What common mistakes do we make when dealing with product analytics?

What common mistakes do we make when dealing with product analytics?

Possible answers (26)

, . We do not take into account the different professional levels (qualifications) of our colleagues in the team. As a result, we do not get the professional opinion we want, and accidentally impose our opinion on colleagues. This is a very common problem that is massively ignored by teams. To solve it, from the very beginning we should fix the basic criteria and values we operate with. Data analysis should be started only after these criteria have been discussed and accepted by all team members.
. We used inaccurate data that incompetent project participants prepared. This problem becomes even more relevant if several different teams prepared the data that we are analyzing.
Because people under the Dunning-Kruger effect cannot understand their incompetence, they often speak as confidently and convincingly as their competent colleagues (). From time to time we could conduct additional data reviews, and show them to random team members to double-check.
. Our huge interest in a particular result leads to the fact that we unconsciously manipulate the data/course of the experiment.
, . We find correlations in random data, interpreting them as causal. We begin to consider random events less random depending on how well the "data tells the story." (, ).
. We have not correctly estimated the amount of time required for data collection. Perhaps we started evaluating the data before the actual completion of the marketing campaign.
. Our sample size was not large enough to extrapolate the conclusions to the entire group.
. In decision-making, we are guided by our success, which was largely the result of chance. We believe that we can repeat/surpass this result.
. We do not correctly assess the causes of the events. We attribute successful events to our merits while blaming the unsuccessful ones on the “outside world” and market conditions.
. In our decisions, we overestimate the importance of the opinion of an authority figure.
. We rely too heavily on data received from the system without checking its reliability (calculation mechanisms, etc.).
. We ignored the unlikely events, assuming they would not happen and would not affect the data. Because of this mistake, many colleagues must cross-check any analytical data.
. Perhaps we have come up with a data analysis method that is unique for our product/company. At first, it seemed that the method was very effective, but later we stopped noticing distortions in the data resulting from its use.
. We were wrong in our conclusions because we analyzed the results from only one of several categories. In fact, the data we did not receive from other categories distorted our understanding of the situation as a whole.
. We overestimated our ability to impact events. As a result, the variables we use in the analysis formulas are beyond our control (but it seems otherwise).
. We consider the events that have already taken place to be more predictable than they are. Based on this, we make the wrong decisions to assess the competence of our team. For example, we can reprimand one of our colleagues because s/he did not notice something "obvious," whereas it was not obvious at that moment.
. When analyzing the data, we "tied" several events together as "most likely" without noticing that the overall likelihood decreases with each new event's addition.
. We postpone the obvious decisions that we know we have to make, and underestimate the risks of our inaction. We look for "additional information" when everything is already clear ().
. We take reports emotionally because of their wording.
. If the collected data resulted from a survey, then we have to ensure that the respondents' answers were real.
. We looked for confirmation of some hypotheses, unconsciously ignoring all the data that proves them wrong.
. We took into account only the data that was the result of our direct actions. We did not consider the events that occurred without our participation, but they had an impact on the result.
. We unconsciously avoid data that may not prove our hypotheses.
. We ignore the existing stereotypes and analyze the data too “politically correct.” As a result, we create a situation in which team members cannot express what they think.
. We believe that everything written above does not concern us directly, since "We are confident in full control over our actions, we understand everything, and biases cannot concern us."
In conclusion, I will add that a person who analyzes data and makes decisions should not allow negative emotions to affect them (). This is a very important skill for any manager.

Related questions

How useful you found this?
Not useful
1
2
3
4
5
6
7
8
9
10
Not useful
Very useful
Thank you for your contribution!
previous bias
next bias
keepsimple logo
picLog In
My ProfileSettingsLog Out

UX CORE GUIDE

arrow downHow to use

UXCG is a free tool that helps teams detect cognitive bias-related UX problems across all product stages, whether you're prototyping, testing, or improving a live product

How to use UXCG

  1. Choose your product stage below;
  2. Find your question (either manually or through search);
  3. Read possible answers.

Each answer explains how cognitive biases relate to your specific situation. Since you understand your context best, you can apply the insights directly. While the tool doesn't provide ready solutions due to each case's uniqueness, it reveals new perspectives backed by cognitive science.

Label description
Questions related to in-house team members cooperation (product, development teams and others).
Questions related to product development stage (from concept to first public release).
Questions related to sales, marketing funnels, prospects and leads communication, and overall product packaging.
Questions related to user interaction with actual product and its features.
Questions related to product analytical data analysis.
search icon
Select your product stage
#10.

Why do users complain about the quality of our support?

#30.

What common mistakes do we make when dealing with product analytics?

#41.

What should we do if our colleagues’ stubbornness is hurting teamwork?

#43.

What to consider when planning product releases?

#45.

What to do if our team members do not share their opinion?

#50.

How to deal with an incompetent colleague/manager?

#59.

What should we consider when referencing political, social, or economic events in our messages?

#61.

What should we do if our team wastes too much time on minor details?

Be Kind. Do Good.