View type




The project idea is to lower the threshold for understanding the thinking patterns of a human (cognitive biases). Many scientists, as well as ourselves, consider this knowledge the most vital information piece for any human because it allows you to increase the quality of your decision-making throughout your life.

Like approaching a software system where you should read the documentation before modifying it, UX Core allows you to read your own "documentation", know yourself better, and adjust your mind processes in the most feasible way.

Here you'll find 105 cognitive biases (thinking patterns) with simple descriptions of each, brief usage examples, and detailed examples of how a particular bias can be spotted and used in real-life situations.

The best way to begin is to read the article: "The Science of User Experience", then listen to the short podcast, and then read the article "Cognitive Science and User Experience." You may also take a look at a few examples of how to use this project in education.

The only thing that fuels this project is your support. We kindly ask you to share it within your network. Thank you.

found 0 item
clear icon
Input the name of the bias or the part of its description
Too much information
Not enough meaning
Need to act fast
What should we remember?
#1Availability heuristics


The tendency to overestimate the likelihood of events with greater "availability" in memory, which can be influenced by how recent the memories are or how unusual or emotionally charged they may be.


In application development, understanding this bias is necessary for consistent interface design, content design, and user communication. If the action we need the user to do is associated with something negative (especially if it's been covered in the media not so long ago), the likelihood that an action will be taken is greatly reduced. Understanding this allows us to design content (text, images, etc.) so that it is associated only with what we need. This bias allows us to reflect on the current world and market conjuncture to choose a better "tone" of our messages.

Another example: Bitcoins and various kinds of ICOs. The topic of cryptocurrencies was so often negatively boosted in the media that, at some point, investors simply decided to avoid everything connected with it without going into detail. Common users realized that the blindness by the increased volatility of this market did not end well. The hype on this topic eventually fizzled out. Many high-quality blockchain projects have faced severe difficulties in development, due to the highly distorted reputation of everything associated with blockchain, Bitcoin, and crypto in general.

The last example is that I chose the topic of software design and blockchain technologies to describe Availability heuristics. The first topic is obvious to me due to my profession (product manager); the second one simply came to mind with ease when I asked myself, "What stream in IT was full of hype and then quickly died out?."

#2Attentional bias


It refers to how a person's perception is affected by selective factors in their attention. For example, people who think a lot about their Instagram profiles are more likely to pay attention to other Instagram profiles


Understanding this bias gives us a reason to think about what we want our users to pay attention to. If we want to encourage our users to open profiles of other members more often, we should create some value in the user's profile. That value should have special significance (karma points, the total number of "likes," a number of sent messages, etc.). Partially this bias decreased the amount of user-generated content on Facebook. People became so heavily dependent on "likes" and other "sympathy gestures" that the fear of being "not popular" decreased the number of publications. Realizing this, Facebook created the mechanism of "Stories" - the opportunity to unobtrusively share something that "disappears" after 24 hours.

Also, Facebook is testing the possibility to remove "likes" in different countries. If people pay less attention to such elements of "public recognition," it will become easier for them to create content. The more content created - the more valuable the network is, and the more users will be involved. Further, according to Metcalfe's law, the more users involved - the higher the utility of the network and hence more profit.

#3Illusory truth effect


Also known as "reiteration effect." A tendency to believe in the validity of information if it has been repeated multiple times.


This bias is food for politicians, marketing specialists, and the media. In developing applications, understanding this bias shows us why the terminology used in marketing/promotions of our project should be consistent.

If we say that "Our app is the best work tracker in the world," and this correlates with customer feedbacks and blog reviews, which refer to our app as the "best tracker," the potential user, wondering which tracker to choose, is more likely to remember us. The potential user can even put us on the same level with trackers of multibillion-dollar companies, even if we work locally, only targeting the village he lives in.

This effect can also explain the meaning of slogans. For example, "Apple," with the slogan "Think Different" plays with our need to "be original" and "not like everyone else." They repeatedly use it in all their products and ads to have the "consistent" image of their products in our minds.

#4Mere-exposure effect


The psychological phenomenon by which people tend to develop a preference for things merely because they are familiar with them. The more often one sees someone, the more pleasant and attractive this person seems to become.


It is important to emphasize that it is not only about people but about objects in general. Let us assume we have a successful app we want to promote in Spain. If we use the colors of the Spanish flag on the background of the description of our application and our digital promotional materials (tastefully and not overdoing it, of course), Hispanic readers will subconsciously sense it familiar.

Another example is that we place a faded, monochrome image of "Sagrada Familia" on the background of a white page that describes the benefits of our app in Spanish. If the text is written with grand and pompous words along with adding the background image, it will easily give the effect of "familiarity" and increase the probability of converting potential users into actual ones.

Another example would be to use standard buttons in our application, standard colors, and sizes, making our entire interface "universal." When we need to notify our users of new app features, we can do it unobtrusively by using a standard pop-up with the same familiar buttons used in the other parts of the app. If we need to focus the user's attention so that we can create a feeling of novelty - we move away from our standard system and use a slightly different window color and a slightly different button size in the pop-up. Seeing the new window, the first association that the user will have is the feeling of "novelty," the intensity of which will be proportional to how long the user has been using our application.

#5Context effect


It describes the influence of environmental factors on one's perception of a stimulus.


This effect was currently used and studied predominantly in marketing.

For example, research has shown that the comfort level of the floor that shoppers are standing on while reviewing products can affect their assessments of product quality, leading to higher assessments if the floor is comfortable and lower ratings if it is uncomfortable.

In the context of software, the Context effect can be attributed to the user's expectations of what the system will provide. So if on the main page of our website we talk about the non-profit orientation of the project, but at some point after registration we offer the user something to buy (even if not from us), dissonance caused by the contrast of the purchase offer and what the user saw earlier will lead to negative feelings. The purchase offer will fall out of the general context and reduce user satisfaction with the product. This is another reason why it is important to approach the architecture of information very carefully.

#6 Cue-dependent forgetting


Cue-dependent forgetting, or retrieval failure, is the failure to recall information without memory cues. It is an inability to recall memory due to the lack of stimuli or signals that were present during memory coding.


In the context of the application, we can use this bias to "remind" the user what they can do with the system.

I will give a simple example on the online sweepstake where many users make bets. Obviously, the "average" user wins and loses, and in the interests of business, it will be good to "support" such a user at that difficult moment when he has lost everything. Since in the mind of a player who has experienced a series of "defeats," there are only defeats the system can remind him of several wins on a certain pattern, bringing back all that series of good memories he experienced. This can be done indirectly, with a message like "Dear % username %, we just wanted to remind you of your incredible winning streak that has lasted three days in a row at % game _ names % ." This approach is possibly too direct, so we can change the message to "Due to your winning streak in % game_ name%, you are now in the 20% of our top players in %sport_name%" This kind of approach is less aggressive and just shows statistics. Doing this is not morally right, of course. Therefore, betting offices and casinos operating under the licenses of the Malta Gaming Authority (MGA), Curaçao, and others, agree in advance that they will not push players into acute gambling. In any case, this example clearly illustrates how businesses can benefit by knowing such a simple error of our brain.

#7Mood-congruent memory bias


A phenomenon in which a person tends to recall information congruent with his current mood, i.e., if you are in a good mood, you are more likely to remember happy moments of the past, whereas, if you are sad, you will recall sad and depressing events.


In the context of application development, understanding this phenomenon is important to properly manage user expectations and avoid errors that may cause the cancellation of service. If the user has a bad mood while using our application (no matter why), he tends to call up sad events. Since human memory is associative, if the user recalls sad events while using the app, the feeling will often be related to the application itself. This means that when working with a user who had a bad experience, worsening his mood should be especially avoided.

To avoid being too vague with this bias, a simple example can be given. If our app is an online video game where a user has lost 30% of his game currency over the last 3 days, he understands that it will take him about a week to win it back. Obviously, he cannot be in a good mood and does not benefit us in keeping him in negative emotions. Let us assume that his annual membership is finishing, and we need to request the due payment. In theory, if we do not do anything, the user will get the payment request for the game. As he continually recalls his fresh memory of losing 30% of his game money, he may make an emotional decision and give up playing completely, which is not in our interest. Instead, we can provide him with some kind of temporary bonus, wait for his gaming economic condition to stabilize, and then send a payment request. Once he gets back to the track, he will be able to pay for the game. The game repeatedly brings him back positive emotions, recovering his playing power, and spending his time effectively.

#8Frequency illusion


Cognitive bias is when a recently learned information, which repeatedly appears with improbable frequency after a short time, is perceived as extremely often reoccurring information.


This bias highlights the importance of competent content formation in the application. For example, we have an ecosystem consisting of several products, and we need to convey to the user the idea that he is part of our community. To do this, it is important to make sure that the same idea is emphasized in the content of these products, their texts, and images.

It is better if this idea is written absolutely identical, for example, "Best solutions for freelancers!." Of course, this bias can also be used within a single product, however, the less the products and their sizes are, the more difficult it is to use this bias and not be pushy and too recurring.

#9Empathy gap


A hot-cold empathy gap is a cognitive bias in which people underestimate the influences of visceral drives on their attitudes, preferences, and behaviors. The most important aspect of this idea is that human understanding is "state-dependent." For example, when one is angry, it is difficult to understand what it is like to be calm and vice versa.


The best use of this bias can be described by the example of the support service of our application.

We can introduce this bias with our customer service colleagues. First, we can indicate possible types of emotional states of our users. It is then better to try to calm an aggressive user with a soft, sympathetic voice, using mirroring and labeling techniques (e.g., Chris Voss - the book "Never Split the Difference").

Training the customer support service will reduce the emotional stress of our colleagues. It will become clear that the customer's negative is not a personal claim to them, but only a brain error common to all of us.

#10Omission bias


One of the cognitive biases manifested in people's tendency to underestimate the consequences of inaction compared to action with a similar result. Responsibility for action by people is perceived more than inaction.


Using it in software often goes against moral and ethical standards and continues to be actively used by a wide variety of companies. For example, knowing that inaction is underestimated, we can create an online magazine service with subscriptions in such a way that people would just be lazy to unsubscribe from the service.

In another case, we may need people to unsubscribe from services they don't use. This may be necessary to obtain vital information to analyze our application use. To achieve this, we can send a request to update the list of services that are being used so that users can see that the inaction will be followed by any negative consequences. Therefore the number of users unsubscribing from the unused services will possibly increase. In this case, understanding of this bias will directly affect the text and tone of our message that we are about to convey.

#11Base rate fallacy


The tendency to ignore general information and focus on only the specific case, even when the general information is more accurate.


For instance, people instantly believe the test results of a rare disease, not taking into account that the disease is a rare one. Another example is the fear of terrorists and flying on a plane. The bottom line is that our brain tends to exaggerate a particular case over statistics.

By understanding this bias, we look closer at developing text content for applications. Thus a message describing a potentially negative outcome in case of any action will be perceived differently by users: "You are about to start the disk defragmentation process. With a 99% probability, the operation will be successful." And another one saying, "You are about to start the disk defragmentation process. There is a 1% chance that the hard drive will be destroyed, and your data will be permanently lost. "

By the way, this is the reason why it is very important to provide quality technical support. A disappointed user who did not find a reasonable explanation for the program error can leave negative feedback about the application. This will harm the product more than it may seem.<br />When people see 50 extremely negative reviews mixed with 15,800 good ones, they tend to consider the product way less valuable, even if the negative reviews are less than 0.1%.

#12Bizarreness effect


The tendency of bizarre material to be better remembered than common material.


Understanding this effect is the key to creating vivid appeals to the audience. In software development, this effect should be avoided in the vast majority of cases, as bizarre material requires more cognitive load for analysis, and more cognitive load is poor user experience. If the application context allows bizarreness, using this effect is an exception to the rule.

To use this bias, we first need to understand whether our product is suitable for it. For an app that helps you quickly organize a memorial ceremony, bizarreness surely will be an outlier that will affect the app's success rate. Another Facebook app matching your profile picture to mythological heroes and generating texts like "You were born in the same month, under the same stars. You are Apollo!" draws huge masses of users with its bizarreness and the easy principle of work.</p>',

#13Humor effect


Humorous items are more easily remembered than non-humorous ones, which might be explained by the increased cognitive processing time to understand the humor or the emotional arousal caused by the humor.


The best example of using humor is memes. Plenty of big IT companies use memes to promote their products, and the "Humor effect" is exactly what they bet on.

It is very important to understand that this bias is about remembering humorous things, but not about a positive attitude towards them. If the user working on an important action (form filling, data saving), and suddenly gets to the error page 500 (Internal server error), 502 (Bad gateway), 503 (Service unavailable), 504 (Gateway timeout)), then humor like "Ho ho! Our pirates are working on it, and soon everything will be work again! " will not be appropriate. In this scenario, humor will be noticed, remembered, and most likely will cause anger, and this incident will be remembered better. If such an event occurs several times in a month, according to the Availability heuristics, the user is likely to give a negative rating the next time he thinks about the quality of our product, even if in 99% of cases the application was running smoothly (Base rate fallacy).

In this case, a good user experience is when the company takes the blame, explains the core of the error, and notifies that everything will be restored soon, and the page will be refreshed.

#14Picture superiority effect


A phenomenon in which images are more likely to be remembered than words.


The debate about this effect continues to this day.

It claims that images have advantages over words with regards to coding and retrieval of stored memory because words are coded more easily.

Since mental effort is a time investment, it turns out that a person is more likely to remember something if the process of analyzing a certain object is longer. And that is why he remembers an image better than text. However, it is not that simple as it highly depends on the image and text itself.

Not to speculate too much on this effect, I will only note that a well-chosen image can replace thousands of words in certain cases. Placing the image in a context that is familiar to the user, and using it as an acute option will be a clever move. The use of this effect depends on an excessively large number of factors, so we will not elaborate more on this bias.

#15Von Restorff effect


Also known as the "isolation effect." It predicts that when multiple homogeneous objects are presented, the object that differs from the rest is more likely to be remembered


For example, in our website's header, we have a navigation menu with the following items:

| Home | About us | Our services | Our products | Contact |

Now, let us put another item here, naming it, say 420:

| Home | About us | Our services | 420 | Our products | Contact |

Obviously, paragraph 420 will draw too much attention, and even if we do not click on it, it is highly likely that we will hover over it (what if there are categories there?). This effect can be used both to make some content "invisible" (such as "Terms of Service" and "Privacy Policy" items) and vice versa, to draw users' attention to a new CRM filter item, or something else.

#16Self-reference effect


A tendency for people to encode information differently depending on the level on which they are implicated in the information.


This effect is widely used daily by marketing specialists. For instance, in advertising, a customer perceives information better if there are people similar to him. They are more likely to remember birthdays that are closer to their own birthday, and so on.

If we have developed an application for the elderly, then using the images of young couples on the home page will be unreasonable.

This effect can be successfully used to compile context links ("Click here if you served in the army between 2005 and 2015"), menu (| Freelance | Private Business | Public Institutions), and more. All this is so intuitive that, in most cases, it is used for its intended purpose. However, a better understanding of this effect will let you move towards creating a simple interface.

#17Negativity bias


Also known as the negativity effect. It is the notion that even when of equal intensity, things of a more negative nature (e.g., unpleasant thoughts, emotions, or social interactions) have a greater effect on one's psychological state and processes than neutral or positive things. In other words, something very positive will generally have less of an impact on a person's behavior and cognition than something equally emotional but negative.


That is why it is necessary to resolve conflicts and any kind of problems with our users as gentle as possible, and erroneous system behavior should be avoided. To erase a negative event memory that has lasted for half an hour or so for the user, in some cases, it will take many days or even weeks of uninterrupted service corresponding to client expectations. If the system fails every so often, and the memory of a previous failure stays fresh, it will push the user to decide that it is full of errors and switch to competitors.

It is important to understand that we can still solve the user's problems 99% of the time, but this will not be enough if we cannot work properly with errors that occur (Base Rate fallacy). As a solution, many companies notify users in advance of possible system errors. Even if the user was offline and wasn't affected by occurred errors, a letter of apology and a progress report is still sent to him. All these details become far more important if the application is for enterprise customers.

#18Anchoring effect


Anchoring or focalism is a cognitive bias where an individual depends too heavily on an initial piece of information offered (considered to be the "anchor") to make subsequent judgments during decision making.

Example of use in stores targeted on increasing sales:

  • Group item pricing (even if there is no big discount), for example. "Buy 3 bottles for $6."
  • Marked limitations (only X pieces per person)
  • Mentioning a random number (buy 18 "Snickers")

It is not only a paradox that this method works, but also that its effect remains, even if disproportionately large or small numbers are used as anchors, and even if the consumer is aware of the Anchoring effect.


Understanding this bias is crucial for any kind of e-Commerce business because an easy game with numbers can be converted into large sales. In addition to the above examples, which are successfully used by stores (both physical and online), we can give another example that deserves attention. It is the indication of MSRP (The manufacturer's suggested retail price) for a certain item. The manufacturing company uses this indicator to standardize the prices of its products for different regions of the world. However, if expansion into other regions is not planned, the MSRP can be used more deftly.

For example, you can mark the MSRP for a laptop with a sum of $2849. Then indicate the "store" price $2099 next to the MSRP, crossing out the "$2849", and then put a 20% discount on the product with the "final" price: $1679. Visually it will look perfect. The potential consumer will see a cascade of discounts, and mentally, when evaluating the product will base on the initial price "$2849." Due to that, he believes that he can buy a product with more than a 40% discount! But what if the store originally planned to sell this laptop for $1550, and it was included in the store's strategic plan? This is a brief example of how online stores work. Of course, there are countless methods to work with numbers, yet going back to the Anchoring effect, it is necessary to emphasize that working in any market, not knowing how this effect works, is least to say, naive.

This effect can be used not only in eCommerce. There are many methods of using it outside software development, which is a material for another article.

#19Conservatism (belief revision)


Conservatism is a bias in human information processing, which refers to the tendency to revise one's belief insufficiently when presented with new evidence. This bias describes human belief revision in which people over-weigh the prior distribution (base rate) and under-weigh new sample evidence when compared to the previous belief-revision. This cognitive bias explains one of the reasons why it is so difficult for people to change their views, values, and beliefs.


From a software perspective, understanding this bias obliges us to be cautious when developing content for our app. Any messages, or "philosophical" speeches even about the advantages of our product, should be reviewed if they can somehow hurt someone's ideological values.

Another utility that can be drawn from this bias is the necessity to consider a value and belief system of our target audience when designing solutions. If, in addition to basic data about potential end-users like "gender, age, race, nationality," etc., we correctly determine the value and belief systems, this will allow us to develop more precise app content. Particularly this refers to promotional materials and various articles that we prepare to promote our product. Often content writers are assigned tasks like "Write an article about this product feature. We are the best on the market, our competitors have nothing of the kind, and thousands of our users have already tested it." With data on our audience values, we can set a task like this: "We need an article describing the %name% feature. Our audience consists of young couples 18-27, most of whom either already have children or are going to. They live in %location_city_name%, and they are concerned about air quality. It is also a common practice in this area to attend Sunday worship services." In the latter described case, a competent content writer will be able to write a far more heartfelt "article" using this material.

Although conservatism is a vast topic, I would like to stress the importance of meeting socio-cultural norms. Regardless of what application we have and how it looks and works if, in the developing process, we don't take into account the socio-cultural factor and the resulting conservative beliefs, we are less likely to succeed.

#20Contrast effect


This is an enhancement of an object's perceived parameters (people, phenomena, and processes) if an immediately previous object was of lesser or greater value in the same dimension.

A couple of examples:A rectangle of a neutral shade of grey will appear lighter or darker than it actually is, due to the dark or light frame around it;A person appears more attractive compared to less attractive people and less attractive if others are more appealing.


Understanding this effect is important for creating consistent interfaces. If, during the process of working with the application the user suddenly gets to the page where engineers have not updated the design (CSS) yet, the overall website will look too faded because of this, and the user will notice the oldness of the design more than it actually is. Unfortunately, engineers often ignore this effect: "it's just CSS, there are more important things!"

However, in comparison with the project engineer, the user will overrespond to this "just CSS." Given that the development of high-quality products is always focused primarily on users, priorities should be adjusted accordingly.

This effect gives food for thought when it comes to working with audiences. Let 's say we are going to launch a dating site. Depending on our audience and how we position ourselves, the images we will use will differ greatly.

If we are doing a project for single people who were previously married, our audience will probably be 30+. It's expected that the priority for these people is "reliable," "kind," "good," "family" partners. As a result, we get images of "average," happy couples who "found each other on our website!"

A different approach is when we focus on students. Obviously, the images of the people mentioned above will not be suitable due to the more impulsiveness of the audience, its hormonal background, and completely different life priorities. Vivid photos of young girls and guys will be more suitable here.

What does the Contrast effect have to do with this? In the first case, if we place images of vivid young people, the confidence of those "who have been divorced," will suffer, and according to this effect, they will consider themselves less attractive. In the second case, if we put "average couples" photos for students, it will contrast with young people's opinions about themselves. After all, they consider themselves quite extraordinary, unusual, and just the best in the world. In other words, they are prone to youthful exuberance.

All these are just examples, of course, and generally, when the criterion for developing something is the "contrast effect," A/B testing will have to be carried out to find out the truth.

#21Distinction bias


The tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately.


Understanding this bias allows us to take different approaches to developing the information structure of our application.

For example, if we want the user to clearly see the difference between two service plans, we can put the service plans in a row indicating the characteristics and prices of each (you have seen such on many websites in the "Price" section).

But if we need the user to consider our service plans to be "almost the same" (no matter why), we can place them vertically instead of placing them horizontally. Thus he will have to scroll the page, which will not allow him to evaluate differences in the service plans; hence we will be more likely to achieve our goal.

This bias is also one of the reasons why online sweepstakes and various kinds of casinos do not show "Deposit Amount," "Winnings," and "Losses" on the same page. This may be user-friendly, but does not meet the project's business targets, as the user will give more importance to the difference between winning and losing. It does not even matter what he will give more importance to. The fact that the user will have feelings and thoughts beyond control poses risks for the business.

#22Framing effect


Drawing different conclusions from the same information, depending on how that information is presented. Thus the same statement depending on the wording and semantic accents can be presented negatively or positively as a gain or as a loss.


This is one of the most important biases to understand. I recommend everyone to properly study it, regardless of one's occupation.

Here are the message examples that came up after the user clicked the "Start System Update" button:

"You are about to start the system update process. 99% probability the operation will succeed."

"You are about to start the system update process. The probability of permanent data loss is very low."

If, for example, the probability of data loss is less than 1%, the very fact of mentioning that the "permanent data loss is low" will be perceived by the user much more intensively, disproportionately to the 1% probability.

Or let's study another example. Our mobile app's user receives a message "You got 20 points and beat 70% of our project's users!" Or a message with the same meaning stated differently: "You got 20 points and got into the top 30% of users of our project!." The alert reader will notice that the meaning of the messages is semantically the same, but the arising emotions vary greatly depending on the situation.

In addition to this effect, I recommend reading about the "7-38-55" rule deduced by Albert Mehrabyan, a remarkable Armenian scientist. Although it is about live communication, an attentive reader will be able to find valuable information to use in the product.

Special attention should be paid to the fact that this effect forms the end-user expectations after reading the content—every word, every message, written anywhere in the project matters.

Alas, currently, in most companies, the importance and integrity of the content are highly underestimated, especially if the management team of the company consists of only engineers. For the same reason, most companies are either closed down or sold to other engineers.

#23Money illusion


Money illusion, or Price illusion, is the name for the human cognitive bias to think of money in nominal rather than real terms. The tendency to perceive money as a material object, while in fact, money is just symbols. For example, if you have $1,000 (10 banknotes with a face value of $100 each), you will assume that if you do not take a single banknote, and no one steals it, then the amount of money will not change in a year. In fact, there will be a smaller amount of money in a year, although neither the number of banknotes nor the figures pictured on them will certainly change. Simply put: the ordinary person does not think about money in terms of inflation.


Understanding this illusion is primarily important in the context of computer game development.

Despite its graphics and playability, many online games are faced with big challenges due to the underestimation of "inflation." So if online game management decides to distribute any "bonuses" to players, understanding the Money illusion and inflation mechanisms is very important. Mainly for determining the optimal size of these "bonuses," which will satisfy the wishes of players and, at the same time in the long term, will not harm the business. This illusion can also be used to develop applications for casinos and various types of sweepstakes.

When you are aware of inflation that hit some currency by 7% in the last 3 months, but the users still think it "costs the same," it allows maneuvering and organizing magnificent bonuses and various kinds of promotions.

#24Weber-Fechner Law


It is an empiric psychological law quantifying the perception of change in a given stimulus. It relates to human perception, specifically the relation between the actual change in a physical stimulus and the perceived change.

Could be illustrated as follows:

An eight-light chandelier seems just as bright as a four-light chandelier, just as a four-light chandelier seems brighter than a two-light chandelier.

Accordingly, the number of light bulbs should increase evenly each time to make it appear that the brightness gain is constant. If the absolute gain in brightness (the brightness difference "before" and "after") is constant, it would appear that the absolute gain decreases as the brightness value itself grows. For example, if you add one light bulb to a two-light chandelier, the apparent gain in brightness will be significant. If you add one light bulb to a chandelier of 12 light bulbs, the brightness gain will be almost imperceptible.


In the case of sweepstakes, understanding this law makes it obvious that offering the user $2 when he has just lost $1,000 is mildly said, "not a good idea." If the goal was to "ease out loss effect", then the $2 could be spent in a more useful way. For instance, in online poker rooms, the system gives tournament tickets instead of 1$ bonuses. The cost of the ticket is still $1, but the participation in the tournament leads to more intense, positive feelings.

#25Confirmation bias


Confirmation bias is the tendency to search for, interpret, favor, and recall information that confirms or supports one's prior personal beliefs or values.


This is another bias that is important to understand, along with "Conservatism." If, in case of Conservatism, we should avoid conflicts with the long-standing beliefs of our users, then in the case of Confirmation bias, we should place mental "bonuses" in different parts of our application, which will meet our users' expectations and their beliefs.

For example, we have a website that allows us to announce "online auctions" of various secondhand things. There are many such projects, and we stand out by allowing people to trade "anonymously." From the very first day with all our content, we show that our users' privacy and security are primarily essential for us. The platform visitors have a set of expectations with which they register and sell different items.

In such a project, the understanding of Confirmation bias allows us to distinguish two different viewpoints by our users: some of them will consider the project doubtful. Once we make the slightest mistake, they will leave thinking that it was bound to happen. Another percentage of users will think that everything is OK and believe in anonymity and security. Understanding this allows us to create content, newsletters, and new system features, taking into account the plurality of the users' beliefs.

As a result, at some point after gaining our users' trust, this approach can make us one of the world's largest online retailers (more detailed examples are described in the "Amazon Era" book ).

#26Congruence bias


Congruence bias is a type of cognitive bias similar to Confirmation bias. It occurs due to people's overreliance on direct testing of a given hypothesis and neglecting or rejecting indirect testing.


Understanding this bias is important when creating metrics by which we evaluate the results of A / B tests and other success metrics of our product.

Let's assume that the 2500 new users who have registered on our website in the last week are the result of launching our advertising poster on the popular website testing.com 10 days ago. Due to the Congruence bias, we tend to focus our attention on the characteristics of the audience who visited testing.com, without taking into account that the testing.com visitors could ignore our banner; and completely ignoring the group of people who visited our website regardless of the banner on the testing.com.

For example, if our partner at testing.com turned out to be dishonest, realizing in advance that we expect good results from our cooperation, he could use Google Ads before launching our banner, targeting an audience relevant to our banner - the Congruence bias will never let us understand this. Everything will definitely change if we consider such development of the scenario.

#27Post-purchase rationalization


Also known as Choice-supportive bias, Post-purchase rationalization is the tendency to retroactively ascribe positive attributes to a subject or action one has selected and/or to demote the forgone options. People are more likely to ascribe the advantages of the chosen option and to amplify the disadvantages of the rejected one.


Knowing the rationalization mechanism allows a wide variety of online retailers to send customers thank-you letters with texts like "Thank you very much for choosing %product_name%! This is the best %product_type% on the market, and we are glad you made such a good choice!"

One of the main goals of such a letter is to give a person more thoughts to rationalize their actions because shops do not want to give a refund.

It is also important to highlight that the intensity of the rationalization material should be proportional to the complexity of the user's decision. Of course, in most cases, when buying products on a website, we do not know whether the purchase was easy or difficult for the user when buying goods regularly. However, for example, if we sell beautiful teacups that cost $49 a piece, it may make sense to spend a few dollars on a beautiful wrapper, a thank-you message that no one reads (and yet they are important!), an email how great it is to be the owner of this limited edition cup, and so on.

I remember I bought a bike saddle made by an English brand, "Brooks." This brand has always been distinguished by high-quality saddles, but in any case, $150 per saddle is expensive. Since the moment of purchase, I somehow rationalized it, mainly focusing on the fact that "They are the best!" Having received a saddle in a beautiful box with nice inscriptions, I continued unpacking. I was surprised to find not only a thank-you letter from the "Director of the Company" but also the issue of the English newspaper, which had a large image of a young couple hiding in a forest as I opened it up. Of course, newspapers never have a direct relation to bicycle saddles, yet, finding all that in the box that day made me close the purchase rationalization issue.

What I mean is, if the user has just purchased from us, this should be the beginning of his journey but not the "final goal." Although a business needs to convert potential buyers into actual ones, however, if we want to have a high-quality name on the market in the long run, our work on UX should continue until the user uses his purchase and gives a good review on the site, preferably also rating the product.

#28Selective perception


It is the tendency of people to notice those elements that are consistent with their expectations and ignore the rest. An example of this phenomenon is the selective perception of facts from news reports.


Thanks to this form of perception, when we work in MS WORD, and at some point, we need to change the color of any text, we very quickly find the appropriate tool on the panel. Working with different editors made us form a set of expectations of what the editor should be, and where the color panel should be placed. Understanding this when creating our own editor allows us to place many tools on one panel, without worrying that people will get confused. In other words, it may be appropriate to group interface elements with the expectation that Selective perception will allow users to intuitively understand the work of the application. Of course, due to its complexity, testing working hypotheses with selective perception should be very careful and include A/B testing.

#29Observer-expectancy effect


Also called the Experimenter-expectancy effect; Expectancy bias. It is a form of reactivity in which a researcher expects a certain result and subconsciously manipulates the experiment or misinterprets the data to detect this result.


Understanding this effect is necessary to correctly read the analytical data on our application. For top management, this is even more important, because middle range managers are more vulnerable to this effect, due to the constant work on both proving their effectiveness and the effectiveness of their decisions. Most often, people fall victim to the effect simply because they wanted to see more numbers. For example, the number of "returned" users of our website is 98, but if you filter in the Google Analytics not from February 18, 2020, but the 17th, then the number will increase to 235. At some point, we can fool ourselves into thinking something vague like "well, there is a time zone difference and all that," and by shifting the filter from the 18th to the 17th, generate a report that will better meet our expectations.

In turn, this report will be given to the management, who will perceive it as accurate and not notice mistakes. Based on incorrect data, the management will make not optimal decision to extend cooperation with the marketing agency since it "seems" to be effective.

#30Ostrich effect


It is the tendency to avoid dangerous or negative information, ignoring them.

In a broader sense: avoiding information that can cause psychological discomfort. The best illustration of this effect is everyday expenses.

Having spent a significant amount of money, a person does not hurry to check his bank account, pretending that his balance has not changed.


Understanding the role of this effect in e-Commerce or computer games' logic allows us to create mechanisms for replacing the spent real currency by various bonus points or bonus gaming currency. Psychological discomfort here should be understood in the broadest sense. To help our users avoid this discomfort, in addition to the data gathering on gender, age, race, and nationality, we can add to the "Persona" of our typical user the expectations from the application, as well as expectations from each action performed.

Knowing our users' expectations allows us to assume the type of psychological discomfort from a particular action in the application and act on it. So if we know that by sending a message on our social network, the user may worry if the message has been sent, we can add the "Delivered" notice, which, as we all know, is widely used in most social networks and various kinds of messengers.

#31Subjective validation


This bias is the perception that something is true if a subject's belief demands it to be true. It also assigns perceived connections between coincidences.


This bias is similar to Confirmation Bias. However, if there is a tendency to seek confirmation of one's beliefs in the case of Confirmation bias, here we are talking about distorting the interpretation of the received information. For example, after a large purchase that was emotionally difficult for us, we tend to rationalize the purchase by giving importance to the product's different properties, which we would have evaluated differently if we were outside observers.

Let's say we bought a bike. If it was expensive, and we spent all our savings on it, then we will tend to exaggerate the significance of every compliment received for the bike from both friends and strangers. Our friend tells us, "So cool, it is so lightweight!" And we cling to his phrase willingly agreeing, adding, "Yes, and at the same time it rides really well!"

For working on software, understanding this effect is important to create a good, comfortable "path" after the user makes factual action. Let's say we have an online store of expensive fitness watches. Taking into account the Subjective validation, as well as the need for people to rationalize their purchase (since watches are expensive), immediately after the purchase, we can send an email to the user indicating the serial number of the watch, indicating that there were only %number% similar watches made, emphasizing that it is a limited edition.

In a couple of days, we can send an article from our blog, which describes the benefits of the watch purchased by our user and includes reviews by athletes competing in various sports. A week later, we can send the last "thank-you letter" with a 10% discount bonus coupon, emphasizing that this coupon is offered ONLY in case of purchasing this watch.

Of course, Subjective validation can be used in a variety of ways in an infinite number of cases.

Here is another more complex example. Let's say that when registering on our site, we asked a user whether he wants to participate in charity events protecting the environment. If the user agrees, we will visually emphasize the importance of his choice and "thank" him. Later if we decide to send him a letter asking to donate a couple of dollars to restore the burnt forest in Ukraine, the likelihood of his consent will be higher, because when he agreed to participate in the program, he created a new belief about himself (I am the one who cares about the environment), and thereby the Subjective validation will work in our favor. Such a user will find it emotionally difficult to step back from the created beliefs about himself. In the context of the Internet, this bias will be more difficult to use than in the "real world," where all these experiments have been successfully conducted and documented. Yet, certain business benefits in IT can be derived from this.

#32Continued influence effect


This is the tendency to believe previously learned misinformation even after it has been corrected. Misinformation can still influence inferences that one generates after a correction has occurred.


This effect can be interpreted as a consequence of two other cognitive biases: Conservatism and Confirmation bias. The reason for this effect is that our brain protects us from possible cognitive dissonance and correlated emotional trauma. That is why the brain needs time to carry out all the necessary validations before the influence of previously learned misinformation decreases.

In the application logic, understanding this effect is important for proper communication with the user about "external world" events. For example, if we decide to send a message or a support letter to our users about COVID-19 (which has become very trendy for many applications), then it will be risky to use previously dubious information in the selected tips and ideas, which has become "true" only recently.

#33Bias blind spot


The Bias blind spot is the failure to notice your own cognitive biases. Therefore the inability to detect biases in oneself is also a cognitive bias. People are more likely to notice erroneous behavior and motives in others than in themselves.


This bias is why it is not necessary to explain to users the reasons for a particular product decision if it is based on psychology and cognitive sciences. For example, if in the release note of the next version of our product, we add information that focuses on psychology, most users will likely perceive this with distrust. Moreover, in some cases, it will be perceived as if we are manipulating them.

Of course, the design of any application consists of manipulating the users' expectations by 100%. The attentive user will certainly notice this, and yet even he will consider himself smarter than others, and to some extent, will be exposed to the Bias blind spot.

Sometimes it is better to seem simpler than you are.

#34Clustering Illusion


This is the human tendency to expect random events to appear more regular or uniform than they actually are, which originates from the idea that the appearance of clusters or sequences in data cannot be caused just by chance.


Sometimes random events, regardless of their frequency, are random, and we should understand this and be more realistic about analyzing product statistics.

Understanding this bias allows us to work more carefully on our user base. Let's say we want to give our mobile app users the feeling that we take care of them 24/7. For this, we can write special notifications which will remind them from time to time of something that should be considered. Not to annoy users, we can send the messages randomly between every 36-55 hours. As the notifications are non-systematic, the users will not notice a certain "pattern." Hence the user believes that the notifications are sent randomly, and due to the Clustering illusion, they seem more regular than they actually are.

Of course, it is a crude example, but I hope you get a general idea. A good implementation of this idea was applied in the Healthy Minds app by Dr. Richard Davidson. During the COVID-19 lockdown, their app randomly sent notifications during the day with the text "This is a really challenging time. It can help to pause for a few slow, mindful breaths."

#35Insensitivity to sample size


This is a cognitive bias that occurs when people judge the probability of obtaining a sample statistic without respect to the sample size.


Understanding this bias is necessary to create the right size of cohort user groups. If our cohort group of users is too small (in absolute value, not percentage), then we risk receiving data that will not allow us to represent information about all our users. As a result, we can decide to create new tools in our app, which may eventually happen not to be needed by the majority. This bias is very broad, and I recommend reading the corresponding work of Daniel Kahneman and Amos Tversky, which is publicly available on the Internet.

#36Neglect of probability


It is the tendency to disregard probability when making a decision under uncertainty and in which people regularly violate the normative rules for decision making. Small risks are typically either neglected entirely or hugely overrated. The continuum between the extremes is ignored. As Rolf Dobelli explains, "We have no intuitive grasp of risk and thus distinguish poorly among different threats. The more serious the threat and the more emotional the topic (such as radioactivity), the less reassuring a reduction in risk seems to us."


Using this bias daily brings hundreds of millions of dollars to various casinos and betting companies worldwide. It is also the reason why we tend to click on "I accept the terms of the license agreement" without even reading it. Understanding this bias allows companies to provide important information to users in a way that they are likely to ignore the risk. Or, on the contrary, we can push the users to the actions we need, using a specific language.

For example, knowing that our users ignore the probability of complete data loss, we can push them to create backups with a message like "Dear %user_name%, the last time you backed up your data is 571 days ago. We strongly recommend creating a backup to avoid the risk of a permanent data loss." The likelihood of loss could constantly be 0.1%. Still, by provoking emotions ("permanent data loss") and converting the conditional 19 months into 571 days, we are more likely to achieve user action (system backup).

#37Anecdotal evidence


This is a statement or evidence based on cases or episodes from personal life or unique experiences. It should not be confused with anecdotes as a humorous story. Anecdotal testimony is a short personal story.


I added this bias to emphasize the importance of familiarizing with it than setting an example in product development. Most people tend to exaggerate the events happening in their lives, and this is primarily important to understand when creating instructions for the support service for our application. For example, a user contacted us and described an improbable case – some errors in our system happened to him personally. Still, he wasn't able to take a screenshot or video, and now he assures us that everything was exactly like he states. Understanding the significance of Anecdotal evidence in conjunction with Confirmation bias and Subjective validation will allow our team to choose a more cautious policy in which we will not offend our user by stating, "Our system does not work like this, what you are saying is impossible!

#38The illusion of validity


This is a cognitive bias in which a person overestimates his ability to interpret and predict an outcome accurately when analyzing a set of data, in particular when the data analyzed shows a very consistent pattern— that is when the data "tells" a coherent story.


Understanding this bias, along with the Clustering illusion, is necessary for a more cautious approach to analyzing product data. These errors are the main reason why the product manager needs colleagues in the team with whom he can double-check the information received. Even if we are very alert and informed of people's tendency to confirm their viewpoint, having a third-party opinion on our data interpretation can greatly reduce the likelihood of error.

#39Recency illusion


It is our belief that the things we noticed only recently are actually new. This illusion arises from another cognitive bias: Selective perception.


In the process of developing the application, this illusion can be used to create the “new” from the “well-forgotten” or unnoticed old. I will give an example on a web application that is opened through desktop browsers. In the presence of an abundant amount of functionality, it will be easy for us to find something that users do not notice, or is used extremely rarely. Suppose we analyzed the data on the application and noticed that there are some tools for the development of which we once spent a decent amount of time, but at the same time, these tools were lost against the rest. Since we do not need “ballast” in the form of dead tools on the interface, we must make a decision: either get rid of the tool or redo it. Typically, this is the approach used by technical managers entrusted with product management. More often, they decide to remake the tool if, in their opinion, something “more useful” can be created from the tool. If it takes too much time to upgrade, they will decide to remove these tools altogether.

Understanding the illusion of novelty allows us to take the third path, which is most often used by politicians. We can decide to carry out a minimal modernization of these tools and present them again, pointing to a fundamentally new approach that we used in their modernization. Even if practically nothing has changed visually, a well-balanced message, coupled with a bright primary indication of the instrument, will most likely stimulate its use. By the way, a similar approach was seen in a number of Microsoft products, in particular in MS Office, when the application occasionally advises you to use the “new features of the office suite” to perform certain actions, while under the “new features” are the same tools that were in previous versions.

#40Gambler’s fallacy


Also known as Monte Carlo fallacy. Reflects a common misconception of the randomness of events. It is connected with the fact that, as a rule, a person intuitively does not realize the fact that the probability of each subsequent outcome does not depend on previous outcomes of a random event. According to studies, it is noted that people with a higher coefficient of intelligence are more prone to this cognitive distortion than others, which is explained by the fact that they attach more importance to patterns, and thus tend to believe that they can predict which event will occur in the next time.


This mistake is one of the most important for understanding to the owners of casinos and various gambling businesses. In fact, the benefit of this error can be obtained for any product where you can add an element of gambling excitement. For example, in a number of modern computer games, players are given the opportunity to buy "lootboxes" from which randomly drop different kinds of objects, providing a kind of play value. Since computer games and their distribution algorithms for lootboxes "drop" are not subject to audits of licensing authorities (such as MGA / Curacao gaming), competent intervention in the distribution issue, taking into account the gambler’s fallacy, can stimulate the purchase of new lootboxes. By the way, the understanding of such details led to the fact that more and more countries impose legal restrictions on game developers who apply similar dishonest practices (example).

#41Hot hand fallacy


The opposite effect is gambler’s fallacy. This is a mistaken belief that a person who has experienced success in a random event has a higher chance of repeating his result. This error in judgment was named after the perception of basketball fans with hot hands. It is said that a player has a hot hand if he or she makes several hits in a row. Based on this, fans believe that the player’s chances of making the next hit are higher than usual.


In the context of software, the use of this error can bring serious benefits to various streaming platforms like Twitch. Adding different visual indications to the most “successful” players attracts the attention of the audience to them.

For the same purpose, in computer games like League of Legends or DOTA2, there are such concepts as “spree” - a series of “killings” of opponents' heroes, with each subsequent kill adding a new visual and audio indication. eSports betting platforms, in turn, can earn extra profits by using this effect to correct the “win” odds of certain “successful” players in real-time.

#42Illusory correlation


The cognitive bias of the exaggeration of the close relationship between the variables, which in reality either does not exist or is significantly less than expected. The "illusory correlation" is usually explained first of all by the way people make decisions and think. In any cognitive process, a person relies on the simplest solution, which does not require a large amount of cost. If an event needs to be tied to something, it is more likely to be tied to something that first comes to mind: even if, in reality, such a connection is illusory. So, an illusory correlation is considered one of the ways of forming stereotypes.


Although I can’t give a clear example of using this bias for product development, nevertheless I will try to emphasize its importance for those companies that want to become a world-class brand.

Thus, an understanding of this bias is necessary in order to double-check all communication with our audience, users, and the media. For the same reason, absolutely all content writers working for us should follow the same instructions and keep the same tone of voice. In the long term, using this bias, we can create opinions in the subconscious of our users about our brand regarding such things that we ourselves have never publicly announced.

I will try to give a slightly complicated example.

Let's say our company is on the Fortune 500 list. We don’t donate to charity, as we expect serious risks ahead of our competitors, and yet, for one reason or another, we would like people to assume that we are engaged in donations. A rough plan of action to address this issue:

  1. Get the data on donations from public sources, in particular, in the blogs of companies from the list of fortune 500 and references in the media;
  2. Make a list of the most “powerful” statements from these data, which were most vigorously discussed in the social networks;
  3. Instruct our content writers to compose materials on the theme of solidarity with large donation companies using the most successful / discussed statements from the list compiled earlier;
  4. Transfer these materials to our marketers and develop a strategy for promoting these materials from private accounts, with an indirect mention of our company in the text;
  5. Wait for the results.

The implementation of this plan is rather complicated, because it will be necessary to take into account a lot of different kinds of nuances, and nevertheless, with competent implementation, we can very well create in the society the illusion that we are also connected with donations. The example is very rude, but I hope that it is enough for the attentive reader to catch the general idea.

Such an illusory correlation and the assumption that many wealthy companies donate money to charity are common to all of us. If you conduct a cursory study of this topic, you will be greatly surprised by the very few cases of companies that are really involved in charity. This is an even clearer example of the cognitive bias being discussed

#43Group attribution error


Refers to people's tendency to believe that the characteristics of an individual group member are reflective of the group as a whole.

Understanding this error makes it obvious to us that after the release of a new feature of our application, which makes it possible to unite in “Groups” with other participants, the second thing is to give the opportunity to add a description to this group.


In other words, if we give the possibility of any association of our users into groups, communities, clans, etc., but at the same time we do not give the opportunity to “describe ourselves”, this creates a dissonance among the participants outside the group, because it is more difficult for them to find common collective properties and form their attitude to this group.

Another benefit that we can extract by understanding this error is the importance of associating with successful teams that our audience is impressed with. This technique is widely used by various marketers who organize advertising of their brand on the uniforms of entire sports clubs (e.g. Football teams).

#44Fundamental attribution error


It consists in the fact that when considering a particular situation, we deliberately consider internal or external factors to be determining, not paying attention to how events actually developed. For example, our incorrect behavior (no matter what form it takes) is more often determined by external circumstances, another's - by internal (personal) factors.


If it comes to success, a successful result, then the vector of the fundamental attribution error usually changes. In relation to other people, we tend to explain everything by some external circumstances, luck, and in relation to ourselves - by the fact that this is a fair result of our efforts, knowledge, skills, etc.

Understanding this very interesting error gives us ample opportunity to work with users of our applications. So, we can analyze the main problems that arise for our users and how they interpret them, and then create instructions for our customer support department in order to make our users' communication with us more comfortable. Understanding this error also makes it possible to create competent tactics for contacting potential users (leads) for our sales team.

For example, if a user angrily contacts our support team pointing to our personal mistakes (determining someone else's (for the user) improper behavior by internal factors), before answering his question, we can refer to his own successes in the form “First of all, let me congratulate you with the success of your last campaign that you had with us! We don’t doubt (vector displacement, we’re talking about a successful user result) your professionalism and we will certainly conduct a thorough analysis of the situation and will answer you before the end of the working day. ”

The best example where understanding this error is easily at the heart of creating instructions for a support service is Amazon.



Perception, classification, and evaluation of objects, events, and individuals by spreading on them the characteristics of a social group or social phenomena based on certain ideas / elaborated stereotypes.


Despite the fact that in our politically correct society such a phenomenon as stereotypes is actively condemned, and people prone to stereotyping are ostracized, ignoring stereotypes in business is stupid, albeit morally commendable. As Dr. Daniel Kahneman correctly noted, “Neglect of valid stereotypes inevitably entails non-optimal estimates. Confronting stereotypes is meritorious from a moral point of view, however, one should not mistakenly adhere to the simplified opinion that this does not have consequences. It’s worth paying the price to improve society, but the denial of its existence, although it calms the soul and is politically correct, still has no scientific justification. ”

I will not give explicit examples describing the harm and benefits of stereotypes, because this will create many reasons for speculation, and nevertheless, I will show controversial issues that may arise during product development.

Suppose we have an online platform where people can rent apartments for short-term rent. One category of users is those who post ads, the second category is those who read these ads looking for an apartment. Suppose we know that in the cities “A”, “B” and “C” there is a crime problem, most often arising from the actions of a particular ethnic group. Suppose we also know that representatives of the same ethnic group in increased numbers are expected in the city “D” in a month. Since we worry about the security of our users, and we want them to have a better user experience, it would be logical to minimize the risks of problems for them by removing the cities A, B and C from the list of listings, referring, of course, not to the ethnic group but to official crime rates.

But how to do with city D? If we remove this city from the listing, citing our expectations, it will be, to put it mildly, politically incorrect, and may cause a flurry of criticism against us. On the other hand, if we won't act, then many of our users will most likely have very bad experiences, which they will essentially “get” on our website by taking apartments in the city “D”. Of course, we can leave everything as it is, wait for the crime index for city D to increase, and take this as a basis to remove this city from the listing. But, firstly, it will take a lot of time, and secondly, it will be clear neglect of the quality of the product for our users. I will not write options for solving this situation here. Let this dilemma be food for thought.

#46Functional fixedness


The essence of the functional fixation is that the use of an object in one quality prevents its subsequent use in different quality in the same current situation. Preliminary knowledge that creates fixity makes it difficult to solve, causing a focus on certain aspects of the problem, and thus hinders the successful solution of the problem.


Using a web application as an example, this bias can become a problem when using “multifunctional menus”. For example, we have a button on our site that opens a menu of 4 tools. Suppose we collected data on the use of this menu and found out that in 97% of cases after opening it, users select the same tool “A”, while the other three remain unused. Regarding the regular use of tools during one session, we can assume that the user is affected by the functional fixation associated with the menu button. Too often, he selected tool “A” immediately after opening the menu, so adding new tools to the same might not be noticed by the user. Understanding this, we can, as an option, remove the “A” tool from the menu and “hang” it in the interface in a more visible place. We can also “re-invent” the previous menu taking into account another bias - the illusion of novelty.

#47Just-world fallacy


A socio-psychological phenomenon, expressed in the belief that the world is fair and people in life get what they deserve in accordance with their personal qualities and actions: good people are awarded, and bad people are punished.


Understanding the basics of this belief is very important in creating the right communication with the users of our products. Because of this phenomenon, we must in every possible way avoid statements that society may call “unfair” (regardless of what arguments we have in support of it and how we relate to it ourselves). We must also emphasize the fairness of our actions in public statements and all kinds of release notes, thereby increasing the sympathy of our audience for our decisions. If we do not greatly outperform our competitors in terms of product quality, the “fairness” factor can play a key role in making decisions on choosing a product from potential users.

#48Authority bias


The tendency to ascribe a higher rating to the opinion of an authoritative figure (not related to its content) and to a greater extent depend on this opinion. According to the effect, people usually have a deep-seated respect for the power and tend to obey when a figure invested with it requires it.


Understanding the mechanics of this effect is important in the context of applications where there are public figures who come in contact with users, for example, administration, moderators. Also, this effect is important for applications where there are functions of arbitration, escrow.

To resolve problems between users, in some cases, the authority will be the support staff. For the successful use of this effect, it is advisable to secure a reputation of a “fair” company, which, in turn, is possible only if communication with users was previously carried out taking into account the just-world fallacy bias.

If the company lacks its own authority, you can refer to other authoritative sources. By the way, this effect is the reason why different expensive brands spend millions of dollars to get a positive opinion about their products from different world-class athletes and other celebrities (here we are not talking about football players advertising perfumes, but rather about runners, advertising sneakers).

#49Automation bias


The tendency of people to give preference to offers of automated systems and ignore conflicting information received without automation, even if this information is correct.

A study was published in the journal of the American Medical Informatics Association, where scientists came to an interesting conclusion about this bias. So, it turned out that the position and popularity of the recommendations appearing on the device screen can affect the likelihood of confusing automation reliability. In this case, the most clearly presented recommendations, correct or not, are more likely to be respected.


According to another study, more elements on the screen can make users less "conservative" and, thus, increase the likelihood of confusing automation reliability.

Understanding this bias allows you to work more carefully with applications where every action can have very far-reaching consequences for many people. So, for example, in one project that we were programming for the General Department of Civil Aviation of Armenia, we developed the design in such a way that at every moment the application user (GDCA member) understood at what stage the requests needed for processing are. Almost every user action requested additional confirmation, while clearly describing its expected result. Obviously, for a mass-oriented application, we would have avoided “importunity” by removing “excessive” confirmations and questions.

In some cases we could want to increase the faith of our users in the reliability of our automation. This could be achieved by analyzing their behavior and sending them various "tips" derived by machine analysis.

#50Bandwagon effect


This effect is also known as the "imitation effect." This is a form of group thinking, shown in the fact that the popularity of certain beliefs increases as more people accept them.


This effect is very important for understanding when working on an application aimed at the broad masses. In online stores like Amazon, this effect is used in recommendations like “Together with this product, our users choose% product_name%”. Its more "aggressive" use, which is also found among online retailers, is "84% of our users choose% product_name%." In fact, such “recommendations” differ from “Based on your purchases over the past week, we believe that you will like% product_name%” only in that they use the effect of joining the majority. At the same time, we proceed from the fact that in all cases the system analyzes information equally; only the form of its filing changes.

In highly specialized applications, the use of this bias may not have any effect, and in some cases, it can even be harmful (for example, recommendations on “choosing the majority” will be extremely inappropriate in medical applications where the users are doctors).



A substance without no therapeutic values used to simulate a drug in studies where the evaluated effect may be distorted by the patient’s belief in the effectiveness of the drug or to improve the patient’s well-being in cases where a more effective drug is not available.


I suppose it may be very strange for the reader to see the medical effect on our list, and yet, it is not accidental here. A placebo and understanding of how it works can be beneficial in working on a product. I will give an example.

Those who have long worked in the support service of a technical company are aware of the difficulties encountered when working with people of age. If such users have any technical knowledge (most often obsolete), they insist on the only right (as they understand it) solution, which should fix everything in the current situation.

Sometimes, instead of stubborn confrontation with such users, where the argument is between your argument “It doesn’t work like that” and user's “I set up Linux systems in 99th, I know how it works”, you can create a kind of technical “placebo” for the user, thanks to why he will stop being nervous, and our employee will solve the problem in the way the user wanted. Of course, the placebo form itself can be very different and very much dependent on the situation itself.

For example, when I was working as a system administrator 8-9 years ago, I noticed that some users of my networks (adults 50+ years old) were very annoyed when certain web pages did not load instantly. This always led to the fact that they believed that the problem was in our network, and attempts to explain to them that the server they were accessing at that moment, could just be loaded, were meaningless. For such users, I created on their desktop a .bat file with one single request “ping gmail.com –t”. The essence of the file was that when it was launched, the system opened a small console window and made a request to the Gmail server, then received a response, simultaneously reporting the time spent on the whole process. I did not indicate the specific time when the system needed to stop the request, and as a result, when the file was launched, every second, the user saw some requests and changing numbers, in the spirit of “hacker” films of the 90s. This absolutely did not load the system and, technically, it made practically no sense in the context of the problem of my users. When leaving, I gave them one instruction: "Run this file every time you see that there are problems with the network." The effect of this “medicine” was colossal. Those users who on a daily basis complained about working “with the network” stopped complaining at all. People worked silently for months and were happy, and each time they saw me, they thanked me for “this technical miracle”, and I thanked them for the “network error they found".

Sometimes, letting the user “click” or fill out something can be useful in order to reduce the tension between the user and the company/product.

#52Out-group homogeneity


It consists in the perception of representatives of other groups as more similar people, and representatives of their group as more unique.


In application logic, understanding this bias may allow us to develop a more “personalized” user interface. For example, we can segregate and group users according to some specific parameter. Then, we can show on the page of our user “recommendations” of other users from his group, thereby simultaneously winning his sympathy. Alas, I will not be able to write a detailed example, taking into account all socio-cultural norms, for obvious reasons.

#53In-group favoritism


Also know as in-group bias. The socio-psychological phenomenon, expressed in the tendency to somehow help members of their own group as opposed to another. This can be seen both in the externally observed behavior of a group member and in his formation of opinions, judgments, etc.


The understanding of this distortion is primarily inherent in journalists, in particular, when creating “catchy”, sometimes provocative headlines. In the logic of application development, where there is “common content”, an understanding of this bias is important for creating a competent mechanism for filtering content. The bottom line is to use the knowledge about this bias and the out-group homogeneity to show users only the content that causes an “emotional response” in them. Both of these distortions are actively used in social networks, Google search results, and many other products.

#54Halo effect


Тhe result of the impact of the general impression of something (phenomenon, person, thing) on the perception of its particular features. An example is an impression that people with attractive appearance have great mental abilities.


Understanding this effect is necessary in the first place to create a consistent (holistic) application. Despite the fact that content is the key to everything, the consistency of the appearance of the application and the general level of “quality” of the service is no less important. For example, a spectacular interface creates the same “spectacular” expectations for users. The user expects high technical quality from such an application. This correspondence between the “wrapper” and the “features” ultimately leads to high-quality user experience.

If our application has a beautiful interface and functionality, then the user will have an impression of our support service in accordance with the impressions of this interface and functionality. That is why, speaking of user experience, we should understand all the possible interactions of our users with our application and company.

#55Positivity effect


Describes the ability of a person to constructively analyze a situation in which the desired results were not achieved and at the same time continue to remain positive in the analysis, in order to increase the likelihood of future success.

Using social media as an example, scientists have come to the conclusion that on Twitter and Instagram, users prefer to share positive news. Moreover, these same users are twice as sensitive to positive news than negative.

In another, private study, it was pointed out that some Instagram users use the application to spread positiveness to others, and at the same time they feel happier. Positiveness in social networks affects not only the person who has left with positive comments but also the person who writes them.


I added a description of this effect here not to write an example of use, but rather so that an attentive reader would conduct his own research on this topic. Nowadays, when due to various events we face a huge cascade of negative information and work stress every day, understanding this effect is important first of all in order to present positive information and create “good”, “positive” things wherever it is possible.

#56Not invented here


A position in a social, corporate, or organizational culture that avoids the use of existing developments, research, standards, or knowledge because of their external origin. The reasons for those who do not want to use the labor of others are varied. Among them - a lack of understanding of someone else's work, unwillingness to recognize or appreciate the work of others, jealousy, or as part of a wider “war for territory”. As a social phenomenon, this philosophy is manifested by a reluctance to accept an idea or product, because it comes from a different culture.


A more grounded example is the “wars” of Android versus iOS users. In the logic of product development, an understanding of this social position is important for a product manager in the first place in order to cleanse himself of such biases, because in fact, such “rejection” deprives the manager of flexibility, and the manager, deprived of flexibility, becomes the “fifth wheel”. Ideally, at the very beginning of work on our application, it would not hurt to explain to our colleagues and partners that we should avoid the position of “rejection” on the principle of someone else’s development since this is fundamentally counterproductive. In programming, to the surprise of many managers, a wide variety of programmers shares the “rejection” position and is divided into different camps a la GitHub vs GitLab, Bootstrap vs Ant Design, etc. Studying the roots of these problems, and having the appropriate arguments for reconciling the parties in case of conflict - the direct responsibility of the manager.

#57Mental accounting


Describes the process in which people code, classify, and evaluate economic outcomes. A person in the mind can have several “mental accounts” for the same resource. For example, a person may consider that eating in a restaurant and buying groceries are different expense items and combine both ways of eating independently of each other, despite the fact that both resources are food and require money to purchase it. Similarly, as a rule, people spend more money on a purchase when paying with a credit card than in cash, because people compare the cost of goods with a small number of resources (banknotes in a wallet) with a large amount (money on a bank account). In the second case, parting with money is easier, since the sensitivity threshold decreases.


An example of the use of mental accounting in a product can be illustrated in computer games. In a number of "MMOG" genre games, developers create different types of in-game currencies, and for various reasons convince players to spend real money. The logic of such games is to force the player to create several different “mental accounts”, for example, one account can be used for monthly account payment in order to have access to the game, another “mental account” will wait for discounts on in-game currency, and the third “ mental account “as a last resort” if the game studio unexpectedly announces a large discount on in-game weapons. Using this approach, the likelihood of a player “pouring” additional money into the game increases because he makes “different” purchases.

The in-game currency is a separate interesting object for study. Say, when you buy 5590 “gold coins” in a game for $ 4, contrary to common sense, the player may consider this “incredibly profitable”, of course if you can buy something “significant” for these coins. It is precisely the creation of such “values” that is one of the works of the product manager of game studios.

Of course, an understanding of mental accounting is useful not only for the gaming business.

Let's say we have an online store. In addition to spending on the goods themselves, we can push our users to create new “mental accounts”. For example, we can sell the status (Amazon Prime), annual extended guarantees by creating a “Calm for my purchases” account in the minds of our users in advance or providing access to the “exclusive” broadcasts (also a status form, Playstation Now).

The very task of “lowering the threshold of sensitivity” of the user base in order to facilitate “parting with money” is significant enough for the business to study the “mental accounting” and other distortions documented by Richard Thaler.

#58Normality bias


It is the tendency of people to believe that things will function in the future as they normally functioned in the past, and therefore underestimate both the likelihood of a disaster and its possible consequences.


For us, in the logic of developing a certain application, understanding this bias is important in order to take care of the negative consequences in advance if they happen to the user. For example, we can automate data backup for our users if something happens that goes far beyond the regular behavior of the user (e.g., a request to delete all data in the system). In part, an understanding of this distortion has led various kinds of investment platforms to implement the functionality of “fixations”, in the case of “catastrophic” consequences.

#59Survival bias


A kind of systematic selection error, when there is a lot of data on one group of objects ("survivors"), and practically no data on another ("dead").


In the logic of working on an application, understanding this error is critically important when compiling cohort user groups for their subsequent analysis. Also, remember this error when analyzing existing information. A simple example: we developed two web pages, which were advertised differently in the media. The goal of both pages is to attract users and encourage them to register. One of the pages focuses on the “A” service, the other on the “B” service.

If after two weeks of “promotion”, we get a tremendous amount of information on one of the pages (say “A”), but practically do not have information on the other ("B"), then our decisions are made on the basis of only one piece of data (page with the service “A” ) may not be true. Even if these decisions are beneficial, there will be a risk of missing additional benefits, ignored due to insufficient analysis of the reasons for the “failure” of the page with the “B” service.

An attentive reader will notice that, in fact, understanding this error is important in compiling almost any A/B test.

#60Subadditivity effect


This is a tendency to evaluate the probability of the whole as less than the probability of its constituent parts.


For example, my colleagues and I estimate the probability of postponing the release date of our project at 30%. In a more detailed analysis, with more specific questions such as “What is the likelihood of a postponement of a release due to: technical problems, changing market conditions, team inconsistencies, unpreparedness of our partners, etc., we will be inclined to give such answers that in total will exceed 30 % probability that we assumed initially. Since such differences of opinion are dangerous for any team, understanding this effect gives us the opportunity to very strictly approach the decomposition of such issues. Decomposition of the question into small parts and the subsequent analysis of the probability of precisely these parts will not only provide more “clean” information for work but also allow the entire team to discuss the risks and understand them equally.

#61The Magical Number 7+-2


Also known as Miller's Law. The pattern according to which short-term human memory, as a rule, cannot remember and repeat more than 7 ± 2 elements.


It is important to understand that under the “elements” there can be an object of any type: images, text, numbers, etc. In addition to remembering, a large number of elements primarily cause the user discomfort due to increased cognitive load on the interface. It is precise with the law 7 + -2 that almost all applications with a large amount of content work primarily on its optimization and categorization.

In corporate culture, an understanding of this law makes it possible to create more “flexible”, “comfortable” systems of rules and values. For example, in the vast majority of modern IT companies, in addition to unspoken rules, there are also “value systems”. Usually, these are poppy, catchy phrases such as “Be Open and Honest”, “Live Like A Team, Work Like A Family”, which most often mean absolutely nothing not only for employees but also for the management of the companies themselves. The situation with such “values” is complicated by the fact that, as a rule, HR “specialists” who thoughtlessly copy these “values” from the Internet do not think at all about their “convenience”, and as a result, the number of such “values” in depending on the company, it varies from 7 to 15+, which, of course, can look pretty, written on the wall, but it will have absolutely no value or benefit for the company itself, corny because all this is difficult to keep in mind.

Ideally, any undertaking aimed at the result and quality is simply obliged to take into account Miller’s law, and not to inflate the number of elements that are expected to be remembered by the employee. This applies both to what is happening “inside” the company, and “inside” the product.

In a world where there is a lot of information noise, more often less = better.

#62Illusion of transparency


Cognitive bias expressed in the tendency of people to overestimate the ability of others to understand them, as well as their ability to understand others. Because of this distortion, we think our words are clearer, our feelings are clearer, and our experiences are more visible to outsiders than they really are.


In the logic of application development, an understanding of this bias is necessary primarily for those who work with content. Fortunately, as far as I can tell, modern content writers have already realized that what is obvious to us is most often obvious only to us.

So, even what I am writing right here is, in fact, much less clear to you than it seems to me. Understanding this forces me to put additional information links to sources on Wikipedia and, as a result, provide you with the opportunity to check the information presented here. In other words, in essence, I solve the problem of good user experience, even if this "product" that I "create" is very small. The same understanding leads to the fact that in the descriptions I rarely refer to famous products, because their life cycle is unknown to me, as well as management plans for companies owning them; and in order not to redo the material here in the future, I try to give examples that will be “universal” for years (although they are not as beautiful as they could be).

#63Curse of knowledge


The cognitive bias is that it is extremely difficult for more informed people to consider a problem from the point of view of less informed people.


An extremely important bias for understanding, which any product manager should remind himself regularly, without exaggeration. Blindness to the point of view of less informed people is the first nail in the coffin of any product. I am deeply convinced that a wide variety of different products fell victim to this bias due to management who either did not manage to establish a “connection” with users or, on the contrary, lost this connection. Understanding this bias is important in the context of startups, especially successful ones, which are rapidly developing and are forced to process huge amounts of information related not only to users but also to legal and financial issues of business development. It is possible to protect our decisions from the influence of this distortion, in part, by correctly forming cohort groups and metrics that will very clearly reflect the mood of our audience.

#64Spotlight effect


The psychological effect, which consists in a tendency to overestimate how much a person’s actions and his appearance are noticeable to others. In other words, it always seems to a person that he, his actions, is always “in sight”.


In fact, people pay less attention to us and our actions than we think. Nevertheless, an understanding of this effect is important in the logic of those applications where we want to “push” users to perform “public” actions that may be visible to other participants in the application.

Any information on our audience is also important here.

For example, if the target audience of our project is shy people, then understanding the effect of the spotlight will allow us to create more careful formulations in communication with the user. It will also affect the design of the application features themselves, their appearance, visual effects, etc. If our audience is people seeking attention to their person, then we can work more “confidently”. So, we can afford more "personal" alerts ("You like % username% !!!"), visual effects. Obviously, in this case, we are most likely to create a number of system features that will “spur” the desire of our users to receive attention.

Understanding this effect in the context of the studied user base also allows us to understand why certain messages or features of our system are interpreted by our users as “boring” or, conversely, “aggressive”, “dangerous”.

#65Illusion of asymmetric insight


This is a cognitive bias that makes us believe that we can see through every person, while we ourselves are completely impenetrable for this person. Because of this bias, we tend to believe that we have some kind of supernatural ability to understand others.


In the logic of working on a product, understanding this illusion is very important in order to distance oneself from one’s own opinion of our users privately, when it is necessary to make some kind of decision on the development of a product based on this opinion.

This distortion is one of the key reasons why, in order to make the right decisions, we first of all need to rely on scientific evidence and data obtained from the system.

The illusion of asymmetric insight = Data First.

#66Hindsight bias


This is a tendency to perceive events that have already occurred or facts that have already been established as obvious and predictable, despite the lack of sufficient initial information to predict them.


This bias was indirectly described by Nicholas N. Taleb in his book The Black Swan (narrative fallacy). Understanding this bias is very important in the first place for company management. The most unsuccessful decisions are made after a few “I knew it”, said at the time of receiving unsatisfactory results. The more a manager is subject to this bias, the more problems he will create first of all for himself, then for his team and, of course, for the product, he is working on.

This is one of those biases for the sake of which it is worth reading not only the scientific studies in which it was derived but also the book that I mentioned above.

Another mistake in the same series: outcome bias.

#67Planning fallacy


Cognitive bias associated with excessive optimism and underestimation of the time required to complete the task. An error occurs regardless of whether the individual knows that solving similar problems in the past took longer than planned. Distortion only affects forecasts regarding one's own tasks; When forecasting time to solve a problem by others, on the contrary, the necessary time is getting overestimated.


An understanding of this bias is necessary for managers of any level since it will minimize the risks associated with the undertaking of obligations by their subordinates. The more people involved in the project, and the more individual obligations added to the overall plan, the higher the risk that something will go wrong. Experienced managers always have backup plans in case something goes wrong on schedule.

#68Pro-innovation bias


Inability to adequately assess the pros and cons of own invention.


In the logic of software development, an understanding of this bias and how to work with it is necessary to smooth out “acute issues” that arise as you work on a product. A high-quality employee (especially a manager) should not "hold on to his idea" to the last. However, often, a careless remark about his idea can turn into an opposition mood and a stubborn rejection of alternatives. Again, ideally, a group of extremely sane people will never allow such bias to slow down their work, but I have met very few such groups, and I consider them to be the exception rather than the norm.

#69Overconfidence effect


Cognitive bias, in which a person’s confidence in their actions and decisions is much higher than the objective accuracy of these judgments. It is also expressed in a flattering representation of oneself.


Another bias that should be "fought", based on data received from the system, as opposed to the opinions of someone (including management). Most often, the degree of influence of this effect on a person is proportional to the degree of the bloat of his ego. For the same reason, it is very difficult to work with such people.

#70Social desirability bias


The term describing the type of bias in the respondents' answers, which consists in the desire of the respondents to give answers that, from their point of view, look preferable in the eyes of others. This trend is expressed in an exaggeration of positive and desirable qualities and behavior and an understatement of negative, undesirable.


A very important effect for understanding when working with the product. Regardless of which application we are talking about, we should always strive to create such an interface and features in the system, the use of which will not go against the accepted norms of the socio-cultural environment of our potential audience. We will never be able to convert visitors into active users, no matter how cool software we wrote for them if its use involves a step against social norms.</p>

#71Third-person effect


The psychological effect is that the individual believes that advertising and other methods of persuasion can affect most people, but the person believes that he himself is less susceptible to such an influence. In other words, the effect of a third party is to underestimate an individual's degree of influence of advertising and other mass media on himself.


Understanding this effect is important in the context of application development for, say, an “intelligent” audience. In particular, we are talking about the B2B sector, SaaS / PaaS business models. In these areas it is necessary to understand those simple techniques of “pushing” and “background advertising” can be easily seen, and, in some cases, can even cause a negative reaction. The more “thinking” a person considers himself, the more clearly this effect is expressed. For the same reason, in a number of companies, employees responsible for communicating with high-touch clients receive specific instructions and behave much more courteously and carefully in matters related to upsell.

#72Consensus bias


This is a tendency to project your way of thinking on other people. In other words, people tend to believe that everyone else thinks exactly the same way they do.


First of all, understanding this bias is important to ensure transparency in the team working on the product. The pluralism of opinion and managers' transparency is the cornerstone of the success of any discussion. Regardless of which question raises, it is better to ask it and get the expected consent, rather than remain silent, assuming that "this is obvious", and make a mistake in the product design.

All the obvious is always obvious only to ourselves.

#73Hard-easy effect


Predictions regarding the probability of solving a problem sometimes do not correspond to its complexity. People often overestimate their strength in solving difficult problems and underestimate themselves in their work on simple ones.


In the context of application development, understanding this effect is very important where the question is about the “ease” or “complexity” of the interface. So, if, having created a certain feature, it seems to us that it is relatively complex, then it should be obvious to us that from the perspective of users it will look much more complicated.

This effect is one of the most important reasons for the importance of "simplifying" the interface wherever it can be done safely.

#74Dunning-Kruger effect


Cognitive bias, which consists in the fact that people with a low level of qualification make erroneous conclusions, make unsuccessful decisions, and at the same time are unable to recognize their mistakes due to the low level of their qualifications.

This leads to their overestimation of their own abilities, while really highly qualified people, on the contrary, tend to underestimate their abilities and suffer from insufficient self-confidence, considering others more competent.


For a team working on product development, understanding this effect is first and foremost necessary to ensure that all team members have the right to vote. It is better to listen to the opinion of each participant and risk improving the software being developed, rather than wait until everyone silently nods to the best speaker who was the last to make a proposal at the table.

In the logic of developing a certain application, say, working on SaaS in the B2B sector, understanding this effect and the principles of its operation makes it possible to “smooth out” the places where users can make a mistake. Many companies in this matter allow an obvious miscalculation, believing that users themselves indicate inconvenience in our program, however, this happens extremely rarely.

In many cases, if errors do not block the operation of the application, users can use the software for years, and never know that the workflow was possible to optimize. Moreover, they, in turn, will teach their colleagues the wrong, non-optimal methods, until at some point we come up with a beautiful solution to the problem, so as not to offend them.

#75Barnum effect


Also called the "Forer effect." A general observation, according to which people highly appreciate the accuracy of such descriptions of their personality, which, they assume, are created individually for them, but which are actually vague and generalized enough that they can be applied with the same success to many other people.

Factors affecting this effect:

  • The subject is convinced that the description applies only to him.
  • The subject is convinced of the authority of the person who formulated the description.
  • The description contains mainly positive characteristics.


This effect can explain the popularity of products such as horoscopes, psychological tests are so widely popular in the social networks and various comparison applications such as “Your facial features resemble Cleopatra and, of course, make you the owner of an extremely insightful mind!” and the like. In other words, not a single reason to read something flattering about ourselves will be perceived by us as "superfluous."

At the moment, a huge number of applications have already been developed that use this effect with high efficiency. Their examples in the form of thousands of different social applications that I mentioned above can be found literally by making a couple of swipes in our news feed on Facebook.

Alternative use of this effect with high profit can be seen in the work of sales specialists, where flattery to the client is another “weapon” in the agent’s arsenal.

#76Illusion of control


Cognitive bias is expressed in the tendency of people to believe that they can in some way influence events that are objectively independent of them or depend to a much lesser extent. The effect is manifested when a person is interested in a positive outcome of the event and is somehow involved in the event or he knows in advance a favorable outcome.


This bias is often used by different application developers when they want to “engage” users “in development”. So, for example, a company can put up three options for the further development of a platform for users to vote, wrapping this like “only you can tell what to do next”. This, in turn, will create the illusion of “control over the fate of the project” among users and their involvement in the vote is likely to increase significantly. In this case, I called the situation an illusion because most often those options that are laid out for public voting pass very strict filtering, sometimes very similar to each other, and in 100% of cases, regardless of the choice of users, they ideally correspond to business plans company. The indirect benefit of creating such illusions is that we better recognize our audience (depends on our presentation form), and at the same time, we learn the priority of options through the eyes of our audience, which allows us to more optimally organize our product backlog.

#77Illusory superiority


Cognitive bias manifests itself in the tendency to exaggerate their strengths and downplay shortcomings in comparison with other people.


D. Carnegie called flattery a fake, which brings to trouble. Scientists have found that even when we are frankly flattered, and even if we are fully aware of it - nevertheless, we are pleased. The bias of illusory superiority is one of the possible explanations of this phenomenon, and, most importantly, the reason why flattery should be stored as another “weapon” in the arsenal of both our sales specialists and support service specialists.

#78Risk compensation


Also referred as “Peltzman effect.” Cognitive bias, in which too many protective devices and safety regulations increase the risk of accidents due to a false sense of invulnerability.


Understanding this effect is important in the context of creating applications where safety regulations are critical. For example, in one project that we developed for the management of the general department of civil aviation of Armenia (GDCA), this effect was taken into account both by me and our designer, with the goal of creating an interface that would foster a sense of responsibility of users in their actions and would not allow them to assume that “ the system will do everything itself. ” The importance of taking this effect into account when working on a product is especially important when it comes to users of the older age group (50+) because for them, the threshold for “overloaded” interface is much lower, whereas the learning cycle for the application is higher.

#79Hyperbolic discounting


This is a cognitive bias, manifested in the tendency of people to prefer even smaller, but instantaneous rewards to large ones, but received some time later.


Obviously, the use of this bias will primarily benefit those applications, in the logic of which there is the concept of "instant reward". Various casinos and betting platforms benefit from this bias the most.

In the logic of some other application, understanding this bias, for example, will help us encourage users to use some kind of “beta” product - become our testers.

For example, Atlassian, in their Confluence and JIRA software, exploits this bias by encouraging users to "Try our beta version of the new interface right now!" They "spice" such calls, usually showing immediate benefits from the use, such as "89% of our users who tried the new interface did not return to the old one." A "big reward" in this situation is that as soon as this “interface” or some other feature of the system leaves the status of “beta”, it is guaranteed to work without errors, without fail, and in case of any malfunction we can demand something from the manufacturing company.

I believe that the attentive reader himself will recall a number of such “pushings” when we were lured into trying something with momentary benefit.

#80Appeal to novelty


It is a fallacy that an individual may prematurely assert that an idea or proposal is correct or, better, solely because it is new or modern.


The widespread bias that causes problems for many companies. The first victims of it are the departments whose managers come up with the “brilliant idea” to integrate some kind of new management framework with a hard-to-pronounce name. At the same time, the fact that the documented successes of its use are nowhere to be found is ignored. If such a fallacy is characteristic of the top management of the company, then, as a rule, it costs them a lot because rarely, one of the lower-ranking employees in the hierarchical ladder decides to express public dissatisfaction with the "new" approach, coming from the "very top".

In the logic of application development, an understanding of this error gives us a wide variety of opportunities for inducing our users to use the new functionality of our system.

For example, we are working on an online platform and our customers are other companies (B2B model). The end-users of our product are mid-level managers of these companies. During the release of new functionality, we can write release notes so that it is easier for our users (managers) to extract from the text their “own” arguments with which they could go to the management and offer to buy our higher-level service plan. So, appealing to the novelty that we will show them beautifully, we will make our users our "agents", who themselves will take care of upselling in the long term.

An example can be complicated, but an attentive reader, I suppose, has grasped the essence.

#81Escalation of commitment


Also called “escalation of participation”, “escalation of obligations”. This is a psychological behavioral pattern in which an individual or group of individuals, encountering the negative consequences of making a decision or action, still continue to make it.


In the logic of software development, this bias occurs if a decision was made by management, and later, due to the inability to accept a “defeat,” it began to be modified, exacerbating the situation more and more.

For example, you are a product manager, and your immediate supervisor suggested creating a new option in your application that will allow your users to download pictures from paid sources. Your argument that the vast majority of your users do not have accounts on such paid portals, your manager retorted, saying that "when they see this option, they will begin to use these portals more." Your other argument, that with this action, we are more likely helping the cause of these paid portals than our own product - was ignored.

Time passes. New functionality is developed and added to the application. A little more time passes, but according to the data, it becomes clear that no one needs this option. After some time, you suggest removing the option because it is not needed and loads the interface, however, your manager, in order not to lose face, and also because of the inability to admit his mistake, offers a modification of this option. Moreover, he offers you to contact these paid portals and offer them to give those users who have registered on the site, through our application, a $ 5 bonus so that they can immediately “buy” several images and use them in our application.

To your argument that we are still helping other products, and this is counterproductive to our own, your manager snarls, saying that "you just do not see the whole picture!" There is a wide variety of examples where a restrained ego became the cause of an irrational approach and understanding this is necessary to ensure a healthy working atmosphere in a team.

#82Generation effect


The mnemonic effect, which is characterized by a higher level of memorization of information completed (generated) by the person himself compared to similar information presented in general for reading, visual viewing, or listening.


The existence of this effect in itself is a strong argument in favor of maintaining logs (reports) of information. If the application we are developing is multifaceted and allows users to make mistakes in various small details, it makes sense to think about creating a transparent logging system. So, it will record the user's actions and show them either to him or to the account owner. Otherwise, if the user generates an assumption that will not be valid and believes in it, it will be extremely difficult to convince him without technical evidence. Work with the logs themselves, and their representation is a separate, extensive, and exciting topic.

#83Loss aversion


The negative utility associated with the loss of an object is greater than the utility associated with its acquisition. Simply put, people are more upset by the loss of a thing than they would be happy about its find.


A very important bias to understand when it comes to monetizing our application.

By “loss,” in this case, one should understand not only finances but also disappointment from a choice.

For example, we have developed a large platform that we are going to offer to the market according to the SaaS (Software as a Service) model. We are considering the possibility of developing many tariff plans since the scope of our functionality this allows. Our colleagues offer to make 8 service plans, citing the argument that more options - this is more space for users to choose. Their idea is that it allows you to create plans for any audience and that in this case, any user will find what he likes.

In addition to a clear violation of Miller’s law (rule 7 + -2), our colleagues, in this case, ignore the fact that a large number of plans are a large number of opportunities to make mistakes. Since we take money from the very beginning for subscribing to our service, where we see “8 flexible service plans”, our users will see “1 correct option, and 7 wrong options.” Loss aversion will do the trick, and the user will simply go to competitors who have everything “simpler.” By the way, this is one of the reasons why many companies present their products on the market, packing offers in 2-3 service plans, or (if the product is small) go even easier - towards the Freemium model, which provides the user with a choice between a free and a premium version of the product.

It will also be useful to add the words of Dr. Daniel Kahneman himself, one of the authors of key studies of this bias:

“In mixed games, when wins and losses are possible, loss aversion leads to a choice in which risk aversion is maximized. A significant rejection of losses is present even if the amount you risk is negligible compared to your wealth. ”

Here we can safely afford the interpretation of “mixed games” as any uncertain situation in the app where you need to make a choice in the face of incomplete information.

#84IKEA effect


This is a cognitive bias that occurs when sellers disproportionately evaluate the significance (value) of goods that they participated in creation themselves (for example, assembly from parts).


In the logic of developing a certain application, the benefit of understanding this bias can be illustrated as follows.

Suppose we decided to create an online platform where anyone can post their photos for the purpose of selling them. Our benefit is that we charge 30% of the commission. Everything is extremely simple. Is that familiar? However, unlike Shutterstock, we are not hyped up and are just starting our steps. We carried out a marketing company, collected a number of users, and right now people began to place their ads. What are we waiting for?

Consistent with the IKEA effect, users will obviously put such huge prices for their photos that no one will buy them. At some point, the lack of sales will result in the photographers leaving the site, and we will sit at a loss, trying to understand what went wrong.

The correct implementation of such an application (as an option) would be a restriction on the maximum cost of photographs depending on their resolution. Another option would be to link the price of the photos to the user’s rating, which would increase every time someone buys their photos, as well as when their photos were “liked” by other “sellers” (in order to minimize the possibility of a “cheating” rating). A more “democratic” option is to show the “market value” of the product, which, of course, is not an option if we are one of the first on the market (this may simply lower our profits).

Another example where the effect of IKEA partially works.

We should be especially careful about those parts of the application where the user has "invested" more physical time. So, if somewhere in the application the user filled out the form for 15 minutes or passed the “test”, it is very important to eliminate the error in the record of results.

For example, a user filled out the information in his profile for 5 minutes in order to make it more “presentable”, and after clicking “Submit”, due to some server error, we could not save this data. It may seem to us that these 5 minutes are not so critical, but since these 5 minutes were the actual “own profile building” in which a person invested an n-th amount of mental energy if the data is really lost, the likelihood that the user will be lost along extremely high.

#85Unit bias


The tendency of people to want to complete their assigned tasks. People want to finish any portion of the task, regardless of its size. The feeling of completion gives people satisfaction.


The bulk of the research on this bias was conducted as part of a study of healthy eating. According to the results of studies, it was found that people tend to follow the established "norm" of food consumption, even if this "norm" is excessively large for them. Also, these studies led to the conclusion that in the fight for a healthy lifestyle, it would be best to limit the actual serving sizes of different dishes.

The reason I included this bias on our list is that the same “completion desire” is found among office workers at all levels. In addition, an understanding of this distortion can bring serious benefits in product development.

So, the simplest illustration of this effect in office work is the work environment of employees. Trello, JIRA, Asana, Basecamp, etc. - everywhere there are Kanban boards, tasks, and columns for which only we are responsible, or tasks to which we are specially assigned. For people who have been working on various projects for a long time, the desire to “read all letters in Gmail”, “view all notifications in Slack and Skype” and also “complete all tasks from this column before the end of the working day” will be very familiar. The desire to complete the “portion of daily work” is not found among all employees, but I saw it with a sufficiently large number of colleagues over the years of my work to confidently include this bias in the list.

Now about the benefits. For management, an understanding of this bias allows us to reflect on the “optimal portion” of tasks that will allow the employee to “self-actualize” during the working day, and at the same time not to burn out. How to do this is the topic of a separate article.

As for the product, understanding this tendency is useful in the development logic of any application where the user is invited to work with lists/queries. If we want to “push” our users to “process” any requests, then selecting the “right” amount that is visible at the same time, and add page numbering can play a big hand in our hands.

Again. If the user sees that “34” of some kind of “request” is waiting for him, he will be much less willing to take up the “task” than if we “stretch” these 34 requests into 6 pages with 6 requests each.

#86Zero-risk bias


The tendency to choose a strategy in which one of several risks disappears completely, instead of a strategy that partially reduces several risks, even if in this case the overall risk is lower. This cognitive bias is widely seen in decision-making areas such as investing, healthcare, the environment, or public safety.


This bias explains the purchase by people of an extended warranty for various products and online services. The simplest example: 99.99% uptime service, when the company "guarantees" the availability of the service in exchange for additional fees. Nowadays, this service is given as a “bonus” in contracts, because nobody buys it separately, but 10-15 years ago it was a popular point for upselling used by different companies. The idea was that when the user imagined a picture of a disaster due to the fact that at some point the service was not available, this was enough to pay an additional few dollars for a “guarantee”.

Understanding this bias is important to be able to see where you can get additional business benefits by showing zero risks of a procedure that, under normal conditions, contains this risk.

#87Processing difficulty effect


It’s easier for people to remember information that requires more time to read and understand.


The effect is still not well understood, but it can easily be explained by the fact that the more time you invest in something (in this case, the time to process the semantic load), the better it is remembered. Research on this subject is also complicated by the factor of awareness(mindfulness) during information processing, which is a separate topic.

In the logic of application development, understanding this effect gives us the opportunity to manipulate the text where we need the person to better understand/remember the meaning of the action.

Best example: contextual buttons. Instead of the “Next” buttons, contextual buttons with names like “Continue and delete my data”, “I understand that by clicking on this button I make my data public”, etc. are much more often used.

#88Endowment effect


The psychological phenomenon is that a person appreciates those things that he already owns more than the things that he can get.


Understanding this effect in the context of application development is extremely important when planning “changes” to existing functionality. If we create new functionality - this is good, but if we “take” away from the user something that he already had (especially if it was in sight) - this is a risk. It is very important to understand that the “taken” functionality could never be used by the user, and just be in front of him. Nevertheless, removing this functionality, we can cause the user emotional pain. In particular, this applies to options that can be described as “nice to have”.

Also, understanding this effect allows us to “hide” the tested functionality in different corners of the application so that if we decide to remove them as not efficient, it will not hurt users. This is especially true in cases where we assume that the feature will appeal to, say, 20% of users, but to leave it in the system, we set the bar at 50%.

#89Backfire effect


Bias due to which we can persist in our beliefs, even if we have before us evidence to the contrary.

“At a loss, you become even stronger in your beliefs, instead of criticizing them. When someone tries to correct you, dispel your delusions, this leads to the opposite result and strengthens your confidence. ”


Extremely important bias for understanding, which allows us to pay special attention to the selection of a team for the project. The intensity of this effect is directly proportional to the conceit and bloated ego of the individual.

So, in my opinion, an understanding of this effect is key for selecting a team, as well as for “cleaning” one’s own picture of the world from illusions.

In the context of working on the application, an understanding of this effect is important for creating instructions for our user support center. In the popular statement “the client is always right”, the appeal to this bias lies in part. Do not try to “be smart” and argue with the client, no matter how wrong he is.

An example of perfect support: Amazon. Even if I contact them and report that due to the poor quality of the colored crayons that I ordered on amazon.com, I cannot call the devil, they will apologize for my bad experience and offer alternative solutions to the issue.

#90System justification


This is a tendency to protect and maintain the status quo, that is, a tendency to prefer the existing social, political and economic structure, and to deny change even at the cost of sacrificing individual and collective interests.


Despite the fact that this distortion describes trends that are not strongly related to product development, I still left it on our list in order to show that any communication with our users (via blog articles, mailing lists, etc.) is fraught negative consequences if the text refers to opinions that could potentially violate the current status quo. In other words, although this sounds very obvious, the company should be engaged in business, not politics and sociology, and any texts in which there is a shade of support for social phenomena, “recent events”, any political statements or economic foundations contain a big risk.

This trend and its description here lie with the aim of providing arguments to management if you need to resist the “strong-willed” decision of the management to make some kind of “eccentric” gesture for the business.

Another distortion on the same topic: status quo bias.



A motivational state arising in a situation when an external condition (another person, sentence, or rule) restricts freedom or threatens to limit the individual. The main objective of this behavior is to restore lost or limited freedom.


Many scientists have studied this bias, however, in order not to go into details, I will indicate one of the last major studies of this phenomenon:

In 2007, a study was conducted that examined the impact of message wording on the perception of information. The message presented to the subjects was about caring for their health. It was shown that if this message ended with a phrase that does not require following these tips in the future, then the reactance of the readers decreased. It was also found that specific messages that were formulated using the least amount of control words were remembered and perceived much better than abstract ones.

For example, a message like “You can cancel your subscription at any time!” that appears after we sign up for a paid service is one example of what can be done by understanding the principle of reactance.

In general, a selection reminder is always a good move to minimize potential alarms and the resulting reactive reactions among our users.

#92Decoy effect


Also known as the effect of asymmetric superiority. It is based on the proposition that, when choosing between two options, the consumer faces a problem, the solution of which, as a rule, does not benefit the seller. And then, in order to simplify the process, a third, obviously unfavorable option is introduced, rejecting which, the client more easily makes a decision in favor of one of the other two options.


This bias is one of the reasons why it sometimes makes sense to have “extra” goods and offers. Suppose we offer the user two products or services on the same web page: similar to the user and causing approximately the same desire for possession. A good option to stimulate sales may be to add the 3rd modification of this product/service, which will obviously be inferior to the previous two in quality, but not in price. For example, we have an online store where we sell various construction tools.

On one page, we offer the user a choice between two sets of screwdrivers. They are almost identical in characteristics, however, they have different colors.

Set A costs $ 49. Set "B" costs $ 49 (in both cases free shipping). If we took into account many different conditions, including receiving information from the system according to which users are on this page for a very long time and cannot make a choice, we can add a third, “empty” option: a set of “X”, which is a copy of the set "A" or "B" but costs $ 85 and does not have free shipping. In accordance with a number of experiments conducted by marketers and scientists, subject to a competent organization, the addition of such a “dummy offer” will stimulate sales in both versions (A, B).

Of course, when working with the decoy effect, a wide range of factors must be taken into account, and it is imperative to collect data from the system with each test to test your hypotheses.

This effect is often used by many eCommerce platforms along with the anchoring.

#93Ambiguity effect


The effect suggests that people tend to choose a solution for which the probability of a favorable outcome is known, compared with a solution where the probability of a favorable outcome is unknown.


Understanding the operation of this effect is important in order to adhere to clear and concrete formulations in all parts of the product, where ambiguity can create psychological discomfort for the user.

For example, if we know that having bought a product on our website, the user will receive it with a 100% probability, because taking into account current data for the last half a year, we did not have a single case of “loss” of goods, then adding an inscription like “100% delivery guarantee” or “100% customer satisfaction of the service” will slightly push the user in the direction that is necessary for the business.

In fact, in order to correctly use this effect, we must, first of all, remind ourselves that what is obvious to us is obvious only to us. And wherever we can emphasize the guaranteed result of success - we should do it (carefully, of course).

#94Information bias


It is a tendency to seek information when it does not affect actions.

An example of a manifestation is when we look for more and more evidence to change our work or stop talking with someone, while we “fill in” ourselves with more and more information in search of “confirmation” of our decision.

Indirectly explained by the effect of loss aversion.


Understanding this bias is very important when working on a product when it is necessary to make an unpopular, painful decision. A person who has come under the influence of this bias can carry out a “comprehensive analysis of the situation” for weeks, and, deep in himself, he knows from the very first day how to do the right thing.

Understanding the principle of this distortion, we can help teammates not to fall into it, or, get out much faster. We ourselves will be able to make cold-blooded decisions in those moments when every hour saves our product from systematic losses. A deep understanding of this bias, in the long run, can help us save hundreds of hours of our lives without spending them on a useless analysis of useless information.

#95Law of triviality


Also called the "effect of a bicycle shed." It is expressed in a simple form: members of the organization attach excessive importance to trivial issues.


Understanding this effect is very important primarily because it gives us the opportunity to "keep abreast" of the meetings. So, we can ask ourselves many times during the meeting, “Are we discussing what needed?”, “Is the discussion of this issue timely?”.

In the same way, in addition to shifting the attention of our colleagues to more important issues, we can monitor how much attention we pay to the details. So, we can more soberly assess whether it is reasonable to spend time on the application discussed in the detail or at this stage it is better to do something else, more significant.

#96Conjunction fallacy


Cognitive bias associated with giving greater credibility to joint events than to events separately.


In the logic of application development, understanding this error gives us the opportunity to make more beautiful “hooks” for users. Suppose we develop highly specialized software, and our target audience consists of people who will carefully study us and our competitors before signing up for the service. To give “credibility” to our technical prowess, we can add the “Our Team” section to the main page of the site, with our photos and descriptions of the most striking technical achievements that we have achieved (at the same time, it is not necessary that these achievements concern the product itself that we sell). Much more important is that the story itself be coherent, folding. For the same reason, let’s say, a number of projects that have a section “Our Team” add details about the won olympiads and certificates of their colleagues. At first glance, if a person won the California State Mathematics Olympiad, it says almost nothing about how he works in a company that offers remote server configuration services. However, at the moment when we will glance through the pages of the website, in search of “evidence” that the product is “good” and in general “what we need”, noticing such a “mathematics champion”, or a manager with an MBA and PMP certificate, we, mentally, will increase the likelihood that this product is the right choice. Of course, in the design of the "About the Team" page, much more distortions should be taken into account, and not only the conjunction fallacy, but, I believe, the reader understood the main idea.

#97Less-is-better effect


Cognitive bias, when in the absence of a direct comparison of two things, things with lower value are preferred. Simply put: people prefer small or less expensive items, but a little better quality.


Understanding this effect in the logic of working on the application is important when we create promotions or give out any “bonuses”. More often than not, this effect is intuitive enough to be used correctly.

For example, we work on LinkedIn. Suppose we want to give out bonuses to our customers to stimulate the conversion of users from “free” to “premium”. The management gave its approval that we provide 10 days of free use of the premium account to our users. On the one hand, a premium account is an expensive, presentable opportunity, but on the other hand, two days may be too short, taking into account the fact that many users visit the platform not every day. Obviously, they will not be able to appreciate our "bonus". This can be a good example of when less is better. Instead of 10 days of using the premium account, we can give one of its opportunities, but in a larger number (thereby increasing the "quality" of the bonus provided in the eyes of users). For example, we can provide the ability to send In-Mail messages that are available to premiums in the amount of 20 pieces, or give the opportunity to receive detailed, premium information on 20 companies at the user's choice. For the price, it will be much cheaper than premium, but it will be more significant for users.

An example is rather rude, but an attentive reader will catch the essence.

#98Implicit stereotypes


Also called "implicit association." Unconscious attribution of certain qualities to members of a certain social group. Implicit stereotypes are formed on the basis of experience and learned associations between specific qualities and social categories, including race and/or gender. Such stereotypes affect the perception and behavior of people, even if they themselves are not aware of their existence.


Understanding this phenomenon and the human tendency to create such associations is important for working with content, and communicating with the audience of our application. So, no matter how politically correct we are, in our speech, it is necessary to take into account the current state of affairs and to maintain neutrality in our public messages.

Partially, I touched on this topic explaining Stereotyping.



By prejudice is meant a judgment acquired uncritically, without reflection. These are the irrational components (stereotypes) of social and individual consciousness: superstition and prejudice. Prejudice is views and opinions based on inaccurate or distorted knowledge, most often taken on faith from the words of other people.


Despite the fact that the topic of prejudice and superstition is in many ways obvious, nevertheless, I added it to our list as I think it is very important for understanding in the context of application development. So, knowledge of prejudice may be needed when creating what is called the "Person" of our application - a list of properties of our potential audience. In addition to attributing such qualities as age, gender, occupation, expectations from the system, and so on, depending on the specifics of the product itself and the region where we work, it will also be useful to attribute prejudices of our potential users.

I want to emphasize that you should not even go into the cause of prejudice since it is not important here. This may be the result of cultural values, due to which certain colors may be associated with death or happiness among different nations; fear of certain numbers. So, for example, without much pomp, in a number of respectable companies (in Asia, the Middle East), the button “13” on the elevator floor is either replaced with some kind of picture or is completely absent. This is an example of using knowledge of prejudice in product development.

#100Fading affect bias


A psychological phenomenon, according to which we tend to forget the memories associated with negative emotions faster than those associated with positive ones.


Understanding this effect is important if at some point in the development of the application we made a public mistake. Often, responsible, serious specialists greatly overestimate the duration of the impact of the negative effect of their last miscalculation. Understanding the fading affect will allow you to act, and recover from errors faster. Of course, we are not talking about cases where the system accidentally deleted user data for the last day, and the next day we decided to conduct a new release with the headline "Great news!".

#101Peak-end rule


People tend to evaluate not the whole experience as a whole, but its quality at the peak and at the end. Most often, customers who have experienced dissatisfaction throughout the experience, but were satisfied at the end, rate it positively.


This bias is critical to understanding management in the context of application development. Any part of our application that is associated with the completion of an action, a series of actions, including closing the application, should be designed so that when leaving, the user has positive emotions, or if negative is inevitable, have its minimum possible part.

It is also useful to familiarize yourself with the fading affect bias and make development decisions considering both of these biases.

#102Serial recall


Also called "sequential recall." This is the ability to recall elements or events in the order in which they occurred.


Understanding the principle of serial recall in the application development logic allows us to return to our design from time to time and think about the sequence of actions that we expect from our users. We must be sure that the sequence of what is happening on the user's screen is logical and coherent, i.e. It has no logical inconsistencies, contradictions. This will lead to a more comfortable "remembering" of what is happening by our user and minimizes possible discomfort when remembering any action that the user performs with our app.

Adapting the path that our user follows in the system to his serial recall is another important point that should be taken into account when working.

#103List-length effect


The effect derived from the study of serial recall. According to this effect, the possibility of serial recall decreases with the increasing length of the list or sequence.


The effect is pretty obvious to most and yet, often, it is neglected in applications. In addition to the fact that, according to this effect, the ability to remember/recall long lists is deteriorating, another negative feature of such lists is excessive cognitive load, which, in essence, is a bad user experience. It is important to understand that here we are talking not only about “classic” lists, such as options “A”, “B”, “C”, “D”, but about any lists. For example, If our navigation bar consists of the items “Our Products”, “About Us” and “Contact Us”, then it will be easier to understand than “Our TVs”, “Our Video Players”, “Our Computers”, “Our Laptops”, etc., exposed in wide rows. In other words, understanding the effect of list length is important when working on content optimization and categorization.

#104Primacy effect


It is human nature to remember events and phenomena that occurred at the beginning of a conditional process better than those that happened in its middle or end.


If the peak-end rule shows the importance of why we need to take care of the “tails” of our application, where the user completes some action or a chain of actions, then the primacy effect explains why it is also important for us to pay attention to the first acquaintance of our users with the product. Understanding the primacy effect is at the heart of decisions to create highly effective boxes for products from different companies. Partially this effect can explain the high popularity of a wide variety of "unboxing" shows.

#105Serial-position effect


The Serial-position effect is the tendency of a person to recall best the first and last items in a series.


It is an essential effect for understanding in the development of various lists for our users. It would also be wise to have some kind of knowledge about Serial recall bias and the List-length effect. Understanding these three effects makes it possible to create lists so that the result meets our goals.

The implementation of this effect can be seen in the rule of persuasion by Homer, according to which the sequence of the arguments given affects their credibility. According to this rule, to persuade your interlocutor, the arguments must be put in the following order:

  1. strong and better rational arguments
  2. medium, but not weak
  3. the strongest one.

We can use this effect everywhere, starting from writing release notes, continuing to appeal to our audience, compile surveys, etc.

Of course, if our list does not consist of arguments, but is, for example, a list of our web application options, we can put the most used tools at the very beginning and at the end of the list. Much depends on the size of the list, our goals, and many other things.

#1Availability heuristics


The tendency to overestimate the likelihood of events with greater "availability" in memory, which can be influenced by how recent the memories are or how unusual or emotionally charged they may be.


In application development, understanding this bias is necessary for consistent interface design, content design, and user communication. If the action we need the user to do is associated with something negative (especially if it's been covered in the media not so long ago), the likelihood that an action will be taken is greatly reduced. Understanding this allows us to design content (text, images, etc.) so that it is associated only with what we need. This bias allows us to reflect on the current world and market conjuncture to choose a better "tone" of our messages.

Another example: Bitcoins and various kinds of ICOs. The topic of cryptocurrencies was so often negatively boosted in the media that, at some point, investors simply decided to avoid everything connected with it without going into detail. Common users realized that the blindness by the increased volatility of this market did not end well. The hype on this topic eventually fizzled out. Many high-quality blockchain projects have faced severe difficulties in development, due to the highly distorted reputation of everything associated with blockchain, Bitcoin, and crypto in general.

The last example is that I chose the topic of software design and blockchain technologies to describe Availability heuristics. The first topic is obvious to me due to my profession (product manager); the second one simply came to mind with ease when I asked myself, "What stream in IT was full of hype and then quickly died out?."