What did I tell you
…did you misread that line?
That's right, take a closer look.
Is this just an optical illusion? What is going on in our mind when we fall for these tricks? We go through errors in the text, as in the example above, but we fully understand what the sentence means.
We often take mental shortcuts, or heuristics, which can lighten the intellectual load and allow us to make decisions quickly and efficiently. Heuristics often lead to cognitive biases which, in turn, can lead to misjudgment, something we want to avoid, especially when building and testing software.
Here, I'll give some background on the psychology of this anomaly, expose some common cognitive biases I've experienced in load testing, and offer some practical ways to avoid them.
Biases, prejudices everywhere
An important study of aGoogleresearch group found that:
On 70% of the mobile landing pages we analyzed, top-half visual content took more than five seconds to display on the screen, and it took more than seven seconds for all top-half visual content to appear on the screen. and bottom of the page will be displayed. fully charge
What do you think is an acceptable response time for your own website? Is it less than five seconds? Less than seven seconds? Was your answer outside of that range? If so, congratulations: you've avoided one of the most common cognitive biases in load testing.
The anchoring effectIt is the tendency to rely too heavily on an initial piece of information, known as an anchor, when making decisions. In the case of the thought experiment just described, respondents are more likely to give an answer within five to seven seconds. If no anchor is offered, a completely different Google rank can be chosen.
Do you also see Google as an authority when it comes to this type of information? If you do, you are susceptible to a social cognitive bias known asauthority bias. This occurs when we overestimate the legitimacy of an authority's opinion and are therefore more likely to be influenced by it in decision making.
Cognitive biases have their origin in the field of psychology, but they affect our daily work. Knowing that they exist and becoming familiar with them can help us avoid costly mistakes in our performance testing efforts.
Thinking: fast and slow
Why are we subject to these prejudices? In a nutshell, the phenomenon boils down to the interplay between two modes of thought: rapid cognition, our effortless, almost reflective thoughts; and slow cognition, our most deliberate and purposeful thoughts.
Fast cognition is more susceptible to cognitive biases: Our personal assumptions and biases can easily creep into these automatic thoughts, but biases can still creep in as systemic errors into our slow cognition. Each thought system overlaps and cooperates with the other, and neither is immune to bias. That is why it is so important to develop strategies to identify and avoid cognitive errors.
Common biases and how to avoid them
Hundreds of different biases have been recognized in the field of psychology. I can't cover them all in this post, but here's a short list of biases I commonly encounter in load tests, along with tips on how to deal with them.
The best way to combat the pinning effect in load testing is to always visualize your data.
Demonstrating the importance of data visualization, thedatasauriois a popular adaptation of the Anscombe quartet, which, when viewed, reveals a dinosaur on the scatterplot, despite the fact that the summary statistics have almost identical values.
The authority bias is often exacerbated by the belief that obedience constitutes correct behavior. This is a good example of how the systemic nature of cognitive biases can affect our slow cognition. As software testers, it can be especially difficult for us to question authority.
To combat this bias, identify the assumptions you make when formulating hypotheses and question whether any are made in deference to authority. That authority could be the specification document, the organization you work for, a recognized arbitrator like Google, or even yourself. You should question authority in all cases rather than mindlessly give in to a potentially biased opinion.
Be aware of the logical fallacy: "After this, therefore because of this."
Consider the site reliability engineer who sees a spike in database write performance on a daily basis, followed by a brief site outage in the early hours of the morning. The assumption might be that since the outage follows the database spike, the outage is caused by the spike; in other words, a correlation seems to suggest causation. I often liken this error to looking for symptoms instead of the root cause.
An experienced engineer can avoid this bias by considering other factors that could potentially be responsible for the results in question, thus ruling out a false connection.
These are mental shortcuts that tend to be based on anecdotal examples. For example, "this happened twice in production" or "last time we changed this, it fixed the problem."
These shortcuts can be countered by deliberately activating his slow cognition. I like to use the following mnemonic device when diagnosing production performance issues under pressure:
- Sup and down the immediacy of a solution.
- TThink about the problem in the context of what has changed.
- Olook at the things you can measure and what needs to be measured.
- PAGPlan your next test, making small observable changes.
This is our tendency to seek out and remember information that confirms our existing beliefs or hypotheses. A classic example of this in performance analysis is: “this component was never a problem in production, sosableIt's not the root cause." The correct conclusion would be: “wewaitthis component is not the root cause.” We can avoid this bias by changing the language we use to define problems.
Focusing less on being right and more on what can go wrong in your tests will help you avoid confirmation bias in your decisions and conclusions.
blindness from lack of attention
This occurs when we are unaware of unexpected results that are right in front of us. Many of us have experienced the frustration of spending hours on a performance problem, only to go back to the beginning of the investigation and miss some simple and obvious detail, such as a database configured with maximum default connections or an operating system limited to the default 1024. file descriptors.
I think a good counter to this bias is to simply "sleep". Postpone decision making until the next day, or at least after lunch. Often after a night of rest or a spell away from the keyboard, I return to my desk and immediately find the answer to a performance issue.
Ask yourself: "Am I missing something?" and do not rush to jump to conclusions. It's valuable to set up your own mental checklists, as small mistakes like misconfiguration often turn into bigger problems. Using checklists helps ease the cognitive load on our slow cognition and gives fast cognition more room to work with less susceptibility to cognitive bias.
How to lighten the load
Those of us who work in software engineering and related disciplines are bound to encounter a variety of cognitive biases on a regular basis. The strategies I've outlined can help mitigate the threat of these misjudgments that can lead us to the wrong conclusions, but it's also important to choose software that helps ease the cognitive load.whichthe prevention of cognitive bias.
Tricentis' continuous testing platform provides comprehensive software testing information with each release and enables us to make better software quality decisions. The most powerful tool we have when it comes to testing software is intelligence, but a good backup never hurts.
Try o Tricent qTest for JIRA