Sto­ries are pow­er­ful devices that help get a point across to oth­ers. They help us close the dis­tance between abstract thought and the way out brains oper­ate through nar­ra­tive. They help us add order to events. We can con­vey very com­plex ideas and help oth­ers under­stand them with our sto­ries. Even more pow­er­fully, this is how we are wired to under­stand events and infor­ma­tion. But what hap­pens if the story we tell is not the right one? How would you even know? Sto­ries are often far more pow­er­ful at bypass­ing ratio­nal deci­sions then in facil­i­tat­ing them. The human mind is wired to sup­port any con­clu­sion it comes to, even in the face of mount­ing evi­dence against our sup­po­si­tion. Nas­sim Taleb describes this error in logic with what he calls the Nar­ra­tive Fal­lacy, or the need for peo­ple to cre­ate sto­ries, even if we do not have evi­dence that that story is true or even the best expla­na­tion of events.

Taleb’s descrip­tion is as follows:

The nar­ra­tive fal­lacy addresses our lim­ited abil­ity to look at sequences of facts with­out weav­ing an expla­na­tion into them, or, equiv­a­lently, forc­ing a log­i­cal link, an arrow of rela­tion­ship upon them. Expla­na­tions bind facts together. They make them all the more eas­ily remem­bered; they help them make more sense. Where this propen­sity can go wrong is when it increases our impres­sion of understanding.

Here is an all too com­mon exam­ple of how this plays out in the real world: You run a test and dis­cover the a vari­ant pro­duces a 6% lift in RPV. you also dis­cover that the same vari­ant pro­duces a 5% drop in inter­nal searches. So you tell oth­ers that peo­ple were obvi­ously find­ing what they wanted eas­ier, so they didn’t need search and so they were spend­ing more…

The data pre­sented does not tell you that, it only tells you that the recipe had a lift for RPV and a drop in search. It doesn’t tell you that search and RPV are related, or why some­one spent more, it only pre­sented a sin­gle data point for com­par­a­tive analy­sis between default and that recipe. Any story you come up with adds noth­ing to why you should make the deci­sion (it raised RPV) and it can set dan­ger­ous prece­dent for believ­ing that drop­ping search always results in rais­ing RPV (it might, but a sin­gle data point in no way pro­vides any insight into that relationship).

Any set of data can be used to make a story. There doesn’t have to be a con­nec­tion between the real world and the story we tell, since we are the ones that are fill­ing in the gaps between data points. Ran­dom­ness and direct cause are con­fused eas­ily when we start to nar­rate an action. We love these sto­ries because they make it acces­si­ble and easy for some­one to hear, first event A hap­pened, then event B, then event C. What hap­pens is our minds instantly race to say, event B hap­pened BECAUSE event A hap­pened. Because event A lead to event B, then nat­u­rally event C hap­pened. This might be true, it might not be, but the story we use and tell our­selves grants us the excuse to not under­stand what really went on. We are elim­i­nat­ing the dis­cov­ery of the real­ity of this rela­tion­ship by grant­ing our­selves the story that fills in those gaps, despite its lack of con­nec­tion to the real world. This action can com­pletely ignore hun­dreds of other causes and also rules out the involve­ment of chance.

The world is a very com­plex place, and there is almost never as sim­ple an answer as a sim­ple series of events to explain any action, let alone one that would actu­ally be impor­tant enough to make a busi­ness deci­sion. So why then do we let our­selves fall into this and why do we fall back on sto­ries as a tool to make those deci­sions? We are not actu­ally adding any real value to the infor­ma­tion from these sto­ries, we are sim­ply pack­ag­ing them in a way to help get across an agenda.

We don’t actu­ally need sto­ries to make deci­sions, we only need dis­ci­pline. Often times we find our­selves stuck try­ing to con­vey con­cepts beyond oth­ers abil­ity to under­stand in short period and between many dif­fer­ent draws for their atten­tion, but this is not an excuse for us believ­ing the fic­tion that we nar­rate. Many peo­ple base their entire jobs on their abil­ity to tell these sto­ries, not on their abil­ity to deliver mean­ing­ful infor­ma­tion or change. The real­ity is that to make a deci­sion, you sim­ply need the abil­ity to com­pare num­bers and choose the best one. I don’t need to know why vari­ant C was bet­ter than B, I sim­ply need to know that it was 5% bet­ter. Pat­tern and anom­alies are pow­er­ful tools and the ana­lyst best friend, but we can never con­fuse them with expla­na­tions for those events. Often times things hap­pen for very com­plex and dif­fi­cult rea­sons, and while it is nice if we feel that we under­stand them, it does not change the pat­tern of events.

One of the main oppor­tu­ni­ties for groups to grow is to move past this dan­ger­ous habit of cre­at­ing sto­ries, and to instead focus on cre­ate dis­ci­plined and pre­vi­ously agreed on rules of action in order to enable deci­sions to be made away from the nar­ra­tive. This move allows you to stop wast­ing energy on this dis­cus­sion and instead use it to think of bet­ter more cre­ative oppor­tu­ni­ties to explore and mea­sure the value of. Any sys­tem is only good as the input into it, so start focus­ing on improv­ing the input, and stop wor­ry­ing about cre­at­ing sto­ries for every input into that system.

Because of the dif­fi­cult nature of this change for some groups, many are turn­ing to more advanced tech­niques in hope of avoid­ing this bias. The fun­da­men­tal goal of machine learn­ing is to remove human inter­pre­ta­tion of results, and to instead let an algo­rithm find the most effi­cient option. All of these sys­tem fail when we lose focus and get back into sto­ry­telling, when we let the ego of oth­ers dic­tate an action based on how well they under­stand the real­ity of the sit­u­a­tion. They also fail when we allow our own biases to make the deci­sions over the sys­tem, instead of let­ting the sys­tem learn and choose the best option. When we free our­selves from sto­ry­telling, it allows us the free­dom to focus on the other end of that sys­tem. We don’t need to worry about act­ing on the data, or in oth­ers under­stand­ing it; we can instead focus ours and oth­ers energy in try­ing new things and in feed­ing the sys­tem with more qual­ity input.

Love your sto­ries, and if you need them to get a point across, do not instantly remove them from your arse­nal. Just don’t believe that they are con­vey­ing any­thing resem­bling the cause and effect of the world, and do not let them be the decid­ing fac­tor in how you view and act on the world. They are color, and they make oth­ers feel good, but they add no value to the deci­sions being made. Be clear with oth­ers on how you are to act before you ever get to the sto­ry­telling and you will dis­cover that sto­ries are sim­ply color. Every jour­ney is a story, just make sure yours is less fic­tion and more about mak­ing cor­rect decisions.