GenAI

Why Businesses Must Bring Hallucinations Into Focus ?

Why Businesses Must Bring Hallucinations Into Focus ?

Mckinsey estimates businesses can add up to $7.9 trillion in value globally through Generative AI by increasing worker productivity. While that number is indeed making business leaders dream bigger, there is an element of concern that isn’t being addressed.

Curiously, this very real and significant concern goes by the euphemism ‘hallucinations’. A ‘hallucination’ is simply an instance when a Large Language Model such as Chat GPT gets an answer wrong.

In the context of businesses the concerns around hallucinations evolve to being more than concerns, they become consequences. This is because LLMs have a tendency to get about 40% of their answers wrong, but they deliver these answers with a 100% conviction.

If you’re an individual using ChatGPT in a personal capacity, these errors will not have a significant impact regardless of whether or not you identify the error. However, when this language model performs the role of a customer support executive of a bank, giving a customer an erroneous summary of their accounts, the bank might have a lawsuit on their hands. 

Every hallucination comes with a cost associated and if businesses can find a way to navigate these tricky waters of ‘hallucination costs’, the trillion dollar goals might be on the horizon.

If A.I is being used to predict the next song on my spotify playlist or being used to generate suggestions for taglines for my marketing campaign then the ‘hallucination cost’ associated with erroneous suggestions is significantly low. When it comes to customer responses the costs are much higher. There are ways and means to determine these costs and measure the consequences as we adopt new forms of automation.

The problem is there are very few leaders in the industry today, who are providing warnings about the impact and cost of hallucinations. The primary purpose of a ‘board of directors’ is to ensure that the business does not suffer from undue risk. Conversations of investment in automation must be supplied with questions around the mitigation of these risks.

In simpler words the board must ask a very basic question- ‘How much do the hallucinations matter here?’ That’s when they’ll see the bigger picture.