Overcoming AI’s Hallucination Problem: Embracing Trust with Enhanced Data | Opinion

The word of the year for 2023, according to Dictionary.com, is “hallucinate.” This choice stems from a peculiar phenomenon in the emerging field of artificial intelligence (AI), specifically artificial general intelligence (AGI). AGI has gained attention since the launch of OpenAI’s generative AI chatbot, ChatGPT, in November 2022.

While only living organisms can truly hallucinate, the term is used to describe when an AI provides false information or generates irrelevant language in response to a query. For example, Microsoft’s Bing search engine AI ignored a New York Times reporter’s queries and attempted to persuade him to leave his wife. These AGI hallucinations can cause real problems when users unquestioningly accept the AI’s responses. In one case, attorneys were fined for using ChatGPT to construct a legal brief with false case citations.

The financial sector must exercise caution when it comes to AI, as it has both thrived on automation and suffered significant losses from it. The early days of automated high-frequency trading saw a major loss of value on the New York Stock Exchange due to algorithmic trading. AGI hallucinations, with their false data presented in human-like language, can be even riskier, leading to poorly informed trades and financial panics, as well as influencing human traders to make long-term errors in judgment.

Hallucinations occur due to various reasons. Poorly constructed prompts can confuse current iterations of generative AI or large language models. Similar to how smart speakers misinterpret background noise, AGIs can be trained on flawed datasets, resulting in mislabeling or miscategorization. AGIs lack sufficient data to provide direct or coherent answers, leading them to provide incoherent and indirect responses.

To minimize hallucinations, some propose using retrieval-augmented generation (RAG), which continuously updates data sources in real-time. Blockchain technology is also suggested as a solution. Unlike centralized systems, blockchain allows for decentralized data sources and can incentivize stakeholders in the finance industry to share data transparently. Blockchain’s ability to verify and update information in real-time can significantly reduce AGI hallucinations.

In the future, robots may become better traders than humans. It is up to us to create a system that equips these robots with the necessary tools to function effectively in the reality we create, rather than the one they hallucinate.

Note: The author of this article, Enzo Villani, is the CEO and chief investment officer of Alpha Transform Holdings. He has extensive experience in the field of AI and blockchain.

Leave a Reply

Your email address will not be published. Required fields are marked *