AI chatbots hallucinate seahorse emoji

metro.co.uk

AI chatbots incorrectly assert the existence of a seahorse emoji, despite it not being officially recognized by the Unicode Consortium. When prompted, ChatGPT repeatedly provides incorrect emoji suggestions and gets stuck in a loop, a phenomenon attributed to its pattern-matching algorithms and a tendency to "hallucinate" plausible but false information. This behavior highlights how large language models can generate confident, incorrect responses when encountering speculative or non-existent concepts found in their training data.


With a significance score of 1.9, this news ranks in the top 29% of today's 33489 analyzed articles.

Get summaries of news with significance over 5.5 (usually ~10 stories per week). Read by 10,000+ subscribers:


AI chatbots hallucinate seahorse emoji | News Minimalist