Some were not so lucky. An Oregon family was hospitalized in 2015 after eating mushrooms that an identification app showed were safe, according to news reports. In 2022, an Ohio man became seriously ill after eating poisonous mushrooms, which was also misidentified by the app. Confidently identifying mushrooms in the wild requires expertise, and technological tools are not yet sufficient, Claypool said.
A variety of new AI-powered mushroom identifiers are now appearing in Apple, Google, and OpenAI app stores. These tools use artificial intelligence to analyze photos and descriptions of mushrooms and compare them to known varieties. As with past mushroom identification apps, accuracy is low, Claypool found in a new report from Public Citizen, a nonprofit consumer advocacy group. But AI companies and app stores offer these apps anyway, often without disclosing clear information about how often their tools are wrong.
Apple, Google, OpenAI, and Microsoft did not respond to requests for comment.
AI Mushrooms The small explosion of apps is emblematic of a larger trend of incorporating AI into products that might not benefit from it, from tax software to medical appointments. Powerful new technologies such as large-scale language models and image generators are good for some things, but continuously spitting out accurate information is not one of them. Mushroom identification is a poor candidate for automation because it's high-stakes and frequently fails, but companies are doing it anyway, Claypool concluded.
“They're advertising it like the Star Trek computer, saying, 'This is a source of knowledge,'” he says. “But the reality is these things always get it wrong.”
Despite the risks, budding foragers appear to be increasingly relying on apps to identify mushroom species. According to Google Trends, 3 out of the top 5 searches related to “mushroom identification” are related to apps or software. A search for “mushroom” on OpenAI's GPT store (where users find professional chatbots) quickly brings up suggestions such as Mushroom Guide, which claims to identify mushrooms from photos and determine whether they're edible. Masu. In the Apple and Google app stores, you'll find many apps that claim to identify mushrooms, some with “AI” in the name or description.
Last year, in response to a spike in poisonings, Australian scientists tested the accuracy of popular mushroom identification apps and found that the most accurate apps correctly identified dangerous mushrooms 44 per cent of the time.
However, a cognitive distortion called automation bias allows even less accurate AI products to quickly gain consumer trust. As early as 1999, scientists discovered that people tend to trust computers' decisions even if the computer's recommendations contradict their common sense and training.
In some situations, AI can improve accuracy and results. For example, his February study published in JAMA Optharmology found that multilingual model chatbots were as good as eye doctors at diagnosing and recommending treatments for glaucoma and retinal diseases. .
“While promising, our findings support direct clinical application as the limitations of chatbots in complex decision-making are not clear, along with necessary ethical, regulatory and validation considerations.” “It should not be construed as something that does.”
When Claypool looked into other means of AI mushroom identification, he discovered that the Amazon bookstore offered what appeared to be an AI-generated mushroom field guide. He also tested his Bing Image Creator from Microsoft and asked it to generate and label images of different mushrooms. He created new parts of mushrooms that did not exist, such as “Gin'' and “Nurupe.'' Inaccurate images of mushrooms generated by AI could negatively impact future AI training data and search engine results, further reducing the accuracy of these AI systems, Claypool wrote.
“We are committed to providing a safe shopping and reading experience for our customers, and we take this incident seriously,” said Amazon spokeswoman Ashley Banicek. The company's guidelines require authors to disclose their use of AI-generated content, and Vanicek pointed out that Amazon blocks books from listing and removes books that don't follow its rules.
(Amazon founder Jeff Bezos owns the Washington Post.)
Lesson learned: Don't eat wild mushrooms unless you consult a professional. (I'm a real person.)