Sunday, May 3, 2026

Artificial intelligence, the dumbing down of the world and bixonimania

Artificial intelligence is changing the world in the same way that the printing press, vaccines and democracy changed the world. Also in the same way, the black plague, smallpox and state-sponsored terrorism changed the world.

It's either going to be great or horrendous. Or maybe – if we're lucky – somewhere between.

For most of us, artificial intelligence is a clever tool to create a cartoon of ourselves or to make fake videos of our dog talking to another dog. Or maybe it's an easy way to write an email that doesn't include misspellings and has the tone we want.

All pretty good stuff. But it also has the capability of disrupting the world's economic system, to eliminate millions of jobs and to potentially enslave humanity. Unless we are already enslaved and don't realize it.

My biggest concern about AI, is of course, existential. AI can write. If it can write better than me, what am I supposed to do? Get a regular job? Ha ha ha. Ridiculous, right?

Another issue, though, is that large language models of AI (such as ChatGPT, Gemini, Claude and others) aren't very discerning and, like people throughout history, don't fact-check before spouting something as truth. Even if it's not true. Even if it says it's not true multiple times.

Take bixonimania.

It's a made-up disease, created by a team of medical researchers in Sweden two years ago to see how AI handled such a situation. According to an article in the journal Nature, researchers uploaded two fake studies on bixonimania to a preprint server and within weeks, people who asked large language model AI tools (like the ones that come with your web browser) about rubbing their eyes and getting slightly pinkish eyelids got an answer. Within weeks, bixonimania was a possible cause. Peer-reviewed literature even cited the studies (revealing that researchers didn't check their sources).

This despite the fact that the reports had a made-up author who worked at a nonexistent university in the equally nonexistent Nova City, California. The report thanked people on the USS Enterprise and mentioned Sideshow Bob, from "The Simpsons." It included the phrase "this paper is entirely made up" and described the people in the study as "made-up."

Yet the AI tools cited it as a possible cause of the problem. Other researchers cited it. An obvious trick (as the author points out, "mania" is used only for psychological issues, not eye issues) became accepted medicine.

Bixonimania is now in the ether for good. There's no getting that genie back in the bottle.

Of course, there have always been lies that have been believed despite all the evidence. Some people insist Elvis is still alive. That the moon landing was faked. That the 2020 election was stolen. That the Dodgers aren't the evil empire. They didn't need AI to believe those things.

But we're now living in a world where we increasingly depend on AI. When you Google something, the first response is AI-generated and many of us never go past that.

I guess it's not that much different than when we relied on our family, friends and neighbors to communicate what was true. We believed that razor blades in apples were a real risk at Halloween. That the Procter & Gamble logo reflected Satanism. That eating Pop Rocks and drinking Coke would make your stomach explode.

Now, however, we trust our phones and laptops. We trust artificial intelligence applications that steal other people's work, summarize it and are willing to make things up.

AI is a great tool. It will make the world a fundamentally different place. It will (at least initially) make things easier and more accessible. The big concern is the same one that has existed since the dawn of the industrial revolution: Will chasing money (and there are trillions of dollars to be made on AI) include ignoring red flags that lead to unanticipated consequences? Will the cure for our need for knowledge and information be worse than the disease?

I don't know. My bixonimania is affecting my eyes right now, so I don't want to think about it.

Reach Brad Stanhope at bradstanhope@outlook.com.




No comments:

Post a Comment