Site Logo
bad ui ux

I just recently heard of the term AI Slop (it seems I must be living under a rock) but now I see it everywhere and watched this kurzgesagt video about the impact of AI generated content on the web, especially for fact checking their videos. They point one thing out that is so obvious yet so disturbing, and I just don’t know why I missed it all the time: You already can’t trust papers, research, books, etc. online as they might be made up of AI. Soon, AI will train on this AI content and use it as a ground truth or at least reference it as a source. This will go on until 1) you don’t know what is correct and what is made up, 2) the AI content is established everywhere on the web such that it’s been referenced in papers and thus making it even more “trustworthy”.

At this point there’s no real use of the internet anymore. You can’t use it to look anything up. We’re back to books and papers that were released before the introduction of transformers. It might even happen that papers will be released for the past to make it seem as these were trustworthy papers that were written before the age of AI.

The internet as we know it will slowly vanish, you can only trust and consume a handful of websites, blogs and authors online. It’ll be a transformation back to analog. In-person meetings with others become more attractive and reliable as it might be easier to identify people that are just making stuff up. All of this reminds me a lot of a public key infrastructure (PKI) where you have to trust a few certificate authorities. With AI slop these authorities become your coworkers, people you trust, and websites that you and your friends, you trust, trust.

However, not knowing what to believe or trust feels overwhelming. Imagine you’re not a researcher, and you just want to look up at what degrees water is boiling. How could you decide which source is correct? You might go to your local library but even there you don’t know if the books are AI generated or fact based. It feels like in a few decades we might live in a world where we make up our ground truth, our axioms, and nobody would or could care.

Until then, there need to be ways to differentiate AI generated stuff from real science.