AI Blow to the Internet’s Giant Site: It’s Dropping Steadily

The proliferation of Artificial Intelligence is threatening the internet’s single-scale, transparent source of information. While sophisticated bot crawlers are muddying the traffic data, AI summaries from search engines have already reduced human traffic by 8%. And none of this looks bright for the site’s future.

The Wikimedia Foundation has joined the chorus of those expressing serious concerns about the effects of Artificial Intelligence on the reliable information ecosystem of the internet. Marshall Miller, the Senior Director of Product at the Foundation, explained in a blog post the impact of the increasing use of chatbots based on Large Language Models (LLMs) and AI-generated summaries in search engines on Wikipedia’s page views.

Miller notes that they have observed an 8% annual drop in Wikipedia’s page views. He believes this decline is a reflection of how generative AI and social media are changing the way people access information. The biggest factor is that search engines are presenting users with summarized answers directly in the search results, often based on Wikipedia content. Users now find the ready-made summary provided by the search engine sufficient instead of clicking directly on Wikipedia to reach a piece of information.

In addition to this situation, the Foundation is also struggling with increasingly complex AI bot crawlers that make it difficult to distinguish human traffic from bots. New and more accurate data, obtained after improving the bot detection mechanisms, confirms this worrying drop in page views.


A Greater Risk: Sustainability Threat

Miller states that this decline is more than a simple loss of traffic; it paints a much larger picture of an existential risk. If Wikipedia’s traffic continues to decrease, this could threaten “the only site of its scale that provides information across the entire internet with standards of verifiability, neutrality, and transparency.” Miller warns that fewer visitors will lead to fewer volunteers, fewer donations, and ultimately a decrease in the reliable content that is the site’s greatest strength.

To solve this problem, Wikimedia advocates that LLMs and search engines should be more conscious about providing the user with an opportunity to directly interact with the source of the information presented. Miller says, “For people to be able to trust the information shared on the internet, platforms must clearly state where the information comes from and increase opportunities to visit and participate in those sources.”

This discussion is not new. Earlier this summer, Wikipedia tested the idea of AI-generated summaries to appear at the top of articles, but had to abandon the project after strong backlash from the site’s volunteer editors. However, Miller’s latest warning once again brings to the forefront the concern that while AI transforms access to information, it may endanger the future of the sources that produce and verify the information.

You Might Also Like;

Exit mobile version