Wikipedia Implements Guidelines for AI Developers Amid Declining Traffic

Most recently, Wikipedia released new signpost guidelines for AI developers and providers. This decision coincides with the platform’s ongoing difficulties by way of plummeting traffic and rampant content scraping by AI bots. The Wikimedia Foundation, which operates Wikipedia, is passionate about the proper attribution of content. Generative AI developers need to recognize the human authors…

Lisa Wong Avatar

By

Wikipedia Implements Guidelines for AI Developers Amid Declining Traffic

Most recently, Wikipedia released new signpost guidelines for AI developers and providers. This decision coincides with the platform’s ongoing difficulties by way of plummeting traffic and rampant content scraping by AI bots. The Wikimedia Foundation, which operates Wikipedia, is passionate about the proper attribution of content. Generative AI developers need to recognize the human authors they use content from.

In recent months, Wikipedia has witnessed an 8% decline in human page views compared to the previous year, prompting concerns about the platform’s sustainability in the AI-driven landscape. From May into June, AI bot activity increased exponentially. As one effect, Wikipedia reported a huge surge in traffic from bots attempting to avoid detection during this period. In response, the organization has re-trained its bot detection algorithms to get more accurate about identifying and allowing these automated visitors while keeping them out.

Wikipedia.org is firmly committed to protecting the trustworthiness of our content. This commitment is evident in the new AI strategy for editors unveiled earlier this year. With this strategy, we are aiming not just to boost the output of our human contributors, but to use AI in making our editorial workflows more effective. Central aspects of the strategy were automating translation processes and developing technology that helps editors as opposed to doing their jobs for them.

As the Wikimedia Foundation noted recently when discussing the ramifications of AI content generation, attribution is essential. It is time for platforms to open up about where their information comes from. Yet they need to promote engagement from these sources to foster user confidence in the information objects posted online.

“For people to trust information shared on the internet, platforms should make it clear where the information is sourced from and elevate opportunities to visit and participate in those sources.” – Wikimedia Foundation blog post.

Wikipedia has been adjusting to the shift away from a desktop-dominated web. It values human input above all factors in every metric it can touch. The organization is taking proactive steps to ensure that its website remains viable and beneficial in an era increasingly dominated by artificial intelligence. By establishing clear guidelines and enhancing its technological capabilities, Wikipedia aims to balance the integration of AI while safeguarding the contributions of its dedicated community of editors.