Updated

Wikipedia has all but abandoned its efforts to combat the pornographic images littering its servers, after board members could not settle on one of the dozens of technical solutions that could have solved the problem.

FoxNews.com published a series of articles about thousands of questionable pictures, videos and other material hidden throughout the popular online encyclopedia in May 2010. The revelation caused a stir-up at the Wikimedia Foundation that runs the site, leading co-founder Jimmy Wales to purge hundreds of images and task members with finding a way to keep objectionable material out of children's eyes.

Summary

Three years after a FoxNews.com expose highlighted the pornography problem, the site is no closer to resolving it.

April 27, 2010: Co-founder Larry Sanger says Wikipedia is knowingly distributing child porn -- and implores the FBI to investigate.

May 10, 2010: Wikimedia Foundation begins purging explicit images as it prepares a new policy regarding sexual content.

May 14, 2010: A shakeup begins at the top levels of Wikipedia as admins try to deal with the controversy.

June 25, 2010: A loose network of pedophiles spinning the popular online encyclopedia in their favor is uncovered.

Feb. 24, 2012: Two years later, Wikimedia is still littered with graphic pornography.

June 5, 2012: pornographic images and videos are the most popular items on Wikipedia.

But the board could not reach a consensus on the best answer, efforts were simply dropped -- and the problem has been largely ignored.

"I'm not surprised that there is still a ton of pornography and sexually graphic material on Wikipedia and Wikimedia Commons, despite Jimmy Wales' frantic efforts in April 2010 to delete a substantial amount of it,” Gregory Kohs, who co-founded criticism site Wikipediocracy.com, said to FoxNews.com. “Frankly, Wales could delete a thousand images that are inappropriate for children, and the so-called 'wiki community' would just restore 900 of them."

Wales did indeed remove images from the site and made a call for the foundation to implement a “personal image filter”; in May 2011, the Board of Trustees unanimously voted 10-0 in favor of the filter. But protests against the decision led the board to reverse its decision.

More On This...

The board ultimately canceled any plans for an image filter, leaving pornography freely available to all site visitors.

"Wales no longer has much, if any, control over Wikipedia. Neither does the foundation," Eric Barbour, a critic of Wikipedia who recently co-wrote a book on the site, told FoxNews.com. “Current Wikipedia administrators tend to be young males who don't write any content for it and love to fight amongst themselves.

"To them, Wikipedia is a giant video game, not an ‘encyclopedia.’”

Officials for the Wikimedia Foundation confirmed in statements to FoxNews.com that the board abandoned efforts.

“This was a major discussion within our community. Thousands [about 24,000] of users contributed to the process,” Jay Walsh, a spokesman for Wikimedia told FoxNews.com. “Beyond simply conducting votes, our community also suggested dozens of possible technological models for how an image filter might work, what issues the system would encounter, and how it might be implemented.”

'Wales could delete a thousand images that are inappropriate for children, and the wiki community would just restore 900 of them.'

— Gregory Kohs, co-founder of Wikipediocracy.com

Walsh added; “Ultimately, our board declared that the results of this referendum were inconclusive, and that no single system would be effective, nor was there consensus about the need for the system.”

While many images were removed during the purge, just as many new images were posted by Wiki users since 2010. Walsh admits that that images have resurfaced, but said that the board was trying to combat it as best as possible.

“A few years ago there was a situation where several hundred images were removed from our project Wikimedia Commons over the course of a few days,” he said. “Some of the material removed then has been returned, some remains removed.”

“On our major media repository, hundreds of images are added and removed every single minute,” he added. “Images that are deemed inappropriate, possible copyright violations, or potentially illegal are removed by volunteers very quickly if not immediately. This is part of the normal, daily process on our project.”