Sarah Palin Facebook Post Deleted, Thanks to "Social Experiment"

The mystery behind a vanishing Facebook post by Sarah Palin that railed against a planned Ground Zero mosque has been cleared up. She fell victim to a “social experiment.”

Facebook's terms of service prohibit users from posting content that is hateful, threatening, incites violence or contains nudity or graphic violence. But that's not why a blogger called on readers to report a post by Palin, which described a mosque being built on Ground Zero in New York City as "an intolerable mistake."

Explaining his actions, Tumblr blogger Moneyries wrote "This is not really about ‘refudiating’ Sarah Palin. It was a social experiment to explore the boundaries of Facebook’s government-like Terms and Conditions and the power of the Tumblr community."

"Free speech on the Internet is still a work-in-progress," he added.

The blogger had called on readers Thursday to report the Palin post, in which she wrote that "to build a mosque at Ground Zero is a stab in the heart of the families of the innocent victims of those horrific attacks." The blogger asked readers to flag the post as hateful or racist speech -- and enough people did so that it was automatically deleted.

But not for too long.

Facebook promptly apologized to Palin and reinstated the post, but that didn't stop the blogger from touting his victory.

"Tumblr: I’m pretty sure we f--king did it!" wrote Moneyries. "756 likes and reblogs later, Sarah Palin’s Facebook note “has either been deleted, or does not exist.” SHE HAS BEEN REFUDIATED!!!" -- a reference to a much-publicized Tweet from Palin using the non-English word.

Initial speculation was that the post was flagged by Facebook's system as inappropriate. But spokesman Andrew Noyes explained that the post itself didn't set off the system.

"The note in question did not violate our content standards but was removed by an automated system. We're always working to improve our processes and we apologize for any inconvenience this caused," he said.

An addendum to the reinstated post notes that "the original post of this statement (on July 20, 2010) was somehow unintentionally deleted by mistake or technical glitch."

The automated filters on Facebook aren't designed to search out and automatically delete such content, explained Noyes.

"In order for this particular system to be activated, there has to be a report," explained Noyes to And the Tumblr blogger was just such a catalyst.

Facebook refused to detail how the system worked, to prevent future attempts to game the system. But it was clear that the automated process does not sort through the tens of millions of comments and entries made on Facebook daily, rather depending upon user reports to find content that violates the terms of use.

How many complaints are required to activate the system and eventually have a post removed remains unclear.

"These processes are by and large very effective at keeping our system safe," Noyes pointed out. But in a community larger than the United States itself, incidents are bound to arise.

"We've got 500 million members, and these things happen from time to time."