Saturday, September 27, 2014

FACEBOOK SPECIAL ........Barred From Facebook, and Wondering Why

Barred From Facebook, and Wondering Why




MICHAEL LETWIN, a lawyer living in Brooklyn, went to sign into his Facebook account, as he does almost daily, and received a surprising — and unpleasant — message.
Your account has been disabled,” it said. “If you have any questions or concerns, you can visit our F.A.Q. page.” Mr. Letwin, who besides his personal page also helps administer a Facebook page for the group Jews for Palestinian Right of Return, clicked onto the F.A.Q. page and found a reference to Facebook’s community standards, none of which he felt he violated, along with the option to appeal.
He did. And then he waited. And waited.
Mr. Letwin’s situation is not unusual, or new. The question of what role social media companies should play — a hands-off observer that steps in only in extreme circumstances, or a curator that decides what goes up and what comes down — has long been debated.
Recently, Twitter refused to allow posts with links to videos of the beheading of the American journalist James Foley. Facebook is involved in a battle with drag queens whose accounts were disabled because they used their stage names in their profiles. Using anything but your real name is a violation of the company’s rules. The furor led this week to a meeting with Facebook representatives and a news conference called by a San Francisco supervisor.
We don’t realize how ingrained Facebook is in our everyday lives,” a drag queen named Heklina told KNTV in San Jose, Calif. “I was shut out of Facebook for 24 hours and felt like I had a limb chopped off.”
But few users, until they are faced with a similar situation, are aware of how little control they actually have over something they view as their own — their pages, their posts, their photos.
When Facebook makes a termination decision, it’s potentially life-altering for some people,” said Eric Goldman, a professor of law at Santa Clara University in California and co-director of the High Tech Law Institute there. “They’re cut off to access to their communities” and possibly to their clients.
That is not to say that Professor Goldman thinks social media platforms should be completely unregulated. And, he said, Facebook and other social media companies largely do a good job of monitoring so many users and posts.
His and others’ main criticism focuses on transparency.
The average person’s soapbox is now digital, and we’re now in a world where the large social media companies have a governmentlike ability to set social norms,” said Lee Rowland, a staff lawyer with the American Civil Liberties Union. “It’s a massive power and it comes with a responsibility.”
These questions arise with all social media, but the relationship users have with Facebook is particularly passionate, Professor Goldman said. Even as some say its impact is waning, it still provides 1.3 billion people — compared, say, to Twitter’s 271 million active monthly users — with access to news about their friends and to community groups.
Our goal has always been to strike an appropriate balance between the interests of people who want to express themselves and the interests of others who may not want to see certain kinds of content,” Monika Bickert, head of Facebook’s global policy management, wrote in an email.
Social media companies have every legal right to take down content or kick someone off, said Danielle Citron, a professor of law at the University of Maryland School of Law.
Facebook, like other social media companies, has a list of standards that users agree to abide by when they set up their accounts, even if they never read the standards.
Among other things, they prohibit posting of hate speech (which means individuals or groups cannot attack others based on race, ethnicity, national origin, religion, sex, gender, sexual orientation, disability or medical condition), encouragements of self-harm, graphic content or threats of violence. And the user’s real name must be used.
Anyone can easily file a report against a user. And Facebook has hundreds of people working globally, around the clock and in 30 languages, reading and responding to reports of violations.
Obviously, many of these categories are open to interpretation. Breast-feeding, for example, is something Facebook has grappled with in the past — essentially, how much of the breast can you show before it becomes graphic?
If Facebook decides to remove content, it sends a warning to the user about the action. People can also be locked out temporarily for a few days or a week. Grounds for immediately disabling an account include using a fake name or promoting child exploitation.
But Heather Dorsey, who lives in Milwaukee, had not done any of those things when she found herself barred from logging onto Facebook three years ago.
My profile didn’t break any rules. I hadn’t done anything out of the ordinary prior to getting temporarily kicked off,” she wrote in an email. “It was frustrating not knowing how long it was going to take to get the issue resolved, as I do use Facebook to stay connected, particularly with friends and relatives who live out of town. I am a freelance writer and social media consultant, so it was also an issue for my work.”
She tried to call, but ended up in an endless circle of recordings. She found an email address for advertisers and contacted it, asking what she had done wrong. And as suddenly as she was taken off, she was allowed back on.
In 2012, the website Gawker published a far more detailed list of Facebook’s Abuse Standards Violations used by the company’s regulators. Facebook refused to confirm that the list was valid.
While the community standards are global, the company does obey a country’s laws.
For example, visually or verbally insulting Turkey’s first president, Mustafa Kemal Ataturk, is illegal in Turkey. If Facebook is notified of such a post, it limits the visibility of that post in Turkey. The same with Holocaust denial in countries where that is against the law.
cebook would not release the number of reports it receives nor how much content it takes down. But it does not take more than a quick search on the Internet to see that many users are confounded when they try to log in and find they cannot.
That includes the American Civil Liberties Union. The organization last year posted a photo of a bare-chested bronze female statue in an article on its Facebook page about controversial public art in Kansas.
Facebook took the post down, telling the organization that it had violated Facebook’s community standards. It then bloced the ACLU from posting for 24 hours, contending it had posted again, which it had not.
Once the A.C.L.U. contacted Facebook’s public policy manager, apologies were given and the post was allowed back up. But as Ms. Rowland said, “Our ultimate success is cold comfort for anyone who has a harder time getting their emails returned than does the A.C.L.U.”
Professor Citron, author of Hate Crimes in Cyberspace,” said of Facebook, “I think it’s a positive thing that they’re allowed to set community norms.” The problem is a lack of “technological due process,” she said.
Ms. Bickert of Facebook acknowledged that “one area where we’re focusing is improving the information we share with people about our community standards and when we take action on reported content.”
For Mr. Letwin, that can’t come soon enough. A month after his account was disabled, he received an email apologizing, saying it had all been a mistake on Facebook’s part.
A Facebook spokesman said a report was filed against Mr. Letwin for using a fake name, which he had not done, and a reviewer looking at his account then mistakenly thought it violated Facebook’s standards regarding promotion of violence and terrorism. But the process took far longer than it should have, he acknowledged, saying that typically, an appeal should be responded to within a few days.
It was a Kafkaesque thing,” Mr. Letwin said. “You don’t know if you did too many posts, too many likes. The rules are constantly changing.”

ny times 140920


No comments: