Unintended Social Consequences Catching up to Facebook

Years of limited oversight and unchecked growth have turned Facebook into a force with incredible power over the lives of its 2 billion users. But the social network has also given rise to unintended social consequences, and they’re starting to catch up with it:

House and Senate panels investigating Russian interference in the 2016 elections have invited Facebook, along with Google and Twitter, to testify this fall. Facebook just agreed to give congressional investigators 3,000 political ads purchased by Russian-backed entities, and announced new disclosure policies for political advertising
Facebook belatedly acknowledged its role purveying false news to its users during the 2016 campaign and announced new measures to curb it. Founder and CEO Mark Zuckerberg even apologized, more than 10 months after the fact, for calling the idea that Facebook might have influenced the election “pretty crazy.”
The company has taken flack for a live video feature that was quickly used to broadcast violent crime and suicides; for removing an iconic Vietnam War photo for “child pornography” and then backtracking; and for allegedly putting its thumb on a feature that ranked trending news stories.

Facebook is behind the curve in understanding that “what happens in their system has profound consequences in the real world,” said Fordham University media-studies professor Paul Levinson. The company’s knee-jerk response has often been “none of your business” when confronted about these consequences, he said.

Moving fast, still breaking things

That response may not work much longer for a company whose original but now-abandoned slogan — “move fast and break things” — sometimes still seems to govern it.

Facebook has, so far, enjoyed seemingly unstoppable growth in users, revenue and its stock price. Along the way, it has also pushed new features on to users even when they protested, targeted ads at them based on a plethora of carefully collected personal details, and engaged in behavioral experiments that seek to influence their mood.

“There’s a general arrogance — they know what’s right, they know what’s best, we know how to make better for you so just let us do it,” said Notre Dame business professor Timothy Carone, who added that this is true of Silicon Valley giants in general. “They need to take a step down and acknowledge that they really don’t have all the answers.”

Hands-off Facebook

Facebook generally points to the fact that its policies prohibit misuse of its platform, and that it is difficult to catch everyone who tries to abuse its platform. When pressed, it tends to acknowledge some problems, offer a few narrowly tailored fixesand move on.

But there is a larger question, which is whether Facebook has taken sufficient care to build policies and systems that are resistant to abuse.

Facebook declined to address the subject on the record, although it pointed to earlier public statements in which Zuckerberg described how he wants Facebook to be a force for good in the world. The company also recently launched a blog called “Hard Questions” that attempts to address its governance issues in more depth.

But Sheryl Sandberg, the company’s No. 2 executive, offered an unexpected perspective on this question in a recent apology. Facebook “never intended or anticipated” how people could use its automated advertising to target ads at users who expressed anti-Semitic views. That, she wrote, “is on us. And we did not find it ourselves — and that is also on us.”

As a result, she said the company will tighten its ad policies to ensure such abuses don’t happen again.



your ad here

leave a reply:

Discover more from UPONSOFT

Subscribe now to keep reading and get access to the full archive.

Continue reading