Facebook begins to shift from being a free and open platform into a responsible public utility
When Facebook recently removed several accounts for trying to influence the 2018 midterm elections, it was the company’s latest move acknowledging the key challenge facing the social media giant: It is both an open platform for free expression of diverse viewpoints and a public utility on which huge numbers of people – and democracy itself – rely for accurate information.
Under pressure from the public and lawmakers alike since 2016, Facebook responded in early 2018 by making significant changes to the algorithms it uses to deliver posts and shared items to users. The changes were intended to show more status updates from friends and family – sparking “meaningful interactions” – and fewer viral videos and news articles that don’t get people talking to each other. As a result, users have spent far less time on the site, and the company’s stock-market value has dropped.
Yet the problem remains: The very features of social media that encourage participation and citizen engagement also make them vulnerable to hate speech, fake news and interference in the democratic process. This inherent contradiction is what the company must resolve as it shifts from being just one startup company in a crowded marketplace of big-data businesses to a public information utility with monopoly power and broad social influence.
No longer a platform
Facebook continues to struggle with the intersection or convergence of three related developments over the past few years. My own research describes the first, the rise of social media networks designed for constant interaction and engagement with users. That enabled the second, getting rid of gatekeepers for news and information: Now anyone can post or share information, whether it’s true or not. And in the third development, these systems have given companies huge amounts of detailed personal information about their users, enabling them to display information – and paid advertisements – matched to an individual’s likes, political and religious views, hobbies, marital status, drug use and sexual orientation.
The company’s founder and CEO, Mark Zuckerberg, is most comfortable describing his creation in terms often used for technology firms, with words like like “connections” and phrases like “bring people closer together.” And the company is still structured as a regular corporation, responsible only for maximizing value for shareholders.
That mindset avoids the fact that Facebook wields societal power on an unprecedented scale. The company’s decisions about what behaviors, words and accounts it will allow govern billions of private interactions, shape public opinion and affect people’s confidence in democratic institutions.
Facebook used to be about extracting profit from data about its users. Now the company is starting to realize it needs its users’ trust even more than their information.
Becoming a utility
What the public expects from a technology company is substantially different from what people expect in, say, a water company or the landline telephone company. Utility companies need to be accountable to the public, offering transparency about their operations, providing accountability when things go wrong, allowing verification of their claims and obedience to regulations meant to protect the public interest.
I expect Facebook will face increasing pressure from politicians, government communications regulators, researchers and social commentators to go beyond filtering out fake news. Soon, the company will be asked to acknowledge what its actual role in democracy has become.
Technological advances mean what used to be extra services – like internet access and social media – are now necessary parts of modern life. Internet service providers are facing similar transitions, as the net neutrality policy debate lays ground rules for the future of an open internet.
Facebook has already signaled its understanding of that pressure – and not only with the algorithm changes and the shutdown of the fake accounts leading up to the midterm elections. In a recent court filing in California, Facebook claimed it was a publisher of information, protected by the First Amendment against possible government regulation.
It’s true that overregulation does run the risk of censorship and limiting free expression. But the dangers of too little regulation are already clear, in the toxic hate, fake news and intentionally misleading propaganda proliferating online and poisoning democracy. In my view, taking no action is no longer an option.
Anjana Susarla, Associate Professor of Information Systems, Michigan State University
This article was originally published on The Conversation. Read the original article.