By Jacqueline Hyman
Views expressed in opinion columns are the author’s own.
Ever since the rise of social media sites, more consumers have been getting their news from media aggregates on the internet. Facebook, Twitter and even Google News play a huge role in disseminating news just by virtue of bringing all the material under one umbrella.
These companies, which were not created with journalistic intentions, are now facing similar ethical dilemmas to those of newsroom editors. In mid-September, Facebook came under fire for giving its advertisers the the option to target anti-Semitic users. Essentially, anyone can pay for “promoted” content that is shared with users with similar interests. ProPublica, an investigative journalism site, found these ads and tested the Facebook advertising system, which approved new ads within 15 minutes, according to the article.
These anti-Semitic ads reached people who described themselves as “Jew-haters” or who searched topics like “How to burn Jews,” according to ProPublica. Uproar is expected when thousands of people could be influenced by potentially dangerous ads.
Facebook did not create these hateful categories, and removed the categories after ProPublica contacted the company. An editor or manager deciding whether to publish hateful ads at a traditional news organization may have more discretion to block these ads. But Facebook is so huge that these ads are approved by machine, going without human contact until reported later.
The thing is, it’s not these social media companies who are creating the content. Rather, users share content they like on their personal pages. Or, news organizations pay to have ads targeted toward news consumers. This also means that hateful ads can easily be “promoted” to Facebook’s millions of users.
So here’s the dilemma: we can’t consider Facebook, Twitter and other social networking sites as news media. They just aren’t the same. We can’t uphold them to the same ethical values, because the business models are completely different. Their number one goal is to make money. On the other hand, news organizations’ priority should be adhering to the values of journalism: truth, accuracy, and loyalty to the people.
However, there is always some crossover. Of course, news companies need to make money to stay afloat. Social media sites should adopt some kind of ethical code. In August, 67 percent of U.S. adults said they get news through social media, according to a study by the Pew Research Center. That’s a huge amount of people that advertisers can influence just by paying some fees.
Facebook leaders need to be aware of the kind of impact their site can have. I don’t think Facebook, or any social site, should impose some kind of prior restraint on its users. After all, potential offensive content is not illegal, and would be extremely hard to monitor with its 2 billion monthly users. But the company could create a more efficient way to weed out these hateful users and report any suspicious activity. There is a report button next to posts and promotions, but there is no clear formula for how many clicks it takes before a post or account is deleted.
Luckily, it seems Facebook is trying to create positive change, as exemplified by removal of the anti-Semitic ads. Similarly, Twitter has recently closed 201 fake accounts that had Russian ties. These steps show that the companies have considered the serious nature of their impact on the public. Companies like Snapchat and Instagram, which are also jumping on the news aggregation bandwagon, will surely have to deal with ethical dilemmas like these shortly. Hopefully, they can all come up with better systems to remove unwanted or disturbing content soon. It may seem like a tall order, but this is a process news editors go through every day. Social media companies can meet news organizations in the middle.
Jacqueline is a senior journalism and English major. She can be contacted at email@example.com.