More people are wondering about the weird crap that mysteriously appears in their news
feeds. How much is fake news? Did disinformation tilt an election? What are Google and Facebook going to do to clean up the mess?
You could almost hear the entire PR industry shifting uncomfortably amidst the backlash. I mean, crafting news (that some might call fake, or at least a stretch) is our stock in trade. We package propaganda as newsworthy information and sell it to the media; and, increasingly publish directly to the Web and social networks.
I understand that the fuss is more about blatant lies, not the average press release. But it highlights the challenges of determining what is newsworthy and true; a role that is increasingly being taken on by algorithms.
The Web and social media gave us all ways to easily share and spread information. This can include rumor, conjecture, commercial information, news, and yes, slander and outright lies.
I would never defend the last two; but will fight for our right to issue press releases, and traffic in other kinds of info. Any good system needs to be able to deal with all of this, i.e. anticipate some BS and surface the most credible and significant information, whether via the wisdom of the crowds, programs or a combination.
It is naïve to think that a publication, editors, or algorithms (which of course are written by humans) can present news without bias. The journalistic piece you just wrote might be pristine, free of opinion; but the very act of deciding which stories to feature shows partiality.
That said, the social networking platforms where more of us are getting news can do a much better job of separating the wheat from the chaff. I thought I’d share some of the great stories I’ve seen about the controversy and takeaways from each.
Anna Escher says “Facebook is hiding behind its [position that] ‘we’re a tech company, not a media company’ … For such an influential platform that preaches social responsibility and prioritizes user experience, it’s irresponsible …”
She recommends that they bring journalists into the process, remove the influence of engagement on news selection during elections, and expand Trending Topics to show a greater diversity of political stories – not just the ones that are the most popular.
Tim’s exhaustive Medium piece looks at all sides. He rails against “operating from an out-of-date map of the world [in which] algorithms are overseen by humans who intervene in specific cases to compensate for their mistakes,“ and says:
“Google has long demonstrated that you can help guide people to better results without preventing anyone’s free speech… They do this without actually making judgments about the actual content of the page. The ‘truth signal’ is in the metadata, not the data.”
Tim makes an analogy between news algorithms and airplanes “Designing an effective algorithm for search or the newsfeed has more in common with designing an airplane so it flies… than with deciding where that airplane flies.”
He cited an example from the history of aircraft design. While it’s impossible to build a plane that doesn’t suffer from cracks and fatigue… “the right approach … kept them from propagating so far that they led to catastrophic failure. That is also Facebook’s challenge.”
Mike Ananny writes about the public editor’s role, and the challenges they face in the increasingly tech-driven environment. He writes:
“Today, it is harder to say where newsrooms stop and audiences begin. Public editors still need to look after the public interest, hold powerful forces accountable, and explain to audiences how and why journalism works as it does — but to do so they need to speak and shape a new language of news platform ethics.”
He asks “Will the public editor have access to Facebook’s software engineers and News Feed algorithms, as she does to Times journalists and editorial decisions?” and says:
“… public editors must speak a new language of platform ethics that is part professional journalism, part technology design, all public values. This means a public editor who can hold accountable a new mix of online journalists, social media companies, algorithm engineers, and fragmented audiences — who can explain to readers what this mix is and why it matters.”