Few holidays underscore America’s unhealthy info diet more than Thanksgiving. While this year’s holiday will certainly be different, and hopefully gatherings will be kept small or scrubbed entirely to limit the health risk, my suspicion is that in one way or another, this will prove true yet again, either through Zoom gatherings or at in-person dinners.
In fact, fresh off the heels of a heated election and amid a surging pandemic, it might even prove to be worse than usual. I suspect that quite a few families will have relatives who believe that the election was rigged or stolen. Other families might encounter members who refuse to wear masks or abide by other safety protocols. And some might see their loved ones espouse QAnon-related rhetoric.
Certainly, Fox News and talk radio certainly play a role in this. There is no question about that. But social media platforms such as Facebook and YouTube also factor heavily into the equation. Not only do these platforms empower bad faith and dishonest actors, but they algorithmically encourage them. These sites were once places that you’d sign on to and see some family photos or a funny viral video. Now, they’re both loaded with disinformation and hyper-partisan rhetoric that circulates and influences the people we care most about.
The change “resulted in a spike in visibility for big, mainstream publishers like CNN, The New York Times and NPR, while posts from highly engaged hyperpartisan pages, such as Breitbart and Occupy Democrats, became less visible,” Facebook sources told Roose, Isaac, and Frankel. It offered a peek at what a “calmer, less divisive Facebook might look like.” According to NYT, some employees argued the tweak should have been made permanent. But that didn’t happen.
>> The key question: Why not make permanent a change that gives weight to credible, authoritative news sources? Doesn’t that seem like common sense?
“Mundane middle-class American life and high-octane propaganda”
>> One of Warzel’s key takeaways was the “problem of comments” which he pointed out often descend “into intense, acrimonious infighting.” Warzel wrote, “The more I scrolled through them, the more comments felt like a central and intractable issue. Unlike links to outside articles, comments aren’t subject to third-party fact checks or outside moderation. They are largely invisible to those people who study or attempt to police the platform. Yet in my experience they were a primary source of debunked claims, harassment and divisive rhetoric…”
>> This isn’t unique to Facebook’s promise to ban content that denies the Holocaust. The company has a history of announcing crackdowns, generating good PR, and then not fully executing on its promises…
Employees understand history will judge them
Donie O’Sullivan emails: “From speaking to Facebook staffers, my sense is some have considered how history is going to judge them and their association with the company… but they also view it as unfair how much scrutiny Facebook gets when (and perhaps correctly) YouTube might be even worse. There are also Facebook staffers, of course, who think the news industry, particularly the NYT and CNN, just go after the company unfairly and thus tune out critical coverage…”