YouTube said it was always working to strike a balance between allowing free expression and protecting online and real-world communities from harm. Nicole Bell, a spokeswoman for the company, said that YouTube removed six videos flagged by Media Matters for violating its policies, and it terminated a channel for uploading content from a banned creator. But most of the more than two dozen videos flagged by Media Matters did not break the platform’s rules, she said.
Last year, the International Fact-Checking Network, representing more than 80 organizations, warned in a letter addressed to YouTube that the platform was “one of the major conduits of online disinformation and misinformation worldwide,” and that it was not addressing the problem.
The consequences of easing up on the fight against misinformation have become clear on Twitter. A new report by two advocacy groups, the Network Contagion Research Institute and the Combat Antisemitism Movement, found a surge in antisemitic content as Mr. Musk took over.
It described an organized campaign by extremists who had previously been barred from the platform. One, Tim Goniet, who used the name Baked Alaska online, was recently convicted and sentenced to 60 days in prison for his part in the Jan. 6 riot at the Capitol. Tweeting this month, he pressed what he called a conspiracy theory: “twitter unbanned all of us cuz their engagement was tanking w/o us.”
“It is true that the trust and safety efforts we have had to date have been really broken, but at least there were efforts,” said Mr. Finkelstein, an author of the report. “and there was some baby in the bath water.”
Despite Mr. Musk’s avowal to foster unfettered speech on the platform, he has also moved to suspend accounts, like Kanye West’s, after a series of antisemitic remarks.
Nora Benavidez, senior counsel at Free Press, an advocacy group for digital rights and accountability, said the experience at Twitter showed that moderating offensive content remained important for the viability of platforms, regardless of economic considerations.
“Content moderation is good for business, and it is good for democracy,” she said. “Companies are failing to do that because they seem to think they don’t have a big enough role to play, so they’re turning their back on it.”