Facebook’s parent company Meta Platforms Inc. issued a statement Wednesday addressing concerns over the potential threats seen on its website before the January 6 attack on Capital marks the one year.
The company said it had made preparations to prevent potential extremist activity, as well as the spread of misinformation, which critics have claimed is rampant across the platform.
“Facebook has taken extraordinary steps to address harmful content and we will continue our work,” the statement said. “We have strong policies that we continue to implement, including restrictions on hate organizations and the removal of content that praises or supports them.”
“We are in contact with law enforcement agencies, including those responsible for addressing domestic terrorism threats,” the company continued. “We are continuing to actively monitor threats on our platform and will respond accordingly.”
The statement also said Facebook moderators have already removed “several of these groups” for various violations of the website’s user agreement.
newsweek Received information from Facebook that expanded on some of the specifications the company was using to combat misinformation.
This includes removing material supporting the events of January 6, cleaning up false facts about the winner of the 2020 election, and taking measures to block pro-violence messages.
Additionally, Facebook reiterated newsweek It has policies in place to direct people to credible sources when they search for certain words, such as “QAnon” and other terms associated with extremism.
The company noted that it had also banned more than 250 white supremacist groups and about 900 militia movements.
Massive misinformation spread on social media was pointed to as a major attacker after the Capitol attack. Despite Facebook saying it continues to remove extremism from its platform, research has shown that several alt-right and fringe groups maintain active accounts.
a report Published Tuesday by the Tech Transparency Project (TTP), a non-profit internet watchdog, listed several groups linked to January 6 that remained on Facebook a year after the revolt.
It contains material from the Three Percentors, a militia group that the TTP notes is “part of the anti-government militia movement”. [that] played a visible role in the Capital Riots.” Despite Facebook banning the three percentiles, TTP found that many of its pages were still circulating on the website, and Facebook’s algorithms sometimes promoted these pages.
Facebook’s algorithmically displayed content remains an ongoing issue, with TTP stating that sources “often actively push extremist and dangerous content into people’s feeds.”
Beyond the three percent, TTP found that several other militia groups were able to run recruitment ads on Facebook and Instagram, which is also owned by Meta.
An Instagram ad allegedly linked to a militia group said “our kids are trained on the streets and online… we’re ready for battle.”
Excessive TTP Report It was released 13 days after the Capitol attack, providing evidence that militia groups coordinated on Facebook in connection with a march on Washington, DC. Politician.”