Germany’s Federal Office of Justice (BFJ) has initiated action against Twitter, arguing the popular microblogging site failed to deal effectively with illegal content.
“The Internet is not a lawless space,” said Marco Buschmann, Minister of Justice. He says Twitter failed to fulfil its legal obligation to address reports of illegal content and claims to have sufficient evidence to support this.
Also Read: Did Musk Change Twitter Logo to Keep a Year-Old Promise?
According to the Network Enforcement Act (NetzDG), social media firms must maintain effective and transparent procedures for handling user complaints about illegal content.
The BFJ claims Twitter did not delete or block access to reported illegal content within the legally stipulated deadlines. The allegations point to a systemic failure of complaint management.
“The provider of Twitter is subject to the provisions of the NetzDG. The BFJ has sufficient indications that it has violated the legal obligation to deal with complaints about illegal content and that this is a systemic failure in the complaint management of the provider, which is subject to a fine,” stated the BFJ in a statement.
Before issuing a fine against providers of social networks, the judicial determination of the illegality of the content is required. The Bonn District Court will handle this so-called preliminary ruling procedure.
Fruitless complaints management
According to the BFJ, “Numerous content was reported that was published on Twitter, which the authority considers illegal and, despite user complaints, was not deleted or blocked by the provider within the legally stipulated periods.
“The fine proceedings initiated are based on this,” it added.
The BFJ reported that the defamatory content, directed towards an undisclosed person, was posted on Twitter over a four-month period and contained similar unjustified statements of opinion.
Twitter’s content moderation practices have faced continuous criticism lately. Since taking over Twitter last October, CEO Elon Musk has reduced the number of staff responsible for content moderation and hate speech handling.
The site now automatically responds to all press queries with a single poop emoji, months after dismissing its communications team.
Another lawsuit is running
This isn’t the only legal action Twitter is contending with. A separate lawsuit has been filed in Germany by digital rights campaign group HateAid and the European Union of Jewish Students (EUJS). It relates to Twitter’s alleged failure to remove six pieces of content that, the groups say, trivialize or deny the Holocaust.
Twitter is also facing warnings from the EU to hire more content moderation staff to comply with the Digital Services Act, which comes into force next year. While no fines have yet been issued under the Network Enforcement Act, companies have been compelled to act by the threat of action.
Adam Hadley, director of Tech Against Terrorism, expressed concern about the potential consequences of Twitter’s recent decision to reduce staff working on content moderation last month.
“Platforms should be under no illusion that cutting costs risks cutting corners in an area that has taken years to develop and refine,” said Hadley. “We are worried about the signal Twitter’s latest move sends to the rest of the industry”