Attorney General William Barr appears before the Oversight Committee of the Capitol Hill Chamber, Washington, DC, USA, July 28, 2020.
Matt McClain Reuters
The justice ministry proposed new legislation Wednesday to reform a key shield for legal responsibility for the technology industry, known as section 230.
The bill, which will have to be passed by Congress, focuses on two areas of reform. First, it aims to narrow the criteria that online platforms must meet in order to gain the protection of liability provided by Section 230. Second, it would create immunity from the law for certain cases, such as crimes involving sexual abuse of children.
Section 230 of the Communications Decency Act protects online platforms from liability for their users̵
Oral protection has helped technology platforms evolve from the early days of the Internet, but have come under scrutiny in recent years as lawmakers and regulators more widely question the strength of the technology industry.
Several lawmakers have proposed reforms to section 230 in recent months, and President Donald Trump signed an executive order in May aimed at the law, which says it is taking tough action against “censorship” by technology platforms. Trump introduced the order shortly after Twitter first hit fact-checking tags on his tweets.
The Ministry of Justice reviews section 230 for most of the year. Attorney General William Barr told a conference in December 2019 that the department was “thinking critically” about section 230. Later in February, it hosted experts to discuss the merits of the law and discuss how it could be reformed.
The reforms proposed by the Ministry of Justice reflect some laws that have already been introduced by legislators. For example, it narrows the standard technology companies that must follow to remove content that is considered “obscene, obscene, lewd, dirty, too violent,” from subjective to “objectively reasonable belief.” A bill introduced by three powerful Republicans earlier this month includes the same standard and similarly narrows the types of content that platforms can be protected for removal, such as content that promotes self-harm or terrorism.
The proposal also includes cutting “Bad Samaritan”, which would explicitly exclude platforms that deliberately do not take action on content that violates federal criminal law. Under the proposal, the platforms can be held liable if they fail to quickly remove or suppress posts that would violate federal criminal law or do not report illegal material to law enforcement when required. The language is similar to that of Richard Richard Blumenthal, D-Conn. And Lindsey Graham, RS.C., which aims to link the protection of responsibility for technology platforms with actions to combat child exploitation materials. Version of their law EARN It, adopted by the Judicial Commission in July.
This story is evolving. Check again for updates.
Subscribe to CNBC on YouTube.