Introduction

Beginning on July 29th, executives from Google, Amazon, Apple, and Facebook took the spotlight for questioning on the Congress floor. After months of debate surrounding the internet platforms’ responsibility for the third-party content uploaded to their platforms, Facebook took the brunt of the questioning in regards to executive immunity and current social media antitrust laws. A majority of the questioning revolved around Section 230 of the Communications Decency Act, and the House of Representatives Judiciary Antitrust Committee has been examining major virtual outlets to find an amendment to this law that may satisfy all ends of the political spectrum.

As major social media platforms like Facebook and Instagram experience backlash for the publication of false and/or harmful information, amendments must be put in place to create better social media regulation. Out of the proposals that exist to amend Section 230, Congress should follow the Department of Justice’s proposal to narrow the scope of immunity. This proposal will allow for the criminal charging of both the users and internet platforms, therefore increasing accountability and transparency. Although there are many approaches that have gained support from Congress, the DOJ’s is the necessary step in effectively clarifying the responsibility to regulate unlawful content.

 

What Does Section 230 Currently Protect?

The Communications Decency Act is a piece of legislation enacted in 1996 to regulate the uploading and posting of inappropriate or obscene content based on user regulations on many internet platforms. As this Act moved forward and was challenged in Supreme Court cases, Section 230 was added to clarify the liabilities of internet platforms and third-party users’ role in the publishing of obscene or inaccurate content. The portion of Section 230 currently under examination states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” In simpler terms, higher-ups of social media platforms have a sort of protection from lawsuits made against content published on their platforms. 

Since the spring, a number of significant events have occurred on social media that have led to the questioning of the legal responsibility these platforms should have. From false advertisements for primary election dates to the deletion of a Twitter video presenting false claims on COVID-19, users on social media are calling on executives to improve regulation and fact-checking to ensure that accurate and unharmful content is commonplace. As a result, a number of proposals have been issued to amend Section 230.

 

How We Can Amend Section 230

The goal in amending Section 230 should be to narrow the scope of interpretation. Under this section, social media platforms and other major internet sites have become hotspots for posts surrounding child trafficking, adult content, and false advertisements for election dates, for example. To improve social media regulation and protect users from inaccurate and obscene content, Congress should follow through with the Department of Justice’s proposed amendment.

The DOJ believes that Section 230 must be amended “to realign the scope of Section 230 with the realities of the modern internet.” Their proposal is broken down into four individual sections that call for reform, such as giving major platforms incentives to address illicit content and clarifying federal government responsibility in upholding regulations. This proposal shines in its ability to allow civil lawsuits against third-party users to persist while also calling for federal action against internet giants that use Section 230 as immunity by guaranteeing users the freedom of speech. The DOJ calls for increased transparency and the recognition of when content is uploaded in good faith or with the purposeful intent of being unlawful. 

 

Long-Term Effects Of This Amendment

Long-term effects of amending Section 230 with the DOJ proposals would be the criminal charging of third-party users who purposefully upload illicit content, and a clear distinction between immunity among different internet platforms, such as social media versus retail platforms. 

Through criminal charging of third-party users, these users will be less likely to do so due to legal action. Currently, the expectation to regulate internet content is based on “Good Samaritan” Laws, which rely on other users to report insensitive content. It is important to criminally charge those who are deemed purposeful in uploading illicit content, as it is simple to recreate another account after an original account is deleted. Responsibility must be put on the users themselves in order to reduce the illicit content across the web down the road.

It is also imperative to note the differences between third-party users on social media sites versus other sites, such as retail outlets, for example. Currently, Section 230 provides an overreaching immunity surrounding post regulations for all internet sites. With the DOJ’s proposal, social media sites that rely much more on third-party users than other sites will have a greater legal responsibility if illicit content is published. This is a knowledgeable approach, as these sites should receive a proportional punishment in relation to how many users this content could reach. The long-term goal would be all sites taking greater accountability for regulating content, as the amendment would pose the potential of a greater consequence.

Other Amendment Proposals

As aforementioned, the Department of Justice’s proposal is one of many proposals under Congress’s consideration. Of these other proposals, the advocacy of passing the EARN IT Act and the proposal made by the State Attorneys General have been on the forefront. However, neither of these proposals would adequately narrow the scope of Section 230 like the DOJ’s proposal. 

The EARN IT Act, introduced this past March by three senators focuses on a major topic of illicit content on the web: child sexual exploitation. This act would limit Section 230 immunity for legal claims surrounding child sexual exploitation, making any content on this ground a criminal offense. Even though this Act is valid in its concern to criminally charge third-party users for publishing content related to this topic, the internet platforms are again protected. Unlike the DOJ’s proposal, the EARN IT Act fails to create any sort of regulation that would prevent the creation of this content in the first place. If major platforms are legally responsible for the uploading of this content, they would do a better job of regulating it in the first place.

Similarly, the proposal by the 47 State Attorneys General falls short under the same circumstance of the EARN IT Act. This proposal requests further amendments be made to allow the publishing of illicit content to become both a federal and state criminal offense. Again, this is a valid idea in narrowing the scope of Section 230, but no expectations are placed on internet platforms’ accountability in the publishing of illicit content. The emphasis must be on both the third-party users for publishing the content, and also the platforms for not taking the steps to regulate and fact-check content being posted.

Neither of these other proposals under Congress’s consideration would be effective moving forward. Congress should shift their focus to the DOJ’s proposal, as it encompasses the outcomes of other proposals while taking further steps to make internet platforms more accountable.

 

Conclusion

In conclusion, an amendment made to Section 230 of the Communications Decency Act is much needed. Numerous proposals have been presented as U.S. citizens are seeing the detrimental effects of a law that does not reflect the current reality of internet usage. Even though other proposals make valid points in narrowing the scope of this section, Congress should follow through with the Department of Justice’s proposal. This will allow an amendment to place a legal responsibility on third-party users for publishing illicit content, while preventing internet platforms from escaping legal responsibility through Section 230 immunity. Internet regulation, especially on social media, must be redefined in order to reflect the internet’s effect on the globe while protecting the public from content that may be inaccurate or harmful.