New Law Targets Nonconsensual Images but Raises Free Speech Concerns

March 4, President Donald Trump’s major joint address to Congress. Secondly, he expressed anticipation that the newly passed Take It Down Act would help address this realistic concern. This new legislation bridges a gap by addressing the publication of nonconsensual explicit images. It addresses both actual photographic images and AI-generated creations, such as revenge porn…

Lisa Wong Avatar

By

New Law Targets Nonconsensual Images but Raises Free Speech Concerns

March 4, President Donald Trump’s major joint address to Congress. Secondly, he expressed anticipation that the newly passed Take It Down Act would help address this realistic concern. This new legislation bridges a gap by addressing the publication of nonconsensual explicit images. It addresses both actual photographic images and AI-generated creations, such as revenge porn and explicit deepfakes. During his speech, Trump talked about how critical this law is to protect Americans from online exploitation. We were thrilled when he said he hoped to sign the Take It Down Act into law. He talked about how it could be used to improve digital safety prospects.

The Take It Down Act puts the onus on online platforms to move fast. They are liable to civil damages if they fail to take down a victim’s requested images within 48 hours. This provision helps to remove dangerous content more swiftly. It provides victims a clearer, more cost-effective path to accountability. Senator Marsha Blackburn, a co-sponsor of the Act, is an ardent proponent of this legislation and speaks to the heart of its purpose. Her fight has been to protect users and she recently co-sponsored the Kids Online Safety Act to help protect kids from dangerous online material.

Implications for Online Platforms

As the Take It Down Act stands, here’s what the bill requires of digital platforms. Notably, they have to establish a user-friendly process for removing nonconsensual intimate imagery no later than one year after the law takes effect. Noncompliance with this deadline could lead to massive legal consequences. This fast response puts content moderation practices that many free speech advocates fear could easily overreach under fire.

India McKinney, the director of federal affairs at the Electronic Frontier Foundation, is alarmed by the new law’s implications. She is concerned about what it may mean. She cautioned that content moderation at scale has a funny way of unintentionally censoring critical speech.

“Content moderation at scale is widely problematic and always ends up with important and necessary speech being censored.” – India McKinney

McKinney further elaborated that the pressure on platforms to comply quickly could lead them to remove content without thoroughly investigating whether it qualifies as nonconsensual intimate imagery or falls under protected speech.

“The default is going to be that they just take it down without doing any investigation to see if this actually is NCII or if it’s another type of protected speech.” – India McKinney

As platforms start to make these changes, they’ll likely start to over-moderate content even before it gets spread. This would go well beyond outright imagery, reaching and chilling more expressive content, even encrypted speech.

Concerns About Content Moderation

The Take It Down Act has raised concerns that its chilling effect would extend to decentralized platforms like Mastodon. This counts new services that have become popular such as Mastodon, Bluesky, and Pixelfed. Advocates fear that the necessity for stringent content moderation may stifle diverse voices and lead to an environment where only certain types of content are allowed.

McKinney predicts a troubling trend where requests to take down images depicting queer and trans individuals in relationships may surge, inadvertently targeting consensual material. She said that she hoped she would be wrong on her predictions but warned that the risk for more censorship is likely.

“I really want to be wrong about this, but I think there are going to be more requests to take down images depicting queer and trans people in relationships.” – India McKinney

A lot of folks feel this way. They fear the law will inadvertently encourage broader censorship of content considered objectionable or offensive. Civil rights advocates argue that the law, despite a stated intent to protect trafficking victims, could disproportionately affect marginalized communities. That can limit their capacity to engage freely and boldly online.

The Broader Context of Online Safety

Senator Marsha Blackburn is a leader and champion of the Take It Down Act and Kids Online Safety Act. This deep engagement underscores lawmakers’ growing bipartisan worry over the state of safety on the internet. Most of the bills and regulations Blackburn has pushed have claimed that content discussing transgender people is harmful, age-inappropriate material for children. She argues that protecting children from this kind of material is a basic responsibility.

“Keeping trans content away from children is protecting kids.” – Heritage Foundation

Supporters of the Take It Down Act say it will encourage online platforms to take more proactive solutions. Specifically, they’re asking for these platforms to do more to deter nonconsensual imagery. Kevin Guo, CEO and cofounder of Hive, was cautiously optimistic about the positive impact the new law could have.

“It’ll help solve some pretty important problems and compel these platforms to adopt solutions more proactively.” – Kevin Guo

As lawmakers push for tighter regulations on online content, many advocates caution against measures that may infringe upon free speech rights. The tension between protecting people from harassment and allowing a full range of public discourse persists as a modern-day challenge.