I believe what EFF and Mike Masnick have to say about it: it's the most forseeably abusable internet censorship law (so far), and it's a crying shame HN isn't confident enough to look past the good intention of it to recognize that.
How have DMCA takedowns gone? How many perfectly innocent, upstanding developers have been nuked, without recourse, by some invalid or malevolent false report? How many times over and over[0] has HN lamented the DMCA takedowns, decried what an obviously stupid thing they are, how awful it is to have state censorship (and without any due process too)?
Well, here we are again: "The takedown provision also lacks critical safeguards against frivolous or bad-faith takedown requests".
If courts don't strike this down, this is going be a DMCA for the entire society. It'll be used by politicians to silence their critics. It'll be overrun by trolls. You'll see the results, and you'll hate them.
> "And I’m going to use that bill for myself too, if you don’t mind, because nobody gets treated worse than I do online, nobody.”"
I'd never heard of this act [1], and the article frames it in a way that is not especially informative, though they have an excellent page on it here. [2] The law is being passed with complete unanimity in all houses (409-2 in the House, unanimous in the Senate), which is a rarity in modern times. Quoting the EFF as well as why they are opposed to it:
- "The takedown provision applies to a much broader category of content—potentially any images involving intimate or sexual content—than the narrower NCII definitions found elsewhere in the bill. The takedown provision also lacks critical safeguards against frivolous or bad-faith takedown requests. Lawful content—including satire, journalism, and political speech—could be wrongly censored."
So sexual or intimate themed memes which use realistic looking imagery may end up being able to be taken down. I also find myself disagreeing with the EFF here in spite of generally being a tremendous supporter of their work. In particular their main argument is that there are existing laws which work for this issue, without introducing new potentially abusable legislation:
- "If a deepfake is used for criminal purposes, then criminal laws will apply. If a deepfake is used to pressure someone to pay money to have it suppressed or destroyed, extortion laws would apply. For any situations in which deepfakes were used to harass, harassment laws apply. There is no need to make new, specific laws about deepfakes in either of these situations."
But I think on this issue one should not need to suffer some form of measurable loss or suffering to want intimate/sexual images removed from a site, let alone then having to go through the legal system and file a lawsuit to achieve such.
I agree. I generally support the EFF, but I disagree with them on this. I read through the bill and the language is very specific to revenge porn (although I’m not a lawyer). I think it would be very difficult for Trump or anyone else to abuse this law and use it for censorship.
I have friends who were the victim of revenge porn and I think this law would help them. I’m looking forward to this becoming a law.
DanAtC 3 hours ago [-]
[flagged]
JumpCrisscross 3 hours ago [-]
Help me with the relevance?
AnthonyMouse 3 hours ago [-]
It's political criticism of public figures protected by the First Amendment. It's also arguably sexual, so risk averse managers are going to execute a takedown for it. Ergo, an effective means for political censorship.
actionfromafar 1 hours ago [-]
What about Musk kissing Trumps feet? (Or the other way around? Can't remember.)
tbrownaw 4 hours ago [-]
Hm. If in practice it ends up being as over-broad as the eff seems to expect, might it get tossed on the grounds that it abridges freedom of speech?
shmerl 3 hours ago [-]
DMCA 1201 violates freedom of speech, but it's backed by corrupt beneficiaries, so it was never tossed. This one is comparable.
otterley 3 hours ago [-]
Attorney here! (Not your attorney, not legal advice.)
Why do you believe it runs afoul of the First Amendment?
AnthonyMouse 3 hours ago [-]
Circumvention and circumvention tools are prohibited regardless of whether there is any underlying infringement, e.g. preventing an excerpt from being taken for the purpose of criticism. In general, fair use is required to square copyright with free speech, but circumvention for the purposes of fair use is prohibited.
thayne 2 hours ago [-]
More than that, it prohibits you from telling someone how to circumvent DRM, even if the purpose of doing so is just to, say, watch a movie they legally purchased on the device of your choice.
AnthonyMouse 2 hours ago [-]
Banning secret numbers is dumb, but it's also the part of the law which is the most completely ineffective. Do you think you can find a copy of DeCSS on the internet? Of course you can.
The actual problem is the fair use problem, because it prevents you from e.g. creating a third party Netflix or Twitter client without the corporation's approval. Which in turn forces you to use their app and puts them in control of recommendations, keeps you within a walled garden instead of exposing you to other voices, etc. It's a mechanism to monopolize the marketplace of ideas.
Of course, Apple et al have turned this against the creators in order to extract their vig, which is a related problem.
otterley 2 hours ago [-]
Courts have never equated code with speech in such a way that it’s protected the same way as, say, political speech. People have been making the argument that “code is speech” (without understanding that not all speech is treated alike by our legal system) since DMCA was still being drafted 20+ years ago, but the legal system has never seen it that way.
JimDabell 1 hours ago [-]
What about Bernstein v. United States?
> the Ninth Circuit Court of Appeals ruled that software source code was speech protected by the First Amendment and that the government's regulations preventing its publication were unconstitutional.
This reply doesn't seem responsive to the issue. It's not just whether you can censor someone from publishing code -- that's a separate problem. It's whether the law can prohibit circumvention even when the copying is fair use -- or when the same technological protection measure is locking away works in the public domain.
otterley 2 hours ago [-]
We’re talking about freedom of speech here, so First Amendment law is on point. There’s no other mechanism in our legal system than the Constitution that would prevent DMCA, including its anti-circumvention provisions, from having full force and effect.
Similarly, absent some Constitutional protection, states can restrict who can purchase lock picks.
actionfromafar 1 hours ago [-]
The Constitution doesn't seem to be very respected these days, in either the Executive nor Congress these days.
The DeCSS case actually went to court (Bunner case), though it wasn’t about the T-shirt. It was a civil case based on trade secret law, not DMCA. The trial court assumed that sharing a trade secret without the permission of the secret’s owner is unlawful. That assumption wasn’t challenged.
That law is consistent with trade secret law in general. The First Amendment does not require trade secrets to lose all protection. If it did, you could freely disclose your own employer’s secrets without penalty.
actionfromafar 1 hours ago [-]
Did Bunner work at the DVD Consortium? Can you freely discuss my employer's secrets without penalty?
f33d5173 3 hours ago [-]
It obviously doesn't, because the constitution is by definition whatever the supreme court says it is, yet by assumption the law in question hasn't been tossed. However, notice that the GP said "freedom of speech" and not the first amendment. Perhaps they understand the former to be more expansive than the latter.
some_furry 3 hours ago [-]
I wouldn't bet the farm on that mattering, especially if it's a law that the powerful find "useful".
Glyptodon 2 hours ago [-]
Like most modern laws, if conservatives are able to bends it to theocratic authoritarian ends it will be upheld.
hsuduebc2 43 minutes ago [-]
Indeed. But to be fair, if it’s framed as "protecting the vulnerable" — for example, as a tool to prevent hate speech — liberals would likely support it as well. Or you can always use the classic catch-all: "protecting children from online predators." If it grants politicians more power, it’s almost always useful to them. It's rare for them to believe the government should not have more authority. It’s really just a matter of framing it the right way.
yapyap 3 hours ago [-]
> might it get tossed on the grounds that it abridges freedom of speech?
yeah, when people got unlawfully sent to a prison in another country even though they were a US citizen law makers sprung into action (/s)
I can't identify where EFFs concerns are coming from. There's a specific limitation of liability for online platforms and the entire process appears to be complaint driven and requires quite a bit of evidence from the complaintant.
What actually concerns me in this bill?
> (B) INVOLVING MINORS.—Except as provided in subparagraph (C), it shall be unlawful for any person, in interstate or foreign commerce, to use an interactive computer service to knowingly publish a digital forgery of an identifiable individual who is a minor with intent to—
> “(i) abuse, humiliate, harass, or degrade the minor; or
> “(ii) arouse or gratify the sexual desire of any person.
> “(C) EXCEPTIONS.—Subparagraphs (A) and (B) shall not apply to—
> “(i) a lawfully authorized investigative, protective, or intelligence activity of—
> “(I) a law enforcement agency of the United States, a State, or a political subdivision of a State; or
> “(II) an intelligence agency of the United States;
Wut? Why do you need this? Are we the baddies?
yuliyp 3 hours ago [-]
The scenarios envisioned in the way the bill is written don't actually apply sanely to how the Internet works. Anyone can send any number of (valid or not) reports to any service provider. That service provider then has to somehow decide if every one of those reports fits the definitions of an intimate visual depiction, and, if so, take it down. There's nothing preventing someone from making fraudulent claims, nor any punishment for doing so. The requirements in Section (3)(a)(1)(B) are trivial to automate ("Hi, my name is So-and-so. The image at URL is an intimate image of me posted without my consent. For reference, URL is a picture of me that confirms that URL contains a picture of me. My email address is example@invalid.com." satisfies the requirements of that section, at a glance).
The limitation on liability is only saying they're not responsible for the consequences of taking something down, not for the consequences of leaving something up.
That, plus the FTC being able to sue any company for the inevitable false negatives that will happen means that the only reasonable response to takedown requests is to be extremely cautious about rejecting them. It'll inevitably be abused for spurious takedowns way more than the DMCA already is.
mschuster91 20 seconds ago [-]
> That, plus the FTC being able to sue any company for the inevitable false negatives that will happen means that the only reasonable response to takedown requests is to be extremely cautious about rejecting them.
... or to finally hire enough moderators to make competent judgements to avoid getting a counter lawsuit for restricting free speech.
timewizard 2 hours ago [-]
> fits the definitions of an intimate visual depiction,
Hardly seems difficult. I think a lot of services have TOSes which cover this type of content. The text of the bill also plainly defines what is covered.
> are trivial to automate
And removal is trivial to automate. I'm pretty sure providers already have systems which cover this case. Those that don't likely don't allow posting of pornographic material whether it's consensual or not.
> they're not responsible for the consequences of taking something down
So the market for "intimate depictions" got a little harder to participate in. This is a strange hill to fight over.
> It'll inevitably be abused for spurious takedowns
Of pornographic content. The law is pretty well confined to "visual depictions." I can see your argument on it's technical merits I just can't rationalize it into the real world other than for some absurdly narrow cases.
tpxl 1 hours ago [-]
How does your trivial removal automation distinguish between 'intimate depictions' and 'political imagery I dislike'?
The whole point of this is discussion is that this is going to be used to censor everything, not just 'intimate visual depictions'.
misnome 1 hours ago [-]
What’s stopping someone from taking down this post of yours?
Assume they are a serial liar. Or that they are a political opponent with zero shame.
wmf 3 hours ago [-]
The limitation of liability is itself concerning because it means platforms won't care about fake takedowns. This law doesn't even have the counter notice process that DMCA has. You say take down X, they take it down, they're not liable. There's no appeal process that I see.
timewizard 2 hours ago [-]
It applies to a narrow category of providers.
> IN GENERAL.—The term “covered platform” means a website, online service, online application, or mobile application—
> (i) that serves the public; and
> (ii) (I) that primarily provides a forum for user-generated content, including messages, videos, images, games, and audio files; or
> (II) for which it is in the regular course of trade or business of the website, online service, online application, or mobile application to publish, curate, host, or make available content of nonconsensual intimate visual depictions.
If you publish the content on your own website or on certain public websites you don't even have to respond to these requests. You do; however, open yourself to criminal and civil liability for publicly hosting any nonconseual visual images. Your provider is also excluded from service and cannot legally be compelled to participate in their removal.
tbrownaw 3 hours ago [-]
Sounds like something meant to allow the vice squad to post "selfies" when they're pretending to be kids?
beej71 3 hours ago [-]
We need it so I don't file a takedown request against what you just posted because I disagree with it.
mmooss 1 hours ago [-]
The article says the takedown section is much broader than other sections.
How have DMCA takedowns gone? How many perfectly innocent, upstanding developers have been nuked, without recourse, by some invalid or malevolent false report? How many times over and over[0] has HN lamented the DMCA takedowns, decried what an obviously stupid thing they are, how awful it is to have state censorship (and without any due process too)?
Well, here we are again: "The takedown provision also lacks critical safeguards against frivolous or bad-faith takedown requests".
If courts don't strike this down, this is going be a DMCA for the entire society. It'll be used by politicians to silence their critics. It'll be overrun by trolls. You'll see the results, and you'll hate them.
> "And I’m going to use that bill for myself too, if you don’t mind, because nobody gets treated worse than I do online, nobody.”"
[0] https://hn.algolia.com/?query=dmca&type=all
- "The takedown provision applies to a much broader category of content—potentially any images involving intimate or sexual content—than the narrower NCII definitions found elsewhere in the bill. The takedown provision also lacks critical safeguards against frivolous or bad-faith takedown requests. Lawful content—including satire, journalism, and political speech—could be wrongly censored."
So sexual or intimate themed memes which use realistic looking imagery may end up being able to be taken down. I also find myself disagreeing with the EFF here in spite of generally being a tremendous supporter of their work. In particular their main argument is that there are existing laws which work for this issue, without introducing new potentially abusable legislation:
- "If a deepfake is used for criminal purposes, then criminal laws will apply. If a deepfake is used to pressure someone to pay money to have it suppressed or destroyed, extortion laws would apply. For any situations in which deepfakes were used to harass, harassment laws apply. There is no need to make new, specific laws about deepfakes in either of these situations."
But I think on this issue one should not need to suffer some form of measurable loss or suffering to want intimate/sexual images removed from a site, let alone then having to go through the legal system and file a lawsuit to achieve such.
[1] - https://en.wikipedia.org/wiki/TAKE_IT_DOWN_Act
[2] - https://www.eff.org/deeplinks/2025/04/congress-passes-take-i...
I have friends who were the victim of revenge porn and I think this law would help them. I’m looking forward to this becoming a law.
Why do you believe it runs afoul of the First Amendment?
The actual problem is the fair use problem, because it prevents you from e.g. creating a third party Netflix or Twitter client without the corporation's approval. Which in turn forces you to use their app and puts them in control of recommendations, keeps you within a walled garden instead of exposing you to other voices, etc. It's a mechanism to monopolize the marketplace of ideas.
Of course, Apple et al have turned this against the creators in order to extract their vig, which is a related problem.
> the Ninth Circuit Court of Appeals ruled that software source code was speech protected by the First Amendment and that the government's regulations preventing its publication were unconstitutional.
— https://en.wikipedia.org/wiki/Bernstein_v._United_States
Similarly, absent some Constitutional protection, states can restrict who can purchase lock picks.
That law is consistent with trade secret law in general. The First Amendment does not require trade secrets to lose all protection. If it did, you could freely disclose your own employer’s secrets without penalty.
yeah, when people got unlawfully sent to a prison in another country even though they were a US citizen law makers sprung into action (/s)
https://www.nytimes.com/2025/04/28/us/politics/house-revenge... (https://archive.ph/GTSM2)
https://www.cbsnews.com/news/house-take-it-down-act-vote-dee...
https://www.techdirt.com/2025/04/28/congress-moving-forward-...
that video of donald sucking elon's toes probably counts.
Take It Down Act: A Flawed Attempt to Protect Victims That'll Lead to Censorship - https://news.ycombinator.com/item?id=43296886 - March 2025 (38 comments)
The Take It Down Act isn't a law, it's a weapon - https://news.ycombinator.com/item?id=43293573 - March 2025 (30 comments)
The "Take It Down" Act - https://news.ycombinator.com/item?id=43274656 - March 2025 (99 comments)
https://www.congress.gov/bill/119th-congress/senate-bill/146...
I can't identify where EFFs concerns are coming from. There's a specific limitation of liability for online platforms and the entire process appears to be complaint driven and requires quite a bit of evidence from the complaintant.
What actually concerns me in this bill?
> (B) INVOLVING MINORS.—Except as provided in subparagraph (C), it shall be unlawful for any person, in interstate or foreign commerce, to use an interactive computer service to knowingly publish a digital forgery of an identifiable individual who is a minor with intent to—
> “(i) abuse, humiliate, harass, or degrade the minor; or
> “(ii) arouse or gratify the sexual desire of any person.
> “(C) EXCEPTIONS.—Subparagraphs (A) and (B) shall not apply to—
> “(i) a lawfully authorized investigative, protective, or intelligence activity of—
> “(I) a law enforcement agency of the United States, a State, or a political subdivision of a State; or
> “(II) an intelligence agency of the United States;
Wut? Why do you need this? Are we the baddies?
The limitation on liability is only saying they're not responsible for the consequences of taking something down, not for the consequences of leaving something up.
That, plus the FTC being able to sue any company for the inevitable false negatives that will happen means that the only reasonable response to takedown requests is to be extremely cautious about rejecting them. It'll inevitably be abused for spurious takedowns way more than the DMCA already is.
... or to finally hire enough moderators to make competent judgements to avoid getting a counter lawsuit for restricting free speech.
Hardly seems difficult. I think a lot of services have TOSes which cover this type of content. The text of the bill also plainly defines what is covered.
> are trivial to automate
And removal is trivial to automate. I'm pretty sure providers already have systems which cover this case. Those that don't likely don't allow posting of pornographic material whether it's consensual or not.
> they're not responsible for the consequences of taking something down
So the market for "intimate depictions" got a little harder to participate in. This is a strange hill to fight over.
> It'll inevitably be abused for spurious takedowns
Of pornographic content. The law is pretty well confined to "visual depictions." I can see your argument on it's technical merits I just can't rationalize it into the real world other than for some absurdly narrow cases.
The whole point of this is discussion is that this is going to be used to censor everything, not just 'intimate visual depictions'.
Assume they are a serial liar. Or that they are a political opponent with zero shame.
> IN GENERAL.—The term “covered platform” means a website, online service, online application, or mobile application—
> (i) that serves the public; and
> (ii) (I) that primarily provides a forum for user-generated content, including messages, videos, images, games, and audio files; or
> (II) for which it is in the regular course of trade or business of the website, online service, online application, or mobile application to publish, curate, host, or make available content of nonconsensual intimate visual depictions.
If you publish the content on your own website or on certain public websites you don't even have to respond to these requests. You do; however, open yourself to criminal and civil liability for publicly hosting any nonconseual visual images. Your provider is also excluded from service and cannot legally be compelled to participate in their removal.