![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
Why do things always go from bad to worse? Why can't they go from bad to somewhat less bad?
In personal news, how many nos is one expected to get before they get a yes?
( I managed to find some non-doom-and-gloom links to shove in here as well )
How are you doing?
I am OK.
11 (73.3%)
I am not OK, but don't need help right now.
4 (26.7%)
I could use some help.
0 (0.0%)
How many other humans live with you?
I am living single.
5 (31.2%)
One other person.
7 (43.8%)
More than one other person.
4 (25.0%)
A Senate bill re-introduced this week threatens security and free speech on the internet. EFF urges Congress to reject the STOP CSAM Act of 2025 (S. 1829), which would undermine services offering end-to-end encryption and force internet companies to take down lawful user content.
Tell Congress Not to Outlaw Encrypted Apps
As in the version introduced last Congress, S. 1829 purports to limit the online spread of child sexual abuse material (CSAM), also known as child pornography. CSAM is already highly illegal. Existing law already requires online service providers who have actual knowledge of “apparent” CSAM on their platforms to report that content to the National Center for Missing and Exploited Children (NCMEC). NCMEC then forwards actionable reports to law enforcement agencies for investigation.
S. 1829 goes much further than current law and threatens to punish any service that works to keep its users secure, including those that do their best to eliminate and report CSAM. The bill applies to “interactive computer services,” which broadly includes private messaging and email apps, social media platforms, cloud storage providers, and many other internet intermediaries and online service providers.
The bill makes it a crime to intentionally “host or store child pornography” or knowingly “promote or facilitate” the sexual exploitation of children. The bill also opens the door for civil lawsuits against providers for the intentional, knowing or even reckless “promotion or facilitation” of conduct relating to child exploitation, the “hosting or storing of child pornography,” or for “making child pornography available to any person.”
The terms “promote” and “facilitate” are broad, and civil liability may be imposed based on a low recklessness state of mind standard. This means a court can find an app or website liable for hosting CSAM even if the app or website did not even know it was hosting CSAM, including because the provider employed end-to-end encryption and could not view the contents of content uploaded by users.
Creating new criminal and civil claims against providers based on broad terms and low standards will undermine digital security for all internet users. Because the law already prohibits the distribution of CSAM, the bill’s broad terms could be interpreted as reaching more passive conduct, like merely providing an encrypted app.
Due to the nature of their services, encrypted communications providers who receive a notice of CSAM may be deemed to have “knowledge” under the criminal law even if they cannot verify and act on that notice. And there is little doubt that plaintiffs’ lawyers will (wrongly) argue that merely providing an encrypted service that can be used to store any image—not necessarily CSAM—recklessly facilitates the sharing of illegal content.
While the bill includes an affirmative defense that a provider can raise if it is “technologically impossible” to remove the CSAM without “compromising encryption,” it is not sufficient to protect our security. Online services that offer encryption shouldn’t have to face the impossible task of proving a negative in order to avoid lawsuits over content they can’t see or control.
First, by making this protection an affirmative defense, providers must still defend against litigation, with significant costs to their business. Not every platform will have the resources to fight these threats in court, especially newcomers that compete with entrenched giants like Meta and Google. Encrypted platforms should not have to rely on prosecutorial discretion or favorable court rulings after protracted litigation. Instead, specific exemptions for encrypted providers should be addressed in the text of the bill.
Second, although technologies like client-side scanning break encryption, members of Congress have misleadingly claimed otherwise. Plaintiffs are likely to argue that providers who do not use these techniques are acting recklessly, leading many apps and websites to scan all of the content on their platforms and remove any content that a state court could find, even wrongfully, is CSAM.
Tell Congress Not to Outlaw Encrypted Apps
The bill allows a new type of lawsuit to be filed against internet platforms, accusing them of “facilitating” child sexual exploitation based on the speech of others. It does this by creating an exception to Section 230, the foundational law of the internet and online speech. Section 230 provides partial immunity to internet intermediaries when sued over content posted by their users. Without that protection, platforms are much more likely to aggressively monitor and censor users.
Section 230 creates the legal breathing room for internet intermediaries to create online spaces for people to freely communicate around the world, with low barriers to entry. However, creating a new exception that exposes providers to more lawsuits will cause them to limit that legal exposure. Online services will censor more and more user content and accounts, with minimal regard as to whether that content is in fact legal. Some platforms may even be forced to shut down or may not even get off the ground in the first place, for fear of being swept up in a flood of litigation and claims around alleged CSAM. On balance, this harms all internet users who rely on intermediaries to connect with their communities and the world at large.
California lawmakers are continuing to promote a bill that will reinforce the power of giant AI companies by burying small AI companies and non-commercial developers in red tape, copyright demands and potentially, lawsuits. After several amendments, the bill hasn’t improved much, and in some ways has actually gotten worse. If A.B. 412 is passed, it will make California’s economy less innovative, and less competitive.
A.B. 412 masquerades as a transparency bill, but it’s actually a government-mandated “reading list” that will allow rights holders to file a new type of lawsuit in state court, even as the federal courts continue to assess whether and how federal copyright law applies to the development of generative AI technologies.
The bill would require developers—even two-person startups— to keep lists of training materials that are “registered, pre-registered or indexed” with the U.S. Copyright Office, and help rights holders create digital ‘fingerprints’ of those works—a technical task with no established standards and no realistic path for small teams to follow. Even if it were limited to registered copyrighted material, that’s a monumental task, as we explained in March when we examined the earlier text of A.B. 412.
The bill’s amendments have made compliance even harder, since it now requires technologists to go beyond copyrighted material and somehow identify “pre-registered” copyrights. The amended bill also has new requirements that demand technologists document and keep track of when they look at works that aren’t copyrighted but are subject to exclusive rights, such as pre-1972 sound recordings—rights that, not coincidentally, are primarily controlled by large entertainment companies.
The penalties for noncompliance are steep—up to $1,000 per day per violation—putting small developers at enormous financial risk even for accidental lapses.
The goal of this list is clear: for big content companies to more easily file lawsuits against software developers, big and small. And for most AI developers, the burden will be crushing. Under A.B. 412, a two-person startup building an open-source chatbot, or an indie developer fine-tuning a language model for disability access, would face the same compliance burdens as Google or Meta.
It’s critical to remember that AI training is very likely protected by fair use under U.S. copyright law—a point that’s still being worked out in the courts. The idea that we should preempt that process with sweeping state regulation is not just premature; it’s dangerous.
It’s also worth noting that copyright is governed by federal law. Federal courts are already working to define the boundaries of fair use and copyright in the AI context—the California legislature should let them do their job. A.B. 412 tries to create a state-level regulatory scheme in an area that belongs in federal hands—a risky legal overreach that could further complicate an already unsettled policy space.
A.B. 412 is a solution in search of a problem. The courthouse doors are far from closed to content owners who want to dispute the use of their copyrighted works. There are multiple high-profile litigations over the copyright status of AI training works that are working their way through trial courts and appeal courts right now.
Rather than narrowing its focus to make compliance more realistic, the latest amendments to A.B. 412 actually expand the scope of covered works. The bill now demands documentation of obscure categories of content like pre-1972 sound recordings. These recordings have rights that are often murky, and largely controlled by major media companies.
The bill also adds “preregistered” and indexed works to its coverage. Preregistration, designed to help entertainment companies punish unauthorized copying even before commercial release, expands the universe of content that developers must track—without offering any meaningful help to small creators.
Ironically, the companies that will benefit most from A.B. 412 are the very same large tech firms that lawmakers often claim they want to regulate. Big companies can hire teams of lawyers and compliance officers to handle these requirements. Small developers? They’re more likely to shut down, sell out, or never enter the field in the first place.
This bill doesn’t create a fairer marketplace. It builds a regulatory moat around the incumbents, locking out new competitors and ensuring that only a handful of companies have the resources to develop advanced AI systems. Truly innovative technology often comes from unknown or small companies, but A.B. 412 threatens to turn California—and anyone who does business there—into a fortress where only the biggest players survive.
A.B. 412 is becoming an increasingly extreme and one-sided piece of legislation. It’s a maximalist wishlist for legacy rights-holders, delivered at the expense of small developers and the public. The result will be less competition, less innovation, and fewer choices for consumers—not more protection for creators.
This new version does close a few loopholes, and expands the period for AI developers to respond to copyright demands from 7 days to 30 days. But it seriously fails to close others: for instance, the exemption for noncommercial development applies only to work done “exclusively for noncommercial academic or governmental” institutions. That still leaves a huge window to sue hobbyists and independent researchers who don’t have university or government jobs.
While the bill nominally exempts developers who use only public or developer-owned data, that’s a carve-out with no practical value. Like a search engine, nearly every meaningful AI system relies on mixed sources — and developers can’t realistically track the copyright status of them all.
At its core, A.B. 412 is a flawed bill that would harm the whole U.S. tech ecosystem. Lawmakers should be advancing policies that protect privacy, promote competition, and ensure that innovation benefits the public—not just a handful of entrenched interests.
If you’re a California resident, now is the time to speak out. Tell your legislators that A.B. 412 will hurt small companies, help big tech, and lock California’s economy in the past.
( Read more... )