AI Regulation is Still Addressing Fictional Problems
California's SB-1047 bill is a case in point. Who is paying for this?
Hi Everyone,
My jumping off point for this post is yet another post from
, who wrote today on his Substack in defense of California’s new AI bill, commonly known as SB-1047, currently on the floor, which would hold “the party responsible and the original developer of that model” responsible systems that run amok, or in the language of the bill go on to create “critical harms”—mass casualties, weapons of mass destruction and other horrors. What’s going on here?Okay, I’m not keen on AI regulation, but in this case it’s not regulation per se but the wording of the bill that’s wacky. Gary’s defense of the bill roused me to offer a few points by way of perspective. I’m not the last word on the issue, but I do have some concerns we’re paddling out into abstractions again. Let me begin.
First, so far anyway, “AI regulation” proposals have been alarmingly impervious to social and cultural context—and common sense. Holding the developers of an LLM/AI responsible for its (we must assume) errors is on its face suspect. Unlike say, automobiles, which if they significantly malfunction guarantee lawsuits on the manufacturer (or the last mechanic, or, well, someone), we already know that LLMs have an error rate, that some of these errors can be damaging to business bottom lines or even pubic safety, that they can be “jail breaked” and otherwise hacked, and so on. We don’t slap a sticker on a new Volvo that says “This car may occasionally veer off the road for no reason. Driver beware.” There’s an assumption of proper function. But the errors of LLMs are baked into the technology itself. They can’t be eliminated, as everyone including Marcus is at pains to explain. So the fact that errors might occur is already common knowledge to the end user. What’s the regulation for?
Keep reading with a 7-day free trial
Subscribe to Colligo to keep reading this post and get 7 days of free access to the full post archives.