Discussion about this post

User's avatar
Shon Pan's avatar

Notice the astroturfing even here, where the objectors to SB-1047 consistently have written no other comments, follow no subcriptions, etc.

Expand full comment
Herbert Roitblat's avatar

My objection to this bill (CA SB-1047) is not that it would be fatal or even significant to the companies producing LLMs, it is that it is silly. How many LLMs does it take to change a light-bulb? Yeah, that's right, LLMs cannot change lightbulbs. They can't do anything but model language. They don't provide any original information that could not be found elsewhere. They are fluent, but not competent.

The bill includes liability for "enabling" catastrophic events. The latest markup revises that to "materially enable," but that is still too vague. Computers enabled the Manhattan project. Was that material? Could it have been foreseen by their developers?

The silliest provision is the requirement to install a kill switch in any model before training, "the capability to promptly enact a full shutdown" of the model.

The risks that it seeks to mitigate might be real for some model, some day, but not today. The current state of the art models do not present the anticipated risks, but the criteria for what constitutes a "covered model" are all stated relative to current models (e.g., the number of FLOPS or the cost of training). They would not necessarily apply to future models, for example, quantum computing models, or they may apply to too many models, which do not present risks. Future models may be trainable for less than $100 million, for example, and they would be excluded from this regulation. That makes no sense: Apply today's criteria to models that do not exist and may not be relevant to models that do present risks.

What this bill does is to respond to and provide government certification to the hype surrounding GenAI models. It supports and provides government certification that these models are more powerful than they are. Despite industry protestations, this bill is a gift to the industry. If today's models are not genuinely (and generally intelligent), they will be within a few years (or so the bill presumes) and so this specific kind of regulation is needed now. The state is contributing to the marketing based on science fiction. That is silly.

Finally, the bill creates a business model for "third party" organization to certify that the models are safe. For the foreseeable future, that party will be able to collect large fees without actually having to do any valuable work.

Today's models do not present the dangers that are anticipated by this bill and it is dubious whether any future model ever will. The California legislature is being conned and that is why I object to this bill. Stop the hype.

Expand full comment
40 more comments...

No posts