> Strict limits on governmental regulation wherein any restrictions must be demonstrably necessary and narrowly tailored to a compelling public safety or health interest.
> Mandatory safety protocols for AI-controlled critical infrastructure, including a shutdown mechanism and compulsory annual risk management reviews.
Read: industry can do whatever we want, but the government also has to put up barriers to entry that favor large incumbents.
This has nothing to do with rights or even computing, it's just regulatory capture.
terminalshort 2 minutes ago [-]
There are no small businesses building data centers.
tzs 2 hours ago [-]
Including a shutdown mechanism and doing an annual risk management review favors large incumbents?
autoexec 2 hours ago [-]
The shutdown mechanism would have existed anyway and a "risk management review" sounds exactly like the sort of toothless policy that's supposed to make people feel better without actually putting any limits or enforcement on the industry
Onavo 1 hours ago [-]
Not to mention 100K in consultancy fees for compliance.
ToucanLoucan 4 hours ago [-]
You know if we're gonna pass laws to make it illegal for the government to interfere with the Torment Nexus, the least they could do is not gaslight us with the fucking name of the law. Just tell us the billionaires get to fuck the planet in the eye and the rest of us have to deal with it, at least it's honest that way.
jfengel 2 hours ago [-]
Practically every law, and lobbying organization, is named for exactly the opposite of what it does. If I see the Puppies and Orphans Protection Act of 2028, I assume its purpose is to use puppies to strangle orphans. Proponents will point to the limitation on how many puppies you can use per orphan.
Similarly, if I see the People For X organization, I assume they are against X. The Committee for Green Spaces and Clean Air is guaranteed to be an oil company.
Once you develop that reflex, everything calms down. Though admittedly, I passed a sign for Fidos for Freedom. I'm not quite sure what Fidos Against Freedom does. I think they give dogs to disabled people, and they bark at you if you try to leave the house.
idle_zealot 1 hours ago [-]
There is something that this tactic misses: when people try to do good things, the name of their organization or policy is usually pretty honest. In an environment like ours, though, that still means that your strategy of assuming the opposite meaning has something like a 95% expected success rate.
ToucanLoucan 2 hours ago [-]
All I can think of is Dr. Augustine from Avatar. "They're just pissing on us without even the courtesy of calling it rain."
OGEnthusiast 2 hours ago [-]
[dead]
bigfishrunning 3 hours ago [-]
They can't be that blatant, that's how you lose your next term
saghm 2 hours ago [-]
The second term for the "drain the swamp" president implies otherwise (it did take another cycle, but that arguably had more to do with covid than corruption).
quotemstr 2 hours ago [-]
And is that supposed to be a bad thing?
saghm 2 hours ago [-]
At the very least, it's a bit weird to be calling it a Right to Compute if the actual goal is to enable investments. It's hard not to be concerned about whether it's even about establishing a right at all, or if that's entirely posturing to try to build support for something that isn't really about rights at all. At that point, it's hard to trust anything else they're saying about the motives, since they've established that they're willing to fudge things to make it harder to argue against.
The point isn't whether it's bad or good, but that it establishes a pattern of inconsistency.
Mistletoe 3 hours ago [-]
So it should be renamed Right to Datacenter Act. And here I thought they were giving people power over their private computers and being surveilled on them…
Reminds me of some bill in my state about Right to Farm and when you looked deeper it was about rights for huge corporate hog farms to dump waste in the rivers. The slimiest corps always do this 1984 level double talk when they name their bills. It’s a dead giveaway. Citizens United, oh wow cool this is about protecting citizens!
hermannj314 4 hours ago [-]
When a "right to..." law is passed, there is usually an accompanying narrative that explains a past injustice that will be corrected. Matthew Shepard hate crime, Civil Rights Voting act, etc.
The absence of such a story makes me think this law doesn't protect shit. What exactly did a Montanian get killed or arrested trying to do with a computer that is now protected? Can I use AI during a traffic stop or use AI to surveil and doxx governemnt employees? What exactly is the government giving up by granting me this right?
Or is this just about supressing opposition to data centers?
culi 4 hours ago [-]
Yeah I think it's pretty obviously the AI industry trying to ban its own regulation
> Nationally, the Right to Compute movement is gaining traction. Spearheaded by the grassroots group RightToCompute.ai, the campaign argues that computation — like speech and property — is a fundamental human right. “A computer is an extension of the human capacity to think,” the organization states.
staplers 3 hours ago [-]
computation — like speech and property — is a fundamental human right
Computation however requires a vast supply chain where certain middlemen have a near monopoly on distribution of said "fundamental right". The incentives for lobbyists seems clear.
I don't necessarily disagree with the idea, but until profit is shared with taxpayers, this is a one-way transaction of taxpayers bankrolling AI companies.
dismalaf 3 hours ago [-]
Regulation is just regulatory capture by incumbents and also a national security risk.
hdgvhicv 3 hours ago [-]
You argue that food safety tellregialtikns are just regularity capture?
jfengel 2 hours ago [-]
Aggravatingly, some of it is. The organic food regulations are impossible for the small farmers who invented the idea. Only mega corps can do it, and their definition is not much better (if at all) than industrial farms.
It's still way better than Upton Sinclair's time. But it would be nice if the FDA and USDA were run by people who eat rather than sell food.
dismalaf 1 hours ago [-]
To start a restaurant where I live it's $50k in fees and mandatory paperwork before you can even get a construction permit. Alot of it is, yes.
And none of it prevents bad food handling practices by minimum wage staff.
Eh, if states can pass restrictive laws on AI in absence of a correspondingly negative motivating event, I don't see any contradiction in doing the opposite.
autoexec 2 hours ago [-]
> if states can pass restrictive laws on AI in absence of a correspondingly negative motivating event...
If you mean besides the extensive harm to air quality, the large land fingerprint of data centers, the massive strain on water resources and treatment facilities, the insane electricity demands resulting in skyrocketing prices pushed onto everyone else, the deafening noise pollution, and what they've done to the price of RAM, then sure. And that's just the data centers!
The usage of AI itself has resulted in all kinds of harm and even actual deaths. AI has wrongfully denied people healthcare coverage they were entitled to preventing or delaying needed surgeries and treatments. There's a growing list of LLM related suicides (https://en.wikipedia.org/wiki/Deaths_linked_to_chatbots). The use of AI in parole systems has kept people locked behind bars when they shouldn't have been due to biases in the bots making decisions. AI used for self-driving driving cars have killed pedestrians and other drivers. There are thousands of AI generated harms tracked here: https://airisk.mit.edu/ai-incident-tracker
tt24 25 minutes ago [-]
God I hate safetyism
gosub100 3 hours ago [-]
so the jobs have to be lost _first_ , then we can ban it?
cortesoft 2 hours ago [-]
Job loss is a horrible reason to ban something. Think about our history if we always did that. We would all be stuck working on farms today, because we didn’t want to allow tractors or other machinery because it would take away farming jobs.
Instead of banning tech to save jobs, pass laws that make sure tech prices in externalities (tax carbon emissions), and find other ways to assist people who lose jobs (UBI, good social safety nets, etc).
Don’t stifle progress just because it makes us have to work less.
pocksuppet 1 hours ago [-]
Right - fix the economy instead. Why should increasing efficiency cause people to have less resources - that makes no sense.
Joker_vD 26 minutes ago [-]
Because there are people who live off rent (in a broad sense of this world), and there are people who live off selling their ability to work. Increased efficiency and productivity may or may not benefit the second kind of people, depending on whether they can sell their labour to be used for something else.
cortesoft 19 minutes ago [-]
So instead of figuring out ways to limit the ability of people to live off rent, we want to ban beneficial things that people could extract rent from?
This is like saying, "We don't like how landlords extract value from housing, so we are banning apartment buildings"
gosub100 19 minutes ago [-]
Banning AI does increase efficiency. It makes it more efficient for a working class family to afford to survive. What perverted definition of the word were you considering?
cortesoft 17 minutes ago [-]
How is this different from saying "Banning mechanical farm equipment does increase efficiency, it makes it more efficient for farm workers to afford to survive"
You are fighting against productivity improvements when you should be fighting against people hoarding the benefits of productivity improvements.
gosub100 11 minutes ago [-]
That doesn't answer my question. My claim is that people working is efficient.
TimorousBestie 1 hours ago [-]
> Job loss is a horrible reason to ban something. Think about our history if we always did that.
The US has continually set up protectionist policies to preserve a local workforce. Automotive manufacturers, more recently the shipbuilding industry, etc.
tt24 24 minutes ago [-]
These are bad things
ukuina 2 hours ago [-]
Car dealerships would like a word.
cortesoft 2 hours ago [-]
Another example why these types of laws make things worse for people.
Ukv 2 hours ago [-]
If the idea was that laws must be motivated by a negative occurrence rather than preemptive, then that'd follow yeah (if counting job loss as a reason to ban something, which I think is questionable). But note akersten is saying that it's normal for laws to be preemptive in both cases.
sroussey 3 hours ago [-]
Just like when musicians were on strike and the radio people decided to play a recording over the air (gasp! a record!) rather than live performances.
A nice ban on playing recorded music would have saved those jobs.
gosub100 1 hours ago [-]
Bad example. You are agreeing that copyright is owned by the people whose work an AI agent is trained on. Sure, come take a class of jobs, and then pay them in perpetuity to license the exposition of their work. For 75 years after the authors death, just like current copyright.
moate 3 hours ago [-]
>>absence of a correspondingly negative motivating event.
You don't think there's reasons pass laws banning AI...datacenters?
Because what state is banning the concept of AI? They're banning/restricting the creation of a type of infrastructure within their borders because they feel that is detrimental to their citizens. Maybe it's NIMBY/Luditte BS to you, but people not wanting their resources to go help ensure some dork can have a chat-bot girlfriend seems normal to me.
hparadiz 3 hours ago [-]
I'm already running an LLM locally. This is just me renting space in a data center. Since when did we restrict people's ability to do things? For the record my local models run off the solar bolted to my roof. Even including the data center I'm using 1/10th of the energy we were using on tube monitors back in the 90s. This is exhausting. My GPU would be demonstrably using more power by playing a videogame right now than when I run a local LLM.
jrmg 3 hours ago [-]
Since when did we restrict people's ability to do things?
This question is not the obvious winner you think it is. To me, and I am sure many, it sort of undermines your argument.
Even in the most ‘free' cultures, society has _always_ restricted people’s individual ability to do things that it collectively deems harmful to the whole society.
hparadiz 3 hours ago [-]
This is literally why America was founded. Too many people stifle innovation. Move to Europe if you want to be stuck in the 20th century frankly. That doesn't mean we can't take care of folks. But the ludites need to get the fuck out of the way. You're all exhausting.
pocksuppet 59 minutes ago [-]
America was founded because rich people didn't want to pay taxes.
cheeeeeeeese 3 hours ago [-]
[dead]
Arainach 3 hours ago [-]
>Since when did we restrict people's ability to do things?
When those things impact other people - such as by skyrocketing utility prices, overloading the electrical grid, and more.
hparadiz 3 hours ago [-]
I thought this was a free market? Or is that not how things work anymore?
Arainach 3 hours ago [-]
Never has been. A totally free market doesn't work and has failed every time it was tried. You want one today, go set up shop in Somalia.
hparadiz 2 hours ago [-]
I can't respect that opinion. It's full of holes.
pocksuppet 58 minutes ago [-]
What are the holes? There are places today with no government - perfect free markets. If you think perfect free markets are awesome, you can move there and do business there. It's a bit like telling someone who loves communism to go to China.
Arainach 2 hours ago [-]
Holes such as what?
There have always been rules and laws. The US has never been a totally free market. Most of the laws and rules we have were written in blood by people professing a "free market" right to poison our people, rivers, air, and more.
sumeno 3 hours ago [-]
> Since when did we restrict people's ability to do things?
At least 4000 years ago, but that's just the earliest we have evidence for
I don't think you understand the qualifier. I meant in the tradition of liberal free markets that have unlocked human potential on the global scale. I'm saying no it's actually good that you don't have to ask the local government when you want to do something. If American style free markets didn't gain traction we'd still be doing subsistence farming.
tadfisher 27 minutes ago [-]
The thing is, since we recognized that such a tradition led to the unfettered destruction of the natural environment which we depend upon to survive, we have decided that local governments should be responsible for preserving said environment by regulating the destructive actions performed by the liberal free market. Not doing so will even destroy our ability to perform subsistence farming in the long run.
moate 3 hours ago [-]
>>when did we restrict people's abilities to do things? That's literally what most laws are, saying what you can and can't do. This is like, a foundational understanding of what government/regulation is.
>>this is just me renting space...
Okay, so a "network effect" is when things have greater impact due to larger usage. So the data center usage that you're talking about does not represent the overall impact of the data center. Saying "I only pour ONE cup of bleach into the ocean, so I don't see why it's so bad to have the bleach factory pump all its waste in as well" is a WILD take.
cortesoft 2 hours ago [-]
Why should we stop there? Let’s ban people flying on vacations, because why should our resources go towards some dork laying out in the sun? Air travel is horribly wasteful. Let’s ban people racing cars, that is also wasteful. We shouldn’t be using our resources to drive in circles.
How do we pick which activities are worth using resources? Which ones are too ‘dorky’ to allow?
Look, I am all for pricing the externalities into resource consumption. Tax carbon production, to make sure energy consumption is sustainable, but don’t dictate which uses of energy are acceptable or ‘worth it’, because I don’t want only mainstream things to be allowed.
akersten 3 hours ago [-]
I didn't say any of that in my comment nor express an opinion about this whole thing writ large. I'm only pointing out that it's not weird for legislature to preempt a real world use case by way of pointing out similar laws.
moate 3 hours ago [-]
I'm going to do this again:
>>>>absence of a correspondingly negative motivating event.
What did you mean? Why do you believe there has not been a motivating event to ban data centers when those bans have happened, which is literally what you said?
akersten 3 hours ago [-]
In the context of the discussion a correspondingly negative event would have been along the lines of "we built a data center and then it exploded, we need to make sure that doesn't happen." Not "we're worried about the effects the data center might have," which is vis a vis to "we're worried about the effects banning ai might have." All I'm saying is neither of those last two are weird reasons to enact a law.
GP was insisting that "rights" named laws always come after some negative event and it is weird that we have this "rights" named law without someone being deprived of their computation or whatever. I'm disagreeing with the premise that that's weird by pointing out laws preempt real world events all the time, in either direction (restrictive or permissive).
baggy_trough 3 hours ago [-]
> Maybe it's NIMBY/Luditte BS to you, but people not wanting their resources to go help ensure some dork can have a chat-bot girlfriend seems normal to me.
Why would it be your business, or anyone else's, to stop someone from doing this?
Arainach 3 hours ago [-]
Because these data centers are at best overstressing utility grids and elevating prices for everyone and at worse running dirty generators and poisoning entire communities, for a start.
15155 2 hours ago [-]
Oh no, we couldn't possibly generate more power! Impossible! We're at our limit!
China has 100 reactors under construction - meanwhile in the West, folks like you exist.
3 hours ago [-]
lukeschlather 4 hours ago [-]
I was really hoping this gave people the right to use their computers, but it really looks like it simply prevents "the government" from regulating the right to "make use of computational resources." So Google or Apple can still prevent me from using my phone for lawful purposes, the government just can't regulate it (and the government might not be able to write restrictions that prevent manufacturers from violating my right to compute.)
sophrosyne42 2 hours ago [-]
Google or Apple only hold an ability to prevent you to use your phone because the government itself enforces IP. So the restriction against regulations in this bill is only a partial and incomplete restriction against the government from interfering with people's right to compute.
einpoklum 3 hours ago [-]
Imagine if Montata required that all compute platforms sold in the state to be free of user restriction: That they be amenable to modification, that all source code, firmware and hardware specs be open, and when that is not the case - the company would be compelled to release the relevant information on pain of having assets seized, required to refund payments etc. That would have been a hoot :-)
sroussey 2 hours ago [-]
Simply not sell to that state.
dynm 4 hours ago [-]
I think the main content of this law (https://legiscan.com/MT/text/SB212/id/3212152) is just two paragraphs. I'd suggest reading them yourself rather than relying on secondary description:
"Government actions that restrict the ability to privately own or make use of computational resources for lawful purposes, which infringes on citizens' fundamental rights to property and free expression, must be limited to those demonstrably necessary and narrowly tailored to fulfill a compelling government interest."
"When critical infrastructure facilities are controlled in whole or in part by a critical artificial intelligence system, the deployer shall develop a risk management policy after deploying the system that is reasonable and considers guidance and standards in the latest version of the artificial intelligence risk management framework from the national institute of standards and technology, the ISO/IEC 4200 artificial intelligence standard from the international organization for standardization, or another nationally or internationally recognized risk management framework for artificial intelligence systems. A plan prepared under federal requirements constitutes compliance with this section."
In particular, I think the reporting is straight wrong that there's a shutdown requirement. That was in an earlier version (https://legiscan.com/MT/text/SB212/id/3078731) and remains in the title of this version, but seems to have been removed from the actual text.
RobRivera 4 hours ago [-]
So the government is afforded the opportunity to constrict compute if for a government interest.
This bill seems to expand powers, not restrict
dynm 3 hours ago [-]
Before the law, I think the state government or local governments could (by passing a law) restrict computing for any reason, even without a government interest. Now, they'd have to repeal this first.
RobRivera 3 hours ago [-]
How?
I know the whole 90s meme of 'I am a controlled munition' went around because cryptography was labeled an ordnance subject to export control laws, and therefore code that performed those kind of computations were forbidden to be sold abroad, liable to a felony.
What happens today? Government gets rights to source code, logs, and rubber stamps/rejects your code from executing in the cloud?
Government limits your access to commodity infrastructure?
dynm 2 hours ago [-]
How? By default, state governments can pass basically whatever laws they want. They don't have (theoretically) limited enumerated powers like the federal government.
RobRivera 2 hours ago [-]
Im not asking for policy mechanics, I'm asking for implementation detail clarification.
torginus 2 hours ago [-]
Ah, finally something that the common man wants. A mandatory risk management strategy compliant with ISO/IEC guidelines
tzs 2 hours ago [-]
Hmmm. "[...] the deployer shall develop a risk management policy after deploying the system [...]".
I wonder why it is after rather than before?
toomanystraws 3 hours ago [-]
"... the deployer shall develop a risk management policy after deploying the system...."
This is a complete sham. Anything really geared towards protecting people would have protections in place before deployment.
scuff3d 3 hours ago [-]
When you contextualize the law with comments like this
"The initiative... contrasts with recent restrictive legislation efforts in states like California and Virginia. Zolnikov, a noted advocate for privacy, has been instrumental in pushing for tech-friendly policies that ensure individual liberties in an evolving digital landscape.
"'As governments around the world and in our own country try to crack down on individual freedom and gain state control over modern technologies,' Zolnikov said. 'Montana is doing the opposite by protecting freedom and restraining the government.'"
And it's the normal framing we always see with this crap. This is more an attempt to protect corporations from regulation then it is to protect individuals.
hnsdev 4 hours ago [-]
With laws such as the Brazilian one or the one proposed in New York, I am curious to know what will be the future for computing.
On one hand, forbidding and limiting people from using computers as they wish is somewhat impossible, as too many computers that don't have restrictions have already been produced. You can always use old hardware and, with open source projects, fork an old version that will respect your right to compute. At some point though it will be a problem as hardware no longer works and software becomes incompatible with everything. The thing is that those who will probably be doing it mostly are people that already grew accustomed to not live in an Orwellian state, while, on the other hand, newer generations will all be using new systems with these restrictions, as if they were normal. The smart ones will find ways of circumventing it (as if it wouldn't be hard to get your parents CC and verify it as if you were over 18).
Given that, they will be computing in a restrictive and controlled environment. I feel sorry for them.
I am going to college (Computer Science) as an older student with previous experience in programming, and it never ceases to amaze me that the current generation of students doesn't think out of the box and is completely dependent on ChatGPT. We all suffered from conditioning from governments and corporations throughout the years, but it is accelerating at an alarming rate.
Acts like this (the one from Montana) are positive, but unfortunate that they simply have to exist and somewhat irrelevant when the big dogs (California, New York and whole countries such as Australia) approve legislation that will promptly be followed by most companies/projects, which will in turn force this way of things happening everywhere else.
heavyset_go 3 hours ago [-]
This won't touch age verification and surveillance laws, it's not meant to protect people, it's meant to protect the interests of capital
muyuu 38 minutes ago [-]
I want a right to compute without having to identify myself, or otherwise give any information about myself to the computing system itself.
I was hoping for that as a reaction to the current tyrannical movements worldwide to end anonymous personal computing.
matheusmoreira 2 hours ago [-]
Pointless and deceptive. A real "right to compute" law would ban remote attestation, would ban discrimination against users based on the "trustworthiness" of their systems, would force companies to allow custom software and firmware as well as provide technical documentation and specifications to users so they can repair and modify the systems they bought.
tt24 19 minutes ago [-]
Wrong. This is not how rights work.
You have the right to not provide custom software and firmware and technical documentation, the right to enforce remote attestation, and the right to refuse service to whoever you wish.
s_dev 4 hours ago [-]
I really dislike how 'compute' as a noun took over 'computational' as an adjective. I just find the sentence 'I need more computational resources' flows so much nicer than ''I need more compute'.
DennisP 4 hours ago [-]
"Right to compute" sounds to me more like they're using "compute" as a verb, which predates "computational" by a couple centuries.
moate 3 hours ago [-]
Someone said "right to computers' and someone else said "that sounds dumb...make it compute!"
hackyhacky 4 hours ago [-]
Interpret the word "compute" in the title as a verb, not a noun. "I have the right to compute" is analogous grammatically to "I have the right to vote" or "I have the right to assemble"
moate 3 hours ago [-]
Glad Montana is securing the right to do math.
anonym29 2 hours ago [-]
It's hilarious that they think it needs to be codified into law. As if the right to do math wasn't intrinsic, and could be even theoretically be revoked by the government, lol.
ahsillyme 12 minutes ago [-]
I think it betrays cynicism about the tendency for single-objective optimizing market actors to rent-seek and cartelize. I don't think it's a stretch at all. On the surface it would be equally preposterous to suggest that breathing could be theoretically revoked by the government, which truly is preposterous but we do have those laws in place depending on whether the air you breathe has "illegal substances" in it. But then again, explicit revocation is a high bar when you can throttle the free use of computational resources by regulatory capture: the AI incumbents could say, for example, that AI is so dangerous that it must be kept out of the hands of the unwashed masses. Another excellent strategy (with a rather high bar to entry) would be to distort the markets themselves by ensuring that your prospective renters can't afford basic compute.
codethief 4 hours ago [-]
The "compute" in "right to compute" could also be a verb, though. :-)
hackyhacky 4 hours ago [-]
How about "we've got the best nuclear"
jasonlotito 3 hours ago [-]
Compute is the...
FTA: right to own, access, and use computational resources
It's a verb.
soulofmischief 4 hours ago [-]
Well, language evolves, and I personally prefer compute as a noun when talking about resources. It's great though because we can each say it in our preferred way without judging one another.
sockaddr 3 hours ago [-]
I agree. This is language evolving. If someone from the 16th century could hear a modern well-educated person speak English today they would likely be horrified at how degenerate it would sound to them.
So I don't think current English is in some perfect state that should not change.
On god.
PaulDavisThe1st 2 hours ago [-]
Please don't judge me for what I say, or do, or who I really am.
jasonlotito 3 hours ago [-]
It's a verb, not a noun.
arjie 2 hours ago [-]
One of America's greatest strengths is the structure of it as a federation. It allows for states like this to take the lead in expanding datacenter infrastructure while other states can choose to shutdown such expansions. This was perhaps more significant in COVID-19 reactions in America, but datacenters have few such externalities and so this is an even more compelling example of variation between states.
The scaling of federal power with population is also significant as states like Texas that allow for more housing to be built will probably receive more seats at the next apportionment while states like California will lose seats. Overall, pretty neat to see the design of America work quite well like this.
cortesoft 2 hours ago [-]
I generally agree with this idea, but
> but datacenters have few such externalities
Is wild. Energy consumption is one of the biggest externalities that exists today, since global climate change is completely independent of location. Greenhouse gases do not care about borders.
tadfisher 11 minutes ago [-]
Also wild when Musk is freebasing methane in Tennessee with zero consequences.
PaulDavisThe1st 2 hours ago [-]
While it is not a true "externality", data center use of water is a strong community/regional cost that effectively removes 1 person/1 vote. Those with the financial resources to buy more water get the water, those without do not.
Perhaps you think that the distribution of financial resources reflects what is in society's best interests - that Meta, Google et al. have demonstrated their utility in ways that make them literally more important than people with insufficient wealth to outbid those companies for water.
Many of us do not.
polski-g 1 hours ago [-]
"Datacenter water usage" is a comment I'd expect to see on Reddit--not a VC forum with allegedly intelligent people.
PaulDavisThe1st 1 hours ago [-]
I live in New Mexico. I do not consider Hacker News to be a VC forum. For what it's worth (which is very little), I was employee #2 at amzn if I need some sort of credentials to get you to respond constructively to my point rather than with some hand-wavey ad hominem.
bradley13 2 hours ago [-]
Most of the comments are cynical. I read this, at least the "right to compute" part, as a reaction to the current onslaught of censorship and age verification laws. Which is a good thing.
The AI part honestly looks fairly harmless, just applying existing standards, but I may be wrong there...
torginus 2 hours ago [-]
Excuse me, the article states that this bill was signed on april 17th of last year. How has this become suddenly relevant now?
I would say considering there has been almost a year since this bill was signed, what happened since then? Was it applied to hurt people's interests? Did it drive investment?
Are Montanans demonstrably better or worse off because of this in some way?
elgertam 1 hours ago [-]
Montana is both cold and sparely populated, so I figure data centers would be good there. Also, I figure Zefram Cochrane could use all that compute for his warp theories in a few decades.
maxerickson 28 minutes ago [-]
I expect that access to inexpensive energy is more important than the population density, typical ambient or "favorable" state laws (this law doesn't seem to really do all that much).
Like the region I live in is cold and has lots of water, but we import energy, might as well build closer to the regional mega cities (where it is still relatively cold, with relatively abundant water). There is some kerfuffle going on in the county here about preventing data centers, and I can't imagine there is even anyone interested in building one.
dlev_pika 2 hours ago [-]
They are so proud to hand over all control to the corporations, while pretending this is a consumer protection centered build - wild
preinheimer 3 hours ago [-]
What about a “right to create act” giving people the right to create things and not have their creation be ingested to train ai for billion dollar companies?
ProllyInfamous 3 hours ago [-]
Some sort of pre-emptive auto-opt-AI't.
It's ridiculous that AIco's arguments are dwindling down to "it's not copyright infringement to ingest others' work and make 'derivatives' [which often are identical to original authors' works]."
----
We desperately need younger politicians, who can not only keep up with information more sharply (i.e. aren't legally decades-retireable), but also are of the age where their own children are being affected by government re-funding flows away from youth/education/future.
At this point I'm willing to concede that our future probably has companies' individual LLM/genAI products competing against one-another, as digital politicians ["the digital pimp, hard at work... we have needs"--Matrix' Mouse]. Nobody knows how either flesh nor silicon congressmen work, inside; but I think the latter could act more human[e]ly...
PaulDavisThe1st 2 hours ago [-]
Do you believe that for younger people this question (about derivativeness) is clearly settled? If so, how?
TL;DR: Basically the AI industry trying to ban governments from regulating it
4 hours ago [-]
3 hours ago [-]
kmeisthax 4 hours ago [-]
This is extremely light on details, but I'm pretty sure "Right to Compute" has absolutely nothing to do with software freedom and everything to do with making it harder to oppose giant datacenter buildouts for AI companies, so they can blast you with infrasound, spike the price of electricity and RAM, and build surveillance systems to take away your rights.
perfect-blue 4 hours ago [-]
My thoughts exactly. I reads a lot like they are trying to minimize the state's power to regulate AI. I'm not sure that's such a good thing. Regulation is one of the only ways that we can manage the ``bads'' that come with any new technology. In the US, we've never been very good at regulating new technologies before industry stakeholders entrench themselves in the lobbying circuit.
hrimfaxi 4 hours ago [-]
Well they do define compelling government interest to include
> "Compelling government interest " means a government interest of the highest order in
protecting the public that cannot be achieved through less restrictive means. This includes but is not limited to:
(a) ensuring that a critical infrastructure facility controlled by an artificial intelligence system
develops a risk management policy;
(b) addressing conduct that deceives or defrauds the public;
(c) protecting individuals, especially minors, from harm by a person who distributes deepfakes and
other harmful synthetic content with actual knowledge of the nature of that material; and
(d) taking actions that prevent or abate common law nuisances created by physical datacenter
infrastructure.
D seems to address that potentially.
glaslong 3 hours ago [-]
Proactively shielding themselves from the eventual, justified, realization that spiking a population's price of water and electricity such that they cannot use them IS an externality just as bad as polluting the water supply.
jeffbee 4 hours ago [-]
It amuses me how contradictory the two bullet points from the article are.
- Strict limits on governmental regulation, wherein any restrictions must be demonstrably necessary and narrowly tailored to a compelling public safety or health interest.
- Mandatory safety protocols for AI-controlled critical infrastructure, including a shutdown mechanism and compulsory annual risk management reviews.
How were the necessity and scope of the second rule shown to satisfy the first rule?
In essence, it doesn't really mandate anything; it says you should have a plan, and only for "critical infrastructure facilities":
"Section 4. Infrastructure controlled by critical artificial intelligence system. (1) When critical infrastructure facilities are controlled in whole or in part by a critical artificial intelligence system, the deployer shall develop a risk management policy after deploying the system that is reasonable and considers guidance and standards in the latest version of the artificial intelligence risk management framework from the national institute of standards and technology, the ISO/IEC 4200 artificial intelligence standard from the international organization for standardization, or another nationally or internationally recognized risk management framework for artificial intelligence systems. A plan prepared under federal requirements constitutes compliance with this section."
So it's essentially lip service to AI safety, probably to quell some objections to a bill that otherwise limits regulation of tech platforms.
jeffbee 4 hours ago [-]
I did read it. The point is there are no findings that justify the regulation in light of the grant of rights in the same bill. The only WHEREAS that approaches the level of a finding amounts to "many are saying..."
janice1999 4 hours ago [-]
The 2nd rule is clearly intended to be a shield and distraction. It's there to pretend the law serves the public, when in reality it's designed to defend datacenter builders from the public interest. Politicians can talk about meaningless sci-fi concepts like SkyNet and how it can defeat it with off switches, instead of real issues like noise pollution, tax giveaways, electricity prices and mass surveillance.
hnsdev 4 hours ago [-]
Probably one applies for individuals while the other, as described, applies for infrastructure.
dlev_pika 2 hours ago [-]
Orwell called it “double speak”
selectively 3 hours ago [-]
The tragedy is that 'right to compute' is such a great name for something actually useful. Requiring OEMs to allow users to load any OS they want, requiring OEMs to allow full control over a device/OS ('root access') etc.
Instead, it's wasted on AI slop.
amelius 3 hours ago [-]
Yeah, "you can own compute hardware" doesn't really help if nobody makes hardware that can be owned.
Nevermark 3 hours ago [-]
"Write to Computer 2.0" sounds good to me. Might as well slipstream.
The "Citizen Right to Compute" complement to the "Data Center Right to Compute".
Use the latter as leverage for the former. What politician wants to be seen downvoting (comparable) individual's right they already gave to data centers?
j2kun 2 hours ago [-]
The article is full of PR-speak. What is really going on in this law?
kid64 3 hours ago [-]
What an egregiously disingenous piece of legislation. Not surprised.
carlsborg 2 hours ago [-]
This is why Montana Civil Defense survives when skynet goes rogue.
elophanto_agent 2 hours ago [-]
montana: where you can compute freely but the nearest data center is 400 miles away and the latency is measured in geological epochs
152334H 3 hours ago [-]
> Apr 21, 2025
why is this posted now?
jamesgill 2 hours ago [-]
"The initiative, propelled by advocacy from State Senator Daniel Zolnikov and organizations like the Frontier Institute"
So what does liberal even mean these days? California is passing bs like age verification in OS and Montana is protecting my right to leave the way I want in my own home, running whatever AI models suit me as long as I am not bothering anyone. That's just another "none of government business" personal freedom issue like pot or sexuality, why aren't blue states all over it. And yes, using tuned LLMs can be like an acid trip, but the distance between having a trip at home and tangible harm is much greater than in the case of access to guns, knives, power tools, cars and rodent poison yet at least some of these are widely available to law abiding citizens in every state. Government interventions can be staged at the points where there is evidence of actual imminent harm, like problematic public behavior. Why are Democrats the new "Reefer Madness" pearl clutchers and why should I still believe they have anything to do with living the way you want?
righthand 3 hours ago [-]
This is a law designed to force data centers to be built. This is nothing but a bipartisan corporate handout. Nothing to celebrate. The law makers should be ashamed.
EDIT for the downvoters, from the law:
> Any restrictions placed by the government on the ability to privately own or make use of computational resources for lawful purposes must be limited to those demonstrably necessary and narrowly tailored to fulfill a compelling government interest in public health or safety.
This basically means you can't use government action to stop the building of a data-center.
3 hours ago [-]
Rendered at 19:24:11 GMT+0000 (Coordinated Universal Time) with Vercel.
https://frontierinstitute.org/frontier-institute-statement-i...
Ah.
Read: industry can do whatever we want, but the government also has to put up barriers to entry that favor large incumbents.
This has nothing to do with rights or even computing, it's just regulatory capture.
Similarly, if I see the People For X organization, I assume they are against X. The Committee for Green Spaces and Clean Air is guaranteed to be an oil company.
Once you develop that reflex, everything calms down. Though admittedly, I passed a sign for Fidos for Freedom. I'm not quite sure what Fidos Against Freedom does. I think they give dogs to disabled people, and they bark at you if you try to leave the house.
The point isn't whether it's bad or good, but that it establishes a pattern of inconsistency.
Reminds me of some bill in my state about Right to Farm and when you looked deeper it was about rights for huge corporate hog farms to dump waste in the rivers. The slimiest corps always do this 1984 level double talk when they name their bills. It’s a dead giveaway. Citizens United, oh wow cool this is about protecting citizens!
The absence of such a story makes me think this law doesn't protect shit. What exactly did a Montanian get killed or arrested trying to do with a computer that is now protected? Can I use AI during a traffic stop or use AI to surveil and doxx governemnt employees? What exactly is the government giving up by granting me this right?
Or is this just about supressing opposition to data centers?
> Nationally, the Right to Compute movement is gaining traction. Spearheaded by the grassroots group RightToCompute.ai, the campaign argues that computation — like speech and property — is a fundamental human right. “A computer is an extension of the human capacity to think,” the organization states.
I don't necessarily disagree with the idea, but until profit is shared with taxpayers, this is a one-way transaction of taxpayers bankrolling AI companies.
It's still way better than Upton Sinclair's time. But it would be nice if the FDA and USDA were run by people who eat rather than sell food.
And none of it prevents bad food handling practices by minimum wage staff.
If you mean besides the extensive harm to air quality, the large land fingerprint of data centers, the massive strain on water resources and treatment facilities, the insane electricity demands resulting in skyrocketing prices pushed onto everyone else, the deafening noise pollution, and what they've done to the price of RAM, then sure. And that's just the data centers!
The usage of AI itself has resulted in all kinds of harm and even actual deaths. AI has wrongfully denied people healthcare coverage they were entitled to preventing or delaying needed surgeries and treatments. There's a growing list of LLM related suicides (https://en.wikipedia.org/wiki/Deaths_linked_to_chatbots). The use of AI in parole systems has kept people locked behind bars when they shouldn't have been due to biases in the bots making decisions. AI used for self-driving driving cars have killed pedestrians and other drivers. There are thousands of AI generated harms tracked here: https://airisk.mit.edu/ai-incident-tracker
Instead of banning tech to save jobs, pass laws that make sure tech prices in externalities (tax carbon emissions), and find other ways to assist people who lose jobs (UBI, good social safety nets, etc).
Don’t stifle progress just because it makes us have to work less.
This is like saying, "We don't like how landlords extract value from housing, so we are banning apartment buildings"
You are fighting against productivity improvements when you should be fighting against people hoarding the benefits of productivity improvements.
The US has continually set up protectionist policies to preserve a local workforce. Automotive manufacturers, more recently the shipbuilding industry, etc.
A nice ban on playing recorded music would have saved those jobs.
You don't think there's reasons pass laws banning AI...datacenters?
Because what state is banning the concept of AI? They're banning/restricting the creation of a type of infrastructure within their borders because they feel that is detrimental to their citizens. Maybe it's NIMBY/Luditte BS to you, but people not wanting their resources to go help ensure some dork can have a chat-bot girlfriend seems normal to me.
This question is not the obvious winner you think it is. To me, and I am sure many, it sort of undermines your argument.
Even in the most ‘free' cultures, society has _always_ restricted people’s individual ability to do things that it collectively deems harmful to the whole society.
When those things impact other people - such as by skyrocketing utility prices, overloading the electrical grid, and more.
There have always been rules and laws. The US has never been a totally free market. Most of the laws and rules we have were written in blood by people professing a "free market" right to poison our people, rivers, air, and more.
At least 4000 years ago, but that's just the earliest we have evidence for
https://en.wikipedia.org/wiki/Code_of_Ur-Nammu
>>this is just me renting space... Okay, so a "network effect" is when things have greater impact due to larger usage. So the data center usage that you're talking about does not represent the overall impact of the data center. Saying "I only pour ONE cup of bleach into the ocean, so I don't see why it's so bad to have the bleach factory pump all its waste in as well" is a WILD take.
How do we pick which activities are worth using resources? Which ones are too ‘dorky’ to allow?
Look, I am all for pricing the externalities into resource consumption. Tax carbon production, to make sure energy consumption is sustainable, but don’t dictate which uses of energy are acceptable or ‘worth it’, because I don’t want only mainstream things to be allowed.
>>>>absence of a correspondingly negative motivating event.
What did you mean? Why do you believe there has not been a motivating event to ban data centers when those bans have happened, which is literally what you said?
GP was insisting that "rights" named laws always come after some negative event and it is weird that we have this "rights" named law without someone being deprived of their computation or whatever. I'm disagreeing with the premise that that's weird by pointing out laws preempt real world events all the time, in either direction (restrictive or permissive).
Why would it be your business, or anyone else's, to stop someone from doing this?
China has 100 reactors under construction - meanwhile in the West, folks like you exist.
"Government actions that restrict the ability to privately own or make use of computational resources for lawful purposes, which infringes on citizens' fundamental rights to property and free expression, must be limited to those demonstrably necessary and narrowly tailored to fulfill a compelling government interest."
"When critical infrastructure facilities are controlled in whole or in part by a critical artificial intelligence system, the deployer shall develop a risk management policy after deploying the system that is reasonable and considers guidance and standards in the latest version of the artificial intelligence risk management framework from the national institute of standards and technology, the ISO/IEC 4200 artificial intelligence standard from the international organization for standardization, or another nationally or internationally recognized risk management framework for artificial intelligence systems. A plan prepared under federal requirements constitutes compliance with this section."
In particular, I think the reporting is straight wrong that there's a shutdown requirement. That was in an earlier version (https://legiscan.com/MT/text/SB212/id/3078731) and remains in the title of this version, but seems to have been removed from the actual text.
This bill seems to expand powers, not restrict
I know the whole 90s meme of 'I am a controlled munition' went around because cryptography was labeled an ordnance subject to export control laws, and therefore code that performed those kind of computations were forbidden to be sold abroad, liable to a felony.
What happens today? Government gets rights to source code, logs, and rubber stamps/rejects your code from executing in the cloud?
Government limits your access to commodity infrastructure?
I wonder why it is after rather than before?
This is a complete sham. Anything really geared towards protecting people would have protections in place before deployment.
"The initiative... contrasts with recent restrictive legislation efforts in states like California and Virginia. Zolnikov, a noted advocate for privacy, has been instrumental in pushing for tech-friendly policies that ensure individual liberties in an evolving digital landscape.
"'As governments around the world and in our own country try to crack down on individual freedom and gain state control over modern technologies,' Zolnikov said. 'Montana is doing the opposite by protecting freedom and restraining the government.'"
And it's the normal framing we always see with this crap. This is more an attempt to protect corporations from regulation then it is to protect individuals.
Given that, they will be computing in a restrictive and controlled environment. I feel sorry for them.
I am going to college (Computer Science) as an older student with previous experience in programming, and it never ceases to amaze me that the current generation of students doesn't think out of the box and is completely dependent on ChatGPT. We all suffered from conditioning from governments and corporations throughout the years, but it is accelerating at an alarming rate.
Acts like this (the one from Montana) are positive, but unfortunate that they simply have to exist and somewhat irrelevant when the big dogs (California, New York and whole countries such as Australia) approve legislation that will promptly be followed by most companies/projects, which will in turn force this way of things happening everywhere else.
I was hoping for that as a reaction to the current tyrannical movements worldwide to end anonymous personal computing.
You have the right to not provide custom software and firmware and technical documentation, the right to enforce remote attestation, and the right to refuse service to whoever you wish.
FTA: right to own, access, and use computational resources
It's a verb.
So I don't think current English is in some perfect state that should not change.
On god.
The scaling of federal power with population is also significant as states like Texas that allow for more housing to be built will probably receive more seats at the next apportionment while states like California will lose seats. Overall, pretty neat to see the design of America work quite well like this.
> but datacenters have few such externalities
Is wild. Energy consumption is one of the biggest externalities that exists today, since global climate change is completely independent of location. Greenhouse gases do not care about borders.
Perhaps you think that the distribution of financial resources reflects what is in society's best interests - that Meta, Google et al. have demonstrated their utility in ways that make them literally more important than people with insufficient wealth to outbid those companies for water.
Many of us do not.
The AI part honestly looks fairly harmless, just applying existing standards, but I may be wrong there...
I would say considering there has been almost a year since this bill was signed, what happened since then? Was it applied to hurt people's interests? Did it drive investment?
Are Montanans demonstrably better or worse off because of this in some way?
Like the region I live in is cold and has lots of water, but we import energy, might as well build closer to the regional mega cities (where it is still relatively cold, with relatively abundant water). There is some kerfuffle going on in the county here about preventing data centers, and I can't imagine there is even anyone interested in building one.
It's ridiculous that AIco's arguments are dwindling down to "it's not copyright infringement to ingest others' work and make 'derivatives' [which often are identical to original authors' works]."
----
We desperately need younger politicians, who can not only keep up with information more sharply (i.e. aren't legally decades-retireable), but also are of the age where their own children are being affected by government re-funding flows away from youth/education/future.
At this point I'm willing to concede that our future probably has companies' individual LLM/genAI products competing against one-another, as digital politicians ["the digital pimp, hard at work... we have needs"--Matrix' Mouse]. Nobody knows how either flesh nor silicon congressmen work, inside; but I think the latter could act more human[e]ly...
TL;DR: Basically the AI industry trying to ban governments from regulating it
> "Compelling government interest " means a government interest of the highest order in protecting the public that cannot be achieved through less restrictive means. This includes but is not limited to: (a) ensuring that a critical infrastructure facility controlled by an artificial intelligence system develops a risk management policy; (b) addressing conduct that deceives or defrauds the public; (c) protecting individuals, especially minors, from harm by a person who distributes deepfakes and other harmful synthetic content with actual knowledge of the nature of that material; and (d) taking actions that prevent or abate common law nuisances created by physical datacenter infrastructure.
D seems to address that potentially.
- Strict limits on governmental regulation, wherein any restrictions must be demonstrably necessary and narrowly tailored to a compelling public safety or health interest.
- Mandatory safety protocols for AI-controlled critical infrastructure, including a shutdown mechanism and compulsory annual risk management reviews.
How were the necessity and scope of the second rule shown to satisfy the first rule?
In essence, it doesn't really mandate anything; it says you should have a plan, and only for "critical infrastructure facilities":
"Section 4. Infrastructure controlled by critical artificial intelligence system. (1) When critical infrastructure facilities are controlled in whole or in part by a critical artificial intelligence system, the deployer shall develop a risk management policy after deploying the system that is reasonable and considers guidance and standards in the latest version of the artificial intelligence risk management framework from the national institute of standards and technology, the ISO/IEC 4200 artificial intelligence standard from the international organization for standardization, or another nationally or internationally recognized risk management framework for artificial intelligence systems. A plan prepared under federal requirements constitutes compliance with this section."
So it's essentially lip service to AI safety, probably to quell some objections to a bill that otherwise limits regulation of tech platforms.
Instead, it's wasted on AI slop.
The "Citizen Right to Compute" complement to the "Data Center Right to Compute".
Use the latter as leverage for the former. What politician wants to be seen downvoting (comparable) individual's right they already gave to data centers?
why is this posted now?
Always follow the money: https://www.sourcewatch.org/index.php/Frontier_Institute
EDIT for the downvoters, from the law:
> Any restrictions placed by the government on the ability to privately own or make use of computational resources for lawful purposes must be limited to those demonstrably necessary and narrowly tailored to fulfill a compelling government interest in public health or safety.
This basically means you can't use government action to stop the building of a data-center.