AI Is A Gun, Not A Nuke
Authored by Dylan Dean via AmericanThinker.com,
It's in vogue, as artificial intelligence becomes more sophisticated, to compare A.I. systems to nuclear weapons. Such analogies have come from traditional media like Bloomberg, niche internet microcelebrities, and the world's most famous A.I. doomer, Eliezer Yudkowsky.
But this comparison does not hold up to scrutiny.
Game Theory - the study of how rational actors interact - shows us why A.I. is not the threat it is made out to be. A.I. is more like a gun than a nuclear weapon.
Nuclear weapons are unlike all other weapons because of their destructive power.
If two nations have a nuclear exchange, both sides lose: missiles will be in the air, with no hope of disabling enemy weaponry in time to prevent a strike.
For this reason, Game Theory suggests a policy of "Mutually Assured Destruction": any rational actor will avoid using nuclear weapons because his own side will be destroyed in the process.
Thus, nuclear safety is dependent on the centralization of nuclear capabilities to a small number of rational actors.
However, consider firearms within this same framework: the best counter to a mass shooter is another person with a gun, either a civilian or law enforcement.
Unlike with nuclear weapons, the risk of gun violence can be mitigated with decentralization. Centralize firearms in the hands of the state, and you have a society of tyranny; centralize them in the hands of criminals, by banning ownership for the law-abiding, and you get a society of plunder. These two are not very different, save for aesthetics. A free and safe society, on the other hand, is the result of a wide dispersal of guns. Guns in the hands of everyday people work as a countervailing force against those who would use guns to take away rights or personal property.
Intelligence, which can be used for both good and evil, operates on a similar principle to that of gun ownership. As with firearms, asymmetric access to intelligence can lead to dangerous situations. This is why the mentally ill are protected through conservatorships (when they aren't being abused — #FreeBritney). It's why we have age of consent laws and why we criminalize elder financial abuse. A situation where mental capability is nearly equal, on the other hand, is a safer one. This is why the U.S. Constitution guarantees the right to a lawyer: the law is an intellectual domain, and defendants without a legal background are at a significant disadvantage. What is really being provided here is the lawyer's mind, with the knowledge and experience necessary to level the playing field.
Artificial intelligence works the same way.
An A.I. model might have access to information and resources vastly greater than an individual person - like the difference between a prosecutor and a defendant. But just as a lawyer steps in to equalize the playing field, another A.I. system could likewise play this equalizing role.
Like guns and lawyers, then, wide access to artificial intelligence is necessary to avoid an asymmetry of resources and information - and the dangerous anti-competitive landscape that would arise from centralization.
Open source models like Llama2 and StableLM level the playing field, or get close, by allowing anyone with sufficient hardware to control his own model locally. Promoting the continued development of these models, through avoiding government regulation and increasing nonprofit funding of open-source A.I. research, will ensure that the playing field stays as level as possible. This will allow the A.I. capabilities of the public to remain near that of large corporations. Regulation, on the other hand, would centralize A.I. resources into the hands of a few large companies, thus taking away competition — the only countervailing force that could check centralized A.I. in the first place.
It is no surprise that the people pushing for A.I. regulation tend to also be proponents of gun control — both positions are premised on the same faulty reasoning. On September 13, Chuck Schumer hosted an A.I. summit that included leaders from the largest A.I. companies. Whatever comes of that summit might be good for those companies, and for gun-grabbing Chuck Schumer, but not for the American people.
They will try to scare people into supporting regulations, the proponents of which will likely use the same tired comparison to nuclear weapons. We must not fall for it.
https://ift.tt/w68eSYt
from ZeroHedge News https://ift.tt/w68eSYt
via IFTTT
0 comments
Post a Comment