For regulators trying to put guardrails on AI, it’s mostly about the arithmetic. Specifically, an AI model trained on 10 to the 26th floating-point operations per second must now be reported to the U.S. government and could soon trigger even stricter requirements in California.
Say what? Well, if you’re counting the zeroes, that’s 100,000,000,000,000,000,000,000,000, or 100 septillion, calculations each second, using a measure known as flops.
What it signals to some lawmakers and AI safety advocates is a level of computing power that might enable rapidly advancing AI technology to create or proliferate weapons of mass destruction, or conduct catastrophic cyberattacks.
It’s already dangerous. Lavender AI has been used by Israel to “identify Hamas agents”. Anduril (a name they do not deserve) is working on drones that will autonomously identify targets and attack them in the future. . It’s already dangerous. It’s already enabling killing to be done without remorse.
It doesn’t have to be “powerful” to be dangerous. People just have to believe that it is and/or believe what it craps out without fact checking it.
Then there’s also the fact it’s driving up demand for energy and keeping dirty power plants online.
goes to look for some perspective
A Radeon 7900 XTX can do 61.4 teraflops.
The federal threshold is 100 yottaflops. So figure a system with training hardware the equivalent of about a million Radeon 7900 XTXes.
I have to say that I’m not sure that the training capacity is likely the strongest metric to use – I think that you could probably generate human-level intelligence with a plain old desktop CPU in a reasonable amount of time, given the right software and available training information that exists in 2024 – but it also probably isn’t too intrusive and it’s an easy bar to evaluate.
Measuring how much data your brain stores is kinda hard – some of that is an encoding that has to provide for redundancy – but I don’t think that it’s too controversial to say that it’s probably within what a hard drive or at most a small array of drives can do today.
Estimates of the storage capacity of the brain vary from 10^10 to 10^15 binary digits. I incline to the lower values and believe that only a very small fraction is used for the higher types of thinking.
– Alan Turing, Computing Machinery and Intelligence
That’d be 10GB to 1PB.
https://aiimpacts.org/information-storage-in-the-brain/
The brain probably stores around 10-100TB of data.
You can get a 10TB hard drive these days. So for training you’re talking about maybe touching every bit on a hard drive at least once. Question is how much actual computational work in preprocessing is required. Might not be a lot.
As for runtime, your brain does have a lot of parallel capacity, but it also has a “clock” on the order of 100 Hz, which isn’t all that fast at serial computation. A lot of problems are constrained by serial computation capacity, and you can trivially serialize any parallel computation, but cannot parallelize all serial computations.
When companies start using it