The massive amount of energy needed to power Artificial Intelligence could slow its development in the U.S., according to a Connecticut database security and privacy expert.
Connecticut and other states would have to significantly improve their energy infrastructure to keep up with the electricity needed for AI computing, said Chetan Jaiswal, a computer science professor at Quinnipiac University and an AI researcher.
“For example, a single chip running for nine days uses more than 27,000 kilowatt hours. An average household uses approximately 10,000 kilowatt hours annually,” Jaiswal said.
“That’s just one processor running for nine days. You realize the more processors we bring in, the more energy consumption it becomes,” he added.
This is among the top problems for America in maintaining its edge over China in AI development.
A massive high-speed data center complex has been proposed for Connecticut, next to the Millstone Nuclear Power plant in Waterford.
It would draw its electricity directly from the nuclear plant.
The data center would be the single largest electricity user in the state.
In the meantime, the state General Assembly didn’t take action on a state electric vehicle mandate this year because some lawmakers were concerned the state did not have the necessary electricity infrastructure.