Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

Electricity is a critical component of artificial intelligence, but new hardware solutions are outstripping the ability of data center operators to manage their relationship with the power grid, forcing them to downsize by up to 30%.
“There’s a lot of wasted energy in these AI factories,” Nvidia CEO Jensen Huang said at the annual GTC customer conference. “Every watt not used is money lost,” the company announced at the annual show.
Today, the beginning AI level has come out privately with $12 million in seed money to solve this problem by precisely trying to use the power of GPUs and new sensors and create better control tools.
The Tel Aviv startup was founded last year by CEO Tomer Timor and CTO Edward Kizis, and backed by Glilot Capital, Grove Ventures, Arc VC, Encoded VC, Leap Forward, and Aurora Capital Partners. The company declined to share the cost.
As edge labs use thousands of GPUs to train and run advanced models, there is a significant increase in millisecond power when processors switch between computing tasks and communicating with other GPUs.
These surges make it difficult for data centers to manage the power they get from the grid. To avoid being left without enough power, data centers pay for short-term storage to cover surges, or reduce their GPU usage. Both cases reduce the return on investment in expensive chips.
“We can’t continue to build data centers the way we’re building them now,” said Lior Handelsman, a partner at Grove Ventures who sits on Niv’s board.
Techcrunch event
San Francisco, CA
| |
October 13-15, 2026
The first step in Niv’s path is to understand what is going on; the company is now shipping rack-level sensors that detect power consumption at the millisecond level on its own and compatible GPUs. The goal is to understand the dynamic history of various deep learning applications, and to develop deductive methods that allow data centers to unlock the data already available.
Naturally, experts hope to create an AI model on the data they collect, with the aim of training to predict and coordinate the electrical load at the data center – “copilot” for data center experts.
Niv-AI hopes to have the system operational in a few US data centers in the next six to eight months. It’s an interesting idea because hyperscalers trying to build new data centers face land use and supply constraints. The startup envisions their end product as an “intelligent layer” between the data center and the power grid.
“The industry is afraid that data centers are using too much energy at a certain time,” Timor told TechCrunch. “The problem we’re looking at is a problem on both sides of the cable.” One is trying to help data centers use more GPUs, and hopefully make more of the power they’re already paying for.