Walrus, IO brings GPUs and distributed storage to AI builders

3 Min Read
3 Min Read

This is a segment of the drop newsletter. Subscribe to read the full edition.

IO.NET, a distributed cloud computing provider with its own IO token, is working with a walrus team to enable startups to train, run and store their own custom AI models.

IO provides a network of GPUs for AI training and fine-tuning, and Walruses allow AI model storage in trading. The consolidation will be available as a provision for spending. This means that builders are billed only for the amount of storage and computing power they use.

This allows AI Agent DEV or AI App Builder to develop and operate AI models without the need to configure their own data centers or hardware.

IO says there are over 10,000 GPUs and CPUs all over the world.

Like Bittensor, Lambda, Spheron, Akash, Gensyn, Vast AI, and Google’s Vertex products, the BYOM offering of Walrus and Ioo must compete with other AI developer cloud services.

“The traditional centralized cloud model is not only expensive, but it also has important privacy risks and limited complexity options that challenge developers who prioritize decentralization,” says Rebecca Simmons, executive money at Walrus Foundation.

“By leveraging distributed data storage solutions, IO.NET can provide the computing power needed for advanced AI and ML development without the drawbacks of traditional models, allowing this to be clearly won for developers, users, and the Web3 industry as a whole,” continued Exec.

The Internet can feel very centralized when it goes through an outage, like what giant cloud services happened to Google last week (and impacted CloudFlare and various other apps and sites). That’s one of the obvious reasons why centralized AI calculations may not be ideal.

See also  90-95% cost reduction, scalability, low carbon emissions, etc.

Walrus’ mainnet will be released in March, with programmable main pitch and distributed storage. The Walrus Foundation announced a $140 million salary increase that month.

More broadly, the need to calculate the power of AI is expected to continue to increase every year. Researchers at McKinsey predict that data centers will need $6.7 trillion worldwide to meet demand by 2030 (although the range of scenarios modeled below):

Share This Article
Leave a comment