LIM Center, Aleje Jerozolimskie 65/79, 00-697 Warsaw, Poland
+48 (22) 364 58 00

Category: Volunteer Computing

32B AI Model Trained by a Swarm of Volunteer GPUs – Inside INTELLECT-2’s Decentralized Revolution

A Globally Distributed AI Training Milestone In May 2025, the Prime Intellect research team unveiled INTELLECT‑2, a 32-billion-parameter large language model (LLM) that was trained not on a single data-center supercomputer, but on a globally distributed “swarm” of volunteer GPUs chakra.dev. This makes INTELLECT-2 the first LLM of its scale to be trained via fully asynchronous…
Read more