Swiss researchers at EPFL have developed Anyway Systems, software that lets powerful AI reasoning models run on local networks of ordinary desktop PCs rather than relying on massive, power-hungry data centers. Traditional large data centers consume huge amounts of electricity, water and rare-earth components and create supply-chain bottlenecks for GPUs; Anyway Systems tackles this by distributing AI processing across a small cluster of consumer machines, allowing even large models like an open-source ChatGPT-120B to run locally. Installing the system can take about 30 minutes, and while inference may be slightly slower than in data centers, the accuracy remains comparable, and the local setup offers better data privacy and sovereignty. This approach doesn’t eliminate the need for giant facilities for training new models, but it provides a more sustainable, decentralized way for small organizations or groups to harness advanced AI without Big Tech infrastructure.

