Europe
2025.09.14 18:55 GMT+8

SpikingBrain: China's new 'brain-like' AI to cut energy and boost speed

Updated 2025.09.14 18:55 GMT+8
CGTN

Instead of activating an entire network at once, as Transformers do, SpikingBrain only 'fires' specific neurons when they are needed. /AI-generated image/CGTN

China has unveiled what is being described as the world's first large-scale, brain-inspired artificial intelligence (AI) system, a potential game-changer for both global technology development and energy sustainability.

Developed by scientists at the Chinese Academy of Sciences' Institute of Automation, the new model, named SpikingBrain 1.0, marks a bold step away from traditional AI architecture and toward a new kind of intelligence, one that mimics how the human brain processes information.

More importantly, it achieves this without using Nvidia GPUs (graphics processing units), the high-performance chips that dominate current AI systems and are now subject to U.S. export restrictions to China.

This breakthrough comes amid intensifying global interest in making AI more efficient, less energy-hungry, and less dependent on a few key hardware suppliers.

 

What makes SpikingBrain different?

Most of today’s large language models (LLMs), such as ChatGPT or GPT-4, are based on a framework called the Transformer architecture, introduced in 2017. It was a revolutionary step forward, allowing machines to process language in parallel rather than word by word. This led to faster training times and more powerful models.

But there is a trade-off. These systems require enormous computational power, relying on GPUs to handle complex parallel processing. Training and running these models demands vast data centres and consumes significant energy, raising environmental and economic concerns.

SpikingBrain 1.0 does things differently. It is based on spiking neural networks, a technology inspired by how biological brains work.

Instead of activating an entire network at once, as Transformers do, SpikingBrain only "fires" specific neurons when they're needed. This event-driven approach mirrors how the human brain operates: most neurons remain inactive until triggered by relevant stimuli.

The result? An energy-efficient system that responds faster, and can even learn from significantly less data.

 

How fast is it?

According to a non-peer reviewed technical paper published on arXiv, a public research repository, SpikingBrain 1.0 can handle long-form tasks at speeds up to 100 times faster than conventional AI models.

In one demonstration, a smaller version of the model processed a four million-token input, in a fraction of the time compared to conventional models.

The system has been tested in continuous operations over several weeks using China’s MetaX chip platform, developed domestically by Shanghai-based MetaX Integrated Circuits Co.

 

Why is this development significant?

One of the most notable features of SpikingBrain 1.0 is that it does not rely on Nvidia chips. This is crucial given that the United States has banned the export of high-end AI chips to China.

Most Western AI systems depend on Nvidia's powerful GPUs, which have become a bottleneck in global AI development. 

SpikingBrain was trained and run entirely on Chinese-built hardware, making it a strategic milestone for China's domestic AI ecosystem.

Lead researcher Li Guoqi from the Chinese Academy of Sciences explained this model opens a new path for AI development, providing a framework optimized for Chinese chip platforms while delivering high performance and energy efficiency.

He said it could be useful to process long sequences of data such as legal documents, medical records or scientific simulations.

 

What are the environmental benefits?

Energy consumption is one of the AI industry's biggest challenges.

Training just one large model can consume hundreds of megawatt-hours of electricity. Running these models at scale, as done by global tech firms, contributes to significant carbon emissions, and adds pressure on energy infrastructure.

SpikingBrain 1.0's energy-efficient design directly addresses this concern. By only activating parts of the network when needed, it consumes far less power. This leads to:

- Lower electricity usage during training and inference

- Reduced cooling requirements in data centres

- Smaller carbon footprint, making AI development more sustainable

If adopted widely, this type of architecture could help AI evolve in a way that aligns with climate targets, potentially shifting AI from being a growing environmental challenge to becoming part of the solution.

 

Can it compete with Western models?

According to the researchers, SpikingBrain 1.0 was trained using just 150 billion tokens, roughly two percent of the data used by some mainstream LLMs, yet still demonstrated comparable performance to leading open-source alternatives.

The two versions of the model built include a seven-billion parameter version for general use and a more advanced 76-billion parameter version for complex tasks.

The team has open-sourced the smaller version and made the larger model available for public testing. 

On the demo website, the model introduces itself: "Hello! I'm SpikingBrain 1.0, or 'Shunxi,' a brain-inspired AI model. I combine the way the human brain processes information with a spiking computation method, aiming to deliver powerful, reliable, and energy-efficient AI services entirely built on Chinese technology."

 

What could this mean for the future of AI?

If the model's reported performance holds up to peer-reviewed scrutiny, SpikingBrain 1.0 could reshape the global AI landscape in several ways. 

With AI development no longer tied to high-end GPUs, more countries and smaller organizations could access powerful AI tools. This will reduce dependence on scarce hardware.

Additionally, by dramatically cutting energy usage, spiking neural networks could become the standard for sustainable AI in the future. And models like SpikingBrain could be deployed on users' phones, laptops, or edge devices. This will reduce the need for vast cloud infrastructure.

SpikingBrain suggests an alternative future where AI learns and reacts more like the human brain, can potentially lead to more natural and adaptive systems.

This model is still new but it is a clear effort by China to chart its own course in AI, free from Western hardware dependencies, while addressing the environmental and computational limitations of existin AI technologies. It opens a new chapter in the race toward faster, greener, and more autonomous artificial intelligence.

Copyright © 

RELATED STORIES