With enormous changes taking place across the technological landscape, IEEE Spectrum has pulled together the biggest computing stories of 2025. This variety speaks to the challenges, breakthroughs and evolution that are ongoing forces in the industry. Finally, with the help of creative designs, it underscores the rich contributions from experienced sages like Robert Charette, technologist for life and frequent contributor to IEEE Spectrum. The stories provide a glimpse into a big moonshot: moving data centers to the moon. Beyond that, they’re a testament to advances in software and computing efficiency, and a warning to both the potential and perils of today’s technology.
Industry practitioners still keenly feel Charette’s 2005 observations on preventable causes of software project failure. They should act as a wake-up call when it comes to the need for effective management practices. As technology evolves, so does the reality of data security, software reliability and project success as a whole.
The Moon as a Data Sanctuary
This year’s collection includes the most audacious effort to date. It’s plan to build data centers on the moon! This creative approach puts priority on safeguarding vital, sensitive public data from catastrophe. These emergencies may be natural disasters, such as hurricanes, floods, and earthquakes, or manmade disasters like cyberattacks.
The moon’s unique position, free from any nation’s jurisdiction, presents an opportunity for hosting data “black boxes” under various legal frameworks. This feature is particularly attractive. Since 2009, data breaches have compromised more than 520 million records and the frequency is only increasing, with this year poised to be the worst yet. Organizations are feeling the heat with growing acknowledgement that legacy data centers are susceptible, leading to a critical need for creative data center solutions that mitigate risk.
Beyond the physical protection this planet-sized move would afford, relocating our data centers to the moon would provide new layers of privacy and security. By operating outside of Earth’s regulatory environments, companies may find new ways to manage data in compliance with diverse international laws while protecting themselves against potential breaches.
Reviving Software Innovations
No surprise then that the rapid evolution of software technologies continues to be a hot topic in the computing sector. Remarkably, Apache Airflow was in dire straits by 2019, on the brink of obsolescence. Due to the determination of one dedicated grassroots open-source developer, the project made an astounding comeback.
With Apache Airflow 2.0 released in late 2020, the project’s popularity experienced a second wind. This freshness brought along with it a excited and growing community of users and developers. With the release of Apache Airflow 3.0, a new modular architecture made it possible for Airflow to run natively across different platforms. This adaptability positions Apache Airflow as an essential tool for managing complex workflows in diverse computing environments.
The rapid return of Apache Airflow is an incredible example of an ongoing and robust developer-first trend. These community-driven approaches are breathing new life into projects that otherwise would have become costly mistakes. As enterprises increasingly depend on automation, AI, orchestration tools and more, the need for resilient solutions has never been more critical.
Advancements in Computing Efficiency
These enabling technologies for leap-frog computing efficiency also recently drew the spotlight in IEEE Spectrum’s insightful analysis. Reversible computing, that has mostly so far stayed in academia, is closer now than ever to commercial viability. It’s easy to see why — from a cultural and intellectual perspective, Vaire Computing is truly an innovative industry leader. They are currently testing new approaches that might increase energy efficiency 4,000 fold over traditional chips.
This leap in efficiency is particularly critical as global energy demands rise alongside technological advancements. Smart, efficient, sustainable computing solutions are required right now more than ever before. In the United States, healthcare costs—now $4.8 trillion—account for 17.6 percent of the GDP.
At the same time, large language models (LLMs) keep learning and advancing at breakneck speeds. Their capabilities literally double every seven months, but they remain limited by issues of quality and reliability. As it stands today, there’s a roughly 50 percent chance that an LLM will correctly solve the hard challenge you’re giving it. This duality is perhaps the strongest evidence of how quickly artificial intelligence is advancing. It highlights the critical importance of advancing model training and deployment.
Biocomputing: A New Frontier
Cortical Labs is definitely on the cutting edge of computing technology. They’ve recently begun commercializing biocomputers that utilize the collective processing power of 800,000 living human neurons embedded onto a silicon chip. At a price of $35,000 though, this innovative product marks a new frontier in engineering the ways we process information by using naturally occurring biological systems.
The potential applications of biocomputing are nearly limitless and have the capacity to transform fields from medicine to AI. Combining biological activity with conventional computing techniques, the researchers could achieve functions that were once considered impossible.
Research into biocomputing fits into broader efforts to make computing more efficient and increase the potential of today’s technology. As researchers continue to investigate this intersection of biology and technology, it may lead to unexpected advancements that reshape how society interacts with machines.

