Skip to main content
Innovation & Entrepreneurship Institute

Frontiers of Computing: The Technologies and Challenges Shaping Our Future

As AI reshapes entire industries and global data demand explodes, foundational computing technologies and next-gen semiconductors are moving from theory to reality. But with innovation comes friction. Founders, researchers, and investors are now grappling with the scientific, commercial, and ethical challenges that will define this next era of computing.
 

Next Frontier Computing

New infrastructure and hardware technologies are emerging, yet questions remain around their most effective use cases and long-term scalability. From quantum and neuromorphic to photonic computing, this article explores the key trends and roadblocks that ventures and mentors explored during the recent Creative Destruction Lab (CDL) Next Gen Computing session in Heilbronn, Germany. Europe’s top minds gathered to explore both the breakthroughs—and the bottlenecks—defining the deep tech frontier.

The Frontier Technologies

While quantum computing gets much of the attention, it’s far from the only technology shaping the future of computation. Quantum computing promises exponential leaps in processing power, capable of simulating molecules, solving optimization problems, and advancing secure communication in ways that classical computers can’t.

But building reliable quantum systems is notoriously difficult. As Prof. Dr. Achim Kempf, Canada Research Chair in the Physics of Innovation at the University of Waterloo, explained, the industry is undergoing a shift in expectations:

“We’re realizing we won’t build a single processor with 100,000 qubits. The future of quantum computing is modular. We’ll create smaller, reliable quantum units—each with around 100 qubits—and connect them via quantum networks.”

This modular approach, akin to distributed computing, may offer a more feasible path forward. Still, controlling qubits and reducing error rates remain core scientific hurdles. 

Next Frontier Computing
Prof. Dr. Achim Kempf, Canada Research Chair in the Physics of Innovation at the University of Waterloo

 

While quantum computing aims to push the boundaries of power, neuromorphic computing focuses on efficiency. By mimicking the brain’s architecture—spiking neurons, parallel pathways, adaptive responses—neuromorphic chips offer ultra-low-power, real-time processing, particularly useful at the edge.

This approach is especially promising for autonomous systems, like satellites, drones, and defense technologies, where energy and size constraints make conventional AI hardware unsustainable. “Neuromorphic computing is really about power efficiency,” said Florian Corgnou, Co-founder of Neurobus, a CDL venture, “that’s essential for enabling the next generation of autonomous systems.”

In a world where AI models require increasing amounts of energy and compute power, neuromorphic systems could help decentralize intelligence—processing data locally instead of relying on energy-hungry cloud data centers.

Photonic computing—using photons instead of electrons to process information—offers another radical shift. Unlike traditional chips, photonic processors can transmit data at the speed of light with minimal heat and energy loss.

This is especially attractive for AI and large-scale data workloads, where speed and thermal efficiency are critical. However, integrating photonic components into existing silicon-based infrastructure remains a complex engineering challenge.

Although less mature than quantum or neuromorphic solutions, photonic computing is gaining attention as a complementary layer in future hybrid computing systems.

Next Frontier Computing
Oliver Kahl, Principal at MIG Capital and CDL Mentor

Next-Gen Semiconductors & Software-Hardware Convergence

From specialized chips like GPUs and TPUs to edge AI processors and custom ASICs, we’re entering an era of hardware-software co-design, where systems are built from the ground up to support specific types of intelligence. Behind every AI breakthrough lies the question: what’s running it?

As Alan Lau, Co-founder of Two Small Fish Ventures, put it:

“AI has been around for decades, but the convergence we’re seeing now—between hardware and software—is relatively new. Edge computing, robotics, generative AI… they all demand a different kind of hardware stack. There’s still a lot to figure out.”

This convergence is driven by a shift in AI workloads—from cloud data centers to edge environments—and the need for flexible, low-latency, high-performance systems that can run AI close to where data is generated.

In short, the computing frontier is no longer defined by one technology, but by an interconnected stack of innovations that together form the backbone of the next digital era.

The Challenges Ahead

The technologies shaping the future of computing are not just cutting-edge—they’re complex, capital-intensive, and often years away from mainstream adoption. While the potential is enormous, the road to real-world impact is filled with significant challenges that must be acknowledged and addressed.

At CDL’s Next Gen Computing session, mentors and founders spoke candidly about the obstacles facing deep tech entrepreneurs. Here are five major challenges facing technologies like quantum, neuromorphic, and photonic computing:

  1. Scientific Maturity & Technical Readiness
  2. Infrastructure & Cost
  3. Commercialization Gap
  4. Strategic & Ethical Considerations
  5. Human Capital
     
IEI - Next Frontier Computing
Ventures during the session 3 of the Next Generation Computing Stream in Heilbronn

Scientific Maturity & Technical Readiness: Many innovations are still in the lab. From neuromorphic prototypes to early-stage quantum modules, these technologies require years of R&D before real-world deployment. Corgnou, from Neurobus, is exactly in this moment. “We need to ensure these technologies provide enough value and maturity to compete in the market.”

Infrastructure & Cost: This highlights one of deep tech’s biggest hurdles: it's expensive—especially in hardware. Startups in quantum, photonics, and semiconductors often need to build or access specialized fabrication and testing facilities that can cost millions. As Oliver Kahl, Principal at MIG Capital, pointed out:

“Some technologies don’t even have the infrastructure yet—it’s hard to build what doesn’t exist.”

Commercialization Gap: Layer on top of this the geopolitical landscape and varying investment climates across regions. While Europe leads in scientific research, it continues to lag in converting that into scalable businesses. Deep tech founders frequently encounter investor skepticism due to long development timelines and uncertain business models, and receive less funding than their American and Chinese counterparts.

Strategic & Ethical Considerations: At the same time, ventures must learn to navigate regulation and build responsibly. In this regard, Ekaterina Almasque, General Partner at OpenOcean and CDL Mentor, emphasized: “We want innovation that doesn’t just scale—but sustains society and aligns with our values.”

Human Capital: Finally, one universal challenge remains: talent. Europe produces exceptional engineering and research talent—but often loses it to other regions, particularly the U.S., in a persistent brain drain. This talent loss undermines Europe's long-term competitiveness in next-generation technologies. With shifting global policy and increased investment, closing the human capital gap will be key to supporting the growing demand for specialized expertise in these fields.