Microelectronics are advancing at record speed, pushed by artificial intelligence (AI) and machine learning (ML), Internet of Things (IoT), next-gen communications, and emerging quantum technologies. In this ACS/CAS webinar session, three experts mapped the research landscape, shared brain-inspired compute breakthroughs, and outlined how the U.S. CHIPS effort is catalyzing materials innovation. Speakers included Jean-Marc Pécourt (CAS), Steve Furber (University of Manchester), and Jim Warren (National Institute of Standards and Technology (NIST)). 

Landscape analysis:Which way is the literature pointing? 

Dr. Jean-Marc Pécourt, Principal Scientist at CAS, opened with a landscape analysis built from data within the CAS Content Collection™. A search within CAS IP Finder, powered by STN™, revealed ~2.3 million microelectronics records across two decades.

Stacked bar chart showing journal articles and patents related to microelectronics. The combined number grows from around 60,000 in 2004 to more than 140,000 in 2023.

Figure 1. Publication trends for the field of microelectronics. The inset donut chart insert illustrates the distribution of journal articles and patent families. Data includes journal and patent family publications from the CAS Content Collection for the period 2004-2024. *Includes data only for the months January to November.

Jean-Marc then zoomed into a five-year, 1.2 million record subset for fresh momentum. The domain’s unusually high patent-to-journal ratio underscores proximity to commercialization. Country curves show China’s sharp ascent in journals and patents, steady patent strength in the U.S. and South Korea, a long-term dip in Japan since ~2013, and India’s growth weighted toward journals.

Several NLP-derived CAS Trendscape maps curated by CAS scientists to reconcile terms highlight fast movers: 2D materials (transition metal dichalcogenides, MXenes), perovskite optoelectronics/PV, and a broad sweep of sensors. Among devices,FETs accelerate on the patent side while journals exhibit rapid growth of memristors and triboelectric nanogenerators. Mature processes (chemical mechanical planarization, chemical vapor deposition, etching) may look “slow” because their baselines are huge; they remain central to modern fabs. 

Why neuromorphic matters now 

Dr. Steve Furber, Professor Emeritus at the University of Manchester, UK, shifted the lens from what we build to how we compute. Today’s AI stacks thrive on dense matrix math; biological cortex thrives on sparse, recurrent signaling. That mismatch, he argued, is the root of AI’s energy wall. His SpiNNaker project approached the problem with a million modest ARMclass cores and, crucially, a selective multicast fabric that turns a single spike into thousands of targeted messages, mirroring cortical fanout without saturating interconnects. The result is biological real-time operation for key brain models, demonstrations of spiking-based constraint solving (e.g., Sudoku), and responsive robotics when paired with event-based sensors. 

The industry context strengthens this case. Intel’s Loihi 2 architecture and the 2024 Hala Point system (1.15 billion “neurons”) show peta-ops-scale neuromorphic computing with leading TOPS/W efficiency on certain deep-learning workloads, showing evidence that event-driven, brain-inspired execution can complement (and in some niches surpass) conventional accelerators. Steve believes sparsity and recurrence aren’t biological curiosities; they’re the blueprint for scalable, sustainable intelligence. 

He closed with a pragmatic view of convergence rather than replacement: graphics processing units and neuromorphics will likely coevolve, with neuromorphics shouldering ultra-low latency, low-power perception and closed-loop control, while conventional accelerators handle dense training phases until, perhaps, the two worlds meet in hybrid systems. 

From maps to manufacturing  

Dr. James Warren, Director of the Materials Genome Program at NIST connected the literature signals and computed ideas to the reality of manufacturing. He outlined the CHIPS R&D Office Broad Agency Announcement (BAA), a call for proposals in advanced semiconductor manufacturing, AI, quantum, and standards along with an atypical returns model (equity/warrants/licenses) designed to align public investment with longterm outcomes. His throughline: to make discovery matter faster, labs need autonomy, data, and rigor. 

Here, the Materials Genome Initiative meets self-driving labs and physics-informed ML. Jim described how closed-loop experimentation can generate rich, auditable datasets while surrogate models, constrained by first principles (e.g., transport, thermodynamics, electrochemistry), make near real-time decisions that are useful on fab timescales. One memorable anecdote: AI extracting phase boundaries directly from high throughput diffraction streams, an early glimpse of how autonomy shifts scientists’ time from data wrangling to discovery. The message was an invitation to build measurement-grade autonomy that propels ideas across the lab-to-fab divide. 

What our audience focused on

Throughout the discussion, our audience was active in asking questions and providing some of their own thoughts. These were the areas that piqued their interest most:

1. Multimodal inputs. Steve explained that true multimodal integration like linking sight and sound is modeled at a higher level than most neuromorphic research today, which focuses on low-level brain circuits. Some neuroscience experiments and models do explore sensory fusion, such as aligning auditory and visual spatial maps, or the rat study where combined weak stimuli became detectable. However, neuromorphic systems generally prioritize understanding fundamental neural microcircuits rather than full multisensory processing. 

2. NVIDIA and AMAT digital twins. Jim didn’t know details of the NVIDIA–AMAT collaboration but highlighted the broader trend: major investment in digital twins for semiconductor design and manufacturing. Virtual models now span from fab-level systems down to materials and atoms, enabling better prediction and optimization. He compared this to NVIDIA’s work in drug discovery with fully virtualized design loops. As modeling improves and data increases, digital twins are becoming central to engineering and manufacturing innovation. Jean-Marc added that these tools are emerging across multiple scales. 

3. Spinnaker vs. conventional High-Performance Computing (HPC). Steve described Spinnaker as superficially HPC-like but fundamentally different. It uses many small processors suited for highly parallel neural modeling, unlike HPC’s large, powerful processors. Its major distinction is a custom communication fabric enabling efficient multicast, one spike reaching thousands of targets, which standard HPC networks cannot do efficiently. Because neural network structure changes slowly, Spinnaker can hardcode routing, a strategy unsuitable for general-purpose HPC but ideal for neuromorphic workloads. 

4. U.S. vs. China in semiconductors and research output. Jean-Marc noted that Chinese institutions produce many commercially focused patents, while the U.S. research landscape mixes commercial and academic efforts. Because semiconductors are strategic, U.S. projects showing commercial relevance may find strong funding opportunities. Jim avoided political specifics but said final U.S. budgets for NSF and NIST remained stable despite earlier concerns. He acknowledged China as a major long-term competitor in semiconductor manufacturing and emphasized ongoing U.S. efforts to secure critical supply chains. 

5. Energy and water use in neuromorphic computing. Steve questioned why data centers consume water at all, noting that cooling water could be fully recycled with heat exchangers. Neuromorphic systems could cut AI energy needs and associated cooling demands by up to 1,000 times. He also cautioned that efficiency gains can lead to greater overall consumption (Jevons paradox), as lower costs expand usage. He illustrated this with historical examples, such as the unexpected explosion in global computer deployment. 

Our thanks to our guests and audience members. For more information, watch the webinar recording and review the related CAS report, Microelectronics frontiers: Emerging technologies, materials, and innovations.

Related CAS Insights

新兴科学

16 billion reasons for hope: How biomarkers are reshaping cancer outcomes

新兴科学

2023 年最具影响力的科学突破和新兴趋势

新兴科学

2024 年值得关注的科学突破新兴趋势

Gain new perspectives for faster progress directly to your inbox.