Altering the Foundation of Our Digital Society
Our computation demands now are different from those in the 20th century. New workloads extend from sensing and communication at the fringe of the Internet of Things to real-time scene analysis in autonomous vehicles to machine learning and inference in huge data centers. In all cases, efficiency is critical: low power, low cost, and small size.
Given these new application domains, and admitting that we can no longer rely on CMOS scaling to bury our problems under a mountain of cheap transistors, what are our options? DARPA has invested in alternative architectures such as neuromorphic computing to develop novel alternatives in base technology, circuits and systems that could continue to deliver greater computational power at much greater efficiency. Why have they not changed the industry (yet)? Where and how will they be deployed? Do our new artificial intelligence workloads open the door to novel computational circuits and systems? What will industry do and what will DARPA do to make the 21st century compute engines to address 21st century compute problems?