- Applied Brain Research Inc. (ABR) released today benchmarks showing ABR Nengo DL deep networks on the Intel Loihi research chip is 38x more efficient than leading GPU for real-time data deep-learning
- ABR working with customers on applying Nengo stack + Loihi to ioT devices, smartphones and driverless cars, offering much more efficient alternative to GPUs
TORONTO, Dec. 6, 2018 /PRNewswire/ – Applied Brain Research Inc. today released a study comparing the energy-efficiency of their Nengo Deep Learning toolkit (Nengo DL) running a real-time, keyword spotting deep learning network on Intel's Loihi neuromorphic research chip to traditional hardware. The results show that Nengo DL on Loihi uses 38x less energy per inference than an architecturally identical network running on an NVIDIA Quadro K4000 GPU.
The study also compared the dynamic energy cost per inference performance of the same deep network on several other platforms. In each case the Nengo DL on Loihi network consumed significantly less power. Specifically, the NVIDIA Jetson TX1 edge GPU consumed 7.3x more energy, the Intel Xeon E5-2630 CPU 8.2x more and the Movidius Neural Compute Stick 1.9x more.
"These benchmarking results show that Loihi can perform inference on real-time data streams using standard feed-forward deep networks with significant efficiency advantages compared to conventional processor architectures. ABR's Nengo DL software makes these gains accessible for mainstream use by hiding the complexity of the underlying spiking neural network implementation. This has important implications for the commercialization outlook for this technology.", stated Mike Davies, Director, Neuromorphic Computing Lab at Intel Corporation.
Dr. Chris Eliasmith, co-CEO of ABR noted that for this application these results indicate that, "Nengo DL on Loihi outperforms all of these alternative platforms on an energy cost per inference basis while maintaining near-equivalent inference accuracy." Furthermore, an analysis of tradeoffs between network size, inference speed, and energy cost indicates that Loihi's comparative advantage over other low-power computing devices improves for larger networks. The full study is available at https://arxiv.org/abs/1812.01739. For more information on Nengo and Nengo DL visit https://www.nengo.ai/nengo-dl/introduction.html.
Computing with artificial spiking neurons directly in software and hardware, known as "neuromorphics," has long been pursued as a means of exploiting how the brain computes intelligence so efficiently. The lessons learned can be applied to improve artificial intelligence. "In this study, ABR has delivered strong empirical evidence that the long-sought-after efficiencies of computing with spiking neurons now can be realized in commercially valuable applications using Nengo DL on Loihi", said Peter Suma, co-CEO of ABR.
ABR's Nengo DL toolkit allows deep learning networks to run on neuromorphic hardware, CPUs and GPUs. This provides one development environment in which to define power-efficient, real-time, neuromorphic networks that can then be run on all supported hardware platforms, including the Intel Loihi neuromorphic research chip. Neuromorphic edge-AI computing with Nengo and Loihi reduces power costs while preserving the performance of deep networks.
About Applied Brain Research Inc. (ABR)
ABR is the maker of the Nengo neuromorphic software development suite which includes the world's leading multi-platform, visual neuromorphic software compiler, runtime and spiking deep learning platform. Using ABR's neuromorphic tools, the team at ABR has built Spaun, the world's largest functional brain model and builds real-time, full-loop AI "brains" for customers in the military, self-driving car, ioT and smartphone markets. (www.AppliedBrainResearch.com)
View original content to download multimedia:http://www.prnewswire.com/news-releases/applied-brain-research-inc-shows-nengo-spiking-real-time-ai-deep-learning-networks-on-intel-loihi-use-38x-less-energy-than-on-nvidia-quadro-k4000-gpu-300761243.html
SOURCE Applied Brain Research Inc.