De-Biasing Neural Networks

Neural networks trained on categorical data with biased training datasets may cause biased results. This issue is especially prevalent if a certain type of data is particularly difficult to acquire but exists in the wild where one would like to implement the neural net. Here, we are working on a technique to reduce sensitivity to such training inequalities.

Optimization Methods

Training deep neural networks is extremely energy intensive. By providing optimization methods which converge using less training epochs, the energy consumption of large models can be reduced drastically. Furthermore, performance and generalization can be improved significantly through better optimization techniques. I am currently working on one such optimization method which shows promising speed-ups and performance enhancements by reframing the optimization path more formally as a loss trajectory. It is my hope this technique will also provide some scientific insights into learning in deep neural networks.

Neuromorphic Computing

I am working on a neuromorphic computing system in collaboration with some at University of Illinois Urbana-Champaign and Zhejiang University. 

Spiking neural networks in neuromorphic hardware have numerous benefits over standard neural networks in their respective accelerated hardware. Intel’s Pohoiki system (running Loihi 2 neuromorphic chips) has been shown to perform a real-time deep learning benchmark with over 100 times less energy than a GPU, and even 5 times lower power consumption than specialized Internet of Things (IoT) inference hardware [1]. There are numerous reasons for pursuing spiking neural networks such as power consumption, event-driven computation, massive parallel processing, elimination of the von Neumann memory wall due to collocated memory and processing, robustness to noise, scalability [2], [3], as well as basic research into artificial and biological neural networks.

[1] Blouw, P., Choo, X., Hunsberger, E., Eliasmith, C. Benchmarking Keyword Spotting Efficiency on Neuromorphic Hardware. Proceedings of the 7th Annual Neuro-inspired Computational Elements Workshop (2019).

[2] Schuller, I. K. & Stevens, R. Neuromorphic Computing: From Materials to Systems Architecture - Report of a Roundtable Convened to Consider Neuromorphic Computing Basic Research Needs. BES Roundtable (2015).

[3] Schuman, C.D., Kulkarni, S.R., Parsa, M. et al. Opportunities for neuromorphic computing algorithms and applications. Nat Comput Sci 2, 10–19 (2022).

Applying Insights from Neuroscience to AI

The field of neural networks began out of the knowledge garnered from neuroscience leading to the eventual rate based models most ANN use. Further insights in considering the role of glial cells and dendritic processing may be applied to improve even rate-based ANN and are being explored.