Incoming Assistant Professor - Computer Science - University of South Carolina
Postdoctoral Researcher - Computer Science - University of California, Irvine
I am a postdoctoral researcher at the University of California, Irvine and I will be an Assistant Professor at the University of South Carolina in the Fall. I completed my Ph.D. in computer science at the University of California, Irvine, my M.S. in computer science from the University of Michigan, and my B.S. in electrical and computer engineering from the Ohio State University.
My goal is to design artificial intelligence algorithms that learn and act independently and efficiently to solve problems that benefit humanity. I am particularly interested in deep learning, reinforcement learning, search, and applications to the natural sciences.
I am currently seeking highly motivated students interested deep learning and reinforcement learning.
The Rubik's cube has over 10^19 possible configurations. We have created a deep reinforcement learning algorithm, called DeepCubeA, that can solve the Rubik's cube and 6 other combinatorial puzzles without domain specific knowledge. We are currently investigating how DeepCubeA can be used to solve problems in the natural sciences.
Solving the Rubik's Cube with Deep Reinforcement Learning and Search, Nature Machine Intelligence (2019)
Solving the Rubik's Cube with Approximate Policy Iteration, ICLR (2019)
Circadian rhythms are found in virtually all forms of life. They play a fundamental role in functions ranging from metabolism to cognition. We have developed Circadiomics for accessing and mining circadian omic datasets and BIO_CYCLE for analyzing cicadian rhythms experiments using deep learning.
CircadiOmics: Circadian Omic Web Portal, Nucleic Acids Research (2018)
What Time is It? Deep Learning Approaches for Circadian Rhythms, ISMB (2016)
The hippocampus plays a key role in the memory of sequences of events, however, the role of the hippocampus in nonspatial tasks has yet to be understood. Using unsupervised deep learning techniques, we visualize hippocampal activity during a nonspatial sequential memory task. We discovered that hippocampal activity correlates strongly with the sequence presented in this nonspatial memory task.
Hippocampal Ensembles Represent Sequential Relationships Among Discrete Nonspatial Events, BioRxiv (2019)
Artificial neural networks typically have a fixed, non-linear activation function at
each neuron. We have designed a novel form of piecewise linear activation function
that is learned though gradient descent. With
this adaptive activation function, we are able to improve upon deep neural network architectures that use static activation functions.
Learning Activation Functions to Improve Deep Neural Networks, ICLR Workshop, 2015