Back

Speaker "Joseph Schneible" Details Back

 

Topic

Using Genetic Algorithms to Optimize Recurrent Neural Networks

Abstract

One of the more challenging tasks in deep learning is the design and tuning of neural networks for specific tasks. This process is often more of an art form than science, requiring expertise in deep learning and a significant amount of time for trial and error. We'll present the use of genetic algorithms to automate the process of tuning the hyper-parameters of recurrent neural networks, including the size of the network, the number of time-steps through which to back propagate and the learning rate. This approach allows us to take advantage of the model parallelism of GPU-based neural network training while also taking advantage of the data parallelism of genetic algorithms. We show that this approach reduces the barrier to entry to using neural networks and is faster than other automated network tuning approaches.

Profile

Joseph Schneible is a Software Engineer leading the Independent Research and Development team at Technica Corporation in Dulles, Virginia. His research focuses on a systems-based approach to optimizing graph analytics and machine-learning algorithms for commodity hardware. Dr. Schneible holds a PhD in Physics from Syracuse University where his research focused on parallel simulations of quantum field theory. Prior to joining Technica, he performed postdoctoral research as a member of the High Performance Computing Lab at The George Washington University. His research interests include the use of GPUs to accelerate simulations and analytics.