Speaker "Ganapathi Pulipaka" Details Back



1974 Gradient Based Deep Learning.


The past decade has shown high-precision and best-performing AI systems running on just smartphones  for  speech  recognition  with  neural translation  architectures  implementing  deep learning techniques. The deep neural networks are advanced technique that contain thousands of processing nodes with millions of neurons which are feed-forward as the data between each neural traverses in one direction. Inspired by the human brain, each neuron connected to a node receives a number of connections, each such connection is assigned as a weight by the node. During the active stage of the neural network connection, every time, the node receives a different data item, it multiplies the associated weight and adds the resulting products together producing another single number. All neural network connections are defined with a threshold value, if the produced number is below the defined threshold value, the node can longer pass the data to the next node. The structure of the neural network contains an input layer, hidden layer, and output layer. The hidden layer can contain a single hidden layer or it can contain multiple layers as well. The neural network with a single hidden layer is constrained with its capacity for spatial expressions. Hence, increasing the number of hidden layers increases the efficiency of the overall neural network. However, the neural network design has to be architected in a way that it is efficient. This session provides in-depth coverage on neural network architectures, linear algebra, calculus, and backpropagation algorithm implementations in TensorFlow and Python. 
Prerequisites: Installation of Python and TensorFlow
               Basic understanding of linear algebra, calculus, and mathematics
               Anaconda Jupyter environment
               Google colab 
               Python IDEs such as Spyder or PyCharm 


Dr. Ganapathi Pulipaka is a Chief AI Scientist at DeepSingularity LLC with 10+ Years of experience in Machine Learning, Deep Learning, Mathematics, Statistics with programming languages R, PyTorch, TensorFlow, and Python.