Under-the-Hood Mechanisms of Neural Networks with TensorFlow

by Sophia TurolSeptember 1, 2016
Learn about the capabilities of convolutional and recurrent neural networks, their application to image captioning, as well as the implementation in TensorFlow.

tensorflow_seattle-v11

Neural networks are actively applied to improve speech recognition, facial identification, reading emotions, sentiment analysis, disease diagnosis, etc. At the recent TensorFlow meetup in Seattle, the attendees were plunged into the world of convolutional and recurrent neural networks, their under-the-hood mechanisms, and usage with TensorFlow, learning some handy tricks on the way.

 

All things neural

In his session, Nick McClure of PayScale took a close look at neural networks. He introduced the audience to a basic unit of a neural network—an operational gate—and explained how to make use of multiple gates. Then, Nick moved on to:

  • loss functions
  • learning rate (it determines how much of a change can be applied to model parameters)
  • logistic regression as a neural network
  • activation functions

Nick outlined that neural networks can have a bunch of “hidden layers” and it’s possible to make them as deep as wanted. He also mentioned that neural networks can have as many inputs / outputs as necessary.

Nick overviewed the perks of TensorFlow as a library for deep learning, highlighting the following aspects:

Overviewing convolutional neural networks (CNN), Nick touched upon reduction of parameters and showed some tricks to try out: pooling and dropout. He also talked about using a regional CNN and a recurrent neural network for image captioning.

You can find Nick’s “TensorFlow Machine Learning Cookbook” on his GitHub’s profile.

 

Want details? Watch the video!

 

 

Join the meetup group to get informed about the upcoming events.

 

Related slides

 

Further reading

 

About the expert

Nick McClure is Senior Data Scientist at PayScale, where he works on machine learning and natural language processing algorithms. He has experience in house price estimation, image recognition, casino game design, optimal slot machine placement, and customer worth prediction. Nick has a PhD in applied mathematics from the University of Montana and currently works as Instructor at the University of Washington, teaching students data science and applied mathematics.