Deep Learning With Python | Deep Learning And Neural Networks | Deep Learning Tutorial | Simplilearn

Please download to get full document.

View again

of 93
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Similar Documents
Information Report
Category:

Education

Published:

Views: 0 | Pages: 93

Extension: PDF | Download: 0

Share
Description
This presentation about Deep Learning with Python will help you understand what is deep learning, applications of deep learning, what is a neural network, biological versus artificial neural networks, introduction to TensorFlow, activation function, cost function, how neural networks work, and what gradient descent is. Deep learning is a technology that is used to achieve machine learning through neural networks. We will also look into how neural networks can help achieve the capability of a machine to mimic human behavior. We'll also implement a neural network manually. Finally, we'll code a neural network in Python using TensorFlow. Below topics are explained in this Deep Learning with Python presentation: 1. What is Deep Learning 2. Biological versus Artificial Intelligence 3. What is a Neural Network 4. Activation function 5. Cost function 6. How do Neural Networks work 7. How do Neural Networks learn 8. Implementing the Neural Network 9. Gradient descent 10. Deep Learning platforms 11. Introduction to TensoFlow 12. Implementation in TensorFlow You can gain in-depth knowledge of Deep Learning by taking our Deep Learning certification training course. With Simplilearn’s Deep Learning course, you will prepare for a career as a Deep Learning engineer as you master concepts and techniques including supervised and unsupervised learning, mathematical and heuristic aspects, and hands-on modeling to develop algorithms. Those who complete the course will be able to: 1. Understand the concepts of TensorFlow, its main functions, operations, and the execution pipeline 2. Implement deep learning algorithms, understand neural networks and traverse the layers of data abstraction which will empower you to understand data like never before 3. Master and comprehend advanced topics such as convolutional neural networks, recurrent neural networks, training deep networks and high-level interfaces 4. Build deep learning models in TensorFlow and interpret the results 5. Understand the language and fundamental concepts of artificial neural networks 6. Troubleshoot and improve deep learning models 7. Build your own deep learning project 8. Differentiate between machine learning, deep learning, and artificial intelligence There is booming demand for skilled deep learning engineers across a wide range of industries, making this deep learning course with TensorFlow training well-suited for professionals at the intermediate to advanced level of experience. We recommend this deep learning online course particularly for the following professionals: 1. Software engineers 2. Data scientists 3. Data analysts 4. Statisticians with an interest in deep learning Learn more at https://www.simplilearn.com/deep-learning-course-with-tensorflow-training
Transcript
  • 1. Deep Learning is used to train robots to perform human tasks Building Robots
  • 2. Music composition Deep Neural Nets can be used to produce music by making computers learn the patterns involved in composing music
  • 3. Image Colorization Neural network recognises objects and uses information from the images to colour them
  • 4. Machine Translation Google translate is one such popular Machine translators you may have come across Given a word, phrase or a sentence in one language, neural networks automatically translate them into another language
  • 5. What’s in it for you? 01 03 05 07 09 02 04 06 08 10 What is Deep Learning? Biological versus Artificial intelligence What is a Neural Network? Activation function Cost function How do Neural Networks work? How do Neural Networks learn? Implementing the Neural Network Gradient descent Deep learning platforms Introduction to TensorFlow Implementation in TensorFlow 11 12
  • 6. What is Deep Learning? Deep Learning is a subfield of Machine Learning that deals with algorithms inspired by the structure and function of the brain Artificial Intelligence Ability of a machine to imitate intelligent human behavior Machine Learning Deep Learning Application of AI that allows a system to automatically learn and improve from experience Application of Machine Learning that uses complex algorithms and deep neural nets to train a model
  • 7. Biological Neuron vs Artificial Neuron Dendrites fetch information from adjacent neurons and pass them on as inputs The data is fed as input to the neuron
  • 8. Biological Neuron vs Artificial Neuron The cell nucleus processes the information received from the dendrites The neuron processes the information provided as input
  • 9. Biological Neuron vs Artificial Neuron Axons are the cables over which the information is transmitted The information is transferred over weighted channels
  • 10. Biological Neuron vs Artificial Neuron Synapses receive the information from the axons and transmit it to the adjacent neurons The output is the final value predicted by the artificial neuron
  • 11. What is a Neural Network? We feed an unlabeled image to a machine which identifies it without any human intervention
  • 12. What is a Neural Network? This machine is intelligent enough to differentiate between the various shapes
  • 13. What is a Neural Network? Neural networks provides this capability
  • 14. What is a Neural Network? A neural network is a system modeled on the human brain
  • 15. What is a Neural Network? inputs outputneuron A neural network is a system modeled on the human brain Inputs are fed to a neuron, that processes the data and gives an output
  • 16. What is a Neural Network? inputs outputneuron This is the most basic structure of a neural network, known as a perceptron A neural network is a system modeled on the human brain
  • 17. What is a Neural Network? Let’s start with visualising a neural network as a black box However, neural networks are usually much more complex outputinput
  • 18. However, neural networks are usually much more complex What is a Neural Network? square The box takes inputs, processes them and gives an output
  • 19. However, neural networks are usually much more complex What is a Neural Network? square The box takes inputs, processes them and gives an output Let’s have a look at what happens within this box
  • 20. However, neural networks are usually much more complex What is a Neural Network? Within the box, exists a network that is the core of deep learning neuron layer
  • 21. However, neural networks are usually much more complex What is a Neural Network? The network consists of layers of neurons neuron layer
  • 22. However, neural networks are usually much more complex What is a Neural Network? Each neuron is associated with a number called the bias neuron layer b1 b2 b3 b4
  • 23. However, neural networks are usually much more complex What is a Neural Network? Neurons of each layer transmit information to neurons of the next layer over channels neuron
  • 24. However, neural networks are usually much more complex What is a Neural Network? These channels are associated with numbers called weights neuron w1 w2 w3 w4
  • 25. However, neural networks are usually much more complex What is a Neural Network? These weights along with the biases determine the information that is passed over from neuron to neuron neuron w1 w2 w3 w4
  • 26. However, neural networks are usually much more complex What is a Neural Network? neuron Neurons of each layer transmit information to neurons of the next layer
  • 27. However, neural networks are usually much more complex What is a Neural Network? neuron Neurons of each layer transmit information to neurons of the next layer
  • 28. However, neural networks are usually much more complex What is a Neural Network? neuron Neurons of each layer transmit information to neurons of the next layer
  • 29. However, neural networks are usually much more complex What is a Neural Network? neuron square Neurons of each layer transmit information to neurons of the next layer
  • 30. However, neural networks are usually much more complex What is a Neural Network? neuron square The output layer emits a predicted output
  • 31. However, neural networks are usually much more complex What is a Neural Network? neuron square The output is emitted by the only active neuron in the final layer Let’s now go deeper. What happens within the neuron?
  • 32. Activation Function Within each neuron the following operations are performed:
  • 33. Activation Function Within each neuron the following operations are performed: • The product of each input and the weight of the channel it’s passed over is found
  • 34. Activation Function Within each neuron the following operations are performed: • The product of each input and the weight of the channel it’s passed over is found • Sum of the weighted products is computed. This is called the weighted sum
  • 35. Activation Function Within each neuron the following operations are performed: • The product of each input and the weight of the channel it’s passed over is found • Sum of the weighted products is computed. This is called the weighted sum • Bias unique to the neuron is added to the weighted sum
  • 36. Activation Function Within each neuron the following operations are performed: • The product of each input and the weight of the channel it’s passed over is found • Sum of the weighted products is computed. This is called the weighted sum • Bias unique to the neuron is added to the weighted sum • The final sum is then subjected to a particular function
  • 37. Activation Function Within each neuron the following operations are performed: • The product of each input and the weight of the channel it’s passed over is found • Sum of the weighted products is computed. This is called the weighted sum • Bias unique to the neuron is added to the weighted sum • The final sum is then subjected to a particular function • The final sum is then subjected to a particular function This is the activation function
  • 38. Activation Function Within each neuron the following operations are performed: • The product of each input and the weight of the channel it’s passed over is found • Sum of the weighted products is computed. This is called the weighted sum • Bias unique to the neuron is added to the weighted sum • The final sum is then subjected to a particular function • The final sum is then subjected to a particular function This is the activation function But what happens within these neurons?∑xiwi Bias x1 x2 x3 Output An activation function takes the “weighted sum of input” as its input, adds a bias and provides an output
  • 39. Activation Function Within each neuron the following operations are performed: • The product of each input and the weight of the channel it’s passed over is found • Sum of the weighted products is computed. This is called the weighted sum • Bias unique to the neuron is added to the weighted sum • The final sum is then subjected to a particular function • The final sum is then subjected to a particular function This is the activation function But what happens within these neurons? Here are the most popular types of activation function ∑xiwi Bias x1 x2 x3 Output Sigmoid Function Threshold Function Rectifier Function Hyperbolic Tangent Function
  • 40. Activation Function Sigmoid Function Used for models where we have to predict the probability as an output. It exists between 0 and 1 (X)= 1 1 + e-x i=1 n w x i i* 0 1 Y (X)= 1 1 + e-x
  • 41. Activation Function Sigmoid Function It is a threshold based activation function. If X value is greater than a certain value, the function is activated and fired else not Threshold Function (X)= 1, if x>=0 0, if x<0( ( i=1 n w x i i* 0 1 Y (X)= 1, if x>=0 0, if x<0( ) X
  • 42. Activation Function Sigmoid Function It is the most widely used activation function and gives an output of X if X is positive and 0 otherwise Threshold Function Rectifier Function (X) = max(X,0) i=1 n w x i i* 0 1 Y (X) = max(X,0)
  • 43. Activation Function Sigmoid Function This function is similar to Sigmoid function and is bound to range (-1, 1) Threshold Function Rectifier Function Hyperbolic Tangent Function (X)= 1 + e-2x 1 - e -2x 0 1 Y (X)= 1 - e 1 + e-2x i=1 n w x i i* - 1 -2x
  • 44. Cost Function The Cost value is the difference between the neural nets predicted output and the actual output from a set of labeled training data inputs Predicted outputneuron Actual output y^ y
  • 45. Cost Function The Cost value is the difference between the neural nets predicted output and the actual output from a set of labeled training data The least cost value is obtained by making adjustments to the weights and biases iteratively throughout the training process inputs Predicted outputneuron Actual output y^ y
  • 46. How do Neural Networks work? But what happens within these neurons?
  • 47. How do Neural Networks work? x1 x2 x3 Input layer cc mileage ABS Let’s build a neural network to predict bike price based on few of its features
  • 48. How do Neural Networks work? x1 x2 x3 Input layer Bike pricey Output Layer ^mileage ABS cc
  • 49. How do Neural Networks work? x1 x2 x3 Input layer y Hidden Layer Bike price The hidden layer helps in improving the output accuracy Output Layer ^mileage ABS cc
  • 50. How do Neural Networks work? x1 x2 x3 Input layer y Hidden Layer w 1 w2 Each of the connections have a weight assigned with it Output Layer ^mileage ABS cc Bike price
  • 51. Output Layer How do Neural Networks work? x1 x2 x3 Input layer y Hidden Layer w 1 w2 Step 1: x1*w1 + x2*w2 + b1 Step 2: Φ(x1* w1 + x2*w2 + b1) where Φ is an activation function The neuron takes a subset of the inputs and processes it mileage ABS cc b1
  • 52. How do Neural Networks work? x1 x2 x3 Input layer y Hidden Layer w3 w4 Output Layer ^mileage ABS cc Bike price b2
  • 53. How do Neural Networks work? x1 x2 x3 Input layer y Hidden Layer w5 w6 Output Layer ^mileage ABS cc Bike price b3
  • 54. How do Neural Networks work? x1 x2 x3 Input layer y Hidden Layer w8 w9 w7 Output Layer ^mileage ABS cc Bike price b4
  • 55. How do Neural Networks work? x1 x2 x3 Input layer y Hidden Layer Output Layer ^mileage ABS cc Bike price
  • 56. How do Neural Networks work? x1 x2 x3 Input layer y Hidden Layer Output Layer ^mileage ABS cc The information reaching the neuron’s in the hidden layer is subjected to the respective activation function Bike price
  • 57. How do Neural Networks work? x1 x2 x3 Input layer y Hidden Layer The processed information is now sent to the output layer, once again, over weighted channels Output Layer ^ w10 w11 w12 w13 mileage ABS cc Bike price
  • 58. How do Neural Networks learn? x1 x2 x3 Input layer y Hidden Layer y C=1/2(Y-Y)2 The output, which is the predicted value is compared against the original value Output Layer ^ ^ mileage ABS cc Bike price
  • 59. How do Neural Networks learn? x1 x2 x3 Input layer y Hidden Layer y C=1/2(Y-Y)2 A cost function determines the error in prediction and reports it back to the neural network Output Layer ^ ^ mileage ABS cc Bike price
  • 60. How do Neural Networks learn? x1 x2 x3 Input layer y Hidden Layer y C=1/2(Y-Y)2 This is called back propagation Output Layer ^ ^ mileage ABS cc Bike price
  • 61. How do Neural Networks learn? x1 x2 x3 Input layer y Hidden Layer y C=1/2(Y-Y)2 w1’ w2’ The weights are adjusted in order to reduce the error Output Layer ^ ^ mileage ABS cc Bike price
  • 62. How do Neural Networks learn? x1 x2 x3 Input layer y Hidden Layer y C=1/2(Y-Y)2 w1’ w2’ The network is now trained using the new weights Output Layer ^ ^ w10’ w11’ w12’ w13’ mileage ABS cc Bike price
  • 63. How do Neural Networks learn? x1 x2 x3 Input layer y Hidden Layer y C=1/2(Y-Y)2 w1’ w2’ Once again, the cost is determined and back propagation is continued until the cost cannot be reduced any further Output Layer ^ ^ mileage ABS cc Bike price
  • 64. Implementing the neural network x1 x2 x3 Input layer y Hidden Layer Let’s plug in values and see how our neural network works Output Layer ^mileage ABS cc Bike price
  • 65. Implementing the neural network x1 x2 x3 Input layer y Hidden Layer Initially, our channels are assigned with random weights w1 w2 Output Layer ^mileage ABS cc Bike price
  • 66. Implementing the neural network x1 x2 x3 Input layer y Hidden Layer Our first neuron takes the value of mileage and cc as inputs w1 w2 Computation • n1 = Φ(7.41*w1+3.51*w2 +b1) Output Layer ^mileage ABS cc Bike price
  • 67. Implementing the neural network x1 x2 x3 Input layer y Hidden Layer Similarly, each of the neurons take a different combination of inputs w4 Computation • n1 = Φ(7.41*w1+3.51*w2 +b1) • n2 = Φ(7.4*w3+9.4*w4 +b2) w3 Output Layer ^mileage ABS cc Bike price
  • 68. Implementing the neural network x1 x2 x3 Input layer y Hidden Layer Computation • n1 = Φ(7.41*w1+3.51*w2 +b1) • n2 = Φ(7.4*w3+9.4*w4 +b2) • n3 = Φ(3.51*w5+9.4*w6 +b3) w5 w6 Output Layer ^mileage ABS cc Bike price
  • 69. Implementing the neural network x1 x2 x3 Input layer y Hidden Layer Computation • n1 = Φ(7.41*w1+3.51*w2 +b1) • n2 = Φ(7.4*w3+9.4*w4 +b2) • n3 = Φ(3.51*w5+9.4*w6 +b3) • n4 = Φ(7.4*w7+3.51*w8+ 9.4*w9 +b4) w7 w8 w9 Output Layer ^mileage ABS cc Bike price
  • 70. Implementing the neural network x1 x2 x3 Input layer y Hidden Layer Output Layer Computation n1 n2 n3 n4 The processed value from each neuron is sent to the output layer over weighted channels ^mileage ABS cc • n1 = Φ(7.41*w1+3.51*w2 +b1) • n2 = Φ(7.4*w3+9.4*w4 +b2) • n3 = Φ(3.51*w5+9.4*w6 +b3) • n4 = Φ(7.4*w7+3.51*w8+ 9.4*w9 +b4) Bike price
  • 71. Implementing the neural network x1 x2 x3 Input layer y Hidden Layer Output Layer n1 n2 n3 n4 ^ Once again, the values are subjected to an activation function and a single value is emitted as the output mileage ABS cc
  • 72. Implementing the neural network x1 x2 x3 Input layer y Hidden Layer Output Layer n1 n2 n3 n4 On comparing the predicted value to the actual value, we clearly see that our network requires training $2000 y $4000 ^mileage ABS cc Bike price
  • 73. Implementing the neural network x1 x2 x3 Input layer y Hidden Layer Output Layer The cost function is calculated, and back propagation takes place y C=1/2(Y-Y)2^ ^mileage ABS cc $2000 $4000 Bike price
  • 74. Implementing the neural network x1 x2 x3 Input layer y Hidden Layer Output Layer Based on the value of the cost function, certain weights are changed y C=1/2(Y-Y)2^ ^w5’ w6’ mileage ABS cc $2000 $4000 Bike price
  • 75. Implementing the neural network x1 x2 x3 Input layer y Hidden Layer Output Layer The values are once again processed using these new weights at the neuron y C=1/2(Y-Y)2^ ^w5’ w6’ mileage ABS cc $2000 $4000 Bike price
  • 76. Implementing the neural network x1 x2 x3 Input layer y Hidden Layer Output Layer Our neural network is considered trained when the value for the cost function is minimum y C=1/2(Y-Y)2^ ^w5’ w6’ n1’ n2’ n3’ n4’ mileage ABS cc $4000 $4000 Bike price
  • 77. Gradient Descent But what approach do we take to minimise the cost function?
  • 78. Gradient Descent C Y^ C=1/2(Y-Y)2 Let’s start with plotting the cost function against the predicted value ^
  • 79. Gradient Descent C Y^ C=1/2(Y-Y)2 Let’s start with plotting the cost function against the predicted value ^
  • 80. Gradient Descent C Y^ C=1/2(Y-Y)2^
  • 81. Gradient Descent C Y^ C=1/2(Y-Y)2^
  • 82. Gradient Descent C Y^ C=1/2(Y-Y)2^
  • 83. Gradient Descent C Y^ C=1/2(Y-Y)2^
  • 84. Gradient Descent And with that, we have all the right weights and we can say our network is trained
  • 85. Deep Learning Platforms Torch KerasTensorFlow DeepLearning4J (java)
  • 86. Introduction to TensorFlow TensorFlow is an open source tool used to define and run computations on tensors
  • 87. TensorFlow is an open source tool used to define and run computations on tensors Introduction to TensorFlow What are tensors?
  • 88. Introduction to TensorFlow Tensors are just another name for arrays a m k q d 2 4 8 1 1 9 3 2 5 4 4 6 6 3 3 7 8 2 9 5 Tensor of Dimensions[5] Tensor of Dimensions[5,4] Tensor of Dimension[3,3,3]
  • 89. Introduction to TensorFlow Open source software library developed by Google Most popular library in Deep Learning Can run on either CPU or GPU Can create data flow graphs that have nodes and edges Used for Machine Learning applications such as Neural Networks
  • 90. Implementation in TensorFlow Let’s build a neural network to identify hand written digits using MNIST Database. Hand written digits from MNIST Database MNIST: Modified National Institute of Standards and Technology Database It has a collection of 70,000 handwritten digits Digit labels identify each of the digits from 0 to 9
  • 91. Implementation in TensorFlow Hand written digits from MNIST Database The dataset is used to train the machine A new image of a digit is fed The digit is identified 3
  • Recommended
    View more...
    We Need Your Support
    Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

    Thanks to everyone for your continued support.

    No, Thanks
    SAVE OUR EARTH

    We need your sign to support Project to invent "SMART AND CONTROLLABLE REFLECTIVE BALLOONS" to cover the Sun and Save Our Earth.

    More details...

    Sign Now!

    We are very appreciated for your Prompt Action!

    x