Deep Learning Basics(8): Intermediate layers and backpropagation
In the previous article we learned that neural networks look for the correlation between the inputs and the outputs of a training set. We also learned that based on the pattern, the weights will have an overall tendency to increase or decrease until the network predicts all the values correctly.
Sometimes there is not a clear direction in this ...
Deep Learning Basics(7): Correlation
In previous articles, we learned how neural networks adjust their weights to improve the accuracy of their predictions using techniques like gradient descent.
In this article, we will take a look at the learning process using a more abstract perspective. We will discuss the correlation between inputs and outputs in a training set, and how neura...
Deep Learning Basics(6): Generalized gradient descent (II)
In the previous article the foundations for a generalized implementation of gradient descent. Namely, cases with multiple inputs and one output, and multiple outputs and one input.
In this article, we will continue our generalization efforts to come up with a version of gradient descent that works with any number of inputs and outputs.
First, ...
Deep Learning Basics(5): Generalized gradient descent (I)
In the previous article, we learned about gradient descent with a simple 1-input/1-output network. In this article, we will learn how to generalize this technique for networks with any number of inputs and outputs.
We will concentrate on 3 different scenarios:
Gradient descent with on NNs with multiple inputs and a single output.
Gradient...
Quick tips: Integrating Google Analytics with Rails 6/5 + Turbolinks + Webpacker
Just including the script tags Google Analytics provides you is not enough to enable analytics when a user is navigating your app with Turbolinks.
We need to enable our app to send analytics to google every time Turbolinks loads, and I wanted to share the solution I came up with for this problem. Before we start, let me share a disclaimer:
...
Deep Learning Basics(4): Gradient Descent
In the previous article, we learned about hot/cold learning.
We also learned that hot/cold learning has some problems: it’s slow and prone to overshoot, so we need a better way of adjusting the weights.
A better approach should take into consideration how accurate our predictions are and adjust the weights accordingly. Predictions that are way...
Deep Learning Basics(3): Hot/Cold learning
In the previous articles, we learned how neural networks perform estimations: a weighted sum is performed between the network inputs and its weights. Until now, the values of those weights were given to us by a mysterious external force. We took for granted that those are the values that produce the best estimates.
Finding the right value for e...
Deep Learning Basics(2): Estimation
In the previous article we learned what a neural network is and how it performs predictions: the input is combined with knowledge (in the form of a weight value) to produce an output.
In practice, just one input and one weight are rarely of any use. Most systems in the real world are much more complex, so you will need networks that can handle ...
84 post articles, 11 pages.