Neural Networks

April 16, 2023 |  Categories:  Machine Learning  

In this short post, I want to quickly cover what a neural network is.

You will often see neural networks depicted with a series of nodes connected by multiple lines. This image can be helpful to visualize, but apart from the obvious media appeal, doesn't explain what is happening at it's core. Don't get me wrong, I believe the classic diagram is useful for abstracting away the nitty gritty and helping us focus on the overarching ideas, but if you look one layer deeper, it becomes much less like magic and much more like math.

The individual nodes in a neural network diagram represent functions with weights and bias. for example: f(x) = w1x + w2x + b1. The resulting number of each function call is given as the weight to the next node it connects, and the incoming nodes from before provide the weights for the first function. Through some more advanced mathematics than I am going to get into here, these weights and biases are adjusted based on how close the final resulting value is to a target value. After adjusting these numbers over and over, eventually your large function (neural network) outputs an answer similar to the one you are expecting. This can then be applied to new situations with the hopes of giving an accurate prediction.

You can see then, that this emerging age of AI is not magic. It's not some black box of sentient beings (the more advanced methods can be very fooling). It is math.

Leave a comment:

Comments:

On April 16, 2023  Caleb wrote:

test comment