Tensors and Neural Networks

Tensors are multi-dimensional arrays that generalize matrices to higher dimensions.

I am studying this book Neural Networks with TensorFlow and Keras by Philip Hua. Let’s learn things!

P.S. I am not affiliated with the author or publisher in any way. I am just learning from this book and sharing my notes. All the credit goes to the author. I would buy him a beer.

Tensor 1: Using Tensors

As the AI models grew larger and datasets more expansive, the need for parallel computing has risen. Deep learning models based on their scale needs to represent and process data across multiple dimensions. This is where tensors are crucial.

Tensor 2: How Machine Learns Using Neural Networks

Did you know that a feedforward neural network with a single hidden layer can approximate any continuous function to arbitary precision, given the right activation function and enough neurons in the hidden layer? This is known as the Universal Approximation Theorem.

Tensor 3: Components of Neural Networks

We look at the components of a neural network in this post. Lets try to understand what each component does and how they contribute to make intelligent decisions.