What news from AWS re:Invent last week will have the most impact on you?
Amazon Q, an AI chatbot for explaining how AWS works.
Super-fast S3 Express storage.
New Graviton 4 processor instances.
Emily Freeman leaving AWS.
I don't use AWS, so none of this will affect me.

Build and Run an Artificial Neural Network on Your Browser

Feb 10th, 2017 3:00am by
Featued image for: Build and Run an Artificial Neural Network on Your Browser
Feature image via Pixabay.

In an earlier post, we built an image classifier that could detect flowers in an image. There were several steps during the process of building which included installing Docker, downloading the data set, linking TensorFlow image, retraining the artificial neural network.

But what if you want to surpass all the above steps and visualize the results in the web browser with no fear of breaking the code? That sounds amazing and would be wonderful. Hence, in this post, we will visualize how does an Artificial Neural Network (ANN) learns from the input data provided and how its output changes when provided different kinds of input while understanding the different parameters that play a crucial role in the learning process. The diagram below shows the basic working principle of ANN.

And then we’ll test out these concepts in an in-browser ANN simulator, created by some folks at Google.

Weights and Measures


Working of a Perceptron

Basic working principle of an Artificial Neural Network

First, ANN takes a combination of input features, each feature having its own weight. The weight assignment can be considered as assigning priority to a feature. In a Summation block, all the input features are multiplied with their weights first and then summed.

After this comes Threshold Checking is performed that verifies whether the Sum exceeds a threshold. If the sum exceeds the threshold, then the ANN gets activated and it status becomes ON and a corresponding output is generated, otherwise remains deactivated and its state remains OFF.

For instance, consider building an algorithm that classifies a red ball in an image. The most likely set of input features we will choose are red color and round shape. Now, we want to give more importance to the shape feature as compared to color. To do this, we will assign weight parameters to both the features but the weight of shape feature will have higher weight compared to the color feature.

This step completes the input features and summation block. Now comes the step of choosing an activation function that will perform threshold checking. There are several types of available functions, like sigmoid and step function, or you could create a function of your own. One can experiment with changing the different types of functions and see the corresponding output. For this example, we want an output like True or False (0 or 1). Therefore, a step function seems like a good choice and its output response can be given by the following equation:

Activation Function

Activation Function

Hence, if the Sum is greater than a chosen threshold then the state is ON and it means a red ball is present in an image, else otherwise. It might happen that the threshold chosen is not working perfectly for the classifier or a simple linear activation function might not be able to cope with the type of input features.

Sometimes the operation may try to overfit the input feature data and thus may not work properly and give abrupt predictive response. To solve this, a regularization term is often added to the function, f(Sum), that helps in improving the performance of the algorithm.

To understand a bit more about the terms that we witnessed here in this post and to explore further, check out this online experimental tool, created by Google’s Daniel Smilkov and Shan Carter, that allows you to tinker with an operational ANN directly within a Web browser:

Hands-On-Artificial Neural Network without the fear of breaking the Code

In the tool, the blocks represent the neurons and the thickness of the connections between the layers of neurons shows their weights (importance). As you can see, the ANN contains 2 hidden layers, but more layers can be added by clicking the plus and minus buttons available on the top.

Similarly, the number of neurons can be manipulated by clicking on similar buttons. On the left, there are different types of dataset on which the ANN can be trained and tested and some unique properties can be added to the dataset by selecting the features. The activation function can be chosen from the drop down menu with its learning rate.

The learning rate determines the progression of learning of an ANN over time. Similarly, the regularization rate can also be controlled to prevent the problem of overfitting that we talked about a bit earlier. Another term, which is closely related to the classification of data, but indeed is very different from it and is called regression. Regression is used for predicting output values for a given function when given a certain type of input, but not label it. Whereas, in classification the goal is to predict the output but also provide a label (classify) to the output. In the tool, all the parameters can be tuned by either editing the entries or by hovering over the block and then editing the values.

Depending on the type of dataset, and/or by adding noise with an activation function, the output response in the form of training loss, test loss and the progression of learning over time in a graphical form can be viewed simultaneously on the right side of the screen. I hope by using this tool, you can learn and see a lot of basic concepts of machine learning in action, which is very important if you want to use artificial intelligence to your advantage.

Happy Learning and stay tuned for the next exploration!

Group Created with Sketch.
TNS owner Insight Partners is an investor in: Docker.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.