Introduction to Neural Network: Feedforward

You can find this article and source code at my GitHub repo

A Deadly Simple Neural Network

Have you ever heard of or look at some materials about the neural network, if so, a model looks like the one in below figure should not be a stranger to you.

An artificial neural network model
An artificial neural network model

Well, that's a little bit complicated, so what about this one?

Diagram of a simple neural network
Diagram of a simple neural network

Let's have a brief explanation for each component in the figure. Each circle represents a unit (or a neuron). And each square represents a calculation. The left most three units form the input layer. The neuron with an h inside is the only neuron the output layer of this neural network has.

The input to the output unit
The input to the output unit

Recall that for a biological neuron, there exists a threshold for such a neuron to be activated. In our neural network, the neuron will calculate the input with an activation function and send the result as the output. One of the biggest advantages of the activation function is that the function could be any function. It means you can use step, polynomial or sigmoid functions of your choice to build your model. The output unit returns the result of f(h), where h is the input to the output unit, and y is the output of the neural network.

if you let f(h) = h as your activation function, then the output of the network will be this, note that here y = f(h).

The output
The output

You are correct if you think this is just the linear regression model. Once you start using activation functions that are continuous and differentiable, it's possible to train the network using gradient descent. We gonna need the first derivative of the activation function. Before we dive into the training process, let's code the dead simple neural network in Python. For the activation function, we will use the sigmoid function. Don't worry if you think this network can only make a prediction by feedforward but learn (get trained) from backpropagation.

The sigmoid function
The sigmoid function
import numpy as np

def sigmoid(x):
    # sigmoid function
    return 1/(1 + np.exp(-x))

inputs = np.array([0.7, -0.3])
weights = np.array([0.1, 0.8])
bias = -0.1

# calculate the output
output = sigmoid(np.dot(weights, inputs) + bias)

print('Output:')
print(output)

You can find the code here.

Your First 2-Layer NN

Now, you would have a basic idea of how a neural network makes predictions. In a real-world problem, such a simple neural network may not be very helpful for your problem. A new concept needs to introduced here, the hidden layer.

A network with 3 input units, 2 hidden units and 1 output unit
A network with 3 input units, 2 hidden units and 1 output unit

In the previous simple network, our weight is a vector, but for a more common case, our weight should be a matrix looks like below (and this is the weight matrix represented in the figure above).

Weights matrix for 3 input units and 2 hidden units
Weights matrix for 3 input units and 2 hidden units

You may get the idea of calculating h1 from the 2-layer neural network structure. Let's name it a mathematical formula.

The formula of calculating hidden layer inputs
The formula of calculating hidden layer inputs

And for our case,

The matrix multiplication for calculating hidden layer inputs of our network
The matrix multiplication for calculating hidden layer inputs of our network

Note: The weight indices have changed in the above image and no longer match up with the labels used in the earlier diagrams. That's because, in matrix notation, the row index always precedes the column index, so it would be misleading to label them the way we did in the neural net diagram.

Weight matrix shown with labels matching earlier diagrams.
Weight matrix shown with labels matching earlier diagrams.

Remember, the above is not a correct view of the indices, but it uses the labels from the earlier neural net diagrams to show you where each weight ends up in the matrix.

Combine with the formula we learned from the first section, we can implement the 2-layer neural network! The activation function used here is the sigmoid function.

Things to do:

  • Calculate the input to the hidden layer.
  • Calculate the hidden layer output.
  • Calculate the input to the output layer.
  • Calculate the output of the network.
import numpy as np

def sigmoid(x):
    # sigmoid function
    return 1/(1+np.exp(-x))

# Network size
N_input = 3
N_hidden = 2
N_output = 1

np.random.seed(42)
# Make some fake data
X = np.random.randn(4)

weights_in_hidden = np.random.normal(0, scale=0.1, size=(N_input, N_hidden))
weights_hidden_out = np.random.normal(0, scale=0.1, size=(N_hidden, N_output))

hidden_layer_in = np.dot(X, weights_in_hidden)
hidden_layer_out = sigmoid(hidden_layer_in)

print('Hidden-layer Output:')
print(hidden_layer_out)

output_layer_in = np.dot(hidden_layer_out, weights_hidden_out)
output_layer_out = sigmoid(output_layer_in)

print('Output-layer Output:')
print(output_layer_out)

You can find the code here.

Reference

Thanks for reading. If you find any mistake/typo in this blog, please don't hesitate to let me know, you can reach me by email: jyang7[at]ualberta.ca

最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 160,706评论 4 366
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 68,002评论 1 301
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 110,462评论 0 250
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 44,375评论 0 216
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 52,763评论 3 294
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 40,849评论 1 224
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 32,033评论 2 317
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 30,768评论 0 204
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 34,490评论 1 246
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 30,734评论 2 253
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 32,204评论 1 264
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 28,566评论 3 260
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 33,227评论 3 241
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 26,137评论 0 8
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 26,934评论 0 201
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 35,926评论 2 283
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 35,774评论 2 274

推荐阅读更多精彩内容