As a study of machine learning, I am reading "* Learning from the basics: Artificial intelligence textbook *".
The feature of this book is that the end-of-chapter problem contains a simple program of Python.
Here, it is copied with Ruby.
neuralnet.rb
INPUTNO = 2
HIDDENNO = 2
def forward(wh, wo, hi, e)
  HIDDENNO.times do |i|
    u = 0.0
    INPUTNO.times do |j|
      u += e[j] * wh[i][j]
    end
    u -= wh[i][INPUTNO]
    hi[i] = f(u)
  end
  o = 0.0
  HIDDENNO.times do |i|
    o += hi[i] * wo[i]
  end
  o -= wo[HIDDENNO]
  f(o)
end
def f(u)
  return 1.0 if u >= 0
  0.0
end
wh = [[-2, 3, -1], [-2, 1, 0.5]]
wo = [-60, 94, -1]
e = [[0, 0], [0, 1], [1, 0], [1, 1]]
hi = [0] * (HIDDENNO + 1)
e.each do |i|
  puts "#{i}->#{forward(wh, wo, hi, i)}"
end
It is a simple hierarchical neural network calculation and there is no learning, but please be careful because there is an error in the indentation on the 26th line ** (September 25, 2019, 1st edition, 1st print issued) **
error.py
for i in range(HIDDENNO):     #Wrong
    for i in range(HIDDENNO): #Positive
As you know, indentation errors in Python seem fatal, but fortunately you can download the sample code from the Ohmsha book page.
step.rb
def f(u)
  return 1.0 if u >= 0
  0.0
end
 #Output example
[0, 0]->0.0
[0, 1]->1.0
[1, 0]->1.0
[1, 1]->0.0
The transfer function f is a step function
Sigmoid.rb
def f(u)
  1 / (1 + Math::E ** -u)
end
 #Output example
[0, 0]->0.0006265270712940932
[0, 1]->0.6434453861326787
[1, 0]->0.0003334059232134918
[1, 1]->8.512503196901111e-16
The transfer function f is a sigmoid function
ramp.rb
def f(u)
  return u if u >= 0
  0.0
end
 #Output example
[0, 0]->0.0
[0, 1]->1.0
[1, 0]->0.0
[1, 1]->0.0
The transfer function f is the ramp function
Recommended Posts