This article treats only forward propagation, whereas, backward propagation is also demonstrated in another article. -> Please see here.
Let's learn the basics of neural networks using micro:bit. Understanding deepens by actually programming. For neural network learning to solve various tasks, back propagation is generally required. Learning with this back propagation requires considerably long time calculations. It is not realistic to do this with such a tiny microbit with low computing power, even though it is not impossible. Therefore, here we will try forward calculation only, using the edge weights of the already learned neural network. Even so, you can experience some of the important points of the neural network.
Neural network with micro:bitthis illustrates evaluation of z = XOR(1,1) = 0 |
The weights of the edge used above are the results of being learned by the following NetLogo program that is based on the following one:
- Wilensky, U. (1999). NetLogo. http://ccl.northwestern.edu/netlogo/. Center for Connected Learning and Computer-Based Modeling, Northwestern University, Evanston, IL.
Weights used in the above example were obtained by this NetLogo simulation |
Please see the video for an actual operation example.
https://youtu.be/PPUcsXgCnZ4In the first half, z = XOR (x, y) is calculated where x = 1, y = 1, and then the value 0.0178... was obtained, so that "0" was finally displayed. On the other hand, the second half is the case of x = 1 and y = 0. Now that 0.9852... has been obtained, "1" was displayed as the final result of z.
0 件のコメント:
コメントを投稿