Lec7 - Ex:1 - "Perceptron-based multi-layer networks with hard thresholds and relative weights (without using learning)"

Lec7 - Ex:1 - "Perceptron-based multi-layer networks with hard thresholds and relative weights (without using learning)"

di SYED FAHAD HASSAN -
Numero di risposte: 0

Perceptron-based multi-layer networks with hard thresholds and relative weights can be used to implement simple Boolean functions such as A and (not B), A xor B, etc.

  • For the function A and (not B), the network would have two input neurons, one for A and one for B, and one output neuron. The weights on the connections from the input neurons to the output neuron would be set such that if A is 1 and B is 0, the output will be 1 (A and (not B) is true), and if A is 0 or B is 1, the output will be 0 (A and (not B) is false).
  • For the function A xor B, the network would have two input neurons, one for A and one for B, and one output neuron. The weights on the connections from the input neurons to the output neuron would be set such that if A is 1 and B is 0 or A is 0 and B is 1, the output will be 1 (A xor B is true), and if A is 1 and B is 1 or A is 0 and B is 0, the output will be 0 (A xor B is false).

It's worth noting that this method of using relative weights and hard thresholds is a very simplistic approach to neural networks and is not commonly used in practice, as it doesn't allow for learning and generalization.

Here is the link to the Google Colab notebook where you can see the implementation:

https://colab.research.google.com/drive/1rvFwnJK5-IXlNaPa8JZaTdk1tmNMHyBc?usp=sharing