SEARCH
You are in browse mode. You must login to use MEMORY

   Log in to start

level: Level 1 of Chapter 5 - Neural Networks

Questions and Answers List

level questions: Level 1 of Chapter 5 - Neural Networks

QuestionAnswer
Types of Interaction input-outputLinear methods: every input contributes separately to the output Decision tree: there is some interaction between input vars, but limited Deep NN: Long computational paths, lots of interactions between input vars
Brains10^11 neurons of > 20 types, 10^14 synapses, 1ms–10ms cycle time Signals are noisy “spike trains” of electrical potential
McCulloch–Pitts “unit”ai ← g(ini) = g(ΣjWj,iaj)
Activation functions(a) is a step function or threshold function (b) is a rectified linear function ReLU(x): max(0,x) The smooth version (everywhere-differentiable) of ReLU is called soft plus softPlus(x) : log(1 + eX) Changing the bias weight W0,i moves the threshold location
McCulloch and Pitts: every Boolean function can be implemented?AND,OR,NOT
Expressiveness of perceptronsCan represent AND, OR, NOT, majority, etc., but not XOR Represents a linear separator in input space
Multi-layer perceptrons1) A single perceptron cannot represent complex non linear relations 2) Neural networks are also known as multi-layer perceptrons. 3) When every node of a layer is connected to all nodes of the next, this is called a fully connected neural network 4) When a network has connections only in one direction from input to output, is called a Feed-forward neural network.
SummaryMost brains have lots of neurons; each neuron ≈ linear–threshold unit (?) Perceptrons (one-layer networks) insufficiently expressive Multi-layer networks are sufficiently expressive; can be trained by gradient descent, i.e., error back-propagation Many applications: speech, driving, handwriting, fraud detection, etc. Engineering, cognitive modelling, and neural system modelling subfields have largely diverged