SEARCH
You are in browse mode. You must login to use MEMORY

   Log in to start


From course:

Intro to AI 2

» Start this Course
(Practice similar questions for free)
Question:

Activation function and Pooling

Author: Christian N



Answer:

An activation function (ReLU) is applied to the feature map • A (max) pooling layer groups together a selection of pixels and selects the max • This identifies the area of image where the filter found the best match • Making the network robust against small shifts in the image • And reducing the size of the input while keeping the important information


0 / 5  (0 ratings)

1 answer(s) in total