You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Dec 21, 2017. It is now read-only.
Since the prediction convolutions are separate from the convolutions that feed forward to the next (conv) layers, during backprop, some prediction layer X will receive weight updates for:
(1) its classification prediction conv weights
(2) its regression conv weights
(3) its normal conv weights that feed into later layers
Thus, for another earlier-during-foward-pass/later-during-backward-pass prediction layer X', the weights will be updated as follows:
(1) classification prediction conv weights are updated from output layer
(2) regression prediction conv weights are updated from output layer
(3) normal conv weights are updated with normal conv weights of X
True/false? The included diagram for SSD 300 seems to support this intuition.
Since the prediction convolutions are separate from the convolutions that feed forward to the next (conv) layers, during backprop, some prediction layer
Xwill receive weight updates for:(1) its classification prediction conv weights
(2) its regression conv weights
(3) its normal conv weights that feed into later layers
Thus, for another earlier-during-foward-pass/later-during-backward-pass prediction layer
X', the weights will be updated as follows:(1) classification prediction conv weights are updated from output layer
(2) regression prediction conv weights are updated from output layer
(3) normal conv weights are updated with normal conv weights of
XTrue/false? The included diagram for SSD 300 seems to support this intuition.