User Controls
sophisticated concepts made easier to understand through the use of memes
-
2019-03-10 at 3:02 AM UTC
-
2019-03-10 at 3:04 AM UTC10/10 idea. 0/10 execution.
I have no idea why I'm looking at a Pikachu face. -
2019-03-10 at 3:07 AM UTC
Originally posted by gadzooks 10/10 idea. 0/10 execution.
I have no idea why I'm looking at a Pikachu face.
pikachu’s face shows you how you feel once you internalize that quote, and come to terms that it’s true. If you don’t know what the pikachu meme is, you’ll probably have to study meme culture.
this meme rehashes the fact that we never acknowledge the ugly side of the God. -
2019-03-10 at 3:07 AM UTCmore memes are welcome in this thread
-
2019-03-10 at 3:21 AM UTC
Originally posted by username: pikachu’s face shows you how you feel once you internalize that quote, and come to terms that it’s true. If you don’t know what the pikachu meme is, you’ll probably have to study meme culture.
this meme rehashes the fact that we never acknowledge the ugly side of the God.
You showed your work. You get a "Thanks".
I'm gonna try and come up with one too, just need a minute cuz I'm inebriated as fuck. -
2019-03-10 at 3:23 AM UTC
-
2019-03-10 at 3:27 AM UTC
-
2019-03-10 at 3:28 AM UTC
-
2019-03-10 at 3:30 AM UTC
-
2019-03-10 at 3:30 AM UTC
-
2019-03-10 at 3:32 AM UTCwow, this thread is like watching your dog getting ran over by a bus.
-
2019-03-10 at 3:33 AM UTC
-
2019-03-10 at 3:34 AM UTC
-
2019-03-10 at 3:36 AM UTC
-
2019-03-10 at 6:16 AM UTC
-
2019-03-10 at 6:19 AM UTC
-
2019-03-10 at 7:02 AM UTC"Backpropagation is a method used in artificial neural networks to calculate a gradient that is needed in the calculation of the weights to be used in the network. Backpropagation is shorthand for "the backward propagation of errors," since an error is computed at the output and distributed backwards throughout the network’s layers. It is commonly used to train deep neural networks.
Backpropagation is a generalization of the delta rule to multi-layered feedforward networks, made possible by using the chain rule to iteratively compute gradients for each layer. It is closely related to the Gauss–Newton algorithm and is part of continuing research in neural backpropagation.
Backpropagation is a special case of a more general technique called automatic differentiation. In the context of learning, backpropagation is commonly used by the gradient descent optimization algorithm to adjust the weight of neurons by calculating the gradient of the loss function. "
-
2019-03-10 at 8:18 AM UTCHoly fuck I remember trying to understand LTSM neural networks and I was like that dog + brain damage.
-
2019-03-10 at 8:42 AM UTC
-
2019-03-10 at 10 PM UTC