A
AcadiFi
DV
DeepLearner_Vidia2026-03-22
cfaLevel IIQuantitative MethodsMachine Learning

How does backpropagation train a neural network?

I understand forward pass but how does the network actually learn? What is the gradient and how does it update weights?

98 upvotes
Verified ExpertVerified Expert
AcadiFi Certified Professional
Backprop propagates error gradients backward using chain rule. Weights updated via gradient descent. ReLU and Adam fix vanishing gradients.

Unlock with Scholar — $19/month

Get full access to all Q&A answers, practice question explanations, and progress tracking.

No credit card required for free trial

📊

Master Level II with our CFA Course

107 lessons · 200+ hours· Expert instruction

#backpropagation#gradient-descent#training