The course is a few years old but Ng did a nice job organizing a wide range material.
One of the more interesting things for me was to finally understand how regularization reduces variance in a model. The concept reminds me of weight decay techniques that our team implemented back in the late 90’s.
Professor Ng also offered a tip on how to test gradients to verify partial derivative calculations in a model. This technique alone is worth the price of admission in terms of avoiding development and training time on bad models.