Click to learn more about author Andras Palfi.
As in the past months, I’ve been working on applying Machine Learning to traditional business problems like churn, where I don’t have a chance to work with Deep Learning models, I thought it would be a good idea to enroll in Andrew Ng’s new Deep Learning course.
Besides being an amazing introductory resource on Deep Neural Networks, Andrew Ng also managed to pack in some great interviews with a couple of the leaders in the field. One of the most interesting interviews was with non-other than Geoffrey Hinton, the godfather of Deep Learning. If you’re unfamiliar with his work, professor Hinton was one of the first researchers who demonstrated the use of the backpropagation algorithm for training multi-layer Neural Networks. He gave some great behind the curtain view into how he managed to publish his work on backpropagation in Nature thus starting a revolution in Neural Networks.
It turns out that using the chain rule to get derivatives was not a novel idea. Many other people had invented it, but their work somehow managed to get little attention. Professor Hinton, on the other hand, made some very clever moves to make sure his work got the exposure it needed. In his words:
“…I did quite a lot of political work to get the paper accepted. I figured out that one of the referees was probably going to be Stuart Sutherland, who was a well-known psychologist in Britain. And I went to talk to him for a long time and explained to him exactly what was going on. And he was very impressed by the fact that we showed that backprop could learn representations for words.”
I believe this illustrates a very important lesson: sometimes even revolutionary ideas need a fair amount of strategic moves, marketing or PR to go mainstream.
If you’re interested in reading one of the seminal papers of Machine Learning, you can find it here: https://www.iro.umontreal.ca/~vincentp/ift3395/lectures/backprop_old.pdf.