NN architectures

I finally did it. I just watched first half of Andrew Ng's "Convolutional Neural Networks" and I have now seen what exactly all these VGG's and Alexnets look like. I had some ideas, and I almost knew I wouldn't like to learn the truth. And yet, here it is: I am terrified by how astonishingly stupid these models are...

They really are hand-picking shapes and order of layers based on empircal observations and well... stack a LOT of layers, more than I thought of. Inception modules are something especially revoluting. Not nearly something I could train (or even run?) on my Intel-run laptop. UPD: I mean, of course I knew about all these particular things... yet it still shocked me to see the actual architecture

It disgusts me how such naive and arbitrary approaches should work better than everything else merely because we've envented GPUs at some point. This is cruel

Комментарии

Comments powered by Disqus