Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You know, driving cars, like the article was talking about?

...or, NLP, audio & image recognition, recommendations... come on. It’s not controversial.



> recommendations

A lot of this is standard statistical methods, much of which are much older than twenty years.

Really, that's the part that threw me, twenty years ago everyone was going crazy for SVM's, which is still machine learning, but the features were definitely hand crafted.

I think deep learning has been super successful with unstructured data, but for tabular data it's pretty much a wash between NN's and boosted trees or generalised additive models.

You do need some flexibility in your function approximation, but not as much as people commonly believe.


I do sorta wonder about that.

The deep learning "revolution" corresponded with an exponential growth in the time/effort/money being thrown at these problems and the amount of data available to do so.

In an alternate universe, could everyone be going crazy about kernel machines?


Maybe.

GPU's are definitely a big part of why this stuff has improved, as you can train much, much faster on larger datasets which is going to improve performance.

NN's are super flexible though, and I'm not sure you'd have gotten the same level of performance out of other methods.

Interesting question though.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: