The very definition of cloaking is serving different pages to bots then users. So the 'optimized' stuff is essentially cloaking... What prevents googlebot for using a different header to check if the site is cloaked?
Btw, i prefer haXe (haxe.org) as a true unified web language ;-)
Yes, cloaking is giving a user and a search engine different pages, but the purpose of cloaking is to decieve search engines, and that's why it's evil. With NOLOH, search engines aren't decieved, they are simply given the content in a way that they can understand since they do not understand JavaScript. They're still given the same content (same text, images, etc...) but they're not told to modify the DOM since they can't. So it is a helping hand to search engines, not a deception in any sense. Why would googlebot use a different header to pretend that it is a user in order to recieve a broken page? That would not benefit any party.
haXe is only unified in the sense that it has support for JS, Flash, and a few other languages. But going through their tutorials (for example, http://www.haxe.org/doc/js/ajax) you still see that you have to write mark-up. So all of the traditional problems that NOLOH is designed to address still persist in haXe. Mark-up is static, error-prone, not intuitive for application development, interpreted by browsers differently, and the list goes on and on...
> Why would googlebot use a different header to pretend that it is a user in order to recieve a broken page? That would not benefit any party.
Well... it would benefit Google (and its users), because they could detect and punish cloaking. How would Google detect cloaking? A clerk examining the two served versions, or automagically comparing the two versions of served content? How smart would their comparing algorithm be? Could it determine that your site was not being deceptive? How would you write such a algorithm (and it better be generic, and difficult to exploit)?
Basically, I have no idea what the hell Google does and does not do, but I am scared (and ignorant) enough to always serve search engines exactly what I serve users.
Btw, i prefer haXe (haxe.org) as a true unified web language ;-)