Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

One of the interesting contradictions in software is that, logically, things are actually supposed to get faster as new versions come out, since there should have been both more time and more data to use for optimization. However, in practice, new versions of software are usually slower than their predecessors, presumably because of the addition of new features. It'd be interesting to see a product developed with a "no slowdowns allowed" policy.


Pretty sure Snow Leopard was a release aimed at doing just that, rewriting a lot of the core system libraries for x86_64 and also clearing house on a lot of old cruft. No new real features were delivered with the release.


Then Lion came along and slowed things down to a crawl. Definitely the slowest OSX release I've had on my old macbook (circa '08). I eventually downgraded back to snow leopard until mountain lion came out, which was noticeably faster than Lion, but still not quite as smooth as snow leopard. But now Mavericks is back to being pretty good.

All-in-all, still not a very straightforward 'improvement' path performance-wise -- at least not from the end consumer's point of view.


WebKit has long been known for having a good set of automated performance tests and has a "zero-tolerance policy for performance regressions": http://www.webkit.org/projects/performance/


The websites themselves are what get slower and use more memory over time, not the browser. Try using Tumblr on your phone someday and check the data usage after - it doesn't compress those multi-MB gifs for mobile at all.


Some large websites (I've heard this about Google Search, specifically) have ironclad latency limits. If a feature can't be added without increasing latency above XXms, then it's not shipped.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: