Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Personally, I think there was nothing golden-age about that time. I stuck with Physics. Monthly breakthroughs, yes. The giants whose shoulders we stand on, walking the earth, yes. But opportunities and platforms were miniscule compared to now. IBM’ers sneered at everyone else. This is the golden age of computing.


> This is the golden age of computing.

I disagree. In 1995, I picked up an issue of BYTE and marveled at the coverage of microkernel operating systems and RISC processors. New and exciting things were on the horizon! Now, I see only decadence. Hardware has stagnated for a decade. Software has less functionality while using exponentially more resources. Compare the Windows 10 Settings app to the Control Panel, which first debuted with Windows 95. Or the OneNote UWP app, which is now discontinued, to previous versions of OneNote. Key functionalities have gotten worse. Chrome's PDF handling is so awful that I spent weeks searching for a PDF viewer that's as good as Acrobat 10. To my shock everything was markedly worse--slower, less reliable searching, etc. I finally installed PDF XChange Editor and it was like a breath of fresh air. Not only does it search PDFs correctly, but it highlights the results and shows indicators of hits on the scrollbar! (Apps these days don't even have scrollbars.)

Apple has given up on the Mac, and now it only gets iOS hand-me-downs. One of my great regrets is that I didn't get get my first Mac until 2007, and missed out on a big part of the Mac's heyday. I remember when each new release of OS X came out, I'd carefully read John Sircusa's intensive review on Ars Technica. The PDF imaging model in OS X 10.0, hardware accelerated compositing in 10.2, fine-grained locking in 10.4. That was 15 years ago! Kids who were born then are stealing alcohol from their parents' liquor cabinets now and the only cool thing to happen to OS X in their lifetime is APFS.

There are highlights, no doubt. Rust is a breakthrough. Apple's Ax processors bring desktop computing power into impossibly small form-factors. Pervasive mobile broadband has enabled a lot of new applications. (But the best software stack for leveraging that capability is a direct descendant of NeXTSTEP.) But it's few and far between now.


There is a very simple reason for that, which I learned when I was in college: the end of Moore's law. During the 80s and 90s Moore's law was in full force, and every few months hardware doubled in speed and halved in price. This was the heyday of computing! With the end of Moore's law we won't see that happening again anytime soon, the future is to do incremental improvements in computer architectures, unless we find something fundamental to fuel the next generation of computing.


You're gonna get the "lots of progress in hardware part" back soon, even without Moore's law :|

...but you're likely not going to like it: will be a rain of exotic heterogeneous computing platforms first, with various ML/AI-accelerators all tailored to specific areas, at first used in IoT and mobile, then everywhere else. Thinks like NN-accelerators you now see in some mobile chips and "secondary chips" like T2s and what ever will become the dominant sources of computing power in systems, dwarfing the general-porpose-and-portable CPUs. And all software interacting with these will stop being portable in the way we know it, and as a result of network effects apps will no longer be portable across the many OS-flavours we'll have.

Second wave will be when ML gets to the point where it is heavily used to design hardware: you'll have an explosion of hardware architectures with non-(human-brain-comprehensible) instruction sets (plus maybe even with analog computing modules), to the point that compilers will likely not even be able to be coded by hand in "assembler", we'll need to have evolved compiler-generator-genrate-generators too etc.

Progress is coming back, and it will accelerate... just that it will not be the ape-brain-comprehensible kind of progress we knew before...


>Progress is coming back, and it will accelerate... just that it will not be the ape-brain-comprehensible kind of progress we knew before...

It also wont be the kind of progress we desire, either.

It's like we asked for jetpacks, but we get self-driving cars instead...


Moore's law was that transistor density doubles roughly every 18 months. That 'law' has not ended (15nm ->7nm, for example), it's just that the gains are being used for things other than clock speed.


It is interesting but not that golden. No modem even. You have to type those code. And mostly basic and perk and poke binary language.

Internet is.

Still, I love to wait for the read of chaos manor. S100 ...


Someone in 20 years will read this comment, laugh, and say "no, this is the golden age of computing!"


And someone else will say "bollocks" to that.

After all, the premise of the grandparent comment is that today some consider 40 years before now as the "golden era".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: