I am still in awe of NeXT's software technology, generally. It was just so carefully and intentionally designed as a coherent whole; one would hope this was where we were going as we got better at architecting software (as individuals, as a field), but disappointingly in retrospect it appears as a kind of high-point, after which we continued to descend into ball-and-twine mediocrity. For Reasons economic and and social that I think people could argue about a lot, but we don't because in part because as a field we don't seem to even agree on what excellence in software architecture/design even means anymore.
But what I want to talk about instead is:
> Like EOF, our database layer that still puts Ruby-on-Rails to shame.
I spent a couple years programming with EOF (the "Enteprise Object Framework", an ORM), and many more recent years programming with ActiveRecord. EOF had a few features that ActiveRecord still doesn't that I miss (like properly functioning multi-table inheritance; and lazy "eager loading" triggered on first access for all associations; Rails 6.1 has a welcome feature to RAISE on n+1 behavior, but why not just lazily trigger the efficient load instead, which is probably no harder to implement? Maybe nobody thought of it, having not used EOF?).
But I wouldn't actually say it still puts ActiveRecord "to shame". ActiveRecord is very similar to EOF in design, by 2020 nearly as mature, with 80-90% of the features.
Yeah, it's striking that ~20 years later we can say AR is mostly as good as EOF haha (and doesn't have anything of note that EOF didn't already have, it hasnt' superceded it in any ways). It's internal architecture isn't quite as elegant. But it really is nearly as good as EOF, it's deficiencies compared to EOF aren't large enough to be particularly shameful, in my experience/opinion, it's in the ballpark!
AR is so similar to EOF that I have always wondered if some of it's designers had experience with EOF.
As a former NeRD (NeXT Registered Developer) who started a company that did custom NeXT development, I both strongly agree and strongly disagree.
The technology really was great. Their understanding of object orientation was superior. The developer tools were wonderful. The user experience was generally a delight. We could develop custom software in a fraction of the time of people using the tools of the day. NeXT had a true vision of the future.
However, what they didn't have was much understanding of economics. The only reason that NeXT wasn't a complete commercial failure was that Apple's board wanted Steve Jobs back. If not, Apple might instead have bought out Be. And if Apple had succeeded in developing their own next-gen OS, both NeXT and Be might be minor footnotes these days. Even prior to the deus ex machina buyout, NeXT was on a slow and steady path to failure. They'd gone from an integrated hardware vendor to an OS-on-other-hardware vendor to a dev-tools-on-other-OSes vendor, and it's not clear that would have worked either. Once the acquisition was announced, they promised to take care of the people who had stuck with them and then did jack.
I took a few lessons away from my time with NeXT. 1) Just because I thought something was technically superior didn't mean it was commercially viable. 2) Being too far ahead of the market is worse than being behind it. 3) Never trust a "visionary leader" to look out for you, no matter what he says. He's in it for himself and the vision; the little people are expendable.
But you're definitely right that it made using other stuff painful. I stopped doing GUI development altogether rather than shift to Windows, which was incomparably awful by comparison.
I think the technical excellence was mostly a side issue there. What really mattered was that they made a better product. I'm sure there were companies that were just as technically excellent but not as focused on value delivery.
Technology always does for technology products. But my point is that pursuing technological excellence as product goal often leads to unsuccessful products.
Google could have -- and probably would have -- failed if not for Google Ads. The great search results drove people to use their product, but it didn't actually earn them money directly.
You can have the best product on the market and fail, and you can have a terrible one, yet succeed. Google has as much business acumen as they do technical chops, and that's why they are such a success. Same with MS and Apple.
Back in the day altavista was superior for search but it was a loss leader. It was designed to sell DEC. I cohabitated some space with Lycos but they couldn’t profit. I think Google executed really well, even to their surprise in hindsight.
I'm lost, you're saying because their product has a monetization strategy that it doesn't count somehow? By that logic, we've reduced the statement I'm replying to into the tautology "cool products with no monetization strategy aren't monetized". Doesn't have quite the same ring to it.
Edit: I guess if we want to limit it to cool technologies that are directly monetizable, the ford motor company of the early 20th century (or maybe Tesla) is a better example? It's just as silly a statement either way.
I'm just saying that technical excellence is orthogonal to how much money a product makes.
There is a ton of shit software out there that makes buckets of money, and lots of well designed, well executed products never make it off the ground. So you can't just focus on good engineering and expect clients to line up out there door.
NeXT was also mostly ahead of the market on the software side. Their machines were a very tough sell compared to the price and performance of other UNIX workstations of the time (which is why I know SunOS and not NeXTStep).
All the vision and all the software quality in the world won't make you competitive in the 90s UNIX workstation market if your machines are underpowered, and we were used to garbage software anyway. Chasing the "personal workstation"/PC market also would never work. DOS/Windows was far too strong and the Macintosh deep in a niche. It's very unfortunate.
NeXT failed on the hardware cost side because they wanted to be a personal computer and not a workstation. They were priced for neither market.
I looked very seriously at Unix machines around the time NeXT came out, having been converted to that religion in college. NeXT started at around US$6500, and that was with the optical disk only. The equivalent-ish Sun box (Sun 3/80) started at around US$15k with disk as I recall and went up in price really fast if you wanted more memory/disk/etc. About the cost of a new Honda Accord at the time. And the Sparcstations were out at much higher performance (and price...I seem to recall around US$22k for a usable config).
On the other hand, you could get a nicely decked out 386/33 for maybe half the cost of the NeXT, or a 486 for a grand or so more. And it ran tons of software, even if it was garbage. Even Unix.
The NeXT at launch was $6500 list for the base model. There were academic deals where you could get it for less, but that's not what we are talking about. And you could get a machine for $3k at least as fast in 1988; the 68030 was past it's prime. If by 'NeXTstation', you mean the 'Slab' pizzabox NeXT, that was $5k list for the mono version, released in 1990, and had a 68040. You could get a much faster machine for $3k in 1990.
The trade name for the original $6500 cube was the NeXTcube (68030). The names for the pizza box workstations were the NeXTstation (68040/25 MHz) and NeXTstation Turbo (68040/33 MHz). The NeXTstation spec’d out at 15 MIPS.
You could buy a NeXTstation for $4995 on the open market, and considerably less with an educational discount.
You could buy a Sun Sparcstation 1, which was released a bit earlier and ran a RISC 20 MHz processor but it was $9,000.
If I recall correctly, SGI workstations like the SGI Indigo at the time started at somewhere around $7500 and depending on configuration could be nearly $40k. The SGI Indy, which was the low end SGI machine, was faster and priced at an identical $4995 - but it wasn’t released until mid 1993.
The first Macs to use the 68040 weren’t released until mid 1992. And they also cost $7,000+.
I am unaware of a machine that was available in 1990 for under $5,000 that had more horsepower. If you can point me to one I’ll gladly concede. But at the time I was actually a NeXT campus consultant, which meant I was selling them and knew the specs of both the NeXT products and the major competitors, and I’m not aware of one. Certainly by 1992 the Mac and high end PCs were catching up, albeit both with vastly inferior operating systems.
That was a long time ago and I could well be wrong. But if so I want to see evidence.
The fact is NeXT machines were dogs. The OS was nice but the hardware was very underpowered. I have a slab in my retro collection. It looks pretty, but a Sun Sparc from the same era is much more powerful.
But to the same markets. NeXTs were being sold both to the workstation and to the personal computer market. They were cheap but underpowered for a workstation, making them not very good for worksation-ish things, because you couldn't scale up. For a personal computer, they were very expensive, so they didn't do well there either.
I’m not arguing they had a good business plan - clearly they did not. All I’m arguing is that they were very good computers for the price, both in terms of software and hardware.
Ah, a lovely machine. That was their second generation, when they were starting to get a sense of reality. Although Wikipedia has the introductory price at $4995, or nearly $10k in 2020 dollars.
Sure, but the Mac II fx was the high-end machine in a consumer line with plenty of low-end options. And the Sun boxes were workstations targeted at businesses and institutions, where high price is not a barrier if the business value is there.
The NeXT hardware was never really competitive in either market except certain niches. E.g., all our clients were in financial trading, because they were willing to pay a huge premium for rapid app development for financial traders.
Thanks for sharing. Can you elaborate a bit why GUI development for NeXT was (and probably is) superior comparing to Windows GUI development (even if we include Borland's effort).
At the time of NeXT’s heyday in the early ‘90s, most GUI programming was textual. You’d call add(button) and button.text = “Hello World” to build up your GUI, and have to wire up the events from your button to take specific actions. Quite a lot of GUI programming is still like this, even now.
What NeXT brought was a GUI editor that allowed you to drag a button from a palette and onto a window (or view). You could then change the text on the button by double clicking on it and renaming the default text. You also got to determine where and how large the button was in relation to the rest of the window.
Most GUI builders could do this, so what was special about Interface Builder?
Two things stood out. First, you could specify how the button reacted to window resizing. There was a “springs and struts” layout mechanism that allowed you to say which parts were fixed offsets and which were variable. You could also say if the button would resize, and if so, in the X or Y or both directions.
The second thing was the ability to connect the button to an action. By Ctrl clicking and dragging, you could wire up the default action to a “selector” — in effect, a virtual method call, on the owner of the button. This owner would be populated at startup, typically the application (controller). So you could have your code with the responder and another team build the UI, and they would join together at runtime.
You could also use properties generated by code as well - you could connect the button’s field to an object’s property (aka an outlet) so that changing the code changed the UI.
The fact that you could drag and drop connections from UI to code, and from code to UI, as well as building a responsive UI, was really what stood out.
This still lives on in Xcode today; IB and PB begat Xcode and IB which begat Xcode. The “nib” format - Next Interface Builder - was a binary format file containing the descriptive state of the Ui and the wiring requirements, which was renamed “xib” when XML became all the rage is the same thing. The fact that IB has been subsumed into Xcode still hides the fact that is what’s happening under the covers.
I think it’s important to realise that this was in an age when Windows 3.1 was all the rage, and we had only just got out of 256 colour VGA while Next station had 16 million.
Nowadays with everyone doing MVC programming with the web, it doesn’t seem so important. But then there was a time when no one wrote unit tests because it was seen as pointless; but it is from these seeds that ideas become mainstream.
Great description. Two things I'd add to that: NeXT's Smalltalk heritage, with what I think of as real object orientation, was great for UI programming. Objects on screen were actually objects that you could message. And Display Postscript made it much easier to get visually solid results.
I can't remember where I read this, but the basic notion was that the Mac was easy to use but hard to program, limiting its adoption. Jobs learned a lesson in that he wanted his NeXT machine to be easy to use and easy to program.
As you say, it might all seem a bit tatty now. The web has raised the game of interface creation quite a bit. But this was in the late 1980s, where a lot of what they did was revolutionary.
And the dithering was wicked fast. I remember playing back multiple videos on NextSTEP fully dithered and quite good looking where the same machine (dual boot setup Intel P90) had huge issues even playing back one video.
I’m sure that your aware that Ms access, Delphi, Visual Basic, progress as well as a host of other tools existed at the time, and you’re fine to point out that Next was superior but given that really none of these systems survived, something else must be going on.
Sure, but Delphi was released in 1995 as the first version, whereas this was something I was programming in 1992 (and I came late to the party with Nextstep 3).
Try using plain ES7+ (with async/await) JavaScript with Mithril (for defining components and their behaviors) and Tachyons (for Atomic CSS for styling). I like that combination best after having used Smalltalk and a variety of GUI builders (including Delphi and ones for Smalltalk and NewtonScript) and Angular and React. (TypeScript is OK too for bigger projects where documenting interfaces wins out over speed of development in plain JavaScript...)
And having dealt with GUI builders with special formats and coding implication related to objects sending special events, I'd much rather just write plain code in one language in a text editor than wrestle with a limited WYSIWYG tool.
Mithril's brilliance is assuming the UI is dirty if you have touched it in some way (mouse click, keystroke, etc.) and always rerendering after the event is handled (except if you want to optimize that). That leads to UI code which is much easier to reason about than arbitrary networks of dependencies like older UI toolkits emphasized. That style of UI development feels a lot more like, say, programming a continually-rerendering video game in for OpenGL than programming a dependency-based UI for VisualWorks/NeXTSTEP/Delphi/VB/etc..
More on all that by me: https://github.com/pdfernhout/choose-mithril
"tl;dr: Choose Mithril whenever you can for JavaScript UI development because Mithril is overall easier to use, understand, debug, refactor, and maintain than most other JavaScript-based UI systems. That ease of use is due to Mithril's design emphasis on appropriate simplicity – including by leveraging the power of JavaScript to define UIs instead of using an adhoc templating system. Mithril helps you focus on the essential complexity of UI development instead of making you struggle with the accidental complexity introduced by problematically-designed tools. Many popular tools emphasize ease-of-use through looking familiar in a few narrow situations instead of emphasizing overall end-to-end simplicity which -- after a short learning curve for Mithril -- leads to greater overall ease-of-use in most situations."
And I say that even having been an official NeXTSTEP developer once upon a time -- after I gave Steve Jobs my business card when I met him after he gave a talk at Princeton and he got me into the developer program (after my paperwork to join that developer program had previously apparently been ignored with its aspiration to build a system where any piece of data could be linked to any other piece of data). Even reading through all the glorious NeXT developer info, I never felt I could afford the NeXT hardware though as much as I wanted it (the short warranty gave me pause too) -- so my career as an independent software developer went in different directions. After reading the article and comments here, I can wish I had just thought to go work for NeXT instead of wanting to be a customer...
Personal experience: Around 2005 I was looking for a platform for a new web app, after some years out of development but having worked extensively with NeXTstep and EOF in the 90s.
After watching DHH's video and reading the Rails book, it reminded me so much of my previous experience with NeXT technology that I had no other choice but to go with Rails.
The dynamism of Ruby had a lot in common with ObjC's runtime. And reading about ActiveRecord at that time I also had the feeling that its authors had worked with EOF before.
All in all, NeXT built great stuff. I still own a NeXTstation Color that I got in 1992 (one of these days I should try to turn it on again). And it's a testament to the quality of that software that some pieces that I'm still running today, like Apple Mail, trace back almost directly to tools I started using back then (NeXTMail).
Yep, people don't often comment on how similar ruby and ObjC are, in fundamentals.
I think it's because both of them were so influenced by smalltalk, more than ObjC influencing ruby necessarily. But not sure.
But I'm still very curious if AR's creators knew EOF, yeah. I haven't found DHH mentioning it; not sure if there might be forgotten other person/people central to original AR architecture.
WebObjects itself was nice in many many ways (I think it's encapsulation of form handling is far better than anything anyone's managed in Rails)... but made a fundamental mistake in trying to keep a fundamentally stateful architecture and apply it to the web by putting what was effectively an opaque state ID in every single URL. This was a basically bad design for the web (although also provided for forementioned good encapsulation of form handling. :) ).
But yeah, the sense I get in my career is that we spend a lot of time trying to reinvent something that already existed, and getting close to being as good as it... then collectively moving on to the next language/platform and doing it again. With not a lot of progress. Up to and through the 90s, it seemed like there was actual progress in software design and architecture at the high-level, the level of affordances for developers to efficiently create reliable maintainable software, but it seems to me have stalled -- perhaps in favor of huge advances in more low-level stuff, better/different languages/language paradigms, etc.
> I am still in awe of NeXT's software technology, generally. It was just so carefully and intentionally designed as a coherent whole [...]
The closest I got to experience inner workings of NeXT software is observing the boot log of Mac OS (which you can see if you boot it with Qemu/Clover). I haven't seen so many triple exclamation marks in a while. That somehow didn't leave the impression of carefully and intentionally designed software.
I couldn't say how similar a 2020 MacOS bootlog is at this point to anything that was in NeXT, and wouldn't assume that whatever you're seeing now that you find inelegant was there in NeXTStep 20 years ago or longer. I mean, maybe, but I wouldn't just assume it and judge NeXT for it. ¯\_(ツ)_/¯
In any event, the boot log is not something I had occasion to pay attention to in NeXTStep, I couldn't speak to it.
NeXTStep/OpenStep had a great development environment and was full of innovation but even in the '90s it had old BSD components that were rarely updated and it really wasn't a great unix. Mac OS X has followed that pattern. Also Mach was inherently slow so running OpenStep on x86 hardware was slower than Linux or Windows - in Mac OS X they finally gave up on a pure microkernel and flattened the kernel to reduce the overhead of message passing through the BSD personality layer to Mach. But folks running OpenStep were running it for the RAD development tools and EOF that let you quickly design a UI with a very usable ORM that allowed you to take a desktop app and turn it into a webapp via WebObjects seamlessly. They complained about the *nix layer even then, but the unix layer was adequate and you could compile newer versions of tools you needed then as now.
After Apple bought NeXT, they upgraded the Mach component from 2.5 to 3.0 (from Apple’s MkLinux project). But it was always a hybrid kernel in both NEXTSTEP and macOS.
And I'm upset I didn't get an opportunity to properly work with WebObjects. WebObjects with Swift would revolutionize the web - IMHO - it was gone too soon.
I coded fulltime in WebObjects from 2006-2008 making webApps in the health care industry.
During my Software Engineering degree I learned the difference between a Library and a Framework, but it wasn't until actually using the WebObjects Framework that the light bulb went off in my head. It was a pleasure to work with, and clearly very, VERY well thought out.
EOF was great, and every time I made a new NSArray() it brought a smile to my face.
It would hardly do that, in case you aren't aware they were in the genesis of J2EE.
> Since the transition of WebObjects to Java in 2000, the functionality of many of Apple's Java Foundation classes is replicated in Sun's own JDK. However, they persist largely for reasons of backwards-compatibility and developers are free to use whichever frameworks they prefer.
But what I want to talk about instead is:
> Like EOF, our database layer that still puts Ruby-on-Rails to shame.
I spent a couple years programming with EOF (the "Enteprise Object Framework", an ORM), and many more recent years programming with ActiveRecord. EOF had a few features that ActiveRecord still doesn't that I miss (like properly functioning multi-table inheritance; and lazy "eager loading" triggered on first access for all associations; Rails 6.1 has a welcome feature to RAISE on n+1 behavior, but why not just lazily trigger the efficient load instead, which is probably no harder to implement? Maybe nobody thought of it, having not used EOF?).
But I wouldn't actually say it still puts ActiveRecord "to shame". ActiveRecord is very similar to EOF in design, by 2020 nearly as mature, with 80-90% of the features.
Yeah, it's striking that ~20 years later we can say AR is mostly as good as EOF haha (and doesn't have anything of note that EOF didn't already have, it hasnt' superceded it in any ways). It's internal architecture isn't quite as elegant. But it really is nearly as good as EOF, it's deficiencies compared to EOF aren't large enough to be particularly shameful, in my experience/opinion, it's in the ballpark!
AR is so similar to EOF that I have always wondered if some of it's designers had experience with EOF.