Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> percentage of good, well planned, consistent and coherent software is going to approach zero

So everything stays exactly the same?



I get this comment everytime I say this but there are levels to this. What you think is bad today could be considered artisan when things become worse than today.


I mean, you've never used the desktop version of Deltek Maconomy, have you? Somehow I can tell.

My point here is not to roast Deltek, although that's certainly fun (and 100% deserved), but to point out that the bar for how bad software can be and still, somehow, be commercially viable is already so low it basically intersects the Earth's centre of gravity.

The internet has always been a machine that allows for the ever-accelerated publishing of complete garbage of all varieties, but it's also meant that in absolute terms more good stuff also gets published.

The problem is one of volume not, I suspect, that the percentages of good versus crap change that much.

So we'll need better tools to search and filter but, again, I suspect AI can help here too.


No, wealth gets more concentrated. Fewer people on the team will be able to afford a comfortable lifestyle and save for retirement. More will edge infinitesimally closer to "barely scraping by".


Underrated comment. The reason that everyone complains about code all the time is because most code is bad, and it’s written by humans. I think this can only be a step up. Nailing validation is the trick now.


Validation was always the hard part, outside of truly novel areas - think edges of computer science (which generally happen very rarely and only need to be explored once or a handful of times).

Validation was always the hard part because great validation requires great design. You can't validate garbage.


> So everything stays exactly the same?

No, we get applications so hideously inefficient that your $3000 developer machine feels like it it's running a Pentium II with 256 MB of RAM.

We get software that's as slow as it was 30 years ago, for no reason other than our own arrogance and apathy.


Do not insult P-II w/256 MB of RAM. That thing used to run this demo[0] at full speed without even getting overwhelmed.

Except some very well maintained software, some of the mundane things we do today waste so much resources it makes me sad.

Heck, the memory use of my IDE peaks at VSCode's initial memory consumption, and I'd argue that my IDE will draw circles around VSCode while sipping coffee and compiling code.

> for no reason other than our own arrogance and apathy.

I'll add greed and apparent cost-reduction to this list. People think they win because they reduce time to market, but that time penalty is delegated to users. Developers gain a couple of hours for once, we lose the same time every couple of days while waiting our computers.

Once I have read a comment by a developer which can be paraphrased as "I won't implement this. It'll take 8 hours. That's too much". I wanted to plant my face to my keyboard full-force, not kidding.

Heck, I tuned/optimized an algorithm for two weeks, which resulted in 2x-3x speedups and enormous memory savings.

We should understand that we don't own the whole machine while running our code.

[0]: https://www.pouet.net/prod.php?which=1221


No insult intended!

Thanks for sharing the demo!


> No insult intended!

Haha, I know. Just worded like that to mean that even a P-II can do many things if software is written well enough.

You're welcome. That demo single-handedly thrown me down the high performance computing path. I thought, if making things this efficient is possible, all the code I'll be writing will be as optimized as it can be as the constraints allow.

Another amazing demo is Elevated [1]. I show its video to someone and ask about the binary and resources size. When they hear the real value, they generally can't believe it!

Cheers!

[1]: https://www.pouet.net/prod.php?which=52938


> No, we get applications so hideously inefficient that your $3000 developer machine feels like it it's running a Pentium II with 256 MB of RAM.

> We get software that's as slow as it was 30 years ago, for no reason other than our own arrogance and apathy.

I feel like I read this exact same take on this site for the past 15 years.


I think the error was to say "as fast"

https://news.ycombinator.com/item?id=36446933


Because it's been true for the past 15 years.


I find it hard to disagree with this (sadly).

I do feels things in general are more "snappy" at the OS level, but once you get into apps (local or web), things don't feel much better than 30 years ago.

The two big exceptions for me are video and gaming.

I wonder how people who work in CAD, media editing, or other "heavy" workloads etc, feel.


> I wonder how people who work in CAD, media editing, or other "heavy" workloads etc, feel.

I’d let you know how I feel but I’m too busy restarting Solidworks after its third crash of the day. I pay thousands a year for the privilege.


Well that's sad to hear. Also kinda makes me glad I didn't wander down the path of learning Solidworks during the pandemic.


It's usually pretty stable for a while. It's when you get into very complex parts and assembles that it starts to really show problems. (You'll still see some crashes learning though).


Moving data for 'heavy' workflows into the cloud is the most common performance bottleneck I see.


> I wonder how people who work in CAD, media editing, or other "heavy" workloads etc, feel.

I would assume (generally speaking) that CAD and video editing applications are carefully designed for efficiency because it's an important differentiator between different applications in the same class.

In my experience, these applications are some of the most exciting to use, because I feel like I'm actually able to leverage the power of my hardware.

IMO the real issue are bloated desktop apps like Slack, Discord, Spotify, or Claude's TUI, which consume massive amounts of resources without doing much beyond displaying text or streaming audio files.


The problem with CAD is that mechanical engineering is still deeply proprietary, especially up and including the software stacks.

There is basically no "open source" in mechanical engineering. So you are relegated to super heavy legacy applications that coast by through their integrations with other proprietary tools. Solidworks is much heavier then FreeCAD but FreeCAD didn't have integrations with simulation tools, with CAM software, used a different geometry engine than industry standard, etc, so when a company tried to turn FreeCAD into a product they failed.

The only open source one sees in mechanical engineering comes out of academia, which while interesting, faces the problem that once the research funds dry up or the project finishes the software is dumped into the open in hard to find places, and is not further developed.

I remain hopeful in the potential for open source, I believe that to have a truly accessible and innovative industry a greater level of openness is needed, but it is yet coming.

I think CAD is a good place to start, as it is not a space where lots of hidden and closely guarded tricks are needed like in Finite Element Analysis. For personal uses FreeCAD is getting there. Snappier than Solidworks, but the workflow layout needs some work.

I am also looking at projects such as https://zoo.dev. In mapping the design 1to1 to code (while keeping gui workflow as well) I think they have a real chance of offering enough value that new companies will be interested in trying out their approach. It opens the doors to automation analysis, and generation that while possible with something like Solidworks is cumbersome and not well documented.


That's the hidden price of fast development


In a bizarre way all these new datacentre build outs may have 'fake demand' because of how inefficiently software gets produced.


Much of the new datacenter capacity is for GPU-based training or inference, which are highly optimized already. But there's plenty of scope for optimizing other, more general workloads with some help from AI. DRAM has become far more expensive and a lot of DRAM use on the server is just plain waste that can be optimized away. Same for high-performance SSDs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: