Hacker Newsnew | past | comments | ask | show | jobs | submit | pbohun's commentslogin

It's not ape coding. It's skill coding. People who don't have the skill to do math and logic ask others to do it for them.

The reason we have programming languages is the same reason we have musical notation or math notation. It is a far more concise and precise way of communicating than using natural languages.

We could write music using natural language, but no one does because a single page of music would require dozens of pages of natural language to describe the same thing.


It's funny that you mention music and notation: sheet music is very compact for musical absolutes like pitch/rhythm/harmony, but a huge part of what we care about with music is nuance, which doesn't reduce cleanly to symbols. Hence there are plenty of words in musical notation that try to describe the desired characteristics of performance, that can't be otherwise encoded into that notation. For example, "with feeling".

That reminds me of an argument on here a while back: where I said I wished Spotify let you filter tracks by presence of pitch-correction or autotune. This wasn't because I thought autotune was 'bad' or modern artists were 'fake', but because sometimes I wanted to listen to vocals as a raw performance - intonation, stability, phrasing - I wanted the option of listening to recordings that let me appreciate the _skill_ possessed by the artists that recorded them.

I got _absolutely destroyed_ in that comments section, with people insisting i'm a snob, that I'm disrespectful, bigoted towards modern artists, there's no way i can actually hear the difference, and if i cant why does it even matter, and anyway everyone uses it now because studio time is expensive and it's so much cheaper than trying to get that perfect take. People got so angry, I got a couple of DMs on Twitter even. All the while I struggled to articulate or justify why I personally value the _skill_ of exceptional raw vocal performance - what I considered to be performance "with feeling".

But, I had to come to terms with the fact that anyone can sing now - no-one can tell the difference, so the skill generally isn't valued any more. Oh, you spent your entire life learning to sing? You studied it? Because you loved music? Sorry dude, I dunno what to say. I guess you'll have to find another way to stand out. You could try losing some weight. Maybe show some skin.


Actually, learning to sing was never really valued. Anyone can learn to sing, but for most that means being a backing singer. Being a lead/soloist is more about timbre and presence (including to a not insignificant extent looks). It's something you either have or you don't.

Self evidently not the case, look at people absolutely falling over themselves to pay hundreds for seats at West End/Broadway shows just to see the spectacle of live human performance.

This is why I never use a calculator. Since my school days I have the skill to do long division. Why hit the sin button when I have the skill to write out a Taylor series expansion? For many other purposes I have the skill to use Newton Raphson methods to calculate values that mostly work.

Those who use a calculator simply don't have these skills.


There is a notable difference between say, calculating long division through a calculator compared to prompting an AI to calculate the derivative of a simple continuous function. one requires _understanding_ of the function, while the other just skips the understanding and returns the required derivative. One is just a means to skip labor intensive and repetitive actions, while the other is meant to skip the entire point of _why_ you are even calculating in the first place. What is the point of dividing two numbers if you don't even understand the reason behind it ?

I'm not quite sure I understand the logic of this and how people don't see that these claims of "well now everyone is going to be dumber because they don't learn" has been a refrain literally every time a major technological / Industrial Revolution happens. Computers? The internet? Calculators?

The skills we needed before are just no longer as relevant. It doesn't mean the world will get dumber, it will adapt to the new tooling and paradigm that we're in. There are always people who don't like the big paradigm change, who are convinced it's the end of the "right" way to do things, but they always age terribly.

I find I learn an incredible amount from using AI + coding agents. It's a _different_ experience, and I would argue a much more efficient one to understand your craft.


100%. I have been learning so much faster as the models get better at both understanding the world and how to explain it me at whatever level I am ready for.

Using AI as just a generator is really missing out on a lot.


Integration and differentiation, even before LLMs, were already something that you would be better off just getting a machine to do in most cases. It's far more important to understand what the operations represent than it is to derive the exact closed form of the result yourself, because the actual process of doing it is almost always tedious and mechanical and doesn't give you much insight into the equation you are working with.

> This is why I never use a calculator.

I always use the calculator.

But, because the numbers that get returned aren't always the right numbers, I try to approximate the answer in my head or with paper and pencil to kind of make sure it's in the ball park.

Also, sometimes it returns digits that don't actually exist, and it's pretty insistent that the digit is correct. If I catch it early I just re-run the equation but there is a special button where I can tell it that it used a digit that does not actually exist.

Sometimes, for complex ones, it tells me it's trying to calculate and provides some details about how it's going about it and keeps going and going and going, for those ones I just reboot the calculator.


Solution for a hallucinating calculator: get a second unreliable calculator to verify the work of the first one. This message brought to you by a trillion dollars in investment desperately trying to replace the labor force with pseudo-intelligent calculators.

Also, the calculator may refuse to process certain operation deemed to be offensive or against the interest of the corporate-state.

Not to forget, the calculator consumes so much processing power that most people are unable to run it at home, so you need a subscription service to access general-purpose calculation.


>get a second unreliable calculator to verify the work of the first one

In do or die situations we actually use 3 calculators.


You probably also don't use a calculator because it uses a scary language called arabic numerals. Why write 123,456 when you could write out in english: One Hundred Twenty-Three Thousand Four Hundred Fifty-Six? English is your programming language and also your math language, right?

I hope this comment is sarcastic.

LLMs are able to ingest numbers. And not just Arabic numerals; Did you know that there are other kinds of number systems?

Believe it or not, they also ingest multimedia. You don't need the English language to talk to a language model. Anything can be a language; you can communicate using only images.

And for that matter, modern LLMs are great at abstract math (and like anything else the results still need proofreading).


Bad analogy. The things I delegate to a calculator, I'm absolutely sure I understand well (and could debug if need be). These are also very legible skills that are easy to remind myself by re-reading the recipe -- so I'm not too worried about skills "atrophying".

Meanwhile those who use a calculator merely hit that sin button and get on with the actual problem at hand, and life in general.

Strongly suspect this is sarcasm, but if it isn't, I applaud your... gusto? Or whatever it is you have going on here.


It's not sarcasm, it's satire.

That's the right word, thanks.

> It is a far more concise and precise way of communicating than using natural languages.

No. We have programming languages because reading and writing binary/hexadecimal is extremely painful to nigh on impossible for humans. And over the years we got better and better languages, from Assembly to C to Python, etc. Natural language was always the implicit ultimate goal of creating programming languages, and each step toward it was primarily hindered by the need to ensure correctness. We still aren't quite there yet, but this is pretty close.


No, that is not true at all.

Natural language is natural because it's good for communicating with fellow humans. We have ways to express needs, wants, feelings, doubts, ideas etc. It is not at all "natural" to program a computer with the same language because those computers were not part of the development of the language.

Now, if we actually could develop a real natural language for programming that would be interesting. However, currently LLMs do not participate in natural language development. The development of the language is expected to have been done already prior to training.

Invented languages and codes are used everywhere. Chemical nomenclature, tyre sizes, mathematics. We could try to do that stuff in "natural" language, but it would be considered a serious regression. We develop these things because they empower us to think in ways that aren't "natural" and free our minds to focus on the problem at hand.


Natural languages are "natural" because they evolved as the de facto way for humans to communicate. Doesn't need to be with fellow humans, but humans were all we've been able to communicate with over our ~300,000 years of existence as a species. And we've done it in thousands of varieties.

> currently LLMs do not participate in natural language development

It's quite literally what LLMs are trained on. You create the core architecture, and then throw terabytes of human-generated text at it until a model that works with said text results. Doesn't matter if it participates in language development or not, it only matters that humans can communicate with it "naturally".

> Invented languages and codes

All languages are invented; the only difference is how conscious and deliberate the process was, which is a function of intended purpose. Just look at Esperanto. Or Valyrian.


A natural language is a living thing. Every day each speaker adjusts his model a tiny bit. This has advantages but also some serious disadvantages which is why technical writers are very careful to use only a small subset of the language in their writing.

For true natural language programming we'd need to develop a language for reliably describing programs, but this doesn't exist in the language, so why would it exist in the LLM models? It will never exist, unless we invent it, which is, of course, exactly what programming languages are.

Natural languages are not invented. Written scripts are said to be invented, but nobody says a natural language like English or French is invented. It just happened, naturally, as the name suggests.

If natural language were the end goal then mathematics and music would use it too. There's nothing stopping them.


> For true natural language programming we'd need to develop a language for reliably describing programs

We really don't. Eventually we won't even be programming anymore per se. Consider communicating with someone who isn't fluent in any language you know, and vice versa. In the beginning you need to use a pretty restricted vocabulary set so you understand each other, similar to a programming language. But over time as communication continues, that vocabulary set grows and things become increasingly "natural", and it's easier for you to "program" each other.

Same with LLMs. We just need to get to the point where a model has sufficient user context (as it already has all the vocabulary) for effective communication. Like OpenClaw is currently accessing enough context for enough use cases that its popularity is through the roof. Tell it to do something, and as long as it has access to the relevant tools and services, it just gets it done. All naturally.


This page was put together very well. It has interactive illustrations when needed (not excessive), and the explanations were informative yet concise. I also like how it brings up other uses of quadtrees, such as for images. This encouraged me to think about how they might be used elsewhere.

I used quad trees in a janky fractal compression schema starting big and only moving small if a feature couldnt be represented in the larger space. It kinda worked. And then you add motion into the mix with oct trees. Love this stuff.

This looks very interesting. It says it has networking, cryptography, etc. Is there any documentation of the APIs? I can't try this out if I don't know what they are.


There is now: https://docs.lilush.link Will be more + user guides

Thanks for the update!


There's no way to say this without sounding mean: Everything Chris Lattner has done has been a "successful mess". He's obviously smart, but a horrible engineer. No one should allow him to design anything.

Edit: I explained my position better below.


People are correct I didn't explain my position.

LLVM: Pretty much everyone who has created a programming language with it has complained about its design. gingerbill, Jon Blow, and Andrew Kelley have all complained about it. LLVM is a good idea, but it that idea was executed better by Ken Thompson with his C compiler for Plan 9, and then again with his Go compiler design. Ken decided to create his own "architecture agnostic" assembly, which is very similar to the IR idea with LLVM.

Swift: I was very excited with the first release of Swift. But it ultimately did not have a very focused vision outlined for it. Because of this, it has morphed into a mess. It tries to be everything for everyone, like C++, and winds up being mediocre, and slow to compile to top it off.

Mojo isn't doesn't exist for the public yet. I hope it turns out to be awesome, but I'm just not going to get my hopes up this time.


Yes. I also written a compiler and I also complained about LLVM.

LLVM is

  - Slow to compile
  - Breaks compilers/doesn't have a stable ABI
  - Optimizes poorly (at least, worse than GCC)
Swift I never used but I tried compiling it once and it was the bottom 2 slowest compiler I ever tested. The only thing nearly as bad was kotlin but 1) I don't actually remember which of these are worse 2) Kotlin wasn't meant to be a CLI compiler, it was meant to compile in the background as a language server so it was designed around that

Mojo... I have things I could say... But I'll stick to this. I talked to engineers there and I asked one how they expected any python developers to use the planned borrow checker. The engineer said "Don't worry about it" ie they didn't have a plan. The nicest thing I can say is they didn't bullshit me 100% of the time when I directly asked a question privately. That's the only nice or neutral thing I could say


> LLVM: Pretty much everyone who has created a programming language with it has complained about its design. gingerbill, Jon Blow, and Andrew Kelley have all complained about it. LLVM is a good idea, but it that idea was executed better by Ken Thompson with his C compiler for Plan 9, and then again with his Go compiler design. Ken decided to create his own "architecture agnostic" assembly, which is very similar to the IR idea with LLVM.

I suggest you ask around to see what the consensus is for which compiler is actually mature. Hint: for all its warts, nobody is writing a seriously optimized language in any of the options you listed besides LLVM.


How many languages are using LLVM as its backend vs Go's?


As far as I know, only Go uses Go's back end because it was specifically designed for Go. But the architecture is such that it makes it trivial for Go to cross compile for any OS and architecture. This is something that LLVM cannot do. You have to compile a new compiler for every OS and arch combo you wish to compile to.

You could imagine creating a modified Go assembler that is more generic and not tied to Go's ABI that could accomplish the same effect as LLVM. However, it'd probably be better to create a project like that from scratch, because most of Go's optimizations happen before reaching the assembler stage.

It would probably be best to have the intermediate language that QBE has and transform that into "intermediate assembly" (IA) very similar to Go's assembly. That way the IL stage could contain nearly all the optimization passes, and the IA stage would focus on code generation that would translate to any OS/arch combo.


> As far as I know, only Go uses Go's back end because it was specifically designed for Go. But the architecture is such that it makes it trivial for Go to cross compile for any OS and architecture. This is something that LLVM cannot do. You have to compile a new compiler for every OS and arch combo you wish to compile to.

I don't think that's true. Zig have a cross-compiler (that also compiles C and C++) based on LLVM. I believe LLVM (unlike gcc) is inherently a cross-compiler, and it's mostly just shipping header files for every platform that `zig cc` is adding.


I do not have enough knowledge to say anything bad about LLVM. As an "amateur" compiler writer, it did confuse me a bit though.

What I will say is that it seem popular to start with LLVM and then move away from it. Zig is doign that. Rust is heading in the direction perhaps with Cranelift. It feels that, if LLVM had completely nailed its mission, these kinds of projects would be less common.

It is also notable that the Dragonegg project to bring GCC languages to LLVM died but we have multiple projects porting Rust to GCC.


Even Clang can cross-compile from one compiler binary - whats missing is bundling the "platform SDK" for all targets like Zig does.


Go never advertised, designed for, nor supported external usage of their backend.


Chris Lattner is definitely a genius engineer at innovation, implementation and delivery but long-term, robust, maintenable software-design doesn't appear to be in his capability set.

The latter is definitely a defining capability of Anders Hejlsberg. (C#/Typescript designer)


> Everything Chris Lattner has done has been a "successful mess".

I don't have an emotional reaction to this, i.e. I don't think you're being mean, but it is wrong and reductive, which people usually will concisely, and perhaps reductively, describe as "mean".

Why is it wrong?

LLVM is great.

Chris Lattner left Apple a *decade* ago, & thus has ~0 impact or responsibility on Swift interop with C++ today.

Swift is a fun language to write, hence, why they shoehorned it in, in the first place.

Mojo is fine, but I wouldn't really know how you or I would judge it. For me, I'm not super-opinionated on Python, and it doesn't diverge heavily from it afaik.


Not just LLVM, but Google's TPU seems to be doing fine also. Honestly it's an impressive track record.


He had 0 to do with the TPU.

I was hired around Google around the same time, but not nearly as famous :)

AFAICouldT it was a "hire first, figure out what to do later", and it ended up being Swift for TensorFlow. That went ~nowhere, and he left within 2 years.

That's fine and doesn't reflect on him, in general, that's Google for ya. At least that era of Google.


Ahh, thanks for the info. Yeah, I heard Google was a bit messy from colleagues who went there.


You don't explain or support your position, you are calling Lattner names. That's not helpful to me or anyone else if we are trying to evaluate his work. Swift has millions of users as does Mojo and Modular in general. These are not trivial accomplishments.


Mojo and Modular have millions of users?


You can answer that question yourself.


You're right, looks like they don't, in fact, have millions of users.


Sick burn my man. Hope you check out Mojo lang vs arguing about rounding errors. I’m still unclear why I’m supposed to steer clear of Lattner which was what we were discussing.


Not sure what you're even talking about. You said something I was curious about since I thought you knew something about it I didn't but then you told me to look it up myself, for some reason, and I did. I don't know how you construed that as a "sick burn" or that you're "supposed to steer clear of Lattner," both of which I never said or implied.


It is all not important and I'm moving on as I'm sure you are as well. I assume good faith so I'm thinking we talked past each other a bit plus I was grouchy at one point (I apologize).

Here's the nut from the relevant comment: "There's no way to say this without sounding mean: Everything Chris Lattner has done has been a "successful mess". He's obviously smart, but a horrible engineer. No one should allow him to design anything."

I can't support a take like this (the commenter's, not yours) it is not helpful or informative. That original comment was edited too, or added onto. On the face of it, the sentiment that Swift and Mojo and LLVM are these awful abominations is beyond the pale for me from an engineering standpoint. I think there are some FOSS feelings in play that stir up strongly held ideologies.


That's why there's nothing that comes close to LLVM and MLIR, right?

If he's such a horrible engineer then we should have lots of LLVM replacements, right?


QBE is a tiny project, but I think illustrates a better intermediate language design. https://c9x.me/compile/


Except performance isn't great and it covers far fewer platforms. It aims for 70% performance but the few benchmarks I've seen show more like 30-50% performance.

It's a cool project and I'd consider it for a toy language but it's far from an LLVM replacement.


Many compilers including my own uses C89


You'll still need a C compiler...


I never heard of hardware without one


Avoiding interacting with LLVM as a user doesn't mean you've created something equivalent to LLVM.

And if the C compiler you use is clang then you're still literally making use of LLVM.


I don't know what point you're trying to make but your question was what's an alternative to llvm. People writing compilers always used C89 or a version of it (C11 allows an easier atomic implementation). There's a lot more C89 compilers than backends that llvm supports. When I was writing arduino code clang/llvm couldn't generate the AVR code for the board. The default toolchain was gcc with an AVR backend. IIRC it had optimizations and was the only reasonable compiler I could use. There's nothing wrong using C as your backend


> There's nothing wrong using C as your backend

I didn't say there was. Saying use C instead of LLVM is fine for a language designer.

But that doesn't make it a replacement for LLVM as a piece of infrastructure. C compilers still need an optimizing backend like LLVM.

The conversation is about whether or not LLVM is a shit piece of engineering, not whether you should target C or IR as a language designer. Avoiding using LLVM directly isn't a replacement lol.

You could say "GCC" is a replacement which at least isn't completely false, but GCC's backend is far more annoying to use for different languages.


You don't really need to deal with gcc backend. Most languages are expressible in C89. It'll obviously be less readable since generated lines look nothing like source lines but C89 beats llvm in most cases, especially if the C compiler supports assembly. I stuck to using intrinsics only because I heard of compilers producing worse code when it hits asm blocks since it can't understand it as well as intrinsics. I took out llvm last time it broke my compiler because it didn't add any value


IIRC, the Inmos Transputer shipped without a C compiler, and while third party products (including a port of GCC) did come later, it was Occam-only for quite some time.


So what you're saying is that C source code is the (only) stable LLVM ABI ;)


Haha, yes

I have an old printer, maybe 15 years old or so. Linux still prints and Windows doesn't!


There also is an unofficial mirror which has updates.

https://github.com/TinyCC/tinycc


That has the same content as git://repo.or.cz/tinycc.git


I'm not. I'm learning a little bit each day, making my brain better and myself more productive as I go.


It feels like everything is falling apart and getting worse. Yet somehow people are racing to produce AI slop faster. If software eventually collapses under its own weight, things might be so borked we have to bootstrap everything from scratch, staring with assembly.


I know it's not fashionable to be positive about macos but since ditching Windows a decade ago it's pretty much just worked, and I can run Excel and the like. I'm sticking with Sequoia and avoiding 26 though, for now.


It sounds weird to say, but Steve Ballmer was probably the best CEO of Microsoft.


Or: Steve Ballmer oversaw the decline of Mircosoft's flagship product, but left before he could be blamed for it.

A lot of Windows' current problems can be traced back to the Ballmer era, including the framework schizophrenia, as Microsoft shifted between Win32, UWP, WPF, and god knows what else. This has lead to the current chaotic and disjointed UI experience, and served to confuse and drive away developers. Repeatedly sacrificing reliable and consistent UX while chasing shiny and new technologies is no way to run an OS.


I think MS's biggest mistake was to not properly maintain and develop the Foundation Classes, basically a thin C++ wrapper library on top of the C API that retained most of the benefits of the Win32 API while eliminating a lot of the boilerplate code. Instead they went after Java with the .NET managed stuff, bloated and slow compared to the native API.

Qt is now the best "old school" UI framework by far.


Experimentation is a cost center?


>including the framework schizophrenia, as Microsoft shifted between Win32, UWP, WPF

Ah yes, and the solution being presented is Linux, with Xlib, Motif, Qt, GTK, and your choice of 167 different desktop environments. Don't forget the whole Wayland schism.

Mac is no better, shifting SDKs every few years, except Apple goes one step further by breaking all legacy applications so you are forced to upgrade. Can't be schizo when you salt the earth and light a match to everything that came before the Current Thing.


macOS has Cocoa since 2000, which is still useable, and SwiftUI since a few years. No comparison to the mess of UI toolkits on Windows.


And what about Carbon?

Gone.

32-bit apps?

Gone.

PowerPC stuff? Anything more than a few years old?

Forget it.

You can't even run versions of iPhoto or iTunes after they deliberately broke them and replaced them with objectively shittier equivalents. Their own apps!

Windows can still run programs from the 90s unmodified. There are VB6 apps from 1998 talking to Access databases still running small businesses today.

Can't say the same for either Mac or Linux.

It's not really a problem for Apple because their userbase is content to re-buy all their software every 5 years.


Well, that's true. It's an interesting point actually. Windows certainly wins in terms of binary compatibility.

I was thinking more about the developer perspective, i.e. churn in terms of frameworks. Yes, PowerPC is gone. Intel will be gone soon.

But both the transitions from PowerPC to Intel as well as from Intel to ARM were pretty straightforward for developers if you were using Cocoa and not doing any assembly stuff.

Carbon only every was a bandaid to give devs some time for the transition to Cocoa.


Maybe I am a bit jaded, but with Apple's yearly OS release cycle — and breaking things nearly every time — I grew sick and tired of software I spent good money or relied on suddenly not working anymore.

Imagine taking your car in for an oil change annually and the radio stopped working when you got it back. It's incompatible with the new oil, they say. You'd be furious.

With the Windows of yore this wasn't so much of an issue — with 5-10 years between upgrade cycles — and service packs in between — you could space it out.

When you work in the computer industry, there tends to be a disconnect with how they are used in the real world by real people — as tools. People grow accustomed to their tools and expect them to be reliable as opposed to some ephemeral service.


I share your sentiment very strongly.

Apple's change for the sake of change is extremely annoying, especially since the changes have been regressions lately. They always push their commercial interest at the cost of their users, refusing to maintain stuff properly to save money.

At some point I had to change a Mac because the GPU wasn't compatible with some apps after they pushed their Metal framework. But it was working just fine for me, and I didn't really need to change it at this moment; Apple just decided so.

And if you use their software on different hardware and make the mistake of upgrading just one, it is very likely that you will have to upgrade the other because the newer software version won't be compatible with the older hardware (had the problem with Notes/Reminders database needing an OS upgrade to be able to sync).

Microsoft is all over the place, but at least it is very likely that you can get away with changing your hardware only once every 10 years if you buy high-end stuff.


Is that a good or bad thing? Yes, Mac chops off legacy after a decade or so, but I don’t see not being able to run apps from the 90s as a problem (or if I did, I’d probably be running windows or Linux instead of Mac OS).


> 32-bit apps?

The Core 2 Duo, used in the last 32 bit Mac, was released in 2006.

> PowerPC stuff?

The last G5 PowerPCs were, similarly, discontinued in 2006.

> every 5 years.

20?


Your stance is all software should die as soon as the generation of chip it was developed on stops being sold?


Sorry, I see how my post might have been somewhat unclear. No, my stance is that 2006 is closer to 20 years ago than 5 years ago.


> Can't say the same for either Mac or Linux.

From my own experience things tend to keep working on Linux if you package your own userland libraries instead of depending on the ever changing system libraries. More or less how you would do it on Windows.

Except Windows isn't perfect either, I had to deal with countless programs that required an ancient version of the c runtime, some weird database libraries that weren't installed by default and countless other Microsoft dependencies that somehow weren't part of the ever growing bloat.


Even WinAmp 2.0 from 1998 still runs on Windows 11.


> Windows can still run programs from the 90s unmodified.

Did _you_ tried ? Because i hear this mantra a lot on HN, but my experience is different. MDK ( the game) cannot _run_ on a current Windows.


Although it's rare for me, I have used some old software that was built for Windows 9X or old versions of NT. So far, the track record is perfect - native programs have worked just fine, though I obviously can't vouch for all of them.

Old, complex games are the worst-case scenario, and are the exception, not the rule. Since they were only beginning to figure out hardware-accelerated 3D gaming in the 90s, it meant that we were left with lots of janky implementations and outdated graphics APIs that were quickly forgotten about. Though, MDK doesn't seem to suffer from this - it should be capable of running on newer systems directly [1]. One big issue it does have is that it uses a 16-bit installer, which is one thing that was explicitly retired during the transition to 64-bit due to it being so archaic at that moment, only being relevant to Windows 1-3. But you can still install the game using the method described in the article, and it should hopefully run fine from there on. Since it has options to use a software renderer and old DirectX, at least one of these should work.

[1] https://www.pcgamingwiki.com/wiki/MDK


I use WinAmp 2.0 sometimes which was released in 1996. I prefer to use v5 but I like to show friends that such old software still works fine (even Shoutcast streaming works fine).


Try running windows 11 on old CPUs, or machines without secure boot / TPM 2.0.


> Try running windows 11 on old CPUs, or machines without secure boot / TPM 2.0.

The more relevant test is the reverse: running Windows XP and apps of that era on modern hardware. It will work perfectly. The same cannot be said of 2000-era Mac software.


That's an academic use case rather than something a lot of people would like to do.


That's because TPM 2.0 module allows M$ to uniquely identify you and sell your info to advertisers - it's not an actual technical limitation, it's just because M$ is greedy, and it's a shame they aren't punished by governments for creating all this unnecessary eWaste just to make even more cash.


With GNU/Linux and BSD I just recompile. I can run old C stuff from the 90's with few flags.

Under GNU/Linux, the VB6 counterpart would be TCL/Tk+SQlite, which would run nearly the same over almost 25-30 years.

As a plus, I can run my code with any editor and the TCL/Tk dependencies will straightly run on both XP, Mac, BSD and GNU/Linux with no propietary chains ever, or worse, that Visual Studio monstruosity. A simple editor will suffice and IronTCL weights less than 100MB and that even bundled with some tool, as BFG:

https://codeberg.org/luxferre/BFG

IronTCL:

https://www.irontcl.com/index.html

Good luck finding some VB5/6 runtime libraries out there without being a virii nest.


I suggest paying attention to some mac development podcasts.


What would I learn there?


That development on macOS UI frameworks isn't as rosy.


I'm not saying it's perfect. Just that it's less of a mess than the situation on Windows.


Again, listen to those podcasts, especially how Apple has (not) handled bug reports.

Between Carbon, Cocoa, UI Touch, UI Kit, Catalyst, Swift UI, there is plenty to chose from, with many non overlapping capabilities.

Then there is naturally the whole plethora of Kits, with several deprecated and replaced by others without feature parity,

https://marcoeidinger.github.io/appleframeworks/


Carbon is long deprecated and as mentioned was only ever meant as a transitional framework.

Cocoa still exists and is usable. UITouch is not a framework, but a class in UIKit. UIkit still exists and is usable. Same for Catalyst. Same for SwiftUI.

As said, I'm not pretending everything is sunshine and roses in Apple-Land. But at least Apple seems to mostly dogfood their own frameworks, which unfortunately doesn't seem to be the case anymore with Microsoft. WinUI 3 and WPF are supposed to be the "official" frameworks to use, but it seems Microsoft themselves are not using them consistently and they also don't seem to put a lot of resources behind them.


Win32, MFC, Windows Forms, and WPF also exist and are quite usable.

Apple also doesn't always uses their stuff as they are supposed to, Webviews are used in a few "native" apps, some macOS apps are actually iOS ones ported via Catalyst, which is the reason they feel strange, and many other stuff I could list.

Two measures, two weights.


> Ah yes, and the solution being presented is Linux, with Xlib, Motif, Qt, GTK

I'm not going to descend into a "my OS's API is worse than yours" pissing match with you, because it's pointless and tangential. The issue is not "is the Windows framework situation worse than Linux" but rather "is the Windows framework situation worse than it used to be" and the answer is emphatically yes, and due mostly to Ballmer's obsession with chasing shiny things, such as that brief period when he decided that all Windows must look like a phone.


> Xlib, Motif, Qt, GTK,

Xlib and Motif are stable APIs. Qt and ... GTK on the other hand...


Feb 2014: Satya Nadella becomes CEO.

July 2014: Microsoft lays off 14k people, a large portion of which are SDET (Software Development Engineer in Test)/QA/test people.

The idea was that regular developers themselves would be writing and owning tests rather than relying on separate testers.

I'm sure there were multiple instances of insane empire building and lots of unproductivity, but it's also hard to not think this was where the downfall began.


Ultimately it still comes down to someone in the chain giving a damn. There are obvious, surface level bugs across most technologies. Yet, developers, PMs, VPs all sign off and say, "Close enough".


And that's also on the CEO, especially after this much time.

He has failed to correctly incentivize/hire/motivate/plan/structure/etc.


Does Satya Nadella even use a windows laptop?


Yes, probably.


Yeah, it sounds weird because the person you’re replying to is using examples of things that came in under Nadella, not Ballmer.


Danluu has a great piece on why Ballmer was better than people gave him credit for: https://danluu.com/ballmer/


The first Go proverb Rob Pike listed in his talk "Go Proverbs" was, "Don't communicate by sharing memory, share memory by communicating."

Go was designed from the beginning to use Tony Hoare's idea of communicating sequential processes for designing concurrent programs.

However, like any professional tool, Go allows you to do the dangerous thing when you absolutely need to, but it's disappointing when people insist on using the dangerous way and then blame it on the language.

https://www.youtube.com/watch?v=PAAkCSZUG1c


> people insist on using the dangerous way and then blame it on the language

Can you blame them when the dangerous way uses 0 syntax while the safe way uses non-0 syntax? I think it's fine to criticize unsafe defaults, though of course it would not be fair to treat it like it's the only option


They're not using the dangerous way because of syntax, they're using it because they think they're "optimizing" their code. They should write correct code first, measure, and then optimize if necessary.


This is all very nice as an idea or a mythical background story ("Go was designed entirely around CSP"), but Go is not a language that encourages "sharing by communicating". Yes, Go has channels, but many other languages also have channels, and they are less error prone than Go[1]. For many concurrent use cases (e.g. caching), sharing memory is far simpler and less error-prone than using channels.

If you're looking for a language that makes "sharing by communicating" the default for almost every kind of use case, that's Erlang. Yes, it's built around the actor model rather than CSP, but the end result is the same, and with Erlang it's the real deal. Go, on the other hand, is not "built around CSP" and does not "encourage sharing by communicating" any more than Rust or Kotlin are. In fact, Rust and Kotlin are probably a little bit more "CSP-centric", since their channel interface is far less error-prone.

[1] https://www.jtolio.com/2016/03/go-channels-are-bad-and-you-s...


Meaning similar to Erlang style message passing?


Not quite. Erlang uses the Actor model which delivers messages asynchronously to named processes. In Go, messages are passed between goroutines via channels, which provide a synchronization mechanism (when un-buffered). The ability to synchronize allow one to setup a "rhythm" to computation that the Actor model is explicitly not designed to do. Also, note that a process must know its consumer in the Actor model, but goroutines do not need to know their consumer in the CSP model. Channels can even be passed around to other goroutines!

Each have their own pros and cons. You can see some of the legends who invented different methods of concurrency here: https://www.youtube.com/watch?v=37wFVVVZlVU

There's also a nice talk Rob Pike gave that illustrated some very useful concurrency patterns that can be built using the CSP model: https://www.youtube.com/watch?v=f6kdp27TYZs


It's true that message sends with Erlang processes do not perform rendezvous synchronization (i.e., sends are nonblocking), but they can be used in a similar way by having process A send a message to process B and then blocking on a reply from process B. This is not the same as unbuffered channel blocking in Go or Clojure, but it's somewhat similar.

For example, in Erlang, `receive` _is_ a blocking operation that you have to attach a timeout to if you want to unblock it.

You're correct about identity/names: the "queue" part of processes (the part that is most analogous to a channel) is their mailbox, which cannot be interacted with except via message sends to a known pid. However, you can again mimic some of the channel-like functionality by sending around pids, as they are first class values, and can be sent, stored, etc.

I agree with all of your points, just adding a little additional color.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: