Those deps have to come from somewhere, right? Unless you're actually rolling your own everything, and with languages that don't have package managers what you end up doing is just adding submodules of various libraries and running their cmake configs, which is at least as insecure as NPM or Crates.io.
Go is a bit unique a it has a really substantial stdlib, so you eliminate some of the necessary deps, but it's also trivial to rely on established packages like Tokio etc, vendor them into your codebase, and not have to worry about it in the future.
> Those deps have to come from somewhere, right? Unless you're actually rolling your own everything
The point is someone needs to curate those "deps". It's not about rolling your own, it's about pulling standard stuff from standard places where you have some hope that smart people have given thought to how to audit, test, package, integrate and maintain the "deps".
NPM and Cargo and PyPI all have this disease (to be fair NPM has it much worse) where it's expected that this is all just the job of some magical Original Author and it's not anyone's business to try to decide for middleware what they want to rely on. And that way lies surprising bugs, version hell, and eventually supply chain attacks.
The curation step is a critical piece of infrastructure: thing things like the Linux maintainer hierarchy, C++ Boost, Linux distro package systems, or in its original conception the Apache Foundation (though they've sort of lost the plot in recent years). You can pull from those sources, get lots of great software with attested (!) authorship, and be really quite certain (not 100%, but close) that something in the middle hasn't been sold to Chinese Intelligence.
But the Darwinian soup of Dueling Language Platforms all think they can short circuit that process (because they're in a mad evangelical rush to get more users) and still ship good stuff. They can't.
I mean somebody could make a singular rust dependency that re-packages all of the language team's packages.
But what's the threat model here. Does it matter that the Rust STD library doesn't expose say "Regex" functionality forcing you to depend on Regex [1] which is also written by the same people who write the STD library [2]? Like if they wanted to add a back-door in to Regex they could add a backdoor into Vec. Personally I like the idea of having a very small STD library so that it's focused (as well as if they need to do something then it has to be allowed by the language unlike say Go Generics or ELM).
Personally I think there's just some willful blindness going on here. You should never have been blindly trusting a giant binary blob from the std library. Instead you should have been vendoring your dependencies and at that point it doesn't matter if its 100 crates totaling 100k LOC or a singular STD library totaling 100k LOC; its the same amount to review (if not less because the crates can only interact along `pub` boundaries).
[1]: https://docs.rs/regex/latest/regex/
> I mean somebody could make a singular rust dependency that re-packages all of the language team's packages.
That's not the requirement though! Curation isn't about packaging, it's about independent (!) audit/test/integration/validation paths that provide a backstop to the upstream maintainers going bonkers.
> But what's the threat model here.
A repeat of the xz-utils fiasco, more or less precisely. This was a successful supply chain attack that was stopped because the downstream Debian folks noticed some odd performance numbers and started digging.
There's no Debian equivalent in the soup of Cargo dependencies. That mistake has bitten NPM repeatedly already, and the reckoning is coming for Rust too.
> Wasn't that a suspected state actor? Against that threat model your best course of action is a prayer and some incense.
No? They caught it! But they did so because the software had extensive downstream (!) integration and validation sitting between the users and authors. xz-utils pushed backdoored software, but Fedora and Debian picked it up only in rawhide/testing and found the issue.
> Notably, xz utils didn't use any package manager ala NPM and it relied on package management by hand.
With all respect, this is an awfully obtuse take. The problem isn't the "package manager", it's (and I was explicit about this) it's the lack of curation.
It's true that xz-utils didn't use NPM. The point is that NPM's lack of curation is, from a security standpoint, isomorphic to not having any packaging regime at all, and equally dangerous.
> a Postgres dev running bleeding edge Debian
Exactly. Not sure how you think this makes the point different. Everything in Debian is volunteer, the fact that people do other stuff is a bonus. Point is the debian community is immunized against malicious software because everyone is working on validation downstream of the authors.
No one does that for NPM. There is no Cargo Rawhide or NPM Testing operated by attested organizations where new software gets quarantined and validated. If the malicious authors of your upstream dependencies want you to run backdoored software, then that's what you're going to run.
No? Who else has 2-3 years worth of time to become a contributior and maintainer for obscure OSS utils?
Plus made sockpuppets to put pressure on OG maintainer to give Jia Tan maintainer privilege.
> Exactly. Not sure how you think this makes the point different. Everything in Debian is volunteer, the fact that people do other stuff is a bonus.
What you mean exactly? This isn't curation working as intended. This is some random dev discovering it by chance. While it snuck past maintainers and curator of both Debian and Red Hat.
> Everything in Debian is volunteer, the fact that people do other stuff is a bonus. Point is the debian community is immunized against malicious software because everyone is working on validation downstream of the authors.
You can do same in NPM and Cargo.
Release a v1.x.y-rc0, give everyone a trial run, see if anyone complains. If they do, it's downstream validation working as intended.
Then yank RC version and publish a non-RC version. No one is preventing anyone from making their release candidate version.
> No one does that for NPM. There is no Cargo Rawhide or NPM Testing
Because, it makes no more sense to have Cargo Rawhide than to have XZ utils SID.
Cargo isn't an integration point, it's infra.
Bevy, which integrates many different libs, has a Release Candidate. But a TOML/XYZ library it uses doesn't.
Isn't xz-utils exactly why you would want a lot of dependencies over a singular one?
If say Serde gets compromised then only the projects depending on that version of Serde are as opposed to if Serde was part of the std library then every rust program is compromised.
> That mistake has bitten NPM repeatedly already, and the reckoning is coming for Rust too.
Eh, the only things that coming is using software expressly without a warranty (expectantly) will mean that software will cause you problems at an unknown time.
This falls under the "selling somthing" angle I mentioned. Yes yes yes, generality and abstraction are tradeoffs and higher level platforms lack primitives for things the lower levels can do.
That is, at best, a ridiculous and specious way to interpret the upthread argument (again c.f. "selling something").
The actual point is that all real systems involve tradeoffs, and one of the core ones for a programming language is "what problems are best solved in this language?". That's not the same question as "what problems CAN be solved in this language", and trying to conflate the two tells me (again) that you're selling something. The applicability of C to problem areas it "can" solve has its own tradeoffs, obviously.
It is more of a cultural thing. Package managers encourage lots of dependencies while programmers using language with no package managers will often pride themselves in having as few dependencies as possible. when you consider the complete graph, it has an exponential effect.
It is also common in languages without package managers to rely on the distro to provide the package, which adds a level of scrutiny.
Technically it's the same. But behaviorally it's not. When pulling in more dependencies is so easy, it's very hard to slow down and ask the question do we need all of this?
Mucking around with cmake adds enough friction that everyone can take a beat for thoughtful decision-making.
but to clarify, this was about a year ago where I struggled to find an auto completion for HttpServer and when I searched it up jdk HttpServer was simply not in the results so I made assumptions that were wrong.
I tried to implement a minimal server just to realize that there is still no way to do so in java 21... I stand corrected I guess it was recently added: https://docs.oracle.com/en/java/javase/25/docs/api/jdk.https..., but it's a sun package instead of standard RT - but probably because it is still early.
I did mention that, but for a lot of things it is not enough compared to a full http client most stdlib's have. HttpClient was introducted for a reason.
Go is a bit unique a it has a really substantial stdlib, so you eliminate some of the necessary deps, but it's also trivial to rely on established packages like Tokio etc, vendor them into your codebase, and not have to worry about it in the future.