The incentive of the human who deployed it—at one remove or another—would require knowing more. But the more likely cases are easy to guess at, e.g., someone is playing with OpenClaw. I'd guess "someone is playing with OpenClaw and intends to write something about it boost their brand, could be a Show HN could be a LinkedIn screed they hope goes viral."
Humans are not judged on the basis of what they _can_ do.
Reasoning about how to constrain tools on the basis of what they _could_ do, if e.g. used outside their established guardrails, needs to be very nuanced.
Correct; the ability of a model to reproduce source material verbatim does not necessarily make the model's existence illegal. However, using a model to do just that might very well present a legal liability for the user. I would be interested to see the extent to which models can "recite from memory" source code, e.g., from the various MS code leaks. Put another way, if I'm using LLM code generation extensively, do I need to run a filter on its output to ensure that I don't "accidentally" copy large chunks of the Windows codebase?
Setting aside the marvelous murk in that use of "you," which parenthetically I would be happy to chat about ad nauseum,
I would say this is a fine time to haul out:
Ximm's Law: every critique of AI assumes to some degree that contemporary implementations will not, or cannot, be improved upon.
Lemma: any statement about AI which uses the word "never" to preclude some feature from future realization is false.
Lemma: contemporary implementations have already improved; they're just unevenly distributed.
I can never these days stop thinking about the XKCD the punchline of which is the alarmingly brief window between "can do at all" and "can do with superhuman capacity."
I'm fully aware of the numerous dimensions upon which the advancement from one state to the other, in any specific domain, is unpredictable, Hard, or less likely to be quick... but this the rare case where absent black swan externalities ending the game, line goes up.
"every critique of AI assumes to some degree that contemporary implementations will not, or cannot, be improved upon."
They're token predictors. This is inherently a limited technology, which is optimized for making people feel good about interacting with it.
There may be future AI technologies which are not just token predictors, and will have different capabilities. Or maybe there won't be. But when we talk about AI these days, we're talking about a technology with a skill ceiling.
This is oddly timed in as much as one of the big success stories I've heard from a friend is their new practice of having Claude Code develop in Rust, than translate that to WebAssembly.
That seems much more like the future than embracing Node... <emoji here>
If you’re making a web app your fancy rust wasm module still has to interface with the dom, so you can’t escape that. Claude might offer you some fake simplicity on that front for awhile, but skeptical that’s it fully scalable
What is your argument for why denecessitating labor is very bad?
This is certainly the assertion of the capitalist class,
whose well documented behavior clearly conveys that this is not because the elimination of labor is not a source of happiness and freedom to pursue indulgences of every kind.
It is not at all clear that universal life-consuming labor is necessary for a society's stability and sustainability.
The assertion IMO is rooted rather in that it is inconveniently bad for the maintenance of the capitalists' control and primacy,
in as much as those who are occupied with labor, and fearful of losing access to it, are controlled and controllable.
People willing to do something harder or more risky than others will always have a bigger chance to get a better position. Be that sports, labor or anything in life.
I am 1000% OK with living in a world where basic needs are fully provided for,
and competition and drive are worked out in domains which do not come at the expense of someone else's basic needs.
Scifi has speculated about many potential outlets for "human drive," the frontier/pioneer spirit being a big one; if I could name my one dream for my kids it'd be that they live in an equitable post-scarcity society which has turned its interests to exploring the solar system and beyond.
Sports, "FKT" competitions, and social capital ("influence") are also relatively innocuous ways to absorb the drive for hierarchy and power.
The X factor is, is the will to dominate/control/be subjugated to, suppressible or manageable.
Could be for fun. I remember fun.
reply