absolutely does! for a new language that no one has heard of, it is essential that examples make at least a parallel with other languages. providing examples for mundane things is very useful to build the understanding with the reader who hasn't been writing a paper on OM language.
that's simply not how it works, and quite obviously so. the stop time is absolutely not linear in the number of people who board the bus. just think about all the time it takes to slow down, possibly make the whole bus kneel, and then sit up again. by your argument, there should be infinity bus stops, each of which only allowing one single person to load. like, what? surely we can think more critically than this...
i love how this disintermediates the next.js/vercel axis, which seems to be determined to make basically everything hard except for exactly what they want to do. as much as i love what vercel has done for open source in general (amazing stuff!) it is hard to interpret some of the stuff they do with next as anything other than vendor lock-in bs… the kind that i know is not in their hearts.
that's what the cultivators of these examples are preying on. but in practice what people care about is "can i get it to do <X>", not "is it a decider on every possible token sequence that humans perceive to be about <X>".
Fair, but that's just what hype is. Overpromise, underdeliver. Most of us recognize its limits and take advantage of its strengths. This post (and many in it) seem to be inferring that AI is useless because it isn't AGI, answered a simple question wrong, was tricked, or didn't answer perfectly. This is cherry-picking at best, disingenuous at worst.
well the other person in the comments said the guy literally held his accelerator to the floor the entire time. is that actually a reasonable standard, or are you preemptively out for blood because you would never let reality get in the way of a good agenda? ironic, given that you go out of your way to accuse others of this. methinks you doth protest too much?
and that’s somehow okay while you’re driving? i didn’t realize there was a “uhh but if you drop your phone it’s okay to cause a crash” clause in the statutes.
the fat and opinionated has always been true for them (especially compared to openai), and to all appearances remains a feature rather than a bug. i can’t say the approach makes my heart sing, personally, but it absolutely has augured tremendous success among thought workers / the intelligensia
no it doesn’t. not at all. “abnormality” is a measure vs. the median… what else could “abnormal” possibly even mean? how could anyone ever be abnormal in any way otherwise, given the number of possible avenues of abnormality in the universe? this logic can only even “play ball” with a singular “is this person abnormal or not?” boolean… if there existed even two axes of abnormality then by your folksy definition it cannot actually exist. QED.
I don't think anyone doubted the story. The details might be questionable, but the basics that he tried to fight with elephants is highly likely. We have plenty of sources for War Elephants in his time, so the idea that he didn't have them would be the larger surprise if someone could prove that.
> but the basics that he tried to fight with elephants is highly likely. We have plenty of sources for War Elephants in his time, so the idea that he didn't have them would be the larger surprise if someone could prove that.
Just to add to your point: Many many cultures have used elephants in their armies, so the only real bone of contention (oh god I do love my puns) would have been that Hannibal was using them on the European continent.
It's more like, the surviving written histories and the archaeological record are each giving us only a part of the real truth. It's as if they're both grasping in the dark at two different parts of the same elephant.
reply