Well, this looks like the equivalent of GPTs to me. Meaning some user, in this case the user “sluttypuffin”, has chosen the prompt directives that instruct the model about how it should respond. So the hyperbolized nature of this is (and I’m guessing to be clear) is likely something that sluttypuffin specifically constructed with their prompt engineering.
I know. There are basically no studies that were run to completion. This one was. Go ahead and look for yourself. However, this study proves that if the effect exists, it is pretty small, and probably placebo. Placebo effects are real. But they affect things like surveys. Actual sleep quality did not seem to change.
This is where prompt engineering becomes more important. Next time consider pre-pending some kind of plain English set of expectations before pasting your code. Something like, “I want you to write tests for this code. Here are the expected behaviors <expected behaviors list>, and here are unexpected behaviors <unexpected behaviors list>. Tests should pass if they adhere to the expected behaviors and fail if they have unexpected behaviors. Here is the code: <code>”.
Like most LLM generation though, it’s not a deterministic thing and like you mentioned originally it takes some verification of the output. I still think with the extra steps it saves time when applied to the right scenarios. The longer the input the higher the hallucinations count in my experience though, so I always keep the code provided in the smallest chunk possible which still has enough context.
>cultural adoption reproducing itself
… what? You’ve gone to great lengths here to fabricate a phrase that is semantically equivalent to “influence” in the most obfuscated way possible.
>cultural hegemony
See previous … what?
Let’s get down to brass tax.
A does foo.
B notices A has done foo.
B decides it will also do foo.
A has influenced B.
There are other aspects about the original comment which were better targets than attacking the meaning of “influence”, such as America’s status as a melting pot whose traditions are in great part just the collection of influences from the multitude of ethnicities which can be represented by the single term “American”. Thinking America has such dramatic unique influence while ignoring its composition of influences from other cultures is not giving honest context. That’s the part I would have gone after.
I mean, they probably would if public services paid their engineers Google salaries. Do you think someone who can make 150k at Google is going to take a public services eng job for at minimum 50% less pay? Of course to have public service jobs like that we’d have to pay more taxes… and thus you end up with overworked engineers who are probably ok at their job but not on the same level as the big private companies
Imagine we start paying government employees Google salaries tomorrow. Could we then start firing incompetent employees and replace them with the competent ones lured by high pay? Everything I know about how politics and public sector unions work tells me that the answer is very much no. This means that this “pay them and they will come” plan can only expect to see any success many decades down the line, once enough incompetent workers leave on their own. And this is assuming your plan actually works: who’s to say that the high quality employees won’t get lazy themselves if they get Google pay with public sector job security?
Overall, this means that before we entertain your idea, we need to bust the public sector unions and fire the incompetent employees. I suspect that once you do that, you won’t even have to raise the pay so much to get large increase in public sector competence.
Google-level salaries are a recent thing in engineering. There wasn’t that much of a difference in engineering salaries between public and private sector say in the 1990s. But public services still sucked.