An LLM can't invent meaning in a text where there is none. It's equivalent to CSI's classic "zoom, enhance" on resolution limited photographs. You need to consider you're learning a load of rubbish from LLMs.
Pretend learning is absolutely the key point, for me. There is danger in shifting our reasoning from knowing "stuff", to knowing a symbolic summary of "stuff" (helpfully generated by an LLM at varying levels of accuracy).
Previously, we saw a shift with search engines where we no longer needed to learn data because we could use a search engine as a mental signpost to the data, freeing up capacity for other thought.
LLMs are shifting knowledge creation to this mental pointer model. We don't need to know real "stuff" because we know how to look it up later (never?).
Each of these summaries is a secondary source, delivered through an agent biased by whatever is in its current context window. Like a game of telephone the summaries are inherently lossy, and each one may be 95% correct and we crucially don't understand which 5% may be incorrect.
When our basis for decision making is a collection of 100s or 1000s of LLM generated "Schrodinger's facts", we risk cumulative cascading errors. We will be wrong in unpredictable, chaotic ways.
We are voluntarily capping ourselves as this childish level of thought, because it feels like we are exercising our critical judgement the same as ever. However, the integrity of the inputs has been compromised. Bad inputs always lead to bad outputs.
Established corporations will be doing yoinking, with a pre-existing credibility. There's a huge incentive to offer these copied services for cents on the dollar, as a way to kill the competition.
It'll be interesting to see if this happens at a service level too. Like how lots of companies offer an S3 compatible API, will companies start offering similar services and building a compatibility layer over the top as an easy way to for customers to transition? You could use the existing service as a test suite to check your compatibility API behaves the same as the original product.
The law locks up the man or woman
Who steals the goose from off the common
But leaves the greater villain loose
Who steals the common from off the goose.
Pilates class at my local gym. Friendly, low stress and only once or twice a week for 45 minutes. Sorted out my core strength, messed up shoulders and back.
Not really that easy. The amount of production bugs that end up getting measured are the ones that got reported, there are an unknown (and unknowable) number of unreported bugs that you never measure.