Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Who are the weak being "sacrificed"?

And who is the one calling for action?

Sorry for being dense, but I'm trying to understand if I'm the "strong" or the "weak" in your analogy.



> Who are the weak being "sacrificed"?

The work of artists, authors, etc.

I know currently the legal situation is messy, but that's exactly the point, anyone who can't engage in lengthy legal battle and defend their position in court are being sacrificed. The companies behind LLMs are spending hundreds of millions of dollars in lobbying and exploiting loopholes.

Let's be real without the data there wouldn't be LLMs, so it crazy that some people are downplaying its significance or value, while on the other hand they're losing sleep over finding fresh sources to scrape.

The big publishers seem to have given up and decided it's best to reach agreement with their counterparts, while independent authors are given the finger.


What about programmers? I never consented to have my code consumed by LLMs.


Any case where someone's work was used without respecting the terms is included in my answer. That's why I used `et cetera` here:

> The work of artists, authors, etc.


I wanted to make sure I understood which side of the equation I fell on. And I must say, it looks to me like a lot of people in the "weak" camp aren't helpless martyrs though, myself included. People are excited and enthusiastic about AI and are actively reaping the benefits of progress. I don't think your analogy is quite apt.


> a lot of people in the "weak" camp

Define "a lot"? Most people barely know how to use their email. Even among the minority who do actively use "AI" and excited about it, outside of ML engineers they aren't well-informed or aware what data is used for training, or even what training means and how these models work to begin with.

> People are excited and enthusiastic about AI and are actively reaping the benefits of progress.

Except the terms were already violated in the initial training phase before the services were even public and saw adoption. That's like pointing at a rape victim who got some form of compensation later, saying:

  see how she's "reaping the benefits"
So let's not play the people wanted it card.

By the time some people started raising concerns, OpenAI claimed the cat was already out of the bag and "if we didn't do it, someone else will, so deal with it."

Similar to privacy, just because some people don't care, lack awareness, or don't want the hassle of fighting for it, doesn't justify taking it away from others.


Your argument seems to be that the majority of the world are the weak being sacrificed but are too ignorant to realize it. I wholeheartedly disagree with this theory.


Yes. Your intellectual labor to the maximum degree possible will be exploited by "AI" companies who are anything but.

This is repackaging content, laundering it, and reselling it.

As others have noted, IP law has lots of problems; Sam Altman et al are exploiting the gap left between the speed of technology and law and using their own version of social good without waiting for the consent of those they're exploiting.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: