Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I basically agree that there’s something… I dunno, implausible about solving loneliness in a general sense using LLMs.

But, wrt your specific description—these LLM based tools are just programs, and they can be easily configured to validate and flatter, or challenge and be obstinate. Surely they could be configured to have a satisfying relationship arc, if some work was put into it. I’m sure we could program a begrudging mentor that only became friendly after you’ve impressed it, if we wanted.

I think you are right that something isn’t there, but the missing thing is deeper than the surface level behavior. They aren’t AI’s, they are just language models. To get closer in some greedy sense, we could give the language model more complex simulated human like behaviors, but that will still be a simulation…



But that isn't the point. The user would know that the arc was programmed in. Loneliness is the absence of the esteem of your community. You cannot get the esteem of the people around you by interacting with a non-person with no free ability to reject you, no matter how elaborate the simulation or how intelligent the non-person entity.

As long as an AI is constructed as a tool for a specific end under the control of people it cannot meet the real social needs of humans.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: