Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I agree. In our current neoliberal era, the ones in power have already been replacing various systems in our society with algorithms and computers, and formulating everything into an optimization problem. As a result of this, the reverse has happened: the systems are now shaping humanity into something that could be optimized.

We have been trying to manage governments, public services, and education the same as corporations, creating numerical targets for institutions to optimize for. Education itself was formulated as an optimization problem about how to create more jobs. Public services like healthcare were privatized and became a target for profit optimization. Half of the stock market is controlled by High-Frequency Trading supercomputers, which will do virtually anything to gain an upper-hand in profits. Those methods were all inherited from the management styles of corporations that began in the neoliberal era. As fundamental parts of our society are replaced by those systems, the society now curve-fits the systems rather than the systems curve-fitting the society. We now hyperoptimize ourselves to fit in this neoliberal landscape; our time is told as something to be optimized between work, socializing, exercising, and self-improving, with no space for "actual free time" of our own. We go to college not to learn but to pass exams and get ourselves a good job that can sustain us. And the faults of our systems are now blamed to be individual problems: "You didn't optimize towards the current trends of the job market, it's your fault." And from the view of the corporations, we are just AI agents waiting to be optimized for cash, and we're now becoming one through fitting our bodies and minds to the social media that tries to maximize engagement and ad revenue no matter the real societal cost.

Now, the real problem of AI compared to the algorithms of the past is its data-driven nature: it can only learn from what data you give it for training. We can only accumulate data from the past and never from the future, so the AI systems will just keep repeating the past, no matter what unseen change will come. We will lose the ability to imagine new political, economic alternatives, we will just be feeding ourselves the status quo, and societal advancement will stagnate at the hands of automated systems. The cancellation of the future: this is what I'm ultimately afraid of.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: