Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> hen you are probably forced to argue that consciousness is very closely related to computation, to the point were a coin flip, or a hello world program has some sliver of consciousness.

Why not? Those things have zero-consciousness that is conscious only of itself and which correctly reflects their lack of self-model.



The argument is, if our brains are just a curve fitting machine, then we can dial in the complexity of the computation. Start with a single layer parameter, then two parameter, and so on until we are at the complexity of the brain. By that procedure, we can ask after each parameter, if the machine is now conscious, and I strongly doubt that there is a good answer.


The way we ascribe consciousness to entities other than ourselves is based on similarity to ourselves.

Obviously other living humans have the highest similarity, so they are automatically deemed conscious. Next are other primates, followed by other domesticated mammals, and other animals.

Furthest from the status of conscious are creatures we see as automata like dung beetles rolling their food, or jellyfish ... jellyfishing.

Presumably we'd apply a similar process to hypothetical AGIs.


That's an ill-defined question. You can do the same thing with far less vague concepts than consciousness, and I could even ask you the same question about a brain and adding neurons. https://en.wikipedia.org/wiki/Sorites_paradox




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: