Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I dont think of the 4000 tokens as its memory as such. Its more like the size of its thinking workspace


It also functions as memory in practice, though. With some complex tasks that can be broken down into steps, it often makes a big difference if you tell GPT to summarize its "understanding" of the current state as it goes through those steps - by repeating a phrase, keeping it within the token window, it effectively "remembers" something.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: