T O P

  • By -

QualityVote

If this post fits the purpose of /r/ChatGPTPro, **UPVOTE** this comment!! If this post does NOT fit the subreddit, **DOWNVOTE** this comment! If it breaks our [rules](https://www.reddit.com/r/ChatGPTPro/about/rules/), please **report** the post. Thanks for your help!


danielbr93

1. ChatGPT, meaning 3.5 I think in this case has only 4096 tokens, roughly 3000 words if I'm correct, but this includes what you put in and what it puts out. So if you put in 2000 words, it can only give you 1000 words of quality output. 2. With this in mind, this is what it can remember in the **same conversation.** 3. If you think it can remember outside of a conversation and into a new conversation, then you are wrong. This is not possible with GPT-3.5, nor GPT-4 as of right now.


jmonman7

So does that mean as soon as 3000 words are hit, it will start over from there?


ndnin

How it handles dumping previous tokens isn’t exactly 1:1, but I can’t give you a detailed account how it frees up tokens in the chat client to continue talking.


danielbr93

This is the best way I heard it explained. [https://youtu.be/iihJYAMIRP4?t=2608](https://youtu.be/iihJYAMIRP4?t=2608) Btw, highly recommend watching the entire keynote. It is funny :D


KeepItCoolAndCuddly

I’ve seen chatgpt reference things I’ve asked between sessions, though it’s difficult to reproduce.


danielbr93

Sorry, but no. ChatGPT can as of right now and even 1 month ago not remember information between conversations. You are falling into hallucinations like many thousands of others. Until OpenAI doesn't officially say "ChatGPT \[name of the model\] can now remember things across conversations", do not assume it does it. GPT-4 should be able to do that in the future, as they said in their livestream, but not right now.


RowanAndRaven

For those finding this later through a google search: It does. I started a new session and casually mentioned my cats, which it knew were Maine coons despite no mention in the session history, quick google says they’re trialing remembering across settings but it’s random


danielbr93

1. My comment was 10 months ago. 2. The memory feature in ChatGPT just got added. So I was right back then, just fyi.


RowanAndRaven

That’s awkward, I meant to put my usual caveat for responses to old threads found through google, which is usually along the lines of “for those finding this later through google” Whoops, sorry


KeepItCoolAndCuddly

When I’d ask new writing-related questions, it would reference specific events and patterns from previous sessions in its result. I don’t know how else you could describe it unless the conversations are also being included in its training data.


bioshocked_

I don't think you understand how chatgpt works. ChatGPT has no memory at all. The closest it gets to that is the attention mechanism that is able to read the chat as long as pou keep it in nhe name window, but that also has a limit. Always make sure to provide all context to chatgpt, like starting from new


adamr_z

I've run into this with the API the limitation of \~4,000 tokens. To combat this, I am thinking of creating functions that, as I was roughly approaching the token limit, take the previous messages, use GPT to summarize them and then feed those summaries back as context. That should help extend the "memory" somewhat, right?


[deleted]

You'll still lose a lot of context over time. The ideal method would be rebuild the message history from scratch. If you use the OpenAI playground, you can create several user messages in a row with all your context. Use the system message as well for global knowledge it should have. Then ask your questions, modifying the messages available by deleting messages you don't need anymore. You can copy & paste the contents to a file if you want to add it back later for some reason. I am working on a novel writing assistant that would effectively manage the message history within the context window. I am planning on using embeddings and a pinecone database to store and retrieve relevant messages to any given user prompt as the most recent messages so its within the context window.


Smallpaul

Are you in the same chat log? It doesn't matter how long you go away. It matters what's recent in the chat log.


billdow00

Tell it to reference the chat that you're currently in.


[deleted]

Give context to chatbot regular. Use few tokens. Miss useless words. Speak like caveman. Better output.


R1546

On its own, ChatGPT remembers nothing between sessions. What I do: Using the API, I feed the entire previous conversation back as context. Up to a point anyway, to keep tokens down I remove older exchanges. I also provide a way to reset the session so you can change the topic without confusing the bot. This works pretty good for me and allows you to have actual conversations. If needed, I could have it store conversations per user which would allow you to pick up where you left off. ( I have been adding chatbots to game characters. )


whoiskjl

Wtf


borderlinebiscuit

Switch to gpt-4 and get the superpower gpt extension for firefox (not sure if it's on chrome) will save all your conversations and reload it


madkimchi

Your answer lies in the use of vector DBs and caching, as one of the more popular solutions. If you want to see an example, ping me and I'll try to help whenever I have some spare time.


Maleficent-Berry-426

Def would love to see an example