-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create strategies for continuing amidst token limit issues #45
Comments
Thoughts on summarizing prior messages? |
While i agree its the obvious default, removing starting with the first message doesnt seem like an assumptiom that would work well for most of my chats. Whilst i sometimes go off on tangents i try to keep to a contained task in each chat most of the time. As a result the first message usually includes the setup of the whole task. i think if its removed then the chat may lose focus. i would like chatGPT to include the option of deleting parts of the conversation. It sometimes isnt apparent for a few prompts that an answer wasnt helpful, and then i dont think one can get rid of it. anyway, i dont think you can control that. being able to define important prompts and responses might help? then the default could say delete the oldest prompt and response (but leave the first one and any others that the user protects). and then user could have a way or protecting prompts and responses. im not sure if thats feasible? |
maybe when the point is reached that old messages need removing, chstgpt asks you to go bacn through and thumbs up and thumbs down all the answers that have been helpful so far? and use that to prioritise what to remove. |
@nicosuave Now that's a cool idea. We could send (a portion of) the prior messages over to a model for summarization. Hopefully, it still has a context for any recent debugging and identifiers it needs. What's a good prompt we can get started with?
We could store that as metadata alongside the actual messages. We could have an interactive widget in the notebook to clean up the prior messages or tag them. |
I just tried this two prompts on one of my longest and most successful coding chats with gpt. |
Sometimes ChatLab is off to the races cranking on analysis and then it runs out of tokens.
We need to do two things:
The simplest strategy could be "remove the first message until under the token limit". Advanced strategies could include taking out the system message or even parts of it.
The text was updated successfully, but these errors were encountered: