-
Notifications
You must be signed in to change notification settings - Fork 5.8k
Description
What feature would you like to see?
Currently, when a conversation for a large task (e.g., developing a complex piece of code, writing a long document, step-by-step problem solving) approaches the context token limit, the session effectively ends. The workflow is broken, and the only workaround is to manually:
Summarize the progress and context from the previous chat.
Start a brand new chat.
Paste the summary and try to prompt the model to pick up where it left off.
This manual process is cumbersome, error-prone, and, most importantly, it completely shatters the creative and logical flow. It feels like trying to build a skyscraper but having to manually carry your tools up from the ground floor every 20 stories.
The Solution: Automatic "Session Chaining"
I envision a feature where the system can automatically handle this transition.
Proactive Monitoring: The system detects when the current session's token count is approaching its limit (e.g., at 99% capacity).
Automated Handoff Trigger: Instead of just failing or cutting off, it triggers an automated "handoff" process.
Intelligent Summarization: The model internally generates a concise summary of the current task, including key context, progress, code snippets, and the immediate next step.
Seamless New Session: It then automatically initiates a new session, pre-pending it with this generated summary as the foundational context.
Uninterrupted Continuation: The model then seamlessly continues with the task, picking up exactly where it left off, without the user needing to do anything.
To the user, this would feel like one continuous, infinite conversation, even though it's technically a chain of linked sessions behind the scenes.
Additional information
The Benefits:
Unbroken Workflow: This would be a game-changer for maintaining focus and momentum on large-scale projects.
Enables More Complex Tasks: It would unlock the ability to tackle even more ambitious projects that are currently impractical due to context limits.
Vastly Improved User Experience: It removes a major point of friction and makes the interaction feel truly intelligent and collaborative.
This feature would elevate Codex from a powerful tool to a true long-term project partner.
Thanks for considering this! Hope to see something like this in the future.