I'm currently using DeepSeek for detailed engineering and quantum mechanics projects, but I'm really struggling with the 128k token limit. I can't afford to lose any of my chat or information. I even tried waiting 24 hours thinking that might help, but it didn't work, and it's really stressing me out.
2 Answers
Is there any chance you could copy the entire chat and paste it into a new DeepSeek conversation? I wonder if that could help to keep your info intact.
Unfortunately, the 128k token limit is a hard cap on the model's capacity. Exceeding it practically isn’t feasible; although if you run the model on your local hardware, you might extend the context a bit, but the model tends to go off the rails and produce nonsensical outputs. It’s better to take the key points from your chats and start a new conversation. Alternatively, you might want to check out something like Gemini 2.5 Pro, which supports a larger context.
Related Questions
xAI Grok Token Calculator
DeepSeek Token Calculator
Google Gemini Token Calculator
Meta LLaMA Token Calculator
OpenAI Token Calculator