Request for "Copilot Ultra" Tier (Uncapped Context Windows (1M+ Tokens)) #189163
Replies: 2 comments
-
|
💬 Your Product Feedback Has Been Submitted 🎉 Thank you for taking the time to share your insights with us! Your feedback is invaluable as we build a better GitHub experience for all our users. Here's what you can expect moving forward ⏩
Where to look to see what's shipping 👀
What you can do in the meantime 💻
As a member of the GitHub community, your participation is essential. While we can't promise that every suggestion will be implemented, we want to emphasize that your feedback is instrumental in guiding our decisions and priorities. Thank you once again for your contribution to making GitHub even better! We're grateful for your ongoing support and collaboration in shaping the future of our platform. ⭐ |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Select Topic Area
Product Feedback
Copilot Feature Area
VS Code
Body
I’ve been using Copilot Pro+ extensively for large-scale architectural work, but I’m finding that even the 400k token limit on GPT-5.4 is becoming a major bottleneck.
When working in deep monorepos or long-running refactoring sessions, the context compaction starts dropping critical details far too early. Given that the underlying models now support 1M+ tokens, I'm curious if others are feeling this "context ceiling" as well?
To the GitHub Team:
I would be more than happy to pay for an "Ultra" or "Uncapped" tier above Pro+ to unlock the full 1M+ token window. For those of us working on massive codebases, the productivity gain would be huge.
Beta Was this translation helpful? Give feedback.
All reactions