Published August 15, 2024 by Gonzalo
I recently saw a tweet from a fellow YC founder explaining that despite all of the AI coding tools, he still prefers to copy-paste code in and out of Claude’s web client, instead of relying on retrieval algorithms.

- Preparing snippets of code for an LLM to ingest requires you to organize and clarify your thoughts
- And similar to rubber ducking, you’re externalizing your thought process by presenting your code to an external entity
Context Control with Double
Engineering the context inside and across context windows is a huge factor in getting an LLM to produce high quality code. At Double, our mission is to make the best context management UX of any LLM product, prioritizing visibility. You should never have to wonder what parts of your codebase are being used, how they’re being selected and so on and so forth. visibility into how much space you’ve got in a context window and how cursor handles managing context across windows. In the video below, I’ll walk you through Double’s current context management and how we’re working to make it even better.
If after reading this you have any feedback or questions, please reach me at help@double.bot. I personally read and reply to every email.