When you run some tool like Claude Code, they seem to load all the files of the working dir and use them as a context, so then you can ask questions like "summarize this project" etc...
The context pushed by vim-ollama is a configuratble number of lines below and after the current line. Is it possible to extend that context to the whole working dir like Claude Code ? This "should" improve a lot the code completion because the LLM is aware of the whole project.
Sorry if my suggestion is a bit naive (and I think it is), I'm not familiar with LLM.
When you run some tool like Claude Code, they seem to load all the files of the working dir and use them as a context, so then you can ask questions like "summarize this project" etc...
The context pushed by vim-ollama is a configuratble number of lines below and after the current line. Is it possible to extend that context to the whole working dir like Claude Code ? This "should" improve a lot the code completion because the LLM is aware of the whole project.
Sorry if my suggestion is a bit naive (and I think it is), I'm not familiar with LLM.