Skip to content

Conversation

@dangazineu
Copy link
Collaborator

Now vyb init asks the user for their LLM provider of choice (out of a list of supported providers with only OpenAI for now).

…refactor

The "What is left to do" section in 0_TODO_vyb.md now contains a detailed,
chronologically ordered list of atomic changes required to migrate the
codebase to the new provider-agnostic llm architecture.  Each item keeps
the workspace in a buildable and testable state and explicitly mentions
accompanying test and documentation updates.
Adds two strongly-typed enumerations that will be used by higher-level
logic to reference LLM models in a provider-agnostic way.

• llm/types.go – defines ModelFamily and ModelSize string types with
  constants plus String() helpers.
• llm/types_test.go – unit tests asserting the String() behaviour and
  compile-time exhaustiveness via switch statements.
• llm/README.md – documents the new enumerations under a dedicated
  "Model abstractions" section.
Adds new package `config` with:
* Struct `Config` containing the single field `Provider`.
* Helper `Load(projectRoot string)` that looks for `.vyb/config.yaml` under the given root and unmarshals it.
  When the file is missing or malformed a default configuration is returned (provider "openai").
* Helper `LoadFS(fsys fs.FS)` mirroring `Load` but operating directly on an `fs.FS`, useful for tests.

Unit-tests (`config/config_test.go`) exercise both success and fallback paths with an in-memory filesystem (fstest.MapFS).
Extend the `vyb init` command so it now also creates a configuration
file under `.vyb/config.yaml` when it does not exist yet.

Behaviour
---------
1. After generating project metadata (`project.Create(".")`) the command
   checks the presence of `.vyb/config.yaml`.
2. If the file is missing, the user is prompted (via `survey`) to choose
   the desired provider from the supported list (currently only
   "openai").
3. In non-interactive environments (or on prompt failure) the selection
   silently falls back to the default provider defined in
   `config.Default()`.
4. The chosen provider is marshalled to YAML through the existing
   `config.Config` struct and written to `.vyb/config.yaml`.
5. When the config file already exists the command leaves it untouched.

A new dependency on `github.com/AlecAivazis/survey/v2` is introduced and
recorded in `go.mod`.
Add internal provider interface inside llm package mirroring existing
OpenAI façade helpers (workspace change proposals, module context and
external contexts).  Implement resolveProvider(projectRoot string) that
loads .vyb/config.yaml via config.Load and returns a concrete provider
based on cfg.Provider.

Wrap existing llm/openai helpers behind openAIProvider type so they
satisfy the new interface without changing their behaviour.

Supported providers are matched in a simple switch with a descriptive
error for unknown values.  This establishes the groundwork for future
providers while keeping current code paths unchanged.
…er model names and accept *config.Config param. Updated annotation and template code to load project config and call the new helpers. Added mapModel and updated llm dispatcher interface to new signatures preserving functionality.
…elects correct provider implementation or errors for unknown provider
Extend high-level README with new `.vyb/config.yaml` section explaining the
LLM provider choice executed during `vyb init` and how `(family,size)` model
specs are resolved internally.  Update command template README to list the new
`model` field (family + size) so authors know how to configure templates under
`cmd/template/*`.

No functional code changes – documentation only.
@dangazineu dangazineu requested a review from Copilot June 11, 2025 01:21
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

Adds a facade in the llm package to dispatch LLM calls based on user‐selected providers and abstracts model selection by family/size.

  • Introduces SupportedProviders and resolveProvider to route calls through a common provider interface
  • Implements mapModel in the OpenAI backend and updates functions to accept ModelFamily & ModelSize
  • Updates vyb init and templates to prompt for and persist the chosen provider and model settings

Reviewed Changes

Copilot reviewed 24 out of 24 changed files in this pull request and generated 1 comment.

Show a summary per file
File Description
llm/providers.go Added SupportedProviders and internal list of providers
llm/dispatcher.go Introduced provider interface and dispatch wrappers
llm/dispatcher_test.go Added tests for resolveProvider
llm/internal/openai/openai.go Added mapModel and updated workspace proposal to use family/size
llm/README.md Documented model abstractions and family/size mapping
config/types.go Defined ModelFamily and ModelSize enums
config/types_test.go Added tests for enum string methods
config/config.go Config loading now includes provider with sane defaults
config/config_test.go Tests for default and file-based provider loading
cmd/init.go Prompts user to choose provider during vyb init
cmd/template/template.go Loads config and passes family/size to the LLM facade
cmd/template/loader.go Sets default model in loader and unmarshals into Definition
cmd/template/README.md Added docs for optional model field in templates
README.md Updated quick-start and documented project config
go.mod Added survey/v2 and bumped indirect dependencies
Comments suppressed due to low confidence (3)

llm/dispatcher.go:62

  • resolveProvider uses case-sensitive matching, but documentation states provider names are case-insensitive. Consider normalizing cfg.Provider (e.g., strings.ToLower) before the switch or using case-insensitive comparison.
case "openai":

llm/internal/openai/openai.go:64

  • Add unit tests covering mapModel for each ModelFamily/ModelSize combination and an unsupported case to ensure mappings and error paths behave as expected.
func mapModel(fam config.ModelFamily, sz config.ModelSize) (string, error) {

README.md:102

  • Documentation states provider names are case-insensitive, but resolveProvider currently matches only lowercase. Update docs or code to ensure behavior aligns.
The provider string is case-insensitive and must match one of the options returned by `vyb llm.SupportedProviders()`.


var cmdDef Definition
if err := yaml.Unmarshal(data, &cmdDef); err != nil {
cmdDef := &Definition{
Copy link

Copilot AI Jun 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] The default model (ModelFamilyReasoning, ModelSizeLarge) is hardcoded here; consider extracting these defaults into named constants or a shared configuration to centralize fallback values.

Copilot uses AI. Check for mistakes.
@dangazineu dangazineu merged commit a7c8528 into vybdev:main Jun 11, 2025
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant