Skip to content

Conversation

toubatbrian
Copy link
Contributor

No description provided.

Copy link

changeset-bot bot commented Sep 15, 2025

⚠️ No Changeset found

Latest commit: fb9943f

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets

When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

@toubatbrian toubatbrian changed the base branch from main to brian/bump-lk-node September 16, 2025 04:29
Base automatically changed from brian/bump-lk-node to main September 16, 2025 06:49
@toubatbrian toubatbrian marked this pull request as ready for review September 18, 2025 04:10
maxRetries?: number;
timeout?: number;
verbosity?: Verbosity;
extraKwargs?: LLMOptions<TModel>;
Copy link
Contributor

@bcherry bcherry Sep 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we just call this extra or extraArgs and also rename to match on the python side? it's not great to leak the word "kwargs" into TypeScript since that's a Python word cc @longcw @theomonnom

// SPDX-FileCopyrightText: 2025 LiveKit, Inc.
//
// SPDX-License-Identifier: Apache-2.0
export { LLM, LLMStream, type LLMModels, type LLMOptions, type OpenAIModels } from './llm.js';
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
export { LLM, LLMStream, type LLMModels, type LLMOptions, type OpenAIModels } from './llm.js';
import * as llm from './llm.js';
import * as stt from './stt.js';
import * as tts from './tts.js';
export { stt, tts, llm };
export { LLM, LLMStream, type LLMModels, type LLMOptions, type OpenAIModels } from './llm.js';

Can we also export the whole llm, stt, and tts modules so that people can access the underlying OpenAiOptions, CerebrasOptions, etc types in generated docs? (we can't just export them normally since CartesiaOptions appears in both STT and TTS)

keyterms_prompt?: string[]; // default: not specified
}

export type STTModels = DeepgramModels | CartesiaModels | AssemblyaiModels;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you need | string here or at consumption sites so people can adopt new models added to the service that aren't yet added to the type enums (same for the other types)

@bcherry
Copy link
Contributor

bcherry commented Sep 19, 2025

I see a bunch of this error in my terminal when i run the starter agent in dev with the gateway integrated

(node:50680) MaxListenersExceededWarning: Possible EventTarget memory leak detected. 11 abort listeners added to [AbortSignal]. MaxListeners is 10. Use events.setMaxListeners() to increase limit

@Shubhrakanti
Copy link
Contributor

I see a bunch of this error in my terminal when i run the starter agent in dev with the gateway integrated


(node:50680) MaxListenersExceededWarning: Possible EventTarget memory leak detected. 11 abort listeners added to [AbortSignal]. MaxListeners is 10. Use events.setMaxListeners() to increase limit

@toubatbrian Maybe too many Tasks?

@Shubhrakanti
Copy link
Contributor

Approved by accident. Pls get @bcherry 's approval before merge

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants