Skip to content
Open

Qwen #6538

Show file tree
Hide file tree
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 15 additions & 0 deletions app/SyncOnFirstLoad.tsx
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
"use client";
import { useEffect } from "react";
import { useSyncStore } from "./store/sync";

export default function SyncOnFirstLoad() {
const syncStore = useSyncStore();

useEffect(() => {
if (syncStore.lastSyncTime === 0) {
syncStore.sync();
}
}, []);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Missing dependency in useEffect hook

The useEffect hook uses syncStore but doesn't include it in the dependency array. This could lead to stale closures.

Apply this fix:

   useEffect(() => {
     if (syncStore.lastSyncTime === 0) {
       syncStore.sync();
     }
-  }, []);
+  }, [syncStore]);
πŸ“ Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
useEffect(() => {
if (syncStore.lastSyncTime === 0) {
syncStore.sync();
}
}, []);
useEffect(() => {
if (syncStore.lastSyncTime === 0) {
syncStore.sync();
}
}, [syncStore]);
πŸ€– Prompt for AI Agents
In app/SyncOnFirstLoad.tsx around lines 8 to 12, the useEffect hook references
syncStore but does not include it in the dependency array, which can cause stale
closures. Add syncStore to the dependency array of the useEffect hook to ensure
it updates correctly when syncStore changes.


return null;
}
4 changes: 4 additions & 0 deletions app/api/[provider]/[...path]/route.ts
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,11 @@ async function handle(
req: NextRequest,
{ params }: { params: { provider: string; path: string[] } },
) {
// Handle OPTIONS request for CORS preflight
// params.provider = MODEL_PROVIDER;

const apiPath = `/api/${params.provider}`;

console.log(`[${params.provider} Route] params `, params);
switch (apiPath) {
case ApiPath.Azure:
Expand Down
104 changes: 76 additions & 28 deletions app/api/alibaba.ts
Original file line number Diff line number Diff line change
@@ -1,22 +1,16 @@
import { getServerSideConfig } from "@/app/config/server";
import {
ALIBABA_BASE_URL,
ApiPath,
ModelProvider,
ServiceProvider,
} from "@/app/constant";
import { ALIBABA_BASE_URL, ApiPath, ModelProvider } from "@/app/constant";
import { prettyObject } from "@/app/utils/format";
import { NextRequest, NextResponse } from "next/server";
import { auth } from "@/app/api/auth";
import { isModelNotavailableInServer } from "@/app/utils/model";

const serverConfig = getServerSideConfig();

export async function handle(
req: NextRequest,
{ params }: { params: { path: string[] } },
) {
console.log("[Alibaba Route] params ", params);
// console.log("[Alibaba Route] params ", params);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

πŸ› οΈ Refactor suggestion

Remove commented console.log statements instead of leaving them commented out.

Commented-out debug logs should be removed entirely to keep the codebase clean, rather than left as commented code.

-  // console.log("[Alibaba Route] params ", params);
+
-    // console.error("[Alibaba] ", e);
+
-  // console.log("[Alibaba] fetchUrl", fetchUrl);
+
-  // console.log("[Proxy] Alibaba options: ", fetchOptions);
+
-        // console.log("[Alibaba] custom models", current_model);
+
-        // console.log("[Alibaba] request body json", jsonBody);
+
-      // console.log("[Alibaba] request body", fetchOptions.body);
+
-      // console.error(`[Alibaba] filter`, e);
+

Also applies to: 30-30, 65-65, 81-81, 103-103, 131-131, 138-138, 159-159

πŸ€– Prompt for AI Agents
In app/api/alibaba.ts at lines 13, 30, 65, 81, 103, 131, 138, and 159, remove
all commented-out console.log statements entirely instead of leaving them
commented out. This will clean up the codebase by eliminating unnecessary
commented debug logs.


if (req.method === "OPTIONS") {
return NextResponse.json({ body: "OK" }, { status: 200 });
Expand All @@ -42,7 +36,9 @@ async function request(req: NextRequest) {
const controller = new AbortController();

// alibaba use base url or just remove the path
let path = `${req.nextUrl.pathname}`.replaceAll(ApiPath.Alibaba, "");
let path = `${req.nextUrl.pathname}`
.replaceAll(ApiPath.Alibaba, "")
.replace("/api", "");

let baseUrl = serverConfig.alibabaUrl || ALIBABA_BASE_URL;

Expand All @@ -65,6 +61,9 @@ async function request(req: NextRequest) {
);

const fetchUrl = `${baseUrl}${path}`;

console.log("[Alibaba] fetchUrl", fetchUrl);

const fetchOptions: RequestInit = {
headers: {
"Content-Type": "application/json",
Expand All @@ -83,28 +82,77 @@ async function request(req: NextRequest) {
if (serverConfig.customModels && req.body) {
try {
const clonedBody = await req.text();
fetchOptions.body = clonedBody;
let jsonBody: any = {};

try {
jsonBody = JSON.parse(clonedBody);

// Move input.messages to messages at the root level if present
if (jsonBody.input && Array.isArray(jsonBody.input.messages)) {
jsonBody.messages = jsonBody.input.messages;

// Remove input.messages to avoid duplication
delete jsonBody.input;

jsonBody.stream = true;
}
Comment on lines +87 to +100
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Fix performance issue with delete operator.

The delete operator can impact performance when used on objects.

Replace the delete operation with object destructuring:

-        // Move input.messages to messages at the root level if present
-        if (jsonBody.input && Array.isArray(jsonBody.input.messages)) {
-          jsonBody.messages = jsonBody.input.messages;
-
-          // Remove input.messages to avoid duplication
-          delete jsonBody.input;
-
-          jsonBody.stream = true;
-        }
+        // Move input.messages to messages at the root level if present
+        if (jsonBody.input && Array.isArray(jsonBody.input.messages)) {
+          const { input, ...bodyWithoutInput } = jsonBody;
+          jsonBody = {
+            ...bodyWithoutInput,
+            messages: input.messages,
+            stream: true
+          };
+        }
πŸ“ Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
let jsonBody: any = {};
try {
jsonBody = JSON.parse(clonedBody);
// Move input.messages to messages at the root level if present
if (jsonBody.input && Array.isArray(jsonBody.input.messages)) {
jsonBody.messages = jsonBody.input.messages;
// Remove input.messages to avoid duplication
delete jsonBody.input;
jsonBody.stream = true;
}
let jsonBody: any = {};
try {
jsonBody = JSON.parse(clonedBody);
// Move input.messages to messages at the root level if present
if (jsonBody.input && Array.isArray(jsonBody.input.messages)) {
const { input, ...bodyWithoutInput } = jsonBody;
jsonBody = {
...bodyWithoutInput,
messages: input.messages,
stream: true
};
}
🧰 Tools
πŸͺ› Biome (1.9.4)

[error] 95-95: Avoid the delete operator which can impact performance.

Unsafe fix: Use an undefined assignment instead.

(lint/performance/noDelete)

πŸ€– Prompt for AI Agents
In app/api/alibaba.ts between lines 85 and 98, the code uses the delete operator
to remove the input property from jsonBody, which can cause performance issues.
To fix this, replace the delete operation by using object destructuring to
create a new jsonBody object without the input property, ensuring input.messages
is moved to the root messages property and stream is set to true without
mutating the original object with delete.


const current_model = jsonBody?.model;
console.log("[Alibaba] custom models", current_model);

//kiem tra xem model co phai la qwen-vl hay khong (vision model)
if (current_model && current_model.startsWith("qwen-vl")) {
console.log("[Alibaba] current model is qwen-vl");
console.log("xu ly hinh anh trong message");

// Reformat image objects in messages
if (Array.isArray(jsonBody.messages)) {
jsonBody.messages = jsonBody.messages.map((msg: any) => {
if (Array.isArray(msg.content)) {
msg.content = msg.content.map((item: any) => {
if (item && typeof item === "object" && "image" in item) {
return {
type: "image_url",
image_url: {
url: item.image,
},
};
}
return item;
});
}
return msg;
});
}
}
Comment on lines +105 to +129
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

πŸ› οΈ Refactor suggestion

Refactor vision model processing and use English comments.

The vision model processing logic has several maintainability issues:

  1. Comments should be in English for better team collaboration
  2. The complex nested transformation logic should be extracted to a helper function

Consider refactoring like this:

-        //kiem tra xem model co phai la qwen-vl hay khong (vision model)
-        if (current_model && current_model.startsWith("qwen-vl")) {
-          console.log("[Alibaba] current model is qwen-vl");
-          console.log("xu ly hinh anh trong message");
-
-          // Reformat image objects in messages
-          if (Array.isArray(jsonBody.messages)) {
-            jsonBody.messages = jsonBody.messages.map((msg: any) => {
-              if (Array.isArray(msg.content)) {
-                msg.content = msg.content.map((item: any) => {
-                  if (item && typeof item === "object" && "image" in item) {
-                    return {
-                      type: "image_url",
-                      image_url: {
-                        url: item.image,
-                      },
-                    };
-                  }
-                  return item;
-                });
-              }
-              return msg;
-            });
-          }
-        }
+        // Check if model is a qwen-vl vision model
+        if (current_model?.startsWith("qwen-vl")) {
+          console.log("[Alibaba] Processing vision model:", current_model);
+          jsonBody.messages = transformVisionMessages(jsonBody.messages);
+        }

Add this helper function:

function transformVisionMessages(messages: any[]): any[] {
  if (!Array.isArray(messages)) return messages;
  
  return messages.map((msg) => {
    if (!Array.isArray(msg.content)) return msg;
    
    return {
      ...msg,
      content: msg.content.map((item) => {
        if (item && typeof item === "object" && "image" in item) {
          return {
            type: "image_url",
            image_url: { url: item.image },
          };
        }
        return item;
      }),
    };
  });
}
🧰 Tools
πŸͺ› Biome (1.9.4)

[error] 106-106: Change to an optional chain.

Unsafe fix: Change to an optional chain.

(lint/complexity/useOptionalChain)

πŸ€– Prompt for AI Agents
In app/api/alibaba.ts between lines 105 and 129, refactor the vision model
processing by replacing the Vietnamese comments with English ones for clarity.
Extract the nested message transformation logic into a separate helper function
named transformVisionMessages that takes messages as input and returns the
transformed messages. Then, replace the inline transformation code with a call
to this helper function to improve maintainability and readability.


// console.log("[Alibaba] request body json", jsonBody);

fetchOptions.body = JSON.stringify(jsonBody);
} catch (e) {
fetchOptions.body = clonedBody; // fallback if not JSON
}

const jsonBody = JSON.parse(clonedBody) as { model?: string };
// console.log("[Alibaba] request body", fetchOptions.body);

// not undefined and is false
if (
isModelNotavailableInServer(
serverConfig.customModels,
jsonBody?.model as string,
ServiceProvider.Alibaba as string,
)
) {
return NextResponse.json(
{
error: true,
message: `you are not allowed to use ${jsonBody?.model} model`,
},
{
status: 403,
},
);
}
// if (
// isModelNotavailableInServer(
// serverConfig.customModels,
// jsonBody?.model as string,
// ServiceProvider.Alibaba as string,
// )
// ) {
// return NextResponse.json(
// {
// error: true,
// message: `you are not allowed to use ${jsonBody?.model} model`,
// },
// {
// status: 403,
// },
// );
// }
Comment on lines +141 to +157
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

πŸ› οΈ Refactor suggestion

Remove commented model availability check code.

This large block of commented code should either be removed entirely if no longer needed, or uncommented and fixed if the functionality is still required.

If this functionality is no longer needed, remove the commented code:

-      // not undefined and is false
-      // if (
-      //   isModelNotavailableInServer(
-      //     serverConfig.customModels,
-      //     jsonBody?.model as string,
-      //     ServiceProvider.Alibaba as string,
-      //   )
-      // ) {
-      //   return NextResponse.json(
-      //     {
-      //       error: true,
-      //       message: `you are not allowed to use ${jsonBody?.model} model`,
-      //     },
-      //     {
-      //       status: 403,
-      //     },
-      //   );
-      // }

If this functionality should be retained, please uncomment and ensure the imports are available.

Committable suggestion skipped: line range outside the PR's diff.

πŸ€– Prompt for AI Agents
In app/api/alibaba.ts between lines 93 and 109, there is a large block of
commented-out code checking model availability. Determine if this model
availability check is still required; if not, remove the entire commented block
to clean up the code. If it is needed, uncomment the code and verify that all
necessary imports, such as isModelNotavailableInServer and ServiceProvider, are
correctly included and the logic works as intended.

} catch (e) {
console.error(`[Alibaba] filter`, e);
}
Expand Down
2 changes: 2 additions & 0 deletions app/api/common.ts
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,8 @@ export async function requestOpenai(req: NextRequest) {
let baseUrl =
(isAzure ? serverConfig.azureUrl : serverConfig.baseUrl) || OPENAI_BASE_URL;

// console.log("[Base Url]", baseUrl);

if (!baseUrl.startsWith("http")) {
baseUrl = `https://${baseUrl}`;
}
Expand Down
90 changes: 27 additions & 63 deletions app/client/platforms/alibaba.ts
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,6 @@ import {
LLMModel,
SpeechOptions,
MultimodalContent,
MultimodalContentForAlibaba,
} from "../api";
import { getClientConfig } from "@/app/config/client";
import {
Expand Down Expand Up @@ -156,86 +155,38 @@ export class QwenApi implements LLMApi {
);

if (shouldStream) {
// LαΊ₯y danh sΓ‘ch cΓ‘c cΓ΄ng cα»₯ (tools) vΓ  hΓ m (funcs) tα»« plugin hiện tαΊ‘i cα»§a phiΓͺn chat
const [tools, funcs] = usePluginStore
.getState()
.getAsTools(
useChatStore.getState().currentSession().mask?.plugin || [],
);
// Gọi hΓ m streamWithThink để xα»­ lΓ½ chat dαΊ‘ng stream (dΓ²ng sα»± kiện server-sent events)
return streamWithThink(
chatPath,
requestPayload,
headers,
tools as any,
funcs,
controller,
// parseSSE
// SSE parse callback for OpenAI-style streaming
(text: string, runTools: ChatMessageTool[]) => {
// console.log("parseSSE", text, runTools);
const json = JSON.parse(text);
const choices = json.output.choices as Array<{
message: {
content: string | null | MultimodalContentForAlibaba[];
tool_calls: ChatMessageTool[];
reasoning_content: string | null;
};
}>;

if (!choices?.length) return { isThinking: false, content: "" };

const tool_calls = choices[0]?.message?.tool_calls;
if (tool_calls?.length > 0) {
const index = tool_calls[0]?.index;
const id = tool_calls[0]?.id;
const args = tool_calls[0]?.function?.arguments;
if (id) {
runTools.push({
id,
type: tool_calls[0]?.type,
function: {
name: tool_calls[0]?.function?.name as string,
arguments: args,
},
});
} else {
// @ts-ignore
runTools[index]["function"]["arguments"] += args;
}
}

const reasoning = choices[0]?.message?.reasoning_content;
const content = choices[0]?.message?.content;

// Skip if both content and reasoning_content are empty or null
if (
(!reasoning || reasoning.length === 0) &&
(!content || content.length === 0)
) {
return {
isThinking: false,
content: "",
};
}

if (reasoning && reasoning.length > 0) {
return {
isThinking: true,
content: reasoning,
};
} else if (content && content.length > 0) {
return {
isThinking: false,
content: Array.isArray(content)
? content.map((item) => item.text).join(",")
: content,
};
// Each `text` is a line like: data: {...}
let json: any;
try {
json = JSON.parse(text);
} catch {
return { isThinking: false, content: "" };
}
const delta = json.choices?.[0]?.delta;
const content = delta?.content ?? "";

// You can accumulate content outside if needed
return {
isThinking: false,
content: "",
content,
};
},
// processToolMessage, include tool_calls message and tool call results
(
requestPayload: RequestPayload,
toolCallMessage: any,
Expand All @@ -248,7 +199,20 @@ export class QwenApi implements LLMApi {
...toolCallResult,
);
},
options,
{
...options,
// Accumulate and render result as it streams
onUpdate: (() => {
let accumulated = "";
return (chunk: string, fetchText?: string) => {
accumulated += chunk;
options.onUpdate?.(accumulated, fetchText ?? "");
};
})(),
onFinish: (final: string, res: any) => {
options.onFinish?.(final, res);
},
},
);
} else {
const res = await fetch(chatPath, chatPayload);
Expand Down
3 changes: 2 additions & 1 deletion app/client/platforms/deepseek.ts
Original file line number Diff line number Diff line change
Expand Up @@ -151,7 +151,8 @@ export class DeepSeekApi implements LLMApi {
controller,
// parseSSE
(text: string, runTools: ChatMessageTool[]) => {
// console.log("parseSSE", text, runTools);
console.log("parseSSE", text, runTools);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

πŸ› οΈ Refactor suggestion

Make debug logging conditional or remove it.

The enabled debug log will add noise to production logs and potentially expose sensitive data. Consider making it conditional based on a debug flag or environment variable.

-            console.log("parseSSE", text, runTools);
+            if (process.env.NODE_ENV === 'development' || process.env.DEBUG_DEEPSEEK) {
+              console.log("parseSSE", text, runTools);
+            }

Or if debugging is no longer needed:

-            console.log("parseSSE", text, runTools);
πŸ“ Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
console.log("parseSSE", text, runTools);
if (process.env.NODE_ENV === 'development' || process.env.DEBUG_DEEPSEEK) {
console.log("parseSSE", text, runTools);
}
πŸ€– Prompt for AI Agents
In app/client/platforms/deepseek.ts at line 154, the console.log statement for
debugging is always enabled, which can clutter production logs and expose
sensitive data. Modify the code to make this debug logging conditional by
checking a debug flag or environment variable before logging, or remove the
console.log entirely if it is no longer needed.


const json = JSON.parse(text);
const choices = json.choices as Array<{
delta: {
Expand Down
Loading