Skip to content

Conversation

@godnight10061
Copy link

  • Replace OpenAI-specific responses.create() API with standard chat.completions.create()
  • Update get_stock_news_openai, get_global_news_openai, and get_fundamentals_openai
  • Add comprehensive tests for OpenAI, Gemini, and OpenRouter compatibility
  • All functions now use standard OpenAI-compatible chat completion API
  • Fixes RuntimeError: All vendor implementations failed for method 'get_global_news'
  • Fixes RuntimeError: All vendor implementations failed for method 'get_indicators'

The issue was that the openai vendor functions used OpenAI-specific API features (responses.create with web_search_preview tools) that are not supported by Gemini or OpenRouter. By switching to the standard chat completions API, these functions now work with any OpenAI-compatible provider.

…nd OpenRouter

- Replace OpenAI-specific responses.create() API with standard chat.completions.create()
- Update get_stock_news_openai, get_global_news_openai, and get_fundamentals_openai
- Add comprehensive tests for OpenAI, Gemini, and OpenRouter compatibility
- All functions now use standard OpenAI-compatible chat completion API
- Fixes RuntimeError: All vendor implementations failed for method 'get_global_news'
- Fixes RuntimeError: All vendor implementations failed for method 'get_indicators'

The issue was that the openai vendor functions used OpenAI-specific API features
(responses.create with web_search_preview tools) that are not supported by
Gemini or OpenRouter. By switching to the standard chat completions API,
these functions now work with any OpenAI-compatible provider.
@gemini-code-assist
Copy link

Summary of Changes

Hello @godnight10061, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request addresses a critical compatibility issue where the openai dataflow functions were tightly coupled to OpenAI-specific API features, preventing their use with other OpenAI-compatible LLM providers like Gemini and OpenRouter. By refactoring these functions to use the standard chat completions API, the system now offers greater flexibility and robustness across different language model backends, ensuring that all vendor implementations can function correctly.

Highlights

  • API Compatibility: Replaced the OpenAI-specific responses.create() API call with the more standard chat.completions.create() across all relevant functions to ensure broader compatibility.
  • Function Updates: Updated get_stock_news_openai, get_global_news_openai, and get_fundamentals_openai to utilize the standard chat completion API, removing reliance on OpenAI-specific features like web_search_preview tools.
  • Enhanced Test Coverage: Added comprehensive new tests to verify the compatibility of the openai dataflow functions with various LLM providers, including OpenAI, Google Gemini (via OpenAI-compatible API), and OpenRouter.
  • Bug Fixes: Resolved RuntimeError issues that previously occurred for methods like get_global_news and get_indicators when used with Gemini and OpenRouter, stemming from the use of non-standard OpenAI API features.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request does a great job of refactoring the OpenAI dataflow to use the standard chat.completions.create API, which successfully enables compatibility with Gemini and OpenRouter. The addition of comprehensive tests is a valuable contribution to ensuring the stability of these integrations. My feedback focuses on opportunities to improve maintainability by reducing code duplication in both the core logic and the new tests, and on expanding test coverage to ensure all modified functions are validated.

import pytest
from unittest.mock import Mock, patch, MagicMock
from tradingagents.dataflows.openai import (
get_stock_news_openai,

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The function get_stock_news_openai is imported but is not covered by any tests in this file. Since this function was modified in this pull request, it's important to add tests to verify its compatibility with the different providers, similar to what has been done for get_global_news_openai.

This test reproduces issue #275 where Gemini and OpenRouter fail with openai vendor.
"""
import pytest
from unittest.mock import Mock, patch, MagicMock

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The MagicMock import is not used in this file. It's good practice to remove unused imports to keep the code clean.

Suggested change
from unittest.mock import Mock, patch, MagicMock
from unittest.mock import Mock, patch

Comment on lines +46 to +65
def test_get_global_news_with_openai(self, mock_openai_class, mock_get_config, mock_config_openai):
"""Test get_global_news_openai works with OpenAI provider."""
mock_get_config.return_value = mock_config_openai

# Mock the OpenAI client and response
mock_client = Mock()
mock_openai_class.return_value = mock_client

# Mock chat completion response (standard API)
mock_response = Mock()
mock_response.choices = [Mock()]
mock_response.choices[0].message.content = "Test news content"
mock_client.chat.completions.create.return_value = mock_response

# Call the function
result = get_global_news_openai("2024-11-09", 7, 5)

# Verify it was called
assert mock_client.chat.completions.create.called
assert result == "Test news content"

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This test is very similar to test_get_global_news_with_gemini and test_get_global_news_with_openrouter. To avoid repetition and improve maintainability, consider consolidating these three tests into a single parameterized test using pytest.mark.parametrize. This would allow you to test all providers with a single test function, making the suite more concise and easier to extend.


@patch('tradingagents.dataflows.openai.get_config')
@patch('tradingagents.dataflows.openai.OpenAI')
def test_get_fundamentals_with_different_providers(self, mock_openai_class, mock_get_config, mock_config_gemini):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The name test_get_fundamentals_with_different_providers is a bit misleading as it only tests with the Gemini configuration. To improve clarity, you could either rename it to test_get_fundamentals_with_gemini or, preferably, parameterize it to run against all provider configurations, similar to the suggestion for the get_global_news tests.

Comment on lines 5 to +39
def get_stock_news_openai(query, start_date, end_date):
"""
Retrieve stock news using LLM provider configured in backend_url.
Compatible with OpenAI, Gemini (via OpenAI-compatible API), and OpenRouter.
Args:
query: Stock ticker or search query
start_date: Start date for news search
end_date: End date for news search
Returns:
str: News content as text
"""
config = get_config()
client = OpenAI(base_url=config["backend_url"])

response = client.responses.create(
# Use standard chat completions API for compatibility with all providers
response = client.chat.completions.create(
model=config["quick_think_llm"],
input=[
messages=[
{
"role": "system",
"content": [
{
"type": "input_text",
"text": f"Can you search Social Media for {query} from {start_date} to {end_date}? Make sure you only get the data posted during that period.",
}
],
}
],
text={"format": {"type": "text"}},
reasoning={},
tools=[
"content": "You are a financial news analyst. Search and summarize relevant news from social media and news sources."
},
{
"type": "web_search_preview",
"user_location": {"type": "approximate"},
"search_context_size": "low",
"role": "user",
"content": f"Can you search Social Media for {query} from {start_date} to {end_date}? Make sure you only get the data posted during that period."
}
],
temperature=1,
max_output_tokens=4096,
max_tokens=4096,
top_p=1,
store=True,
)

return response.output[1].content[0].text
return response.choices[0].message.content

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

There is significant code duplication across get_stock_news_openai, get_global_news_openai, and get_fundamentals_openai. Each function repeats the logic for getting the configuration, initializing the OpenAI client, making the API call, and parsing the response. Consider refactoring this common logic into a private helper function, for example _execute_chat_completion(messages: list) -> str:. This would make the code more DRY (Don't Repeat Yourself) and easier to maintain.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant