-
Notifications
You must be signed in to change notification settings - Fork 4.8k
Fix #275: Make openai dataflow compatible with Gemini and OpenRouter #276
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Fix #275: Make openai dataflow compatible with Gemini and OpenRouter #276
Conversation
…nd OpenRouter - Replace OpenAI-specific responses.create() API with standard chat.completions.create() - Update get_stock_news_openai, get_global_news_openai, and get_fundamentals_openai - Add comprehensive tests for OpenAI, Gemini, and OpenRouter compatibility - All functions now use standard OpenAI-compatible chat completion API - Fixes RuntimeError: All vendor implementations failed for method 'get_global_news' - Fixes RuntimeError: All vendor implementations failed for method 'get_indicators' The issue was that the openai vendor functions used OpenAI-specific API features (responses.create with web_search_preview tools) that are not supported by Gemini or OpenRouter. By switching to the standard chat completions API, these functions now work with any OpenAI-compatible provider.
Summary of ChangesHello @godnight10061, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request addresses a critical compatibility issue where the Highlights
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request does a great job of refactoring the OpenAI dataflow to use the standard chat.completions.create API, which successfully enables compatibility with Gemini and OpenRouter. The addition of comprehensive tests is a valuable contribution to ensuring the stability of these integrations. My feedback focuses on opportunities to improve maintainability by reducing code duplication in both the core logic and the new tests, and on expanding test coverage to ensure all modified functions are validated.
| import pytest | ||
| from unittest.mock import Mock, patch, MagicMock | ||
| from tradingagents.dataflows.openai import ( | ||
| get_stock_news_openai, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| This test reproduces issue #275 where Gemini and OpenRouter fail with openai vendor. | ||
| """ | ||
| import pytest | ||
| from unittest.mock import Mock, patch, MagicMock |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| def test_get_global_news_with_openai(self, mock_openai_class, mock_get_config, mock_config_openai): | ||
| """Test get_global_news_openai works with OpenAI provider.""" | ||
| mock_get_config.return_value = mock_config_openai | ||
|
|
||
| # Mock the OpenAI client and response | ||
| mock_client = Mock() | ||
| mock_openai_class.return_value = mock_client | ||
|
|
||
| # Mock chat completion response (standard API) | ||
| mock_response = Mock() | ||
| mock_response.choices = [Mock()] | ||
| mock_response.choices[0].message.content = "Test news content" | ||
| mock_client.chat.completions.create.return_value = mock_response | ||
|
|
||
| # Call the function | ||
| result = get_global_news_openai("2024-11-09", 7, 5) | ||
|
|
||
| # Verify it was called | ||
| assert mock_client.chat.completions.create.called | ||
| assert result == "Test news content" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This test is very similar to test_get_global_news_with_gemini and test_get_global_news_with_openrouter. To avoid repetition and improve maintainability, consider consolidating these three tests into a single parameterized test using pytest.mark.parametrize. This would allow you to test all providers with a single test function, making the suite more concise and easier to extend.
|
|
||
| @patch('tradingagents.dataflows.openai.get_config') | ||
| @patch('tradingagents.dataflows.openai.OpenAI') | ||
| def test_get_fundamentals_with_different_providers(self, mock_openai_class, mock_get_config, mock_config_gemini): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The name test_get_fundamentals_with_different_providers is a bit misleading as it only tests with the Gemini configuration. To improve clarity, you could either rename it to test_get_fundamentals_with_gemini or, preferably, parameterize it to run against all provider configurations, similar to the suggestion for the get_global_news tests.
| def get_stock_news_openai(query, start_date, end_date): | ||
| """ | ||
| Retrieve stock news using LLM provider configured in backend_url. | ||
| Compatible with OpenAI, Gemini (via OpenAI-compatible API), and OpenRouter. | ||
| Args: | ||
| query: Stock ticker or search query | ||
| start_date: Start date for news search | ||
| end_date: End date for news search | ||
| Returns: | ||
| str: News content as text | ||
| """ | ||
| config = get_config() | ||
| client = OpenAI(base_url=config["backend_url"]) | ||
|
|
||
| response = client.responses.create( | ||
| # Use standard chat completions API for compatibility with all providers | ||
| response = client.chat.completions.create( | ||
| model=config["quick_think_llm"], | ||
| input=[ | ||
| messages=[ | ||
| { | ||
| "role": "system", | ||
| "content": [ | ||
| { | ||
| "type": "input_text", | ||
| "text": f"Can you search Social Media for {query} from {start_date} to {end_date}? Make sure you only get the data posted during that period.", | ||
| } | ||
| ], | ||
| } | ||
| ], | ||
| text={"format": {"type": "text"}}, | ||
| reasoning={}, | ||
| tools=[ | ||
| "content": "You are a financial news analyst. Search and summarize relevant news from social media and news sources." | ||
| }, | ||
| { | ||
| "type": "web_search_preview", | ||
| "user_location": {"type": "approximate"}, | ||
| "search_context_size": "low", | ||
| "role": "user", | ||
| "content": f"Can you search Social Media for {query} from {start_date} to {end_date}? Make sure you only get the data posted during that period." | ||
| } | ||
| ], | ||
| temperature=1, | ||
| max_output_tokens=4096, | ||
| max_tokens=4096, | ||
| top_p=1, | ||
| store=True, | ||
| ) | ||
|
|
||
| return response.output[1].content[0].text | ||
| return response.choices[0].message.content |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is significant code duplication across get_stock_news_openai, get_global_news_openai, and get_fundamentals_openai. Each function repeats the logic for getting the configuration, initializing the OpenAI client, making the API call, and parsing the response. Consider refactoring this common logic into a private helper function, for example _execute_chat_completion(messages: list) -> str:. This would make the code more DRY (Don't Repeat Yourself) and easier to maintain.
The issue was that the openai vendor functions used OpenAI-specific API features (responses.create with web_search_preview tools) that are not supported by Gemini or OpenRouter. By switching to the standard chat completions API, these functions now work with any OpenAI-compatible provider.