Unable to integrate self-hosted llm (provider - vllm) #935
Replies: 2 comments
-
|
@Gitkakkar1597 Crawl4AI uses LiteLLM under the hood basically as an adapter, so it can API agnostic. Based on the LiteLLM docs for this provider, you seem to have configured correctly as well. Based on the error code you shared (301), looks like this is an issue with your vllm installation. Here I found some references in vllm repository related to this error vllm-project/vllm#8532 (comment) Do check if there's anything you can change in vllm installation, and if that fixes the problem. I changed this to a forum post, since this isn't a bug, especially in the Crawl4AI library. If you get it to work, do it link back here, so other community members with same problem could also benefit from it. |
Beta Was this translation helpful? Give feedback.
-
|
Similar errors when i use my local LLM deepseek-v2: { pls share some insight on how to solve this. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
crawl4ai version
0.5.0
Expected Behavior
Hi @unclecode thanks for creating such a powerful open-source resource for us devs. I am using crawl4ai to extract and process data from some weburl, but I am unable to integrate my own deployed llm model. The LLMCondig should use the 'api_base' endpoint url. The endpoint requires bearer auth, which should be in 'api_token' .
The llm endpoint can be called in a python request like:-
Current Behavior
In current approach, it throws
"litellm.APIError: APIError: Hosted_vllmException - <html>\r\n<head><title>301 Moved Permanently</title></head>\r\n<body>\r\n<center><h1>301 Moved Permanently</h1></center>\r\n<hr><center>cloudflare</center>\r\n</body>\r\n</html>"error.Kindly help me integrating my own-deployed llm in crawl4ai's LLMExtractionStrategy method.
Is this reproducible?
Yes
Inputs Causing the Bug
llmConfig= LlmConfig( provider = "hosted_vllm/meta-llama/Meta-Llama-3-8B-Instruct", base_url= f"http://{endpoint_domain}.com/v1/completions", api_token = f"Bearer {llm_auth_token}", ),Steps to Reproduce
Code snippets
OS
Windows
Python version
3.12.8
Browser
Chrome
Browser version
134.0.6998.178
Error logs & Screenshots (if applicable)
Beta Was this translation helpful? Give feedback.
All reactions