Flashinfer adoption by vLLM #883
                  
                    
                      prasad-nair-amd
                    
                  
                
                  started this conversation in
                General
              
            Replies: 1 comment
-
| You would still have to install FlashInfer ("pip install flashinfer-python"). Otherwise, you'll see this warning pop up during inference: 
 | 
Beta Was this translation helpful? Give feedback.
                  
                    0 replies
                  
                
            
  
    Sign up for free
    to join this conversation on GitHub.
    Already have an account?
    Sign in to comment
  
        
    
Uh oh!
There was an error while loading. Please reload this page.
-
The flashinfer github page says , flashinfer is adopted by vLLM. Does this mean if we are doing inference using vLLM flashinfer is used ??
Beta Was this translation helpful? Give feedback.
All reactions