@@ -17,6 +17,8 @@ The following tables detail the models supported by LMDeploy's TurboMind engine
17
17
| InternLM3 | 8B | LLM | Yes | Yes | Yes | Yes |
18
18
| InternLM-XComposer2 | 7B, 4khd-7B | MLLM | Yes | Yes | Yes | Yes |
19
19
| InternLM-XComposer2.5 | 7B | MLLM | Yes | Yes | Yes | Yes |
20
+ | Intern-S1 | 241B | MLLM | Yes | Yes | Yes | Yes |
21
+ | Intern-S1-mini | 8.3B | MLLM | Yes | Yes | Yes | Yes |
20
22
| Qwen | 1.8B - 72B | LLM | Yes | Yes | Yes | Yes |
21
23
| Qwen1.5<sup >\[ 1\] </sup > | 1.8B - 110B | LLM | Yes | Yes | Yes | Yes |
22
24
| Qwen2<sup >\[ 2\] </sup > | 0.5B - 72B | LLM | Yes | Yes\* | Yes\* | Yes |
@@ -67,6 +69,8 @@ The following tables detail the models supported by LMDeploy's TurboMind engine
67
69
| InternLM2 | 7B - 20B | LLM | Yes | Yes | Yes | Yes | Yes |
68
70
| InternLM2.5 | 7B | LLM | Yes | Yes | Yes | Yes | Yes |
69
71
| InternLM3 | 8B | LLM | Yes | Yes | Yes | Yes | Yes |
72
+ | Intern-S1 | 241B | MLLM | Yes | Yes | Yes | Yes | - |
73
+ | Intern-S1-mini | 8.3B | MLLM | Yes | Yes | Yes | Yes | - |
70
74
| Baichuan2 | 7B | LLM | Yes | Yes | Yes | Yes | No |
71
75
| Baichuan2 | 13B | LLM | Yes | Yes | Yes | No | No |
72
76
| ChatGLM2 | 6B | LLM | Yes | Yes | Yes | No | No |
@@ -111,7 +115,6 @@ The following tables detail the models supported by LMDeploy's TurboMind engine
111
115
| Phi-3.5-mini | 3.8B | LLM | Yes | Yes | No | - | - |
112
116
| Phi-3.5-MoE | 16x3.8B | LLM | Yes | Yes | No | - | - |
113
117
| Phi-3.5-vision | 4.2B | MLLM | Yes | Yes | No | - | - |
114
- | gpt-oss | 20B | LLM | Yes | Yes | No | - | - |
115
118
116
119
``` {note}
117
120
* [1] Currently Mono-InternVL does not support FP16 due to numerical instability. Please use BF16 instead.
0 commit comments