ModelScope LLMS¶
在这个笔记本中,我们将展示如何在LlamaIndex中使用ModelScope LLM模型。请查看ModelScope网站。
如果您在colab上打开这个笔记本,您需要安装LlamaIndex 🦙和modelscope。
In [ ]:
Copied!
!pip install llama-index-llms-modelscope
!pip install llama-index-llms-modelscope
基本用法¶
In [ ]:
Copied!
import sys
from llama_index.llms.modelscope import ModelScopeLLM
llm = ModelScopeLLM(model_name="qwen/Qwen1.5-7B-Chat", model_revision="master")
rsp = llm.complete("Hello, who are you?")
print(rsp)
import sys
from llama_index.llms.modelscope import ModelScopeLLM
llm = ModelScopeLLM(model_name="qwen/Qwen1.5-7B-Chat", model_revision="master")
rsp = llm.complete("Hello, who are you?")
print(rsp)
使用消息请求¶
In [ ]:
Copied!
from llama_index.core.base.llms.types import MessageRole, ChatMessage
messages = [
ChatMessage(
role=MessageRole.SYSTEM, content="You are a helpful assistant."
),
ChatMessage(role=MessageRole.USER, content="How to make cake?"),
]
resp = llm.chat(messages)
print(resp)
from llama_index.core.base.llms.types import MessageRole, ChatMessage
messages = [
ChatMessage(
role=MessageRole.SYSTEM, content="You are a helpful assistant."
),
ChatMessage(role=MessageRole.USER, content="How to make cake?"),
]
resp = llm.chat(messages)
print(resp)