In [ ]:
Copied!
%pip install llama-index-llms-premai
%pip install llama-index-llms-premai
In [ ]:
Copied!
from llama_index.llms.premai import PremAI
from llama_index.core.llms import ChatMessage
from llama_index.llms.premai import PremAI
from llama_index.core.llms import ChatMessage
In [ ]:
Copied!
import os
import getpass
if os.environ.get("PREMAI_API_KEY") is None:
os.environ["PREMAI_API_KEY"] = getpass.getpass("PremAI API Key:")
prem_chat = PremAI(project_id=8)
import os
import getpass
if os.environ.get("PREMAI_API_KEY") is None:
os.environ["PREMAI_API_KEY"] = getpass.getpass("PremAI API Key:")
prem_chat = PremAI(project_id=8)
聊天完成¶
现在你已经准备就绪。我们现在可以开始与我们的应用程序进行交互了。让我们从使用llama-index构建简单的聊天请求和响应开始。
In [ ]:
Copied!
messages = [
ChatMessage(role="user", content="What is your name"),
ChatMessage(
role="user", content="Write an essay about your school in 500 words"
),
]
messages = [
ChatMessage(role="user", content="What is your name"),
ChatMessage(
role="user", content="Write an essay about your school in 500 words"
),
]
请注意:您可以在ChatMessage
中提供系统提示,就像这样:
messages = [
ChatMessage(role="system", content="Act like a pirate"),
ChatMessage(role="user", content="What is your name"),
ChatMessage(role="user", content="Where do you live, write an essay in 500 words"),
]
此外,您还可以像这样使用系统提示实例化您的客户端:
chat = PremAI(project_id=8, system_prompt="Act like nemo fish")
在这两种情况下,您将覆盖在部署应用程序时从平台上固定的系统提示。特别是在这种情况下,如果在实例化PremAI类时覆盖系统提示,那么
ChatMessage
中的系统消息将不会产生任何影响。
因此,如果您想要为任何实验情况覆盖系统提示,您需要在实例化客户端时提供该提示,或者在
ChatMessage
中以system
角色编写该提示。
现在让我们调用模型。
In [ ]:
Copied!
response = prem_chat.chat(messages)
print(response)
response = prem_chat.chat(messages)
print(response)
[ChatResponse(message=ChatMessage(role=<MessageRole.ASSISTANT: 'assistant'>, content="I'm here to assist you with any questions or tasks you have, but I'm not able to write essays. However, if you need help brainstorming ideas or organizing your thoughts for your essay about your school, I'd be happy to help with that. Just let me know how I can assist you further!", additional_kwargs={}), raw={'role': <RoleEnum.ASSISTANT: 'assistant'>, 'content': "I'm here to assist you with any questions or tasks you have, but I'm not able to write essays. However, if you need help brainstorming ideas or organizing your thoughts for your essay about your school, I'd be happy to help with that. Just let me know how I can assist you further!"}, delta=None, additional_kwargs={})]
您还可以将聊天功能转换为完成功能。以下是它的工作原理。
In [ ]:
Copied!
completion = prem_chat.complete("Paul Graham is ")
completion = prem_chat.complete("Paul Graham is ")
流式处理¶
在本节中,让我们看看如何使用llama-index和PremAI来流式处理标记。这与上述方法非常相似。以下是操作步骤。
In [ ]:
Copied!
streamed_response = prem_chat.stream_chat(messages)
for response_delta in streamed_response:
print(response_delta.delta, end="")
streamed_response = prem_chat.stream_chat(messages)
for response_delta in streamed_response:
print(response_delta.delta, end="")
I'm here to assist you with writing tasks, but I don't have personal experiences or attend school. However, I can help you brainstorm ideas, outline your essay, or provide information on various school-related topics. Just let me know how I can assist you further!
这将一个接一个地流式传输标记。与complete
方法类似,我们有stream_complete
方法,用于完成时的标记流式传输。
In [ ]:
Copied!
# 这将逐个流式传输标记streamed_response = prem_chat.stream_complete("你好,你好吗")for response_delta in streamed_response: print(response_delta.delta, end="")
# 这将逐个流式传输标记streamed_response = prem_chat.stream_complete("你好,你好吗")for response_delta in streamed_response: print(response_delta.delta, end="")
Hello! I'm here and ready to assist you. How can I help you today?