自定义聊天提示¶
如果您在Colab上打开此笔记本,您可能需要安装LlamaIndex 🦙。
In [ ]:
Copied!
%pip install llama-index-llms-openai
%pip install llama-index-llms-openai
In [ ]:
Copied!
!pip install llama-index
!pip install llama-index
设置提示¶
下面,我们将使用默认提示并对其进行自定义,以便即使上下文不够有用也能始终给出答案。
我们展示了两种设置提示的方法:
- 明确定义ChatMessage和MessageRole对象。
- 调用ChatPromptTemplate.from_messages
In [ ]:
Copied!
qa_prompt_str = (
"Context information is below.\n"
"---------------------\n"
"{context_str}\n"
"---------------------\n"
"Given the context information and not prior knowledge, "
"answer the question: {query_str}\n"
)
refine_prompt_str = (
"We have the opportunity to refine the original answer "
"(only if needed) with some more context below.\n"
"------------\n"
"{context_msg}\n"
"------------\n"
"Given the new context, refine the original answer to better "
"answer the question: {query_str}. "
"If the context isn't useful, output the original answer again.\n"
"Original Answer: {existing_answer}"
)
qa_prompt_str = (
"Context information is below.\n"
"---------------------\n"
"{context_str}\n"
"---------------------\n"
"Given the context information and not prior knowledge, "
"answer the question: {query_str}\n"
)
refine_prompt_str = (
"We have the opportunity to refine the original answer "
"(only if needed) with some more context below.\n"
"------------\n"
"{context_msg}\n"
"------------\n"
"Given the new context, refine the original answer to better "
"answer the question: {query_str}. "
"If the context isn't useful, output the original answer again.\n"
"Original Answer: {existing_answer}"
)
1. 明确定义ChatMessage
和MessageRole
对象¶
In [ ]:
Copied!
from llama_index.core.llms import ChatMessage, MessageRole
from llama_index.core import ChatPromptTemplate
# 文本问答提示
chat_text_qa_msgs = [
ChatMessage(
role=MessageRole.SYSTEM,
content=(
"始终回答问题,即使上下文不是很有帮助。"
),
),
ChatMessage(role=MessageRole.USER, content=qa_prompt_str),
]
text_qa_template = ChatPromptTemplate(chat_text_qa_msgs)
# 优化提示
chat_refine_msgs = [
ChatMessage(
role=MessageRole.SYSTEM,
content=(
"始终回答问题,即使上下文不是很有帮助。"
),
),
ChatMessage(role=MessageRole.USER, content=refine_prompt_str),
]
refine_template = ChatPromptTemplate(chat_refine_msgs)
from llama_index.core.llms import ChatMessage, MessageRole
from llama_index.core import ChatPromptTemplate
# 文本问答提示
chat_text_qa_msgs = [
ChatMessage(
role=MessageRole.SYSTEM,
content=(
"始终回答问题,即使上下文不是很有帮助。"
),
),
ChatMessage(role=MessageRole.USER, content=qa_prompt_str),
]
text_qa_template = ChatPromptTemplate(chat_text_qa_msgs)
# 优化提示
chat_refine_msgs = [
ChatMessage(
role=MessageRole.SYSTEM,
content=(
"始终回答问题,即使上下文不是很有帮助。"
),
),
ChatMessage(role=MessageRole.USER, content=refine_prompt_str),
]
refine_template = ChatPromptTemplate(chat_refine_msgs)
2. 调用ChatPromptTemplate.from_messages
¶
from_messages
是一种语法糖,允许您将聊天提示模板定义为一个元组列表,其中每个元组对应一个聊天消息("role","message")。
In [ ]:
Copied!
from llama_index.core import ChatPromptTemplate
# 文本问答提示
chat_text_qa_msgs = [
(
"system",
"始终回答问题,即使上下文不是很有帮助。",
),
("user", qa_prompt_str),
]
text_qa_template = ChatPromptTemplate.from_messages(chat_text_qa_msgs)
# 优化提示
chat_refine_msgs = [
(
"system",
"始终回答问题,即使上下文不是很有帮助。",
),
("user", refine_prompt_str),
]
refine_template = ChatPromptTemplate.from_messages(chat_refine_msgs)
from llama_index.core import ChatPromptTemplate
# 文本问答提示
chat_text_qa_msgs = [
(
"system",
"始终回答问题,即使上下文不是很有帮助。",
),
("user", qa_prompt_str),
]
text_qa_template = ChatPromptTemplate.from_messages(chat_text_qa_msgs)
# 优化提示
chat_refine_msgs = [
(
"system",
"始终回答问题,即使上下文不是很有帮助。",
),
("user", refine_prompt_str),
]
refine_template = ChatPromptTemplate.from_messages(chat_refine_msgs)
使用提示¶
现在,我们将在索引查询中使用这些提示!
In [ ]:
Copied!
import openai
import os
os.environ["OPENAI_API_KEY"] = "sk-..."
openai.api_key = os.environ["OPENAI_API_KEY"]
import openai
import os
os.environ["OPENAI_API_KEY"] = "sk-..."
openai.api_key = os.environ["OPENAI_API_KEY"]
下载数据¶
In [ ]:
Copied!
!mkdir -p 'data/paul_graham/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'
!mkdir -p 'data/paul_graham/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'
In [ ]:
Copied!
from llama_index.core import VectorStoreIndex和SimpleDirectoryReader
from llama_index.llms.openai import OpenAI
documents = SimpleDirectoryReader("./data/paul_graham/").load_data()
# 使用聊天模型创建索引,这样我们就可以使用聊天提示!
llm = OpenAI(model="gpt-3.5-turbo", temperature=0.1)
index = VectorStoreIndex.from_documents(documents)
from llama_index.core import VectorStoreIndex和SimpleDirectoryReader
from llama_index.llms.openai import OpenAI
documents = SimpleDirectoryReader("./data/paul_graham/").load_data()
# 使用聊天模型创建索引,这样我们就可以使用聊天提示!
llm = OpenAI(model="gpt-3.5-turbo", temperature=0.1)
index = VectorStoreIndex.from_documents(documents)
在添加模板之前
In [ ]:
Copied!
print(index.as_query_engine(llm=llm).query("Who is Joe Biden?"))
print(index.as_query_engine(llm=llm).query("Who is Joe Biden?"))
I'm unable to provide an answer to that question based on the context information provided.
添加模板后¶
In [ ]:
Copied!
print(
index.as_query_engine(
text_qa_template=text_qa_template,
refine_template=refine_template,
llm=llm,
).query("Who is Joe Biden?")
)
print(
index.as_query_engine(
text_qa_template=text_qa_template,
refine_template=refine_template,
llm=llm,
).query("Who is Joe Biden?")
)
Joe Biden is the current President of the United States, having taken office in January 2021. He previously served as Vice President under President Barack Obama from 2009 to 2017.