设置¶
如果您在colab上打开这个笔记本,您可能需要安装LlamaIndex 🦙。
In [ ]:
Copied!
%pip install llama-index-llms-oci-genai
%pip install llama-index-llms-oci-genai
In [ ]:
Copied!
!pip install llama-index
!pip install llama-index
您还需要安装OCI SDK。
In [ ]:
Copied!
!pip install -U oci
!pip install -U oci
基本用法¶
使用OCI Generative AI提供的LLMs与LlamaIndex只需要您使用您的OCI端点、模型ID、OCID和认证方法初始化OCIGenAI接口。
使用提示调用complete
¶
In [ ]:
Copied!
from llama_index.llms.oci_genai import OCIGenAI
llm = OCIGenAI(
model="MY_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
)
resp = llm.complete("Paul Graham is ")
print(resp)
from llama_index.llms.oci_genai import OCIGenAI
llm = OCIGenAI(
model="MY_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
)
resp = llm.complete("Paul Graham is ")
print(resp)
使用消息列表调用chat
¶
In [ ]:
Copied!
from llama_index.llms.oci_genai import OCIGenAI
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="Tell me a story"),
]
llm = OCIGenAI(
model="MY_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
)
resp = llm.chat(messages)
print(resp)
from llama_index.llms.oci_genai import OCIGenAI
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="Tell me a story"),
]
llm = OCIGenAI(
model="MY_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
)
resp = llm.chat(messages)
print(resp)
流式处理¶
使用 stream_complete
终端点
In [ ]:
Copied!
from llama_index.llms.oci_genai import OCIGenAI
llm = OCIGenAI(
model="MY_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
)
resp = llm.stream_complete("Paul Graham is ")
for r in resp:
print(r.delta, end="")
from llama_index.llms.oci_genai import OCIGenAI
llm = OCIGenAI(
model="MY_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
)
resp = llm.stream_complete("Paul Graham is ")
for r in resp:
print(r.delta, end="")
使用 stream_chat
端点
In [ ]:
Copied!
from llama_index.llms.oci_genai import OCIGenAI
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="Tell me a story"),
]
llm = OCIGenAI(
model="MY_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
)
resp = llm.stream_chat(messages)
for r in resp:
print(r.delta, end="")
from llama_index.llms.oci_genai import OCIGenAI
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="Tell me a story"),
]
llm = OCIGenAI(
model="MY_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
)
resp = llm.stream_chat(messages)
for r in resp:
print(r.delta, end="")
异步¶
本地异步目前不受支持。异步调用将回退为同步。
In [ ]:
Copied!
from llama_index.llms.oci_genai import OCIGenAI
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="Tell me a story"),
]
llm = OCIGenAI(
model="MY_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
)
resp = llm.achat(messages)
print(resp)
resp = llm.astream_chat(messages)
for r in resp:
print(r.delta, end="")
from llama_index.llms.oci_genai import OCIGenAI
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="Tell me a story"),
]
llm = OCIGenAI(
model="MY_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
)
resp = llm.achat(messages)
print(resp)
resp = llm.astream_chat(messages)
for r in resp:
print(r.delta, end="")
配置模型¶
In [ ]:
Copied!
from llama_index.llms.oci_genai import OCIGenAI
llm = OCIGenAI(
model="cohere.command",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
)
resp = llm.complete("Paul Graham is ")
print(resp)
from llama_index.llms.oci_genai import OCIGenAI
llm = OCIGenAI(
model="cohere.command",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
)
resp = llm.complete("Paul Graham is ")
print(resp)
In [ ]:
Copied!
# 从llama_index.llms.oci_genai导入OCIGenAI
llm = OCIGenAI(
model="MY_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
auth_type="SECURITY_TOKEN",
auth_profile="MY_PROFILE", # 用你的配置文件名称替换
)
resp = llm.complete("Paul Graham is ")
print(resp)
# 从llama_index.llms.oci_genai导入OCIGenAI
llm = OCIGenAI(
model="MY_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
auth_type="SECURITY_TOKEN",
auth_profile="MY_PROFILE", # 用你的配置文件名称替换
)
resp = llm.complete("Paul Graham is ")
print(resp)
In [ ]:
Copied!
from llama_index.llms.oci_genai import OCIGenAI
from llama_index.core.llms import ChatMessage
llm = OCIGenAI(
model="ocid1.generativeaiendpoint.oc1.us-chicago-1....",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="DEDICATED_COMPARTMENT_OCID",
auth_profile="MY_PROFILE", # 替换为您的配置文件名称,
provider="MODEL_PROVIDER", # 例如,"cohere"或"meta"
context_size="MODEL_CONTEXT_SIZE", # 例如,128000
)
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="Tell me a story"),
]
resp = llm.chat(messages)
print(resp)
from llama_index.llms.oci_genai import OCIGenAI
from llama_index.core.llms import ChatMessage
llm = OCIGenAI(
model="ocid1.generativeaiendpoint.oc1.us-chicago-1....",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="DEDICATED_COMPARTMENT_OCID",
auth_profile="MY_PROFILE", # 替换为您的配置文件名称,
provider="MODEL_PROVIDER", # 例如,"cohere"或"meta"
context_size="MODEL_CONTEXT_SIZE", # 例如,128000
)
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="Tell me a story"),
]
resp = llm.chat(messages)
print(resp)