OCI 数据科学模型部署端点
OCI Data Science 是一个完全托管且无服务器的平台,供数据科学团队在Oracle云基础设施中构建、训练和管理机器学习模型。
有关最新更新、示例和实验性功能,请参阅 ADS LangChain Integration。
本笔记本介绍了如何使用托管在OCI数据科学模型部署上的LLM。
为了进行身份验证,使用oracle-ads库来自动加载调用端点所需的凭据。
!pip3 install oracle-ads
先决条件
部署模型
您可以使用OCI数据科学模型部署上的AI快速操作轻松部署、微调和评估基础模型。如需更多部署示例,请访问Oracle GitHub示例库。
策略
确保拥有访问OCI数据科学模型部署端点所需的策略。
设置
部署模型后,您需要设置以下必需的调用参数:
endpoint
: 部署模型的HTTP端点,例如https://modeldeployment.
。.oci.customer-oci.com/ /predict
认证
您可以通过广告或环境变量设置认证。当您在OCI数据科学笔记本会话中工作时,您可以利用资源主体访问其他OCI资源。查看这里以查看更多选项。
示例
import ads
from langchain_community.llms import OCIModelDeploymentLLM
# Set authentication through ads
# Use resource principal are operating within a
# OCI service that has resource principal based
# authentication configured
ads.set_auth("resource_principal")
# Create an instance of OCI Model Deployment Endpoint
# Replace the endpoint uri and model name with your own
# Using generic class as entry point, you will be able
# to pass model parameters through model_kwargs during
# instantiation.
llm = OCIModelDeploymentLLM(
endpoint="https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict",
model="odsc-llm",
)
# Run the LLM
llm.invoke("Who is the first president of United States?")
API Reference:OCIModelDeploymentLLM
import ads
from langchain_community.llms import OCIModelDeploymentVLLM
# Set authentication through ads
# Use resource principal are operating within a
# OCI service that has resource principal based
# authentication configured
ads.set_auth("resource_principal")
# Create an instance of OCI Model Deployment Endpoint
# Replace the endpoint uri and model name with your own
# Using framework specific class as entry point, you will
# be able to pass model parameters in constructor.
llm = OCIModelDeploymentVLLM(
endpoint="https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict",
)
# Run the LLM
llm.invoke("Who is the first president of United States?")
API Reference:OCIModelDeploymentVLLM
import os
from langchain_community.llms import OCIModelDeploymentTGI
# Set authentication through environment variables
# Use API Key setup when you are working from a local
# workstation or on platform which does not support
# resource principals.
os.environ["OCI_IAM_TYPE"] = "api_key"
os.environ["OCI_CONFIG_PROFILE"] = "default"
os.environ["OCI_CONFIG_LOCATION"] = "~/.oci"
# Set endpoint through environment variables
# Replace the endpoint uri with your own
os.environ["OCI_LLM_ENDPOINT"] = (
"https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict"
)
# Create an instance of OCI Model Deployment Endpoint
# Using framework specific class as entry point, you will
# be able to pass model parameters in constructor.
llm = OCIModelDeploymentTGI()
# Run the LLM
llm.invoke("Who is the first president of United States?")
API Reference:OCIModelDeploymentTGI
异步调用
await llm.ainvoke("Tell me a joke.")
流式调用
for chunk in llm.stream("Tell me a joke."):
print(chunk, end="", flush=True)
API 参考
有关所有功能和配置的全面详细信息,请参阅每个类的API参考文档:
相关
- LLM 概念指南
- LLM how-to guides