Eden AI
Eden AI 正在通过联合最佳的人工智能提供商,彻底改变人工智能领域,使用户能够解锁无限的可能性,并挖掘人工智能的真正潜力。通过一个全面且无忧的一体化平台,用户可以快速将人工智能功能部署到生产中,通过单一API轻松访问全方位的人工智能能力。(网站:https://edenai.co/)
本示例介绍了如何使用LangChain与Eden AI模型进行交互
访问EDENAI的API需要一个API密钥,
你可以通过创建一个账户 https://app.edenai.run/user/register 并前往这里 https://app.edenai.run/admin/account/settings 来获取
一旦我们有了一个密钥,我们将希望通过运行以下命令将其设置为环境变量:
export EDENAI_API_KEY="..."
如果您不想设置环境变量,可以直接通过名为edenai_api_key的参数传递密钥
当启动EdenAI LLM类时:
from langchain_community.llms import EdenAI
API Reference:EdenAI
llm = EdenAI(edenai_api_key="...", provider="openai", temperature=0.2, max_tokens=250)
调用模型
EdenAI API汇集了各种提供商,每个提供商都提供多种模型。
要访问特定模型,您可以在实例化时简单地添加'model'。
例如,让我们探索OpenAI提供的模型,如GPT3.5
文本生成
from langchain.chains import LLMChain
from langchain_core.prompts import PromptTemplate
llm = EdenAI(
feature="text",
provider="openai",
model="gpt-3.5-turbo-instruct",
temperature=0.2,
max_tokens=250,
)
prompt = """
User: Answer the following yes/no question by reasoning step by step. Can a dog drive a car?
Assistant:
"""
llm(prompt)
API Reference:LLMChain | PromptTemplate
图像生成
import base64
from io import BytesIO
from PIL import Image
def print_base64_image(base64_string):
# Decode the base64 string into binary data
decoded_data = base64.b64decode(base64_string)
# Create an in-memory stream to read the binary data
image_stream = BytesIO(decoded_data)
# Open the image using PIL
image = Image.open(image_stream)
# Display the image
image.show()
text2image = EdenAI(feature="image", provider="openai", resolution="512x512")
image_output = text2image("A cat riding a motorcycle by Picasso")
print_base64_image(image_output)
带回调的文本生成
from langchain_community.llms import EdenAI
from langchain_core.callbacks import StreamingStdOutCallbackHandler
llm = EdenAI(
callbacks=[StreamingStdOutCallbackHandler()],
feature="text",
provider="openai",
temperature=0.2,
max_tokens=250,
)
prompt = """
User: Answer the following yes/no question by reasoning step by step. Can a dog drive a car?
Assistant:
"""
print(llm.invoke(prompt))
API Reference:EdenAI | StreamingStdOutCallbackHandler
链式调用
from langchain.chains import LLMChain, SimpleSequentialChain
from langchain_core.prompts import PromptTemplate
llm = EdenAI(feature="text", provider="openai", temperature=0.2, max_tokens=250)
text2image = EdenAI(feature="image", provider="openai", resolution="512x512")
prompt = PromptTemplate(
input_variables=["product"],
template="What is a good name for a company that makes {product}?",
)
chain = LLMChain(llm=llm, prompt=prompt)
second_prompt = PromptTemplate(
input_variables=["company_name"],
template="Write a description of a logo for this company: {company_name}, the logo should not contain text at all ",
)
chain_two = LLMChain(llm=llm, prompt=second_prompt)
third_prompt = PromptTemplate(
input_variables=["company_logo_description"],
template="{company_logo_description}",
)
chain_three = LLMChain(llm=text2image, prompt=third_prompt)
# Run the chain specifying only the input variable for the first chain.
overall_chain = SimpleSequentialChain(
chains=[chain, chain_two, chain_three], verbose=True
)
output = overall_chain.run("hats")
# print the image
print_base64_image(output)
相关
- LLM 概念指南
- LLM how-to guides