大纲
这将帮助您开始使用Outlines LLM。有关所有Outlines功能和配置的详细文档,请前往API参考。
Outlines 是一个用于约束语言生成的库。它允许你在使用大型语言模型(LLMs)和各种后端的同时,对生成的输出应用约束。
概述
集成详情
类 | 包 | 本地 | 可序列化 | JS支持 | 包下载 | 包最新 |
---|---|---|---|---|---|---|
Outlines | langchain-community | ✅ | 测试版 | ❌ |
设置
要访问Outlines模型,您需要有一个互联网连接以从huggingface下载模型权重。根据您使用的后端,您需要安装所需的依赖项(请参阅Outlines文档)
凭证
Outlines 没有内置的认证机制。
安装
LangChain Outlines 集成位于 langchain-community
包中,并且需要 outlines
库:
%pip install -qU langchain-community outlines
实例化
现在我们可以实例化我们的模型对象并生成聊天完成:
from langchain_community.llms import Outlines
# For use with llamacpp backend
model = Outlines(model="microsoft/Phi-3-mini-4k-instruct", backend="llamacpp")
# For use with vllm backend (not available on Mac)
model = Outlines(model="microsoft/Phi-3-mini-4k-instruct", backend="vllm")
# For use with mlxlm backend (only available on Mac)
model = Outlines(model="microsoft/Phi-3-mini-4k-instruct", backend="mlxlm")
# For use with huggingface transformers backend
model = Outlines(
model="microsoft/Phi-3-mini-4k-instruct"
) # defaults to backend="transformers"
API Reference:Outlines
调用
model.invoke("Hello how are you?")
链式调用
from langchain_core.prompts import PromptTemplate
prompt = PromptTemplate.from_template("How to say {input} in {output_language}:\n")
chain = prompt | model
chain.invoke(
{
"output_language": "German",
"input": "I love programming.",
}
)
API Reference:PromptTemplate
流处理
Outlines 支持令牌的流式传输:
for chunk in model.stream("Count to 10 in French:"):
print(chunk, end="", flush=True)
受限生成
大纲允许您对生成的输出应用各种约束:
正则表达式约束
model.regex = r"((25[0-5]|2[0-4]\d|[01]?\d\d?)\.){3}(25[0-5]|2[0-4]\d|[01]?\d\d?)"
response = model.invoke("What is the IP address of Google's DNS server?")
response
类型约束
model.type_constraints = int
response = model.invoke("What is the answer to life, the universe, and everything?")
JSON 模式
from pydantic import BaseModel
class Person(BaseModel):
name: str
model.json_schema = Person
response = model.invoke("Who is the author of LangChain?")
person = Person.model_validate_json(response)
person
语法约束
model.grammar = """
?start: expression
?expression: term (("+" | "-") term)
?term: factor (("" | "/") factor)
?factor: NUMBER | "-" factor | "(" expression ")"
%import common.NUMBER
%import common.WS
%ignore WS
"""
response = model.invoke("Give me a complex arithmetic expression:")
response
API 参考
有关所有ChatOutlines功能和配置的详细文档,请访问API参考:https://python.langchain.com/api_reference/community/chat_models/langchain_community.chat_models.outlines.ChatOutlines.html
大纲文档:
https://dottxt-ai.github.io/outlines/latest/
相关
- LLM 概念指南
- LLM how-to guides