设置¶
安装包¶
In [ ]:
Copied!
%pip install llama-index llama-index-callbacks-langfuse
%pip install llama-index llama-index-callbacks-langfuse
配置环境¶
如果你还没有的话,在Langfuse上注册,并从项目设置中获取你的API密钥。
In [ ]:
Copied!
import os# Langfuseos.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-..."os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-..."os.environ[ "LANGFUSE_HOST"] = "https://cloud.langfuse.com" # 🇪🇺 欧盟地区, 🇺🇸 美国地区: "https://us.cloud.langfuse.com"# OpenAIos.environ["OPENAI_API_KEY"] = "sk-..."
import os# Langfuseos.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-..."os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-..."os.environ[ "LANGFUSE_HOST"] = "https://cloud.langfuse.com" # 🇪🇺 欧盟地区, 🇺🇸 美国地区: "https://us.cloud.langfuse.com"# OpenAIos.environ["OPENAI_API_KEY"] = "sk-..."
注册Langfuse回调处理程序¶
选项1:设置全局LlamaIndex处理程序¶
In [ ]:
Copied!
from llama_index.core import global_handler, set_global_handler
set_global_handler("langfuse")
langfuse_callback_handler = global_handler
from llama_index.core import global_handler, set_global_handler
set_global_handler("langfuse")
langfuse_callback_handler = global_handler
选项2:直接使用Langfuse回调¶
In [ ]:
Copied!
from llama_index.core import Settings
from llama_index.core.callbacks import CallbackManager
from langfuse.llama_index import LlamaIndexCallbackHandler
langfuse_callback_handler = LlamaIndexCallbackHandler()
Settings.callback_manager = CallbackManager([langfuse_callback_handler])
from llama_index.core import Settings
from llama_index.core.callbacks import CallbackManager
from langfuse.llama_index import LlamaIndexCallbackHandler
langfuse_callback_handler = LlamaIndexCallbackHandler()
Settings.callback_manager = CallbackManager([langfuse_callback_handler])
将事件刷新到Langfuse¶
Langfuse是一个用于处理事件的系统,当需要将事件刷新到Langfuse时,可以使用该功能。
Langfuse SDK在后台对事件进行排队和批处理,以减少网络请求次数并提高整体性能。在退出应用程序之前,请确保所有排队的事件都已经被刷新到Langfuse服务器。
In [ ]:
Copied!
# ... 在这里调用你的LlamaIndex ...langfuse_callback_handler.flush()
# ... 在这里调用你的LlamaIndex ...langfuse_callback_handler.flush()
完成!✨ 现在,您的LlamaIndex应用程序中的跟踪和指标将自动在Langfuse中进行跟踪。如果您构建新的索引或在上下文中查询LLM,您的跟踪和指标将立即显示在Langfuse UI中。接下来,让我们看看在Langfuse中跟踪的具体情况。
示例¶
获取并保存示例数据。
In [ ]:
Copied!
!mkdir -p 'data/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham_essay.txt'
!mkdir -p 'data/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham_essay.txt'
运行一个示例索引构建、查询和聊天。
In [ ]:
Copied!
from llama_index.core import SimpleDirectoryReader, VectorStoreIndex# 创建索引documents = SimpleDirectoryReader("data").load_data()index = VectorStoreIndex.from_documents(documents)# 执行查询query_engine = index.as_query_engine()query_response = query_engine.query("作者在成长过程中做了什么?")print(query_response)# 执行聊天查询chat_engine = index.as_chat_engine()chat_response = chat_engine.chat("作者在成长过程中做了什么?")print(chat_response)# 由于我们希望立即在Langfuse中看到结果,因此需要刷新回调处理程序langfuse_callback_handler.flush()
from llama_index.core import SimpleDirectoryReader, VectorStoreIndex# 创建索引documents = SimpleDirectoryReader("data").load_data()index = VectorStoreIndex.from_documents(documents)# 执行查询query_engine = index.as_query_engine()query_response = query_engine.query("作者在成长过程中做了什么?")print(query_response)# 执行聊天查询chat_engine = index.as_chat_engine()chat_response = chat_engine.chat("作者在成长过程中做了什么?")print(chat_response)# 由于我们希望立即在Langfuse中看到结果,因此需要刷新回调处理程序langfuse_callback_handler.flush()
📚 更多细节¶
查看完整的Langfuse文档,了解Langfuse的跟踪和分析能力,以及如何充分利用这一集成功能。