使用OpenLLMetry实现可观测性¶
OpenLLMetry 是一个基于OpenTelemetry的开源项目,用于跟踪和监控LLM应用程序。它可以连接到所有主要的可观测性平台(如Datadog、Dynatrace、Honeycomb、New Relic等),并且可以在几分钟内安装完成。
如果您在colab上打开这个笔记本,您可能需要安装LlamaIndex 🦙 和 OpenLLMetry。
In [ ]:
Copied!
!pip install llama-index
!pip install traceloop-sdk
!pip install llama-index
!pip install traceloop-sdk
配置API密钥¶
在app.traceloop.com注册Traceloop账户。然后,转到API密钥页面并创建一个新的API密钥。复制密钥并粘贴到下面的单元格中。
如果您更喜欢使用其他可观测性平台,如Datadog、Dynatrace、Honeycomb或其他平台,您可以在这里找到有关如何配置的说明。
In [ ]:
Copied!
import os
os.environ["OPENAI_API_KEY"] = "sk-..."
os.environ["TRACELOOP_API_KEY"] = "..."
import os
os.environ["OPENAI_API_KEY"] = "sk-..."
os.environ["TRACELOOP_API_KEY"] = "..."
初始化 OpenLLMetry¶
In [ ]:
Copied!
from traceloop.sdk import Traceloop
Traceloop.init()
from traceloop.sdk import Traceloop
Traceloop.init()
Traceloop syncing configuration and prompts Traceloop exporting traces to https://api.traceloop.com authenticating with bearer token
下载数据¶
In [ ]:
Copied!
!mkdir -p 'data/paul_graham/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'
!mkdir -p 'data/paul_graham/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'
--2024-01-12 12:43:16-- https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.109.133, 185.199.108.133, 185.199.111.133, ... Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.109.133|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 75042 (73K) [text/plain] Saving to: ‘data/paul_graham/paul_graham_essay.txt’ data/paul_graham/pa 100%[===================>] 73.28K --.-KB/s in 0.02s 2024-01-12 12:43:17 (3.68 MB/s) - ‘data/paul_graham/paul_graham_essay.txt’ saved [75042/75042]
In [ ]:
Copied!
from llama_index.core import SimpleDirectoryReader
docs = SimpleDirectoryReader("./data/paul_graham/").load_data()
from llama_index.core import SimpleDirectoryReader
docs = SimpleDirectoryReader("./data/paul_graham/").load_data()
要运行一个查询¶
In [ ]:
Copied!
from llama_index.core import VectorStoreIndex
index = VectorStoreIndex.from_documents(docs)
query_engine = index.as_query_engine()
response = query_engine.query("What did the author do growing up?")
print(response)
from llama_index.core import VectorStoreIndex
index = VectorStoreIndex.from_documents(docs)
query_engine = index.as_query_engine()
response = query_engine.query("What did the author do growing up?")
print(response)
The author wrote short stories and also worked on programming, specifically on an IBM 1401 computer in 9th grade. They used an early version of Fortran and typed programs on punch cards. They also mentioned getting a microcomputer, a TRS-80, in about 1980 and started programming on it.