Cognee Integration with Litefuse
What is Cognee? Cognee is an open-source AI memory that turns your data into a searchable, reasoning-ready knowledge graph. By pairing Cognee with Litefuse you gain production-grade tracing, evaluation, and analytics for every pipeline step, and search query. Check out the GitHub repo or the docs for details.
What is Litefuse? Litefuse is the open-source LLM engineering platform. It helps teams trace applications, debug issues, evaluate quality, and monitor costs in production.
Quick Start Guide
Step 1: Install Cognee (includes Litefuse)
pip install cognee # langfuse is declared as a dependency and will be installed automaticallyStep 2: Create a Litefuse Project
- Sign up at Litefuse Cloud.
- Create a new project and copy your public and secret API keys.
Step 3: Configure Environment Variables
Create a .env file or export the variables directly in your shell:
LANGFUSE_PUBLIC_KEY=<your public key>
LANGFUSE_SECRET_KEY=<your secret key>
LANGFUSE_HOST=https://litefuse.cloud # 🇪🇺 EU region
# LANGFUSE_HOST=https://litefuse.cloud # 🇺🇸 US regionStep 4: Trace Cognee Functions
cognee ships with a tiny wrapper around Litefuse. Import get_observe() and decorate any function you want to monitor.
from cognee.modules.observability.get_observe import get_observe
observe = get_observe()
@observe(as_type="generation") # optional label
async def acreate_structured_output(...):
... # your business logicEvery time the function runs, the decorator automatically opens a span in Litefuse and streams metrics such as duration, token usage, and custom metadata.
Step 5: Start Cognifying & Watch Traces
Run your regular Cognee workflows:
import cognee
import asyncio
from cognee.modules.observability.get_observe import get_observe
observe = get_observe()
@observe(name="simple_example_run", as_type="example")
async def main():
await cognee.add("Natural language processing (NLP) is ...")
await cognee.cognify()
results = await cognee.search("Tell me about NLP")
for r in results:
print(r)
asyncio.run(main())Open the Litefuse UI – traces for any @observe-decorated helper functions will appear.
Adding Your Own Spans
You can instrument any function in your codebase – not just cognee internals:
from cognee.modules.observability.get_observe import get_observe
observe = get_observe()
@observe(as_type="my_tool", metadata={"foo": "bar"})
def my_helper(arg1, arg2):
...Resources
- Cognee homepage
- Cognee GitHub repository
- Cognee docs for this integration