Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: langfuse on spcaes guide and gradio example #1529

Open
wants to merge 21 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 3 commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
7496fef
docs: langfuse on spcaes guide and gradio example
jannikmaierhoefer Dec 16, 2024
0fffbdb
edit toctree
jannikmaierhoefer Dec 17, 2024
fcc7485
text edit
jannikmaierhoefer Dec 17, 2024
82e3972
edit troubleshooting part
jannikmaierhoefer Dec 19, 2024
f024ddc
edit text
jannikmaierhoefer Dec 20, 2024
41dfe8a
update numbers
jannikmaierhoefer Dec 20, 2024
ac70405
fix spelling
jannikmaierhoefer Dec 20, 2024
8462e6f
Update docs/hub/spaces-sdks-docker-langfuse.md
jannikmaierhoefer Dec 20, 2024
ad18cdd
Update docs/hub/spaces-sdks-docker-langfuse.md
jannikmaierhoefer Dec 20, 2024
a0dfd6e
Update docs/hub/spaces-sdks-docker-langfuse.md
jannikmaierhoefer Dec 20, 2024
112e7e9
Update docs/hub/spaces-sdks-docker-langfuse.md
jannikmaierhoefer Dec 20, 2024
35f25b3
Update docs/hub/spaces-sdks-docker-langfuse.md
jannikmaierhoefer Dec 20, 2024
19dc6c0
Update docs/hub/spaces-sdks-docker-langfuse.md
jannikmaierhoefer Dec 20, 2024
737301c
move troubleshoot section to gradio template readme as this is only g…
jannikmaierhoefer Dec 20, 2024
4c74941
Update docs/hub/spaces-sdks-docker-langfuse.md
jannikmaierhoefer Dec 20, 2024
9849195
edit gradio link name
jannikmaierhoefer Dec 20, 2024
9336d5d
Apply suggestions from code review
andrewrreed Dec 20, 2024
192fb20
fix setup steps numbered list formatting
andrewrreed Dec 20, 2024
b1a5a3d
Add simple tracing example with HF Serverless API
andrewrreed Dec 20, 2024
0ca2049
remove <tip> for link formatting
andrewrreed Dec 20, 2024
d08059e
point "Deploy on HF" to preselected template
andrewrreed Dec 20, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions docs/hub/_toctree.yml
Original file line number Diff line number Diff line change
Expand Up @@ -285,6 +285,8 @@
title: Evidence on Spaces
- local: spaces-sdks-docker-marimo
title: marimo on Spaces
- local: spaces-sdks-docker-langfuse
title: Langfuse on Spaces
- local: spaces-embed
title: Embed your Space
- local: spaces-run-with-docker
Expand Down
104 changes: 104 additions & 0 deletions docs/hub/spaces-sdks-docker-langfuse.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,104 @@
# Langfuse on Spaces

[Langfuse](https://langfuse.com) is an open-source LLM observability platform that helps teams collaboratively debug, analyze, and iterate on their LLM applications. With Langfuse, you can capture detailed traces of your applications, manage prompts, evaluate outputs, and more—all in one place.

## What is Langfuse?

Langfuse provides tools to monitor and understand the internal states of your large language model (LLM) applications. It enables developers to track LLM inference, embedding retrieval, API usage, and other interactions, making it easier to pinpoint problems and improve application performance.

Key features of Langfuse include LLM tracing to capture the full context of your application's execution flow, prompt management for centralized and collaborative prompt iteration, evaluation metrics to assess output quality, dataset creation for testing and benchmarking, and a playground to experiment with prompts and model configurations.
jannikmaierhoefer marked this conversation as resolved.
Show resolved Hide resolved

jannikmaierhoefer marked this conversation as resolved.
Show resolved Hide resolved
## Why LLM Observability?

As LLMs become more prevalent, understanding their behavior and performance is crucial. LLM observability refers to monitoring and understanding the internal states of an LLM application through its outputs. This is essential for addressing challenges such as complex control flows, non-deterministic outputs, and varied user intents.

Building LLM applications involves intricate workflows with repeated or chained calls, making debugging challenging. The non-deterministic nature of LLM outputs adds complexity to consistent quality assessment, and varied user inputs require deep understanding to improve user experience.

Implementing LLM observability helps in debugging complex workflows, evaluating output quality over time, and analyzing user behavior. By gaining insights into your application's performance, you can enhance reliability and user satisfaction.

## Deploy Langfuse on Spaces

You can deploy Langfuse on Hugging Face Spaces effortlessly and start using it within minutes.

### Steps to Deploy Langfuse:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adapt to using the the HF default spaces templates instead of the Langfuse template space

@andrewrreed to confirm that the space will be available in the HF spaces menu

CleanShot 2024-12-19 at 12 18 00

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Notes

  • Pick langfuse from the "select space menu"
  • enable persistent storage
  • configure environment variables / secrets

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will update the image as soon as Langfuse is available as a Docker template.


1. **Open the Langfuse Template Space:**

Click the button below to create your own Langfuse Space:

[![Open Space](https://huggingface.co/datasets/huggingface/badges/resolve/main/deploy-to-spaces-lg.svg)](https://huggingface.co/spaces/langfuse/langfuse-template-space)

4. **Open the Langfuse Instance:**
jannikmaierhoefer marked this conversation as resolved.
Show resolved Hide resolved

- Click on the **"Open in new tab"** button located.

5. **Authenticate with Hugging Face OAuth:**

- On the Langfuse login page, click on **"Sign in with Hugging Face"**.
- Grant the necessary permissions when prompted.

6. **Start Using Langfuse:**

After authentication, you will have a fully functioning Langfuse instance running on Hugging Face Spaces.

## Get Started with Langfuse
jannikmaierhoefer marked this conversation as resolved.
Show resolved Hide resolved

Now that you have Langfuse running, you can begin integrating it with your LLM applications.

### 1. Create a New Project

Create a new organization and project in Langfuse.

### 2. Generate API Credentials

Navigate to **Project Settings**, and under **API Keys**, click on **"Create New Key"**. Copy the **Public Key** and **Secret Key**; you'll need them to authenticate when sending data to Langfuse.

### 3. Create a Sample Gradio Chat Application

To create a sample Gradio chat application in Hugging Face Spaces, follow these steps:

1. **Set Up Your Space:**
jannikmaierhoefer marked this conversation as resolved.
Show resolved Hide resolved
jannikmaierhoefer marked this conversation as resolved.
Show resolved Hide resolved

- Navigate to Hugging Face Spaces and create a new Space.
- Choose the appropriate template or start with a blank Space.

2. **Add the Application Code:**

- Create a file named `app.py` in your Space.
- Copy the application code from [app.py](docs/hub/app.py) and paste it into your `app.py` file in the Space.

3. **Define Dependencies:**

- Create a `requirements.txt` file in your Space.
- List all necessary dependencies for your application. For example:
```
jannikmaierhoefer marked this conversation as resolved.
Show resolved Hide resolved
gradio
langfuse
openai
```

4. **Launch the Application:**

- Once the `app.py` and `requirements.txt` files are set up, start your Space.
- The application will launch, and you can interact with the Gradio chat interface.

5. **View Example Traces in Langfuse:**

- After starting the application, navigate to your Langfuse dashboard.
- Go to the **Traces** section to view the example traces generated by your Gradio chat application.
jannikmaierhoefer marked this conversation as resolved.
Show resolved Hide resolved

By following these steps, you can quickly set up and run a Gradio chat application in Hugging Face Spaces and observe its traces in Langfuse.

### 4. View Traces in Langfuse Dashboard

Open your Langfuse dashboard, navigate to **Traces** to see the recorded traces from your application, and use the observability tools to analyze and debug your LLM applications.

For detailed instructions and advanced features, refer to the [Langfuse Get Started Guide](https://langfuse.com/docs/get-started).

## Feedback and Support

We value your feedback and are here to help if you have any questions.
jannikmaierhoefer marked this conversation as resolved.
Show resolved Hide resolved

- **Join Our Community:** Engage with us on [Discord](https://discord.gg/langfuse) or via [GitHub Discussions](https://github.com/langfuse/langfuse/discussions)
- **Report Issues:** Submit issues or feature requests on our [GitHub Issues](https://github.com/langfuse/langfuse/issues) page
- **Contact Us:** Reach out via our [Support Page](https://langfuse.com/support) or email us at [[email protected]](mailto:[email protected])