← Back to changelog
April 9, 2024

OpenAI Integration tracks used Langfuse Prompts

Picture Marc KlingenMarc Klingen

The OpenAI SDK integration for Langfuse Tracing now supports capturing the used prompt (version) from Prompt Management.

Example

prompt = langfuse.get_prompt("calculator")
 
openai.chat.completions.create(
  model="gpt-3.5-turbo",
  messages=[
    {"role": "system", "content": prompt.compile(base=10)},
    {"role": "user", "content": "1 + 1 = "}],
  langfuse_prompt=prompt
)

Thanks to @fancyweb for the contribution on this!

See prompt management docs for more details and example notebook.

Was this page useful?

Questions? We're here to help

Subscribe to updates