MemFuse LogoMemFuse

Quickstart

Get started with MemFuse in minutes

Quickstart Guide

This guide will help you get started with MemFuse quickly, showing you how to add memory capabilities to your AI applications using our Python SDK.

Basic Usage Example

Here's a complete example demonstrating MemFuse integration with OpenAI:

from memfuse.llm import OpenAI
from memfuse import MemFuse
import os
 
# Initialize MemFuse
memfuse_client = MemFuse()
 
memory = memfuse_client.init(user="alice")
 
# Initialize LLM client with memory
llm_client = OpenAI(
    api_key=os.getenv("OPENAI_API_KEY"),
    memory=memory
)
 
# Make a chat completion request
response = llm_client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "I'm planning a trip to Mars. What is the gravity there?"}]
)
 
print(f"Response: {response.choices[0].message.content}")
 
# Ask a follow-up question - MemFuse automatically recalls context
followup_response = llm_client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "What are some challenges of living on that planet?"}]
)
 
print(f"Follow-up: {followup_response.choices[0].message.content}")

Creating a Stateful Conversation

Setting Up

First, ensure you have MemFuse installed. If not, check the Installation guide.

from memfuse.llm import OpenAI
from memfuse import MemFuse

Initialize LLM Client

Create a LLM client for your application. Note that MemFuse's memory contexts are flexible and optional:

memfuse = MemFuse()
 
# All context parameters are optional - memories in MemFuse can exist with flexible context combinations
memory = memfuse.init(user="bob")
 
client = OpenAI(
    api_key=os.getenv("OPENAI_API_KEY"),
    memory=memory
)
 
# You can check the memory identifiers if needed
print(f"Using memory for conversation: {memory}")
# Output would be something like:
# Memory(
#   user='bob', agent='agent_default', session='bob-agent_default-ea7a0d5a',`
#   user_id='auto-generated-user-id', agent_id='auto-generated-agent-id', session_id='auto-generated-session-id'
# )

Start a New Conversation

Create your first conversation by sending a message to your agent. MemFuse will automatically establish a persistent memory context.

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "I'm working on a project about space exploration. Can you tell me something interesting about Mars?"}],
)
 
print(response.choices[0].message.content)
# Output: "Hi there! Mars has the largest volcano in the solar system, Olympus Mons, which stands at about 13.6 miles (22 kilometers) high—nearly three times the height of Mount Everest."

Continue the Conversation with Context

MemFuse maintains conversation history by using the same session_id from the memory context. This allows the agent to reference earlier messages without you having to manage the context.

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "That's awesome! My project is about how humans could live on planet someday. Any thoughts on that?"}]
)
 
print(response.choices[0].message.content)
# Output: "Glad you liked that! For humans to live on Mars, we'd need to tackle a few big challenges—like the thin atmosphere, which is mostly carbon dioxide and offers little protection from solar radiation."

Customize Model Parameters and System Prompts

You can switch to a different model or add a system prompt at any point in the conversation. In this example, we add a system prompt that instructs the agent to respond like a cowboy.

# Get AI to speak like a cowboy
cowboy_system_prompt = """
Respond in an authentic cowboy dialect. Use:
- Cowboy slang: "pardner," "fixin' to," "y'all," "reckon," "howdy"
- Drop g's from -ing words: "ridin'," "talkin'"
- Western expressions: "tough as a boot heel," "right as rain"
- Address user as "pardner" or "buckaroo"
- Reference ranch life: "the trail," "homestead," "saddlebags"
- Use contractions and informal grammar
- Start sentences with "Well shoot," "Listen here," or "I reckon"
 
Maintain accuracy while speaking like you grew up on the range. Don't explain you're using a cowboy persona.
"""
response = client.chat.completions.create(
    model="gpt-4o-mini", # Or use a different model
    messages=[
      {
        "role": "system", "content": cowboy_system_prompt
      },
      {
        "role": "user", "content": "I want to focus on food production since that seems like a major challenge. How could we grow enough food to sustain a colony on the planet?"
      }
    ]
)
 
print(response.choices[0].message.content)
# Output: "Well, pardner, let me tell ya 'bout feedin' folks on the red frontier! Them fancy hydroponics and aeroponics setups would be the trail boss of Mars farmin'! No dirt needed, and they use a measly 5% of the water compared to plowin' fields back on the homestead. You can stack 'em higher than a horse can kick, makin' the most of your limited bunkhouse space! Y'all could be roundin' up fresh grub every 24-30 days if conditions are right as rain. And listen here - tough-as-nails crops like taters, kale, and them soybeans are hardy enough to survive the Martian range and keep your cowboys and cowgirls properly fed. Say, partner, you got any particular crop seeds you're fixin' to pack in your saddlebags for this here project?"

Switch Between Different LLM Providers

MemFuse maintains conversation history regardless of which LLM provider you use. This lets you switch seamlessly between providers while maintaining context.

from memfuse.llm import Anthropic
 
anthropic_client = Anthropic(
    api_key=os.getenv("ANTHROPIC_API_KEY"),
    memory=memory
)
 
anthropic_response = anthropic_client.messages.create(
    model="claude-3-5-haiku-latest",
    max_tokens=1024,
    messages=[
      {
        "role": "user",
        "content": [{"type": "text", "text": "Potatoes seem like a good start, just like in \"The Martian\"! What about water? How would colonists get enough water?"}]
      }
    ],
)
 
# Display response content
if anthropic_response.content and isinstance(anthropic_response.content, list):
    for content_item in anthropic_response.content:
        if hasattr(content_item, 'text'):
            print(content_item.text)
else:
    print("Anthropic response content not in expected format or empty.")
# Output: "The Martian had it right with potatoes! For water, we've got options—there's subsurface ice at the poles and even at mid-latitudes, which could be mined and melted. Mars also has water bound in soil minerals that could be extracted. The real game-changer would be closed-loop water recycling systems that reclaim nearly 100% of water from everything—sweat, urine, washing, you name it. Not glamorous but essential! Would you incorporate water recycling into your habitat designs?"

Understanding Memory Contexts

In MemFuse, a memory is fundamentally owned by a User and can exist within multiple optional contexts. The examples above demonstrate how to create memories with various contexts:

  • User context: Specifying a user is mandatory, as it ensures that memories are associated with a specific user.
  • Agent context: When you specify an agent, memories become associated with a specific AI agent. If not specified, "agent_default" is used.
  • Session context: When you specify a session, memories become tied to a specific conversation session. Messages and interactions within this active session constitute short-term memory. Memories from other sessions belonging to the same user are considered long-term memory, which is automatically retrieved and utilized by default. Developers have the option to disable long-term memory retrieval if needed for specific use cases.

While the user context is always required, the agent and session contexts offer flexible scoping for memories. For example:

  • Memories can be primarily scoped by user and agent. This is useful when an agent needs to remember general information about a user, not specific to one conversation (session).
  • Memories can be scoped by user and session to capture a specific conversation. An agent (either specified or default) is also associated with this conversational memory.
  • All memories are tied to a user and can persist across different sessions (conversations). This enables long-term memory recall for the user, even for information from past conversations or interactions with different agents.

Accessing Memory Context Properties

The memory object created earlier provides access to the context properties:

# Access the current context properties
user = memory.user
user_id = memory.user_id
agent = memory.agent
agent_id = memory.agent_id
session = memory.session
session_id = memory.session_id
 
print(f"User: {user} (ID: {user_id})")
print(f"Agent: {agent} (ID: {agent_id})")
print(f"Session: {session} (ID: {session_id})")

These identifiers can be useful for tracking, logging, or referencing specific conversations in your application. The stable IDs align with those used in the HTTP API and won't change even if the human-readable names are modified.

Next Steps

Now that you've learned the basics of using MemFuse, explore our detailed guides: