⚠ This page is served via a proxy. Original site: https://github.com
This service does not collect credentials or authentication data.
Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
65 changes: 65 additions & 0 deletions contributing/samples/memory_chroma/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
# ChromaDB Memory Service Example

This example demonstrates using `ChromaMemoryService` for semantic memory search
with embeddings generated by Ollama.

## Prerequisites

1. **Ollama Server Running**
```bash
ollama serve
```

2. **Embedding Model Pulled**
```bash
ollama pull nomic-embed-text
```

3. **Dependencies Installed**
```bash
pip install chromadb
# Or with uv:
uv pip install chromadb
```

## Running the Example

```bash
cd contributing/samples/memory_chroma
python main.py
```

## What This Demo Does

1. **Session 1**: Creates memories by having a conversation with the agent
- User introduces themselves as "Jack"
- User mentions they like badminton
- User mentions what they ate recently

2. **Memory Storage**: The session is saved to ChromaDB with semantic embeddings
- Data persists to `./chroma_db` directory
- Embeddings are generated using Ollama's `nomic-embed-text` model

3. **Session 2**: Queries the memories using semantic search
- User asks about their hobbies (agent should recall "badminton")
- User asks about what they ate (agent should recall "burger")

## Key Differences from InMemoryMemoryService

| Feature | InMemory | ChromaDB |
|---------|----------|----------|
| Search Type | Keyword matching | **Semantic similarity** |
| Persistence | No (lost on restart) | **Yes (disk)** |
| Synonyms | No | **Yes** |
| Performance | Fast | Fast (with HNSW index) |

## Customization

You can change the embedding model by modifying the `OllamaEmbeddingProvider`:

```python
embedding_provider = OllamaEmbeddingProvider(
model="mxbai-embed-large", # Higher quality but slower
host="http://remote-server:11434", # Remote Ollama server
)
```
15 changes: 15 additions & 0 deletions contributing/samples/memory_chroma/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

"""Sample package for ChromaMemoryService demonstration."""
45 changes: 45 additions & 0 deletions contributing/samples/memory_chroma/agent.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

"""Agent definition for ChromaMemoryService demo."""

from datetime import datetime

from google.adk import Agent
from google.adk.agents.callback_context import CallbackContext
from google.adk.tools.load_memory_tool import load_memory_tool
from google.adk.tools.preload_memory_tool import preload_memory_tool


def update_current_time(callback_context: CallbackContext):
callback_context.state["_time"] = datetime.now().isoformat()


root_agent = Agent(
model="gemini-2.0-flash-001",
name="chroma_memory_agent",
description="Agent with ChromaDB-backed semantic memory.",
before_agent_callback=update_current_time,
instruction="""\
You are an agent that helps users answer questions.
You have access to a semantic memory system that stores past conversations.
Use the memory tools to recall information from previous sessions.

Current time: {_time}
""",
tools=[
load_memory_tool,
preload_memory_tool,
],
)
139 changes: 139 additions & 0 deletions contributing/samples/memory_chroma/main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,139 @@
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

"""Demo script for ChromaMemoryService with OllamaEmbeddingProvider.

This example demonstrates using ChromaDB for semantic memory search
with embeddings generated by Ollama.

Prerequisites:
1. Ollama server running: `ollama serve`
2. Embedding model pulled: `ollama pull nomic-embed-text`
3. Dependencies installed: `pip install chromadb`

Usage:
python main.py
"""

import asyncio
from datetime import datetime
from datetime import timedelta
from typing import cast

import agent
from dotenv import load_dotenv
from google.adk.cli.utils import logs
from google.adk.memory import ChromaMemoryService
from google.adk.memory import OllamaEmbeddingProvider
from google.adk.runners import InMemoryRunner
from google.adk.sessions.session import Session
from google.genai import types

load_dotenv(override=True)
logs.log_to_tmp_folder()


async def main():
app_name = "my_app"
user_id_1 = "user1"

# Initialize the ChromaMemoryService with Ollama embeddings
embedding_provider = OllamaEmbeddingProvider(
model="nomic-embed-text", # Or another embedding model you have
)
memory_service = ChromaMemoryService(
embedding_provider=embedding_provider,
collection_name="demo_memory",
persist_directory="./chroma_db", # Persist to disk
)

runner = InMemoryRunner(
app_name=app_name,
agent=agent.root_agent,
memory_service=memory_service,
)

async def run_prompt(session: Session, new_message: str) -> Session:
content = types.Content(
role="user", parts=[types.Part.from_text(text=new_message)]
)
print("** User says:", content.model_dump(exclude_none=True))
async for event in runner.run_async(
user_id=user_id_1,
session_id=session.id,
new_message=content,
):
if not event.content or not event.content.parts:
continue
if event.content.parts[0].text:
print(f"** {event.author}: {event.content.parts[0].text}")
elif event.content.parts[0].function_call:
print(
f"** {event.author}: fc /"
f" {event.content.parts[0].function_call.name} /"
f" {event.content.parts[0].function_call.args}\n"
)
elif event.content.parts[0].function_response:
print(
f"** {event.author}: fr /"
f" {event.content.parts[0].function_response.name} /"
f" {event.content.parts[0].function_response.response}\n"
)

return cast(
Session,
await runner.session_service.get_session(
app_name=app_name, user_id=user_id_1, session_id=session.id
),
)

# Session 1: Create memories
session_1 = await runner.session_service.create_session(
app_name=app_name, user_id=user_id_1
)

print(f"----Session to create memory: {session_1.id} ----------------------")
session_1 = await run_prompt(session_1, "Hi")
session_1 = await run_prompt(session_1, "My name is Jack")
session_1 = await run_prompt(session_1, "I like badminton.")
session_1 = await run_prompt(
session_1,
f"I ate a burger on {(datetime.now() - timedelta(days=1)).date()}.",
)
session_1 = await run_prompt(
session_1,
f"I ate a banana on {(datetime.now() - timedelta(days=2)).date()}.",
)

print("Saving session to ChromaDB memory service...")
await memory_service.add_session_to_memory(session_1)
print("Session saved! Data persisted to ./chroma_db")
print("-------------------------------------------------------------------")

# Session 2: Query memories using semantic search
session_2 = await runner.session_service.create_session(
app_name=app_name, user_id=user_id_1
)
print(f"----Session to use memory: {session_2.id} ----------------------")
session_2 = await run_prompt(session_2, "Hi")
session_2 = await run_prompt(session_2, "What do I like to do?")
# Expected: The agent should recall "badminton" from semantic search
session_2 = await run_prompt(session_2, "When did I say that?")
session_2 = await run_prompt(session_2, "What did I eat yesterday?")
# Expected: The agent should recall "burger" from semantic search
print("-------------------------------------------------------------------")


if __name__ == "__main__":
asyncio.run(main())
4 changes: 4 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -164,6 +164,10 @@ otel-gcp = ["opentelemetry-instrumentation-google-genai>=0.3b0, <1.0.0"]

toolbox = ["toolbox-adk>=0.5.7, <0.6.0"]

chroma = [
"chromadb>=0.4.0, <1.0.0", # For ChromaMemoryService
]

[tool.pyink]
# Format py files following Google style-guide
line-length = 80
Expand Down
17 changes: 17 additions & 0 deletions src/google/adk/memory/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,3 +35,20 @@
' VertexAiRagMemoryService please install it. If not, you can ignore this'
' warning.'
)

try:
from .chroma_memory_service import ChromaMemoryService
from .embeddings import BaseEmbeddingProvider
from .embeddings import OllamaEmbeddingProvider

__all__.extend([
'ChromaMemoryService',
'BaseEmbeddingProvider',
'OllamaEmbeddingProvider',
])
except ImportError:
logger.debug(
'chromadb is not installed. If you want to use the ChromaMemoryService'
' please install it with: pip install chromadb. If not, you can ignore'
' this warning.'
)
Loading