Docs/Getting Started

Client API Reference

Complete reference for the RecallClient class and all available methods.

RecallClient

The main client class for interacting with Recall's hybrid memory system.

Constructor

TypeScript
1// TypeScript
2new RecallClient(options?: RecallClientOptions)
3
4interface RecallClientOptions {
5 redisUrl?: string;
6 mem0ApiKey?: string;
7 environment?: string;
8 cacheConfig?: CacheConfig;
9 syncConfig?: SyncConfig;
10 [key: string]: any;
11}
Python
1# Python
2RecallClient(
3 redis_url: str | None = None,
4 mem0_api_key: str | None = None,
5 environment: str = "development",
6 cache_config: CacheConfig | None = None,
7 sync_config: SyncConfig | None = None,
8 **kwargs
9)

Parameters

ParameterTypeDefaultDescription
redis_urlstring"redis://localhost:6379"Redis connection URL
mem0_api_keystringNoneMem0 API key for cloud storage
environmentstring"development"Environment name (development, staging, production)
cache_configCacheConfigNoneCache configuration options
sync_configSyncConfigNoneSynchronization configuration

Example

Python
1from recall import RecallClient
2
3# Basic initialization
4client = RecallClient(
5 redis_url="redis://localhost:6379",
6 mem0_api_key="m0-xxxxxxxxxxxx"
7)
8
9# With configuration
10client = RecallClient(
11 redis_url="redis://localhost:6379",
12 mem0_api_key="m0-xxxxxxxxxxxx",
13 environment="production",
14 cache_config=CacheConfig(ttl=3600),
15 sync_config=SyncConfig(mode="eager")
16)
TypeScript
1import { RecallClient } from "@recall/client";
2
3// Basic initialization
4const client = new RecallClient({
5 redisUrl: "redis://localhost:6379",
6 mem0ApiKey: "m0-xxxxxxxxxxxx",
7});
8
9// With configuration
10const client = new RecallClient({
11 redisUrl: "redis://localhost:6379",
12 mem0ApiKey: "m0-xxxxxxxxxxxx",
13 environment: "production",
14 cacheConfig: { ttl: 3600 },
15 syncConfig: { mode: "eager" },
16});

Core Methods

add()

Add a new memory to the system.

Python
1add(
2 content: str,
3 user_id: str,
4 priority: str = "medium",
5 metadata: dict | None = None,
6 async_mode: bool = False
7) -> dict
TypeScript
1add(options: AddMemoryOptions): Promise<Memory>
2
3interface AddMemoryOptions {
4 content: string;
5 userId: string;
6 priority?: Priority;
7 metadata?: Record<string, any>;
8 asyncMode?: boolean;
9}

Parameters

ParameterTypeRequiredDescription
contentstringYesThe memory content to store
user_idstringYesUser identifier
prioritystringNoPriority level: "critical", "high", "medium", "low"
metadataobjectNoAdditional metadata
async_modebooleanNoIf true, returns immediately without waiting for cloud sync

Returns

A dictionary/object containing:

  • id: Unique memory identifier
  • content: The stored content
  • user_id: Associated user ID
  • priority: Assigned priority level
  • created_at: Creation timestamp
  • metadata: Any additional metadata

Example

Python
1memory = client.add(
2 content="User prefers dark theme",
3 user_id="user_123",
4 priority="high",
5 metadata={
6 "category": "preferences",
7 "source": "settings_update"
8 }
9)
10
11print(f"Memory ID: {memory['id']}")
12# Output: Memory ID: mem_abc123xyz
TypeScript
1const memory = await client.add({
2 content: "User prefers dark theme",
3 userId: "user_123",
4 priority: "high",
5 metadata: {
6 category: "preferences",
7 source: "settings_update",
8 },
9});
10
11console.log(`Memory ID: ${memory.id}`);
12// Output: Memory ID: mem_abc123xyz

search()

Search for relevant memories using semantic search.

Python
1search(
2 query: str,
3 user_id: str | None = None,
4 limit: int = 10,
5 filters: dict | None = None,
6 threshold: float = 0.0
7) -> list[dict]
TypeScript
1search(options: SearchOptions): Promise<Memory[]>
2
3interface SearchOptions {
4 query: string;
5 userId?: string;
6 limit?: number;
7 filters?: Record<string, any>;
8 threshold?: number;
9}

Parameters

ParameterTypeRequiredDescription
querystringYesSearch query
user_idstringNoFilter by user ID
limitintegerNoMaximum results to return (default: 10)
filtersobjectNoMetadata filters
thresholdfloatNoMinimum relevance score (0.0 to 1.0)

Returns

Array of memory objects, each containing:

  • All memory fields
  • score: Relevance score (0.0 to 1.0)
  • source: Whether from "cache" or "cloud"

Example

Python
1results = client.search(
2 query="user preferences for UI",
3 user_id="user_123",
4 limit=5,
5 filters={"category": "preferences"},
6 threshold=0.7
7)
8
9for memory in results:
10 print(f"{memory['content']} (score: {memory['score']:.2f})")
TypeScript
1const results = await client.search({
2 query: "user preferences for UI",
3 userId: "user_123",
4 limit: 5,
5 filters: { category: "preferences" },
6 threshold: 0.7,
7});
8
9results.forEach((memory) => {
10 console.log(`${memory.content} (score: ${memory.score.toFixed(2)})`);
11});

get()

Retrieve a specific memory by ID.

Python
1get(memory_id: str) -> dict | None
TypeScript
1get(memoryId: string): Promise<Memory | null>

Example

Python
1memory = client.get("mem_abc123xyz")
2if memory:
3 print(f"Content: {memory['content']}")
4else:
5 print("Memory not found")
TypeScript
1const memory = await client.get("mem_abc123xyz");
2if (memory) {
3 console.log(`Content: ${memory.content}`);
4} else {
5 console.log("Memory not found");
6}

update()

Update an existing memory.

Python
1update(
2 memory_id: str,
3 content: str | None = None,
4 priority: str | None = None,
5 metadata: dict | None = None
6) -> dict
TypeScript
1update(options: UpdateOptions): Promise<Memory>
2
3interface UpdateOptions {
4 memoryId: string;
5 content?: string;
6 priority?: Priority;
7 metadata?: Record<string, any>;
8}

Example

Python
1updated = client.update(
2 memory_id="mem_abc123xyz",
3 priority="critical",
4 metadata={"last_accessed": datetime.now().isoformat()}
5)
TypeScript
1const updated = await client.update({
2 memoryId: "mem_abc123xyz",
3 priority: "critical",
4 metadata: { lastAccessed: new Date().toISOString() },
5});

delete()

Delete a memory from both cache and cloud storage.

Python
1delete(memory_id: str) -> bool
TypeScript
1delete(memoryId: string): Promise<boolean>

Example

Python
1success = client.delete("mem_abc123xyz")
2print(f"Deleted: {success}")
TypeScript
1const success = await client.delete("mem_abc123xyz");
2console.log(`Deleted: ${success}`);

get_all()

Retrieve all memories for a user.

Python
1get_all(
2 user_id: str,
3 limit: int | None = None,
4 offset: int = 0
5) -> list[dict]
TypeScript
1getAll(options: GetAllOptions): Promise<Memory[]>
2
3interface GetAllOptions {
4 userId: string;
5 limit?: number;
6 offset?: number;
7}

Example

Python
1memories = client.get_all(
2 user_id="user_123",
3 limit=100,
4 offset=0
5)
6print(f"Total memories: {len(memories)}")
TypeScript
1const memories = await client.getAll({
2 userId: "user_123",
3 limit: 100,
4 offset: 0,
5});
6console.log(`Total memories: ${memories.length}`);

Batch Operations

add_batch()

Add multiple memories in a single operation.

Python
1add_batch(memories: list[dict]) -> list[dict]
TypeScript
1addBatch(memories: AddMemoryOptions[]): Promise<Memory[]>

Example

Python
1memories = [
2 {
3 "content": "Prefers email notifications",
4 "user_id": "user_123",
5 "priority": "high"
6 },
7 {
8 "content": "Works in tech industry",
9 "user_id": "user_123",
10 "priority": "medium"
11 }
12]
13
14results = client.add_batch(memories)
15print(f"Added {len(results)} memories")
TypeScript
1const memories = [
2 {
3 content: "Prefers email notifications",
4 userId: "user_123",
5 priority: "high",
6 },
7 {
8 content: "Works in tech industry",
9 userId: "user_123",
10 priority: "medium",
11 },
12];
13
14const results = await client.addBatch(memories);
15console.log(`Added ${results.length} memories`);

delete_batch()

Delete multiple memories by ID.

Python
1delete_batch(memory_ids: list[str]) -> dict
TypeScript
1deleteBatch(memoryIds: string[]): Promise<BatchDeleteResult>

Cache Management

cache_stats()

Get detailed cache statistics.

Python
1cache_stats() -> dict
TypeScript
1cacheStats(): Promise<CacheStats>

Returns

Python
1{
2 "size": 1234, # Number of cached items
3 "memory_usage": "45.6MB", # Memory used
4 "hit_rate": 0.92, # Cache hit rate
5 "miss_rate": 0.08, # Cache miss rate
6 "evictions": 156, # Number of evictions
7 "avg_ttl": 3600, # Average TTL in seconds
8 "by_priority": {
9 "critical": 10,
10 "high": 234,
11 "medium": 567,
12 "low": 423
13 }
14}
TypeScript
1interface CacheStats {
2 size: number;
3 memoryUsage: string;
4 hitRate: number;
5 missRate: number;
6 evictions: number;
7 avgTtl: number;
8 byPriority: {
9 critical: number;
10 high: number;
11 medium: number;
12 low: number;
13 };
14}

optimize_cache()

Optimize cache by removing stale entries and reorganizing based on access patterns.

Python
1optimize_cache(
2 aggressive: bool = False
3) -> dict
TypeScript
1optimizeCache(options?: OptimizeOptions): Promise<OptimizeResult>

clear_cache()

Clear cache for specific user or entirely.

Python
1clear_cache(user_id: str | None = None) -> bool
TypeScript
1clearCache(userId?: string): Promise<boolean>

Synchronization

sync()

Manually trigger synchronization between cache and cloud.

Python
1sync(
2 direction: str = "bidirectional",
3 force: bool = False
4) -> dict
TypeScript
1sync(options?: SyncOptions): Promise<SyncResult>
2
3interface SyncOptions {
4 direction?: "bidirectional" | "to_cloud" | "from_cloud";
5 force?: boolean;
6}

Health & Monitoring

health_check()

Check the health status of all components.

Python
1health_check() -> dict
TypeScript
1healthCheck(): Promise<HealthStatus>

Returns

Python
1{
2 "status": "healthy",
3 "timestamp": "2024-01-15T10:30:00Z",
4 "components": {
5 "redis": {
6 "status": "healthy",
7 "latency_ms": 1.2,
8 "version": "7.0.5"
9 },
10 "mem0": {
11 "status": "healthy",
12 "latency_ms": 45.3,
13 "quota_used": 0.23
14 },
15 "cache": {
16 "status": "healthy",
17 "size": 1234,
18 "memory_usage": "45.6MB"
19 }
20 },
21 "version": "1.0.0"
22}
TypeScript
1interface HealthStatus {
2 status: "healthy" | "degraded" | "unhealthy";
3 timestamp: string;
4 components: {
5 redis: ComponentHealth;
6 mem0: ComponentHealth;
7 cache: ComponentHealth;
8 };
9 version: string;
10}

Configuration Classes

CacheConfig

Python
1class CacheConfig:
2 ttl: int | dict[str, int | None] = 3600
3 max_memory: str = "512mb"
4 eviction_policy: str = "allkeys-lru"
5 compression: bool = False
6 warm_cache: bool = True
TypeScript
1interface CacheConfig {
2 ttl?: number | Record<Priority, number | null>;
3 maxMemory?: string;
4 evictionPolicy?: string;
5 compression?: boolean;
6 warmCache?: boolean;
7}

SyncConfig

Python
1class SyncConfig:
2 mode: str = "lazy" # lazy, eager, manual
3 batch_size: int = 100
4 interval: int = 60
5 retry_policy: str = "exponential"
6 max_retries: int = 3
TypeScript
1interface SyncConfig {
2 mode?: "lazy" | "eager" | "manual";
3 batchSize?: number;
4 interval?: number;
5 retryPolicy?: string;
6 maxRetries?: number;
7}

Error Handling

Exception Types

Python
1from recall.exceptions import (
2 RecallError, # Base exception
3 ConnectionError, # Redis/Mem0 connection issues
4 AuthenticationError, # Invalid API key
5 ValidationError, # Invalid parameters
6 CacheError, # Cache-specific errors
7 SyncError, # Synchronization errors
8 RateLimitError # API rate limiting
9)
10
11try:
12 client.add(content="", user_id="")
13except ValidationError as e:
14 print(f"Invalid input: {e}")
15except RecallError as e:
16 print(f"Recall error: {e}")
TypeScript
1import {
2 RecallError,
3 ConnectionError,
4 AuthenticationError,
5 ValidationError,
6 CacheError,
7 SyncError,
8 RateLimitError,
9} from "@recall/client";
10
11try {
12 await client.add({ content: "", userId: "" });
13} catch (error) {
14 if (error instanceof ValidationError) {
15 console.log(`Invalid input: ${error.message}`);
16 } else if (error instanceof RecallError) {
17 console.log(`Recall error: ${error.message}`);
18 }
19}

Async Support

Async Client (Python)

Python
1from recall import AsyncRecallClient
2import asyncio
3
4async def main():
5 client = AsyncRecallClient(
6 redis_url="redis://localhost:6379",
7 mem0_api_key="your-api-key"
8 )
9
10 # Async methods
11 memory = await client.add(
12 content="Async memory",
13 user_id="user_123"
14 )
15
16 results = await client.search(
17 query="async operations",
18 user_id="user_123"
19 )
20
21 # Concurrent operations
22 tasks = [
23 client.add(content=f"Memory {i}", user_id="user_123")
24 for i in range(10)
25 ]
26 memories = await asyncio.gather(*tasks)
27
28asyncio.run(main())

Next Steps