Long-Term Memory
The Long-Term Memory system allows FinQuest AI to maintain continuity across sessions. By capturing user goals, habits, and preferences, the AI personae (Professor Ledger and your FinMon) can provide personalized advice and build rapport over time.
Overview
The system operates by scanning AI responses for specific "Memory Tags" and persisting that data to local storage. When you interact with the AI, the most relevant memories are retrieved and injected into the conversation context.
Memory Schema
Memories are categorized to help the AI distinguish between what you want to achieve and how you behave.
export interface Memory {
id: string;
category: 'GOAL' | 'HABIT' | 'PREFERENCE' | 'EVENT';
content: string;
timestamp: number;
}
| Category | Description | Example |
| :--- | :--- | :--- |
| GOAL | Long-term financial objectives. | "Buying a house", "Paying off student loans" |
| HABIT | Recurring behaviors or routines. | "Buys coffee every morning", "Saves 10% of every check" |
| PREFERENCE | Specific likes, dislikes, or naming conventions. | "Hates credit cards", "Calls savings 'The Batcave'" |
| EVENT | Significant one-time financial milestones. | "Received a promotion", "Paid off the car" |
Usage: Memory Service
The memoryService.ts provides the public interface for managing these records.
saveMemory(category, content)
Manually or programmatically persists a new memory. The service automatically prevents exact duplicates.
import { saveMemory } from './services/memoryService';
// Example: Saving a user preference
saveMemory('PREFERENCE', 'Prefers aggressive investment strategies');
getRelevantContext(userQuery)
Retrieves a string containing recent memories formatted for LLM consumption. This is typically used in the geminiService to prime the AI before it generates a response.
import { getRelevantContext } from './services/memoryService';
const context = getRelevantContext("Should I buy this?");
// Returns: "[MEMORY CONTEXT]: The user previously mentioned: (GOAL) Buying a house; (PREFERENCE) Hates debt."
AI-Driven Memory Extraction
The AI is instructed via the Memory Protocol to identify when you share something important. It uses a specific tag format: [[MEMORY:CATEGORY:CONTENT]].
How it works:
- Detection: During a chat, if you say "I'm saving up for a trip to Japan," the AI responds and appends
[[MEMORY:GOAL:Trip to Japan]]to its internal output. - Processing: The
extractAndSaveMemoriesfunction ingeminiService.tsintercepts this tag. - Storage: The tag is stripped from the visible UI message, and the content is saved to the user's Long-Term Memory.
- Recall: In future conversations, Professor Ledger might say: "Since you're planning that trip to Japan, maybe skip this luxury purchase?"
Data Management
Memories are stored in the browser's localStorage under the key finmon_long_term_memory.
- Persistence: Memories survive page refreshes and browser restarts.
- Clearing Data: To wipe the AI's memory, you can call the internal
clearMemories()function:
import { clearMemories } from './services/memoryService';
// Use this to reset the AI's knowledge of the user
clearMemories();
Integration with Learning Service
While Memory tracks who you are, the Learning Service (via learningService.ts) tracks how you categorize money. The AI uses both to create a complete picture of your financial life—remembering that "The Daily Grind" is a "Food" expense (Learning) and that you are trying to cut back on caffeine (Memory).