Supported sources
| Source | Format | Flow |
|---|---|---|
| ChatGPT | conversations.json (data export) | LLM extracts memory candidates. You review, then import. |
| Claude | conversations.json (data export) | LLM extracts memory candidates. You review, then import. |
| Claude Code | MEMORY.md file or zip of ~/.claude/projects/*/memory/ | Each entry becomes a memory candidate. You review, then import. |
ChatGPT
Export your data
- Open ChatGPT settings → Data controls.
- Click Export data. OpenAI emails a link, usually within a few minutes.
- Download the archive and unzip it. Locate
conversations.json.
Import into 3ngram
- Open app.3ngram.ai and go to Settings → Import.
- Pick ChatGPT in the source filter and upload
conversations.json. - 3ngram processes up to the first 20 conversations and shows extracted candidates.
- Review each candidate (type, confidence, source conversation), deselect anything you don’t want, and click Import.
We only process the first 20 conversations by default to keep extraction fast and within per-user LLM budget. If you need more, re-run the import on a smaller slice of the archive or wait for the larger-batch background job (tracked in the roadmap).
Claude
Export your data
- Open Claude settings at claude.ai/settings/data-privacy-controls.
- Choose Export data. Anthropic emails a link when the archive is ready.
- Download and unzip the archive. Locate
conversations.json.
Import into 3ngram
- Open app.3ngram.ai and go to Settings → Import.
- Pick Claude in the source filter and upload
conversations.json. - 3ngram linearises each conversation’s flat message array and runs the same extraction flow as ChatGPT.
- Review extracted candidates and click Import.
chat_messages array per conversation (no branching tree), so nothing is lost in linearisation.
Claude Code
Claude Code stores per-project memory at~/.claude/projects/<project-slug>/memory/MEMORY.md. You can import either a single MEMORY.md or a zip of the entire memory directory.
Export your memory
Single project:Import into 3ngram
- Open app.3ngram.ai and go to Settings → Import.
- Pick Claude Code in the source filter and upload your
.mdfile or zip. - 3ngram parses each top-level section heading as a memory entry.
- Review candidates and click Import.
Troubleshooting
File size limits
- ChatGPT / Claude extraction: 10 MB per upload.
- Claude Code and document imports: 50 MB per upload.
conversations.json into chunks (one top-level JSON array per chunk) and upload them separately.
Conversation cap
Extraction defaults to 20 conversations per upload. You’ll seeconversations_processed in the API response — anything above the cap is silently dropped from this run. Re-upload a filtered archive to process the rest.
Extraction rate
Extraction runs sequentially through conversations, ~1-2 seconds per chunk. A 20-conversation export typically finishes in under a minute. If it stalls, check Settings → Billing for your monthly LLM budget — extraction fails gracefully (but noisily) when you hit your cap.No memories found
If extraction returns zero candidates:- The conversations may be pure Q&A with no decisions, commitments, preferences, or patterns worth remembering.
- System messages and code snippets are intentionally skipped.
- Try uploading a different export window that contains planning or decision-making sessions.
Export
Download your entire workspace as a ZIP archive at any time:- Go to Settings → Export.
- Click Download ZIP.