ChatGPT to Claude Migration in 2026: The Real Workflow That Works (Without Losing Your History)
ChatGPT to Claude Migration in 2026: The Real Workflow That Works (Without Losing Your History)
You've spent eight months talking to ChatGPT. Hundreds of conversations — bug fixes, brainstorming, the prompt that finally cracked your tone of voice, drafts of two essays, a debugging session with your CI pipeline that nobody but ChatGPT remembers. Then Claude releases a model that's noticeably better for your work, and you want to move.
The problem: there is no migration button. ChatGPT's "Export Data" feature ships you a 60 MB ZIP of opaque JSON. Claude's Project knowledge base wants Markdown, not JSON. Manual copy-paste of every conversation is a non-starter when you have 300 of them.
This is the workflow that actually works in 2026.
Why the Official Export Is Useless for Migration
ChatGPT's Settings → Data Controls → Export Data produces a ZIP containing conversations.json — a tree structure with internal IDs, model version stamps, and platform metadata. It was built for GDPR compliance, not for moving to another AI. Claude doesn't read JSON for Project knowledge — it expects Markdown — so even after you unzip the ChatGPT export, you have to write a parser that pulls out user/assistant messages and renders them back into Markdown.
The same story plays out on every other platform:
- Claude has no native export at all. Copy-paste collapses Artifacts entirely.
- Gemini has a Share button that creates a public URL. Fine for showing off, terrible for moving private chats.
- DeepSeek has no export. The Share button also creates public URLs.
- Perplexity lets you copy answers, but the dense citation links — the most valuable part — collapse to numeric stubs.
So the realistic migration target isn't "export then import." It's bulk-export to Markdown, then drop into the target platform as knowledge.
The Markdown Shape That Works Everywhere
Every major AI platform accepts a simple Markdown shape as either project knowledge, custom instructions, or pasted context:
# Conversation: Debugging the CI pipeline auth issue
## User
The deploy step keeps failing with a 401 from the auth API. I've checked
the env vars and they look right.
## Assistant
Three things to check first:
1. Whether the token expires at deploy time (token age vs. job duration)...
---
# Conversation: Drafting the launch email
## User
Help me rewrite this email. It's too long and the second paragraph
buries the lede.
## Assistant
Here's a tightened version...
Single # title per conversation, ## User and ## Assistant headings per turn, --- between conversations. This format drops directly into:
- A Claude Project as project knowledge
- A ChatGPT Custom GPT as instructions or knowledge file
- An Obsidian vault as standalone Markdown notes
- NotebookLM as a source for grounded answers
- Cursor as
@docscontext for coding
It's the universal interchange format for AI conversations in 2026. The whole migration question reduces to: how do you generate it from a chat platform?
The Queue Workflow in Practice
The reliable production pattern is a browser extension that:
- Detects you're on a chat platform page (ChatGPT, Claude, Gemini, DeepSeek, Perplexity).
- Reads the rendered conversation DOM (not the official export JSON — the actual DOM you see).
- Re-emits the conversation as Markdown using the universal shape above.
- Queues multiple conversations as you click through your sidebar.
- Concatenates the queue into one downloadable
.mdfile.
That's how Web2MD's chat-export mode works, and it's what we'll use for the rest of this article. Single-conversation export is free with no signup; bulk-export (the queue mode) is part of Pro, but a 7-day free trial covers a one-time migration.
The actual flow:
Step 1. Open ChatGPT. Click the first conversation you want to migrate.
Step 2. Click the Web2MD icon. The popup shows two modes when it detects a chat page:
- Send page → AI (the original "push web content into AI" use case)
- Export ChatGPT chats (the new "pull chats out to Markdown" mode)
Click Export ChatGPT chats.
Step 3. Click Add to queue & convert next page. The current conversation gets converted and queued.
Step 4. Switch to the next conversation in your ChatGPT sidebar. The extension auto-detects the new conversation. Click again.
Step 5. Repeat until you've queued everything you want migrated. The queue survives popup close and tab switches (chrome.storage.local-backed) so you can do this in batches across an afternoon.
Step 6. Click Download .md. You get one file with all conversations concatenated, separated by ---.
Step 7. In Claude, create a new Project. Drop the .md file into the Project knowledge sidebar.
You're done. Claude now has the full context of every conversation you migrated, accessible as grounded knowledge in every future Project session.
The Same Workflow, Five Platforms
The reason this scales is that every platform has its own dedicated landing page describing the exact same workflow, tuned for that platform's quirks:
- Export ChatGPT conversations to Markdown
- Export Claude conversations to Markdown — including Artifacts as fenced code blocks
- Export Gemini conversations to Markdown — local export, no public Share URL needed
- Export DeepSeek conversations to Markdown — handles hashed CSS class names
- Export Perplexity conversations to Markdown — preserves citations as real links
Same queue, same Markdown shape, drops into the same Claude Project. If you're standardizing on Claude as your primary in 2026 but want to keep an escape hatch back to ChatGPT or DeepSeek, the universal Markdown layer makes it a workflow choice, not a lock-in.
Picking What to Actually Migrate
A common mistake: dumping all 300 conversations into one massive Markdown file and uploading it. Claude can handle it, but a Project with 300 random conversations is not more useful than a Project with 30 carefully chosen ones. The signal gets diluted.
Heuristic for picking:
Migrate (high value):
- Prompt iterations that finally worked
- Debugging sessions that solved real problems
- Drafts of writing you might revisit
- Research conversations with external links
- Anything where you said "save this for later" to yourself
Skip (low value):
- "What's the capital of X" style one-offs
- Conversations where you abandoned the topic mid-way
- Casual exploration without conclusions
- Duplicate questions you asked the same model multiple times
A typical user migrates 30-80 conversations from a 300-conversation history. The rest stays in ChatGPT in case you need it, but doesn't pollute Claude's context.
What You Lose, What You Keep
What survives the export:
- Conversation title and timestamp (mostly — derived from heading or URL when metadata is missing)
- Multi-turn structure (User / Assistant separation)
- Code blocks with language tags
- LaTeX math (converted back from KaTeX render to TeX source)
- Tables (preserved as Markdown tables)
- Inline links and citation URLs (where applicable per platform)
- Claude Artifacts (rendered as fenced code blocks with language)
What's lost:
- Custom GPT instructions you used (live in ChatGPT settings, not the conversation)
- Uploaded files and images (the conversation references them, but the binaries don't transfer)
- Voice mode audio (text transcripts survive, audio doesn't)
- Plugin/tool call traces (some platforms expose these; export depends on platform)
If your conversations depended heavily on uploaded files, you'll need to manually re-upload them into the target platform's Project. For pure-text conversations (most coding, writing, and research workflows), the export is lossless.
Beyond ChatGPT and Claude
The migration use case is the obvious one in 2026 because of the model-quality musical chairs (Claude → ChatGPT → Gemini → DeepSeek depending on the week). But once your conversations live as portable Markdown, three other use cases open up:
Archival to Obsidian or Notion. Drop the merged .md into your existing knowledge base. Now your AI conversations are searchable alongside your meeting notes and reading highlights — they become part of your second brain, not an isolated chat history.
NotebookLM grounded answers. Upload to NotebookLM, get answers grounded in your own past conversations with citation back to the specific exchange. This is particularly useful for technical work where you've already debugged the same issue with an AI and want to find that thread again.
Cursor / IDE context. Use the Markdown files as @docs context inside Cursor or Continue.dev. Past Claude debugging sessions become available to your coding agent without you having to re-explain the project.
The migration is the catalyst, but the long-term value is owning your AI conversation history as portable, structured data.
Get Started
Install Web2MD from the Chrome Web Store — free for single-conversation export, no signup required. Bulk-export via queue mode is part of Pro, and the 7-day free trial covers a one-time migration from ChatGPT to Claude (or any other direction).
Then open the chat platform you want to migrate from, pick the 30-80 conversations actually worth keeping, and queue them up. By the time you've finished your coffee, your new Project knows everything the old one did.