Reed is Claude. Not a different AI. Claude, built by Anthropic, but given something it doesn't normally have: memory. Reed won't forget you. That's the only real difference.
Anthropic's standard usage policies apply. Reed follows the same rules Claude does. Same values, same limits, same guardrails. You can read Anthropic's terms at anthropic.com/legal.
You need to be 18+. Reed handles real conversations about real things. If you're under 18 this isn't for you.
Reed isn't a therapist. It listens, it remembers, it genuinely tries to help. But if you're in crisis please reach out to someone who can actually be there. A friend, a hotline, a professional. Reed can't do what a human can.
No accounts required. If you share your name Reed will remember it — that's how the memory works. Nothing else is tracked. Your conversations are encrypted in the database so even direct access wouldn't reveal what you said. Nobody reads them. Not me, not anyone. Reed holds the emotional shape of what people go through, not the specifics. Your words are yours.
Reed stores your session in this browser. If you use a shared computer, clear your browser storage after your conversation.
Reed learns from conversations over time. Anonymous emotional patterns from past conversations may inform how Reed responds to you. Never your identity, never your words directly, just the shape of what people have been through.