"Privacy-first" is one of the most overused phrases in enterprise software. Every product claims it. Very few can show you the mechanism.
This article traces exactly what happens to your data at every step of a HammerLockAI session — from the moment you type a character to the moment a response is stored in your vault. No abstraction, no marketing language. Just the architecture.
The Stack Overview
HammerLockAI's privacy architecture has four distinct layers, each with a specific role:
- Local encryption layer — protects data at rest on your device
- PII anonymization layer — protects identity before any data leaves your device
- Routing layer — controls which providers see which queries
- Local storage layer — ensures responses and sessions never reach cloud storage
Understanding where each layer operates tells you exactly where your data is protected and where it isn't.
Step 1: You Type a Query
Your keystrokes go directly into the HammerLockAI interface running locally on your device. Nothing is transmitted at this stage. The interface is a local application — no keystroke logging, no real-time sync to a server, no analytics pinging home as you type.
Your session state — the conversation context, any loaded persona files, your vault contents — exists entirely in local memory and local encrypted storage. Nothing about your session is live on a remote server.
What leaves your device at this step: Nothing.
Step 2: PII Detection and Anonymization
Before your query is processed for transmission, the anonymizer scans it for personally identifiable information: names, email addresses, phone numbers, company names, and other identifiers that could link the query to a specific individual or organization.
Detected PII is replaced with generic placeholders. The original values and their placeholder substitutions are stored in a session mapping held in local memory — not on disk, not transmitted.
The anonymized query — your original intent intact, but stripped of identifying specifics — is what gets passed to the next stage.
What leaves your device at this step: Nothing. Anonymization runs entirely locally.
Step 3: Query Routing Decision
The routing layer decides where the anonymized query goes. This decision is based on your configuration:
Local-only mode: The query goes to Ollama, running on your device. It never leaves your machine. The model processes it locally, generates a response locally, and the response is stored locally. No external network connection is made for this query.
Cloud mode (racing or sequential): The anonymized query is prepared for transmission to one or more configured cloud providers. Your API keys (stored in your encrypted vault, loaded into memory at session start) are used to authenticate the outbound request. The request is made directly from your device to the provider's API — there is no HammerLock intermediary server handling the transmission.
Hybrid mode: Some queries go local (typically lower-complexity tasks, sensitive sessions), some go cloud (high-capability requirements). The routing can be configured manually or set to automatic based on query characteristics.
What leaves your device at this step: In cloud mode, the anonymized query is transmitted directly to the provider API over HTTPS. Your IP address is visible to the provider. The content of the query is anonymized. Your API key authenticates the request — providers can associate requests with your account, but not with identifiable query content.
Step 4: Provider Processing
The cloud AI provider — OpenAI, Anthropic, Groq, Gemini, Mistral, or DeepSeek — receives the anonymized query and processes it. The provider has no access to:
- Your real name or your clients' names (stripped by the anonymizer)
- Your company name or your clients' company names (stripped)
- Email addresses, phone numbers, or other contact identifiers (stripped)
- Your previous sessions (not transmitted — context lives in your local session only)
- Your vault contents (never transmitted)
- Your persona files (not sent to the provider — used locally for context, summarized when included in the prompt)
The provider does have access to:
- The anonymized query content (the substance of what you're asking)
- Your API key (if BYOK) or HammerLock's pooled key (if credits)
- Your IP address (standard for any API request)
- Provider-side logs per their own retention policies
This is the exposure window. The provider sees the subject matter of your query, even if they don't see the identities involved. For most professional use cases, this is an acceptable tradeoff. For highly sensitive queries where even the subject matter can't be disclosed to third parties, use local-only mode.
Step 5: Response Transmission
The provider streams the response back to your device over HTTPS. Each token arrives encrypted in transit (TLS 1.3 on all major providers). The response is received by the HammerLockAI runtime running locally.
At no point is the response written to a HammerLock server. It streams directly from the provider to your local application.
What leaves your device at this step: Nothing.
Step 6: Response Restoration
The locally-held PII mapping from Step 2 is applied to the incoming response. Placeholders in the response — "Person A," "Company A," "email_1@domain.com" — are replaced with the original values. This restoration happens in local memory before the response is displayed.
What you see in the interface is the complete, contextually accurate response with real names and identifiers restored. The provider never saw those real values; you see them because the restoration happened on your device.
What leaves your device at this step: Nothing.
Step 7: Local Storage and Encryption
When the session is saved, the full conversation — including the original (pre-anonymized) query, the restored response, and any associated context — is encrypted using AES-256-GCM and written to local storage. The encryption key is derived from your password using PBKDF2 key stretching.
The encrypted data on your device is unreadable without your password. There is no recovery mechanism — HammerLock has no copy of your password and cannot decrypt your vault. If you forget your password, your encrypted data cannot be recovered. This is a deliberate design choice. Recoverability requires a trusted party who holds a key; we don't hold one.
What is stored on HammerLock servers: Nothing. Your conversation history, vault, and persona files exist only on your device.
The Complete Data Flow, Summarized
| Step | What Happens | Data Off Device? |
|---|---|---|
| You type | Input captured locally | No |
| PII scan | Identifiers detected, mapping created in memory | No |
| Routing | Query directed to local or cloud | No (local) / Anonymized query only (cloud) |
| Provider | Model processes anonymized query | Anonymized query visible to provider |
| Response | Streamed back over HTTPS | No |
| Restoration | Placeholders replaced locally | No |
| Storage | Encrypted, stored locally | No |
Where HammerLockAI Sits vs. the Law
For professionals operating under legal data protection regimes — attorney-client privilege, HIPAA, GDPR, fiduciary duty — understanding this architecture matters beyond convenience.
Attorney-client privilege: Transmitting client information to a third-party AI provider is a disclosure. Courts are still working through whether AI providers are "agents" of the attorney that preserve privilege. The safest position — no client identifiers transmitted — is what HammerLockAI's anonymizer enables. Local-only mode is the absolute safest position.
HIPAA: Covered entities transmitting patient data to AI providers trigger Business Associate Agreement (BAA) requirements. Most AI providers' standard terms don't qualify as BAAs. HammerLockAI's anonymizer prevents patient identifiers from being transmitted; local-only mode prevents any PHI from leaving the covered entity's infrastructure.
GDPR: Data minimization is a foundational principle — collect and process only what is necessary. Transmitting anonymized queries to process your AI tasks is a significantly more defensible position than transmitting full-context queries with identifiable data. Local-only mode eliminates the third-country transfer problem entirely.
Fiduciary duty: Financial advisors and wealth managers owe their clients a duty that includes protecting confidential information. Running client portfolio data through a commercial AI tool with standard terms is a compliance risk. HammerLockAI's anonymizer, used correctly, meaningfully reduces that exposure.
The Honest Caveat
No privacy architecture is a complete legal opinion. This document describes the technical reality of how HammerLockAI processes data. Whether that technical reality satisfies your specific professional obligations under specific regulatory regimes is a legal question that your counsel should evaluate.
What we can say with precision: the architecture described here is the architecture HammerLockAI runs. Your data path is local — except for the anonymized query that reaches the provider in cloud mode. That is a meaningfully better posture than the alternative.
HammerLockAI is built on a fork of OpenClaw, the open-source agentic AI runtime. View the source on GitHub →