</p>
<p align="center">
- Privacy proxy for LLMs. Masks personal data and secrets before sending prompts to your provider.
+ Privacy proxy for OpenAI and Anthropic. Masks personal data and secrets before they reach the API.
</p>
<p align="center">
<br/>
-<img src="assets/dashboard.png" width="100%" alt="PasteGuard Dashboard">
+<img src="assets/demo.gif" width="100%" alt="PasteGuard Demo">
## What is PasteGuard?
-PasteGuard is a privacy proxy that masks personal data and secrets before sending prompts to LLM providers.
+PasteGuard is a privacy proxy that masks personal data and secrets before they reach OpenAI or Anthropic.
```
You send: "Email Dr. Sarah Chen at sarah@hospital.org"
**Two ways to protect your data:**
-- **Mask Mode** — Replace PII with placeholders, send to your provider, restore in response. No local infrastructure needed.
-- **Route Mode** — Send PII requests to a local LLM (Ollama, vLLM, llama.cpp), everything else to your provider. Data never leaves your network.
+- **Mask Mode** — Replace PII with placeholders, send to OpenAI or Anthropic, restore in response. No local infrastructure needed.
+- **Route Mode** — Send PII requests to a local LLM (Ollama, vLLM, llama.cpp), everything else to OpenAI or Anthropic. Data never leaves your network.
Just change one URL to start protecting your data.
## Features
- **PII Detection** — Names, emails, phone numbers, credit cards, IBANs, and more
-- **Secrets Detection** — API keys, tokens, private keys caught before they reach the LLM
+- **Secrets Detection** — API keys, tokens, private keys caught before they reach OpenAI or Anthropic
- **Streaming Support** — Real-time unmasking as tokens arrive
- **24 Languages** — English, German, French, and 21 more
- **OpenAI** — Works with OpenAI and compatible APIs (Azure, OpenRouter, Groq, Together AI, etc.)
Point your app to PasteGuard:
-| Provider | PasteGuard URL | Original URL |
+| API | PasteGuard URL | Original URL |
|----------|----------------|--------------|
| OpenAI | `http://localhost:3000/openai/v1` | `https://api.openai.com/v1` |
| Anthropic | `http://localhost:3000/anthropic` | `https://api.anthropic.com` |
Dashboard: [http://localhost:3000/dashboard](http://localhost:3000/dashboard)
+<img src="assets/dashboard.png" width="100%" alt="PasteGuard Dashboard">
+
### European Languages
For German, Spanish, French, Italian, Dutch, Polish, Portuguese, and Romanian:
```
<Note>
-This endpoint supports both **mask mode** and **route mode**. Route mode requires a local provider with Anthropic API support (e.g., Ollama). The request format follows the [Anthropic Messages API](https://platform.claude.com/docs/en/api/messages).
+This endpoint supports both **mask mode** and **route mode**. Route mode requires a local LLM with Anthropic API support (e.g., Ollama). The request format follows the [Anthropic Messages API](https://platform.claude.com/docs/en/api/messages).
</Note>
## Request
---
title: Mask Mode
-description: Replace PII with placeholders before sending to your provider
+description: Replace PII with placeholders before sending to OpenAI or Anthropic
---
# Mask Mode
-Mask mode replaces PII with placeholders before sending to your configured provider. The response is automatically unmasked before returning to you.
+Mask mode replaces PII with placeholders before sending to OpenAI or Anthropic. The response is automatically unmasked before returning to you.
## How It Works
PasteGuard finds: `Dr. Sarah Chen` (PERSON), `sarah.chen@hospital.org` (EMAIL)
</Step>
<Step title="Masked request sent">
- Provider receives: `"Write a follow-up email to [[PERSON_1]] ([[EMAIL_ADDRESS_1]])"`
+ OpenAI/Anthropic receives: `"Write a follow-up email to [[PERSON_1]] ([[EMAIL_ADDRESS_1]])"`
</Step>
<Step title="Response masked">
- Provider responds: `"Dear [[PERSON_1]], Following up on our discussion..."`
+ OpenAI/Anthropic responds: `"Dear [[PERSON_1]], Following up on our discussion..."`
</Step>
<Step title="Response unmasked">
You receive: `"Dear Dr. Sarah Chen, Following up on our discussion..."`
## When to Use
- Simple setup without local infrastructure
-- Want to use OpenAI, Anthropic, or compatible providers while protecting PII
+- Want to use OpenAI or Anthropic while protecting PII
## Configuration
# Route Mode
-Route mode sends requests containing PII to a local LLM. Requests without PII go to your configured provider.
+Route mode sends requests containing PII to a local LLM. Requests without PII go to OpenAI or Anthropic.
## How It Works
PII stays on your network.
</Card>
<Card title="Request without PII" icon="server">
- Routed to **Configured Provider** (OpenAI, Anthropic, Azure, etc.)
+ Routed to **OpenAI or Anthropic**
- Full provider performance.
+ Full performance.
</Card>
</CardGroup>
```
In route mode:
-- **No PII detected** → Request goes to configured provider (OpenAI or Anthropic)
-- **PII detected** → Request goes to local provider
+- **No PII detected** → Request goes to OpenAI or Anthropic
+- **PII detected** → Request goes to local LLM
<Note>
-For Anthropic requests, the local provider must support the Anthropic Messages API (e.g., Ollama with Anthropic API compatibility).
+For Anthropic requests, the local LLM must support the Anthropic Messages API (e.g., Ollama with Anthropic API compatibility).
</Note>
-## Local Provider Setup
+## Local LLM Setup
### Ollama
X-PasteGuard-Language: en
```
-When routed to configured provider:
+When routed to OpenAI or Anthropic:
```
X-PasteGuard-Mode: route
| Action | Description |
|--------|-------------|
| `mask` | Replace secrets with placeholders, restore in response (default) |
-| `block` | Return HTTP 400, request never reaches LLM |
+| `block` | Return HTTP 400, request never reaches OpenAI or Anthropic |
| `route_local` | Route to local LLM (requires route mode) |
### Mask (Default)
action: block
```
-Request is rejected with HTTP 400. The secret never reaches the LLM.
+Request is rejected with HTTP 400. The secret never reaches OpenAI or Anthropic.
### Route to Local
log_masked_content: true
```
-Shows what was actually sent to your provider with PII replaced by placeholders.
+Shows what was actually sent to OpenAI or Anthropic with PII replaced by placeholders.
### No Content
| Value | Description |
|-------|-------------|
-| `mask` | Replace PII with placeholders, send to provider, restore in response |
-| `route` | PII requests stay on your local LLM (Ollama, vLLM, llama.cpp), others go to your configured provider |
+| `mask` | Replace PII with placeholders, send to OpenAI or Anthropic, restore in response |
+| `route` | PII requests stay on your local LLM (Ollama, vLLM, llama.cpp), others go to OpenAI or Anthropic |
See [Mask Mode](/concepts/mask-mode) and [Route Mode](/concepts/route-mode) for details.
---
title: Providers
-description: Configure your LLM providers
+description: Configure OpenAI, Anthropic, and local LLM endpoints
---
# Providers
-PasteGuard supports two provider types: configured providers (`providers`) and local provider (`local`).
+Configure endpoints for OpenAI, Anthropic, and local LLMs.
## OpenAI Provider
| `base_url` | Anthropic API endpoint |
| `api_key` | Optional. Used if client doesn't send `x-api-key` header |
-## Local Provider
+## Local LLM
Required for route mode only. Your local LLM for PII requests.
## API Key Handling
-PasteGuard forwards your client's authentication headers to the configured provider. You can optionally set `api_key` in config as a fallback:
+PasteGuard forwards your client's authentication headers to OpenAI or Anthropic. You can optionally set `api_key` in config as a fallback:
```yaml
providers:
| Action | Description |
|--------|-------------|
| `mask` | Replace secrets with placeholders, restore in response (default) |
-| `block` | Return HTTP 400, request never reaches LLM |
+| `block` | Return HTTP 400, request never reaches OpenAI or Anthropic |
| `route_local` | Route to local LLM (requires route mode) |
### Mask (Default)
PasteGuard drops into your existing workflow. Point your tools to PasteGuard and every request gets PII protection automatically.
-| Provider | PasteGuard URL |
+| API | PasteGuard URL |
|----------|----------------|
| OpenAI | `http://localhost:3000/openai/v1` |
| Anthropic | `http://localhost:3000/anthropic` |
endpoints:
custom:
- name: "PasteGuard"
- apiKey: "${OPENAI_API_KEY}" # Your API key, forwarded to provider
+ apiKey: "${OPENAI_API_KEY}" # Your API key, forwarded to OpenAI
baseURL: "http://localhost:3000/openai/v1"
models:
default: ["gpt-5.2"]
---
title: Introduction
-description: Privacy proxy for LLMs
+description: Privacy proxy for OpenAI and Anthropic
---
-PasteGuard masks personal data and secrets before sending prompts to LLM providers.
+PasteGuard masks personal data and secrets before they reach OpenAI or Anthropic.
```
You send: "Email Dr. Sarah Chen at sarah@hospital.org"
You get: Response with original names restored
```
-PasteGuard sits between your app and the LLM API:
+PasteGuard sits between your app and the API:
<Frame>
<img src="/images/demo.gif" alt="PasteGuard Demo" />
| Mode | How it works |
|------|--------------|
-| **Mask** | Replace PII with placeholders, send to provider, restore in response |
-| **Route** | PII requests stay on your local LLM (Ollama, vLLM, llama.cpp), others go to your configured provider |
+| **Mask** | Replace PII with placeholders, send to OpenAI or Anthropic, restore in response |
+| **Route** | PII requests stay on your local LLM (Ollama, vLLM, llama.cpp), others go to OpenAI or Anthropic |
## Browser Extension (Beta)
## Features
- **PII Detection** — Names, emails, phone numbers, credit cards, IBANs, and more
-- **Secrets Detection** — API keys, tokens, private keys caught before they reach the LLM
+- **Secrets Detection** — API keys, tokens, private keys caught before they reach OpenAI or Anthropic
- **Streaming Support** — Real-time unmasking as tokens arrive
- **24 Languages** — English, German, French, and 21 more
- **OpenAI** — Works with OpenAI and compatible APIs (Azure, OpenRouter, Groq, Together AI, etc.)
- Request history
- Detected PII entities
-- Masked content sent to the LLM
+- Masked content sent to OpenAI or Anthropic
<Frame>
<img src="/images/dashboard.png" alt="PasteGuard Dashboard" />
{
"name": "pasteguard",
- "version": "0.1.0",
+ "version": "0.2.1",
"description": "Privacy proxy for LLMs. Masks personal data and secrets before sending to your provider.",
"type": "module",
"main": "src/index.ts",
const body = (await res.json()) as Record<string, unknown>;
expect(body.name).toBe("PasteGuard");
- expect(body.version).toBe("0.1.0");
+ expect(body.version).toMatch(/^\d+\.\d+\.\d+$/);
expect(body.mode).toBeDefined();
expect(body.providers).toBeDefined();
expect(body.pii_detection).toBeDefined();