From: Stefan Gasser Date: Tue, 20 Jan 2026 22:40:11 +0000 (+0100) Subject: Replace generic provider/LLM terminology with OpenAI or Anthropic (#56) X-Git-Url: http://git.99rst.org/?a=commitdiff_plain;h=4d2a80a4bd362dcb11dc78a5832afbb6e11af2e3;p=sgasser-llm-shield.git Replace generic provider/LLM terminology with OpenAI or Anthropic (#56) - Use specific names instead of "provider" or "LLM" in user-facing text - Keep "local LLM" for Ollama/vLLM references - Update README, docs, and configuration files --- diff --git a/README.md b/README.md index eef1409..7f1abeb 100644 --- a/README.md +++ b/README.md @@ -8,7 +8,7 @@

- Privacy proxy for LLMs. Masks personal data and secrets before sending prompts to your provider. + Privacy proxy for OpenAI and Anthropic. Masks personal data and secrets before they reach the API.

@@ -19,11 +19,11 @@
-PasteGuard Dashboard +PasteGuard Demo ## What is PasteGuard? -PasteGuard is a privacy proxy that masks personal data and secrets before sending prompts to LLM providers. +PasteGuard is a privacy proxy that masks personal data and secrets before they reach OpenAI or Anthropic. ``` You send: "Email Dr. Sarah Chen at sarah@hospital.org" @@ -33,8 +33,8 @@ You get: Response with original names restored **Two ways to protect your data:** -- **Mask Mode** — Replace PII with placeholders, send to your provider, restore in response. No local infrastructure needed. -- **Route Mode** — Send PII requests to a local LLM (Ollama, vLLM, llama.cpp), everything else to your provider. Data never leaves your network. +- **Mask Mode** — Replace PII with placeholders, send to OpenAI or Anthropic, restore in response. No local infrastructure needed. +- **Route Mode** — Send PII requests to a local LLM (Ollama, vLLM, llama.cpp), everything else to OpenAI or Anthropic. Data never leaves your network. Just change one URL to start protecting your data. @@ -52,7 +52,7 @@ Open source (Apache 2.0). Built in public — early feedback shapes the product. ## Features - **PII Detection** — Names, emails, phone numbers, credit cards, IBANs, and more -- **Secrets Detection** — API keys, tokens, private keys caught before they reach the LLM +- **Secrets Detection** — API keys, tokens, private keys caught before they reach OpenAI or Anthropic - **Streaming Support** — Real-time unmasking as tokens arrive - **24 Languages** — English, German, French, and 21 more - **OpenAI** — Works with OpenAI and compatible APIs (Azure, OpenRouter, Groq, Together AI, etc.) @@ -69,13 +69,15 @@ docker run --rm -p 3000:3000 ghcr.io/sgasser/pasteguard:en Point your app to PasteGuard: -| Provider | PasteGuard URL | Original URL | +| API | PasteGuard URL | Original URL | |----------|----------------|--------------| | OpenAI | `http://localhost:3000/openai/v1` | `https://api.openai.com/v1` | | Anthropic | `http://localhost:3000/anthropic` | `https://api.anthropic.com` | Dashboard: [http://localhost:3000/dashboard](http://localhost:3000/dashboard) +PasteGuard Dashboard + ### European Languages For German, Spanish, French, Italian, Dutch, Polish, Portuguese, and Romanian: diff --git a/docs/api-reference/anthropic.mdx b/docs/api-reference/anthropic.mdx index 13079f7..f90078f 100644 --- a/docs/api-reference/anthropic.mdx +++ b/docs/api-reference/anthropic.mdx @@ -12,7 +12,7 @@ POST /anthropic/v1/messages ``` -This endpoint supports both **mask mode** and **route mode**. Route mode requires a local provider with Anthropic API support (e.g., Ollama). The request format follows the [Anthropic Messages API](https://platform.claude.com/docs/en/api/messages). +This endpoint supports both **mask mode** and **route mode**. Route mode requires a local LLM with Anthropic API support (e.g., Ollama). The request format follows the [Anthropic Messages API](https://platform.claude.com/docs/en/api/messages). ## Request diff --git a/docs/concepts/mask-mode.mdx b/docs/concepts/mask-mode.mdx index 31f20a1..79c5db6 100644 --- a/docs/concepts/mask-mode.mdx +++ b/docs/concepts/mask-mode.mdx @@ -1,11 +1,11 @@ --- title: Mask Mode -description: Replace PII with placeholders before sending to your provider +description: Replace PII with placeholders before sending to OpenAI or Anthropic --- # Mask Mode -Mask mode replaces PII with placeholders before sending to your configured provider. The response is automatically unmasked before returning to you. +Mask mode replaces PII with placeholders before sending to OpenAI or Anthropic. The response is automatically unmasked before returning to you. ## How It Works @@ -17,10 +17,10 @@ Mask mode replaces PII with placeholders before sending to your configured provi PasteGuard finds: `Dr. Sarah Chen` (PERSON), `sarah.chen@hospital.org` (EMAIL) - Provider receives: `"Write a follow-up email to [[PERSON_1]] ([[EMAIL_ADDRESS_1]])"` + OpenAI/Anthropic receives: `"Write a follow-up email to [[PERSON_1]] ([[EMAIL_ADDRESS_1]])"` - Provider responds: `"Dear [[PERSON_1]], Following up on our discussion..."` + OpenAI/Anthropic responds: `"Dear [[PERSON_1]], Following up on our discussion..."` You receive: `"Dear Dr. Sarah Chen, Following up on our discussion..."` @@ -30,7 +30,7 @@ Mask mode replaces PII with placeholders before sending to your configured provi ## When to Use - Simple setup without local infrastructure -- Want to use OpenAI, Anthropic, or compatible providers while protecting PII +- Want to use OpenAI or Anthropic while protecting PII ## Configuration diff --git a/docs/concepts/route-mode.mdx b/docs/concepts/route-mode.mdx index e9581f9..87d33e6 100644 --- a/docs/concepts/route-mode.mdx +++ b/docs/concepts/route-mode.mdx @@ -5,7 +5,7 @@ description: Route PII requests to a local LLM # Route Mode -Route mode sends requests containing PII to a local LLM. Requests without PII go to your configured provider. +Route mode sends requests containing PII to a local LLM. Requests without PII go to OpenAI or Anthropic. ## How It Works @@ -16,9 +16,9 @@ Route mode sends requests containing PII to a local LLM. Requests without PII go PII stays on your network. - Routed to **Configured Provider** (OpenAI, Anthropic, Azure, etc.) + Routed to **OpenAI or Anthropic** - Full provider performance. + Full performance. @@ -44,14 +44,14 @@ local: ``` In route mode: -- **No PII detected** → Request goes to configured provider (OpenAI or Anthropic) -- **PII detected** → Request goes to local provider +- **No PII detected** → Request goes to OpenAI or Anthropic +- **PII detected** → Request goes to local LLM -For Anthropic requests, the local provider must support the Anthropic Messages API (e.g., Ollama with Anthropic API compatibility). +For Anthropic requests, the local LLM must support the Anthropic Messages API (e.g., Ollama with Anthropic API compatibility). -## Local Provider Setup +## Local LLM Setup ### Ollama @@ -103,7 +103,7 @@ X-PasteGuard-PII-Detected: true X-PasteGuard-Language: en ``` -When routed to configured provider: +When routed to OpenAI or Anthropic: ``` X-PasteGuard-Mode: route diff --git a/docs/concepts/secrets-detection.mdx b/docs/concepts/secrets-detection.mdx index 0daca06..f969773 100644 --- a/docs/concepts/secrets-detection.mdx +++ b/docs/concepts/secrets-detection.mdx @@ -44,7 +44,7 @@ PasteGuard detects secrets before PII detection and can block, mask, or route re | Action | Description | |--------|-------------| | `mask` | Replace secrets with placeholders, restore in response (default) | -| `block` | Return HTTP 400, request never reaches LLM | +| `block` | Return HTTP 400, request never reaches OpenAI or Anthropic | | `route_local` | Route to local LLM (requires route mode) | ### Mask (Default) @@ -64,7 +64,7 @@ secrets_detection: action: block ``` -Request is rejected with HTTP 400. The secret never reaches the LLM. +Request is rejected with HTTP 400. The secret never reaches OpenAI or Anthropic. ### Route to Local diff --git a/docs/configuration/logging.mdx b/docs/configuration/logging.mdx index 284608e..ae54067 100644 --- a/docs/configuration/logging.mdx +++ b/docs/configuration/logging.mdx @@ -69,7 +69,7 @@ logging: log_masked_content: true ``` -Shows what was actually sent to your provider with PII replaced by placeholders. +Shows what was actually sent to OpenAI or Anthropic with PII replaced by placeholders. ### No Content diff --git a/docs/configuration/overview.mdx b/docs/configuration/overview.mdx index 0ba4291..0db0f52 100644 --- a/docs/configuration/overview.mdx +++ b/docs/configuration/overview.mdx @@ -21,8 +21,8 @@ mode: mask | Value | Description | |-------|-------------| -| `mask` | Replace PII with placeholders, send to provider, restore in response | -| `route` | PII requests stay on your local LLM (Ollama, vLLM, llama.cpp), others go to your configured provider | +| `mask` | Replace PII with placeholders, send to OpenAI or Anthropic, restore in response | +| `route` | PII requests stay on your local LLM (Ollama, vLLM, llama.cpp), others go to OpenAI or Anthropic | See [Mask Mode](/concepts/mask-mode) and [Route Mode](/concepts/route-mode) for details. diff --git a/docs/configuration/providers.mdx b/docs/configuration/providers.mdx index 94f764f..32d4e5f 100644 --- a/docs/configuration/providers.mdx +++ b/docs/configuration/providers.mdx @@ -1,11 +1,11 @@ --- title: Providers -description: Configure your LLM providers +description: Configure OpenAI, Anthropic, and local LLM endpoints --- # Providers -PasteGuard supports two provider types: configured providers (`providers`) and local provider (`local`). +Configure endpoints for OpenAI, Anthropic, and local LLMs. ## OpenAI Provider @@ -76,7 +76,7 @@ providers: | `base_url` | Anthropic API endpoint | | `api_key` | Optional. Used if client doesn't send `x-api-key` header | -## Local Provider +## Local LLM Required for route mode only. Your local LLM for PII requests. @@ -133,7 +133,7 @@ local: ## API Key Handling -PasteGuard forwards your client's authentication headers to the configured provider. You can optionally set `api_key` in config as a fallback: +PasteGuard forwards your client's authentication headers to OpenAI or Anthropic. You can optionally set `api_key` in config as a fallback: ```yaml providers: diff --git a/docs/configuration/secrets-detection.mdx b/docs/configuration/secrets-detection.mdx index 0b90bb4..d117228 100644 --- a/docs/configuration/secrets-detection.mdx +++ b/docs/configuration/secrets-detection.mdx @@ -31,7 +31,7 @@ secrets_detection: | Action | Description | |--------|-------------| | `mask` | Replace secrets with placeholders, restore in response (default) | -| `block` | Return HTTP 400, request never reaches LLM | +| `block` | Return HTTP 400, request never reaches OpenAI or Anthropic | | `route_local` | Route to local LLM (requires route mode) | ### Mask (Default) diff --git a/docs/integrations.mdx b/docs/integrations.mdx index 0d885fb..8076b6a 100644 --- a/docs/integrations.mdx +++ b/docs/integrations.mdx @@ -7,7 +7,7 @@ description: Use PasteGuard with IDEs, chat interfaces, and SDKs PasteGuard drops into your existing workflow. Point your tools to PasteGuard and every request gets PII protection automatically. -| Provider | PasteGuard URL | +| API | PasteGuard URL | |----------|----------------| | OpenAI | `http://localhost:3000/openai/v1` | | Anthropic | `http://localhost:3000/anthropic` | @@ -58,7 +58,7 @@ cache: true endpoints: custom: - name: "PasteGuard" - apiKey: "${OPENAI_API_KEY}" # Your API key, forwarded to provider + apiKey: "${OPENAI_API_KEY}" # Your API key, forwarded to OpenAI baseURL: "http://localhost:3000/openai/v1" models: default: ["gpt-5.2"] diff --git a/docs/introduction.mdx b/docs/introduction.mdx index 1e4e843..20ed28b 100644 --- a/docs/introduction.mdx +++ b/docs/introduction.mdx @@ -1,9 +1,9 @@ --- title: Introduction -description: Privacy proxy for LLMs +description: Privacy proxy for OpenAI and Anthropic --- -PasteGuard masks personal data and secrets before sending prompts to LLM providers. +PasteGuard masks personal data and secrets before they reach OpenAI or Anthropic. ``` You send: "Email Dr. Sarah Chen at sarah@hospital.org" @@ -11,7 +11,7 @@ LLM sees: "Email [[PERSON_1]] at [[EMAIL_ADDRESS_1]]" You get: Response with original names restored ``` -PasteGuard sits between your app and the LLM API: +PasteGuard sits between your app and the API: PasteGuard Demo @@ -21,8 +21,8 @@ Two privacy modes: | Mode | How it works | |------|--------------| -| **Mask** | Replace PII with placeholders, send to provider, restore in response | -| **Route** | PII requests stay on your local LLM (Ollama, vLLM, llama.cpp), others go to your configured provider | +| **Mask** | Replace PII with placeholders, send to OpenAI or Anthropic, restore in response | +| **Route** | PII requests stay on your local LLM (Ollama, vLLM, llama.cpp), others go to OpenAI or Anthropic | ## Browser Extension (Beta) @@ -40,7 +40,7 @@ Open source (Apache 2.0). Built in public — early feedback shapes the product. ## Features - **PII Detection** — Names, emails, phone numbers, credit cards, IBANs, and more -- **Secrets Detection** — API keys, tokens, private keys caught before they reach the LLM +- **Secrets Detection** — API keys, tokens, private keys caught before they reach OpenAI or Anthropic - **Streaming Support** — Real-time unmasking as tokens arrive - **24 Languages** — English, German, French, and 21 more - **OpenAI** — Works with OpenAI and compatible APIs (Azure, OpenRouter, Groq, Together AI, etc.) diff --git a/docs/quickstart.mdx b/docs/quickstart.mdx index 3603f76..826173e 100644 --- a/docs/quickstart.mdx +++ b/docs/quickstart.mdx @@ -90,7 +90,7 @@ Open `http://localhost:3000/dashboard` in your browser to see: - Request history - Detected PII entities -- Masked content sent to the LLM +- Masked content sent to OpenAI or Anthropic PasteGuard Dashboard diff --git a/package.json b/package.json index 7347496..7d01eee 100644 --- a/package.json +++ b/package.json @@ -1,6 +1,6 @@ { "name": "pasteguard", - "version": "0.1.0", + "version": "0.2.1", "description": "Privacy proxy for LLMs. Masks personal data and secrets before sending to your provider.", "type": "module", "main": "src/index.ts", diff --git a/src/routes/info.test.ts b/src/routes/info.test.ts index e5990a6..121a380 100644 --- a/src/routes/info.test.ts +++ b/src/routes/info.test.ts @@ -13,7 +13,7 @@ describe("GET /info", () => { const body = (await res.json()) as Record; expect(body.name).toBe("PasteGuard"); - expect(body.version).toBe("0.1.0"); + expect(body.version).toMatch(/^\d+\.\d+\.\d+$/); expect(body.mode).toBeDefined(); expect(body.providers).toBeDefined(); expect(body.pii_detection).toBeDefined();