AI Data Leak Prevention

Stop Sensitive Data from Leaking into AI Tools

Every day, employees paste API keys, passwords, SSNs, and client PII into ChatGPT, Copilot, and Gemini. Locki's Smart Suggestions detects sensitive patterns in real time — and alerts your team before the data is submitted.

AES-256-GCM

Bank-grade encryption

Zero-Knowledge

We never see your data

Local Processing

Encryption in the browser

GDPR Ready

Compliant by design

Open Audit

Transparent cryptography

Free Trial

14 days, no credit card

AI tools see everything your employees type

AI assistants are now part of everyday work. But every prompt is sent to a third-party server. When employees paste sensitive data without thinking, it becomes part of someone else's infrastructure.

AI tools receive what you type

Every message sent to ChatGPT, Copilot, or Gemini is transmitted to external model servers. You have no control over how long it is retained, who can access it, or whether it influences future model outputs.

Employees don't always recognise sensitive data

People paste API keys into AI debugging sessions, include client names in AI-drafted emails, or ask AI to summarise documents containing personal data — often without realising the risk.

Security teams have no visibility

Unlike email or file sharing, AI tool usage is invisible to IT. There is no audit trail of what was pasted into ChatGPT last Tuesday, no DLP policy that stops it, and no alert when it happens.

One paste can be a reportable incident

Sending a single Social Security Number, private key, or credit card number to an external AI service may constitute a data breach under GDPR, HIPAA, or PCI-DSS — regardless of intent.

Smart Suggestions: real-time DLP, right in the browser

Real-time browser DLP

Locki monitors every text field as employees type or paste — including AI chat interfaces. Sensitive patterns are detected before the user clicks Submit, not after.

8 built-in detection patterns

Ready to use on day one: credit cards, IBANs, AWS access keys, PEM private keys, Social Security Numbers, generic API secrets, email addresses, and phone numbers.

Scoped to groups and websites

Admins assign rules to specific user groups — DevOps, HR, Legal — and scope them to specific domains like chatgpt.com or github.com. The right rules apply to the right people, in the right context.

Zero performance overhead

Detection runs as a lightweight regex engine inside the browser extension. No content is ever sent to Locki servers for analysis. The only server contact is to sync the rule list.

How It Works

Up and running in four steps

1
Step 1

Subscribe to Teams Pro

Smart Suggestions is part of the Locki Teams Pro plan. Subscribe and deploy the browser extension to your team via managed installation or share the download link.

2
Step 2

Configure rules in the dashboard

8 detection patterns are active by default. From the admin dashboard, create custom rules, assign them to user groups (DevOps, HR, Legal), and target specific websites like chatgpt.com or copilot.microsoft.com.

3
Step 3

Locki monitors in the background

The extension runs silently across all browser tabs. When an employee types or pastes a matching pattern into any text field — including AI chat interfaces — a real-time in-browser alert appears.

4
Step 4

Employee decides what to do next

The alert prompts the user to reconsider. They can encrypt the sensitive content with Locki before sending, redact it, or dismiss the warning with intent — all without leaving the page.

Built-in Detection Patterns

8 patterns, active on day one

Credit Card Numbers
Bank Account Numbers (IBAN)
AWS Access Keys
Private Keys (PEM)
Social Security Numbers
API Keys & Secrets
Email Addresses
Phone Numbers

Smart Suggestions is a Locki Teams Pro feature

Admins can create unlimited custom patterns, assign them to user groups, and scope them to specific websites — all from the dashboard.

View Teams Pro pricing
Use Cases

Who uses Smart Suggestions

ChatGPT & Claude users

Employees drafting customer-facing messages often paste client names, email addresses, or account numbers into AI writing assistants. Smart Suggestions catches PII before it leaves the browser.

Developers using Copilot

Developers frequently paste code snippets containing API keys, database credentials, or tokens when asking AI for debugging help. Locki detects these secrets and prompts the dev to sanitise first.

AI-assisted customer support

Support agents using AI tools to summarise tickets may paste customer records with personal data. Locki alerts the agent before the data reaches the AI provider.

GDPR & HIPAA compliance teams

Compliance teams can enforce data handling policies directly in the browser — without VPNs, proxies, or application-layer controls. Detections are logged in the Locki audit trail.

DevOps and infrastructure teams

Prevent .env file contents, SSH private keys, and cloud provider tokens from being pasted into AI coding assistants. Scoped rules mean only DevOps group members see these alerts.

Legal and HR departments

Lawyers drafting with AI and HR teams summarising employee records are high-risk users. Smart Suggestions detects confidential data patterns and stops leaks before they happen.

Detection happens in your browser — nothing leaves your device

Locki's DLP engine runs entirely inside the browser extension as a lightweight regex matcher. No content is ever sent to Locki servers for analysis — the only network call is to sync the rule list (patterns, not text). Smart Suggestions is part of Locki Teams Pro and complements Locki's AES-256-GCM encryption: when sensitive data is detected, users can encrypt it on the spot before deciding whether to send it.

No Credit Card Required

Start Your 14-Day Free Trial

No credit card required. Full Teams Pro access from day one.

14-day free trial · Full Teams Pro access · Cancel anytime