Privacy Policy

Last updated: March 17, 2026

TLDR CMD+K runs entirely on your device. No servers, no accounts, no analytics. You bring your own API key for a cloud provider (OpenAI, Anthropic, Google Gemini, xAI, or OpenRouter), or run a local provider (Ollama, LM Studio) with no API key and no external network requests. Data goes directly from your device to your selected provider, nowhere else.

How CMD+K Works

CMD+K is a native desktop overlay application. You press a global hotkey, type a natural-language description of what you want to do, and CMD+K generates a working terminal command using AI models from your chosen provider. The entire application runs locally on your device. There is no CMD+K server in between. When using a local provider (Ollama or LM Studio), inference also runs on your device and no data leaves your machine at all.

What Is Sent to the AI Provider

When you submit a query, CMD+K sends context to your selected AI provider's API so the model can generate an accurate command. The exact data depends on which mode you are using.

Terminal Mode

  • Terminal application name (e.g., Terminal.app, iTerm2)
  • Shell type (e.g., zsh, bash)
  • Current working directory
  • Last 25 lines of visible terminal output
  • Recent terminal text (visible buffer content, up to approximately 12% of the model's context window)
  • Running process name
  • Your query text
  • Session history (last 7 conversation turns)

Assistant Mode

  • Active application name
  • Browser console last line (if DevTools is open)
  • Visible screen text (approximately 4 KB)
  • Your question

For cloud providers, all data is sent over HTTPS directly to the provider you selected:

  • OpenAIapi.openai.com
  • Anthropicapi.anthropic.com
  • Google Geminigenerativelanguage.googleapis.com
  • xAIapi.x.ai
  • OpenRouteropenrouter.ai/api

For local providers, data stays entirely on your machine:

  • Ollamalocalhost:11434 (default)
  • LM Studiolocalhost:1234 (default)

No data passes through any CMD+K-operated server.

What Is Automatically Filtered

Before any context is sent to the AI provider, CMD+K automatically scans for and redacts common secret patterns, including:

  • AWS access keys
  • Generic API tokens
  • xAI API tokens
  • OpenAI API keys
  • GitHub personal access tokens
  • PEM-encoded private keys
  • Shell exports containing secret-like variable names

Matched values are replaced with [REDACTED] before the request leaves your machine.

What Is NOT Sent

The following information never leaves your device:

  • Your API key (stored in macOS Keychain, Windows Credential Manager, or Linux system keyring — never included in prompts)
  • File contents from your filesystem
  • Browsing history
  • Location data
  • Device identifiers
  • Network configuration or IP addresses
  • System hardware information
  • Hotkey preferences or app settings

Where Your Data Is Stored

All persistent data is stored locally on your device:

  • System credential store Your API key(s) for cloud providers, encrypted by the operating system (macOS Keychain on Mac, Windows Credential Manager on Windows, system keyring via libsecret on Linux)
  • Local settings file Model preference, hotkey binding, feature toggles, and local provider base URLs

There is no CMD+K server, no cloud database, no remote backup, and no sync service.

Third-Party Services

CMD+K connects to the AI provider you select. For cloud providers, your queries and the context described above are sent to that provider to generate command suggestions. Once data reaches the provider, their privacy policy applies.

Local providers (Ollama, LM Studio) run entirely on your device. No data is sent to any third-party service when using a local provider. No API key is required.

Permissions

CMD+K uses platform APIs to read terminal output and detect the active application:

  • macOS — Accessibility API permission, granted in System Settings > Privacy & Security > Accessibility. You can revoke it at any time, which will prevent CMD+K from capturing terminal context.
  • Windows — No special permissions needed. CMD+K uses standard Win32 APIs (UI Automation, CreateToolhelp32Snapshot) that work without elevation.
  • Linux/proc filesystem for process detection (standard, no special permissions). xdotool for keystroke simulation on X11. AT-SPI2 accessibility API for reading terminal text from VTE-based terminals (GNOME Terminal, Tilix, Terminator). kitty and WezTerm remote-control APIs for those terminals.

No Telemetry

CMD+K includes zero telemetry. There are no analytics SDKs, no tracking pixels, no cookies, no crash reporters, and no usage data transmitted anywhere. The application makes network requests only when you explicitly submit a query, and only to your selected AI provider.

Open Source

CMD+K is fully open source under the MIT license. The entire codebase is available on GitHub. Anyone can audit exactly what data is collected, how it is processed, and where it is sent.

Changes to This Policy

If this policy is updated, the changes will be reflected by the "Last updated" date at the top of this page. Significant changes will also be noted in the project's GitHub release notes.

Contact

If you have questions about this privacy policy or CMD+K's data handling, please open an issue on GitHub Issues.

Policy History

March 15, 2026 Linux support — cross-platform credential storage

TLDR CMD+K runs entirely on your device. No servers, no accounts, no analytics. You bring your own API key for the provider you choose (OpenAI, Anthropic, Google Gemini, xAI, or OpenRouter). Data goes directly from your device to your selected provider's API, nowhere else.

How CMD+K Works

CMD+K is a native desktop overlay application. You press a global hotkey, type a natural-language description of what you want to do, and CMD+K generates a working terminal command using AI models from your chosen provider. The entire application runs locally on your device. There is no CMD+K server in between.

What Is Sent to the AI Provider

When you submit a query, CMD+K sends context to your selected AI provider's API so the model can generate an accurate command. The exact data depends on which mode you are using.

Terminal Mode

  • Terminal application name (e.g., Terminal.app, iTerm2)
  • Shell type (e.g., zsh, bash)
  • Current working directory
  • Last 25 lines of visible terminal output
  • Recent terminal text (visible buffer content, up to approximately 12% of the model's context window)
  • Running process name
  • Your query text
  • Session history (last 7 conversation turns)

Assistant Mode

  • Active application name
  • Browser console last line (if DevTools is open)
  • Visible screen text (approximately 4 KB)
  • Your question

All data is sent over HTTPS directly to the provider you selected:

  • OpenAIapi.openai.com
  • Anthropicapi.anthropic.com
  • Google Geminigenerativelanguage.googleapis.com
  • xAIapi.x.ai
  • OpenRouteropenrouter.ai/api

No data passes through any CMD+K-operated server.

What Is Automatically Filtered

Before any context is sent to the AI provider, CMD+K automatically scans for and redacts common secret patterns, including:

  • AWS access keys
  • Generic API tokens
  • xAI API tokens
  • OpenAI API keys
  • GitHub personal access tokens
  • PEM-encoded private keys
  • Shell exports containing secret-like variable names

Matched values are replaced with [REDACTED] before the request leaves your machine.

What Is NOT Sent

The following information never leaves your device:

  • Your API key (stored in macOS Keychain, Windows Credential Manager, or Linux system keyring — never included in prompts)
  • File contents from your filesystem
  • Browsing history
  • Location data
  • Device identifiers
  • Network configuration or IP addresses
  • System hardware information
  • Hotkey preferences or app settings

Where Your Data Is Stored

All persistent data is stored locally on your device:

  • System credential store Your API key(s), encrypted by the operating system (macOS Keychain on Mac, Windows Credential Manager on Windows, system keyring via libsecret on Linux)
  • Local settings file Model preference, hotkey binding, feature toggles

There is no CMD+K server, no cloud database, no remote backup, and no sync service.

Third-Party Services

CMD+K connects to the AI provider you select. Your queries and the context described above are sent to that provider to generate command suggestions. Once data reaches the provider, their privacy policy applies.

Permissions

CMD+K uses platform APIs to read terminal output and detect the active application:

  • macOS — Accessibility API permission, granted in System Settings > Privacy & Security > Accessibility. You can revoke it at any time, which will prevent CMD+K from capturing terminal context.
  • Windows — No special permissions needed. CMD+K uses standard Win32 APIs (UI Automation, CreateToolhelp32Snapshot) that work without elevation.
  • Linux/proc filesystem for process detection (standard, no special permissions). xdotool for keystroke simulation on X11. AT-SPI2 accessibility API for reading terminal text from VTE-based terminals (GNOME Terminal, Tilix, Terminator). kitty and WezTerm remote-control APIs for those terminals.

No Telemetry

CMD+K includes zero telemetry. There are no analytics SDKs, no tracking pixels, no cookies, no crash reporters, and no usage data transmitted anywhere. The application makes network requests only when you explicitly submit a query, and only to your selected AI provider.

Open Source

CMD+K is fully open source under the MIT license. The entire codebase is available on GitHub. Anyone can audit exactly what data is collected, how it is processed, and where it is sent.

Changes to This Policy

If this policy is updated, the changes will be reflected by the "Last updated" date at the top of this page. Significant changes will also be noted in the project's GitHub release notes.

Contact

If you have questions about this privacy policy or CMD+K's data handling, please open an issue on GitHub Issues.

March 9, 2026 Multi-provider support — macOS and Windows

TLDR CMD+K runs entirely on your device. No servers, no accounts, no analytics. You bring your own API key for the provider you choose (OpenAI, Anthropic, Google Gemini, xAI, or OpenRouter). Data goes directly from your device to your selected provider's API, nowhere else.

How CMD+K Works

CMD+K is a native desktop overlay application. You press a global hotkey, type a natural-language description of what you want to do, and CMD+K generates a working terminal command using AI models from your chosen provider. The entire application runs locally on your device. There is no CMD+K server in between.

What Is Sent to the AI Provider

When you submit a query, CMD+K sends context to your selected AI provider's API so the model can generate an accurate command. The exact data depends on which mode you are using.

Terminal Mode

  • Terminal application name (e.g., Terminal.app, iTerm2)
  • Shell type (e.g., zsh, bash)
  • Current working directory
  • Last 25 lines of visible terminal output
  • Running process name
  • Your query text
  • Session history (last 7 conversation turns)

Assistant Mode

  • Active application name
  • Browser console last line (if DevTools is open)
  • Visible screen text (approximately 4 KB)
  • Your question

All data is sent over HTTPS directly to the provider you selected:

  • OpenAIapi.openai.com
  • Anthropicapi.anthropic.com
  • Google Geminigenerativelanguage.googleapis.com
  • xAIapi.x.ai
  • OpenRouteropenrouter.ai/api

No data passes through any CMD+K-operated server.

What Is Automatically Filtered

Before any context is sent to the AI provider, CMD+K automatically scans for and redacts common secret patterns, including:

  • AWS access keys
  • Generic API tokens
  • xAI API tokens
  • OpenAI API keys
  • GitHub personal access tokens
  • PEM-encoded private keys
  • Shell exports containing secret-like variable names

Matched values are replaced with [REDACTED] before the request leaves your machine.

What Is NOT Sent

The following information never leaves your device:

  • Your API key (stored in macOS Keychain or Windows Credential Manager, never included in prompts)
  • File contents from your filesystem
  • Browsing history
  • Location data
  • Device identifiers
  • Network configuration or IP addresses
  • System hardware information
  • Hotkey preferences or app settings

Where Your Data Is Stored

All persistent data is stored locally on your device:

  • System credential store Your API key(s), encrypted by the operating system (macOS Keychain on Mac, Windows Credential Manager on Windows)
  • Local settings file Model preference, hotkey binding, feature toggles

There is no CMD+K server, no cloud database, no remote backup, and no sync service.

Third-Party Services

CMD+K connects to the AI provider you select. Your queries and the context described above are sent to that provider to generate command suggestions. Once data reaches the provider, their privacy policy applies.

Permissions

CMD+K uses platform accessibility APIs to read terminal output and detect the active application. On macOS, this permission must be explicitly granted in System Settings > Privacy & Security > Accessibility. You can revoke it at any time, which will prevent CMD+K from capturing terminal context.

No Telemetry

CMD+K includes zero telemetry. There are no analytics SDKs, no tracking pixels, no cookies, no crash reporters, and no usage data transmitted anywhere. The application makes network requests only when you explicitly submit a query, and only to your selected AI provider.

Open Source

CMD+K is fully open source under the MIT license. The entire codebase is available on GitHub. Anyone can audit exactly what data is collected, how it is processed, and where it is sent.

Changes to This Policy

If this policy is updated, the changes will be reflected by the "Last updated" date at the top of this page. Significant changes will also be noted in the project's GitHub release notes.

Contact

If you have questions about this privacy policy or CMD+K's data handling, please open an issue on GitHub Issues.

February 27, 2026 Initial version — xAI-only provider support

TLDR (original): CMD+K runs entirely on your device. No servers, no accounts, no analytics. You bring your own xAI API key. Data goes directly from your device to xAI's API, nowhere else.

Third-Party Services (original): CMD+K connects to exactly one external service: the xAI API (api.x.ai). Your queries and the context described above are sent to xAI to generate command suggestions.

Endpoint: All data was sent over HTTPS directly to api.x.ai.