Skip to main content
paulund
#tools#ai#github#copilot

GitHub Copilot CLI has always been a solid agentic terminal experience, but it locked you into GitHub's model routing. That changes now. You can point it at your own provider, or run everything locally, and GitHub never needs to be involved.

Configuration is done via environment variables before launching the CLI. Remote providers like Azure OpenAI, Anthropic, and any OpenAI-compatible endpoint are all supported. For local models, it works with Ollama, vLLM, and Foundry Local. Your model needs to support tool calling and streaming — a 128k context window is recommended for best results.

If you need full air-gap, set COPILOT_OFFLINE=true. This stops the CLI from contacting GitHub's servers entirely and disables telemetry. You also no longer need a GitHub account when using your own provider — just set your provider credentials and you're running.

This is useful if you're already paying for Anthropic or Azure OpenAI and want to consolidate spend, or if you're working in an environment where data leaving the machine isn't acceptable.

Read the GitHub Copilot CLI BYOK changelog


Newsletter

A weekly newsletter on React, Next.js, AI-assisted development, and engineering. No spam, unsubscribe any time.