Gemini CLI Tutorial #9 - MCP Servers & Extensions

| Programming | December 29, 2025 | 6.84 Thousand views | 13:05

TL;DR

This tutorial explains how to extend Gemini CLI beyond local codebase interaction using MCP (Model Context Protocol) servers to connect with external APIs like Firebase, and demonstrates how to install Gemini CLI extensions that bundle MCP servers with custom commands and context files.

💡 Understanding MCP Servers 2 insights

MCP bridges AI to external services

MCP (Model Context Protocol) is an Anthropic-created standard that allows AI clients like Gemini CLI to interact with external APIs through tool functions without the AI model directly accessing those services.

Context 7 provides current documentation

The Context 7 MCP server gives Gemini access to up-to-date framework documentation (React, Next.js, etc.), solving the problem of AI models using outdated training data when implementing new features.

⚙️ Configuration and Setup 3 insights

Register servers in settings.json

MCP servers are configured in the project's `.gemini/settings.json` file under the `mcpServers` property, or globally in the home directory for cross-project access.

Secure API keys with environment variables

API keys should be stored in environment variables (e.g., `CONTEXT_7_API_KEY`) and referenced in settings.json using the syntax `${VARIABLE_NAME}` to prevent accidental exposure in repositories.

Verify installation with MCP command

Run the `mcp` command in Gemini CLI to list all registered servers and confirm tools are available before use.

🧩 Gemini CLI Extensions 3 insights

Extensions bundle capabilities

Unique to Gemini CLI, extensions package MCP server configurations, markdown context files, and custom slash commands into installable packages.

Firebase extension example

The Firebase extension includes an MCP server for backend communication plus custom commands like `/firebase:init` and `/firebase:deploy` for project setup and deployment.

Global installation via CLI

Install extensions using `gemini extensions install [URL]` which places them in the home directory's `.gemini/extensions` folder and automatically enables their MCP servers globally.

Bottom Line

To ensure Gemini CLI generates accurate, current code when working with external services or frameworks, configure MCP servers like Context 7 for documentation or install extensions like Firebase to add specialized tools and slash commands that extend the AI's capabilities beyond its training data cutoff.

More from The Net Ninja

View all
Gemini CLI Tutorial #7 - Custom Commands
11:38
The Net Ninja The Net Ninja

Gemini CLI Tutorial #7 - Custom Commands

This tutorial demonstrates how to extend Gemini CLI by creating custom slash commands stored as TOML files in `.gemini/commands/`, enabling complex automated workflows like UI component generation with integrated testing, git branching, and preview rendering through structured multi-step prompts.

5 months ago · 8 points
new p5.js 2 functions: textWeight, textContours, textModel
2:37:21
The Net Ninja The Net Ninja

new p5.js 2 functions: textWeight, textContours, textModel

Dan Schiffman outlines The Coding Train's return to regular content, detailing a sustainable workflow of converting live streams into edited tutorials, dual-path teaching strategies for p5.js 2.0 features like async/await and variable fonts, and a 2026 roadmap involving a studio move to enable physical computing.

7 months ago · 8 points

More in Programming

View all
Tanstack Start Course Course
30:57
Traversy Media Traversy Media

Tanstack Start Course Course

TanStack Start is a full-stack React framework powered by TanStack Router that provides SSR and server functions as a lightweight alternative to Next.js. Its isomorphic execution model runs code on both server and client, requiring specific patterns to handle server-only operations safely.

2 days ago · 10 points
Open Models Coding Essentials – Running LLMs Locally and in the Cloud Course
2:17:28
freeCodeCamp.org freeCodeCamp.org

Open Models Coding Essentials – Running LLMs Locally and in the Cloud Course

Andrew Brown tests open-source coding models including Gemma 4, Kimi 2.5, and Qwen across local and cloud deployments to evaluate viable alternatives to proprietary solutions, finding that while some models perform surprisingly well, hardware constraints make cloud hosting the practical choice for most developers.

2 days ago · 10 points