Google Gemini CLI
Gemini CLI brings Google's Gemini models into your GROOVE orchestration. It's a strong option for high-throughput tasks where Gemini's speed and cost profile make sense.
Installation
npm i -g @anthropic-ai/gemini-cliVerify it's available:
gemini --versionAuthentication
Gemini CLI requires a Google AI API key. Set it through the GUI's provider panel or via the CLI:
groove set-key gemini AIza-your-api-keyGROOVE encrypts the key with AES-256-GCM and stores it locally. The key is passed to the gemini process at spawn time.
Models
| Model | Tier | Best For |
|---|---|---|
gemini-2.5-pro | Heavy | Complex reasoning, large context tasks |
gemini-2.5-flash | Medium | Fast iteration, lightweight coding, high-volume tasks |
When adaptive routing is set to Auto, GROOVE classifies tasks and selects the appropriate Gemini model.
GROOVE Integration
Gemini agents are fully integrated into the coordination layer. They receive agent introductions, respect file scope locks, participate in the approval queue, and have their token usage tracked in the dashboard.
Limitations
- No hot-swap -- model changes require a kill-and-respawn cycle. Context rotation handles this seamlessly.
- API key billing -- usage is billed to your Google AI account.
When to Use Gemini
Gemini Flash is one of the fastest models available, making it ideal for bulk tasks like generating tests, writing documentation, or processing many small files. Pair it with Claude Code on the same project -- let Claude handle the complex architecture while Gemini handles volume.
