/gemini-consult

Engages in deep, iterative conversations with the Gemini MCP server for complex problem-solving and architectural guidance.

This command is your direct line to an external expert AI. It's designed for situations where you need a thinking partner to work through a complex problem, not just a quick answer.

Usage

  • With arguments: /gemini-consult [specific problem or question]
    • Starts a new conversation session focused on your question.
  • Without arguments: /gemini-consult
    • If a session is active, this continues the conversation for follow-up questions.
    • If no session is active, the AI intelligently infers a topic from your current work.

Core Philosophy

The command enables persistent Gemini sessions for evolving problems through:

  • Continuous dialogue: Go back and forth multiple times until the problem is solved.
  • Context awareness: Automatically infers the problem when you don't provide one.
  • Session persistence: The session stays alive for the entire problem-solving lifecycle.

CRITICAL: Always consider Gemini's input as suggestions, not truths. You, as the primary AI, must think critically about the advice and maintain independent judgment.

Command Execution

Step 1 & 1.5: Understand the Problem & Gather Docs

The AI first determines the problem, either from your arguments or by analyzing the current context. It may also decide to consult the Context7 MCP to fetch up-to-date documentation for any relevant external libraries before initiating the Gemini session. This ensures the consultation is based on the latest information.

Step 2: Initialize Gemini Session

A new session is started with the Gemini MCP. The gemini-context-injector.sh hook automatically attaches foundational project files to the request:

  • MCP-ASSISTANT-RULES.md (Your project's coding standards for the assistant)
  • docs/ai-context/project-structure.md (Complete tech stack and file structure)
  • docs/ai-context/docs-overview.md (Documentation architecture)

This provides Gemini with immediate, deep context about your project.

Step 3: Engage in Deep Dialogue

This is an iterative process. You can have a back-and-forth conversation, refining the problem, sharing code changes, and getting feedback.

# Initial Question
session = mcp__gemini__consult_gemini(
    specific_question="How should I refactor this service?",
    ...
)

# Follow-up
follow_up = mcp__gemini__consult_gemini(
    specific_question="That's a good idea, but what about the performance impact?",
    session_id=session["session_id"]
)

Step 4: Session Management

Sessions are kept open by default to allow for a continuous problem-solving lifecycle. They are only closed when the problem is definitively solved and tested, or when a fresh start is more beneficial.

Best Practices

  1. Be Specific: Vague questions get vague answers. Provide concrete code and clear goals.
  2. Challenge Assumptions: Don't accept unclear guidance. Ask for clarification.
  3. Share Results: After implementing a suggestion, share the outcome with Gemini. This is a feedback loop.
  4. Trust but Verify: Test all suggestions thoroughly. The implementation is what reveals the truth.