Back to blog
|David Durika

AI-Assisted MongoDB Queries in Mingo: How It Works

Mingo's new AI assistant turns plain English into MongoDB queries. Bring your own API key, pick your provider, keep your data private. Here's how it works under the hood.

AI-Assisted MongoDB Queries in Mingo: How It Works

The Problem with Writing MongoDB Queries

You know what you want. "Show me all orders from the last 30 days where the total exceeds $500 and the customer is in Europe." You can picture the result set. But between that thought and a working query, there's a gap filled with $match, $gte, ISODate(), nested $and conditions, and 10 minutes of Stack Overflow.

Aggregation pipelines are worse. A 6-stage pipeline with $lookup, $unwind, $group, and $project can take 20 minutes to write correctly. Describing what you want takes 20 seconds.

That's what Mingo's AI assistant is for.

What it does

The AI assistant is a side panel that appears in Documents, Aggregation, and Shell views. Type what you need in plain English, and it returns a working MongoDB query you can apply with one click.

  • Documents view: generates find() queries with filters, projections, and sort
  • Aggregation view: builds full pipelines, or adds and modifies individual stages
  • Shell view: writes shell commands and scripts

It's not a text-to-JSON translator. It understands your collection's schema, explores field values when needed, and refines its output based on what it actually finds in your data. More on that below.

Bring your own key

Every other MongoDB GUI that offers AI features makes a choice for you. Some route your queries through their own servers. Some require a cloud account login. Some lock you into a single model.

Mingo takes a different approach: bring your own API key.

You pick from four providers:

  • OpenAI (GPT-4o and newer)
  • Anthropic (Claude)
  • Google (Gemini)
  • xAI (Grok)

Your API key is stored in your operating system's secure keychain (macOS Keychain, Windows Credential Manager, Linux Secret Service). It never touches Mingo's servers. The AI request goes directly from your machine to your chosen provider.

No cloud account required. Switch providers whenever you want.

How it understands your data

When you open the AI panel, Mingo samples two documents from the current collection and builds a simplified schema. Strings are truncated, arrays show only their first element, nested objects are limited to two levels. Sensitive-looking fields are redacted. The AI sees the shape of your data, not the data itself.

But schema alone isn't always enough. If you ask "find all users with the 'enterprise' plan," the AI needs to know that the field is called subscription.tier and the value is "enterprise", not "Enterprise" or "ENTERPRISE".

So the AI can ask for more context. It can request:

  • Field samples: "Show me example values from subscription.tier"
  • Distinct values: "What are all unique values in status?"

It gets the results, refines its understanding, and generates a query that actually matches your data. This back-and-forth happens automatically, usually in a single turn. You just see the final result.

Other tools generate queries from schema metadata alone. The tool loop is why Mingo's queries tend to be correct on the first try.

Auto-apply mode

For when you're in the flow and don't want to click "Apply" every time, there's an auto-apply toggle. The AI executes generated queries directly. Turn it off when you want to review before running.

Per-tab conversations

Each tab has its own conversation history. Query users in one tab, aggregate orders in another. Close the tab and the conversation is gone. Nothing is persisted or sent anywhere.

Privacy

Mingo doesn't track your queries, conversations, or usage patterns. AI requests go directly from your machine to your provider. Mingo's servers are not in the middle. The AI sees simplified schema samples, not your actual documents. Your API keys live in your OS keychain, encrypted at rest.

If you work with sensitive data (healthcare, finance, government), this matters. You control what leaves your machine and where it goes.

Getting started

  1. Open Mingo and navigate to any collection
  2. Click the sparkles icon (or press Cmd+Shift+A / Ctrl+Shift+A)
  3. Enter your API key for any supported provider
  4. Start asking questions in plain English

Available to all licensed users, including trial.

What's next

We're working on deeper schema awareness, conversation persistence, and support for local models via OpenAI-compatible APIs like Ollama.