Introducing the Vectra AI MCP Server for On-Premises (QUX)

October 27, 2025
Fabien Guillot
Director, Technical Marketing at Vectra
Introducing the Vectra AI MCP Server for On-Premises (QUX)

Following up on our MCP journey

Back in August, we launched something that got a lot of attention across the cybersecurity world: the Vectra AI MCP Server for our SaaS Platform — an early look at how Model Context Protocol (MCP) could change how analysts and AI assistants work together. You can read that announcement here if you missed it.

Since that launch, one question kept coming up from our community: “What about those of us running Vectra on-prem?”

We heard you.

Today, we’re excited to bring the same innovation to our on-premise customers with the Vectra AI MCP Server for QUX — created in direct response to the many requests from organizations using our appliance-based platform. Now, you can connect AI assistants like Claude, ChatGPT, Cursor, or VS Code Copilot directly to your Vectra environment — all without leaving your secure infrastructure.

This release is all about inclusion — extending the power of MCP to every Vectra customer, wherever your platform lives.

Think of it as giving your on-prem SOC the same AI-native capabilities that SaaS users have enjoyed since August — an assistant that speaks your language, works with your data, and helps your team act faster than ever.

What’s MCP, and why does it matter?

If you’re hearing about Model Context Protocol (MCP) for the first time, here’s the short version: it’s an emerging open standard that allows AI agents to securely connect to your tools and data. Instead of manually digging through dashboards, analysts can ask questions and take action using natural language, with the AI acting as a bridge between your context and your intent.

In other words, MCP turns large language models (LLMs) into real SOC assistants — ones that can reason, recall, and respond based on your own environment.

Why is this a big deal? Because it’s how we move from “AI summarizing alerts” to AI-driven operations. The SOC of the future won’t just analyze data — it will act on it, intelligently and at scale.

At Vectra AI, this aligns perfectly with our mission: enabling agentic AI for modern security operations. MCP is the connective tissue that makes it all possible.

Why the Vectra MCP Server matters for on-prem SOCs?

For many organizations, Vectra’s on-premise platform (QUX) is the backbone of their detection and response strategy. But until now, connecting AI assistants to these environments required a lot of custom glue — and a bit of luck.

With the new Vectra AI MCP Server for QUX, that changes. Now, your analysts can:

  • Investigate threats in natural language: “Show me all hosts related to this detection in the last 24 hours.”
  • Correlate across accounts, hosts, and detections without writing a single query.
  • Generate reports or summaries directly from conversational input.
  • Automate assignments and notes during investigations.

All while staying inside the tools they already love — from Claude Desktop to their VS Code terminal. No extra dashboards. No friction.

For every organization, this isn’t just convenience — it’s transformation. It’s how AI becomes a true force multiplier for your SOC. Instead of scaling by headcount, you scale by intelligence.

Why should you give the Vectra MCP Server a try?

The Vectra MCP Server for QUX is open source and free to use. It’s designed for security teams who want to experiment, innovate, and see firsthand what AI-native operations can look like.

Imagine this:

“Summarize the most critical detections involving privileged accounts this week.”

“Show lateral movement indicators in the finance subnet.”

“Generate a quick report for my CISO on unresolved identity anomalies.”

All of that, from your AI assistant — and all powered by your own Vectra data.

This isn’t a demo or simulation. It’s your SOC, extended by AI.

How to get started (it’s easier than you think)

To get started, head over to the GitHub repository.

You can run the MCP server locally with a few quick steps, and connect it directly to Claude Desktop or other MCP-compatible clients. For most users, the stdio transport mode is all you need to get started.

A few commands later, you’ll be chatting with your Vectra data — in plain English.

Let's see a quick preview in action?

Curious what this looks like in practice? We’ve got you covered.

Check out two short demos where Claude connects to the Vectra MCP Server and interacts directly with real detection and asset data — all through simple natural language prompts. You’ll see how effortless it becomes to explore detections, retrieve host context, and generate investigation insights, all within seconds.

Make it better, together

This project is open source because we believe innovation in cybersecurity should be shared. The MCP ecosystem is moving fast, and we want the community — you — to help shape how AI-native SOCs evolve.

So try it. Fork it. Break it. Build on it.

If you have ideas for new features or use cases, open a pull request or share your thoughts on GitHub. This is a playground for the next generation of security operations.

AI for every SOC, everywhere

The MCP Server for QUX is more than just a connector — it’s part of a vision. A vision where every SOC, whether in the cloud or on-prem, can work at AI speed. Where your analysts don’t just keep up with attackers — they outpace them.

By connecting the Vectra AI Platform to the world of agentic AI, we’re helping teams modernize operations without replacing the human insight that matters most.

It’s time to see what your SOC can do when it speaks your language.

➡️ Explore it now: Vectra AI MCP Server for QUX on GitHub

Ready to see how Vectra AI can modernize your SOC?

Request a demo or check out our Modern SOC Vision to see how Attack Signal Intelligence and agentic AI are redefining security operations.

FAQs