Network and Firewall Requirements
It is recommended to run MariaDB AI RAG on an internal, secured network. Direct public exposure of application or database ports is not recommended.
Before deploying MariaDB AI RAG, ensure that your firewall and network rules allow traffic on all required ports. Proper connectivity between the Docker containers, local services, and external AI providers is essential for the system to function correctly.
The following table details the necessary ports and their purposes, following the standard MariaDB deployment format.
RAG API
8000
TCP
Inbound
API Access: Main REST API endpoint and Swagger UI documentation.
MCP Server
8002
TCP
Inbound
AI Gateway: Model Context Protocol endpoint for AI agent and IDE interactions.
MariaDB Server
3306
TCP
Outbound
Database Access: Native port for relational and vector data storage.
Ollama
11434
TCP
Outbound
Local LLM: API endpoint for local language models (active only with the Ollama profile).
External API Providers
443
HTTPS
Outbound
AI Services: Requests to configured providers (OpenAI, Gemini, Voyage, Cohere).
MariaDB Licensing
443
HTTPS
Outbound
License Validation: Required at startup to fetch public keys for license verification.
All inbound ports listed are TCP. Ensure your firewall rules explicitly allow TCP traffic for the specified ports within the Docker network (rag-network) or from authorized external hosts.
Summary of Required Firewall Rules
For a standard AI RAG 1.1 deployment, ensure the following rules are in place:
Inbound Access: Allow traffic from user workstations or applications to the RAG API on port
8000and the MCP Server on port8002.Outbound Access: The host must be able to reach
https://*.mariadb.comand your configured AI provider endpoints (e.g.,https://generativelanguage.googleapis.com) on port443.Internal Connectivity: If your MariaDB vector database is hosted on a separate machine, ensure the RAG host can communicate with it on port
3306.
Last updated
Was this helpful?

