On-Premise DeviceLocal AICloud PortalOptional External LLMs

Federated: MODS Device

Hardware at Your Location

Get a MODS device deployed at your location with local AI processing options. Cloud portal for authentication and remote access. External LLMs can be enabled with your own API keys.

Federated Features

On-Premise Device

MODS hardware runs at your location. Connect it to your network and you're ready to go.

Local AI Processing

AI models run directly on your device. When external providers are disabled, primary query processing remains local.

Cloud Portal Access

Authenticate through our portal and access your device from anywhere. No VPN configuration needed.

Where Does Your Data Live?

MODS supports deployment models where primary data can remain at your location, depending on tier and configuration.

Traditional Cloud AI

Your Office
Your files, documents, company data
Data Sent to Cloud
Third-Party Cloud
Data stored on their servers
Data governance is provider-dependent

MODS On-Premise Device

Your Office + MODS Device
AI runs here, data stored here
Management Only
MODS Portal
Remote access & management
Designed for data-local operation

How Federated Works

1

We Ship Your Device

Pre-configured MODS device shipped to your location. Ready to connect.

2

Connect to Your Network

Plug the device into your local network. It registers with our cloud portal.

3

Access From Anywhere

Your team logs into the portal and connects to the device. Works remotely or on your LAN.

4

Local Processing by Default

AI queries are handled on your device by default, with behavior depending on provider settings and configuration.

Where Does Data Live?

On Your Device

  • Uploaded documents (primary storage)
  • Knowledge base embeddings
  • Chat history and operational logs
  • Local AI model

In The Cloud Portal

  • User authentication
  • Device connection status
  • Access permissions
  • Remote access and connection routing

Optional: External LLMs

Bring Your Own API Keys

Want access to state-of-the-art LLMs like Claude, GPT-4, or Gemini? You can optionally enable external AI providers by adding your own API keys in the device settings.

How it works: You create accounts directly with OpenAI, Anthropic, Google, etc. and enter your API keys. You pay those providers directly based on your usage.

Note: When you use external AI providers, your queries are sent to those services per their terms and privacy policies. Your bulk document storage remains on your device, but individual queries and responses are processed by the external provider.

Keep It Fully Local

Prefer to avoid external model providers? Keep external LLMs disabled and primary AI processing remains on your device. Portal authentication and routing traffic may still apply.

Why Teams Choose Federated

Work From Anywhere

Your team can access the AI assistant from home, office, or travel through our portal.

Data Stays With You

Primary document and knowledge-base storage is on the device at your location.

Local AI by Default

Queries process on your device by default. External provider traffic only occurs if enabled in settings.

Optional AI Upgrade

Add your own API keys to access state-of-the-art external LLMs when you need more capability.

Ready for a MODS Device?

Deploy hardware at your location with local AI processing options and cloud portal access.