NEWCKEditor AI on your premises: Hook your LLM and register MCP tools. Webinar coming soon!
Sign up (with export icon)

Deployment options

Show the table of contents

CKEditor AI backend is available in two deployment modes: Cloud (SaaS) and On-premises. Both options provide the same core AI features – Chat, Quick Actions, Review, and Translate – with the on-premises version offering additional capabilities such as custom AI models and MCP support.

Cloud (SaaS)

Copy link

Unlock this feature with selected CKEditor Plans

Try all premium features – no credit card needed.

Sign up for a free trial Select a Plan

The Cloud (SaaS) deployment offers the fastest way to get started with CKEditor AI. The AI service is hosted and managed by CKEditor, so there is no server-side setup required on your end. You only need to provide a valid license key and configure the editor-side plugins as described in the integration guide.

For more information about the Cloud AI service, refer to the CKEditor AI Cloud Services documentation.

On-premises

Copy link

Unlock this feature with the Custom Plan

Want to try this feature? Contact us to request a trial or learn more about our plans.

Request a trial View plans

The on-premises deployment allows you to run the CKEditor AI service on your own infrastructure, including private cloud environments. The service is distributed as Docker images compatible with standard container runtimes.

On-premises deployment gives you full control over the AI service, including the ability to use custom AI models and providers, and to extend CKEditor AI with custom tools via MCP (Model Context Protocol).

For detailed setup instructions, requirements, and configuration, refer to the CKEditor AI On-Premises documentation.

Connecting the editor to an on-premises service

Copy link

To point the editor to your on-premises AI service, set the config.ai.serviceUrl property to the URL of your on-premises instance:

ClassicEditor
    .create( document.querySelector( '#editor' ), {
        licenseKey: '<YOUR_LICENSE_KEY>',

        ai: {
            serviceUrl: 'https://your-on-prem-host.com/v1',

            // ... Other AI configuration options.
        }

        // ... Other editor configuration.
    } )
    .then( /* ... */ )
    .catch( /* ... */ );
Copy code

Custom AI models

Copy link

The on-premises version supports custom AI model providers, allowing you to use your own models hosted on services like Amazon Bedrock, Google Cloud, Microsoft Azure, or self-hosted solutions. Models configured on the server side will automatically appear in the editor’s model selector.

For configuration details, refer to the on-premises configuration guide.

MCP support

Copy link

The on-premises deployment supports the Model Context Protocol (MCP), which allows you to extend CKEditor AI with custom external tools. Learn more in the MCP support guide.

Feature comparison

Copy link
Feature Cloud (SaaS) On-premises
Hosted and managed by CKEditor ✅ Yes ❌ No
Custom infrastructure ❌ No ✅ Yes
Custom AI models and providers ❌ No ✅ Yes
MCP support ❌ No ✅ Yes