Enable AI Copilot (Beta)#

plans-img Available on all plans

deployment-img Cloud and self-hosted deployments

Signficantly increase team productivity and decision-making speed by enhancing your real-time collaboration capabilities with instant access to AI-generated information, discussion summaries, and contextually-aware action recommendations with Mattermost’s AI Copilot. Your users can interact with AI capabilities directly within their daily communication channels without needing to switch between multiple tools or platforms

Setup#

Mattermost AI Copilot comes preinstalled from Mattermost Server v9.7 or later. You need to be a Mattermost system admin to enable it and configure it using the System Console.

Note

If you’re running Mattermost Server v9.6 or earlier, AI Copilot must be installed using the latest binary available for download from the plugin repository . For an optimized user experience and compatibility, we recommend using AI Copilot with Mattermost v9.7 and later.

The AI Copilot integration is compatible with the following Mattermost Server versions:

  • v9.6 or later

  • v9.5.2+ (Extended Support Release - ESR)

  • v9.4.4+

  • v9.3.3+

  • v8.1.11+ (Extended Support Release - ESR)

Enable#

Go to System Console > Plugins > AI Copilot to enable this feature.

Once the integration is installed and enabled, complete configuration in the System Console as described below, then notify your teams that they can use the AI Copilot in any Mattermost team or channel.

Mattermost configuration#

With extensive customization and extensibility options, you can tailor your AI Copilot to meet your specific needs, whether it’s integrating with internal systems, customizing AI responses based on the team or project needs, or developing new capabilities that are unique to your operational requirements. You can also create custom integrations, workflows, and bots that leverage AI to meet your unique business needs.

Configure a large language model (LLM) for your AI Copilot integration by going to System Console > Plugins > AI Copilot. Mattermost supports the following LLMs:

  1. Obtain an OpenAI API key.

  2. Select OpenAI in the AI Service dropdown.

  3. Enter your OpenAI API key in the API Key field.

  4. Enter a model name in the Default Model field, such as gpt-4 or gpt-3.5-turbo.

  5. (Optional) If using multiple organizations on OpenAI, specify your Organization ID to direct API usage and billing accordingly.

  1. Obtain an Anthropic API key.

  2. Select Anthropic in the AI Service dropdown.

  3. Enter your Anthropic API key in the API Key field.

  4. Specify a model name in the Default Model field, like claude-v1.

For integrating with Microsoft Azure’s OpenAI services, see the official Azure Open AI documentation.

  1. Get access to Azure OpenAI.

  2. Create a new OpenAI resource in Azure.

  3. Ensure your model resource on Azure does not auto-update.

  4. In Mattermost, choose OpenAI Compatible in the AI Service dropdown.

  5. Enter your Azure resource’s URL in the API URL field.

  6. Input your Azure resource API key in the API Key field.

  7. Specify your model name in the Default Model field, for example, gpt-3.5-turbo.

The OpenAI Compatible option allows integration with any OpenAI-compatible LLM provider:

  1. Deploy your model, for example, on LocalAI.

  2. Select OpenAI Compatible in the AI Service dropdown.

  3. Enter the URL to your AI service in the API URL field.

  4. Enter your API key in the API Key field.

  5. Specify your model name in the Default Model field.

Upgrade#

We recommend updating this integration as new versions are released. Generally, updates are seamless and don’t interrupt the user experience in Mattermost.

Visit the Releases page for the latest release, available releases, and compatibiilty considerations.

Usage#

See the chat with AI Copilot documentation for details on using AI Copilot to Overcome information overload and streamline communication and collaboration.