Enable Copilot#
Available on all plans
Cloud and self-hosted deployments
Signficantly increase team productivity and decision-making speed by enhancing your real-time collaboration capabilities with instant access to AI-generated information, discussion summaries, and contextually-aware action recommendations with Mattermost’s Copilot. Your users can interact with AI capabilities directly within their daily communication channels without needing to switch between multiple tools or platforms.
Tip
Looking for a Mattermost Copilot demo? Try it yourself! Then, watch this AI-Enhanced Collaboration on-demand webinar to learn how Copilot can enhance your mission-critical workflows, and download the Mattermost Copilot datasheet to learn more about integrating with industry-leading large language models (LLMs).
Setup#
You must be a Mattermost system admin to enable and configure it using the System Console.
From Mattermost v10.3#
From Mattermost v10.3, Copilot is installed automatically and ready for you to configure a large language model (LLM). When no LLMs are configured, the Copilot panel prompts users to configure one.
From Mattermost v9.7#
From Mattermost v9.7, you can install Mattermost Copilot from the in-product Mattermost Marketplace by selecting the Product icon and selecting App Marketplace. Search for Copilot and select Install, then configure an LLM.
Mattermost v9.6 or earlier#
Important
For an optimized user experience and compatibility, we recommend using Mattermost Copilot with Mattermost v9.7 and later.
If you’re running Mattermost Server v9.6 or earlier, AI Copilot must be installed using the latest binary available for download from the plugin repository.
Copilot is compatible with the following Mattermost Server versions:
v9.6 or later
v9.5.2+ (Extended Support Release - ESR)
v9.4.4+
v9.3.3+
v8.1.11+ (Extended Support Release - ESR)
Enable#
From Mattermost v10.3#
From Mattermost v10.3, Copilot is enabled automatically and is ready for LLM configuration.
From Mattermost v9.6#
From Mattermost v9.6, you must Copilot by going to System Console > Plugins > Copilot and setting Enable Plugin to True, then complete configuration in the System Console.
Mattermost configuration#
With extensive customization and extensibility options, you can tailor Copilot to meet your specific needs, whether it’s integrating with internal systems, customizing AI responses based on the team or project needs, or developing new capabilities that are unique to your operational requirements. You can also create custom integrations, workflows, and bots that leverage AI to meet your unique business needs.
Configure an LLM for your Copilot integration by going to System Console > Plugins > Copilot and selecting Add an AI Bot. Mattermost supports the following LLMs:
Note
The ability to define multiple LLMs for your Copilot integration requires a Mattermost Enterprise license.
Obtain an OpenAI API key.
Select OpenAI in the Service dropdown.
Enter your OpenAI API key in the API Key field.
Enter a model name in the Default Model field corresponding with the model’s label in the API, such as gpt-4o or gpt-3.5-turbo.
(Optional) If your API key belongs to an OpenAI organization, specify your Organization ID.
Obtain an Anthropic API key.
Select Anthropic in the Service dropdown.
Enter your Anthropic API key in the API Key field.
Specify a model name in the Default Model field corresponding with the model’s label in the API, such as claude-3-5-sonnet-20240620.
For more details about integrating with Microsoft Azure’s OpenAI services, see the official Azure OpenAI documentation.
Provision sufficient access to Azure OpenAI for your organization and access your Azure portal.
If you do not already have one, deploy an Azure AI Hub resource within Azure AI Studio
Once the deployment is complete, navigate to the resource and select Launch Azure AI Studio.
In the side navigation pane, select Deployments under Shared resources.
Select Deploy model then Deploy base model.
Select your model, such as gpt-4o and select Confirm.
Select Deploy to start your model.
In Mattermost, select OpenAI Compatible in the Service dropdown.
In the Endpoint panel for your new model deployment, copy the base URI of the Target URI (everything up to and including .com) and paste it in the API URL field in Mattermost.
In the Endpoint panel for your new model deployment, copy the Key and paste it in the API Key field in Mattermost.
In the Deployment panel for your new model deployment, copy the Model name and paste it in the Default Model field in Mattermost.
The OpenAI Compatible option allows integration with any OpenAI-compatible LLM provider, such as Ollama:
Deploy your model, for example, on Ollama.
Select OpenAI Compatible in the AI Service dropdown.
Enter the URL to your AI service from your Mattermost deployment in the API URL field. Be sure to include the port, and append
/v1
to the end of the URL if using Ollama. (e.g.,http://localhost:11434/v1
for Ollama, otherwisehttp://localhost:11434/
)If using Ollama, leave the API Key field blank.
Specify your model name in the Default Model field.
Custom instructions#
Text input here is included in the prompt for every request. Use this to give your bots extra context or instructions. For example, you could list all of your organization’s specific acronyms so the bot knows your vernacular and users can ask for definitions. Or you could give it specialized instructions such as adopting a specific personality or following a certain workflow. By customizing the instructions for each individual bot, you can create a more tailored AI experience for your specific needs.
Enable vision (Beta)#
Enabling vision allows images that are attached to posts to be sent to the upstream LLM for analysis. This requires that your upstream LLM supports these features. Only available with OpenAI and OpenAI-compatable services.
Disable tools (Beta)#
Disabling tools will prevent the LLM from making function calls. This is useful when a model technically supports tool usage but you want to prevent it from being used within Mattermost. Try toggling this feature if you encounter unpredictable tool-related behavior with your model.
Copilot plugin metrics#
Metrics for Copilot are exposed through the /plugins/mattermost-ai/metrics
subpath under the existing Mattermost server metrics endpoint. This is controlled by the Listen address for performance configuration setting. It defaults to port 8067
, and the following metrics are available:
copilot_system_plugin_start_timestamp_seconds
: The time the plugin started.copilot_system_plugin_info
: The plugin version and installation ID.copilot_api_time_seconds
: How long to execute API.copilot_http_requests_total
: The total number of API requests.copilot_http_errors_total
: The total number of http API errors.copilot_llm_requests_total
: The total number of requests to upstream LLMs.
Integrations#
Currently integrations are limited to direct messages between users and the bots. The integrations won’t operate from within public, private, or group message channels.
Jira#
Issues with public Jira instances can be fetched. No configuration is required for this integration.
GitHub#
If you have the Mattermost GitHub plugin enabled, you can use the integration to fetch issues and PRs from your public and private GitHub repositories. The user must be logged in to their GitHub account through the Mattermost GitHub plugin.
Upgrade#
We recommend updating this integration as new versions are released. Generally, updates are seamless and don’t interrupt the user experience in Mattermost.
Visit the Releases page for the latest release, available releases, and compatibiilty considerations.
Usage#
When Copilot is configured, notify your teams that they can use the Copilot in any Mattermost team or channel, and direct users to the chat with Copilot documentation for details on using Copilot to overcome information overload and streamline communication and collaboration.