Skip to main content

ChatMistralAI

This notebook covers how to get started with MistralAI chat models, via their API.

A valid API key is needed to communicate with the API.

from langchain_core.messages import HumanMessage
from langchain_mistralai.chat_models import ChatMistralAI
import os

mistral_api_key = os.environ.get("MISTRAL_API_KEY")
# If mistral_api_key is not passed, default behavior is to use the `MISTRAL_API_KEY` environment variable.
chat = ChatMistralAI(mistral_api_key=mistral_api_key)
messages = [HumanMessage(content="say a brief hello")]
chat.invoke(messages)
AIMessage(content="Hello! I'm here to assist you. How can I help you today? If you have any questions or need information on a particular topic, feel free to ask. I'm ready to provide accurate and helpful answers to the best of my ability.")

ChatMistralAI also supports async and streaming functionality:

await chat.ainvoke(messages)
AIMessage(content="Hello! I'm glad you're here. If you have any questions or need assistance with something related to programming or software development, feel free to ask. I'll do my best to help you out. Have a great day!")
for chunk in chat.stream(messages):
print(chunk.content, end="")
Hello! I'm happy to assist you. Is there a specific question or topic you would like to discuss? I can provide information and answer questions on a wide variety of subjects.