Skip to main content
warning

🚧 Cortex.cpp is currently under development. Our documentation outlines the intended behavior of Cortex, which may not yet be fully implemented in the codebase.

cortex chat

info

This CLI command calls the following API endpoint:

  • Download Model (The command only calls this endpoint if the specified model is not downloaded yet.)
  • Install Engine (The command only calls this endpoint if the specified engine is not downloaded yet.)
  • Start Model
  • Chat Completions (The command makes a call to this endpoint if the -c option is used.)

This command starts a chat session with a specified model, allowing you to interact directly with it through an interactive chat interface.

Usage​

info

You can use the --verbose flag to display more detailed output of the internal processes. To apply this flag, use the following format: cortex --verbose [subcommand].


# Stable
cortex chat [options] <model_id> -m <message>
# Beta
cortex-beta chat [options] <model_id> -m <message>
# Nightly
cortex-nightly chat [options] <model_id> -m <message>

info

This command uses a model_id from the model that you have downloaded or available in your file system.

Options​

OptionDescriptionRequiredDefault valueExample
model_idModel ID to chat with.Yes-mistral
-m, --message <message>Message to send to the modelYes--m "Hello, model!"
-h, --helpDisplay help information for the command.No--h