This release includes new features and bug fixes.
Ensure output format using structured output
You can now use structured output to generate output using openAIChat
or azureChat
objects.
In LLMs with MATLAB, you can specify the structure of the output in two different ways.
- Specify a valid JSON Schema directly.
- Specify an example structure array that adheres to the required output format. The software automatically generates the corresponding JSON Schema and provides this to the LLM. Then, the software automatically converts the output of the LLM back into a structure array.
To do this, set the ReponseFormat
name-value argument of openAIChat
, azureChat
, or generate
to:
- A string scalar containing a valid JSON Schema.
- A structure array containing an example that adheres to the required format, for example:
ResponseFormat=struct("Name","Rudolph","NoseColor",[255 0 0])
For more information on structured output, see https://platform.openai.com/docs/guides/structured-outputs.
Argument name changes
The Model
name-value argument of ollamaChat
has been renamed to ModelName
. However, you can still use Model
instead.
The Deployment
name-value argument of azureChat
has been renamed to DeploymentID
. However, you can still use Deployment
instead.