You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
if isinstance(self.inference_model, OpenAiInferenceEngine):
if self.format and type(self.format) is not SystemFormat:
raise ValueError(
"Error in 'LLMAsJudge' metric. Inference model 'OpenAiInferenceEngine' does "
"not support formatting. Please remove the format definition from the recipe"
" (OpenAi Chat API take care of the formatting automatically)."
)
If OpenAiInferenceEngine does not support formatting, need to assure that self.format is an empty format, not any SystemFormat.
The text was updated successfully, but these errors were encountered:
The current validation code states:
If OpenAiInferenceEngine does not support formatting, need to assure that
self.format
is an empty format, not any SystemFormat.The text was updated successfully, but these errors were encountered: