Use an LLM provider that requires responses to be in “json_schema” format

Hello,
I’m trying to connect to an OpenAI LLM provider that only supports the “json_schema” response format. Nocobase seems to require text. Is there a way to configure this? It seems possible within a workflow, but is it possible for other requests?
Thank you for your help.
Pix

Please describe in detail the problems encountered during use, and whether there are any error messages.

Hi,

I think it’s not that the provider only supports json_schema, it’s that it no longer accepts “text” as an explicit value for response_format.type.

NocoBase is sending response_format: { type: “text” } in its requests, which my OpenAI-compatible provider (Infomaniak) now reject with a 400 error:

400 Your request was malformed or missing some required parameters: {“code”:“validation_failed”,“description”:“Validation failed”,“errors”:[{“code”:“validation_rule_response_format_types_blacklist”,“description”:“The ‘text’ is no longer supported for ‘response_format.type’, only ‘json_schema’ is. See the documentation of the endpoint: https://developer.infomaniak.com/docs/api/post/2/ai/{product_id}/openai/v1/chat/completions or the OpenAI one: https://platform.openai.com/docs/api-reference/chat/create#chat_create-response_format",“context”:{“attribute”:"response_format.type”}}]}

It seems that “text” is the default behavior of the OpenAI API, NocoBase should omit the response_format field when text output is required.

Is there a way to get around this ?

Thank you

Pix