Override Prompt Caching per model
R
Raul Silva
Currently, only Claude 3.7 Sonnet shows the option to overwrite the parameters per model, and enable Prompt Caching. OpenAI models are missing the toggle. Please add it to all models that support it.
Also probably, a check should be implemented to show or hide overwrite parameters, based on each model capabilities.
When changing a model in a chat, I would also like to apply these overwrites. If for example, I am using temperature 0, and switch to a model that doesn't support it, I have to manually go in and change it for that chat.
It should be seamless to switch models in chat, only applying what each model supports.