Allow to set custom context length for every model. Like when selecting GPT-4 turbo I would like to set context length to 64k instead of default 128k.
Reason (longer context length is losing data):