问题描述 聊天模型选 gpt-3.5-turbo-0613 设置最大token数 4096 发送送聊天内容就报错了 This model's maximum context length is 4097 tokens. However, you requested 4104 tokens (8 in the messages, 4096 in the completion). Please reduce the le...
问题描述 这里写问题具体描述...