fix: correct max_tokens reference in OpenAI class
Updated the reference to max_tokens in the truncation call from self.chat_api.max_tokens to self.max_tokens, ensuring the correct token limit is applied. This resolves potential issues with message length handling.
This commit is contained in:
parent
c06da55d5d
commit
ca7245696a
1 changed files with 1 additions and 1 deletions
|
@ -393,7 +393,7 @@ class OpenAI(BaseAI):
|
|||
# Truncate messages to fit within the token limit
|
||||
self._truncate(
|
||||
messages=chat_messages,
|
||||
max_tokens=self.chat_api.max_tokens - 1,
|
||||
max_tokens=self.max_tokens - 1,
|
||||
system_message=system_message,
|
||||
)
|
||||
|
||||
|
|
Loading…
Reference in a new issue