fix: correct max_tokens reference in OpenAI class

Updated the reference to max_tokens in the truncation call from
self.chat_api.max_tokens to self.max_tokens, ensuring the correct
token limit is applied. This resolves potential issues with message
length handling.
This commit is contained in:
Kumi 2024-08-04 17:42:23 +02:00
parent c06da55d5d
commit ca7245696a
Signed by: kumi
GPG key ID: ECBCC9082395383F

View file

@ -393,7 +393,7 @@ class OpenAI(BaseAI):
# Truncate messages to fit within the token limit # Truncate messages to fit within the token limit
self._truncate( self._truncate(
messages=chat_messages, messages=chat_messages,
max_tokens=self.chat_api.max_tokens - 1, max_tokens=self.max_tokens - 1,
system_message=system_message, system_message=system_message,
) )