Replies: 1 comment
-
I found the solution.. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have more than 2000 json files. each file is very complex and of size more than 700 KB but follows the same structure. I am not able to find any good split strategy for this json data. every option at any point in time gives max token error. Anyone has any suggestion.
{'error': {'message': "This model's maximum context length is 128000 tokens. However, your messages resulted in 153652 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
Beta Was this translation helpful? Give feedback.
All reactions