You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[Bug]: Trying to run the vectorizer on a large number of new documents results in "Requested 629204 tokens, max 600000 tokens per request" from openai
#481
Open
kolaente opened this issue
Feb 14, 2025
· 2 comments
· May be fixed by #482
What happened?
I'm running the vectorizer on a new large dataset and get this error from openai:
I wonder if that's a configuration error?
pgai extension affected
0.7.0
pgai library affected
No response
PostgreSQL version used
17
What operating system did you use?
latest
timescale/timescaledb-ha:pg17
What installation method did you use?
Docker
What platform did you run on?
On prem/Self-hosted
Relevant log output and stack trace
How can we reproduce the bug?
1. Have a large amount of data 2. Create a vectorizer 3. Run the worker (I can't really pin it down, will try to see if I can reproduce it better)
Are you going to work on the bugfix?
None
The text was updated successfully, but these errors were encountered: