Hi, I am creating a project which involves analysing of pretty big files.Currently I use openAi api with file retrieval, but it costs to much, almost a 1$ per request. Is there any other, cheaper alternative? I am not into ML, so i am looking for something quite quick to figure out. Preferably for working with SQL databases, but json files are fine also. I am not sure if this is right place to ask this kinds of questions, if so please advise me on where cannI get an answer for this. Thank’s for help.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Darshan Hiranandani : Optimizing Model for Handling Large Transcripts with Metadata: Suggestions Needed | 0 | 17 | January 16, 2025 | |
Is it Possible to use these Models to make Requests to OpenAI? | 0 | 188 | January 11, 2023 | |
Which is the optimum model for handling large transcripts with MetaData? | 0 | 15 | January 14, 2025 | |
AI model for large context processing | 1 | 193 | July 30, 2024 | |
Recommendations on millions of files | 0 | 113 | June 18, 2024 |