Uploading large files(PDFs) in Inference Endpoints

Hi everyone!
We have a specific usecase in that we need to upload large PDF files, let’s say of 150 to 200 pages of pdf.
For smaller pdf containing less that 50 pages, I’m converting those images to base64strings and then sending it to the Endpoint server using requests. but it’s very slow as it depends on internet speed and all.

Is there any better approach of doing this? One way I thought to upload the large files to any cloud say s3-bucket and download those files in the inference endpoint server only, BUT the problem is I couldn’t find a way to set the secret keys in Inference Endpoints.

Thanks a lot!