"Worker died" error while performing inference on large text

I want to use feature extraction task using ‘EleutherAI/gpt-neo-125M’ but when I provide text, I got “worker died” error. I tried to look into logs and find out the error shows io.netty.handler.codec.CorruptedFrameException: Message size exceed limit: 14010754
Kindly help to resolve this error, Thanks in advance

Hey @Nomesh, did you end up finding a solution for this?

Hey, I think the solution might be to increase the maximum allowed size using env variables as follows (or endpoint equivalent), this worked for me:

transformer = huggingface_model.transformer(
    instance_count=1,
    output_path = s3_output,
    instance_type="ml.m5.xlarge",
    assemble_with="Line",
    max_payload=6,
    strategy='SingleRecord',
    env={'SAGEMAKER_MODEL_SERVER_TIMEOUT':'3600', 
         'TS_MAX_RESPONSE_SIZE':'2000000000',
         'TS_MAX_REQUEST_SIZE':'2000000000',
         'MMS_MAX_RESPONSE_SIZE':'2000000000',
         'MMS_MAX_REQUEST_SIZE':'2000000000'}
)