How can I create lambda function code after deploying bart transformer for summarization in sagemaker?

I deployed bart-large-cnn model in aws sagemaker for summarization task by following the code.

from sagemaker.huggingface import HuggingFaceModel
import sagemaker

role = sagemaker.get_execution_role()

Hub Model configuration. Models - Hugging Face

hub = {
‘HF_MODEL_ID’:‘facebook/bart-large-cnn’, # model_id from Models - Hugging Face
‘HF_TASK’:‘summarization’ # NLP task you want to use for predictions
}

create Hugging Face Model Class

huggingface_model = HuggingFaceModel(
env=hub,
role=role, # iam role with permissions to create an Endpoint
transformers_version=“4.6”, # transformers version used
pytorch_version=“1.7”, # pytorch version used
py_version=“py36”, # python version of the DLC
)

deploy model to SageMaker Inference

predictor = huggingface_model.deploy(
initial_instance_count=1,
instance_type=“ml.m5.xlarge”
)

Then tested with text to extract summaries and it performed well.

What code i needed to add in lambda function, for creating a rest api to call the summariZation model endpoint using aws lambda and api gateway.

1 Like

Hello @vihaary,

Create to hear it worked as expected. Invoking your endpoint from an AWS Lambda function is pretty easy you can either use the boto3 sagemaker-runtime or install the sagemaker sdk into your lambda function.

boto3 snippet:

import boto3
import json
client = boto3.client('sagemaker-runtime')

response = client.invoke_endpoint(
EndpointName=ENDPOINT_NAME,
ContentType="application/json",
Accept="application/json",
Body=json.dumps(payload),
)
print(response['Body'].read().decode())

sagemaker snippet

from sagemaker.huggingface import HuggingFacePredictor

predictor = HuggingFacePredictor(ENDPOINT_NAME)
response = predictor.predict(payload)
print(response)
1 Like

Hello @philschmid , thanks for your response. Will I get summary as response output for text paragraphs as input with this lambda code?.

No that is just how you can invoke your endpoint in your AWS Lambda function.

@philschmid

{
“errorMessage”: “An error occurred (ModelError) when calling the InvokeEndpoint operation: Received client error (400) from model with message “{\n “code”: 400,\n “type”: “InternalServerException”,\n “message”: “\u0027numpy.ndarray\u0027 object has no attribute \u0027pop\u0027”\n}\n”. See https://us-west-2.console.aws.amazon.com/cloudwatch/home?region=us-west-2#logEventViewer:group=/aws/sagemaker/Endpoints/huggingface-pytorch-inference-2021-09-27-12-40-12-385 in account 215401044306 for more information.”,
“errorType”: “ModelError”,
“stackTrace”: [
" File “/var/task/lambda_function.py”, line 20, in lambda_handler\n Body=payload)\n",
" File “/var/runtime/botocore/client.py”, line 386, in _api_call\n return self._make_api_call(operation_name, kwargs)\n",
" File “/var/runtime/botocore/client.py”, line 705, in _make_api_call\n raise error_class(parsed_response, operation_name)\n"
]
}

I am getting this error. Also can i get summary as output for text paragraphs as input after invoking my endpoint in aws lambda function?

You are not sending a json input to the endpoint.

Yes

@philschmid Thanks for the reply.

I tried this one

import os
import io
import boto3
import json
import csv

grab environment variables

ENDPOINT_NAME = os.environ[‘ENDPOINT_NAME’]
runtime= boto3.client(‘runtime.sagemaker’)

def lambda_handler(event, context):
print("Received event: " + json.dumps(event, indent=2))

data = json.loads(json.dumps(event))
payload = data['data']
print(payload)

response = runtime.invoke_endpoint(EndpointName=ENDPOINT_NAME,
                                   ContentType='text/csv',
                                   Body=payload)
print(response)
result = json.loads(response['Body'].read().decode())
print(result)

But i am not getting the summary output. Can you provide a sample code for lambda function where i can give paragraph as input and i should get summary as output for my huggingface bart model endpoint.

Your ContentType is wrong if you compare it to what I have shared. I am going to create a sample for this but not before next week.

Thanks @philschmid . Will be waiting for your response

Hi @philschmid . Can you try to create a lambda function sample for the above bart model summarization task?. Thanks in advance.

Hey,
I’ll post about it later on Social Media (Twitter, LinkedIn) but here is the example: cdk-samples/aws-lambda-sagemaker-endpoint-huggingface at master · philschmid/cdk-samples · GitHub
It is an example using the CDK so if you want to deploy a BART model for summarization you can follow the readme.

clone the repository

git clone https://github.com/philschmid/cdk-samples.git
cd aws-lambda-sagemaker-endpoint-huggingface

Install the cdk required dependencies. Make your you have the cdk installed.

pip3 install -r requirements.txt

Bootstrap your application in the cloud.

cdk bootstrap

Deploy your Hugging Face Transformer model to Amazon SageMaker


cdk deploy \
-c model="facebook/bart-large-cnn" \
-c task="summarization"

clean up

cdk destroy \
-c model="facebook/bart-large-cnn" \
-c task="summarization"

If you only want to know/see the lambda code you can look here cdk-samples/handler.py at master · philschmid/cdk-samples · GitHub

Hi @vihaary,

Do you able to solve that problem ? If yes then please share the code with me . I need that code as I am working on a project regarding that things…

Thanks

Yes, as mentioned above I created an example on how to do do this using AWS CDK.