You can add a requirements.txt
into the code/
and the archive and upload it to s3 and provide it as model_data
. This should work. You can use my example to test it.
Additionally, does the FrameworkModel
class have the attribute dependencies
, but it looks way more complex to add your dependencies.
- dependencies ( list [ str ] ) –A list of paths to directories (absolute or relative) with any additional libraries that will be exported to the container (default: ). The library folders will be copied to SageMaker in the same folder where the entrypoint is copied. If ‘git_config’ is provided, ‘dependencies’ should be a list of relative locations to directories with any additional libraries needed in the Git repo. If the
source_dir
points to S3, code will be uploaded and the S3 location will be used instead.
Example
The following callModel(entry_point=‘inference.py’, … dependencies=[‘my/libs/common’, ‘virtual-env’])
results in the following inside the container:
$ ls
opt/ml/code
|------ inference.py
|------ common
|------ virtual-env
This is not supported with “local code” in Local Mode.
If you want to go with source_dir
and entry_point
I would suggest building a helper function that is executed before all imports like install_dependencies
.