Inference Toolkit - Init and default template for custom inference

Hey Phil,

Thanks for the response. With the custom inference.py, the example above was just a test to make sure it is working before I start customizing it even further. Would the solution you suggested work when including the custom inference.py in the model? Also, if we want to modify the model loading, do we use model_fn or load_fn? I thought I saw both versions floating around.

Thanks!