Unable to load LLM with load_in_8bits


I am trying to load mosaicml/mpt-7b-instruct with load_in_8bits=True and device_map="auto", however I get the following error:

ValueError: MPTForCausalLM does not support `device_map='auto'` yet.

How can I get rid of this error?


Thanks to @sam-mosaic, I got a reply on the model community forum.

Are there any specific set of steps to add int8 inference support for an LLM?