Hi,
I am trying to load mosaicml/mpt-7b-instruct with load_in_8bits=True and device_map="auto", however I get the following error:
ValueError: MPTForCausalLM does not support `device_map='auto'` yet.
How can I get rid of this error?
Thanks!
Hi,
I am trying to load mosaicml/mpt-7b-instruct with load_in_8bits=True and device_map="auto", however I get the following error:
ValueError: MPTForCausalLM does not support `device_map='auto'` yet.
How can I get rid of this error?
Thanks!
Thanks to @sam-mosaic, I got a reply on the model community forum.
Are there any specific set of steps to add int8 inference support for an LLM?
Thanks!