Model type: chatglm - unexpected keyword argument 'padding_side'

Hi guys :slight_smile:

Im switching between models and now I found interesting model which generates strange error.

INFO: Application startup complete.
copilot-models | Using precision: 16
copilot-models | Model THUDM/codegeex4-all-9b not found locally. Downloading from Hugging Faceā€¦
copilot-models | Using fast tokenizer: ChatGLM4Tokenizer
copilot-models | Setting add_prefix_space=False for slow tokenizer.
copilot-models | Config passed
copilot-models | Model type: chatglm
copilot-models | Loading model: THUDM/codegeex4-all-9b
Loading checkpoint shards: 100% 4/4 [00:04<00:00, 1.01s/it]
copilot-models | Model and tokenizer saved locally to /models/THUDM/codegeex4-all-9b_model.
copilot-models | Using device: cuda
copilot-models | Error during model inference: _pad() got an unexpected keyword argument ā€˜padding_sideā€™
copilot-models | ERROR:root:Error checking if model is saved: _pad() got an unexpected keyword argument ā€˜padding_sideā€™

I found this TypeError: ChatGLMTokenizer._pad() got an unexpected keyword argument 'padding_side' Ā· Issue #1324 Ā· THUDM/ChatGLM3 Ā· GitHub but it didnt help.
Also found Fix TypeError in _pad method by adding missing padding_side field Ā· THUDM/LongWriter-glm4-9b at 778b571 but it was done 4 weeks ago and Im not sure why its not applied yet - or maybe it is but now this is the different issue with same ā€œsignatureā€?

2 Likes