TypeError: forward() got an unexpected keyword argument 'input_ids'

Hello everyone! I’m totally new to BERT and I’m trying to use it for topic modeling. So my goal is to get the sentence embeddings for my dataset and group these embeddings with KMeans to get my topics. So to use KMeans, I set the number of features to be 256. However, I’m getting this error:

TypeError: forward() got an unexpected keyword argument 'input_ids'

Would anyone know why this is?

from sentence_transformers import SentenceTransformer, models
from torch import nn

word_embedding_model = models.Transformer('bert-base-multilingual-uncased', max_seq_length=256)
pooling_model = models.Pooling(word_embedding_model.get_word_embedding_dimension())
dense_model = models.Dense(in_features=pooling_model.get_sentence_embedding_dimension(), out_features=256, activation_function=nn.Tanh())

model = SentenceTransformer(modules=[word_embedding_model, pooling_model, dense_model])
from transformers import BertTokenizer, BertModel

dataset = ['panic hits financial markets via',
 'alliance bolobarienne producteur officiel fakenews',
 'zit spurs dwars voorzorg tottenham hotspur',
 'fear unadulterated fear great evening became friends owners worried family restrictions far worse know resources strained people china scared told anything',
 'might get ravaged hell way go londonlife',
 'circulait inaperçu depuis semaines italie',
 'netizens cancelled',
 'langsam nerven berichte einfach kleinste zeitung senf geben dadurch einfach berichte verschiedenen meinungen entstehen apokalyptischem ausmaß anhören',
 'welp nice knowing guess matter time norcal f',
 'knvb geeft officieel statement',
 'gon nothing matters post boner gifs mods asleep',
 'andererseits frage viele leute unnötig wegen bissl schnupfen gerade schnupfen mal symptom hausarztptaxen verstopfen einfach froh hysterie abklingt hoffe nix abgeriegelt',
 'liva kuku tutaanza upya msimu ujao',
 'pidgin abeg make fear turn soap water',
 'patient gutem zustand kontaktpersonen häuslicher quarantäne',
 'theory going around mikepence deliberately set trump fail fighting seeing pics like really wonder whether truth rtoday covd',
 'could make perfect storm trump kill us rank incompetence rtoday',
 'malaysian interested purchasing harper bazaar china boost help purchase provide reading code later info please read following texts superm taeyong ten lucas mark',
 'iran trusted deal',
 'could follow mikepence yesterday wiped nose hands proceeded shake everyone hands pence course leading american response rtoday',
 'bundeswehr koblenz',
 'criminelles incohérences gouvernement',
 'bad things never go away country nigeria came nigeria',
 'trkl habite algrange',
 'newzealand confirms first case saying recent arrival iran traveled auckland via bali tested',
 'channeling trump go ahead worry situation control trust seriously going clean teeth anti mask',
 'deutschland innerhalb weniger stunden fälle bestätigt hessen hamburg via',
 'despite imposing strict filter gag order info statements real news coming risk level high',
 'ojalá preocuparan ponerse condon preocupan virus',
 'ever happened hong kong protests happening behind wuhan virus',
 'know say border walls wipe sterilize surfaces virus staying power outside body many miles wall supposed',
 'ever happened hong kong protests happening behind wuhan virus',
 'fears frap financial markets seagal settles sec case canada stop paying harry amp meghan protection latest news delano breakfast briefing friday',
 'pidgin abeg make fear turn soap water',
 'häusliche quarantäne bedeutet',
 'first case huh hundreds boarded flight pathetic thing china succeeded containing nigeria ever coronavirusupdates coronavirusnigeria',
 'nice knowing yall',
 'pidgin abeg make fear turn soap water coronavirusupdates',
 'forscher welt verwenden berkeley lights beacon plattform kampf',
 'mauvais santé coca vaut mieux bonne',
 'ever happened hong kong protests happening behind wuhan virus',
 'erst neues bon fickt richtig',
 'south china happy valley unbeaten hong kong first division',
 'guessing influenza similar thing even flu irrelevant much dangerous common flu',
 'luftverkehr china liegt bereits weitgehend lahm',
 'kundin hamsterkäufe gleiche kundin minuten später atemschutzmasken filter',
 'daily crunch facebook cancels f concerns microsoft',
 'versetzt globalisierung fieberschub via',
 'circulait inaperçu depuis semaines italie',
 'kaspersky mcafee enough']

tokenizer = BertTokenizer.from_pretrained('bert-base-multilingual-uncased')
encoded_input = tokenizer(dataset, return_tensors='pt', truncation=True, padding=True)
output = model(**encoded_input)
last_hidden_states_01 = outputs[0]