Eli5.map :tokenizer not defined

I am trying this tutotial.
preprocess function runs fine when I test it with portion of eli5[train]. However it runs I think out of memory when provided whole dataset. Now when I run eli5.map I get name tokenizer is not defined. Could it be memory issue? I modified the preprocess function to return only 10 items from example like this: return tokenizer([" ".join(x) for x in examples[“answers.text”][0:10]]) but still get the same error

found solution here