Both `max_new_tokens` and `max_length` have been set but they serve the same purpose

System Info

  • transformers version: 4.26.0
  • Platform: Linux-5.10.147±x86_64-with-glibc2.29
  • Python version: 3.8.10
  • Huggingface_hub version: 0.12.0
  • PyTorch version (GPU?): 1.13.1+cu116 (False)
  • Tensorflow version (GPU?): 2.9.2 (False)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed

was setting up and using Galactica language model.
Was facing this error:
ValueError: Both max_new_tokens and max_length have been set but they serve the same purpose – setting a limit to the generated output length. Remove one of those arguments. Please refer to the documentation for more information.

!pip install galai
import galai as gal
from galai.notebook_utils import *

model = gal.load_model("base")
model.generate("The Transformer architecture [START_REF]") ----- here the error came up.

Same error came up while running this ----

prompt = f"Question: A bat and a ball cost $\\$1.10$ in total. The bat costs $\\$1.00$ more than the ball. How much does the ball cost?\n\nAnswer:"
display_markdown(model.generate(prompt, new_doc=True, max_length=250))