Dont have maximum tokens length 4097 in Lesson Gestting started with Llama2

Hi, im trying the course in local with llama2 model
( ollama run llama2)

But if i excute the part of the max_tokens limit 4097, in the videos it gives error, but
i dont have… Is there any reason ? Local run should be the same as together.ai made in the course no ? Any explanation why i dont have this max tokens limit when using the same model and prompt ?

and these

Thanks

1 Like