Error in "Exploring LLM capabilities" lab

It says “The function generate_with_single_input takes as input a prompt, role, top_k, temperature, max_tokens and model name.”.

But it’s actually called “top_p”, not “top_k”

1 Like

Indeed, both top_p and top_k are described in the lab, but should be top_p for generate_with_single_input as is a float value controlling randomness of the output.

I’ve logged an issue for this typo, thanks for the heads up @billyboe!