LLM - API : temperature / top_p/repetition_penalty parameters logic meaning from business perspective

thank you to further interprete this paramaters selection and its true impact logic meaning behind it. From current course demo, i want to know how we choose the number default of these parameters and their meanings to make potential impacts. Such as 0.9 for top_p, i understand this is the cumulative probability distribution to 0.9 choosing, but what is the impact pros and cons that determine the generative result? It would be appreciate that if we can have marketing customized/creative content generation base on the customers insights as the demo or business case to further interprete. Many thanks!

Hi Leory.
The default parameters are those that the model handles by default, and generally, both top_p and temperature are always initialized to 0.
Both parameters, temperature and top_p, are configured together. Temperature adjusts the probability distribution, while top_p limits the set of possible tokens that can be chosen. This way, we manage randomness and prevent the model from generating inconsistent text.
In the course notebook, you will find an exercise with better combination values; these can serve as a reference. However, the most appropriate values for your particular use case will have to be fine-tuned in the case itself, but they are almost certainly close to the exercise parameters.
Alternatively, you can use “Repetition Penalty,” but this only applies to certain use cases.
In the notebook exercise, I’ve included an image of the section with the best values, enclosed in a yellow rectangle.

Best of luck!

1 Like