Which packages and libraries should we install if we want to want to use helper functions and import print_llm_response?

In the week 1 lessons where we get to play with LLM print responses, it appears we’re using a custom library that imports the print_llm_response. Is there a package or library I can install on my home machine to experiment with, within vscode?

I’m not a mentor for this course.
Does this help?

1 Like

Hi, welcome to the community.
You can find the helper_function.py file by clicking on File → Open and then selecting and downloading the file, as shown in the screenshots

Then it opens a new tab


You select and download helper_functions.py file.

Just that it requires set up locally. You need to add your API keys, some of which require payment. This requires tremendous effort on your path to run it locally. The easiest is just to use the learning platform.
Here’s how the file looks:

2 Likes

Thank you so much for guiding me through this. If I have a $20 subscription to ChatGPT, does this give me access to the OpenAI model? Or do I require a developer subscription to use this?

I’ll take your advice and for now will only use the learning platform to experiment with the function.

1 Like

No. If you pay the $20 subscription to ChatGPT, you wont have access to the API. A separate developer subscription is necessary to have access to the API client.

1 Like

Hi @TJ2026,

As Lukman suggested, for now, the best thing is to use the platform resources to go through the course content. But you can keep downloading the notebooks and all of the relevant files for it for later use.

In module 4, you’ll get a brief introduction on how to set up all of this locally, use an external library to help with the set up. That’s when you can start to try running things locally.

Best,
Mubsi