Build LLM prompt with varialbles - AI Python for Beginners: Basics of AI Python Coding

When creating a post, please add:

Could you please confirm that is still possible using this functionality in Jupyter notebook?
If yes, could you please provide reference.
It looks like OpenAI realease Jupyter Code Assistant with Innovative Abilities to assist with tasks specific to Jupyter Notebooks. This is available with ChatGPT Plus and not for free.
Many thanks
OF8

@chris.favila can you check this?

Hi, need anyone up for helping me out in regards of helper_functions and print_llm_responses.

Following are the Steps I followed so far-

Step- 1. First tried to run the “from helper_functions import print_llm_responses”. Got the error messages "No module named ‘helper_functions’ “.
Step- 2. Then tried the Copilot search how to get out of this problem. As copilot returned with try " pip install helper_functions”. I tried this as well.
Step- 3. Then this error messages popped out —
" error: subprocess-exited-with-error
python setup.py egg_info did not run successfully.
exit code: 1

[15 lines of output]
The ‘sklearn’ PyPI package is deprecated, use ‘scikit-learn’
rather than ‘sklearn’ for pip commands.

Here is how to fix this error in the main use cases:

use ‘pip install scikit-learn’ rather than ‘pip install sklearn’
replace ‘sklearn’ by ‘scikit-learn’ in your pip requirements files
(requirements.txt, setup.py, setup.cfg, Pipfile, etc …)
if the ‘sklearn’ package is used by one of your dependencies,
it would be great if you take some time to track which package uses
‘sklearn’ instead of ‘scikit-learn’ and report it to their issue tracker
as a last resort, set the environment variable
SKLEARN_ALLOW_DEPRECATED_SKLEARN_PACKAGE_INSTALL=True to avoid this error
More information is available at
GitHub - scikit-learn/sklearn-pypi-package
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

Encountered error while generating package metadata.

See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details."

Step- 4. Again no idea what this error messages said. Tried copilot again but never returned with something promising reply.

Step- 5. Then following the error message tried “pip install scikit-learn” as well. This action completed successfully. However, again then when I tried to “pip install helper_functions” . Same error messages again popped up.

Step- 6. Seeing no way around for solution, I hopped into the community discussions of this course. First into “How to have helper_functions in python” which is locked by the way. There i read the discussions of Leyu_Shi, Rajankrsna, Mubsi.

Step- 7. As per there discussion there is already a helper_functions.py file in the lab workspace. Okay I found it also. Tried to download the file and kept the file in same directory in which every other jupyter lab file are present in my system.

Step- 8. Then again when i tried to run the script " from helper_functiions…" , I got this error now----

ModuleNotFoundError Traceback (most recent call last)
Cell In[4], line 1
----> 1 from helper_functions import print_llm_reponses

File ~\helper_functions.py:5
2 import os
4 from openai import OpenAI
----> 5 from dotenv import load_dotenv
7 import random
9 #Get the OpenAI API key from the .env file

ModuleNotFoundError: No module named ‘dotenv’

Step- 9. Then I fixed the problem by using this command as well -
Install the dotenv module
pip install python-dotenv

Step- 10. I again run the same script ,“from helper_functions import print_llm_reponses”. Now this error is generating ---- " ---------------------------------------------------------------------------
OpenAIError Traceback (most recent call last)
Cell In[9], line 1
----> 1 from helper_functions import print_llm_reponses
File ~\helper_functions.py:14
11 openai_api_key = os.getenv(‘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’)
13 # Set up the OpenAI client
—> 14 client = OpenAI(api_key=openai_api_key)
17 def print_llm_response(prompt):
18 “”“This function takes as input a prompt, which must be a string enclosed in quotation marks,
19 and passes it to OpenAI’s GPT3.5 model. The function then prints the response of the model.
20 “””

File ~\anaconda3\lib\site-packages\openai_client.py:105, in OpenAI.init(self, api_key, organization, project, base_url, timeout, max_retries, default_headers, default_query, http_client, _strict_response_validation)
103 api_key = os.environ.get(“OPENAI_API_KEY”)
104 if api_key is None:
→ 105 raise OpenAIError(
106 “The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable”
107 )
108 self.api_key = api_key
110 if organization is None:

OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
"

Step- 11. Now I got the OpenAI API key as well and inserted the key in the
import gradio as gr
import os

from openai import OpenAI
from dotenv import load_dotenv

import random

#Get the OpenAI API key from the .env file
load_dotenv(‘.env’, override=True)
openai_api_key = os.getenv(" Here I inserted the openAI API key") as well. This scripts is present in the helper_functons.py file.

Step- 12. Still not getting the appropriate result.

Step- 13. So these many are my course of action.

Step- 14 Now need help of some mentor for getting me out of this problem.

Thank you for reading this message

Again I am posting , due to this same issue I am not able do with the Lesson 10 as well.

In lesson 10, I have written " from helper_functions import * ".
And this error message is generating in return making my learnings partially-understood.

Error message— " OpenAIError Traceback (most recent call last)
Cell In[14], line 1
----> 1 from helper_functions import *

File ~\helper_functions.py:14
11 openai_api_key = os.getenv(‘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’)
13 # Set up the OpenAI client
—> 14 client = OpenAI(api_key=openai_api_key)
17 def print_llm_response(prompt):
18 “”“This function takes as input a prompt, which must be a string enclosed in quotation marks,
19 and passes it to OpenAI’s GPT3.5 model. The function then prints the response of the model.
20 “””

File ~\anaconda3\lib\site-packages\openai_client.py:105, in OpenAI.init(self, api_key, organization, project, base_url, timeout, max_retries, default_headers, default_query, http_client, _strict_response_validation)
103 api_key = os.environ.get(“OPENAI_API_KEY”)
104 if api_key is None:
→ 105 raise OpenAIError(
106 “The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable”
107 )
108 self.api_key = api_key
110 if organization is None:

OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable "

previously also i have mentioned about this same message in my just previous post.
Need help please

hi @Nikunja

Sorry for the delayed response. please check the below link to address your issue.

Let me know if this helps or not!!

Regards
DP