L1: Semantic Kernel is like your AI cooking kitchen: cant' connect to Azure

I can’t make a connection with the provided code between a Google Colab and Azure.

Get an error: 'TypeError: AzureOpenAI.init() got an unexpected keyword argument ‘deployment’

code:

useAzureOpenAI = True

from dotenv import load_dotenv, find_dotenv
_ = load_dotenv(find_dotenv())

client = AzureOpenAI(
  api_key = os.getenv("AZURE_OPENAI_API_KEY"),
  azure_endpoint = os.getenv("AZURE_OPENAI_ENDPOINT"),
  deployment = os.getenv("AZURE_OPENAI_DEPLOYMENT_NAME"),
  api_version = "2023-07-01-preview" 
)

if useAzureOpenAI:
    deployment, api_key, azure_endpoint = sk.azure_openai_settings_from_dot_env()
    kernel.add_text_completion_service("azureopenai", AzureChatCompletion(deployment, azure_endpoint, api_key))
else:
    api_key, org_id = sk.openai_settings_from_dot_env()
    kernel.add_text_completion_service("openai", OpenAIChatCompletion("gpt-3.5-turbo-0301", api_key, org_id))

print("You made a kernel!")

I set useAzureOpenAI to True because that’s what I want it to do (contrary to the course).
It does connect via Hugging Face but I want to get it to work via Azure.

The deployment name and model name are identical (seems to be an issue from time to time too if look here and there): gpt-35-turbo.
Checked the .env file several times: it’s mentioned there.
Any ideas?
As far as I can see, it could be related to a package issue (version)?

1 Like