Running my one's own instance of Azure OpenAI

Hi team,
It seems that spinning up an Azure OpenAI instance is limited to enterprise companies and Msft partners.
Is this accurate?
Is there no ability to create our own instance?
If so, is there a better alternative?
For example, is it possible to use chatGPT direct through langchain and complete the same course content examples in our own environments?

Kind regards,
Alex

I think I’ve answered this for myself…

Here’s an example from lesson 2, except without needing an azure stack:

import os

import pandas as pd
from IPython.display import Markdown, HTML, display
from langchain.schema import HumanMessage
from langchain_openai import ChatOpenAI

llm_model = “gpt-4”
llm = ChatOpenAI(temperature=0.0, model=llm_model)

df = pd.read_csv(“./data/all-states-history.csv”).fillna(value=0)

from langchain.agents.agent_types import AgentType
from langchain_experimental.agents.agent_toolkits import create_pandas_dataframe_agent

agent = create_pandas_dataframe_agent(
llm=llm,
df=df,
verbose=True,
allow_dangerous_code=True
)

agent.invoke(“how many rows are there?”)

So, yes, it appears that the MS stack is not available for personal use. Wonder if the other topics use MS specific features.

Good one. I used a different setup to work directly with the OpenAI API

import os
from openai import OpenAI
import json
client = OpenAI(api_key=os.environ.get(“OPENAI_API_KEY”))
OAI_model = “gpt-4o”

assistant = client.beta.assistants.create(
instructions=“”“You are an assistant answering questions
about a Covid dataset.”“”,
model=OAI_model,
tools=Helper.tools_sql,
… etc etc…

I was able to get most of the lesson to run with some minor changes to work with the OpenAI documentation but ran into some timeout and other challenges where I believe the codebase for Azure and OpenAI are a little different.

see:
https://platform.openai.com/docs/assistants/overview