Hi i dont want to use together.ai as im a teacher, and in the institute that i wok i cant make my students register as they are minors and laws of the country.
So is there any easy way to follow the course without together.ai ? The used helper function is defined :
def llama(prompt,
add_inst=True,
model="togethercomputer/llama-2-7b-chat",
temperature=0.0,
max_tokens=1024,
verbose=False,
url=url,
headers=headers,
base = 2, # number of seconds to wait
max_tries=3):
I guess is there a way to use it with local llama or openai python library:
from openai import OpenAI
prompt = "What is the capital of France?"
client = OpenAI(
base_url='http://localhost:11434/v1/',
# required but ignored
api_key='ollama',
)
/* PERHAPS SOME CODE GOES HERE */
chat_completion = client.chat.completions.create(
messages=[
{
'role': 'user',
'content': prompt,
}
],
model='llama2',
/* WHAT CODE GOES HERE, PERHAPS SOME PARAMS ??? */
)
print(chat_completion.choices[0].message.content)
Sorry for my bad english, but i want or need help to “translate” the llama function defined above ( the full code is in utils.py) into some kind of code i could use in local. If using openai compatibility would be great, if not with “nomal” ollama python library.
Thanks in advance