BasicModelRunner("meta-llama/Llama-2-7b-hf") => ModelNameError: Not Found

non_finetuned = BasicModelRunner(“meta-llama/Llama-2-7b-hf”)


HTTPError Traceback (most recent call last)
File /usr/local/lib/python3.9/site-packages/llama/program/util/run_ai.py:134, in powerml_send_query_to_url(params, route)
131 response = requests.post(
132 url=url + route, headers=headers, json=params, timeout=200
133 )
→ 134 response.raise_for_status()
135 except requests.exceptions.Timeout:

File /usr/local/lib/python3.9/site-packages/requests/models.py:1021, in Response.raise_for_status(self)
1020 if http_error_msg:
→ 1021 raise HTTPError(http_error_msg, response=self)

HTTPError: 404 Client Error: Not Found for url: http://internal-aws-prod-internal-revproxy-alb-11660607.us-west-1.elb.amazonaws.com/rev-proxy/lamini/v1/llama/run_program

During handling of the above exception, another exception occurred:

ModelNameError Traceback (most recent call last)
Cell In[3], line 1
----> 1 non_finetuned_output = non_finetuned(“Tell me how to train my dog to sit”)

File /usr/local/lib/python3.9/site-packages/llama/runners/basic_model_runner.py:50, in BasicModelRunner.call(self, inputs)
47 else:
48 # Singleton
49 input_objects = Input(input=inputs)
—> 50 output_objects = self.llm(
51 input=input_objects,
52 output_type=Output,
53 model_name=self.model_name,
54 enable_peft=self.enable_peft,
55 )
56 if isinstance(output_objects, list):
57 outputs = [o.output for o in output_objects]

File /usr/local/lib/python3.9/site-packages/llama/program/builder.py:85, in Builder.call(self, input, output_type, *args, **kwargs)
83 else:
84 value = self.add_model(input, output_type, *args, **kwargs)
—> 85 result = gen_value(value)
86 return result

File /usr/local/lib/python3.9/site-packages/llama/program/util/api_actions.py:203, in gen_value(value)
202 def gen_value(value: Value):
→ 203 value._compute_value()
204 return value._data

File /usr/local/lib/python3.9/site-packages/llama/program/value.py:65, in Value._compute_value(self)
60 else:
61 params = {
62 “program”: self._function.program.to_dict(),
63 “requested_values”: [self._index],
64 }
—> 65 response = query_run_program(params)
67 response.raise_for_status()
69 # update the cache

File /usr/local/lib/python3.9/site-packages/llama/program/util/run_ai.py:11, in query_run_program(params)
10 def query_run_program(params):
—> 11 resp = powerml_send_query_to_url(params, “/v1/llama/run_program”)
12 return resp

File /usr/local/lib/python3.9/site-packages/llama/program/util/run_ai.py:143, in powerml_send_query_to_url(params, route)
141 except Exception:
142 json_response = {}
→ 143 raise llama.error.ModelNameError(
144 json_response.get(“detail”, “ModelNameError”)
145 )
146 if response.status_code == 429:
147 try:

ModelNameError: Not Found

1 Like

OH I have found a solution: !pip install --upgrade --force-reinstall --ignore-installed lamini

Now it works …

3 Likes

Hi @ihorrible

Welcome to the community.

Well done at troubleshooting this exception.

Best regards

Hello everyone !
not working for me :slight_smile:
I got a " /usr/bin/sh: 1: pip: not found " error

any other idea ?

1 Like

you could try:
%pip install --upgrade --force-reinstall --ignore-installed lamini

and then restart the kernel. I got a 500 Internal Server Error. Did you solve it?

1 Like
/usr/bin/sh: 1: pip: not found

Could you please help with that?