Hi, I’m getting error, despite restoring the original version:
Warning: You are sending unauthenticated requests to the HF Hub. Please set a HF_TOKEN to enable higher rate limits and faster downloads.
Could not cache non-existence of file. Will ignore error and continue. Error: [Errno 30] Read-only file system: ‘.models/models–BAAI–bge-base-en-v1.5/.no_exist/a5beb1e3e68b9ab74eb54cfd186867f64f240e1a/adapter_config.json’
can post screenshot of the error for which code you encountered this issue?
hi @Anjan1970
if you read that error, it mentions you to ignore it as the cell successfully runs with import of flask app.
OSError: [Errno 30] Read-only file system indicates that the program is attempting to write a file to a directory or file system that is configured as read-only.
So this error shouldnt cause any problem to the lab codes.
Hi Preeti. Thanks. But when I run:
response = generate_with_single_input(**kwargs)
print(response[‘content’])
Here is the error I get:
InvalidRequestError Traceback (most recent call last)
Cell In[5], line 2
1 # Generating
----> 2 response = generate_with_single_input(**kwargs)
3 print(response[‘content’])
File ~/work/utils.py:213, in generate_with_single_input(prompt, role, top_p, temperature, max_tokens, model, together_api_key, **kwargs)
211 together_api_key = os.environ[‘TOGETHER_API_KEY’]
212 client = Together(api_key = together_api_key)
→ 213 json_dict = client.chat.completions.create(**payload).model_dump()
214 json_dict[‘choices’][-1][‘message’][‘role’] = json_dict[‘choices’][-1][‘message’][‘role’].name.lower()
215 try:
File /usr/local/lib/python3.12/site-packages/together/resources/chat/completions.py:141, in ChatCompletions.create(self, messages, model, max_tokens, stop, temperature, top_p, top_k, repetition_penalty, presence_penalty, frequency_penalty, min_p, logit_bias, seed, stream, logprobs, echo, n, safety_model, response_format, tools, tool_choice, **kwargs)
112 requestor = api_requestor.APIRequestor(
113 client=self._client,
114 )
116 parameter_payload = ChatCompletionRequest(
117 model=model,
118 messages=messages,
(…) 138 **kwargs,
139 ).model_dump(exclude_none=True)
→ 141 response, _, _ = requestor.request(
142 options=TogetherRequest(
143 method=“POST”,
144 url=“chat/completions”,
145 params=parameter_payload,
146 ),
147 stream=stream,
148 )
150 if stream:
151 # must be an iterator
152 assert not isinstance(response, TogetherResponse)
File /usr/local/lib/python3.12/site-packages/together/abstract/api_requestor.py:249, in APIRequestor.request(self, options, stream, remaining_retries, request_timeout)
231 def request(
232 self,
233 options: TogetherRequest,
(…) 240 str | None,
241 ]:
242 result = self.request_raw(
243 options=options,
244 remaining_retries=remaining_retries or self.retries,
245 stream=stream,
246 request_timeout=request_timeout,
247 )
→ 249 resp, got_stream = self._interpret_response(result, stream)
250 return resp, got_stream, self.api_key
File /usr/local/lib/python3.12/site-packages/together/abstract/api_requestor.py:650, in APIRequestor._interpret_response(self, result, stream)
647 else:
648 content = result.content.decode(“utf-8”)
649 return (
→ 650 self._interpret_response_line(
651 content,
652 result.status_code,
653 result.headers,
654 stream=False,
655 ),
656 False,
657 )
File /usr/local/lib/python3.12/site-packages/together/abstract/api_requestor.py:748, in APIRequestor._interpret_response_line(self, rbody, rcode, rheaders, stream)
746 # Handle streaming errors
747 if not 200 <= rcode < 300:
→ 748 raise self.handle_error_response(resp, rcode, stream_error=stream)
749 return resp
InvalidRequestError: Error code: 400 - {“message”: “Unable to access non-serverless model meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo. Please visit Together AI | The AI Native Cloud to create and start a new dedicated endpoint for the model.”, “type_”: “invalid_request_error”, “code”: “model_not_available”}
hi @Anjan1970
By the way I am not Preeti, Deepti.
Labs were again updated just few minutes ago for this new exception error.
kindly follow these steps
Delete cache and browsing history of the browser you are using.
re-login in dlai website
Open notebook. Now delete all files from File==> Open
Now click 3 dots, restore original version.
Let me know if issue still persist
Regards
Dr. Deepti

