embedding = embedding_model.get_embeddings([“life”])
trying to run this cell but gives error.
embedding = embedding_model.get_embeddings([“life”])
trying to run this cell but gives error.
Same problem here. This happens also in the next notebook.
Gemini conclusion (and potential solutions) shown below
However this did not work, so I guess it’s for DeepLearning staff to take a look at this
=====================
Based on the stack trace and the specific function names visible (specifically _dlai_custom_api), the problem is not with your Python syntax, but rather with the environment or the API endpoint you are trying to reach.
Here is the breakdown of the issue and how to fix it.
The key clue lies in this line of your stack trace: File .../vertexai/language_models/_language_models.py:532, in TextEmbeddingModel._dlai_custom_api(self, texts)
The method _dlai_custom_api indicates that you are using a wrapper library likely provided by a DeepLearning.AI course (e.g., “Generative AI with LLMs”). This code is attempting to hit a custom proxy server set up for the course, not the standard public Google Vertex AI API.
JSONDecodeError: Extra dataThe error JSONDecodeError: Extra data: line 1 column 5 usually happens when a server returns a raw HTTP status code or plain text instead of a JSON object.
For example, if the server returns the text string: 404 Not Found
The Python JSON parser reads 404 (which is a valid JSON number), stops, and then sees " Not Found". It throws an error because it found “Extra data” (column 5) that doesn’t belong there.
If you copied this code from the course and are trying to run it on your own machine, it will fail.
The library included in the course environment is patched to send requests to a temporary, course-specific proxy.
That proxy is not accessible from your local internet connection, or you are missing the specific API keys injected into the course environment.
If you are running this inside the DeepLearning.AI / Coursera environment:
The API Token Expired: The lab sessions often have time-limited credentials.
The Model is Deprecated: textembedding-gecko@001 is an older model version. The proxy might have stopped supporting it.
If you want to use Vertex AI on your own computer, do not use the _dlai specific wrapper logic. You must use the standard Google Cloud setup.
Install the official library:
Bash
pip install google-cloud-aiplatform
Authenticate: Install the Google Cloud CLI and run:
Bash
gcloud auth application-default login
Use this code instead: You must initialize the AI platform with your specific Project ID and Location.
Python
import vertexai
from vertexai.language_models import TextEmbeddingModel
# Initialize your project
vertexai.init(project="YOUR_PROJECT_ID", location="us-central1")
# Use a newer model version (Gecko 001 is getting old)
# Try "text-embedding-004" or "textembedding-gecko@003"
model = TextEmbeddingModel.from_pretrained("textembedding-gecko@003")
embeddings = model.get_embeddings(["life"])
for embedding in embeddings:
print(embedding.values)
If you are currently taking the course and seeing this error in the browser-based lab:
Restart the Kernel: Go to Kernel → Restart.
Refresh the Lab: If that fails, look for a “Get New Token” or “Lab Help” button to refresh your credentials.
Check the Model Name: If the lab instructions are old, textembedding-gecko@001 might be down. Try changing the string to textembedding-gecko@003 to see if the proxy accepts a newer version.
This fixed the problem. Thank you!
print(PROJECT_ID)
REGION = ‘us-central1’
import vertexai
vertexai.init(project = PROJECT_ID,
location = REGION,
credentials = credentials)
===
from vertexai.language_models import TextEmbeddingModel
embedding_model = TextEmbeddingModel.from_pretrained(“text-embedding-004”)
embeddings = model.get_embeddings([“Life”])
for embedding in embeddings:
print(embedding.values)
Thanks for reporting this issue! @fuad47 @Hansbakker
please consider that the current model in the notebook is deprecated
# deprecated
#embedding_model = TextEmbeddingModel.from_pretrained("textembedding-gecko@001")
you could use
# updated to 2026
embedding_model = TextEmbeddingModel.from_pretrained("text-embedding-004")
However, we’re still checking latest updates in gemini models to update our notebooks in the platform. Thank you for your patience!