C3W3: Valueerror while importing 'aiplatform' from google cloud

While working on GCP, in week 3 graded assignment on “running distributed tensorflow using vertex ai”.
I am encountering an error: ValueError: numpy.dtype size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject.

The code section I am encountering the error is “Import Vertex SDK for Python” where we just importing all the modules

And this is the whole error:
ValueError Traceback (most recent call last)
Cell In[6], line 4
1 import os
2 import sys
----> 4 from google.cloud import aiplatform
5 from google.cloud.aiplatform import gapic as aip
7 aiplatform.init(project=PROJECT_ID, location=REGION, staging_bucket=BUCKET_NAME)

File ~/.local/lib/python3.10/site-packages/google/cloud/aiplatform/init.py:26
21 version = aiplatform_version.version
24 from google.cloud.aiplatform import initializer
—> 26 from google.cloud.aiplatform.datasets import (
27 ImageDataset,
28 TabularDataset,
29 TextDataset,
30 TimeSeriesDataset,
31 VideoDataset,
32 )
33 from google.cloud.aiplatform import explain
34 from google.cloud.aiplatform import gapic

File ~/.local/lib/python3.10/site-packages/google/cloud/aiplatform/datasets/init.py:19
1 # -- coding: utf-8 --
2
3 # Copyright 2020 Google LLC
(…)
15 # limitations under the License.
16 #
18 from google.cloud.aiplatform.datasets.dataset import _Dataset
—> 19 from google.cloud.aiplatform.datasets.column_names_dataset import _ColumnNamesDataset
20 from google.cloud.aiplatform.datasets.tabular_dataset import TabularDataset
21 from google.cloud.aiplatform.datasets.time_series_dataset import TimeSeriesDataset

File ~/.local/lib/python3.10/site-packages/google/cloud/aiplatform/datasets/column_names_dataset.py:24
21 from typing import List, Optional, Set
22 from google.auth import credentials as auth_credentials
—> 24 from google.cloud import bigquery
25 from google.cloud import storage
27 from google.cloud.aiplatform import utils

File /opt/conda/lib/python3.10/site-packages/google/cloud/bigquery/init.py:35
31 from google.cloud.bigquery import version as bigquery_version
33 version = bigquery_version.version
—> 35 from google.cloud.bigquery.client import Client
36 from google.cloud.bigquery.dataset import AccessEntry
37 from google.cloud.bigquery.dataset import Dataset

File /opt/conda/lib/python3.10/site-packages/google/cloud/bigquery/client.py:61
58 from google.cloud.client import ClientWithProject # type: ignore # pytype: disable=import-error
60 try:
—> 61 from google.cloud.bigquery_storage_v1.services.big_query_read.client import (
62 DEFAULT_CLIENT_INFO as DEFAULT_BQSTORAGE_CLIENT_INFO,
63 )
64 except ImportError:
65 DEFAULT_BQSTORAGE_CLIENT_INFO = None # type: ignore

File /opt/conda/lib/python3.10/site-packages/google/cloud/bigquery_storage_v1/init.py:25
19 import pkg_resources
21 version = pkg_resources.get_distribution(
22 “google-cloud-bigquery-storage”
23 ).version # noqa
—> 25 from google.cloud.bigquery_storage_v1 import client
26 from google.cloud.bigquery_storage_v1 import types
29 class BigQueryReadClient(client.BigQueryReadClient):

File /opt/conda/lib/python3.10/site-packages/google/cloud/bigquery_storage_v1/client.py:26
22 from future import absolute_import
24 import google.api_core.gapic_v1.method
—> 26 from google.cloud.bigquery_storage_v1 import reader
27 from google.cloud.bigquery_storage_v1.services import big_query_read
28 from google.cloud.bigquery_storage_v1.services import big_query_write

File /opt/conda/lib/python3.10/site-packages/google/cloud/bigquery_storage_v1/reader.py:30
27 import google.rpc.error_details_pb2
29 try:
—> 30 import pandas
31 except ImportError: # pragma: NO COVER
32 pandas = None

File /opt/conda/lib/python3.10/site-packages/pandas/init.py:22
19 del _hard_dependencies, _dependency, _missing_dependencies
21 # numpy compat
—> 22 from pandas.compat import is_numpy_dev as _is_numpy_dev # pyright: ignore # noqa:F401
24 try:
25 from pandas._libs import hashtable as _hashtable, lib as _lib, tslib as _tslib

File /opt/conda/lib/python3.10/site-packages/pandas/compat/init.py:25
17 from pandas.compat._constants import (
18 IS64,
19 PY39,
(…)
22 PYPY,
23 )
24 import pandas.compat.compressors
—> 25 from pandas.compat.numpy import (
26 is_numpy_dev,
27 np_version_under1p21,
28 )
29 from pandas.compat.pyarrow import (
30 pa_version_under7p0,
31 pa_version_under8p0,
32 pa_version_under9p0,
33 pa_version_under11p0,
34 )
37 def set_function_name(f: F, name: str, cls) → F:

File /opt/conda/lib/python3.10/site-packages/pandas/compat/numpy/init.py:4
1 “”" support numpy compatibility across versions “”"
2 import numpy as np
----> 4 from pandas.util.version import Version
6 # numpy versioning
7 _np_version = np.version

File /opt/conda/lib/python3.10/site-packages/pandas/util/init.py:2
1 # pyright: reportUnusedImport = false
----> 2 from pandas.util._decorators import ( # noqa:F401
3 Appender,
4 Substitution,
5 cache_readonly,
6 )
8 from pandas.core.util.hashing import ( # noqa:F401
9 hash_array,
10 hash_pandas_object,
11 )

File /opt/conda/lib/python3.10/site-packages/pandas/util/_decorators.py:14
6 from typing import (
7 Any,
8 Callable,
9 Mapping,
10 cast,
11 )
12 import warnings
—> 14 from pandas._libs.properties import cache_readonly
15 from pandas._typing import (
16 F,
17 T,
18 )
19 from pandas.util._exceptions import find_stack_level

File /opt/conda/lib/python3.10/site-packages/pandas/_libs/init.py:13
1 all = [
2 “NaT”,
3 “NaTType”,
(…)
9 “Interval”,
10 ]
—> 13 from pandas._libs.interval import Interval
14 from pandas._libs.tslibs import (
15 NaT,
16 NaTType,
(…)
21 iNaT,
22 )

File /opt/conda/lib/python3.10/site-packages/pandas/_libs/interval.pyx:1, in init pandas._libs.interval()

ValueError: numpy.dtype size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject

Hi @Abhay_Pancholi,

I think, this error occurs due to a mismatch between the versions of numpy and other packages. Make sure the installed libraries are compatible and their versions meet the requirements!

Hope it helps!

Hello @Alireza_Saei sir,
Thanks for your suggestion, I will try to fix it.

Sure, feel free to keep us updated with your progress! Good luck!

1 Like

solved it by using this:

! pip install --upgrade numpy==1.26.4

Seems like numpy 2.0.0 breaks the pipeline

2 Likes

Hi all,
The staff have been informed to fix the lab.
Thanks.

Hi everyone! Thank you for reporting. I’ve forwarded this to our partners so it can be fixed asap.

Thanks as well to Bendy for the workaround!

Will update this thread as soon as we hear back from the Qwiklabs team.

1 Like