C1 W1 Lab. I am getting QueryFailed error

The first time when I run this notebook there was no error but now I’m getting this error

When I execulte this query:

df_count_by_sentiment = wr.athena.read_sql_query(
sql=statement_count_by_sentiment,
database=database_name
)

print(df_count_by_sentiment)


QueryFailed Traceback (most recent call last)
in
1 df_count_by_sentiment = wr.athena.read_sql_query(
2 sql=statement_count_by_sentiment,
----> 3 database=database_name
4 )
5

/opt/conda/lib/python3.7/site-packages/awswrangler/_config.py in wrapper(*args_raw, **kwargs)
415 del args[name]
416 args = {**args, **keywords}
→ 417 return function(**args)
418
419 wrapper.doc = _inject_config_doc(doc=function.doc, available_configs=available_configs)

/opt/conda/lib/python3.7/site-packages/awswrangler/athena/_read.py in read_sql_query(sql, database, ctas_approach, categories, chunksize, s3_output, workgroup, encryption, kms_key, keep_files, ctas_database_name, ctas_temp_table_name, use_threads, boto3_session, max_cache_seconds, max_cache_query_inspections, max_remote_cache_entries, max_local_cache_entries, data_source, params, s3_additional_kwargs)
833 use_threads=use_threads,
834 s3_additional_kwargs=s3_additional_kwargs,
→ 835 boto3_session=session,
836 )
837

/opt/conda/lib/python3.7/site-packages/awswrangler/athena/_read.py in _resolve_query_without_cache(sql, database, data_source, ctas_approach, categories, chunksize, s3_output, workgroup, encryption, kms_key, keep_files, ctas_database_name, ctas_temp_table_name, use_threads, s3_additional_kwargs, boto3_session)
546 use_threads=use_threads,
547 s3_additional_kwargs=s3_additional_kwargs,
→ 548 boto3_session=boto3_session,
549 )
550 finally:

/opt/conda/lib/python3.7/site-packages/awswrangler/athena/_read.py in _resolve_query_without_cache_ctas(sql, database, data_source, s3_output, keep_files, chunksize, categories, encryption, workgroup, kms_key, wg_config, alt_database, name, use_threads, s3_additional_kwargs, boto3_session)
440 “(E.g. ‘SELECT CAST(NULL AS INTEGER) AS MY_COL, …’)”
441 )
→ 442 raise ex
443 return _fetch_parquet_result(
444 query_metadata=query_metadata,

/opt/conda/lib/python3.7/site-packages/awswrangler/athena/_read.py in _resolve_query_without_cache_ctas(sql, database, data_source, s3_output, keep_files, chunksize, categories, encryption, workgroup, kms_key, wg_config, alt_database, name, use_threads, s3_additional_kwargs, boto3_session)
423 boto3_session=boto3_session,
424 categories=categories,
→ 425 metadata_cache_manager=_cache_manager,
426 )
427 except exceptions.QueryFailed as ex:

/opt/conda/lib/python3.7/site-packages/awswrangler/athena/_utils.py in _get_query_metadata(query_execution_id, boto3_session, categories, query_execution_payload, metadata_cache_manager)
259 _query_execution_payload: Dict[str, Any] = query_execution_payload
260 else:
→ 261 _query_execution_payload = wait_query(query_execution_id=query_execution_id, boto3_session=boto3_session)
262 cols_types: Dict[str, str] = get_query_columns_types(
263 query_execution_id=query_execution_id, boto3_session=boto3_session

/opt/conda/lib/python3.7/site-packages/awswrangler/athena/_utils.py in wait_query(query_execution_id, boto3_session)
771 _logger.debug(“StateChangeReason: %s”, response[“Status”].get(“StateChangeReason”))
772 if state == “FAILED”:
→ 773 raise exceptions.QueryFailed(response[“Status”].get(“StateChangeReason”))
774 if state == “CANCELLED”:
775 raise exceptions.QueryCancelled(response[“Status”].get(“StateChangeReason”))

QueryFailed: com.amazonaws.services.s3.model.AmazonS3Exception: The specified bucket does not exist (Service: Amazon S3; Status Code: 404; Error Code: NoSuchBucket; Request ID: B9WNXGSX7KZBMAFN; S3 Extended Request ID: 1YtUt20CWOSy3TgZQIASoIjZGX3JmUr4Nhllgx5FcK4v4DC8lmCmJ3OIUCqBSdyQ3LYr+0qz5o0=; Proxy: null), S3 Extended Request ID: 1YtUt20CWOSy3TgZQIASoIjZGX3JmUr4Nhllgx5FcK4v4DC8lmCmJ3OIUCqBSdyQ3LYr+0qz5o0= (Path: s3://aws-athena-query-results-718062519571-us-east-1/temp_table_d27cc434eddf4a71a8236005832e34af). You may need to manually clean the data at location ‘s3://aws-athena-query-results-718062519571-us-east-1/tables/c681a813-4be5-4e41-8f17-cfe3ef1b029e’ before retrying. Athena will not delete data in your account.

What should I do? Please suggest me.

Hi @proloy_pal,

The error is raising because S3 bucket doesn’t exist. Can you check in previous cells whether S3 bucket exists or not?

Best Regards,
A. Sriharsha

Hi Mentors,

I am running into the same issue.
The queries ran without error.
I tried to run at a later time, and I get a similar error - how to I make sure the s3 bucket exists (as you recommend)

Thank you,

Neil A

Hi @NeilA,

After 2.1 Register S3 dataset files as a table for querying cell, using the link provided by executing the next cell you can check whether the s3 bucket exists[After executing 2.1, you’ll get the details fo s3 bucket].

Best Regards,
A. Sriharsha

I got the same issue. The notebook run with no error for the first time, but it raised this error for the second time onward

I have the same error. All cells are executed correctly, all that are before this one. I manually believed that everything exists. This is an Amazon, but even here there are crutches and dances with a tambourine. They care so much about safety that it is impossible to learn anything, forced to fill my head with problems, I am already ready to pay for it.

I got the same error too. Only the first time running is successful. I have checked the bucket and table, both did exist.

I got the same error. Have any solution been found?

Yesterday could not open Sagemaker studio for the whole day.
Today I could work upto but not “wr.athena.read_sql_query” got an error even though I physically verified that the database and table exists.

I got the same error. I manually checked the dsoaws_deep_learning database and found the table reviews in it. So the query should work but is giving an error
Report a bug does not work. Feedback does not work.
“QueryFailed: GENERIC_INTERNAL_ERROR: S3 service error. You may need to manually clean the data at location ‘s3://aws-athena-query-results-142957599085-us-east-1/tables/2a883ca5-5868-49c3-8fa4-db813e07d9db’ before retrying. Athena will not delete data in your account.”

If your still having that problem, could you post a screenshot of the cell with the query?

Below was the error message,

Finally after many such errors, I had got one chance to start and work on the lab but could not complete it and now I’m getting this below out of budget errror: “Your total lab spend of $26.865066 has exceeded the total budget of $20”