QueryFailed: com.amazonaws.services.s3.model.AmazonS3Exception: The specified bucket does not exist (Service: Amazon S3; Status Code: 404; Error Code: NoSuchBucket;

While i am running this piece of code

df_count_by_sentiment = wr.athena.read_sql_query(
sql=statement_count_by_sentiment,
database=database_name
)

i get the following error:
QueryFailed Traceback (most recent call last)
in
1 df_count_by_sentiment = wr.athena.read_sql_query(
2 sql=statement_count_by_sentiment,
----> 3 database=database_name
4 )
5

/opt/conda/lib/python3.7/site-packages/awswrangler/_config.py in wrapper(*args_raw, **kwargs)
415 del args[name]
416 args = {**args, **keywords}
→ 417 return function(**args)
418
419 wrapper.doc = _inject_config_doc(doc=function.doc, available_configs=available_configs)

/opt/conda/lib/python3.7/site-packages/awswrangler/athena/_read.py in read_sql_query(sql, database, ctas_approach, categories, chunksize, s3_output, workgroup, encryption, kms_key, keep_files, ctas_database_name, ctas_temp_table_name, use_threads, boto3_session, max_cache_seconds, max_cache_query_inspections, max_remote_cache_entries, max_local_cache_entries, data_source, params, s3_additional_kwargs)
833 use_threads=use_threads,
834 s3_additional_kwargs=s3_additional_kwargs,
→ 835 boto3_session=session,
836 )
837

/opt/conda/lib/python3.7/site-packages/awswrangler/athena/_read.py in _resolve_query_without_cache(sql, database, data_source, ctas_approach, categories, chunksize, s3_output, workgroup, encryption, kms_key, keep_files, ctas_database_name, ctas_temp_table_name, use_threads, s3_additional_kwargs, boto3_session)
546 use_threads=use_threads,
547 s3_additional_kwargs=s3_additional_kwargs,
→ 548 boto3_session=boto3_session,
549 )
550 finally:

/opt/conda/lib/python3.7/site-packages/awswrangler/athena/_read.py in _resolve_query_without_cache_ctas(sql, database, data_source, s3_output, keep_files, chunksize, categories, encryption, workgroup, kms_key, wg_config, alt_database, name, use_threads, s3_additional_kwargs, boto3_session)
440 “(E.g. ‘SELECT CAST(NULL AS INTEGER) AS MY_COL, …’)”
441 )
→ 442 raise ex
443 return _fetch_parquet_result(
444 query_metadata=query_metadata,

/opt/conda/lib/python3.7/site-packages/awswrangler/athena/_read.py in _resolve_query_without_cache_ctas(sql, database, data_source, s3_output, keep_files, chunksize, categories, encryption, workgroup, kms_key, wg_config, alt_database, name, use_threads, s3_additional_kwargs, boto3_session)
423 boto3_session=boto3_session,
424 categories=categories,
→ 425 metadata_cache_manager=_cache_manager,
426 )
427 except exceptions.QueryFailed as ex:

/opt/conda/lib/python3.7/site-packages/awswrangler/athena/_utils.py in _get_query_metadata(query_execution_id, boto3_session, categories, query_execution_payload, metadata_cache_manager)
259 _query_execution_payload: Dict[str, Any] = query_execution_payload
260 else:
→ 261 _query_execution_payload = wait_query(query_execution_id=query_execution_id, boto3_session=boto3_session)
262 cols_types: Dict[str, str] = get_query_columns_types(
263 query_execution_id=query_execution_id, boto3_session=boto3_session

/opt/conda/lib/python3.7/site-packages/awswrangler/athena/_utils.py in wait_query(query_execution_id, boto3_session)
771 _logger.debug(“StateChangeReason: %s”, response[“Status”].get(“StateChangeReason”))
772 if state == “FAILED”:
→ 773 raise exceptions.QueryFailed(response[“Status”].get(“StateChangeReason”))
774 if state == “CANCELLED”:
775 raise exceptions.QueryCancelled(response[“Status”].get(“StateChangeReason”))

QueryFailed: com.amazonaws.services.s3.model.AmazonS3Exception: The specified bucket does not exist (Service: Amazon S3; Status Code: 404; Error Code: NoSuchBucket; Request ID: RATVWQENBE9G5CZ2; S3 Extended Request ID: mtnqR6aUZfDWLGebB0lbV4pOb2/xCvJELLYMfwWaGDozsAcggPs2Hoz5Y2niThFjRpf6gY1zKU8=; Proxy: null), S3 Extended Request ID: mtnqR6aUZfDWLGebB0lbV4pOb2/xCvJELLYMfwWaGDozsAcggPs2Hoz5Y2niThFjRpf6gY1zKU8= (Path: s3://aws-athena-query-results-379080104215-us-east-1/temp_table_69ca3393a5e0458ab6519afdf3bd1aa9). You may need to manually clean the data at location ‘s3://aws-athena-query-results-379080104215-us-east-1/tables/4e5eb553-9811-46f9-b3df-48d575c4c2e9’ before retrying. Athena will not delete data in your account.

Hi Barah,
This is for C1_W1 right?
It looks like you missed filling in the “None” in the query from the previous cell, exercise 3.

statement_count_by_category = """
SELECT None, COUNT(None) AS None
FROM None
GROUP BY None 
ORDER BY None DESC
"""

If that doesn’t work let me know!

If you’re curious where I got that from, it is from this line: “(E.g. ‘SELECT CAST(NULL AS INTEGER) AS MY_COL, …’)”

1 Like

Hey @Barah_Ah

Ultimately the raised exception is AmazonS3Exception. You shouldnt trust subsequent errors but I think in this case is straightforward.
Check that the location you are trying to access, (looks like s3://aws-athena-query-results-379080104215-us-east-1/temp_table_69ca3393a5e0458ab6519afdf3bd1aa9) is available, track down where that url is getting set.

Thanks! Ya I should definitely try and go through some problem solving stages/ sanity checks with them instead of just telling them what’s wrong (my first mentoring gig).

Why shouldn’t you trust consequent errors? They haven’t led me astray from what I can remember. I guess the source code can get a bit gnarly and confusing at times but it feels like you have a more specific reason.

Hey!

My bad, I confused subsequent with consequent (not native english speaker). Just edited my last comment.

What I meant is that usually you should not trust subsequent errors as they often might lead you in the wrong direction, depending how the error handling between the systems is done.
However in this case, I think the root cause is quite clear as for the raised exception

All good, I could still easily understand you and your English is great, better than some native speakers I know!

1 Like

Issue is fixed. @graham_broughton I think I made a mistake in replacing the none value.

Thank you @graham_broughton and @sv77 for responding.
Have a great day:)