C2M2 I can't see the transferred data in Amazon S3 using kinesis data streams through Amazon firehose

Source Systems, Data Ingestion, and Pipelines

Module-2

  • Link: https://www.coursera.org/learn/source-systems-data-ingestion-and-pipelines/lecture/vx2l5/lab-walkthrough-streaming-ingestion

  • I have processed the code then created the data stream in kinesis data stream with the name given in the documentation.

  • 1-part ingestion was worked. then I implemented the 2-part of the implementation.

  • Kinesis data stream pushes the records to kinesis firehouse of the USA & International events, but the kinesis firehose didn’t transfer the events to Amazon S3. But it had two folders containing the name USA & International when I opened them nothing was inside.

  • I have been waiting for 10 mins after that I checked the code second from the last cd src/elt of usa & International source_stream pushes the data to firehose that I could see on the terminal CLI.

  • But the last CLI command cd src/elt of _stream. It didn’t print anything on the terminal but in the documentation, saying that we have to wait for 5 to 7 mins. I had been waiting for 15 mins. I could not see anything in the Amazon S3 buckets.

When calling the function:

create_kinesis_firehose(…,
bucket_name=None,

)

Do you have the bucket_name parameter = USA_BUCKET or INTERNATIONAL_BUCKET according to the case.

Based on your question nothing else could be wrong. The lab works well take a look at the screen shots below of the contents of the S3 buckets. Most orders go to international locations.

1 Like

Hi Iaacdm,

Thanks for the reply. I have checked the bucket_name parameter and everything was fine, but I wasn’t sure what else I’d done wrong so I applied the latest version, and I performed the given task. It finally worked.

Thanks for the help Iaacdm. Have a wonderful day.

Best,
Van.

1 Like