C3W2 assignment issue cell 3

I get this message when submit my job:
Cell #3. Can’t compile the student’s code. Error: TypeError(“init() missing 2 required positional arguments: ‘op’ and ‘message’”,)

In the past, other learners posted this issue and resolved it by specifying data_dir parameter. This is my line:
dataset, info = tfds.load(‘cats_vs_dogs:4..’, split=splits, with_info=True, data_dir=‘/tf’)
I keep getting the same error by the grader. Moreover, I also tried
data_dir=‘/’
and
current_dir = getcwd()
data_dir=current_dir

Keep getting the same error. Any hint? Seems that this is not code issue; this is limitation of the grader.
Thank you for help.

1 Like

Hi @Dennis_Sinitsky

Can you share if you got any error in the assignment ??

or is it only in the submission grader output??

As I am not mentor for this course, I am tagging one of the course mentor @Jamal022

Yet if he doesn’t respond, DM me your codes for the line code having issue

As far as I can understand your error is related to the def statement and not data_dir, so check if those codes are in alignment.

Regards
DP

Hi Deepti,
thank you for your reply. I can DM you my code. Again, based on previous posts, other learners quickly overcame the problem when they defined data_dir parameter, but I keep getting compilation error – apparently, compiler cannot find directory from which to load? When I run from my side, it all works.
Dennis

1 Like

actually I have doubt on the data_dir parameter recall you used. make sure you are writing code for this data_dir as per the utils file provided and not the tensorflow directory.

I have DMed, kindly check for further response as codes can be discussed openly here.

Hi @Dennis_Sinitsky

I just referred tensorflow-2 repo which tells me why your codes might be taken incorrect by autograder.

Check the below image for the same cell

what I can see you dont require the current_dir = getcwd()

next splits, info = tfds.load(#YOUR CODE HERE) but you change this to dataset, info, for your data_dir path not being able to read by the autograder.

Next code line (train_examples, validation_examples, test_examples) = splits
but you changed this to
train_examples = dataset[‘train_examples’]
validation_examples = dataset[‘validation_examples’]
test_examples = dataset[‘test_examples’]

kindly get a fresh copy, and write codes only mentioned #YOUR CODE HERE) or splits = [#DESCRIBE YOUR SPLITS HERE#]

other than that do not add any line to hard-code the path. remember the instruction already mentions the dataset is already downloaded.

Also another issue I felt is coming from would be the instruction give

The assignment uses TF version 2.1.0 and TFDS version 3.2.1 so if you run this notebook on TF 1.x or some other version of TFDS, some things might not work.

So make sure your version are as per stated.

Regards
DP

Hi Deepti,
this is my code:

For data_dir I used ‘.’, ‘./’, ‘/’, ‘/tf’ and all failed
For version I have
image

I think they are latest.

1 Like

wait, let me install required versions of tf and tfds first…:dotted_line_face:

1 Like

in one the assignment, the version required to downgrade for the successful submission.

just in case if doing this pass your assignment let me know, i will inform the L.T of course about this then.

hey @Dennis_Sinitsky

kindly go through this thread which mentions what data_dir needs to be

also the one needs to recall the file name needs to be correct.

check chris.favila’s comment. it should help you resolve your issue. let me know if issue still persist.

regards
DP

FINALLY!!! This is a good hint. I did data_path = filePath and it worked (on my 20-th submission or something like this). I just copied data_path from week1 assignment.
Thank you, Deepti. :):):):slight_smile:
Dennis

2 Likes

Glad you resolved, remember it is not the number of submission counts, it is the effort you have put, you might forget other assignments but you will not forget this one :stuck_out_tongue_winking_eye:

You always have been a keen learner as I have seen your journey of query and response

Keep Learning!!!

Regards
DP