I’m having issues with exercise 6 in the assignment.
I’m passing all tests until exercise 6.
The log is shown as below.
UnfilteredStackTrace Traceback (most recent call last)
in
3 # take that into account for you train_model implementation
----> 4 training_loop = train_model(model, train_task, [eval_task], 100, output_dir_expand)
in train_model(classifier, train_task, eval_task, n_steps, output_dir)
24
—> 25 training_loop.run(n_steps = n_steps)
26 ### END CODE HERE ###
/opt/conda/lib/python3.7/site-packages/trax/supervised/training.py in run(self, n_steps)
434
→ 435 loss, optimizer_metrics = self._run_one_step(task_index, task_changed)
436
/opt/conda/lib/python3.7/site-packages/trax/supervised/training.py in _run_one_step(self, task_index, task_changed)
632 (loss, stats) = trainer.one_step(
→ 633 batch, rng, step=step, learning_rate=learning_rate
634 )
/opt/conda/lib/python3.7/site-packages/trax/optimizers/trainer.py in one_step(self, batch, rng, step, learning_rate)
147 (new_weights, new_slots), new_state, stats = self._accelerated_update_fn(
→ 148 (weights, self._slots), step, self._opt_params, batch, state, rng)
149
/opt/conda/lib/python3.7/site-packages/jax/_src/traceback_util.py in reraise_with_filtered_traceback(*args, **kwargs)
182 try:
→ 183 return fun(*args, **kwargs)
184 except Exception as e:
/opt/conda/lib/python3.7/site-packages/jax/_src/api.py in cache_miss(*args, **kwargs)
426 device=device, backend=backend, name=flat_fun.name,
→ 427 donated_invars=donated_invars, inline=inline)
428 out_pytree_def = out_tree()
/opt/conda/lib/python3.7/site-packages/jax/core.py in bind(self, fun, *args, **params)
1559 def bind(self, fun, *args, **params):
→ 1560 return call_bind(self, fun, *args, **params)
1561
/opt/conda/lib/python3.7/site-packages/jax/core.py in call_bind(primitive, fun, *args, **params)
1549 params_tuple, out_axes_transforms)
→ 1550 tracers = map(top_trace.full_raise, args)
1551 outs = primitive.process(top_trace, fun, tracers, params)
/opt/conda/lib/python3.7/site-packages/jax/_src/util.py in safe_map(f, *args)
40 assert len(arg) == n, ‘length mismatch: {}’.format(list(map(len, args)))
—> 41 return list(map(f, *args))
42
/opt/conda/lib/python3.7/site-packages/jax/core.py in full_raise(self, val)
384 raise escaped_tracer_error(
→ 385 val, f"Can’t lift sublevels {val._trace.sublevel} to {sublevel}")
386 elif val._trace.level < level:
UnfilteredStackTrace: jax.core.UnexpectedTracerError: Encountered an unexpected tracer. Perhaps this tracer escaped through global state from a previously traced function.
The functions being transformed should not save traced values to global state. Detail: Can’t lift sublevels 1 to 0.
To catch the leak earlier, try setting the environment variable JAX_CHECK_TRACER_LEAKS or using the jax.checking_leaks
context manager.
The stack trace below excludes JAX-internal frames.
The preceding is the original exception that occurred, unmodified.
The code in exercise 6 is as followed.
# UNQ_C6 (UNIQUE CELL IDENTIFIER, DO NOT EDIT)
# GRADED FUNCTION: train_model
def train_model(classifier, train_task, eval_task, n_steps, output_dir):
'''
Input:
classifier - the model you are building
train_task - Training task
eval_task - Evaluation task. Received as a list
n_steps - the evaluation steps
output_dir - folder to save your files
Output:
trainer - trax trainer
'''
rnd.seed(31) # Do NOT modify this random seed. This makes the notebook easier to replicate
### START CODE HERE (Replace instances of 'None' with your code) ###
training_loop = training.Loop(
classifier, # The learning model
train_task, # The training task
eval_tasks=eval_task, # The evaluation task
output_dir=output_dir, # The output directory
random_seed=31 # Do not modify this random seed in order to ensure reproducibility and for grading purposes.
)
training_loop.run(n_steps = n_steps)
### END CODE HERE ###
# Return the training_loop, since it has the model.
return training_loop
I don’t know if this error related to the code before this exercise that was mentioned in other answers, i could show the other parts of my code if that’s necessary. Thanks a lot!