hello everyone, I have a problem.
when I was implementing my model, I designed it in a function as below:
def model_by_myself(training_set,training_labels,units_for_hidden_layer,epochs):
model = tf.keras.Sequential([tf.keras.layers.Flatten(),
tf.keras.layers.Dense(units_for_hidden_layer,activation=tf.nn.relu),
tf.keras.layers.Dense(10,activation=tf.nn.softmax)])
model.compile(optimizer=‘adam’,
loss='sparse_categorical_crossentropy',
)
model.fit(training_set,training_labels,epochs)
return model
but when I was implementing the model without any function I get better accuracy. but for why was this happen ?
Hello mahyaalizadeh,
Neural network algorithms are stochastic. This means they make use of randomness, such as initialising to random weights, and in turn the same network trained on the same data can produce different results.
This can be confusing as the algorithm appears unstable, and in fact they are by design. The random initialisation allows the network to learn a good approximation for the function being learned.
You can try to set the seed to a particular number at the beginning of your program:
import tensorflow as tf
tf.random.set_seed(221)
I used 221 for illustration purposes, but can be whatever.
Best,
German
Hello German.
Thank you for your response.
But their accuracy is very different (about 10%), even when I used " tf.random.set_seed() "
is this normal?
I think , I realized my mistake ,
I put “model.fit()” in the function and I think it was wrong.
Am I right? and if I am right, could you please explain the reason to me?
thank you
Hi mahyaalizadeh,
Implementing model.fit() in a separated function should not make any difference unless you fit the model twice - once inside the function and once more outside it. In that case, it would affect accuracy indeed as you are fitting over an already fitted model.
Minor topic, I’ve noticed you didn’t include metrics parameter into the compilation.
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
Best