Validation accuracy increasing but validation loss is also increasing


Validation accuracy increasing but validation loss is also increasing



I am using a CNN network to classify images into 5 classes. The size of my dataset is around 370K. I am using Adam optimizer with learning rate 0.0001 and batch size of 32. Surprisingly, I am getting improvement in validation accuracy over the epochs but validation loss is constantly growing.



I am assuming that the model is becoming less and less unsure about the validation set but the accuracy is more because the value of softmax output is more than the threshold value.



What can be the reason behind this? Any help in this regard would be highly appreciated.
Loss curve



Accuracy curve





this is a case study of overfitting. as the training loss goes down, the validation loss is increasing
– seralouk
30 mins ago





yeah but in case of overfitting the validation accuracy should also be going down, right?
– Upendar Gareri
22 mins ago




1 Answer
1



I think it’s tending to over fitting. The change of val_loss is so small that it may not immediately change the performance of the model(val_acc). You can try to imagine a linearly separable problem separated by a line.






By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.

Popular posts from this blog

PHP contact form sending but not receiving emails

Do graphics cards have individual ID by which single devices can be distinguished?

iOS Top Alignment constraint based on screen (superview) height