Validation Leaderboard is Not Public

Hi Everyone,

The validation scores are for your own information and debugging, and are not intended to be made public. Please do not share your validation score results on the forum. Also please keep in mind that the validation score results are not meant to be indicative of performance on the test dataset.

Thank you.


did you actively change the validation set to be misleading or does it make sense to include the scoring of different approaches into the selection process for which model should be the final submission? Given the size it can obviously only act as a minor indication, but i was planning to use it as a tiebreaker if it comes to this…

Dear Admin team,
WRT “keep in mind that the validation score results are not meant to be indicative of performance on the test dataset.” If that is true, does this mean that I have to count on the reliability of my own cross-validation system?

Otherwise, what is the point of releasing a validation set that does not represent the real test set?

Also, would it be possible to change the rules so that the 3 best predictions are taken into account (Like Kaggle does) rather than a single prediction?

Many thanks,

The validation set is meant for debugging to check if the submission format is correct, e.g. to catch a formatting error if results show a massively decreased metric.

We do not plan to change rules or ranking metric.

Dear Organizers,

Would it be possible to add the leaderboard for validation dataset same as previous years? Because it is very helpful to get a rough estimation of our model performance against unseen data.



i would like to support that motion!

Or at least display the results instead of just displaying “Success”? Would be comforting to know that more than just the submission format is correct.


Hi @vesal, @simonhschaefer,

You should now be able to immediately see your score for validation submissions. However, as mentioned in Validation leader board, the validation leader board is not public this year.