Results announcement

Are we going to be able to see the leaderboard today (31st July?) I saw a new August 1st date in the site.

Hi @fredguth,
Since we had to change from arXiv articles to personally submitted articles we will now have to go through all manuscripts manually. We hope to achieve August 1st as publishing date, but due to a high participation rate chances are it may happen later.
We work hard on doing this as fast as possible.

1 Like

Thanks for the info.Can you please share the link for the official leader board?

Leaderboards will be posted on our main challenge site at https://challenge2018.isic-archive.com/ . We’ll be sure to make a Forum and email announcement too.

HI, August 1st if the publishing date or delays are expected?
Thanks,

We’ll make the announcement on this forum and email when the final leaderboard goes up. Otherwise, we have no updates yet.

Hi, would you be able to disclose the approximate number of participants this year? I was just curious.

Mmmm … why is this taking so long?
Thanks,

We got about 300 submissions in total and a corresponding amount of manuscripts to screen.

1 Like

That sounds like a lot of work, I had no idea there was such a large cohort of participants.

Hello all. Here it is: Leaderboard initial release

1 Like

Hi folks, any idea when we will have the extended metrics publicized? Our group is particularly interested in the accuracies and AUCs for each lesion class — any chance those will be made public?

3 Likes

Any Update on Eduardo’s request? When are you planning to make the additional metrics available?

Hi all,

Unfortunately there still seem to be infrastructure issues delaying implementation of the secondary metrics on the official platform. As you are rightfully expecting them, please let me share preliminary secondary metrics of Task 3 within this (and the next because of character limits) post.
CAVE: These are not official numbers but ones I have calculated with another backend offline. While using standard implementations, I haven’t tested the results extensively (e.g. rounding), so I can’t guarantee no differences to the future official leaderboard metrics. I still hope they are useful for you.

Please note I am traveling the next hours (and days), so I will very probably not be able to answer any questions here.

Task 3 — Test-Set — Preliminary Secondary Metrics (1/2)

team_name approach_name MEL_Sens MEL_Spec MEL_PPV MEL_NPV MEL_AUC BCC_Sens BCC_Spec BCC_PPV BCC_NPV BCC_AUC AKIEC_Sens AKIEC_Spec AKIEC_PPV AKIEC_NPV AKIEC_AUC NV_Sens NV_Spec NV_PPV NV_NPV NV_AUC BKL_Sens BKL_Spec BKL_PPV BKL_NPV BKL_AUC DF_Sens DF_Spec DF_PPV DF_NPV DF_AUC VASC_Sens VASC_Spec VASC_PPV VASC_NPV VASC_AUC avgAUC
MetaOptima Technology Inc. Top 10 Models Averaged 0.825 0.931 0.605 0.977 0.949 0.925 0.983 0.782 0.995 0.997 0.837 0.988 0.679 0.995 0.987 0.845 0.977 0.982 0.807 0.979 0.829 0.974 0.841 0.971 0.974 0.932 0.976 0.539 0.998 0.992 1.000 0.994 0.795 1.000 1.000 0.983
MetaOptima Technology Inc. Meta Ensemble 0.825 0.922 0.576 0.976 0.946 0.892 0.989 0.847 0.993 0.997 0.860 0.982 0.587 0.996 0.985 0.849 0.970 0.977 0.810 0.981 0.816 0.979 0.868 0.969 0.977 0.932 0.983 0.621 0.998 0.990 1.000 0.993 0.761 1.000 1.000 0.982
MetaOptima Technology Inc. Best Single Model 0.819 0.919 0.562 0.975 0.948 0.914 0.986 0.810 0.994 0.996 0.860 0.987 0.661 0.996 0.981 0.809 0.975 0.980 0.773 0.978 0.834 0.965 0.801 0.972 0.971 0.886 0.980 0.565 0.997 0.986 0.971 0.985 0.607 0.999 0.999 0.980
DAISYLab Large Ensemble with heavy multi-cropping and loss weighting 0.813 0.954 0.692 0.976 0.959 0.839 0.989 0.830 0.989 0.995 0.884 0.988 0.679 0.997 0.995 0.928 0.937 0.957 0.897 0.983 0.885 0.987 0.919 0.981 0.990 0.727 0.998 0.914 0.992 0.987 0.914 0.998 0.914 0.998 0.999 0.987
Medical Image Analysis Group, Sun Yat-sen University Emsemble Of SENET and PNANET with DataAugmentation when TEST 0.743 0.974 0.784 0.967 0.945 0.882 0.985 0.796 0.992 0.992 0.860 0.986 0.649 0.996 0.988 0.934 0.904 0.936 0.901 0.974 0.816 0.975 0.847 0.969 0.969 0.795 0.999 0.946 0.994 0.982 0.886 0.996 0.838 0.997 0.998 0.978
Medical Image Analysis Group, Sun Yat-sen University Emsemble of SENET and PNASNET 0.731 0.977 0.801 0.966 0.945 0.871 0.985 0.794 0.991 0.992 0.814 0.988 0.673 0.995 0.985 0.941 0.881 0.922 0.908 0.973 0.779 0.974 0.833 0.963 0.966 0.773 0.999 0.944 0.993 0.982 0.857 0.996 0.833 0.997 0.998 0.977
Li densenet 0.673 0.985 0.852 0.959 0.949 0.849 0.990 0.849 0.990 0.992 0.767 0.993 0.767 0.993 0.991 0.959 0.867 0.916 0.934 0.974 0.848 0.974 0.848 0.974 0.977 0.750 0.995 0.825 0.993 0.980 0.857 0.999 0.938 0.997 0.998 0.980
Ask Sina Approach 3 : Average of Approach 1 and 2 0.830 0.880 0.469 0.976 0.883 0.828 0.980 0.733 0.989 0.977 0.767 0.993 0.750 0.993 0.979 0.786 0.957 0.965 0.748 0.969 0.811 0.953 0.743 0.968 0.937 0.773 0.991 0.723 0.993 0.976 0.886 0.997 0.886 0.997 0.998 0.960
RECOD Titans Average of 15 Deep Learning Models Trained Only with Challenge Data 0.608 0.961 0.667 0.951 0.934 0.849 0.982 0.760 0.990 0.991 0.860 0.979 0.544 0.996 0.983 0.911 0.884 0.922 0.868 0.965 0.770 0.965 0.788 0.962 0.958 0.795 0.995 0.814 0.994 0.987 0.829 0.999 0.935 0.996 0.997 0.974
Ask Sina Approach 1 : 7 Classes Classifier 0.789 0.904 0.513 0.971 0.923 0.806 0.973 0.664 0.987 0.979 0.791 0.988 0.667 0.994 0.980 0.839 0.945 0.958 0.796 0.967 0.760 0.968 0.797 0.960 0.950 0.773 0.992 0.739 0.993 0.985 0.857 0.996 0.833 0.997 0.999 0.969
Ask Sina Approach 2 : 2 Steps Classifier (2C , 6C) 0.836 0.832 0.389 0.976 0.782 0.828 0.982 0.748 0.989 0.974 0.767 0.991 0.717 0.993 0.976 0.673 0.968 0.970 0.663 0.960 0.834 0.921 0.640 0.971 0.916 0.773 0.991 0.723 0.993 0.958 0.829 0.997 0.853 0.996 0.996 0.938
NWPU-SAIIP FV+Res101 0.661 0.965 0.706 0.957 0.813 0.828 0.984 0.778 0.989 0.906 0.744 0.988 0.640 0.992 0.866 0.925 0.879 0.920 0.886 0.902 0.774 0.961 0.771 0.962 0.868 0.795 0.996 0.854 0.994 0.896 0.771 0.998 0.900 0.995 0.885 0.876
Wonlab in Sungkyunkwan University, Korea, Republic of WonDerM: Skin Lesion Classification with Fine-tuned Neural Networks 0.620 0.968 0.711 0.952 0.939 0.860 0.985 0.792 0.991 0.991 0.628 0.988 0.614 0.989 0.982 0.926 0.877 0.919 0.888 0.966 0.797 0.971 0.824 0.966 0.964 0.750 0.984 0.589 0.992 0.983 0.914 0.997 0.889 0.998 0.997 0.974
vess Resnext101 & DPN92, Snapshot ensamble, D4 TTA 0.614 0.974 0.750 0.952 0.794 0.849 0.987 0.806 0.990 0.918 0.767 0.991 0.717 0.993 0.879 0.939 0.866 0.913 0.905 0.903 0.839 0.961 0.784 0.973 0.900 0.682 0.999 0.968 0.991 0.841 0.800 0.999 0.933 0.995 0.899 0.876
Medical Image Analysis Group, Sun Yat-sen University Emsemble Of ResNet-152 0.772 0.905 0.510 0.969 0.940 0.871 0.981 0.750 0.991 0.991 0.581 0.991 0.658 0.988 0.977 0.844 0.934 0.950 0.799 0.966 0.793 0.955 0.748 0.965 0.958 0.727 0.998 0.914 0.992 0.987 0.886 0.997 0.886 0.997 0.996 0.974
LMU DataMining thresholding DF AKIEC MEL VASC BKL 0.754 0.878 0.440 0.966 0.923 0.785 0.983 0.753 0.986 0.983 0.651 0.988 0.622 0.990 0.960 0.776 0.954 0.962 0.739 0.966 0.760 0.950 0.717 0.959 0.943 0.818 0.978 0.529 0.994 0.966 0.914 0.991 0.711 0.998 0.994 0.962
NWPU-SAIIP FV+Res50 fine-tuning 0.667 0.967 0.722 0.958 0.817 0.796 0.991 0.851 0.987 0.893 0.744 0.982 0.552 0.992 0.863 0.927 0.876 0.918 0.889 0.901 0.793 0.961 0.775 0.965 0.877 0.750 0.996 0.846 0.993 0.873 0.771 0.998 0.900 0.995 0.885 0.873
LMU DataMining thresholding DF AKIEC 0.690 0.922 0.529 0.959 0.928 0.796 0.983 0.755 0.987 0.984 0.651 0.988 0.622 0.990 0.963 0.849 0.929 0.947 0.803 0.968 0.779 0.958 0.758 0.963 0.945 0.818 0.978 0.529 0.994 0.965 0.857 0.993 0.750 0.997 0.994 0.964
University of Washington Densenet201 0.632 0.936 0.557 0.952 0.881 0.785 0.982 0.745 0.986 0.981 0.744 0.985 0.593 0.992 0.981 0.855 0.905 0.932 0.805 0.951 0.756 0.939 0.675 0.958 0.923 0.750 0.990 0.702 0.992 0.941 0.857 0.992 0.714 0.997 0.986 0.949
LMU DataMining No thresholding 0.696 0.922 0.531 0.960 0.931 0.796 0.982 0.747 0.987 0.984 0.558 0.992 0.667 0.987 0.963 0.874 0.925 0.946 0.830 0.970 0.788 0.953 0.737 0.964 0.945 0.795 0.996 0.854 0.994 0.967 0.857 0.993 0.750 0.997 0.994 0.965
Holidayburned Ensemble_of_resnet_and_inception_iteration_20000 0.830 0.856 0.424 0.975 0.926 0.839 0.980 0.736 0.989 0.991 0.512 0.993 0.687 0.986 0.984 0.762 0.960 0.966 0.728 0.951 0.793 0.943 0.699 0.964 0.956 0.773 0.996 0.850 0.993 0.986 0.771 0.994 0.750 0.995 0.997 0.970
DeepOncology.AI DeepOncology.AI-real_test_0.915_ensemble 0.591 0.974 0.743 0.949 0.882 0.839 0.988 0.821 0.989 0.990 0.698 0.992 0.714 0.991 0.975 0.967 0.818 0.889 0.943 0.965 0.751 0.981 0.867 0.959 0.945 0.682 0.999 0.937 0.991 0.955 0.743 0.997 0.867 0.994 0.996 0.958
University of Minnesota Center for Distributed Robotics An ensemble of resnet, densenet, inception, xception, and inceptionresnet 0.702 0.934 0.577 0.961 0.929 0.839 0.982 0.757 0.989 0.991 0.674 0.991 0.690 0.990 0.973 0.926 0.900 0.933 0.890 0.974 0.774 0.976 0.844 0.963 0.965 0.636 0.998 0.903 0.989 0.983 0.714 0.999 0.926 0.993 0.998 0.973
NWPU-SAIIP finetuning res101 0.667 0.901 0.462 0.955 0.784 0.753 0.972 0.642 0.984 0.863 0.814 0.953 0.337 0.994 0.883 0.774 0.934 0.946 0.733 0.854 0.664 0.943 0.661 0.944 0.803 0.795 0.990 0.700 0.994 0.893 0.771 0.991 0.675 0.995 0.881 0.852
Li Exp 5 0.673 0.963 0.701 0.958 0.945 0.785 0.981 0.730 0.986 0.987 0.628 0.987 0.587 0.989 0.978 0.956 0.852 0.907 0.928 0.970 0.724 0.983 0.877 0.955 0.960 0.727 0.997 0.889 0.992 0.982 0.743 0.998 0.897 0.994 0.999 0.974
Holidayburned Ensemble_of_resnet_and_Inception 0.825 0.861 0.431 0.975 0.922 0.817 0.982 0.745 0.988 0.987 0.558 0.991 0.649 0.987 0.978 0.772 0.940 0.951 0.733 0.953 0.793 0.947 0.717 0.965 0.949 0.682 0.998 0.909 0.991 0.988 0.771 0.995 0.771 0.995 0.997 0.968
Li 1-resins 0.690 0.963 0.707 0.961 0.935 0.796 0.979 0.712 0.986 0.980 0.581 0.992 0.676 0.988 0.958 0.958 0.834 0.897 0.930 0.969 0.719 0.985 0.891 0.954 0.947 0.705 1.000 1.000 0.991 0.973 0.714 0.999 0.926 0.993 0.983 0.964
BIL, NTU Conv. Ensemble (Inception Models Normalized+Un-Norm) + avg. pooling 0.614 0.971 0.729 0.952 0.934 0.753 0.994 0.886 0.984 0.988 0.698 0.995 0.811 0.991 0.988 0.969 0.818 0.889 0.946 0.974 0.765 0.971 0.814 0.961 0.973 0.614 0.999 0.964 0.989 0.966 0.743 0.998 0.897 0.994 0.984 0.972
Tufts University School of Medicine Convnet ResNet50 with cyclical learning rates V2 0.573 0.968 0.695 0.947 0.932 0.817 0.984 0.768 0.988 0.988 0.605 0.991 0.667 0.988 0.976 0.946 0.829 0.893 0.911 0.964 0.737 0.968 0.792 0.956 0.953 0.750 0.995 0.805 0.993 0.988 0.714 0.999 0.926 0.993 0.998 0.971
Le-Health Classification Using the Cascade Structure 0.602 0.951 0.609 0.949 0.913 0.817 0.980 0.724 0.988 0.978 0.791 0.980 0.540 0.994 0.978 0.901 0.856 0.904 0.851 0.957 0.737 0.956 0.737 0.956 0.948 0.545 1.000 1.000 0.987 0.949 0.743 0.999 0.929 0.994 0.984 0.958
BIL, NTU Conv. Ensemble (Inception Models) + avg. pooling 0.673 0.966 0.714 0.959 0.931 0.753 0.992 0.854 0.984 0.987 0.674 0.995 0.806 0.991 0.987 0.970 0.826 0.894 0.949 0.973 0.751 0.978 0.849 0.959 0.972 0.591 0.999 0.963 0.988 0.971 0.714 0.999 0.926 0.993 0.974 0.971
RECOD Titans XGB Ensemble of 43 Deep Learning Models 0.620 0.940 0.570 0.951 0.904 0.763 0.994 0.899 0.985 0.967 0.558 0.990 0.632 0.987 0.961 0.936 0.814 0.884 0.894 0.962 0.682 0.977 0.831 0.948 0.944 0.705 0.997 0.861 0.991 0.941 0.857 0.999 0.938 0.997 0.986 0.952
DeepOncology.AI DeepOncology.AI-real_test_0.877_resnet152_finetune_90.2458_0.347_66_2018-07-13_1 0.620 0.963 0.679 0.952 0.794 0.860 0.980 0.741 0.991 0.983 0.512 0.994 0.710 0.986 0.948 0.906 0.862 0.908 0.860 0.949 0.742 0.947 0.703 0.956 0.867 0.705 0.990 0.689 0.991 0.959 0.771 0.994 0.750 0.995 0.987 0.927
350818 multidimensional ensembling 0.702 0.944 0.615 0.961 0.943 0.656 0.991 0.824 0.978 0.985 0.465 0.999 0.952 0.985 0.983 0.936 0.886 0.925 0.902 0.974 0.816 0.952 0.741 0.969 0.955 0.636 0.999 0.966 0.989 0.989 0.886 0.998 0.912 0.997 0.999 0.975
Nile University - MIIP Ensembles of ResNet50, ResNet34, InceptionV3 0.667 0.952 0.640 0.957 0.809 0.796 0.985 0.779 0.987 0.890 0.698 0.988 0.625 0.991 0.843 0.947 0.842 0.901 0.914 0.895 0.691 0.976 0.829 0.950 0.834 0.523 0.999 0.920 0.986 0.761 0.771 0.999 0.931 0.995 0.885 0.845
QuindiTech: Xuan Li, Peng Xu Inception Ensemble 0.596 0.972 0.734 0.950 0.784 0.763 0.985 0.772 0.984 0.874 0.605 0.995 0.788 0.988 0.800 0.970 0.804 0.882 0.947 0.887 0.733 0.976 0.837 0.956 0.854 0.705 1.000 1.000 0.991 0.852 0.714 0.999 0.926 0.993 0.856 0.844
Tufts University School of Medicine Convnet ResNet50 with cyclical learning rates V1 0.632 0.962 0.679 0.953 0.940 0.785 0.984 0.760 0.986 0.984 0.605 0.995 0.765 0.988 0.973 0.943 0.824 0.890 0.905 0.967 0.742 0.970 0.805 0.957 0.958 0.773 0.998 0.919 0.993 0.985 0.600 0.999 0.913 0.991 0.998 0.972
Tandon Titans ResNet-50-Post-Processing 0.719 0.917 0.526 0.962 0.914 0.742 0.986 0.775 0.983 0.982 0.651 0.988 0.622 0.990 0.962 0.889 0.892 0.925 0.842 0.957 0.724 0.956 0.734 0.954 0.933 0.636 0.999 0.933 0.989 0.989 0.714 0.999 0.926 0.993 0.986 0.960
RECOD Titans Average of 8 Deep Learning Models Augmented with Synthetic Images 0.550 0.961 0.644 0.944 0.929 0.774 0.995 0.911 0.985 0.993 0.628 0.992 0.692 0.989 0.978 0.967 0.769 0.863 0.939 0.968 0.631 0.982 0.856 0.941 0.955 0.750 0.995 0.825 0.993 0.984 0.771 0.998 0.900 0.995 0.995 0.972
Texas A&M Aggies Sequential PNASNet Classification based on Balanced Color-Normed Dataset 0.655 0.943 0.596 0.955 0.799 0.817 0.976 0.691 0.988 0.897 0.605 0.989 0.619 0.988 0.797 0.906 0.846 0.898 0.857 0.876 0.618 0.968 0.766 0.938 0.793 0.727 0.988 0.653 0.992 0.858 0.743 0.997 0.839 0.994 0.870 0.841
Hangzhou Dianzi University CAD429 CNN; ensemble learning; multi-features. 0.684 0.968 0.731 0.960 0.826 0.763 0.991 0.845 0.985 0.877 0.465 0.993 0.645 0.984 0.729 0.956 0.844 0.902 0.927 0.900 0.825 0.970 0.821 0.971 0.897 0.614 0.999 0.964 0.989 0.806 0.743 0.999 0.929 0.994 0.871 0.844
University of Washington Densenet169_nesterov 0.608 0.957 0.642 0.950 0.888 0.731 0.982 0.731 0.982 0.960 0.535 0.988 0.575 0.986 0.965 0.913 0.842 0.897 0.865 0.935 0.677 0.962 0.750 0.947 0.922 0.727 0.984 0.582 0.992 0.908 0.857 0.993 0.732 0.997 0.976 0.936
QuindiTech: Xuan Li, Peng Xu Resnet152 Ensemble with Other Models 0.596 0.971 0.723 0.950 0.784 0.763 0.985 0.772 0.984 0.874 0.605 0.994 0.743 0.988 0.799 0.968 0.799 0.879 0.943 0.884 0.724 0.978 0.844 0.955 0.851 0.705 1.000 1.000 0.991 0.852 0.686 0.999 0.923 0.993 0.842 0.841
2nd Appinion Wide residual network applied to 7-class skin lesion classification 0.749 0.925 0.559 0.966 0.939 0.828 0.986 0.794 0.989 0.991 0.628 0.993 0.711 0.989 0.974 0.877 0.877 0.915 0.825 0.960 0.802 0.959 0.767 0.967 0.960 0.477 0.998 0.875 0.985 0.940 0.686 0.999 0.923 0.993 0.982 0.964
Hdu CAD429 Ensemble with many Multi Scale Convolutional Neural Network 0.678 0.970 0.744 0.959 0.824 0.774 0.991 0.847 0.985 0.883 0.465 0.993 0.645 0.984 0.729 0.956 0.839 0.899 0.927 0.898 0.820 0.968 0.813 0.970 0.894 0.591 0.999 0.963 0.988 0.795 0.743 0.999 0.929 0.994 0.871 0.842
University of Washington Densenet161 0.731 0.922 0.543 0.964 0.880 0.731 0.982 0.723 0.982 0.965 0.581 0.983 0.500 0.988 0.972 0.824 0.925 0.943 0.777 0.943 0.793 0.923 0.635 0.964 0.917 0.591 0.990 0.650 0.988 0.884 0.771 0.996 0.818 0.995 0.966 0.932
Tandon Titans ResNet50-with-hair-removal 0.754 0.837 0.372 0.964 0.879 0.731 0.984 0.747 0.982 0.978 0.605 0.984 0.531 0.988 0.957 0.743 0.947 0.955 0.710 0.941 0.724 0.927 0.623 0.952 0.911 0.636 0.997 0.875 0.989 0.989 0.829 0.997 0.879 0.996 0.985 0.948
Dysion AI Technology Co., Ltd Deep Model Ensemble with Data Alignment v1 0.544 0.944 0.554 0.942 0.905 0.839 0.966 0.619 0.989 0.982 0.674 0.984 0.558 0.990 0.977 0.905 0.847 0.899 0.856 0.956 0.668 0.971 0.792 0.946 0.941 0.636 0.994 0.757 0.989 0.968 0.743 0.997 0.839 0.994 0.993 0.960
AI Toulouse Using Znet and bagging from two first approachs 0.673 0.919 0.513 0.956 0.852 0.871 0.971 0.664 0.991 0.950 0.535 0.982 0.469 0.986 0.845 0.850 0.897 0.926 0.799 0.914 0.691 0.951 0.701 0.948 0.890 0.727 0.997 0.889 0.992 0.896 0.657 0.994 0.719 0.992 0.869 0.888
QuindiTech: Xuan Li, Peng Xu vgg ensemble 0.591 0.972 0.727 0.949 0.781 0.731 0.987 0.791 0.982 0.859 0.628 0.993 0.711 0.989 0.810 0.967 0.793 0.875 0.941 0.880 0.747 0.978 0.848 0.958 0.862 0.659 1.000 1.000 0.990 0.830 0.657 0.999 0.920 0.992 0.828 0.836
Dysion AI Technology Co., Ltd Deep Model Ensemble with Data Alignment v2 0.444 0.960 0.585 0.931 0.893 0.839 0.956 0.553 0.989 0.985 0.721 0.984 0.564 0.992 0.979 0.915 0.834 0.893 0.867 0.953 0.645 0.967 0.765 0.942 0.930 0.636 0.992 0.700 0.989 0.967 0.771 0.997 0.871 0.995 0.995 0.958
DeepOncology.AI DeepOncology.AI-real_test_0.849_resnet101_finetune_92.3418_0.2891_111_2018-07-15 0.602 0.943 0.575 0.949 0.848 0.806 0.984 0.773 0.987 0.978 0.628 0.990 0.643 0.989 0.969 0.931 0.819 0.886 0.887 0.944 0.696 0.974 0.816 0.950 0.908 0.614 0.999 0.964 0.989 0.934 0.686 0.999 0.923 0.993 0.969 0.936
Department of Dermatology, University of Rzeszów, Poland ResNet101-SGD 0.585 0.948 0.588 0.947 0.887 0.753 0.972 0.636 0.984 0.966 0.605 0.984 0.520 0.988 0.952 0.894 0.859 0.905 0.844 0.950 0.710 0.953 0.716 0.951 0.927 0.614 0.997 0.844 0.989 0.958 0.800 0.994 0.757 0.995 0.988 0.947
QuindiTech: Yuchen Lu ls + resnet + ens 0.632 0.966 0.706 0.954 0.901 0.753 0.985 0.769 0.984 0.983 0.651 0.989 0.636 0.990 0.943 0.949 0.808 0.881 0.914 0.946 0.710 0.971 0.806 0.952 0.941 0.659 0.999 0.935 0.990 0.942 0.600 0.999 0.913 0.991 0.986 0.949
BioImaging-KHU Deep Learning with Adapted InceptionResNetV2 0.649 0.937 0.566 0.954 0.912 0.774 0.984 0.758 0.985 0.982 0.674 0.977 0.460 0.990 0.937 0.876 0.876 0.914 0.824 0.952 0.691 0.940 0.661 0.948 0.929 0.614 0.995 0.794 0.988 0.977 0.657 0.998 0.885 0.992 0.980 0.953
Dominiks AI team Above dermatologist-level classification of malignant melanomas with deep neural 0.877 0.606 0.221 0.975 0.671 0.774 0.984 0.758 0.985 0.919 0.674 0.983 0.537 0.990 0.868 0.524 0.980 0.975 0.578 0.658 0.470 0.987 0.857 0.917 0.675 0.773 0.997 0.895 0.993 0.909 0.829 0.993 0.744 0.996 0.971 0.810
Mammoth Old fashion 0.673 0.934 0.567 0.957 0.893 0.688 0.967 0.577 0.979 0.967 0.628 0.988 0.600 0.989 0.973 0.862 0.900 0.929 0.813 0.946 0.700 0.932 0.633 0.949 0.908 0.682 0.993 0.732 0.990 0.977 0.686 0.997 0.857 0.993 0.981 0.949
AI Toulouse ZNet classification with additional data 0.608 0.931 0.528 0.949 0.853 0.871 0.975 0.692 0.991 0.957 0.512 0.986 0.524 0.986 0.847 0.881 0.864 0.907 0.828 0.922 0.673 0.952 0.702 0.946 0.892 0.750 0.998 0.917 0.993 0.896 0.629 0.995 0.759 0.991 0.869 0.891
Redha Ali, Russell C. Hardie, Manawaduge Supun De Silva, and Temesguen Messay Ke Combining Deep and Handcrafted Image Features for Skin Cancer Classification 0.626 0.935 0.552 0.951 0.780 0.817 0.955 0.543 0.988 0.886 0.744 0.971 0.432 0.992 0.858 0.791 0.949 0.959 0.751 0.870 0.793 0.907 0.589 0.963 0.850 0.682 0.993 0.750 0.990 0.838 0.457 0.996 0.727 0.987 0.727 0.830
CNR-ISASI_Lecce Deep Convolutional Neural Network with Stochastic Gradient Descent Optimization 0.620 0.949 0.606 0.951 0.908 0.742 0.987 0.784 0.983 0.978 0.651 0.990 0.667 0.990 0.976 0.911 0.833 0.891 0.861 0.954 0.751 0.954 0.734 0.958 0.957 0.591 0.996 0.812 0.988 0.979 0.629 0.999 0.917 0.991 0.992 0.963
BioImaging-KHU Deep Learning with Adapted ResNet-50 0.696 0.916 0.515 0.959 0.916 0.742 0.971 0.627 0.983 0.969 0.488 0.992 0.636 0.985 0.944 0.877 0.871 0.911 0.824 0.936 0.691 0.960 0.743 0.949 0.925 0.705 0.998 0.912 0.991 0.966 0.686 0.998 0.889 0.993 0.982 0.949
Hosei University, Iyatomi lab SEResNet101 w/ mean_teacher + SEResNet152 w/o mean_teacher 0.696 0.959 0.684 0.961 0.943 0.699 0.987 0.783 0.980 0.979 0.674 0.991 0.690 0.990 0.977 0.956 0.791 0.873 0.923 0.960 0.700 0.985 0.889 0.951 0.955 0.455 1.000 1.000 0.984 0.978 0.686 0.998 0.889 0.993 0.996 0.970
Manu Goyal DeeplabV3+ with Priority strategy based on benign/maligant and number of imag 0.585 0.928 0.510 0.946 0.757 0.785 0.983 0.753 0.986 0.884 0.488 0.980 0.420 0.985 0.734 0.867 0.862 0.905 0.811 0.865 0.525 0.983 0.838 0.925 0.754 0.841 0.970 0.457 0.995 0.905 0.771 0.963 0.333 0.994 0.867 0.824
QuindiTech: Yuchen Lu resnet all data 0.655 0.963 0.696 0.956 0.912 0.774 0.979 0.706 0.985 0.971 0.674 0.984 0.558 0.990 0.957 0.942 0.831 0.893 0.904 0.943 0.714 0.971 0.803 0.953 0.924 0.523 0.999 0.920 0.986 0.962 0.571 0.999 0.952 0.990 0.989 0.951
PA_Tech deep convolutional neural network with transfer learning 0.637 0.954 0.637 0.954 0.924 0.753 0.984 0.753 0.984 0.983 0.581 0.985 0.532 0.988 0.963 0.921 0.841 0.897 0.876 0.952 0.687 0.957 0.730 0.948 0.933 0.636 0.994 0.757 0.989 0.976 0.629 0.997 0.815 0.991 0.995 0.961
Hosei University, Iyatomi lab SEResNet152 w/ mean_teacher + SEResNet152 w/o mean_teacher (10fold ensemble) 0.667 0.960 0.683 0.958 0.945 0.742 0.987 0.784 0.983 0.977 0.605 0.993 0.703 0.988 0.969 0.959 0.776 0.866 0.927 0.963 0.668 0.986 0.890 0.947 0.959 0.455 0.999 0.909 0.984 0.987 0.743 0.999 0.929 0.994 0.996 0.971
Hosei University, Iyatomi lab SEResNet101 w/ mean_teacher + SEResNet152 w/o mean_teacher + 10fold SEResNets 0.661 0.962 0.689 0.957 0.946 0.720 0.985 0.761 0.982 0.979 0.674 0.993 0.725 0.990 0.976 0.959 0.776 0.866 0.927 0.963 0.687 0.987 0.898 0.949 0.959 0.477 1.000 1.000 0.985 0.985 0.657 0.998 0.885 0.992 0.996 0.972
Opsins Transfer learning based CNN 0.573 0.963 0.662 0.946 0.930 0.667 0.992 0.838 0.978 0.983 0.628 0.992 0.692 0.989 0.958 0.946 0.811 0.883 0.909 0.964 0.751 0.958 0.751 0.958 0.943 0.727 0.995 0.821 0.992 0.960 0.543 0.999 0.905 0.989 0.968 0.958
Mammoth Luan Dun 0.696 0.934 0.572 0.960 0.912 0.699 0.971 0.613 0.980 0.967 0.721 0.984 0.564 0.992 0.960 0.876 0.896 0.927 0.827 0.949 0.687 0.940 0.659 0.947 0.915 0.614 0.994 0.750 0.988 0.867 0.543 0.998 0.864 0.989 0.773 0.906
Manu Goyal DeeplabV3+ with Priority strategy based on no. of images in Dataset 0.480 0.954 0.569 0.935 0.717 0.613 0.992 0.826 0.975 0.802 0.581 0.963 0.313 0.987 0.772 0.867 0.862 0.905 0.811 0.865 0.567 0.973 0.778 0.931 0.770 0.841 0.971 0.462 0.995 0.906 0.886 0.946 0.282 0.997 0.916 0.821

Task 3 — Test-Set — Preliminary Secondary Metrics (2/2)

team_name approach_name MEL_Sens MEL_Spec MEL_PPV MEL_NPV MEL_AUC BCC_Sens BCC_Spec BCC_PPV BCC_NPV BCC_AUC AKIEC_Sens AKIEC_Spec AKIEC_PPV AKIEC_NPV AKIEC_AUC NV_Sens NV_Spec NV_PPV NV_NPV NV_AUC BKL_Sens BKL_Spec BKL_PPV BKL_NPV BKL_AUC DF_Sens DF_Spec DF_PPV DF_NPV DF_AUC VASC_Sens VASC_Spec VASC_PPV VASC_NPV VASC_AUC avgAUC
BIL, NTU Conv. Ensemble (Inception Models Normalized) + avg. pooling 0.561 0.977 0.756 0.946 0.930 0.645 0.994 0.870 0.977 0.987 0.558 0.995 0.750 0.987 0.987 0.969 0.791 0.875 0.945 0.969 0.747 0.956 0.740 0.957 0.963 0.545 0.998 0.889 0.987 0.967 0.800 0.998 0.903 0.995 0.988 0.970
BioImaging-KHU Deep Learning with Adapted InceptionV3 0.673 0.935 0.569 0.957 0.917 0.645 0.989 0.800 0.977 0.981 0.558 0.989 0.600 0.987 0.962 0.903 0.849 0.900 0.853 0.947 0.728 0.952 0.718 0.954 0.932 0.659 0.995 0.784 0.990 0.939 0.657 0.998 0.885 0.992 0.981 0.951
UNIST_BMIPL Multiscale Lesion Segmentation and Application to Skin Cancer Classification 0.807 0.825 0.371 0.971 0.897 0.613 0.982 0.695 0.975 0.971 0.674 0.985 0.569 0.990 0.969 0.761 0.887 0.910 0.711 0.894 0.419 0.974 0.734 0.909 0.889 0.705 0.993 0.738 0.991 0.975 0.829 0.965 0.358 0.996 0.974 0.938
Dysion AI Technology Co., Ltd Deep Model Ensemble without Data Alignment v2 0.398 0.961 0.567 0.926 0.890 0.839 0.951 0.531 0.989 0.984 0.605 0.986 0.565 0.988 0.978 0.916 0.814 0.881 0.866 0.952 0.636 0.966 0.758 0.941 0.927 0.636 0.993 0.737 0.989 0.961 0.771 0.995 0.794 0.995 0.994 0.955
Manu Goyal DeeplabV3+ with strategy based on maximum no. of pixel per class 0.608 0.921 0.495 0.949 0.765 0.731 0.984 0.756 0.982 0.858 0.512 0.978 0.400 0.986 0.745 0.870 0.856 0.901 0.814 0.863 0.512 0.987 0.867 0.923 0.749 0.795 0.973 0.473 0.994 0.884 0.771 0.966 0.351 0.994 0.869 0.819
Mammoth Lu Zhu 0.684 0.933 0.565 0.959 0.907 0.699 0.969 0.596 0.980 0.958 0.698 0.984 0.556 0.991 0.958 0.880 0.894 0.926 0.832 0.948 0.687 0.944 0.671 0.947 0.910 0.636 0.995 0.800 0.989 0.863 0.514 0.998 0.857 0.989 0.827 0.910
Nitwit AI Inception V3 0.632 0.962 0.679 0.953 0.797 0.774 0.980 0.713 0.985 0.877 0.581 0.986 0.543 0.988 0.784 0.954 0.798 0.877 0.920 0.876 0.631 0.981 0.846 0.941 0.806 0.682 0.999 0.937 0.991 0.840 0.543 0.997 0.826 0.989 0.770 0.821
The Homeboy’s Fusion of classical and DL technique 0.655 0.904 0.465 0.954 0.892 0.677 0.972 0.618 0.979 0.956 0.488 0.986 0.500 0.985 0.962 0.824 0.887 0.917 0.770 0.936 0.696 0.936 0.645 0.948 0.917 0.705 0.994 0.775 0.991 0.973 0.743 0.993 0.722 0.994 0.995 0.947
Tencent Youtu Lab ResNet based skin lesion diagnosis with triplet loss and feature similarity 0.608 0.961 0.667 0.951 0.930 0.753 0.987 0.795 0.984 0.967 0.488 0.992 0.636 0.985 0.943 0.946 0.826 0.891 0.910 0.962 0.724 0.958 0.744 0.954 0.939 0.636 0.996 0.824 0.989 0.970 0.629 0.998 0.880 0.991 0.967 0.954
Persistent Systems Two-stage hierarchical classifier 0.509 0.952 0.576 0.938 0.763 0.667 0.970 0.590 0.978 0.930 0.744 0.960 0.356 0.992 0.951 0.869 0.834 0.888 0.809 0.895 0.581 0.961 0.712 0.932 0.858 0.705 0.982 0.534 0.991 0.917 0.686 0.988 0.585 0.993 0.902 0.888
DC resnet ensemble 0.737 0.940 0.612 0.966 0.937 0.570 0.991 0.803 0.972 0.965 0.442 0.997 0.792 0.984 0.961 0.911 0.887 0.924 0.869 0.966 0.843 0.936 0.688 0.973 0.964 0.682 0.997 0.882 0.991 0.968 0.571 1.000 1.000 0.990 0.984 0.964
QuindiTech: Yuchen Lu resnet + label_smooth + balance 0.596 0.959 0.650 0.949 0.884 0.753 0.985 0.769 0.984 0.981 0.628 0.988 0.600 0.989 0.936 0.945 0.791 0.872 0.905 0.933 0.682 0.972 0.804 0.948 0.926 0.591 0.998 0.897 0.988 0.927 0.543 0.999 0.905 0.989 0.976 0.938
CNB-CSIC & FTN-UNS Ensemble of transfer learning on VGG16 and GoogLeNet 0.602 0.955 0.632 0.950 0.902 0.774 0.980 0.720 0.985 0.972 0.512 0.980 0.423 0.986 0.932 0.922 0.806 0.877 0.873 0.943 0.636 0.966 0.758 0.941 0.920 0.682 0.997 0.882 0.991 0.920 0.629 0.997 0.846 0.991 0.989 0.940
Università degli Studi di Modena e Reggio Emilia (AImage Lab .zip) inception fine-tuned, weighted loss and data augmentation 0.544 0.953 0.596 0.942 0.748 0.731 0.982 0.723 0.982 0.856 0.721 0.984 0.564 0.992 0.852 0.934 0.789 0.870 0.888 0.862 0.664 0.974 0.809 0.945 0.819 0.568 0.997 0.862 0.987 0.783 0.571 0.997 0.833 0.990 0.784 0.815
Tandon Titans ResNet-50-without-hair-removal 0.561 0.954 0.611 0.945 0.929 0.710 0.989 0.805 0.981 0.984 0.605 0.990 0.650 0.988 0.965 0.950 0.776 0.865 0.912 0.964 0.682 0.973 0.809 0.948 0.947 0.614 0.999 0.964 0.989 0.983 0.600 0.999 0.913 0.991 0.986 0.965
The Homeboy’s Pyramidal DL 0.643 0.903 0.458 0.952 0.906 0.731 0.965 0.581 0.982 0.966 0.512 0.988 0.564 0.986 0.964 0.809 0.902 0.926 0.759 0.936 0.700 0.917 0.587 0.948 0.913 0.614 0.997 0.844 0.989 0.973 0.686 0.996 0.800 0.993 0.996 0.951
WVUmich resenet without augmentation fusion epochs 0.456 0.972 0.672 0.933 0.892 0.710 0.987 0.776 0.981 0.981 0.581 0.980 0.455 0.988 0.943 0.946 0.760 0.856 0.903 0.945 0.599 0.960 0.714 0.935 0.916 0.614 0.995 0.794 0.988 0.925 0.771 0.995 0.771 0.995 0.990 0.942
WVUmich Fusion resenet models 0.526 0.964 0.652 0.941 0.901 0.753 0.987 0.795 0.984 0.987 0.535 0.986 0.535 0.986 0.969 0.954 0.748 0.851 0.915 0.950 0.631 0.975 0.811 0.940 0.937 0.523 0.998 0.885 0.986 0.955 0.743 0.998 0.897 0.994 0.993 0.956
Computer Vision Lab, Nankai University CNN ensemble 0.602 0.970 0.720 0.950 0.786 0.817 0.975 0.685 0.988 0.896 0.488 0.990 0.583 0.985 0.739 0.954 0.794 0.875 0.919 0.874 0.645 0.967 0.765 0.942 0.806 0.545 0.999 0.960 0.987 0.772 0.600 0.999 0.913 0.991 0.799 0.810
Department of Dermatology, University of Rzeszów, Poland ResNet101-Adam 0.649 0.936 0.563 0.954 0.913 0.731 0.971 0.624 0.982 0.974 0.535 0.986 0.535 0.986 0.957 0.883 0.804 0.872 0.821 0.927 0.525 0.960 0.687 0.923 0.895 0.523 0.995 0.742 0.986 0.939 0.771 0.988 0.600 0.995 0.963 0.938
Stony Brook University Transfer Learning 2 (0.9) 0.614 0.925 0.512 0.949 0.770 0.763 0.954 0.522 0.984 0.859 0.744 0.956 0.333 0.992 0.850 0.783 0.939 0.951 0.742 0.861 0.645 0.910 0.547 0.939 0.778 0.455 0.990 0.571 0.984 0.722 0.600 0.991 0.600 0.991 0.795 0.805
ISI_NLP_LAB Ensembel of ResNet50, DenseNet121 and MobileNet 0.608 0.970 0.722 0.951 0.789 0.645 0.983 0.714 0.977 0.814 0.674 0.987 0.604 0.990 0.831 0.957 0.765 0.860 0.922 0.861 0.645 0.974 0.805 0.942 0.809 0.500 0.997 0.815 0.985 0.748 0.571 0.998 0.870 0.990 0.785 0.805
ISI_NLP_LAB Ensemble of ResNet50, DenseNet121, MobileNet 0.608 0.970 0.722 0.951 0.789 0.645 0.983 0.714 0.977 0.814 0.674 0.987 0.604 0.990 0.831 0.957 0.765 0.860 0.922 0.861 0.645 0.974 0.805 0.942 0.809 0.500 0.997 0.815 0.985 0.748 0.571 0.998 0.870 0.990 0.785 0.805
Persistent Systems Five-stage hierarchical classifier 0.474 0.952 0.559 0.934 0.717 0.699 0.961 0.537 0.980 0.835 0.628 0.964 0.337 0.989 0.800 0.869 0.834 0.888 0.809 0.895 0.576 0.951 0.665 0.930 0.772 0.727 0.988 0.640 0.992 0.856 0.629 0.989 0.579 0.991 0.834 0.816
AI Toulouse Classification by ZNet, using only official data for training 0.713 0.890 0.454 0.961 0.860 0.763 0.972 0.645 0.984 0.919 0.488 0.984 0.467 0.985 0.811 0.840 0.869 0.906 0.783 0.904 0.590 0.952 0.674 0.933 0.841 0.545 0.997 0.857 0.987 0.861 0.657 0.997 0.852 0.992 0.884 0.869
Stony Brook University Transfer Learning Ensemble 3 0.614 0.925 0.512 0.949 0.770 0.753 0.955 0.522 0.983 0.854 0.744 0.955 0.327 0.992 0.850 0.783 0.939 0.951 0.742 0.861 0.645 0.910 0.547 0.939 0.778 0.455 0.990 0.571 0.984 0.722 0.600 0.991 0.600 0.991 0.795 0.804
SAIIP-MIA A Multi-Level Deep Ensemble Model 0.632 0.921 0.505 0.951 0.887 0.785 0.967 0.608 0.986 0.965 0.674 0.982 0.518 0.990 0.953 0.898 0.859 0.906 0.848 0.942 0.618 0.968 0.761 0.938 0.935 0.409 0.997 0.818 0.983 0.924 0.571 0.998 0.870 0.990 0.982 0.941
Stony Brook University Transfer Learning Ensemble 1 0.579 0.941 0.556 0.946 0.760 0.753 0.958 0.538 0.983 0.855 0.744 0.958 0.344 0.992 0.851 0.846 0.910 0.934 0.797 0.878 0.636 0.934 0.619 0.939 0.785 0.455 0.991 0.606 0.984 0.723 0.571 0.992 0.625 0.990 0.782 0.805
University of Padua Ensemble of different CNN topologies 0.538 0.972 0.708 0.943 0.933 0.688 0.981 0.703 0.980 0.980 0.512 0.993 0.667 0.986 0.978 0.960 0.766 0.861 0.928 0.963 0.691 0.967 0.777 0.949 0.946 0.568 0.998 0.893 0.987 0.986 0.600 0.999 0.913 0.991 0.997 0.969
UnB Task1 preprocessed K-fold Ensemble of Resnet50 0.561 0.955 0.615 0.945 0.903 0.634 0.978 0.656 0.976 0.960 0.581 0.986 0.556 0.988 0.979 0.923 0.784 0.866 0.871 0.941 0.622 0.955 0.699 0.938 0.926 0.568 0.995 0.781 0.987 0.966 0.657 0.997 0.852 0.992 0.990 0.952
LTS5 Convolutional Neural Network, DermoNet segmentation, ResNet50 0.684 0.934 0.571 0.959 0.918 0.699 0.982 0.714 0.980 0.966 0.488 0.978 0.396 0.985 0.962 0.924 0.801 0.875 0.875 0.946 0.548 0.974 0.778 0.928 0.929 0.545 0.999 0.960 0.987 0.965 0.657 0.999 0.920 0.992 0.995 0.954
University of Padua Ensemble of VGG16 CNN 0.596 0.957 0.642 0.949 0.926 0.634 0.977 0.641 0.976 0.975 0.535 0.986 0.535 0.986 0.967 0.936 0.806 0.879 0.893 0.959 0.700 0.963 0.760 0.950 0.939 0.545 0.997 0.857 0.987 0.986 0.571 0.999 0.909 0.990 0.994 0.964
UnB Task 1 preprocessed Resnet50 pretrained model - CropBest strategy 0.532 0.954 0.595 0.941 0.900 0.624 0.975 0.624 0.975 0.957 0.581 0.986 0.543 0.988 0.977 0.920 0.793 0.870 0.868 0.939 0.631 0.951 0.685 0.939 0.926 0.591 0.995 0.765 0.988 0.966 0.629 0.998 0.880 0.991 0.987 0.950
CAMP TUM Webly Supervised Learning for Skin Lesion Classification 0.538 0.951 0.582 0.942 0.866 0.720 0.984 0.753 0.982 0.959 0.512 0.991 0.629 0.986 0.932 0.946 0.778 0.865 0.905 0.948 0.654 0.963 0.747 0.943 0.922 0.500 0.999 0.957 0.985 0.861 0.600 0.999 0.913 0.991 0.948 0.919
miltonbd SENet-154 final valid 0.77 0.591 0.942 0.564 0.947 0.876 0.731 0.983 0.739 0.982 0.950 0.558 0.988 0.571 0.987 0.961 0.931 0.784 0.867 0.882 0.937 0.631 0.969 0.774 0.940 0.904 0.455 0.997 0.800 0.984 0.913 0.571 0.999 0.952 0.990 0.899 0.920
Nitwit AI ResNet 0.637 0.869 0.384 0.949 0.877 0.602 0.974 0.602 0.974 0.963 0.419 0.988 0.500 0.983 0.961 0.724 0.925 0.936 0.690 0.920 0.719 0.887 0.517 0.950 0.898 0.614 0.984 0.540 0.988 0.972 0.714 0.987 0.568 0.993 0.983 0.939
WVUmich resenet using augmentation tuning block 0.520 0.942 0.533 0.939 0.886 0.677 0.984 0.733 0.979 0.979 0.419 0.992 0.600 0.983 0.936 0.920 0.740 0.842 0.859 0.931 0.622 0.962 0.734 0.938 0.915 0.545 0.999 0.960 0.987 0.959 0.657 0.997 0.852 0.992 0.984 0.941
Mehdi&peyman InceptionV3 with augmentation 0.491 0.963 0.627 0.937 0.727 0.720 0.972 0.626 0.981 0.846 0.535 0.982 0.460 0.986 0.758 0.913 0.824 0.887 0.863 0.869 0.770 0.939 0.679 0.960 0.854 0.477 0.996 0.778 0.985 0.737 0.314 0.999 0.917 0.984 0.657 0.778
gchhor Pre-trained Inception-v3 0.520 0.958 0.614 0.940 0.868 0.602 0.984 0.718 0.974 0.937 0.419 0.995 0.692 0.983 0.907 0.933 0.692 0.820 0.872 0.897 0.641 0.967 0.764 0.941 0.872 0.409 0.999 0.947 0.983 0.894 0.686 0.997 0.857 0.993 0.972 0.907
University of Dayton, Signal and Image Processing Lab SVM classifier with hand-crafted features using segmentation 0.673 0.867 0.392 0.954 0.884 0.559 0.968 0.536 0.971 0.936 0.558 0.966 0.324 0.987 0.918 0.750 0.900 0.919 0.705 0.916 0.535 0.906 0.489 0.921 0.864 0.477 0.984 0.477 0.984 0.941 0.486 0.995 0.680 0.988 0.952 0.916
kevint Two-Step Training of Deep Residual Networks for Skin Lesion Diagnosis 0.456 0.957 0.574 0.932 0.896 0.559 0.987 0.732 0.972 0.965 0.651 0.984 0.538 0.990 0.947 0.961 0.677 0.817 0.921 0.942 0.470 0.972 0.739 0.916 0.919 0.477 0.997 0.840 0.985 0.968 0.429 0.996 0.714 0.987 0.979 0.945
Andrey Sorokin Hybrid Model of Lesion Boundary Detector and lesion Pixel-wise class detection 0.743 0.854 0.394 0.963 0.800 0.462 0.984 0.662 0.965 0.723 0.372 0.988 0.485 0.982 0.680 0.854 0.819 0.877 0.788 0.836 0.539 0.969 0.745 0.926 0.754 0.432 0.995 0.704 0.983 0.713 0.571 0.998 0.870 0.990 0.785 0.756
kevint One-Step Training of Deep Residual Networks for Skin Lesion Diagnosis 0.368 0.984 0.750 0.924 0.887 0.677 0.958 0.516 0.978 0.967 0.628 0.984 0.540 0.989 0.954 0.967 0.638 0.801 0.928 0.938 0.461 0.979 0.787 0.915 0.915 0.250 0.997 0.733 0.978 0.932 0.400 0.998 0.824 0.986 0.966 0.937
Le-Health Ensemble of Densenet 0.673 0.817 0.319 0.951 0.859 0.280 0.983 0.520 0.954 0.877 0.372 0.986 0.432 0.982 0.819 0.650 0.924 0.928 0.637 0.904 0.618 0.835 0.386 0.929 0.806 0.409 0.994 0.667 0.982 0.857 0.686 0.980 0.444 0.992 0.944 0.867
Department of Dermatology, University of Rzeszów, Poland TripletLoss-Margin 0.415 0.973 0.664 0.929 0.885 0.484 0.970 0.517 0.966 0.944 0.558 0.972 0.369 0.987 0.965 0.946 0.756 0.854 0.903 0.938 0.548 0.935 0.586 0.925 0.886 0.205 0.995 0.529 0.977 0.897 0.457 0.993 0.615 0.987 0.934 0.921
Andreas Pirchner Generative adversarial networks for skin lesion classification 0.398 0.950 0.504 0.925 0.875 0.387 0.970 0.456 0.960 0.936 0.302 0.986 0.382 0.980 0.949 0.869 0.743 0.836 0.790 0.907 0.488 0.906 0.467 0.914 0.851 0.500 0.978 0.400 0.985 0.938 0.657 0.991 0.622 0.992 0.959 0.916
Team MLMI Cascaded DenseNets for Multi-Class Skin Lesion Classification 0.649 0.851 0.358 0.950 0.839 0.538 0.941 0.373 0.969 0.834 0.349 0.974 0.283 0.981 0.812 0.742 0.879 0.902 0.694 0.839 0.447 0.913 0.464 0.908 0.781 0.295 0.989 0.448 0.979 0.731 0.486 0.992 0.586 0.988 0.791 0.804
NTHU CVLab Deep learning with ResNet-50 0.398 0.963 0.581 0.926 0.850 0.462 0.979 0.589 0.965 0.933 0.465 0.980 0.400 0.984 0.936 0.920 0.658 0.802 0.845 0.908 0.525 0.945 0.616 0.922 0.888 0.409 0.997 0.783 0.983 0.920 0.314 0.993 0.500 0.984 0.904 0.905
University of York Multitask learning, single CNN for all three tasks, segmentation via FCN 0.363 0.949 0.477 0.921 0.814 0.570 0.965 0.515 0.972 0.943 0.140 0.993 0.353 0.975 0.905 0.904 0.725 0.832 0.834 0.904 0.484 0.910 0.473 0.913 0.834 0.409 0.994 0.667 0.982 0.956 0.600 0.997 0.840 0.991 0.975 0.904
SSNMLRG hierarchy_resnet 0.474 0.939 0.497 0.933 0.706 0.462 0.966 0.473 0.965 0.714 0.395 0.978 0.340 0.982 0.686 0.888 0.728 0.831 0.811 0.808 0.438 0.928 0.505 0.908 0.683 0.159 0.988 0.292 0.975 0.574 0.571 0.997 0.800 0.990 0.784 0.708
QuindiTech: Pingao Wang Ensemble of Transfer Learning with pre-trained model over collected ISIC Archive 0.029 1.000 1.000 0.890 0.770 0.570 0.984 0.707 0.972 0.957 0.163 0.994 0.437 0.976 0.864 0.975 0.393 0.707 0.912 0.852 0.217 0.982 0.671 0.882 0.858 0.432 0.999 0.905 0.983 0.927 0.857 0.971 0.411 0.997 0.988 0.888
QuindiTech: Pingao Wang Transfer Learning with ISIC2018 archive 0.012 1.000 1.000 0.888 0.771 0.548 0.990 0.785 0.971 0.958 0.140 0.993 0.375 0.975 0.868 0.970 0.355 0.694 0.888 0.850 0.198 0.983 0.662 0.880 0.863 0.386 0.999 0.944 0.982 0.936 0.886 0.970 0.413 0.997 0.986 0.890
Balazs Harangi SKIN LESION DETECTION BASED ON AN ENSEMBLE OF ONE VERSUS ALL TRAINED CNN 0.480 0.919 0.429 0.933 0.744 0.484 0.965 0.479 0.966 0.923 0.326 0.984 0.368 0.980 0.924 0.893 0.756 0.847 0.825 0.902 0.484 0.923 0.512 0.914 0.854 0.091 0.997 0.444 0.973 0.921 0.371 0.998 0.812 0.985 0.962 0.890
Balazs Harangi SKIN LESION DETECTION BASED ON AN ENSEMBLE OF ONE VERSUS ALL TRAINED CNNS 0.480 0.916 0.423 0.932 0.744 0.484 0.965 0.479 0.966 0.923 0.326 0.984 0.368 0.980 0.924 0.892 0.760 0.848 0.824 0.898 0.484 0.923 0.512 0.914 0.854 0.091 0.997 0.444 0.973 0.921 0.371 0.998 0.812 0.985 0.962 0.890
LABCIN Handcrafted features based on ABCD rule and ELM Hierarchical Classification 0.632 0.834 0.326 0.947 0.733 0.398 0.944 0.319 0.960 0.671 0.395 0.960 0.227 0.982 0.678 0.652 0.909 0.915 0.634 0.780 0.548 0.876 0.425 0.920 0.712 0.205 0.984 0.273 0.976 0.594 0.286 0.987 0.345 0.983 0.636 0.686
University of Illinois Springfield Multiple CNN model 0.222 0.977 0.551 0.908 0.856 0.473 0.976 0.564 0.966 0.938 0.419 0.983 0.419 0.983 0.933 0.953 0.536 0.755 0.883 0.902 0.359 0.953 0.561 0.899 0.852 0.045 0.996 0.250 0.972 0.900 0.600 0.995 0.724 0.991 0.976 0.908
Altumview and UBC Joint Team Automatic Skin Lesion Analysis using Sliced Dermoscopy Images and Deep Residual 0.614 0.757 0.244 0.939 0.686 0.634 0.910 0.316 0.974 0.772 0.186 0.972 0.163 0.976 0.579 0.556 0.833 0.833 0.555 0.694 0.138 0.994 0.789 0.873 0.566 0.182 0.988 0.320 0.976 0.585 0.657 0.896 0.131 0.991 0.777 0.666
QuindiTech: Pingao Wang Transfer Learning with ISIC Archive and data augmentation 0.018 1.000 1.000 0.889 0.767 0.387 0.995 0.837 0.961 0.962 0.116 0.993 0.333 0.975 0.797 0.985 0.294 0.677 0.927 0.845 0.129 0.991 0.718 0.872 0.858 0.386 0.997 0.773 0.982 0.923 0.914 0.975 0.464 0.998 0.993 0.878
SSNMLRG Hierarchy 0.497 0.936 0.497 0.936 0.716 0.452 0.956 0.400 0.964 0.704 0.395 0.982 0.395 0.982 0.689 0.889 0.683 0.809 0.803 0.786 0.479 0.941 0.578 0.915 0.710 0.023 0.999 0.333 0.971 0.511 0.200 0.997 0.636 0.981 0.599 0.673
Deep-Class CNNCBR 0.591 0.813 0.288 0.940 0.702 0.484 0.950 0.388 0.966 0.717 0.326 0.881 0.074 0.978 0.604 0.532 0.919 0.908 0.566 0.725 0.346 0.900 0.366 0.891 0.623 0.273 0.957 0.160 0.978 0.615 0.343 0.978 0.273 0.984 0.661 0.664
SSNMLRG hierarchy_transferlearning 0.281 0.950 0.417 0.912 0.615 0.355 0.958 0.355 0.958 0.656 0.628 0.972 0.397 0.989 0.800 0.907 0.600 0.774 0.812 0.754 0.346 0.944 0.507 0.896 0.645 0.000 1.000 NaN 0.971 0.500 0.343 0.993 0.545 0.985 0.668 0.663
SSNMLRG hierarchy_transferlearning 0.281 0.950 0.417 0.912 0.615 0.355 0.958 0.355 0.958 0.656 0.628 0.972 0.397 0.989 0.800 0.907 0.600 0.774 0.812 0.754 0.346 0.944 0.507 0.896 0.645 0.000 1.000 NaN 0.971 0.500 0.343 0.993 0.545 0.985 0.668 0.663
Math & Stat Dept., UNC-Greensboro, USA and CSIE Dept., NTNU, Taiwan Support vector machine with topological features from persistent homology III 0.310 0.890 0.264 0.910 0.600 0.430 0.934 0.301 0.962 0.682 0.465 0.947 0.204 0.984 0.706 0.705 0.743 0.805 0.626 0.724 0.336 0.903 0.369 0.890 0.620 0.114 0.968 0.096 0.973 0.541 0.400 0.986 0.412 0.986 0.693 0.652
University of Texas Freshman Research Initiative Deep Learning with augmentation, dropout, and svm 0.561 0.825 0.290 0.936 0.805 0.312 0.964 0.362 0.955 0.888 0.209 0.990 0.375 0.977 0.905 0.749 0.801 0.850 0.679 0.860 0.406 0.877 0.356 0.898 0.786 0.114 0.995 0.385 0.974 0.873 0.343 0.997 0.750 0.985 0.923 0.863
Math & Stat Dept., UNC-Greensboro, USA and CSIE Dept., NTNU, Taiwan Support vector machine with topological features from persistent homology II 0.327 0.902 0.299 0.913 0.615 0.344 0.953 0.327 0.957 0.649 0.326 0.955 0.175 0.980 0.640 0.741 0.690 0.783 0.639 0.716 0.387 0.901 0.396 0.898 0.644 0.114 0.976 0.125 0.973 0.545 0.400 0.986 0.412 0.986 0.693 0.643
Math & Stat Dept., UNC-Greensboro, USA and CSIE Dept., NTNU, Taiwan Support vector machine with topological features from persistent homology 0.339 0.873 0.254 0.912 0.606 0.344 0.933 0.252 0.956 0.639 0.349 0.946 0.158 0.980 0.647 0.648 0.750 0.796 0.585 0.699 0.332 0.900 0.358 0.889 0.616 0.136 0.964 0.102 0.974 0.550 0.429 0.968 0.242 0.986 0.698 0.636
SSTL Skin Disease Classification by Resnet50 0.140 0.983 0.511 0.900 0.773 0.527 0.942 0.374 0.968 0.864 0.163 0.997 0.583 0.976 0.859 0.499 0.706 0.719 0.484 0.665 0.452 0.814 0.289 0.898 0.742 0.000 1.000 NaN 0.971 0.672 0.743 0.779 0.074 0.992 0.868 0.777
Deep-Class CNNCBR-2 0.468 0.807 0.236 0.922 0.620 0.419 0.951 0.358 0.961 0.858 0.209 0.955 0.120 0.976 0.700 0.612 0.849 0.859 0.593 0.827 0.442 0.850 0.331 0.901 0.720 0.091 0.980 0.121 0.973 0.524 0.257 0.994 0.500 0.983 0.834 0.726
Michigan weighted super learner 0.211 0.967 0.450 0.906 0.849 0.344 0.970 0.427 0.958 0.894 0.209 0.991 0.409 0.977 0.941 0.968 0.464 0.731 0.906 0.894 0.309 0.955 0.536 0.892 0.851 0.068 0.997 0.429 0.973 0.882 0.000 1.000 NaN 0.977 0.935 0.892
miltonbd senet 154 0.023 0.991 0.250 0.888 0.508 0.000 0.988 0.000 0.938 0.603 0.023 0.992 0.077 0.972 0.674 0.969 0.035 0.602 0.429 0.544 0.000 0.998 0.000 0.856 0.629 0.000 1.000 NaN 0.971 0.470 0.000 1.000 NaN 0.977 0.579 0.573
UnB Snapshot Ensemble of Resnet50 - SnapEsem stragegy 0.012 0.966 0.043 0.885 0.622 0.151 0.836 0.057 0.938 0.576 0.023 0.975 0.026 0.971 0.681 0.009 0.879 0.099 0.371 0.761 0.065 0.893 0.092 0.851 0.523 0.159 0.376 0.008 0.937 0.758 0.543 0.997 0.826 0.989 0.987 0.701