Blog¤
Running in Google Colab¤
You can run this experiment in Google Colab by clicking the button below:
Dataset¤
Blog Feedback [1] is a dataset containing 54,270 data points from blog
posts. The raw HTML-documents of the blog posts were crawled and
processed. The prediction task associated with the data is the
prediction of the number of comments in the upcoming 24 hours. The
feature of the dataset has 276 dimensions, and 8 attributes among them
should be monotonically non-decreasing with the prediction. They are
A51, A52, A53, A54, A56, A57, A58, A59. Thus the
monotonicity_indicator
corrsponding to these features are set to 1. As
done in [2], we only use the data points with targets smaller than the
90th percentile.
References:
- Krisztian Buza. Feedback prediction for blogs. In Data analysis, machine learning and knowledge discovery, pages 145–152. Springer, 2014
- Xingchao Liu, Xing Han, Na Zhang, and Qiang Liu. Certified monotonic neural networks. Advances in Neural Information Processing Systems, 33:15427–15438, 2020
monotonicity_indicator = {
f"feature_{i}": 1 if i in range(50, 54) or i in range(55, 59) else 0
for i in range(276)
}
These are a few examples of the dataset:
0 | 1 | 2 | 3 | 4 | |
---|---|---|---|---|---|
feature_0 | 0.001920 | 0.001920 | 0.000640 | 0.001920 | 0.001920 |
feature_1 | 0.001825 | 0.001825 | 0.001825 | 0.000000 | 0.000000 |
feature_2 | 0.002920 | 0.002920 | 0.000000 | 0.001460 | 0.001460 |
feature_3 | 0.001627 | 0.001627 | 0.000651 | 0.001627 | 0.001627 |
feature_4 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_5 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_6 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_7 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_8 | 0.035901 | 0.035901 | 0.035901 | 0.035901 | 0.035901 |
feature_9 | 0.096250 | 0.096250 | 0.096250 | 0.096250 | 0.096250 |
feature_10 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_11 | 0.196184 | 0.196184 | 0.196184 | 0.196184 | 0.196184 |
feature_12 | 0.011416 | 0.011416 | 0.011416 | 0.011416 | 0.011416 |
feature_13 | 0.035070 | 0.035070 | 0.035070 | 0.035070 | 0.035070 |
feature_14 | 0.090234 | 0.090234 | 0.090234 | 0.090234 | 0.090234 |
feature_15 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_16 | 0.264747 | 0.264747 | 0.264747 | 0.264747 | 0.264747 |
feature_17 | 0.005102 | 0.005102 | 0.005102 | 0.005102 | 0.005102 |
feature_18 | 0.032064 | 0.032064 | 0.032064 | 0.032064 | 0.032064 |
feature_19 | 0.089666 | 0.089666 | 0.089666 | 0.089666 | 0.089666 |
feature_20 | 0.264747 | 0.264747 | 0.264747 | 0.264747 | 0.264747 |
feature_21 | 0.003401 | 0.003401 | 0.003401 | 0.003401 | 0.003401 |
feature_22 | 0.031368 | 0.031368 | 0.031368 | 0.031368 | 0.031368 |
feature_23 | 0.083403 | 0.083403 | 0.083403 | 0.083403 | 0.083403 |
feature_24 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_25 | 0.195652 | 0.195652 | 0.195652 | 0.195652 | 0.195652 |
feature_26 | 0.009302 | 0.009302 | 0.009302 | 0.009302 | 0.009302 |
feature_27 | 0.068459 | 0.068459 | 0.068459 | 0.068459 | 0.068459 |
feature_28 | 0.085496 | 0.085496 | 0.085496 | 0.085496 | 0.085496 |
feature_29 | 0.716561 | 0.716561 | 0.716561 | 0.716561 | 0.716561 |
feature_30 | 0.265120 | 0.265120 | 0.265120 | 0.265120 | 0.265120 |
feature_31 | 0.419453 | 0.419453 | 0.419453 | 0.419453 | 0.419453 |
feature_32 | 0.120206 | 0.120206 | 0.120206 | 0.120206 | 0.120206 |
feature_33 | 0.345656 | 0.345656 | 0.345656 | 0.345656 | 0.345656 |
feature_34 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_35 | 0.366667 | 0.366667 | 0.366667 | 0.366667 | 0.366667 |
feature_36 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_37 | 0.126985 | 0.126985 | 0.126985 | 0.126985 | 0.126985 |
feature_38 | 0.226342 | 0.226342 | 0.226342 | 0.226342 | 0.226342 |
feature_39 | 0.375000 | 0.375000 | 0.375000 | 0.375000 | 0.375000 |
feature_40 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_41 | 0.125853 | 0.125853 | 0.125853 | 0.125853 | 0.125853 |
feature_42 | 0.224422 | 0.224422 | 0.224422 | 0.224422 | 0.224422 |
feature_43 | 0.375000 | 0.375000 | 0.375000 | 0.375000 | 0.375000 |
feature_44 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_45 | 0.114587 | 0.114587 | 0.114587 | 0.114587 | 0.114587 |
feature_46 | 0.343826 | 0.343826 | 0.343826 | 0.343826 | 0.343826 |
feature_47 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_48 | 0.384615 | 0.384615 | 0.384615 | 0.384615 | 0.384615 |
feature_49 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_50 | 0.108675 | 0.108675 | 0.108675 | 0.108675 | 0.108675 |
feature_51 | 0.195570 | 0.195570 | 0.195570 | 0.195570 | 0.195570 |
feature_52 | 0.600000 | 0.600000 | 0.600000 | 0.600000 | 0.600000 |
feature_53 | 0.391304 | 0.391304 | 0.391304 | 0.391304 | 0.391304 |
feature_54 | 0.333333 | 0.333333 | 0.333333 | 0.333333 | 0.333333 |
feature_55 | 0.516725 | 0.516725 | 0.518486 | 0.516725 | 0.516725 |
feature_56 | 0.550000 | 0.550000 | 0.550000 | 0.550000 | 0.550000 |
feature_57 | 0.486111 | 0.486111 | 0.138889 | 0.819444 | 0.819444 |
feature_58 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_59 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_60 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_61 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_62 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_63 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_64 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_65 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_66 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_67 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_68 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_69 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_70 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_71 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_72 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_73 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_74 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_75 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_76 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_77 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_78 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_79 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_80 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_81 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_82 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_83 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_84 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_85 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_86 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_87 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_88 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_89 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_90 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_91 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_92 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_93 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_94 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_95 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_96 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_97 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_98 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_99 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_100 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_101 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_102 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_103 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_104 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_105 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_106 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_107 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_108 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_109 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_110 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_111 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_112 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_113 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_114 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_115 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_116 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_117 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_118 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_119 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_120 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_121 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_122 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_123 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_124 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_125 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_126 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_127 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_128 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_129 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_130 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_131 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_132 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_133 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_134 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_135 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_136 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_137 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_138 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_139 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_140 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_141 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_142 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_143 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_144 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_145 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_146 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_147 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_148 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_149 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_150 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_151 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_152 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_153 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_154 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_155 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_156 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_157 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_158 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_159 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_160 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_161 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_162 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_163 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_164 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_165 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_166 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_167 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_168 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_169 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_170 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_171 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_172 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_173 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_174 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_175 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_176 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_177 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_178 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_179 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_180 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_181 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_182 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_183 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_184 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_185 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_186 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_187 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_188 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_189 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_190 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_191 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_192 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_193 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_194 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_195 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_196 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_197 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_198 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_199 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_200 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_201 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_202 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_203 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_204 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_205 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_206 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_207 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_208 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_209 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_210 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_211 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_212 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_213 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_214 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_215 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_216 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_217 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_218 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_219 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_220 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_221 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_222 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_223 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_224 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_225 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_226 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_227 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_228 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_229 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_230 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_231 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_232 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_233 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_234 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_235 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_236 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_237 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_238 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_239 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_240 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_241 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_242 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_243 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_244 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_245 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_246 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_247 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_248 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_249 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_250 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_251 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_252 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_253 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_254 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_255 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_256 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_257 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_258 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_259 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_260 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_261 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_262 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_263 | 1.000000 | 1.000000 | 1.000000 | 0.000000 | 0.000000 |
feature_264 | 0.000000 | 0.000000 | 0.000000 | 1.000000 | 1.000000 |
feature_265 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_266 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_267 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_268 | 1.000000 | 1.000000 | 0.000000 | 1.000000 | 1.000000 |
feature_269 | 0.000000 | 0.000000 | 1.000000 | 0.000000 | 0.000000 |
feature_270 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_271 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_272 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_273 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_274 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
feature_275 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
ground_truth | 0.000000 | 0.000000 | 0.125000 | 0.000000 | 0.000000 |
Hyperparameter search¤
The choice of the batch size and the maximum number of epochs depends on the dataset size. For this dataset, we use the following values:
batch_size = 256
max_epochs = 30
We use the Type-2 architecture built using
MonoDense
layer with the following set of hyperparameters ranges:
def hp_params_f(hp):
return dict(
units=hp.Int("units", min_value=16, max_value=32, step=1),
n_layers=hp.Int("n_layers", min_value=2, max_value=2),
activation=hp.Choice("activation", values=["elu"]),
learning_rate=hp.Float(
"learning_rate", min_value=1e-4, max_value=1e-2, sampling="log"
),
weight_decay=hp.Float(
"weight_decay", min_value=3e-2, max_value=0.3, sampling="log"
),
dropout=hp.Float("dropout", min_value=0.0, max_value=0.5, sampling="linear"),
decay_rate=hp.Float(
"decay_rate", min_value=0.8, max_value=1.0, sampling="reverse_log"
),
)
The following fixed parameters are used to build the Type-2 architecture for this dataset:
-
final_activation
is used to build the final layer for regression problem (set toNone
) or for the classification problem ("sigmoid"
), -
loss
is used for training regression ("mse"
) or classification ("binary_crossentropy"
) problem, and -
metrics
denotes metrics used to compare with previosly published results:"accuracy"
for classification and “mse
” or “rmse
” for regression.
Parameters objective
and direction
are used by the tuner such that
objective=f"val_{metrics}"
and direction is either "min
or "max"
.
Parameters max_trials
denotes the number of trial performed buy the
tuner, patience
is the number of epochs allowed to perform worst than
the best one before stopping the current trial. The parameter
execution_per_trial
denotes the number of runs before calculating the
results of a trial, it should be set to value greater than 1 for small
datasets that have high variance in results.
final_activation = None
loss = "mse"
metrics = tf.keras.metrics.RootMeanSquaredError()
objective = "val_root_mean_squared_error"
direction = "min"
max_trials = 50
executions_per_trial = 1
patience = 10
The following table describes the best models and their hyperparameters found by the tuner:
The optimal model¤
These are the best hyperparameters found by previous runs of the tuner:
def final_hp_params_f(hp):
return dict(
units=hp.Fixed("units", value=4),
n_layers=hp.Fixed("n_layers", 2),
activation=hp.Fixed("activation", value="elu"),
learning_rate=hp.Fixed("learning_rate", value=0.01),
weight_decay=hp.Fixed("weight_decay", value=0.0),
dropout=hp.Fixed("dropout", value=0.0),
decay_rate=hp.Fixed("decay_rate", value=0.95),
)
The final evaluation of the optimal model:
0 | |
---|---|
units | 4 |
n_layers | 2 |
activation | elu |
learning_rate | 0.010000 |
weight_decay | 0.000000 |
dropout | 0.000000 |
decay_rate | 0.950000 |
val_root_mean_squared_error_mean | 0.154109 |
val_root_mean_squared_error_std | 0.000568 |
val_root_mean_squared_error_min | 0.153669 |
val_root_mean_squared_error_max | 0.154894 |
params | 1665 |