Keyweek : week number rider : rider number # : approximate degrees of freedom contributing to the data. These are non-integral because degrees of freedom for riders are reduced for the degrees of freedom lost in determining the weekly normalizations. The degrees of freedom for weeks are reduced for the degrees of freedom lost in determining the rider scores. The calculation is iterative. rating : for riders, the natural log of the rider's expected ratio to the average speed. for weeks, the natural log of the time in seconds for the average rider. Thus, the predicted time for a given rider in a given week is exp(rating_rider + rating_week) seconds. sigma : the standard error associated with a rating |
Weeksweek # rating sigma climb 1 31.647 7.75363 0.0398593 Montebello 2 44.4949 7.98183 0.0356381 Page Mill Road 3 45.3847 8.3447 0.02699 Mt Diablo 4 54.5015 7.48581 0.0295556 Kings Mountain 5 53.6109 8.07969 0.0443287 West 84 & W OLH 6 39.1248 7.76355 0.0395436 Bohlman-On Orbit-Bohlman 7 42.7838 6.91905 0.0341594 Alpine & Juaquin 8 37.4537 7.93937 0.0296849 Hick's & Loma Almaden 9 44.1999 8.70311 0.0289379 Mt Hamilton Road X 21.4496 7.32499 0.0810513 Old La Honda Road |
These raw ranking can be converted into hill conversion factors, represented in the following matrix:
t o w e e k - 1 2 3 4 5 6 7 8 9 X f 1 1.000 1.256 1.806 0.765 1.385 1.010 0.434 1.204 2.584 0.651 r 2 0.796 1.000 1.437 0.609 1.103 0.804 0.345 0.958 2.057 0.518 o 3 0.554 0.696 1.000 0.424 0.767 0.559 0.240 0.667 1.431 0.361 m 4 1.307 1.642 2.361 1.000 1.811 1.320 0.567 1.574 3.378 0.851 5 0.722 0.907 1.303 0.552 1.000 0.729 0.313 0.869 1.865 0.470 w 6 0.990 1.244 1.788 0.757 1.372 1.000 0.430 1.192 2.559 0.645 e 7 2.304 2.894 4.161 1.763 3.192 2.327 1.000 2.774 5.954 1.501 e 8 0.830 1.043 1.500 0.635 1.151 0.839 0.360 1.000 2.146 0.541 k 9 0.387 0.486 0.699 0.296 0.536 0.391 0.168 0.466 1.000 0.252 X 1.535 1.929 2.772 1.174 2.127 1.550 0.666 1.849 3.967 1.000 |
Ridersrider # rating sigma 14 0.984074 -0.750676 0 123 0.988922 -0.745592 0 200 0.988244 -0.682599 0 218 0.988623 -0.611946 0 15 0.984074 -0.58744 0 217 0.988623 -0.500624 0 214 0.988623 -0.500422 0 195 0.987138 -0.471075 0 185 1.97777 -0.438934 0.0758477 137 0.988922 -0.412395 0 138 0.988922 -0.412395 0 228 0.976412 -0.409311 0 63 3.95232 -0.385065 0.129368 71 0.984074 -0.328921 0 142 1.97734 -0.320458 0.008018 126 0.988922 -0.28925 0 56 7.89887 -0.287188 0.06064 32 1.973 -0.256515 0.0124924 80 5.9359 -0.253942 0.0407013 182 1.97777 -0.235325 0.0414138 196 0.987138 -0.233994 0 191 0.987138 -0.223821 0 174 0.990784 -0.219989 0 192 0.987138 -0.21492 0 13 0.984074 -0.197291 0 40 0.984074 -0.184484 0 84 5.92819 -0.176916 0.0611196 129 0.988922 -0.176742 0 177 1.97887 -0.175985 0.0186729 81 5.92153 -0.171566 0.0368434 97 0.988699 -0.169285 0 49 0.984074 -0.166784 0 103 0.988699 -0.164935 0 64 8.89367 -0.160514 0.0227901 194 0.987138 -0.160508 0 76 0.984074 -0.160203 0 68 0.984074 -0.157632 0 210 0.988623 -0.15022 0 225 0.988623 -0.145189 0 10 1.97719 -0.140036 0.0261499 197 2.96194 -0.13865 0.0675284 173 0.990784 -0.138138 0 17 0.984074 -0.136445 0 58 8.88116 -0.131178 0.0576084 86 0.988699 -0.128836 0 73 0.984074 -0.121305 0 25 0.984074 -0.120542 0 119 0.988922 -0.113654 0 79 6.91998 -0.111508 0.0076885 136 0.988922 -0.111311 0 205 0.98656 -0.108179 0 59 9.87009 -0.105103 0.0484398 16 1.9747 -0.101795 0.0357074 207 0.98656 -0.0988288 0 66 0.984074 -0.0958113 0 85 8.88601 -0.0946885 0.0309075 202 1.97687 -0.0920767 0.00898471 21 1.97277 -0.0854065 0.0121905 55 4.92605 -0.0831742 0.0535985 100 6.91053 -0.0770808 0.046833 118 5.92045 -0.0750928 0.0567755 106 0.988699 -0.0737142 0 107 1.97948 -0.0730018 0.00277274 38 7.90505 -0.066382 0.027467 89 4.94766 -0.0631822 0.0254037 47 7.8906 -0.0588127 0.0321898 190 2.96194 -0.0565618 0.0506294 183 0.99063 -0.0556763 0 121 2.96262 -0.0552238 0.0257693 9 3.94996 -0.0531579 0.032044 75 1.9727 -0.0515168 0.0437369 208 0.98656 -0.0493581 0 213 0.988623 -0.0479608 0 159 2.96765 -0.0395697 0.0282118 48 5.92959 -0.0383228 0.0239397 19 0.984074 -0.0374093 0 67 0.984074 -0.0365795 0 155 0.990784 -0.0344097 0 39 2.96162 -0.0286113 0.016978 131 2.96817 -0.0212069 0.0211312 204 0.98656 -0.0210511 0 115 0.988922 -0.0179186 0 102 0.988699 -0.0177349 0 227 0.976412 -0.0158454 0 151 3.95828 -0.015802 0.0138969 152 3.95828 -0.015802 0.0138969 44 7.89483 -0.0138222 0.0537359 232 0.976412 -0.00997264 0 236 0.976412 -0.00997264 0 184 0.99063 -0.00472331 0 135 0.988922 -0.00447972 0 62 3.95248 -0.00440555 0.0229056 30 1.9747 -0.00364245 0.00829667 146 3.94439 0.00165159 0.0527561 95 1.97948 0.00463461 0.0176936 122 2.97034 0.00499541 0.0301333 220 0.988623 0.00653884 0 42 0.984074 0.0114638 0 221 0.988623 0.0161438 0 128 0.988922 0.0180848 0 171 1.97941 0.0202143 0.00312445 165 1.98141 0.0207741 0.00405197 92 6.90847 0.0224632 0.0442421 222 0.988623 0.0256709 0 105 3.95903 0.0265655 0.0343722 150 2.96966 0.0281093 0.0181596 203 0.988244 0.0313138 0 53 6.91829 0.0329005 0.022054 34 0.984074 0.0339267 0 167 0.990784 0.0342337 0 206 0.98656 0.0347489 0 23 8.89367 0.0390259 0.0385895 170 1.98141 0.0419923 0.00593128 109 5.93206 0.0430454 0.046531 124 0.988922 0.0496534 0 153 1.97903 0.0498533 0.00582125 12 6.91633 0.0517648 0.0282747 156 0.990784 0.0524534 0 164 2.96654 0.0544596 0.0524685 8 6.91681 0.0552699 0.0289824 178 0.99063 0.0627683 0 161 0.990784 0.063187 0 231 0.976412 0.0633302 0 27 2.96356 0.0666457 0.0154091 108 4.94766 0.068528 0.0174084 72 0.984074 0.0734504 0 41 0.984074 0.07577 0 188 0.987138 0.0778188 0 78 0.984074 0.0794927 0 215 1.97725 0.0812552 0.0640234 172 2.96855 0.0818263 0.0190334 61 2.96378 0.0819327 0.0325556 219 0.988623 0.0828293 0 125 1.97971 0.0829355 0.00297025 93 5.93019 0.0838583 0.0263485 179 2.96543 0.0861482 0.00623828 52 9.87009 0.0883234 0.0404602 134 0.988922 0.0890044 0 112 2.96624 0.0909975 0.0375779 180 1.96704 0.0911934 0.0280273 35 1.9747 0.0919104 0.00770484 24 8.89367 0.0956702 0.0271279 199 0.987138 0.0959546 0 114 5.93228 0.0977282 0.0270777 69 0.984074 0.098792 0 82 0.988699 0.100048 0 90 0.988699 0.102323 0 74 0.984074 0.102601 0 145 0.990784 0.102935 0 233 0.976412 0.104299 0 50 0.984074 0.108343 0 193 1.9737 0.111503 0.0159071 37 1.97277 0.111519 0.0173242 198 0.987138 0.113947 0 169 0.990784 0.114228 0 132 3.95642 0.115831 0.00444312 91 6.92304 0.116247 0.0281384 154 4.94336 0.118675 0.0245562 28 1.97486 0.120727 0.00270629 77 2.96169 0.121107 0.0224311 176 0.990784 0.1231 0 110 1.97948 0.127079 0.00388746 157 0.990784 0.13398 0 22 0.984074 0.136075 0 229 0.976412 0.136355 0 235 0.976412 0.136355 0 163 1.97941 0.136801 0.00334122 33 0.984074 0.141517 0 149 3.95828 0.145565 0.0218166 148 1.98141 0.146678 0.021887 43 5.91774 0.151034 0.0417444 6 4.9411 0.154099 0.0208203 209 1.96503 0.155171 0.0569649 45 4.93725 0.155433 0.0268866 46 7.8906 0.158148 0.0431735 88 0.988699 0.158544 0 212 0.988623 0.164442 0 99 2.97011 0.166966 0.0378676 26 1.97277 0.171456 0.00357809 29 0.984074 0.172242 0 87 8.88601 0.173728 0.0303258 117 2.95396 0.176579 0.0560572 127 4.94572 0.176668 0.0390158 70 6.91226 0.178509 0.0175664 0 4.94105 0.179756 0.0274114 113 4.94313 0.181492 0.0187196 234 0.976412 0.184464 0 20 9.87009 0.189009 0.0335662 83 4.93994 0.190728 0.0311907 130 0.988922 0.195031 0 223 0.988623 0.197991 0 147 0.990784 0.201182 0 160 0.990784 0.202559 0 158 0.990784 0.212255 0 116 0.988922 0.212276 0 54 0.984074 0.212957 0 96 1.96511 0.215793 0.0475268 187 2.96401 0.21889 0.0491922 101 2.9624 0.221618 0.0302988 120 0.988922 0.228323 0 94 0.988699 0.23358 0 166 1.97941 0.237474 0.035549 1 7.89739 0.238551 0.0597298 133 2.97034 0.241172 0.0468569 230 0.976412 0.243401 0 104 1.97948 0.244297 0.0367161 168 4.94336 0.249947 0.0421082 224 0.988623 0.257398 0 139 6.90653 0.260411 0.0418354 36 0.984074 0.263774 0 162 2.96654 0.268084 0.00929699 189 0.987138 0.269214 0 31 0.984074 0.278455 0 11 0.984074 0.337624 0 18 0.984074 0.35774 0 98 6.90676 0.366107 0.0494707 |
A regression scheme was implemented to rank riders and weeks. This was done for the determination of "most improved" rider awards, for which it was important to determine the relative performance of the winners of each week, so the winner-normalized scores could be converted into a globally-normalized score. Weeks in which riders exhibited anamolously low normalized scores were pruned from the data and then scores reiterated. For example, weeks in which riders punctured were typically noted and eliminated from the data in this way.
A shell script called weeks_to_regression was used to initiate the analysis:
|
weeks_to_regression.nawk was as follows:
|
The rider data file is the following:
|