oh yeah, “… at least one occupant fatality”.
The source dataset seems to have pedestrian/non-occupant fatalities, pretty shitty of tthem to go out of their way to exclude them.
oh yeah, “… at least one occupant fatality”.
The source dataset seems to have pedestrian/non-occupant fatalities, pretty shitty of tthem to go out of their way to exclude them.
They looked at fatal crashes only, which is presumably a very small share of all crashes. They also normalised to per mile driven using a sample of data they have - presumably some data on miles driven by car type.
Could be sketchy, could just be a much smaller sub-population.
… sufficiently random …
Since they just use the 8m for the normalisation it’d be interesting to know how sensitive the rankings are if they assumed some bias. Or maybe even just swap around some normalisation factors and see how robust the ranking is.
I guess they do have near complete data on the deaths, and pretty good data on the population of registered vehicles.
Or, their manufacturers also make some safer vehicles. It seems that all of Tesla’s vehicles are high up the list, so the whole manufacturer average is higher than all others. Wheras Hyundai, for example, must sell plenty of safer models that bring down its average.