One of the drawback of original consistent hashing algorithm(without vnodes) is the subrange is wide and lead to hotspot. I am wondering if the hashing algorithm is random enough why this is still a problem?

Even if the hashing algo is random enough it the since the range is widened the probability of an item going into the widened range is doubled than before causing it to have 2x traffic possibility.