Can the default hashCode act as a source of entropy?
The default implementation hashCode()
is to return an object identity hash, which is provided by the JVM and is probably not as random. If hashCode()
overridden, which happens quite often, the return value is even less random, and any implementation that returns a random number for hashCode()
(even if it's the same for each call in one instance) will most likely break the contract that was defined for equals()
and hashCode()
;
From JavaDoc:
Note that it is usually necessary to override the hashCode method whenever that method is overridden in order to maintain the general contract for the hashCode method, which states that equal objects must have the same hash codes.
So I would recommend using java.util.Random
(you could use one instance) or something else, but not hashCode()
.
source to share
The ideal hash code should
- will be equal for equal instances
- be different for unequal instances (as many cases as possible)
Note that the second condition contradicts randomness. Imagine that we have 10 objects. Better implementation hashCode
returns
0, 1, 2, 3, 4, 5, 6, 7, 8, 9
for instances, and therefore hashCode
has no collisions. However, the random numbers will be something like this:
5, 3, 7, 0, 5, 4, 2, 8, 1, 7
note collisions (two 5
and two 7
). Therefore, a good one hashCode
should inevitably be biased (to prevent / minimize collisions) and therefore should not be used as a random number associated with an instance. If you override hashCode
to return the associated random value as a hash code, you actually screw up by hashCode
adding potential collisions.
So use hashCode
for hash codes and Java.util.Random
for random numbers.
source to share