What is the impedance of your antenna at the point where you will install the resistor?
If you are putting it at one end of an end-fed wire, the impedance may be high on some
bands. If the resistance is 10 times the impedance, then 1/10 of your transmit power will be
dissipated in the resistor. For a 50 ohm line, that would argue for at least 1K, and perhaps
10K. For an end-fed half wave wire where the impedance might be 2000 ohms or more, then
something like 47K or 100K would be better.
The resistor also has to withstand the TX voltage across it. At 50 ohms, 100 watts is only
SQRT( 100 watts * 50 ohms ) = 70V RMS or ~100V peak. Most standard leaded resistors
are rated for 250V, and should be safe. But a 2000 ohm load would have a voltage of
SQRT( 100 Watts * 2000 ohms ) = 450V RMS or ~630v Peak so you'd need a string of at
least 3 of them to handle the voltage.
I would normally look for a power resistor like 47K or 100K. For example, a 47K 10W resistor
must be rated at 685V RMS to reach full dissipation. Wire-wound resistors are fine - the
inductance just increases the impedance.
And, the purpose of a "bleeder" is to do what? Discharge static voltages? One could avoid losses if that is the intended application by using a common trifilar-wound voltage balun autotransformer with its secondary center tap grounded to feed the dipole.