10 seconds of key-down gets the resistor hot enough to burn skin.
Then you have something wrong: did you check that the resistor really is 150K ohms
and not 150 ohms?
The voltage at the feedpoint of an end-fed wire antenna depends on the output power
and the length of the wire in wavelenghts: wires that are a multiple of a half wave can
have impedances of a few thousand ohms, while those that are an odd multiple of a
quarter wave are often less than 100 ohms.
If the wire is very short in terms of wavelength, especially less than 1/8 wavelength,
then you can also have high voltages at the feedpoint, which will have a low resistance
but a high reactance.
So the dissipation in the wire would depend on the wire length and the band of operation.
But 150K ohms should be more than adequate to keep dissipation low in the resistor. As
long as the resistor doesn't get so hot that it is uncomfortable to touch, I wouldn't worry
about it.
You can add a RF choke in series with the resistor, which will increase the impedance, but
if the resistor really is 150K ohms then it won't make much difference.
Let's say you are using a half wave end-fed wire and the feedpoint impedance is 5000 ohms
(which is probably on the high side). The RMS voltage is given by the square root of P * R,
so at 100 watts we have SQRT( 100 * 5000 ) = 707V, or about 1000V peak. We'd use the
RMS voltage to determine the power dissipation in a 150K resistor at this voltage, which
would be 1/3 of a watt. That's probably enough to take the chill off of a resistor, but
shouldn't get it too hot to touch.
With other wire lengths the dissipated power will be less, unless the wire is very short.