What would the downside of increasing the bias voltage?
Higher idle current, so higher power dissipation in the output transistor.
At low duty cycles this might not be a problem, but if dissipation is
already marginal it can cause the transistor to overheat.
The idle current is not a linear function of voltage. The base junction
voltage drop of the final transistor is also about 0.6V, so increasing the
voltage a bit above that causes large increases in current - much more
than one might expect by a simple application of Ohm's law.
Probably the best approach is to adjust the bias for the most linear
standing current at normal operating temperature. Or rebuild the
bias circuit to make it more stable with temperature. Or just ignore
the difference and enjoy operating regardless of the output power.
And just because the bias voltage changes, that isn't necessarily
what is causing the difference: you'd have to check that the drive
power is constant as well. There are other possible causes for the
difference: for example, a temperature sensitive capacitor in a high
Q circuit in a driver stage, or a marginal electrolytic capacitor.
You can investigate the source of the problem further by using
freeze-spray or a small source of heat (possibly a hair dryer with
a narrow output opening) to change the temperature in a small
section of the circuit and see if that makes a difference. This was
how I found the bad capacitor that was causing frequency drift
in my 40m pocket CW transceiver.