That depends on what you are trying to accomplish.
If you want to drop the voltage to the specified output of the transformer,
then with a 15 ohm load it should put out 3V at 200 mA.
But if you want to use it as a 5V (or 3V) source for an indicator of
some sort, then you don't need to draw full power.
For example, if it puts out 6V no load, you could stick 6 each 1K
(or even 10k) resistors across the output, and take your 5V
reference off one resistor from the top of the stack, because the
output voltage under that load won't drop much.
Or you could experiment to find what load resistance drops the
output to 5V: a good starting point might be to assume the
internal resistance of the transformer is 15 ohms, so a load
resistor of 5 x 15 = 60 ohms.
But in your application, using higher value resistors so the
supply is lightly loaded, then tapping across the resistor chain
to get the desired voltage, dissipates less power is probably
is easier to adjust. (You could, of course, just stick a 10K pot
across the output of the supply and adjust it for the desired
voltage.
That's assuming that your detector / indicator circuit has a
high input impedance, of course.