The answer to your question is called Ohms Law. Unless you got zero resistance you'll always find a voltage drop along a line.

If it was just resistance it would be much easier to build a supply with good regulation.

If you have a transformer rated at 12VAC (RMS) then the peak voltage will be ( 1.4 * 12 = ) 16.8V, and this is what the capacitor will charge to with no load. But at full load it will drop down closer to the nominal 12VAC, for a change of 4.8V. This voltage change is

*in addition to* any voltage drop due to resistance in the circuit.

First of all my intention was to keep the answer simple for easier understanding.

Of course you are right concerning the input capacitor, i.e. the one after the rectifier. To complete this, the transformer does have to be capable to deliver sufficient current as well and not only voltage. I usually use a transformer with an output of at least 1.4 times the desired DC output voltage. So 18V would be desirable here for 12V DC at whatever current it's planned for. Good transformers are hard ones, i.e. not much voltage change from no load to full. But now we are really getting into design details.