>2.) percentage of voltage drop that is allowed<

I think that is where people muddle up the conductor size and thus deduce (?) that the 'wire is too small to let the current through'. Possibly because of the old analogy of a water pipe?

Now what about true Litz wire with all those small conductors?

I think we try to use fancy words, and that confuses our communications. Ampacity is a word specially invented to mean ampere capacity dictated by NEC. We would all do a whole lot better not using it outside of application of NEC conductor size applications, because we might assume it means something it does not.

Conductors have thermal ratings caused by I^2 R heating, and they have voltage drop caused by current and resistance.

Nowhere in this does working voltage come into play.

If we allow 1 volt drop, it does not matter if the supply is 1 volt or 1 million volts, the wire size is the same for the same current and length.

If we allow a certain amount of heat for a given current, it is always the same no matter what the voltage.

When we mix this into a complex operation, like a certain % of regulation or heat at a certain load **power**, it is a different criteria. Now we change the current through the wire, and of COURSE the required size changes.

For example, if we allow 5% voltage drop and and a certain heat run 120 volts and it requires X mils cross sectional area of conductor for a certain conductor behavior, and then we run 240 volts at the same load POWER on the same wire, we now have half the current and half the voltage drop and twice the supply voltage, so regulation is FOUR times better. We also have half the current, and since heat is I^2 R then wire heat is FOUR times less for the same load power.

What messes us up is not working it as a problem for what we are really interested in, but CURRENT rating for a given voltage drop or heat clearly does not change with voltage.