Almost all the loss in coaxial cable used at common frequencies comes from the surface resistance of the center conductor, and almost nothing else. The outer conductor, having much more surface area, contributes very little, and dielectric losses are negligible until you get into the SHF region.
The ratio of losses between inner and outer is approximately the ratio of inner to outer diameter, or about 1:3.3 for solid PE dielectric 50 ohm cable. I don't know that this is "almost all", but certainly, I'd go for 75% of the loss is in the center conductor. For foam or air dielectric, the ratio is smaller (LMR400 is 1:2.46), and for the same outer diameter, the inner conductor is bigger, so the loss is lower, but the shield starts to be a bigger fraction of the total loss.
I ran across an interesting anomaly in the whole coax loss thing the other day: A cable might be really good for microwave, but not so hot for HF.. Some cables for microwave purposes have a very thin layer of silver or copper on a stainless steel core. At microwave frequencies, the skin depth is much less than the cladding thickness, but at HF, that's not the case. (LMR400, for instance, is copper over aluminum). The really big coax actually has a tubing center conductor. I'm pretty sure the inner tube wall is thicker than the skin depth, even at low HF, (copper is 33 microns, 0.0012" @ 4MHz), but it would be something to check. As a rule of thumb, you want the conductor thickness to be 5 times the skin depth.
For a small diameter conductor, <10 skin depths in diameter, the AC resistance is even higher.. for very small coax 'WIK is probably closer to the truth when he says "almost all of the loss is in the center conductor", but even something as small as RG-8X or RG-58 has a center conductor around 1mm diameter, which is like 30 skin depths at 4 MHz.
I've been looking into this in connection with inexpensive coax (e.g. RG-6) intended for Cable TV kinds of applications, a lot of which has copper over steel core center conductors.