Ground wave (or "surface wave") attenuation with distance
is a function of soil conductivity, because the radio wave
induces currents in the ground.
Looking at the CCIR tables for ground wave propagation,
a 1 kW transmitter at 200 kHz gives a signal strength of
1uV/m at 2000 km over sea water vs. 800 km over poor
ground. (Assuming my eyes could read the graphs
properly.)
Why don't we point the antenna at the ground? The
"surface wave" signal is confined in the space between
the surface of the earth and a height of roughly one
wavelength above it. We're not picking it up from the
ground, but rather the wave travels along the earth/
air boundary. A vertically polarized antenna is optimum
for both transmitting and receiving.
There may be other factors, including the orientation of
the loop antenna, or local noise levels, that affect the
received signal strength, but the intervening ground
between you and the transmitter certainly makes a
difference.