I'm not an electrical
engineer, but I play one at work and I think the following is accurate.
For a 15 amp current, the extension cord voltage drop per linear foot and the total drop across a 50 and 100 foot extension cord is as follows:
10 Gauge 0.01550 Volts/ft 0.775 Volts for 50 ft 1.55 Volts for 100 ft
12 Gauge 0.02468 Volts/ft 1.234 Volts for 50 ft 2.468 Volts for 100 ft
For a 30 amp current, the same data is as follows:
10 Gauge 0.03100 Volts/ft 1.55 Volts for 50 ft 3.10 Volts for 100 ft
12 Gauge 0.04936 Volts/ft 2.468 Volts for 50 ft 4.936 Volts for 100 ft
So if you use a 12 gauge, 100 foot long extension cord connencted to a typical home 15 amp line, the voltage drop is approximately 2.5 volts between the wall and the trailer.
If we assume that the trailer has a built in 25 foot 10 gauge line, an additonal voltage drop of 0.3875 volts will occur. This is a total voltage drop of almost 3 volts or approximately 2.7 percent of the nominal 110 volts available. Additional voltage drops due to internal trailer wiring and bad luck will drop the voltage by 3% or more. Plus we should assume there is a measurable voltage drop between the power pole outside the house and the electric plug on the side of the house.
What all this means is beyond me. If your device stives to maintain a constant power output then the current will increase by 3% to compensate for the 3% voltage drop. This adds to the current draw which results in a higher voltage drop and the cycle continues until a thermal or current limit or a stability point is reached. In other words, the lower voltage means your pumps and fans work harder which will force them to work even harder to maintain the necessary power level.
This voltage drop and current increase is part of the reason extension cords shouldn't be hooked together. This is also why my extension cords get so darn hot at Christmas.