I'm having this same issue that I've been working through for the last couple days and believe I've found the answer, although it's yet to be definitively tested.
It looks like it all comes down to the extension cords. I've used two different microwaves and they both act the same. I used my Kill-a-Watt to determine that both microwaves were using the same amount of power (approx. 880 watts and 9.2 amps). For me, I believe it's going to come down to the gauge of the extension cords, in part, due to the length of the run.
Here are a couple of good links to help explain:
* This one is in regards to the wire size for a circuit depending on the voltage, length of the run, amps, etc.. So it should stand to reason that they same would be true of the extension cords too; Wire Size Calculator
So, for me, running copper wire approx. 100 ft, off a single pole, 120v, 20 amp circuit... it says I should be using 8 gauge wire.
* This one is just an overall good read on the subject. The key piece I learned from this one is; "Length - As mentioned above, for every foot that current flows through a cord, there is a voltage loss. For a given current flow, fifty feet of cord will have twice as much voltage loss as 25 feet of the same
cord." There's also a good chart to show the percentage of voltage drop depending on several aspects; http://home.mchsi.com/~gweidner/extension-cords.pdf
I had already replaced circuit breakers, etc. Tonight I'm going to go back to square one and plug the microwave directly into the outlet. If it still happens, I'm going to put a new single pole 120v/20a circuit into the panel and run it to one single outlet using either 10 or 8 gauge wire and see if that resolves the issue.
But this issue is absolutely not due to the microwave itself.
Hope that helps anyone who happens across this in the future. I'll try to remember to update with any findings going forward.