The difference between 50% load and 100% load in terms of 80+ testing is 3-4% loss in efficiency.
Varies some, I've seen a few high-end Gold units lose closer to 2% between those points.
PSU's unlike miners are ASSETS and not LIABILITIES
Miners are not liabilities as long as they can mine at a profit. I'd class them as short-term high depreciation assets, vs, power supplies where a good power supply can be a very long-term asset.
Given that I anticipate the "in process of showing up" generation of miners to have a high probability of being viable for 4+ years, I doubt that power supplies will outlast 3 generations of miner past the "current showing up" generation on a high-probability bases. Miners are catching up to "state of the art" in the current generation, after a few years of each ASIC generation being well behind the current state of the art for semiconductor manufacture.
50% to 80% load on MOST gold power supplies is usually less than a 1% drop in efficiency. Seems like the optimal point from a ALL costs basis is around 70-80% ballpark - which is also a good point to be at from a LONG TERM reliability basis.
I'm really not sure about downclocking and power usage
In my S5 testing, the efficiency was pretty close to flat vs. clock rate from 300 Mhz to 380ish. The varience seemed to be mostly "measurement tolerance" and a very small contribution from the small efficiency difference in the power supply with varying load level.
I don't see dropping to 275 being a significant efficiency change.
POWER usage on the other hand did change with the clock changes, downclocking WOULD drop the total power used appx. in proportion to the clock rate change.
2TH at 700 watts wouldn't be practical out of an S5 form factor unit, you'd need more chips than you could fit on 3 boards. 1TH at around 350 though I could see happening if Bitmain wanted to do it, 2 strings of 20 chips per board with 3 boards - I'd have to go look up one of my posts in the Gekko BM1384 thread to find the figures I worked out.