An attack I thought of years ago would be to formulate UTXO's systematically to chain blocks together in such a way that 1) their count would stress the working database (currently implemented in RAM most often) and 2) verifying them would touch as many blocks as possible making rendering of much of the actual blockchain itself require for many transactions. I'm sure there is a name for such an attack. If not, call it the 'tvbcof attack' I suppose.
Something like this was brought up on reddit. Why not have higher fees for these kind of "tvbcof transactions"? (Higher fees in proportion to how much they scatter the UTXOs. And perhaps lower or zero fees for transactions that consolidate UTXOs.)
A spotlight has gone onto the UTXO because it is bloating, e.g.
FWIW, here are a few observations related to the growth of the utxo set.
The growth rate of the utxo set has increased since late 2013 early 2014.
More interestingly, a repeated pattern has appeared since early 2014, showing steps every sunday (around
100k utxos added to the set every sunday).

My bold emphasis.
Gavin has kicked over the card-table by writing his 20MB patch and going public with it. There is now a phenomenal amount of constructive debate going on in dev, particularly around ideas like leveraging an increased block limit for maintaining a cleaner utxo set and an improved fees market by paying for extra block space.
Interesting that the market is looking more bullish since Gavin's announcement than a long while previously.
The drawn line extrapolates based on an assumption of linear growth from some point midway along the function into the future.
If we drew a linear line from the start of the dataset through the current time, we would hit 20MB at a different, earlier date.
If we drew the line of best fit as a polynomial function (which is currently above the line and returning to it), we would hit 20 MB at a different, still earlier date.
If we drew a sigmoid function where we are approaching a ceiling, it is possible that limit would never hit 20MB.
If it is within anyone's capacity, I think it would be worth throwing these data points into some statistical software and determining line(s) of best fit, with correlations and such.
It would turn something that is subjectively interpretible into something objective.
I think that's important in this debate, for many of the reasons Zangelbert mentioned above.

Agreed, but the projection which Peter has is probably as good as any other fit. Some considerations:
The volatility in 7-day average block size has collapsed since 2010 and would imply a maturing process of the ecosystem. Arguably, the phase up to mid-2012 was primarily usage from "innovators" and since then "early adopters", which continues today. Projecting forward from mid-2012 data would seem more realistic for predicting the near future.
The "bump" from April 2012 to March 2013 is basically the effect of SatoshiDice which doubled the transaction loading until its owner (Eric Voorhees) modified it, and Core Dev changed dust thresholds.
Transaction size has doubled since 2012 from about 250 to 500 bytes (hence the oft-quoted max throughout of 7tps is more like 3tps), largely because of scripting and multisig. Without this the average block data would look more sigmoid. Adam Back suggests that off-chain tx is already 100x on-chain (e.g. including exchange volume etc), and he is probably right. Lighting Networks, Coinbase and the like, could eventually take so much off-chain volume that indeed, the 20MB may not be reached until well after 2021.