The industry warranties for degradation tend to be 90% of rated value at 10 years, and 80% of rated value at 25 years. My panels (Sharp ND-216U1F) have a 25 year limited warranty. Their rating is 216 Watts with an initial tolerance of +10/-5%.
The data sheet for the panel also has temperature coefficients for the panels. The panels have their ratings at 25 degrees C. The panel I have has a derating of -0.485% per degree C. The data sheet does not state if this temperature is ambient or the surface of the cells. I'll go out tomorrow and measure my ambient vs. cell surface temperature so we can put a number on that.
So at 77 degrees F (25 C), the panel in "full light" makes 216 Watts of power. At other temperatures:
90F makes 208 Watts
100F makes 203 Watts
110F makes 197 Watts
120F makes 191 Watts
448F makes no power (and I suspect the plastic on the back is melting

)
The temperature derating is linear, and I suspect it's only good for real planet tempuratures not the extrapolated zero power out point :laughing:.
Note also that as the temperature goes down, the output goes up. So:
50F makes 226 Watts
32F makes 242 Watts
0F makes 260 Watts
And again, there is probably a limit on this end too. Temperature has other effects on copper interconnections used. Temperature cycling is probably also a prime means of wear out, even though it takes decades. For here in North Carolina, I'll see an output range between 242 and 200 watts for my 216 watt rated panels. The "110 degree and they stop performing" doesn't make a lot of sense, and I suspect there is other criteria applied to the system performance that makes that temperature a cut off point (as orezok mentions).
ShenandoahJoe's post points out the difficulty of make decade long or more guestimates on payback periods, a point I danced around on a previous post.
BTW, in the last 24 hours I made 86% of the energy my house needed, and made money to boot! Lower temps means lower HVAC use, this is great time of year!
Pete