This comes into that – all too large – class of things I’d not want to have to explain myself but am willing to accept is being properly explained by another.

You know, on the basis that I don’t understand the method by which the point is being made, only the conclusion.

But, Marty Weitzman told us that there’s a fat tail to the probability of calamitous climate change. The implication of this being that we should and must do much more than we were thinking we had to.

As it turns out, the initial claim is wrong:

The basic paradigm that underpins his analysis is that if we try to estimate the parameters of a distribution by taking random draws from it, then our estimate of the distribution is going to naturally take the form of a t-distribution which is fat-tailed. And importantly, this remains true even when we know the distribution to be Gaussian (thin-tailed), but we don’t know the width and can only estimate it from the data. The presentation of this paradigm is hidden beyond several pages of verbiage and economics which you have to read through first, but it’s clear enough on page 7 onwards (starting with “The point of departure here”).

The simple point that I have to make is to observe that this paradigm is not relevant to how we generate estimates of the equilibrium climate sensitivity. We are not trying to estimate parameters of “the distribution of climate sensitivity”, in fact to even talk of such a thing would be to commit a category error. Climate sensitivity is an unknown parameter, it does not have a distribution. Furthermore, we do not generate an uncertainty estimate by comparing a handful of different observationally-based point estimates and building a distribution around them. (Amusingly, if we were to do this, we would actually end up with a much lower uncertainty than usually stated at the 1-sigma level, though in this case it could indeed end up being fat-tailed in the Weitzman sense.) Instead, we have independent uncertainty estimates attached to each observational analysis, which are based on analysis of how the observations are made and processed in each specific case. There is no fundamental reason why these uncertainty estimates should necessarily be either fat- or thin-tailed, they just are what they are and in many cases the uncertainties we attach to them are a matter of judgment rather than detailed mathematical analysis. It is easy to create artificial toy scenarios (where we can control all structural errors and other “black swans”) where the correct posterior pdf arising from the analysis can be of either form.

From Sabine Hossenfelder https://backreaction.blogspot.com/2019/10/the-crisis-in-physics-is-not-only-about.html who interestingly enough is a member of the climate doom cult in spite of all the wrong predictions of the mathematical models.