Nassim discussed definitions of success and his own life journey in this commencement speech at the American University in Beirut. The full text of the speech, published on Nassim’s home page, is available below.
Nassim spoke to Bloomberg News while he was at the SALT Conference in Last Vegas from May 10th to the 13th. Bloomberg has put up four short videos with him on their website, which we share below.
On his facebook page, Nassim recently posted links to a new short technical paper on the probability distribution of p-values and a video commentary. He wrote:
I was able to pull out the exact meta-distribution of p-values (i.e. p-values as random variables).
The point is that the same phenomenon will produce p-values all over the map. A true p-value of .12 will produce p-values <.05 more than half the time, so people may never replicate and get the same result.
One Hundred Years of P-Value Bullshit!
Here is the text of the paper, which was originally posted on his website, Fooled By Randomness.
Nassim kicks off The Bank of England’s One Bank Flagship Seminar, the first such seminar offered by the bank in an effort at greater transparency:
The first part of this talk – The Law of Large Numbers in the Real World – presents fat tails, defines them, and shows how the conventional statistics fail to operate in the real world, particularly with econometric variables, for two main reasons: 1) we need a lot, a lot more data for fat tails; and 2) we are going about estimators the wrong way. The second part – Detecting Fragility – presents heuristics to detect fragility in portfolios. Fragility is shown to be ‘anything that is harmed by volatility’. The good news is that while (tail) risk is not measurable, fragility is.
Nassim joins Jean-Philippe Bouchaud, founder and chairman of Capital Fund Management, in a debate at the at the 2015 Chaire PARI Conference.
Some highlights of the Fares Center for Eastern Mediterranean Studies’ discussion on “Peace for Syria through a A Federal Governance Strategy” with Dr. Yaneer Bar-Yam, moderated by Nassim Taleb. Dr. Yaneer Bar-Yam, founder and president of the New England Complex Systems Institute, discussed complexity theory and violence, with a special focus on implications of the theory in the Syrian context.
Here is the full version of the talk.
Discussing the concept of antifragility, Nassim remotely addresses the Theory of Constraints International Conference held in Capetown South Africa from September 6th to 9th, 2015. The theme of the conference was how to use the Theory of Constraints to transform organizations and people from fragile (harmed by volatility) to robust (not harmed by volatility) to antifragile (benefiting from volatility).
In this short video, Nassim speaks at the Lebanese Diaspora Energy event, a conference designed to strengthen the bonds between Lebanese residents and emigrants worldwide.
Nassim was interviewed by Bruce Oreck at the Tomorrow Conference in Finland in June.
This video is of Nassim’s keynote address at this year’s Fletcher Conference on Managing Political Risk.
The conference’s website provides the text of some of the address, as follows:
Keynote address by Nassim Nicholas Taleb
Nadim Shehadi, moderator
Let’s start with the notion of fat tails. A fat tail is a situation in which a small number of observations create the largest effect. When you have a lot of data, and the event is explained by the smallest number of observations. In finance, almost everything is fat tails. A small number of companies represent most of the sales; in pharmaceuticals, a small number of drugs represent almost all the sales. The law of large numbers: the outlier determines outcomes. In wealth, if you sample the top 1% of wealthy people you get half the wealth. In violence – a few conflicts (e.g. World Wars I and II) represent most of the deaths in combat: that is a super fat tail.
So why is the world becoming more and more characterized by fat tails? Because of globalization. More “winner takes all” effects. You have fewer crises, but when they happen they are more consequential. And the mean is not visible by conventional methods.
Now, moral hazard. Banks like to make money. Under fat tails, large numbers operate slowly. Let’s say you get a bonus for each year you make money. Then in 1982, banks lost more money than they did in their history. Then in 2007-2008, $4.7 trillion were lost. Then bankers wrote letters about how the odds were so low that the event was as much of a surprise to them as it was to you. Any situation in which you see the upside without the downside, you are inviting risks. People will tell you something is very safe, when in fact it is dangerous. Visible profits, and invisible losses. People are getting bonuses on things that are extremely high risk. And then the system collapses.
If you have skin in the game at all times, this does not happen. Modernity: a situation in which people get benefits from the action, but the adverse effects do not touch them. You hide risks to improve your year end job assessment. Bear Stearns never lost money – until they lost money.
Hedge fund managers are forced to eat their own cooking. When the fund loses money, the hedge fund manager loses his own money: he has skin in the game. You have fools of randomness, and crooks of randomness. Driving on a highway, you could go against traffic and kill 30 people – why does that not happen more often? Because types of people who would do this kill themselves along with others, so they filter themselves out of the system. Entrepreneurs, who make mistakes, are effectively dead if there is a filtering system. Suicide bombers kill themselves – so we can’t talk about them as a real threat to the system. So there is a filtering mechanism. People don’t survive high risk. If they have skin in the game, traders don’t like high risk.
Let’s now talk about fragility. The past does not predict the future. The black swan idea is not to predict – it is to describe this phenomena, and how to build systems that can resist black swan events. We define fragility as something that does not like disorder. What is disorder? Imagine driving a car 50 times into a wall at 1 mph, and then once at 50 mph: which would hurt you more. So there is an acceleration of fragility. The goal is to be anti-fragile.
There are two types of portfolios: 1) if there is bad news you lose money, 2) if there is bad news you win money. One doesn’t like disorder, one likes disorder. One is fragile, one is anti-fragile. Size (such as size of debt, size of a corporation) makes you more fragile to disorder.
Questions & Answers:
- Do the people of ISIS returning home pose a risk?
- This is not a risk. Debt is a risk. ISIS makes the newspapers and people talk about it but the real risks are not ISIS – the real risk is ebola, because it can spread. And the next ebola will be worse. So when people ask me to talk about risk, an epidemic is the biggest risk.
- Can you discuss some examples in the world that are fragile, examples of the fat tail?
- The Soviet Union did not collapse because of the regime but because of the size. Similarly, a lot of people don’t fully understand the history of Italy, before unification. There was constant, low grade turmoil. After unification, there were infrequent but deep problems. The risks facing us today, are the real things that can harm us and spread uncontrollably.
- Should we still think about risks on a country level? How do we think about transnational risks?
- Cybersecurity – banks spend 5% of their money on it. Netflix engineers failures every day. They pay an army of agents to try to destroy their system, to discover their vulnerabilities. Things that experience constant stress are more stable. In cybersecurity, there are a lot of risks, but we’re doing so much to protect against it that we don’t need to worry much. But eventually the cost of controlling these risks might explode.
- What is your blind spot?
- If I knew my blind spots, they wouldn’t be blind spots. I’m developing something that is improving stress testing. The good thing about fragility theory is you can touch a lot of things. I want to make narrow improvements, little by little, not try to save the world.
- Is statistics useless or are there some redeeming qualities?
- Any science becomes applied mathematics and if it’s not applied mathematics yet, it is not a science. Stats is used mechanistically. Statisticians need to make risk an application of probability theory. A lot of the people doing this come from the insurance industry.
- How does bad data effect your work?
- When you have a lot of variables, but not much data per variable, you are more likely to have spurious correlations. And when you have a lot of data, you are likely to find a stock figure that correlates with your blood pressure – that’s spurious. More data is not always good.
- Another problem is that if I want to write a paper, I test, test, test something until it fits my expectations – and I won’t reveal to you how many times I have tried. If there is someone doing this for a living, for money, then I don’t trust them.
- This is a great system you’re developing but can it be misused?
- The problem is in the math and in the ethics.
- If we stop using statistics, how can we make decisions? Don’t we have to make assumptions?
- Have skin in the game. Only use statistics for decisions if the stats are reliable. Joseph Stiglitz is blocking evolution – he made a prediction about Fannie Mae not collapsing, and it collapsed – and yet he’s still lecturing us on what to do next.