# Tag Archives: probability

## Percentages for sceptics: part III

I wanted to do some self-criticism of my previous two posts in this series:

1. You can calculate the minimum of responses from a single percentage by hand (no need for computer programmes or look-up tables).
2. I’ve made a very rough model to estimate how many people the program typically returns when fed six percentages (as I did several times here).
In between, I’ve collected some links to demonstrate how great continued fractions can be.

Handy calculations

There are many ways of writing real numbers (fractions and irrationals) apart from in decimal notation. You can represent them in binary, for instance $\pi = 11.001001000011\ldots$, or in other bases. These have their uses: there is a formula to calculate the $n$th binary digit of $\pi$ without calculating all the preceding digits.

For our purposes we will use continued fractions. People write $\pi$ as $[3; 7, 15, 1, 292, 1, 1, 1, 2, 1, ...]$: this notation means that 3 is a good first approximation to $\pi$, the well-known $3+\frac{1}{7}=\frac{22}{7}$ is the closest you can be with any fraction $\frac{p}{q}$ with $q \leq 7$. Then $3+\frac{1}{ 7 + \frac{1}{15}}=\frac{333}{106}$ is the best with $q\leq 106$, and the fourth term

$3+\frac{1}{7+\frac{1}{15+\frac{1}{1}}}=\frac{355}{113}$

is a very good approximation, as the next number in the square brackets, 292, is very large (I’ll motivate this observation at the end of the section). The golden ratio is sometimes called the ‘most irrational’ number because it has a continued fraction expansion with all ones, so the sequence converges slowly.

Filed under Accessible, Applications

## All for one and none for all! Diversification, regulation and the tragedy of the commons.

Some proverbs come in contradictory pairs, for instance “Too many cooks spoil the broth” and “Many hands make light work”. I’d like to present an example that I feel illustrates two of these simultaneously. Everyone knows “Don’t put all your eggs in one basket”, including banks, which diversify by holding many different investments of different types. But while it may be in banks’ best interests to lower their levels of risk through diversification, it may plausibly raise the risk of a system-wide failure: if all banks follow that same advice, the rest of us may be “In for a penny, in for a pound”.

This is the argument given by Beale et al. in their paper Individual versus systemic risk and the Regulator’s Dilemma. Here’s a simple example that illustrates the main idea, adapted from a related comment in the earlier paper Systemic risk: the dynamics of model banking system.

Some people are playing a dice game. Each of them rolls a single fair die once and receives £1 for rolling a 1, £2 for a 2, and so on, up to £6 for a 6. However, afterwards they each have to pay £1.50 for the privilege of playing this rewarding game. Each of them starts out with no money, and so if they roll a 1, they go bankrupt.

If you play this game, there’s a $\frac{1}{6}$ chance that you’d go bankrupt on your roll. Let’s say ten people play the game: the chance of all of them going bankrupt on their single throw is $(1/6)^{10}$, which is about one in sixty million.

Filed under Accessible, Applications, Economics

## Are you 100% sure? Betting for the same team.

Betting is often a good way of settling arguments: “I bet you so much that my sports team will beat yours”, can quickly lead to a resolution. But some of the most heated arguments are between people who completely agree with each other. So what happens when you want to gamble with someone else, but both of you want to bet on the same outcome?

You could bet on a particular score, or the timing of an event. But how about if the options are limited: is the correct direction to turn left or right? Or if you have identical views: the game will end in a nil-nil draw, or this party will win the election?

Here, the bet can go ahead as long as one of you is more sure than the other that they’re correct.  Continue reading