This is part 2 of an examination of the various ways we can predict the future, their strengths and their weaknesses. Part 1 is available here, which focuses on data and when it becomes misleading. Part 3 is available here, which focuses on my critique of Hegelian Dialectics, and The conclusion in part 4 is available here.
Following up on the Turkey problem from the previous part, it is interesting to think about the various ways we can try to predict the future, and which of them would successfully account for the turkey problem.
Another way to do so would be to look at incentive structures. The turkey could think about the human being who dedicates time and resources to feed it, and wonder what that human has to gain from all of this. As a result, it could realistically figure out that the human has something to gain for its own benefit, because it's unlikely that it feels particularly attached to the turkey, and thus realizes that it's going to be killed eventually.
The interesting thing about analyses based on incentive structures is that they can often appear like conspiracy theories. Imagine if the turkey in our setup lived with another one, and started telling it how human beings only keep turkeys on farm in order to eat them. The second turkey, if it was indoctrinated in bayesian reasoning, would reassure it that every day keeps bringing more evidence that they are safe, and that the first turkey should ditch its conspiracy theories which are just based on fear. And yet, the first turkey was right, as crazy as it might have sounded to the second one.
As I have said before, conspiracy theories are very limited but this doesn't mean that they are false. The most powerful people have far more influence than any regular person has, and they aren't so numerous that they couldn't coordinate with one another, but the problem of course is figuring out which powerful people is going to have enough of an influence to steer our world, how they are going to do it, and for what reason?
Analyzing incentive structures is quite powerful at a much smaller scale however. Within a company, it is possible to figure out what the people around you want and how they are going to try to get it. If you know that someone isn't particularly engaged in their work, then it either means that they are being tactically lazy and doing the bare minimum to get paid, or it means that they are about to leave the job. If someone on the other hand is ambitious and tries to take space, then it means that they are trying to get a raise or even a promotion, and they might even do so at the expense of others. Or their enthusiasm might not work at all, and it will simply be for nothing, leading them to burn out or leave the job.
Analysis of incentive structures at small scale has a name, and that's called game theory. Game theory is all about analyzing how different people pursue incentives, especially in relation to one another. The games we are involved in are often adversarial, which is to say that we are competing with one another over finite resources, but they can also be cooperative, though in practice, cooperative environments in society also involve an individual dimension too, as we rarely find ourselves in situations where individuals cannot get ahead of others at their expense.
Game theory is useful, but limited like everything, because we don't know other people perfectly. In fact, we don't even know ourselves all that well. It is very often the case that a strategy that might be "optimal" isn't followed at all by other people, either out of ignorance or simply because they value other things.
Another way to make predictions about the future is by focusing on limiting factors. In this philosophy, we are not trying to make conclusive predictions about things that will happen, but we try to establish statements about what cannot happen. This is what underlies the peak oil worldview, of which I subscribe to. I might have no way of predicting how the coming decades will go, but the peak oil lens tells me that:
Predictions based on limiting factors are highly unpopular because they focus on establishing limits and what we collectively cannot achieve, which doesn't make for very good PR. People want hope in the projects of civilization, not someone telling them how it's impossible to maintain perpetual exponential growth on a finite planet.
Because of this inherent unpopularity, these forms of predictions usually take the form of apocalyptic predictions, the ones that tell you that our world is going to be doomed because of: AI taking over the world, nuclear warfare, climate change, the spread of a super-virus, a solar flare, or some prophetic vision by our ancestors. Exaggeration is always tempting because it grabs people's attention better than the highly complex and slow developments that our world inevitably takes.
But analyzing limiting factors is important, because it's possible to establish that something cannot happen, for instance that it's impossible to build a perpetual motion machine, but on the other hand, to show that something is possible you need to actually go out there and do it, which can take a lot of time and energy.
Focusing on what can go wrong is often labelled as defeatism in our techno-optimist times, but it doesn't have to be, because the extra time and energy you save by not spending them on foolish endeavors allows you to build things which bring you value in the near term and which have a better chance of lasting.
As someone who subscribes to John Michael Greer's view of the Long Descent, the view that peak oil makes it impossible for us to progress technologically forever, and that we are currently in a long descent over many decades of technological and social decline, I have personally focused more on feeling well in my body, because of the basic realization that such an investment is guaranteed to stay with me no matter what happens to our world, which cannot be said about investments in the financial market, or anything which relies on complex technology.
It's only defeatism if you focus on what you cannot do, as opposed to focus on what you can do and what matters. Ultimately, I think human beings are meant to focus on their local surroundings, their family and friends, their work and their health, not on trying to "save the world" or be citizens of a global civilization, but that's my view.
1 See our world in data for instance, which is the closest thing we have to an official source. Europe likes to pretend that it's getting "greener", but really it's only exporting its pollution to other countries, from which it then buys its products, which doesn't solve the problem at all. It's similar to a child in kindergarten playing with a ball and causing a mess, and then handing it to another child just before the mistress comes and gets him in trouble, so that the other child can be blamed instead.
2 The theory of abiotic oil is typically what we should expect from people desperate enough to cling onto the myth of progress to justify all sorts of crackpot theories. Even if oil could technically form from non-biological processes deep within the Earth, it wouldn't matter the least if it wasn't easily accessible and if the rate of replenishment wasn't high enough. The fact that oil companies turn to non-conventional ways to get oil, such as oil sands and shale oil, which require far more processing and thus have a lower return on investment, tells me that even if abiotic oil was technically true, it would be economically irrelevant.
Go back to the list of blog posts
Prediction Peakoil Gametheory Incentives History Conspiracy Collapse
2025-12-22