35,000 choices. That’s how many choices we make on a daily basis. 200 of which are just thinking about food, according to a study from Cornell University, while thousands of micro-decisions are made automatically or unconsciously. This information practically debunks the common misconception that our choices are rational. Tens of thousands of our decisions do not necessarily weigh facts, consider risks, and well, if you’re choosing unconsciously, there is a high probability that your decision is not entirely based on logic but familiarity.
Daniel Kahneman reveals in his seminal work Thinking, Fast and Slow, that much of our thinking is far less deliberate than we think. The book not only psychologically dives deep into how our minds work, but it also reveals the hidden mechanical operations behind every judgment we make and the errors that come with them.
Mindless autopilot

Thinking, Fast and Slow’s focal point of conversation discussed the concept of two systems that govern how we think—System 1, fast, intuitive, and automatic, and System 2, slow, analytical, and deliberate. Kahneman described System 1 as the part of our brain that leaps to conclusions, creates first impressions, and handles routine tasks with ease. Should I get up and brush my teeth? Should I get up and fix the bed? Or should I just stay in bed?
System 2 is the exact opposite of System 1. This is the part of our brain that takes care of solving complex problems, analyzing data, and even self-reflection. But since System 2 requires more focus, energy, and attention, our brain automatically resorts to quicker judgments that System 1 offers—even when the situation calls for more careful thought.
The mill that churns cognitive biases

Ever noticed how we seamlessly attract narratives based on our assumptions and beliefs? Let’s say you think a particular politician is corrupt. When you scroll through your feed to check out news, you automatically notice news and articles that support your beliefs or assumptions. Kahneman’s dual-system model explains this confirmation bias and a wider range of cognitive biases. It also explains the errors that affect our choices, whether they be about our views on politics and morality or our financial choices.
We are easily swayed and influenced by arbitrary numbers or, say, we are presented with ideas beforehand. Then we rely on that information because it’s easily available to us, even if it’s not entirely representative of the broader reality. Call it human nature; we tend to switch up difficult questions for easier ones automatically or unconsciously.
One example is if you’ve recently seen several news reports about airplane crashes. Statistically, even though air travel is one of the safest modes of transportation, those in-your-face, vivid, and current stories dominate your memory. So when you get asked whether flying or driving is more dangerous, you might instinctively say flying, despite the data showing car accidents as far more frequent and deadly. These errors and biases affect real-world outcomes. They’re not just quirks of the mind; they have a lasting impact that could blur how we see things clearly.
Wisdom nuggets and mind tricks

Perhaps the most powerful lesson in the book is the idea of loss aversion. Thinking, Fast and Slow paints the picture that we tend to fear losses more than we value equivalent gains. We feel the negative impact more than the positive ones. You lost a thousand pesos? Feels worse, right? That feeling weighs more than the joy of actually gaining a thousand pesos. The emotional imbalance of loss aversion could greatly affect how we make decisions under uncertainty, how we invest, and how we negotiate. Kahneman teaches us how important it is to manage our emotional responses if we want to make more balanced decisions and do better.
He also challenges the myth of expert intuition. Normally, we’d trust the words of people who had gained years of experience to make quick and sound decisions. Kahneman argued that this would only work in predictable environments that provide immediate feedback. Why do we expect experienced ER doctors to accurately diagnose a heart attack or stroke? Because we know that they’ve already developed a reliable intuition over time, thanks to high repetition, clear patterns, and a fast feedback loop. Cases are relatively predictable, repeated, and if the patient improves or quickly worsens, they receive immediate feedback.
But apply it to other tricky or complex domains? For Kahneman, even seasoned professionals become vulnerable to mental traps just like everyone else. Intuitive expertise is real, but it only develops in a clear, rapid feedback environment. In one-shot decisions, experienced people would have confidence, but it doesn’t necessarily mean they’re accurate.
Strangers to our own minds

Reading Thinking, Fast and Slow is both humbling and empowering. It’s a concise reminder that we are not as rational as we think we are. Patterns we rarely recognize often shape our minds. But the book equips us to become better thinkers. It suggests that if we engage System 2 by slowing down and constantly questioning our snap judgments instead of just relying on familiarity and comfort, we can navigate the world with greater clarity and wisdom.
It is a call to reclaim thoughtfulness in a world where stimuli fire from every corner with noise and urgency at a constant rate. Kahneman’s work teaches us that by being aware of how we think, we make our first step toward wiser decisions. Live with more insight and intention. As mentioned by Henry David Thoreau in his book Walden, “Live deliberately.”
