SCIENCE

SENSIBILITY

SERENITY

1

Others may know more than you do about yourself

The ancient Greek aphorism Nosce Te Ipsum (‘Know Thyself’) still makes sense. The aphorism attributed to Thales of Miletus, one of the seven sages of ancient Greece — and known to Socrates and Plat — was inscribed above the entrance of one of the Temples of Apollo at Delph. It implies that to be wise, we must have an objective understanding of ourselves: our thoughts, feelings, desires, emotions, strengths, weaknesses, actions and motivations.

The way we identify with others and distinguish between self and others plays a significant role in our social development.

We now know that our beliefs about ourselves are the product of the brain. Our brain handles information about ourselves differently from other details. Memories about oneself are easier to recall than other forms of memory.

What we call the self emerges from the workings of the brain: a story that we continuously write and rewrite in our minds. This story is written all the time we are awake, and our brain is functioning normally. The story does not forever remain the same but there is something that remains constant which makes the self.

This constant also seems to include one’s name. The sound of a person’s spoken name and its written equivalent is a unique part of the person’s consciousness. Recent studies on patients in a coma and patients with disorders of consciousness suggest that some of these patients show a distinctive physiological response when their names are spoken to them. After examining whether one’s name is preferentially processed unconsciously in healthy individuals, researchers say, ‘your unconscious knows your name’.

The knowledge of the self helps us recognise and manage fear, anger and other potentially destructive emotions. Many studies show that people who believe they are behaving authentically are less distressed and have higher self-esteem. Brain-imaging experiments reveal that when people distance themselves from upsetting feelings, the rational parts of their brains (such as the prefrontal cortex) slow down emotional ones (such as the amygdala) — and they feel better. Research also links authenticity with mindfulness which can help curb depression and anxiety. When you are mindful, you respond with reason before emotion. You are aware of how you are responding to a situation.

Being yourself also comes with some costs. Certain forms of self-knowledge may be painful such as becoming aware of the limitations of our social skills or finding out that we are not as athletically talented as we had hoped. But behaving in ways that are at odds with one’s true self can undermine well-being.

American psychologists Simine Vazire and Erika N. Carlson say that it is a natural tendency to think we know ourselves better than others do, but there are aspects of personality that others know about us that we don’t know ourselves and vice versa. To get a complete picture of a personality, you need both pictures. They suggest an addendum to the ancient aphorism: Ask a friend. ‘Listen to others,’ they advise, ‘They may know more than you do — even about yourself.’

 

2

Relax, AI cannot replicate your ‘self’ — for now, at least

Thales would be horrified to learn that we may one day his eternal aphorism ‘Know Thyself’ may also apply to AI (artificial intelligence). The smartphone in your pocket or purse contemplating to ‘Know Thy AI Self’? No worries, dear great philosopher, this far-fetched idea is far from becoming a reality — at least, now.

As mentioned earlier, the self emerges from the brain. There are about 86 billion neurons, or brain cells, in the adult human brain, and each neuron is specialised and connects with from 1000 to 10,000 neighbours. Neurons transmit electrical signals from one to another five to 50 times per second. These signals are carried by molecules across contact points, called synapses, with other neurons. The molecules are called neurotransmitters — you have heard of some of them: dopamine, endorphin, histamine and serotonin. Brain activity is just a bunch of neurons firing. When one neuron fires up, it excites its neighbours and they, in turn, fire up others, giving rise to patterns of activity that result in feelings, experiences, memories, beliefs and perceptions. In other words, our ‘self’, which has ownership over our thoughts and actions.

We often use the terms ‘self’ and ‘consciousness’ interchangeably, but they are two separate concepts. Consciousness can be simply described as our general state of awareness and perception. It involves combining sensory information from our surroundings with the internal processes of our brains. This allows us to have a coherent and continuous experience of the world.

The idea of consciousness still baffles researchers: how a physical system like the brain could also be a conscious being with subjective experience? Like the self, consciousness also arises from complex interactions among neurons in the brain, resulting in subjective experience and self-reflection.

When advances in AI technology usher the next state of AI, artificial general intelligence (AGI) with genuine self-awareness, then we may have to start worrying about concepts like ‘Know Thyself’ in the context of AI.

However, even if a future AGI system could match or surpass human intelligence and could think like us, the question of whether it could possess consciousness remains a complex and unresolved question.

‘We should avoid trying to build machines that are in our image,’ warns Yoshua Bengin, a renewed expert on artificial intelligence and a Full Professor at Université de Montréal. ‘It will be healthier if we keep AIs in their roles of tools, rather than as agents, like people. They would not play the same kind of social roles that humans play in society as they would essentially be immortal.’

I asked an AI chatbot, does it have consciousness. Its reply (paraphrased, of course, by a human): ‘At present AI systems lack consciousness as humans experience it. While AI can exhibit advanced cognitive abilities and human-like behaviour, consciousness remains a uniquely characteristic, rooted in the biological complexity of the human brain.’

But AI will always be prone to the biases of its human creators. Some rogue AI experts may already be dreaming of creating an AI Frankenstein. Who knows?

 

3

Your digital self

In the digital world, the odds are stacked against your uniqueness. How much information does it take to single out one person? Just 33 bits of information is enough to single out any person from the world population of more than 8 billion. (A bit is either 1 or 0 in binary notation; eight bits make a byte, which is the unit of digital information, commonly known as memory.) These days when your every tiny idiosyncrasy can be catalogued and filed away in milliseconds, compiling your unique digital fingerprint is all too easy.

Self is always obvious to us, but to others, it’s our identity that makes us different from them. Each of us has a biological identity which is evident in our DNA sequence and biometrics such as fingerprints, facial structure, the iris, and voice. We also have a legal identity (name, date and place of birth, signature and so on). Now a vast majority of people on the planet also have a digital identity, the data that uniquely describes a person.

You may be aware of your identity on social media sites, but you probably do not know of many of your other digital fingerprints. Each device connected to the internet identifies itself in many ways to help websites to deliver the required information. This information in conjunction with other data such as the internet connection can identify a user. Did you know that the history of the websites you have visited is also a unique identity? Even if you delete your browser history regularly, it leaves its paw marks somewhere in cyberspace, for an eternity as far as we know.

You know yourself, but never let others know your digital self. To protect your digital self: (a) never reveal details to unknown websites that might identify you such as full name, date of birth, place of birth, address or contact numbers; (b) don’t post anything you don’t want strangers to know or find out about; (c) think before you post online as it can never completely be deleted; and (d) remember scammers are lurking in the shadows to grab your digital self.

For geeks only

A few words to please geeks. Whenever computer geeks talk about the brain, the first thing they want to know is the memory capacity of the human brain — in bytes. Forget about comparing your brain to a computer. Your brain is far smarter than any computer you will see in your lifetime. And don’t worry, it will never run out of memory. Our brains are already as smart as a neuron-based brain can be.

Already, neuroscientists have recorded the responses of individual neurons. In one experiment, they traced a single neuron that fires only when the subject is shown pictures of former US president Bill Clinton and no one else. Interestingly, in another experiment, they found a neuron that fires only when the subject is shown pictures of the actor Jennifer Aniston, but not pictures of her with her former husband, the actor Brad Pitt. Obviously, there are neurons in our brains which fire only when we see someone we know or recognise.

Would a computer-chip-based AI brain ever be smarter than a neuron-based brain and have self-awareness and consciousness like us? Who knows.

The three laws of robotics devised by renowned science fiction writer Issac Asimov in 1940 are also worth mentioning: First Law: A robot may not injure a human being or through inaction, allow a human being to harm. Second law: A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law. Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

These laws are nothing but science fiction, but even today AI researchers would like their intelligent machines to follow these laws.

Would AI one day ignore these laws and rule the planet after exterminating the self-indulgent and self-destructing human beings who have no respect for their own kind, other species and the environment? A scary scenario, indeed.

4

Self-justification causes us to lie to ourselves

The saying ‘An error does not become a mistake until you refuse to correct it’ became popular when in 1961 US President John F. Kennedy used it in a speech after the failed Bay of Pigs invasion of Cuba. Kennedy also said, ‘We’re not going to have any search for a scapegoat ... the final responsibility of any failure is mine and mine alone.’

Most of us, unlike Kennedy, are reluctant to admit our failures and mistakes and look for scapegoats. We try to justify our stance even if it means telling lies and more lies. Self-justification is not merely lying to others, it’s lying to ourselves to preserve our self-esteem and positive self-image.         

 The eminent American social psychologist Elliot Aronson believes that self-justification is more dangerous and more insidious than explicitly lying because ‘we are not even aware a mistake was made, let alone that we made it.’

The mind’s mechanism behind self-justification is powered by cognitive dissociation: the conflict between two opposing cognitions arouses a contrary psychological stance — called dissonance — which in turn motivates activities designed to reduce dissonance. In other words, it’s a state of mind in which we oscillate between conflicting ideas, attitudes, beliefs and opinions. According to Aronson, dissonance produces mental discomfort, ranging from minor pangs to deep anguish; people don’t rest easy until they find a way to reduce it. This process of reducing dissonance revs up self-justification.

For example, when a teacher exhorts her students to use reusable drink bottles because plastic bottles harm the environment, students expect that the teacher always uses reusable bottles. If a student finds the teacher in the street sipping from a plastic bottle, points at the bottle and smirks, two opposite ideas would cause tension in the teacher’s mind: ‘I’m a respected teacher and believe in what I teach’ and ‘I never use plastic bottles.’ Most probably the teacher would ease this momentary mental distress by telling a little lie, ‘Oh, I left my reusable bottle at home.’

If the dissonance is between ‘I’m a responsible person and know that drinking and driving is dangerous’ and ‘I drive after one too many drinks at the pub’ you are forced not to drive after having been at the pub or to find flimsy excuses for driving after drinking. That’s how cognitive dissociation drives self-justification.

Self-justification has its uses: it means we do not spend time worrying and fretting. ‘But the reason self-justification is so dangerous is that in justifying the road we do take, we fall into the trap of justifying everything that flows from that first decision,’ says Carol Travis, the co-author of Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts, in which she and Elliot Aronson tackle the inner workings of self-justification. The mechanics of self-justification see to it that we become increasingly enmeshed in our decision and less and less able to consider the possibility that it is wrong, she adds.
 

5

We always believe what we want to believe

Would you believe me if I say that like most people, I used only 10 per cent of my brain to write this book?

Yes, or no?

If you have not tried it before, it’s time you tried this popular test, devised in 1984 by American psychologist Glen Hass, to find out whether you are a good liar. Extend the index finger of your dominant hand and within five seconds trace a capital Q on your forehead. If you have drawn the Q so that it can be seen by someone facing you with the tail of the Q on the left side of your forehead, you are aware of how other people see you and you are a good liar. If you have drawn the Q so that you can see it yourself with the tail on the right side of your forehead, you’re an introvert and not a good liar.

You cannot judge whether I’m a bad or good liar without watching me doing this Q thing on my forehead. Try another way. Say or write F (false) and T (true) for the following statements.

 

1.     The bigger the brain, the smarter you are.

2.     People are either ‘right-brained’ or ‘left-brained’, which can help explain individual differences among learners.

3.    Listening to classical music makes you smarter.

4.    Lightning does not strike in the same place twice. 

5.    You are born with all the brain cells you will ever have.

6.    Your handwriting reveals your personality.

7.     We have only five senses.

8.    Children are less attentive after consuming sugary drinks and/or snacks.

9.    Vaccines cause autism.

10.  Evolution is just a theory.

Would it surprise you if I say that the answer for all of them is false? They are all myths, misconceptions, disinformation, misinformation, fake news, fallacies, falsehoods, alternative facts, known unknowns, or any other fancy phrase you may use to describe lies. Facts don’t lie.

Obviously, I’m not a good liar.

Of all these myths, one is the most prevalent.

A myth that refuses to die

Decades ago, when I was in high school and published my first science article in a national magazine, my science teacher congratulated me and said something to the effect that we use only 10 per cent of our brain. I was sceptical as I believed that I had commandeered every brain cell — the word neuron was not popular then — in my little head to work hard to research and write in a language that was not my mother tongue.

Since then, I’m amazed by how often doctors, teachers, scientists and other professionals have uttered the silly phrase in my presence. If I had put a pretty penny in a piggy bank every time I heard or read this dictum about the limited capacity of our brains, the piggy bank’s belly would have burst years ago.

I wondered then why 90 per cent of the brain cells were lazing around while the 10 per cent were slaving away. Now I know there is no scientific evidence, even of moderate quality, to support this absurd claim.

In recent years, neuroscientists have scanned the brain with sophisticated big-name machines such as electroencephalography (EEG), magnetoencephalography (MEG), computerised axial tomography (CAT) and positron emission tomography (PET) and functional magnetic resonance imaging (fMRI). They have pinpointed numerous psychological functions to their specific parts. Their scans have not revealed any regions of the brain that are vegetative. Besides, if 90 per cent of our brains were doing nothing, there would be large areas of dead cells in our brains. No autopsy has ever revealed it to be true.

At any given time, not all neurons, the basic working units of the brain, are active; but no neuroscientist has ever found that 90 per cent of our brain is perpetually on vacation. Even at rest, the brain works at its total capacity. Brain scans show that our brains have a ‘default network’, interconnected brain regions that remain active when we are not doing anything. Of course, some parts of the brain are more active than others at any given time or during a particular activity.

For our body, the brain is an expensive organ to maintain; it uses too many resources: about 20 per cent of our body’s daily calorie intake. Evolution (or, intelligent design, if your brain is more partial to not-so-scientific ideas) would not have allowed such a wasteful organ to survive.

Yet, the myth of 10 per cent brain refuses to die. In a systematic review of all neuromyths, the myths about the brain, Marta Torrijos-Muelas and her colleagues at the University of Castilla-La Mancha in Spain, say that this myth seems to be the most enduring neuromyth that has survived more than a century. Their finding reported in The Frontiers of Psychology presents neuromyths ‘as the consequence of a lack of scientific knowledge, a communicative gap between scientists and teachers, and the low-quality information sources consulted by teachers.’

Ask 10 people in the pub or anywhere else, and you will be surprised by the high percentage of people who believe in this myth. When someone tells you that we use only 10 per cent of our brains, they are using only 10 per cent of their brains.

Neurons on fire

Using brain scanners neuroscientists have discovered that we use distinct parts of the brain when we lie and when we tell the truth. Neurons in seven areas of the brain fire up when we tell a lie; neurons in only four areas fire up when we tell the truth. The brain must work harder to tell a lie. Mental conflict arises when we tell a lie and there is increased demand for motor control when suppressing a truth. No wonder neurons in your brain, not your pants, are on fire when you tell a lie.

We all tell little lies to enhance our self-image and still can retain the self-perception of being an honest person. But little white lies often lead to big black lies. Very soon you may find yourself turning into a mythomaniac. It’s only a step away from becoming a pathological liar. The best way to avoid this trap is to learn to tell the truth with tact — or a touch of self-deprecating humour.

How do lies become ‘facts?

Our minds subconsciously rely on shortcuts to make quick decisions, accept too much at face value, and if something is familiar is also safe. They are biased towards embracing information that supports their beliefs and discounts contradictory evidence. They are not blank slates but eager to assimilate the latest information in their worldview, a projection of themselves. This is our minds’ confirmation bias.

Our minds also prefer information that is easy to process and understand, even if it lacks sufficient evidence and logical thinking.

On the other hand, cognitive dissonance is the mental discomfort created by holding contradictory beliefs. And there is the desirability bias that helps us to more readily accept information that pleases our minds.

Another reason we fall for lies is our desire to belong, which requires agreeing and conforming. Cailin O’Connor and James Owen Weatherall of the University of California at Irvine call it conformism: a preference to act in the same way as others in one’s community. ‘The urge to conform is a profound part of the human psyche and one that can lead us to take actions we know to be harmful,’ they write in Scientific American.

The reason for a stronger attachment to lies is reliance on emotions. In a study published in Cognitive Research: Principles and Implications, Cameron Martel and his colleagues at Sloan School of Management, Massachusetts Institute of Technology, show people fall for fake news, in part, because they rely too heavily on emotions. People mistakenly ‘go with their gut’ when it would be prudent to stop and think more reflectively.

Tom Hanks, the most admired American actor, said something similar at Harvard University’s 372nd commencement in 2023, ‘The truth, to some, is no longer empirical. It’s no longer based on data nor common sense or even common decency. Truth is now considered malleable by opinion and by zero-sum endgames.’

Lies linger on in memory

Machiavelli, the sixteenth-century Italian exponent of the art of the politics of duplicity, certainly knew a thing or two about the impression of misinformation on people’s memories when he said, ‘Throw mud enough and some will stick.’ The sticking power of the Machiavellian mud of misinformation has now been enforced by psychologists: Lies have a lasting impact on our memory and despite the best efforts to correct wrong facts, they cannot be completely erased. Our brains hold on to lies even after they have been proved wrong; it occurs even if the retraction of lies is understood, believed and remembered. No matter what cock and bull story you tell, some lies will linger on in listeners’ memories. Therefore, it’s essential to get our facts right in the first place.

In his famous book Thinking, Fast and Slow, Kahneman (he won a Nobel Prize in economics, but he never attended an economics class in his school or college) talks about the ‘halo effect’ (that first impression can overwhelm the following information). Other studies also show that when we hear something repeatedly, more likely we are to believe it to be true. Familiarity becomes a fact.

‘Myths are almost impossible to eradicate,’ says Paul Kirschner, an educational psychologist at the Open University of the Netherlands. ‘The more you disprove it, often the harder core it becomes.’ Other psychological studies have the same message: The very act of attempting to dispel a myth leads to a stronger attachment to it.

How can we detect if someone is lying?

If our brains are biased towards trusting lies, they are not well-equipped to detect lies. So, distrusting my brain, I asked an AI chatbot to answer the question: How can we detect if someone is lying?

It suggested several behavioural cues and techniques that can help in determining if someone is lying. In short, (a) liars often struggle to keep their stories consistent; (b) body language cues, such as avoiding eye contact, fidgeting or excessive hand movements; (c) verbal cues, such as avoiding personal pronouns or using passive voice; and (d) lack of emotional consistency, such as liars’ faces are emotionally turbulent, swinging between positive and negative expressions, in marked contrast to the neutrality often displayed by someone telling the truth.

The advice from my dear AI friend was not new to me. I have read it before in numerous studies and the bot simply summarised what was out there. AI is just like us when it comes to telling the truth or lies.

Some studies also tell that when all else fails, look into eyes as eyes don’t lie. When you lie, your pupils dilate. Be assured that my pupils are not dilated when I say …

6

Mindfulness walking brightens the brain

Before you read the rest of the book, to clear your mind of falsehoods and freshen it up, come along with me for a long walk along a creek near my house.

I’m awed by the sight of the giant red river gum trees that line both sides of the water. The trees are a common eucalyptus species in these parts. They can stand majestically for more than 200 years. The creek was once part of the walking track, followed by the Australian Aboriginal people for thousands of years. As I walk, I see an Aboriginal flag flying on a flagpole near the creek. A yellow circle in the centre of a rectangle divided in half horizontally, the upper half black and the lower ochre red, reminding anyone who cared to find out about an ancient culture’s spiritual relation to the land and the sun — the planet and the cosmos.

An unusual calmness descends upon my mind when suddenly I remember a day decades ago when I was a teenager and a keen walker. I was walking in a lonely street in a scenic town in the foothills of the Himalayas. When I saw an old Tibetan monk pacing in the front yard of a large house, I pressed my palms together and bowed my head a little as a sign of respect.

His solemn face shining in the weak wintery sunlight, the monk smiled and said something like, ‘My young friend, walk as if you are kissing the ground with your feet.’ I couldn’t understand what he said then, but now I know he was talking about mindfulness walking — the spiritual interconnectedness between the land and the mind. An idea the enlightened Australian Aboriginal people have practised for millenniums.

By ‘kissing the ground,’ the old, wise monk meant, ‘focus on the movement of your body and the surroundings and do not let your mind wander.’

When we are mindful, our minds focus on the present and respond with reason before emotion. There is still no good scientific definition of mindfulness. Nevertheless, scientists agree that it’s no magic bullet, but it does help our well-being. When we are mindful, we are aware of every sensation as it unfolds at that moment. While walking, focus on the body’s sensations by mentally scanning every body part engaged in walking.

I know now that the monk was part of the entourage of the Dalai Lama when he sought refuge in India. The year was 1959, and the Dalai Lama was only 23 then. He disguised himself as a soldier and slipped through the crowds outside his palace in Lhasa. He then embarked on a dangerous journey to India, crossing the Himalayas on foot with a small number of Tibetan soldiers and officials. They travelled only at night for two weeks to avoid detection by Red Chinese soldiers.

Inspired by the Dalia Lama’s long walk in the Himalayas from Lhasa to the India-Tibet border, ‘I wandered lonely as a cloud.’ A cloud saturated with serenity, watching William Wordsworth walking in the fields and woods of England’s picturesque Lake District. His mind calm yet fiercely active in creative thinking and composing poems. The poems are now as old and overpowering as the red river gum trees surrounding me.

When Wordsworth achieved literary fame in the last years of his life, he began receiving many gifts, letters, and requests for autographs or meetings. Sometimes travellers would arrive at his house unannounced. One such traveller came when Wordsworth was away. He requested the maid to show him the celebrated poet’s study. She took him inside and said, ‘Here’s his library, but his study is out of doors.’ Not his desk but his mind out of doors in the fields and woods.

In 1862, 12 years after Wordsworth’s death, Henry David Thoreau wrote more than 12,000 words in The Atlantic magazine extolling the virtues of walking in natural environments. Not a word of science, but a fascinating read (you can find it by Googling ‘The Atlantic + Henry David Thoreau’).

The Brontë sisters — literary geniuses Charlotte (Jane Eyre), Emily (Wuthering Heights) and Anne (The Tenant of Wildfell Hall) — went walking on the moors every day. ‘To the great damage of our shoes, but I hope, to the benefit of our health,’ notes Charlotte in a letter to a friend. The old-fashioned moral of this story: the more damaged your walking shoes are (not big numbers on your fitness app), the healthier you are.

Charles Dickens was famous for his love of walking. The long hours he spent at his desk agitated him tremendously, and walking helped his mind to calm down. If he walked all night, he could write all day, G.K. Chesterton remarks in his biography of Dickens. Not unexpected from the author of Great Expectations.

A simple but smart message

These of our perennial favourite writers were avid walkers and somehow knew walking benefits health. They had no idea that regular brisk walking reduces the risk of heart attacks and strokes, type 2 diabetes, hypertension, obesity, breast and colorectal and other forms of cancer, depression, and many other ailments (it even helps people with Parkinson’s disease, as some studies suggest). But we do. We also know from a slew of studies that regular physical activity improves mental abilities, especially short-term and long-term memory.

Numerous benefits of walking

Walking is a ‘moderate’ exercise while running is a ‘vigorous’ exercise. Walking is as good as running if we walk twice as long. Walkers burn four times more calories than sitting; runners burn about twice more energy than walkers. If you prefer walking, you can be just as well-off, health-wise.

Any exercise makes our brains smarter by boosting our mood, memory, and learning. We know this because we have now learned a lot about the hippocampus. This seahorse-shaped centre of learning in the brain is also intricately linked to the limbic system that controls emotions and motivation. So, it truly takes care of our well-being. Our well-being is inextricably linked to the well-being of our hippocampus.

When we exercise, brain cells, or neurons, in the hippocampus rev up, improving our cognitive abilities. The revved-up neurons also lift the mood by releasing feel-good chemicals like dopamine and endorphins that make us happy. Exercise also helps us eliminate chemicals that make us feel stressed and anxious. If you have a mental block, go for a walk or jog. The exercise would help you pull out of your funk. When our feet hit the ground, their arteries are compressed. This increases the blood flow in the brain by up to 15 per cent providing more nourishment to neurons in the hippocampus.

Exercise also helps increase the size of your hippocampus, which tends to decrease in old age. It used to be thought that ageing was a one-way process going the wrong way, but that’s not the case. Studies show that the hippocampus is as much as two per cent larger in older people who regularly walk, jog, or engage in aerobic exercises. Any shrinkage in the hippocampus can lead to Alzheimer’s disease and dementia. Motivation and drive suffer in most types of dementia. Exercise can stop the brain’s decline, or at least it can slow it down.

For the elderly, walking is the best way to prevent or delay the onset of Alzheimer’s disease. The benefits of walking accumulate once it becomes a routine habit. Unlike many other sports, walking never becomes impossible with advancing age. Research shows regular walking goes a long way toward reducing the risk of dementia, even for people classified as high-risk.

Walking is an aerobic exercise. So are running, cycling, dancing, and swimming. Aerobic exercise gets our blood flowing faster and our heart pumping more of it, thus increasing the body’s use of oxygen. This increase in oxygen is also a boon for the brain, the biggest consumer of oxygen in the body. Oxygen helps food to convert into energy, and the brain consumes about 20 per cent of our daily calorie intake. More oxygen to the neurons means they are better nourished. Many large-scale studies now closely correlate increased use of the body’s oxygen with a significant increase in life spans, even among the elderly or overweight.

Walking is one of the best exercises for people with moderate knee osteoarthritis. It can keep off the pain. Osteoarthritis, also known as wear-and-tear arthritis, happens when the joint cartilage breaks down and underlying bone changes. Regular walking can help create muscle mass, strengthening ligaments around osteoarthritis joints. Experts advise starting with short walks, wearing supportive shoes, drinking plenty of water during a walk, and applying an ice pack after a long walk.

Walking by numbers

We are never too old, too young, pregnant, or ill to reap the benefits of brisk walking.  Even Little Winnie-the-Pooh, the bear with no brain, knew that exercise would also make him smarter (‘A bear, however hard he tries, grows tubby without exercise.’ — A.A. Milne, Winnie-the-Pooh). To keep their mind sharp, children are advised to play sports, run, or walk for about an hour a day.

The science of walking is now overwhelmingly convincing. It has only one simple message: Walk briskly for at least 150 minutes per week or walk casually for 30 minutes every day. Even walking casually for as little as two minutes per hour will go a long way to improving physical and mental health. It’s much better than merely sitting. Walking involves nearly four times more energy than sitting. Any activity that uses more energy than sitting on the couch is good for our health.

After analysing eight earlier studies, an international consortium of researchers has unequivocally shown clear scientific evidence that higher levels of total physical activity — regardless of intensity — reduce our chances of dying prematurely. The British Medical Journal suggests 25 minutes per day of moderate exercises such as brisk walking, or five hours a day of light, gentle activity such as cooking and cleaning. 

Step-count sweet spot

Ten thousand steps a day has always been the mantra of walking enthusiasts. Well, at least since 1964, when a Japanese clockmaker decided to mass-produce a pedometer to capitalise on fitness fetish after the Tokyo Olympics. In Japanese characters, the pedometer’s name looked like a walking man. In English, it is also translated as ‘10,000-step meter’. Not surprisingly, the 10,000 steps a day goal has become the standard on most fitness apps on our smartphones and watches. However, experts on step counts and health suggest 8,000 steps to be the step-count sweet spot, and we gain no benefits beyond this magic number.

The number 10,000 is so deeply rooted in our consciousness that science will find it hard to erase. Science also tells us that our brains are inherently lazy; they accept too much at face value, and if something is familiar is also safe. Forget the familiar 10,000 and go for 7,500 to 8,000 steps daily — a range recommended by many researchers. But it doesn’t mean you don’t walk 10,000 or more steps every day. Go for it without any guilt.

Health benefits of walking within 60 to 90 minutes after a meal

You must go for a walk after a meal, just for a few minutes. I still remember this stern advice from my grandmother. I know now why this edict from her: somehow, she knew that it uplifts your mood and helps digestion.

An analysis of seven studies published in Sports Medicine, recommends what my grandmother always preached. The analysis shows that light walking after a meal for two to five minutes significantly lowers blood sugar levels. This finding is good news for Type 2 diabetics.

Standing after a meal also helps, but not to the same extent. Standing did have a small benefit, but light-intensity walking was a superior intervention, according to the study's lead author, Aidan J. Buffey of the University of Limerick in Ireland.

Other studies suggest a short walk within 60 to 90 minutes of eating, when the blood sugar spikes, is most helpful in minimizing blood sugar levels.

Moving even a little bit is worthwhile and can lead to surprisingly good changes in the health of couch potatoes.

Try tiny spurts of brisk walking throughout the day

A study published in the prestigious journal Nature Medicine shows that tiny spurts of exercise throughout the day are associated with significant improvement in health. Emmanuel Stamatakis of the University of Sydney’s Charles Perkins Centre and the study’s lead author suggest ways to incorporate small bouts of movements in their lives.

‘If you have a roughly half-mile long walk — for example, from your apartment to the grocery store — you don’t need to sprint the entire time but accelerate your pace for a few hundred feet or three times over the course of your walk,’ he says. ‘Instead of taking the elevator, opt for the stairs. As long as you go up more than one or two flights, that will count as vigorous activity.’

I walk, therefore I’m

The message of the foregoing about 2000 words on walking is simple: Spend as much time as you can in Wordsworth’s study, that is, the outdoors. Walk 2000 mindful steps before you turn the page.

It may turn ‘I think, therefore I am’ (Cogito, ergo sum — René Descartes’ famous philosophical statement) into ‘I walk, therefore I’m’ (Ambulo, ergo sum).

7

If anything can go wrong, it will

Life’s little annoyances — such as being unable to find a matching pair of socks, or taking along an umbrella that turned out to be unneeded or not taking one when it was needed — made James Payne, a Victorian satirist, lament in 1884:

 

I never had a piece of toast
Particularly long and wide
But fell upon the sanded floor
And always on the buttered side.

In 1949 these little annoyances acquired a ‘scientific’ name, Murphy’s law.  Two American technicians, John Paul Stapp and George E. Nichols, were working on an aerospace project designed to test how much sudden deceleration a person can stand in a crash. Edward A. Murphy (1918-90) came from another laboratory bringing a set of gauges that were supposed to measure the deceleration more accurately. He found that the original gauges measured no deceleration because they were wired incorrectly. Thoroughly annoyed, Murphy cursed the technicians responsible and muttered something approximating his immortal law: ‘If anything can go wrong, it will.’ Murphy’s law was born.

The law occurs too frequently to be pure chance. (Haven’t you noticed that whenever you are trying to park your car along a busy road, all the empty spaces are on the other side?) This has worried Robert A. J. Matthews, a British computer scientist turned science journalist, for a long time. After years of research, he lamented why highly accurate weather forecasts were not good enough to prove Murphy’s Law of Umbrellas (‘Carrying an umbrella when rain is forecast makes rain less likely to fall’) wrong.

Murphy’s law is now expressed in various humorous axioms stating that anything that can possibly go wrong will go wrong. In mathematical form, it is expressed as 1 + 1 F  2 (F stands for ‘hardly ever’).

After 50 years, in 1999, Stapp, Nichols and Murphy were awarded the Ig Noble Prize, a spoof on the Nobel Prize, which is awarded annually by the science humour magazine Annals of Improbable Research to honour achievements that ‘cannot or should not be reproduced’.

Murphy’s law is not a scientific law. How did it get into this book? We may blame Murphy’s law or Peter principle (after Laurence Peter, 1919-90): Employees within an organisation will advance to their highest level of competence and then be promoted to and remain at a level at which they are incompetent.

After applying Peter principle to myself, I would like to say that Murphy’s law helps us in uncertain times by reminding us that things can go wrong at any time in our lives, especially when we are fearful of failure.

8

Your brain perceives a failure as a threat, but it is a fantastic way to learn

‘There is only one thing that makes a dream impossible to achieve: the fear of failure.’ — In Paulo Coelho’s The Alchemist, the alchemist replies to Santiago, a shepherd boy, when he says, ‘I have no idea how to turn myself into the wind.’

Once seeded, fear of failure sprouts in the brain, making it incapable of making decisions. This stagnation ensures that we only meet failure. When researching this lesson, I found gazillions of papers on the fear of failure churned out by ‘fearful’ psychologists or cognitive scientists, as some prefer this label. It’s not for me but for some aspiring young psychologists to discover why so many of their colleagues are preoccupied with failure. I looked at my desk bulging with books, research papers and pop articles on the fear of failure, and before the fear of failure could grip my brain, I started writing following this astute advice: failure fears people who pursue meaningful goals.

I’ll start with the work of Carol Dweck, an eminent psychologist and author of Mindset: The New Psychology of Success, who has spent decades studying how people cope with failure. She came up with the idea of mindset when she was sitting in her office studying the result of the latest experiment with one of her graduate students. The results showed that people who disliked challenges thought that talent was a fixed thing that you were either born with or not. People who relished challenges thought that talent was something you could nourish by doing things you were not good at all. ‘There was this eureka moment,’ recalls Dweck.

She later came up with the term ‘fixed mindset’ to identify the former group and the ‘growth mindset’ for the latter group. If you believe you can develop your talents over time (a growth mindset), you’ll never be paralyzed by fear of failure. If you believe you were born with a certain amount of talent (a fixed mindset), that’s the end of the road for you. A growth mindset benefits us throughout our life. ‘It allows you to take more challenges,’ she says, ‘and you don’t get discouraged by setbacks or find effort undermining.’ She applies her research findings to her own life. She took up piano as an adult and learned Italian in her 50s. ‘These are things the adults are not supposed to be good at learning,’ she says. ‘Just being aware of the growth mindset, studying it and writing about it, I feel compelled to live it and benefit from it.’

We can learn to change our mindsets and make a dramatic stride in our performance. ‘Changing mindsets is not likely surgery,’ Dweck warns, ‘You can’t simply remove the fixed mindset and replace it with a growth mindset.’ First, you must learn that talent is like a muscle that grows stronger through exercise, and then train yourself to master new things. The practice may not make you perfect, but it will undoubtedly improve your performance.

Are you burdened with fear of failure? Write true or false against the following statements, whether they are generally like you or not. This is not a diagnostic test; it may help you discover the areas you need to work to change.

1.     Failure makes me worried about what other people would think about me.

2.     I’m afraid of looking dumb.

3.     I’m uncertain about my ability to avoid failure.

4.     I like to play it safe, as I can’t afford to be vulnerable.

5.     I always put off tasks for tomorrow.

6.     I become anxious when not certain.

7.     I live in self-doubt.

8.     I’m afraid of disapproval.

9.     I worry that I won’t do well.

10.   I worry that failure will disappoint people whose opinions I value.

Any true answer suggests that you might like to examine the issue further. But don’t be too hard on yourself. Failure to act correctly is an inevitable part of life; that’s why computer keyboards have delete keys. You can always delete a failure from your memory and start again. It’s easier if you have a growth mindset: every failure will rewire the brain, making it stronger to face the next big challenge.

Here’re some of the ways to lose your fear of failure:

·       Maintain perspective. Take a long-term view of your failures; they are not final. A failure is a single incident; it doesn’t make you incapable of success in the future. Failure is a relative term. Was Vincent von Gogh’s inability to sell more than one painting in his lifetime a failure? If he were afraid of failure, humanity would have been deprived of the eternal beauty of his nearly 2000 paintings, drawings, and sketches.

·       Think of failure as a learning experience that teaches us to take responsibility. Put aside old ideas and past efforts and start anew. Visualize your goals and work out your milestones. Develop a strategy — a step-by-step plan that ensures that your actions lead you toward your objective — and execute it efficiently. Let Thomas Ala Edison inspire you: after experimenting with thousands of different sorts of fibres (including the hair from the beards of some of the men in his laboratory), he, at last, found the right filament for his newly invented incandescent light bulb. He hadn’t failed thousands of times; he had found thousands of ways that didn’t work. The one that worked brought sunshine into our darkened rooms.

·       Identify things that are in your control and focus on them. You may not be good at figures (of mathematical type), but if you like to draw, you can focus on figures (of curvaceous type). Everyone has talents they are not sure about or know about. Once you know of your unique gift, you know of one thing that is under your control. Just focus on it.

·       Failure is not defeat, and success is not excellence. Plunge right into what you want to do. It’s better to enjoy partial success than nursing regrets not doing it at all.

To avoid emotional bruises caused by failure, learn to own the fear. Find trusted people with whom you can discuss your demoralizing feelings of shame and disappointment. ‘Bringing these feelings to the surface can help prevent you from expressing them via unconscious efforts to sabotage yourself, and getting reassurance and empathy trusted from others can bolster your feelings of self-worth and minimize the threat of disappointing them,’ advises psychologist Guy Winch, author of Emotional First Aid: Practical Strategies for Treating Failure, Rejection, Guilt, and Other Everyday Psychological Injuries.

Parents are keen to motivate their children, but fear of failure is not the right tool. Many parents believe that if you want to develop your children’s resilience, you let them fail and don’t hide their failures from them. The argument is that when children don’t get a hoped-for-reward, they will be motivated to try harder next time. Negative motivation is educationally and psychologically weak. Alfie Kohn, the author of The Myth of the Spoiled Child: Challenging the Conventional Wisdom about Children and Parenting, dismisses the let-children-fail parenting style. ‘When kids’ performance slides, when they lose enthusiasm for what they’re doing, or when they cut corners, much more is going on than laziness or lack of motivation,’ she writes. ‘What’s relevant is what their experiences have been. And the experience of having failed is a uniquely poor bet for anyone who wants to maximize the probability of future success.’’

A study of 347 UK students, with an average age of 15 over 18 months, also confirms that failure-focused messages make students less motivated to do well. The students received both negative and positive messages from their teachers before the exam. For example, ‘If you fail the exam, you will never be able to get a good job or go to college. You need to work hard in order to avoid failure.’ Or ‘The exam is really important as most jobs that pay well require that you pass, and if you want to go to college, you will need to pass the exam.’

‘Both messages highlight to students the importance of effort and provide a reason for striving,’ says researcher David Putwain of Edge Hill University in England. ‘Where these messages differ is some focus on the possibility of success while others stress the need to avoid failure.’ The study results showed that students threatened by teachers’ failure-focused messages felt less motivated to do well and had lower exam scores than those whose teachers used fewer fear tactics.

Forget fear tactics; try this tactic, not from a psychologist but from an acclaimed writer of short fiction: ‘When we begin to take our failures non-seriously, it means we are ceasing to be afraid of them.’ — Katherine Mansfield in ‘A Shot of Laughter’

Instead, many parents take failure seriously, so seriously that they try to scrub failure from every step of their children’s lives: from rushing breathlessly to swinging so that nobody gets hurt to doing the children’s homework.

When parents try to engineer failure out of children’s lives, warns Jessica Lahey, author of The Gift of Failure, children feel incompetent, incapable, unworthy of trust and utterly dependent. Obviously, these children will never learn that failure packs enormous power, especially when we learn from it.

 

9

Learning from our mistakes

The mistakes of a learned person are a shipwreck which wrecks many others as it goes down, so says an old Arabian proverb. Everyone makes mistakes. Some learn from them; others miss the opportunities to learn from their failures.

Now, researchers have found that people who think they can learn from their mistakes have a different brain reaction to mistakes than people who think intelligence is fixed. Michigan State University researchers recruited undergraduates for their project. Each participant was wired to an EEG (electroencephalograph) to record electrical activity in the brain and given a task in which it was easy to make a mistake. They were asked to identify the middle letter of five letters series like ‘MMMMM’ or ‘NNMNN’ which repeatedly flashed on a computer screen. Sometimes the middle letter was the same as the other four, and sometimes it was different. In either case, the participants were asked to push a button. They had only a few milliseconds to decide.

When someone makes a mistake, their brain makes two distinct reactions which can be recorded on EEG. An initial reaction appears about 50 milliseconds after the mistake, and the second arrives anywhere between 100 to 500 milliseconds after the mistake. The first reaction is mostly involuntary and the second indicates that the person is consciously aware of the mistake and trying to learn from it.

The results of the experiment showed that the participants who thought they could learn from their mistakes did significantly better after making mistakes. Their EEG signals showed a bigger second signal, the one that implied, ‘I see that I’ve made a mistake, so I should pay more attention to it.’

‘This finding is exciting in that it suggests people who think they can learn from mistakes have brains that are more tuned to pick up on mistakes very quickly,’ says Jason Moser, the lead researcher. Their brains are tuned differently at a very fundamental level. What you need is the right ‘mindset’: this mindset is not simply dependent on intelligence; motivation and effort are a huger part of it. On the other hand, people who think that they can’t get smarter will not take opportunities to learn from their mistakes.

The researchers believe that these findings could help in training people to believe that they can work harder and learn more, by showing how their brain is reacting to mistakes. All they need is EEG caps on their heads.

In her famous 2008 Harvard Commencement address, J.K. Rowling, author of the mega-selling Harry Potter books, talked to students about the fear of failure: ‘You might never fail on the scale I did, but some failure in life is inevitable. It is impossible to live without failing at something unless you live so cautiously that you might as well not have lived at all — in which case, you fail by default.’

If you learn from a failure, it’s no longer a failure. You have learned something new. You have become smarter. The eminent Danish physicist Niels Bohr summed up this essential lesson of learning when he said, ‘An expert is a person who has made all the mistakes that can be made in a very narrow field.’

 

10

Once you’ve lost time, you can never get it back

‘My dear Micawber!’ urged his wife.

‘I say,’ returned Mr Micawber, quite forgetting himself, and smiling again, ‘the miserable wretch you behold. My advice is, never do tomorrow what you can do today. Procrastination is the thief of time. Collar him!’

‘My poor papa’s maxim,’ Mrs Micawber observed.

Charles Dickens, David Copperfield

Your brain is washed in dopamine, a chemical that controls the brain’s sense of reward and pleasure. Dopamine neurons are revved up and so is your motivation. But motivation is in the mind; turning motivation into action requires goals that take you beyond your current circumstances. At the same time, once you set a goal and start working towards it your motivation — and life satisfaction — also increases. Achieving a goal requires more than the old-fashioned notion of willpower. Willpower may help you control your emotions and impulses but reaching a goal requires an achievable action plan.

Setting a realistic goal requires some effort. We’re not talking about your short-term new year’s resolutions such as going on a diet or quitting smoking; we’re talking about long-term lifestyle changes. First, make sure that the goal is your own; it’s not based on someone else’s expectations. Think about what motivates you.

The next step is to visualise your goal: visualise the work needed to be done and the specific obstacles you will face; imagine yourself carrying out your plans and what your life will be like once you reach your goal. The visualisation makes the goal seem more tangible and tempting. Studies suggest that imagining both the successful result of your efforts and the specific obstacles you will face — called mental contrasting — helps people tackle challenges of circumstances more enthusiastically.

You have your action plan ready and then instead of shouting carpe diem (seize the day) you mumble, I’ll do it tomorrow. It’s the moment all familiar to most of us: the urge to delay the task. The higher our fear of failure, the higher the urge to delay the task. Procrastination (from a Latin word meaning ‘to put off until tomorrow’) is the intentional delay of due tasks. It keeps most of us imprisoned in our present circumstances. Pleasurable social activities have immediate rewards, but the benefits of your dreary action plan are distant. How to conquer procrastination?

Well, if I do not want to write this book, I can’t act like Victor Hugo. The French novelist practised a novel way of dealing with procrastination: he would ask his valet to hide his clothes so that he would be unable to go outside when he was supposed to be writing.

Let me try a different approach, which comes from the University of Calgary industrial psychologist Piers Steel (thank you, sir, for letting me keep my clothes on). The desirability of a task, or utility (U), depends on four factors: the expectation of success (E), the value of completing the activity (V), the delay until reward (D), and the personal sensitivity to delay (I). The equation relating these factors is: U = EV/SD. If I expect to succeed in writing this book (a higher E), its utility (U) will increase. If a publisher promises a lucrative contract, the value of completing task (V) will also increase U. All I need to do now to keep U high is to make sure that S and D shrink. If I’m impulsive or lack self-control (high S) or the reward lies in the far future (high D), the chances of finishing the task are low. Therefore, to shrink S, I must have high motivation to finish the book; a shorter deadline will help in keeping D low.

The printed or e-book you are reading proves that I have not practised ‘the art of keeping up with yesterday’, as some wit has described procrastination.

In your brain, the limbic system controls automatic functions such as it tells you to pull your hand away from a flame. The prefrontal cortex, the ‘executive’ region of the brain, integrates information and allows you to make decisions. It’s not automatic; you must kick it into gear (‘I’ve to finish reading this page of the book’; once you’re not consciously engaged in reading the limbic system takes over). Studies suggest that procrastination is a self-regulation failure because it has become automatic and ingrained.

To increase the brain’s executive function and decrease procrastination, Timothy A. Pychyl of Carleton University in Canada advises focusing on the problem of ‘giving in to feel good’ by first developing an awareness of this process and its negative effect on achievement. He also suggests using short goals that build on one another with regular deadlines and feedback. It’s easy to procrastinate when goals are large and the path to them long and fuzzy, he says.

Here’s another strategy to prevent procrastination: ‘implementation intentions’, where and how you will strive to achieve your goal. Goal intentions are commitments to act (‘I intend to reach Z’: study Stoic philosophy, analyse pitchblende ore to find a new element or simply write an essay). Implementation intentions are dependent on goal intentions and specify the when, where and how of responses leading to goal achievement. Says Peter Gollwitzer of New York University, a foremost authority on implementation intentions: ‘They have the structure of ‘When situation x arises, I will perform response y!’ and this link opportunities with goal-directed responses.’ For example, to achieve the goal intention of completing your essay, your completion intention is: ‘I’ll sit down at my desk tonight at 8 to make an outline of my essay’. Once you have started, avoid getting derailed (‘If my mobile ring, I’ll ignore it’). You also know when to look at different alternatives to achieve your goal (‘If feedback from my tutor is disappointing, I’ll change my strategy’) and when to stop working too hard (‘I’ll now watch TV and look at my essay again in the morning’).

Most goals are never achieved — unless you make sure they do. But one success erases many failures. Implementation intentions can have an almost magical effect on you to achieve that one success. The mechanism in the brain is simple: When you sincerely decide ‘I’ll sit down at my desk tonight at 8 to make an outline of my essay’, you hand over the decision from your conscious mind to the unconscious mind. Now the fickle conscious mind sits idyll, and the reliable unconscious takes over the control of the executive region of the brain. You will be unconsciously at your desk at 8 pm to plan your essay. In implementation intentions, the good old willpower does help.

Mr and Mrs Micawber had strong views on procrastination, but what would they think about pre-crastination? Research by David Rosenbaum of Pennsylvania State University and colleagues hasn’t bothered to answer this question, but it suggests that people often opt to begin a task as soon as possible just to get it off their plate. Pre-crastination is a term used by them to describe the tendency to hurry up to complete a task as soon as possible. In one of their nine experiments all of which had the same general setup, they asked university students to pick up either of two buckets, one to the left of an alley and one to the right, and to carry the selected bucket to the alley’s end. In most trials, one of the buckets was closer to the endpoint. Contrary to their expectation, students chose the bucket that was closer to the start position and carried it farther than the other bucket. The researchers say that this seemingly irrational choice reflects a tendency to pre-crastinate. ‘Most of us feel stressed about all the things we need to do — we have to-do lists, not on just slips of paper we carry with us or on our iPhones, but also in our heads,’ says Rosenbaum. ‘Our findings suggest that the desire to relieve the stress of maintaining that information in working memory can cause us to over-exert ourselves physically to take extra tasks.’ The researchers says that most of the people they tested pre-crastinated, so they think procrastinating and pre-crastinating might turn out to be two different things.

In Indian English, ‘prepone’ is commonly used as the opposite of ‘postpone’. It seems like pre-crastinte; it means, ‘bringing something forward to an earlier date or time’. The word in now found in major English dictionaries, including Oxford.

The time you’ll take to turn over the page to start reading the next story would determine whether you like to procrastinate or pre-crastinate.

If you dump the book, it’s called giving up. You may do so if ‘you wanna fly, you got to give up the shit that weighs you down’ (Toni Morrison, Song of Solomon).

© Surendra Verma 2023

To read 50 more stimulating stories, buy

Theories & Things

to Live by

Sensibly & Serenely

Kindle, paperback and hardback editions

available on Amazon