People Archives - Farnam Street https://myvibez.link/category/people/ Mastering the best of what other people have already figured out Wed, 04 Jun 2025 13:36:34 +0000 en-US hourly 1 https://myvibez.link/wp-content/uploads/2015/06/cropped-farnamstreet-80x80.png People Archives - Farnam Street https://myvibez.link/category/people/ 32 32 148761140 The Feynman Learning Technique https://myvibez.link/feynman-learning-technique/ Mon, 22 Feb 2021 12:59:36 +0000 https://myvibez.link/?p=43627 The Feynman Technique is the best way to supercharge your learning. And it works no matter the subject. Devised by Nobel Prize-winning physicist Richard Feynman, it leverages the power of teaching for better learning. Learning doesn’t happen from skimming through a book or remembering enough to pass a test. Information is learned when you can …

The post The Feynman Learning Technique appeared first on Farnam Street.

]]>
The Feynman Technique is the best way to supercharge your learning. And it works no matter the subject. Devised by Nobel Prize-winning physicist Richard Feynman, it leverages the power of teaching for better learning.

Learning doesn’t happen from skimming through a book or remembering enough to pass a test.

Information is learned when you can explain it and use it in a wide variety of situations. The Feynman Technique gets more mileage from the ideas you encounter instead of rendering anything new into isolated, useless factoids.

When you really learn something, you give yourself a tool to use for the rest of your life. The more you know, the fewer surprises you will encounter because most new things will connect to something you already understand.

Ultimately, the point of learning is to understand the world. But most of us don’t bother to deliberately learn anything.

We memorize what we need to as we move through school, then forget most of it. As we continue through life, we don’t extrapolate from our experiences to broaden the applicability of our knowledge. Consequently, life kicks us in the ass time and again.

To avoid the pain of being bewildered by the unexpected, the Feynman Technique helps you turn information into knowledge that you can access as easily as reaching for a chair.

*

The Feynman Technique

“Any intelligent fool can make things bigger, more complex, and more violent. It takes a touch of genius—and a lot of courage—to move in the opposite direction.”

E.F. Schumacher

There are four steps to the Feynman Learning Technique, based on the method Richard Feynman originally used. We have adapted it slightly after reflecting on our own experiences using this process to learn. The steps are as follows:

  1. Pretend to teach a concept you want to learn about to a student in the sixth grade.
  2. Identify gaps in your explanation. Go back to the source material to better understand it.
  3. Organize and simplify.
  4. Transmit (optional).

Step 1: Pretend to teach it to a child

Take out a blank sheet of paper. At the top, write the subject you want to master. Now write out everything you know about the subject as if you were teaching it to a child or a rubber duck sitting on your desk.

It’s important to remember that you are not teaching to your smart adult friend, but rather a child who has just enough vocabulary and attention span to understand basic concepts and relationships. It has to be simple and clear. There is nowhere to hide in obfuscation.

Or, for a different angle on the Feynman Technique, you could place a rubber duck on your desk and try explaining the concept to it. Software engineers sometimes tackle debugging by explaining their code, line by line, to a rubber duck. It sounds silly, but it’s a forcing function to make you walk through your thinking as simply as possible.

It turns out that one of the ways we mask our lack of understanding is by using complicated vocabulary and jargon. The truth is, if you can’t clearly and simply define the words and terms you are using, you don’t really know what you’re talking about.

If you look at a painting and describe it as “abstract” because that’s what you heard in art class, you demonstrate no understanding. You’re just mimicking what you’ve heard. You haven’t learned anything.

When you write out an idea from start to finish in simple language that a child can understand, you force yourself to understand the concept at a deeper level and simplify relationships and connections between ideas. You can better explain the why behind your description of the what.

Writing helps you think because it gives you nowhere to hide.

Looking at the painting again, you will be able to say that the painting doesn’t display buildings like the ones we look at every day. Instead, it uses certain shapes and colors to depict a city landscape. You will be able to point out what these are. You will be able to engage in speculation about why the artist chose those shapes and those colors. You will be able to explain why artists sometimes do this, and you will be able to communicate what you think of the piece considering all of this.

Chances are, after capturing a full explanation of the painting in the simplest possible terms that would be easily understood by a sixth-grader, you will have learned a lot about that painting and abstract art in general.

Some of capturing what you would teach will be easy. These are the places where you have a clear understanding of the subject. But you will find many places where things are much foggier.

Step 2: Identify gaps in your explanation

Areas, where you struggle in Step 1, are the points where you have some gaps in your understanding.

Identifying gaps in your knowledge—where you forget something important, aren’t able to explain it, or simply have trouble thinking of how variables interact—is a critical part of the learning process.

Filling those gaps is when you really make the learning stick.

Now that you know where you have gaps in your understanding go back to the source material. Augment it with other sources. Look up definitions. Keep going until you can explain everything you need to in basic terms.

Only when you can explain your understanding without jargon and in simple terms can you demonstrate understanding. Think about it this way. If you require complicated terminology to explain what you know, you have no flexibility. When someone asks you a question, you can only repeat what you’ve already said.

Simple terms can be rearranged and easily combined with other words to communicate your point. When you can say something in multiple ways using different words, you understand it really well.
Being able to explain something in a simple, accessible way shows you’ve done the work required to learn. Skipping it leads to the illusion of knowledge—an illusion that can be quickly shattered when challenged.

Identifying the boundaries of your understanding is also a way of defining your circle of competence. When you know what you know (and are honest about what you don’t know), you limit the mistakes you’re liable to make and increase your chance of success when applying knowledge.

Step 3. Organize and simplify

Now you have a set of hand-crafted notes containing a simple explanation. Organize them into a narrative that you can tell from beginning to end. Read it out loud. If the explanation sounds confusing at any point, go back to Step 2. Keep iterating until you have a story that you can tell to anyone who will listen.

If you follow this approach over and over, you will end up with a binder full of pages on different subjects. If you take some time twice a year to go through this binder, you will find just how much you retain.

Step 4: Transmit (optional)

This part is optional, but it’s the logical result of everything you’ve just done.

If you really want to be sure of your understanding, run it past someone (ideally someone who knows little of the subject).

The ultimate test of your knowledge is your capacity to convey it to another. You can read out directly what you’ve written. You can present the material like a lecture. You can ask your friends for a few minutes of their time while you’re buying them dinner. You can volunteer as a guest speaker in your child’s classroom or your parents’ retirement residence. All that really matters is that you attempt to transmit the material to at least one person who isn’t that familiar with it.

The questions you get and the feedback you receive are invaluable for further developing your understanding.

Hearing what your audience is curious about will likely pique your own curiosity and set you on a path for further learning. After all, it’s only when you begin to learn a few things really well do you appreciate how much there is to know.

***

The Feynman Technique is not only a wonderful recipe for learning but also a window into a different way of thinking that allows you to tear ideas apart and reconstruct them from the ground up. It also allows you to supercharge your learning from others.

Too often, we want to seem smart rather than learn. We nod along even when we don’t understand what someone is talking about. This is a missed opportunity for learning. If you’re having a conversation with someone and they start using jargon that you don’t understand, ask them to explain it to you like you’re twelve. Not only will you supercharge your own learning, but you’ll also supercharge theirs.

Feynman’s approach intuitively believes that intelligence is a process of growth, which dovetails nicely with the work of Carol Dweck, who describes the difference between a fixed and growth mindset.

“If you can’t reduce a difficult engineering problem to just one 8-1/2 x 11-inch sheet of paper, you will probably never understand it.”

—Ralph Peck

What does it mean to “know?”

Richard Feynman believed that “the world is much more interesting than any one discipline.” He understood the difference between knowing something and knowing the name of something, as well as how, when you truly know something, you can use that knowledge broadly.

When you only know what something is called, you have no real sense of what it is.

You can’t take it apart and play with it or use it to make new connections and generate new insights. When you know something, the labels are unimportant because it’s not necessary to keep it in the box it came in.

“The person who says he knows what he thinks but cannot express it usually does not know what he thinks.”

—Mortimer Adler

Feynman’s explanations—on why questions, why trains stay on the tracks as they go around a curve, how we look for new laws of science, or how rubber bands work—are simple and powerful. He doesn’t hide behind abstraction or jargon.

Here he articulates the difference between knowing the name of something and understanding it.

“See that bird? It’s a brown-throated thrush, but in Germany it’s called a halzenfugel, and in Chinese they call it a chung ling, and even if you know all those names for it, you still know nothing about the bird. You only know something about people: what they call the bird. Now that thrush sings, and teaches its young to fly, and flies so many miles away during the summer across the country, and nobody knows how it finds its way.”

Knowing the name of something doesn’t mean you understand it. We talk in fact-deficient, obfuscating generalities to cover up our lack of understanding.

You can’t replace translating things into simple language that a kid can understand because you need to reflect in order to learn.

The post The Feynman Learning Technique appeared first on Farnam Street.

]]>
43627
The High Price of Mistrust https://myvibez.link/mistrust/ Mon, 25 Jan 2021 13:30:35 +0000 https://myvibez.link/?p=43411 When we can’t trust each other, nothing works. As we participate in our communities less and less, we find it harder to feel other people are trustworthy. But if we can bring back a sense of trust in the people around us, the rewards are incredible. There are costs to falling community participation. Rather than …

The post The High Price of Mistrust appeared first on Farnam Street.

]]>
When we can’t trust each other, nothing works. As we participate in our communities less and less, we find it harder to feel other people are trustworthy. But if we can bring back a sense of trust in the people around us, the rewards are incredible.

There are costs to falling community participation. Rather than simply lamenting the loss of a past golden era (as people have done in every era), Harvard political scientist Robert D. Putnam explains these costs, as well as how we might bring community participation back.

First published twenty years ago, Bowling Alone is an exhaustive, hefty work. In its 544 pages, Putnam negotiated mountains of data to support his thesis that the previous few decades had seen Americans retreat en masse from public life. Putnam argued Americans had become disconnected from their wider communities, as evidenced by changes such as a decline in civic engagement and dwindling membership rates for groups such as bowling leagues and PTAs.

Though aspects of Bowling Alone are a little dated today (“computer-mediated communication” isn’t a phrase you’re likely to have heard recently), a quick glance at 2021’s social landscape would suggest many of the trends Putnam described have only continued and apply in other parts of the world too.

Right now, polarization and social distancing have forced us apart from any sense of community to a degree that can seem irresolvable.

Will we ever bowl in leagues alongside near strangers and turn them into friends again? Will we ever bowl again at all, even if alone, or will those gleaming aisles, too-tight shoes, and overpriced sodas fade into a distant memory we recount to our children?

The idea of going into a public space for a non-essential reason can feel incredibly out of reach for many of us right now. And who knows how spaces like bowling alleys will survive in the long run without the social scenes that fuelled them. Now is a perfect time to revisit Bowling Alone to see what it can still teach us, because many of its warnings and lessons are perhaps more relevant now than at its time of publication.

One key lesson we can derive from Bowling Alone is that the less we trust each other—something which is both a cause and consequence of declining community engagement—the more it costs us. Mistrust is expensive.

We need to trust the people around us in order to live happy, productive lives. If we don’t trust them, we end up having to find costly ways to formalize our relationships. Even if we’re not engaged with other people on a social or civic level, we still have to transact with them on an economic one. We still have to walk along the same streets, send our children to the same schools, and spend afternoons in the same parks.

To live our lives freely, we need to to find ways to trust that other people won‘t hurt us, rip us off, or otherwise harm us. Otherwise we may lose something too precious to put a price tag on.

***

No person is an island

As community engagement declines, Putnam refers to the thing we are losing as “social capital,” meaning the sum of our connections with other individuals and the benefits they bring us.

Being part of a social network gives you access to all sorts of value. Putnam explains, “Just as a screwdriver (physical capital) or a college education (human capital) can increase productivity (both individual and collective), so too can social contacts affect the productivity of individuals and groups.” For example, knowing the right people can help you find a job where your skills are well utilized. If you don’t know many people, you might struggle to find work and end up doing something you’re overqualified for or be unemployed for a while.

To give another example, if you’re friends with other parents in your local neighborhood, you can coordinate with them to share childcare responsibilities. If you’re not, you’re likely to end up paying for childcare or being more limited in what you can do when your kids are home from school.

Both individuals and groups have social capital. Putnam also explains that “social capital also can have externalities that affect the wider community, so that not all of the costs and benefits of social connections accrue to the person making the contact . . . even a poorly connected individual may derive some of the spillover benefits from living in a well-connected community.” A well-connected community is usually a safer community, and the safety extends, at least partly, to the least connected members.

For example, the more neighbors know each other, the more they notice when something on the street is out of the norm and potentially harmful. That observation benefits everyone on the street—especially the most vulnerable people.

Having social capital is valuable because it undergirds certain norms. Our connections to other people require and encourage us to behave in ways that maintain those connections. Being well-connected is both an outcome of following social norms and an incentive to follow them. We adhere to “rules of conduct” for the sake of our social capital.

Social capital enables us to trust other people. When we’re connected to many others, we develop a norm of “generalized reciprocity.” Putnam explains this as meaning “I’ll do this for you without expecting anything specific back from you, in the confident expectation that someone else will do something for me down the road.” We can go for the delayed payoff that comes from being nice without an agenda. Generalized reciprocity makes all of our interactions with other people easier. It’s a form of trust.

Putnam goes on to write, “A society characterized by generalized reciprocity is more efficient than a distrustful society, for the same reason that money is more efficient than barter. If we don’t have to balance every exchange instantly, we can get a lot more accomplished. Trustworthiness lubricates social life.” Trust requires that we interact with the same people more than once, or at least think that we might.

Generalized reciprocity as a norm also enables us to work together to do things that benefit the whole group or even that don’t benefit us personally at all, rather than focusing on ourselves. If you live in a neighborhood with a norm of generalized reciprocity, you can do things like mowing a neighbor’s lawn for free because you know that when you need similar help, someone will come through. You can do things that wouldn’t make sense in an “every person for themselves” area.

Societies and groups with a norm of generalized reciprocity maintain that norm through “gossip and other valuable ways of cultivating reputation.”

When people are linked to each other, they know that news will spread if they deviate from norms. If one member of a bowling league cheats and another member notices, they’re likely to discuss it with others, and everyone will know to trust that member a little less. Knowing gossip will spread enables us to trust our perceptions of others, because if something were amiss we would have surely heard about it. It also nudges us towards behaving well—if something is amiss about us, others are sure to hear of that, too.

But with the decline of community participation comes the decline of trust. If you don’t know the people around you, how can you trust them? The more disconnected we are from each other, the less we can rely on each other to be good and nice. Without repeated interactions with the same people, we become suspicious of each other. This suspicion carries heavy costs.

***

Rising transaction costs

In economics, a “transaction cost” refers to the cost of making some sort of trade within a market. Transaction costs are the price we pay in order to exchange value. They’re in addition to the cost of producing or otherwise providing that value.

For example, when you make a credit card purchase in a shop, the shop likely pays a processing fee to the card company. It’s part of the cost of doing business with you. Another cost is that the shop needs people working in it to ensure you pay. They can’t just rely on you popping the right money in the till then leaving.

Putnam explains later in the book that being able to trust people as a result of a norm of generalized reciprocity in our social lives leads to reduced transaction costs. It means we can relax around other people and not be distracted by “worrying whether you got back the right change from the clerk to double-checking that you locked the car door.We can easily be honest if we know others will do the same.

With the decline of social capital comes rising transaction costs. We can’t rely on other people to treat us as they would like to be treated because we don’t know them and haven’t built the opportunities to engage in reciprocal relationships.

Much like trusting trustworthy people has great benefits, trusting untrustworthy people has enormous costs. No one likes being exploited or ripped off because they assumed good faith in the wrong person.

If we’re uncertain, we default to mistrust. You can see the endpoint of a loss of trust in societies and groups which must rely on the use or threat of force to get anything done because everyone is out to rip off everyone else.

At a certain point, transaction costs can cancel out the benefits of transacting at all. If lending a leaf blower to a neighbor requires a lawyer to set up a contract stipulating the terms of its use, then borrowing it doesn’t save them any money. They might as well hire someone or buy their own.

We don’t try new things when we can’t trust other people. So we have to find additional ways of making transactions work. One way we do this is through “the rule of law—formal contracts, courts, litigation, adjudication, and enforcement by the state.” During the period since the 1970s when Putnam considers social capital to have declined, the ratio of lawyers to other professions increased more than any other profession: “After 1970 the legal profession grew three times faster than the other professions as a whole.”

While we can’t attribute that solely to a decline in social capital, it seems clear that mistrusting each other makes us more likely to prefer to get things in writing. We are “forced to rely increasingly on formal institutions, and above all the law, to accomplish what we used to accomplished through informal networks reinforced by generalized reciprocity—that is, through social capital.”

***

The high price of mistrust

The cost of mistrust doesn’t just show up in the form of bills from lawyers. It poisons everything we do and further drives us apart.

Mistrust drives us to install remote monitoring software on our employees’ laptops and ask them to fill in reports on every tiny task to prove they’re not skiving off. It drives us to make excuses when a friend asks for help moving or a lift to the airport because no one was available last time we needed that same help. It drives us to begrudgingly buy a household appliance or tool we’ll only use once because we don’t even consider borrowing it from a neighbor.

Mistrust nudges us to peek at the search history of a partner or to cross-reference what a child says. It causes us to keep our belongings close in public, to double-lock the doors, to not let our kids play in the street, and a million other tiny changes.

Mistrust costs us time and money, sure. But it also costs us a little bit of our humanity. We are sociable animals, and seeing the people around us as a potential threat, even a small one, wears on us. Constant vigilance is exhausting. So is being under constant suspicion.

One lesson we can take from Bowling Alone is that anything we can do to increase trust between people will have tremendous knock-on benefits. Trust allows us to relax, delay gratification, and generally be nicer to everyone. It makes for a nicer day-to-day existence. We don’t need to spend so much time and money checking up on others. Ultimately, it’s worth investing in trust whenever possible, as opposed to investing in more ways of monitoring and controlling people.

That’s not to say that there was ever a golden utopia when everyone trusted everyone. People have always abused the trust of others. And people on the fringes of society have always been unfairly mistrusted and struggled to trust that others would act in good faith. Nonetheless, whenever we go to install some mechanism intended to replace trust, it’s worth asking if there’s a different way.

The ingredients for trust are simple. We need to repeatedly interact with the same people, know that others will warn us about their bad behavior, and feel secure in the knowledge we’ll be helped when and if we need it. At the same time, we need to know others will be warned if we behave badly and that everything we give to others will come back to us, perhaps multiplied.

If you want people to trust you, the best place to start is by trusting them. That isn’t always easy to do, especially if you’ve paid the price for it in the past. But it’s the best place to start. Then you need to combine it with repeat interactions, or the possibility thereof. In the iterated Prisoner’s Dilemma, a game that reveals how cooperation works, the best strategy to adopt is tit for tat. In the first round you cooperate, then in subsequent rounds do whatever the other player did last.

How might that play out in real life? If you want your employees to trust you, then you might start by trusting them—while also making it clear that you’re not going to fire them suddenly and you want them to stick around.

Mistrust is expensive. But trusting the wrong people can sometimes seem too risky. The lesson we can take from Bowling Alone is that building trust is absolutely worthwhile—and that the only way to do it is by finding ways to get out there and engage with other people.

We can create trust by contributing to existing communities and creating new ones. The more we show up and are willing to have faith in others, the more we’ll get back in return.

The post The High Price of Mistrust appeared first on Farnam Street.

]]>
43411
The Best of The Knowledge Project 2020 https://myvibez.link/best-conversations-2020/ Sun, 27 Dec 2020 15:08:52 +0000 https://myvibez.link/?p=43278 One of the best ways to learn is a good conversation. While there are many advantages to a good conversation, perhaps the best is that you can benefit from the lessons that other people have already paid the price for. Of course, that’s not all. Good conversations can also offer a new way to interpret …

The post The Best of The Knowledge Project 2020 appeared first on Farnam Street.

]]>
One of the best ways to learn is a good conversation.

While there are many advantages to a good conversation, perhaps the best is that you can benefit from the lessons that other people have already paid the price for. Of course, that’s not all. Good conversations can also offer a new way to interpret your past experiences, discover something new, and remind us of something we already know.

A good conversation updates the software in your brain. But not all updates are the same. Learning more isn’t simply a matter of having more conversations, but rather getting more out of each conversation that you are apart of. Deep conversations with ‘people that do’ offer the richest source of learning. Conversations that skim the surface, on the other hand, only offer the illusion of learning.

With that in mind, we’d like to invite you to join us in the top conversations we had on The Knowledge Project in 2020.

It’s time to listen and learn.

  • Episode 82: Bill Ackman: Getting Back Up — Legendary activist investor, Bill Ackman talks about lessons he’s learned growing up, raising a family, what drives him forward and back up from failure, consuming information and ideas, and facing criticism.
  • Episode 94: Chamath Palihapitiya: Understanding Yourself — Founder and CEO of Social Capital, Chamath Palihapitiya sits down with Shane Parrish to chat about what it means to be an observer of the present, how to think in first principles, the psychology of successful investing, his thoughts on the best public company CEOs and much more.
  • Episode 74: Embracing Confusion with Jeff Hunter — CEO of Talentism, Jeff Hunter, teaches how to rewrite damaging narratives that hold us back, how to give and receive helpful feedback, and why confusion can be a good thing.
  • Episode 80: Developing the Leader in You with John Maxwell — Leadership expert John Maxwell breaks down the four traits every successful person possesses and how to awaken the leader within you, no matter what your job title says.
  • Episode 85: Bethany McLean: Crafting a Narrative — Best-selling author of The Smartest Guys in the Room and All the Devils are Here, Bethany McLean, discusses how to write a story, the behaviors of CEO’s, visionaries and fraudsters and so much more.

Honorary mention to Derek Sivers: Innovation Versus Imitation [The Knowledge Project Ep. #88], who was only 131 downloads away from making the list.

In other news this year, we released a TKP youtube channel with full-length videos of our conversations so you can see the guest, as well as a “Clips” channel, where we are building the world’s best repository of nugget-sized information you can use in work and life.

If you’re still curious, check out the 2019 list.

The post The Best of The Knowledge Project 2020 appeared first on Farnam Street.

]]>
43278
Aim For What’s Reasonable: Leadership Lessons From Director Jean Renoir https://myvibez.link/aim-for-whats-reasonable-leadership-lessons-from-director-jean-renoir/ Mon, 31 Aug 2020 14:29:50 +0000 https://myvibez.link/?p=42726 Directing a film involves getting an enormous group of people to work together on turning the image inside your head into a reality. In this 1970 interview, director Jean Renoir dispenses time-tested wisdom for leaders everywhere on humility, accountability, goal-setting, and more. *** Many of us end up in leadership roles at some point in …

The post Aim For What’s Reasonable: Leadership Lessons From Director Jean Renoir appeared first on Farnam Street.

]]>
Directing a film involves getting an enormous group of people to work together on turning the image inside your head into a reality. In this 1970 interview, director Jean Renoir dispenses time-tested wisdom for leaders everywhere on humility, accountability, goal-setting, and more.

***

Many of us end up in leadership roles at some point in our career. Most of us, however, never get any training or instruction on how to actually be a good leader. But whether we end up offering formal or informal leadership, at some point we need to inspire or motivate people towards accomplishing a shared vision.

Directors are the leaders of movie productions. They assemble their team, they communicate their vision, and they manage the ups and downs of the filming process. Thus the experience of a successful director offers great insight into the qualities of a good leader. In 1970, film director Jean Renoir gave an interview with George Stevens Jr. of the American Film Institute where he discussed the leadership aspects of directing. His insights illustrate some important lessons. Renoir started out making silent films, and he continued filmmaking through to the 1960s. His two greatest cinematic achievements were the films The Grand Illusion (1937) and The Rules of the Game (1939). He received a Lifetime Achievement Academy Award in 1975 for his contribution to the motion picture industry.

In the interview, Renoir speaks to humility in leadership when he says, “I’m a director who has spent his life suggesting stories that nobody wanted. It’s still going on. But I’m used to it and I’m not complaining, because the ideas which were forced on me were often better than my own ideas.”

Leadership is not necessarily coming up with all the answers; it’s also important to put aside your own ego to cultivate and support the contributions from your team. Sometimes leaders have the best ideas. But often people on their team have excellent ones as well.

Renoir suggests that the role of a director is to have a clear enough vision that you can work through the imperfections involved in executing it. “A picture, often when it is good, is the result of some inner belief which is so strong that you have to show what you want, in spite of a stupid story or difficulties about the commercial side of the film.”

Good leaders don’t require perfection to achieve results. They work with what they have, often using creativity and ingenuity to fill in when reality doesn’t conform to the ideal image in their head. Having a vision is not about achieving exactly that vision. It’s about doing the best you can once you come into contact with reality.

When Renoir says, “We directors are simply midwives,” he implies that effective leadership is about giving shape to the talents and capabilities that already exist. Excellent leaders find a way to challenge and develop those on their team. In explaining how he works with actors, he says, “You must not ask an actor to do what he cannot do.” Rather, you need to work with what you have, using clear feedback and communication to find a way to bring out the best in people. Sometimes getting out of people’s way and letting their natural abilities come out is the most important thing to do.

Although Renoir says, “When I can, I shoot my scenes only once. I like to be committed, to be a slave to my decision,” he further explains, “I don’t like to make the important decisions alone.” Good leaders know when to consult others. They know to take in information from those who know more than they do and to respect different forms of expertise. But they still take accountability for their decisions because they made the final choice.

Good leaders are also mindful of the world outside the group or organization they are leading. They don’t lead in a vacuum but are sensitive to all those involved in achieving the results they are trying to deliver. For a director, it makes no sense to conceive of a film without considering the audience. Renoir explains, “I believe that the work of art where the spectator does not collaborate is not a work of art.” Similarly, we all have groups that we interact with outside of our organization, like clients or customers. We too need to run our teams with an understanding of that outside world.

No one can be good at everything, and thus effective leadership involves knowing when to ask for help. Renoir admits, “That’s where I like to have my friends help me, because I am very bad at casting.” Knowing your weaknesses is vital, because then you can find people who have strengths in those areas to assist you.

Additionally, most organizations are too complex for any one person to be an expert at all of the roles. Leaders show hubris when they assume they can do the jobs of everyone else well. Renoir explains this notion of knowing your role as a leader: “Too many directors work like this. They tell the actor, ‘Sit down, my dear friends, and look at me. I am going to act a scene, and you are going to repeat what I just did.’ He acts a scene and he acts it badly, because if he is a director instead of an actor, it’s probably because he’s a bad actor.”

***

Although leadership can be all encompassing, we shouldn’t be intimidated by the ideal list of qualities and behaviors a good leader displays. Focus on how you can improve. Set goals. Reflect on your failures, and recognize your success.

“You know, there is an old slogan, very popular in our occidental civilization: you must look to an end higher than normal, and that way you will achieve something. Your aim must be very, very high. Myself, I am absolutely convinced that it is mere stupidity. The aim must be easy to reach, and by reaching it, you achieve more.”

The post Aim For What’s Reasonable: Leadership Lessons From Director Jean Renoir appeared first on Farnam Street.

]]>
42726
Job Interviews Don’t Work https://myvibez.link/job-interviews/ Mon, 06 Jul 2020 11:00:56 +0000 https://myvibez.link/?p=42535 Better hiring leads to better work environments, less turnover, and more innovation and productivity. When you understand the limitations and pitfalls of the job interview, you improve your chances of hiring the best possible person for your needs. *** The job interview is a ritual just about every adult goes through at least once. They …

The post Job Interviews Don’t Work appeared first on Farnam Street.

]]>
Better hiring leads to better work environments, less turnover, and more innovation and productivity. When you understand the limitations and pitfalls of the job interview, you improve your chances of hiring the best possible person for your needs.

***

The job interview is a ritual just about every adult goes through at least once. They seem to be a ubiquitous part of most hiring processes. The funny thing about them, however, is that they take up time and resources without actually helping to select the best people to hire. Instead, they promote a homogenous workforce where everyone thinks the same.

If you have any doubt about how much you can get from an interview, think of what’s involved for the person being interviewed. We’ve all been there. The night before, you dig out your smartest outfit, iron it, and hope your hair lies flat for once. You frantically research the company, reading every last news article based on a formulaic press release, every blog post by the CEO, and every review by a disgruntled former employee.

After a sleepless night, you trek to their office, make awkward small talk, then answer a set of predictable questions. What’s your biggest weakness? Where do you see yourself in five years? Why do you want this job? Why are you leaving your current job? You reel off the answers you prepared the night before, highlighting the best of the best. All the while, you’re reminding yourself to sit up straight, don’t bite your nails, and keep smiling.

It’s not much better on the employer’s side of the table. When you have a role to fill, you select a list of promising candidates and invite them for an interview. Then you pull together a set of standard questions to riff off, doing a little improvising as you hear their responses. At the end of it all, you make some kind of gut judgment about the person who felt right—likely the one you connected with the most in the short time you were together.

Is it any surprise that job interviews don’t work when the whole process is based on subjective feelings? They are in no way the most effective means of deciding who to hire because they maximize the role of bias and minimize the role of evaluating competency.

What is a job interview?

“In most cases, the best strategy for a job interview is to be fairly honest, because the worst thing that can happen is that you won’t get the job and will spend the rest of your life foraging for food in the wilderness and seeking shelter underneath a tree or the awning of a bowling alley that has gone out of business.”

— Lemony Snicket, Horseradish

When we say “job interviews” throughout this post, we’re talking about the type of interview that has become standard in many industries and even in universities: free-form interviews in which candidates sit in a room with one or more people from a prospective employer (often people they might end up working with) and answer unstructured questions. Such interviews tend to focus on how a candidate behaves generally, emphasizing factors like whether they arrive on time or if they researched the company in advance. While questions may ostensibly be about predicting job performance, they tend to better select for traits like charisma rather than actual competence.

Unstructured interviews can make sense for certain roles. The ability to give a good first impression and be charming matters for a salesperson. But not all roles need charm, and just because you don’t want to hang out with someone after an interview doesn’t mean they won’t be an amazing software engineer. In a small startup with a handful of employees, someone being “one of the gang” might matter because close-knit friendships are a strong motivator when work is hard and pay is bad. But that group mentality may be less important in a larger company in need of diversity.

Considering the importance of hiring and how much harm getting it wrong can cause, it makes sense for companies to study and understand the most effective interview methods. Let’s take a look at why job interviews don’t work and what we can do instead.

Why job interviews are ineffective

Discrimination and bias

Information like someone’s age, gender, race, appearance, or social class shouldn’t dictate if they get a job or not—their competence should. But that’s unfortunately not always the case. Interviewers can end up picking the people they like the most, which often means those who are most similar to them. This ultimately means a narrower range of competencies is available to the organization.

Psychologist Ron Friedman explains in The Best Place to Work: The Art and Science of Creating an Extraordinary Workplace some of the unconscious biases that can impact hiring. We tend to rate attractive people as more competent, intelligent, and qualified. We consider tall people to be better leaders, particularly when evaluating men. We view people with deep voices as more trustworthy than those with higher voices.

Implicit bias is pernicious because it’s challenging to spot the ways it influences interviews. Once an interviewer judges someone, they may ask questions that nudge the interviewee towards fitting that perception. For instance, if they perceive someone to be less intelligent, they may ask basic questions that don’t allow the candidate to display their expertise. Having confirmed their bias, the interviewer has no reason to question it or even notice it in the future.

Hiring often comes down to how much an interviewer likes a candidate as a person. This means that we can be manipulated by manufactured charm. If someone’s charisma is faked for an interview, an organization can be left dealing with the fallout for ages.

The map is not the territory

The representation of something is not the thing itself. A job interview is meant to be a quick snapshot to tell a company how a candidate would be at a job. However, it’s not a representative situation in terms of replicating how the person will perform in the actual work environment.

For instance, people can lie during job interviews. Indeed, the situation practically encourages it. While most people feel uncomfortable telling outright lies (and know they would face serious consequences later on for a serious fabrication), bending the truth is common. Ron Friedman writes, “Research suggests that outright lying generates too much psychological discomfort for people to do it very often. More common during interviews are more nuanced forms of deception which include embellishment (in which we take credit for things we haven’t done), tailoring (in which we adapt our answers to fit the job requirements), and constructing (in which we piece together elements from different experiences to provide better answers.)” An interviewer can’t know if someone is deceiving them in any of these ways. So they can’t know if they’re hearing the truth.

One reason why we think job interviews are representative is the fundamental attribution error. This is a logical fallacy that leads us to believe that the way people behave in one area carries over to how they will behave in other situations. We view people’s behaviors as the visible outcome of innate characteristics, and we undervalue the impact of circumstances.

Some employers report using one single detail they consider representative to make hiring decisions, such as whether a candidate sends a thank-you note after the interview or if their LinkedIn picture is a selfie. Sending a thank-you note shows manners and conscientiousness. Having a selfie on LinkedIn shows unprofessionalism. But is that really true? Can one thing carry across to every area of job performance? It’s worth debating.

Gut feelings aren’t accurate

We all like to think we can trust our intuition. The problem is that intuitive judgments tend to only work in areas where feedback is fast and cause and effect clear. Job interviews don’t fall into that category. Feedback is slow. The link between a hiring decision and a company’s success is unclear.

Overwhelmed by candidates and the pressure of choosing, interviewers may resort to making snap judgments based on limited information. And interviews introduce a lot of noise, which can dilute relevant information while leading to overconfidence. In a study entitled Belief in the Unstructured Interview: The Persistence of an Illusion, participants predicted the future GPA of a set of students. They either received biographical information about the students or both biographical information and an interview. In some of the cases, the interview responses were entirely random, meaning they shouldn’t have conveyed any genuine useful information.

Before the participants made their predictions, the researchers informed them that the strongest predictor of a student’s future GPA is their past GPA. Seeing as all participants had access to past GPA information, they should have factored it heavily into their predictions.

In the end, participants who were able to interview the students made worse predictions than those who only had access to biographical information. Why? Because the interviews introduced too much noise. They distracted participants with irrelevant information, making them forget the most significant predictive factor: past GPA. Of course, we do not have clear metrics like GPA for jobs. But this study indicates that interviews do not automatically lead to better judgments about a person.

We tend to think human gut judgments are superior, even when evidence doesn’t support this. We are quick to discard information that should shape our judgments in favor of less robust intuitions that we latch onto because they feel good. The less challenging information is to process, the better it feels. And we tend to associate good feelings with ‘rightness’.

Experience ≠ expertise in interviewing

In 1979, the University of Texas Medical School at Houston suddenly had to increase its incoming class size by 50 students due to a legal change requiring larger classes. Without time to interview again, they selected from the pool of candidates the school chose to interview, then rejected as unsuitable for admission. Seeing as they got through to the interview stage, they had to be among the best candidates. They just weren’t previously considered good enough to admit.

When researchers later studied the result of this unusual situation, they found that the students whom the school first rejected performed no better or worse academically than the ones they first accepted. In short, interviewing students did nothing to help select for the highest performers.

Studying the efficacy of interviews is complicated and hard to manage from an ethical standpoint. We can’t exactly give different people the same real-world job in the same conditions. We can take clues from fortuitous occurrences, like the University of Texas Medical School change in class size and the subsequent lessons learned. Without the legal change, the interviewers would never have known that the students they rejected were of equal competence to the ones they accepted. This is why building up experience in this arena is difficult. Even if someone has a lot of experience conducting interviews, it’s not straightforward to translate that into expertise. Expertise is about have a predictive model of something, not just knowing a lot about it.

Furthermore, the feedback from hiring decisions tends to be slow. An interviewer cannot know what would happen if they hired an alternate candidate. If a new hire doesn’t work out, that tends to fall on them, not the person who chose them. There are so many factors involved that it’s not terribly conducive to learning from experience.

Making interviews more effective

It’s easy to see why job interviews are so common. People want to work with people they like, so interviews allow them to scope out possible future coworkers. Candidates expect interviews, as well—wouldn’t you feel a bit peeved if a company offered you a job without the requisite “casual chat” beforehand? Going through a grueling interview can make candidates more invested in the position and likely to accept an offer. And it can be hard to imagine viable alternatives to interviews.

But it is possible to make job interviews more effective or make them the final step in the hiring process after using other techniques to gauge a potential hire’s abilities. Doing what works should take priority over what looks right or what has always been done.

Structured interviews

While unstructured interviews don’t work, structured ones can be excellent. In Thinking, Fast and Slow, Daniel Kahneman describes how he redefined the Israel Defense Force’s interviewing process as a young psychology graduate. At the time, recruiting a new soldier involved a series of psychometric tests followed by an interview to assess their personality. Interviewers then based their decision on their intuitive sense of a candidate’s fitness for a particular role. It was very similar to the method of hiring most companies use today—and it proved to be useless.

Kahneman introduced a new interviewing style in which candidates answered a predefined series of questions that were intended to measure relevant personality traits for the role (for example, responsibility and sociability). He then asked interviewers to give candidates a score for how well they seemed to exhibit each trait based on their responses. Kahneman explained that “by focusing on standardized, factual questions I hoped to combat the halo effect, where favorable first impressions influence later judgments.” He tasked interviewers only with providing these numbers, not with making a final decision.

Although interviewers at first disliked Kahneman’s system, structured interviews proved far more effective and soon became the standard for the IDF. In general, they are often the most useful way to hire. The key is to decide in advance on a list of questions, specifically designed to test job-specific skills, then ask them to all the candidates. In a structured interview, everyone gets the same questions with the same wording, and the interviewer doesn’t improvise.

Tomas Chamorro-Premuzic writes in The Talent Delusion:

There are at least 15 different meta-analytic syntheses on the validity of job interviews published in academic research journals. These studies show that structured interviews are very useful to predict future job performance. . . . In comparison, unstructured interviews, which do not have a set of predefined rules for scoring or classifying answers and observations in a reliable and standardized manner, are considerably less accurate.

Why does it help if everyone hears the same questions? Because, as we learned previously, interviewers can make unconscious judgments about candidates, then ask questions intended to confirm their assumptions. Structured interviews help measure competency, not irrelevant factors. Ron Friedman explains this further:

It’s also worth having interviewers develop questions ahead of time so that: 1) each candidate receives the same questions, and 2) they are worded the same way. The more you do to standardize your interviews, providing the same experience to every candidate, the less influence you wield on their performance.

What, then, is an employer to do with the answers? Friedman says you must then create clear criteria for evaluating them.

Another step to help minimize your interviewing blind spots: include multiple interviewers and give them each specific criteria upon which to evaluate the candidate. Without a predefined framework for evaluating applicants—which may include relevant experience, communication skills, attention to detail—it’s hard for interviewers to know where to focus. And when this happens, fuzzy interpersonal factors hold greater weight, biasing assessments. Far better to channel interviewers’ attention in specific ways, so that the feedback they provide is precise.

Blind auditions

One way to make job interviews more effective is to find ways to “blind” the process—to disguise key information that may lead to biased judgments. Blinded interviews focus on skills alone, not who a candidate is as a person. Orchestras offer a remarkable case study in the benefits of blinding.

In the 1970s, orchestras had a gender bias problem. A mere 5% of their members were women, on average. Orchestras knew they were missing out on potential talent, but they found the audition process seemed to favor men over women. Those who were carrying out auditions couldn’t sidestep their unconscious tendency to favor men.

Instead of throwing up their hands in despair and letting this inequality stand, orchestras began carrying out blind auditions. During these, candidates would play their instruments behind a screen while a panel listened and assessed their performance. They received no identifiable information about candidates. The idea was that orchestras would be able to hire without room for bias. It took a bit of tweaking to make it work – at first, the interviewers were able to discern gender based on the sound of a candidate’s shoes. After that, they requested that people interview without their shoes.

The results? By 1997, up to 25% of orchestra members were women. Today, the figure is closer to 30%.

Although this is sometimes difficult to replicate for other types of work, blind auditions can provide an inspiration to other industries that could benefit from finding ways to make interviews more about a person’s abilities than their identity.

Competency-related evaluations

What’s the best way to test if someone can do a particular job well? Get them to carry out tasks that are part of the job. See if they can do what they say they can do. It’s much harder for someone to lie and mislead an interviewer during actual work than during an interview. Using competency tests for a blinded interview process is also possible—interviewers could look at depersonalized test results to make unbiased judgments.

Tomas Chamorro-Premuzic writes in The Talent Delusion: Why Data, Not Intuition, Is the Key to Unlocking Human Potential, “The science of personnel selection is over a hundred years old yet decision-makers still tend to play it by ear or believe in tools that have little academic rigor. . . . An important reason why talent isn’t measured more scientifically is the belief that rigorous tests are difficult and time-consuming to administer, and that subjective evaluations seem to do the job ‘just fine.’”

Competency tests are already quite common in many fields. But interviewers tend not to accord them sufficient importance. They come after an interview, or they’re considered secondary to it. A bad interview can override a good competency test. At best, interviewers accord them equal importance to interviews. Yet they should consider them far more important.

Ron Friedman writes, “Extraneous data such as a candidate’s appearance or charisma lose their influence when you can see the way an applicant actually performs. It’s also a better predictor of their future contributions because unlike traditional in-person interviews, it evaluates job-relevant criteria. Including an assignment can help you better identify the true winners in your applicant pool while simultaneously making them more invested in the position.”

Conclusion

If a company relies on traditional job interviews as its sole or main means of choosing employees, it simply won’t get the best people. And getting hiring right is paramount to the success of any organization. A driven team of people passionate about what they do can trump one with better funding and resources. The key to finding those people is using hiring techniques that truly work.

The post Job Interviews Don’t Work appeared first on Farnam Street.

]]>
42535
Standing on the Shoulders of Giants https://myvibez.link/shoulders-of-giants/ Mon, 13 Apr 2020 13:33:34 +0000 https://myvibez.link/?p=41681 Innovation doesn’t occur in a vacuum. Doers and thinkers from Shakespeare to Jobs, liberally “stole” inspiration from the doers and thinkers who came before. Here’s how to do it right. *** “If I have seen further,” Isaac Newton wrote in a 1675 letter to fellow scientist Robert Hooke, “it is by standing on the shoulders …

The post Standing on the Shoulders of Giants appeared first on Farnam Street.

]]>
Innovation doesn’t occur in a vacuum. Doers and thinkers from Shakespeare to Jobs, liberally “stole” inspiration from the doers and thinkers who came before. Here’s how to do it right.

***

If I have seen further,” Isaac Newton wrote in a 1675 letter to fellow scientist Robert Hooke, “it is by standing on the shoulders of giants.

It can be easy to look at great geniuses like Newton and imagine that their ideas and work came solely out of their minds, that they spun it from their own thoughts—that they were true originals. But that is rarely the case.

Innovative ideas have to come from somewhere. No matter how unique or unprecedented a work seems, dig a little deeper and you will always find that the creator stood on someone else’s shoulders. They mastered the best of what other people had already figured out, then made that expertise their own. With each iteration, they could see a little further, and they were content in the knowledge that future generations would, in turn, stand on their shoulders.

Standing on the shoulders of giants is a necessary part of creativity, innovation, and development. It doesn’t make what you do less valuable. Embrace it.

Everyone gets a lift up

Ironically, Newton’s turn of phrase wasn’t even entirely his own. The phrase can be traced back to the twelfth century, when the author John of Salisbury wrote that philosopher Bernard of Chartres compared people to dwarves perched on the shoulders of giants and said that “we see more and farther than our predecessors, not because we have keener vision or greater height, but because we are lifted up and borne aloft on their gigantic stature.

Mary Shelley put it this way in the nineteenth century, in a preface for Frankenstein: “Invention, it must be humbly admitted, does not consist in creating out of void but out of chaos.

There are giants in every field. Don’t be intimidated by them. They offer an exciting perspective. As the film director Jim Jarmusch advised, “Nothing is original. Steal from anywhere that resonates with inspiration or fuels your imagination. Devour old films, new films, music, books, paintings, photographs, poems, dreams, random conversations, architecture, bridges, street signs, trees, clouds, bodies of water, light, and shadows. Select only things to steal from that speak directly to your soul. If you do this, your work (and theft) will be authentic. Authenticity is invaluable; originality is non-existent. And don’t bother concealing your thievery—celebrate it if you feel like it. In any case, always remember what Jean-Luc Godard said: ‘It’s not where you take things from—it’s where you take them to.’”

That might sound demoralizing. Some might think, “My song, my book, my blog post, my startup, my app, my creation—surely they are original? Surely no one has done this before!” But that’s likely not the case. It’s also not a bad thing. Filmmaker Kirby Ferguson states in his TED Talk: “Admitting this to ourselves is not an embrace of mediocrity and derivativeness—it’s a liberation from our misconceptions, and it’s an incentive to not expect so much from ourselves and to simply begin.

There lies the important fact. Standing on the shoulders of giants enables us to see further, not merely as far as before. When we build upon prior work, we often improve upon it and take humanity in new directions. However original your work seems to be, the influences are there—they might just be uncredited or not obvious. As we know from social proof, copying is a natural human tendency. It’s how we learn and figure out how to behave.

In Antifragile: Things That Gain from Disorder, Nassim Taleb describes the type of antifragile inventions and ideas that have lasted throughout history. He describes himself heading to a restaurant (the likes of which have been around for at least 2,500 years), in shoes similar to those worn at least 5,300 years ago, to use silverware designed by the Mesopotamians. During the evening, he drinks wine based on a 6,000-year-old recipe, from glasses invented 2,900 years ago, followed by cheese unchanged through the centuries. The dinner is prepared with one of our oldest tools, fire, and using utensils much like those the Romans developed.

Much about our societies and cultures has undeniably changed and continues to change at an ever-faster rate. But we continue to stand on the shoulders of those who came before in our everyday life, using their inventions and ideas, and sometimes building upon them.

Not invented here syndrome

When we discredit what came before or try to reinvent the wheel or refuse to learn from history, we hold ourselves back. After all, many of the best ideas are the oldest. “Not Invented Here Syndrome” is a term for situations when we avoid using ideas, products, or data created by someone else, preferring instead to develop our own (even if it is more expensive, time-consuming, and of lower quality.)

The syndrome can also manifest as reluctance to outsource or delegate work. People might think their output is intrinsically better if they do it themselves, becoming overconfident in their own abilities. After all, who likes getting told what to do, even by someone who knows better? Who wouldn’t want to be known as the genius who (re)invented the wheel?

Developing a new solution for a problem is more exciting than using someone else’s ideas. But new solutions, in turn, create new problems. Some people joke that, for example, the largest Silicon Valley companies are in fact just impromptu incubators for people who will eventually set up their own business, firm in the belief that what they create themselves will be better.

The syndrome is also a case of the sunk cost fallacy. If a company has spent a lot of time and money getting a square wheel to work, they may be resistant to buying the round ones that someone else comes out with. The opportunity costs can be tremendous. Not Invented Here Syndrome detracts from an organization or individual’s core competency, and results in wasting time and talent on what are ultimately distractions. Better to use someone else’s idea and be a giant for someone else.

Why Steve Jobs stole his ideas

“Creativity is just connecting things. When you ask creative people how they did something, they feel a little guilty because they didn’t really do it. They just saw something. It seemed obvious to them after a while; that’s because they were able to connect experiences they’ve had and synthesize new things.” 

— Steve Jobs

In The Runaway Species: How Human Creativity Remakes the World, Anthony Brandt and David Eagleman trace the path that led to the creation of the iPhone and track down the giants upon whose shoulders Steve Jobs perched. We often hail Jobs as a revolutionary figure who changed how we use technology. Few who were around in 2007 could have failed to notice the buzz created by the release of the iPhone. It seemed so new, a total departure from anything that had come before. The truth is a little messier.

The first touchscreen came about almost half a century before the iPhone, developed by E.A. Johnson for air traffic control. Other engineers built upon his work and developed usable models, filing a patent in 1975. Around the same time, the University of Illinois was developing touchscreen terminals for students. Prior to touchscreens, light pens used similar technology. The first commercial touchscreen computer came out in 1983, soon followed by graphics boards, tablets, watches, and video game consoles. Casio released a touchscreen pocket computer in 1987 (remember, this is still a full twenty years before the iPhone.)

However, early touchscreen devices were frustrating to use, with very limited functionality, often short battery lives, and minimal use cases for the average person. As touchscreen devices developed in complexity and usability, they laid down the groundwork for the iPhone.

Likewise, the iPod built upon the work of Kane Kramer, who took inspiration from the Sony Walkman. Kramer designed a small portable music player in the 1970s. The IXI, as he called it, looked similar to the iPod but arrived too early for a market to exist, and Kramer lacked the marketing skills to create one. When pitching to investors, Kramer described the potential for immediate delivery, digital inventory, taped live performances, back catalog availability, and the promotion of new artists and microtransactions. Sound familiar?

Steve Jobs stood on the shoulders of the many unseen engineers, students, and scientists who worked for decades to build the technology he drew upon. Although Apple has a long history of merciless lawsuits against those they consider to have stolen their ideas, many were not truly their own in the first place. Brandt and Eagleman conclude that “human creativity does not emerge from a vacuum. We draw on our experience and the raw materials around us to refashion the world. Knowing where we’ve been, and where we are, points the way to the next big industries.”

How Shakespeare got his ideas

Nothing will come of nothing.”  

— William Shakespeare,<em> King Lear</em>

Most, if not all, of Shakespeare’s plays draw heavily upon prior works—so much so that some question whether he would have survived today’s copyright laws.

Hamlet took inspiration from Gesta Danorum, a twelfth-century work on Danish history by Saxo Grammaticus, consisting of sixteen Latin books. Although it is doubtful whether Shakespeare had access to the original text, scholars find the parallels undeniable and believe he may have read another play based on it, from which he drew inspiration. In particular, the accounts of the plight of Prince Amleth (which has the same letters as Hamlet) involves similar events.

Holinshed’s Chronicles, a co-authored account of British history from the late sixteenth century, tells stories that mimic the plot of Macbeth, including the three witches. Holinshed’s Chronicles itself was a mélange of earlier texts, which transferred their biases and fabrications to Shakespeare. It also likely inspired King Lear.

Parts of Antony and Cleopatra are copied verbatim from Plutarch’s Life of Mark Anthony. Arthur Brooke’s 1562 poem The Tragicall Historye of Romeus and Juliet was an undisguised template for Romeo and Juliet. Once again, there are more giants behind the scenes—Brooke copied a 1559 poem by Pierre Boaistuau, who in turn drew from a 1554 story by Matteo Bandello, who in turn drew inspiration from a 1530 work by Luigi da Porto. The list continues, with Plutarch, Chaucer, and the Bible acting as inspirations for many major literary, theatrical, and cultural works.

Yet what Shakespeare did with the works he sometimes copied, sometimes learned from, is remarkable. Take a look at any of the original texts and, despite the mimicry, you will find that they cannot compare to his plays. Many of the originals were dry, unengaging, and lacking any sort of poetic language. J.J. Munro wrote in 1908 that The Tragicall Historye of Romeus and Julietmeanders on like a listless stream in a strange and impossible land; Shakespeare’s sweeps on like a broad and rushing river, singing and foaming, flashing in sunlight and darkening in cloud, carrying all things irresistibly to where it plunges over the precipice into a waste of waters below.

Despite bordering on plagiarism at times, he overhauled the stories with exceptional use of the English language, bringing drama and emotion to dreary chronicles or poems. He had a keen sense for the changes required to restructure plots, creating suspense and intensity in their stories. Shakespeare saw far further than those who wrote before him, and with their help, he ushered in a new era of the English language.

Of course, it’s not just Newton, Jobs, and Shakespeare who found a (sometimes willing, sometimes not) shoulder to stand upon. Facebook is presumed to have built upon Friendster. Cormac McCarthy’s books often replicate older history texts, with one character coming straight from Samuel Chamberlain’s My Confessions. John Lennon borrowed from diverse musicians, once writing in a letter to the New York Times that though the Beatles copied black musicians, “it wasn’t a rip off. It was a love in.

In The Ecstasy of Influence, Jonathan Lethem points to many other instances of influences in classic works. In 1916, journalist Heinz von Lichberg published a story of a man who falls in love with his landlady’s daughter and begins a love affair, culminating in her death and his lasting loneliness. The title? Lolita. It’s hard to question that Nabokov must have read it, but aside from the plot and name, the style of language in his version is absent from the original.

The list continues. The point is not to be flippant about plagiarism but to cultivate sensitivity to the elements of value in a previous work, as well as the ability to build upon those elements. If we restrict the flow of ideas, everyone loses out.

The adjacent possible

What’s this about? Why can’t people come up with their own ideas? Why do so many people come up with a brilliant idea but never profit from it? The answer lies in what scientist Stuart Kauffman calls “the adjacent possible.” Quite simply, each new innovation or idea opens up the possibility of additional innovations and ideas. At any time, there are limits to what is possible, yet those limits are constantly expanding.

In Where Good Ideas Come From: The Natural History of Innovation, Steven Johnson compares this process to being in a house where opening a door creates new rooms. Each time we open the door to a new room, new doors appear and the house grows. Johnson compares it to the formation of life, beginning with basic fatty acids. The first fatty acids to form were not capable of turning into living creatures. When they self-organized into spheres, the groundwork formed for cell membranes, and a new door opened to genetic codes, chloroplasts, and mitochondria. When dinosaurs evolved a new bone that meant they had more manual dexterity, they opened a new door to flight. When our distant ancestors evolved opposable thumbs, dozens of new doors opened to the use of tools, writing, and warfare. According to Johnson, the history of innovation has been about exploring new wings of the adjacent possible and expanding what we are capable of.

A new idea—like those of Newton, Jobs, and Shakespeare—is only possible because a previous giant opened a new door and made their work possible. They in turn opened new doors and expanded the realm of possibility. Technology, art, and other advances are only possible if someone else has laid the groundwork; nothing comes from nothing. Shakespeare could write his plays because other people had developed the structures and language that formed his tools. Newton could advance science because of the preliminary discoveries that others had made. Jobs built Apple out of the debris of many prior devices and technological advances.

The questions we all have to ask ourselves are these: What new doors can I open, based on the work of the giants that came before me? What opportunities can I spot that they couldn’t? Where can I take the adjacent possible? If you think all the good ideas have already been found, you are very wrong. Other people’s good ideas open new possibilities, rather than restricting them.

As time passes, the giants just keep getting taller and more willing to let us hop onto their shoulders. Their expertise is out there in books and blog posts, open-source software and TED talks, podcast interviews, and academic papers. Whatever we are trying to do, we have the option to find a suitable giant and see what can be learned from them. In the process, knowledge compounds, and everyone gets to see further as we open new doors to the adjacent possible.

The post Standing on the Shoulders of Giants appeared first on Farnam Street.

]]>
41681
What You Truly Value https://myvibez.link/find-what-you-truly-value/ Mon, 30 Mar 2020 14:07:11 +0000 https://myvibez.link/?p=41495 Our devotion to our values gets tested in the face of a true crisis. But it’s also an opportunity to reconnect, recommit, and sometimes, bake some bread. *** The recent outbreak of the coronavirus is impacting people all over the world — not just in terms of physical health, but financially, emotionally, and even socially. …

The post What You Truly Value appeared first on Farnam Street.

]]>
Our devotion to our values gets tested in the face of a true crisis. But it’s also an opportunity to reconnect, recommit, and sometimes, bake some bread.

***

The recent outbreak of the coronavirus is impacting people all over the world — not just in terms of physical health, but financially, emotionally, and even socially. As we struggle to adapt to our new circumstances, it can be tempting to bury our head and wait for it all to blow over so we can just get back to normal. Or we can see this as an incredible opportunity to figure out who we are.

What many of us are discovering right now is that the things we valued a few months ago don’t actually matter: our cars, the titles on our business cards, our privileged neighborhoods. Rather, what is coming to the forefront is a shift to figuring out what we find intrinsically rewarding.

When everything is easy, it can seem like you have life figured out. When things change and you’re called to put it into practice, it’s a different level. It’s one thing to say you are stoic when your coffee spills and another entirely when you’re watching your community collapse. When life changes and gets hard, you realize you’ve never had to put into practice what you thought you knew about coping with disaster.

But when a crisis hits, everything is put to the real test.

The challenge then becomes wrapping our struggles into our values, because what we value only has meaning if it’s important when life is hard. To know if they have worth, your values need to help you move forward when you can barely crawl and the obstacles in your way seem insurmountable.

In the face of a crisis, what is important to us becomes evident when we give ourselves the space to reflect on what is going to get us through the hard times. And so we find renewed commitment to get back to core priorities. What seemed important before falls apart to reveal what really matters: family, love, community, health.

“I was 32 when I started cooking; up until then, I just ate.” 

— Julia Child

One unexpected activity that many people are turning to now that they have time and are more introspective is baking. In fact, this week Google searches for bread recipes hit a noticeable high.


Baking is a very physical experience: kneading dough, tasting batter, smelling the results of the ingredients coming together. It’s an activity that requires patience. Bread has to rise. Pies have to cook. Cakes have to cool before they can be covered with icing. And, as prescriptive as baking seems on its surface, it’s something that facilitates creativity as we improvise our ingredients based on what we have in the cupboard. We discover new flavors, and we comfort ourselves and others with the results. Baked goods are often something we share, and in doing so we are providing for those we care about.

Why might baking be useful in times of stress? In Overcoming Anxiety, Dennis Tirch explains “research has demonstrated that when people engage more fully in behaviors that give them a sense of pleasure and mastery, they can begin to overcome negative emotions.”

At home with their loved ones people can reconsider what they value one muffin at a time. Creating with the people we love instead of consuming on our own allows us to focus on what we value as the world changes around us. With more time, slow, seemingly unproductive pursuits have new appeal because they help us reorient to the qualities in life that matter most.

Giving yourself the space to tune in to your values doesn’t have to come through baking. What’s important is that you find an activity that lets you move past fear and panic, to reconnect with what gives your life meaning. When you engage with an activity that gives you pleasure and releases negative emotions, it allows you to rediscover what is important to you.

Change is stressful. But neither stress nor change have to be scary. If you think about it, you undergo moments of change every day because nothing in life is ever static. Our lives are a constant adaptation to a world that is always in motion.

All change brings opportunity. Some change gives us the opportunity to pause and ask what we can do better. How can we better connect to what has proven to be important? Connection is not an abstract intellectual exercise, but an experience that orients us to the values that provide us direction. If you look for opportunities in line with your values, you will be able to see a path through the fear and uncertainty guided by the light that is hope.

The post What You Truly Value appeared first on Farnam Street.

]]>
41495
Seduced by Logic: Émilie du Châtelet and the Struggles to create the Newtonian Revolution https://myvibez.link/seduced-logic/ Tue, 04 Apr 2017 11:00:04 +0000 https://www.farnamstreetblog.com/?p=31164 Against great odds, Émilie du Châtelet (1706–1749) taught herself mathematics and became a world authority on Newtonian mathematical physics. I say against great odds because being a woman at the time meant she was ineligible for the same formal and informal opportunities available to others. Seduced by Logic, by Robyn Arianrhod tells her story with …

The post Seduced by Logic: Émilie du Châtelet and the Struggles to create the Newtonian Revolution appeared first on Farnam Street.

]]>
Against great odds, Émilie du Châtelet (1706–1749) taught herself mathematics and became a world authority on Newtonian mathematical physics.

I say against great odds because being a woman at the time meant she was ineligible for the same formal and informal opportunities available to others. Seduced by Logic, by Robyn Arianrhod tells her story with captivating color.

Émilie and her lover and collaborator Voltaire realized that Newton’s Principia not only changed our view of the world but also the way we do science.

“Newton,” writes Arianrhod, “had created a method for constructing and then testing theories, so the Principia provided the first truly modern blueprint for theoretical science as both a predictive, quantitative discipline—Newton eschewed qualitative, unproven, metaphysical speculations—and a secular discipline, separate from religion, although by no means inherently opposed to it.”

This, of course, has impacted the way we live and see ourselves. While Newton is relatively well known today, his theories were not easily accepted at the time. Émilie was one of the first to realize his impact and promote his thinking. In the late 1740s, she created what is, still to this day, the authoritative French translation, which includes detailed commentary, on Newton’s masterpiece. Voltaire considered du Châtelet “a genius worthy of Horace and Newton.”

Émilie du Châtelet didn’t limit herself to only commenting on Newton. The reason the book still stands today is that she added a lot of original thought.

***

How did Émilie du Châtelet come to learn so much in a world that overtly limited her opportunities? This is where her character shines.

While her brothers were sent to the most prestigious Jesuit secondary schools; Émilie was left to fend for herself and acquired much of her knowledge through reading. While her brothers could attend university, “such a thing was unthinkable for a girl.”

Luckily her family environment was conducive to self-education. Émilie’s parents “were rather unorthodox in the intellectual freedom they allowed in their children: both parents allowed Émilie to argue with them and express opinions, and from the time they were about ten years old, the children had permission to browse freely through the library.”

 

***

Émilie would grow and enter an arranged marriage at eighteen with thirty-year-old Florent-Claude, marquis du Chatelet and count of Lomont. Less than a year later she gave birth to their first child, Gabrielle-Pauline, which was followed seventeen months later by their son, Floren-Louis. Another child, a boy, would come six years later only to pass within two years. His death caused her to remark on her grief that the ‘sentiments of nature must exist in us without us suspecting.’

“Sometime around 1732, she experienced a true intellectual epiphany,” Arianrhod writes. As a result, Émilie would come to see herself as a ‘thinking creature.’

“At first, she only caught a glimpse of this new possibility, and she continued to allow her time to be wasted by superficial society life and its dissipation, ‘which was all I had felt myself born for.’ Fortunately, her ongoing friendship with these ‘people who think’—including another mathematically inclined woman, Marie de Thil, who would remain her lifelong friend—led Émilie to the liberating realisation that it was not too late to begin cultivating her mind seriously.”

It would be a difficult journey. “I feel,” Émilie wrote, “all the weight of the prejudice that universally excludes [women] from the sciences. It is one of the contradictions of this world that has always astonished me, that there are great countries whose destiny the law permits us to rule, and yet there is no place where we are taught to think.”

To become a person who thinks she became a person who reads.

“Presumably,” Arianrhod writes, “she studied Descartes, Newton, and the great English philosopher of liberty, John Locke, because when she met Voltaire a year after her epiphany, he was immediately captivated by her mind as well as her other charms.”

In an early love letter, Voltaire would write to her “Ah! What happiness to see you, to hear you … and what pleasures I taste in your arms! I am so fortunate to love the one I admire … you are the idol of my heart, you make all my happiness.”

“When Émilie and Voltaire because their courtship in 1733,” Arianhod writes, “she was twenty-six, and he was thirty-eight (the same as-as her husband, with whom Voltaire would eventually become good friends, thanks to Émilie’s encouragement and her efforts as a diplomatic go-between.)”

***

Arianrhod writes of Émilie’s struggles to learn:

Émilie’s plan to become a mathematician would require all her courage and determination. Firstly, envious acquaintances like Madame du Deffand would try to cast her as a dry and ugly ‘learned woman’ or femme savante, despite the fact that she had such appeal and charisma that the handsome duc de Richelieu, one of the most sought-after men in Paris, was rumoured to have once been her lover, while the celebrated Voltaire adored her. Of course, some of her female contemporaries admired her scholarship: Madame de Graffigny would later say, ‘Our sex ought to erect altars to her!’ But many were irritated by, or envious of, her liberated commitment to an intellectual life, because Émilie was very different from the glamorous women who ran many of Paris’s legendary literary salons. It was acceptable, even admirable, for such women to know enough of languages and philosophy to be good conversationalists with the learned men who dominated salon gatherings, but it was expected that women be modest about their knowledge. By contrast, Émilie would become famous as a scholar in her own right, thus angering the likes of Madame du Deffand, a powerful salonnière who claimed Émilie’s interest in science was all for show.

There were few truly learned women of the time, the belief being they were “either pretentious or ugly,” something that lingered “for the next three centuries.”

If you’re going to blaze the trail, you really have to blaze it.

At thirty-five, (Pierre-Louis Moreau de Maupertuis) Maupertuis was both ambitious and charming. When he agreed to tutor Émilie, he probably expected her to be a dilettante like his other female students: he had quite a following among society ladies. But her first known letter to him, written in January 1734, is both deferential and eager: ‘I spent all yesterday evening working on your lessons. I would like to make myself worthy of them. I fear, I confess to you, losing the good opinion you have of me.’ Perhaps he still doubted her commitment, because a week or two later she wrote, ‘I spent the evening with binomials and trinomials, [but] I am no longer able to study if you do not give me a task, and I have an extreme desire for one.’ Over the next few months, she sent him a stream of notes, trying to arrange lessons, asking him to come to her house for a couple of hours, or offering to meet him outside the Academy of Sciences – women were allowed inside only for the twice-yearly public lectures – or outside Gradot’s, one of the favourite cafés of the intellectual set.

[…]

It was this kind of intensity – as expressed in this multitude of requests for rendezvous – that fuelled gossip among her peers, and jealousy from Voltaire. Until the late twentieth century, most historians, too, seemed unable to imagine a woman like Émilie could be seduced only by mathematics – after all, until then, few women had actually become mathematicians. But it is true that many of Émilie’s letters to Maupertuis have a very flirtatious style – it was, after all, an era that revelled in the game of seduction. There is no evidence to prove whether or not they ever became lovers in those early months, before she and Voltaire had fully committed themselves to each other, but her letters certainly prove that all her life she would continue to hold a deep affection and respect for Maupertuis. In late April 1734, Émilie wrote to Maupertuis: ‘I hope I will render myself less unworthy of your lessons by telling you that it is not for myself that I want to become a mathematician, but because I am ashamed of making such mediocre progress under such a master as you.’ It was, indeed, an era of flattery! (Voltaire was quite adept at it – as a mere bourgeois, he often needed to flatter important people to help advance his literary career.) Although this letter suggests Émilie was simply using flattery to extract more lessons from her mathematical ‘master’, she always did have genuine doubts about her ability, which is not surprising given her lack of formal education and the assumed intellectual inferiority of her gender. She would later write, ‘If I were king … I would reform an abuse which cuts back, as it were, half of humanity. I would have women participate in all human rights, and above all those of the mind.’

***

In the translator’s preface to her late 1730s edition of Selected Philosophical and Scientific Writings, Du Châtelet highlights a few of the traits that helped her overcome so much.

You must know what you want:

(Knowledge) can never be acquired unless one has chosen a goal for one’s studies. One must conduct oneself as in everyday life; one must know what one wants to be. In the latter endeavors irresolution produces false steps, and in the life of the mind confused ideas.

She considered herself a member of the ordinary class, and she wrote about how regular people can come to acquire talent.

It sometimes happens that work and study force genius to declare itself, like the fruits that art produces in a soil where nature did not intend it, but these efforts of art are nearly as rare as natural genius itself. The vast majority of thinking men — the others, the geniuses, are in a class of their own — need to search within themselves for their talent. They know the difficulties of each art, and the mistakes of those who engage in each one, but they lack the courage that is not disheartened by such reflections, and the superiority that would enable them to overcome such difficulties. Mediocrity is, even among the elect, the lot of the greatest number.

Seduced by Logic is worth reading in its entirety. Du Châtelet’s story is as fascinating as informative.

The post Seduced by Logic: Émilie du Châtelet and the Struggles to create the Newtonian Revolution appeared first on Farnam Street.

]]>
31164
Using Multidisciplinary Thinking to Approach Problems in a Complex World https://myvibez.link/steven-pinker-what-is-true-complex-world/ Tue, 08 Nov 2016 12:00:36 +0000 https://www.farnamstreetblog.com/?p=29628 Complex outcomes in human systems are a tough nut to crack when it comes to deciding what’s really true. Any phenomena we might try to explain will have a host of competing theories, many of them seemingly plausible. So how do we know what to go with? One idea is to take a nod from the best. One of the most …

The post Using Multidisciplinary Thinking to Approach Problems in a Complex World appeared first on Farnam Street.

]]>
Complex outcomes in human systems are a tough nut to crack when it comes to deciding what’s really true. Any phenomena we might try to explain will have a host of competing theories, many of them seemingly plausible.

So how do we know what to go with?

One idea is to take a nod from the best. One of the most successful “explainers” of human behavior has been the cognitive psychologist Steven Pinker. His books have been massively influential, in part because they combine scientific rigor, explanatory power, and plainly excellent writing.

What’s unique about Pinker is the range of sources he draws on. His book The Better Angels of Our Nature, a cogitation on the decline in relative violence in recent human history, draws on ideas from evolutionary psychology, forensic anthropology, statistics, social history, criminology, and a host of other fields. Pinker, like Vaclav Smil and Jared Diamond, is the opposite of the man with a hammer, ranging over much material to come to his conclusions.

In fact, when asked about the progress of social science as an explanatory arena over time, Pinker credited this cross-disciplinary focus:

Because of the unification with the sciences, there are more genuinely explanatory theories, and there’s a sense of progress, with more non-obvious things being discovered that have profound implications.

But, even better, Pinker gives out an outline for how a multidisciplinary thinker should approach problems in a complex world.

***

Here’s the issue at stake: When we’re viewing a complex phenomena—say, the decline in certain forms of violence in human history—it can be hard to come with up a rigorous explanation. We can’t just set up repeated lab experiments and vary the conditions of human history to see what pops out, as with physics or chemistry.

So out of necessity, we must approach the problem in a different way.

In the above referenced interview, Pinker gives a wonderful example how to do it: Note how he carefully “cross-checks” from a variety of sources of data, developing a 3D view of the landscape he’s trying to assess:

Pinker: Absolutely, I think most philosophers of science would say that all scientific generalizations are probabilistic rather than logically certain, more so for the social sciences because the systems you are studying are more complex than, say, molecules, and because there are fewer opportunities to intervene experimentally and to control every variable. But the exis­tence of the social sciences, including psychology, to the extent that they have discovered anything, shows that, despite the uncontrollability of human behavior, you can make some progress: you can do your best to control the nuisance variables that are not literally in your control; you can have analogues in a laboratory that simulate what you’re interested in and impose an experimental manipulation.

You can be clever about squeezing the last drop of causal information out of a correlational data set, and you can use converging evi­dence, the qualitative narratives of traditional history in combination with quantitative data sets and regression analyses that try to find patterns in them. But I also go to traditional historical narratives, partly as a sanity check. If you’re just manipulating numbers, you never know whether you’ve wan­dered into some preposterous conclusion by taking numbers too seriously that couldn’t possibly reflect reality. Also, it’s the narrative history that provides hypotheses that can then be tested. Very often a historian comes up with some plausible causal story, and that gives the social scientists something to do in squeezing a story out of the numbers.

Warburton: I wonder if you’ve got an example of just that, where you’ve combined the history and the social science?

Pinker: One example is the hypothesis that the Humanitarian Revolution during the Enlightenment, that is, the abolition of slavery, torture, cruel punishments, religious persecution, and so on, was a product of an expansion of empathy, which in turn was fueled by literacy and the consumption of novels and journalis­tic accounts. People read what life was like in other times and places, and then applied their sense of empathy more broadly, which gave them second thoughts about whether it’s a good idea to disembowel someone as a form of criminal punish­ment. So that’s a historical hypothesis. Lynn Hunt, a historian at the University of California–Berkeley, proposed it, and there are some psychological studies that show that, indeed, if people read a first-person account by someone unlike them, they will become more sympathetic to that individual, and also to the category of people that that individual represents.

So now we have a bit of experimental psychology supporting the historical qualita­tive narrative. And, in addition, one can go to economic histo­rians and see that, indeed, there was first a massive increase in the economic efficiency of manufacturing a book, then there was a massive increase in the number of books pub­lished, and finally there was a massive increase in the rate of literacy. So you’ve got a story that has at least three vertices: the historian’s hypothesis; the economic historians identifying exogenous variables that changed prior to the phenomenon we’re trying to explain, so the putative cause occurs before the putative effect; and then you have the experimental manipulation in a laboratory, showing that the intervening link is indeed plausible.

Pinker is saying, Look we can’t just rely on “plausible narratives” generated by folks like the historians. There are too many possibilities that could be correct.

Nor can we rely purely on correlations (i.e., the rise in literacy statistically tracking the decline in violence) — they don’t necessarily offer us a causative explanation. (Does the rise in literacy cause less violence, or is it vice versa? Or, does a third factor cause both?)

However, if we layer in some other known facts from areas we can experiment on — say, psychology or cognitive neuroscience — we can sometimes establish the causal link we need or, at worst, a better hypothesis of reality.

In this case, it would be the finding from psychology that certain forms of literacy do indeed increase empathy (for logical reasons).

Does this method give us absolute proof? No. However, it does allow us to propose and then test, re-test, alter, and strengthen or ultimately reject a hypothesis. (In other words, rigorous thinking.)

We can’t stop here though. We have to take time to examine competing hypotheses — there may be a better fit. The interviewer continues on asking Pinker about this methodology:

Warburton: And so you conclude that the de-centering that occurs through novel-reading and first-person accounts probably did have a causal impact on the willingness of people to be violent to their peers?

Pinker: That’s right. And, of course, one has to rule out alternative hypotheses. One of them could be the growth of affluence: perhaps it’s simply a question of how pleasant your life is. If you live a longer and healthier and more enjoyable life, maybe you place a higher value on life in general, and, by extension, the lives of others. That would be an alternative hypothesis to the idea that there was an expansion of empathy fueled by greater literacy. But that can be ruled out by data from eco­nomic historians that show there was little increase in afflu­ence during the time of the Humanitarian Revolution. The increase in affluence really came later, in the 19th century, with the advent of the Industrial Revolution.

***

Let’s review the process that Pinker has laid out, one that we might think about emulating as we examine the causes of complex phenomena in human systems:

  1. We observe an interesting phenomenon in need of explanation, one we feel capable of exploring.
  2. We propose and examine competing hypotheses that would explain the phenomena (set up in a falsifiable way, in harmony with the divide between science and pseudoscience laid out for us by the great Karl Popper).
  3. We examine a cross-section of: Empirical data relating to the phenomena; sensible qualitative inference (from multiple fields/disciplines, the more fundamental the better), and finally;  “Demonstrable” aspects of nature we are nearly certain about, arising from controlled experiment or other rigorous sources of knowledge ranging from engineering to biology to cognitive neuroscience.

What we end up with is not necessarily a bulletproof explanation, but probably the best we can do if we think carefully. A good cross-disciplinary examination with quantitative and qualitative sources coming into equal play, and a good dose of judgment, can be far more rigorous than the gut instinct or plausible nonsense type stories that many of us lazily spout.

A Word of Caution

Although Pinker’s “multiple vertices” approach to problem solving in complex domains can be powerful, we always have to be on guard for phenomena that we simply cannot explain at our current level of competence: We must have a “too hard” pile when competing explanations come out “too close to call” or we otherwise feel we’re outside of our circle of competence. Always tread carefully and be sure to follow Darwin’s Golden Rule: Contrary facts are more important than confirming ones. Be ready to change your mind, like Darwin, when the facts don’t go your way.

***

Still Interested? For some more Pinker goodness check out our prior posts on his work, or check out a few of his books like How the Mind Works or The Blank Slate: The Modern Denial of Human Nature.

The post Using Multidisciplinary Thinking to Approach Problems in a Complex World appeared first on Farnam Street.

]]>
29628
How To Mentally Overachieve — Charles Darwin’s Reflections On His Own Mind https://myvibez.link/charles-darwins-reflections-mind/ Tue, 18 Oct 2016 11:00:34 +0000 https://www.farnamstreetblog.com/?p=29384 We’ve written quite a bit about the marvelous British naturalist Charles Darwin, who with his Origin of Species created perhaps the most intense intellectual debate in human history, one which continues up to this day. Darwin’s Origin was a courageous and detailed thought piece on the nature and development of biological species. It’s the starting point …

The post How To Mentally Overachieve — Charles Darwin’s Reflections On His Own Mind appeared first on Farnam Street.

]]>
We’ve written quite a bit about the marvelous British naturalist Charles Darwin, who with his Origin of Species created perhaps the most intense intellectual debate in human history, one which continues up to this day.

Darwin’s Origin was a courageous and detailed thought piece on the nature and development of biological species. It’s the starting point for nearly all of modern biology.

But, as we’ve noted before, Darwin was not a man of pure IQ. He was not Issac Newton, or Richard Feynman, or Albert Einstein — breezing through complex mathematical physics at a young age.

Charlie Munger thinks Darwin would have placed somewhere in the middle of a good private high school class. He was also in notoriously bad health for most of his adult life and, by his son’s estimation, a terrible sleeper. He really only worked a few hours a day in the many years leading up to the Origin of Species.

Yet his “thinking work” outclassed almost everyone. An incredible story.

In his autobiography, Darwin reflected on this peculiar state of affairs. What was he good at that led to the result? What was he so weak at? Why did he achieve better thinking outcomes? As he put it, his goal was to:

“Try to analyse the mental qualities and the conditions on which my success has depended; though I am aware that no man can do this correctly.”

In studying Darwin ourselves, we hope to better appreciate our own strengths and weaknesses and, not to mention understand the working methods of a “mental overachiever.

Let’s explore what Darwin saw in himself.

***

1. He did not have a quick intellect or an ability to follow long, complex, or mathematical reasoning. He may have been a bit hard on himself, but Darwin realized that he wasn’t a “5 second insight” type of guy (and let’s face it, most of us aren’t). His life also proves how little that trait matters if you’re aware of it and counter-weight it with other methods.

I have no great quickness of apprehension or wit which is so remarkable in some clever men, for instance, Huxley. I am therefore a poor critic: a paper or book, when first read, generally excites my admiration, and it is only after considerable reflection that I perceive the weak points. My power to follow a long and purely abstract train of thought is very limited; and therefore I could never have succeeded with metaphysics or mathematics. My memory is extensive, yet hazy: it suffices to make me cautious by vaguely telling me that I have observed or read something opposed to the conclusion which I am drawing, or on the other hand in favour of it; and after a time I can generally recollect where to search for my authority. So poor in one sense is my memory, that I have never been able to remember for more than a few days a single date or a line of poetry.

2. He did not feel easily able to write clearly and concisely. He compensated by getting things down quickly and then coming back to them later, thinking them through again and again. Slow, methodical….and ridiculously effective: For those who haven’t read it, the Origin of Species is extremely readable and clear, even now, 150 years later.

I have as much difficulty as ever in expressing myself clearly and concisely; and this difficulty has caused me a very great loss of time; but it has had the compensating advantage of forcing me to think long and intently about every sentence, and thus I have been led to see errors in reasoning and in my own observations or those of others.

There seems to be a sort of fatality in my mind leading me to put at first my statement or proposition in a wrong or awkward form. Formerly I used to think about my sentences before writing them down; but for several years I have found that it saves time to scribble in a vile hand whole pages as quickly as I possibly can, contracting half the words; and then correct deliberately. Sentences thus scribbled down are often better ones than I could have written deliberately.

3. He forced himself to be an incredibly effective and organized collector of information. Darwin’s system of reading and indexing facts in large portfolios is worth emulating, as is the habit of taking down conflicting ideas immediately.

As in several of my books facts observed by others have been very extensively used, and as I have always had several quite distinct subjects in hand at the same time, I may mention that I keep from thirty to forty large portfolios, in cabinets with labelled shelves, into which I can at once put a detached reference or memorandum. I have bought many books, and at their ends I make an index of all the facts that concern my work; or, if the book is not my own, write out a separate abstract, and of such abstracts I have a large drawer full. Before beginning on any subject I look to all the short indexes and make a general and classified index, and by taking the one or more proper portfolios I have all the information collected during my life ready for use.

4. He had possibly the most valuable trait in any sort of thinker: A passionate interest in understanding reality and putting it in useful order in his headThis “Reality Orientation” is hard to measure and certainly does not show up on IQ tests, but probably determines, to some extent, success in life.

On the favourable side of the balance, I think that I am superior to the common run of men in noticing things which easily escape attention, and in observing them carefully. My industry has been nearly as great as it could have been in the observation and collection of facts. What is far more important, my love of natural science has been steady and ardent.

This pure love has, however, been much aided by the ambition to be esteemed by my fellow naturalists. From my early youth I have had the strongest desire to understand or explain whatever I observed,–that is, to group all facts under some general laws. These causes combined have given me the patience to reflect or ponder for any number of years over any unexplained problem. As far as I can judge, I am not apt to follow blindly the lead of other men. I have steadily endeavoured to keep my mind free so as to give up any hypothesis, however much beloved (and I cannot resist forming one on every subject), as soon as facts are shown to be opposed to it.

Indeed, I have had no choice but to act in this manner, for with the exception of the Coral Reefs, I cannot remember a single first-formed hypothesis which had not after a time to be given up or greatly modified. This has naturally led me to distrust greatly deductive reasoning in the mixed sciences. On the other hand, I am not very sceptical—a frame of mind which I believe to be injurious to the progress of science. A good deal of scepticism in a scientific man is advisable to avoid much loss of time, but I have met with not a few men, who, I feel sure, have often thus been deterred from experiment or observations, which would have proved directly or indirectly serviceable.

[…]

Therefore my success as a man of science, whatever this may have amounted to, has been determined, as far as I can judge, by complex and diversified mental qualities and conditions. Of these, the most important have been—the love of science—unbounded patience in long reflecting over any subject—industry in observing and collecting facts—and a fair share of invention as well as of common sense.

5. Most inspirational to us of average intellect, he outperformed his own mental aptitude with these good habits, surprising even himself with the results.

With such moderate abilities as I possess, it is truly surprising that I should have influenced to a considerable extent the belief of scientific men on some important points.

***

Still Interested? Read his autobiography, his The Origin of Species, or check out David Quammen’s wonderful short biography of the most important period of Darwin’s life. Also, if you missed it, check out our prior post on Darwin’s Golden Rule.

The post How To Mentally Overachieve — Charles Darwin’s Reflections On His Own Mind appeared first on Farnam Street.

]]>
29384
Lee Kuan Yew’s Rule https://myvibez.link/lee-kuan-yew-rule/ Tue, 13 Sep 2016 11:00:42 +0000 https://www.farnamstreetblog.com/?p=28705 Lee Kuan Yew, the “Father of Modern Singapore”, who took a nation from “Third World to First” in his own lifetime, has a simple idea about using theory and philosophy. Here it is: Does it work? He isn’t throwing away big ideas or theories, or even discounting them per se. They just have to meet the …

The post Lee Kuan Yew’s Rule appeared first on Farnam Street.

]]>
Lee Kuan Yew, the “Father of Modern Singapore”, who took a nation from “Third World to First” in his own lifetime, has a simple idea about using theory and philosophy. Here it is: Does it work?

He isn’t throwing away big ideas or theories, or even discounting them per se. They just have to meet the simple, pragmatic standard.

Does it work?

Try it out the next time you study a philosophy, a value, an approach, a theory, an ideology…it doesn’t matter if the source is a great thinker of antiquity or your grandmother. Has it worked? We’ll call this Lee Kuan Yew’s Rule, to make it easy to remember.

Here’s his discussion of it in The Grand Master’s Insights on China, the United States, and the World:

My life is not guided by philosophy or theories. I get things done and leave others to extract the principles from my successful solutions. I do not work on a theory. Instead, I ask: what will make this work? If, after a series of solutions, I find that a certain approach worked, then I try to find out what was the principle behind the solution. So Plato, Aristotle, Socrates, I am not guided by them…I am interested in what works…Presented with the difficulty or major problem or an assortment of conflicting facts, I review what alternatives I have if my proposed solution does not work. I choose a solution which offers a higher probability of success, but if it fails, I have some other way. Never a dead end.

We were not ideologues. We did not believe in theories as such. A theory is an attractive proposition intellectually. What we faced was a real problem of human beings looking for work, to be paid, to buy their food, their clothes, their homes, and to bring their children up…I had read the theories and maybe half believed in them.

But we were sufficiently practical and pragmatic enough not to be cluttered up and inhibited by theories. If a thing works, let us work it, and that eventually evolved into the kind of economy that we have today. Our test was: does it work? Does it bring benefits to the people?…The prevailing theory then was that multinationals were exploiters of cheap labor and cheap raw materials and would suck a country dry…Nobody else wanted to exploit the labor. So why not, if they want to exploit our labor? They are welcome to it…. We were learning how to do a job from them, which we would never have learnt… We were part of the process that disproved the theory of the development economics school, that this was exploitation. We were in no position to be fussy about high-minded principles.

***

Want More? Check out our prior posts on Lee Kuan Yew, or check out the short book of his insights from where this clip came. If you really want to dive deep, check out his wonderful autobiography, the amazing story of Singapore’s climb.

The post Lee Kuan Yew’s Rule appeared first on Farnam Street.

]]>
28705
Isaac Watts and the Improvement of the Mind https://myvibez.link/isaac-watts-improvement-of-the-mind/ Tue, 30 Aug 2016 11:00:31 +0000 https://www.farnamstreetblog.com/?p=28701 What did an 18th-century hymn writer have to contribute to the modern understanding of the world? As it turns out, a lot. Sometimes we forget how useful the old wisdom can be. *** One of the most popular and prolific Christian hymn writers of all time — including Joy to the World — was a man …

The post Isaac Watts and the Improvement of the Mind appeared first on Farnam Street.

]]>
What did an 18th-century hymn writer have to contribute to the modern understanding of the world? As it turns out, a lot. Sometimes we forget how useful the old wisdom can be.

***

One of the most popular and prolific Christian hymn writers of all time — including Joy to the World — was a man named Isaac Watts, who lived in England in the late 17th and early 18th century. Watts was a well educated Nonconformist (in the religious sense, not the modern one) who, along with his hymn writing, published a number of books on logic, science, and the learning process, at a time when these concepts were only just starting to grab hold as a dominant ideology, replacing the central role of religious teaching.

Watts’s book The Improvement of the Mind was an important contribution to the growing body of work emphasizing the importance of critical thinking and rational, balanced inquiry, rather than adhering to centuries of dogma. If, as Alfred North Whitehead once pronounced, modernity’s progress was due to the “invention of the method of invention,” Watts and his books (which became textbooks in English schools, including Oxford) can easily be credited with helping push the world along.

One non-conformist who would later come to be deeply influenced by Watts was the great scientist Michael Faraday. Faraday grew up in a poor area of 18th-century England and received a fairly crude education, and yet would go on to become the Father of Electromagnetism. How?

In part, Faraday credits his own “inventing the method of invention” to reading Watts’s books, particularly The Improvement of the Mind — a self improvement guide a few centuries before the internet. Watts recommended keeping a commonplace book to record facts, and Faraday did. Watts recommended he be guided by observed facts, and Faraday was. Watts recommended finding a great teacher, and Faraday starting attending lectures.

In Watts’s book, Faraday had found a guiding ethos for how to sort out truth and fiction, what we now call the scientific method. And, given his tremendous achievements from a limited starting point, it’s worth asking…what did Faraday find?

***

We needn’t search far to figure it out. Smack dab in Chapter One of the book, Watts lays out his General Rules for the Improvement of Knowledge.

Watts first lays out the goal of the whole enterprise. The idea is a pretty awesome one, the same ethos we promote constantly here: We all need to make decisions constantly, so why not figure out how to make better ones? You don’t have to be an intellectual to pursue this goal. Everybody has a mind worth cultivating in order to improve the practical outcome of their lives:

No man is obliged to learn and know every thing; this can neither be sought nor required, for it is utterly impossible : yet all persons are under some obligation to improve their own understanding; otherwise it will be a barren desert, or a forest overgrown with weeds and brambles. Universal ignorance or infinite errors will overspread the mind, which is utterly neglected, and lies without any cultivation.

Skill in the sciences is indeed the business and profession but of a small part of mankind; but there are many others placed in such an exalted rank in the world, as allows them much leisure and large opportunities to cultivate their reason, and to beautify and enrich their minds with various knowledge. Even the lower orders of men have particular railings in life, wherein they ought to acquire a just degree of skill; and this is not to be done well, without thinking and reasoning about them.

The common duties and benefits of society, which belong to every man living, as we are social creatures, and even our native and necessary relations to a family, a neighbourhood, or government, oblige all persons whatsoever to use their reasoning powers upon a thousand occasions; every hour of life calls for some regular exercise of our judgment, as to time and things, persons and actions; without a prudent and discreet determination in matters before as, we, shall be plunged into perpetual errors in our conduct. Now that which should always be practised, must at some time be learnt.

We then get into the Rules themselves, an 18th-century guide to becoming smarter, better, and more useful which is just as useful three hundred years later. In the Rules, Watts promotes the idea of becoming wiser, more humble, more hungry, and more broad-thinking. These are as good a guide to improving your mind as you’ll find.

Below as an abridged version of the Rules. Check them all out here or get it in book form here. Watts had a bit of a bent towards solemnity and godliness that need not be emulated (unless you’d like to, of course), but most of the Rules are as useful today as the day they were written.

***

Rule I. DEEPLY possess your mind with the vast importance of a good judgment, and the rich and inestimable advantage of right reasoning.

Review the instances of your own misconduct in life ; think seriously with yourselves how many follies and sorrows you had escaped, and how much guilt and misery you had prevented, if from your early years you had but taken due paius to judge aright concerning persons, times, and things. This will awaken you with lively vigour to address yourselves to the work of improving your reasoning powers, and seizing every opportunity and advantage for that end.

Rule II. Consider the weaknesses, frailties, and mistakes of human nature in general, which arise from the very constitution of a soul united to an animal body, and subjected to many inconveniences thereby.

Consider the many additional weaknesses, mistakes, and frailties, which are derived from our original apostasy and fall from a state of innocence; how much our powers of understanding are yet more darkened, enfeebled, and imposed upon by our senses, our fancies, and our unruly passions, &c.

Consider the depth and difficulty of many truths, and the flattering appearances of falsehood, whence arises an infinite variety of dangers to which we are exposed in our judgment of things.

Read with greediness those authors that treat of the doctrine of prejudices, prepossessions, and springs of error, on purpose to make your soul watchful on all sides, that it suffer itself, as far as possible, to be imposed upon by none of them.

Rule III. A slight view of things so momentous is not sufficient.

You should therefore contrive and practise some proper methods to acquaint yourself with your own ignorance, and to impress your mind with a deep and painful sense of the low and imperfect degrees of your present knowledge, that you may be incited with labour and activity to pursue after greater measures. Among others, you may find some such methods as these successful.

1. Take a wide survey now and then of the vast and unlimited regions of learning. […] The worlds of science are immense and endless.

2. Think what a numberless variety of questions and difficulties there are belonging even to that particular science in which you have made the greatest progress, and how few of them there are in which you have arrived at a final and undoubted certainty; excepting only those questions in the pure and simple mathematics, whose theorems are demonstrable, and leave scarce any doubt; and yet, even in the pursuit of some few of these, mankind have been strangely bewildered.

3. Spend a few thoughts sometimes on the puzzling enquiries concerning vacuums and atoms, the doctrine of infinites, indivisibles, and incommensurables in geometry, wherein there appear some insolvable difficulties: do this on purpose to give you a more sensible impression of the poverty of your understanding, and the imperfection of your knowledge. This will teach you what a vain thing it is to fancy that you know all things, and will instruct you to think modestly of your present attainments […]

4. Read the accounts of those vast treasures of knowledge which some of the dead have possessed, and some of the living do possess. Read and be astonished at the almost incredible advances which have been made in science. Acquaint yourself with some persons of great learning, that by converse among them, and comparing yourself with them, you may acquire a mean opinion of your own attainments, and may thereby be animated with new zeal, to equal them as far as possible, or to exceed: thus let your diligence be quickened by a generous and laudable emulation.

Rule IV. Presume not too much upon a bright genius, a ready wit, and good parts; for this, without labour and study, will never make a man of knowledge and wisdom.

This has been an unhappy temptation to persons of a vigorous and gay fancy, to despise learning and study. They have been acknowledged to shine in an assembly, and sparkle in a discourse on common topics, and thence they took it into their heads to abandon reading and labour, and grow old in ignorance; but when they had lost their vivacity of animal nature and youth, they became stupid and sottish even to contempt aud ridicule. Lucidas and Scintillo are young men of this stamp; they shine in conversation; they spread their native riches before the ignorant; they pride themselves in their own lively images of fancy, and imagine themselves wise and learned; but they had best avoid the presence of the skilful, and the test of reasoning; and I would advise them once a day to think forward a little, what a contemptible figure they will make in age.

The witty men sometimes have sense enough to know their own foible; and therefore they craftily shun the attacks of argument, or boldly pretend to despise and renounce them, because they are conscious of their own ignorance, aud inwardly confess their want of acquaintance with the skill of reasoning.

Rule V. As you are not to fancy yourself a learned man because you are blessed with a ready wit; so neither must you imagine that large and laborious reading, and a strong memory, can denominate you truly wise.

What that excellent critic has determined when he decided the question, whether wit or study makes the best poet, may well be applied to every sort of learning:

“Concerning poets there has been contest,
Whether they’re made by art, or nature best;
But if I may presume in this affair,
Among the rest my judgment to declare,
No art without a genius will avail,
And parts without the help of art will fail:
But both ingredients jointly must unite,
Or verse will never shine with a transcendent light.”
– Oldham.

It is meditation and studious thought, it is the exercise of your own reason and judgment upon all you read, that gives good sense even to the best genius, and affords your understanding the truest improvement. A boy of a strong memory may repeat a whole book of Euclid, yet be no geometrician; for he may not be able perhaps to demonstrate one single theorem. Memorino has learnt half the Bible by heart, and is become a living concordance, and a speaking index to theological folios, and yet he understands little of divinity. […]

Rule VII. Let the hope of new discoveries, as well as the satisfaction and pleasure of known trains, animate your daily industry.

Do not think learning in general is arrived at its perfection, or that the knowledge of any particular subject in any science cannot be improved, merely because it has lain five hundred or a thousand years without improvement. The present age, by the blessing of God on the ingenuity and diligence of men, has brought to light such truths in natural philosophy, and such discoveries in the heavens and the earth, as seemed to be beyond the reach of man. But may there not be Sir Isaac Newtons in every science? You should never despair therefore of finding out that which has never yet been found, unless you see something in the nature of it which renders it unsearchable, and above the reach of our faculties. […]

Rule VIII. Do not hover always on the surface of things, nor take up suddenly with mere appearances; but penetrate into the depth of matters, as far as your time and circumstances allow, especially in those things which relate to your own profession.

Do not indulge yourselves to judge of things by the first glimpse, or a short and superficial view of them; for this will fill the mind with errors and prejudices, and give it a wrong turn and ill habit of thinking, and make much work for retractation. Subito is carried away with title pages, so that he ventures to pronounce upon a large octavo at once, and to recommend it wonderfully when he had read half the preface. Another volume of controversies, of equal size, was discarded by him at once, because it pretended to treat of the Trinity, and yet he could neither find the word essence nor subsistences in the twelve first pages; but Subito changes his opinions of men and books and things so often, that nobody regards him.

As for those sciences, or those parts of knowledge, which either your profession, your leisure, your inclination, or your incapacity, forbid you to pursue with much application, or to search far into them, you must be contented with an historical and superficial knowledge of them, and not pretend to form any judgments of your own on those subjects which you understand very imperfectly.

Rule IX. Once a day, especially in the early years of life and study, call yourselves to an account what new ideas, what new proposition or truth you have gained, what further confirmation of known truths, and what advances you have made in any part of knowledge;

And let no day, if possible, pass away without some intellectual gain: such a course, well pursued, must certainly advance us in useful knowledge. It is a wise proverb among the learned, borrowed from the lips and practice of a celebrated painter,

“Let no day pass without one line at least.”

…and it was a sacred rule among the Pythagoreans, That they should every evening thrice run over the actions and affairs of the day, and examine what their conduct had been, what they had done, or what they had neglected: and they assured their pupils, that by this method they would make a noble progress on the path of virtue.

Rule X. Maintain a constant watch at all times against a dogmatical spirit;

Fix not your assent to any proposition in a firm and unalterable manner, till you have some firm and unalterable ground for it, and till you have arrived at some clear and sure evidence; till you have turned the proposition on all sides, and searched the matter through and through, so that you cannot be mistaken.

And even where you may think you have full grounds of assurance, be not too early, nor too frequent, in expressing this assurance in too peremptory and positive a manner, remembering that human nature is always liable to mistake in this corrupt and feeble state. A dogmatical spirit has man; inconveniences attending it: as

1. It stops the ear against all further reasoning upon that subject, and shuts up the mind from all farther improvements of knowledge. If you have resolutely fixed your opinion, though it be upon too slight and insufficient grounds, yet you will stand determined to renounce the strongest reason brought for the contrary opinion, and grow obstinate against the force of the clearest argument. Positive is a man of this character; and has often pronounced his assurance of the Cartesian vortexes: last year some further light broke in upon his understanding, with uncontrollable force, by reading something of mathematical philosophy; yet having asserted his former opinions in a most confident manner, be is tempted now to wink a little against the truth, or to prevaricate in his discourse upon that subject, lest by admitting conviction, he should expose himself to the necessity of confessing his former folly and mistake: and he has not humility enough for that.

2. A dogmatical spirit naturally leads us to arrogance of mind, and gives a man some airs in conversation which are too haughty and assuming. Audens is a man of learning, and very good company ; but his infallible assurance renders his carriage sometimes insupportable.

[…]

Rule XI. Though caution and slow assent will guard you against frequent mistakes and retractions; yet you should get humility and courage enough to retract any mistake, and confess an error.

Frequent changes are tokens of levity in our first determinations; yet you should never be too proud to change your opinion, nor frighted at the name of a changeling. Learn to scorn those vulgar bugbears, which confirm foolish man in his old mistakes, for fear of being charged with inconstancy. I confess it is better not to judge, than judge falsely; it is wiser to withhold our assent till we see complete evidence; but if we have too suddenly given up our assent, as the wisest man does sometimes, if we have professed what we find afterwards to be false, we should never be ashamed nor afraid to renounce a mistake. That is a noble essay which is found among the occasional papers ‘ to encourage the world to repractise retractations;’ and I would recommend it to the perusal of every scholar and every Christian.

Rule XV. Watch against the pride of your own reason, and a vain conceit of your own intellectual powers, with the neglect of divine aid and blessing.

Presume not upon great attainments in knowledge by your own self-sufficiency: those who trust to their own understandings entirely, are pronounced fools in the word of God; and it is the wisest of men gives them this character,

‘ He that trusteth in his own heart is a fool/ Prov. xxviii. 26. And the same divine writer advises us to ‘ trust in the Lord with all our heart, and not to lean to our understandings, nor to be wise in our own eyes,’ chap. iii. 5, 7*

Those who, with a neglect of religion and dependence on God, apply themselves to search out every article in the things of God by the mere dint of their own reason, have been suffered to run into wild excesses of foolery, and strange extravagance of opinions. Every one who pursues this vain course, and will not ask for the conduct of God in the study of religion, has just reason to fear he shall be left of God, and given up a prey to a thousand prejudices ; that he shall be consigned over to the follies of his own heart, and pursue his own temporal and eternal ruin. And even in common studies we should, by humility and dependence, engage the God of truth on our side. (Transcribers Note: This talk of God, pure nonsense that it is, does not diminish the value of his other rules.)

 

 

The post Isaac Watts and the Improvement of the Mind appeared first on Farnam Street.

]]>
28701
Andy Grove and the Value of Facing Reality https://myvibez.link/andy-grove-value-facing-reality/ Tue, 16 Aug 2016 11:00:26 +0000 https://www.farnamstreetblog.com/?p=28688 “People who have no emotional stake in a decision can see what needs to be done sooner.” — Andy Grove *** What do you do when you wake up one day and realize that reality has changed, and you will either change with it or perish? Here’s one story of someone who did it successfully: Andy …

The post Andy Grove and the Value of Facing Reality appeared first on Farnam Street.

]]>
“People who have no emotional stake in a decision
can see what needs to be done sooner.”
— Andy Grove

***

What do you do when you wake up one day and realize that reality has changed, and you will either change with it or perish? Here’s one story of someone who did it successfully: Andy Grove, the former CEO of Intel Corp.

Here’s the long and short: As late as 1981, Intel Corp had massive dominance of the worldwide semiconductor business. They made memory chips (RAM), owning about 60% of the global trade in a business that was growing in leaps and bounds. The personal computer revolution was taking off and the world was going digital slowly, year by year. It was the right business to be in, and Intel owned it. They got designed into the IBM PC, one of the first popular personal computers, in 1981. Life was good.

The problem was that everyone else wanted into the same business. New companies were popping up every day in the United States, and in the late ’70s and throughout the ’80s, Japanese semiconductor manufacturers started nipping at Intel’s heels. They were competing on price and fast availability. Slowly, Intel realized its products were becoming commodities. By 1988, Japanese manufacturers had over 50% of the global market.

What did Intel do in response?

At first, as most all of us do, they tried to cope with the old reality. They tried running faster on a treadmill to nowhere. This is the first true difficulty of facing a new reality: Seeing the world as it truly is. The temptation is always to stick to the old paradigm.

What Intel really wanted was to be able to stay in the old business and make money at it. Andy Grove describes some of the tactics they tried to this end in his great book Only the Paranoid Survive, written in 1996:

We tried a lot of things. We tried to focus on a niche of the memory market segment, we tried to invent special-purpose memories called valued-added designs, we introduced more advanced technologies and built memories with them. What we were desperately trying to do was earn a premium for our product in the marketplace as we couldn’t match the Japanese downward pricing spiral. There was a saying at Intel at that time: “If we do well we get ‘2x’ [twice] the price of Japanese memories, but what good does it do if ‘X’ gets smaller and smaller?

[…]

We had meetings and more meetings, bickering and arguments, resulting in nothing but conflicting proposals. There were those who proposed what they called a “go for it” strategy: “Let’s build a gigantic factory dedicated to producing memories and nothing but memories, and let’s take on the Japanese.” Others proposed that we should get really clever and use an avant-garde technology, “go for it” but in a technological rather than a manufacturing sense and build something the Japanese producers couldn’t build. Others were still clinging to the idea that we could come up with special-purpose memories, an increasingly unlikely possibility as memories became a uniform worldwide commodity. Meanwhile, as the debates raged, we just went on losing more and more money.

As Grove started waking up to the reality that the old way of doing business wasn’t going to work anymore, he allowed himself the thought that Intel would leave the business that had buttered its bread for so long.

And with this came the second difficulty of facing a new reality: Being the first to see it means you’ll face tremendous resistance from those who are not there yet. 

Of course, Grove faced this in spades at Intel. Notice how he describes the ties to the old reality: Religious conviction.

The company had a couple of beliefs that were as strong as religious dogmas. Both of them had to do with the importance of memories as the backbone of our manufacturing and sales activities. One was that memories were our “technology drivers.” What this phrase meant was that we always developed and refined our technologies on our memory products first because they were easier to test. Once the technology had been debugged on memories, we would apply it to microprocessors and other products. The other belief was the “full product-line” dogma. According to this, our salesmen needed a full product line to do a good job in front of our customers; if they didn’t have a full product line, the customer would prefer to do business with our competitors who did.

Given the strength of these beliefs, an open-minded, rational discussion about getting out of memories was practically impossible. What were we going to use for technology drivers? How were our salespeople going to do their jobs when they had an incomplete product family?

Eventually, after taking half-measures and facing all kinds of resistance from the homeostatic system that is a large organization, Grove was able to convince the executive team it was time to move on from the memory business and go whole-hog into microprocessors, a business where Intel could truly differentiate themselves and build a formidable competitive position.

It’s here that Grove hits on a very humbling point about facing reality: We’re often the last ones to see things the way they truly are! We’re sitting on a train oblivious to the fact that it’s moving at 80 miles per hour, but anyone sitting outside the train watches it whiz right by! This is the value of learning to see the world through the eyes of others.

After all manner of gnashing of teeth, we told our sales force to notify our memory customers. This was one of the great bugaboos: How would our customers react? Would they stop doing business with us altogether now that we were letting them down? In fact, the reaction was, for all practical purposes, a big yawn. Our customers knew that we were not a very large factor in the market and they had half figured that we would get out; most of them had already made arrangements with other suppliers.

In fact, when we informed them of the decision, some of them reacted with the comment, “It sure took you a long time.” People who have no emotional stake in a decision can see what needs to be done sooner. 

This is where the rubber hits on the road. As Grove mentions regarding Intel, you must train yourself to see your situation from the perspective of an outsider.

This is why companies often bring outside management or consulting organizations in to help them — they feel only someone sitting outside the train can see how fast it’s moving! But what if you could have for yourself that kind of objectivity? It takes a passionate interest in reality and a commitment to being open to change. In business especially, the Red Queen effect means that change is a constant, not a variable.

And the story of Andy Grove shows that it can be done. Despite the myriad of problems discussed above, not only did Grove realize how fast the train was moving, but he got all of his people off, and onto a new and better train! By the late ’80s Intel pushed into microprocessing and out of memories, and became one of the great growth companies of the 1990s in a brand new business. (And he did it without bringing in outside help.)

What it took was the courage to face facts and act on them: As hard as it must have been, the alternative was death.

Here’s what Grove took from the experience:

Change is pain

I learned how small and helpless you feel when facing a force that’s “10X” larger than what you are accustomed to. I experienced the confusion that engulfs you when something fundamental changes in the business, and I felt the frustration that comes when the things that worked for you in the past no longer do any good. I learned how desperately you want to run from dealing with even describing a new reality to close associates. And I experienced the exhilaration that comes from a set-jawed commitment to a new direction, unsure as that may be.

A new reality doesn’t happen overnight

In this case, the Japanese started beating us in the memory business in the early eighties. Intel’s performance started to slump when the entire industry weakened in mid-1984. The conversation with Gordon Moore that I described occurred in mid-1985. It took until mid-1986 to implement our exit from memories. Then it took another year before we returned to profitability. Going through the whole strategic inflection point took us a total of three years.

The new reality may be preferable to the old one

I also learned that strategic inflection points, painful as they are for all participants, provide an opportunity to break out of a plateau and catapult to a higher level of achievement. Had we not changed our business strategy, we would have been relegated to an immensely tough economic existence and, for sure, a relatively insignificant role in our industry. By making a forceful move, things turned out far better for us.

So here is your opportunity: When a new reality awaits, don’t go at it timidly. Take it head on and make it not only as good, but better than the old reality. Don’t be the boy in the well, looking up and seeing only the sides of the well. Take the time to see the world around you as it truly is.

***

Still Interested? Check out Grove’s classic book on strategic inflection points, Only the Paranoid Survive. For another interesting business case study, read the interesting story of how IBM first built its monster 20th century competitive advantage.

The post Andy Grove and the Value of Facing Reality appeared first on Farnam Street.

]]>
28688
Henry Ford and the Actual Value of Education https://myvibez.link/henry-ford-actual-value-education/ Tue, 26 Jul 2016 11:00:56 +0000 https://www.farnamstreetblog.com/?p=28295 “The object of education is not to fill a man’s mind with facts; it is to teach him how to use his mind in thinking.” — Henry Ford *** In his memoir My Life and Work, written in 1934, the brilliant (but flawed) Henry Ford (1863-1947) offers perhaps the best definition you’ll find of the …

The post Henry Ford and the Actual Value of Education appeared first on Farnam Street.

]]>
“The object of education is not to fill a man’s mind with facts;
it is to teach him how to use his mind in thinking.”
— Henry Ford

***

In his memoir My Life and Work, written in 1934, the brilliant (but flawed) Henry Ford (1863-1947) offers perhaps the best definition you’ll find of the value of an education, and a useful warning against the mere accumulation of information for the sake of its accumulation. A  devotee of lifelong learning need not be a Jeopardy contestant, accumulating trivia to spit back as needed. In the Age of Google, that sort of knowledge is increasingly irrelevant.

A real life-long learner seeks to learn and apply the world’s best knowledge to create a more constructive and more useful life for themselves and those around them. And to do that, you have to learn how to think on your feet. The world does not offer up no-brainers every day; more frequently, we’re presented with a lot of grey options. Unless your studies are improving your ability to handle reality as it is and get a fair result, you’re probably wasting your time.

From Ford’s memoir:

An educated man is not one whose memory is trained to carry a few dates in history—he is one who can accomplish things. A man who cannot think is not an educated man however many college degrees he may have acquired. Thinking is the hardest work anyone can do—which is probably the reason why we have so few thinkers. There are two extremes to be avoided: one is the attitude of contempt toward education, the other is the tragic snobbery of assuming that marching through an educational system is a sure cure for ignorance and mediocrity. You cannot learn in any school what the world is going to do next year, but you can learn some of the things which the world has tried to do in former years, and where it failed and where it succeeded. If education consisted in warning the young student away from some of the false theories on which men have tried to build, so that he may be saved the loss of the time in finding out by bitter experience, its good would be unquestioned.

An education which consists of signposts indicating the failure and the fallacies of the past doubtless would be very useful. It is not education just to possess the theories of a lot of professors. Speculation is very interesting, and sometimes profitable, but it is not education. To be learned in science today is merely to be aware of a hundred theories that have not been proved. And not to know what those theories are is to be “uneducated,” “ignorant,” and so forth. If knowledge of guesses is learning, then one may become learned by the simple expedient of making his own guesses. And by the same token he can dub the rest of the world “ignorant” because it does not know what his guesses are.

But the best that education can do for a man is to put him in possession of his powers, give him control of the tools with which destiny has endowed him, and teach him how to think. The college renders its best service as an intellectual gymnasium, in which mental muscle is developed and the student strengthened to do what he can. To say, however, that mental gymnastics can be had only in college is not true, as every educator knows. A man’s real education begins after he has left school. True education is gained through the discipline of life.

[…]

Men satisfy their minds more by finding out things for themselves than by heaping together the things which somebody else has found out. You can go out and gather knowledge all your life, and with all your gathering you will not catch up even with your own times. You may fill your head with all the “facts” of all the ages, and your head may be just an overloaded fact−box when you get through. The point is this: Great piles of knowledge in the head are not the same as mental activity. A man may be very learned and very useless. And then again, a man may be unlearned and very useful.

The object of education is not to fill a man’s mind with facts; it is to teach him how to use his mind in thinking. And it often happens that a man can think better if he is not hampered by the knowledge of the past.

Ford is probably wrong in his very last statement, study of the past is crucial to understand the human condition, but the sentiment offered in the rest of the piece should be read and re-read frequently.

This brings to mind a debate you’ll hear that almost all debaters get wrong: What’s more valuable, to be educated in the school of life, or in the school of books? Which is it?

It’s both!

This is what we call a false dichotomy. There is absolutely no reason to choose between the two. We’re all familiar with the algebra. If A and B have positive value, then A+B must be greater than A or B alone! You must learn from your life as it goes along, but since we have the option to augment that by studying the lives of others, why would we not take advantage? All it takes is the will and the attitude to study the successes and failures of history, add them to your own experience, and get an algebra-style A+B result.

So, resolve to use your studies to learn to think, to learn to handle the world better, to be more useful to those around you. Don’t worry about the facts and figures for their own sake. We don’t need another human encyclopedia.

***

Still Interested? Check out all of Ford’s interesting memoir, or try reading up on what a broad education should contain. 

The post Henry Ford and the Actual Value of Education appeared first on Farnam Street.

]]>
28295
Hares, Tortoises, and the Trouble with Genius https://myvibez.link/james-march-the-trouble-with-genius/ Thu, 14 Jul 2016 11:00:12 +0000 https://www.farnamstreetblog.com/?p=28140 “Geniuses are dangerous.” — James March How many organizations would deny that they want more creativity, more genius, and more divergent thinking among their constituents? The great genius leaders of the world are fawned over breathlessly and a great amount of lip service is given to innovation; given the choice between “mediocrity” and “innovation,” we all choose innovation …

The post Hares, Tortoises, and the Trouble with Genius appeared first on Farnam Street.

]]>
“Geniuses are dangerous.”
— James March

How many organizations would deny that they want more creativity, more genius, and more divergent thinking among their constituents? The great genius leaders of the world are fawned over breathlessly and a great amount of lip service is given to innovation; given the choice between “mediocrity” and “innovation,” we all choose innovation hands-down.

So why do we act the opposite way?

Stanford’s James March might have some insight. His book On Leadership (see our earlier notes here) is a collection of insights derived mostly from the study of great literature, from Don Quixote to Saint Joan to War & Peace. In March’s estimation, we can learn more about human nature (of which leadership is merely a subset) from studying literature than we can from studying leadership literature.

March discusses the nature of divergent thinking and “genius” in a way that seems to reflect true reality. We don’t seek to cultivate genius, especially in a mature organization, because we’re more afraid of the risks than appreciative of the benefits. A classic case of loss aversion. Tolerating genius means tolerating a certain amount of disruption; the upside of genius sounds pretty good until we start understanding its dark side:

Most original ideas are bad ones. Those that are good, moreover, are only seen as such after a long learning period; they rarely are impressive when first tried out. As a result, an organization is likely to discourage both experimentation with deviant ideas and the people who come up with them, thereby depriving itself, in the name of efficient operation, of its main source of innovation.

[…]

Geniuses are dangerous. Othello’s instinctive action makes him commit an appalling crime, the fine sentiments of Pierre Bezukhov bring little comfort to the Russian peasants, and Don Quixote treats innocent people badly over and over again. A genius combines the characteristics that produce resounding failures (stubbornness, lack of discipline, ignorance), a few ingredients of success (elements of intelligence, a capacity to put mistakes behind him or her, unquenchable motivation), and exceptional good luck. Genius therefore only appears as a sub-product of a great tolerance for heresy and apparent craziness, which is often the result of particular circumstances (over-abundant resources, managerial ideology, promotional systems) rather than deliberate intention. “Intelligent” organizations will therefore try to create an environment that allows genius to flourish by accepting the risks of inefficiency or crushing failures…within the limits of the risks that they can afford to take.

We’ve bolded an important component: Exceptional good luck. The kind of genius that rarely surfaces but we desperately pursue needs great luck to make an impact. Truthfully, genius is always recognized in hindsight, with the benefit of positive results in mind. We “cherrypick” the good results of divergent thinkers, but forget that we use the results to decide who’s a genius and who isn’t. Thus, tolerating divergent, genius-level thinking requires an ability to tolerate failure, loss, and change if it’s to be applied prospectively.

Sounds easy enough, in theory. But as Daniel Kahneman and Charlie Munger have so brilliantly pointed out, we become very risk averse when we possess anything, including success; we feel loss more acutely than gain, and we seek to keep the status quo intact. (And it’s probably good that we do, on average.)

Compounding the problem, when we do recognize and promote genius, some of our exalting is likely to be based on false confidence, almost by definition:

Individuals who are frequently promoted because they have been successful will have confidence in their own abilities to beat the odds. Since in a selective, and therefore increasingly homogenous, management group the differences in performance that are observed are likely to be more often due to chance events than to any particular individual capacity, the confidence is likely to be misplaced. Thus, the process of selecting on performance results in exaggerated self-confidence and exaggerated risk-taking.

Let’s use a current example: Elon Musk. Elon is (justifiably) recognized as a modern genius, leaping tall buildings in a single bound. Yet as Ashlee Vance makes clear in his biography, Musk teetered on the brink several times. It’s a near miracle that his businesses have survived (and thrived) to where they’re at today. The press would read much differently if SpaceX or Tesla had gone under — he might be considered a brilliant but fatally flawed eccentric rather than a genius. Luck played a fair part in that outcome (which is not to take away from Musk’s incredible work).

***

Getting back to organizations, the failure to appropriately tolerate genius is also a problem of homeostasis: The tendency of systems to “stay in place” and avoid disruption of strongly formed past habits. Would an Elon Musk be able to rise in a homeostatic organization? It generally does not happen.

James March has a solution, though, and it’s one we’ve heard echoed by other thinkers like Nassim Taleb and seems to be used fairly well in some modern technology organizations. As with most organizational solutions, it requires realigning incentives, which is the job of a strong and selfless leader.

An analogy of the hare and the tortoise illustrates the solution:

Although one particular hare (who runs fast but sleeps too long) has every chance or being beaten by one particular tortoise, an army of hares in competition with an army of tortoises will almost certainly result in one of the hares crossing the finish line first. The choices of an organization therefore depend on the respective importance that it attaches to its mean performance (in which case it should recruit tortoises) and the achievement of a few dazzling successes (an army of hares, which is inefficient as a whole, but contains some outstanding individuals.)

[…]

In a simple model, a tortoise advances with a constant speed of 1 mile/hour while a hare runs at 5 miles/hour, but in each given 5-minute period a hare has a 90 percent chance of sleeping rather than running. A tortoise will cover the mile of the test in one hour exactly and a hare will have only about an 11 percent chance of arriving faster (the probability that he will be awake for at least three of the 5-minute periods.) If there is a race between the tortoise and one hare, the probability that the hare will win is only 0.11. However, if there are 100 tortoises and 100 hares in the race, the probability that at least one hare will arrive before any tortoise (and thus the race will be won by a hare) is 1– ((0.89)^100), or greater than 0.9999.

The analogy holds up well in the business world. Any one young, aggressive “hare” is unlikely to beat the lumbering “tortoise” that reigns king, but put 100 hares out against 100 tortoises and the result is much different.

This means that any organization must conduct itself in such a way that hares have a chance to succeed internally. It means becoming open to divergence and allowing erratic genius to rise, while keeping the costs of failure manageable. It means having the courage to create an “army of hares” inside of your own organization rather than letting tortoises have their way, as they will if given the opportunity.

For a small young organization, the cost of failure isn’t all that high, comparatively speaking — you can’t fall far off a pancake. So hares tend to get a lot more leash. But for a large organization, the cost of failure tends to increase to such a pain point that it stops becoming tolerated! At this point, real innovation ceases.

But, if we have the will and ability to create small teams and projects with “hare-like” qualities, in ways that allow the “talent + luck” equation to surface truly better and different work, necessarily tolerating (and encouraging) failure and disruption, then we might have a shot at overcoming homeostasis in the same way that a specific combination of engineering and fuel allow rockets to overcome the equally strong force of gravity.

***

Still Interested? Check out our notes on James March’s books On Leadership and The Ambiguities of Experience, and an interview March did on the topic of leadership.

The post Hares, Tortoises, and the Trouble with Genius appeared first on Farnam Street.

]]>
28140
Our Genes and Our Behavior https://myvibez.link/our-genes-and-our-behavior/ Sun, 10 Jul 2016 11:00:25 +0000 https://www.farnamstreetblog.com/?p=28091 “But now we are starting to show genetic influence on individual differences using DNA. DNA is a game changer; it’s a lot harder to argue with DNA than it is with a twin study or an adoption study.” — Robert Plomin *** It’s not controversial to say that our genetics help explain our physical traits. Tall …

The post Our Genes and Our Behavior appeared first on Farnam Street.

]]>
“But now we are starting to show genetic influence on individual differences using DNA. DNA is a game changer; it’s a lot harder to argue with DNA than it is with a twin study or an adoption study.”
— Robert Plomin

***

It’s not controversial to say that our genetics help explain our physical traits. Tall parents will, on average, have tall children. Overweight parents will, on average, have overweight children. Irish parents have Irish looking kids. This is true to the point of banality and only a committed ignorant would dispute it.

It’s slightly more controversial to talk about genes influencing behavior. For a long time, it was denied entirely. For most of the 20th century, the “experts” in human behavior had decided that “nurture” beat “nature” with a score of 100-0. Particularly influential was the child’s early life — the way their parents treated them in the womb and throughout early childhood. (Thanks Freud!)

So, where are we at now?

Genes and Behavior

Developmental scientists and behavioral scientists eventually got to work with twin studies and adoption studies, which tended to show that certain traits were almost certainly heritable and not reliant on environment, thanks to the natural controlled experiments of twins separated at birth. (This eventually provided fodder for Judith Rich Harris’s wonderful work on development and personality.)

All throughout, the geneticists, starting with Gregor Mendel and his peas, kept on working. As behavioral geneticist Robert Plomin explains, the genetic camp split early on. Some people wanted to understand the gene itself in detail, using very simple traits to figure it out (eye color, long or short wings, etc.) and others wanted to study the effect of genes on complex behavior, generally:

People realized these two views of genetics could come together. Nonetheless, the two worlds split apart because Mendelians became geneticists who were interested in understanding genes. They would take a convenient phenotype, a dependent measure, like eye color in flies, just something that was easy to measure. They weren’t interested in the measure, they were interested in how genes work. They wanted a simple way of seeing how genes work.

By contrast, the geneticists studying complex traits—the Galtonians—became quantitative geneticists. They were interested in agricultural traits or human traits, like cardiovascular disease or reading ability, and would use genetics only insofar as it helped them understand that trait. They were behavior centered, while the molecular geneticists were gene centered. The molecular geneticists wanted to know everything about how a gene worked. For almost a century these two worlds of genetics diverged.

Eventually, the two began to converge. One camp (the gene people) figured out that once we could sequence the genome, they might be able to understand more complicated behavior by looking directly at genes in specific people with unique DNA, and contrasting them against one another.

The reason why this whole gene-behavior game is hard is because, as Plomin makes clear, complex traits like intelligence are not like eye color. There’s no “smart gene” — it comes from the interaction of thousands of different genes and can occur in a variety of combinations. Basic Mendel-style counting (the sort of dominant/recessive eye color gene thing you learned in high school biology) doesn’t work in analyzing the influence of genes on complex traits:

The word gene wasn’t invented until 1903. Mendel did his work in the mid-19th century. In the early 1900s, when Mendel was rediscovered, people finally realized the impact of what he did, which was to show the laws of inheritance of a single gene. At that time, these Mendelians went around looking for Mendelian 3:1 segregation ratios, which was the essence of what Mendel showed, that inheritance was discreet. Most of the socially, behaviorally, or agriculturally important traits aren’t either/or traits, like a single-gene disorder. Huntington’s disease, for example, is a single-gene dominant disorder, which means that if you have that mutant form of the Huntington’s gene, you will have Huntington’s disease. It’s necessary and sufficient. But that’s not the way complex traits work.

The importance of genetics is hard to understate, but until the right technology came along, we could only observe it indirectly. A study might have shown that 50% of the variance in cognitive ability was due to genetics, but we had no idea which specific genes, in which combinations, actually produced smarter people.

But the Moore’s law style improvement in genetic testing means that we can cheaply and effectively map out entire genomes for a very low cost. And with that, the geneticists have a lot of data to work with, a lot of correlations to begin sussing out. The good thing about finding strong correlations between genes and human traits is that we know which one is causative: The gene! Obviously, your reading ability doesn’t cause you to have certain DNA; it must be the other way around. So “Big Data” style screening is extremely useful, once we get a little better at it.

***

The problem is that, so far, the successes have been a bit minimal. There are millions of “ATCG” base pairs to check on.  As Plomin points out, we can only pinpoint about 20% of the specific genetic influence for something simple like height, which we know is about 90% heritable. Complex traits like schizophrenia are going to take a lot of work:

We’ve got to be able to figure out where the so-called missing heritability is, that is, the gap between the DNA variants that we are able to identify and the estimates we have from twin and adoption studies. For example, height is about 90 percent heritable, meaning, of the differences between people in height, about 90 percent of those differences can be explained by genetic differences. With genome-wide association studies, we can account for 20 percent of the variance of height, or a quarter of the heritability of height. That’s still a lot of missing heritability, but 20 percent of the variance is impressive.

With schizophrenia, for example, people say they can explain 15 percent of the genetic liability. The jury is still out on how that translates into the real world. What you want to be able to do is get this polygenic score for schizophrenia that would allow you to look at the entire population and predict who’s going to become schizophrenic. That’s tricky because the studies are case-control studies based on extreme, well-diagnosed schizophrenics, versus clean controls who have no known psychopathology. We’ll know soon how this polygenic score translates to predicting who will become schizophrenic or not.

It brings up an interesting question that gets us back to the beginning of the piece: If we know that genetics have an influence on some complex behavioral traits (and we do), and we can with the continuing progress of science and technology, sequence a baby’s genome and predict to a certain extent their reading level, facility with math, facility with social interaction, etc., do we do it?

Well, we can’t until we get a general recognition that genes do indeed influence behavior and do have predictive power as far as how children perform. So far, the track record on getting educators to see that it’s all quite real is pretty bad. Like the Freudians before, there’s a resistance to the “nature” aspect of the debate, probably influenced by some strong ideologies:

If you look at the books and the training that teachers get, genetics doesn’t get a look-in. Yet if you ask teachers, as I’ve done, about why they think children are so different in their ability to learn to read, and they know that genetics is important. When it comes to governments and educational policymakers, the knee-jerk reaction is that if kids aren’t doing well, you blame the teachers and the schools; if that doesn’t work, you blame the parents; if that doesn’t work, you blame the kids because they’re just not trying hard enough. An important message for genetics is that you’ve got to recognize that children are different in their ability to learn. We need to respect those differences because they’re genetic. Not that we can’t do anything about it.

It’s like obesity. The NHS is thinking about charging people to be fat because, like smoking, they say it’s your fault. Weight is not as heritable as height, but it’s highly heritable. Maybe 60 percent of the differences in weight are heritable. That doesn’t mean you can’t do anything about it. If you stop eating, you won’t gain weight, but given the normal life in a fast-food culture, with our Stone Age brains that want to eat fat and sugar, it’s much harder for some people.

We need to respect the fact that genetic differences are important, not just for body mass index and weight, but also for things like reading disability. I know personally how difficult it is for some children to learn to read. Genetics suggests that we need to have more recognition that children differ genetically, and to respect those differences. My grandson, for example, had a great deal of difficulty learning to read. His parents put a lot of energy into helping him learn to read. We also have a granddaughter who taught herself to read. Both of them now are not just learning to read but reading to learn.

Genetic influence is just influence; it’s not deterministic like a single gene. At government levels—I’ve consulted with the Department for Education—I don’t think they’re as hostile to genetics as I had feared, they’re just ignorant of it. Education just doesn’t consider genetics, whereas teachers on the ground can’t ignore it. I never get static from them because they know that these children are different when they start. Some just go off on very steep trajectories, while others struggle all the way along the line. When the government sees that, they tend to blame the teachers, the schools, or the parents, or the kids. The teachers know. They’re not ignoring this one child. If anything, they’re putting more energy into that child.

It’s frustrating for Plomin because he knows that eventually DNA mapping will get good enough that real, and helpful, predictions will be possible. We’ll be able to target kids early enough to make real differences — earlier than problems actually manifest — and hopefully change the course of their lives for the better. But so far, no dice.

Education is the last backwater of anti-genetic thinking. It’s not even anti-genetic. It’s as if genetics doesn’t even exist. I want to get people in education talking about genetics because the evidence for genetic influence is overwhelming. The things that interest them—learning abilities, cognitive abilities, behavior problems in childhood—are the most heritable things in the behavioral domain. Yet it’s like Alice in Wonderland. You go to educational conferences and it’s as if genetics does not exist.

I’m wondering about where the DNA revolution will take us. If we are explaining 10 percent of the variance of GCSE scores with a DNA chip, it becomes real. People will begin to use it. It’s important that we begin to have this conversation. I’m frustrated at having so little success in convincing people in education of the possibility of genetic influence. It is ignorance as much as it is antagonism.

Here’s one call for more reality recognition.

***

Still Interested? Check out a book by John Brookman of Edge.org with a curated collection of articles published on genetics.

The post Our Genes and Our Behavior appeared first on Farnam Street.

]]>
28091
J.K. Rowling On People’s Intolerance of Alternative Viewpoints https://myvibez.link/j-k-rowling-intolerance-viewpoints/ Thu, 07 Jul 2016 11:00:27 +0000 https://www.farnamstreetblog.com/?p=28138 At the PEN America Literary Gala & Free Expression Awards, J.K. Rowling, of Harry Potter fame, received the 2016 PEN/Allen Foundation Literary Service Award. Embedded in her acceptance speech is some timeless wisdom on tolerance and acceptance: Intolerance of alternative viewpoints is spreading to places that make me, a moderate and a liberal, most uncomfortable. …

The post J.K. Rowling On People’s Intolerance of Alternative Viewpoints appeared first on Farnam Street.

]]>
At the PEN America Literary Gala & Free Expression Awards, J.K. Rowling, of Harry Potter fame, received the 2016 PEN/Allen Foundation Literary Service Award. Embedded in her acceptance speech is some timeless wisdom on tolerance and acceptance:

Intolerance of alternative viewpoints is spreading to places that make me, a moderate and a liberal, most uncomfortable. Only last year, we saw an online petition to ban Donald Trump from entry to the U.K. It garnered half a million signatures.

Just a moment.

I find almost everything that Mr. Trump says objectionable. I consider him offensive and bigoted. But he has my full support to come to my country and be offensive and bigoted there. His freedom to speak protects my freedom to call him a bigot. His freedom guarantees mine. Unless we take that absolute position without caveats or apologies, we have set foot upon a road with only one destination. If my offended feelings can justify a travel ban on Donald Trump, I have no moral ground on which to argue that those offended by feminism or the fight for transgender rights or universal suffrage should not oppress campaigners for those causes. If you seek the removal of freedoms from an opponent simply on the grounds that they have offended you, you have crossed the line to stand alongside tyrants who imprison, torture and kill on exactly the same justification.

Too often we look at the world through our own eyes and fail to acknowledge the eyes of others. In so doing we often lose touch with reality.

The quick reaction our brains have to people who disagree with us is often that they are idiots. They shouldn’t be allowed to talk or have a platform. They should lose.

This reminds me of Kathryn Schulz’s insightful view on what we do when someone disagrees with us.

As a result we dismiss the views of others, failing to even consider that our view of the world might be wrong.

It’s easy to be dismissive and intolerant of others. It’s easy to say they’re idiots and wish they didn’t have the same rights you have. It’s harder to map that to the very freedoms we enjoy and relate it to the world we want to live in.

The post J.K. Rowling On People’s Intolerance of Alternative Viewpoints appeared first on Farnam Street.

]]>
28138
Warren Berger’s Three-Part Method for More Creativity https://myvibez.link/warren-berger-system-questioning/ Wed, 29 Jun 2016 11:00:27 +0000 https://www.farnamstreetblog.com/?p=28061 “A problem well stated is a problem half-solved.” — Charles “Boss” Kettering *** The whole scientific method is built on a very simple structure: If I do this, then what will happen? That’s the basic question on which more complicated, intricate, and targeted lines of inquiry are built, across a wide variety of subjects. This simple form helps us push deeper …

The post Warren Berger’s Three-Part Method for More Creativity appeared first on Farnam Street.

]]>
“A problem well stated is a problem half-solved.”
— Charles “Boss” Kettering

***

The whole scientific method is built on a very simple structure: If I do this, then what will happen? That’s the basic question on which more complicated, intricate, and targeted lines of inquiry are built, across a wide variety of subjects. This simple form helps us push deeper and deeper into knowledge of the world. (On a sidenote, science has become such a loaded, political word that this basic truth of how it works frequently seems to be lost!)

Individuals learn this way too. From the time you were a child, you were asking why (maybe even too much), trying to figure out all the right questions to ask to get better information about how the world works and what to do about it.

Because question-asking is such an integral part of how we know things about the world, both institutionally and individually, it seems worthy to understand how creative inquiry works, no? If we want to do things that haven’t been done or learn things that have never been learned — in short, be more creative — we must learn to ask the right questions, ones so good that they’re half-answered in the asking. And to do that, it might help to understand the process, no?

Warren Berger proposes a simple method in his book A More Beautiful Questionan interesting three-part system to help (partially) solve the problem of inquiry. He calls it The Why, What If, and How of Innovative Questioning, and reminds us why it’s worth learning about.

Each stage of the problem solving process has distinct challenges and issues–requiring a different mind-set, along with different types of questions. Expertise is helpful at certain points, not so helpful at others; wide-open, unfettered divergent thinking is critical at one stage, discipline and focus is called for at another. By thinking of questioning and problem solving in a more structured way, we can remind ourselves to shift approaches, change tools, and adjust our questions according to which stage we’re entering.

Three-Part Method for More Creativity

Why?

It starts with the Why?

A good Why? seeks true understanding. Why are things the way they are currently? Why do we do it that way? Why do we believe what we believe?

This start is essential because it gives us permission to continue down a line of inquiry fully equipped. Although we may think we have a brilliant idea in our heads for a new product, or a new answer to an old question, or a new way of doing an old thing, unless we understand why things are the way they are, we’re not yet on solid ground. We never want to operate from a position of ignorance, wasting our time on an idea that hasn’t been pushed and fleshed out. Before we say “I already know” the answer, maybe we need to step back and look for the truth.

At the same time, starting with a strong Why also opens up the idea that the current way (whether it’s our way or someone else’s) might be wrong, or at least inefficient. Let’s say a friend proposes you go to the same restaurant you’ve been to a thousand times. It might be a little agitating, but a simple “Why do we always go there?” allows two things to happen:

A. Your friend can explain why, and this gives him/her a legitimate chance at persuasion. (If you’re open minded.)

B. The two of you may agree you only go there out of habit, and might like to go somewhere else.

This whole Why? business is the realm of contrarian thinking, which not everyone enjoys doing. But Berger cites the case of George Lois:

George Lois, the renowned designer of iconic magazine covers and celebrated advertising campaigns, was also known for being a disruptive force in business meetings. It wasn’t just that he was passionate in arguing for his ideas; the real issue, Lois recalls, was that often he was the only person in the meeting willing to ask why. The gathered business executives would be anxious to proceed on a course of action assumed to be sensible. While everyone else nodded in agreement, “I would be the only guy raising his hand to say, ‘Wait a minute, this thing you want to do doesn’t make any sense. Why the hell are you doing it this way?”

Others in the room saw Lois to be slowing the meeting and stopping the group from moving forward. But Lois understood that the group was apt to be operating on habit–trotting out an idea or approach similar to what had been done in similar situations before, without questioning whether it was the best idea or the right approach in this instance. The group needed to be challenged to “step back” by someone like Lois–who had a healthy enough ego to withstand being the lone questioner in the room.

The truth is that a really good Why? type question tends to be threatening. That’s also what makes it useful. It challenges us to step back and stop thinking on autopilot. It also requires what Berger calls a step back from knowing — that recognizable feeling of knowing something but not knowing how you know it. This forced perspective is, of course, as valuable a thing as you can do.

Berger describes a valuable exercise that’s sometimes used to force perspective on people who think they already have a complete answer. After showing a drawing of a large square (seemingly) divided into 16 smaller squares, the questioner asks the audience “How many squares do you see?”

The easy answer is sixteen. But the more observant people in the group are apt to notice–especially after Srinivas allows them to have a second, longer, look–that you can find additional squares by configuring them differently. In addition to the sixteen single squares, there are nine two-by-two squares, four three-by-three squares, and one large four-by-four square, which brings the total to thirty squares.

“The squares were always there, but you didn’t find them until you looked for them.”

Point being, until you step back, re-examine, and look a little harder, you might not have seen all the damn squares yet!

What If?

The second part is where a good questioner, after using Why? to understand as deeply as possible and open a new line of inquiry, proposes a new type of solution, usually an audacious one — all great ideas tend to be, almost by definition — by asking What If…?

Berger illustrates this one well with the story of Pandora Music. The founder Tim Westergren wanted to know why good music wasn’t making it out to the masses. His search didn’t lead to a satisfactory answer, so he eventually asked himself, What if we could map the DNA of music? The result has been pretty darn good, with something close to 80 million listeners at present:

The Pandora story, like many stories of inquiry-driven startups, started with someone’s wondering about an unmet need. It concluded with the questioner, Westergren, figuring out how to bring a fully realized version of the answer into the world.

But what happened in between? That’s when the lightning struck. In Westergren’s case, ideas and influences began to come together; he combined what he knew about music with what he was learning about technology. Inspiration was drawn from a magazine article, and from a seemingly unrelated world (biology). A vision of the new possibility began to form in the mind. It all resulted in an audacious hypothetical question that might or might not have been feasible–but was exciting enough to rally people to the challenge of trying to make it work.

The What If stage is the blue-sky moment of questioning, when anything is possible. Those possibilities may not survive the more practical How stage; but it’s critical to innovation that there be time for wild, improbable ideas to surface and to inspire.

If the word Why has penetrative power, enabling the questioner to get past assumptions and dig deep into problems, the words What if have a more expansive effect–allowing us to think without limits or constraints, firing the imagination.

Clearly, Westergren had engaged in serious combinatorial creativity pulling from multiple disciplines, which led him to ask the right kind of questions. This seems to be a pretty common feature at this stage of the game, and an extremely common feature of all new ideas:

Smart recombinations are all around us. Pandora, for example, is a combination of a radio station and search engine; it also takes the biological method of genetic coding and transfers it to the domain of music […] In today’s tech world, many of the most successful products–Apple’s iPhone being just one notable example–are hybrids, melding functions and features in new ways.

Companies, too, can be smart recombinations. Netflix was started as a video-rental business that operated like a monthly membership health club (and how it has added “TV production studio” to the mix). Airbnb is a combination of an online travel agency, a social media platform, and a good old-fashioned bed-and-breakfast (the B&B itself is a smart combination from way back.)

It may be that the Why? –> What if? line of inquiry is common to all types of innovative thinking because it engages the part of our brain that starts turning over old ideas in new ways by combining them with other unrelated ideas, much of them previously sitting idle in our subconscious. That churning is where new ideas really arise.

The idea then has to be “reality-tested”, and that’s where the last major question comes in.

How?

Once we think we’ve hit on a brilliant new idea, it’s time to see if the thing actually works. Usually and most frequently, the answer is no. But enough times to make it worth our while, we discover that the new idea has legs.

The most common problem here is that we try to perfect a new idea all at once, leading to stagnation and paralysis. That’s usually the wrong approach.

Another, often better, way is to try the idea quickly and start getting feedback. As much as possible. In the book, Berger describes a fun little experiment that drives home the point, and serves as a fairly useful business metaphor besides:

A software designer shared a story about an interesting experiment in which the organizers brought together a group of kindergarten children who were divided into small teams and given a challenge: Using uncooked spaghetti sticks, string, tape, and a marshmallow, they had to assemble the tallest structure they could, within a time limit (the marshmallow was supposed to be placed on top of the completed structure.)

Then, in a second phase of the experiment, the organizers added a new wrinkle. They brought in teams of Harvard MBA grad students to compete in the challenge against the kindergartners. The grad students, I’m told, took it seriously. They brought a highly analytical approach to the challenge, debating among themselves about how best to combine the sticks, the string, and the tape to achieve maximum altitude.

Perhaps you’ll have guessed this already, but the MBA students were no match for the kindergartners. For all their planning and discussion, the structures they carefully conceived invariably fell apart–and then they were out of time before they could get in more attempts.

The kids used their time much more efficiently by constructing right away. They tried one way of building, and if it didn’t work, they quickly tried another. They got in a lot more tries. They learned from their mistakes as they went along, instead of attempting to figure out everything in advance.

This little experiment gets run in the real world all the time by startups looking to outcompete ponderous old bureaucracies. They simply substitute velocity for scale and see what happens — it often works well.

The point is to move along the axis of Why?–>What If–>How? without too much self-censoring in the last phase. Being afraid to fail can often mean a great What If? proposition gets stuck there forever. Analysis paralysis, as it’s sometimes called. But if you can instead enter the testing of the How? stage quickly, even by showing that an idea won’t work, then you can start the loop over again, either asking a new Why? or proposing a new What If? to an existing Why?

Thus moving your creative engine forward.

***

Berger’s point is that there is an intense practical end to understanding productive inquiry. Just like “If I do this, then what will happen?” is a basic structure on which all manner of complex scientific questioning and testing is built, so can a simple Why, What If, and How structure catalyze a litany of new ideas.

Still Interested? Check out the book, or check out some related posts: Steve Jobs on CreativitySeneca on Gathering Ideas And Combinatorial Creativity, or for some fun with question-asking, What If? Serious Scientific Answers to Absurd Hypothetical Questions.

The post Warren Berger’s Three-Part Method for More Creativity appeared first on Farnam Street.

]]>
28061
Atul Gawande and the Mistrust of Science https://myvibez.link/atul-gawande-mistrust-science/ Thu, 23 Jun 2016 11:00:03 +0000 https://www.farnamstreetblog.com/?p=27887 Continuing on with Commencement Season, Atul Gawande gave an address to the students of Cal Tech last Friday, delivering a message to future scientists, but one that applies equally to all of us as thinkers: “Even more than what you think, how you think matters.” Gawande addresses the current growing mistrust of “scientific authority” — the thought …

The post Atul Gawande and the Mistrust of Science appeared first on Farnam Street.

]]>
Continuing on with Commencement Season, Atul Gawande gave an address to the students of Cal Tech last Friday, delivering a message to future scientists, but one that applies equally to all of us as thinkers:

“Even more than what you think, how you think matters.”

Gawande addresses the current growing mistrust of “scientific authority” — the thought that because science creaks along one mistake at a time, that it isn’t to be trusted. The misunderstanding of what scientific thinking is and how it works is at the root of much problematic ideology, and it’s up to those who do understand it to promote its virtues.

It’s important to realize that scientists, singular, are as fallible as the rest of us. Thinking otherwise only sets you up for a disappointment. The point of science is the collective, the forward advance of the hive, not the bee. It’s sort of a sausage-making factory when seen up close, but when you pull back the view, it looks like a beautifully humming engine, steadily giving us more and more information about ourselves and the world around us. Science is, above all, a method of thought. A way of figuring out what’s true and what we’re just fooling ourselves about.

So explains Gawande:

Few working scientists can give a ground-up explanation of the phenomenon they study; they rely on information and techniques borrowed from other scientists. Knowledge and the virtues of the scientific orientation live far more in the community than the individual. When we talk of a “scientific community,” we are pointing to something critical: that advanced science is a social enterprise, characterized by an intricate division of cognitive labor. Individual scientists, no less than the quacks, can be famously bull-headed, overly enamored of pet theories, dismissive of new evidence, and heedless of their fallibility. (Hence Max Planck’s observation that science advances one funeral at a time.) But as a community endeavor, it is beautifully self-correcting.

Beautifully organized, however, it is not. Seen up close, the scientific community—with its muddled peer-review process, badly written journal articles, subtly contemptuous letters to the editor, overtly contemptuous subreddit threads, and pompous pronouncements of the academy— looks like a rickety vehicle for getting to truth. Yet the hive mind swarms ever forward. It now advances knowledge in almost every realm of existence—even the humanities, where neuroscience and computerization are shaping understanding of everything from free will to how art and literature have evolved over time.

He echoes Steven Pinker in the thought that science, traditionally left to the realm of discovering “physical” reality, is now making great inroads into what might have previously been considered philosophy, by exploring why and how our minds work the way they do. This can only be accomplished by deep critical thinking across a broad range of disciplines, and by the dual attack of specialists uncovering highly specific nuggets and great synthesizers able to suss out meaning from the big pile of facts.

The whole speech is worth a read and reflection, but Gawande’s conclusion is particularly poignant for an educated individual in a Republic:

The mistake, then, is to believe that the educational credentials you get today give you any special authority on truth. What you have gained is far more important: an understanding of what real truth-seeking looks like. It is the effort not of a single person but of a group of people—the bigger the better—pursuing ideas with curiosity, inquisitiveness, openness, and discipline. As scientists, in other words.

Even more than what you think, how you think matters. The stakes for understanding this could not be higher than they are today, because we are not just battling for what it means to be scientists. We are battling for what it means to be citizens.

Still Interested? Read the rest, and read a few other of this year’s commencements by Nassim Taleb and Gary Taubes. Or read about E.O. Wilson, the great Harvard biologist, and what he thought it took to become a great scientist. (Hint: The same stuff it takes for anyone to become a great critical thinker.)

The post Atul Gawande and the Mistrust of Science appeared first on Farnam Street.

]]>
27887
Eric Hoffer and the Creation of Fanatical Mass Movements https://myvibez.link/eric-hoffer-creation-fanatical-mass-movements/ Mon, 20 Jun 2016 11:00:08 +0000 https://www.farnamstreetblog.com/?p=27808 What is the nature of a true mass movement? In 1951, the American philosopher Eric Hoffer attempted to answer this, and published his first and most well-known work: The True Believer: Thoughts on the Nature of Mass Movements. The True Believer became a hit because it was released on the heels of World War II …

The post Eric Hoffer and the Creation of Fanatical Mass Movements appeared first on Farnam Street.

]]>
What is the nature of a true mass movement? In 1951, the American philosopher Eric Hoffer attempted to answer this, and published his first and most well-known work: The True Believer: Thoughts on the Nature of Mass Movements.

The True Believer became a hit because it was released on the heels of World War II and at the outset of the US/Soviet Cold War, and hoped to explain the nature of the “mass movements” that created widespread devastation: Nazism, Fascism, and Stalinism among them. Most people were still hungry for answers. (Heck, many people all over the world were still hungry, period.)

Hoffer took the analysis a bit further. What did all mass movements seem to have in common? He didn’t stop with modern political movements, but thought also about religious movements, reformations, and nationalist movements throughout history, featuring heavy commentary on Christianity and Islam in particular.

The book is a series of loosely connected cogitations on the nature of fanatically organized mass movements, the kind that can lead to mass murder and starvation as in the cases above, but that have also led to movements we generally look fondly upon like the Catholic Reformation, the American Revolution, and the Indian Independence Movement.

Like any good book, it’s impossible to summarize without losing a tremendous amount of understanding. But Hoffer does offer a loose framework for how mass movements start and move into completion, and his insights here are worth studying, for they give us a great window into humanity and history. This will be a longer one, but it’s worth the ride.

The Intellectual Underpinning of Revolution

Hoffer makes it clear that the nature of a true mass movement is one of unified struggle. It comes at a time of disillusionment with the state of affairs. It’s not necessarily and not even usually desperation, though: People who can barely feed themselves do not tend to revolt, for they do not have the time, interest, or energy. Revolt tends to happen in a society of intellectual discourse and, counter-intuitively, a certain amount of freedom.

But when conditions are right and people are sufficiently whipped into a fervor, a mass movement can arise among a frustrated group. Hoffer outlines the basic definition of a mass movement by emphasizing the call for self-sacrifice as a central element:

The vigor of a mass movement stems from the propensity of its followers for united action and self-sacrifice. When we ascribe the success of a movement to its faith, doctrine, propaganda, leadership, ruthlessness and so on, we are but referring to instruments of unification and the means used to inculcate a readiness for self-sacrifice. It is perhaps impossible to understand the nature of mass movements unless it is recognized that their chief preoccupation is to foster, perfect and perpetuate a facility for united action and self-sacrifice. To know the processes by which such a facility is engendered is to grasp the inner logic of most of the characteristic attitudes and practices of an active mass movement.

With few exceptions, any group or organization which tries, for one reason or another, to create and maintain compact unity and a constant readiness for self-sacrifice usually manifests the peculiarities—both noble and base—of a mass movement. On the other hand, a mass movement is bound to lose much which distinguishes it from other types of organization when it relaxes its collective compactness and begins to countenance self-interest as a legitimate motive of activity. In times of peace and prosperity, a democratic nation is an institutionalized association of more or less free individuals. On the other hand, in time of crisis, when the nation’s existence is threatened, and it tries to reinforce its unity and generate in its people a readiness for self-sacrifice, it almost always assumes in some degree the character of a mass movement. The same is true of religious and revolutionary organizations: whether or not they develop into mass movements depends less on the doctrine they preach and the program they project than on the degree of their preoccupation with unity and the readiness for self- degree of their preoccupation with unity and the readiness for self-sacrifice.

The essential nature, then, of a fanatical mass movement is one where a group of loyal followers can be made to believe, or simply nudged into indulging a prior belief that their own life is less important than a greater cause. A willingness to lose their own identity for the “greater good,” however defined, seems a necessary element.

In order for this to happen, a population must obviously be given something to believe — a cause strong enough to subsume them. And in order to do that, the cause must be all-encompassing. The reason Hoffer titles the book True Believer is that a strong mass movement only works when it purports to provide a solution that can be turned to for all the essential answers: The Bible, the Qur’an, Liberty, Freedom, Communism, Equality, Lebensraum, the State, the Nation…all-encompassing narratives which would become the central dogma of a mass movement, to be enacted and upheld by force. (These narratives would become a central element of Yuval Harari’s wonderful book Sapiens about 50 years later. He called them the human myths.)

The process is kicked off by a radical intellectual, or what Hoffer calls a Man of Words.

The Man of Words

As Hoffer describes it, the process begins with a thinker or a group of thinkers with a strong set of ideas that offer a solution to a proposed societal problem. Karl Marx. Jesus Christ. Thomas Paine. Martin Luther. The German philosophers. The French philosophers.

These intellectuals “prepare the ground” for the movements to come, by providing the central dogmas of the revolution which begin to get the populace on board by giving them an alternative to the present. A new future. And without these forceful new ideas, the momentum will eventually die.

Generally, this only works in a society where the intellectuals are not already part of the establishment: There has to be a certain disaffected nature to their work. If they don’t hate the powers that be, why bother inciting Revolution — intentionally or not? Hoffer makes the point that in societies where the intellectuals are the ruling class, or participate heavily in the ruling class, there’s not much of a tendency towards a true mass movement.

In one passage, he describes the outline of this beginning period:

The men of letters of eighteenth-century France are the most familiar example of intellectuals pioneering a mass movement. A somewhat similar pattern may be detected in the periods preceding the rise of most movements. The ground for the Reformation was prepared by the men who satirized and denounced the clergy in popular pamphlets, and by men of letters like Johann Reuchlin, who fought and discredited the Roman curia. The rapid spread of Christianity in the Roman world was partly due to the fact that the pagan cults it sought to supplant were already thoroughly discredited. The discrediting was done, before and after the birth of Christianity, by the Greek philosophers who were bored with the puerility of the cults and denounced and ridiculed them in schools and city streets.

Christianity made little headway against Judaism because the Jewish religion had the ardent allegiance of the Jewish men of words. The rabbis and their disciples enjoyed an exalted status in Jewish life of that day, where the school and the book supplanted the temple and the fatherland. In any social order where the reign of men of words is so supreme, no opposition can develop within and no foreign mass movement can gain a foothold. The mass movements of modern time, whether socialist or nationalist, were invariably pioneered by poets, writers, historians, scholars, philosophers and the like. The connection between intellectual theoreticians and revolutionary movements needs no emphasis.

But it is equally true that all nationalist movements— from the cult of la patrie in revolutionary France to the latest nationalist rising in Indonesia—were conceived not by men of action but by fault-finding intellectuals. The generals, industrialists, landowners and businessmen who are considered pillars of patriotism are latecomers who join the movement after it has become a going concern. The most strenuous part of the early phase of every nationalist movement consists in convincing and winning over these future pillars of patriotism. The Czech historian Palacky said that if the ceiling of a room in which he and a handful of friends were dining one night had collapsed, there would have been no Czech nationalist movement.

Such handfuls of impractical men of words were at the beginning of all nationalist movements. German intellectuals were the originators of German nationalism, just as Jewish intellectuals were the originators of Zionism. It is the deep-seated craving of the man of words for an exalted status which makes him oversensitive to any humiliation imposed on the class or community (racial, lingual or religious) to which he belongs however loosely. It was Napoleon’s humiliation of the Germans, particularly the Prussians, which drove Fichte and the German intellectuals to call on the German masses to unite into a mighty nation which would dominate Europe. Theodore Herzl and the Jewish intellectuals were driven to Zionism by the humiliations heaped upon millions of Jews in Russia, and by the calumnies to which the Jews in the rest of continental Europe were subjected toward the end of the nineteenth century. To a degree the nationalist movement which forced the British rulers out of India had its inception in the humiliation of a scrawny and bespectacled Indian man of words in South Africa

Of course, the Man of Words is simply an intellectual forbear. Although it has happened, it’s rare that he or she is the actual leader of the movement. As Hoffer points out, Christ was not a Christian and Marx was not a Marxist. The movements came as a result of their anti-establishment thought-work, but there’s a crucial difference between the Man of Words and the Fanatic which will begin catalyzing the movement: the Man of Words may be a lot more intellectually flexible than the true believers who follow in their footsteps:

The genuine man of words himself can get along without faith in absolutes. He values the search for truth as much as truth itself. He delights in the clash of thought and in the give-and-take of controversy. If he formulates a philosophy and a doctrine, they are more an exhibition of brilliance and an exercise in dialectics than a program of action and the tenets of a faith. His vanity, it is true, often prompts him to defend his speculations with savagery and even venom; but his appeal is usually to reason and not to faith. The fanatics and the faith-hungry masses, however, are likely to invest such speculations with the certitude of holy writ, and make them the fountainhead of a new faith. Jesus was not a Christian, nor was Marx a Marxist.

[…]

The reason for the tragic fate which almost always overtakes the intellectual midwives of a mass movement is that, no matter how much they preach and glorify the united effort, they remain essentially individualists. They believe in the possibility of individual happiness and the validity of individual opinion and initiative. But once a movement gets rolling, power falls into the hands of those who have neither faith in, nor respect for, the individual. And the reason they prevail is not so much that their disregard of the individual gives them a capacity for ruthlessness, but that their attitude is in full accord with the ruling passion of the masses.

The next steps, the corralling of the people, is often carried out by a Fanatic.

Fanatics of a Revolution

When the moment is ripe, only the fanatic can hatch a genuine mass movement. Without him the disaffection engendered by militant men of words remains undirected and can vent itself only in pointless and easily suppressed disorders. Without him the initiated reforms, even when drastic, leave the old way of life unchanged, and any change in government usually amounts to no more than a transfer of power from one set of men of action to another. Without him there can perhaps be no new beginning.

This may be a bit of historical curve-fitting, but Hoffer thinks that a frustrated creative intellectual makes a pretty good Fanatic, and points to specific examples of where that held — Hitler, Robespierre and Lenin among them. Whether or not that is true matters less than the fact that the Fanatic is responsible for galvanizing the movement and pushing it to a point of no return.

But the Fanatic is not really a good leader or manager in a long-term sense, simply because of the nature of their fanaticism: They believe too strongly in their dogmas, and their intellectual blindness pushes them to blunder. Hitler and Mussolini would be excellent examples. Their fanaticism got them a long way, but ultimately it sowed the seeds of its own destruction.

If it is to be a sustainable movement, then the Men of Action, as Hoffer names them, will have to do the hard work.

Men of Action: Holding Down the Fort

Hoffer continues by describing the final phase of the beginning of a Movement: Leadership by a practical actor with less devotion to pure fanaticism and a healthier dose of order (ideally):

The chief preoccupation of a man of action when he takes over an “arrived” movement is to fix and perpetuate its unity and readiness for self-sacrifice. His ideal is a compact, invincible whole that functions automatically. To achieve this he cannot rely on enthusiasm, for enthusiasm is ephemeral. Persuasion, too, is unpredictable. He inclines, therefore, to rely mainly on drill and coercion. He finds the assertion that all men are cowards less debatable than that all men are fools, and, in the words of Sir John Maynard, inclines to found the new order on the “necks of the people rather than in their hearts.” The genuine man of action is not a man of faith but a man of law.

Still, he cannot help being awed by the tremendous achievements of faith and spontaneity in the early days of the movement when a mighty instrument of power was conjured out of the void. The memory of it is still extremely vivid. He takes, therefore, great care to preserve in the new institutions an impressive façade of faith, and maintains an incessant vow of fervent propaganda, though he relies mainly on the persuasiveness of force. His orders are worded in pious vocabulary, and the old formulas and slogans are continually on his lips. The symbols of faith are carried high and given reverence. The men of words and the fanatics of the early period are canonized.

Though the steel fingers of coercion make themselves felt everywhere and great emphasis is placed on mechanical drill, the pious phrases and the fervent propaganda give to coercion a semblance of persuasion, and to habit a semblance of spontaneity. No effort is spared to present the new order as the glorious consummation of the hopes and struggles of the early days.

It is, of course, true that the categories often overlap. Joseph Stalin was both a practical leader and a man of deep fanaticism, as was Hitler. The leaders of the American Revolution certainly carried both traits, as did Gandhi in leading the Indian Independence. The balance between the fanaticism necessary to catalyze real change and the practical sense needed to sustain a cohesive movement probably affords some measure of success or failure as time goes on. (Nazism collapsed, at least partially, due to fanaticism outstripping reality.)

From there, the new movement, no longer a minority but the dominant power, must find a way to stabilize, and it often does this by patching together a structure from many other institutions:

Stalin’s Russia was a patchwork of bolshevism, czarism, nationalism, pan-Slavism, dictatorship and borrowings from Hitler, and monopolistic capitalism. Hitler’s Third Reich was a conglomerate of nationalism, racialism, Prussianism, dictatorship and borrowings from fascism, bolshevism, Shintoism, Catholicism and the ancient Hebrews. Christianity, too, when after the conflicts and dissensions of the first few centuries it crystallized into an authoritarian church, was a patchwork of old and new and of borrowings from friend and foe. It patterned its hierarchy after the bureaucracy of the Roman Empire, adopted portions of the antique ritual, developed the institution of an absolute leader and used every means to absorb all existent elements of life and power.

***

Thus we have an outline for the True Believer style mass social movement: It starts by creating an ideology, a set of dogmas around which a fanatical leader or group can create a following. From there, it must find a way to sustain itself as reality creeps in on ideology — structure is introduced or the whole thing will collapse. In almost all cases, there is tremendous violence, although in certain (rare) circumstances, it can be bloodless. Even the rise of religions or nations that promote peace came with tremendous bloodshed.

In the end, mass movements likely take many forms, but Hoffer gives us as good a framework as any to start thinking about the way they are constructed.

Still Interested? Check out the whole book, it’s a very interesting read.

The post Eric Hoffer and the Creation of Fanatical Mass Movements appeared first on Farnam Street.

]]>
27808