Saturday, May 26, 2018

In Defense of Virtue Ethics

I sometimes get the sense that utilitarianism has become the dominant view in pop philosophy, with deontology seen as a close -- or perhaps the only -- alternative. The most famous thought experiment in philosophy, the trolley problem, distinguishes between the two views. Five people are tied to a trolley track. A trolley barrels down the track and approaches a junction with another track. There is one person tied to the second track. You can pull a lever and switch the trolley from the five-person track to the one-person track. Utilitarians (typified by John Stuart Mill) pull the lever so that only one person dies instead of five. Deontologists (typified by Immanuel Kant) wouldn't pull the lever because by pulling the lever you are actively committing murder and murder is bad. 

Image result for trolley problem
The 'in' thing these days seems to be that utilitarianism is correct and deonotology is dumb. At least that's what you see in the comments of the Trolley Problem Memes Facebook page.


Image result for anti deontologist memes
And to be fair, utilitarianism is a pretty good ethical system. With a few basic assumptions (e.g. we care about human happiness) and infinite computing power, you can get a sensible answer to pretty much any ethical quandary by tallying up the utils (happiness points) for any given action. There are paradoxes that emerge from this approach, but overall it's basically sensible. Deontology has its merits too though, perhaps the chief among which is that it relies on simple-to-follow rules instead of intractable calculations of future outcomes.

What's missing from this debate is any mention of virtue ethics. Virtue ethics, instead of focusing on outcomes or actions, is primarily interested in virtues, or positive moral qualities of the individual. Virtue ethics is ambivalent about the trolley problem - on the one hand, actively killing someone by pulling the lever can have dramatic negatives consequences on the psyche a la Crime and Punishment, but choosing not to save to  the five people on the first track when you know that you could have if you so choose can be equally character-demolishing.

One trolley problem-esque thought experiment for virtue ethics is the corruption problem. You are in a position where you could run for political office. You know that power corrupts, and according to virtue ethics being corrupted is bad.  However, there is another candidate running for office, and you know that the power of political office will corrupt the other candidate more than it will corrupt you. The utilitarian would say "it's better than you run for office, because if the other guy wins then he'll do worse things because he will become more corrupt." A virtue ethics purist, however, might say that your responsibility is to focus on your own virtues, and thus you should avoid the corrupting occupation even if this choice results in net negative consequences for society. (Thought experiment proposed by R. Aryeh Klapper.)

In any event, virtue ethics has been subject to criticism in the rationalist community. Scott Alexander the Great in particular wrote a series of posts in 2013 arguing against virtue ethics in favor of utilitarianism (link, link, link).

...we can debate important moral questions – like abortion, or redistributive taxation – until the cows come home, but this is in fact only the appearance of debate since we have no agreed-upon standards against which to judge these things...
The interesting thing about virtue ethics is that it is uniquely bad at this problem... For example, in Kant’s famous “an axe murderer asks you where his intended victim is” case, the virtue of truthfulness conflicts with the virtue of of compassion (note, by the way, that no one has an authoritative list of the virtues and they cannot be derived from first principles, so anyone is welcome to call anything a virtue and most people do).
This is a valid critique of virtue ethics, but I think it misses the point of what virtue ethics is about. In real life, most people are not confronted by runaway trolleys or inquisitive ax murderers very often. I personally find myself in morally ambiguous situations maybe once a year on average. Consequentialism and deontology, while useful for resolving the hard problems when they do come up, simply don't play much of a role in my day-to-day life.

 When a person wakes up in the morning, he faces a litany of seemingly trivial choices about what to eat, how should he behave at work, how he should interact with his family, how to apportion his time, etc. Utilitarianism has little to say in these choices, other than things like "if you exercise you'll live longer, but then again if you live longer you might be unhappy during those extra years and watching Netflix will make you happy now..." and deontology can say "The categorical imperative says you must exercise 30 minutes a day." But neither of these seem like the correct approach to even framing the issue of whether and why you should exercise.

Enter virtue ethics. Virtue ethics answers the question "what is the proper path that a man should take for himself in life?" Virtue ethics promotes the concept of Eudamonia, or human flourishing. Virtue ethics differs from utilitarianism and deontology because it views the flourishing of the individual -- that is to say you personally -- as an important teleological end into itself. While utilitarianism cares about universal happiness and suffering and deontology cares about universal moral rules, virtue ethics cares about you. Virtue ethics is a project to build humans into the best possible versions of themselves. Of course, within virtue ethics there is some debate as to what the best possible version of a person looks like, just like there is debate within any other ethical framework. But the questions of "should I exercise now" or "how should I spend my time this evening" are questions that begin to make a lot of sense in a virtue ethics framework.

For the most part, virtue ethics isn't interested in the big once-a-year moral dilemmas, and it would be happy to hand off that job to utilitarianism. But by the same token, utilitarianism isn't particularly interested in the role that virtue ethics plays either. If you're a billionaire with a million dollars to burn and need to decide which charity to give that money to,  on a utilitarian frame the answer to that question is astronomically more important than your decision of whether or not to exercise, to the point where the exercise question doesn't even register as a worthwhile ethical problem. Virtue ethics, on the other hand, will say "hey buddy, it's great that you gave a million dollars to charity, but first of all it's a tiny portion of your wealth so you're not really that much better than anyone else and also you still need to exercise because being physically active is a virtue just like generosity is."

II.

Perhaps one of the great losses to the world when religion lost its philosophical standing is that questions of individual virtue took second stage to political questions which - by dint of affecting large numbers of people - obfuscated the need of every person to independently build their own personal character.  This sentiment is typified in the second-wave feminism mantra "the personal is political." From a utilitarian perspective, of course, there is a great deal of truth to this. The overall physical well-being of people, especially people who are members of vulnerable groups, depends largely on policy.

But there is a sense in which relegating all individual well-being to the political sphere is profoundly dangerous. Practically speaking, if political change is untenable, the vulnerable individual is done for. If the individual has to wait for society to remove barriers to his success before moving forward in life, he has no recourse but to either passively wait or attempt to effect societal transformation. The latter, being the more active approach, tends to feel like the more noble choice, so political activism becomes the moral activity.

Political activism, when done correctly, does have utilitarian moral value assuming that one candidate dominates the other one in total utility and that you made the right choice. The website 80,000 hours estimates that a vote in a US election is worth somewhere on the order of 100,000 to a million dollars in terms of the expected impact on the economy (which is a weak metric for well-being, but it's what we have). It also cites a Brookings Institute study that door to door electioneering can get one vote for every 14 contacts and phone banking gets one vote for every 38 contacts. On the other hand, leaflet and email campaigns had little to no statistically significant effect. Interestingly, there is a surprisingly large incidence - on the order of 20% -  of people who change their minds on a political issue due to social media according to this Pew study.

So from a utilitarian standpoint, social media activism might actually be one of the most impactful and important ways you can affect people's lives in a positive way. Swinging one Facebook friend to vote for the better candidate is worth more than most people's annual salaries.

But think about how dystopian this becomes, even for a person such as myself who spends, er, a very utility-maximizing amount of time arguing with people on the internet. All of a sudden "being ethical" turns into "spending every waking hour in flame wars on Twitter." What you eat doesn't matter. Whether you exercise doesn't matter, except in that it takes time away from social media. How you treat your friends, family, colleagues, barely matters. All that matters is that sweet, sweet, utility maximization by convincing people on the internet to vote for the right person. And under utilitarianism, that is an ethical imperative.

Maybe this example is silly, because I'm overestimating the impact that social media has on policy outcomes and human well-being. But there are examples that hit a lot harder. There was a Japanese official, Chiune Sugihara who was a vice consul during World War II. Sugihara took it upon himself to hand-write thousands of visas, for up to 20 hours a day, for Jews so that they could escape the Holocaust and come to Japan. He produced as many visas in a day as would normally be produced in month. He slept little and wrote until he could literally write no more, knowing that every hour that he slept would mean the deaths of at least dozens of people. He saved 5,558 Jews from their deaths by the hands of the Germans and is undoubtedly one of the great heroes of the Second World War.

A photographic portrait of Chiune Sugihara.

But somehow it seems unreasonable to make this kind of heroism ethically normative for everyone. Realistically we can expect heroic individuals who completely disregard their own needs in order to help others to occur every once in a blue moon. And most people don't find themselves in positions where they have an opportunity to accomplish such gargantuan feats of utility maximization.

Ethics needs to have a normative default for normal people living boring, non-heroic lives. And people need to have a sense of what it means to live a good life -- that is a happy, healthy, meaningful, and psychologically rewarding life. And it turns out that a big part of living a good life is being selfless and generous, but that's not the entirety of it.

Utilitarianism believes in the maximization of well-being, but has no clear definition of what well-being looks like. It uses things like standards of living, wealth and "happiness" as proxies, but it turns out that A) wealth is not necessarily a great indicator of happiness, especially between countries (link) and B) "happiness" is ill-defined and notoriously difficult to achieve. And politics, at the end of the day, is mostly concerned with questions of wealth. And that's why, though politics is certainly necessary for achieving well-being, it is far from sufficient. For people to actual be better off, they have to understand what "better off" means.

Virtue ethics defines well-being, and importantly, states that even under the best of circumstances, well-being requires effort on the part of the individual, and that's a feature, not a bug. Well-being means investing time in your family. It means eating healthy food. It means being nice to people. It means taking your work seriously. But above all, it means that psychology - in terms of your personality and character traits -- is really, really, important.

III.

A lot of religions and ancient traditions of ethical thought are essentially virtue ethics systems. Confucianism, for example, emphasized virtues like benevolence, justice, filial piety, honesty, and loyalty. Rabbinic Judaism, with its myriad rules and commandments, is also essentially a virtue ethics system in many interpretations. Though some laws in Judaism have a strong deontological vibe ("don't murder" probably just means "don't murder") many of the rituals have ethical symbolism, meant to inculcate virtues in its practitioners. The obvious example here is "don't cook a kid goat in its mother's milk," which is meant to teach the virtue of compassion. The Bible and Talmud are also replete with stories and lessons about virtuous behavior.

The import of virtue ethics fully crystallized in the Judaism with the advent of the mussar (ethics) movement in the 18th century. In the mussar framework, ethical behavior was seen as Judaism's paramount value, in some sense superseding the ritual aspects of the religion. That being said, the mussar movement was fully Orthodox in terms of practice, but it shifted the emphasis towards ethics. One of the major insights of the mussar movement was that not only is personal character important, it is possible to improve your personal character with constant, vigilant attention and, critically, personal character improvement is man's chief responsibility in life.

Adherents of the mussar movement had a number of methods to achieve personal character goals. One prominent mussarist, Eliyahu Dessler, incorporated Freudian psychology into his teachings. Other schools of thought were even more unorthodox. Yisrael Salanter, the founder of the movement, recommended that people chant mantras to instill in themselves personal virtues. An old joke goes that in one mussar school, a student sat swaying back and forth and repeating the mantra ich bin a gornisht - I am nothing (meant to invoke humility) - vociferously and emphatically, over and over again. Another student, seeing this display, starts chanting the same matra even more loudly than the first student. The first student picks up his head and angrily yells "I can't believe this guy thinks he's a bigger gornisht than I am! One yeshiva (religious academy) in Nevarodok, Russia had a practice of sending its students out to bakeries and request to purchase hardware appliances so that the students would build personal resilience and humility in the face of mockery.

The mussar movement was essentially wiped out in the Holocaust. Its only real remnants are books written by its adherents and 15 minute seder (study period) or shmuess (sermon) devoted to mussar in some yeshivot. And this begs the question: why wasn't there a serious mussar revival movement in Orthodox Judaism? The Lithuanian tradition of Talmud study refounded itself after the Holocaust, as did many streams of Hassidic Judaism. Why not mussar?

I think the answer here is that, in some sense, people knew that the mussar movement was a failure. As a teacher of mine once remarked, "if you do everything the mussar books tell you to do, you turn into the kind of person no one really wants to be around." Some historical accounts seem to support this take; Chaim Grade's book The Yeshiva is an excellent historical novel portraying the darker side of the Nevarodok mussar tradition. While mussar was perhaps noble in its conception, its methods and overall obsessiveness left much to be desired.

The word mussar (מוסר )is mostly famously related to the biblical verse (Proverbs 1:8) "שְׁמַע בְּנִי מוּסַר אָבִיךָ וְאַל תִּטֹּשׁ תּוֹרַת אִמֶּךָ" - listen, my son, to the instruction (mussar) of your father, and do not forget the teaching (Torah) of your mother. There is a bit of an oddity in this verse, which is that the teaching of the law (Torah) was historically considered the province of males in the patriarchal society of the Bible. In this verse the gender roles are reversed - the mother is teacher of the law and the father is giving ethical instruction. I claim that the gendered coding here is no accident. The verse here in Proverbs is describing an ancient practice of fathers teaching their sons "how to become men" which is distinct from the teaching of the law. Virtue ethics, at its root, are values that parents instill in their children, and in particular values that fathers teach their sons.

For whatever reason, males seem to have a particular need for this sort of instruction, perhaps due to greater levels of testosterone-fueled aggression relative to women. Men are in particular need of having their basest instincts reeled in, and it's often only a father figure who can play that crucial instructive role. Fatherlessness is one of the most important predictors of male incarceration (source) as well as a number of other outcomes like school performance and behavioral issues that are more pronounced in boys than in girls (source).

It is perhaps for this reason that personal improvement preachers like Jordan Peterson have become so popular among males. Peterson is notorious (among other reasons) for his exhortation that people clean their room (literally and metaphorically) as a critical aspect of their character development. Many find this lesson comical, but it makes perfect sense in terms of virtue ethics. In virtue ethics, the little details of our lives - like the cleanliness of our living space - reflect and affect how we see ourselves. And unsurprisingly, "clean your room" is also exactly the kind of instruction that fathers have been telling their sons around the world, from time immemorial.

Speaking of time immemorial, I came across this cuneiform tablet at the Oriental Institute in Chicago.


There's a certain irony to the fact that #MeToo movement in 2018 in America has an ally in a 4400 year old message that a father tells his son how to behave around women. This text is part of a genre known as "wisdom literature" -- of which the book of Proverbs is also a part -- which is in many ways synonomous with virtue ethics. A tablet like this doesn't give us a utility maximizing function (though, incidentally, there were some cuneiform texts with algebra problems in the same display case) nor does it really establish a deontological system of rules that everyone is obliged to obey. Rather, this is advice -- wisdom -- from a father to his son about how to live a good, ethical life.

I think there may be a lesson here, about what we need to effect societal change in areas like, say, the prevalence of sexual assault. The political and criminal justice systems might be able to make a dent in predatory male behavior via the standard deterrents. But the first line of defense against bad behavior shouldn't be politics or the justice system, it should be individual ethics. And most of the time individual ethics isn't about utilitarianism or deontology -- most people don't even know what those words mean. In the real world, for most people, ethics is simply the values that good parents teach their children.

Otherwise known as virtue ethics.












Monday, May 21, 2018

On the Value of Received Tradition


I.

One of the foundational beliefs of rabbinic Judaism is that when God gave the Torah to Moses, it contained two components: the oral law and the written law. The written law is the biblical text of the Five Books of Moses of the Old Testament, and the oral law is an oral tradition, passed down from teacher to student, containing explanations of the written law as well as laws which were not included in the biblical text. It is the oral law which comprises the primary canonical document of rabbinic Judaism, the Talmud. Although the Talmud, having been written down, changed the dynamic nature of the student-teacher process of transmission, the Talmud is mostly a closed book to people who might try to study it on their own due to its difficult language, obscure references, and perplexing argumentative structure. Thus, although the flexible nature of an oral tradition of law was lost with the writing of the Talmud, there remained a strong need for young aspiring Talmudists to find themselves learned teachers.

There is another important role that oral tradition plays in rabbinic Judaism, namely as a verification of the national history of the Jewish people. The (orthodox) argument for the historicity of the events recorded in the Bible depends on the idea that the Jewish people kept a parent-to-child oral tradition about the major events discussed in the Bible, such as the Exodus and the receiving of the Torah. As the argument goes, the Jewish people thus have an oral tradition that independently verifies the text of the Bible. [This is the essence of the argument proposed by the Rabbi Yehuda Halevi in his work The Kuzari, written in 1140.]

It is therefore no surprise that when Jewish philosophers in the Rabbinic tradition (Maimonides, Luzzatto) write about epistemology, "received tradition" is afforded a high place in their epistemological frameworks. Oral traditions also play a strong role in Christianity and Islam and are an important component of the hierarchies of religious authority, especially in the early histories of those religions. Oral traditions give learned individuals a monopoly on the teachings of the religion, which ensures that religion doctrine and practice don't change with the whims of the masses.

In contrast, modern philosophers, such as Descartes and Hume, placed emphasis on empirical observation and reason. The emphasis on reason and empiricism also meant that the individual played a much stronger role in epistemology, expected to rely on his own senses and reasoning ability to come to true conclusions about the world. [This dovetails with ideas that were being formulated at around the same time placing the individual at the pinnacle of moral and political considerations.] In a simplified version of history, the philosophical breakthrough of emphasizing observation and reason over tradition is what led to modern science and the fall of religion, at least in terms of the standing of religion within philosophy. The narrative of rationalism and empiricism replacing tradition, heralding enlightenment, and building a brave new world is still a staple of contemporary secular thought, despite challenges being raised from both the right and the left. (See, for example, the debate about Stephen Pinker's recent book Enlightenment Now.)

II.

I came across an article by a journalist who attended a Flat-Earth convention. The author characterizes the conference attendees as follows:
...[F]lat earthers do seem to place a lot of emphasis and priority on scientific methods and, in particular, on observable facts. The weekend in no small part revolved around discussing and debating science, with lots of time spent running, planning, and reporting on the latest set of flat earth experiments and models. Indeed, as one presenter noted early on, flat earthers try to “look for multiple, verifiable evidence” and advised attendees to “always do your own research and accept you might be wrong”. 
The emphasis on individual reasoning and skepticism toward extant knowledge can also characterize a number of other anti-science movements that have crept up in the United States and around the world, such as climate change deniers and anti-vaccination activists. The curious thing about these movements is that they are prima facie doing exactly what Descartes and Hume would have wanted them to do: they are experimenting and thinking for themselves. So where did they go wrong?

As a graduate student in the hard sciences, I find myself in the position of observing firsthand how scientists actually work. And it turns out that received tradition plays a huge role in how science is conducted today, and without it the scientific enterprise would be doomed to fail.

Before beginning the research phase of their careers, most scientists attend graduate school to get up to speed on the current state of knowledge in the field. While flat-earthers might view graduate school (and perhaps school in general) as an indoctrination camp of sorts, what actually happens inside the classrooms are vigorous debates between professors and students (more often in seminars than courses, but in courses as well), tossing around ideas for experimental design, and explicit rejection of old theories in favor of recently published results. Scientific graduate school classrooms and seminars are among the most critical thinking-suffused spaces in the world. But there is still -  in most cases - a professor in the front of the classroom who generally has the final say.

 When a doctoral student begins to research a scientific topic, they choose an adviser who will mentor them for the duration of the doctoral research. The role of the adviser is to give guidance, assistance and  - of course - funding to the intrepid young researcher. Few professional scientists start their careers doing research independently; scientific careers almost always begin as apprenticeships under an accomplished scientist. There are a small number of genius-level individuals who don't take this route, but they are the exceptions that prove the rule.

In addition to the doctoral process itself, there is a general sense in which everyone doing science is a midget atop the shoulders of giants. Most scientific work, even groundbreaking, revolutionary findings, starts from things that people already know. All of us, from Nobel Prize winners to lowly graduate students, begins their papers with a literature review. The greatest faux pas that a scientist can make is to publish work that is rendered either trivial or incorrect by researchers who preceded him. I've personally witnessed situations where seasoned researchers have given uninformed talks outside of their field of expertise when an expert was sitting in the audience. The result is...not pretty. The zeroth step of the scientific method is "do your homework," and woe unto him who begins to experiment without reading the relevant literature.





I remember on the last day of one of our graduate courses, our professor sat us down and gave us advice for the beginnings of our scientific career. That advice? Have humility and listen to your advisers and senior graduate students. He didn't need to tell us to be creative, to think critically, to reject dogma -- we're graduate students, we already do that all the time. But the hard part of science often isn't the "think for yourself" part, it's the slog through the technicalities that your adviser and older graduates students have all experienced and know to anticipate. And if you don't listen to them, you're going to have a bad time.
Rabbi Chanina the son of Idi said: Why are the words of Torah compared to water, as it is written "Ho, all who are thirsty, go to the water" (Isaiah 55:1)? To teach you: just as water, when placed in a high place goes to a low place, so the words of Torah only exist in one whose temperament is humble. (Taanit 7a)

III. 

 There is sense in which intellectual traditions have worth beyond practical necessity. In academia, just like in the world of religious learning, the relationship between a teacher and student go beyond the purely functional. A graduate student at a respected institution of higher learning will often have instructors and research advisers who are renowned experts in their fields. Studying under and being mentored by a prominent scientist is a badge of honor for a student; having successful graduate students is a badge of honor for professors. There will occasionally be moments as a graduate student, - the "sitting at the feet of Moses at Sinai" moments -  when a professor is writing an equation on the board that he discovered, that you realize that you'll be able to tell your future students that Professor Y himself taught you that formula. The significance of the chain of knowledge from professor to students is evident in the existence of academic genealogy projects, where academics can trace their academic lineage through their adviser hundreds of years back in history to the founders of their field.

There are parallels to this phenomenon in the arts as well. In music, for example, most rock artists can trace their musical influences back to the very beginning of rock music, to bands like The Beatles and musicians like Jimi Hendrix. (Most rock bands probably claim direct influence from those early trailblazers, which you can do when everything is recorded.) Kirk Hammett, the lead guitarist of Metallica, was a student of Joe Satriani, who picked up a guitar (and went on to became a virtuoso) after hearing of the death of Hendrix, having been inspired by him (source).

Received tradition is also a very prominent part of the Asian martial arts. The movie Karate Kid did a good job of portraying this received tradition relationship between the wizened Japanese Karate instructor Mr. Miyagi, who teachers the young Daniel Larusso how to master Karate and helps him navigate the challenges in his personal life.  In the real world, the martial art of ninjutsu was popularized by an Israeli -- Dorron Navon and an American -- Stephen Hayes, who both studied under Dr. Masaaki Hatsumi, who is the 34th in a chain of grandmasters who passed down the tradition of Togakure-ryū ninjutsu from teacher to student since 1162. And we actually have a list of all the people in that chain. There is no doubt that the martial art has changed over the centuries, but all students of ninjutsu today are links in a chain of an unbroken tradition that is almost a millenium old. [Interestingly, Japan has a bunch of other professions that can trace themselves back for many generations, including a family-owned hotel that dates back 46 generations to the 8th century.]

In summary, chains of received tradition are not just functional in the sense of enabling the transmission of knowledge and skills; they can also provide a deep-rooted feeling of belonging to a system that spans past, present, and future. There is a certain lamentability to the fact that most industries in the modern world have little sense of history. I imagine it's easier to get up with a sense of purpose every morning as a worker in a 1300 year-old hotel passed from generation to generation than at an equivalent job at a Holiday Inn. In some ways, I wouldn't mind if our educational systems - including academia - emphasized this received tradition aspect a bit more. My experiences in the religious world and in martial arts world have taught me that there is a powerful value to the deep teacher-student relationships that are common in traditional frameworks. Like in Karate Kid, these relationships can go beyond the functional to the moral and "spiritual" (in the sense of personal character building) in a way that few other relationships can.


IV.

Maybe I've overstated my case. After all, the modern rationalist-empiricist epistemological view was an essential step in establishing a world where science became ascendant over religion and superstition. What is to prevent students of a received tradition from becoming blinded to errors made by their forbears and perpetuating incorrect and dangerous ideas? This is an especially critical question when it comes to religious traditions, traditions that have given rise to inquisitions, crusades, and holy wars.

Here, it is crucial to distinguish between traditions that are open to criticism and change and traditions that eschew any kind of reform. The scientific tradition is predicated on revolutionary discoveries overtaking accepted dogma. The artistic tradition celebrates those who break established forms and create something new and original. And even liberal religious traditions are open to ethical ideas from secular philosophies and the questioning of foundational beliefs.

The truly problematic traditions are those that refuse to engage with criticism or self-modify when an aspect of the tradition is convincingly demonstrated to be wrong (in the case of 'is' beliefs) or harmful (in the case of 'ought' claims). Fundamentalist religions tend to take an all-or-none approach to their beliefs - you either buy into everything or you're a heretic. Either you accept the Bible as true in its entirety or you're a non-believer. Either you accept all of the 13 doctrines of Maimonides or you are not Orthodox. Either you accept that homosexuality is an abomination in the eyes of God or you're an apostate. Fundamentalist religions also shun people who study outside sources of knowledge which have the potential to conflict with internal traditional views.

In healthy intellectual traditions, individuals are free to choose which parts of the tradition they wish to accept and which they wish to reject, and the tradition itself should be open to improvements made by the individual. Individuals should, of course, approach this task with humility, but ultimately the individual is expected to improve the tradition. If a martial arts master develops an effective new technique or discovers that a particular stance in ineffective in sparring, he is responsible for conveying that information to his students. If a mathematician discovers an error in a published proof, the mathematics community would hope that she makes that error known to the community so others do not build further theorems on faulty assumptions.

I am essentially arguing here that received traditions should operate as evolutionary processes. In evolution, you start out with the genetic material that is continually modified over time in a way that is adaptive to the environment. An evolutionary process that doesn't allow mutation -- stagnant, rigid traditions -- will never make it out of the primordial soup. At the same time, individuals who want to scrap all prior evolutionary progress and start from scratch -- the flat-earthers -- are also not going to make it very far.