Response to Ratnayake and Merry 2018 ‘Forgetting ourselves: epistemic costs and ethical concerns in mindfulness exercises’

Ratnayake and Merry argue that prescription of ‘mindfulness practices’ by medical/psychiatric professionals may be more ethically problematic than previously thought, and that greater care should be taken to ensure that clients can provide informed consent before being prescribed these treatments. First a case is made that common mindfulness practices (including at least some kinds of cognitive behavioural therapy and similar programs) at least imitate and have historical links to Buddhist and Vedic practices, which in turn presuppose certain beliefs about the metaphysics of existence. (Stronger claims, such as that the mindfulness practices rely on such a worldview in order to ‘make sense’, are avoided.)  Second, the paper argues that encouraging a client/patient to engage in mindfulness exercises may inadvertently lead them to lose credence in beliefs that are of great importance to them. (There is a bit of a gap in the argument here, but perhaps not an unbridgeable one.) The main example here, and the most plausible, is the allegation that Christianity (or some forms of it) requires a more substantial and complex notion of the self than the Vedic or Buddhist traditions allow. The authors conclude that those prescribing mindfulness practices should be careful to inform clients of the ‘metaphysical loadedness’ of the practices, and that this loadedness should be reduced where appropriate.     

The fact that this is a problem of epistemology is significant: the problem is not that a fundamentalist member of the Abrahamic tradition might feel that that Vedic-inspired practices are somehow tainted by their origins (inspired by Satan, or whatever). This would be a rather different problem. The problem is that a person’s web of beliefs might be inadvertently disrupted, with consequences, perhaps, for their values and relationships. This is prima facie a more interesting problem.

It is a peculiar trait of academic philosophers that we take it to be respectful to assume that a person is ardently committed to their ‘comprehensive doctrine’ and everything that it entails. Isn’t this what it means to be sincere, logical, and free-willed? Once we accept that a person’s comprehensive doctrine must be taken seriously, rather than being dismissed as a mere error, we feel that we should accommodate it insofar as this is consistent with accommodating others and their doctrines. I think this might be a mistake: I’ll call this the ‘little philosophers view’ of ordinary people. This I think is an error of Ratnayake and Merry, but one they share with many others, and one with far-reaching significance.

The first problem with this account is its paternalist and coddling mentality – we are both reasoning on behalf of others to decide which ideas could harm them, and then trying to protect them from this harm. (Is this a practice that, if adopted, can be kept within its proper bounds?) But we don’t have the sort of access to their beliefs that we might think we have, and it is not clear that we are ‘protecting’ them from harm rather than benefit anyway. I’ll deal with each of these points in turn (the latter I think is by far the more controversial):

First, on the ‘little philosophers view’, the architype Christian is the fundamentalist. As a non-religious person, I understand the allure of this idea. The Christian ‘little philosopher’ is a sort of foundationalist whose foundation is the Bible. If somebody thinks that their book is a holy book, shouldn’t they treat it as ‘truer’ than just about anything else? Shouldn’t its words, and every conclusion logically derivable from them, be absolute law? (Again, isn’t this just what it means to be sincere, logical, and free-willed?) I understand the allure of this idea, but I think it must be fiercely resisted. This is not the way faith operates for most people of faith – most have a different view of their religious authorities; others manage to balance literalist beliefs with other beliefs that allow them to navigate the modern world, despite some cognitive dissonance. This is the case, and we would all be in trouble if it were not. The same can be said, I think, for many other sorts of beliefs, importantly including non-religious moral and political beliefs. (So, my emphasis is not at all on ‘faith’ here, but like Ratnayake and Merry on epistemology in general.)

Second, and closely related: It is significant that ordinary people are generally not ‘little philosophers’, and the harms they are threatened with are not the harms of being driven to logical inconsistency. If they are not devoting themselves to theorising, then reasoning will have some other purposes for them, and it is worth thinking about what these are. If a religious person is not a literalist of a fairly extreme sort, then there will be latent flexibility in their interpretations of what their religion says and demands of them. Perhaps this will not even be apparent to them until they are confronted with a novel practice (e.g. mindfulness training). Perhaps, instead, they will have previously had the cognitive resources to identify this ambiguity, but no motive to consider it until the novel experience occurs. Perhaps this new experience will force some sort of crisis of cognitive dissonance – but this is not the usual reaction and an outsider cannot generally be expected to predict when this will occur. Whichever situation the new experience provokes, these are all familiar experiences to a moderately reflective person and they are basically what we would call ‘learning from experience’: ordinary people learn by day-to-day experience of the world with the help of limited reasoning, not by sophisticated reasoning from first principles.

Indeed, if they are too deeply attached to reasoning from first principles, this may seriously inhibit their ability to learn at all. This lesson is not unique to ordinary people living their day-to-day lives. I tend to be interested in moral and political accounts of public reason, where ‘comprehensive doctrines’ are an important topic. But it happens that I have just finished reading Lakatos’s 1970 ‘History of Science and it’s Rational Reconstruction’, so I will go there for an example. Lakatos writes:  

“Furthermore, for Popper, working on an inconsistent system must invariably be regarded as irrational ‘a self-contradictory system must be rejected… [because it] is uninformative… No statement is singled out… since all are derivable’. But some of the greatest scientific research programmes progressed on inconsistent foundations. Indeed in such cases the best scientists’ rule is frequently: ‘Allez en avant et la foi vous viendra‘. This anti-Popperian methodology secured a breathing space both for the infinitesimal calculus and for naive set theory when they were bedevilled by logical paradoxes.

“Indeed, if the game of science had been played according to Popper’s rule book, Bohr’s 1913 paper would never have been published because it was inconsistently grafted on to Maxwell’s theory, and Dirac’s delta functions would have been suppressed until Schwartz. All these examples of research based on inconsistent foundations constitute further ‘falsifications’ of falsificationist methodology.”  (112-3)

The lesson (relevant to ‘Forgetting Ourselves’): even very sophisticated reasoners should not be too quick to theorise themselves (or others) into a corner. Cognitive dissonance is a resource which we can expend when faced with novel challenges. A person with a comprehensively over-theorised worldview can do nothing when this worldview is challenged but shrug and dismiss the challenge as an anomaly (or, dismiss their entire worldview as an error). A person who can maintain some degree of cognitive dissonance can draw on a variety of perhaps incompatible epistemic resources when trying to learn from a new experience. They may in the process discard some of these resources, they may acquire others. And for ordinary people, this is progress – and unless we conceive of them as little philosophers, the achievement of cognitive closure is not progress.