5 A naturalistic responsibility

5.1 Proactive epigenesis

The first sentence in the 1948 Universal Declaration of Human Rights states: “All human beings are born free and equal in dignity and rights.

Read as a description of the actual situation of human beings, this is blatantly and tragically false. Read as a normative ideal that we should strive for, it is noble but tragically unrealistic: considering our present cerebral structure, we are not likely to acknowledge in actual social practice the equal dignity and rights of all individuals independently of race, gender, creed, etc. Life conditions may have improved for many humans over time, yet the present global situation remains appalling, notably, with respect to poverty, unequal distribution of health care, and the predominantly non-egalitarian or bellicose relations between individuals or groups. The vast majority of human beings appear reluctant, unable to identify with, or show compassion towards those who are beyond (and sometimes even towards those who are within) their sphere. While some societies or individuals may be more prone than others to developing a strong ethnic identity, violence, racism, sexism, social hierarchies, or exclusion, all exhibit some form and measure of xenophobia.

What I have here suggested, however, is that we might make presently unrealistic ideals, such as equality in dignity and rights, somewhat more realistic by selecting them for epigenetic proactivity.

Synaptic epigenetic theories of cultural and social imprinting on our brain architecture open the door to being epigenetically proactive, which means that we may culturally influence our brain organisation with the aim of self-improvement, individually as well as socially, and change our biological predispositions through a better fit of our brain to cultures and social structures.

I suggest that certain areas of research are especially important to pursue with the goal of “epigenetic proaction” in mind. They aim at integrating recent advances in neuroscientific research into normative debates at the level of society. This does not necessarily mean that my level of explanation is “neurocentric” or “neuroreductionist”. My aim is more “encyclopedic” in the sense that I wish to illustrate the benefits that neuroscience can bring to the humanities and social sciences and conversely. I do not see myself as either neuro-“centric” or “reductionist”—which would mean an exclusion of other categories of determinants at the social or historical levels—but I am more modestly willing to unify knowledge between the humanities and the neurosciences, which are too often deliberately omitted from the debate. This can be illustrated by two examples: violence in adolescents in relation to their social environments, and violence in adults associated with interconfessional conflicts.

Violence in adolescents is a common phenomenon in our societies and it is frequently repressed through police and judiciary means, often resulting in incarceration. But this approach to juvenile violence simply omits the scientifically-established fact that adolescence is also a time of “neurodevelopmental crisis”. Evidence from anatomical and functional-imaging studies has highlighted major modifications of cortical circuits during adolescence. These include reductions of gyrification and grey matter, increases in the myelination of cortico-cortical connections, and changes in the architecture of large-scale cortical networks—including precentral, temporal, and frontal areas. (Klein et al. 2014). Uhlhaas et al. (2009) have used MEG synchrony as an indicator of conscious access and cognitive performance (rev. Dehaene & Changeux 2011). Until early adolescence, developmental improvements in cognitive performance are accompanied by increases in neural MEG synchrony. This developmental phase is followed by an unexpected decrease in neural synchrony that occurs during late adolescence and is associated with reduced performance. After this period of destabilization follows a reorganization of synchronization patterns that is accompanied by pronounced increases in gamma-band power and in theta and beta phase synchrony (Uhlhaas et al. 2009). These remarkable changes in neural connectivity and performance in the adolescent are only just being explored and may lead to special unexpected proactive care from society. In turn, this requires active research, including a social educative environment adequate to adolescents’ special needs. This may include adequate physical exercise, cultural games, educational training, and new kinds of therapies yet to be invented.

Violent interconfessional conflicts have raged throughout human history. They continue to plague our modern societies and are presently an important cause of wars and other forms of violence throughout the world. One should remember that every newborn and child brain incorporates critical features of its biological, social, and cultural environment including, in addition to spoken and written language, symbolic systems and religious rituals (which include dietary and vestimentary practices as markers of the faith). These epigenetic traces are almost irreversibly laid down and may persist throughout the whole life of the individual. Yet they might be renewed through epigenetic transmission from adults to newborns. In this context, early proactive epigenetic imprinting through education is of critical importance. The aim of that education should not be to abolish faith or emotional convictions (e.g., moral, political, or religious) but only to control the fervour, intolerance, and fanaticism in their expression. The problem, as I see it, is not a belief itself, but the emotional intensity to which it gives rise and the manner in which it is expressed. Influencing a child brain to reduce its propensity to ideological violence or fanaticism and enhance its tolerance to others’ differences also requires special proactive care from society that per force involves active research—including a social educative environment adequate to this particular goal.

These are only two illustrations of the many that are possible, chosen because they have been problematic throughout the history of humankind and show no signs of disappearing.

At the individual level, the social conditions of an infant, or an adolescent, are of crucial importance in their cerebral development, and adequate conditions can in principle be provided. The factual realism of this application is largely a matter of political will and social agreement. The scientific challenge will be to further develop the knowledge of these conditions and their effects on the developing infant and adolescent brain. Also, the challenge will be to develop our knowledge of how social conditions affect the adult brain, e.g., to prevent neurodegeneration.

On a more general level, when applied on a larger scale to a society, a population, or to the entire human species, the argument follows the same logic and is no less important—but it becomes considerably more complicated to apply, theoretically as well as practically.

If new cultural imprints were epigenetically stored in our brains (say, less violent or less sectarian features), future generations would presumably develop societies that reflect them (i.e., become more peaceful and inclusive). A weakness of this optimistic reasoning is its circularity, since we would already need to be peaceful in order for a peaceful society to be maintained. A crucial question then becomes: how long does it take for a cultural characteristic to leave a cerebral trace? In some measure stable and enduring cultural structures are needed in order to effect stable neurobiological changes and store cultural imprints in the brain that might give evolution a push in the right direction, but the chances of maintaining societies that conflict with the present nature of its inhabitants—say, maintaining a peaceful egalitarian rule in a society of violent xenophobes—are arguably slim.

The challenges involved in trying to be epigenetically proactive by culturally influencing the future actions of human genes and neuronal structures, with the aim of altering higher cognitive functions and their resulting behaviour seem formidable, at least if enlarged sympathy is on the agenda. Still, within the epigenetic neuroscientific framework, at least the theoretical possibility exists, and it is worthy of consideration by many other disciplines beyond neuroscience. Depending on how we choose to develop our culture, one day epigenetic rules that enlarge the presently-narrow realm of human sympathy might perhaps emerge.

5.2 Conclusion: A naturalistic responsibility

The origins of norms and the relationship between facts and values have been much debated in philosophy. Reasoning that weds scientific theory with normative considerations has been accused of committing the logical error of confusing facts and values, which is known as “the naturalistic fallacy”.

The expression “the naturalistic fallacy” was coined by the British moral philosopher G. E. Moore and refers in his work to the identification (or reduction) of goodness with (or to) another property such as utility, pleasure, or happiness (Moore 1903). That issue is not relevant in the present context. In the interpretation of the naturalistic fallacy that is relevant here, the fallacy consists in deriving an “ought” from an “is”, or a value from a fact, and letting descriptive properties entail normative properties, which confuses the distinction between facts and values in a fallacious manner. This argument is reminiscent of David Hume’s claim that what is is entirely different from what ought to be, for “the distinction of vice and virtue is not founded on the relations of objects, nor is perceiv’d by reason” but is fundamentally a matter of feelings and as such is neither true nor false (Hume 1739, III, I). I agree that it is fallacious to derive “ought to be” from “is”, and consider this a conceptual mistake that our theory of epigenetic proaction must and indeed does avoid. I do not assert that factual descriptions of the brain’s architecture are tantamount to yielding recommendations or assertions of norms, do not confuse “is” with “ought”, and consequently do not commit the naturalistic fallacy in this formulation.

We should observe that a value may be represented on many levels: non-conscious as well as conscious, as a basic biological function or as a feature of advanced moral reasoning. When discussing the naturalistic fallacy, value as a feature of advanced normative reasoning is the relevant sense of the term. The logical distinction between fact and value could collapse if the term is defined differently—say, if it features as a non-normative biological function. The logical error in the naturalistic fallacy concerns the fact/value distinction as it is drawn between normative and descriptive statements, namely between ought and is; not between facts that are/are not biological values, where that concern would presumably not arise.

However, eagerness to avoid the naturalistic fallacy must not prevent our normative reasoning from being informed by scientific theories. Normative judgments should be informed by facts, even though they cannot be entailed by them. If certain evaluative tendencies are innate in the normal human brain’s architecture, such as self-interest and selective sympathy, this fact (if it is one) about the human being’s neuronal structure would admittedly entail that every healthy, sufficiently mature individual will to some degree feel both self-interest and sympathy towards some other creature. However, this is not the entailment of a norm, but an empirical entailment of another fact. It does not entail that it is good (or bad), or that we ought to conceive it as good (or bad) that we are thus construed. Similarly, if it is true that we are, for example, and as we have argued, self-projective xenophobes, knowledge of this (presumed) fact is not in itself a justification of it. Understanding is not the same as justification: to know, or to understand, is not to approve. On the contrary, knowledge about our neural structures’ predispositions should increase our awareness of the need for stable and realistic social structures and agreements to keep us in check.

We should also observe that a belief in the approximate universality of certain values, or preferential tendencies as innate features of the human neurobiological make-up, is logically compatible with a belief in maintaining the description/norm distinction.

My primary focus has been on the important empirical connections between biological facts and norms. Norms are brain constructs elaborated by human societies, biologically as well as culturally embedded in and constrained by the contingent evolution of socio-cultural structures—in particular, by the multiple symbolic philosophical and religious systems that have developed. This fact, and the realisation that normative judgments should be informed by facts even though they cannot be entailed by them suggests that science, philosophy and—not leastneuroethics—have a major responsibility: namely to decipher the network of causal connections between the neurobiological, socio-cultural, and contingent historical perspectives that allow a moral norm to be enunciated at a given moment in human history; and to evaluate their “universal” character as pre-specified in our genome and shared by the human species in distinction from those relative to a given culture or symbolic system. The “fallacy” of the naturalistic approach is thus inverted into a naturalistic responsibility (Evers 2009): the responsibility to connect facts and values, biology, and socio-cultural structures, and to use that enriched understanding for the benefit of ourselves and our societies.

We may hope that through the rational exchange of arguments between partners with different cultures and moral traditions debating together, a species-specific “human core could become dominant beyond individual differences and converge on a common structure (Changeux & Ricoeur 2000). At the same time, we must note that the diversity of human individuals and societies is enormous and must be respected while we strive to find this common ground that might allow coexistence.

The idea of proactively selecting those specific dispositions or capacities (such as sympathy) that we all share as human beings which that, if properly developed, may benefit our global co-existence while respecting individual and ideological diversities, is well in line with Darwin. Darwin wrote in The Descent of Man:

As man advances in civilization, and small tribes are united into larger communities, the simplest reason would tell each individual that he ought to extend his social instincts and sympathies to all members of the same nation, though personally unknown to him. This point being once reached, there is only an artificial barrier to prevent his sympathies extending to the men of all nations and races.

Lewontin (1993) argues that while traditional Darwinism has portrayed the organism as a passive recipient of environmental influences, a correct understanding should emphasize that humans are active constructors of their own environment—in particular the social and cultural environment. I agree and argue further that, in line with Darwin, we can be active constructors of our own brains through using our environment and culture, in a relationship that is reciprocal.

In this article, my main focus has been on feasibility—that is, on whether we can be epigenetically proactive. If we assume an affirmative answer to that question, an important follow-up question arises: whether we should be so. My basic position, that I have here tried to express, is that epigenetic proaction could be a very promising, powerful, and long-term way of influencing human nature and of improving our societies. However, in order to pursue this in a responsible and adequate manner, caution is required, along with careful analyses of the relevant social and ethical issues. Science can be, and has throughout history repeatedly been, ideologically hijacked, and the resulting dangers increase with the strength of the science in question. If, say, humans learn to design their own brain more potently than we already do by selecting what we believe to be brain-nourishing food and pursuing neuronally-healthy life-styles, we could use that knowledge well—that is, there is certainly room for improvement. On the other hand, the dream of the perfect human being has a sordid past, providing ample cause for concern about such projects. Historic awareness is of the utmost importance for neuroethics when assessing suggested applications in a responsible and adequate manner. Moreover, what we mean by “responsible and adequate” is open to interpretation. The traits we choose to favour epigenetically, and the social structures we choose to develop, depend on who “we” are, and in what society we wish to live.

Arthur Koestler compares evolution to “a labyrinth of blind alleys” and suggests that “there is nothing very strange or improbable in the assumption that man’s native equipment, though superior to that of any other living species, nevertheless contains some built-in error or deficiency which predisposes him to self-destruction” (Koestler 1967, xi). In that light, steering evolution by influencing the cultural imprints to be stored in our brains appears to be an attractive option.

Acknowledgements

I wish to thank Jean-Pierre Changeux for his important scientific contributions to this paper, and for his detailed scrutiny of the arguments expressed. I also wish to thank Yadin Dudai, Sten Grillner, Hugo Lagercrantz, and Arleen Salles for their valuable comments on earlier versions of this manuscript. The research leading to these results has received funding from the European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement n° 604102 (HBP).