It is now arguably the dominant perspective to see human history as progressive. This has not always been the case. The Judaeo-Christian view envisions a fall from a primary state of divine favour and perfection; the Greek poet Herodotus saw human civilisation as devolving from an original Golden Age of peace and plenitude; similarly, Hinduism envisages history in terms of a moral and physical decline through four yugas, at the end of which a global cataclysm reboots the eternal cycle. That such views have fallen out of favour is often attributed to scientific advance. From its beginnings in ignorance and superstition – so this story goes – humanity has bootstrapped itself into a state of rational enlightenment and technological mastery. Given the abundant evidence of the latter, certainly, it might seem perverse to deny progress, as over the last century or so science has enabled us to achieve the incredible. While medieval theologians and astronomers redundantly debated the true nature of what lies beyond the moon, science has put a man on it, and pointed high-resolution telescopes into the far reaches of space. Whereas once the poet William Blake dreamed of holding infinity in the palm of his hand, anyone with a smartphone now has in their pocket a device that is capable (at least theoretically) of accessing the totality of human knowledge – not quite an infinity, perhaps, but still a mind-boggling achievement.
And yet, for all that the apparently unstoppable advance of science and technology has given, and continues to give us, there are those who wonder whether we have not in the process also lost something. Such a voice is Meghan O’Gieblyn’s. God, Human, Animal, Machine plots the rise of the modern world view, illustrating how this has involved a steady process of disenchantment, as the world is gradually stripped bare not only of superstition and myth, but also ultimately those things that science and rationality itself holds dear. This may all be laid ultimately at the door of the French philosopher René Descartes, who decided to employ sceptical doubt as a tool with which to conduct an inventory of his own beliefs in order to weed out the false ones – or at least, those he was capable of doubting. 17th Century Science took up Descartes’s challenge and ran with it, and it’s not stopped since. So what, then, if in the process the world has become “disenchanted”? A good rationalist will not miss fairies and hobgoblins, and other supernatural explanations for natural events, nor a morality based solely on the promise of eternal punishment or reward. But the inventory of allegedly false beliefs does not stop there, and with these too must be jettisoned our naive notions of personal identity, our belief in freewill, even perhaps the very notion that we are conscious at all (which, incredible as it is for the layperson to discover, certain philosophers and neuroscientists are now seriously proposing). The physical world, too, the investigation of which has so often rendered decisive blows against religious dogma and armchair philosophy, has also begun to prove treacherous. Finding at the very heart of matter not the concrete answers they sought, physicists have uncovered an increasingly bizarre subatomic realm where the very notion of solidity, causality and predictability are replaced by baffling conundrums with no untroubling resolution; a world, in fact, closer to metaphor, where any precision of language must break down. As Niels Bohr – one of the founding fathers of quantum physics – admitted, “When it comes to atoms … language can be used only as in poetry.” In fact, Bohr thought that any scientific ambition to explain the fundamental nature of reality is hampered by the possibility that human understanding may itself not be fit for such a purpose. And if our intellectual tools are insufficient, then any theories they produce can only ultimately serve as metaphors, poetic rules of thumb. After all, despite its revolutionary technological impact, quantum physics works in spite of our continued inability to fully provide a theoretical understanding of it. A point which would seem to undermine the idea that technological advance is itself proof of the scientific world view.
But perhaps, then, such theoretical understanding is not necessary. Perhaps we don’t need theories. With the rise of Big Data, we may now answer questions for which we may not yet – or ever – possess an adequate theory. But does that matter? This segues nicely into the rise of AI, and the use of algorithms for everything from cancer detection to judicial sentencing. To take the latter example, shouldn’t a man who is sentenced to a certain length of jail time based on algorithmic recommendation be entitled to know how that decision was arrived at? But the alarming truth is that the creators of said algorithm may not actually know, for it is a “black box”. The algorithm has not been schooled on a theory of human nature or society. It has merely been fed masses of data concerning behavioural trends, crime rates, post codes, etc, among which it detects patterns, and from which it then evolves its own principles that allow it to predict (e.g.) whether the defendant is likely to reoffend, whether they are a danger to the community, a flight risk, and so forth. But if the algorithm cannot “show its workings” – that is, communicate what these principles are (which, apparently, it cannot) – and there is no appeal against the algorithm’s judgement, then justice becomes frighteningly Kafkaesque. We must simply trust that it is correct … almost as if on faith. O’Gieblyn here notes the curious similarity between these sort of black box algorithms, to which we are increasingly and blindly entrusting important decision making processes, and the opaque justice of the God of the protestant theologians John Calvin and Martin Luther, which was similarly considered beyond human comprehension. Like the biblical Job – whose demand to know the reason for his seemingly unmerited sufferings was mocked and belittled by God – are we similarly to unquestiongingly accept that the algorithm knows best?
But the parallels between science and religion go even deeper. Having rejected religion’s traditional goals and consolations – immortality, resurrection, revelation, transcendence – science (in the form of transhumanism) now embodies many of those same strivings, but translated into the scientific idiom. Digital technology will allow our minds to be uploaded onto computers, where we may live on as pure information, or be downloaded into new robotic bodies. This will be possible once technological progress brings about the Singularity, a point of Rapture-like consummation at which machines will be capable of building even better versions of themselves, and so accelerating progress to an exponential rate. This all seems to suggest that the further science progresses, the more it comes to resemble the very thing it sought to escape – almost like some Freudian “Return of the Repressed”.
But O’Gieblyn is not arguing for a return to religion – in fact, it is not clear that she is proposing any sort of simple solution (I suspect not). As a lapsed religious believer, she finds herself in a postmodern hinterland, where neither faith nor reason give solace. What we are left with, in O’Gieblyn’s view, are two competing metaphors, both unsatisfactory, both falling short of desired certainty by virtue of the limitations of the human mind. Human beings, perhaps, are simply not fitted to understand reality. A key virtue of the book is O’Gieblyn’s honesty and sincerity. She mixes personal history and anecdote with a wonderfully lucid and insightful grasp of some of the most obscure questions in science, philosophy and technology – the book covers an impressive amount of theoretical ground across numerous disciplines in clear, engaging prose, and could be read and enjoyed for that alone. Furthermore, the fact that she is not completely at home in any of these spheres makes her the ideal commentator on each. She writes wonderfully, clearly, and always from the heart, and if she has no ultimate pat answers to leave us with, this is to her credit too.
[Disclaimer: The above review was based on a complimentary review copy provided by the author]