top of page
  • Vijay Chandru

Historians of the Now: From digital to living machines

Updated: Jan 7, 2021

Vijay Chandru

November 12, 2020

“The historians of the now begin with the study of what is today, asking not

how to avoid the perils of the past but how to maximise the advantages of the future”

Abraham Verghese (from the foreword to Deep Medicine by Eric Topol)

This essay was prepared for a session on interpreting social transformations to prepare for tomorrow, in the context of the extraordinary crisis that the world is experiencing with the CoViD-19 pandemic, and the “reset” that we are going to need. It seems that we need a few historians of the here and now to maximize the advantages of the future, and it is natural to assume that technology will drive the reset agenda. We wish the reset would be driven by humanitarian concerns as well, rather than just technology, but unfortunately, as George Santayana famously said: “Those who cannot remember the past are condemned to repeat it.” If there is one lesson from our pandemic experience in 2020 it is that, while we have made amazing progress in scientifically and technologically responding to the crisis, as a society we seem condemned to repeat historical patterns of conspiracy theories, polarization, socially irresponsible behaviour.

The world today and the pace of technological change, as well as the scale of disruptions, suggest that advantages of the future can only be imagined with an understanding of the scientific and technological trajectories that have brought us to the now. The remarkable advances in technology, at exponential scale, that we have witnessed over the last 100 years towards the digital revolution and the second machine age, and how we are now poised to enter the age of living machines are the narratives of this editorial. But as Martin Heidegger[1] has warned us, we need to look beyond technology as a “means to an end” and consider the “essence” of technology which has dimensions of both instrumental (means) and anthropological (of human activity). To posit ends and procure and utilize the means to them, is a human activity. And essence (or the noun Wesen in German) “does not simply mean what technology is, but it means, further, the way in which technology pursues its course, the way in which it endures.”

A century ago, physicists and engineers combined forces and symbiotically produced telegraphs, telephones, radios, aircraft, radar, electro-mechanical calculators, power systems, and completed the agenda of the first machine age. The post WWII era, the middle of the 20th Century, saw the role again of physicists and mathematicians fresh out of the war effort, looking for new frontiers and aiding in the birth of computers and digital hardware, along with the beginnings of molecular biology. The digital hardware was a little further along and was soon laying the foundations for the revolution that would usher the next disruption.

The Second Machine Age:

In 2016, at a Carnegie India’s Global Tech Summit in Bangalore, a senior leader of the information technology industry in India, Nandan Nilekani, made the remark that “apparently 2007 was when it all (planet scale computing and communications) began happening.” That remark by India’s technology czar on the inflection in technology deserved more investigation. I was concurrently reading a fascinating book, “The Second Machine Age[2]”, which is a commentary by economists Eric Brynjolfsson and Andrew McAfee on “work, progress and prosperity in a time of brilliant technologies.” The authors point out that all kinds of technology-driven “miracles” are beginning to manifest in the last few years. Science fiction is turning into reality at a speed that is breath taking.

The exponential “Moore’s Law” of semiconductor computing technology is clearly the cause of the second machine age. To explain exponential growth, Brynjolfsson and McAfee use the example from the Indian Paalu Payasam fable about doubling rice grains on subsequent squares of the 64 square grid on a chessboard. Place one single grain of rice on the first square of the board, two on the second, four on the third, and so on, so that each square receives twice as many grains as the previous. When the chessboard is half covered, the 32nd square has roughly 4 billion grains (or 4,294,967,295 grains to be precise) on it and if we went all the way to the 64th square, we would wind up with more than eighteen quintillion grains of rice. The steep effects of the exponential curve in the second half of the chessboard define the disruptive appearance of technological feats that seemed out of reach just a short while back.

How does all this connect with Nandan’s remark? Well, consider 1958, the date of the first registration of a semiconductor company, as the start of Moore’s Law. Now consider doubling at a frequency of once every 18 months (Moore’s Law), we reach 232 or the second half of the chessboard 48 years later, bringing us to 2006. Nandan was spot on in stating that 2007 was when “it all began happening.” The impact we are seeing today of being on the 40th square of the chessboard with digital hardware is enormous and yet rapidly progressing.

We are truly climbing at a steep incline towards instant communication and artificial intelligence in the purest sense of the word. Software has been playing catchup with hardware since the 1970s and now, with the digitization of the world and even the universe, we have the challenges of data science that has amazing scale in hardware at its beck and call. If you are amazed by Alpha Go and Watson and Autonomous Vehicles, you ain’t seen nothing yet! Work, Progress and Prosperity with brilliant technologies is the promise of Brynjolfsson and McAfee.

What this second machine age has also done is warn us that the immense and unbridled power of the digital machines will also create difficult social problems such as inequity in access to technology, the perverse use of technology to spread fake news, and ability of technologies to influence traditional democracies. Cathy O’Neil calls these the weapons of math destruction, and Yuval Harari writes[3] “Once big data systems know me better than I know myself, authority will shift from humans to algorithms. Big data could then empower Big Brother.” So in this narrative, George Orwell’s 1984 was perhaps misinterpreted as a warning about socialist authoritarianism while in fact the benevolent internet actually brought Big Brother to life.

Towards a Quantified Science:

Molecular biology, as I stated earlier, saw its birth in the immediate post WWII period (1940s and 1950s) with use of better instruments and measurement techniques to help scientists understand biology at the level of the cell’s hardware – the DNA, RNA and the protein building blocks of all living things. It led to understanding disease at a molecular scale. Early developments in biotechnology and the birth of biopharma essentially built on this understanding.

There is yet another exponential technology law that we will all learn to be awestruck by in the years to come, as it drives a different set of science fiction scenarios to reality: the law of the cost (and speed) of genome sequencing – sometimes referred to as Flatley’s Law, as a counterpart to Moore’s Law. The start date for genomics is 1984, when Applied Biosystems commercialized the sequencing technology first proposed by Fred Sanger in 1977. From 1984 to 2007 (around 24 years) we had sequencing costs go down at the rate of Moore’s law. So Fred Sanger helped us get about 216 of the way. After 2008, we have the advent of Shankar Balasubramanian’s Solexa NGS (Next Generation Sequencing) technology, which has sped up the clock. So 2008 to 2016 has got us from 216 to 232, and voilà we are in the second half of the proverbial chessboard.

And this excitement about genomics is really just the tip of the proverbial iceberg. The next wave of the Genomics Revolution comes from our ability to write on genomes i.e., to edit[4] and modify them. This will have enormous impact because we will be able to do this for plant genomes, for animals, for improving sustainability of the planet, for the eradication of serious pandemics, and industrial applications with engineered microbes. The Financial Times in March 2018, named CRISPR as the “greatest discovery since Darwin” and Prof Jennifer Doudna, our recent Nobel laureate and a pioneer in gene editing, calls this “A Crack in Creation.” A recent reputable private equity report on the revolution in genomic medicine using gene therapies and genome editing estimates the total addressable market at USD 4.8 trillion just in the Western markets.

Genomics, the software of biology, in combination with the advances in molecular biology has positioned biology now to progress as a quantified science that can have great synergies with engineering – materials research, micro and nano technologies that will drive a huge adoption of biotechnology in improving the human condition in health, food and energy security, and environmental sustainability. If so, this will certainly be the century of the life sciences and the foundation of Industry 5.0.

President Emerita Susan Hockfield, the first woman and biologist to serve as MIT’s President from 2004-2012, has called this the emergence of the “Age of Living Machines” that biology + engineering are starting to create. The new avatars of bio-engineering that might be called BioSystems Science and Engineering that have taken shape under her leadership are lyrically described in the book as case studies of breakthroughs by extraordinary researchers at MIT on living machines ranging from virus based batteries, protein based water filters, cancer detecting nanoparticles, computer-engineered crops and cognitive bionic limbs. Dr Hockfield concludes that her hope for a better future rests with as much confidence on the next generation as on the technologies themselves[5].

But since I raised the issues of unbridled technologies in the context of digital machines, we do need to be even more alert about the impact of living machines. Francis Fukuyama was troubled by this even twenty years back, when the human genome project was successful – he wrote in 2002[6] “Biotechnology presents us with a special moral dilemma, because any reservations we may have about progress need to be tempered with a recognition of its undisputed promise….Hanging over the entire field of Genetics has been the specter of Eugenics – that is the deliberate breeding of people for certain selected heritable traits”. The challenges created by the renegade scientist He Jianqui at Shenzhen with gene edited and implanted embryos has reinforced this specter. Even Professor Jennifer Doudna was haunted by a dream in which Adolf Hitler appeared, holding a pen and paper, requesting a copy of the CRISPR recipe. What horrible purpose could the nightmare Hitler have had in mind?

Work, Progress and Prosperity:

The success of the Indian IT industry in the latter part of the 20th century owes a great deal to the visionaries in the government in the 1960s and 1970s. First the Department of Electronics identified “Software led exports” as a segué for Indian export promotion in 1972 and provided resources to buy computers and get the private sector up to speed. Then we had STP (software tech parks) in the 80s and 90s that provided shelter for the industry to import equipment at competitive prices, tax holidays and free connectivity to the internet. The leg up that the industry received for these two decades is sometimes overlooked in the facetious praise of “benign neglect by the government” as the reason for success of the IT sector. The truth is that the government provided the right help at the right time and did not over-regulate the sector once it was off to the races. A marvellous example of directed public policy in technology. The obvious question to ask is if India has a strategy that can help us ride the steep advances created by exponential technology laws in bioengineering to build a bioeconomy. India has already succeeded in localizing biopharmaceuticals, cinema, satellite and cellular communications, cable television, radio and computing. Can it do the same for bioengineering?” What are the right levers to push this sector of Indian Biotechnology onto the world stage? The decisions we make today will make a difference.

The countries that take the lead in the twenty-first century will be the ones that implement an innovation ecosystem that more effectively support the production of new ideas in the private sector[7]. Economic growth occurs whenever people take resources and rearrange them in ways that make them more valuable. Possibilities do not merely add up; they multiply[8]. Using the phraseology of bioengineering, Recombinant Growth[9] has to be the focus of our national strategy.

Acknowledgements: To Susan Hockfield for her extraordinary book on living machines, Foy Sayas for the pointer to Heidegger’s work and George Verghese for his careful reading of early versions of this essay.

Author: Vijay Chandru ( is an engineer-scientist trained at BITS, UCLA and MIT; a professor of engineering and operations research at Purdue University and of computer science at Indian Institute of Science (IISc). Chandru is a Fellow of the Indian academies of science and engineering and serves on the faculty at the Centre for Biosystems Science and Engineering at IISc. An inventor of India’s first handheld computer - the Simputer - he went on to help start Strand Life Sciences, India’s leading clinical genomics and precision medicine company, which he led for 18 years. He can rightfully claim to the historians of the now that he has journeyed with digital and living machines.

[1][i] Martin Heidegger, The question concerning technology and other essays,” translated by William Lovitt, Garland 1977. [2] E Brynjolfsson and A McAfee, “The Second Machine Age”, Norton 2014. [3] “Yuval Noah Harari on big data, Google and the end of free will” Financial Times, August 26, 2016 [4] A professor of linguistics, Dr K P Mohanan, points out an interesting misuse of the word edit in the context of genetics. He says, “to the extent I can tell, what CRISPR does, cannot be described as editing the human genome. From my perspective, what it does is proof-read it without understanding the text. A person who does know any English whatsoever can still proofread a copy of T S Eliot's poem The Waste Land if (s)he is given the original to compare it with. And if the word 'breeding' in "breeding lilacs out of the dead land" is mis-spelt as 'bleeding', (s)he can correct it and change l back to r. But one cannot edit the book of the human species without understanding the grammar of the language in which that book is written, no matter how well we understand the 'code' i.e., how well we understand how the sounds are transcribed as letters. And to understand the grammar, we need the equivalent of a theoretical linguistics in biology.” [5] Catch Dr Hockfield’s plenary address on Living Machines at the Bangalore Tech Summit November 19-21st, 2020. [6] Francis Fukuyama, “Our Posthuman Future: Consequences of the Biotechnology Revolution” 2002. [7] Paul Romer, “Economic Growth,” Library of Economics and Liberty, 2008. [8] Martin Weitzman, “Recombinant Growth,” Quarterly J of Economics 113,no. 2(1998): 331-60. [9] An early reviewer suggested that I add an example to illustrate how this recombinant idea works. A good illustration is a recent brilliant move by the Office of the Principal Scientific Advisor (PSA) in India came up with. It is called Indigenous Diagnostics (InDx) and is a virtual marketplace for innovations, funded by Rockefeller Foundation, that can be leveraged to put together recombinant diagnostic solutions for pandemics like CoViD-19.

35 views0 comments

Recent Posts

See All

Gandhiji’s Talisman

When you begin your 8th decade on this earth, you should begin to wonder what choices in life make sense going forward. Mahatma Gandhi, the father of our nation has been a beacon for me all my life an


bottom of page