đź“š Book Notes: A Short History of Nearly Everything

Swapnil Agarwal
8 min readDec 15, 2020
Photo by Greg Rakozy on Unsplash

Here are my notes from A Short History of Nearly Everything:

  1. In his final years, Cope developed one other interesting obsession. It became his earnest wish to be declared the type specimen for Homo sapiens — that is, to have his bones be the official set for the human race. Normally, the type specimen of a species is the first set of bones found, but since no first set of Homo sapiens bones exists, there was a vacancy, which Cope desired to fill. It was an odd and vain wish, but no-one could think of any grounds to oppose it. To that end, Cope willed his bones to the Wistar Institute, a learned society in Philadelphia endowed by the descendants of the seemingly inescapable Caspar Wistar. Unfortunately, after his bones were prepared and assembled, it was found that they showed signs of incipient syphilis, hardly a feature one would wish to preserve in the type specimen for one’s own race. So Cope’s petition and his bones were quietly shelved. There is still no type specimen for modern humans.
  2. Mendeleyev was said to have been inspired by the card game known as solitaire in North America and patience elsewhere, wherein cards are arranged by suit horizontally and by number vertically. Using a broadly similar concept, he arranged the elements in horizontal rows called periods and vertical columns called groups. This instantly showed one set of relationships when read up and down and another when read from side to side. Specifically, the vertical columns put together chemicals that have similar properties. Thus copper sits on top of silver and silver sits on top of gold because of their chemical affinities as metals, while helium, neon and argon are in a column made up of gases. (The actual, formal determinant in the ordering is something called their electron valences, and if you want to understand them you will have to enrol in evening classes.) The horizontal rows, meanwhile, arrange the chemicals in ascending order by the number of protons in their nuclei — what is known as their atomic number.
  3. For all his success, Rutherford was not an especially brilliant man and was actually pretty terrible at mathematics. Often during lectures he would get so lost in his own equations that he would give up halfway through and tell the students to work it out for themselves. According to his longtime colleague James Chadwick, discoverer of the neutron, he wasn’t even particularly clever at experimentation. He was simply tenacious and open-minded. For brilliance he substituted shrewdness and a kind of daring. His mind, in the words of one biographer, was “always operating out towards the frontiers, as far as he could see, and that was a great deal further than most other men.” Confronted with an intractable problem, he was prepared to work at it harder and longer than most people and to be more receptive to unorthodox explanations. His greatest breakthrough came because he was prepared to spend immensely tedious hours sitting at a screen counting alpha particle scintillations, as they were known — the sort of work that would normally have been farmed out. He was one of the first — possibly the very first — to see that the power inherent in the atom could, if harnessed, make bombs powerful enough to “make this old world vanish in smoke.”
  4. It is still a fairly astounding notion to consider that atoms are mostly empty space, and that the solidity we experience all around us is an illusion. When two objects come together in the real world — billiard balls are most often used for illustration — they don’t actually strike each other. “Rather,” as Timothy Ferris explains, “the negatively charged fields of the two balls repel each other…[W]ere it not for their electrical charges they could, like galaxies, pass right through each other unscathed.” When you sit in a chair, you are not actually sitting there, but levitating above it at a height of one angstrom (a hundred millionth of a centimetre), your electrons and its electrons implacably opposed to any closer intimacy.
  5. Libby’s idea was so useful that he would be awarded a Nobel Prize for it in 1960. It was based on the realization that all living things have within them an isotope of carbon called carbon-14, which begins to decay at a measurable rate the instant they die. Carbon-14 has a half-life — that is, the time it takes for half of any sample to disappear — of about 5,600 years, so by working out how much of a given sample of carbon had decayed, Libby could get a good fix on the age of an object — though only up to a point. After eight half-lives, only 0.39 per cent of the original radioactive carbon remains, which is too little to make a reliable measurement, so radiocarbon dating works only for objects up to forty thousand or so years old.
    Curiously, just as the technique was becoming widespread, certain flaws within it became apparent. To begin with, it was discovered that one of the basic components of Libby’s formula, known as the decay constant, was out by about 3 per cent. By this time, however, thousands of measurements had been taken throughout the world. Rather than restate every one, scientists decided to keep the inaccurate constant. “Thus,” Tim Flannery notes, “every raw radiocarbon date you read today is given as too young by around 3 per cent.” The problems didn’t quite stop there. It was also quickly discovered that carbon-14 samples can be easily contaminated with carbon from other sources — a tiny scrap of vegetable matter, for instance, that has been collected with the sample and not noticed. For younger samples — those under twenty thousand years or so — slight contamination does not always matter so much, but for older samples it can be a serious problem because so few remaining atoms are being counted. In the first instance, to borrow from Flannery, it is like miscounting by a dollar when counting to a thousand; in the second it is more like miscounting by a dollar when you have only two dollars to count.
    Libby’s method was also based on the assumption that the amount of carbon-14 in the atmosphere, and the rate at which it has been absorbed by living things, has been consistent throughout history. In fact it hasn’t been. We now know that the volume of atmospheric carbon-14 varies depending on how well or not the Earth’s magnetism is deflecting cosmic rays, and that that can vary significantly over time. This means that some carbon-14 dates are more dubious than others. Among the more dubious are dates just around the time that people first came to the Americas, which is one of the reasons the matter is so perennially in dispute.
    Finally, and perhaps a little unexpectedly, readings can be thrown out by seemingly unrelated external factors — such as the diets of those whose bones are being tested. One recent case involved the long-running debate over whether syphilis originated in the New World or the Old. Archaeologists in Hull found that monks in a monastery graveyard had suffered from syphilis, but the initial conclusion that the monks had done so before Columbus’s voyage was cast into doubt by the realization that they had eaten a lot of fish, which could make their bones appear to be older than in fact they were. The monks may well have had syphilis, but how it got to them, and when, remain tantalizingly unresolved.
    Because of the accumulated shortcomings of carbon-14, scientists devised other methods of dating ancient materials, among them thermoluminescence, which measures electrons trapped in clays, and electron spin resonance, which involves bombarding a sample with electromagnetic waves and measuring the vibrations of the electrons. But even the best of these could not date anything older than about two hundred thousand years, and they couldn’t date inorganic materials like rocks at all, which is of course what you need to do if you wish to determine the age of your planet.
    The problems of dating rocks were such that at one point almost everyone in the world had given up on them. Had it not been for a determined English professor named Arthur Holmes, the quest might well have fallen into abeyance altogether.
  6. We have better maps of Mars than we do of our own seabeds.
  7. So why, you are bound to ask at some point in your life, do microbes so often want to hurt us? What possible satisfaction could there be to a microbe in having us grow feverish or chilled, or disfigured with sores, or above all deceased? A dead host, after all, is hardly going to provide long-term hospitality.
    To begin with, it is worth remembering that most micro-organisms are neutral or even beneficial to human well-being. The most rampantly infectious organism on Earth, a bacterium called Wolbachia, doesn’t hurt humans at all — or, come to that, any other vertebrates — but if you are a shrimp or worm or fruit fly, it can make you wish you had never been born. Altogether, only about one microbe in a thousand is a pathogen for humans, according to the National Geographic — though, knowing what some of them can do, we could be forgiven for thinking that that is quite enough. Even if most of them are benign, microbes are still the number three killer in the Western world — and even many that don’t kill us make us deeply rue their existence.
    Making a host unwell has certain benefits for the microbe. The symptoms of an illness often help to spread the disease. Vomiting, sneezing and diarrhoea are excellent methods of getting out of one host and into position for boarding another. The most effective strategy of all is to enlist the help of a mobile third party. Infectious organisms love mosquitoes because the mosquito’s sting delivers them directly into a bloodstream where they can get straight to work before the victim’s defence mechanisms can figure out what’s hit them. This is why so many grade A diseases — malaria, yellow fever, dengue fever, encephalitis and a hundred or so other less celebrated but often rapacious maladies — begin with a mosquito bite. It is a fortunate fluke for us that HIV, the AIDS agent, isn’t among them — at least not yet. Any HIV the mosquito sucks up on its travels is dissolved by the mosquito’s own metabolism. When the day comes that the virus mutates its way around this, we may be in real trouble.
    It is a mistake, however, to consider the matter too carefully from the position of logic because micro-organisms clearly are not calculating entities. They don’t care what they do to you any more than you care what distress you cause when you slaughter them by the millions with a soapy shower or a swipe of deodorant. The only time your continuing well-being is of consequence to a pathogen is when it kills you too well. If they eliminate you before they can move on, then they may well die out themselves. History, Jared Diamond notes, is full of diseases that “once caused terrifying epidemics and then disappeared as mysteriously as they had come.” He cites the robust but mercifully transient English sweating sickness, which raged from 1485 to 1552, killing tens of thousands as it went, before burning itself out. Too much efficiency is not a good thing for any infectious organism.
  8. In the meantime, back on Dubois’ old turf of Java, a team led by Ralph von Koenigswald had found another group of early humans which became known as the Solo People, from the site of their discovery on the Solo River at Ngandong. Koenigswald’s discoveries might have been more impressive still but for a tactical error that was realized too late. He had offered locals 10 cents for every piece of hominid bone they could come up with, then discovered to his horror that they had been enthusiastically smashing large pieces into small ones to maximize their income.

If you liked the above content, I’d definitely recommend reading the whole book. 💯

A little email digest to share what I’m reading, listening to, and find interesting. 💌

--

--

Swapnil Agarwal

Software Developer at Day | Aspiring Writer at Night