Transistors, Translation and tRNAs

Transistors, Translation and tRNAs

Prior to 1960s, most computers were analog computers.

World War II era gun directors, gun data computers, and bomb sights used mechanical analog computers. Mechanical analog computers were very important in gun fire control in World War II, The Korean War and well past the Vietnam War; they were made in significant numbers.

The FERMIAC was an analog computer invented by physicist Enrico Fermi in 1947 to aid in his studies of neutron transport.[12] Project Cyclone was an analog computer developed by Reeves in 1950 for the analysis and design of dynamic systems.[13] Project Typhoon was an analog computer developed by RCA in 1952. It consisted of over 4000 electron tubes and used 100 dials and 6000 plug-in connectors to program.[14] The MONIAC Computer was a hydraulic model of a national economy first unveiled in 1949.


In those years, many universities and companies were building analog computers, although there were rare exceptions.

Computer Engineering Associates was spun out of Caltech in 1950 to provide commercial services using the “Direct Analogy Electric Analog Computer” (“the largest and most impressive general-purpose analyzer facility for the solution of field problems”) developed there by Gilbert D. McCann, Charles H. Wilts, and Bart Locanthi.[15][16]

Educational analog computers illustrated the principles of analog calculation. The Heathkit EC-1, a $199 educational analog computer, was made by the Heath Company, USA c. 1960.[17] It was programmed using patch cords that connected nine operational amplifiers and other components.[18] General Electric also marketed an “educational” analog computer kit of a simple design in the early 1960s consisting of a two transistor tone generator and three potentiometers wired such that the frequency of the oscillator was nulled when the potentiometer dials were positioned by hand to satisfy an equation. The relative resistance of the potentiometer was then equivalent to the formula of the equation being solved. Multiplication or division could be performed depending on which dials were considered inputs and which was the output. Accuracy and resolution was limited and a simple slide rule was more accurate; however, the unit did demonstrate the basic principle.

Analog computers are completely gone from the scene today, and the transition happened in several phases. In 1930s, Alan Turing developed the mathematical principles of an abstract digital machine to show that it could do arbitrary calculations. A number of digital machines were built in the 40s and 50s, but the big transition could start only after the invention of silicon integrated circuits of complementary-symmetry metaloxidesemiconductor (CMOS) or MOSFET switches. This is the predominant technology today.

However, despite such prevalence of digital computers everywhere, the analog world exists and it exists right beneath the digital circuitry. Think about it this way. If the natural world (of silicon materials) is analog and the computer is digital, the transition from analog to digital must happen somewhere, right? That transition is carefully hidden inside the MOSFET.

Between 1997-2000, I worked at a semiconductor company, and the task of our group was to hide the analog world from those, who did not want to face it. The task gets increasingly difficult with every round of miniaturization, because all elements of the circuit including n- and p-transistors, gate capacitors and even electrical wires connecting them start to remind the world that the analog world exists.



Switching to the other side of the living world, Darwin’s model of evolution was entirely analog. Darwin was not aware of Mendel’s discovery of ‘genes’, which introduced discreteness into evolution.

Biologists did not find out about Mendel’s experiment until ~1900s and then a huge conflict ensued between those, who sided with Darwin’s continuous evolution and those, who sided with Mendel’s discrete evolution.

Mendel’s results were quickly replicated, and genetic linkage quickly worked out. Biologists flocked to the theory; even though it was not yet applicable to many phenomena, it sought to give a genotypic understanding of heredity which they felt was lacking in previous studies of heredity which focused on phenotypic approaches. Most prominent of these previous approaches was the biometric school of Karl Pearson and W. F. R. Weldon, which was based heavily on statistical studies of phenotype variation. The strongest opposition to this school came from William Bateson, who perhaps did the most in the early days of publicising the benefits of Mendel’s theory (the word “genetics”, and much of the discipline’s other terminology, originated with Bateson). This debate between the biometricians and the Mendelians was extremely vigorous in the first two decades of the twentieth century, with the biometricians claiming statistical and mathematical rigor,[37] whereas the Mendelians claimed a better understanding of biology.[38][39] (Modern genetics shows that Mendelian heredity is in fact an inherently biological process, though not all genes of Mendel’s experiments are yet understood.)[40][41]


Fisher’s mathematics (1920s and 30s) brought those two camps together, and then the series of discoveries (1950s and 60s) starting with Watson-Crick’s solving of DNA structure further confirmed the existence of digital information layer in every living organism. Since then research in the living world moved together with the technological advances in the computing world. Advances in digital technology allowed better experimentation on living world, and those experiments focused more on genetics and genomics. As a result, we learned a lot about how the information layer of living organisms evolved over time and plan to learn more through the completion of 10,000 insect genome project, 1,000 fish transcriptome project and various other comparative sequencing projects.

However, when you compare the two narratives, do you notice something missing from the second one? In case of the semiconductor industry, a dedicated group hides the existence of analog world from vast majority of practitioners, but who does so for the living world? Moreover, where is the analog connection of information layer of life hidden?

As you can guess from the title, the answer most likely lies in ‘translation’ and ‘tRNAs’. How so will be the topic of another commentary.

Written by M. //