There is a large gap between the mechanisms of chemistry and the magic of biology that most people do not see closed until late in their education. It's a real shame that this gap cannot be closed sooner.
In undergrad I took a bunch of biology and chemistry classes. It wasn't until I took Biochemistry (a senior level class) that everything came together. The biochemistry class I took was a re-telling of all the stories you learned in molecular biology but with the tools you acquire in organic chemistry.
Equipped with those tools I relearned the Krebs cycle and photosynthesis as real chemical reactions that make sense rather than a chain of facts to be memorized.
The class left me with a deep and profound reverence for life. Every process in a cell has a mechanism that can understood with chemistry. However, the magic of life exists where those processes come together and interact in incredibly complicated ways.
It's seductive to think that we should be able to tease apart this complicated processes and figure out how "life" works, and maybe someday we will. However, it's easy to underestimate the level of complexity and interconnectedness in these systems.
Many of us understand how hard it can be to debug a distributed system. Imagine trying to reverse engineer a distributed system with tens of thousands of interconnected services and messaging queues that all just sort of evolved and were not built with clean engineering practices.
I had a similar revelation for structural biology, applying the physics I learned for bridges and buildings to microscopic proteins. They are structurally like a cathedral built by a blind and deranged architect. The fact that mechanically bend, pivot, and move like a complex machine at a micro scale to do real work is the most sci-fi thing I can conceive of.
Think of a even a simple walking protein like Kinesin [1]. What is not shown in the video is that this is all happening in a hurricane of molecules battering it from all sides. Each part of the structure is being pushed, pulled, bent, robot made out of sticks and rubber bands.
The other word missing is "cheap". Proteins are under a massive selection pressure: many thermodynamic reactions in fundamental bits of biology are as thermodynamically efficient as they can be, else some slightly more efficient mutant would have out-competed it aeons ago.
I became interested in biology as a physicist when I realised that all of the problems, on some level, boil down to putting a load of lego pieces in a box, shaking it up with some energy not terribly different to k_B T, and getting a fully-formed, self-replicating lego models out the other end. It's all physics. It's all utterly incomprehensibly mind-bogglingly complex with layers of complexity wrapped around each other, and far out of the realms of either physics or chemistry to compute completely. It's why I work at the intersection of the two fields.
Another famous paper, often-mentioned, related to this is "How a biologist would fix a transistor radio", essentially armed only with a shotgun. The tools of modern molecular biology may be scalpels rather than shotguns, but still, the idea is arguably the same.
I was about two sentences into the parent comment when these videos came to mind. If I had seen just about anything done by Drew Berry when I was in middle school I probably would be in a completely different career:
I recall the same "everything coming together" feeling, but for me it didn't happen until Applied Biochemistry in grad school.
I recall the final exam being only a single question, with a bunch of blank lined pages to write your answer, and the question was something like "You just ate a ham sandwich. What happens to it?" A good answer needed to include everything down to the molecular/chemical level and tie it together all the way up to the macro scale, and I finally felt like that class had prepared me to tell the story.
Oh man, where do I even start? Sensory input from the inner ear to balance, the networks that handle feedback from afferent signals from the periphery, efferent pathways to control motor movement. I don't even know all the details but it's mind bogglingly complex. Do I explain the molecular basis of action potentials? The modulating effects of inhibitory feedback within the networks? I feel like all of that barely scratches the surface of the insane complexity of neuronal networks
And how does one even begin to talk about our desire and internal drive to do things like ride a bicycle
Assume an experienced rider, as learning is different.
Intention is set, requiring the basal ganglia and fore brain and either a notion of free will, determinism, or whatever you fancy.
The area ahead is scanned and mapped for a clear path via retina- optic nerve - visual cortex and particularly the dorsal parietal pathway.
Initial organised motor signal sequences originate in pre and pre pre motor areas, hitting motor strip of the brain, particularly those homoncular areas corresponding to legs, arms and torso. Basal ganglia loops prime these circuits into action and help maintain their engagement.
Activated motor strip neurons pass through the internal capsule, down the pyramidal tracts, the spinal cord, and meet a lower motor neuron in the anterior horn, which then carries the baton and traverses out if the cord (still the central nervous system) into the body and to the final destination: a muscle. Electrical depolarisation along the axon hops rapidly between nodes of ranvier, enabled by insulation myelination. At the terrminal synaptic bouton lower motor neurons branch across a muscle body. Each neurons innervated patch is a motor unit; multiple combine into a motor pool. Depolarisation triggers fusion of vesicles to the membrane endplate and release of acetylcholine into the thin synapse. Rapid diffusion moves thr snall molecukes to the muscle membrane, the sarcolemma, who then bind to the membrane spanning Nm nicotinic receptors which open and allow a rapid flooding of sodium into the muscle cell syncitium and efflux of potassiun into the extra cellular matrix. Depolarisation of that muscle allows further calcium released from the sarcoplasmic reticulum to activate protein machinery; myosin and actin run across each other and fibres contract. With enough activity concentric movement is achieved across the associated joint.
In a manner similar to walking, various spinal reflexes and the spinal locomotor pattern generator create a local, fast framework for actualisation of the impulses.
Feed back on state of the musculature ascends the spine via dorsal root ganglia and the dorsal horn. Amongst these are proprioceptive afferents, rapidly feeding back state of tension in muscle fibres from golgi tendon organs along highly myelinated type 1a fibres. These signals pass into the cerebellum where they are co processed with signals from the eyes and vestibular system.
The cerebellum modulates
the intensity of descending motor activity by comparing expected to perceived muscle state. It also orchestrates balance by integrating general body state, visual cues and vestibular information. In this way the small and large oscillations of riding the bike are maintained and constrained into an orderly process.
Experienced riders can dedicate higher function, i.e Executive frontal areas to other tasks, or to refined modulation of thr task to overcome specific issues.
Beginners must use all their frontal powers to focus attention on the task, painstakingly sequence actions, and reflect on the numerous errors and their consequences. Learning is slow, multi system, and largely independent of autobiographical memory.
You forgot the metabolic cycles of the signalling molecules and their receptor proteins, as well as the ion pumps important for those reactions. I think that is really what the professor was going for. Also, perhaps, some [partial] description of the learning process as it impacts a single neuron.
It took me a long time to figure out how I made a turn on a bicycle. No, it's not just turning the steering wheel in the direction you want to turn. You actually slightly turn it the other way, then the bike tilts into the turn you want to make, and you turn the steering wheel into the turn to stop the tilt from turning into a crash.
It all happens so subtly, and your body does it perfectly with no input from the brain other than "I want to turn".
It is a fun one to show new or inexperienced riders. "Now that we are coasting, what do you think a slight push forward of your left hand will do to the bike?"
My finger gently depresses the black plastic key. As the machine begins to pull in data from the net I shift my weight back in the chair and look up over the top of screen, out the window at the lunch hour foot traffic passing silently by beyond the steam tinted cafe window. The overheavy graphics begin to render, but my gaze is caught by a momentary glimpse between the rushing cars of a woman in a red dress on the far side of the street...
... it'll be fun if you start from what happens when the enter key is pressed- the mechanics and electronics involved in submitting that URL (and some chemistry and physics behind what your eyes see on the screen), the physical transmission of the signal from your computer to through the interwebs and some error correction protocols to ensure your signals are still useful.
Maybe toss a line or two in about the complexities of running a large data center and how your response time varies based on some sorcery.
Then you go the extra mile and weave a tale of electrons wrestling with their universe of invisible electromagnetic wave overlords that determine their fate while they embark on a treacherous journey to convey information thousands of kilometers across with blistering speed. Tell them of the aged electron saw a family member get attacked by a stray cosmic ray and the fright of the pack when one simply tunneled out of existence...
Not during an interview, though. The interviewer would see it as trolling (at best), and you would fail the interview. And for a good reason! Because as an engineer (and an intelligent person in general) you must be able to separate what is essential from the non-essential for the subject in question. For instance, the physics or the physiology of the process of pushing a key on a keyboard is probably not what the question was about, nor do those things in fact have much to do with typing, even (which you can do on a touchscreen or using the mouse).
This answer reminds me of the blog post of the guy asked to implement a linked list iirc during an interview, and he does it all in Haskell's type system.
When we start looking at life at the level of physics, chemistry, and biochemistry, the absolute beauty of the system begins to appear. The complexity is on a scale that's difficult to imagine or even unimaginable even to those trained in the fields, and there is a feeling of wonder that words can't capture
Going over quantum electrodynamics to explain how to make microcircuits would be fun, that was our first class specific to electrical engineering for me some 30 plus years ago.
I went through biochem, but didn’t fully understand just how gigantic & complicated proteins are until I started learning about computational protein folding. There’s several levels of abstraction just between rna/ribosomes and functional proteins… that’s one of the most shocking complexities to me, most pieces of life are rather elegant when you come to understand them but it’s hard to imagine how complex proteins evolved spontaneously. There’s just endless complexity there.
There’s 574 amino acids making four separate interlocking chains in a single globin, plus the heme, all just to bind 4 oxygen molecules. It’s simultaneously elegant but hugely complex, far above any discussion of the rna sequencing.
It’s a big part of the “gap” between chemistry and biology IMO.
I worked for a professor (James Milner-White) who was interested in early protein evolution and I remember a conversation we had about the possibility that proteins could have evolved from large to small.
Not sure if it was from a published paper, but the idea was that early proteins might have been large - say several hundred residues - but mostly disordered.
The smaller, more ordered 'domains' would then have evolved within these larger chains. Recombination and deletion would then have pruned down the disordered parts to leave more efficient smaller proteins.
No idea if that idea makes sense or has any research behind it, but it's quite a neat theory.
There was a paper a few years ago about a similar effect in artificial neural networks [0]. The gist was that a large network can contain many subnetworks, and the number of subnetworks grows much faster than the size of the network they are contained in. They were able to find a subnetwork in a randomly weighted network with equivalent performance to a trained network of a much smaller size.
Its actually top down and bottom up at the same time. All of biochemistry operates on the basic rules of physics which determine how the chemistry happens with feedback from the surroundings/system as the top down part
The hemoglobin molecule is different for every species, and if you chart changes in the molecule, it forms the same tree as evolutionary biologists had already figured out.
Humans have the most complicated hemoglobin molecule.
> I went through biochem, but didn’t fully understand just how gigantic & complicated proteins are until I started learning about computational protein folding.
Some years back, there seemed an opportunity to create an educational web interactive, a full-scale 3D folding sim, with hands-on direct manipulation, by aiming for plausible-not-correct folding. The simulation literature having built up lots of shortcuts for slashing computation costs, which sacrificed correctness but not plausibility. So one might variously knead a protein, alter it and its environment, and watch it flail. I wonder if anyone ever got around to it?
I would go further to describe living systems as not just distributed and so on. Also they are self-assembling and self-repairing. They are redundant - which makes them more damage resistant and 'evolvable'.
Also, these complex assemblies of machines work at (mostly) room temperature and pressure. Except for extremophiles that can work down to freezing or up to boiling temperatures, or in acid or high pressure environments.
Also enzymes catalyse stereospecific reactions, or can use light to drive proton gradients across a lipid membrane, or reduce nitrogen gas. I've always found it funny the sci-fi obsession with 'nanomachines' when living systems are basically composed of exactly that.
I'm not sure there is such a fundamental difference. In biology the code is the DNA and RNA, whereas the hardware is the proteins. DNA and RNA are self-modifying and imperfectly transmitted, but those traits can also exist in computer code (to the extent that they aren't, it's because humans make sure of so, because they hate trying to understand dynamically changing things). The hardware of life is self-creating and self-repairing, but - again - this can also be easily simulated in computer hardware, to the extent that it isn't, it's because it's costly and there is no good reason for it.
Biology's difference from computers is in scope (organisms are whole factories who just happen to have computational abilities by necessity) and origin (organisms aren't designed, and this profoundly and significantly affects everything about them).
> In biology the code is the DNA and RNA, whereas the hardware is the proteins.
This distinction isn't as clear as you think. The active parts of ribosomes (the machines that translate mRNA into proteins) are catalytic RNA. There are organisms that use RNA to store templates (RNA viruses).
Proteins are the runtime on which DNA is executed, because they are the mechanism that "reads" DNA. But proteins are the compiled output of DNA, because they are the result of "reading" DNA. So the DNA defines the runtime environment that is necessary for DNA to run.
RNA actually has a large role to play in going from DNA to protein. Its been suspected that the first life was RNA based because RNA can actually form functional site similar to proteins to do enzymatic reactions. RNA is some of the secret sauce to many of these systems
Definitely true, and my comment was without a doubt extremely oversimplified and wrong in several respects in an attempt to explain the analogy. Thank you for giving the clarification on it.
I feel the same way about just math in general, and all the sciences that derive a lot of their knowledge and systems from it. You start learning math as just high level/abstracted away things where you just have to memorize that this thing does that and in this case do this instead, especially derivation I remember they showed us the formula with dy/dx, but they never showed us any proofs of why or how that lead to the different outcomes, we just had to memorize.
Meanwhile, later when you get to higher education, math just kind of explode into this creative problem solving field with loads of interesting problems and ways to reason about them, but you almost have to relearn it/properly learn the basics over again when you get there, because you never learned why or how the basics works, just the input and output of the basics.
I had the opposite experience. My teacher took extra care to explain to us why and how certain things worked in math. The reason I loved math so much, and still do, is because I never had to memorize anything. I just had to understand how it worked. In biology, however, it was very different. I had to memorize facts instead of understanding them.
I agree in spirit. I’d love to see a curriculum that somehow teaches kids to “discover” counting, addition, the utility of notation, squares, cubes, up through square roots, complex numbers, derivatives, etc…. But I feel like it would be tough to create.
Lovely life lesson shared. It also blew my mind how much complexity handling non formal, discrete systems adds. Just samoling root development takes years and endless hours of tedious, non automatable work. No wonder the field progresses orders of magnitudes slower. Also, add chaos theory, quantum mechanics, differential equations and enzyme molecules to the distributed system to make it a bit more realistic.
This happens so much in education, and specifically academics. A lot of theoretical explanations without stepping back from theory and going back to reality once in a while to ponder about the implications. I passed a lot of courses by memorizing theoretical concepts without truly understanding it. I feel that's wasteful because I'm not advocating for longer lectures, but rather more effective teaching methods (at least they would be for me). Understanding a concept is very different than proving a theorem.
I was told that this was so that they could craft cirriculums that stretch decades when it could be taught much quicker. Doing so would be deterimental to the labor market in academia.
> In undergrad I took a bunch of biology and chemistry classes. It wasn't until I took Biochemistry (a senior level class) that everything came together.
In high school I really hated biology and chemistry. It was just a bunch of abstract stuff. What made me (re)discover biology was taking up gardening. To me gardening is like applied biology. After a while you really start to get a sense of how it all works and just how unbelievably complex life systems are: photosynthesis, the carbon cycle, the different water cycles, how soil life affects the plants that grow in it, and how incredibly resourceful plants are in interacting with their environment (not to mention insects and other creatures higher up the food chain...)
While retaining the typical high school separation between math, biology, chemistry, and physics, but given control over the curricula taught in those courses, do you think it is possible to teach a single very high-level concept such as the Krebs cycle in full complexity at a high school level (i.e. starting from algebra and very limited science education, completed in four full-time years)? This seems like a foothold for a potentially interesting restructuring of how we educate children, oriented toward depth in a few things to enlighten future breadth. I ask specifically about feasibility, since that seems like a necessary prerequisite to a discussion of beneficial value.
I read a good book review[1] that moved me towards the view that figuring out how a living thing works is possible, though not necessarily easy. I highly recommend reading it, but here's a summary.
Evolution promotes fitness enhancing functionality, so we should expect biological processes to be useful for some purpose and hence not be distributed like a random graph (e.g. Erdos Reyni graphs). Indeed, if we look at biological structures, we can find that the causal networks they form are far from random. Furthermore, there are often repeated motifs present. These motifs are quite simple and seem to map neatly onto human understandable concepts (like XOR gates or autoregulators or feedforward networks etc.)
And often, the overall graphs seem like they're tree like rather than some complicated mess of feedback loops (barring autoregulation). This kind of structure is quite modular, and hence we can leverage our understanding of component parts to understand greater and greater pieces of the organism.
There are two problems with this arguement: one, that a lot of the data used for it is not nearly exhaustive. Maybe the people examining biological circuitry stumbled on the rare areas where there are repeated sub-components. Second, even if there are repeated sub components, why should we get modularity i.e. few connections, mostly local?
The former may not be an issue if there hasn't been a lot of dedicated effort towards finding human comprehensible structure in biological circuits, which there might not have been. These things are big and complicated, with many constituent parts, and teasing out the underlying structure may require loads of computation and statistical analysis, which was hard for most of the history of biology.
The latter is not adressed in the book review, or in the comments, but the review author's work makes me it plausible to me that modularity will be common in biological systems. I don't have a good summary of that, or can clearly articulate why I'm hopeful about this. But read the rest of the work of the writer of the article if you're interested in this kind of stuff (key words: natural abstractions, interfaces, selection theorems).
> Imagine trying to reverse engineer a distributed system with tens of thousands of interconnected services and messaging queues that all just sort of evolved and were not built with clean engineering practices.
The more I understand about biology, the more bizarre it is that people try to beat it down to simple, obvious, narrow, and globally consistent binaries to serve their ideological purposes.
In undergrad I took a bunch of biology and chemistry classes. It wasn't until I took Biochemistry (a senior level class) that everything came together. The biochemistry class I took was a re-telling of all the stories you learned in molecular biology but with the tools you acquire in organic chemistry.
Equipped with those tools I relearned the Krebs cycle and photosynthesis as real chemical reactions that make sense rather than a chain of facts to be memorized.
The class left me with a deep and profound reverence for life. Every process in a cell has a mechanism that can understood with chemistry. However, the magic of life exists where those processes come together and interact in incredibly complicated ways.
It's seductive to think that we should be able to tease apart this complicated processes and figure out how "life" works, and maybe someday we will. However, it's easy to underestimate the level of complexity and interconnectedness in these systems.
Many of us understand how hard it can be to debug a distributed system. Imagine trying to reverse engineer a distributed system with tens of thousands of interconnected services and messaging queues that all just sort of evolved and were not built with clean engineering practices.