Cornell University

Central Campus

View map

Overcoming Thermal Fluctuations Requires Large Energetic Costs for Electrical Signaling in Neurons

 

Biological information processing is energetically costly; our brain consumes more energy per gram than our muscles. But while thermodynamic tools roughly capture the energy requirements of the mechanical work done by muscles, we have no equivalent framework to understand the energy requirements and efficiency of the information processing done by our brains. Fundamental bounds applied without reference to physical details typically predict costs of order the thermal energy, KBT per bit, at least six orders of magnitude smaller than measured costs of transmission between neurons. Here I will argue that costs can still be understood as arising from the need to beat intrinsic thermal fluctuations, but that practical costs can be much larger than KBT due to geometric constraints and the physical properties of the communication medium.

While living systems use physically distinct media for communication, here I will focus on understanding the energy costs of electrical communication in neurons. In these cells, electrical currents mediated by ion channels are carried through the cytoplasm and are sensed by distant voltage sensitive ion channels. I will show that sending signals that are reliable over these fluctuations implies costs which are often orders of magnitude larger than the thermal energy.  I will argue that the requirement that signals compete with thermal fluctuations limits the sensitivity of individual ion channels, and that these limits set the voltage scale required to power reliable dynamics. I will argue that these considerations plausibly account for the high energetic cost of fast electrical communication. 

 

0 people are interested in this event

User Activity

No recent activity