Limits...
Thermodynamic costs of information processing in sensory adaptation.

Sartori P, Granger L, Lee CF, Horowitz JM - PLoS Comput. Biol. (2014)

Bottom Line: We apply these principles to the E. coli's chemotaxis pathway during binary ligand concentration changes.In this regime, we quantify the amount of information stored by each methyl group and show that receptors consume energy in the range of the information-theoretic minimum.Our work provides a basis for further inquiries into more complex phenomena, such as gradient sensing and frequency response.

View Article: PubMed Central - PubMed

Affiliation: Max Planck Institute for the Physics of Complex Systems, Dresden, Germany.

ABSTRACT
Biological sensory systems react to changes in their surroundings. They are characterized by fast response and slow adaptation to varying environmental cues. Insofar as sensory adaptive systems map environmental changes to changes of their internal degrees of freedom, they can be regarded as computational devices manipulating information. Landauer established that information is ultimately physical, and its manipulation subject to the entropic and energetic bounds of thermodynamics. Thus the fundamental costs of biological sensory adaptation can be elucidated by tracking how the information the system has about its environment is altered. These bounds are particularly relevant for small organisms, which unlike everyday computers, operate at very low energies. In this paper, we establish a general framework for the thermodynamics of information processing in sensing. With it, we quantify how during sensory adaptation information about the past is erased, while information about the present is gathered. This process produces entropy larger than the amount of old information erased and has an energetic cost bounded by the amount of new information written to memory. We apply these principles to the E. coli's chemotaxis pathway during binary ligand concentration changes. In this regime, we quantify the amount of information stored by each methyl group and show that receptors consume energy in the range of the information-theoretic minimum. Our work provides a basis for further inquiries into more complex phenomena, such as gradient sensing and frequency response.

Show MeSH
Equilibrium adaptation in a symmetric feedforward SAS.(A) Reaction network of the four states in activity, , memory, , space, with kinetic rates  indicated for each transitions. (B) Topology of the model: feedforward with mutual inhibition. For a fixed signal , a sudden increase in the memory makes the average activity drop, and vice versa for activity changes. This symmetry of the topology, which is at the core of detailed balance, allows an equilibrium construction. (C/D) Representation of steady state probabilities  for low/high  signals using the  space in (A). Wider state diameter represents higher probability, thus lower energy.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4263364&req=5

pcbi-1003974-g002: Equilibrium adaptation in a symmetric feedforward SAS.(A) Reaction network of the four states in activity, , memory, , space, with kinetic rates indicated for each transitions. (B) Topology of the model: feedforward with mutual inhibition. For a fixed signal , a sudden increase in the memory makes the average activity drop, and vice versa for activity changes. This symmetry of the topology, which is at the core of detailed balance, allows an equilibrium construction. (C/D) Representation of steady state probabilities for low/high signals using the space in (A). Wider state diameter represents higher probability, thus lower energy.

Mentions: To facilitate the development of our formalism, we first present a minimal stochastic model of a SAS, where the activity and memory are binary variables (0 or 1). This model is minimal, since it has the least number of degrees of freedom (or states) possible and still exhibits the required response and adaptive behavior. Treating the environmental signal as an external field that drives the SAS, the system can be viewed as evolving by jumping stochastically between its four states depicted in Fig. 2A. The rates for activity transitions from given at fixed are denoted , and those for memory transitions from given are .


Thermodynamic costs of information processing in sensory adaptation.

Sartori P, Granger L, Lee CF, Horowitz JM - PLoS Comput. Biol. (2014)

Equilibrium adaptation in a symmetric feedforward SAS.(A) Reaction network of the four states in activity, , memory, , space, with kinetic rates  indicated for each transitions. (B) Topology of the model: feedforward with mutual inhibition. For a fixed signal , a sudden increase in the memory makes the average activity drop, and vice versa for activity changes. This symmetry of the topology, which is at the core of detailed balance, allows an equilibrium construction. (C/D) Representation of steady state probabilities  for low/high  signals using the  space in (A). Wider state diameter represents higher probability, thus lower energy.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4263364&req=5

pcbi-1003974-g002: Equilibrium adaptation in a symmetric feedforward SAS.(A) Reaction network of the four states in activity, , memory, , space, with kinetic rates indicated for each transitions. (B) Topology of the model: feedforward with mutual inhibition. For a fixed signal , a sudden increase in the memory makes the average activity drop, and vice versa for activity changes. This symmetry of the topology, which is at the core of detailed balance, allows an equilibrium construction. (C/D) Representation of steady state probabilities for low/high signals using the space in (A). Wider state diameter represents higher probability, thus lower energy.
Mentions: To facilitate the development of our formalism, we first present a minimal stochastic model of a SAS, where the activity and memory are binary variables (0 or 1). This model is minimal, since it has the least number of degrees of freedom (or states) possible and still exhibits the required response and adaptive behavior. Treating the environmental signal as an external field that drives the SAS, the system can be viewed as evolving by jumping stochastically between its four states depicted in Fig. 2A. The rates for activity transitions from given at fixed are denoted , and those for memory transitions from given are .

Bottom Line: We apply these principles to the E. coli's chemotaxis pathway during binary ligand concentration changes.In this regime, we quantify the amount of information stored by each methyl group and show that receptors consume energy in the range of the information-theoretic minimum.Our work provides a basis for further inquiries into more complex phenomena, such as gradient sensing and frequency response.

View Article: PubMed Central - PubMed

Affiliation: Max Planck Institute for the Physics of Complex Systems, Dresden, Germany.

ABSTRACT
Biological sensory systems react to changes in their surroundings. They are characterized by fast response and slow adaptation to varying environmental cues. Insofar as sensory adaptive systems map environmental changes to changes of their internal degrees of freedom, they can be regarded as computational devices manipulating information. Landauer established that information is ultimately physical, and its manipulation subject to the entropic and energetic bounds of thermodynamics. Thus the fundamental costs of biological sensory adaptation can be elucidated by tracking how the information the system has about its environment is altered. These bounds are particularly relevant for small organisms, which unlike everyday computers, operate at very low energies. In this paper, we establish a general framework for the thermodynamics of information processing in sensing. With it, we quantify how during sensory adaptation information about the past is erased, while information about the present is gathered. This process produces entropy larger than the amount of old information erased and has an energetic cost bounded by the amount of new information written to memory. We apply these principles to the E. coli's chemotaxis pathway during binary ligand concentration changes. In this regime, we quantify the amount of information stored by each methyl group and show that receptors consume energy in the range of the information-theoretic minimum. Our work provides a basis for further inquiries into more complex phenomena, such as gradient sensing and frequency response.

Show MeSH