Interface Focus current issue
http://rsfs.royalsocietypublishing.org
2042-8901December 6, 2018Interface Focus2042-8898<![CDATA[Towards fungal computer]]>
http://rsfs.royalsocietypublishing.org/cgi/content/short/8/6/20180029?rss=1
We propose that fungi Basidiomycetes can be used as computing devices: information is represented by spikes of electrical activity, a computation is implemented in a mycelium network and an interface is realized via fruit bodies. In a series of scoping experiments, we demonstrate that electrical activity recorded on fruits might act as a reliable indicator of the fungi’s response to thermal and chemical stimulation. A stimulation of a fruit is reflected in changes of electrical activity of other fruits of a cluster, i.e. there is distant information transfer between fungal fruit bodies. In an automaton model of a fungal computer, we show how to implement computation with fungi and demonstrate that a structure of logical functions computed is determined by mycelium geometry.
]]>2018-10-19T00:05:22-07:00info:doi/10.1098/rsfs.2018.0029hwp:master-id:royfocus;rsfs.2018.00292018-10-19ARTICLES862018002920180029Theme issue 'Computation by natural systems' organised by Dominique Chu, Christian Ray and Mikhail Prokopenko<![CDATA[From statistical inference to a differential learning rule for stochastic neural networks]]>
http://rsfs.royalsocietypublishing.org/cgi/content/short/8/6/20180033?rss=1
Stochastic neural networks are a prototypical computational device able to build a probabilistic representation of an ensemble of external stimuli. Building on the relationship between inference and learning, we derive a synaptic plasticity rule that relies only on delayed activity correlations, and that shows a number of remarkable features. Our delayed-correlations matching (DCM) rule satisfies some basic requirements for biological feasibility: finite and noisy afferent signals, Dale’s principle and asymmetry of synaptic connections, locality of the weight update computations. Nevertheless, the DCM rule is capable of storing a large, extensive number of patterns as attractors in a stochastic recurrent neural network, under general scenarios without requiring any modification: it can deal with correlated patterns, a broad range of architectures (with or without hidden neuronal states), one-shot learning with the palimpsest property, all the while avoiding the proliferation of spurious attractors. When hidden units are present, our learning rule can be employed to construct Boltzmann machine-like generative models, exploiting the addition of hidden neurons in feature extraction and classification tasks.
]]>2018-10-19T00:05:22-07:00info:doi/10.1098/rsfs.2018.0033hwp:master-id:royfocus;rsfs.2018.00332018-10-19ARTICLES862018003320180033Theme issue 'Computation by natural systems' organised by Dominique Chu, Christian Ray and Mikhail Prokopenko<![CDATA[Something has to give: scaling combinatorial computing by biological agents exploring physical networks encoding NP-complete problems]]>
http://rsfs.royalsocietypublishing.org/cgi/content/short/8/6/20180034?rss=1
On-chip network-based computation, using biological agents, is a new hardware-embedded approach which attempts to find solutions to combinatorial problems, in principle, in a shorter time than the fast, but sequential electronic computers. This analytical review starts by describing the underlying mathematical principles, presents several types of combinatorial (including NP-complete) problems and shows current implementations of proof of principle developments. Taking the subset sum problem as example for in-depth analysis, the review presents various options of computing agents, and compares several possible operation ‘run modes’ of network-based computer systems. Given the brute force approach of network-based systems for solving a problem of input size C, 2^{C} solutions must be visited. As this exponentially increasing workload needs to be distributed in space, time, and per computing agent, this review identifies the scaling-related key technological challenges in terms of chip fabrication, readout reliability and energy efficiency. The estimated computing time of massively parallel or combinatorially operating biological agents is then compared to that of electronic computers. Among future developments which could considerably improve network-based computing, labelling agents ‘on the fly’ and the readout of their travel history at network exits could offer promising avenues for finding hardware-embedded solutions to combinatorial problems.
]]>2018-10-19T00:05:22-07:00info:doi/10.1098/rsfs.2018.0034hwp:master-id:royfocus;rsfs.2018.00342018-10-19ARTICLES862018003420180034Theme issue 'Computation by natural systems' organised by Dominique Chu, Christian Ray and Mikhail Prokopenko<![CDATA[Thermodynamic efficiency of contagions: a statistical mechanical analysis of the SIS epidemic model]]>
http://rsfs.royalsocietypublishing.org/cgi/content/short/8/6/20180036?rss=1
We present a novel approach to the study of epidemics on networks as thermodynamic phenomena, quantifying the thermodynamic efficiency of contagions, considered as distributed computational processes. Modelling SIS dynamics on a contact network statistical-mechanically, we follow the maximum entropy (MaxEnt) principle to obtain steady-state distributions and derive, under certain assumptions, relevant thermodynamic quantities both analytically and numerically. In particular, we obtain closed-form solutions for some cases, while interpreting key epidemic variables, such as the reproductive ratio of a SIS model, in a statistical mechanical setting. On the other hand, we consider configuration and free entropy, as well as the Fisher information, in the epidemiological context. This allowed us to identify criticality and distinct phases of epidemic processes. For each of the considered thermodynamic quantities, we compare the analytical solutions informed by the MaxEnt principle with the numerical estimates for SIS epidemics simulated on Watts–Strogatz random graphs.
]]>2018-10-19T00:05:22-07:00info:doi/10.1098/rsfs.2018.0036hwp:master-id:royfocus;rsfs.2018.00362018-10-19ARTICLES862018003620180036Theme issue 'Computation by natural systems' organised by Dominique Chu, Christian Ray and Mikhail Prokopenko<![CDATA[A thermodynamically consistent model of finite-state machines]]>
http://rsfs.royalsocietypublishing.org/cgi/content/short/8/6/20180037?rss=1
Finite-state machines (FSMs) are a theoretically and practically important model of computation. We propose a general, thermodynamically consistent model of FSMs and characterize the resource requirements of these machines. We model FSMs as time-inhomogeneous Markov chains. The computation is driven by instantaneous manipulations of the energy levels of the states. We calculate the entropy production of the machine, its error probability, and the time required to complete one update step. We find that a sequence of generalized bit-setting operations is sufficient to implement any FSM.
]]>2018-10-19T00:05:22-07:00info:doi/10.1098/rsfs.2018.0037hwp:master-id:royfocus;rsfs.2018.00372018-10-19ARTICLES862018003720180037Theme issue 'Computation by natural systems' organised by Dominique Chu, Christian Ray and Mikhail Prokopenko<![CDATA[Computational modelling unravels the precise clockwork of cyanobacteria]]>
http://rsfs.royalsocietypublishing.org/cgi/content/short/8/6/20180038?rss=1
Precisely timing the regulation of gene expression by anticipating recurring environmental changes is a fundamental part of global gene regulation. Circadian clocks are one form of this regulation, which is found in both eukaryotes and prokaryotes, providing a fitness advantage for these organisms. Whereas many different eukaryotic groups harbour circadian clocks, cyanobacteria are the only known oxygenic phototrophic prokaryotes to regulate large parts of their genes in a circadian fashion. A decade of intensive research on the mechanisms and functionality using computational and mathematical approaches in addition to the detailed biochemical and biophysical understanding make this the best understood circadian clock. Here, we summarize the findings and insights into various parts of the cyanobacterial circadian clock made by mathematical modelling. These findings have implications for eukaryotic circadian research as well as synthetic biology harnessing the power and efficiency of global gene regulation.
]]>2018-10-19T00:05:22-07:00info:doi/10.1098/rsfs.2018.0038hwp:master-id:royfocus;rsfs.2018.00382018-10-19ARTICLES862018003820180038Theme issue 'Computation by natural systems' organised by Dominique Chu, Christian Ray and Mikhail Prokopenko<![CDATA[Intrinsic limits of information transmission in biochemical signalling motifs]]>
http://rsfs.royalsocietypublishing.org/cgi/content/short/8/6/20180039?rss=1
All living things have evolved to sense changes in their environment in order to respond in adaptive ways. At the cellular level, these sensing systems generally involve receptor molecules at the cell surface, which detect changes outside the cell and relay those changes to the appropriate response elements downstream. With the advent of experimental technologies that can track signalling at the single-cell level, it has become clear that many signalling systems exhibit significant levels of ‘noise,’ manifesting as differential responses of otherwise identical cells to the same environment. This noise has a large impact on the capacity of cell signalling networks to transmit information from the environment. Application of information theory to experimental data has found that all systems studied to date encode less than 2.5 bits of information, with the majority transmitting significantly less than 1 bit. Given the growing interest in applying information theory to biological data, it is crucial to understand whether the low values observed to date represent some sort of intrinsic limit on information flow given the inherently stochastic nature of biochemical signalling events. In this work, we used a series of computational models to explore how much information a variety of common ‘signalling motifs’ can encode. We found that the majority of these motifs, which serve as the basic building blocks of cell signalling networks, can encode far more information (4–6 bits) than has ever been observed experimentally. In addition to providing a consistent framework for estimating information-theoretic quantities from experimental data, our findings suggest that the low levels of information flow observed so far in living system are not necessarily due to intrinsic limitations. Further experimental work will be needed to understand whether certain cell signalling systems actually can approach the intrinsic limits described here, and to understand the sources and purpose of the variation that reduces information flow in living cells.
]]>2018-10-19T00:05:22-07:00info:doi/10.1098/rsfs.2018.0039hwp:master-id:royfocus;rsfs.2018.00392018-10-19ARTICLES862018003920180039Theme issue 'Computation by natural systems' organised by Dominique Chu, Christian Ray and Mikhail Prokopenko<![CDATA[Haematopoietic stem cells: entropic landscapes of differentiation]]>
http://rsfs.royalsocietypublishing.org/cgi/content/short/8/6/20180040?rss=1
The metaphor of a potential epigenetic differentiation landscape broadly suggests that during differentiation a stem cell approaches a stable equilibrium state from a higher free energy towards a stable equilibrium state which represents the final cell type. It has been conjectured that there is an analogy to the concept of entropy in statistical mechanics. In this context, in the undifferentiated state, the entropy would be large since fewer constraints exist on the gene expression programmes of the cell. As differentiation progresses, gene expression programmes become more and more constrained and thus the entropy would be expected to decrease. In order to assess these predictions, we compute the Shannon entropy for time-resolved single-cell gene expression data in two different experimental set-ups of haematopoietic differentiation. We find that the behaviour of this entropy measure is in contrast to these predictions. In particular, we find that the Shannon entropy is not a decreasing function of developmental pseudo-time but instead it increases towards the time point of commitment before decreasing again. This behaviour is consistent with an increase in gene expression disorder observed in populations sampled at the time point of commitment. Single cells in these populations exhibit different combinations of regulator activity that suggest the presence of multiple configurations of a potential differentiation network as a result of multiple entry points into the committed state.
]]>2018-10-19T00:05:22-07:00info:doi/10.1098/rsfs.2018.0040hwp:master-id:royfocus;rsfs.2018.00402018-10-19ARTICLES862018004020180040Theme issue 'Computation by natural systems' organised by Dominique Chu, Christian Ray and Mikhail Prokopenko<![CDATA[Semantic information, autonomous agency and non-equilibrium statistical physics]]>
http://rsfs.royalsocietypublishing.org/cgi/content/short/8/6/20180041?rss=1
Shannon information theory provides various measures of so-called syntactic information, which reflect the amount of statistical correlation between systems. By contrast, the concept of ‘semantic information’ refers to those correlations which carry significance or ‘meaning’ for a given system. Semantic information plays an important role in many fields, including biology, cognitive science and philosophy, and there has been a long-standing interest in formulating a broadly applicable and formal theory of semantic information. In this paper, we introduce such a theory. We define semantic information as the syntactic information that a physical system has about its environment which is causally necessary for the system to maintain its own existence. ‘Causal necessity’ is defined in terms of counter-factual interventions which scramble correlations between the system and its environment, while ‘maintaining existence’ is defined in terms of the system's ability to keep itself in a low entropy state. We also use recent results in non-equilibrium statistical physics to analyse semantic information from a thermodynamic point of view. Our framework is grounded in the intrinsic dynamics of a system coupled to an environment, and is applicable to any physical system, living or otherwise. It leads to formal definitions of several concepts that have been intuitively understood to be related to semantic information, including ‘value of information’, ‘semantic content’ and ‘agency’.
]]>2018-10-19T00:05:22-07:00info:doi/10.1098/rsfs.2018.0041hwp:master-id:royfocus;rsfs.2018.00412018-10-19ARTICLES862018004120180041Theme issue 'Computation by natural systems' organised by Dominique Chu, Christian Ray and Mikhail Prokopenko<![CDATA[Computation by natural systems]]>
http://rsfs.royalsocietypublishing.org/cgi/content/short/8/6/20180058?rss=1
Computation is a useful concept far beyond the disciplinary boundaries of computer science. Perhaps the most important class of natural computers can be found in biological systems that perform computation on multiple levels. From molecular and cellular information processing networks to ecologies, economies and brains, life computes. Despite ubiquitous agreement on this fact going back as far as von Neumann automata and McCulloch–Pitts neural nets, we so far lack principles to understand rigorously how computation is done in living, or active, matter. What is the ultimate nature of natural computation that has evolved, and how can we use these principles to engineer intelligent technologies and biological tissues?
]]>2018-10-19T00:05:22-07:00info:doi/10.1098/rsfs.2018.0058hwp:master-id:royfocus;rsfs.2018.00582018-10-19INTRODUCTION862018005820180058Theme issue 'Computation by natural systems' organised by Dominique Chu, Christian Ray and Mikhail Prokopenko