Phys. Rev. X 14, 021007 (2024) – Published 8 April 2024
A novel theoretical framework unravels how processes in complex systems that occur at different timescales are coupled together at the functional level by sharing information.
Phys. Rev. X 14, 011037 (2024) – Published 5 March 2024
An analytical framework describing ecosystems in which species interactions drive large population fluctuations provides a way to address fundamental questions about this dynamical state.
Logan A. Becker, Baowang Li, Nicholas J. Priebe, Eyal Seidemann, and Thibaud Taillefumier
Phys. Rev. X 14, 011021 (2024) – Published 16 February 2024
Achieving realistic subthreshold variability in a biophysical neuronal model requires low-level synchrony in its synaptic input drive, a finding that challenges current theories to explain spiking activity in cortical neurons.
Claudia Merger, Alexandre René, Kirsten Fischer, Peter Bouss, Sandra Nestler, David Dahmen, Carsten Honerkamp, and Moritz Helias
Phys. Rev. X 13, 041033 (2023) – Published 20 November 2023
Models of systems in physics usually start with elementary processes. New work with a neural network shows how models can also be built by observing the system as a whole and deducing the underlying interactions.
Steven Durr, Youssef Mroueh, Yuhai Tu, and Shenshen Wang
Phys. Rev. X 13, 041004 (2023) – Published 5 October 2023
A simplified model of a family of machine-learning architectures offers a way to explore a major form of training failure—mode collapse—that is not well understood.
Freya Behrens, Barbora Hudcová, and Lenka Zdeborová
Phys. Rev. X 13, 031021 (2023) – Published 21 August 2023
A simple twist on a mainstay tool for analyzing the dynamics of disordered systems provides a way to describe out-of-equilibrium properties, which are traditionally much harder to obtain.
Phys. Rev. X 13, 031020 (2023) – Published 18 August 2023
A wide class of physical systems could be turned into learning machines, thanks to a new general approach to training them based entirely on physical dynamics combined with a time-reversal operation.
Sara Dal Cengio, Vivien Lecomte, and Matteo Polettini
Phys. Rev. X 13, 021040 (2023) – Published 27 June 2023
A new framework for analyzing forces and currents in nonequilibrium systems generalizes existing graph-theoretical tools to now encompass interacting reaction networks and time-dependent properties.
I. Samoylenko, D. Aleja, E. Primo, K. Alfaro-Bittner, E. Vasilyeva, K. Kovalenko, D. Musatov, A. M. Raigorodskii, R. Criado, M. Romance, D. Papo, M. Perc, B. Barzel, and S. Boccaletti
Phys. Rev. X 13, 021032 (2023) – Published 31 May 2023
The “six degrees of separation” are the property of the equilibrium state of any network where individuals weigh their aspiration to improve their centrality against the costs incurred in forming or maintaining connections.
Jack Murdoch Moore, Gang Yan, and Eduardo G. Altmann
Phys. Rev. X 12, 021056 (2022) – Published 10 June 2022
A new approach to applying a power-law model to describe extreme events avoids traditional pitfalls and offers a more robust approach to predicting and mitigating risk.
A. Molavi Tabrizi, A. Mesgarnejad, M. Bazzi, S. Luther, J. Christoph, and A. Karma
Phys. Rev. X 12, 021052 (2022) – Published 6 June 2022
When the heart starts to beat irregularly, mechanical contraction of the heart muscle does not simply follow electrical excitation waves but exhibits more complex disorganization.
Phys. Rev. X 12, 021051 (2022) – Published 3 June 2022
A metric for neural networks’ similarity, based on synaptic differences, provides accurate predictions of how novel networks function, thus identifying a key relation between structure and function.
Efe Ilker, Özenç Güngör, Benjamin Kuznets-Speck, Joshua Chiel, Sebastian Deffner, and Michael Hinczewski
Phys. Rev. X 12, 021048 (2022) – Published 31 May 2022
Graph theory provides universal algorithms that can be used to control stochastic biological systems at any scale, from single proteins to the evolution of whole populations of organisms.
Phys. Rev. X 12, 011052 (2022) – Published 18 March 2022
Random walkers (such as particles, cells, or animals) that interact attractively or repulsively with their own paths exhibit memories that aid the exploration of their domains.
Phys. Rev. X 12, 011044 (2022) – Published 8 March 2022
An analysis of neural network models with biophysically realistic descriptions of synaptic connections shows that irregular neuronal activity—observed in experiments—emerges naturally from interactions between cells.
Thomas J. Elliott, Mile Gu, Andrew J. P. Garner, and Jayne Thompson
Phys. Rev. X 12, 011007 (2022) – Published 11 January 2022
Quantum information processing can provide a significant competitive advantage for any system that must adapt to its environment, an enhancement that scales without bound.