scholarly journals Exact Renormalization Groups As a Form of Entropic Dynamics

Entropy ◽  
2018 ◽  
Vol 20 (1) ◽  
pp. 25 ◽  
Author(s):  
◽  

The Renormalization Group (RG) is a set of methods that have been instrumental in tackling problems involving an infinite number of degrees of freedom, such as, for example, in quantum field theory and critical phenomena. What all these methods have in common—which is what explains their success—is that they allow a systematic search for those degrees of freedom that happen to be relevant to the phenomena in question. In the standard approaches the RG transformations are implemented by either coarse graining or through a change of variables. When these transformations are infinitesimal, the formalism can be described as a continuous dynamical flow in a fictitious time parameter. It is generally the case that these exact RG equations are functional diffusion equations. In this paper we show that the exact RG equations can be derived using entropic methods. The RG flow is then described as a form of entropic dynamics of field configurations. Although equivalent to other versions of the RG, in this approach the RG transformations receive a purely inferential interpretation that establishes a clear link to information theory.

2019 ◽  
Vol 28 (14) ◽  
pp. 1944006
Author(s):  
ChunJun Cao ◽  
Aidan Chatwin-Davies ◽  
Ashmeet Singh

According to the holographic bound, there is only a finite density of degrees of freedom in space when gravity is taken into account. Conventional quantum field theory does not conform to this bound, since in this framework, infinitely many degrees of freedom may be localized to any given region of space. In this paper, we explore the viewpoint that quantum field theory may emerge from an underlying theory that is locally finite-dimensional, and we construct a locally finite-dimensional version of a Klein–Gordon scalar field using generalized Clifford algebras. Demanding that the finite-dimensional field operators obey a suitable version of the canonical commutation relations makes this construction essentially unique. We then find that enforcing local finite dimensionality in a holographically consistent way leads to a huge suppression of the quantum contribution to vacuum energy, to the point that the theoretical prediction becomes plausibly consistent with observations.


Author(s):  
Xiaoyong Cao ◽  
Pu Tian

Molecular modeling is widely utilized in subjects including but not limited to physics, chemistry, biology, materials science and engineering. Impressive progress has been made in development of theories, algorithms and software packages. To divide and conquer, and to cache intermediate results have been long standing principles in development of algorithms. Not surprisingly, Most of important methodological advancements in more than half century of molecule modeling are various implementations of these two fundamental principles. In the mainstream classical computational molecular science based on force fields parameterization by coarse graining, tremendous efforts have been invested on two lines of algorithm development. The first is coarse graining, which is to represent multiple basic particles in higher resolution modeling as a single larger and softer particle in lower resolution counterpart, with resulting force fields of partial transferability at the expense of some information loss. The second is enhanced sampling, which realizes "dividing and conquering" and/or "caching" in configurational space with focus either on reaction coordinates and collective variables as in metadynamics and related algorithms, or on the transition matrix and state discretization as in Markov state models. For this line of algorithms, spatial resolution is maintained but no transferability is available. Deep learning has been utilized to realize more efficient and accurate ways of "dividing and conquering" and "caching" along these two lines of algorithmic research. We proposed and demonstrated the local free energy landscape approach, a new framework for classical computational molecular science and a third class of algorithm that facilitates molecular modeling through partially transferable in resolution "caching" of distributions for local clusters of molecular degrees of freedom. Differences, connections and potential interactions among these three algorithmic directions are discussed, with the hope to stimulate development of more elegant, efficient and reliable formulations and algorithms for "dividing and conquering" and "caching" in complex molecular systems.


2020 ◽  
pp. 289-318
Author(s):  
Giuseppe Mussardo

Chapter 8 introduces the key ideas of the renormalization group, including how they provide a theoretical scheme and a proper language to face critical phenomena. It covers the scaling transformations of a system and their implementations in the space of the coupling constants and reducing the degrees of freedom. From this analysis, the reader is led to the important notion of relevant, irrelevant and marginal operators and then to the universality of the critical phenomena. Furthermore, the chapter also covers (as regards the RG) transformation laws, effective Hamiltonians, the Gaussian model, the Ising model, operators of quantum field theory, universal ratios, critical exponents and β‎-functions.


2020 ◽  
Vol 29 (14) ◽  
pp. 2043012
Author(s):  
Tejinder P. Singh

We start from classical general relativity coupled to matter fields. Each configuration variable and its conjugate momentum, as also spacetime points are raised to the status of matrices [equivalently operators]. These matrices obey a deterministic Lagrangian dynamics at the Planck scale. By coarse-graining this matrix dynamics over time intervals much larger than Planck time, one derives quantum theory as a low energy emergent approximation. If a sufficiently large number of degrees of freedom get entangled, spontaneous localisation takes place, leading to the emergence of classical spacetime geometry and a classical universe. In our theory, dark energy is shown to be a large-scale quantum gravitational phenomenon. Quantum indeterminism is not fundamental, but results from our not probing physics at the Planck scale.


2013 ◽  
Vol 28 (17) ◽  
pp. 1330023 ◽  
Author(s):  
MARCO BENINI ◽  
CLAUDIO DAPPIAGGI ◽  
THOMAS-PAUL HACK

Goal of this paper is to introduce the algebraic approach to quantum field theory on curved backgrounds. Based on a set of axioms, first written down by Haag and Kastler, this method consists of a two-step procedure. In the first one, it is assigned to a physical system a suitable algebra of observables, which is meant to encode all algebraic relations among observables, such as commutation relations. In the second step, one must select an algebraic state in order to recover the standard Hilbert space interpretation of a quantum system. As quantum field theories possess infinitely many degrees of freedom, many unitarily inequivalent Hilbert space representations exist and the power of such approach is the ability to treat them all in a coherent manner. We will discuss in detail the algebraic approach for free fields in order to give the reader all necessary information to deal with the recent literature, which focuses on the applications to specific problems, mostly in cosmology.


2020 ◽  
Vol 117 (39) ◽  
pp. 24061-24068 ◽  
Author(s):  
Thomas T. Foley ◽  
Katherine M. Kidder ◽  
M. Scott Shell ◽  
W. G. Noid

The success of any physical model critically depends upon adopting an appropriate representation for the phenomenon of interest. Unfortunately, it remains generally challenging to identify the essential degrees of freedom or, equivalently, the proper order parameters for describing complex phenomena. Here we develop a statistical physics framework for exploring and quantitatively characterizing the space of order parameters for representing physical systems. Specifically, we examine the space of low-resolution representations that correspond to particle-based coarse-grained (CG) models for a simple microscopic model of protein fluctuations. We employ Monte Carlo (MC) methods to sample this space and determine the density of states for CG representations as a function of their ability to preserve the configurational information, I, and large-scale fluctuations, Q, of the microscopic model. These two metrics are uncorrelated in high-resolution representations but become anticorrelated at lower resolutions. Moreover, our MC simulations suggest an emergent length scale for coarse-graining proteins, as well as a qualitative distinction between good and bad representations of proteins. Finally, we relate our work to recent approaches for clustering graphs and detecting communities in networks.


2019 ◽  
Vol 36 (20) ◽  
pp. 205013 ◽  
Author(s):  
Selman Ipek ◽  
Mohammad Abedi ◽  
Ariel Caticha

2007 ◽  
Vol 98 (26) ◽  
Author(s):  
H. Bock ◽  
K. E. Gubbins ◽  
S. H. L. Klapp

Sign in / Sign up

Export Citation Format

Share Document