The scientific method — based on the process of observation, formulating a hypothesis, then going through experiments to either prove or disprove that hypothesis — has been the cornerstone of modern science for several centuries. It’s a technique used by many scientific fields, as it provides a structured guideline to answering a question logically and rationally, using empirical evidence — an approach that ushered humanity out of the dark ages and into today’s era where breakthrough discoveries in physics, astronomy and modern medicine are possible.
But are there situations in scientific investigation where the scientific method is not needed? A team of researchers at Princeton University’s Plasma Physics Laboratory (PPPL) are now proposing that this is indeed possible — by using a machine learning algorithm that can predict the physical orbits of planets, without the need for it to be based on the laws of physics.
The paper on the work, which was recently published in Scientific Reports, outlines how the team trained a machine-learning algorithm on data about the known orbits of Mercury, Venus, Earth, Mars, Jupiter, and the dwarf planet Ceres. This machine-learning algorithm, paired along with what the team calls a “serving algorithm”, was then used to predict the orbits of other planets — including the parabolic and hyperbolic escaping orbits, of the solar system — without needing to input Newtonian laws of motion and gravitation. Instead, the approach forms what the team calls a discrete field theory, which models the universe as a kind of “black box.” This method means that one can go from “data to data” — from training data to predictions — without relying on the intermediary steps offered by the laws of physics.
“It is worthwhile to emphasize that the serving and learning algorithms do not know, learn, or use Newton’s laws of motion and universal gravitation,” wrote the team. “The discrete field theory directly connects the observational data and new predictions. Newton’s laws are not needed.”
In essence, the team’s algorithm is capable of learning the laws of planetary motion and understanding the dynamics underlying any physical system, after being trained with only a few examples. The algorithm does not need to rely on solving complex differential equations in order to deliver highly accurate predictions.
“The method proposed learns a field theory from a given set of training data consisting of observed values of a physical field at discrete spacetime locations,” said the researchers. They explain:
“The laws of physics are fundamentally expressed in the form of field theories instead of differential equations. It is thus more important to learn the underpinning field theories when possible. Since field theories are in general simpler than the corresponding differential equations, learning field theories is easier, which is true for both human intelligence and artificial intelligence. Except for the fundamental assumption that the observational data are governed by field theories, the learning and serving algorithms proposed do not assume any knowledge of the laws of physics, such as Newton’s law of motion and Schrödinger’s equation [a wave equation that predicts the future behavior of a dynamic system]. This is different from most existing methodologies of machine learning in physics.”
We might be most familiar with machine learning algorithms as they are used in recommendation engines, and facial recognition and natural language processing applications. In the field of physics, however, machine learning algorithms are typically used to model complex processes like plasma disruptions in magnetic fusion devices, or modeling the dynamic motions of fluids. In the case of this work by the Princeton team, the algorithm skips the interim steps of needing to be explicitly programmed with the conventions of physics.
“The algorithms developed are robust against variations of the governing laws of physics because the method does not require any knowledge of the laws of physics other than the fundamental assumption that the governing laws are field theories,” said the team. “When the effects of special relativity or general relativity are important, the algorithms are expected to be valid as well.”
Living in a Computer Simulation?
The researchers’ approach was inspired in part by Oxford philosopher Nick Bostrom’s philosophical thought experiment that the universe is actually a computer simulation. Bostrom’s simulation hypothesis states that humanity may have entered a “posthuman” stage that is highly technological, and where enormous computing power is available. Bostrom then postulates that we may be in fact living in an “ancestor simulation” that is being run by our posthuman descendants — almost analogous to someone turning on a console to play a video game.
“It is worthwhile to emphasize that the serving and learning algorithms do not know, learn, or use Newton’s laws of motion and universal gravitation,” wrote the team. “The discrete field theory directly connects the observational data and new predictions. Newton’s laws are not needed.”
“The simulation hypothesis states that the physical universe is a computer simulation, and it is being carefully examined by physicists as a possible reality,” said the team. “If the hypothesis is true, then the spacetime is necessarily discrete. So are the field theories in physics. It is then reasonable, at least from a theoretical point of view, to suggest that some machine learning and serving algorithms of discrete field theories are what the discrete universe, i.e., the computer simulation, runs to minimize the actions.”
While we may not know with any certainty whether we do indeed live in a simulation anytime soon, the team indicated that their novel approach will likely have some interesting applications in predicting how particles of plasma may behave in fusion energy experiments. Fusion occurs when two or more atomic nuclei are combined to form one or more different atomic nuclei and subatomic particles, which produce a massive amount of energy. Nuclear fusion is the main source of energy for stars, and if scientists are able to predict how fusion processes work, it may be one way to harness it as a near-infinite source of renewable energy. Currently, the team is working on improving the accuracy of the machine learning model by reducing noise, such as using a noise-canceling signal or employing generative machine learning models.
Read more in the team’s paper.
Images: Photo by Karlis Reimanis via Unsplash