Every Crucial Equation in Math and Physics

Seventeen Equations That Built the Modern World

The Pythagorean Theorem

The Pythagorean theorem is a very important idea in geometry. It helps us understand the relationship between the sides of a right triangle, which is a triangle with a 90° angle. If you know the lengths of the two shorter sides, then you can use the formula a² + b² = c² to find the length of the longest side. Or, if you know the length of the longest side and one of the shorter sides, you can use the formula to find the length of the remaining shorter side.

However, there’s one important exception. The Pythagorean theorem only works on flat surfaces, like a piece of paper. If you draw a right triangle on a curved surface like a ball, the theorem might not apply. This is because the Pythagorean theorem is a rule of flat, or Euclidean, geometry.

Logarithms

Logarithms are a way to make multiplication easier. They are the opposite of exponents. An exponent tells you how many times to multiply a number by itself. For example, 2³ means 2 × 2 × 2, which equals 8. Or 10², which means 10 × 10 = 100. A logarithm tells you what exponent you need to make a certain number. For example, the logarithm base 2 of 8 is 3, because 2 raised to the power of 3 equals 8. And the logarithm base 10 of 100 is 2, because 10 raised to the power of 2 equals 100.

The base of the logarithm tells you which exponent to use. The most common bases are 10 and 2. One very useful property of logarithms is that they turn multiplication into addition. For example, log(2 × 3) is equal to log(2) + log(3). Before calculators and computers, people used logarithm tables to quickly multiply large numbers. This made calculations much faster in science and engineering.

Calculus

Calculus is a branch of mathematics that helps us understand how things change over time. It has two main parts: derivatives and integrals.

A derivative measures the rate of change of a quantity. For example, if you are walking at a speed of 3 mph, the derivative of your position is 3 mph. This tells you how quickly your position is changing. The formula for a derivative says that the derivative of a function f(x) is the limit of the change in f(x) divided by the change in x, as the change in x gets smaller and smaller.

Integrals are the opposite of derivatives. They measure the total change in a quantity over a period of time. For example, if you know your speed over time, you can use an integral to calculate the total distance that you traveled. Together, derivatives and integrals form the foundation of calculus. Calculus is incredibly useful in science and engineering because it allows us to understand and quantify how things change. This is essential for studying motion, growth, rates of change, and much more.

The Law of Gravity

Isaac Newton discovered that there is a force called gravity that pulls objects towards each other. He figured out a way to calculate how strong this force is between any two objects. Newton’s law of gravity states that F = G(m₁m₂)/r², where F is the force of gravity between the two objects, G is a constant number called the gravitational constant, m₁ and m₂ are the masses of the two objects, and r is the distance between the centers of the two objects.

So the force of gravity depends on two things: the masses of the two objects (the more massive they are, the stronger the force) and the distance between them (the force gets weaker as the distance increases). This law explains why planets orbit the Sun and why objects fall to the ground on Earth. It works everywhere in the universe.

Newton’s law was very accurate for over 200 years, but in the early 1900s Albert Einstein came up with an even better theory called general relativity. This theory explains gravity in an even more precise way.

The Square Root of −1

In math we keep expanding the types of numbers we use. We start with the simple whole numbers: 1, 2, and 3. Then we add negative numbers like −1, −2, and −3. After that we add fractions and decimals to get the real numbers. But there’s one more important type of number: the imaginary number i.

The number i is defined as the square root of −1. This means if you multiply i by itself, you get −1, so i² = −1. This may seem strange at first, but it turns out to be very useful in math and science. When you combine real numbers and imaginary numbers, you get the complex numbers.

Complex numbers have some special properties. Any polynomial equation can be solved using complex numbers. Calculus and algebra work perfectly with complex numbers. And complex numbers reveal hidden symmetries and patterns. These properties make complex numbers essential in fields like electronics and signal processing. They help us understand and work with things like waves, circuits, and digital signals.

Euler’s Polyhedra Formula

A polyhedron is a 3D shape made up of flat polygons. The corners of a polyhedron are called vertices, the lines connecting the vertices are called edges, and the polygons covering it are called faces. For example, a cube has 8 vertices, 12 edges, and 6 faces.

If you add the number of vertices and faces together and subtract the number of edges, you always get 2. This is true for any convex polyhedron, no matter how many faces it has. The formula is V + F − E = 2, where V is the number of vertices, F is the number of faces, and E is the number of edges.

This surprising relationship was discovered by Leonhard Euler. It’s an example of a topological invariant, a number or property that stays the same for a whole class of similar shapes. Euler’s polyhedra formula was an important step in the development of topology, a branch of math that studies the properties of shapes that don’t change when the shape is stretched or bent, as long as it’s not torn or glued together. Topology has many applications in modern physics, like in the study of spacetime and quantum mechanics. So the simple formula of polyhedra turned out to be the start of some very deep and powerful mathematics.

The Fourier Transform

The Fourier transform is a mathematical tool that helps us understand and work with waves. For example, imagine a recording of somebody talking. The sound wave looks like a messy, complicated pattern. But using the Fourier transform, we can analyze this wave and express it as a combination of many pure sine waves of different frequencies. This is incredibly useful because it simplifies the analysis of complex wave patterns, allows us to focus on the individual frequency components, and forms the basis of modern signal processing and analysis.

The Fourier transform is used in many fields: audio and speech processing, image and video compression like JPEG and MP3, analyzing vibrations and oscillations, and quantum mechanics and quantum computing.

The Wave Equation

The wave equation is a mathematical formula that describes how waves behave. It shows how the shape of a wave changes over time. In the equation, u is the wave function (or the shape of the wave), t is time, x is position, and c is the speed of the wave. The equation says that the rate of change of the wave shape with respect to time is equal to the wave speed squared times the rate of change of the wave shape with respect to position. In other words, it shows how the wave shape at one moment determines how it will change in the next moment.

The wave equation applies to many different types of waves: sound waves, water waves, light waves, and even vibrations in solids. Solving the wave equation helps us understand and predict the behavior of these waves. It was one of the first differential equations ever studied, and the techniques developed to solve it paved the way for understanding other differential equations as well.

Maxwell’s Equations

Maxwell’s equations are a set of four mathematical equations that describe the relationship between electricity and magnetism. These equations are the foundation of our understanding of classical electromagnetism, the study of electric and magnetic fields and how they interact.

The four equations are Gauss’s law for electric fields, Gauss’s law for magnetic fields, Faraday’s law of induction, and Ampère’s law with Maxwell’s correction. These equations show how electric and magnetic fields are created and how they change over time. They explain phenomena like how electric charges create electric fields, how changing magnetic fields create electric fields, how electric currents create magnetic fields, and how changing electric fields create magnetic fields.

Maxwell’s equations are as important to electromagnetism as Newton’s laws are to classical mechanics. They provide a comprehensive mathematical framework for understanding the behavior of electric and magnetic fields.

The Second Law of Thermodynamics

The second law of thermodynamics says that in a closed system, disorder, or entropy, always increases over time. Entropy is a measure of disorder. For example, heat naturally flows from hot to cold regions, increasing disorder. This one-way nature of the second law is unique in physics. Most processes are reversible, but the second law only runs in one direction. An ice cube melting in coffee, but never the coffee freezing back into an ice cube, is a good example.

The second law has profound implications, explaining why perpetual motion machines are impossible and why the universe’s disorder increases over time. It’s a fundamental principle of thermodynamics.

The Normal Distribution

The normal distribution, or bell curve, is a common statistical model. It has a distinctive symmetrical bell-shaped graph. The normal distribution is defined by the mean and standard deviation. About 68% of data falls within one standard deviation of the mean, and 95% within two.

This bell-shaped pattern appears in many real-world phenomena, from test scores to heights to measurement errors. That’s why the normal distribution is so widely used in statistics. It’s a powerful tool for understanding the behavior of large groups of independent processes. The normal distribution’s predictable shape makes it a fundamental concept in data analysis.

Relativity

Albert Einstein changed physics forever with his theories of relativity. Special relativity showed that matter and energy are equivalent (E = mc²), the speed of light is a universal speed limit, and time passes differently for people moving at different speeds. General relativity showed that gravity is caused by the curving and folding of space and time. This was a major change from Newton’s law of gravity.

Special relativity says that the laws of physics are the same for everyone, no matter how fast they are moving. General relativity describes gravity as a warping of space and time rather than just a force between objects. Relativity is a cornerstone of modern physics, and while the math behind relativity is complex, the core ideas are actually quite simple.

The Schrödinger Equation

The Schrödinger equation is a foundation of quantum mechanics. Just as general relativity explains our universe at its largest scales, this equation governs the behavior of atoms and subatomic particles. The Schrödinger equation allows calculating a particle’s wave function, representing its quantum state and behavior. This wave function gives probabilities, not definite values. This probabilistic, wavelike nature of quantum mechanics differs greatly from classical physics. The Schrödinger equation is the mathematical basis for understanding this strange quantum realm.

Heisenberg’s Uncertainty Principle

Heisenberg’s uncertainty principle states that there are limits to how precisely we can know certain pairs of properties of a particle, like its position and momentum. The more precisely we measure one property, the more uncertain we become about the other. This is represented mathematically as Δx · Δp ≥ ℏ/2, where Δx is the uncertainty in the particle’s position, Δp is the uncertainty in its momentum, and ℏ is the reduced Planck constant.

The uncertainty principle is a fundamental feature of quantum mechanics with profound implications for our understanding of the microscopic world.

The Navier-Stokes Equations

The Navier-Stokes equations describe the behavior of flowing fluids: water moving through a pipe, air flowing over an airplane wing, or smoke rising from a cigarette. While we have approximate solutions of the Navier-Stokes equations that allow computers to simulate fluid motion fairly well, it is still an open question, with a million-dollar prize, whether it is possible to construct mathematically exact solutions to the equations.

Information Theory

Shannon information entropy, like thermodynamic entropy, is a measure of disorder. In this case, it measures the information content of a message, a book, a JPEG picture sent on the internet, or anything that can be represented symbolically. The Shannon entropy of a message represents a lower bound on how much that message can be compressed without losing some of its content. Shannon’s entropy measure launched the mathematical study of information, and its results are central to how we communicate over networks today.

The Black-Scholes Equation

The Black-Scholes equation describes how finance experts and traders find prices for derivatives. Derivatives are financial products based on some underlying asset, like a stock, and they’re a major part of the modern financial system. The Black-Scholes equation allows financial professionals to calculate the value of these financial products based on the properties of the derivative and the underlying asset. The equation’s widespread use has transformed the financial industry. It facilitates the derivatives market and is essential for managing financial risk today.

Chaos Theory

The logistic map demonstrates chaos theory, showing how small changes in a system can lead to dramatically different outcomes over time. The map shows how a quantity x changes based on its current value and a constant k. For certain k values, tiny differences in the initial x cause the process to evolve very differently. This sensitivity to initial conditions is seen in complex systems like weather, where a small atmospheric change can trigger radically different patterns days later (the butterfly effect). Chaos theory reveals how simple rules generate intricate, unpredictable behavior, and it has applications across physics, biology, economics, and more.


Further Reading


 

Join the ThoughtThrill Newsletter
Get new mind-expanding math explained simply, plus free access to the Math Toolkit with interactive tools, visualizers, and resources used in our articles.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *