Are all equations just a form of addition+subtraction?

1. Like other people mentioned, math is not just about numbers, so that won't work for fields of math without addition.
2. If you want to reduce equations dealing with irrational numbers down to additions/subtractions, then you'll have to deal with approximations or infinite decimals/infinite sums which, misleadingly, are not really sums. An infinite sum is usually defined as the limit of the sequence of partial sums and I don't think limits can't be reduced to addition (since you now have logical quantifiers and sequences).
3. If you want to go deeper, addition itself is just a form of repeated "successor". For example, 3 + 2 is just the successor of the successor of 3.

However, this train of thought is very interesting in my opinion and we can follow it while sticking to the natural numbers.

A particular class of functions that I think you would find interesting are the primitive recursive functions which are:

* constant functions (always returns the same thing)
* the successor function (returns the number that comes after)
* projections (returns one of  the inputs, for example, f(x, y) = y would be a projection)
* compositions of  primitive recursive functions (for example x + (y \* z), multiplication followed by addition).
* functions defined using "primitive" recursion. By "primitive" I mean that the function has to be defined for 0, and then the function for n can only be defined in terms of the function for n-1. For example, multiplication is usually defined using a primitive recursion. x \* 0  = 0 and x \* y = (x \* (y-1)) + x). Times 0 is defined and times n is defined in terms of times n-1.

This is one possible definition of the class of functions that "boil down to" the successor function. If that's what you had in mind, then you can reword your question as, "Are all functions just primitive recursive functions?". Since this class of function has a name, you might have guessed that the answer is no.

Another class of functions called general recursive functions adds another way to specify a function, the "unbounded search operator" which returns the smallest natural number that satisfies a condition. For example, "the smallest prime bigger than x" would be a general recursive function.

It turns out that you can also define "the smallest prime bigger than x" in a primitive recursive way (it's just more complicated), so that function also happens to to be a primitive recursive function. However, there are some functions which are general recursive but are NOT primitive recursive. In particular, the Ackermann function. It is recursive, but there's no way to write it as a primitive recursion.

So now, another question we can ask is "Are all functions just general recursive functions?". Again, the answer is no. However! General recursive functions actually correspond with computable functions, functions for which an algorithm can be written. This correspondence lead in part to the Church-Turing thesis which I think you might find interesting.

But like I said there are some functions that are not general recursive, functions that are not computable. The most well-known is the busy beaver function.

Edit: One thing I want to point out. In programming, a "function" is an implementation of an algorithm. In that sense, every (programming) function is a general recursive (math) function. Every (programming) function can indeed be boiled down to constants, projections, and the successor function, by using composition, primitive recursion, and the unbounded search operator.

As for the busy beaver function, since there's no algorithm for it, it's not a function in the programming sense of the word. It's a function only in the mathematical sense that it's an abstract, but "well-defined", mapping from one set to another.
It is known that (N,+) is a complete theory, and hence it does not interpret Peano arithmetic.

So in that sense, you can't boil everything down to addition and substraction.

You need at least multiplication. If you also allow multiplication, then you can take the strict formalist view of math, in which case by using Godel encoding and MRDP theorem, every math question is just a polynomial equation.

Note: it might sounds like you can break multiplication into addition, but that does not quite work, at least not without other operation. If you think carefully about the concept of **repeated** addition you will notice that you need the ability to repeat some actions.
For example, solving for the value of i^4 +1 requires knowing complex number multiplication rules, which can't be boiled down to addition or subtraction.

Even without complex numbers, you gotta know multiplication rules like -1*-1=1 to do the most basic algebra.

And then there are entire branches of mathematics which have equations which do not involve any numbers. Equation is a much more general concept than numbers.
For anything a computer can estimate this is maybe true (turing machine) but wildly impractical for reasoning.
Even multiplication cannot be thought of as repeated addition. How can you reframe sqrt(2) * sqrt(3) as an addition problem?
How about an equation like sin x = 0.  Off the top of my head I don't know a way, even a roundabout one, to solve it with just addition and subtraction.
An equation is just an equivalence relation, of which there are many who do not involve adding and subtracting (really just another form of adding). For example, you could argue that two triangle A and B are identical in the sense that they are similar (every angle is the same), so A = B. However, there is nothing additive here. When you have a sense of addition, that's linear algebra and many (but certainly not all) things are studied in this way.
Galois theory shows that polynomials of degree 5 or more cannot always be solved by radicals which are +, -, multiplication, division and nth roots. So no.
An equation is just a statement that two objects are equal; it doesn't inherently suggest that there is anything to be "solved", or any operations (such as addition or subtraction) to be "performed". Even if you're trying to perform some kind of calculation or simplification in order to show that an equality holds, it may not be the case that the set you're working in has a notion of addition or subtraction.
I would argue that set "equations" have only an analogue of addition in them, in the form of union or intersection.

The limit is also a challenge here because of the extra processes happening.

But at a fundamental level there is this common thread that addition or additive inverse is at the root of almost all operations in mathematics.
by

0 like 0 dislike