Take for example differential forms and tensor calculus. Of course, differential forms are about a subset of tensor calculus, but it is often advertised as it's own thing (at least for physicists). So, to my question, are there other example of Mathematics fields which are like this?
A lot of introductory complex analysis is nothing more than PDEs--things like the Cauchy integral formula and analyticity of holomorphic functions, for instance, really are statements about harmonic functions, and are sometimes even more naturally viewed as statements about solutions to PDEs more generally. It's only later in a course do people tend to do things that probably have some legitimate claim to be about complex analysis, and not just PDE.
Group theory is used in basically everything, yet group theory exists by itself.
The theory of Riemann surfaces and algebraic geometry, or number theory and algebraic geometry, and when we're at is, linear algebra and algebraic geometry. I think the pattern is obvious.
Maybe probability theory and measure theory but I don't know enough to say for sure.
In every course I've taken isomorphism theory is the same thing as isomorphism theory.

/s
Statistics always struck me as a layer of terminology atop concepts in analysis and probabilities. I remember one time, we did a proof in statistics that involved the Borel-Lebesgue characterization of compact sets (from any covering by open sets of the compact, a finite covering can be extracted), in order to get a covering of a compact by a finite set of balls with radius epsilon... This is very very unusual for statistics and is the only glimpse of intelligible math I found in all statistics courses I had to suffer. I would describe statistics as object oriented mathematics, where everything is hidden from view. But, on that day, I was allowed to see that there is something behind the terminological "thicket", to quote Numerical Recipes in Fortran. Sorry, this turned into a rant.

Machine learning and such is basically interpolation with complicated or steep functions (like logistic functions, but they call those some other way), naturally using Optimization (called fitting or training, because why not). What they call a "loss function" is what everyone else calls a "cost function", so there is a tendency here. Also they use mostly stochastic optimization because they have many variables, but that does not make for a field per se, in my opinion. It is similar to statistics as it is an old field used for new purposes, hence the new terminology.
Algebra Lectures advertise that you will get to play with Groups, Rings, Modules, Ideals and a bit of Category Theory sprinkled in, and that's mostly true.

Topology Lectures advertises that you will play with Donuts, Coffee mugs, Spaces, Knots and count holes. What they often forget to tell you is that you count holes by spending your time with Chains of Rings and Modules, basically juggling around Morphisms and doing Algebra and Category Theory all day.
Machine Learning techniques are just numerical analysis
For those who are interested in AI, you'll find that Reinforcement Learning is more or less secretly Control Theory