(Joint) moments, when they exist, can help characterize the distribution of a random vector. Having all the same moments is a necessary but not sufficient condition for two random vectors to be identically distributed. As you know, the mean and covariance are first- and second-order (central) moments.
The moment-generating function, when it exists, uniquely determines the distribution. Note that for some distributions, moments of all orders exist and yet the MGF does not. In such cases, the collection of all moments need *not* uniquely determine the distribution.
Equivalently, one can consider the cumulant-generating function, which is the natural logarithm of the MGF. In many ways cumulants are nicer than moments, although they contain equivalent information. For instance, a cumulant of a sum of independent random variables is the sum of the cumulants of the individual variables, which is not true of arbitrary central moments.