Friendly reminder that despite public perception, not all of math is universally agreed upon.

Defining 0^0 is essentially a matter of convenience; it's useful in certain applications to have a value there. Not everyone in the math community will agree that it should be defined or what the definition should be, and it's not going to be defined in all contexts.

Your calculator is programmed by people somewhere, and whoever programmed it decided (or more likely, was told) that the calculator should return 1 when you input 0^0.

So really, they are both "true" or neither is "true" depending on your point of view. Any definition we give in math for anything is an attempt to provide a starting point from which to build useful results. Those definitions will change in different areas, even when they are generally agreed upon (example: parallel lines in euclidean vs. non-euclidean geometry).