I believe it _is_ supposed to be like this when you encounter it for the first time. Felt the same to me too. It's one of those foundational topics that pervade a really vast terrain in maths, yet on their own feel almost useless and dry. That's because it's an abstraction, a generalization of some useful concepts that pop up in many different places. For example, if you're going to take multivariable calc you'll see it pop up a bit. If you take differential equations, you'll see it pop up quite a bit. If you take machine learning, that's basically entirely applied calc and lin alg. Courses further up the ladder, e.g. probability theory, abstract algebra, differential geometry etc, continue becoming more and more dependent on linear algebra. If you take physics courses, quantum mechanics, special and general relativity, continuum mechanics etc make heavy use of lin alg. Heck, it's not just useful in pure maths and physics - basically any field that involves mathematical modelling will depend highly on linear algebra. The specifics often don't matter but it's very very useful (and necessary) to have a high-level understanding. Also many of the techniques you learnt in this course will eventually become a permanent part of your arsenal.

The basic takeaway right now should be how linear transformations work, how we can represent them with matrices, how the same transformation looks different in different bases and that transformations will in general have a preferred basis where they look really simple. I studied a few chapters of Axler several years ago and I believe the book does a good job of introducing these things. Nevertheless, I'd say have a go at watching _3blue1brown_'s YouTube playlist called _Essence of Linear Algebra_. It will, hopefully, make a lot of things click for you.