(Unrelated to sudgy). A few years ago, I made a post here complaining about the way coordinate vectors are introduced in linear algebra. In particular, no-one ever explained me whether, if you write (2,3,4), what you mean is "the element (2,3,4) \in \R^3" or "the vector 2*b_1 + 3*b_2 + 4*b_3 for some basis {b_1, b_2, b_3}". In the former case, writing (2,3,4)_B would then mean "I apply the operation _B to the vector (2,3,4) which means I map it onto 2*b_1 + 3*b_2 + 4*b_3, which is a different vector". And in the latter case, you would technically -always- have to denote which basis you mean when you write down any 'coordinate vector', and not doing this would just be an abbreviation. The second case is weird because no-one follows it consistently (also how do you write down the vectors of a basis? You need two different notations!), and the former is weird once you throw matrices into the mix, because matrices are defined relative to a basis, and vectors aren't. And when you define a function as applying a matrix, i.e. f : x -> A * x it gets even worse, because A still depends on a basis, but f definitely shouldn't. Our lecture did this, ugh. As far as I remember, though there were some decent comments, no-one really gave me a satisfying explanation.

Forward to today, I've been reading "Linear Algebra Done Right" and it immediately did what both lectures I visited didn't manage to. In the book, vectors are elements in F^n, not relative to anything. Matrices are something separate. There is a function M taking any linear map *and the basises of two vector spaces* and returning a matrix. Matrix multiplication is defined in the usual way, but there is no multiplication of a matrix with a vector. Instead, there is another function which takes a vector *and a basis* and returns a n*1 matrix representing the vector *relative to that basis*. That matrix can then be multiplied with another matrix following matrix multiplication as usual.

I find that super awesome, so I wanted to share it. This is actually a completely consistent way to differentiate between vectors and 'coordinate vectors' that answers all questions in a satisfying way, and it's not even difficult! That's how you should do it. The book completely deserves its title. Even though the justification they gave for the title wasn't about this, but about something involving the determinant and how it's usually explained without providing any intuition, which is also true for the lectures I've visited. Looking forward to how this book handles that, too.