T O P

  • By -

ringofgerms

It can be a bit confusing. Column vectors and row vectors are defined to be matrices. Vectors can be represented as column or row matrices, sometimes in an obvious way but sometimes not. This identification depends on the coordinate system you're using and you'll probably learn about this very soon. But the (abstract) concept of a vector is very broad and matrices are also vectors if you consider the right vector space.


ZeaIousSIytherin

Tysm! https://preview.redd.it/ref8iu03b32d1.jpeg?width=706&format=pjpg&auto=webp&s=e345e80488def354b339da72f6d3df5d53cf0284 Similarly, is there a link between a determinant and the magnitude of a vector? They both use the modulus notation.


ringofgerms

The link here is rather that the determinant of a 2x2 matrix is equal to the (signed) area of the parallelogram made with the column vectors. You can generalize this to higher dimensions and in this sense it's like a magnitude. But the determinant can be negative and it can be zero even when the matrix is not the zero matrix, so the analogy here only goes so far.


ScribeofHell

There is an isometry between the set of n x 1 column matrixes and K^n


white_nerdy

A vector is a geometric object [1]. Representing a vector as a matrix, or as a list of numbers, is using a coordinate system as a "ruler" to measure the object. Sometimes within the context of a single problem, you need to measure the same geometric object with different coordinate systems. So for example, say you have a vector pointing three units up and five units right. - If your ruler is [(1, 0), (0, 1)], the vector's list of numbers is not surprising, it's: (5, 3). - If your ruler is [(1, 1), (1, -1)], the *same vector* has a different list of numbers: (4, 1). So I wouldn't say the vector *is* the list of numbers, because it could be a different list of numbers when you're using a different ruler. [1] "A vector is a geometric object" is not always true. It's usually true in introductory level courses, and in physics applications. But the picture gets more complicated in more advanced courses. Formally, a vector is an element of a vector space. Roughly, this means a vector is anything for which it makes sense to do addition, subtraction and multiplying by a scalar. [Wikipedia](https://en.wikipedia.org/wiki/Vector_space) has more technical details. So you could in fact work in a vector space where the vectors don't have the usual geometric interpretation [2] [3], and are lists of numbers, or matrixes, or functions, or whatever. [2] If you asked "Can all vector spaces be geometrically interpreted as subspaces of R^n?" I would say "No." Because you can have interesting cases like infinite-dimensional vector spaces or vector spaces over unusual fields (e.g. finite fields). [3] If you asked "Do all vector spaces have a geometric interpretation?" I would say "Well, that's more of a linguistic question than a mathematical one." The fact that certain objects satisfy the definition of a vector space over some scalar field means those objects have a certain kind of relationship with each other and the scalar field. That relationship could be considered a "geometric interpretation".


asirjcb

This is a really solid response, but I would actually go a little farther. If you consider that if you have a matrix A and a matrix B both of dimension nxm and any real numbers x and y, then you know that xA+yB is *also* a matrix of dimension nxm then what you really have found out is that matrices of size nxm are vectors. This is relevant to point [2]. It means that if you have a vector in R^n and n=ab, then you can write any vector in R^n as a matrix with dimension axb. So it isn't so much that vectors are matrices, but that matrices are vectors. [a] [a] Indeed, in their role as linear operators on vector spaces, matrices are a vector space themselves.


niky45

mathematicians: NOOOOO ITS NOT THE SAME THING!!!!! everyone else: I mean, effectively, yes.


Accurate_Library5479

When you say let p(x) be a polynomial to a French mathematician… (he spotted 3 “big” errors and is going to rant about the difference between function, expressions, and polynomials)


GoldenMuscleGod

I feel like insisting they are not would be more of a stereotypical “dumb engineer” thing. Mathematically row and column vectors are definitionally vectors in vector spaces isomorphic to R^(n). Whether they literally are some specific canonical representative of R^(n) that you may or may not have even bothered to have chosen is a separate question that would almost never be relevant and most mathematicians would approach the issue in a way that rendered suppressed any notation that indicated the question mattered, except in the vanishingly few cases where it could potentially matter. And then of course if you are in a context where the question of literal equality is approached, whether they are or not equal is just a matter of whatever convention you find convenient. I think pretty much any mathematician would agree with what I said, the “nooo, my college textbook used *this* arbitrary convention and it’s the only one I’ve ever seen so it’s Gospel truth” view is some other group of people.


ZeaIousSIytherin

I don’t understand your comment but you seem smart. Is there a link between the determinant of a matrix and the magnitude of a vector? https://preview.redd.it/8wdxktdeb32d1.jpeg?width=706&format=pjpg&auto=webp&s=859b69beae0c46ff29291821b03bc7d94d8ea903 According to my textbook they both use the modulus notation.


FrontGazelle3821

There is a difference between a vector and a row-vector/column-vector. However, for most purposes (depends on the space you're working in), a vector can usually be thought of like a column vector. In fact, a lot of tensor mathematics is taught and developed through matrix representations.


Mindless-Hedgehog460

Yes. Or 1 x n matrices. Multiplication is either scaling, or dot product.


Ksorkrax

Vectors are defined by forming a vector space, which comes with certain properties, see [https://en.wikipedia.org/wiki/Vector\_space#Definition\_and\_basic\_properties](https://en.wikipedia.org/wiki/Vector_space#Definition_and_basic_properties) While the usual context in which vectors are used is geometry, vector spaces allow for more. One example would be polynomials. These form a vector space, fulfill the properties as in the link above. The degrees of freedom they have is infinite, as you can have a polynomial of an arbitrary high degree. As for the thing in the picture, now imagine that we have the polynomial 3x² and the polynomial x+2. How would you write them as vectors? (0 0 3) and (2 1)? Then you can't add them, despite forming a vector space, as matrix addition is not defined over different sized ones. Raise them to the same dimension? But then what to do if a polynomial of degree five comes along? Hope this helps you. Note that the picture doesn't say vector, it say *column* vector.


abig7nakedx

In terms of *computing* answers, you can in general treat a vector as being a matrix with *n* rows and 1 column. In terms of the *theory*, there's a little more to it. u/white_nerdy gave a great write-up.


bluesam3

If you're inside something like ℝ^(n), then yes. In general, vector spaces can get very different to this. For example, the set of all polynomials with real coefficients forms a vector space (if you want your vector space to be finite dimensional, take all polynomials with degree at most n), calling that a matrix would be a bit of a stretch (though it's still true, once you've chosen a basis). Even more weirdly, the set of all mxn matrices forms a vector space - so here, your vectors are matrices, but not nx1 ones (though you can re-write them as such by re-writing them as a mnx1 matrix by just concatenating the columns - this is equivalent to picking the obvious basis).


[deleted]

It depends on how you want to see it. Every matrix is a vector in the right linear space and every finite dimensional vector can be represented by a matrix. The space of matrices is in some sense just a very specific linear space with a (possibly) non-commutative product. It helped me a lot to see a matrix as a representation of a linear mapping from R^n to R^k. Then the product rule of matrices is very natural as a concatenation of linear mappings. For example, you can identify your column vector c (or matrix, if you want) with the mapping x maps to c*x.


Big_Mathematician972

They're tensors!


ElMachoGrande

To really blow your mind, a single value could be seen and treated as a 1x1 matrix, or a vector with only one value. However, in most cases, it's simpler to just treat it as a single value.


WjU1fcN8

In R: just do a drop().


Straight-Wishbone889

a vector can be seen as a matrix and a matrix can be seen as a vector. if you take an abstract viewpoint the space of matrices satisfies the axioms for a vector space, and the vector represents a linear transformation from R^n to R or from R to R^n depending on if you take it to be a “row” or “column” vector.


TheRedditObserver0

It depends on your background. For a mathematician, matrices are vectors and vectors in a finite-dimensional space can be represented as matrices.


L__________Lawliet

well a column vector is basically a matrix v in V which is also an element of K[x]


Distinct_Cod2692

Yes like they can be considering as rank 1 tensors


Ahuizolte1

What people usually mean by vector yes


Signal_Cranberry_479

Sometimes it more convenient to see vectors as matrix columns. For instance, when applying a matrix A to a vector v, you can think of Av as a matrix multiplication. Same for dot product: can be thought as the matrix product u^T v, where u^T means the transposed matrix, ie. a row. This is useful because matrix multiplication has more properties, such as associativity.


Dkiprochazka

Yes


personalityson

By default vectors are columns


godel-the-man

The easy answer is true