1. Determinant of a matrix
  2. Difference between inverse matrix and identity matrix and what are they?
  3. Eigenvalues
  4. Unitary or orthonormal matrix
  5. Diagonal matrix
  6. How to compute matrices?

Thank you in advance for answering anyone of them.

You are viewing a single thread.
View all comments View context
4 points

@meowmeowmeow
2(a). In a lot of mathematical systems, the “identity” is the thing that “does nothing.” For example, when adding ordinary numbers the identity is 0 because adding 0 to any number does nothing - the other number stays the same. Similarly, when multiplying the identity is 1 because multiplying 1 with any number also does nothing. The identity matrix plays the same role - if you multiply any (square) matrix with the identity, you’ll get back the same matrix you started with.

permalink
report
parent
reply
4 points

@meowmeowmeow
2(b). The inverse is related to the identity. It’s sort of the “opposite” of a math object (a number, matrix, etc.) but in a specific way. When combining something with its inverse by some operation (like adding or multiplying) the result is the identity. For example: when adding, the inverse of x is -x because x+(-x) = 0. And when multiplying, the inverse of x is 1/x because x*1/x = 1. In the same way, when a matrix multiplies with its inverse, the result is the identity matrix.

permalink
report
parent
reply
2 points

@meowmeowmeow
3. Remember a matrix is like a function: multiply it with a column vector as input, and you get another column vector as output. In general, a matrix can transform vectors in all sorts of ways, but sometimes a matrix has special input vectors called “eigenvectors.” What makes them special is that, after multiplying, you get almost exactly the same vector you started with, but multiplied by some number called an “eigenvalue.” This page has some examples: https://www.mathsisfun.com/algebra/eigenvalue.html

permalink
report
parent
reply
3 points

@meowmeowmeow
4(a). “Orthonormal” combines “orthogonal” (sort of means the same as “perpendicular”) and “normal” (in this context means a vector with length 1). If a matrix is orthonormal, that means if we treat its columns as separate vectors, they’re all mutually perpendicular to each other and each have length 1. Why do we care enough to give this a special name? Well, it turns out orthonormal matrices rotate and reflect vectors, which has obvious uses to science and computer graphics.

permalink
report
parent
reply
1 point

Thanks for your explaination with examples. What is a column vector? Is it something like (1 2 3) which means move x upwards 1 unit, y up 2 units, z up 3 units?

permalink
report
parent
reply

Explain Like I’m Five

!explainlikeimfive@lemmy.ml

Create post

Layperson-friendly explanations

Community stats

  • 1

    Monthly active users

  • 6

    Posts

  • 22

    Comments

Community moderators