The sum of kets is a ket: $|A\rangle + |b\rangle = |c\rangle$
vECTOR ADDITION IS COMMUTATIVE: $|A\rangle + |b\rangle = |B\rangle + |A\rangle$
VECTOR ADDITION IS ASSOCIATIVE: $(|A \rangle + |b \rangle) + |C \rangle = |a \rangle + (|b \rangle + |C \rangle)$
There is a vector, 0, that, when added to any ket, gives the same ket back: $|A\rangle + 0 = |A\rangle$.
Given any ket $|a\rangle$ there is a ket $-|a\rangle$, such that:
given any ket $|a\rangle$, and any complex number z, you can mutiply them to get a new ket, and this multiplication is linear:
The Distributive Property holds:
Axioms 6 & 7 together, represent *linearity*.
What makes these vectors different from ordinary 3-vectors is that, unlike the ordinary vectors, these canm be multiplied by *complex* numbers.
--
A(x) = b is the formula for multiplying a matrix by a vector, I believe there's a relationship to this, as that formula is from linear algebra, which is the mathematical language for quantum computing.
Typically, we can't multiply 3-vectors by complex numbers, and our ability to do so indicates that these are a more abstract type of vector than we classically expect.
Two dimensional column vectors are exemplary, where we stack up complex numbers associated with $|A\rangle$, $\begin{pmatrix} \alpha \ \alpha \end{pmatrix}$, however these vectors (and spaces) can have any number of dimensions.
We typically do not mix vectors of different dimensionalities.
Bras and kets
Just as complex numbers have their duals, Kets have their duals, in a dual space, called Bras, which are row vectors, as opposed to column vectors.
These two vector 'types' allow us to generate inner products, written as $\langle B| A \rangle$, which are extremely important with regards to the mathematical structure of quantum mechanics and characterizing vector spaces.