Suppose little g is abelian, so the bracket is zero. Then little ad_X equals 0 for all X, so K of X, Y equals zero for all X and Y. The Killing form was introduced to equip little g with some interesting geometry, and defining the dot product of two vectors to be identically zero isn't an interesting geometry, so we want to focus on examples where this doesn't happen.
Dual Killing form
Killing form
Recall that any Lie algebra admits a symmetric bilinear form called the Killing form K of X, Y equals the trace of littel ad_X times little ad_Y.
A symmetric bilinear form K is called
We say that little g is
I'm being immoral here. The correct definition of semisimplicity is actually quite different from this, involving diagonalisability of certain matrices. It's then a theorem ("Cartan's criterion") that semisimplicity is equivalent to nondegeneracy of the Killing form. However, the way I've set things up, we don't need to know the usual definition of semisimplicity, but we do need to know about nondegeneracy of the Killing form.
Compact groups
Here is a theorem (which we won't prove; it will be an exercise) which gives a large class of examples of semisimple Lie algebras.
If G is a compact group then the Killing form K from little g times little g to R is (realvalued and) negative semidefinite, i.e. K of X, X is less than or equal to zero for all X in little g. Moreover K(X, X) = 0 if and only if little ad_X equals zero, i.e. if X commutes with everything in the Lie algebra.
The
Recall that the centre Z of big G of a group big G is the set of all elements g which commute with every other element of the group. It's possible to have Lie groups with nontrivial centre but where the Lie algebra has trivial centre, for example SU(2) has centre comprising plus or minus the identity matrix while the centre of the Lie algebra little s u 2 is zero.
If G is compact and has trivial centre then little g is semisimple.
Since the centre is trivial, K of X, X is strictly negative for all nonzero X in little g, so little g is semisimple.
Assume the centre of little g is trivial. Let T inside G be a maximal torus. Let little t be the Lie algebra of T. Let little h be the complexification of little t and little h R be i times little t. Then K restricted to little h R is positive definite.
K of X, X is negative, so K of i X, i X equals minus K of X, X is positive (unless X = 0).
Another name for a positivedefinite symmetric bilinear form K from V times V to R is a
The dual
Definition
Remember that our weight diagram lives in little h R dual, not in little h R.
If K from V times V to C is a nondegenerate symmetric bilinear form then V dual inherits a symmetric bilinear form K star from V dual times V dual to C.

There is an isomorphism sharp from V dual to V (its inverse is called flat from V to V dual) defined in the following way: K of alpha sharp, v equals alpha of v for all alpha in V^* and v in V.
This uniquely determines the map sharp because K is nondegenerate (it will be an exercise to see this).

Define K star of alpha, beta to be K of alpha sharp, beta sharp.
More concretely...
To make this a little less abstract, let's write out what this means in terms of matrices. Pick a basis e_1 up to e_n of V and define a matrix K whose i,j entry is K_{i j} = K(e_i, e_j). Then K of v, w equals K of sum v_i e_i, sum w_j e_j, which equals sum of v_i w_j times K_{i j}.
Given alpha in V dual, we can think of alpha as a row vector alpha_1 along to alpha_n and we want to produce a column vector alpha sharp equals alpha sharp_1 down to alpha sharp_n such that K of alpha sharp, v equals alpha of v for all v in V.
We have alpha of v equals sum of alpha_j v_j; and K of alpha sharp, v equals sum of K_{i j} alpha sharp_i times v_j Now taking coefficients of v_j on both sides we get alpha_j equals sum of K_{i j} alpha sharp_i
Since K_{i j} is symmetric, this implies alpha_j equals sum of K_{j i} alpha sharp_i, which we can write as the column vector alpha transpose equals K times alpha sharp Hence alpha sharp equals K inverse times alpha transpose.
This only makes sense if K_{i j} is invertible, and this turns out to be equivalent to nondegeneracy. The flat map is then given by v flat equals the tranpose of K times v, which makes sense even if K is degenerate.
Finally, let's define K star_{i j} by K star of alpha, beta equals sum of K star_{i j} times alpha_i times beta_j. Since K star of alpha, beta equals K of alpha sharp, beta sharp, we have sum of K star_{i j} alpha_i beta_j equals sum of K_{i j}K inverse_{i p} alpha_p K invere_{j q} beta_q, which equals the sum of K inverse_{p,q} alpha_p beta_q.
Therefore K star_{i j} equals K inverse_{i j}. So if we want to compute the matrix of the dual symmetric bilinear form K star (with respect to the dual basis), we first compute the matrix of K, then we invert it.
Example
For SU(3), we calculated K(H_{1 3}, H_{1 3}) = K(H_{2 3}, H_{2 3}) = 12 and K(H_{1 3}, H_{2 3}) = 6 for the Killing form on little h R (spanned by e_1 = H_{1 3} and e_2 = H_{2 3}). So, as a matrix K_{i j}= K(e_i, e_j), we get K to be the matrix 12, 6, 6, 12
On the dual space little h R dual, we therefore get K star to be one over 108 times the matrix 12, minus 6, minus 6, 12 Let e_1 flat and e_2 flat be the dual basis of little h R dual. Then each little e_i flat has length one third, and e_1 flat dot e_2 flat equals minus 6 over 108, so if phi is the angle between these two vectors then cos phi equals minus 6 times 9 over 108, which equals minus a half, and phi is 120 degrees.
This is why I was drawing my k and \ell axes at 120 degrees to one another in SU(3) weight diagrams.