Skip to content

Instantly share code, notes, and snippets.

@tobydriscoll
Created August 30, 2016 21:25
Show Gist options
  • Save tobydriscoll/6f5a1364a9fb1d20998e9faa6956b272 to your computer and use it in GitHub Desktop.
Save tobydriscoll/6f5a1364a9fb1d20998e9faa6956b272 to your computer and use it in GitHub Desktop.
TB-Lecture-02-julia
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Lecture 2: Orthogonal vectors and matrices"
]
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": true
},
"source": [
"With real vectors and matrices, the transpose operation is simple and familiar. It also happens to correspond to what we call the **adjoint** mathematically. In the complex case, one also has to conjugate the entries to keep the mathematical structure intact. We call this operator the **hermitian** of a matrix and use a star superscript for it."
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"2x4 Array{Complex{Float64},2}:\n",
" 0.840791+0.839765im 0.505249+0.883819im … 0.583826+0.634644im\n",
" 0.971119+0.123017im 0.119297+0.765509im 0.0494804+0.222855im"
]
},
"execution_count": 1,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"A = rand(2,4) + 1im*rand(2,4)"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"4x2 Array{Complex{Float64},2}:\n",
" 0.840791-0.839765im 0.971119-0.123017im\n",
" 0.505249-0.883819im 0.119297-0.765509im\n",
" 0.0546813-0.662234im 0.715531-0.348226im\n",
" 0.583826-0.634644im 0.0494804-0.222855im"
]
},
"execution_count": 2,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"Aadjoint = A'"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To get plain transpose, use a `.^` operator."
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"4x2 Array{Complex{Float64},2}:\n",
" 0.840791+0.839765im 0.971119+0.123017im\n",
" 0.505249+0.883819im 0.119297+0.765509im\n",
" 0.0546813+0.662234im 0.715531+0.348226im\n",
" 0.583826+0.634644im 0.0494804+0.222855im"
]
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"Atrans = A.'"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Inner products"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"If **u** and **v** are column vectors of the same length, then their **inner product** is $\\mathbf{u}^*\\mathbf{v}$. The result is a scalar. (In Julia, though, the result is a 1-by-1 matrix.)"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"1-element Array{Complex{Int64},1}:\n",
" -2-3im"
]
},
"execution_count": 4,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"u = [ 4; -1; 2+2im ]\n",
"v = [ -1; 1im; 1 ]\n",
"innerprod = u'*v"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The inner product has geometric significance. It is used to define length through the 2-norm,"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"1-element Array{Complex{Int64},1}:\n",
" 25+0im"
]
},
"execution_count": 5,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"length_u_squared = u'*u"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"25.0"
]
},
"execution_count": 6,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"sum( abs(u).^2 )"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"5.0"
]
},
"execution_count": 7,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"norm_u = norm(u)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"It also defines the angle between two vectors as a generalization of the familiar dot product."
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"1-element Array{Complex{Float64},1}:\n",
" -0.23094-0.34641im"
]
},
"execution_count": 8,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"cos_theta = (u'*v) / ( norm(u)*norm(v) )"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The angle may be complex when the vectors are complex! "
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"1-element Array{Complex{Float64},1}:\n",
" 1.79019+0.34786im"
]
},
"execution_count": 9,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"theta = acos(cos_theta)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The operations of inverse and hermitian commute."
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"4x4 Array{Complex{Float64},2}:\n",
" 0.0886673+0.982318im -0.490957-1.13907im … 1.2432-0.145164im\n",
" -1.07021-1.818im 1.13836+1.15101im -0.139613+0.474516im\n",
" -0.436714+0.051986im 0.352478+0.521735im 0.793142+0.407391im\n",
" 0.882468+1.38138im -0.467233-0.833646im -0.954656-0.650509im"
]
},
"execution_count": 10,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"A = rand(4,4)+1im*rand(4,4); (inv(A))'"
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"4x4 Array{Complex{Float64},2}:\n",
" 0.0886673+0.982318im -0.490957-1.13907im … 1.2432-0.145164im\n",
" -1.07021-1.818im 1.13836+1.15101im -0.139613+0.474516im\n",
" -0.436714+0.051986im 0.352478+0.521735im 0.793142+0.407391im\n",
" 0.882468+1.38138im -0.467233-0.833646im -0.954656-0.650509im"
]
},
"execution_count": 11,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"inv(A')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"So we just write $\\mathbf{A}^{-*}$ for either case. "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Orthogonality"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Orthogonality, which is the multidimensional extension of perpendicularity, means that $\\cos \\theta=0$, i.e., that the inner product between vectors is zero. A collection of vectors is orthogonal if they are all pairwise orthogonal. \n",
"\n",
"Don't worry about how we are creating the vectors here for now."
]
},
{
"cell_type": "code",
"execution_count": 12,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"5x3 Array{Float64,2}:\n",
" -0.103283 -0.0246192 0.792393\n",
" -0.33304 -0.475506 -0.348076\n",
" -0.629735 0.364999 0.291983\n",
" -0.13881 -0.797678 0.295044\n",
" -0.680135 0.061426 -0.28045 "
]
},
"execution_count": 12,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"Q = full( qrfact(rand(5,3))[:Q] )"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Since $\\mathbf{Q}^*\\mathbf{Q}$ is a matrix of all inner products between columns of $\\mathbf{Q}$, those columns are orthogonal if and only if that matrix is diagonal."
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"3x3 Array{Float64,2}:\n",
" 1.0 -6.40858e-17 4.16557e-17\n",
" -6.40858e-17 1.0 7.12044e-17\n",
" 4.16557e-17 7.12044e-17 1.0 "
]
},
"execution_count": 13,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"QhQ = Q'*Q"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"In fact we have a stronger condition here: the columns are **orthonormal**, meaning that they are orthogonal and each has 2-norm equal to 1. "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Given any other vector of length 5, we can compute its inner product with each of the columns of $\\mathbf{Q}$. "
]
},
{
"cell_type": "code",
"execution_count": 14,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"3x1 Array{Float64,2}:\n",
" -1.11399 \n",
" -0.901804 \n",
" 0.00955111"
]
},
"execution_count": 14,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"u = rand(5,1); c = Q'*u"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can then use these coefficients to find a vector in the column space of $\\mathbf{Q}$."
]
},
{
"cell_type": "code",
"execution_count": 15,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"5x1 Array{Float64,2}:\n",
" 0.144826\n",
" 0.796491\n",
" 0.375148\n",
" 0.8768 \n",
" 0.699589"
]
},
"execution_count": 15,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"v = Q*c"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"As explained in the text, $\\mathbf{r} = \\mathbf{u}-\\mathbf{v}$ is orthogonal to all of the columns of $\\mathbf{Q}$."
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"3x1 Array{Float64,2}:\n",
" -2.24722e-16\n",
" 7.41632e-17\n",
" 1.13865e-16"
]
},
"execution_count": 16,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"r = u-v; Q'*r"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Consequently, we have decomposed $\\mathbf{u}=\\mathbf{v}+\\mathbf{r}$ into the sum of two orthogonal parts, one lying in the range of $\\mathbf{Q}$. "
]
},
{
"cell_type": "code",
"execution_count": 17,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"1x1 Array{Float64,2}:\n",
" 1.90887e-16"
]
},
"execution_count": 17,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"v'*r"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Unitary matrices"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We just saw that a matrix whose columns are orthonormal is pretty special. It becomes even more special if the matrix is also square, in which case we call it **unitary**. (In the real case, such matrices are confusingly called _orthogonal_. Ugh.) Say $\\mathbf{Q}$ is unitary and $m\\times m$. Then $\\mathbf{Q}^*\\mathbf{Q}$ is an $m\\times m$ identity matrix---that is, $\\mathbf{Q}^*=\\mathbf{Q}^{-1}$! It can't get much easier in terms of finding the inverse of a matrix. "
]
},
{
"cell_type": "code",
"execution_count": 18,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"5x5 Array{Float64,2}:\n",
" 2.77556e-16 5.55112e-17 1.57009e-16 3.10317e-16 1.75542e-16\n",
" 1.24127e-16 2.48253e-16 3.04364e-16 1.41271e-16 1.24127e-16\n",
" 3.88578e-16 2.77556e-16 3.60822e-16 1.57009e-16 1.49468e-16\n",
" 1.77858e-16 1.19696e-16 2.22045e-16 1.24127e-16 8.32667e-17\n",
" 2.48253e-16 1.75542e-16 1.57009e-16 2.65135e-16 1.19834e-16"
]
},
"execution_count": 18,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"Q = full( qrfact(rand(5,5)+1im*rand(5,5))[:Q] );\n",
"abs( inv(Q) - Q' )"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The rank of $\\mathbf{Q}$ is $m$, so continuing the discussion above, the original vector $\\mathbf{u}$ lies in its column space. Hence the remainder $\\mathbf{r}=\\boldsymbol{0}$. "
]
},
{
"cell_type": "code",
"execution_count": 19,
"metadata": {
"collapsed": false
},
"outputs": [
{
"data": {
"text/plain": [
"5x1 Array{Complex{Float64},2}:\n",
" -1.94289e-16-7.63278e-17im\n",
" 0.0+1.38778e-16im\n",
" -2.77556e-16+2.35922e-16im\n",
" -2.22045e-16-3.33067e-16im\n",
" -2.22045e-16+1.11022e-16im"
]
},
"execution_count": 19,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"c = Q'*u; \n",
"v = Q*c;\n",
"r = u - v"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This is another way to arrive at a fact we already knew: Multiplication by $\\mathbf{Q}^*=\\mathbf{Q}^{-1}$ changes the basis to the columns of $\\mathbf{Q}$."
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Julia 0.4.3",
"language": "julia",
"name": "julia-0.4"
},
"language_info": {
"file_extension": ".jl",
"mimetype": "application/julia",
"name": "julia",
"version": "0.4.3"
}
},
"nbformat": 4,
"nbformat_minor": 0
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment