Loading [MathJax]/jax/output/HTML-CSS/jax.js

Tuesday, November 16, 2021

Symmetric and Anti-symmetric Components

 

During various times in my math studies, I came across the following very similar looking equations.
For any complex number z, we can split it into a symmetric and antisymmetric part: z=z+ˉz2=:z1+zˉz2=:z2ˉz1=z1ˉz2=z2 For any square matrix A, we can split it into a symmetric and antisymmetric part: A=A+AT2=:A1+AAT2=:A2AT1=A1AT2=A2 For any polynomial f, we can split it into an odd and even part: f(x)=f(x)+f(x)2=:f1(x)+f(x)f(x)2=:f2(x)f1(x)=f1(x)f2(x)=f2(x)
This prompts us to ask ourselves, is there a generalization of this?

Consider a vectorspace V. Let g:VV be a function such that g(x1+x2)=g(x1)+g(x2)(linearity)g(cx)=cg(x)(linearity)g2(x)=x Then, given any xV, it can be split into a g-symmetric and g-antisymmetric component as follows: x=x+g(x)2=:x1+xg(x)2=:x2 We can verify that g(x1)=g(x+g(x)2)=g(x)+g2(x)2=g(x)+x2=x1g(x2)=g(xg(x)2)=g(x)g2(x)2=g(x)x2=x2 Not only that, this decomposition is unique. If not, let x=xS+xA=˜xS+˜xA be two such representations. Then 0=(xS˜xS)=:yS+(xA˜xA)=:yA Note that g(yS)=g(xS˜xS)=g(xS)g(˜xS)=xs˜xS=ySg(yA)=g(xA˜xA)=g(xA)g(˜xA)=xA+˜xA=yA Thus, yS is g-symmetric and yA is g-antisymmetric.

Now we have 0=yS+yA0=g(0)=g(yS)+g(yA)0=ysyA The first and third equations above imply yA=0 and hence yS=0 Thus xS=˜xSxA=˜xA and the representation is unique.

No comments: