Hypercomplex Numbers $$ \newcommand{\V}[1]{\mathbf{#1}} \newcommand{\M}[1]{\mathbf{#1}} \newcommand{\I}[1]{#1} \newcommand{\K}[1]{#1} \newcommand{\A}[1]{\mathbf{#1}} \newcommand{\scalars}[2][]{\K{#2}\I{#1}} \newcommand{\versors}[2][]{\A{#2}\I{#1}} \newcommand{\xs}[1]{\scalars[^{#1}]{x}} \newcommand{\ys}[1]{\scalars[^{#1}]{y}} \newcommand{\zs}[1]{\scalars[^{#1}]{z}} \newcommand{\es}[1]{\versors[_{#1}]{e}} \newcommand{\fs}[1]{\versors[_{#1}]{f}} \newcommand{\Xs}[2]{\scalars[_{#1}^{#2}]{X}} \newcommand{\Ys}[2]{\scalars[_{#1}^{#2}]{Y}} \newcommand{\Zs}[2]{\scalars[_{#1}^{#2}]{Z}} \newcommand{\Cs}[3]{\scalars[_{#1#2}^{#3}]{C}} $$
The theory of hypercomplex numbers generalizes the previously discussed families of complex and quaternion numbers.
Preliminaries
This page uses tensors in Einstein notation.
Definitions
(Note: The following definitions are based on Catoni et al..)
Elements
An $(n+1)$-dimensional hypercomplex number is given by the expression:
$$ \A{x} = \es{\alpha}\xs{\alpha} = \sum _{k=0}^{n} \es{k}\xs{k} = \es{0}\xs{0} + \es{1}\xs{1} + \dots + \es{n}\xs{n} $$
where $\xs{\alpha} \in \mathbb{R}$ are called components and $\es{\alpha} \notin \mathbb{R}$ are called versors (sometimes units or bases) as in vector algebra.
Operations
As in vector algebra, the product of two hypercomplex numbers is defined if the product between the versors is defined. The multiplication rule for hypercomplex numbers is given by:
$$ \es{\alpha}\es{\beta} = \es{\gamma}\Cs{\alpha}{\beta}{\gamma} $$
where the constants $C_{\alpha\beta}^{\gamma} \in \mathbb{R}$ are called structure constants and define the characteristics of the system.
Let us consider the product $\A{z}$ of two hypercomplex numbers $\A{x}$ and $\A{y}$. From the definition:
$$ \begin{align*} \A{x} &= \es{\alpha}\xs{\alpha} \\ \A{y} &= \es{\beta}\ys{\beta} \\ \A{z} &= \es{\gamma}\zs{\gamma} \\ \end{align*} $$
Therefore, since scalars commute with versors:
$$ \es{\gamma}\zs{\gamma} = \es{\alpha}\xs{\alpha} \; \es{\beta}\ys{\beta} = \es{\alpha}\es{\beta} \; \xs{\alpha}\ys{\beta} = \es{\gamma}\Cs{\alpha}{\beta}{\gamma} \; \xs{\alpha}\ys{\beta} $$
Given the structure constants we can produce the corresponding multiplication table as follows:
Characteristics
The system of hypercomplex numbers is commutative if the structure constants satisfy the relations:
$$ \Cs{\alpha}{\beta}{\gamma} = \Cs{\beta}{\alpha}{\gamma}, \forall (\alpha, \beta, \gamma) $$
The system of hypercomplex numbers is associative if the structure constants satisfy the relations:
$$ \Cs{\gamma}{\delta}{\epsilon}\Cs{\alpha}{\beta}{\gamma} = \Cs{\alpha}{\gamma}{\epsilon}\Cs{\beta}{\delta}{\gamma}, \forall (\alpha, \beta, \delta, \epsilon) $$
The system of hypercomplex numbers is anti-commutative if the structure constants satisfy the relations:
$$ \Cs{\alpha}{\beta}{\gamma} = -\Cs{\beta}{\alpha}{\gamma}, \forall (\alpha, \beta, \gamma) $$
The system of hypercomplex numbers is anti-associative if the structure constants satisfy the relations:
$$ \Cs{\gamma}{\delta}{\epsilon}\Cs{\alpha}{\beta}{\gamma} = -\Cs{\alpha}{\gamma}{\epsilon}\Cs{\beta}{\delta}{\gamma}, \forall (\alpha, \beta, \delta, \epsilon) $$
Matrix Representations
Recall the product of two hypercomplex numbers:
$$ \es{\gamma}\zs{\gamma} = \es{\alpha}\xs{\alpha} \; \es{\beta}\ys{\beta} = \es{\gamma}\Cs{\alpha}{\beta}{\gamma} \; \xs{\alpha}\ys{\beta} $$
We introduce the rank two tensor $\Xs{\beta}{\gamma}$:
$$ X_{\beta}^{\gamma} = \Cs{\alpha}{\beta}{\gamma}\xs{\alpha} $$
Then the components of $\zs{\gamma} $ can be expressed in terms of $X_{\beta}^{\gamma}$ and $\ys{\beta}$:
$$ \zs{\gamma} = \Xs{\beta}{\gamma}\ys{\beta} $$
The combination of the structure constants $\A{C}$ and the hypercomplex number $\A{x}$ can be represented as a matrix. The hypercomplex numbers $\A{y}$ and $\A{z}$ can be represented as column vectors. In this way, the product of two hypercomplex numbers is equivalent to a matrix-vector product.
$$ \begin{bmatrix} \zs{0} \\ \zs{1} \\ \vdots \\ \zs{n} \\ \end{bmatrix} = \begin{bmatrix} \Xs{0}{0} & \Xs{1}{0} & \cdots & \Xs{n}{0} \\ \Xs{0}{1} & \Xs{1}{1} & \cdots & \Xs{n}{1} \\ \vdots & \vdots & \ddots & \vdots \\ \Xs{0}{n} & \Xs{1}{n} & \cdots & \Xs{n}{n}\\ \end{bmatrix} \begin{bmatrix} \ys{0} \\ \ys{1} \\ \vdots \\ \ys{n} \\ \end{bmatrix} = \begin{bmatrix} \Xs{0}{0}\ys{0} + \Xs{1}{0}\ys{1} + \dots + \Xs{n}{0}\ys{n} \\ \Xs{0}{1}\ys{0} + \Xs{1}{1}\ys{1} + \dots + \Xs{n}{1}\ys{n} \\ \vdots \\ \Xs{0}{2}\ys{0} + \Xs{1}{2}\ys{1} + \dots + \Xs{n}{n}\ys{n} \\ \end{bmatrix} $$
Further, all the associative hypercomplex numbers can be represented by a characteristic matrix and their product by a matrix-matrix product.
Examples
Complex Number Family
The complex number family can be viewed as 2-dimensional hypercomplex numbers.
$$ \A{x} = \es{0}\xs{0} + \es{1}\xs{1} $$
Take the structure constants $\Cs{\alpha}{\beta}{\gamma}$ with the following non-zero elements:
$$ \begin{align*} \Cs{0}{0}{0} &= 1 & \Cs{0}{1}{1} &= 1 & \Cs{1}{0}{1} &= 1 & \Cs{1}{1}{0} &= -k \\ \end{align*} $$
The resulting multiplication table is:
Based soley on the structure constants, we can characterize these hypercomplex numbers. Specifically, we can check the aforementioned relations to show that they are associative and commutative in agreement with the complex number family.
Indeed, if we set $\es{0} = 1$ and $\es{1} = \mathbf{\kappa}$ we recover the familiar multiplication table for the unified complex number family.
The resulting matrix representation is:
$$ \A{X}(\A{x}) = \begin{bmatrix} \Xs{0}{0} & \Xs{1}{0} \\ \Xs{0}{1} & \Xs{1}{1} \\ \end{bmatrix} = \begin{bmatrix} \Cs{0}{0}{0}\xs{0} + \Cs{1}{0}{0}\xs{1} & \Cs{0}{1}{0}\xs{0} + \Cs{1}{1}{0}\xs{1} \\ \Cs{0}{0}{1}\xs{0} + \Cs{1}{0}{1}\xs{1} & \Cs{0}{1}{1}\xs{0} + \Cs{1}{1}{1}\xs{1} \\ \end{bmatrix} = \begin{bmatrix} \xs{0} & -k\xs{1} \\ \xs{1} & \xs{0} \\ \end{bmatrix} $$
Let's verify the multiplication:
$$ \begin{bmatrix} \zs{0} \\ \zs{1} \\ \end{bmatrix} = \begin{bmatrix} \xs{0} & -k\xs{1} \\ \xs{1} & \xs{0} \\ \end{bmatrix} \begin{bmatrix} \ys{0} \\ \ys{1} \\ \end{bmatrix} = \begin{bmatrix} \xs{0}\ys{0} - k\xs{1}\ys{1}\\ \xs{1}\ys{0} + \xs{0}\ys{1}\\ \end{bmatrix} $$
You may recoginze this as the multiplication of the unified complex number family.
Vectors in $\mathbb{R}^{3}$
The vectors in $\mathbb{R}^{3}$ can be viewed as 3-dimensional hypercomplex numbers.
$$ \A{x} = \es{0}\xs{0} + \es{1}\xs{1} + \es{2}\xs{2} $$
Take the structure constants $\Cs{\alpha}{\beta}{\gamma}$ with the following non-zero elements:
$$ \begin{align*} \Cs{0}{1}{2} &= 1 & \Cs{0}{2}{1} &= -1 & \Cs{1}{2}{0} &= 1 & \Cs{1}{0}{2} &= -1 & \Cs{2}{0}{1} &= 1 & \Cs{2}{1}{0} &= -1 \\ \end{align*} $$
The resulting multiplication table is:
Based soley on the structure constants, we can characterize these hypercomplex numbers. Specifically, we can check the aforementioned relations to show that they are neither associative nor commutative. However, they are anti-commutative.
The resulting matrix representation is:
$$ \A{X}(\A{x}) = \begin{bmatrix} \Xs{0}{0} & \Xs{1}{0} & \Xs{2}{0} \\ \Xs{0}{1} & \Xs{1}{1} & \Xs{2}{1} \\ \Xs{0}{2} & \Xs{1}{2} & \Xs{2}{2} \\ \end{bmatrix} = \begin{bmatrix} 0 & -\xs{2} & \xs{1} \\ \xs{2} & 0 & -\xs{0} \\ -\xs{1} & \xs{0} & 0 \\ \end{bmatrix} $$
Can you already tell what the multiplication in these hypercomplex numbers corresponds to?
Let's verify the multiplication:
$$ \begin{bmatrix} \zs{0} \\ \zs{1} \\ \zs{2} \\ \end{bmatrix} = \begin{bmatrix} 0 & -\xs{2} & \xs{1} \\ \xs{2} & 0 & -\xs{0} \\ -\xs{1} & \xs{0} & 0 \\ \end{bmatrix} \begin{bmatrix} \ys{0} \\ \ys{1} \\ \ys{2} \\ \end{bmatrix} = \begin{bmatrix} \xs{1}\ys{2} - \xs{2}\ys{1}\\ \xs{2}\ys{0} - \xs{0}\ys{2}\\ \xs{0}\ys{1} - \xs{1}\ys{0}\\ \end{bmatrix} $$
You may recoginze this as the 3-dimensional cross product.
Quaternion Family
The quaternion family can be viewed as 4-dimensional hypercomplex numbers.
$$ \A{x} = \es{0}\xs{0} + \es{1}\xs{1} + \es{2}\xs{2} + \es{3}\xs{3} $$
Deriving the structure constants for the unified quaternion family. is left as excercise for the reader.