Hypercomplex Numbers

The theory of hypercomplex numbers generalizes the previously discussed families of complex and quaternion numbers.

Preliminaries

This page uses tensors in Einstein notationopen in new window.

Definitions

(Note: The following definitions are based on Catoni et al.open in new window.)

Elements

An $(n+1)$-dimensional hypercomplex number is given by the expression:

$$ \A{x} = \es{\alpha}\xs{\alpha} = \sum _{k=0}^{n} \es{k}\xs{k} = \es{0}\xs{0} + \es{1}\xs{1} + \dots + \es{n}\xs{n} $$

where $\xs{\alpha} \in \mathbb{R}$ are called components and $\es{\alpha} \notin \mathbb{R}$ are called versors (sometimes units or bases) as in vector algebra.

Operations

As in vector algebra, the product of two hypercomplex numbers is defined if the product between the versors is defined. The multiplication rule for hypercomplex numbers is given by:

$$ \es{\alpha}\es{\beta} = \es{\gamma}\Cs{\alpha}{\beta}{\gamma} $$

where the constants $C_{\alpha\beta}^{\gamma} \in \mathbb{R}$ are called structure constants and define the characteristics of the system.

Let us consider the product $\A{z}$ of two hypercomplex numbers $\A{x}$ and $\A{y}$. From the definition:

$$ \begin{align*} \A{x} &= \es{\alpha}\xs{\alpha} \\ \A{y} &= \es{\beta}\ys{\beta} \\ \A{z} &= \es{\gamma}\zs{\gamma} \\ \end{align*} $$

Therefore, since scalars commute with versors:

$$ \es{\gamma}\zs{\gamma} = \es{\alpha}\xs{\alpha} \; \es{\beta}\ys{\beta} = \es{\alpha}\es{\beta} \; \xs{\alpha}\ys{\beta} = \es{\gamma}\Cs{\alpha}{\beta}{\gamma} \; \xs{\alpha}\ys{\beta} $$

Given the structure constants we can produce the corresponding multiplication tableopen in new window as follows:

$\es{0}$
$\es{1}$
$\cdots$
$\es{n}$
$\es{0}$
$\es{\gamma}\Cs{0}{0}{\gamma}$
$\es{\gamma}\Cs{0}{1}{\gamma}$
$\cdots$
$\es{\gamma}\Cs{0}{n}{\gamma}$
$\es{1}$
$\es{\gamma}\Cs{1}{0}{\gamma}$
$\es{\gamma}\Cs{1}{1}{\gamma}$
$\cdots$
$\es{\gamma}\Cs{1}{n}{\gamma}$
$\vdots$
$\vdots$
$\vdots$
$\ddots$
$\vdots$
$\es{n}$
$\es{\gamma}\Cs{n}{0}{\gamma}$
$\es{\gamma}\Cs{n}{1}{\gamma}$
$\cdots$
$\es{\gamma}\Cs{n}{n}{\gamma}$

Characteristics

The system of hypercomplex numbers is commutative if the structure constants satisfy the relations:

$$ \Cs{\alpha}{\beta}{\gamma} = \Cs{\beta}{\alpha}{\gamma}, \forall (\alpha, \beta, \gamma) $$

The system of hypercomplex numbers is associative if the structure constants satisfy the relations:

$$ \Cs{\gamma}{\delta}{\epsilon}\Cs{\alpha}{\beta}{\gamma} = \Cs{\alpha}{\gamma}{\epsilon}\Cs{\beta}{\delta}{\gamma}, \forall (\alpha, \beta, \delta, \epsilon) $$

The system of hypercomplex numbers is anti-commutative if the structure constants satisfy the relations:

$$ \Cs{\alpha}{\beta}{\gamma} = -\Cs{\beta}{\alpha}{\gamma}, \forall (\alpha, \beta, \gamma) $$

The system of hypercomplex numbers is anti-associative if the structure constants satisfy the relations:

$$ \Cs{\gamma}{\delta}{\epsilon}\Cs{\alpha}{\beta}{\gamma} = -\Cs{\alpha}{\gamma}{\epsilon}\Cs{\beta}{\delta}{\gamma}, \forall (\alpha, \beta, \delta, \epsilon) $$

Matrix Representations

Recall the product of two hypercomplex numbers:

$$ \es{\gamma}\zs{\gamma} = \es{\alpha}\xs{\alpha} \; \es{\beta}\ys{\beta} = \es{\gamma}\Cs{\alpha}{\beta}{\gamma} \; \xs{\alpha}\ys{\beta} $$

We introduce the rank two tensor $\Xs{\beta}{\gamma}$:

$$ X_{\beta}^{\gamma} = \Cs{\alpha}{\beta}{\gamma}\xs{\alpha} $$

Then the components of $\zs{\gamma} $ can be expressed in terms of $X_{\beta}^{\gamma}$ and $\ys{\beta}$:

$$ \zs{\gamma} = \Xs{\beta}{\gamma}\ys{\beta} $$

The combination of the structure constants $\A{C}$ and the hyper­complex number $\A{x}$ can be represented as a matrix. The hyper­complex numbers $\A{y}$ and $\A{z}$ can be represented as column vectors. In this way, the product of two hypercomplex numbers is equivalent to a matrix-vector product.

$$ \begin{bmatrix} \zs{0} \\ \zs{1} \\ \vdots \\ \zs{n} \\ \end{bmatrix} = \begin{bmatrix} \Xs{0}{0} & \Xs{1}{0} & \cdots & \Xs{n}{0} \\ \Xs{0}{1} & \Xs{1}{1} & \cdots & \Xs{n}{1} \\ \vdots & \vdots & \ddots & \vdots \\ \Xs{0}{n} & \Xs{1}{n} & \cdots & \Xs{n}{n}\\ \end{bmatrix} \begin{bmatrix} \ys{0} \\ \ys{1} \\ \vdots \\ \ys{n} \\ \end{bmatrix} = \begin{bmatrix} \Xs{0}{0}\ys{0} + \Xs{1}{0}\ys{1} + \dots + \Xs{n}{0}\ys{n} \\ \Xs{0}{1}\ys{0} + \Xs{1}{1}\ys{1} + \dots + \Xs{n}{1}\ys{n} \\ \vdots \\ \Xs{0}{2}\ys{0} + \Xs{1}{2}\ys{1} + \dots + \Xs{n}{n}\ys{n} \\ \end{bmatrix} $$

Further, all the associative hypercomplex numbers can be represented by a characteristic matrix and their product by a matrix-matrix product.

Examples

Complex Number Family

The complex number family can be viewed as 2-dimensional hypercomplex numbers.

$$ \A{x} = \es{0}\xs{0} + \es{1}\xs{1} $$

Take the structure constants $\Cs{\alpha}{\beta}{\gamma}$ with the following non-zero elements:

$$ \begin{align*} \Cs{0}{0}{0} &= 1 & \Cs{0}{1}{1} &= 1 & \Cs{1}{0}{1} &= 1 & \Cs{1}{1}{0} &= -k \\ \end{align*} $$

The resulting multiplication table is:

$\es{0}$
$\es{1}$
$\es{0}$
$\es{0}$
$\es{1}$
$\es{1}$
$\es{1}$
$-k\es{0}$

Based soley on the structure constants, we can characterize these hypercomplex numbers. Specifically, we can check the aforementioned relations to show that they are associative and commutative in agreement with the complex number family.

Indeed, if we set $\es{0} = 1$ and $\es{1} = \mathbf{\kappa}$ we recover the familiar multiplication table for the unified complex number family.

The resulting matrix representation is:

$$ \A{X}(\A{x}) = \begin{bmatrix} \Xs{0}{0} & \Xs{1}{0} \\ \Xs{0}{1} & \Xs{1}{1} \\ \end{bmatrix} = \begin{bmatrix} \Cs{0}{0}{0}\xs{0} + \Cs{1}{0}{0}\xs{1} & \Cs{0}{1}{0}\xs{0} + \Cs{1}{1}{0}\xs{1} \\ \Cs{0}{0}{1}\xs{0} + \Cs{1}{0}{1}\xs{1} & \Cs{0}{1}{1}\xs{0} + \Cs{1}{1}{1}\xs{1} \\ \end{bmatrix} = \begin{bmatrix} \xs{0} & -k\xs{1} \\ \xs{1} & \xs{0} \\ \end{bmatrix} $$

Let's verify the multiplication:

$$ \begin{bmatrix} \zs{0} \\ \zs{1} \\ \end{bmatrix} = \begin{bmatrix} \xs{0} & -k\xs{1} \\ \xs{1} & \xs{0} \\ \end{bmatrix} \begin{bmatrix} \ys{0} \\ \ys{1} \\ \end{bmatrix} = \begin{bmatrix} \xs{0}\ys{0} - k\xs{1}\ys{1}\\ \xs{1}\ys{0} + \xs{0}\ys{1}\\ \end{bmatrix} $$

You may recoginze this as the multiplication of the unified complex number family.

Vectors in $\mathbb{R}^{3}$

The vectors in $\mathbb{R}^{3}$ can be viewed as 3-dimensional hypercomplex numbers.

$$ \A{x} = \es{0}\xs{0} + \es{1}\xs{1} + \es{2}\xs{2} $$

Take the structure constants $\Cs{\alpha}{\beta}{\gamma}$ with the following non-zero elements:

$$ \begin{align*} \Cs{0}{1}{2} &= 1 & \Cs{0}{2}{1} &= -1 & \Cs{1}{2}{0} &= 1 & \Cs{1}{0}{2} &= -1 & \Cs{2}{0}{1} &= 1 & \Cs{2}{1}{0} &= -1 \\ \end{align*} $$

The resulting multiplication table is:

$\es{0}$
$\es{1}$
$\es{2}$
$\es{0}$
$0$
$\es{2}$
$-\es{1}$
$\es{1}$
$-\es{2}$
$0$
$\es{0}$
$\es{2}$
$\es{1}$
$-\es{0}$
$0$

Based soley on the structure constants, we can characterize these hypercomplex numbers. Specifically, we can check the aforementioned relations to show that they are neither associative nor commutative. However, they are anti-commutative.

The resulting matrix representation is:

$$ \A{X}(\A{x}) = \begin{bmatrix} \Xs{0}{0} & \Xs{1}{0} & \Xs{2}{0} \\ \Xs{0}{1} & \Xs{1}{1} & \Xs{2}{1} \\ \Xs{0}{2} & \Xs{1}{2} & \Xs{2}{2} \\ \end{bmatrix} = \begin{bmatrix} 0 & -\xs{2} & \xs{1} \\ \xs{2} & 0 & -\xs{0} \\ -\xs{1} & \xs{0} & 0 \\ \end{bmatrix} $$

Can you already tell what the multiplication in these hypercomplex numbers corresponds to?

Let's verify the multiplication:

$$ \begin{bmatrix} \zs{0} \\ \zs{1} \\ \zs{2} \\ \end{bmatrix} = \begin{bmatrix} 0 & -\xs{2} & \xs{1} \\ \xs{2} & 0 & -\xs{0} \\ -\xs{1} & \xs{0} & 0 \\ \end{bmatrix} \begin{bmatrix} \ys{0} \\ \ys{1} \\ \ys{2} \\ \end{bmatrix} = \begin{bmatrix} \xs{1}\ys{2} - \xs{2}\ys{1}\\ \xs{2}\ys{0} - \xs{0}\ys{2}\\ \xs{0}\ys{1} - \xs{1}\ys{0}\\ \end{bmatrix} $$

You may recoginze this as the 3-dimensional cross product.

Quaternion Family

The quaternion family can be viewed as 4-dimensional hypercomplex numbers.

$$ \A{x} = \es{0}\xs{0} + \es{1}\xs{1} + \es{2}\xs{2} + \es{3}\xs{3} $$

Deriving the structure constants for the unified quaternion family. is left as excercise for the reader.

References

Last Updated:
Contributors: filonik, Daniel Filonik