Overview of Higher Order Singular Value Decomposition (HOSVD) and examples of algorithms and implementations.

Machine Learning Artificial Intelligence Digital Transformation Natural Language Processing Deep Learning Information Geometric Approach to Data Mathematics Navigation of this blog
Overview of Higher Order Singular Value Decomposition (HOSVD)

Higher Order Singular Value Decomposition (HOSVD) is a method for dimensionality reduction and data compression of tensors (multidimensional arrays of three or more dimensions), whereas regular SVD is applied to matrices. HOSVD is applied to tensors.

HOSVD captures the structure of the original tensor by decomposing the tensor into many smaller tensors and compressing the information in each tensor. Specifically, HOSVD decomposes a tensor into multiple dimensions using singular value decomposition (SVD)  as described in “Overview of Singular Value Decomposition (SVD) and examples of algorithms and implementations“and, in each mode (dimension), decomposes the tensor using the left and right singular matrices obtained by the singular value decomposition.

Using HOSVD, the original tensor can be decomposed into a set of low-rank tensors. This low-rank approximation tensor is an approximation of the original tensor, but reduces the dimensionality and compresses the data.

HOSVD is useful in high-dimensional data such as images, video and audio, and has been applied to efficient data representation and feature extraction.

Algorithms related to Higher Order Singular Value Decomposition (HOSVD)

The steps of the HOSVD algorithm are described below.

1. understanding the structure of the tensor: first, understand the dimensions of the tensor of interest and the magnitude of each dimension.

2. singular value decomposition (SVD) in each mode: for each mode (dimension) of the tensor, a singular value decomposition (SVD) is performed. For example, for a 3D tensor, the matrix in each mode is singular value decomposed and this step yields a left singular matrix (U) and a right singular matrix (V) in each mode.

3. construction of the compressed tensor: the U and V obtained from the singular value matrices in each mode are combined to reconstruct the original tensor. For example, in the case of a 3D tensor, the tensor is reconstructed by combining the U and V obtained in each mode, and this step yields a low-rank tensor that approximates the original tensor.

4. dimensionality reduction and data compression: the obtained low-rank tensor provides a more compressed representation than the original tensor. The low-rank tensor can efficiently capture the information of the original tensor.

HOSVD effectively captures the structure of the tensor by performing a singular value decomposition for each mode, which allows for dimensionality reduction and data compression. The method has been widely applied to efficient representation and feature extraction of high-dimensional data.

Application examples of Higher Order Singular Value Decomposition (HOSVD)

The following are examples of HOSVD applications.

1. image processing:

Compression of high-resolution images: HOSVDs are used to compress high-resolution images efficiently. Multi-dimensional image tensors can be decomposed using HOSVD to compress image data.

2. video analysis:

Video compression: HOSVDs represent each frame of a video as a tensor, which helps to capture and compress information along the temporal and spatial axes. This improves the efficiency of data storage and transmission.

3. audio processing:

Analysis of speech data: HOSVD is applied to decompose high-dimensional data, such as speech spectrograms, for feature extraction and pattern recognition, thereby enabling efficient processing and analysis of speech data.

4. tensor data analysis:

Dimensionality reduction of tensor data: HOSVD is used to analyse data in tensor form, such as sensor data and biomedical engineering data, for dimensionality reduction and feature extraction. For example, data from multiple sensors can be represented as a single tensor, which can then be decomposed by HOSVD to facilitate understanding of the data structure.

HOSVDs are widely used for analysing and compressing multidimensional data, contributing to efficient representation of data and extraction of information.

Example implementation of Higher Order Singular Value Decomposition (HOSVD).

The implementation of HOSVD involves singular value decomposition (SVD) in each mode. Examples of implementations using the Python and NumPy libraries are given below.

import numpy as np

def hosvd(tensor):
    # Get the dimension of the tensor.
    dim = tensor.ndim
    shape = tensor.shape
    
    # Singular value decomposition (SVD) in each mode
    u_matrices = []
    s_matrices = []
    for mode in range(dim):
        # singular value analysis
        U, S, _ = np.linalg.svd(np.reshape(tensor, (shape[mode], -1)), full_matrices=False)
        u_matrices.append(U)
        s_matrices.append(np.diag(S))
    
    # Compressed tensor construction.
    core_tensor = np.tensordot(tensor, u_matrices, axes=range(dim))
    
    return core_tensor, u_matrices, s_matrices

# Tensor example.
tensor = np.random.rand(3, 4, 5)

# Execution of HOSVD.
core_tensor, u_matrices, s_matrices = hosvd(tensor)

# Display of results
print("Core Tensor:")
print(core_tensor)
print("nU Matrices:")
for i, u_matrix in enumerate(u_matrices):
    print(f"Mode {i+1}:")
    print(u_matrix)
    print()
print("nS Matrices:")
for s_matrix in s_matrices:
    print(s_matrix)
    print()

The code performs HOSVD on the given tensor and computes the left singular matrix (U matrix) and singular value matrix (S matrix) for the compressed tensor and each mode. This provides a low-rank approximation of the tensor.

Challenges and measures for Higher Order Singular Value Decomposition (HOSVD).

Although HOSVDs are useful in many situations, some challenges also exist. The main challenges and measures to address them are described below.

1. increased computational costs: the computation of singular value decompositions is very expensive, especially when the tensor dimension is high. This is also the case when the original tensor is very large.

Solution: the computational cost can be reduced by using approximate HOSVD algorithms or by utilising distributed or parallel processing.

2. rank selection: HOSVD generates low-rank approximation tensors, but how to select the rank is an important issue.

Solution: model selection methods such as cross-validation and information criterion can be used to determine the appropriate rank.

3. dealing with non-linearity of the data: the HOSVD is based on a linear singular value decomposition, which can be difficult to approximate adequately if the data are not linear.

Solution: Non-parametric approaches and non-linear singular value decomposition methods may be used. Some methods also take into account the local non-linearity of the tensor.

4. handling sparse data: the effectiveness of HOSVD may be limited if the original tensor is sparse.

Solution: deformations that take into account the properties of sparse data and methods such as sparse singular value decomposition can be considered.

Reference Information and Reference Books

For more information on optimization in machine learning, see also “Optimization for the First Time Reading Notes” “Sequential Optimization for Machine Learning” “Statistical Learning Theory” “Stochastic Optimization” etc.

Reference books include Optimization for Machine Learning

Machine Learning, Optimization, and Data Science

Linear Algebra and Optimization for Machine Learning: A Textbook

コメント

Exit mobile version
タイトルとURLをコピーしました