• Title/Summary/Keyword: tensor decomposition

Search Result 42, Processing Time 0.033 seconds

Vector decomposition of the evolution equations of the conformation tensor of Maxwellian fluids

  • Cho, Kwang-Soo
    • Korea-Australia Rheology Journal
    • /
    • v.21 no.2
    • /
    • pp.143-146
    • /
    • 2009
  • Breakthrough of high Weisenberg number problem is related with keeping the positive definiteness of the conformation tensor in numerical procedures. In this paper, we suggest a simple method to preserve the positive definiteness by use of vector decomposition of the conformation tensor which does not require eigenvalue problem. We also derive the constitutive equation of tensor-logarithmic transform in simpler way than that of Fattal and Kupferman and discuss the comparison between the vector decomposition and tensor-logarithmic transformation.

DECOMPOSITION OF SPECIAL PSEUDO PROJECTIVE CURVATURE TENSOR FIELD

  • MOHIT SAXENA;PRAVEEN KUMAR MATHUR
    • Journal of applied mathematics & informatics
    • /
    • v.41 no.5
    • /
    • pp.989-999
    • /
    • 2023
  • The aim of this paper is to study the projective curvature tensor field of the Curvature tensor Rijkh on a recurrent non Riemannian space admitting recurrent affine motion, which is also decomposable in the form Rijkh=Xi Yjkh, where Xi and Yjkh are non-null vector and tensor respectively. In this paper we decompose Special Pseudo Projective Curvature Tensor Field. In the sequal of decomposition we established several properties of such decomposed tensor fields. We have considered the curvature tensor field Rijkh in a Finsler space equipped with non symmetric connection and we study the decomposition of such field. In a special Pseudo recurrent Finsler Space, if the arbitrary tensor field 𝜓ij is assumed to be a covariant constant then, in view of the decomposition rule, 𝜙kh behaves as a recurrent tensor field. In the last, we have considered the decomposition of curvature tensor fields in Kaehlerian recurrent spaces and have obtained several related theorems.

S-PARAFAC: Distributed Tensor Decomposition using Apache Spark (S-PARAFAC: 아파치 스파크를 이용한 분산 텐서 분해)

  • Yang, Hye-Kyung;Yong, Hwan-Seung
    • Journal of KIISE
    • /
    • v.45 no.3
    • /
    • pp.280-287
    • /
    • 2018
  • Recently, the use of a recommendation system and tensor data analysis, which has high-dimensional data, is increasing, as they allow us to analyze the tensor and extract potential elements and patterns. However, due to the large size and complexity of the tensor, it needs to be decomposed in order to analyze the tensor data. While several tools are used for tensor decomposition such as rTensor, pyTensor, and MATLAB, since such tools run on a single machine, they are unable to handle large data. Also, while distributed tensor decomposition tools based on Hadoop can handle a scalable tensor, its computing speed is too slow. In this paper, we propose S-PARAFAC, which is a tensor decomposition tool based on Apache Spark, in distributed in-memory environments. We converted the PARAFAC algorithm into an Apache Spark version that enables rapid processing of tensor data. We also compared the performance of the Hadoop based tensor tool and S-PARAFAC. The result showed that S-PARAFAC is approximately 4~25 times faster than the Hadoop based tensor tool.

PARAFAC Tensor Reconstruction for Recommender System based on Apache Spark (아파치 스파크에서의 PARAFAC 분해 기반 텐서 재구성을 이용한 추천 시스템)

  • Im, Eo-Jin;Yong, Hwan-Seung
    • Journal of Korea Multimedia Society
    • /
    • v.22 no.4
    • /
    • pp.443-454
    • /
    • 2019
  • In recent years, there has been active research on a recommender system that considers three or more inputs in addition to users and goods, making it a multi-dimensional array, also known as a tensor. The main issue with using tensor is that there are a lot of missing values, making it sparse. In order to solve this, the tensor can be shrunk using the tensor decomposition algorithm into a lower dimensional array called a factor matrix. Then, the tensor is reconstructed by calculating factor matrices to fill original empty cells with predicted values. This is called tensor reconstruction. In this paper, we propose a user-based Top-K recommender system by normalized PARAFAC tensor reconstruction. This method involves factorization of a tensor into factor matrices and reconstructs the tensor again. Before decomposition, the original tensor is normalized based on each dimension to reduce overfitting. Using the real world dataset, this paper shows the processing of a large amount of data and implements a recommender system based on Apache Spark. In addition, this study has confirmed that the recommender performance is improved through normalization of the tensor.

An Application of Tucker Decomposition for Detecting Epilepsy EEG signals

  • Thieu, Thao Nguyen;Yang, Hyung-Jeong
    • Journal of Multimedia Information System
    • /
    • v.2 no.2
    • /
    • pp.215-222
    • /
    • 2015
  • Epileptic Seizure is a popular brain disease in the world. It affects the nervous system and the activities of brain function that make a person who has seizure signs cannot control and predict his actions. Based on the Electroencephalography (EEG) signals which are recorded from human or animal brains, the scientists use many methods to detect and recognize the abnormal activities of brain. Tucker model is investigated to solve this problem. Tucker decomposition is known as a higher-order form of Singular Value Decomposition (SVD), a well-known algorithm for decomposing a matric. It is widely used to extract good features of a tensor. After decomposing, the result of Tucker decomposition is a core tensor and some factor matrices along each mode. This core tensor contains a number of the best information of original data. In this paper, we used Tucker decomposition as a way to obtain good features. Training data is primarily applied into the core tensor and the remained matrices will be combined with the test data to build the Tucker base that is used for testing. Using core tensor makes the process simpler and obtains higher accuracies.

DECOMPOSITION FOR CARTAN'S SECOND CURVATURE TENSOR OF DIFFERENT ORDER IN FINSLER SPACES

  • Abdallah, Alaa A.;Navlekar, A.A.;Ghadle, Kirtiwant P.;Hamoud, Ahmed A.
    • Nonlinear Functional Analysis and Applications
    • /
    • v.27 no.2
    • /
    • pp.433-448
    • /
    • 2022
  • The Cartan's second curvature tensor Pijkh is a positively homogeneous of degree-1 in yi, where yi represent a directional coordinate for the line element in Finsler space. In this paper, we discuss the decomposition of Cartan's second curvature tensor Pijkh in two spaces, a generalized 𝔅P-recurrent space and generalized 𝔅P-birecurrent space. We obtain different tensors which satisfy the recurrence and birecurrence property under the decomposition. Also, we prove the decomposition for different tensors are non-vanishing. As an illustration of the applicability of the obtained results, we finish this work with some illustrative examples.

Differential Evolution Based Clustering (차분진화에 기초한 클러스터링)

  • Ham, Seo-Hyun;Lee, Hyun-Chang;Shin, Seong-Yoon
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2019.07a
    • /
    • pp.389-390
    • /
    • 2019
  • Tensor decomposition, proven to be an efficient data processing method, can be used to provide data-driven services. we propose a novel datadriven mutation strategy for parent individuals selection, namely tensor-based DE with parapatric and cross-generation(TPCDE).

  • PDF

Nonnegative Tucker Decomposition (텐서의 비음수 Tucker 분해)

  • Kim, Yong-Deok;Choi, Seung-Jin
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.14 no.3
    • /
    • pp.296-300
    • /
    • 2008
  • Nonnegative tensor factorization(NTF) is a recent multiway(multilineal) extension of nonnegative matrix factorization(NMF), where nonnegativity constraints are imposed on the CANDECOMP/PARAFAC model. In this paper we consider the Tucker model with nonnegativity constraints and develop a new tensor factorization method, referred to as nonnegative Tucker decomposition (NTD). We derive multiplicative updating algorithms for various discrepancy measures: least square error function, I-divergence, and $\alpha$-divergence.

Blind signal separation for coprime planar arrays: An improved coupled trilinear decomposition method

  • Zhongyuan Que;Xiaofei Zhang;Benzhou Jin
    • ETRI Journal
    • /
    • v.45 no.1
    • /
    • pp.138-149
    • /
    • 2023
  • In this study, the problem of blind signal separation for coprime planar arrays is investigated. For coprime planar arrays comprising two uniform rectangular subarrays, we link the signal separation to the tensor-based model called coupled canonical polyadic decomposition (CPD) and propose an improved coupled trilinear decomposition approach. The output data of coprime planar arrays are modeled as a coupled tensor set that can be further interpreted as a coupled CPD model, allowing a signal separation to be achieved using coupled trilinear alternating least squares (TALS). Furthermore, in the procedure of the coupled TALS, a Vandermonde structure enforcing approach is explicitly applied, which is shown to ensure fast convergence. The results of Monto Carlo simulations show that our proposed algorithm has the same separation accuracy as the basic coupled TALS but with a faster convergence speed.