2021-08

スパースモデリング

Protected: Sparse Modeling and Multivariate Analysis (6) Image Processing and Sparsity (Overview of Machine Learning for Signal Processing)

Overview of Sparse Models for Machine Learning of Image Information for Artificial Intelligence (AI) and Digital Transformation (DX), JPEG, DCT, Sparse Land Model
スパースモデリング

Protected: Sparse modeling and multivariate analysis (5) Graphical lasso and its application (anomaly detection, etc.)

Graph sparse models used for dimensionality reduction of graph data and explanation of machine learning models, introduction of sparsity to relations and its application to graphical lasso and anomaly detection etc.
スパースモデリング

Protected: Sparse Modeling and Multivariate Analysis (4) Introducing Sparsity into Relationships

The graph sparsity model, which is used to reduce the dimensionality of graph data and to explain machine learning models, is discussed in terms of introducing sparsity into relationships and graph lasso.
機械学習:Machine Learning

Protected: Two approaches to language meaning (fusion of symbolic and distributed representations)

Symbolic and vector representation approaches to natural language meaning that can be used for artificial intelligence (AI) and digital transformation (DX) tasks and their integration, content relation recognition, paraphrase recognition, semantic similarity recognition, datasets (RTE, RITE, STS)
機械学習:Machine Learning

Protected: Teaching the meaning of words to a computer (on various language models)

Meaning of Natural Language by WordNet (Dictionary), Distributed Hypothesis, PPMI, Singularity Decomposition (SVD), Word2Vec (Distributed Representation) in Natural Language Processing
機械学習:Machine Learning

Protected: Topic models that capture the individuality of language

Topic models to capture latent meanings behind sentences, differences between various probabilistic approaches and deep learning, supervised LDA, Boltzmann machines,Naive Bayes
確率・統計:Probability and Statistics

Introduction to models of language (probabilistic unigram models and Bayesian probability)

Natural language processing as it applies to digital transformation (DX), artificial intelligence (AI), machine learning, etc. Modeling of natural language, application of unigram models and Bayesian probabilistic models.
セマンテックウェブ技術:Semantic web Technology

Strategies in similarity matching methods (7) Improved alignment disambiguation

Alignment disambiguation for optimization of natural language simirality and ontology matching for digital transformation (DX) and artificial intelligence (AI) applications
セマンテックウェブ技術:Semantic web Technology

Strategies in similarity matching methods (6) Alignment extraction approach

Optimization and alignment extraction for natural language simirality and ontology matching for digital transformation (DX) and artificial intelligence (AI) applications
セマンテックウェブ技術:Semantic web Technology

Strategies in similarity matching methods (5) Tuning approaches

Digital transformation (DX), similarity of natural language for artificial intelligence (AI), tuning by machine learning for ontology matching, stacked generalization, genetic algorithms
タイトルとURLをコピーしました