確率・統計:Probability and Statistics

微分積分:Calculus

Protected: Complexity of Hypothesis Sets in Statistical Mathematics Theory

Complexity of sets of hypotheses in statistical mathematical theory used in digital transformation, artificial intelligence, and machine learning tasks Rademacher complexity, VC dimension, large number factor, law of large uniform numbers, decision stocks, set of linear discriminators, set of linear functions, Cauchy-Schwartz inequality, Jensen inequality, Masar's complement, Talagrande's complement, empirical Rademacher complexity, Sauer's complement, Radon's theorem
アルゴリズム:Algorithms

Protected: Stochastic Optimization and Online Optimization Overview

Stochastic and online optimization used in digital transformation, artificial intelligence, and machine learning tasks expected error, riglet, minimax optimal, strongly convex loss function, stochastic gradient descent, stochastic dual averaging method, AdaGrad, online stochastic optimization, batch stochastic optimization
アルゴリズム:Algorithms

Protected: Unconstrained optimization for continuous optimization in machine learning

Unconstrained Optimization for Continuous Optimization in Machine Learning for Digital Transformation, Artificial Intelligence, and Machine Learning tasks machine epsilon, stopping conditions without scaling, stopping conditions with scaling, Taylor's theorem, stopping conditions for optimization algorithms, Hesse matrix
アルゴリズム:Algorithms

Protected: Unsupervised Learning with Gaussian Processes (1)Overview and Algorithm of Gaussian Process Latent Variable Models

Overview and algorithms of unsupervised learning using Gaussian Process Latent Variable Models GPLVM, an application of probabilistic generative models used in digital transformation, artificial intelligence, and machine learning, Bayesian Gaussian Process Latent Variable Models ,Bayesian GPLVM
アルゴリズム:Algorithms

Protected: Implementation of model-free reinforcement learning in python (2) Monte Carlo and TD methods

Python implementations of model-free reinforcement learning such as Monte Carlo and TD methods Q-Learning, Value-based methods, Monte Carlo methods, neural nets, Epsilon-Greedy methods, TD(lambda) methods, Muli-step Learning, Rainbow, A3C/A2C, DDPG, APE-X DDPG, Muli-step Learning) Epsilon-Greedy method, TD(λ) method, Muli-step Learning, Rainbow, A3C/A2C, DDPG, APE-X DQN
グラフ理論

Protected: Information Geometry of Positive Definite Matrices (1) Introduction of dual geometric structure

Introduction of dual geometric structures as information geometry for positive definite matrices utilized in digital transformation, artificial intelligence, and machine learning tasks (Riemannian metric, tangent vector space, semi-positive definite programming problem, self-equilibrium, Levi-Civita connection, Riemannian geometry, geodesics, Euclidean geometry, ∇-geodesics, tangent vector, tensor quantity, dual flatness, positive definite matrix set)
バンディッド問題

Protected: Fundamentals of Stochastic Bandid Problems

Basics of stochastic bandid problems utilized in digital transformation, artificial intelligence, and machine learning tasks (large deviation principle and examples in Bernoulli distribution, Chernoff-Heffding inequality, Sanov's theorem, Heffding inequality, Kullback-Leibler divergence, probability mass function, hem probability, probability approximation by central limit theorem).
微分積分:Calculus

Protected: Fundamentals of Convex Analysis as a Basic Matter for Sequential Optimization in Machine Learning

Basics of convex analysis as a fundamental matter of continuous optimization utilized in digital transformation, artificial intelligence, and machine learning tasks subgradient, subdifferential, conjugate function, closed truly convex function, conjugate function, strongly convex function, closed truly convex function, upper and lower bounds on function values, Hesse matrix, epigraph, Taylor's theorem, relative interior, Ahuynh envelope, continuity, convex envelope, convex function, convex set
アルゴリズム:Algorithms

Protected: Basic Framework of Statistical Mathematics Theory

Basic framework of statistical mathematics theory used in digital transformation, artificial intelligence, and machine learning tasks regularization, approximation and estimation errors, Höfding's inequality, prediction discriminant error, statistical consistency, learning algorithms, performance evaluation, ROC curves, AUC, Bayes rules, Bayes error, prediction loss, empirical loss
アルゴリズム:Algorithms

Protected: Supervised learning and regularization

Overview of supervised learning regression, discriminant and regularization ridge function, L1 regularization, bridge regularization, elastic net regularization, SCAD, group regularization, generalized concatenated regularization, trace norm regularization as the basis of machine learning optimization methods used for digital transformation, artificial intelligence and machine learning tasks
タイトルとURLをコピーしました