機械学習:Machine Learning

アルゴリズム:Algorithms

Protected: Stochastic Optimization and Online Optimization Overview

Stochastic and online optimization used in digital transformation, artificial intelligence, and machine learning tasks expected error, riglet, minimax optimal, strongly convex loss function, stochastic gradient descent, stochastic dual averaging method, AdaGrad, online stochastic optimization, batch stochastic optimization
アルゴリズム:Algorithms

Protected: Unconstrained optimization for continuous optimization in machine learning

Unconstrained Optimization for Continuous Optimization in Machine Learning for Digital Transformation, Artificial Intelligence, and Machine Learning tasks machine epsilon, stopping conditions without scaling, stopping conditions with scaling, Taylor's theorem, stopping conditions for optimization algorithms, Hesse matrix
アルゴリズム:Algorithms

Protected: Unsupervised Learning with Gaussian Processes (1)Overview and Algorithm of Gaussian Process Latent Variable Models

Overview and algorithms of unsupervised learning using Gaussian Process Latent Variable Models GPLVM, an application of probabilistic generative models used in digital transformation, artificial intelligence, and machine learning, Bayesian Gaussian Process Latent Variable Models ,Bayesian GPLVM
アルゴリズム:Algorithms

Protected: Implementation of model-free reinforcement learning in python (2) Monte Carlo and TD methods

Python implementations of model-free reinforcement learning such as Monte Carlo and TD methods Q-Learning, Value-based methods, Monte Carlo methods, neural nets, Epsilon-Greedy methods, TD(lambda) methods, Muli-step Learning, Rainbow, A3C/A2C, DDPG, APE-X DDPG, Muli-step Learning) Epsilon-Greedy method, TD(λ) method, Muli-step Learning, Rainbow, A3C/A2C, DDPG, APE-X DQN
Clojure

Protected: Clojure implementation of distributed computation processing (map-reduce) used in Hadoop

Clojure implementation of distributed computation processing (map-reduce) used in Hadoop for digital transformation, artificial intelligence, and machine learning tasks Tesser, Reducer function, fold, cost function, gradient descent method, feature extraction, feature-scales function, feature scaling, gradient descent learning rate, gradient descent update rule, iterative algorithm, multiple regression, correlation matrix, fuse, commutative, linear regression, co-reduction, and covariance) feature-scales function, feature scaling, gradient descent learning rate, gradient descent update rule, iterative algorithm, multiple regression, correlation matrix, fuse, commutativity, linear regression, covariance, Hadoop, pararrel fold
コンピューター

Introduction to FPGAs for Software Engineers Machine Learning

Summary An FPGA (Field Programmable Gate Array) is a programmable hardware device that can perform high-...
グラフ理論

Protected: Information Geometry of Positive Definite Matrices (1) Introduction of dual geometric structure

Introduction of dual geometric structures as information geometry for positive definite matrices utilized in digital transformation, artificial intelligence, and machine learning tasks (Riemannian metric, tangent vector space, semi-positive definite programming problem, self-equilibrium, Levi-Civita connection, Riemannian geometry, geodesics, Euclidean geometry, ∇-geodesics, tangent vector, tensor quantity, dual flatness, positive definite matrix set)
バンディッド問題

Protected: Fundamentals of Stochastic Bandid Problems

Basics of stochastic bandid problems utilized in digital transformation, artificial intelligence, and machine learning tasks (large deviation principle and examples in Bernoulli distribution, Chernoff-Heffding inequality, Sanov's theorem, Heffding inequality, Kullback-Leibler divergence, probability mass function, hem probability, probability approximation by central limit theorem).
微分積分:Calculus

Protected: Fundamentals of Convex Analysis as a Basic Matter for Sequential Optimization in Machine Learning

Basics of convex analysis as a fundamental matter of continuous optimization utilized in digital transformation, artificial intelligence, and machine learning tasks subgradient, subdifferential, conjugate function, closed truly convex function, conjugate function, strongly convex function, closed truly convex function, upper and lower bounds on function values, Hesse matrix, epigraph, Taylor's theorem, relative interior, Ahuynh envelope, continuity, convex envelope, convex function, convex set
アルゴリズム:Algorithms

Protected: Basic Framework of Statistical Mathematics Theory

Basic framework of statistical mathematics theory used in digital transformation, artificial intelligence, and machine learning tasks regularization, approximation and estimation errors, Höfding's inequality, prediction discriminant error, statistical consistency, learning algorithms, performance evaluation, ROC curves, AUC, Bayes rules, Bayes error, prediction loss, empirical loss
タイトルとURLをコピーしました