Fundamentals

アルゴリズム:Algorithms

Protected: Fundamentals of convex analysis in stochastic optimization (1) Convex functions and subdifferentials, dual functions

Convex functions and subdifferentials, dual functions (convex functions, conjugate functions, Young-Fenchel inequality, subdifferentials, Lejandre transform, subgradient, L1 norm, relative interior points, affine envelope, affine set, closed envelope, epigraph, convex envelope, smooth convex functions, narrowly convex functions, truly convex closed functions, closed convex closed functions, execution domain, convex set) in basic matters of convex analysis in stochastic optimization used for Digital Transformation, Artificial Intelligence, Machine Learning tasks.
アルゴリズム:Algorithms

Protected: Kernel functions as the basis of kernel methods in statistical mathematics theory.

Kernel functions (Gaussian kernels, polynomial kernels, linear kernels, kernel functions, regression functions, linear models, regression problems, discriminant problems) as the basis for kernel methods in statistical mathematics theory used in digital transformation, artificial intelligence and machine learning tasks.
アルゴリズム:Algorithms

Machine Learning by Ensemble Methods – Fundamentals and Algorithms Reading Notes

Fundamentals and algorithms in machine learning with ensemble methods used in digital transformation, artificial intelligence and machine learning tasks class unbalanced learning, cost-aware learning, active learning, semi-supervised learning, similarity-based methods, clustering ensemble methods, graph-based methods, festival label-based methods, transformation-based methods, clustering, optimization-based pruning, ensemble pruning, join methods, bagging, boosting
微分積分:Calculus

Protected: Fundamentals of Convex Analysis as a Basic Matter for Sequential Optimization in Machine Learning

Basics of convex analysis as a fundamental matter of continuous optimization utilized in digital transformation, artificial intelligence, and machine learning tasks subgradient, subdifferential, conjugate function, closed truly convex function, conjugate function, strongly convex function, closed truly convex function, upper and lower bounds on function values, Hesse matrix, epigraph, Taylor's theorem, relative interior, Ahuynh envelope, continuity, convex envelope, convex function, convex set
アルゴリズム:Algorithms

Protected: Supervised learning and regularization

Overview of supervised learning regression, discriminant and regularization ridge function, L1 regularization, bridge regularization, elastic net regularization, SCAD, group regularization, generalized concatenated regularization, trace norm regularization as the basis of machine learning optimization methods used for digital transformation, artificial intelligence and machine learning tasks
アルゴリズム:Algorithms

Fundamentals of Continuous Optimization – Calculus and Linear Algebra

Fundamentals of Continuous Optimization - Calculus and Linear Algebra (Taylor's theorem, Hesse matrix, Landau's symbol, Lipschitz continuity, Lipschitz constant, implicit function theorem, Jacobi matrix, diagonal matrix, eigenvalues, nonnegative definite matrix, positive definite matrix, subspace, projection, 1-rank update, natural gradient method, quasi Newton method, Sherman-Morrison formula, norm, Euclidean norm, p-norm, Schwartz inequality, Helder inequality, function on matrix space)
アルゴリズム:Algorithms

Protected: Stochastic Generative Models and Gaussian Processes(2)Maximum Likelihood and Bayesian Estimation

Maximum Likelihood and Bayesian Estimation Overview for Probabilistic Generative Models and Gaussian Process Fundamentals Used in Digital Transformation, Artificial Intelligence, and Machine Learning Tasks
アルゴリズム:Algorithms

stochastic optimization

Stochastic optimization methods for solving large-scale learning problems on large amounts of data used in digital transformation, artificial intelligence, and machine learning tasks supervised learning and regularization, basics of convex analysis, what is stochastic optimization, online stochastic optimization, batch stochastic optimization, stochastic optimization in distributed environments
Symbolic Logic

Protected: Fundamentals of Submodular Optimization (2) Basic Properties of Submodular Functions

Three basic properties of submodular functions (normalized, non-negative, symmetric) as a basis for optimization algorithms (submodular optimization) of discrete information for digital transformation, artificial intelligence and machine learning tasks and their application to graph cut maximization and minimization problems
Exit mobile version
タイトルとURLをコピーしました