Regularization

アルゴリズム:Algorithms

Protected: Batch Stochastic Optimization – Stochastic Variance-Reduced Gradient Descent and Stochastic Mean Gradient Methods

Batch stochastic optimization for digital transformation, artificial intelligence, and machine learning tasks - stochastic variance reduced gradient descent and stochastic mean gradient methods (SAGA, SAG, convergence rate, regularization term, strongly convex condition, improved stochastic mean gradient method, unbiased estimator, SVRG, algorithm, regularization, step size, memory efficiency, Nekaterov's acceleration method, mini-batch method, SDCA)
アルゴリズム:Algorithms

Protected: Basic Framework of Statistical Mathematics Theory

Basic framework of statistical mathematics theory used in digital transformation, artificial intelligence, and machine learning tasks regularization, approximation and estimation errors, Höfding's inequality, prediction discriminant error, statistical consistency, learning algorithms, performance evaluation, ROC curves, AUC, Bayes rules, Bayes error, prediction loss, empirical loss
アルゴリズム:Algorithms

Protected: Supervised learning and regularization

Overview of supervised learning regression, discriminant and regularization ridge function, L1 regularization, bridge regularization, elastic net regularization, SCAD, group regularization, generalized concatenated regularization, trace norm regularization as the basis of machine learning optimization methods used for digital transformation, artificial intelligence and machine learning tasks
IOT技術:IOT Technology

Protected: Structural regularization learning with submodular optimization (1) Regularization and p-norm review

Review of sparse modeling, regularization and p-norm to consider structural regularization learning with submodular optimization, an optimization technique for discrete information for digital transformation, artificial intelligence and machine learning tasks
Symbolic Logic

Protected: Fundamentals of Submodular Optimization (2) Basic Properties of Submodular Functions

Three basic properties of submodular functions (normalized, non-negative, symmetric) as a basis for optimization algorithms (submodular optimization) of discrete information for digital transformation, artificial intelligence and machine learning tasks and their application to graph cut maximization and minimization problems
オンライン学習

Protected: Reinforcement Learning with Function Approximation (2) – Function Approximation of Value Functions (For Online Learning)

Theory of function approximation online methods gradient TD learning, least-squares based least-squares TD learning (LSTD), GTD2)for reinforcement learning with a huge number of states used in digital transformation , artificial intelligence , and machine learning tasks, and regularization with LASSO.
最適化:Optimization

Machine Learning Professional Series Sparsity-Based Machine Learning Reading Notes

Overview of sparse modeling used for regularization and other applications in machine learning for digital transformation , artificial intelligence , and machine learning tasks.
微分積分:Calculus

Protected: Anomaly detection using sparse structure learning- Graph models and regularization that link broken dependencies between variables to anomalies.

Graph models and regularization that link broken dependencies between variables to anomalies.
微分積分:Calculus

This is a good introduction to deep learning (Machine Learning Startup Series)Reading Notes

Overview of deep learning for digital transformation and artificial intelligence tasks, including machine learning, gradient descent, regularization, error back propagation, self-encoders, convolutional neural networks, recurrent neural networks, Boltzmann machines, and reinforcement learning.
タイトルとURLをコピーしました