1 documents found
Information × Registration Number 0225U005198, (0124U002162) , R & D reports Title New Subgradient and Extragradient Methods for Non-smooth Regression Problems popup.stage_title Розробка та програмна реалізація субградієнтних та екстраградієнтних алгоритмів для задач, що пов’язані з обробкою зображень та машинним навчанням Head Liashko Serhii I., д.ф.-м.н.Serhiienko Ivan V., Доктор фізико-математичних наук Registration Date 26-12-2025 Organization Department for special training National academy of science of Ukraine at Kiev University popup.description1 The goal of scientific research is to create new effective subgradient and extragradient methods with theoretical justification for non-smooth regression problems. popup.description2 Properties of 3 computational forms of r-algorithms are investigated, along with nonsmooth problems with L_1-regularization for linear regression models and the problem of finding the minimum-radius Euclidean ball containing a finite set of points in R^n. Existence theorems, necessary optimality conditions, and approximate solution methods for vector optimization problems in linear distributed systems are considered. Application of the ellipsoid method algorithm emshor to the solution of a range of problems is studied, including: the problem of estimating parameters of a linear regression model with L_1-regularization and a criterion of sum of absolute deviations powered to p∈[1,2], as well as a generalized nonsmooth regression model; and a nonparametric regression problem for approximating convex/concave quadratic functions using the least p-power absolute deviations criterion with p∈[1,2]. Computational experiments are performed for the one-dimensional total variation (TV) problem in denoising applications. For training a binary SVM classification model using the hinge loss function, the ralgsvm algorithm is developed and implemented. To solve the problem of minimizing a convex quadratic function under linear two-sided constraints, the QPralg algorithm is constructed and implemented. Algorithms for solving variational inequalities with monotone, Lipschitz, or Hölder continuous operators in 2-uniformly convex and uniformly smooth Banach spaces are presented. For extrapolation-from-the-past algorithms and operator extrapolation with Bregman divergence applied to variational inequalities, efficiency estimates are obtained in terms of the gap function. Several variants of the extragradient algorithm and operator extrapolation methods for pseudomonotone variational inequalities of saddle-point type are developed for problems arising in data science and noisy signal processing. Product Description popup.authors Volodymyr V. Semenov Petro I. Stetsyuk Viktor O. Stovba Korablov Mykola M. Kovalenko Oleksandra Yu. Cherhykalo Denys O. popup.nrat_date 2025-12-26 Close
R & D report
Head: Liashko Serhii I.. New Subgradient and Extragradient Methods for Non-smooth Regression Problems. (popup.stage: Розробка та програмна реалізація субградієнтних та екстраградієнтних алгоритмів для задач, що пов’язані з обробкою зображень та машинним навчанням). Department for special training National academy of science of Ukraine at Kiev University. № 0225U005198
1 documents found

Updated: 2026-03-22