Tangent-cut optimizer on gradient descent: an approach towards Hybrid Heuristics

作者:Saptarshi Biswas, Subhrapratim Nath, Sumagna Dey, Utsha Majumdar

摘要

The world has witnessed a surfeit of usage of Artificial Intelligence systems for a long time. Nowadays, most of the problems are transforming from logical solutions into statistical domains. This requires the implementation of machine learning algorithms to mine useful data from the statistical datasets which in turn demands high-end computing. Generally, machine learning algorithms utilize Gradient Descent as a tool to find the optimal solution of computationally expensive problems. This gave rise to the development of optimization algorithms like Momentum, RMSProp, Adam and the like, which could speed up the convergence to the global optimum besides increasing the learning accuracy. However, nowadays the supervised machine learning models got more data intensive which increased their computational cost, putting the efficiency of these algorithms into question. In this context, a new optimization algorithm namely, the Tangent-Cut Optimizer (TC-Opt) has been proposed which can converge faster than the traditional optimization algorithms for supervised machine learning models. Furthermore, the proposed work brings forward a phenomenon that intertwines the statistical and logical decision-making model into a single unit while shedding light on a new heuristic approach named “Hybrid Heuristics”. The proposed algorithm has been implemented on the standard dataset of Boston House Pricing Dataset for linear regression and MNIST image dataset of handwritten digits from 0 to 9 for logistic regression and its performance has been compared with the existing algorithms. Finally, the robustness and high accuracy of the proposed optimization algorithm have been proved and demonstrated in the presentation.

论文关键词:Gradient Descent, Heuristics, Hybrid Heuristics, Optimization, Supervised learning

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10462-021-09984-0