Abstract:
We explore a specific subclass of convex functions that exhibit enhanced and superior char-
acteristics, known as strong convex functions. By focusing on strong convexity, we revisit
classical inequalities like Jensen’s and Hermite-Hadamard (HH) type inequalities. This ap-
proach leads to more robust estimates and refinements of well-known divergence measures
such as Kullback-Liebler (KL), χ2- and Jeffreys divergence, among others. Moreover,
we extend our investigations to improving Riemann-Liouville HH ϒ-divergence inequal-
ities specifically designed for strongly convex functions. These improvements serve as
a foundation to bridge fractional information inequalities with recent significant research
outcomes, providing valuable insights and connections within the field of mathematical
analysis and information theory. We explore various novel bounds for Csiszar and related
divergences and for Zipf-Mandelbrot entropy by means of Jensen-Mercer’s inequality via
strongly convex function.