Profile Image
Assistant Lecturer

Yeldez Jingeez Subhi

Research Interests

optimization

Numerical Analysis

Dali Analysis

Algabra

Gender FEMALE
Place of Work College of Oil & Gas Techniques Engineering / Kirkuk
Department Renewable Energy Techniques Department
Position مسؤولة وحدة ضمان الجودة
Qualification Master’s
Speciality Mathematics
Email yeldez.j.subhi@ntu.edu.iq
Phone 07722370470
Address Kirkuk/Alwasite, Kirkuke, Kirkuk, Iraq

Skills

اللغة التركمانية (100%)
اللغة الانكليزية (65%)
اللغة الانكليزية (65%)
اللغة الانكليزية (65%)
اللغة العربية (70%)
اللغة الكردية (75%)
اللغة التركية (86%)
working experience

Academic Qualification

ماجستير
Sep 20, 2020 - Nov 7, 2022

بكلوريوس
Dec 1, 2012 - Jun 20, 2015

Working Experience

رياضيات /تحليل عددي [العمل في جامعة كركوك /كلية طب الاسنان]
Oct 8, 2018 - Sep 20, 0019

رياضيات /تحليل عددي [العمل في مديرية تربية كركوك /مدرسة ثانوية الانتصار المختلطة]
Sep 15, 2017 - Jun 29, 2018

رياضيات /تحليل عددي [تدريسية ومقررة قسم في جامعة الكتاب /كلية هندسة النفط]
Apr 29, 2023 - Aug 16, 2024

رياضيات /تحليل عددي [العمل كتدريسية في الجامعة التقنية الشمالية / كركوك]
Sep 25, 2024 - Present

Publications

NUMERICAL AND CONVERGENCE ANALYSIS OF AN ENHANCED DAI-LIAO METHOD FOR UNCONSTRAINED OPTIMIZATION
Sep 1, 2025

Journal Journal of Mathematics and Its Applications

publisher Basim A.Hassan@!",Ibrahim Mohammed Sulaiman، Yeldez J.Subhi

DOI https://doi.org/10.30598/barekengvol19no4pp2993-3004

Iterative algorithms play an important role in mathematical optimization,particularly in soving Articde History: radient (CG) methods are large-scale unconstrained optimization problems. The conjugate gradient( Received:29 2025 widely used due to their low memory requirements and efficiency.However,their performance Revised: 29mApril 2025 highly depends on the choice of parameters that influence search directions and convergence Accepted:10"Jhune 2025 speed. Despite their advantages, traditional CG algorithms sometimes suffer from slow Available onlin convergence or poor accuracy,especia September 2025 gradient parameters significantly influences the performance, and there is a need to develop improved stralegies to enhance solutιon accuracy and efιciency

Solving single variable functions using a new secant method
Feb 13, 2025

Journal Journal of Interdisciplinary Mathematics

publisher Hawraz N. Jabbar $ /Yeldez J. Subhi @ / Hakeem N. Hussein * Basim A. Hassan ^

DOI https://doi.org/10.47974/JIM-1854

Issue 0972-0502 (Print), ISSN: 2169-012X (Online)

Volume 28 (2025), No. 1, pp. 245–251

The quadratically convergent Newton method is a fundamental and significant approach for solving one-variable functions. In order to solve a single minimization issue, we deduce a novel secant type approach in this study that is based on estimating the second derivative information. The convergence of the novel secant type iterative approach is of order

On new secant-method for minimum functions of one variable
Feb 13, 2025

Journal Journal of Interdisciplinary Mathematics

publisher Ali M. Jasim , Yeldez J. Subhi & Basim Abbas Hassan

DOI https://doi.org/10.47974/JIM-1899

Issue 0972-0502 (Print), ISSN: 2169-012X (Online)

Volume 28 (2025), No. 1, pp. 291–296

In this article, we developed the Newton method by utilizing the Taylor series to estimate derivatives based on the function’s minimum value. The aim was to reduce the number of iterations required to obtain the optimal solution of the function. We compare the execution time and number of iterations between the proposed approach and the classical

Impact to formula gradient impulse noise reduction from images
Oct 15, 2024

Journal Journal of Interdisciplinary Mathematics

publisher Isam H.Halil/ Yeldez J. Subhi / Basim A. Hassan ^

DOI 10.47974/JIM-2187

For the majority of image processing techniques and applications,denoising a photo is a mıst. The Taylor series is used to suggest a new conjugate gradient scalar. Together with the descent property,the novel fornula satisfies the convergence properties.Lastly, we provide a few illustrations of picture restoration using; the suggested conjugate gradient technique.

Enhancements Self-Scaling Quasi-Newton for Unconstrained Optimization
May 5, 2024

Journal Advances in Nonlinear Variational Inequalities

publisher Basim A. Hassan, Hakeem N. Hussein, Yeldez J. Subhi, Yoksal A. Laylani, Hawraz N. Jabbar, Mohammed W. Taha

DOI https://doi.org/10.52783/anvi.v27.974

Issue 1092-910X

Volume Vol 27No. 2(2024)

A self-scaling for the quasi-Newton tecnique is derive by using a second_order Taylor's expansion to achieve optimal computational performance. Following this, new updating formulas for the quasi-Newton method are introduced based on the newly derived self-scaling equation. The numerical results confirm this derivation and suggest that the new method could potentially rival the BFGS method in terms of performance.

Image Impulse Noise Reduction Using a Conjugate Gradient of Alternative Parameter
Jul 30, 2023

Journal EUROPEAN JOURNAL OF PURE AND APPLIED MATHEMATICS

publisher Hawraz N. Jabbar Yeldez J. Subhi Basim A. Hassan

DOI https://doi.org/10.29020/nybg.ejpam.v16i3.4849

Issue Vol. 16 No. 3: (July 2023)

Volume Vol. 16 No. 3: (July 2023)

Conjugate gradient approaches emphasise the conjugate formula. This study creates a new conjugate coefficient for the conjugate gradient approach to restore pictures using Perry’s conjugacy condition and a quadratic model. Algorithms have global convergence and descent. The new technique performed better in numerical testing. The new conjugate gradient technique outperforms the FR method. The new technique performed better in numerical testing. The new conjugate gradient technique outperforms the FR method.