
YeldezJingeez Jingeez Subhi
Research Interests
Gender | FEMALE |
---|---|
Place of Work | College of Oil & Gas Techniques Engineering / Kirkuk |
Position | Teacher |
Qualification | Master’s |
Speciality | Mathematics |
yeldez.j.subhi@ntu.edu.iq | |
Phone | 07722370470 |
Address | Kirkuk/Alwasite, Kirkuke, Kirkuk, Iraq |
Publications
Impact to formula gradient impulse noise reduction from images
Jun 12, 2025Journal Journal of Interdisciplinary Mathematics
publisher Issam H. Halil, Yeldez J. Subhi, Basim A. Hassan
Volume 28 (2025), No. 4, pp. 1635–1642
For the majority of image processing techniques and applications, denoising a photo is a must. The Taylor series is used to suggest a new conjugate gradient scalar. Together with the descent property, the novel formula satisfies the convergence properties. Lastly, we provide a few illustrations of picture restoration using the suggested conjugate gradient technique. Subject Classification
Solving single variable functions using a new secant method
Feb 13, 2025Journal Journal of Interdisciplinary Mathematics
publisher Hawraz N. Jabbar $ /Yeldez J. Subhi @ / Hakeem N. Hussein * Basim A. Hassan ^
Issue 0972-0502 (Print), ISSN: 2169-012X (Online)
Volume 28 (2025), No. 1, pp. 245–251
The quadratically convergent Newton method is a fundamental and significant approach for solving one-variable functions. In order to solve a single minimization issue, we deduce a novel secant type approach in this study that is based on estimating the second derivative information. The convergence of the novel secant type iterative approach is of order
On new secant-method for minimum functions of one variable
Feb 13, 2025Journal Journal of Interdisciplinary Mathematics
publisher Ali M. Jasim, Yeldez J. Subhi, Basim A. Hassan
Volume 28 (2025), No. 1, pp. 291–296
In this article, we developed the Newton method by utilizing the Taylor series to estimate derivatives based on the function’s minimum value. The aim was to reduce the number of iterations required to obtain the optimal solution of the function. We compare the execution time and number of iterations between the proposed approach and the classical
Enhancements Self-Scaling Quasi-Newton for Unconstrained Optimization
Sep 27, 2024Journal Advances in Nonlinear Variational Inequalities
publisher Basim A. Hassan1, Hakeem N. Hussein, Yeldez J. Subhi2 , Yoksal A. Laylani3, Hawraz N. Jabbar3, Mohammed W. Taha4
DOI https://doi.org/10.52783/anvi.v27.974
Issue Vol. 27 No. 2 (2024)
Volume 27 No. 2 (2024)
A self-scaling for the quasi-Newton tecnique is derive by using a second_order Taylor's expansion to achieve optimal computational performance. Following this, new updating formulas for the quasi-Newton method are introduced based on the newly derived self-scaling equation. The numerical results confirm this derivation and suggest that the new method could potentially rival the BFGS method in terms of performance
Image Impulse Noise Reduction Using a Conjugate Gradient of Alternative Parameter
Jul 30, 2023Journal EUROPEAN JOURNAL OF PURE AND APPLIED MATHEMATICS
publisher Hawraz N. Jabbar1, Yeldez J. Subhi2, Basim A. Hassan3
DOI https://doi.org/10.29020/nybg.ejpam.v16i3.4849
Volume 16, No. 3, 2023, 1624-1633
Conjugate gradient approaches emphasise the conjugate formula. This study creates a new conjugate coefficient for the conjugate gradient approach to restore pictures using Perry’s conjugacy condition and a quadratic model. Algorithms have global convergence and descent. The new technique performed better in numerical testing. The new conjugate gradient technique outperforms the FR method. The new technique performed better in numerical testing. The new conjugate gradient technique outperforms the FR method