Arşiv logosu
  • Türkçe
  • English
  • Giriş
    Yeni kullanıcı mısınız? Kayıt için tıklayın. Şifrenizi mi unuttunuz?
Arşiv logosu
  • Koleksiyonlar
  • Sistem İçeriği
  • Analiz
  • Talep/Soru
  • Türkçe
  • English
  • Giriş
    Yeni kullanıcı mısınız? Kayıt için tıklayın. Şifrenizi mi unuttunuz?
  1. Ana Sayfa
  2. Yazara Göre Listele

Yazar "Genc, Murat" seçeneğine göre listele

Listeleniyor 1 - 5 / 5
Sayfa Başına Sonuç
Sıralama seçenekleri
  • [ X ]
    Öğe
    A new double-regularized regression using Liu and lasso regularization
    (Springer Heidelberg, 2022) Genc, Murat
    This paper discusses a new estimator that performs simultaneous parameter estimation and variable selection in the scope of penalized regression methods. The estimator is an extension of the Liu estimator with l(1)-norm penalization. We give the coordinate descent algorithm to estimate the coefficient vector of the proposed estimator, efficiently. We also examine the consistency properties of the estimator. We conduct simulation studies and two real data analyses to compare the proposed estimator with several estimators including the ridge, Liu, lasso and elastic net. The simulation studies and real data analyses show that besides performing automatic variable selection, the new estimator has considerable prediction performance with a small mean squared error under sparse and non-sparse data structures.
  • [ X ]
    Öğe
    An Enhanced Extreme Learning Machine Based on Square-Root Lasso Method
    (Springer, 2024) Genc, Murat
    Extreme learning machine (ELM) is one of the most notable machine learning algorithms with many advantages, especially its training speed. However, ELM has some drawbacks such as instability, poor generalizability and overfitting in the case of multicollinearity in the linear model. This paper introduces square-root lasso ELM (SQRTL-ELM) as a novel regularized ELM algorithm to deal with these drawbacks of ELM. A modified version of the alternating minimization algorithm is used to obtain the estimates of the proposed method. Various techniques are presented to determine the tuning parameter of SQRTL-ELM. The method is compared with the basic ELM, RIDGE-ELM, LASSO-ELM and ENET-ELM on six benchmark data sets. Performance evaluation results show that the SQRTL-ELM exhibits satisfactory performance in terms of testing root mean squared error in benchmark data sets for the sake of slightly extra computation time. The superiority level of the method depends on the tuning parameter selection technique. As a result, the proposed method can be considered a powerful alternative to avoid performance loss in regression problems .
  • [ X ]
    Öğe
    Lasso regression under stochastic restrictions in linear regression: An application to genomic data
    (Taylor & Francis Inc, 2024) Genc, Murat; Ozkale, M. Revan
    Variable selection approaches are often employed in high-dimensionality and multicollinearity problems. Since lasso selects variables by shrinking the coefficients, it has extensive use in many fields. On the other, we may sometime have extra information on the model. In this case, the extra information should be considered in the estimation procedure. In this paper, we propose a stochastic restricted lasso estimator in linear regression model which uses the extra information as stochastic linear restrictions. The estimator is a generalization of mixed estimator with L-1 type penalization. We give the coordinate descent algorithm to estimate the coefficient vector of the proposed method and strong rules for the coordinate descent algorithm to discard variables from the model. Also, we propose a method to estimate the tuning parameter. We conduct two real data analyses and simulation studies to compare the new estimator with several estimators including the ridge, lasso and stochastic restricted ridge. The real data analyses and simulation studies show that the new estimator enjoys the automatic variable selection property of the lasso while outperforms standard methods, achieving lower test mean squared error.
  • [ X ]
    Öğe
    Regularization and variable selection with triple shrinkage in linear regression: a generalization of lasso
    (Taylor & Francis Inc, 2024) Genc, Murat; Ozkale, M. Revan
    We propose a new shrinkage and variable selection method in linear regression, which is based on triple shrinkage on the regression coefficients. The new estimation method contains the ridge, lasso and elastic net as special cases. The term based on the shrunken estimator in the new method can provide estimates with a smaller length depending on the size of a new tuning parameter compared to the elastic net, maintaining the variable selection feature in the case of multicollinearity. The new estimator has the property of the grouping effect similar to that of the elastic net. The well-known coordinate descent algorithm is used to compute the coefficient path of the new estimator, efficiently. We conduct real data analysis and simulation studies to compare the new estimator with several methods including the lasso and elastic net.
  • [ X ]
    Öğe
    Weighted LAD-Liu-LASSO for robust estimation and sparsity
    (Springer Heidelberg, 2025) Genc, Murat; Lukman, Adewale
    The Least Absolute Shrinkage and Selection Operator (LASSO) is widely used for parameter estimation and variable selection but can encounter challenges with outliers and heavy-tailed error distributions. Integrating variable selection methods such as LASSO with Weighted Least Absolute Deviation (WLAD) has been explored in limited studies to handle these problems. In this study, we proposed the integration of Weighted Least Absolute Deviation with Liu-LASSO to handle variable selection, parameter estimation, and heavy-tailed error distributions due to the advantages of the Liu-LASSO approach over traditional LASSO methods. This approach is demonstrated through a simple simulation study and real-world application. Our findings showcase the superiority of our method over existing techniques while maintaining the asymptotic efficiency comparable to the unpenalized LAD estimator.

| Tarsus Üniversitesi | Kütüphane | Rehber | OAI-PMH |

Bu site Creative Commons Alıntı-Gayri Ticari-Türetilemez 4.0 Uluslararası Lisansı ile korunmaktadır.


Tarsus Üniversitesi, Mersin, TÜRKİYE
İçerikte herhangi bir hata görürseniz lütfen bize bildirin

Powered by İdeal DSpace

DSpace yazılımı telif hakkı © 2002-2025 LYRASIS

  • Çerez Ayarları
  • Gizlilik Politikası
  • Son Kullanıcı Sözleşmesi
  • Geri Bildirim