Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
-
Updated
Apr 24, 2023 - Jupyter Notebook
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
Penalized least squares estimation using the Orthogonalizing EM (OEM) algorithm
Machine learning time series regressions
Block coordinate descent for group lasso
北大文再文-最优化方法(凸优化)-程序作业 Course homework for Optimization Methods 2023 Fall, PKU WenZW
Regularization paths of linear, logistic, Poisson, or Cox models with overlapping grouped covariates
Procedure of variable selection in the context of redundancy between explanatory variables, which holds true with high dimensional data
Molecular-property prediction with sparsity
R Package: Adaptively weighted group lasso for semiparametic quantile regression models
HIWT-GSC: Homotopy Iterative Weighted Thresholding algorithm for solving Group Sparsity Constrained optimization problems
We explored various approaches to deal with high-dimensional data in this study, and we compared them using simulation and soil datasets. We discovered that grouping had a significant impact on model correctness and error reduction. For the core projection step, we first looked at the properties of all the algorithms and how they function to com…
This is a development version of DMRnet — Delete or Merge Regressors Algorithms for Linear and Logistic Model Selection and High-Dimensional Data.
My project for STATS-608A in Fall 2018 at the University of Michigan
R package : Group lasso based selection for high-dimensional mediation analysis
Add a description, image, and links to the group-lasso topic page so that developers can more easily learn about it.
To associate your repository with the group-lasso topic, visit your repo's landing page and select "manage topics."