Implements cross-validated smooth adaptive lasso regularization for structural equation models. The penalty function is given by: $$p( x_j) = p( x_j) = \frac{1}{w_j}\lambda\sqrt{(x_j + \epsilon)^2}$$
Usage
cvSmoothAdaptiveLasso(
lavaanModel,
regularized,
weights = NULL,
lambdas,
epsilon,
k = 5,
standardize = FALSE,
returnSubsetParameters = FALSE,
modifyModel = lessSEM::modifyModel(),
control = lessSEM::controlBFGS()
)
Arguments
- lavaanModel
model of class lavaan
- regularized
vector with names of parameters which are to be regularized. If you are unsure what these parameters are called, use getLavaanParameters(model) with your lavaan model object
- weights
labeled vector with weights for each of the parameters in the model. If you are unsure what these parameters are called, use getLavaanParameters(model) with your lavaan model object. If set to NULL, the default weights will be used: the inverse of the absolute values of the unregularized parameter estimates
- lambdas
numeric vector: values for the tuning parameter lambda
- epsilon
epsilon > 0; controls the smoothness of the approximation. Larger values = smoother
- k
the number of cross-validation folds. Alternatively, you can pass a matrix with booleans (TRUE, FALSE) which indicates for each person which subset it belongs to. See ?lessSEM::createSubsets for an example of how this matrix should look like.
- standardize
Standardizing your data prior to the analysis can undermine the cross- validation. Set standardize=TRUE to automatically standardize the data.
- returnSubsetParameters
set to TRUE to return the parameters for each training set
- modifyModel
used to modify the lavaanModel. See ?modifyModel.
- control
used to control the optimizer. This element is generated with the controlBFGS function. See ?controlBFGS for more details.
Details
Identical to regsem, models are specified using lavaan. Currently,
most standard SEM are supported. lessSEM also provides full information
maximum likelihood for missing data. To use this functionality,
fit your lavaan model with the argument sem(..., missing = 'ml')
.
lessSEM will then automatically switch to full information maximum likelihood
as well.
Adaptive lasso regularization:
Zou, H. (2006). The adaptive lasso and its oracle properties. Journal of the American Statistical Association, 101(476), 1418–1429. https://doi.org/10.1198/016214506000000735
Regularized SEM
Huang, P.-H., Chen, H., & Weng, L.-J. (2017). A Penalized Likelihood Method for Structural Equation Modeling. Psychometrika, 82(2), 329–354. https://doi.org/10.1007/s11336-017-9566-9
Jacobucci, R., Grimm, K. J., & McArdle, J. J. (2016). Regularized Structural Equation Modeling. Structural Equation Modeling: A Multidisciplinary Journal, 23(4), 555–566. https://doi.org/10.1080/10705511.2016.1154793
Examples
library(lessSEM)
# Identical to regsem, lessSEM builds on the lavaan
# package for model specification. The first step
# therefore is to implement the model in lavaan.
dataset <- simulateExampleData()
lavaanSyntax <- "
f =~ l1*y1 + l2*y2 + l3*y3 + l4*y4 + l5*y5 +
l6*y6 + l7*y7 + l8*y8 + l9*y9 + l10*y10 +
l11*y11 + l12*y12 + l13*y13 + l14*y14 + l15*y15
f ~~ 1*f
"
lavaanModel <- lavaan::sem(lavaanSyntax,
data = dataset,
meanstructure = TRUE,
std.lv = TRUE)
# Regularization:
lsem <- cvSmoothAdaptiveLasso(
# pass the fitted lavaan model
lavaanModel = lavaanModel,
# names of the regularized parameters:
regularized = paste0("l", 6:15),
lambdas = seq(0,1,.1),
epsilon = 1e-8)
# use the plot-function to plot the cross-validation fit
plot(lsem)
# the coefficients can be accessed with:
coef(lsem)
# elements of lsem can be accessed with the @ operator:
lsem@parameters
# The best parameters can also be extracted with:
coef(lsem)