# ignore the models. They are very large and don't need to be sent up. They can be recreated through code on the user end if necessary
models/
# ignore any and all job files and outputs. these include cross-validation, train-test prediction, among others. Just extra bloat that can be recreated later on
cv-jobs/
cv-out/
train-test-predict-jobs/
# ignore the original data. this can be transferred via some other method than github. also prevents HCP restricted information from being uploaded
This project is dedicated to using various linear and nonlinear machine learning algorithms to create a model of cortical thickness as a function of various anatomic and retinotopic properties in early visual cortex.
"This section sets up a variety of libraries I use to import, manipulate, and test data with using linear models from sklearn. It also defines a function calculate_vif_ that will be used to find and remove any predictor variables that are considered colinear. A VIF cutoff of 10 will be used for feature removal. Models will be created for each of the different smoothing levels (0, 2, 5, and 10 mm kernels). For all models, unsmoothed Sulc values will be used"
This section sets up a variety of libraries I use to import, manipulate, and test data with using linear models from sklearn. It also defines a function calculate_vif_ that will be used to find and remove any predictor variables that are considered colinear. A VIF cutoff of 10 will be used for feature removal. Models will be created for each of the different smoothing levels (0, 2, 5, and 10 mm kernels). For all models, unsmoothed Sulc values will be used