Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Metabolic Syndrome Prediction | 4. ML-based Feature Importance #89

Closed
Arihant-Bhandari opened this issue May 16, 2024 · 2 comments
Closed

Comments

@Arihant-Bhandari
Copy link
Contributor

Step 4:

  1. XGBoost (eXtreme Gradient Boosting model)
  2. Random Forest
  3. Decision Tree
  4. LGBM (Light Gradient Boosting model)
  5. CatBoost (Categorical Boosting model)
  6. Extra Trees
  7. AdaBoost (Adaptive Boosting)
  8. Gradient Boosting model
Copy link

Congratulations, @Arihant-Bhandari! 🎉 Thank you for creating your issue. Your contribution is greatly appreciated and we look forward to working with you to resolve the issue. Keep up the great work!

We will promptly review your changes and offer feedback. Keep up the excellent work! Kindly remember to check our contributing guidelines

@Arihant-Bhandari
Copy link
Contributor Author

Arihant-Bhandari commented May 16, 2024

@SrijanShovit hi sorry to disturb, i figured you were busy since you didnt create an issue for step 4. i have made this so that i can link up PR for the same, please mark this with appropriate labels, i have worked on these as is, i am sending a PR wherein you can review the work done for this. If there are things to be added or redone i will be more than glad to do so.

thank you for your time and patience.

additionally i have added a HistGradientBoostingClassifier-sklearn based permutation test for feature importance we covered in step 3 as an additon to this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants