New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Out-of-fold predictions for unsupervised models? #1025
Comments
@salbalkus Yes, currently I think your suggestion to add some kind of support makes sense. As you say, this is closely related to #575 , which could be dealt with at the same time. Also related is: JuliaAI/MLJBase.jl#660 |
In the meantime, you might be able to implement what you want using a learning network as well. The MLJ See this tutorial for the general idea. The way learning networks are exported as stand-alone models has changed ( |
I have written a custom unsupervised learning model that implements
transform
andfit
. I'd like to use MLJ to perform resampling and obtain out-of-fold output for this model. For example, given a 5-fold cross-validation, I would like tofit
the model on four of the folds and obtain the output oftransform
on the left out fold - and then, repeat this until I've obtained "out-of-fold predictions" for each fold.Is there (or could there be) an easy way to do using MLJ without having to write my own function to extract folds? As mentioned in #575,
evaluate!
with a custom measure that simply returns the prediction can accomplish this for supervised models. However, this does not appear to work for unsupervised models sinceevaluate!
seems to require apredict
method and ay
value.The text was updated successfully, but these errors were encountered: