Skip to content
This repository has been archived by the owner on Mar 21, 2024. It is now read-only.

Is it possible to use InnerEye fully on a local GPU machine (without Azure ML) #842

Answered by peterhessey
furtheraway asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @furtheraway, thanks for opening this thread! The short answer to question is yes, absolutely 😄 All of InnerEye is designed to work locally identically to how it works in AzureML.

When you complete your training locally, the trained model ensemble and all the files you need for running inference will be saved to the final_model/ folder. In there you will find a file, score.py. This file is the entry point for running inference on a single file that is used by the submit_for_inference.py script provided in the InnerEye-DeepLearning repository.

cd-ing into the final_model/ and running the following command will give you the functionality you're looking for:

score.py --data_folder <path_t…

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@peterhessey
Comment options

Answer selected by furtheraway
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants