You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Firstly, I really appreciate about your clean and kind code.
I have trained ESANet model with SUNRGBD dataset and scenenet dataset. For the SUNRGBD dataset, when i implemented the code prepare_dataset.py in the src colder, the refined depth data was generated, and when i train the depth mode to 'refined', it was trained with the refined depth dataset. Also when i downloaded scenenet dataset, the depth file was refined depth. And I have a question here, in the paper of this model, is the inference time that you mentioned in Fig 4. and Fig 5. include the data pre-processing time (making the raw depth to refined depth)? Without refining the depth data, is it able to implement in the real-time application?
Thank you for reading my issue.
The text was updated successfully, but these errors were encountered:
Yes, we did train on refined depth data, as it is common practice. Still, you can use the model and apply it on "raw depth" samples. However, you can also train on raw depth data, which results in only slightly lower mIoU. For application, I think, you will not notice the difference.
Firstly, I really appreciate about your clean and kind code.
I have trained ESANet model with SUNRGBD dataset and scenenet dataset. For the SUNRGBD dataset, when i implemented the code prepare_dataset.py in the src colder, the refined depth data was generated, and when i train the depth mode to 'refined', it was trained with the refined depth dataset. Also when i downloaded scenenet dataset, the depth file was refined depth. And I have a question here, in the paper of this model, is the inference time that you mentioned in Fig 4. and Fig 5. include the data pre-processing time (making the raw depth to refined depth)? Without refining the depth data, is it able to implement in the real-time application?
Thank you for reading my issue.
The text was updated successfully, but these errors were encountered: