Prevent bug when number of patch is 1

#6

Hello, I am Sungyoon from South Korea.

First of all, thank you for sharing the great MIL model, TITAN. We are very interested in TITAN and would like to deploy it to evaluate its performance in our environment.

While using the code, we encountered a minor issue:
bg_mask is a torch.Tensor, which raises an error when its shape is (1, 1, 1). Converting it to a NumPy array resolves the error.

image.png

Could you please take a look at this issue and consider merging this PR?

Thank you very much.

Sincerely,
Sungyoon Kim

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment