Giter Site home page Giter Site logo

confusezius / unet-lits-2d-pipeline Goto Github PK

View Code? Open in Web Editor NEW
94.0 3.0 25.0 218 KB

Liver Lesion Segmentation with 2D Unets

Python 99.94% Shell 0.06%
deep-learning medical-image-segmentation unet-pytorch lesion-segmentation liver-segmentation unet semantic-segmentation

unet-lits-2d-pipeline's Issues

GPU use

I want to run the program with two GPUs. I fixed os.environ['CUDA_LAUNCH_BLOCKING'] = "0, 1" in Train_Networks but it is still unsuccessful. I would like to ask how to do this.

Time limit exceeded on lits challenge

hello,
yesterday i made a submission on codalab but after running a couple of hours i was surprised by a weird ''time limit exceeded" error do you have any idea about the origin of this error ? thank you

Boundary_weightmap

Thank you for your sharing. I have some quesions :

  1.  “   boundary_weightmap_lesion = 0.75*np.clip(np.exp(-0.02*(ndi.morphology.distance_transform_edt(1-total_border))),weight,None)+0.25*lesion_mask
    
    boundary_weightmap_liver = 0.65boundary_weightmap_liver + 0.35boundary_weightmap_lesion”
    Q:
    (1) Did boundary_weightmap formula come from ‘ weight_map = wc+w0 * np.exp(-1/2 * ((d1+d2)/sigma) ** 2) ’?
    (2) How to determine the coefficients of the weight formula to be 0.75 and 0.25,0.65 and 0.35?
    (3) How is this boundary_weightmap sent to the network? to which part of the network?

2." if opt.Training['data']=='lesion': vol_info['ref_mask_info']= pd.read_csv(opt.Paths['Training_Path']+'/Assign_2D_LiverMasks.csv', header=0)"
Q:What does this code mean?Why 'data' is'lesion', but Assign_2D_LiverMasks.csv is given to ref_mask_info?

Looking forward to your answer, thank you very much!

pretrained weight

Thank you very much for your sharing! I have a simple task using the segmented liver for postpropressing with my own datasets, so if you can provide the weight pretrained in the dataset ? Thanks again!

train with self-made dataset

Thanks for sharing the great work. I am a student working on similar project and I'm still learning the techniques, I wonder how could I adjust the data preparation code to use self-made dataset including binary masks stored in png and sliced volume stored in mat instead of LITS database, thanks in advance for your help.

I got some error when training

When running

python Train_Networks/Train_Networks.py --search_setup Small_UNet_Liver.txt --no_date

i got this errror, and dont know how to fix it.

Setup Iteration... :   0%|                                                                    | 0/1 [00:00<?, ?it/s]│
Traceback (most recent call last):                                                                                 
  File "Train_Networks/Train_Networks.py", line 232, in <module>                                                    
    main(training_setup)                                                                                           
  File "Train_Networks/Train_Networks.py", line 114, in main                                                        
    train_dataset, val_dataset = Data.Generate_Required_Datasets(opt)                                               
  File "/home/cc/jupyter/unet-lits-2d-pipeline/Train_Networks/../Utilities/PyTorch_Datasets.py", line 23, in Generat
e_Required_Datasets                                                                                                 
    available_volumes = sorted(list(set(np.array(vol_info['volume_slice_info']['Volume']))), key=lambda x: int(x.spl
it('-')[-1]))                                                                                                      
  File "/home/cc/jupyter/unet-lits-2d-pipeline/Train_Networks/../Utilities/PyTorch_Datasets.py", line 23, in <lambda                                                                                   
    available_volumes = sorted(list(set(np.array(vol_info['volume_slice_info']['Volume']))), key=lambda x: int(x.spl
it('-')[-1]))                                                                                                       
ValueError: invalid literal for int() with base 10: 'Volume'  

Could u please help me?

creating weight map

I have some trouble creating weight map using Convert_NibVolumes_to_Slices.py (not the 3D version since my data are already slices). The program outputs:
Reloaded modules: network_zoo, Network_Utilities, General_Utilities, PyTorch_Datasets, Function_Library
0%| | 0/1 [00:00<?, ?it/s]
Without weightmap or csv file. I noticed that on L178 of that file explaining that txt file are read, I wonder which txt file should I adjust? Thanks :)

RuntimeError

RuntimeError: CUDA out of the memory. Tried to allocate 360.00 MiB

GPU using

In the program, I want to run the program with two GPUs. I fixed os.environ['CUDA_LAUNCH_BLOCKING'] = "0, 1" in Train_Networks but it is still unsuccessful. I would like to ask how to do this.

Execution Error

May I know what is this error about

0% 0/4165 [00:00<?, ?it/s]
train_iter 4165
0% 0/4165 [00:00<?, ?it/s]
(#26453082) Training [lr=[3.e-05]]: 0% 0/50 [00:00<?, ?it/s]
Setup Iteration... : 0% 0/1 [00:05<?, ?it/s]
Traceback (most recent call last):
File "/content/drive/MyDrive/My Drive/Tversky Loss/Train_Networks/Train_Networks.py", line 235, in
main(training_setup)
File "/content/drive/MyDrive/My Drive/Tversky Loss/Train_Networks/Train_Networks.py", line 137, in main
flib.trainer([network,optimizer], train_data_loader, [base_loss_func, aux_loss_func], opt, Metrics, epoch)
File "/content/drive/MyDrive/My Drive/Tversky Loss/Train_Networks/Function_Library.py", line 36, in trainer
for slice_idx, file_dict in enumerate(train_data_iter):
File "/usr/local/lib/python3.7/dist-packages/tqdm/std.py", line 1180, in iter
for obj in iterable:
File "/usr/local/lib/python3.7/dist-packages/torch/utils/data/dataloader.py", line 521, in next
data = self._next_data()
File "/usr/local/lib/python3.7/dist-packages/torch/utils/data/dataloader.py", line 1203, in _next_data
return self._process_data(data)
File "/usr/local/lib/python3.7/dist-packages/torch/utils/data/dataloader.py", line 1229, in _process_data
data.reraise()
File "/usr/local/lib/python3.7/dist-packages/torch/_utils.py", line 434, in reraise
raise exception
ValueError: Caught ValueError in DataLoader worker process 0.
Original Traceback (most recent call last):
File "/usr/local/lib/python3.7/dist-packages/torch/utils/data/_utils/worker.py", line 287, in _worker_loop
data = fetcher.fetch(index)
File "/usr/local/lib/python3.7/dist-packages/torch/utils/data/_utils/fetch.py", line 49, in fetch
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/usr/local/lib/python3.7/dist-packages/torch/utils/data/_utils/fetch.py", line 49, in
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/content/drive/MyDrive/My Drive/Tversky Loss/Utilities/PyTorch_Datasets.py", line 161, in getitem
crops_for_picked_batch = gu.get_crops_per_batch(files_to_crop, crop_mask, crop_size=self.pars.Training['crop_size'], seed=self.rng.randint(0,1e8))
File "/content/drive/MyDrive/My Drive/Tversky Loss/Utilities/General_Utilities.py", line 307, in get_crops_per_batch
crop_idx = [rng.randint(crop_size[i]//2-1,np.array(batch_to_crop[0].shape[i+1])-crop_size[i]//2-1) for i in range(batch_to_crop[0].ndim-1)]
File "/content/drive/MyDrive/My Drive/Tversky Loss/Utilities/General_Utilities.py", line 307, in
crop_idx = [rng.randint(crop_size[i]//2-1,np.array(batch_to_crop[0].shape[i+1])-crop_size[i]//2-1) for i in range(batch_to_crop[0].ndim-1)]
File "mtrand.pyx", line 746, in numpy.random.mtrand.RandomState.randint
File "_bounded_integers.pyx", line 1254, in numpy.random._bounded_integers._rand_int64
ValueError: low >= high

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.