Giter Site home page Giter Site logo

Comments (8)

qaqzzz avatar qaqzzz commented on June 2, 2024 2

Xu, Thank you for your reply. I had understood the AC loss and implemented it successfully.Thank you anyway. best wishes Li

Sorry to bother. Could you please share your pytorch implemention? I am quite confused about this loss function. I got very bad performance. :(

Thank you very much!

class ActiveContourLoss(Module):
def init(self):
super(ActiveContourLoss, self).init()

def forward(self, y_pred, y_true, combine=None):

    x = y_pred[:,:,1:,:] - y_pred[:,:,:-1,:] # horizontal and vertical directions 
    y = y_pred[:,:,:,1:] - y_pred[:,:,:,:-1]
    
    delta_x = x[:,:,1:,:-2]**2
    delta_y = y[:,:,:-2,1:]**2
    delta_u = torch.abs(delta_x + delta_y)
    
    epsilon = 0.00000001 # where is a parameter to avoid square root is zero in practice.
    w = 1.
    
    if combine is not None:
        lenth = w * torch.mean(torch.sqrt(delta_u + epsilon)) # equ.(11) in the paper
    else:
        lenth = w * torch.sum(torch.sqrt(delta_u + epsilon)) # equ.(11) in the paper

    if torch.cuda.is_available():
        C_1 = torch.ones(y_true.shape, dtype=torch.float32).cuda()
        C_2 = torch.zeros(y_true.shape, dtype=torch.float32).cuda()
    else:
        C_1 = torch.ones(y_true.shape, dtype=torch.float32)
        C_2 = torch.zeros(y_true.shape, dtype=torch.float32)
    
    if combine is not None:
        region_in = torch.abs(torch.mean(y_pred * ((y_true - C_1)**2)) ) # equ.(12) in the paper
        region_out = torch.abs(torch.mean((1. - y_pred) * ((y_true - C_2)**2))) # equ.(12) in the paper
    else:
        region_in = torch.abs(torch.sum(y_pred * ((y_true - C_1)**2)) ) # equ.(12) in the paper
        region_out = torch.abs(torch.sum((1. - y_pred) * ((y_true - C_2)**2))) # equ.(12) in the paper
        
    lambdaP = 5. # lambda parameter could be various.
    loss = lenth + lambdaP * (region_in + region_out) 

    return loss        

from active-contour-loss.

xuuuuuuchen avatar xuuuuuuchen commented on June 2, 2024

Sorry, I didn't notice that you closed it. If you are still wondering, here is my reply:

AC loss (this version) may not work well in imbalanced label problem. A hyperparameter between region_in and region_out may be required in practice.

0 in y_pred[:,0,:,:] means slicing the 1st channel out so y_pred will have a new size as same as C1 and C2.

Input shape is 'channel first' like (batch size, channel, H, W).

from active-contour-loss.

qaqzzz avatar qaqzzz commented on June 2, 2024

from active-contour-loss.

shiqi1994 avatar shiqi1994 commented on June 2, 2024

Xu, Thank you for your reply. I had understood the AC loss and implemented it successfully.Thank you anyway. best wishes Li

Sorry to bother. Could you please share your pytorch implemention? I am quite confused about this loss function. I got very bad performance. :(

Thank you very much!

from active-contour-loss.

shiqi1994 avatar shiqi1994 commented on June 2, 2024

Xu, Thank you for your reply. I had understood the AC loss and implemented it successfully.Thank you anyway. best wishes Li

Sorry to bother. Could you please share your pytorch implemention? I am quite confused about this loss function. I got very bad performance. :(
Thank you very much!

class ActiveContourLoss(Module):
def init(self):
super(ActiveContourLoss, self).init()

def forward(self, y_pred, y_true, combine=None):

    x = y_pred[:,:,1:,:] - y_pred[:,:,:-1,:] # horizontal and vertical directions 
    y = y_pred[:,:,:,1:] - y_pred[:,:,:,:-1]
    
    delta_x = x[:,:,1:,:-2]**2
    delta_y = y[:,:,:-2,1:]**2
    delta_u = torch.abs(delta_x + delta_y)
    
    epsilon = 0.00000001 # where is a parameter to avoid square root is zero in practice.
    w = 1.
    
    if combine is not None:
        lenth = w * torch.mean(torch.sqrt(delta_u + epsilon)) # equ.(11) in the paper
    else:
        lenth = w * torch.sum(torch.sqrt(delta_u + epsilon)) # equ.(11) in the paper

    if torch.cuda.is_available():
        C_1 = torch.ones(y_true.shape, dtype=torch.float32).cuda()
        C_2 = torch.zeros(y_true.shape, dtype=torch.float32).cuda()
    else:
        C_1 = torch.ones(y_true.shape, dtype=torch.float32)
        C_2 = torch.zeros(y_true.shape, dtype=torch.float32)
    
    if combine is not None:
        region_in = torch.abs(torch.mean(y_pred * ((y_true - C_1)**2)) ) # equ.(12) in the paper
        region_out = torch.abs(torch.mean((1. - y_pred) * ((y_true - C_2)**2))) # equ.(12) in the paper
    else:
        region_in = torch.abs(torch.sum(y_pred * ((y_true - C_1)**2)) ) # equ.(12) in the paper
        region_out = torch.abs(torch.sum((1. - y_pred) * ((y_true - C_2)**2))) # equ.(12) in the paper
        
    lambdaP = 5. # lambda parameter could be various.
    loss = lenth + lambdaP * (region_in + region_out) 

    return loss        

Thank you very much! And may I ask if I need sigmoid y_pred? And how do you set the value of learning rate and iteration? In my experiment, the output always like this
1570436275484
I am very confused why ac loss itself do not work...

from active-contour-loss.

qaqzzz avatar qaqzzz commented on June 2, 2024

from active-contour-loss.

qaqzzz avatar qaqzzz commented on June 2, 2024

from active-contour-loss.

Gaojun211 avatar Gaojun211 commented on June 2, 2024

Hi Xu,

I had used AC Loss to try seg 2-class images. Although the loss is decreasing, dice score doesn't improve at all. On the contrary, dice score is at a fairly low value, about 0.0001.

`
class ActiveContourLoss(Module):
def init(self):
super(ActiveContourLoss, self).init()

def forward(self, y_pred, y_true):

    x = y_pred[:,:,1:,:] - y_pred[:,:,:-1,:] # horizontal and vertical directions 
    y = y_pred[:,:,:,1:] - y_pred[:,:,:,:-1]
    
    delta_x = x[:,:,1:,:-2]**2
    delta_y = y[:,:,:-2,1:]**2
    delta_u = torch.abs(delta_x + delta_y)
    
    epsilon = 0.00000001 # where is a parameter to avoid square root is zero in practice.
    w = 1.

    lenth = w * torch.sum(torch.sqrt(delta_u + epsilon)) # equ.(11) in the paper

    C_1 = torch.ones(y_true.shape, dtype=torch.float32).cuda()
    C_2 = torch.zeros(y_true.shape, dtype=torch.float32).cuda()

    region_in = torch.abs(torch.sum(y_pred * ((y_true - C_1)**2)) ) # equ.(12) in the paper
    region_out = torch.abs(torch.sum((1. - y_pred) * ((y_true - C_2)**2))) # equ.(12) in the paper
    
    lambdaP = 5. # lambda parameter could be various.
    loss = lenth + lambdaP * (region_in + region_out) 

    return loss

`

this is my pytorch implemention, y_pred[:,0,:,:] what is the mean 0, is channels? and y_pred as the input need to sigmoid? the input shape is (channel, batchsize, H, W) or (batchsize, channel, H, W) .
My y_pred shape is (16, 1, 512, 512).So i need modify it?

Best,
qaqzzz

hello,bro.I have the same problem now that dice score is always 0.00000,and I have set learn rate to 0.00005.So I'm very confused about it.Could you give me some suggestions?Thank you very much.

from active-contour-loss.

Related Issues (15)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.