y0ast / deterministic-uncertainty-quantification Goto Github PK
View Code? Open in Web Editor NEWCode for "Uncertainty Estimation Using a Single Deep Deterministic Neural Network"
Home Page: https://arxiv.org/abs/2003.02037
License: MIT License
Code for "Uncertainty Estimation Using a Single Deep Deterministic Neural Network"
Home Page: https://arxiv.org/abs/2003.02037
License: MIT License
In your ICML presentation slides, you mentioned that DUQ is able to estimate aleatoric uncertainy, but in the paper 2.3, you said DUQ captures both aleatoric and epistemic uncertainty
.
Comapred with "What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision?" by Alex Kendall in NeurIPS 2017
, what is the advantage of DUQ?
Compared with using Gaussian Mixture Density Netwrok
to estimate the stds(uncertainty), what is your advantage?
Thanks ahead!
Hi,
thanks for sharing your work.
I'm trying to reproduce your results. I'm trying SVHN CIFAR10 results. I have trained your model and now I'm testing.
I produce 2 hitsograms about scores(kernel_distance) from CIFAR 10 and SVHN, but they are quite different from your paper. Accuracy score and AUROC is similar from the paper:
SVNH
Accuracy,Auroc 0.9135159073448065 0.9238
get from
accuracy, auroc = get_cifar_svhn_ood(model)
CIFAR 10
0.9238 0.9070916430423466
get from function
accuracy, auroc = get_auroc_classification(test_dataset, model)
I have attached the 2 histograms
Do you think this results are similar from yours?
Thanks
Maybe adding a link or a note might be helpful.
"./data/notMNIST_small.mat"
Thanks,
I’m confused about the method of updating centroids, could you please explain it?
The paper mentioned:
The method of updating centroids was introduced in the Appendix of van den Oord et al. (2017) for updating quantised latent variable.
I only found the paper, but didn't find the Appendix of van den Oord et al. (2017), Can you provide the link of the Appendix?
def step(engine, batch):
model.train()
optimizer.zero_grad()
x, y = batch
x, y = x.cuda(), y.cuda()
x.requires_grad_(True)
y_pred = model(x)
y = F.one_hot(y, num_classes).float()
loss = F.binary_cross_entropy(y_pred, y, reduction="mean")
if l_gradient_penalty > 0:
gp = calc_gradient_penalty(x, y_pred)
loss += l_gradient_penalty * gp
loss.backward()
optimizer.step()
x.requires_grad_(False)
with torch.no_grad():
model.eval()
model.update_embeddings(x, y)
return loss.item()
Is the gradient of x
just for calculating gradient penalty? How does the loss of l_gradient_penalty * gp
backpropagate?
def update_embeddings(self, x, y):
self.N = self.gamma * self.N + (1 - self.gamma) * y.sum(0)
z = self.feature_extractor(x)
z = torch.einsum("ij,mnj->imn", z, self.W)
embedding_sum = torch.einsum("ijk,ik->jk", z, y)
self.m = self.gamma * self.m + (1 - self.gamma) * embedding_sum
Could you please explain the process of model.update_embeddings
? What’s the meaning of self.N
and self.m
?
Thank you so much!
Hello,Can this method be used for object detection? Have you done relevant experiments? Thank you so much!
File "/deterministic-uncertainty-quantification-master/utils/evaluate_ood.py", line 44, in loop_over_dataloader
kernel_distance, pred = output.max(1)
AttributeError: 'tuple' object has no attribute 'max'
as the title
Thanks for great paper.
I am trying to replicate the uncertainty plot of Deep Ensemble of the Toy Example but I couldn't get the same plot as shown in Figure 1 of the paper.
I noticed that we only have the code for DUQ. Can you also share the code for generating the uncertainty plot for Deep Ensemble?
Thanks,
Hello,
Very interesting work!
I am also interested in your work about semantic segmentation:
https://arxiv.org/abs/2111.00079
Are you planning on publishing the code?
Thanks a lot!
Hi,
Thank you for sharing your working codes.
I want to trying your code, But i got a problem with training code
File "directory/train_duq_cifar.py", line 103, in calc_gradient_penalty gradients = calc_gradients_input(x, y_pred)
File "directory/train_duq_cifar.py", line 95, in calc_gradients_input create_graph=True,
It occurred after 1epoch
As you can see line 122 of train_duq_cifar.py, Authors set the x.requires_grad_(True) for tracking the gradients
I don't know this error occurred.
Could me help me?
Thank you
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.