Comments (4)
I have solved this problem, and the core issue is:
The official PTH, TAR trained through official tutorials, and the structure of network inference. The names of the keys for the three are different and need to be converted when reading.
The official conversion method has been written in the Lightglue library, located at approximately line 470 of Lightglue. py.
for i in range(self.conf.n_layers):
pattern = f"self_attn.{i}", f"transformers.{i}.self_attn
state_dict = {k.replace(*pattern): v for k, v in state_dict.items()}
pattern = f"cross_attn.{i}", f"transformers.{i}.cross_attn"
state_dict = {k.replace(*pattern): v for k, v in state_dict.items()}
I referred to this writing method and manually checked the key names for each link.
from glue-factory.
I have solved this problem, and the core issue is:我已经解决了这个问题,核心问题是:
The official PTH, TAR trained through official tutorials, and the structure of network inference. The names of the keys for the three are different and need to be converted when reading.官方PTH,通过官方教程训练的TAR,以及网络推理的结构。三者的键名不同,读取时需要转换。
The official conversion method has been written in the Lightglue library, located at approximately line 470 of Lightglue. py.官方的转换方法已经写在 Lightglue 库中,位于 Lightglue 的大约 470 行。py。
for i in range(self.conf.n_layers):
pattern = f"self_attn.{i}", f"transformers.{i}.self_attn
state_dict = {k.replace(*pattern): v for k, v in state_dict.items()}
pattern = f"cross_attn.{i}", f"transformers.{i}.cross_attn"
state_dict = {k.replace(*pattern): v for k, v in state_dict.items()}
I referred to this writing method and manually checked the key names for each link.我参考了这种编写方法,并手动检查了每个链接的键名。
Hello, I want to fine-tune the training on my own dataset, how should I do it, what the format of the dataset should be
from glue-factory.
我已经解决了这个问题,核心问题是:我已经解决了这个问题,核心问题是:
官方PTH,通过官方教程训练的TAR,以及网络推理的结构。官方PTH,通过官方教程训练的TAR,以及网络推理的结构。三者的键名不同,读取时需要转换。
官方的转换方法已经写在 Lightglue 库中,位于 Lightglue 的大约 470 行。py.官方的转换方法已经写在 Lightglue 库中,位于 Lightglue 的大约 470 行。py。
for i in range(self.conf.n_layers):
pattern = f"self_attn.{i}", f"transformers.{i}.self_attn
state_dict = {k.replace(*pattern): v for k, v in state_dict.items()}
pattern = f"cross_attn.{i}", f"transformers.{i}.cross_attn"
state_dict = {k.replace(*pattern): v for k, v in state_dict.items()}
我参考了这种编写方法,并手动检查了每个链接的键名。您好,我想在自己的数据集上微调训练,我应该怎么做,数据集的格式应该是什么
您可以下载官方所采用的Homography或者megadepth数据集进行参考(虽然经过我实际测试微调效果并不好)
from glue-factory.
我已经解决了这个问题,核心问题是:我已经解决了这个问题,核心问题是:
官方PTH,通过官方教程训练的TAR,以及网络推理的结构。官方PTH,通过官方教程训练的TAR,以及网络推理的结构。三者的键名不同,读取时需要转换。
官方的转换方法已经写在 Lightglue 库中,位于 Lightglue 的大约 470 行。py.官方的转换方法已经写在 Lightglue 库中,位于 Lightglue 的大约 470 行。py。
for i in range(self.conf.n_layers):
pattern = f"self_attn.{i}", f"transformers.{i}.self_attn
state_dict = {k.replace(*pattern): v for k, v in state_dict.items()}
pattern = f"cross_attn.{i}", f"transformers.{i}.cross_attn"
state_dict = {k.replace(*pattern): v for k, v in state_dict.items()}
我参考了这种编写方法,并手动检查了每个链接的键名。您好,我想在自己的数据集上微调训练,我应该怎么做,数据集的格式应该是什么
您可以下载官方所采用的Homography或者megadepth数据集进行参考(虽然经过我实际测试微调效果并不好)
我有更多的问题想请教一下您,方便加个球球吗,963170859
from glue-factory.
Related Issues (20)
- Questions about SIFT+LightGlue HOT 1
- Mismatched model shape upon MegaDepth finetuning HOT 2
- Why duplicating the log_assignment calculation? HOT 1
- question about homography transformations HOT 2
- Could you please provide some checksum for the dataset?
- what is the approximate training time for each stages? HOT 1
- stuck at the "Creating dataset ImagePairs" when evaluate lightglue on megadepth HOT 1
- How to train lightglue with my extractor? HOT 2
- train => disk +lg HOT 1
- pycolmap error HOT 1
- DeDoDe HOT 1
- memory not release
- mixed precision training
- Error:_pickle.UnpicklingError: invalid load key, '\x0a'.
- what is add_scale_ori and how to use it?
- gluestick training time HOT 3
- simpler GT line matching HOT 1
- the two places name the gamma differently?
- Performance
- The Required Datasets for Training.
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from glue-factory.