Giter Site home page Giter Site logo

Comments (3)

LiBromine avatar LiBromine commented on July 28, 2024 1

例如, 在 LLaMA 的微调中, 我们使用如下代码, 在 deepspeed 和 fsdp 下可以正常工作, 其中 optim_mode 是我们自定义的参数, 根据此参数的值来判断是否使用我们的优化器. 如果有其他更好的做法欢迎提出

def create_optimizer(self):
    """
    Setup the optimizer.

    We provide a reasonable default that works well. If you want to use something else, you can pass a tuple in the
    Trainer's init through `optimizers`, or subclass and override this method in a subclass.
    """

    if self.optimizer is None:
        if self.args.optim_mode is None or self.args.optim_mode == "base":
            print(f"Use {self.args.optim}")
            super().create_optimizer()
        else:
            assert self.args.optim_mode == "lpmm"
            opt_model = self.model

            decay_parameters = get_parameter_names(opt_model, [nn.LayerNorm])
            decay_parameters = [name for name in decay_parameters if "bias" not in name]
            optimizer_grouped_parameters = [
                {
                    "params": [
                        p for n, p in opt_model.named_parameters() if (n in decay_parameters and p.requires_grad)
                    ],
                    "weight_decay": self.args.weight_decay,
                },
                {
                    "params": [
                        p for n, p in opt_model.named_parameters() if (n not in decay_parameters and p.requires_grad)
                    ],
                    "weight_decay": 0.0,
                },
            ]

            _, optimizer_kwargs = Trainer.get_optimizer_cls_and_kwargs(self.args)
            self.optimizer = lpmm.optim.AdamW(
                params=optimizer_grouped_parameters,
                factor_second_moment=False,
                qconfig=self.args.qconfig,
                fused=False,
                **optimizer_kwargs,
            )
    return self.optimizer

from low-bit-optimizers.

LiBromine avatar LiBromine commented on July 28, 2024

你好, transformers Trainer 中没有内置我们的优化器, 目前无法通过直接传递 optim 字段来使用我们的 low-bit optimizers. 如果要搭配 Trainer 使用, 可以通过重载 create_optimizer 方法.

from low-bit-optimizers.

sunshin2012 avatar sunshin2012 commented on July 28, 2024

你好, transformers Trainer 中没有内置我们的优化器, 目前无法通过直接传递 optim 字段来使用我们的 low-bit optimizers. 如果要搭配 Trainer 使用, 可以通过重载 create_optimizer 方法.

具体要如何操作呢

from low-bit-optimizers.

Related Issues (5)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.