Giter Site home page Giter Site logo

Comments (19)

sovrasov avatar sovrasov commented on May 17, 2024 133

Most of modern hardware architectures uses FMA instructions for operations with tensors.
FMA computes a*x+b as one operation. Roughly GMACs = 0.5 * GFLOPs

from flops-counter.pytorch.

snownus avatar snownus commented on May 17, 2024 24

I think GFLOPs = 2 * GMACs as general each MAC contains one multiplication and one addition.

from flops-counter.pytorch.

sovrasov avatar sovrasov commented on May 17, 2024 17

MAC = Multiply–accumulate operation

from flops-counter.pytorch.

cassie101 avatar cassie101 commented on May 17, 2024 12

@sovrasov, in this case would you consider changing the variable flops to mac to avoid confusion?

flops, params = get_model_complexity_info(net, (3, 224, 224),

from flops-counter.pytorch.

sovrasov avatar sovrasov commented on May 17, 2024 12

In the original resnet paper authors mixed up macs and flops. As far as I remember, they provided a definition of flops that considers one flop as multiply & add operation. Please check up the paper, correct me if I'm wrong.

from flops-counter.pytorch.

cmj18 avatar cmj18 commented on May 17, 2024 11

GMACs = 2 * GFLOPs, because MACs includes addition and multiplication operation, GFLOPs only has add operation.

from flops-counter.pytorch.

drcege avatar drcege commented on May 17, 2024 8

@cmj18 @jerryli1981
No, it should be GFLOPs = 2 * GMACs.
MACs stands for multiply–accumulate operation that performs a <- a + (b x c)(they are counted as one operation)
FLOPs is abbreviation of floating operations which includes mul / add / div ... etc. (each is separately counted as a single floating operation)

from flops-counter.pytorch.

sovrasov avatar sovrasov commented on May 17, 2024 3

@cassie101 makes sense, I'll change it

from flops-counter.pytorch.

nizhenliang avatar nizhenliang commented on May 17, 2024

Thank you, sir! Whether the output value is directly FLOPs? Do we need to divide it by 2 to get FLOPs?

from flops-counter.pytorch.

XavierCHEN34 avatar XavierCHEN34 commented on May 17, 2024

what does MAC stands for? Multi-Add Calculation?

from flops-counter.pytorch.

chnadell avatar chnadell commented on May 17, 2024

Roughly GMACs = 2 * GFLOPs

@sovrasov is there a typo here? I did a little reading and it seems that @snownus has it right. In general a multiply-accumulate is one multiplication and one addition, which can each be floating point operations. So 1 GMAC counts as 2 GFLOPs, meaning GMACs = .5 * GFLOPs (I'm not sure if this is what was already meant).

As for fused multiply-add (FMA) it seems that (if it is supported on a given chip/system) the two FLOPs are indeed computed "in a single step" (see here) or "at once" (see here). But this confuses our conversion. Perhaps in the case of FMA it is more accurate to say 1 GMACs = 1 GFLOPs? Hopefully someone with more expertise than me can clarify!

from flops-counter.pytorch.

sovrasov avatar sovrasov commented on May 17, 2024

@chnadell yes, you're right! @snownus also figured it out. I'll edit the first post to avoid any future confusions.

from flops-counter.pytorch.

code-by-jin avatar code-by-jin commented on May 17, 2024

Thank you, sir! Whether the output value is directly FLOPs? Do we need to divide it by 2 to get FLOPs?

I am also confused. Shouldn't we multiply it by 2 to get FLOPs?

from flops-counter.pytorch.

sovrasov avatar sovrasov commented on May 17, 2024

@code-by-jin yes, exactly, we should multiply GMACS by 2 to get FLOPS

from flops-counter.pytorch.

code-by-jin avatar code-by-jin commented on May 17, 2024

@code-by-jin yes, exactly, we should multiply GMACS by 2 to get FLOPS

Thanks for your response. I checked ResNet-50 using your tool. It has around 4 GMACS, which is close to the number of FLOPS claimed in the resnet paper. Now I am confused, do I really need to multiply your output GMACS by two?

from flops-counter.pytorch.

zyxjtu avatar zyxjtu commented on May 17, 2024

Most of modern hardware architectures uses FMA instructions for operations with tensors. FMA computes a*x+b as one operation. Roughly GMACs = 0.5 * GFLOPs

hi, I've never seen GMACs like this before, it means 10^9 about macs? As far as I know the capital letter before the word is related to FLOPS, not FLOPs and MACs, which is easy to confuse me. Looking forward to your reply, thx

from flops-counter.pytorch.

jerryli1981 avatar jerryli1981 commented on May 17, 2024

It isn't always true that GMACs = 2 * GFLOPs. For example, two models with the same the GMACS, may have very difference GFLOPS. It depends how you implement model efficiently

from flops-counter.pytorch.

minecraftdixit avatar minecraftdixit commented on May 17, 2024

I want to know is there any relation between GOPS(Giga operations per second ) and GFLOPS ,like if i know GFLOPS then ,can i determine GOPS , or are they independent ??

from flops-counter.pytorch.

sovrasov avatar sovrasov commented on May 17, 2024

GOPS is a characteristic of hardware, it can only be determined by measurements. ptflops just shows an approximation to theoretical amount of operations required for one forward pass. Time is not considered by ptflops.

from flops-counter.pytorch.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.