Skip to content

flops are counted multiple times if a module is shared by other modules #106

Open
@CocytusDuo

Description

@CocytusDuo

If a module is passed to a sub-module, for example:

import torch.nn as nn
import ptflops

class Block(nn.Module):
    def __init__(self, linear_layer) -> None:
        super().__init__()
        self.linear_layer = linear_layer
    
    def forward(self, x):
        return self.linear_layer(x)
    
class Test_model(nn.Module):
    def __init__(self) -> None:
        super().__init__()
        self.linear_layer = nn.Linear(1000, 1000)
        self.block = Block(self.linear_layer)

    def forward(self, x):
        out = self.linear_layer(x)
        out = self.block(out)
        return out

net = Test_model()
print(ptflops.get_model_complexity_info(net, (20, 1000)))

then, the flops of module nn.Linear(1000, 1000) will be counted twice in Test_model and Block:

Warning: variables __flops__ or __params__ are already defined for the moduleLinear ptflops can affect your code!
Test_model(
  2.0 M, 200.000% Params, 80.0 MMac, 100.000% MACs, 
  (linear_layer): Linear(1.0 M, 100.000% Params, 40.0 MMac, 50.000% MACs, in_features=1000, out_features=1000, bias=True)
  (block): Block(
    1.0 M, 100.000% Params, 40.0 MMac, 50.000% MACs, 
    (linear_layer): Linear(1.0 M, 100.000% Params, 40.0 MMac, 50.000% MACs, in_features=1000, out_features=1000, bias=True)
  )
)
('80.0 MMac', '1.0 M')

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingwontfixThis will not be worked on

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions