Description
After going through the source code, I found that in different func the addition operation is count in different style
in torch.add all addition seems to be counted
while in some matrices multiplication, part of the addition is not
one example is in torch,addmm,
input + mat1 @ mat2
def _addmm_tensor_flops_hook(input, mat1, mat2, *, beta=1, alpha=1, out=None):
flops = np.prod(mat1.shape, dtype=np.int64) * mat2.shape[-1]
if beta != 0:
flops += np.prod(input.shape, dtype=np.int64)
return flops
the addition of input + new matrix
is included, while the addition in mat1 @ mat2
seems not been considered
assume mat1 and mat2 have shape m*n and n*l, the total operation output for addmm will be m*n*l+m*l
, but I think it might be m*n*l+(n-1)*m*l+m*l=2*m*n*l
I understand that addition might affect little to the total process, but maybe a constant regulation for computing will be more reasonable? Or is there any error in my expression, like in real case we could have some extra condition?
Thank you very much