Skip to content

This package contains a macro for converting expressions to use muladd calls and fused-multiply-add (FMA) operations for high-performance in the SciML scientific machine learning ecosystem

License

Notifications You must be signed in to change notification settings

SciML/MuladdMacro.jl

Repository files navigation

MuladdMacro.jl

Join the chat at https://julialang.zulipchat.com #sciml-bridged Global Docs

codecov Build Status

ColPrac: Contributor's Guide on Collaborative Practices for Community Packages SciML Code Style

This package provides the @muladd macro. It automatically converts expressions with multiplications and additions or subtractions to calls with muladd which then fuse via FMA when it would increase the performance of the code. The @muladd macro can be placed on code blocks and it will automatically find the appropriate expressions and nest muladd expressions when necessary. In mixed expressions summands without multiplication will be grouped together and evaluated first but otherwise the order of evaluation of multiplications and additions is not changed.

Tutorials and Documentation

For information on using the package, see the stable documentation. Use the in-development documentation for the version of the documentation, which contains the unreleased features.

Examples

julia> using MuladdMacro

julia> @macroexpand(@muladd k3 = f(t + c3 * dt, @. uprev + dt * (a031 * k1 + a032 * k2)))
:(k3 = f((muladd)(c3, dt, t), (muladd).(dt, (muladd).(a032, k2, (*).(a031, k1)), uprev)))

julia> @macroexpand(@muladd integrator.EEst = integrator.opts.internalnorm((update -
                                                                            dt * (bhat1 * k1 +
                                                                             bhat4 * k4 +
                                                                             bhat5 * k5 +
                                                                             bhat6 * k6 +
                                                                             bhat7 * k7 +
                                                                             bhat10 * k10)) ./
                                                                           @. (integrator.opts.abstol +
                                                                               max(abs(uprev),
           abs(u)) * integrator.opts.reltol)))
:(integrator.EEst = integrator.opts.internalnorm((muladd)(-dt, (muladd)(bhat10, k10, (muladd)(bhat7, k7, (muladd)(bhat6, k6, (muladd)(bhat5, k5, (muladd)(bhat4, k4, bhat1 * k1))))), update) ./ (muladd).(max.(abs.(uprev), abs.(u)), integrator.opts.reltol, integrator.opts.abstol)))

Broadcasting

A muladd call will be broadcasted if both the * and the + or - are broadcasted. If either one is not broadcasted, then the expression will be converted to a non-dotted muladd.

Limitations

Currently, @muladd handles only explicit calls of + and *. In particular, assignments using += or literal power such as ^2 are not supported. Thus, you need to rewrite them, e.g.

julia> using MuladdMacro

julia> a = 1.0;
       b = 2.0;
       c = 3.0;

julia> @macroexpand @muladd a += b * c # does not work
:(a += b * c)

julia> @macroexpand @muladd a = a + b * c # good alternative
:(a = (muladd)(b, c, a))

julia> @macroexpand @muladd a + b^2 # does not work
:(a + b ^ 2)

julia> @macroexpand @muladd a + b * b # good alternative
:((muladd)(b, b, a))

Credit

Most of the credit goes to @fcard and @devmotion for building the first version and greatly refining the macro. These contributions are not directly shown as this was developed in Gitter chats and in the DiffEqBase.jl repository, but these two individuals did almost all of the work.

About

This package contains a macro for converting expressions to use muladd calls and fused-multiply-add (FMA) operations for high-performance in the SciML scientific machine learning ecosystem

Topics

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Sponsor this project

 

Packages

No packages published

Languages