Skip to content

What does it mean to constrain weights in a deconvolutional setting?  #4

@maffettone

Description

@maffettone

Constraining the weights of a regular NMF makes sense when you need to say, "discover the components that are monotonically, or linearly, mixing to create these samples." However, in doing this we lose the functionality of the NMFD, specifically, our components no longer possess any degree of translational robustness.

What do the weights of the NMFD look like in a normal context? Should constraining them be in effect to constrain the sum/mean to match the constraints of pure NMF? These soft constraints are not implemented. Do we fix each weight to a kernel such that it's sum/mean matches the constraints of the pure NMF?

Let's get some plots of weights from the NMFD, and see what we can do.

Metadata

Metadata

Labels

questionFurther information is requested

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions