Replies: 2 comments 6 replies
-
Hi, thank you for trying out You have two variables Now, if we look at the documentation for defining a custom node, we can what we should do. here we see the update equations for sum-product message passing, where here we see the update rules for variational message passing. There is a subtle difference between these two update equations: The sum-product message passing update rule only depends on Since you don't impose any additional constraints on the variational posterior, initialization = @initialization begin
μ(r) = GammaShapeRate(1.0, 1.0)
μ(p) = Beta(1.0, 1.0)
end Now if you try to run inference with this initialization, you'll run into the next error telling you that there's no rules available for the combination of your node and the incoming messages. This is because you defined your node yourself and using RxInfer, Random, Distributions, ExponentialFamilyProjection
# Register the Negative Binomial node
@node Distributions.NegativeBinomial Stochastic [out, r, p]
# Define the model for a single gene using RxInfer
@model function simple_nb(data, γ_α, γ_β, β_α, β_β)
# Set prior for r
r ~ GammaShapeRate(γ_α, γ_β)
# Set prior for p
p ~ Beta(β_α, β_β)
# Negative Binomial likelihood
data .~ NegativeBinomial(r, p)
end
# Generate synthetic dataset
Random.seed!(42)
n_samples = 100
# Ground truth parameters for p
β_α = 1.0
β_β = 1.0
p_true = rand(Beta(β_α, β_β))
# Ground truth parameters for r
γ_α = 5.0
γ_β = 0.1
r_true = rand(GammaShapeRate(γ_α, γ_β))
# Simulate the data
data = rand(NegativeBinomial(r_true, p_true), n_samples)
initialization = @initialization begin
μ(r) = GammaShapeRate(1.0, 1.0)
μ(p) = Beta(1.0, 1.0)
end
constraints = @constraints begin
q(r) :: ProjectedTo(Gamma)
q(p) :: ProjectedTo(Beta)
μ(r) :: ProjectedTo(Gamma)
μ(p) :: ProjectedTo(Beta)
end
# Run inference
result = infer(
model = simple_nb(γ_α=γ_α, γ_β=γ_β, β_α=β_α, β_β=β_β),
data = (data = data,),
initialization = initialization,
constraints = constraints,
options = (
rulefallback = NodeFunctionRuleFallback(),
),
iterations=10
) This gives you an inference result. You can improve the inference by deriving rules yourself. Hopefully you can get started with EDIT: After messing around a bit more with your model, I have found a way to improve the inference result. For this, I imposed a Mean Field constraint on using RxInfer, Random, Distributions, ExponentialFamilyProjection
# Register the Negative Binomial node
@node Distributions.NegativeBinomial Stochastic [out, r, p]
# Define the model for a single gene using RxInfer
@model function simple_nb(data, γ_α, γ_β, β_α, β_β)
# Set prior for r
r ~ GammaShapeRate(γ_α, γ_β)
# Set prior for p
p ~ Beta(β_α, β_β)
# Negative Binomial likelihood
data .~ NegativeBinomial(r, p)
end
# Generate synthetic dataset
Random.seed!(42)
n_samples = 100
# Ground truth parameters for p
β_α = 1.0
β_β = 1.0
p_true = rand(Beta(β_α, β_β))
# Ground truth parameters for r
γ_α = 5.0
γ_β = 0.1
r_true = rand(GammaShapeRate(γ_α, γ_β))
# Simulate the data
data = rand(NegativeBinomial(r_true, p_true), n_samples)
initialization = @initialization begin
q(r) = GammaShapeRate(1.0, 1.0)
q(p) = Beta(1.0, 1.0)
end
constraints = @constraints begin
q(r) :: ProjectedTo(Gamma)
q(p) :: ProjectedTo(Beta)
q(r, p) = MeanField()
end
# Run inference
result = infer(
model = simple_nb(γ_α=γ_α, γ_β=γ_β, β_α=β_α, β_β=β_β),
data = (data = data,),
initialization = initialization,
constraints = constraints,
options = (
rulefallback = NodeFunctionRuleFallback(),
),
iterations = 10
) The inference result I get with this code is a lot better than when we approximate the message first and the posterior marginal afterwards, so hopefully this method is a bit more useful. |
Beta Was this translation helpful? Give feedback.
-
It has been a fascinating journey to understand enough of RxInfer's mathematical background. Last time, I couldn't understand @wouterwln's suggestions, but after many hours of work, I think I know now—at least the basics of message passing and variational message passing. After this long journey, I came back to my original problem. I was able to derive the variational message passing rule for the newly-defined negative binomial on the @node Distributions.NegativeBinomial Stochastic [out, r, p]
@rule NegativeBinomial(:p, Marginalisation) (
q_out::PointMass, q_r::Any
) = begin
# Get mean of the counts
x = mean(q_out)
# Get mean of q(r)
r = mean(q_r)
# Returns Beta distribution with parameters (1 + r, 1 + x)
return Beta(one(r) + r, one(x) + x)
end However, regardless of how much I tried, I think that the message on the I got excited about the idea behind @model function negbinom_model(
counts,
r_prior_shape,
r_prior_rate,
p_prior_shape,
p_prior_rate,
)
# Prior for the Gamma shape parameter
r ~ GammaShapeRate(r_prior_shape, r_prior_rate)
# Prior for the Beta shape parameter
p ~ Beta(p_prior_shape, p_prior_rate)
# Loop over each count
for i in eachindex(counts)
counts[i] ~ NegativeBinomial(r, p)
end # for
end # negbinom_model I then generate synthetic data to test it as Random.seed!(42)
# Define number of samples
n_samples = 1_000
# Define ground truth parameters for each r
r_α = 5.0
r_β = 0.1
r_true = rand(RxInfer.GammaShapeRate(r_α, r_β))
# Define prior parameters
p_α = 1.0
p_β = 1.0
p_true = rand(RxInfer.Beta(p_α, p_β))
# Simulate the data
counts = rand(NegativeBinomial(r_true, p_true), n_samples)
# Store data in dictionary
data = Dict(:counts => counts) I then run the inference with the following initialization and constraints # Initialize variational distributions
initialization = @initialization begin
q(r) = vague(Gamma)
q(p) = vague(Beta)
end
# Define constraints
constraints = @constraints begin
q(r)::ProjectedTo(Gamma)
q(r, p) = MeanField()
end
# Run inference
results = infer(
model=negbinom_model(
r_prior_shape=r_α,
r_prior_rate=r_β,
p_prior_shape=p_α,
p_prior_rate=p_β,
),
data=data,
initialization=initialization,
constraints=constraints,
options=(
rulefallback=NodeFunctionRuleFallback(),
),
) But the inference looks terrible. It is not a subtle failure, but it completely misses the ground truth. Do you have any suggestions on whether this can be improved? I tried changing the number of iterations when performing the projection, doing ProjectedTo(Gamma; parameters = ProjectionParameters(niterations = 500)) or the tolerance ProjectedTo(Gamma; parameters = ProjectionParameters(tolerance = 1e-2)) But nothing worked. Is it simply that |
Beta Was this translation helpful? Give feedback.
-
Dear RxInfer team,
I’m trying to implement a simple Negative Binomial model using RxInfer.jl, but I’m encountering an error when running inference. Despite reviewing the initialization section in the documentation, I’m unsure how to proceed.
Code to reproduce the issue:
Error message:
What I’ve tried:
Question:
Could you please help me understand why this error occurs and how to initialize the variables p and r for this model properly? Any guidance on correctly setting up the inference for the Negative Binomial model would be greatly appreciated.
Beta Was this translation helpful? Give feedback.
All reactions