Half-edge error in switching model #371
Replies: 5 comments 2 replies
-
I also tried if is_at_A[i]
x[i] ~ Normal(; mean=A * x_prev + B * x_A_prev, precision=10)
else
x[i] ~ Normal(; mean=A * x_prev + B * x_B_prev, precision=10)
end which results in the same error. |
Beta Was this translation helpful? Give feedback.
-
I updated my model - still the same error - but I think nicer to read: @model function split(x0, x0_A, x0_B, y, is_at_A, A, B)
P ~ Gamma(; shape=0.001, scale=0.001)
x_A[1] ~ x0_A
x_B[1] ~ x0_B
for i in 2:length(y)
x_A[i] ~ Normal(; mean=x_A[i - 1], precision=10) # random walk
x_B[i] ~ Normal(; mean=x_B[i - 1], precision=10) # random walk
end
x[1] ~ x0
for i in 2:length(y)
if is_at_A[i - 1]
x[i] ~ Normal(; mean=A * x[i - 1] + B * x_A[i - 1], precision=10)
else
x[i] ~ Normal(; mean=A * x[i - 1] + B * x_B[i - 1], precision=10)
end
y[i] ~ Normal(; mean=x[i], precision=P)
end
end
A = 0.9
B = 0.1
constraints = @constraints begin
q(x, x_A, x_B, y, P) = q(x, x_A, x_B)q(P)q(y)
end
x0_prior = NormalMeanVariance(0.0, 1000.0)
x0_A_prior = NormalMeanVariance(0.0, 1000.0)
x0_B_prior = NormalMeanVariance(0.0, 1000.0)
initialization = @initialization begin
q(P) = Gamma(0.001, 0.001)
end
result = infer(;
model=split(; x0=x0_prior, x0_A=x0_A_prior, x0_B=x0_B_prior, is_at_A=is_at_A, A=A, B=B),
data=(y=y,),
constraints=constraints,
initialization=initialization,
returnvars=(x=KeepLast(), x_A=KeepLast(), x_B=KeepLast()),
iterations=20,
options=(limit_stack_depth=100,),
) |
Beta Was this translation helpful? Give feedback.
-
Ok, I now have something that diverges: @model function split(xA0, xB0, x0, y, is_at_A)
P ~ Gamma(; shape=0.001, scale=0.001)
xA_prior ~ Normal(; mean=mean(xA0), var=var(xA0))
xB_prior ~ Normal(; mean=mean(xB0), var=var(xB0))
x_prior ~ Normal(; mean=mean(x0), var=var(x0))
local xA, xB
xA_prev = xA_prior
xB_prev = xB_prior
for i in 1:length(y)
xA[i] ~ Normal(; mean=xA_prev, precision=20)
xB[i] ~ Normal(; mean=xB_prev, precision=20)
xA_prev = xA[i]
xB_prev = xB[i]
end
local x
x_prev = x_prior
for i in 1:length(y)
x[i] := 0.999 * x_prev + 0.001 * (is_at_A[i] * xA[i] + (1 - is_at_A[i]) * xB[i]) # TODO: indices are wrong
y[i] ~ Normal(; mean=x[i], precision=P)
x_prev = x[i]
end
end
constraints = @constraints begin
q(xA_prior, xA, xB_prior, xB, y, x, x_prior, P) = q(xA_prior, xA, xB_prior, xB, x, x_prior)q(P)q(y)
end
xA0_prior = NormalMeanVariance(0.0, 10.0)
xB0_prior = NormalMeanVariance(0.0, 10.0)
x0_prior = NormalMeanVariance(0.0, 10.0)
initm = @initialization begin
q(P) = Gamma(0.001, 0.001)
# Initialize the marginal
q(xA) = vague(NormalMeanPrecision)
q(xB) = vague(NormalMeanPrecision)
q(x) = vague(NormalMeanPrecision)
# Initialize the message
μ(xA) = vague(NormalMeanPrecision)
μ(xB) = vague(NormalMeanPrecision)
μ(x) = vague(NormalMeanPrecision)
end
result = infer(;
model=split(; xA0=xA0_prior, xB0=xB0_prior, x0=x0_prior),
data=(y=y, is_at_A=is_at_A),
constraints=constraints,
initialization=initm,
returnvars=(xA=KeepLast(), xB=KeepLast()),
iterations=10,
options=(limit_stack_depth=250,),
# free_energy=true, # ERROR: Failed to compute variable bound free energy component for `name = anonymous_var_graphppl, index = nothing, linked to (NodeData in context with properties name = is_at_A, index = 1 with extra:
) I would appreciate any advice. |
Beta Was this translation helpful? Give feedback.
-
I will take a look at your model this week. Cheers |
Beta Was this translation helpful? Give feedback.
-
@ValentinKaisermayer I've had a lazy attempt to fix it. For starter, you can simplify the constraints, we only need to specify factors that needed to be factorised. I thought going with delta node would work just fine, but I am getting using RxInfer
using ExponentialFamilyProjection
generated_xA = cumsum(vcat(2, randn(99)))
generated_xB = cumsum(vcat(-10, randn(99)))
generated_is_at_A = float.(rand(Bool, 100))
generated_x = generated_is_at_A .* generated_xA .+ (1 .- generated_is_at_A) .* generated_xB
generated_y = generated_x .+ randn(100)
function compute_term(is_at_A, x_A, x_B)
return is_at_A * x_A + (1 - is_at_A) * x_B
end
@model function split(y, is_at_A, x0, x0_A, x0_B, A, B)
P ~ Gamma(shape=1.0, rate=1.0)
x_prior ~ Normal(mean=mean(x0), var=var(x0))
x_A_prior ~ Normal(mean=mean(x0_A), var=var(x0_A))
x_B_prior ~ Normal(mean=mean(x0_B), var=var(x0_B))
x_prev = x_prior
x_A_prev = x_A_prior
x_B_prev = x_B_prior
for i in 1:length(y)
x_A[i] ~ Normal(mean=x_A_prev, precision=1.0)
x_B[i] ~ Normal(mean=x_B_prev, precision=1.0)
term[i] := compute_term(is_at_A[i], x_A[i], x_B[i])
x[i] ~ Normal(mean=A * x_prev + B*term[i], precision=1.0)
y[i] ~ Normal(mean=x[i], precision=P)
x_prev = x[i]
x_A_prev = x_A[i]
x_B_prev = x_B[i]
end
end
constraints = @constraints begin
q(x, P) = q(x)q(P)
end
xA0_prior = NormalMeanVariance(0.0, 1.0)
xB0_prior = NormalMeanVariance(0.0, 1.0)
x0_prior = NormalMeanVariance(0.0, 1.0)
initm = @initialization begin
q(P) = GammaShapeRate(1.0, 1.0)
# Initialize the message
μ(x_A) = NormalMeanPrecision(0.0, 1.0)
μ(x_B) = NormalMeanPrecision(0.0, 1.0)
end
meta = @meta begin
compute_term() -> Linearization()
end;
result = infer(
model = split(x0=x0_prior, x0_A=xA0_prior, x0_B=xB0_prior, A=1.0, B=1.0),
data = (y =generated_y, is_at_A = generated_is_at_A),
constraints = constraints,
initialization = initm,
meta = meta,
iterations = 10,
options = (limit_stack_depth = 250,),
) Given that you are interested in x_A, x_B then the following trick my work for you, although I am afraid that it won't scale super good. |
Beta Was this translation helpful? Give feedback.
-
I have a measurement signal that measures either quantity A or quantity B. An additional signal tells me which one it is. However, there is a dynamic in the model, so when switching it takes some time.
So my model is two random walks for the original quantities:
The model for combining the two:
And finally, my noisy observations:
However, I have problems implementing this in RxInfer:
Note that my simulated data is without the dynamics, but that should not matter.
I'm getting an error:
Thx in advance!
Beta Was this translation helpful? Give feedback.
All reactions