Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Switch to EliminateDiscrete for max-product #1362

Merged
merged 52 commits into from
Jan 4, 2023

Conversation

dellaert
Copy link
Member

@dellaert dellaert commented Jan 1, 2023

  • added tiny example in c++
  • made SumFrontals a method
  • added test for SumFrontals
  • added test for elimination in python, and it works (now we need to move to the next simplest example):
  • below shows estimate of marginals, which match the eliminated result! And rations of posterior resulting from elimination also match! Awesome job :-)
..True mode: 0
P(mode=0; z0, z1) = 0.776760546738582
P(mode=1; z0, z1) = 0.22323945326141803
HybridBayesNet
 
size: 2
conditional 0: Hybrid  P( x0 | m0)
 Discrete Keys = (m0, 2), 
 Choice(m0) 
 0 Leaf  p(x0)
  R = [ 2.83549 ]
  d = [ 33.4989 ]
  mean: 1 elements
  x0: 11.8142
  No noise model

 1 Leaf  p(x0)
  R = [ 0.512076 ]
  d = [ 5.53226 ]
  mean: 1 elements
  x0: 10.8036
  No noise model

conditional 1:  P( m0 ):
 Choice(m0) 
 0 Leaf 0.76887015
 1 Leaf 0.23112985

expected_ratio: 9.889012746635817

Ratio: 9.889012746635933
Ratio: 9.889012746636144
Ratio: 9.889012746635778
Ratio: 9.889012746633508
Ratio: 9.889012746635919
Ratio: 9.889012746635949
Ratio: 9.889012746633531
Ratio: 9.889012746633517
Ratio: 9.889012746633536
Ratio: 9.889012746635643

@dellaert
Copy link
Member Author

dellaert commented Jan 1, 2023

Not sure why it works yet, we should talk.
Switch to EliminateDiscrete in this PR was key I think. And maybe your constant, although was that used already?

@dellaert
Copy link
Member Author

dellaert commented Jan 1, 2023

@varunagrawal So, I thought the code would now be correct, but it's not working out. The ratios also stopped working. Maybe we can investigate together. In a nutshell, in the C++ code, because we only use measurement, the discretefactor should in the final posterior be exactly equal to the prior. That means that the factor computed in the first elimination step should be zero information: DiscreteFactor with two equal leaves. But the constants do not add up to the same value (which is what is required).

@varunagrawal
Copy link
Collaborator

@varunagrawal So, I thought the code would now be correct, but it's not working out. The ratios also stopped working. Maybe we can investigate together. In a nutshell, in the C++ code, because we only use measurement, the discretefactor should in the final posterior be exactly equal to the prior. That means that the factor computed in the first elimination step should be zero information: DiscreteFactor with two equal leaves. But the constants do not add up to the same value (which is what is required).

I guess the issue with this was the assumption that one measurement wasn't informative enough to distinguish the modes? If yes, then glad we figured it out today. Making the note here for posterity.

Copy link
Collaborator

@varunagrawal varunagrawal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm making a PR to address some of these things and fix the tests.

gtsam/hybrid/GaussianMixtureFactor.cpp Outdated Show resolved Hide resolved
@@ -172,6 +169,16 @@ class GTSAM_EXPORT GaussianMixture
*/
double error(const HybridValues &values) const override;

// /// Calculate probability density for given values `x`.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we keeping these for the future? My intuition says yes, but this would probably move to a base class.

gtsam/hybrid/HybridBayesNet.cpp Outdated Show resolved Hide resolved
gtsam/hybrid/GaussianMixtureFactor.h Outdated Show resolved Hide resolved
gtsam/hybrid/HybridGaussianFactorGraph.cpp Outdated Show resolved Hide resolved
};
DecisionTree<Key, double> fdt(separatorFactors, factorProb);
// Normalize the values of decision tree to be valid probabilities
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need this. Or will normalization be handled later?

gtsam/hybrid/HybridGaussianFactorGraph.cpp Show resolved Hide resolved
gtsam/hybrid/HybridGaussianFactorGraph.cpp Show resolved Hide resolved
gtsam/hybrid/tests/testHybridEstimation.cpp Show resolved Hide resolved
gtsam/hybrid/HybridConditional.cpp Show resolved Hide resolved
@varunagrawal varunagrawal mentioned this pull request Jan 3, 2023
@varunagrawal
Copy link
Collaborator

@dellaert I am merging this.

@varunagrawal varunagrawal merged commit 5cdff9e into hybrid/elimination Jan 4, 2023
@varunagrawal varunagrawal deleted the hybrid/test_with_evaluate branch January 4, 2023 06:37
@dellaert dellaert added this to the Hybrid Inference milestone Feb 7, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants