Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HybridFactorGraph base class and other niceties #1221

Merged
merged 93 commits into from
Aug 24, 2022
Merged
Show file tree
Hide file tree
Changes from 22 commits
Commits
Show all changes
93 commits
Select commit Hold shift + click to select a range
2927d92
add HybridNonlinearFactor and nonlinear HybridFactorGraph
varunagrawal May 28, 2022
78ea90b
Add MixtureFactor for nonlinear factor types
varunagrawal May 28, 2022
e91a354
convert to cpp
varunagrawal May 29, 2022
9cbd2ef
Base Hybrid Factor Graph
varunagrawal May 29, 2022
01b9a65
make GaussianMixtureFactor a subclass of HybridGaussianFactor
varunagrawal May 29, 2022
fe0d666
HybridFactorGraph fixes
varunagrawal May 29, 2022
cdd030b
Make MixtureFactor only work with NonlinearFactors and make some impr…
varunagrawal May 29, 2022
08fab8a
HybridNonlinearFactor linearize method
varunagrawal May 29, 2022
9279bd6
push_back for GaussianHybridFactor
varunagrawal May 29, 2022
3274cb1
clean up testHybridFactorGraph, need to add more tests
varunagrawal May 29, 2022
6c36b2c
GaussianHybridFactorGraph inherits from HybridFactorGraph
varunagrawal May 29, 2022
85f4b48
Improvements to GaussianHybridFactorGraph, make MixtureFactor a subcl…
varunagrawal May 30, 2022
53e8c32
Add NonlinearHybridFactorGraph class
varunagrawal May 30, 2022
7e18277
fix base class
varunagrawal May 30, 2022
119679a
linearize returns object instead of pointer
varunagrawal May 30, 2022
3212dde
remove unneeded method
varunagrawal May 30, 2022
0c16799
GaussianMixtureFactor inherits from HybridFactor
varunagrawal May 30, 2022
e711a62
More tests working
varunagrawal May 30, 2022
9e737db
initial pruning method
varunagrawal Jun 3, 2022
89768cf
record continuous keys separately
varunagrawal Jun 3, 2022
ad77a45
formatting and docs update
varunagrawal Jun 3, 2022
c2e5061
add pruning to HybridBayesNet
varunagrawal Jun 3, 2022
e25b0c8
Merge branch 'develop' into hybrid/pruning
varunagrawal Jun 3, 2022
28db8b2
use KeyVector and iterator constructor
varunagrawal Jun 4, 2022
5c5c053
Add HybridFactorGraph base class and more methods for adding gaussian…
varunagrawal Jun 7, 2022
44079d1
refactor testGaussianHybridFactorGraph to include comments and valid …
varunagrawal Jun 7, 2022
374e3cb
Improved hybrid bayes net and tests
varunagrawal Jun 7, 2022
8d6a225
fix printing
varunagrawal Jun 7, 2022
00be610
Add Switching class and make it linear
varunagrawal Jun 7, 2022
26e0cef
fixes
varunagrawal Jun 7, 2022
5bf5d03
local include
varunagrawal Jun 8, 2022
8d35925
make makeBinaryOrdering inline to prevent multiple definition error
varunagrawal Jun 8, 2022
622ebdd
makeSwitchingChain is also inline
varunagrawal Jun 8, 2022
09d512a
add docstring for makeBinaryOrdering
varunagrawal Jun 10, 2022
7496f2f
address review comments
varunagrawal Jun 10, 2022
f1a1f1c
Merge branch 'develop' into hybrid/pruning
varunagrawal Jun 10, 2022
dfbfca9
Merge branch 'develop' into hybrid/hybrid-factor-graph
varunagrawal Jun 20, 2022
91da520
add new lines
varunagrawal Jun 20, 2022
34298c4
Merge branch 'hybrid/hybrid-factor-graph' into hybrid/pruning
varunagrawal Jul 10, 2022
3780b8c
Merge branch 'develop' into feature/nonlinear-hybrid
varunagrawal Jul 26, 2022
8ddc2ea
rename to HybridNonlinearFactorGraph
varunagrawal Jul 26, 2022
7a55341
add IsGaussian template check
varunagrawal Jul 26, 2022
43c28e7
renaming fixes
varunagrawal Jul 26, 2022
987448f
remove derived push_back in HybridNonlinearFactorGraph and HybridFact…
varunagrawal Jul 26, 2022
8471c97
add nonlinear switching system tests
varunagrawal Jul 30, 2022
8907922
get more nonlinear tests to work and make some updates
varunagrawal Aug 1, 2022
16124f3
get all but 2 tests passing
varunagrawal Aug 2, 2022
ee124c3
fix discrete only elimination (use EliminateForMPE)
varunagrawal Aug 2, 2022
b39c231
all tests pass!!!
varunagrawal Aug 2, 2022
0f732d7
fix discrete conditional test
varunagrawal Aug 3, 2022
6670779
Wrap DiscreteLookupTable
varunagrawal Aug 3, 2022
f124ccc
Merge branch 'develop' into hybrid/hybrid-factor-graph
varunagrawal Aug 3, 2022
2fb11db
Merge branch 'develop' into hybrid/pruning
varunagrawal Aug 3, 2022
92a5868
Merge branch 'develop' into feature/nonlinear-hybrid
varunagrawal Aug 3, 2022
db56909
Merge branch 'hybrid/pruning' into feature/nonlinear-hybrid
varunagrawal Aug 3, 2022
5965d8f
change discrete key variable from C to M
varunagrawal Aug 8, 2022
f5e046f
split HybridNonlinearFactorGraph to .h and .cpp
varunagrawal Aug 8, 2022
51d2f07
fix printing and key bug in MixtureFactor linearize
varunagrawal Aug 8, 2022
a3eacaa
fix adding priors in Switching
varunagrawal Aug 8, 2022
588f56e
HybridGaussianISAM unit tests
varunagrawal Aug 8, 2022
4ee23cf
Merge branch 'hybrid/hybrid-factor-graph' into hybrid/pruning
varunagrawal Aug 8, 2022
60c88e3
fix print tests
varunagrawal Aug 10, 2022
4ea897c
cleaner printing
varunagrawal Aug 10, 2022
fbceda3
got some more tests working
varunagrawal Aug 10, 2022
0e4db30
use templetized constructor for MixtureFactor
varunagrawal Aug 10, 2022
aa48658
more tests running
varunagrawal Aug 12, 2022
77bea31
one more test passing
varunagrawal Aug 13, 2022
5806850
HybridValues and optimize() method for hybrid gaussian bayes net
xsj01 Aug 14, 2022
36d6097
Merge branch 'hybrid/hybrid-factor-graph' into hybrid/hybrid-optimize
xsj01 Aug 15, 2022
9564e32
Merge branch 'hybrid/hybrid-factor-graph' into hybrid/hybrid-optimize
xsj01 Aug 15, 2022
7d36a9e
add some comments
xsj01 Aug 15, 2022
2a974a4
Address review comments
varunagrawal Aug 15, 2022
ac20cff
add incremental pruning to HybridGaussianISAM
varunagrawal Aug 16, 2022
379a65f
Address review comments
xsj01 Aug 16, 2022
83b8103
last test to get running
varunagrawal Aug 17, 2022
746ca78
Address review comments
xsj01 Aug 18, 2022
c4184e1
fix error in python unit test
xsj01 Aug 18, 2022
7977f77
Merge pull request #1263 from borglab/feature/nonlinear-hybrid
varunagrawal Aug 21, 2022
07f0101
check subset rather than equality for GaussianISAM pruning
varunagrawal Aug 21, 2022
29c19ee
handle HybridConditional and explicitly set Gaussian Factor Graphs to…
varunagrawal Aug 21, 2022
f6df641
remove custom orderings, let it happen automatically
varunagrawal Aug 21, 2022
6b792c0
add note about sumFrontals
varunagrawal Aug 21, 2022
893c5f7
cast to only HybridGaussianFactor
varunagrawal Aug 22, 2022
ef066a0
Merge pull request #1270 from borglab/hybrid/hybrid-optimize
xsj01 Aug 22, 2022
4c9c106
Merge pull request #1271 from borglab/feature/nonlinear-incremental
varunagrawal Aug 22, 2022
84456f4
Merge pull request #1273 from borglab/hybrid-incremental
varunagrawal Aug 22, 2022
05b1174
Merge pull request #1277 from borglab/feature/nonlinear-hybrid
varunagrawal Aug 22, 2022
8fd6091
add new line
varunagrawal Aug 22, 2022
b07964b
name file correctly in doc string
varunagrawal Aug 22, 2022
7227965
Merge branch 'develop' into hybrid/pruning
varunagrawal Aug 22, 2022
89cdf4f
Merge pull request #1215 from borglab/hybrid/pruning
varunagrawal Aug 23, 2022
aa52b39
fix docstring
varunagrawal Aug 23, 2022
df64982
add extra tests
varunagrawal Aug 23, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion gtsam/hybrid/GaussianMixtureFactor.h
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,7 @@ class GTSAM_EXPORT GaussianMixtureFactor : public HybridFactor {
bool equals(const HybridFactor &lf, double tol = 1e-9) const override;

void print(
const std::string &s = "HybridFactor\n",
const std::string &s = "GaussianMixtureFactor\n",
const KeyFormatter &formatter = DefaultKeyFormatter) const override;
/// @}

Expand Down
36 changes: 36 additions & 0 deletions gtsam/hybrid/HybridBayesNet.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,43 @@
* @file HybridBayesNet.cpp
* @brief A bayes net of Gaussian Conditionals indexed by discrete keys.
* @author Fan Jiang
* @author Varun Agrawal
* @author Shangjie Xue
* @date January 2022
*/

#include <gtsam/hybrid/HybridBayesNet.h>
#include <gtsam/hybrid/HybridValues.h>
#include <gtsam/hybrid/HybridLookupDAG.h>

namespace gtsam {

/* ************************************************************************* */
GaussianMixture::shared_ptr HybridBayesNet::atGaussian(size_t i) const {
return boost::dynamic_pointer_cast<GaussianMixture>(factors_.at(i)->inner());
}

/* ************************************************************************* */
DiscreteConditional::shared_ptr HybridBayesNet::atDiscrete(size_t i) const {
return boost::dynamic_pointer_cast<DiscreteConditional>(
factors_.at(i)->inner());
}

/* ************************************************************************* */
GaussianBayesNet HybridBayesNet::choose(
const DiscreteValues &assignment) const {
GaussianBayesNet gbn;
for (size_t idx = 0; idx < size(); idx++) {
GaussianMixture gm = *this->atGaussian(idx);
gbn.push_back(gm(assignment));
}
return gbn;
}

/* *******************************************************************************/
HybridValues HybridBayesNet::optimize() const {
auto dag = HybridLookupDAG::FromBayesNet(*this);
return dag.argmax();
}

} // namespace gtsam
31 changes: 31 additions & 0 deletions gtsam/hybrid/HybridBayesNet.h
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,9 @@
#pragma once

#include <gtsam/hybrid/HybridConditional.h>
#include <gtsam/hybrid/HybridValues.h>
#include <gtsam/inference/BayesNet.h>
#include <gtsam/linear/GaussianBayesNet.h>

namespace gtsam {

Expand All @@ -36,6 +38,35 @@ class GTSAM_EXPORT HybridBayesNet : public BayesNet<HybridConditional> {

/** Construct empty bayes net */
HybridBayesNet() = default;

/// Add HybridConditional to Bayes Net
using Base::add;

/// Add a discrete conditional to the Bayes Net.
void add(const DiscreteKey &key, const std::string &table) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not in favor of more helpers, do we have similar stuff in DiscreteBayesNet?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We'll need this since we need to wrap the DiscreteConditional into a HybridConditional.

push_back(
HybridConditional(boost::make_shared<DiscreteConditional>(key, table)));
}

/// Get a specific Gaussian mixture by index `i`.
GaussianMixture::shared_ptr atGaussian(size_t i) const;

/// Get a specific discrete conditional by index `i`.
DiscreteConditional::shared_ptr atDiscrete(size_t i) const;

/**
* @brief Get the Gaussian Bayes Net which corresponds to a specific discrete
* value assignment.
*
* @param assignment The discrete value assignment for the discrete keys.
* @return GaussianBayesNet
*/
GaussianBayesNet choose(const DiscreteValues &assignment) const;

/// Solve the HybridBayesNet by back-substitution.
/// TODO(Shangjie) do we need to create a HybridGaussianBayesNet class, and
/// put this method there?
HybridValues optimize() const;
};

} // namespace gtsam
140 changes: 140 additions & 0 deletions gtsam/hybrid/HybridFactorGraph.h
Original file line number Diff line number Diff line change
@@ -0,0 +1,140 @@
/* ----------------------------------------------------------------------------

* GTSAM Copyright 2010, Georgia Tech Research Corporation,
* Atlanta, Georgia 30332-0415
* All Rights Reserved
* Authors: Frank Dellaert, et al. (see THANKS for the full author list)

* See LICENSE for the license information

* -------------------------------------------------------------------------- */

/**
* @file HybridFactorGraph.h
* @brief Hybrid factor graph base class that uses type erasure
* @author Varun Agrawal
* @date May 28, 2022
*/

#pragma once

#include <gtsam/discrete/DiscreteFactor.h>
#include <gtsam/hybrid/HybridDiscreteFactor.h>
#include <gtsam/hybrid/HybridFactor.h>
#include <gtsam/inference/FactorGraph.h>
#include <gtsam/inference/Ordering.h>

#include <boost/format.hpp>
namespace gtsam {

using SharedFactor = boost::shared_ptr<Factor>;

/**
* Hybrid Factor Graph
* -----------------------
* This is the base hybrid factor graph.
* Everything inside needs to be hybrid factor or hybrid conditional.
*/
class HybridFactorGraph : public FactorGraph<HybridFactor> {
public:
using Base = FactorGraph<HybridFactor>;
using This = HybridFactorGraph; ///< this class
using shared_ptr = boost::shared_ptr<This>; ///< shared_ptr to This

using Values = gtsam::Values; ///< backwards compatibility
using Indices = KeyVector; ///> map from keys to values

protected:
/// Check if FACTOR type is derived from DiscreteFactor.
template <typename FACTOR>
using IsDiscrete = typename std::enable_if<
std::is_base_of<DiscreteFactor, FACTOR>::value>::type;

/// Check if FACTOR type is derived from HybridFactor.
template <typename FACTOR>
using IsHybrid = typename std::enable_if<
std::is_base_of<HybridFactor, FACTOR>::value>::type;

public:
/// @name Constructors
/// @{

/// Default constructor
HybridFactorGraph() = default;

/**
* Implicit copy/downcast constructor to override explicit template container
* constructor. In BayesTree this is used for:
* `cachedSeparatorMarginal_.reset(*separatorMarginal)`
* */
template <class DERIVEDFACTOR>
HybridFactorGraph(const FactorGraph<DERIVEDFACTOR>& graph) : Base(graph) {}

/// @}

// Allow use of selected FactorGraph methods:
using Base::empty;
using Base::reserve;
using Base::size;
using Base::operator[];
using Base::add;
using Base::push_back;
using Base::resize;

/**
* Add a discrete factor *pointer* to the internal discrete graph
* @param discreteFactor - boost::shared_ptr to the factor to add
*/
template <typename FACTOR>
IsDiscrete<FACTOR> push_discrete(
const boost::shared_ptr<FACTOR>& discreteFactor) {
Base::push_back(boost::make_shared<HybridDiscreteFactor>(discreteFactor));
}

/**
* Add a discrete-continuous (Hybrid) factor *pointer* to the graph
* @param hybridFactor - boost::shared_ptr to the factor to add
*/
template <typename FACTOR>
IsHybrid<FACTOR> push_hybrid(const boost::shared_ptr<FACTOR>& hybridFactor) {
Base::push_back(hybridFactor);
}

/// delete emplace_shared.
template <class FACTOR, class... Args>
void emplace_shared(Args&&... args) = delete;

/// Construct a factor and add (shared pointer to it) to factor graph.
template <class FACTOR, class... Args>
IsDiscrete<FACTOR> emplace_discrete(Args&&... args) {
auto factor = boost::allocate_shared<FACTOR>(
Eigen::aligned_allocator<FACTOR>(), std::forward<Args>(args)...);
push_discrete(factor);
}

/// Construct a factor and add (shared pointer to it) to factor graph.
template <class FACTOR, class... Args>
IsHybrid<FACTOR> emplace_hybrid(Args&&... args) {
auto factor = boost::allocate_shared<FACTOR>(
Eigen::aligned_allocator<FACTOR>(), std::forward<Args>(args)...);
push_hybrid(factor);
}

/**
* @brief Add a single factor shared pointer to the hybrid factor graph.
* Dynamically handles the factor type and assigns it to the correct
* underlying container.
*
* @param sharedFactor The factor to add to this factor graph.
*/
void push_back(const SharedFactor& sharedFactor) {
if (auto p = boost::dynamic_pointer_cast<DiscreteFactor>(sharedFactor)) {
push_discrete(p);
}
if (auto p = boost::dynamic_pointer_cast<HybridFactor>(sharedFactor)) {
push_hybrid(p);
}
}
};

} // namespace gtsam
60 changes: 53 additions & 7 deletions gtsam/hybrid/HybridGaussianFactorGraph.h
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,8 @@
#pragma once

#include <gtsam/hybrid/HybridFactor.h>
#include <gtsam/hybrid/HybridFactorGraph.h>
#include <gtsam/hybrid/HybridGaussianFactor.h>
#include <gtsam/inference/EliminateableFactorGraph.h>
#include <gtsam/inference/FactorGraph.h>
#include <gtsam/inference/Ordering.h>
Expand Down Expand Up @@ -53,10 +55,9 @@ struct EliminationTraits<HybridGaussianFactorGraph> {
typedef HybridBayesNet
BayesNetType; ///< Type of Bayes net from sequential elimination
typedef HybridEliminationTree
EliminationTreeType; ///< Type of elimination tree
typedef HybridBayesTree BayesTreeType; ///< Type of Bayes tree
typedef HybridJunctionTree
JunctionTreeType; ///< Type of Junction tree
EliminationTreeType; ///< Type of elimination tree
typedef HybridBayesTree BayesTreeType; ///< Type of Bayes tree
typedef HybridJunctionTree JunctionTreeType; ///< Type of Junction tree
/// The default dense elimination function
static std::pair<boost::shared_ptr<ConditionalType>,
boost::shared_ptr<FactorType> >
Expand All @@ -72,10 +73,16 @@ struct EliminationTraits<HybridGaussianFactorGraph> {
* Everything inside needs to be hybrid factor or hybrid conditional.
*/
class GTSAM_EXPORT HybridGaussianFactorGraph
: public FactorGraph<HybridFactor>,
: public HybridFactorGraph,
public EliminateableFactorGraph<HybridGaussianFactorGraph> {
protected:
/// Check if FACTOR type is derived from GaussianFactor.
template <typename FACTOR>
using IsGaussian = typename std::enable_if<
std::is_base_of<GaussianFactor, FACTOR>::value>::type;

public:
using Base = FactorGraph<HybridFactor>;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice!

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, this makes adding HybridNonlinearFactorGraph later a cinch! :D

using Base = HybridFactorGraph;
using This = HybridGaussianFactorGraph; ///< this class
using BaseEliminateable =
EliminateableFactorGraph<This>; ///< for elimination
Expand All @@ -100,7 +107,13 @@ class GTSAM_EXPORT HybridGaussianFactorGraph

/// @}

using FactorGraph::add;
using Base::empty;
using Base::reserve;
using Base::size;
using Base::operator[];
using Base::add;
using Base::push_back;
using Base::resize;

/// Add a Jacobian factor to the factor graph.
void add(JacobianFactor&& factor);
Expand All @@ -113,6 +126,39 @@ class GTSAM_EXPORT HybridGaussianFactorGraph

/// Add a DecisionTreeFactor as a shared ptr.
void add(boost::shared_ptr<DecisionTreeFactor> factor);

/**
* Add a gaussian factor *pointer* to the internal gaussian factor graph
* @param gaussianFactor - boost::shared_ptr to the factor to add
*/
template <typename FACTOR>
IsGaussian<FACTOR> push_gaussian(
const boost::shared_ptr<FACTOR>& gaussianFactor) {
Base::push_back(boost::make_shared<HybridGaussianFactor>(gaussianFactor));
}

/// Construct a factor and add (shared pointer to it) to factor graph.
template <class FACTOR, class... Args>
IsGaussian<FACTOR> emplace_gaussian(Args&&... args) {
auto factor = boost::allocate_shared<FACTOR>(
Eigen::aligned_allocator<FACTOR>(), std::forward<Args>(args)...);
push_gaussian(factor);
}

/**
* @brief Add a single factor shared pointer to the hybrid factor graph.
* Dynamically handles the factor type and assigns it to the correct
* underlying container.
*
* @param sharedFactor The factor to add to this factor graph.
*/
void push_back(const SharedFactor& sharedFactor) {
if (auto p = boost::dynamic_pointer_cast<GaussianFactor>(sharedFactor)) {
push_gaussian(p);
} else {
Base::push_back(sharedFactor);
}
}
};

} // namespace gtsam
Loading