Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Symbolic bound tightening over absolute values #251

Merged
merged 39 commits into from
Apr 27, 2020

Conversation

guykatzz
Copy link
Collaborator

Teach the network level reasoner to work with AbsoluteValue constraints, and not just ReLUs. One motivating example is to have better support for adversarial robustness queries that use L1 norm.

This is PR #3 in this series

@ahmed-irfan
Copy link
Collaborator

there are conflicts

@guykatzz
Copy link
Collaborator Author

Fixed the conflicts

Copy link
Collaborator

@ahmed-irfan ahmed-irfan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.

@guykatzz guykatzz merged commit 8a116f4 into NeuralNetworkVerification:master Apr 27, 2020
@guykatzz guykatzz deleted the nlr3 branch April 27, 2020 07:11
guykatzz added a commit to guykatzz/Marabou that referenced this pull request Apr 27, 2020
AleksandarZeljic pushed a commit to AleksandarZeljic/Marabou that referenced this pull request Oct 9, 2020
…tion#251)

* test

* basic interval artihmetic bound propagation

* another test case

* initialize an SBT within the NLR. Pass topology, weights and biases

* wip

* SBT functionality into NLR

* wip

* bug fix

* cleanup

* cleanup

* wip

* handle eliminated relus

* cleanup: remove symbolic bound tightener

* oops

* additional cleanup

* oops

* first unit test

* unit tests for NLR

* basic support for symbolic bound propagations over ABS constraints

* unit test - evalue abs + relus

* a test for interval bound propagation, for abs constraints

* unit tests, and some consequent corrections, to AbsConstraint SBT

* set lblb to 0 for both Relu and Abs

* always initialize the scalar to 0 by default

* consistency

* oops

Co-authored-by: Guy Katz <guykatz@cs.huji.ac.il>
matanost pushed a commit that referenced this pull request Nov 2, 2021
* test

* basic interval artihmetic bound propagation

* another test case

* initialize an SBT within the NLR. Pass topology, weights and biases

* wip

* SBT functionality into NLR

* wip

* bug fix

* cleanup

* cleanup

* wip

* handle eliminated relus

* cleanup: remove symbolic bound tightener

* oops

* additional cleanup

* oops

* first unit test

* unit tests for NLR

* basic support for symbolic bound propagations over ABS constraints

* unit test - evalue abs + relus

* a test for interval bound propagation, for abs constraints

* unit tests, and some consequent corrections, to AbsConstraint SBT

* set lblb to 0 for both Relu and Abs

* always initialize the scalar to 0 by default

* consistency

* oops

Co-authored-by: Guy Katz <guykatz@cs.huji.ac.il>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants