diff --git a/Notes on Papers/A Review of Formal Methods applied to Machine Learning.md b/Notes on Papers/A Review of Formal Methods applied to Machine Learning.md index cd0c3ca5..7d7157dc 100644 --- a/Notes on Papers/A Review of Formal Methods applied to Machine Learning.md +++ b/Notes on Papers/A Review of Formal Methods applied to Machine Learning.md @@ -24,6 +24,22 @@ network? # Second Pass **What is the main thrust?** +This review gives an overview of formal methods applied to machine learning. +Formal methods has been used for ML to test for robustness for various +perturbations on inputs. They start by talking about several types of formal methods, including deductive verification, design by refinement, proof +assistants, model checking, and semantic static analysis, and then finally, +abstract interpretation of semantic static analysis. The last one has been +used the most in FM of ML. + +A large part of the paper focuses on formal methods for neural network +perturbation robustness. They split up the methods into two types: complete +and incomplete formal methods. Complete formal methods are *sound* and +*complete*. This means they do not have false positives or false negatives. +These methods generally do not apply well to large networks (beyond +hundreds gets rough), but give the highest level of assurance. Here, there +are two more subcategories: SMT based solvers and MILP-based solvers. + +SMT solvers are Satisfiability Modulo Theory solvers. These solvers **What is the supporting evidence?**