vault backup: 2025-07-07 11:46:38
This commit is contained in:
parent
c3a19a4c54
commit
a1c2690909
@ -7,7 +7,7 @@ authors:
|
||||
|
||||
citekey: "urbanReviewFormalMethods2021"
|
||||
publish_date: 2021-04-21
|
||||
last_import: 2025-06-30
|
||||
last_import: 2025-07-07
|
||||
---
|
||||
|
||||
# Indexing Information
|
||||
@ -18,8 +18,7 @@ Published: 2021-04
|
||||
#Computer-Science---Machine-Learning, #Computer-Science---Logic-in-Computer-Science, #Computer-Science---Programming-Languages
|
||||
|
||||
|
||||
#InFirstPass
|
||||
|
||||
#ToRead
|
||||
|
||||
|
||||
>[!Abstract]
|
||||
@ -207,7 +206,7 @@ Published: 2021-04
|
||||
|
||||
>[!tip] Brilliant
|
||||
> hen the training inputs are not linearly separable in the input space, the input space is implicitly projected, using so-called kernel functions, into a much higher-dimensional space in which the training inputs become linearly separable. This allows a support vector machine to also learn non-linear decision boundaries.
|
||||
> 2025-06-30 1:59 pm
|
||||
> 2025-07-07 11:41 am
|
||||
>
|
||||
|
||||
>[!done] Important
|
||||
@ -239,6 +238,7 @@ Published: 2021-04
|
||||
|
||||
|
||||
|
||||
|
||||
## Follow-Ups
|
||||
|
||||
>[!example]
|
||||
@ -277,3 +277,7 @@ Published: 2021-04
|
||||
>[107] B. Livshits, M. Sridharan, Y. Smaragdakis, O. Lhot ́ak, J. Nelson Amaral, B.-Y. E. Chang, S. Z. Guyer, U. P. Khedker, A. Møller, and Dimitrios Vardoulakis. In Defense of Soundiness: A Manifesto. 2015.
|
||||
>- [ ] #Follow-Up
|
||||
|
||||
>[!example]
|
||||
>[131] F. Ranzato, M. Zanella Robustness Verification of Support Vector Machines. In SAS 2019.
|
||||
>- [ ] #Follow-Up
|
||||
|
||||
|
||||
@ -23,7 +23,6 @@ Very well written, easy to understand. Except, what is abstractification of a
|
||||
network?
|
||||
|
||||
# Second Pass
|
||||
**What is the main thrust?**
|
||||
This review gives an overview of formal methods applied to machine learning.
|
||||
Formal methods has been used for ML to test for robustness for various
|
||||
perturbations on inputs. They start by talking about several types of formal
|
||||
@ -47,12 +46,26 @@ That way, if a safety condition is violated, the SMT solver will pick it up as a
|
||||
counter example.
|
||||
|
||||
MILP based solvers are Mixed Integer Linear Programming solvers. MILPs use
|
||||
linear programming where certain constraints are integers to reduce the solving space.
|
||||
linear programming where certain constraints are integers to reduce the
|
||||
solving space. Instead of having an explicity satisffiability condition on safety
|
||||
or not, MILPs instead can use a minimizing function to generate counter
|
||||
examples. For example, a MILP saftey condition might be formalized as:
|
||||
$$ \text{min}\space \bf{x}_n$$
|
||||
where a negative value of $\bf{x}_n$ is 'a valid counter example'.
|
||||
|
||||
**What is the supporting evidence?**
|
||||
There are also 'incomplete formal methods'. These incomplete methods are
|
||||
sound, meaning they will not produce false negatives, but they may not be
|
||||
complete, meaning they may produce false positives. They scale better, but
|
||||
the false positives can be pretty annoying.
|
||||
|
||||
**What are the key findings?**
|
||||
Local robustness
|
||||
One of the main incomplete methods is abstract interpretation. Things get very
|
||||
weird very fast--this is all about zonotopes and using over approximation of sets
|
||||
to say things about stability. There is weird stuff going on with cell splitting
|
||||
and merging that seems like a to grasp. But, apparently it works.
|
||||
|
||||
Next, they spend some time talking about other machine learning methods and
|
||||
the formal methods that have been applied to them. Of particular interest to me
|
||||
is the formal method application to support vector machines and decision trees.
|
||||
|
||||
# Third Pass
|
||||
**Recreation Notes:**
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user