43 lines
1.3 KiB
Markdown
43 lines
1.3 KiB
Markdown
# First Pass
|
|
**Category:**
|
|
This is a methods paper.
|
|
|
|
**Context:**
|
|
This paper proposes a way of using mixed integer linear programming (MILP)
|
|
to evaluate properties of neural networks.
|
|
|
|
**Correctness:**
|
|
Formal
|
|
|
|
**Contributions:**
|
|
They do nifty things with bounds tightening and presolving that makes their
|
|
solver very fast compared to the state of the art or Reluplex. They also talk
|
|
about stable and unstable neurons.
|
|
|
|
**Clarity:**
|
|
They have a really good explanation of what a MILP problem is and how one
|
|
might encode a neural network as one.
|
|
|
|
# Second Pass
|
|
**What is the main thrust?**
|
|
The main thrust is their new solving method of MILPs for neural networks.
|
|
With their method, neural networks can have their neurons analyzed to
|
|
prove whether or not the network is robust to input perturbations. This is
|
|
especially important for classifiers, who need to know if there are sneaky
|
|
nonlinearities that can be harmful to a built system (like a glitch). This
|
|
method of bounds tightening and MILP usage makes their solver much faster
|
|
and therein more capable to handle large networks.
|
|
|
|
**What is the supporting evidence?**
|
|
They have a whole bunch of experimental results.
|
|
|
|
**What are the key findings?**
|
|
MILPs and bound tightening is very good!
|
|
|
|
# Third Pass
|
|
**Recreation Notes:**
|
|
|
|
**Hidden Findings:**
|
|
|
|
**Weak Points? Strong Points?**
|