Obsidian/Zettelkasten/Literature Notes/Output Range Analysis for Deep Feedforward Neural Networks.md

2.0 KiB

authors citekey publish_date publisher location pages last_import
Dutta, Souradeep
Jha, Susmit
Sankaranarayanan, Sriram
Tiwari, Ashish
Dutle, Aaron
Muñoz, César
Narkawicz, Anthony
duttaOutputRangeAnalysis2018 2018-01-01 Springer International Publishing Cham 121-138 2025-05-12

Indexing Information

Published: 2018-01

DOI 10.1007/978-3-319-77935-5_9 ISBN 978-3-319-77935-5

#ToRead

[!Abstract] Given a neural network (NN) and a set of possible inputs to the network described by polyhedral constraints, we aim to compute a safe over-approximation of the set of possible output values. This operation is a fundamental primitive enabling the formal analysis of neural networks that are extensively used in a variety of machine learning tasks such as perception and control of autonomous systems. Increasingly, they are deployed in high-assurance applications, leading to a compelling use case for formal verification approaches. In this paper, we present an efficient range estimation algorithm that iterates between an expensive global combinatorial search using mixed-integer linear programming problems, and a relatively inexpensive local optimization that repeatedly seeks a local optimum of the function represented by the NN. We implement our approach and compare it with Reluplex, a recently proposed solver for deep neural networks. We demonstrate applications of our approach to computing flowpipes for neural network-based feedback controllers. We show that the use of local search in conjunction with mixed-integer linear programming solvers effectively reduces the combinatorial search over possible combinations of active neurons in the network by pruning away suboptimal nodes.>[!seealso] Related Papers

Annotations

Notes

!Paper Notes/Output Range Analysis for Deep Feedforward Neural Networks.md

Highlights From Zotero

Follow-Ups