Show simple item record

dc.contributor.authorMan, Y.
dc.contributor.authorLi, M.
dc.contributor.authorGerdes, R.
dc.date.accessioned2021-07-16T01:37:44Z
dc.date.available2021-07-16T01:37:44Z
dc.date.issued2020
dc.identifier.citationMan, Y., Li, M., & Gerdes, R. (2020). GhostImage: Remote Perception Attacks against Camera-based Image Classification Systems. In 23rd International Symposium on Research in Attacks, Intrusions and Defenses ({RAID} 2020) (pp. 317-332).
dc.identifier.isbn9781940000000
dc.identifier.urihttp://hdl.handle.net/10150/660577
dc.description.abstractIn vision-based object classification systems imaging sensors perceive the environment and then objects are detected and classified for decision-making purposes; e.g., to maneuver an automated vehicle around an obstacle or to raise an alarm to indicate the presence of an intruder in surveillance settings. In this work we demonstrate how the perception domain can be remotely and unobtrusively exploited to enable an attacker to create spurious objects or alter an existing object. An automated system relying on a detection/classification framework subject to our attack could be made to undertake actions with catastrophic results due to attacker-induced misperception. We focus on camera-based systems and show that it is possible to remotely project adversarial patterns into camera systems by exploiting two common effects in optical imaging systems, viz., lens flare/ghost effects and auto-exposure control. To improve the robustness of the attack to channel effects, we generate optimal patterns by integrating adversarial machine learning techniques with a trained end-to-end channel model. We experimentally demonstrate our attacks using a low-cost projector, on three different image datasets, in indoor and outdoor environments, and with three different cameras. Experimental results show that, depending on the projector-camera distance, attack success rates can reach as high as 100% and under targeted conditions. © 2020 by The USENIX Association. All Rights Reserved.
dc.language.isoen
dc.publisherUSENIX Association
dc.rightsCopyright © The Author(s).
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/
dc.titleGhostImage: Remote perception attacks against camera-based image classification systems
dc.typeProceedings
dc.typetext
dc.contributor.departmentUniversity of Arizona
dc.identifier.journalRAID 2020 Proceedings - 23rd International Symposium on Research in Attacks, Intrusions and Defenses
dc.description.noteImmediate access
dc.description.collectioninformationThis item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at repository@u.library.arizona.edu.
dc.eprint.versionFinal published version
dc.source.journaltitleRAID 2020 Proceedings - 23rd International Symposium on Research in Attacks, Intrusions and Defenses
refterms.dateFOA2021-07-16T01:37:44Z


Files in this item

Thumbnail
Name:
raid20-man.pdf
Size:
860.9Kb
Format:
PDF
Description:
Final Published Version

This item appears in the following Collection(s)

Show simple item record