Evaluating the Feasibility of Latency Attacks against Black-box Object Detection Models
Adversarial examples can be crafted by attaching external objects from pre-collected data to target images, enabling successful latency attacks against black-box object detection models without any prior knowledge about the target model.