본문 바로가기
Adversarial attck

Adversarial attack이란

by Hyeon.___. 2022. 12. 9.
반응형

Adversarial attack이란 adversarial example이라고 하는 조작된 이미지로 학습되어 있는 모델에 추론을 방해하는 공격 기법을 말합니다.

 

 

Reference


https://arxiv.org/abs/1412.6572

 

Explaining and Harnessing Adversarial Examples

Several machine learning models, including neural networks, consistently misclassify adversarial examples---inputs formed by applying small but intentionally worst-case perturbations to examples from the dataset, such that the perturbed input results in th

arxiv.org

 

반응형

'Adversarial attck' 카테고리의 다른 글

Adversarial Robustness in Object Detection  (0) 2023.07.20

댓글