Adversarial attck
Adversarial attack이란
Hyeon.___.
2022. 12. 9. 16:07
반응형
Adversarial attack이란 adversarial example이라고 하는 조작된 이미지로 학습되어 있는 모델에 추론을 방해하는 공격 기법을 말합니다.
Reference
https://arxiv.org/abs/1412.6572
Explaining and Harnessing Adversarial Examples
Several machine learning models, including neural networks, consistently misclassify adversarial examples---inputs formed by applying small but intentionally worst-case perturbations to examples from the dataset, such that the perturbed input results in th
arxiv.org
반응형