[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#1023309: RFP: fawkes - privacy-preserving tool against facial recognition systems



Package: wnpp
Severity: wishlist

* Package name    : fawkes
   Version         : 0.3
   Upstream Author : Shawn Shan
* URL             : https://github.com/Shawn-Shan/fawkes
* License         : BSD-3-Clause License
   Programming Lang: Python
   Description     : privacy-preserving tool against facial recognition

Fawkes is a software tool that gives individuals the ability to limit
how unknown third parties can track them by building facial recognition
models out of their publicly available photos. At a high level, Fawkes
"poisons" models that try to learn what you look like, by putting hidden
changes into your photos, and using them as Trojan horses to deliver that
poison to any facial recognition models of you. Fawkes takes your personal
images and makes tiny, pixel-level changes that are invisible to the human
eye, in a process we call image cloaking. You can then use these "cloaked"
photos as you normally would, sharing them on social media, sending them
to friends, printing them or displaying them on digital devices, the same
way you would any other photo. The difference, however, is that if and
when someone tries to use these photos to build a facial recognition model,
"cloaked" images will teach the model an highly distorted version of what
makes you look like you. The cloak effect is not easily detectable by humans
or machines and will not cause errors in model training. However, when
someone tries to identify you by presenting an unaltered, "uncloaked" image
of you (e.g. a photo taken in public) to the model, the model will fail to
recognize you.

I'm not aware of any other similar tool in Debian.

--


Reply to: