Interest in signals captured about individuals by an array of sensing devices will continue to grow as algorithms engineered to predict not only our attributes, but our future choices, become increasingly effective. A fundamental risk emerges from the fact that individuals still hold significant misperceptions about the sensitivity of the information that can be inferred from this data, and what agency they have in protecting their privacy when their fine-grained (and possibly involuntary) behavior is tracked. This work directly addresses this problem by examining how individuals obfuscate their intent in situations where the data-driven prediction of their choices may pose a threat to privacy and perhaps autonomy. In this paradigm, which we call covert embodied choice, we leverage virtual reality as an experimental platform allowing us to capture key motor and physiological signals (such as eye and arm movements, and electrodermal activity). Analysis of these data streams will provide insight into people’s prior beliefs and strategic choices, as well as the sensitivity of and risks pertaining to physiology tracking in this setting.
For inquiries: https://www.ischool.berkeley.edu/people/jeremy-gordon