Date of this Version
Unmanned Aerial Vehicles (UAVs) are being used in public domains and hazardous environments where effective communication strategies are critical. UAV gesture techniques have been shown to communicate meaning to human observers and may be ideal in contexts that require lightweight systems such as unmanned aerial flight, however, this work may be limited to an idealized range of viewer perspectives. As gesture is a visual communication technique it is necessary to consider how the perception of a robot gesture may suffer from obfuscation or self-occlusion from some viewpoints. This thesis presents the results of three online user-studies that examine participants’ ability to accurately perceive the intended shape of two-dimensional UAV gestures from varying viewer perspectives. We used a logistic regression model to characterize participant gesture classification accuracy, demonstrating that viewer perspective does impact how participants perceive the shape of UAV gestures. Our results yielded a viewpoint angle threshold from beyond which participants were able to assess the intended shape of a gesture’s motion with 90% accuracy. We also present methods for prediction of intraset gesture differentiability and viewpoint perceptibility. We demonstrate that differentiability is correlated to trajectory difference measures and viewpoint perceptibilty can be predicted within one standard error of mean participant responses. These findings will enable UAV gesture systems that, with a high degree of confidence, ensure gesture motions can be accurately perceived by human observers.
Advisors: Brittany Duncan and Carrick Detweiler