Computer Science and Engineering, Department of


First Advisor

Brittany Duncan

Second Advisor

Carrick Detweiler

Date of this Version

Spring 4-25-2022


A THESIS Presented to the Faculty of The Graduate College at the University of Nebraska In Partial Fulfilment of Requirements For the Degree ofMaster of Science, Major: Computer Science, Under the Supervision of Professors Brittany Duncan and Carrick Detweiler. Lincoln, Nebraska: May, 2022

Copyright © 2022 Paul Fletcher


Unmanned Aerial Vehicles (UAVs) are being used in public domains and hazardous environments where effective communication strategies are critical. UAV gesture techniques have been shown to communicate meaning to human observers and may be ideal in contexts that require lightweight systems such as unmanned aerial flight, however, this work may be limited to an idealized range of viewer perspectives. As gesture is a visual communication technique it is necessary to consider how the perception of a robot gesture may suffer from obfuscation or self-occlusion from some viewpoints. This thesis presents the results of three online user-studies that examine participants’ ability to accurately perceive the intended shape of two-dimensional UAV gestures from varying viewer perspectives. We used a logistic regression model to characterize participant gesture classification accuracy, demonstrating that viewer perspective does impact how participants perceive the shape of UAV gestures. Our results yielded a viewpoint angle threshold from beyond which participants were able to assess the intended shape of a gesture’s motion with 90% accuracy. We also present methods for prediction of intraset gesture differentiability and viewpoint perceptibility. We demonstrate that differentiability is correlated to trajectory difference measures and viewpoint perceptibilty can be predicted within one standard error of mean participant responses. These findings will enable UAV gesture systems that, with a high degree of confidence, ensure gesture motions can be accurately perceived by human observers.

Advisors: Brittany Duncan and Carrick Detweiler