Factors Affecting Digital Map Reading

Digital maps are common place in our modern lives and mapping software is constantly improving with frequent feature additions and design adjustments. Given how ubiquitous they are, it is important to research and understand how end-user cognition can impact the ability to learn from these maps. Are there cognitive factors that can aid (or hinder) our ability to learn digital maps? How is the user experience impacted by 1) interface design choices, and 2) cognition of the end user? Specifically, we are interested in whether cognitive load and sex of the user can affect map reading.

Our data suggests that increased cognitive load impacts males and females differently when learning map-based information. The literature posits an evolution of sexually dimorphic navigation strategies and our research supports this hypothesis. While males and females are both negatively impacted by an increase in cognitive load, the detriment is selective and highly dependent on the type of information being learnt.

Overall, this line of research suggests that cognitive and personological factors should be taken into account when designing mapping software, as these factors can impact our ability to learn map information in unexpected ways.

Eye Gaze in Real-time Dyadic Interactions

Eye-tracking technology has frequently been used to study social gazing behaviour. However, most experiments have studied said behaviour using computer-based paradigms, which lack many of the qualities found in true social interactions between live participants. We addressed this problem by utilizing dual eye-trackers to study social interactions in live, dyadic settings. Furthermore, we turned to signal anaylsis techniques to uncover the temporal relationships between eye gaze signals in these natural environments.

Our data shows that both eye gaze and speech information can be quantified as discrete binary signals, and that they are both key to regulating turn-taking behaviour during interactions between live participants. Temporal analysis of these signal pairings indicate that maximal alignment can occur at multiple time points, and the specific temporal offset of that alignment can signal different intentions during turn-taking.

Role of Vocal Features in Social Hierarchies

How do humans form social hierarchies? Rank assignment in social groups is contingent upon multiple factors. For example, our perception of a person’s leadership style can depend on their sex, physical size, and testosterone level. These physical features are of little use under conditions where visual information is difficult to obtain (e.g. when it is dark). This line of research explores whether rank assignment can occur in the absence of visual cues, and whether vocal information can act as a supplementary cue to aid the formation of social hierarchies.

Signal analysis techniques were used to extract features from vocal signals (e.g. fundamental frequency, formant dispersion) obtained from natural group interactions. Our results indicate that the most important feature for predicting an individual’s rank/leadership style is the change in fundamental frequency during early parts of the interaction.