Our research explores how humans can understand and develop viewing behaviors with mutual paralleled first person view sharing in which a person can see others' first person video perspectives as well as their own perspective in realtime. We developed a paralleled first person view sharing system which consists of multiple video see-through head mounted displays and an embedded eye tracking system. With this system, four persons can see four shared first person videos of each other. We then conducted workshop based research with two activities, drawing pictures and playing a simple chasing game with our view sharing system. Our results show that 1) people can complement each other's memory and decisions and 2) people can develop their viewing behaviors to understand their own physical embodiment and spatial relationship with others in complex situations. Our findings about patterns of viewing behavior and design implications will contribute to building design experience in paralleled view sharing applications.
Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA