When you go window shopping, you need to bring your imagination. You’re not really interested in buying that expensive outfit on the mannequin, nor are you invested enough to go to a dressing room, but it would be nice to know how you look in it. So your mind’s eye takes up the challenge, with its erratic vision and often unreliable accounts. Fortunately, a new technology has presented a way for lukewarm shoppers to passively (and virtually) try on clothing just by looking in shop windows – without ever leaving the sidewalk. When we stand in front of a reflective, transparent surface like a window, our reflection overlaps the objects behind it. What researchers from the University of Bristol and Lancaster University in England have introduced is a new take on planar reflective optical combiners and two-way mirrors by visually merging physical spaces in front of and behind a reflective surface, and enabling augmentations and interactive possibilities. Its inspiration? Bulky virtual-reality helmets. Courtesy of Dr. Diego Martinez Plasencia, Bristol Interaction and Graphics Group, University of Bristol “I have been trying to come up with systems that allow you to get the benefits of collaborative virtual-reality worlds, but without adding [work for] the user,” said Dr. Diego Martinez Plasencia of the University of Bristol. “By [making] the system more clever, we can avoid cumbersome devices … A flat, semireflective surface optically blends together the spaces in front and behind it, independently of where the observer is located.” Martinez Plasencia and his team used the reflections in semitransparent mirrors to overlap unreachable content placed behind a combiner. To produce the effect of “trying on” clothing or accessories, a modified liquid-crystal sandwich arrangement was implemented with a two-way mirror to use a variety of 3-D techniques, such as perspective-corrected fish-tank virtual reality, parallax barriers and multiuser random-hole masks. A motion sensor camera was used to track the person’s location and to map augmentations – such as hats, glasses or clothes – onto a person’s reflection. And window shopping is just one of the many applications for this technology. When looking at a museum display, for instance, reflections of a person’s fingers can create a 3-D cursor within the display, visible among the objects. The person can then virtually touch the items behind the window and perhaps even enhance the exhibition. For example, someone can virtually “open” an ancient Egyptian sarcophagus and see a display of its contents. These actions can be seen by everyone in attendance, regardless of location. “We explored the influence of different factors, such as the material’s reflectivity, illumination, display technology, projection mapping, volumetric displays, flat autostereoscopic effects and also issues related to human perception,” Martinez Plasencia said. “Our eye has a limited depth of field, which determines the elements in front and behind the mirror.” Other potential applications are to be found in the medical field, where patients could see 3-D visuals of an illness or disease overlaid on their actual bodies, so they could fully grasp the inner workings of their biology. For surgeons in training, displays controlled by the teaching surgeon could be mapped onto the surgery table, showing relevant information. These capabilities combine the magic of virtual reality with the freedom of reflection and provide users with an immersive, interactive experience. “The virtual world bends its rules to adapt to you,” Martinez Plasencia said. The research was published in the 2014 proceedings of the ACM Symposium on User Interface Software and Technology (doi: 10.1145/2642918.2647351).