Designing New Experiences: Where Mixed Reality Meets Climate Change
As virtual reality (VR) and augmented reality (AR) technologies become increasingly accessible, the big question on many people’s minds is how these new tools will translate to meaningful applications.

While early examples like immersive games (which might superimpose digital content on users’ views of the physical world, in an augmented reality approach, or create fully immersive virtual reality experiences) have captured public attention, technologists and designers are using the technology to create tools with significant potential for impact in a wide range of fields, from medicine to manufacturing.
In his work with Yale’s recently launched OpenLab, Tsai CITY innovator-in-residence Martin Wainstein and his collaborators are working on building AR/VR tools to tackle what might be the biggest challenge on the planet: climate change. In November, Wainstein shared his work with two different audiences. Mid-month, he joined energy services company Avangrid’s innovation summit at the Yale School of Management to give a presentation and live VR demo to company representatives. Partnering with Avangrid, the Yale Center for Collaborative Arts and Media, and the Yale Center for Business and the Environment (CBEY), Wainstein had worked with a group of students (Bobby Berry, Lance Chantiles-Wertz, and Winter Willoughby-Spera) to prototype a mixed reality tool that could help practitioners manage Avangrid’s systems, using real-time information to adjust the energy mix, respond to emergencies, and more. As November came to a close, Wainstein and his colleague Sophie Janaskie, environmental innovation fellow at CITY and CBEY, presented at Yale’s Day of Data, an annual gathering of researchers and campus community members working with data in diverse fields. In both events, the team highlighted their focus on experience design, pointing out how emerging technologies could open new possibilities for understanding, using, and presenting complex information. We caught up with Wainstein to learn more.
In recent presentations, you've spoken to two different audiences — one primarily industry-focused, and one primarily research-focused. What insights or new ideas are you bringing to each of these audiences? How do you hope to engage or collaborate with these audiences going forward?
The industry-focused insight has been around the higher versatility that AR has when it comes to use-cases in operations. This is a key focus in industry, because these applications of new technology can help optimize specific processes. However, we’ve found that VR is almost crucial to design AR applications in the first place. Through VR, you can recreate the environments in which industry practitioners operate, and test prototypes directly there. For example, we’d like to design an AR tool to help wind turbine engineers, who operate in an engine room 150 meters off the ground. With VR, we can virtually recreate this setting to develop prototypes, without having to go up a wind turbine just to do user testing. Industry-wise, we hope to keep engaging students to work on new use-case applications, as well as taking current prototypes to the next level. Our next big step is integrating machine learning algorithms with mixed reality outputs.
On the research front, the main insight we’ve had is that AR/VR is a canvas to show data in context. This helps researchers explain the background and results of their work to an extended audience and in a more engaging way. For example, if a research group is studying protein folding in photosystems of plant cells, VR can help them show exactly where these are located on a thylakoid membrane, and even place the output of their work directly in this immersive context. We would love to get researchers’ input on how to create applications that they can use in the lab or the classroom.
What do you see as some key use-cases for the areas you're exploring, particularly the intersection of mixed reality and climate change?
The first use-case that inspired us is education around energy. Energy is an all-encompassing topic that literally connects the whole universe — it’s hard to grasp the macro and micro perspectives for such a topic through purely abstract concepts. Furthermore, it’s also crucial that the incoming generation has a good grasp of the central role that our energy system has in the sustainability of the planet, understanding from a systems level how climate change is happening and why everyone needs to do their part. For this, immersive experiences from VR can help elicit empathy and consolidate knowledge across spatial scales (from planet dynamics to cellular or atomic dynamics). Ultimately, mixed reality is technology that can help us reframe how we tell stories.
There are, of course, other use-cases, some more related to concrete business operations. For example, we’ve been designing an application to help operators manage smart grids and renewable energy portfolios through more intuitive user interfaces. In a nutshell, we’ve defined our use-case analysis to cover education, training, and operations.
Many of the emerging tech projects you've executed this fall, like the Blockchain Bootcamp and the Energy Academy project, aim to bring together people and skillsets from different disciplines, including the arts as well as STEM. What can Yale uniquely offer in this regard?
I’m convinced Yale is in an incredibly interesting position to engage with emerging technology in a totally different way, and thus produce innovative outcomes. Emerging tech and STEM will not shine if we engage them only through computer science prowess and math, focusing on projects guided by market opportunities or purely technical challenges. I believe for these technologies to serve humanity and the planet (and not the other way around), the mind and sensitivity of artists and humanists is crucial to reframe our relationship with technology.
Learn more about the OpenLab and its emerging technology projects here.