The project seamlessly integrates physical and virtual components by digitally replicating a houseplant and visualizing its data (temperature, light intensity, humidity, and soil parameters). Plant owners can interact with experienced individuals, receive advice, and communicate regardless of location. Technologies such as Blender, Unity, and Meta Oculus Pro were used for 3D modeling, virtual environment creation, and MR visualization, respectively. Historical weather data from three and a half years have helped visualize and predict plant conditions. The implementation integrates physical and virtual spaces and provides valuable insights into plant maintenance. This underscores the importance of MR and Digital Twins that assist people in understanding how to care for indoor plants effectively.
Tao et al. mention precisely what we were aiming for in our experience and taking into consideration during our brainstorming: "AR (including MR) contains a set of innovative techniques (e.g., real-time data acquisition, human-computer interaction, scene capture, real-time tracking, and registration, etc.), and can augment the view of the physical world by embedding computer-generated elements or objects. It enables people to work in a physical environment with valuable and abundant virtual information and models". Therefore, after creating a possible scenario, we decided on interactions that included manipulating the physical plant and its digital counterpart, engaging with data, and fostering user-to-user interaction. The plant's 3D model was designed using the 3D modeling software Blender. During modeling, we focused on individual leaves' number, size, shape, and placement on our physical template. We then imported the 3D model into Unity and created the virtual environment components for visualization. Photon is a real-time multiplayer game development framework that allows for collaborative experiences. We used it to implement a real-time collaborative experience. We started designing our data visualization in Figma to get a first idea and later designed a similar dynamic visualization in Unity. It is important to emphasize that we used simulated data for this project.
Although some technologies, such as connected pots and sensors capable of measuring nutritional values and environmental conditions, were unavailable, we recognized their potential value for future project iterations. For instance, the Xiaomi Mi Flora sensor, which measures light level (lux), temperature, humidity, and nutrient levels in the soil, could have provided valuable real-time data.
We collected data from January 1, 2020, and May 31, 2023, to gather historical weather data for Stockholm. This data served as a foundation for modeling the environmental conditions experienced by the plant. To showcase historical data in the MR experience, we developed a script in Unity that reads data from a CSV file and updates sliders representing temperature, light, and humidity. This feature enabled users to observe the environmental and soil conditions over the past seven days. Regarding plant growth, we collected data on the monthly growth in centimeters and the number of leaves produced for minimal and maximal optimal conditions. We also determined the minimum and maximum values for temperature, humidity, light, pH, nitrogen, phosphorus, and potassium. We employed the logistic growth equation, a commonly used mathematical model for describing population growth to estimate plant growth. In our case, this equation was applied to approximate the growth of the Ficus elastica plant. Thelogistic growth equation used in the script is: N(t) = K / (1 + (K / N0 - 1) * exp(-r * t)). The script solved it for time(t), utilizing the given values of N0 (initial leaves), N(t) (target leaves), K (carrying capacity), and r (growth rate). By inputting these values, the script calculated the estimated time required for the Ficus elastica plant to grow from the initial number of leaves to the target number of leaves. An estimated growth time of months was then displayed in the Unity console.
Given the novelty and limited testing of MR technologies, we had considerable difficulty obtaining guidance, instructions, and inspiration on integrating the various components. As a result, we spent much time exploring and testing different approaches. However, since this is a typical challenge faced with new technologies, this exploratory approach is one of the goals of this course. One challenge we encountered was figuring out how to handle and implement the collaborative part the way we needed. The synchronization of objects and interfaces in the multiplayer space between the two users participating in the experience was a significant challenge, as the positions and states of objects in the virtual world were not always behaving the way we intended. BloomTwins uses the "Passthrough" functionality of Meta Quest Pro headsets, enabling the user to see the surrounding environment, which is necessary to implement an MR application. This presented another challenge, as many of the frameworks we tried to implement in multiplayer did not support Passthrough. This meant we had to spend considerable time exploring possible technologies for the project.
In conclusion, the design and development of the Mixed Reality collaboration experience featuring a Digital Twin of a houseplant showcased the potential of merging physical and virtual realms within individual homes. By utilizing MR technology and incorporating Digital Twin and Interaction Design concepts, the project aims to offer plant owners an immersive and collaborative platform for plant care and maintenance. The project effectively integrated physical and virtual spaces using technologies like Blender, Unity, and Meta Oculus Pro. Real-time visualization of plant data, historical weather data, and collaborative interactions enriched the user experience and supported informed decision-making for plant owners. Furthermore, the project highlighted the significance of addressing user needs and challenges during the design and development of MR applications. However, the project encountered a few limitations. These included relying on simulated data instead of sensor data, the absence of readily available sources and guidance for integrating MR components, and difficulties in implementing the collaborative aspect. These limitations present opportunities for future research to explore and address these challenges, refining the design and development process of MR applications for individual households. Overall, the
project successfully demonstrated the practical application and potential of MR technologies and Digital Twins in the realm of houseplants, providing valuable insights into plant care, health monitoring, and collaborative engagement. Future studies in this field could focus on gamified and educational approaches to promote sustainable plant care practices. Integration of advanced sensor technologies could enhance the accuracy and real-time monitoring of plant health. Additionally, exploring user-centered design principles such as adaptability, flexibility, and usability could further enhance the interaction design of MR applications for plant care.