VR-Bio-Talk: AI-driven Virtual Reality, Digital Twins in Ag Datasets, and CyVerse

Sept. 12, 2024

CyVerse, U of A, and Purdue University Partner on AI Platform that Enhances Agricultural Datasets

Image
Purdue image for VR-AI

Image credit: Purdue University

Image
duke pauli on bed of lettuce

Researchers are generating large amounts of data that capture many features of physical plants and bring that data into computer systems in order to understand and reveal complex biological processes and interactions. One example of such a project at Arizona is PhytoOracle, a scalable, distributed workflow manager for analyzing high-throughput phenotyping data collected by the U of A gantry (field scanner) under Professor Duke Pauli. Extracting meaningful information from this data requires advanced technical skills such as algorithm development, programming, and statistics. 

U of A, with Purdue as its partner, was awarded a new grant from NSF for the VR-Bio-Talk project, which will develop a life-like virtual reality (VR) visual analytics system that will be controlled by voice. Pauli leads the plant phenotyping efforts and has overseen multiple large experiments utilizing the Field Scanalyzer and also provides domain expertise. Nirav Merchant, the director of the University of Arizona Data Science Institute, is the principal investigator for NSF CyVerse, a cyberinfrastructure project for managing data and analysis workflows from phenotyping platforms such as the Field Scanalyzer and CyVerse. He will provide access to the vast amounts of open datasets and methods to rapidly extract information from it.

Imagine this: the VR-Bio-Talk user, say a farmer, puts on a headset and is immediately immersed in a field of scanned plants that she will be able to interact with verbally. For example, the command ?show me all plants older than four weeks and their average leaf area.? will display only the correct plants and their leaf area as a label and a graph above them showing how the value is changing over time. At its core, this project will develop novel algorithms for artificial intelligence (AI)-based interaction and advanced processing, reconstruction, and visualization of large plant datasets. The overall aim is to bridge the domain gap of current data analytics systems. 

Image
Lettuce

The anticipated impact will be support for the development of a data-enabled biology workforce capable of advancing the understanding of plant biology and contributing to innovations by deriving insights from data using novel systems and algorithms for interaction with large phenotypic data. Special attention will be given to including potentially disadvantaged users through built-in robustness to accents and support of learners with limited English proficiency and through VR data interaction designed to be accessible to users with limited motor skills. The novel AI-based voice-controlled VR interaction and visualization algorithms will have a broad impact that extends beyond the life sciences.

This project has three main aims:

  • Development of novel AI-based algorithms for the reconstruction of the vast, rich, but often underused data from phenotyping facilities into plant digital twins that respond to the environment by providing highly detailed 3D geometry and light interaction. In particular, the project will use the rich data acquired by the University of Arizona field scanner. The plants will be rendered with high visual plausibility and photorealism. They will also be rendered in more salient false colors as needed for analysis and AI training. 
  • The second aim is the development of a voice-controlled VR user interface that interprets complex compound commands. The interface will be connected to an AI-based voice recognition system and voice-to-text encoder, which will generate code and executable commands. The control will be tuned to respond to a wide variety of accents and commands. 
  • The third aim will focus on the deployment and evaluation of a set of carefully designed experiments with participants ranging from novices to experts. The users will use intuitive, natural interaction via dialogue with an AI-enabled data analytics system.

“Traditional data science projects require domain expertise to process the data. People need to write programs to extract information from data. We aim to lower this barrier by using the latest advances in AI for voice recognition, conversational interfaces, digital twins, and procedural modeling,” said Bedrich Benes, Purdue's PI.

Create Account

An Open Science Workspace for Collaborative Data-driven Discovery