PROJECT IDEA

Overview

My idea is to use the MIT app inventor to create a fully immersive VR TOHCAR (Timeline Of Habitat Changes And Relationships) app that is designed to strengthen students’ awareness and understanding of relationships in nature like the ones between abiotic and biotic components, as well as biogeochemical cycles, animals and plants, and between animals, and how all of these relationships have change over time, by taking students on a guided virtual reality excursion(s) to different types of habitats.

It would use audio and visual components to fully immerse the user in the environment, as well as provide a hands-on experience that would build empathy and deepen their comprehension regarding the extent that humans have disrupting the environment. Furthermore, the teachers would use the TOHCAR VR app to guide students through tours of 360-degree photos of certain types of habitats.

Motivation

The reason I wanted to create this app was because I wanted to increase the interactivity of the educational resources used in VCE Environmental Science in high school, as it was a subject that I believe that many found boring. I also wanted to find a way to generally decrease the potential of technology distractions in classroom settings in order to maximise students’ learning, and since VR is a fully immersive experience with the headsets and the hand controllers, there are no opportunities for students to not engage with learning content in environmental science by checking social media or playing games on their devices.

Furthermore, according to a study in the Journal of Media communication, students not only spend ‘an average of 11.43 times checking their devices during class’ (Unimersiv, 2017), but they also waste one fifth of their time during class on non-educational activities on their digital devices (Unimersiv 2017). Hence, I believe that the TOHCAR VR app would help students develop a deeper understanding of how humans have negatively impacted the environment, through comparison of the natural changes in the biogeochemical cycles and the exaggerated changes made by human interference.

This app would be designed to appeal to the more auditory and visual learners. In addition, I am also motivated to create the TOHCAR VR app because of Edgar Dale’s Cone of Experience, which a visual representation of what activities are the most beneficial for memory retention in learning – and since this Cone of Experience also states that people typically remember 90% of what they do or personally experience (Unimersiv 2017), the TOHCAR VR app acts as a substitute for the personal experience of existing in those habitats over time.

I also wanted the TOHCAR VR app to be on a phone because not all schools make it mandatory for students to have laptops or provide free laptops to their students on campus (as the students could be using an iPad for class instead, and VR isn’t compatible with iPads) (Paul Evans 2019) – and therefore, it is more likely for most students to bring a phone with them when they are attending school.

Description

Prior to the students entering the TOHCAR VR simulated habitat tour, the teacher would select the habitat that they will be teaching their students about from the app, which would function similarly to Google VR Expeditions in how the user can select what location they want to go to on the actual app before they enter VR (Brad Dale 2019).

The TOHCAR VR would operate on a seated scale VR, which means that the user would need to sit while using the VR of TOHCAR, but they do not need to be standing (known as standing-scale VR) or moving around the room that they are physically in (known as room-scale VR) (Harry Baker 2022). I did not want the TOHCAR VR experience (for the students that this app is designed for) to be a standing-scale VR experience, because postural instability increases the likelihood of motion sickness whilst using VR, and thus, remaining in a seated position limits the user’s movements and therefore minimises their disorientation, especially in the likelihood that their body’s movements in VR does not sync up with their body’s movements in real life (Sophie Thompson 2020).

I also did not want the TOHCAR VR experience to be a room-scale VR experience because considering that a class typically has 20 students in a room, it would be difficult for the teacher to maintain order over the classroom if students were continuously accidentally running into each other – while the room-scale VR experience undoubtedly has the highest level of immersion out of the three, and there are “chaperone” or “guardian” boundaries that appear whenever the user is about to bump into someone, I felt that the standing-scale VR would be the easiest way for teachers to ensure that no accidental injuries or chaos disrupts during the lesson, but still ensure that students are separated enough from each physically that they do not bump into each other while in the TOHCAR VR experience (Harry Baker 2022).

As such, the VR headset used in conjunction with the TOHCAR VR app would not contain head tracking (which means that when the user is wearing the VR headset, the simulated image changes from their perspective whenever they move their head) – but the headset would contain motion tracking (which refers to VR users seeing their own hands or the VR hand controllers in the simulated space), and the headset would also contain eye tracking (which is done by an infrared sensor inside the headset that monitors where your eyes are looking in virtual reality in order to minimise simulation sickness, which is when your brain knows that your vision does not match up with what you should be seeing from your current point of view) (Charara 2017). Instead, the students would use the hand controllers to change their point of view of and move around the habitat that their teacher is giving them a virtual ‘tour’ of. I removed the head-tracking aspect from the VR headset because the hand controllers would provide the user the ability to rotate, tilt and move their point of view (Andrew Courtney n.d.) – and to have a VR headset that enables a user to rotate their point of view by turning their head would be redundant.

The VR headset also would be capable of mounting the smartphone (with the TOHCAR VR app) so that the user can use the VR content from the app (Virtual Reality Basics n.d.).

Further, the VR headset also contains FOV (field of view), which is everything that the user can see of the virtual world whilst wearing the headset (Harry Baker 2022). In order to minimise the discomfort caused by using VR, when the user is moving around in the virtual world, the VR headset would restrict the FOV into a tunnel vision (Harry Baker 2022).

Additionally, the VR headset would also contain stereoscopic lenses, which matches how humans see the world with two eyes because it uses two lenses, each with a slightly different angle, which creates the 3D depth of VR (Immersion VR n.d.).

In addition, the frame rate of the VR headset refers to the number of frames (of the 360-degree image of the habitat) that are displayed per second – and thus, the TOHCAR VR app must have a high frame rate because having a low one means that the VR visuals are jittery and lagging (eg. When you rotate your point of view, the frame rate of the virtual habitat you are in moves slower, and you are not seeing what you would usually see if you were in the actual habitat and turning around) (3D Realise 2021). The VR headset would also need to be of an extremely high quality, especially since when multiple people are using the same VR experience at the same time (ST Engineering ANTYCIP 2020), and the TOHCAR VR app is designed to be used by a whole class of students and their teacher at the same time.

Finally, the VR headset for the TOHCAR VR app would be capable of the physical adjustment of the IPD (short for “interpupillary distance”) in order to ensure that the headsets can be worn by a wide range of people with a varying IPD (Harry Baker 2022). IPD refers to the distance between the centre of the user’s two pupils – so if the headset’s lenses and display are not aligned in front of the user’s pupil’s then the VR images will likely appear blurry, and thus, (only in the worst cases) will increase the chances of a headache or nausea (Harry Baker 2022).

Additionally, the VR engine is necessary for storing the data of the habitats and organisms that the user will see (Biztech n.d.) in the TOHCAR app. Thus, UNITY will be used, because it is suited to any mobile phone (unlike the UNREAL, or iOS platforms which need a computer and an Apple device, respectively, to work), has a free plan (which makes it less expensive to use in the classroom, unlike Java) (Yan Telles 2022). However, UNITY has a slower rendering and graphics performance than UNREAL, cannot run on older devices (only newer ones, as it is performance intensive) (Definline n.d.) , and would use up a lot of space on the students’ respective phones (Yan Telles 2022).

In addition, in order for the TOHCAR VR app to provide the most accurate timelapse of the habitat that the students would be visiting, the TOHCAR would rely on data from the Smithsonian National Museum of Natural History. This data would be stored in the VR engine, UNITY, for this app.

Moreover, TOHCAR would also need WebGL (Web Graphics Library) which would be responsible for rendering the interactive 3D graphics of the habitat (with the interactivity of the 3D graphics being important so that the user can move around in the selected habitat) (TutorialsPoint n.d.). Further, WebGL would need to be used in conjunction with a powerful GPU (graphics card) in order for to ensure that the frame rate and rendering of the 360-degree photo of the selected habitat is immersive (ST Engineering ANTYCIP 2020), which in turn prevents the user from having motion sickness from experiencing low frame rate.

Since it would be inconvenient to have students to only have the option of download the TOHCAR VR app on their phones, implementing WebVR (which is a Javascript that runs on a browser) enables students to use the TOHCAR app with a browser on the computer rather than on their phones if they so choose; also, WebGL is also compatible with browsers as well (Vilmate n.d.).

However, the farther back the user goes in the timeline of the habitat in the TOHCAR VR app, the less interactive and the 3D graphics from the WebGL will be in the earliest geographic periods of history. For example, what the habitat looked like the Hadean era (the first geologic period in the Earth’s history) as there is less information on what the organisms looked like when they moved, or the timing of major natural events like when the atmosphere and ocean were formed – but there is more general information of asteroid and meteorite impacts dominating the Earth at that time (EarthHow 2022), so that is what the TOHCAR app would show rather than the what the exact location of that habitat itself looked like in the Hadean era. Hence, the interactivity would be significantly decreased, because since we do not have actual definitive information on what exact places looked like during the Hadean era beyond the fact that there were a lot of asteroid and meteorite impacts, so at most the user would be able to rotate their point of view, but not tilt or move their point of view forwards or backwards.

In addition, it is still debated on what, exactly, caused each mass extinction, so the teacher using the TOHCAR VR app would have to instead explain to the students the potential causes of these mass extinctions could be, because they cannot show their students what did cause these mass extinctions because the exact causes are not confirmed yet.

Furthermore, the farther back in the timeline of that habitat that the user goes means that there are no photos of what that location looked like at that specific point in time because photos did not exist yet, so the TOHCAR VR experience would have to revert from 360-degree high-definition photos to relying on a 3D generated and rendered environment created from UNITY (The University of Melbourne n.d.) of what that place would have likely looked like, the farther back in the timeline of the location that the user chooses to go.

Another issue with WebGL may be the depiction of certain symbiotic relationships between organisms in the habitat selected – for example, a teacher may want to teach their students about antagonistic symbiotic relationships through the TOHCAR VR app, but it may be visually confronting for some students in the VR simulation to see the extinction or hunting of some species, so the photos that would be selected as a part of the 360-degree habitat tour would have to be carefully selected.

In addition, the wireless hand controllers would operate similarly to the ones used in Google Earth VR, where the right thumb stick enables the user to move forward or backward in their perspective (this is also known as smooth locomotion) (Harry Baker 2022), rotate their view by holding down on the button on the right controller’s handle, and the ‘B’ button enables the user to switch between the two views available (a perspective that is tilted up or tilted down) (Andrew Courtney n.d.).

The left-hand controller’s thumb stick would act as the ‘mouse’, enabling the user to select which habitat that the teacher wants to explain to their students, as well as select and change the VR settings while in the VR homepage, and when to exit the TOHCAR VR simulated habitat. For the purposes of having a succinct lesson, the teacher would be in charge of the hand controllers, guiding the rest of the students on the ‘tour’ of the habitat that they want to teach about in the TOHCAR VR app.

Also, the VR headphones would contain a high quality 360-degree audio, which would heighten the immersion of the TOHCAR VR experience because it makes the user feel like they are actually in the simulated habitat as a result of the sounds in that location appearing to come from above, to the side or behind the user (Anthony Mattana n.d.).

This is how the TOHCAR VR app experience would be designed to go through this process (in order to use it properly:

  1. The teacher puts on the VR headset and headphones and physically adjusts the IPD.
  2. The user mounts the phone onto their VR headset and turns on the headset on, while the user is in a seated position.
  3. When the VR headset is turned on, the user is located in the equivalent of a VR homepage, which serves as a waiting room where the user can change settings and also select which habitat that they would like to explore with the VR TOHCAR app (XRToday 2022).
  4. When the user has selected the habitat that they wish to explore with the VR TOHCAR app, the 360-degree photo image of the habitat is fed through the video source (in this case, it is the phone mounted onto the VR headset) – and then, the stereoscopic lenses create the 3D depth to the 360-degree photo or image in the user’s VR experience (Immersion VR n.d.)
  5. The VR headphones provide the 360-degree audio, and the user can use the wireless right-hand controller to rotate, tilt and move their point of view forwards or backwards, and the wireless left-hand controller to exit the TOHCAR VR habitat, switch to another habitat to explore in the TOHCAR VR app, or select the habitat’s time period (go back to what the habitat looked like).

Tools and Technology

The TOHCAR app requires a VR headset (also called hand-held HMD – head-mounted displays) which would enable the user to see the VR habitat (the VR headset for TOHCAR would also be like the Google Cardboard headset which could mount smartphones for VR content); a phone that is Virtual Reality Support compatible (meaning that it has a gyroscope sensor – a device utilised to sense and maintain direction –, and accelerometer – a device used to sense acceleration) (Aditya Tiwari 2016), and wirelesshand controllers (for the user to interact with the VR environment, that changes the user’s point of view of the location that they are in, as well as move around in the simulated location); additionally, the TOHCAR app needs a software platform and SDK (software development kit), a VR engine (which would be responsible for rendering the visual aspects of what user sees in the habitat) and VR headphones (Vilmate Software Organisation n.d.).

Skills Required

The software that would need to be written would be similar to Google Expeditions VR, which is written in C++ and enables the user to move anywhere around the world and look at different locations from different perspectives through the hand controllers and uses 360-degree photos. However, since the headsets for Google Expeditions VR need a lot of processing power, this means that these headsets would be reliant on a PC – so this means that the software for the TOHCAR VR app would also be similar to the Fulldive VR app, which is able to function on a smartphone (Fulldive n.d.).

In addition, the people who will be trialling the usage of the VR TOHCAR app would need to be not susceptible to motion sickness, or have any physical conditions that may make them more likely to have motion sickness while using the VR TOHCAR app. There would likely also need to be more testing and trials to be completed to minimise the effects potential of nausea and motion sickness during VR usage so that the VR TOHCAR app is safer to use for students and teachers in the classroom. Furthermore, before this app is implemented into classrooms, the teachers would have to be trained on how to effectively use the TOHCAR VR app in order to maximise student learning.

Additionally, in order to make the TOHCAR VR app look good, it requires prototyping, which would help develop the 360-degree view of the photo of the habitat that the teachers would be teaching their students about (Biztech n.d.). as such, software like Google Blocks would be useful for this because it would enable the developers of this app to create 3D models in VR which help with developing the depth of the 360-degree photos that would be used in the TOHCAR VR app by enabling them to test the sensors, sense of scale and viewing position of the user (Biztech n.d.).

Outcome

>The outcome of the VR TOHCAR app is to increase students’ understanding of the naturally evolving relationships between animals, biogeochemical cycles, plants, biotic and abiotic components and the effects humans have had on nature by going on virtual reality excursions to different types of habitats. In addition, the TOHCAR VR app would be used for educational purposes in the Geography, Biology and Environmental Science high school subjects, therefore increasing students’ interest in pursuing a career in sustainability and environmentalism as scientists or as a part of environmental government body like the EPA (Environmental Protection Authority) in order to mitigate and improve the consequences of human pollution and activities on nature. Additionally, my VR TOHCAR app could be potentially adapted for educational purposes in a history or architecture subject how places, natural and manmade looked overtime.

Navigation

Home

Ideal job

Personal Profile

References

Assignment 1 details