Sixth Sense is an AR app to simulate schizophrenia symptoms in an authentic and respectful way to foster understanding and empathy.


Understanding psychosis is critical to effective allyship and interfacing safely with affected persons.


  • Platform: Magic Leap 1 (spatial audio, digital objects)
  • Dynamic dialogue: Google Dialogflow v2
  • Pulse Sensor: Heart rate monitor



  • Games for Change Festival 2022 
  • XR Brain Jam 2022 Live Demo 

The App

To open the app, download The Lab and use Device Bridge to load the app on your Magic Leap One device.

User Journey

The player takes a walk in the shoes of someone with schizophrenia under the guidance of a benevolent friend. Typical encounters of daily life trigger auditory hallucinations. The player turns to their friend to help find their way through.

The Features

We delivered our first iteration of our product for the XR Brain Jam. We focused on spatial audio to create auditory hallucination with simple AR objects in the real world: alarm, book, TV.

“Auditory hallucinations are among the most common symptoms in schizophrenia, affecting more than 70% of the patients.” source


  • “It is quite intense.”
  • “I can see how it can be useful for the medical staff to experience for better undertsanding”
  • “You can only learn so much by reading. This is a wonderful tool to train empathy.”


I am the lead developer for this project. I am proud to successfully build the application of the MVP experience with spatial audio within 25 hours. I also took on the role of technical and art director where I assigned technical tasks for my teammates to work on certain features (AI companion, text to speech, Dialogflow, and voice over recording, and 3d assets).

Development Process

assets/sixthsense/XRBrainJamDeveloping.jpg I enjoyed developing for Magic Leap with Unity. Although, at first, I spent around 20 hours troubleshooting:

  • Trying out different example templates
  • The original repo didn’t have MLTK tool embedded and I found the MLTK package on github
  • Explored the example scenes of the MLTK
  • Decided the Controller Input was the one that fits our experience the most
  • I also explored the spatial map and would love to explore more in the future


  • It is include in the Control Input scene and I would love to dive into it more to be more calibratable to different physical scenes 
  • I was so happy when I finally succeeded in building our application!


  • Testing out spatial audio

Storyboard + Content Creation



Script for Inner Voices

Event1: Alarm

  • Look at what the hell you did! You ruined everything
  • Run! Run!
  • You’re always messing up. You need to leave before security finds you.
  • Voice A: If you weren’t so stupid you wouldn’t get caught.
  • Voice B: Exactly! What a fucking idiot

Event2: Book

  • What’s the point of reading this
  • There is no hope for someone as meaningless as you
  • Everyone is watching you, you look so pathetic
  • Put it down right now, or else.

Event3: Video

The Video on TV: (1:28 - 1:44)

  • You’re so weak, you’re so sensitive - Look they’re talking about you on TV!
  • They’re watching you
  • If you weren’t such a snowflake maybe you’d be better off
  • crazy laughter
  • Did that hurt your feelings? I don’t give a shit

Voice Overs



Life without troubleshooting is dull. 

I have encountered the following challenges while working with Magic Leap:

  • Issue with Mabu, manifest.xml


  • I enjoyed the documentation on Magic Leap’s official website. Yet there is one website that has confusing most up to date templates. Luckily github comes to the rescue. Here is the image showing that zero iteration works with the Magic Leap Unity Exmamples repo.


  • Registering issues with Magic Leap. We have three Magic Leap devices to develop for yet due to the registering issue, we ended up working with one registered magic leap for our development.
  • First time using magic leap. Learned how to use it in less than 24 hours.
  • Limited developing time since we agreed with our storyboard 15 hours before the end of the Jam

Feedback from demoing 

  1. To create an engaging storyline for user to immerse
  2. Add more AR objects for longer experience  
  3. Make the spatial audio clear. Or have fewer audio playing at the same time to create the effect of spatialization.

Next Steps

  • Expand library of trigger events
    • Diversify scenarios for the AR experience 
  • Spatial audio + Visual FX 
    • Incorporate visual hallucinations 
  • AI Companion
    • Hard coded dialogue → Conversational AI 
    • Integrate Dialogflow for dynamic dialogue between the player and companion
    • Image recognition in the physical world to generate responses in the Spatial Audio content
  • Post-experience biofeedback → Real time visualization
  • Improved design of the heart rate graph for easier readability and engagement
  • Incoporate Hand Tracking




Special Thanks to Destiny Guzmán Julia Scott Tingru Lian Lucas Wozniak