Ryan G. Clark

About Me

My first experience with an XR headset was at a hackathon back in 2017. Long story short: it was a cinderblock strapped to your face, with terrible graphics and zero peripheral vision; but it worked. Down the rabbit-hole, through the wardrobe, it felt like I had stepped out of this universe into a whole new fantastical realm. Instantly I was obsessed. I could see the immense potential of this technology and I knew this was exactly what I wanted to do with my life.

A month later I came down with a debilitating and unknown illness that robbed me of 6 years of my life. Every doctor was baffled. I saw countless specialists who ran countless tests, but none of them could determine what was wrong with me.

The single benefit I derived from being bedridden for all that time was the opportunity to let my new-found obsession run wild. On the lucky few days where I felt strong enough to work, I began learning as much as I could about the science and psychology behind XR headsets. I even taught myself how to build my own XR applications.

The whimsical novelty of traversing virtual worlds is certainly charming, but I quickly realized that XR devices, at their core, are actually very sophisticated data collection machines, and can be powerful tools in conducting scientific research. Your hands, eyes, expressions, movements—they’re all tracked and quantified. I believe this data will be hugely valuable in studies relating to psychology, emotion, neuroscience, and biomechanics. Quickly, my obsession with XR headsets blossomed into a fascination with the functioning of human movement, the nervous system, and non-verbal communication.

Early last year, after nothing short of a miracle, I finally found a doctor who could provide me with answers. He offered a diagnosis, prescribed a medication, and after years of agony I was back to good health in a matter of months.

Now that I’m feeling strong enough to reenter the world, I’m dying to act on these ideas and bring them to life. Currently, I’m an XR researcher at the Georgia Institute of Technology. I’m also an active member of the Georgia Tech XR Club, where I helped establish a research branch of the organization. Our aim is to inspire new developers to get involved in XR, and push forward the role of XR devices in mainstream scientific research.

Overall, my goal in life is to push the boundaries between the physical and the digital worlds, and explore the many use cases for XR devices that are just waiting to be discovered.

XR Light Painter

This is my latest exploration into controlling IoT lights with XR devices.

Mixed-Reality

Crystal Ball

for Apple Vision Pro



Plenty of VR games can make you feel like an all-powerful wizard, but only when you’re in the virtual world. I wanted to feel like a wizard in real life.

Instead of using my hand gestures to trigger a virtual in-game event, I broadcasted that information over an IoT network to a Home Assistant server running on raspberry pi, which triggers an event to happen on a wifi-enabled smart lightbulb.

Now that I’ve built a simple pipeline to allow AR devices to directly interact with IoT devices, I’m very excited to see what new interactions I can come up with next.


For this project, my goal was to use the Meta Quest Pro to replicate the Apple Vision Pro’s eye-tracking based input system.

XR X-Mas Tree

Talk to the Hand (Literally)

By combining the power of a Raspberry Pi, the Unity Game Engine, a Quest Pro, and an assortment of IoT lightbulbs, I gave myself the ability to talk to my Christmas tree with nothing but hand gestures. It’s silly, I know, but it’s surprisingly fun to play around with.

StarDust

Send the stars on their way to clear a path for the morning sun.

There’s one big difference between movies and VR experiences: timing. While film scores follow a fixed timeline, VR experiences need to adapt to the players actions in real time.

To create the music for this experience, I had to ensure that each musical motif could be played asynchronously from each other, yet still feel like a cohesive and pre-planned musical score.

Essays

  • Your smart lamp is dumb. You might be able to speak to it through Siri or Alexa, but can you really call either of them “smart”?

    Don’t get me wrong, the recent breakthroughs in AI are extraordinary, and they are about to revolutionize the way in which we use text and speech to communicate with technology. The real issue, however, is that a staggering amount of human communication is completely non-verbal. Conservative estimates put that number at over fifty percent.

    So how do we bridge that gap? Let’s look at what humans require to communicate non-verbally: Imagine there is someone standing opposite you on a city crosswalk. They look past you, pointing their fingers wildly, with a horrified expression plastered on their face. You, like any rational human being, turn around to make sure you’re not in any imminent danger. To understand that complex message, all you needed to know was where that person was looking, what they were doing with their hands, and what they were doing with their face.

    Now let’s take a look at modern XR headsets. They track your eyes, they track your hands, and they track your facial expressions; nearly everything you need to understand human non-verbal communication. If your headset knows this information, it can relay that information over an IoT network and broadcast it to your dumb lamp. However, now that it knows your body language, your dumb lamp can finally live up to the moniker of “smart”.

    When you connect an XR device to an IoT network, you are simultaneously connecting your IoT devices to the realm of human social interaction.

    In the near future, when XR devices replace smart phones as the dominant computing platform, we will have the ability to interact with technology on a deeply human level, not with bits and bytes, but with waves, winks and smiles.

  • Coming soon

  • Coming soon

  • Mabe? I know that’s a hell of a claim to make, but I believe there’s a lot more potential there than you might think.

    (more coming soon)

Contact

ryan@ryangclark.net

Ryan Gerhard Clark