Earlier today, I tested Google Project Starline, which aims to deliver an immersive telepresence experience, where the participants are life-sized and displayed in three dimensions, without the need to wear glasses or a headset.

As described by Google, Project Starline is like looking through a magic window, where you can see another person, with volume and depth. You can talk naturally, gesture and make eye contact. The effect is the feeling of a person sitting directly in front of you.

Project Starline combines computer vision, machine learning, spatial audio, a light field display and real-time data compression.

The latest version of Project Starline (which has been through multiple iterations) is delivered as a compact series of cameras, attached to what appears to be a regular display.

Project Starline

In use, the combination of spatial audio and the light field display successfully tricks the mind into believing the individual is present within the room, delivering an impressively realistic effect.

Unfortunately, I was not able to record my test session, but even if I had, the “magic” would not translate to video. This is due to the use of eye-tracking technology, which is what maintains the visual illusion. Therefore, Project Skyline can only be experienced in person.

The best overview/description I have seen is the Marques Brownlee (MKBHD) video from last year.

I completed a similar series of tests as Marques (e.g., passing the apple). My reaction was similar to his, initially, a feeling of disbelief, followed by a desire to see how far the technology could be pushed before the illusion was broken. In the end, I settled into a natural conversation with the other person, which was the most realistic I have ever experienced via virtual conferencing.

The specifics of the technology were left a little vague. For example, I would love to know exactly how the light field display works. Without any insider knowledge, Looking Glass appears to be the closest comparison.

Information gleaned from other interviews (e.g., Wired) states that the depth sensors are capable of capturing approximately 180 degrees. Therefore, if you move outside of this window, the volume and depth effects are lost.

The data compression technology leverages a modified version of WebRTC, which is rumoured to reduce the size of the raw data by a factor of 100. This is what enables Project Starline to operate over a traditional network (although it would appear all the test sessions are connected within the same building).

At this time, Project Starline remains a “Technology Project” and has specific restrictions. For example, it only supports one-to-one discussions. However, Google shared that they plan to commercialise the product in 2025, working with partners such as HP.

The real-world applicability of Project Starline is yet to be proven. However, I can certainly see it being popular with business executives.

Looking beyond 2025, I am interested in understanding how quickly this technology could be delivered to a wider audience, as it certainly feels like a positive step forward from traditional virtual conferencing technologies.