One of the unique features of the Pixel 4 series has to be the presence of the Soli chip. However, Google didn’t add new functionalities that would take advantage of the chip, except the air gesture to control music playback in the March Pixel feature drop. Now, the company’s research team (Research at Google) has released an app named ‘Soli Sandbox’ to create new Soli workflows.

  • Presence Event – Triggers every time Soli detects a person within 0.7 meters (2.3ft) of the device.
  • Reach Event – Detects movement resembling a hand reaching toward the device, within about 5-10 cm.
  • Swipe Event – Detects motion that resembles a hand wave gesture above the device.
  • Tap Event – Detects movement that resembles a single hand bounce above the center of the phone.

Google says that Soli Sandbox prototypes are HTML files that receive and respond to Soli events with JavaScript. The app uses Android System Webview to display prototypes. Hence, all technologies supported by Android System Webview are supported on the Soli Sandbox app, except WebAR, WebGL, and WebVR.

If you’re interested to get involved, you may check out Soli Sandbox Manual and the starter project on Glitch.