New Apple Patents reveal an XR Headset Gaze Control System that can interpret different user's hand gestures in MR environments
HomeHome > News > New Apple Patents reveal an XR Headset Gaze Control System that can interpret different user's hand gestures in MR environments

New Apple Patents reveal an XR Headset Gaze Control System that can interpret different user's hand gestures in MR environments

Aug 15, 2023

In May 2023 the European Patent & Trademark Office published a patent application from Apple that technically relates to a computer system with one or more display generation components and one or more input devices that provide computer­ generated experiences, including but not limited to electronic devices that provide virtual reality and mixed reality experiences via one or more displays. In practical terms, Apple's patent dives into providing users of Apple's future XR Headset with the ability to use various hand gestures to control or move objects that the user will see in AR/VR/MR/XR environments using gaze input from a gaze tracking system. The cameras on or in the headset use depth mapping to distinguish hand gestures.

Apple notes in their patent background that methods and interfaces for interacting with environments that include at least some virtual elements (e.g., applications, augmented reality environments, mixed reality environments, and virtual reality environments) are cumbersome, inefficient, and limited.

For example, systems that provide insufficient feedback for performing actions associated with virtual objects, systems that require a series of inputs to achieve a desired outcome in an augmented reality environment, and systems in which manipulation of virtual objects is complex, tedious and error-prone, create a significant cognitive burden on a user, and detract from the user experience with the virtual/augmented reality environment. In addition, these methods take longer than necessary, thereby wasting energy. This latter consideration is particularly important in battery-operated devices.

In some embodiments, the computer system can distinguish a user's hand movement based on the type of grip or hand posture that is maintained during the user's hand movement. The type of grip or hand posture is distinguished based on the position of the fingers on a respective hand, the orientation of a respective hand, the number of hands forming the grip, the relative position and orientation of the hands forming the grip, or a combination of two or more of the above, etc..

In some embodiments, once a destination position or anchor position is selected, if a movable virtual object is present at the destination position or anchor position, the computer system recognizes the user's hand movement as a request to move the virtual object relative to the three-dimensional environment (and relative to the virtual position of the user or the viewpoint of the currently displayed view of the three-dimensional environment) if the user's hand(s) are in a first type of grip during the hand movement; and the computer system recognizes the user's hand movement as a request to move the viewpoint relative to the virtual object (and relative to the three-dimensional environment) if the user's hand(s) are in a second type of grip different from the first type of grip.

In patent FIG. 2 below we see a block diagram illustrating a controller of a computer system that is configured to manage and coordinate a CGR experience for the user. The controller #110 below includes a suitable combination of software, firmware, and/or hardware; Figures 7A-7D illustrate selecting a navigation target and navigating to the navigation target in accordance with a physical hand gesture.

Apple's patent Figures 7E/F/H/I below illustrate selectively moving a virtual object relative to the three-dimensional environment (and the viewpoint) or moving the viewpoint relative to the three-dimensional environment (and all the virtual objects inside the three-dimensional environment) based on a user's hand grip that is maintained during a sequence of user inputs.

Apple's invention is exhaustively detailed in their 163 page patent that was filed in Europe on May 24, 2023. For developers and geeks that want to take a deeper dive into the rich details provided, check out Europe's patent number EP4182799.

The patent is associated with Apple's patent application 20210286502 that was published back in October 2021 expanding on the European filing. Apple hides patents deemed critical from prying eyes like ours by hiding the patent under their engineers names, as in this case with their '502 patent. The Inventors listed on the '502 are clearly Apple engineers and in fact in the secondary patent, Jonathan Ive is listed. Can you get any more Apple than that?

Posted by Jack Purcher on June 03, 2023 at 11:25 AM in 1A. Patent Applications, HMDs, Smartglasses +, In-Air Gesturing, Gestures, Eye-Tracking | Permalink | Comments (0)

In practical terms