Today, the US Patent and Trademark Office formally granted Apple a patent for streaming immersive video content presented to users wearing head-mounted devices.
immersive video streaming
According to Apple, wearable display devices such as virtual reality headsets and augmented reality headsets can be used to present immersive video content to users in three dimensions. Additionally, different portions of the immersive video content can be presented to the user depending on the position and orientation of the user’s body and/or the user’s input.
Apple Patent FIG. 1 below shows an exemplary system #100 for presenting immersive video content to a user #102. System 100 includes video content source #104 communicatively coupled to wearable display device #106 via network #108.
A wearable display device may be any device worn by a user and configured to display visual data to the user. As an example, the wearable display device may be a wearable headset such as a virtual reality headset, an augmented reality headset, a mixed reality headset, or a wearable holographic display.
Figure 2A on the Apple patent shows an example viewport for displaying immersive video content. figure. FIG. 2B is a diagram illustrating an example of degrees of freedom of movement of the user’s body.
Apple’s HMD viewport
Further to Patent Figure 1. In FIG. 2A, Apple states that immersive video content #200 can include visual data that can be presented according to a range of viewing directions and/or viewing positions for the user. Viewport #202 Choose to present portions of immersive video content to the user (e.g., based on the position and/or orientation of the user’s head) to give the user the impression that they are viewing visual data according to a particular field can do. Display a view and/or perspective.
Additionally, the viewport is continuously updated based on the user’s movements, giving the impression that the user is looking through the visual environment.
Headset sensors can also be configured to detect the position and orientation of the user’s head in multiple dimensions. For example, see FIG. 2B, A Cartesian coordinate system can be defined such that the x-, y-, and z-axes are orthogonal to each other and intersect at the origin O (eg, corresponding to the user’s head position).
A sensor (#120, Figure 1) senses that the user is translating their head along one or more of these axes and/or rotating their head about one or more of these axes. can be detected (e.g. according to 6 degrees of freedom, 6DoF).
For example, sensors can detect when a user translates their head forward or backward (e.g., along the x-axis), sometimes referred to as “surge” motion. As another example, a sensor can detect when a user translates their head left or right (for example, along the y-axis). This is sometimes called a “swaying” motion. As another example, sensors can detect when a user moves their head up or down (eg, along the z-axis). This is sometimes called a “heave” motion.
As another example, a sensor can detect when the user rotates their head around the x-axis. This is sometimes called a “roll” motion. As another example, a sensor can detect when the user rotates his head around his y-axis. This is sometimes called “pitch” motion.
As another example, sensors can detect when a user rotates their head around the z-axis, sometimes referred to as “yaw” motion.
Developers and/or engineers can delve deeper into the details of this invention under patent US 11570417 B2 granted by Apple.
- Fanyi Duanmu: Video Coding and Processing Engineer
- Jun Xin: Engineering Manager, Video Coding and Processing
- Xiaosong Zhu: Senior Software QA Engineer (years of experience in broadcast, digital video encoding process, and IPTV industry)
- Hsi-Jung Wu: No LinkedIn profile found.