WO2022254833A1 - Dispositif de traitement d'informations, procédé de traitement d'informations, et support d'enregistrement de programme - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations, et support d'enregistrement de programme Download PDF

Info

Publication number
WO2022254833A1
WO2022254833A1 PCT/JP2022/008177 JP2022008177W WO2022254833A1 WO 2022254833 A1 WO2022254833 A1 WO 2022254833A1 JP 2022008177 W JP2022008177 W JP 2022008177W WO 2022254833 A1 WO2022254833 A1 WO 2022254833A1
Authority
WO
WIPO (PCT)
Prior art keywords
signal
viewpoint position
user
information processing
phase
Prior art date
Application number
PCT/JP2022/008177
Other languages
English (en)
Japanese (ja)
Inventor
仕豪 温
雅人 赤尾
佳明 神山
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2022254833A1 publication Critical patent/WO2022254833A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/04Processes or apparatus for producing holograms
    • G03H1/08Synthesising holograms, i.e. holograms synthesized from objects or objects from holograms
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/22Processes or apparatus for obtaining an optical image from holograms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present disclosure relates to an information processing device, an information processing method, and a recording medium recording a program.
  • VR virtual reality
  • HMD head-mounted display
  • rendering is performed based on the detected positions of the user's head and eyes.
  • Eye position is constantly moving. Therefore, there is a time lag (latency) between the detection of the position of the user's head and eyes and the display of the image, and as a result, the actual movement of the user's head and eyes when viewing the displayed image.
  • the problem arises that the positions are different from the user's head and eye positions used for rendering, making it difficult to provide the user with a smooth viewing experience.
  • Time warp processing exists as a way to deal with such problems.
  • Time warp processing detects the positions of the user's head and eyes immediately before displaying the image after the image is generated by rendering, and compensates for the displayed image based on the detected positions. This technology reduces the gap between the position of the head and eyes at the time of viewing and the displayed image.
  • AR augmented reality
  • CGH computer-generated holograms
  • the present disclosure proposes an information processing device, an information processing method, and a recording medium that records a program that can suppress deterioration of the user's viewing experience.
  • an information processing apparatus includes a viewpoint position acquisition unit that acquires a viewpoint position of a user, a rendering unit that generates a rendering image based on the viewpoint position, and A wavefront propagation unit that generates a propagation signal representing a hologram based on a rendered image; a phase signal generation unit that generates a first phase signal for displaying a hologram based on the propagation signal; a correction unit that performs correction based on the current viewpoint position of the user acquired by the viewpoint position acquisition unit.
  • FIG. 1 is a schematic diagram for explaining the outline of a hologram display system according to a first embodiment
  • FIG. 1 is a simplified schematic diagram of a hologram display system according to a first embodiment
  • FIG. FIG. 3 is a diagram obtained by rotating the schematic diagram shown in FIG. 2
  • 1 is a block diagram showing a schematic configuration example of a hologram display system according to a first embodiment
  • FIG. 4 is a flowchart showing an operation example of the hologram display system according to the first embodiment
  • 6 is a diagram showing a specific example for explaining step S102 of FIG. 5 according to the first embodiment
  • FIG. FIG. 4 is a diagram showing an example of RGB image data according to the first embodiment
  • FIG. 4 is a diagram for explaining a case where a user's viewpoint position changes between time T1 and time T2 according to the first embodiment;
  • FIG. 4 is a diagram showing an example of a linear phase pattern in the x direction according to the first embodiment;
  • FIG. 10 is a diagram showing an example of a phase pattern when the compensating phase pattern shown in FIG. 9 has a period of 0 to 2 ⁇ ;
  • 11 is a two-dimensional image expressing the compensating phase pattern shown in FIG. 10 in grayscale;
  • FIG. 4 is a diagram for explaining a case where a user's viewpoint position changes between time T1 and time T2 according to the first embodiment;
  • FIG. 4 is a diagram showing an example of a linear phase pattern in the x direction according to the first embodiment;
  • FIG. 10 is a diagram showing an example of a phase pattern when the compensating phase pattern shown
  • FIG. 9 is a timing chart for explaining the operation flow according to Modification 1 of the first embodiment
  • FIG. FIG. 10 is a diagram showing an example of a pre-computed one-dimensional compensation phase pattern according to Modification 3 of the first embodiment
  • 15 is a two-dimensional image obtained by two-dimensionally representing the compensating phase pattern shown in FIG. 14
  • FIG. 11 is a diagram showing an example of a y-direction compensating phase pattern according to Modification 4 of the first embodiment
  • FIG. 11 is a diagram showing an example of compensation phase patterns in the x-direction and the y-direction according to Modification 4 of the first embodiment
  • FIG. 11 is a block diagram showing a schematic configuration example of a hologram display system according to a second embodiment
  • FIG. 9 is a flowchart showing an operation example of the hologram display system according to the second embodiment
  • FIG. 11 is a schematic diagram for explaining an outline of an AR HUD system using CGH according to the third embodiment
  • 1 is a block diagram showing a hardware configuration example of an information processing apparatus according to an embodiment
  • an information processing apparatus, an information processing method, and a recording medium storing a program according to a first embodiment of the present disclosure will be described in detail with reference to the drawings.
  • an AR device using an optical see-through HMD is exemplified as a hologram display system to which the information processing apparatus, information processing method, and program according to the embodiment are applied. is not limited to
  • FIG. 1 is a schematic diagram for explaining the outline of the hologram display system according to this embodiment.
  • (A) to (F) indicate respective schematic positions.
  • this embodiment exemplifies a hologram display system 1 including a light source 11, an enlarging optical system composed of a plurality of lenses 12 and 13, a beam splitter 14, and a spatial light modulator 15. do.
  • a laser beam L1 emitted from a light source 11 at position (D) is converted into coherent light L2 with an enlarged beam diameter by an expansion optical system composed of a plurality of lenses 12 and 13 at position (C). be done.
  • the coherent light L 2 passes through the beam splitter 14 and enters the spatial light modulator 15 .
  • a reflective SLM is exemplified as the spatial light modulator 15, but it is not limited to this, and may be a transmissive SLM. Also, in this description, a phase-only SLM is exemplified, but the present invention is not limited to this, and an amplitude-only SLM may be used.
  • the coherent light L2 is modulated by the spatial light modulator 15 so as to display a hologram at a desired display position (E) in real space.
  • the beam splitter 14 at the position (B) is arranged so that a viewer (hereinafter referred to as a user) 19 at the position (viewpoint position (F)) can observe the hologram 18 reproduced by the spatial light modulator 15. For example, Projected at the display position (E) on the real space. A user 19 at the viewpoint position (F) can see the hologram 18 superimposed on the real space seen through the beam splitter 14 by observing the direction of the display position (E).
  • the hologram display system can make a virtual object or the like appear in the real space, direct an object in the real space with a special effect or the like, or present predetermined information to the user. experience can be provided.
  • FIG. 1 the schematic diagram of the hologram display system 1 illustrated in FIG. 1 can be simplified as shown in FIG.
  • the spatial light modulator 15 of FIG. 1 which was of a reflective type, is changed to a transmissive type.
  • FIG. 3 is a diagram rotated 90 degrees to the right to make the schematic diagram shown in FIG. 2 easier to see.
  • FIG. 4 is a block diagram showing a schematic configuration example of the hologram display system according to this embodiment.
  • the hologram display system 1 includes a viewpoint position acquisition unit 110, a rendering unit 120, a wavefront propagation unit 130, a viewpoint position change calculation unit 140, and a correction signal generation unit 150. , a complex signal synthesizing unit 160 , an interference fringe transforming unit 170 , and an output device 180 .
  • the complex signal synthesizing unit 160 and the interference fringe transforming unit 170 can correspond to, for example, an example of the phase signal generating unit in the scope of claims.
  • the viewpoint position change calculation unit 140 and the correction signal generation unit 150 can correspond to an example of the correction unit in the claims, for example.
  • the viewpoint position acquisition unit 110 receives input from a sensor such as an IMU (Inertial Measurement Unit) provided in an HMD worn on the head of the user 19 or a camera that acquires images of the eyes and face of the user 19, for example. Based on the information, information about the viewpoint position (F) of the user 19 (hereinafter referred to as viewpoint position information) is acquired.
  • a sensor such as an IMU (Inertial Measurement Unit) provided in an HMD worn on the head of the user 19 or a camera that acquires images of the eyes and face of the user 19, for example.
  • viewpoint position information information about the viewpoint position (F) of the user 19
  • the viewpoint position information may be the position of the viewpoint of the user 19 in the physical space. Further, the position of the viewpoint of the user 19 in the physical space may be the position of both eyes or one eye of the user 19 in the physical space, or the position of the head of the user 19 in the physical space. good too.
  • the viewpoint position information may include information regarding the orientation of both eyes or one eye or the head of the user 19 in addition to the information regarding the position of the user's 19 eyes or one eye or the head.
  • the viewpoint position acquisition unit 110 may acquire the viewpoint position of the user 19 in the physical space by SLAM (Simultaneous Localization and Mapping). Furthermore, the viewpoint position acquisition unit 110 predicts the viewpoint position after a predetermined time (for example, a preset time) has elapsed from the current viewpoint position (F), and acquires this predicted viewpoint position as the user's viewpoint position.
  • SLAM Simultaneous Localization and Mapping
  • the rendering unit 120 renders the three-dimensional image information 101 of the object displayed as a hologram based on the viewpoint position information input from the viewpoint position acquisition unit 110, thereby obtaining RGB image data representing the texture of the object and Generate a depth map representing depth.
  • a set of RGB image data and a depth map indicating the depth value of each pixel of the RGB image data is also called RGB+D image data or rendered image.
  • the wavefront propagation unit 130 propagates the RGB+D image data generated by the rendering unit 120 to generate a propagation signal of the hologram pattern displayed by the spatial light modulator 15 .
  • the viewpoint position change calculation unit 140 calculates the viewpoint position information acquired by the viewpoint position acquisition unit 110 when the rendering unit 120 renders the three-dimensional image information 101 (that is, the viewpoint position information used by the rendering unit 120 for rendering the three-dimensional image information 101).
  • the difference between the viewpoint position (F) of the user 19 in the viewpoint position information acquired by the viewpoint position acquisition unit 110 and the viewpoint position of the user 19 in the current viewpoint position information acquired by the viewpoint position acquisition unit 110 is It is calculated as a viewpoint position change amount.
  • the correction signal generation unit 150 generates a correction signal for correcting the propagation signal of the hologram pattern generated by the wavefront propagation unit 130 based on the viewpoint position change amount calculated by the viewpoint position change calculation unit 140 .
  • the complex signal synthesizing unit 160 synthesizes the propagation signal generated by the wavefront propagating unit 130 and the correction signal generated by the correction signal generating unit 150 to generate a complex signal of the hologram pattern.
  • the interference fringe conversion unit 170 converts the complex signal generated by the complex signal synthesis unit 160 to generate a phase signal for displaying the hologram pattern on the spatial light modulator 15 .
  • the output device 180 displays the phase signal generated by the interference fringe converter 170 on the spatial light modulator 15 . As a result, the hologram is superimposed on the real space and displayed to the user 19 .
  • FIG. 5 is a flowchart showing an operation example of the hologram display system according to this embodiment.
  • the operation of the hologram display system 1 includes a first half step S100 of generating a propagation signal of the hologram pattern based on the viewpoint position (F) of the user 19 at a certain time (time T1);
  • the hologram pattern is corrected based on the viewpoint position (F′) of the user 19 at the time immediately before the display (time T2), and the second half step S200 of displaying the corrected hologram pattern.
  • a first half step S100 of generating a propagation signal of the hologram pattern based on the viewpoint position (F) of the user 19 at a certain time (time T1) The hologram pattern is corrected based on the viewpoint position (F′) of the user 19 at the time immediately before the display (time T2), and the second half step S200 of displaying the corrected hologram pattern.
  • Step S101 As shown in FIG. 5, in the first half step S100, first, the viewpoint position acquisition unit 110, for example, based on sensor information from a sensor such as an IMU provided in an HMD or the like worn on the head of the user 19, , the viewpoint position information of the user 19 at time T1 is obtained (step S101).
  • the acquired viewpoint position information may include a time stamp indicating the time T1 at which the viewpoint position information was acquired.
  • the acquired viewpoint position information is input to the rendering unit 120 and the viewpoint position change calculation unit 140 .
  • Step S102 the rendering unit 120 renders the externally input three-dimensional image information 101 based on the viewpoint position (F) of the user 19 indicated by the viewpoint position information at time T1, thereby generating an RGB image representing the texture of the object. Data and a depth map representing the depth of the object are generated (step S102).
  • FIG. 6 is a diagram showing a specific example for explaining step S102 in FIG. 6, in the simplified schematic diagram shown in FIG. 2, the initial position is 500 mm (millimeters) forward from the viewpoint position (F) of the user 19 (for example, the position of the right eye).
  • the initial position is 500 mm (millimeters) forward from the viewpoint position (F) of the user 19 (for example, the position of the right eye).
  • a case where three-dimensional image information 101 for displaying a distant point object 18a is input to the rendering unit 120 will be illustrated.
  • the spatial light modulator 15 is assumed to be arranged at a position 100 mm (millimeters) away from the viewpoint position (right eye) of the user 19 in front.
  • FIG. 7A is a diagram showing an example of RGB image data 18RGB when the hologram of the point object 18a shown in FIG. 6 is displayed to the user.
  • FIG. 10 is a diagram showing an example of a depth map 18D displayed for .
  • the rendering unit 120 sets black to areas where there is no image to be projected. Further, as shown in FIG. 6, when displaying the hologram of the point object 18a 500 mm away from the viewpoint position (F) to the user 19, as shown in FIG. are given colored pixel values corresponding to the texture of the point object 18a, and pixels in the other regions are given black pixel values to generate RGB image data 18RGB. That is, the rendering unit 120 sets black to areas where there is no image to be projected. Further, as shown in FIG.
  • the rendering unit 120 gives, for example, a depth value corresponding to 400 mm, which is the distance from the spatial light modulator 15 to the point object 18a, to pixels in the area of the point object 18a, and A depth map 18D is generated in which white pixel values indicating that the distance from the spatial light modulator 15 is infinite are given to the pixels in the other area.
  • Step S103 the wavefront propagation unit 130 wavefront propagates the RGB+D image data generated in step S102, so that the hologram pattern at the position (hereinafter also referred to as the SLM position) (A) where the spatial light modulator 15 is arranged is generated (step S103).
  • Arithmetic processing for generating a propagation signal of a hologram pattern depends on the type of spatial light modulator 15, propagation distance, and encoding technique.
  • a case where a phase-only liquid crystal on silicon (LCoS) is used as the spatial light modulator 15 will be exemplified.
  • the wavefront propagation can be expressed by the Fresnel diffraction formula.
  • a dual-phase encoding algorithm can also be used for phase-only encoding. The propagation formula and encoding algorithm according to this embodiment will be described below.
  • the wavefront propagating unit 130 propagates the wavefront of each point represented in units of pixels in FIGS. 7A and 7B by using Fresnel's wavefront diffraction formula shown in Equation (1) below.
  • z indicates the distance from the display position (E) on the real space where each point is displayed to the SLM position (A) where the spatial light modulator 15 is arranged.
  • the distance (400 mm) from the display position (E) in the real space where the point object 18a is displayed to the SLM position (A) where the spatial light modulator 15 is arranged is Set as the value of z.
  • the wavefront propagation unit 130 sets the value of z for all points and causes wavefront propagation for each point using equation (1), thereby generating a propagation signal of the hologram pattern for all points.
  • the values of all points are summed to generate the hologram pattern, as shown in equation (1).
  • Other wave propagation functions that can be used to generate the hologram pattern at SLM position A can also be used.
  • the wavefront propagation unit 130 Through the processing of step S103, the wavefront propagation unit 130 generates a propagation signal representing a complex number that displays a hologram at the SLM position (A).
  • the generated propagation signal is input to the complex signal synthesizing section 160 .
  • Step S201 Subsequently, in the second half step S200, first, similar to step S101, the viewpoint position acquisition unit 110, for example, based on sensor information from a sensor such as an IMU provided in the HMD or the like worn on the head of the user 19. Thus, viewpoint position information of the user 19 at time T2 is obtained (step S201). The acquired viewpoint position information is input to the viewpoint position change calculation unit 140 .
  • Step S202 the viewpoint position change calculation unit 140 calculates the viewpoint position (F) in the viewpoint position information at time T1 input in step S101 and the viewpoint position in the viewpoint position information at time T2 newly input in step S201. A difference (viewpoint position change amount) from (F') is obtained (step S202). Note that the viewpoint position information acquired in step S201 may also be input to the rendering unit 120 and used for rendering the 3D image information 101 from the next time onward.
  • FIG. 8 is a diagram for explaining a case where the user's viewpoint position changes between time T1 and time T2. 8, the positional relationship between the viewpoint position (F) of the user 19, the spatial light modulator 15, and the display position (E) where the hologram 18 (however, the point object 18a shown in FIG. 6 and the like) is displayed. are the same as those illustrated in FIGS. 3 and 6 to 7B.
  • the hologram pattern (point object 18a) displayed by the spatial light modulator 15 is calculated based on the viewpoint position (F) of the user 19 at time T1.
  • the point object 18a calculated based on the viewpoint position (F) at time T1 is displayed as it is when the position (A) is translated to the viewpoint positions (F') and (A') separated by a distance d to the right, for example, , the point object 18a is displayed at the display position (E') which is moved rightward by a distance d from the display position (E) where it should be displayed.
  • an arrow extending from the point object 18a indicates an optical path from the point object 18a through the spatial light modulator 15 to the viewpoint position (F) or (F') of the user 19.
  • the rendering unit 120 uses the 3D image information 101 to render the three-dimensional image information 101 in order to correct the deviation of the display position of the hologram pattern due to the movement of the viewpoint position (F) between rendering and display.
  • the difference between the viewpoint position (F) of the user 19 in the viewpoint position information at time T1 and the viewpoint position (F′) of the user 19 in the viewpoint position information acquired by the viewpoint position acquisition unit 110 at time T2 immediately before the display is Acquired as a viewpoint position change amount. For example, in the example shown in FIG.
  • the viewpoint position change calculation unit 140 changes the viewpoint position (F) and A distance d (for example, 3 mm) from the viewpoint position (F′) is acquired as the viewpoint position change amount.
  • the viewpoint position change amount may be a vector from the viewpoint position (F) to the viewpoint position (F').
  • Step S203 the correction signal generation unit 150 generates a correction signal for reducing display position deviation due to a change in viewpoint position based on the viewpoint position change amount generated in step S202 (step S203).
  • This embodiment illustrates a method of adding a linear phase signal to the propagation signal generated in step S103.
  • a linear phase signal is a correction signal for shifting an object displayed in the field of complex numbers in accordance with a change in viewpoint position, and can be expressed by the following equation (2), for example.
  • the viewpoint position change amount is 3 mm and the distance from the SLM position (A) to the display position (E) is 400 mm, based on the above equation (2), from the point object 18a
  • the display position of the point object 18a can be changed to the position before the movement even when the viewpoint position moves from (F) to (F′). can be set to the display position (E).
  • the incident angle of the optical path from the point object 18a to the spatial light modulator 15 in the case of FIG. 5 in the initial state is assumed to be zero degrees, and the wavelength of the light emitted from the light source 11 is assumed to be Assuming 532 nm (nanometers) and the pixel pitch of the spatial light modulator 15 to be 3.74 ⁇ m (micrometers), in order to rotate the optical path of light from the spatial light modulator 15 by 0.86 degrees, the complex number field should be approximately 0.33128 radians, or 19 degrees, between adjacent pixels in linear phase added to .
  • FIG. 9 is a diagram showing an example of a compensating phase pattern of linear phase only in the x direction.
  • the horizontal axis indicates the viewpoint position change amount (in units of pixels), and the vertical axis indicates the phase.
  • the linear phase shown in FIG. 9 can be a phase pattern that repeats with a period of 0 to 2 ⁇ .
  • FIG. 11 is a two-dimensional image of a linear phase compensation phase pattern in which phase values in the range of 0 to 2 ⁇ are expressed in gray scale.
  • black has the smallest phase difference, and the closer to white, the greater the phase difference.
  • Step S204 the complex signal synthesizing unit 160 synthesizes the propagation signal generated in step S103 and the correction signal generated in step S203 to generate a complex signal of the hologram pattern (step S204). Equation (3) shown below, for example, is used to synthesize the propagation signal and the correction signal.
  • Step S205 the interference fringe conversion unit 170 executes hologram pattern conversion processing (step S205).
  • the complex number field represented by the complex signal generated in step S204 is converted into a hologram pattern phase signal that can be displayed using the spatial light modulator 15 .
  • This hologram pattern conversion process requires a different conversion technique for each type of spatial light modulator 15 . If the spatial light modulator 15 is a phase-only SLM, as exemplified in this description, the complex number field needs to be converted to a phase-only field in order for the hologram to be displayed on the spatial light modulator 15. be. However, it is not limited to this, and various other encoding methods may be used.
  • Pa and Pb in equation (4) alternate in a checkerboard pattern to form a phase-only signal (phase signal) for displaying the hologram pattern on spatial light modulator 15.
  • phase signal phase signal
  • Step S206 The phase signal of the hologram pattern generated in step S205 is input to the output device 180 that controls the light source 11. FIG. Then, by controlling the light source 11 based on the phase signal input by the output device 180, the light for displaying the hologram pattern is irradiated onto the modulation surface of the spatial light modulator 15, thereby causing the user 19 to Then, the hologram at the display position (E) is displayed (step S206).
  • step S300 the control unit that controls the overall operation of the hologram display system determines whether or not to continue this operation (step S300), and ends the operation. If so (YES in step S300), this operation is terminated. On the other hand, when not ending (NO of step S300), a control part returns to step S101 and performs subsequent operation
  • the hologram pattern to be displayed is corrected based on the viewpoint position (F′) immediately before display. can be reduced. As a result, it is possible to reduce the sense of discomfort that the user feels due to the positional deviation between the real space and the object superimposed thereon, so that it is possible to suppress deterioration in the user's viewing experience.
  • the first half step S100 of generating the propagation signal of the hologram pattern based on the viewpoint position (F) of the user 19 at a certain time T1 and the The hologram pattern is corrected based on the viewpoint position (F') of the user 19 at time T2, and the hologram pattern is displayed after correction.
  • S200 was configured to be executed.
  • FIG. 13 is a timing chart for explaining the operation flow according to Modification 1. As shown in FIG.
  • the viewpoint position acquisition unit 110 generates line-of-sight position information in a predetermined cycle based on sensor information sequentially input from sensors provided in an HMD or the like (step S101, S201).
  • the generated line-of-sight position information is sequentially input to the rendering unit 120 and the viewpoint position change calculation unit 140 .
  • the first half process S100 that is, steps S101 to S103 in FIG. 5 are looped asynchronously with the second half process S200. is looped asynchronously.
  • the cycle in which the viewpoint position acquisition unit 110 outputs the viewpoint position information may be a cycle shorter than the cycle in which the first half step S100 and the second half step S200 are looped.
  • a correction signal is generated based on the acquired viewpoint position change amount (S203), and by using the generated correction signal, is corrected (S204), and the complex signal generated by this correction is converted into a phase signal (S205) and output, so that the hologram pattern is displayed on the spatial light modulator 15 (S206 ) is repeated.
  • the wavefront propagation process of step S103 requires a much longer processing time than the other processes. Therefore, for example, when displaying a hologram to the user 19 at time T3, the propagation signal used to generate the hologram pattern is based on the rendered image rendered using the viewpoint position information at time T1. If the propagated signal is not corrected based on the viewpoint position information at the time T2 immediately before the display, there is a possibility that a very large shift in the display position will occur. Since the user 19 is sensitive to such display position deviation, there is a possibility that the user's viewing experience will be significantly degraded if no correction is made.
  • the correction signal generation unit 150 generates the correction signal asynchronously with the generation of the propagation signal by the wavefront propagation unit 130, so that the hologram based on the latest viewpoint position is always displayed to the user 19. is possible, it is possible to reduce the deviation of the display position. As a result, it becomes possible to suppress deterioration of the user's viewing experience.
  • the propagation signals generated in the same first half step S100 can be corrected based on each of two or more viewpoint positions at different times to display the hologram, the substantial frame rate of the hologram displayed to the user 19 can be increased.
  • Modification 2 refers to the viewpoint position acquisition unit 110 .
  • the viewpoint position acquisition unit 110 receives input from sensors such as an IMU provided in the HMD worn on the head of the user 19 and a camera that acquires images of the eyes and face of the user 19. The case where the viewpoint position information of the user 19 is acquired based on the sensor information obtained is illustrated.
  • the viewpoint position (head and eye positions) of the user 19 is estimated from the position and posture of the HMD worn on the head of the user 19 will be illustrated.
  • the sensors include, for example, an internal sensor such as an IMU, such as a sensor used for SLAM, an external camera, a ToF (Time of Flight) sensor, and the like.
  • IMU internal sensor
  • the IMU is capable of sensing at a high sampling frequency of 1000 Hz or higher and at a low cost of several milliseconds (milliseconds) or less. It is possible to acquire with low latency and high frequency.
  • the viewpoint position (head and eye position) of the user 19 can be estimated from the position and posture of the HMD. Therefore, by using the sensor information from the IMU for estimating the viewpoint position of the user 19, it is possible to obtain an accurate viewpoint position with low latency and high frequency. Such acquisition of an accurate viewpoint position with low latency and high frequency eliminates side effects such as displacement of the display position caused by the operation illustrated in FIG. can contribute to reducing
  • the hologram pattern is displayed after obtaining the viewpoint position information of the viewpoint position (F′) of the display straight line during the processing time of the latter step S200. It is effective to shorten the time to Therefore, in Modification 3, a lookup table (LUT) is used in the process of generating the correction signal shown in step S202 in the latter half of the process S200, thereby shortening the processing time of the entire latter half of the process S200 including step S202.
  • LUT lookup table
  • the LUT stores pre-calculated compensating phase patterns to compensate based on the required diffraction angle ⁇ given in equation (2) above.
  • the number of compensating phase patterns stored in the LUT depends on how many diffraction angles ⁇ are quantized. For example, when the diffraction angle ⁇ of 0.1 to 3.0 degrees is quantized in increments of 0.1 degrees, the LUT stores 30 compensating phase patterns.
  • FIG. 14 shows an example of a pre-computed one-dimensional representation of the compensating phase pattern.
  • FIG. 15 shows a two-dimensional image obtained by two-dimensionally expressing the compensating phase pattern shown in FIG.
  • Other compensating phase patterns in the LUT can also be generated by a simple method similar to the method described above.
  • the compensation phase obtained from the LUT illustrated in Modification 3 may be corrected based on actual use. For example, if the viewpoint of the user 19 moves not only horizontally (x-direction) but also vertically (y-direction), the compensation phase (correction signal) shifts the object displayed as a hologram in two dimensions. may be corrected as follows.
  • the display position (E′) of the point object 18a is changed to It is possible to shift the display position of the point object 18a in the x direction and the y direction to the original display position (E) by continuously applying the compensation phase pattern for shifting to .
  • a compensation phase pattern for shifting the display position in the x-direction and the y-direction can be a combination of an x-direction compensation phase pattern and a y-direction compensation phase pattern.
  • the compensating phase pattern illustrated in FIG. 11 can be used as the compensating phase pattern in the x direction.
  • the compensating phase pattern in the y direction is as shown in FIG.
  • the x direction of the compensating phase pattern obtained by rotating the compensating phase pattern in the x direction to the right by 90 degrees is multiplied by m/n.
  • the compensating phase pattern obtained by setting the resolution to (m:n) may be used as the compensating phase pattern in the y direction.
  • the (m:n) area is cut out (cropped) from the (m:n) area.
  • the propagation signal generated by the wavefront propagation unit 130 and the correction signal generated by the correction signal generation unit 150 are combined in the complex signal synthesis unit 160,
  • the obtained complex signal is converted into a phase signal in the interference fringe converter 170 .
  • the propagation signal is converted into a phase signal before synthesizing the correction signal, and then the correction signal is synthesized.
  • FIG. 18 is a block diagram showing a schematic configuration example of a hologram display system according to this embodiment.
  • the hologram display system 2 according to the present embodiment has a configuration similar to that of the hologram display system 1 described in the first embodiment with reference to FIG. 260 , and the arrangement of the interference fringe conversion section 170 is changed between the wavefront propagation section 130 and the phase signal synthesis section 260 .
  • the interference fringe conversion section 170 converts the propagation signal of the hologram pattern generated by the wavefront propagation section 130 into a phase signal, and inputs the phase signal to the phase signal synthesis section 260 .
  • the phase signal synthesizing unit 260 synthesizes the phase signal input from the interference fringe converting unit 170 and the correction signal generated by the correction signal generating unit 150 to display the hologram pattern on the spatial light modulator 15. is generated and input to the output device 180 .
  • the interference fringe conversion unit 170 and the phase signal synthesis unit 260 can correspond to an example of the phase signal generation unit in the scope of claims, for example.
  • the viewpoint position change calculation unit 140 and the correction signal generation unit 150 can correspond to an example of the correction unit in the claims, for example.
  • FIG. 19 is a flowchart showing an operation example of the hologram display system according to this embodiment.
  • the operation of the hologram display system 2 is similar to the operation described using FIG. 5 in the first embodiment, based on the viewpoint position (F) of the user 19 at a certain time T1. and a second half step S500 of correcting the hologram pattern based on the viewpoint position (F′) of the user 19 at time T2 immediately before display and displaying the corrected hologram pattern. It consists of processes.
  • Step S401 the interference fringe conversion unit 170 performs hologram pattern conversion processing on the propagation signal of the hologram pattern generated by the wavefront propagation unit 130 (step S401).
  • the content of the hologram pattern conversion processing may be the same as the processing content shown in step S205 of FIG. 5 illustrated in the first embodiment.
  • Step S501 the phase signal synthesizer 260 synthesizes the phase signal generated in step S401 and the correction signal generated in step S203 to generate the phase signal of the hologram pattern (step S501).
  • Formula (6) shown below, for example, is used to synthesize the phase signal and the correction signal.
  • Step S206 the phase signal of the hologram pattern generated in step S501 is input to the output device 180 that controls the light source 11, and the output device 180 controls the light source 11 based on the input phase signal.
  • the modulation surface of the spatial light modulator 15 is irradiated with light for displaying the hologram pattern, and the hologram at the display position (E) is displayed to the user 19 (step S206).
  • the control unit determines whether or not to continue this operation (step S300), and if it ends (YES in step S300), end the action. On the other hand, when not ending (NO of step S300), a control part returns to step S101 and performs subsequent operation
  • an AR device using an optical see-through HMD was exemplified as a hologram display system, but the present invention is not limited to this. It can be applied to various AR devices that can superimpose and display a hologram on the real world, such as an AR device using a naked-eye stereoscopic television or the like.
  • an AR HUD system using CGH can be realized as an application example of the hologram display system described with reference to FIG. 1 in the first embodiment. Therefore, in the third embodiment, an AR HUD system using CGH will be described as an example.
  • FIG. 20 is a schematic diagram for explaining the outline of the AR HUD system using the CGH according to this embodiment.
  • the principle by which the spatial light modulator 15 forms the hologram 18 may be the same as the principle by which the spatial light modulator 15 forms the hologram 18 in the first embodiment, for example.
  • a transmissive SLM is used as the spatial light modulator 15 .
  • the light emitted from the light source 11 is magnified by a magnifying optical system composed of a plurality of lenses 12 and 13, and then transmitted through the spatial light modulator 15, thereby creating a hologram at the display position (E). form 18.
  • the hologram 18 formed at the display position (E) is reflected by the combiner 30 and viewed by the user 19 at the viewpoint position (F) as the hologram 18A at the display position (G).
  • a hologram combiner that reflects light from the light source 11 while transmitting light in the visible light range may be used.
  • the technology according to the present disclosure is not limited to HMD type AR devices, but can be applied to various AR devices that can display holograms superimposed on the real world.
  • FIG. 21 is a hardware configuration diagram showing an example of a computer 1000 that implements the functions of the information processing apparatus according to the above-described embodiments.
  • the computer 1000 has a CPU 1100 , a RAM 1200 , a ROM (Read Only Memory) 1300 , a HDD (Hard Disk Drive) 1400 , a communication interface 1500 and an input/output interface 1600 . Each part of computer 1000 is connected by bus 1050 .
  • the CPU 1100 operates based on programs stored in the ROM 1300 or HDD 1400 and controls each section. For example, the CPU 1100 loads programs stored in the ROM 1300 or HDD 1400 into the RAM 1200 and executes processes corresponding to various programs.
  • the ROM 1300 stores a boot program such as BIOS (Basic Input Output System) executed by the CPU 1100 when the computer 1000 is started, and programs dependent on the hardware of the computer 1000.
  • BIOS Basic Input Output System
  • the HDD 1400 is a computer-readable recording medium that non-temporarily records programs executed by the CPU 1100 and data used by such programs.
  • HDD 1400 is a recording medium that records the program according to the present disclosure, which is an example of program data 1450 .
  • a communication interface 1500 is an interface for connecting the computer 1000 to an external network 1550 (for example, the Internet).
  • the CPU 1100 receives data from another device via the communication interface 1500, and transmits data generated by the CPU 1100 to another device.
  • the input/output interface 1600 is an interface for connecting the input/output device 1650 and the computer 1000 .
  • the CPU 1100 receives data from input devices such as a keyboard and mouse via the input/output interface 1600 .
  • the CPU 1100 transmits data to an output device such as a display, a speaker, or a printer via the input/output interface 1600 .
  • the input/output interface 1600 may function as a media interface for reading a program or the like recorded on a predetermined recording medium.
  • Media include, for example, optical recording media such as DVD (Digital Versatile Disc) and PD (Phase change rewritable disk), magneto-optical recording media such as MO (Magneto-Optical disk), tape media, magnetic recording media, semiconductor memories, etc. is.
  • optical recording media such as DVD (Digital Versatile Disc) and PD (Phase change rewritable disk)
  • magneto-optical recording media such as MO (Magneto-Optical disk)
  • tape media magnetic recording media
  • magnetic recording media semiconductor memories, etc. is.
  • the CPU 1100 of the computer 1000 executes a program loaded on the RAM 1200 to obtain the viewpoint position acquisition unit 110, the rendering unit 120, the wavefront propagation At least part of the functions of the unit 130, the viewpoint position change calculation unit 140, the correction signal generation unit 150, the complex signal synthesis unit 160 or the phase signal synthesis unit 260, the interference fringe conversion unit 170, and the output device 180 are realized.
  • the HDD 1400 also stores programs and the like according to the present disclosure. Although CPU 1100 reads and executes program data 1450 from HDD 1400 , as another example, these programs may be obtained from another device via external network 1550 .
  • the present technology can also take the following configuration.
  • a viewpoint position acquisition unit that acquires a user's viewpoint position; a rendering unit that generates a rendered image based on the viewpoint position; a wavefront propagation unit that generates a propagated signal representing a hologram based on the rendered image; a phase signal generator that generates a first phase signal for displaying a hologram based on the propagation signal; a correction unit that corrects the displayed hologram based on the current viewpoint position of the user acquired by the viewpoint position acquisition unit; Information processing device.
  • the correction unit generates a correction signal for correcting the propagation signal based on the user's current viewpoint position acquired by the viewpoint position acquisition unit
  • the phase signal generator is a complex signal synthesizing unit that generates a complex signal based on the propagation signal and the correction signal; an interference fringe conversion unit that converts the complex signal into the first phase signal; including, The information processing device according to (1) above.
  • the phase signal generation unit includes an interference fringe conversion unit that converts the propagation signal into a second phase signal,
  • the correction unit generates a correction signal for correcting the second phase signal based on the current viewpoint position of the user acquired by the viewpoint position acquisition unit,
  • the phase signal generation unit further includes a phase signal synthesis unit that generates the first phase signal based on the second phase signal and the correction signal,
  • the information processing device according to (1) above.
  • the correction unit generates the correction signal by referring to a lookup table prepared in advance based on the current viewpoint position of the user. information processing equipment.
  • the information processing apparatus corrects the correction signal obtained from the lookup table prepared in advance based on the current viewpoint position of the user.
  • the correction signal has a compensating phase pattern of a linear phase.
  • the compensating phase pattern has a phase pattern that repeats with a period of 0 to 2 ⁇ .
  • the correction signal includes a horizontal compensating phase pattern and a vertical compensating phase pattern.
  • the information processing apparatus according to any one of (2) to (9), wherein the correction signal has a compensating phase pattern obtained by synthesizing a horizontal compensating phase pattern and a vertical compensating phase pattern.
  • the correction unit generates the correction signal asynchronously with generation of the propagation signal by the wavefront propagation unit.
  • the information processing apparatus according to any one of (1) to (12) above, further comprising an output unit that controls a light source that outputs light to the spatial light modulator based on the first phase signal.
  • the spatial light modulator is a phase limited spatial light modulator or an amplitude limited spatial light modulator.
  • An information processing method comprising: (19) A recording medium recording a program for operating a processor mounted in a system for displaying a hologram by irradiating a spatial light modulator with light, to the processor; a process of acquiring a user's viewpoint position; a process of generating a rendered image based on the viewpoint position; generating a propagated signal representing a hologram based on the rendered image; a process of generating a first phase signal for displaying a hologram based on the propagation signal; a process of obtaining the user's current viewpoint position; a process of correcting the displayed hologram based

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Holo Graphy (AREA)

Abstract

La présente invention supprime une dégradation de l'expérience de visualisation d'utilisateur. Un dispositif de traitement d'informations selon la présente invention comprend une unité d'acquisition de position de point de vue (110) qui acquiert la position de point de vue d'un utilisateur, une unité de rendu (120) qui génère une image de rendu sur la base de la position de point de vue, une unité de propagation de front d'onde (130) qui génère un signal de propagation indiquant un hologramme sur la base de l'image de rendu, une unité de génération de signal de phase (160, 170) qui génère un premier signal de phase pour afficher l'hologramme sur la base du signal de propagation, et une unité de correction (140, 150) qui corrige l'hologramme affiché sur la base de la position actuelle de point de vue de l'utilisateur, la position de point de vue étant acquise par l'unité d'acquisition de position de point de vue.
PCT/JP2022/008177 2021-05-31 2022-02-28 Dispositif de traitement d'informations, procédé de traitement d'informations, et support d'enregistrement de programme WO2022254833A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021091158 2021-05-31
JP2021-091158 2021-05-31

Publications (1)

Publication Number Publication Date
WO2022254833A1 true WO2022254833A1 (fr) 2022-12-08

Family

ID=84324113

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/008177 WO2022254833A1 (fr) 2021-05-31 2022-02-28 Dispositif de traitement d'informations, procédé de traitement d'informations, et support d'enregistrement de programme

Country Status (1)

Country Link
WO (1) WO2022254833A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005516258A (ja) * 2002-01-30 2005-06-02 エックスワイジー イメージング インク. 合成1段階ホログラムの書き込み方法
JP2009544036A (ja) * 2006-07-18 2009-12-10 セルオプティック、インコーポレイテッド 受信電磁放射線から物体の3次元情報を抽出するシステム、装置および方法
JP2010528325A (ja) * 2007-05-16 2010-08-19 シーリアル テクノロジーズ ソシエテ アノニム 3dレンダリング・グラフィック・パイプラインを向上するためにビデオホログラムをリアルタイムに生成する方法
US20190310585A1 (en) * 2013-06-06 2019-10-10 Seereal Technologies S.A. Device and method for calculating holographic data
JP2020516933A (ja) * 2017-04-07 2020-06-11 デュアリタス リミテッド ホログラフィックプロジェクタ
JP2020533629A (ja) * 2017-09-08 2020-11-19 デュアリタス リミテッド ホログラフィックプロジェクタ

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005516258A (ja) * 2002-01-30 2005-06-02 エックスワイジー イメージング インク. 合成1段階ホログラムの書き込み方法
JP2009544036A (ja) * 2006-07-18 2009-12-10 セルオプティック、インコーポレイテッド 受信電磁放射線から物体の3次元情報を抽出するシステム、装置および方法
JP2010528325A (ja) * 2007-05-16 2010-08-19 シーリアル テクノロジーズ ソシエテ アノニム 3dレンダリング・グラフィック・パイプラインを向上するためにビデオホログラムをリアルタイムに生成する方法
US20190310585A1 (en) * 2013-06-06 2019-10-10 Seereal Technologies S.A. Device and method for calculating holographic data
JP2020516933A (ja) * 2017-04-07 2020-06-11 デュアリタス リミテッド ホログラフィックプロジェクタ
JP2020533629A (ja) * 2017-09-08 2020-11-19 デュアリタス リミテッド ホログラフィックプロジェクタ

Similar Documents

Publication Publication Date Title
Maimone et al. Holographic near-eye displays for virtual and augmented reality
Park et al. Holographic techniques for augmented reality and virtual reality near-eye displays
US10481554B2 (en) Near-eye device
Padmanaban et al. Holographic near-eye displays based on overlap-add stereograms
KR101620852B1 (ko) 홀로그래픽 보정을 갖는 홀로그래픽 이미지 프로젝션
JP5265546B2 (ja) サブホログラムを使用してビデオホログラムをリアルタイムに生成する方法
EP1287400B8 (fr) Reduction du temps de calcul dans des affichages tridimensionnels
JP7486822B2 (ja) ホログラフィックヘッドアップディスプレイ装置
JP5468537B2 (ja) 3dレンダリング・グラフィック・パイプラインを向上するためにビデオホログラムをリアルタイムに生成する方法
US10330936B2 (en) Focal surface display
JP5266223B2 (ja) 伝播を使用して計算機ビデオホログラムをリアルタイムに生成する方法
US10845761B2 (en) Reduced bandwidth holographic near-eye display
JP2008541159A (ja) シーンのホログラフィック再構成を行う投影装置及び方法
JP2008541159A5 (fr)
JP7227095B2 (ja) ホログラム生成装置およびホログラム生成方法
Kang et al. Color holographic wavefront printing technique for realistic representation
Akşit et al. Holobeam: Paper-thin near-eye displays
JP2004516498A (ja) 改良型3dディスプレイ
WO2022254833A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, et support d'enregistrement de programme
JP2009540353A (ja) エレクトロホログラフィックディスプレイにおける実効画素ピッチを低減する方法及び低減された実効画素ピッチを含むエレクトロホログラフィックディスプレイ
US20220342366A1 (en) Holographic display apparatus including freeform curved surface and operating method thereof
CN109946943B (zh) 一种基于光束整形抑制散斑噪声的全息显示***
US10168668B2 (en) Method of forming a rarefied hologram for video imaging and 3D lithography
JP6607491B2 (ja) ホログラムデータ生成装置及びそのプログラム
WO2019062306A1 (fr) Procédé et dispositif d'affichage holographique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22815591

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22815591

Country of ref document: EP

Kind code of ref document: A1