CN116996656A - VR device control method and device, electronic device and storage medium - Google Patents

VR device control method and device, electronic device and storage medium Download PDF

Info

Publication number
CN116996656A
CN116996656A CN202311207102.6A CN202311207102A CN116996656A CN 116996656 A CN116996656 A CN 116996656A CN 202311207102 A CN202311207102 A CN 202311207102A CN 116996656 A CN116996656 A CN 116996656A
Authority
CN
China
Prior art keywords
distance
remote
data transmission
cameras
transmission connection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311207102.6A
Other languages
Chinese (zh)
Inventor
梁洪尧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Luxshare Precision Technology Nanjing Co Ltd
Original Assignee
Luxshare Precision Technology Nanjing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Luxshare Precision Technology Nanjing Co Ltd filed Critical Luxshare Precision Technology Nanjing Co Ltd
Priority to CN202311207102.6A priority Critical patent/CN116996656A/en
Publication of CN116996656A publication Critical patent/CN116996656A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
    • H04N13/117Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the invention discloses a VR equipment control method, a device, electronic equipment and a storage medium, wherein the method comprises the following steps: responding to a remote connection instruction triggered by a first user, and establishing data transmission connection with a remote shooting end; acquiring a first pupil distance parameter of a first user based on data transmission connection; based on data transmission connection, the first pupil distance parameter is sent to a remote shooting end, so that the remote shooting end adjusts the distance between two cameras to be a target distance matched with the first pupil distance parameter; based on the data transmission connection, an image obtained by shooting the target distance by the remote shooting end is received and displayed. According to the method, the VR terminal can automatically determine the distance between the two cameras of the remote shooting end according to the interpupillary distance of eyes, so that the remote shooting end can focus according to the interpupillary distance more accurately, the display definition and the anti-distortion effect of the device are improved, and the experience of a user using the VR device is further improved.

Description

VR device control method and device, electronic device and storage medium
Technical Field
The embodiment of the invention relates to the technical field of virtual reality, in particular to a VR equipment control method, electronic equipment, device and storage medium.
Background
In recent years, the application of AR/VR technology in actual life is increasing, and VR equipment is changed from the original Gao Angjia grid to the current civilian price. The 360-degree camera in the VR device can well control the image combination of the two eyes. However, VR devices using 180 degree cameras are difficult to implement and may suffer from ghosting and unfocused conditions.
The conventional VR device generally uses a user to manually adjust the distance between two cameras, and the user determines whether the distance has been adjusted to a proper position according to the visual effect seen by the user. Especially 180 degrees of cameras, can easily appear ghost images under this adjustment mode, resulting in poor display definition and anti-distortion effects of the device, affecting the user's experience of using the VR device.
Disclosure of Invention
The embodiment of the invention provides a VR equipment control method, a device, electronic equipment and a storage medium, which can quickly and accurately automatically determine the distance between two cameras at a remote shooting end according to the interpupillary distance of human eyes, so that the remote shooting end can focus according to the interpupillary distance more accurately, the display definition and the anti-distortion effect of the equipment are improved, and the experience of a user using the VR equipment is further improved.
In a first aspect, an embodiment of the present invention provides a VR device control method, applied to a Virtual Reality (VR) terminal, where the method includes:
responding to a remote connection instruction triggered by a first user, and establishing data transmission connection with a remote shooting end; the remote shooting end comprises two cameras, and an included angle between the two cameras is 180 degrees;
acquiring a first pupil distance parameter of the first user based on the data transmission connection;
based on the data transmission connection, the first pupil distance parameter is sent to the remote shooting end, so that the remote shooting end adjusts the distance between the two cameras to be a target distance matched with the first pupil distance parameter;
and receiving and displaying the image shot by the remote shooting end based on the target distance based on the data transmission connection.
In a second aspect, an embodiment of the present invention provides a VR device control method, which is applied to a remote capturing end, where the remote capturing end includes two cameras, and the method includes:
responding to a remote connection instruction sent by a VR terminal, and establishing data transmission connection with the VR terminal;
Based on the data transmission connection, receiving a first pupil distance parameter sent by a VR terminal, and determining a target distance matched with the first pupil distance parameter;
adjusting the distance between the two cameras to be the target distance;
and shooting images through the two adjusted cameras, and sending the images to the VR terminal based on the data transmission connection, so that the VR terminal displays the images.
In a third aspect, an embodiment of the present invention provides a VR device control apparatus, applied to a VR terminal, where the apparatus includes:
the first connection module is used for responding to a remote connection instruction triggered by a first user and establishing data transmission connection with a remote shooting end; the remote shooting end comprises two cameras, and an included angle between the two cameras is 180 degrees;
the parameter acquisition module is used for acquiring a first pupil distance parameter of the first user based on the data transmission connection;
the parameter sending module is used for sending the first pupil distance parameter to the remote shooting end based on the data transmission connection, so that the remote shooting end adjusts the distance between the two cameras to be a target distance matched with the first pupil distance parameter;
And the image display module is used for receiving and displaying the image obtained by the remote shooting end based on the target distance shooting based on the data transmission connection.
In a fourth aspect, an embodiment of the present invention provides a VR device control apparatus, which is applied to a remote capturing end, where the remote capturing end includes two cameras, and the apparatus includes:
the second connection module is used for responding to a remote connection instruction sent by the VR terminal and establishing data transmission connection with the VR terminal;
the distance determining module is used for receiving a first pupil distance parameter sent by the VR terminal based on the data transmission connection and determining a target distance matched with the first pupil distance parameter;
the distance adjusting module is used for adjusting the distance between the two cameras to be the target distance;
and the image shooting module is used for shooting images through the two adjusted cameras, and sending the images to the VR terminal based on the data transmission connection so that the VR terminal can display the images.
In a fifth aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes a memory, a processor, and a computer program stored in the memory and capable of running on the processor, and when the processor executes the program, the VR device control method according to any one of the embodiments of the present invention is implemented.
In a sixth aspect, embodiments of the present invention further provide a computer readable storage medium having stored thereon a computer program that, when executed by a processor, implements a VR device control method as set forth in any one of the embodiments of the present invention.
In the embodiment of the invention, the data transmission connection can be established with the remote shooting end in response to the remote connection instruction triggered by the first user; the remote shooting end comprises two cameras, and an included angle between the two cameras is 180 degrees; acquiring a first pupil distance parameter of a first user based on data transmission connection; based on data transmission connection, the first pupil distance parameter is sent to a remote shooting end, so that the remote shooting end adjusts the distance between two cameras to be a target distance matched with the first pupil distance parameter; based on the data transmission connection, an image obtained by shooting the target distance by the remote shooting end is received and displayed. In the embodiment of the invention, the VR terminal can quickly and accurately determine the distance between the two cameras of the remote shooting end according to the interpupillary distance of human eyes, so that the remote shooting end can focus according to the interpupillary distance more accurately, the display definition and the anti-distortion effect of the device are improved, and the experience of a user using the VR device is further improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a VR device control method applied to a VR terminal according to an embodiment of the present invention;
fig. 2 is another flowchart of a VR device control method applied to a VR terminal according to an embodiment of the present invention;
fig. 3 is a flowchart of a VR device control method applied to a remote shooting end according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a shooting end according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a VR device control apparatus applied to a VR terminal according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a VR device control apparatus applied to a remote shooting end according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present invention are shown in the drawings.
Fig. 1 is a flowchart of a VR device control method applied to a VR terminal, where the embodiment of the present invention is applied to remote live connection, and a picture shooting mode of a live broadcast end is matched with a human eye vision habit, so that picture accuracy presented by the VR end is matched with accuracy of a real scene shot by the live broadcast end, for example, a remote medical scene, and the like. The following embodiments will be described taking the example of the integration of the apparatus in an electronic device, and referring to fig. 1, the method may specifically include the following steps:
And step 101, responding to a remote connection instruction triggered by a first user, and establishing data transmission connection with a remote shooting end.
The VR equipment comprises a VR terminal and a remote shooting end, wherein the remote shooting end is used for capturing a real world scene. In this scheme, long-range shooting end includes two cameras, and the contained angle between two cameras is 180 degrees. The remote connection instruction is an instruction for instructing the VR terminal to establish a data transmission connection with the remote capturing terminal. The data transmission connection mode comprises a wired connection and a wireless connection. In this scheme, a wireless network or a dedicated wireless transmission technology (such as Wi-Fi, 5G, etc.) is used to establish a data transmission connection between the remote capturing terminal and the VR terminal.
Specifically, when the user uses the VR device, the connection instruction may be sent to the remote capturing terminal through an on button on the VR terminal. Alternatively, the remote connection instruction may be sent to the VR terminal through a terminal (mobile phone, tablet computer, etc.) of the first user connected to the VR terminal. After receiving the remote connection instruction, the VR terminal responds to the remote connection instruction and sends a data transmission connection request to the remote shooting end, and the remote shooting end can establish data transmission connection with the VR terminal after receiving the data transmission connection request.
Further, after the VR terminal and the remote shooting end establish data transmission connection, network delay of data transmission between the VR terminal and the remote shooting end can be obtained in real time. And in a preset time period, if the network time delay between the VR terminal and the remote shooting end is larger than the preset time delay, the network connection between the VR terminal and the remote shooting end is unstable. At this time, the VR terminal may prompt, through the voice broadcast device or the like, that the network connection between the VR terminal and the remote photographing terminal of the first user is unstable, so that the first user may adjust the network between the VR terminal and the remote photographing terminal in time.
Step 102, acquiring a first pupil distance parameter of a first user based on data transmission connection.
Wherein the interpupillary distance parameter is a distance between pupils of both eyes of the first user. In an alternative embodiment, the VR terminal is provided with a measuring device, such as a sensor, a camera, etc., for measuring the user's pupil distance. After the VR terminal and the remote shooting terminal are connected through data transmission, the user can be prompted to aim the eyes at the camera to perform pupil distance measurement, so that pupil distance data of the first user are obtained. Or the VR terminal can acquire a human eye image of a user through the camera, and perform feature extraction on the human eye image to obtain a left eye center point position and a right eye center point position in the human eye image; the pupil distance parameter of the first user is determined based on the left eye center point position and the right eye center point position.
Step 103, based on data transmission connection, the first pupil distance parameter is sent to the remote shooting end, so that the remote shooting end adjusts the distance between the two cameras to be a target distance matched with the first pupil distance parameter.
Specifically, the pupil distance data of the user is very important for the VR equipment, and the remote camera end can adjust the distance between the two cameras according to the pupil distance data of the user, so that focusing and alignment of virtual reality contents are realized, comfort and definition of the user during watching are ensured, and personalized virtual reality experience is provided for the user.
In an alternative embodiment, after determining the pupil distance parameter of the first user, the VR terminal may send the pupil distance parameter of the first user to the remote capturing end through a preset network communication protocol based on the data transmission connection. The preset network communication protocol is a network communication protocol for live broadcasting, such as a real-time streaming protocol, which is a protocol using a push server. Under the real-time streaming protocol, the remote shooting end can push and package the image and video coding data packaged into real-time transmission packets to the terminal, so that the remote shooting end can realize streaming media transmission with small system overhead.
In this scheme, long-range shooting end includes two cameras, step motor and two strip barrels, and step motor includes two gears, and a strip barrel is connected to every gear, and a camera is connected to every strip barrel. In an alternative implementation manner, after receiving pupil distance data of a first user, the remote shooting end calculates rolling distances of two strip-shaped rolling bars based on the target distance; the two gears are driven to rotate by the stepping motor so as to drive the two gears to drive the respective corresponding strip-shaped rolling bars to slide according to the rolling distance, and the distance between the two cameras is the target distance.
And 104, receiving and displaying an image obtained by shooting the remote shooting end based on the target distance based on the data transmission connection.
Specifically, after the distance between the two cameras is adjusted, the remote shooting end performs remote shooting and recording in the virtual reality environment through the two cameras. In this scheme, VR equipment can provide live broadcast service for the user. In an optional implementation manner, after the remote shooting end establishes data transmission connection with the VR terminal, the remote shooting end receives the pupil distance parameter of the first user sent by the VR terminal, adjusts the distance between the two cameras to be a distance corresponding to the pupil distance parameter of the first user, and ensures comfort and definition when the user watches. Further, the two cameras are controlled to shoot, images shot by the two cameras are obtained, and the shot images are sent to the VR terminal in real time. The VR terminal can receive images (live streams) sent by the remote shooting end and synchronously display the images for the user, so that the user can clearly and smoothly watch live broadcast.
According to the technical scheme, the data transmission connection can be established with the remote shooting end in response to the remote connection instruction triggered by the first user; the remote shooting end comprises two cameras, and an included angle between the two cameras is 180 degrees; acquiring a first pupil distance parameter of a first user based on data transmission connection; based on data transmission connection, the first pupil distance parameter is sent to a remote shooting end, so that the remote shooting end adjusts the distance between two cameras to be a target distance matched with the first pupil distance parameter; based on the data transmission connection, an image obtained by shooting the target distance by the remote shooting end is received and displayed. According to the technical scheme, the distance between the two cameras of the remote shooting end can be automatically determined according to the interpupillary distance of eyes, so that the remote shooting end can focus according to the interpupillary distance more accurately, the display definition and the anti-distortion effect of the equipment are improved, and the experience of a user using VR equipment is further improved.
Fig. 2 is another flowchart of a VR device control method applied to a VR terminal according to an embodiment of the present invention, where the method may be performed by a VR device control apparatus according to an embodiment of the present invention, and the apparatus may be implemented by software and/or hardware. As shown in fig. 2, the method specifically comprises the following steps:
Step 201, a data transmission connection is established with a remote shooting end in response to a remote connection instruction triggered by a first user.
The remote shooting end comprises two cameras, and an included angle between the two cameras is 180 degrees.
Step 202, acquiring an eye movement parameter of a first user based on the data transmission connection.
Wherein the eye movement parameter is a parameter that the VR device acquires while tracking the movement of the user's eye. Such as the change in the position of the user's eye gaze, the length of time the user gazes at a certain point, etc.
In the scheme, the VR terminal is internally provided with the eye tracking sensor, and after the first user wears the VR terminal, the eye tracking sensor arranged in the VR terminal can track the movement of the eyes of the first user in real time and generate the eye movement parameters of the first user according to the movement condition of the eyes of the first user.
Step 203, determining gaze point information of the first user based on the eye movement parameters.
Wherein the gaze point information includes: gaze point coordinates: position coordinates of the user's eye gaze; gaze point duration: the length of time the user gazes at a point; gaze point order: the user looks at the sequence of different points.
In an alternative embodiment, after acquiring the eye movement parameters of the first user, the VR terminal analyzes and sorts the eye movement parameters to obtain some eye movement characteristics of the first user. For example, the gaze characteristics of the first user: gaze duration, gaze location, etc.; eye jump feature: spontaneous or non-spontaneous, length of eye jump, trajectory of eye jump, etc.; blink characteristics, etc. Further, the VR terminal determines, based on the features, gaze point information of the first user.
And 204, transmitting the gaze point information to a remote shooting end based on the data transmission connection, so that the remote shooting end focuses on the two cameras based on the gaze point information to obtain target imaging planes corresponding to the two cameras.
Wherein the target imaging plane is a display plane generated by images captured by two cameras. This plane is a window through which the user views the virtual reality scene through the VR terminal, and is also a representation of the virtual world that the user can see.
In an alternative embodiment, after obtaining the gaze point information of the user, the VR terminal sends the gaze point information to the remote capturing end through a data transmission connection. The remote shooting end is provided with a zoom engine, and after receiving the gaze point information of the user, the remote shooting end can estimate the best focus based on the gaze point information of the user at the current moment by using the zoom engine, and determine the focal length at the current moment through the best focus at the current moment. In the scheme, the camera at the remote shooting end adopts a zoom camera, liquid can be used as a lens, after the focal length at the current moment is determined, the focal length is changed by changing the curvature of the liquid, and focusing treatment is carried out on the two cameras, so that target imaging planes corresponding to the two cameras are obtained.
Step 205, acquiring a first pupil distance parameter of a first user based on the data transmission connection.
Step 206, based on the data transmission connection, the first pupil distance parameter is sent to the remote shooting end, so that the remote shooting end adjusts the distance between the two cameras to be the target distance matched with the first pupil distance parameter.
Specifically, during the use of the VR device by the user, the environmental information in which the user is located may change, such as the ambient brightness. Therefore, after the remote camera end adjusts the distance between the two cameras, information such as the target field angle, the lens focal length, the target screen brightness and the like of the VR terminal needs to be obtained, and VR equipment needs to be adjusted according to the information.
In this scheme, after sending the first pupil distance parameter to the remote shooting end based on data transmission connection to make the remote shooting end adjust the distance between two cameras to the target distance that matches with the first pupil distance parameter, still include: obtaining a target field angle, a lens focal length and target screen brightness of a VR terminal; based on data transmission connection, sending a target field angle, a lens focal length and target screen brightness to a remote shooting end, so that the remote shooting end adjusts the field angles of the two lenses to be the target field angle; the focal length of the two cameras is adjusted to be the focal length of the lens; the screen brightness is adjusted to the target screen brightness.
Wherein the target screen brightness is a brightness matching the ambient light. The target field of view includes a left eye field of view, a right eye field of view, and a binocular field of view, which is also referred to as a monocular field of view. The binocular viewing angle may also be referred to as an overall viewing angle, and is a combination of two monocular viewing angles, and the overlapping stereoscopic portions of the two monocular viewing angles are the binocular viewing angles. Monocular and binocular angles of view are a factor of particular importance for virtual reality. The wider the angle of view, the easier the user is to feel as being personally on the scene.
Specifically, an illumination sensor is arranged on the VR equipment, the ambient light brightness at the current moment can be determined through the illumination sensor, and the target screen brightness matched with the ambient light brightness is obtained according to the ambient light brightness. Further, the VR terminal may send the target field angle, the lens focal length, and the target screen brightness to the remote photographing terminal based on the data transmission connection, and the remote photographing terminal may adjust the field angles of the two cameras to the target field angle after receiving the target field angle, the lens focal length, and the target screen brightness; the focal length of the two cameras is adjusted to be the focal length of the lens; the screen brightness is adjusted to the target screen brightness.
Step 207, receiving and displaying an image obtained by shooting by the remote shooting end based on the target distance and the target imaging plane based on the data transmission connection.
And step 208, responding to the pupil distance parameter updating instruction, and acquiring a second pupil distance parameter of the second user.
Wherein the second user is a user using the VR device that is different from the first user. The parameter updating instruction is an instruction for instructing the VR terminal to acquire the pupil distance of the current user. In this scheme, the pupil distance parameter update instruction may be an instruction actively triggered by the second user, or may be an instruction triggered when the VR terminal continuously detects the pupil distance of the user and discovers that the pupil distance is changed.
When the first user finishes using the VR device and the second user starts using the VR device, the second user can send a connection instruction to the remote shooting end through an opening button on the VR terminal to establish data transmission connection with the remote shooting end; and acquiring a second pupil distance parameter of the second user based on the data transmission connection. Or, the VR terminal may acquire the pupil distance of the user using the VR device once every one minute, and compare the pupil distance acquired in the previous minute, and when the pupil distance changes, the VR terminal may automatically trigger the pupil distance parameter update instruction to acquire the second pupil distance parameter.
Step 209, based on the data transmission connection, sending the second pupil distance parameter to the remote shooting end, so that the remote shooting end updates the target distance based on the second pupil distance parameter, and obtains an updated distance matched with the second pupil distance parameter.
Specifically, after the VR terminal obtains the second pupil distance parameter, the second pupil distance parameter needs to be sent to the remote capturing end, so that the remote capturing end is based on the second pupil distance parameter. After the remote shooting end receives the second pupil distance parameter, the updating distance matched with the second pupil distance parameter can be obtained according to pupil distance data of the user, the distance between the two cameras is adjusted according to the updating distance, focusing and alignment of virtual reality content are achieved, comfort and definition of the user during watching are guaranteed, and personalized virtual reality experience is provided for the user.
Step 210, receiving and displaying an image obtained by the remote shooting terminal based on the updated distance shooting based on the data transmission connection.
In the technical scheme of the embodiment, a data transmission connection is established with a remote shooting end in response to a remote connection instruction triggered by a first user. The remote shooting end comprises two cameras, and an included angle between the two cameras is 180 degrees. Based on the data transmission connection, an eye movement parameter of the first user is obtained. Gaze point information for a first user is determined based on the eye movement parameters. Based on data transmission connection, the gaze point information is sent to the remote shooting end, so that the remote shooting end focuses on the two cameras based on the gaze point information, and target imaging planes corresponding to the two cameras are obtained. Based on the data transmission connection, a first pupil distance parameter of the first user is obtained. Based on the data transmission connection, the first pupil distance parameter is sent to the remote shooting end, so that the remote shooting end adjusts the distance between the two cameras to be the target distance matched with the first pupil distance parameter. Based on the data transmission connection, an image obtained by shooting the target distance by the remote shooting end is received and displayed. And responding to the pupil distance parameter updating instruction, and acquiring a second pupil distance parameter of the second user. And based on the data transmission connection, the second pupil distance parameter is sent to the remote shooting end, so that the remote shooting end updates the target distance based on the second pupil distance parameter, and an update distance matched with the second pupil distance parameter is obtained. Based on the data transmission connection, the image obtained by the remote shooting end based on the updated distance shooting is received and displayed. According to the technical scheme, the distance between the two cameras at the shooting end can be automatically determined according to the interpupillary distance of the eyes, so that the shooting end can focus according to the interpupillary distance more accurately. When different users use the equipment, the pupil distance parameters of the users can be updated in time, and personalized services are provided for the users; the method comprises the steps of obtaining the gaze point information of a current user through eye tracking, and adjusting a camera to focus in real time according to the gaze point information, so that the user can watch a video with higher definition; meanwhile, on the basis of eye tracking, the effective range watched by eyes of a user can be obtained, and only the definition of the effective range is required to be adjusted, so that the bandwidth is reduced, the data transmission efficiency is improved, and the experience of the user in using VR equipment is further improved.
Fig. 3 is a flowchart of a VR device control method applied to a remote capturing end according to an embodiment of the present invention, where the method may be executed by a VR device control apparatus applied to a capturing end according to an embodiment of the present invention, and the apparatus may be implemented by software and/or hardware. As shown in fig. 3, the method specifically comprises the following steps:
step 301, a data transmission connection is established with the VR terminal in response to a remote connection instruction sent by the VR terminal.
The remote connection instruction is an instruction for indicating the VR terminal to establish data transmission connection with the remote shooting end. The data transmission connection mode comprises a wired connection and a wireless connection. In the scheme, a wireless network or a special wireless transmission technology is used for establishing data transmission connection between a remote shooting end and a VR terminal.
Specifically, when the user uses the VR device, the connection instruction may be sent to the remote capturing terminal through an on button on the VR terminal. Alternatively, the remote connection instruction may be sent to the VR terminal through a terminal (mobile phone, tablet computer, etc.) of the first user connected to the VR terminal. After receiving the remote connection instruction, the VR terminal responds to the connection instruction and sends a data transmission connection request to the remote shooting end, and the remote shooting end can establish data transmission connection with the VR terminal after receiving the data transmission connection request.
Step 302, based on data transmission connection, a first pupil distance parameter sent by a VR terminal is received, a target distance matched with the first pupil distance parameter is determined, and the distance between two cameras is adjusted to be the target distance.
The first pupil distance parameter is the distance between pupils of two eyes of the first user, and the target distance is the distance between the center points of the two cameras when the two cameras of the remote shooting end are at the optimal positions. Specifically, the VR device stores basic information of two cameras in advance, including an initial distance between center points of the two cameras when the two cameras are at the original positions. After the target distance between the two cameras is determined, the distance between the two cameras can be adjusted according to the target distance, so that the distance between the two cameras is the target distance.
In this scheme, adjust the distance between two cameras to the target distance, include: calculating the rolling distance of the two strip-shaped rolling bars based on the target distance; the two gears are driven to rotate by the stepping motor so as to drive the two gears to drive the respective corresponding strip-shaped rolling bars to slide according to the rolling distance, and the distance between the two cameras is the target distance.
Specifically, the remote shooting end further comprises a stepping motor and two strip-shaped rolling bars, wherein the stepping motor comprises two gears, each gear is connected with one strip-shaped rolling bar, and each strip-shaped rolling bar is connected with one camera. Fig. 4 is a schematic structural diagram of a remote shooting end according to an embodiment of the present invention. Two cameras are arranged at two ends of the remote shooting end, namely a camera 1 and a camera 2, the camera 1 is connected with a strip-shaped rolling bar 1, and the camera 2 is connected with a strip-shaped rolling bar 2. The strip-shaped rolling bar 1 is connected with a gear 1 of a stepping motor, and the strip-shaped rolling bar 2 is connected with a gear 2 of the stepping motor. When the remote camera end determines the target distance, the rolling distance and the rolling direction of the two strip-shaped rolling bars are determined according to the target distance, and the strip-shaped rolling bars 1 and 2 are driven to roll according to the rolling distances respectively through the gear 1 and the gear 2 of the stepping motor, so that the distance between the two cameras is the target distance.
Illustratively, assume that the distance between the center points of camera 1 and camera 2 is 3 cm when camera 1 and camera 2 are in the initial position. The pupil distance of the first user is 5 cm, and the target distance between the two cameras is 5.2 cm according to the pupil distance of the user. The rolling distance of the camera 1 can be determined to be 1.1 cm according to the target distance, and the rolling distance of the camera 2 is determined to be 1.1 cm. Further, the remote shooting end can control the driving motor to drive the gear 1 to rotate by 1.1 cm, so that the gear 1 drives the strip-shaped rolling bar 1 to move leftwards by 1.1 cm. The driving motor is controlled to drive the gear 2 to rotate by 1.1 cm, so that the gear 2 drives the strip-shaped rolling bar 2 to move rightwards by 1.1 cm. Therefore, the camera 1 can move left by 1.1 cm under the driving of the strip-shaped rolling bar 1, and the camera 2 can move right by 1.1 cm under the driving of the strip-shaped rolling bar 2, so that the distance between the camera 1 and the camera 2 is the target distance (5.2 cm).
And 303, shooting images through the two adjusted cameras, and sending the images to the VR terminal based on the data transmission connection so that the VR terminal can display the images.
Specifically, after the distance between the two cameras is adjusted, the remote shooting end controls the two cameras to shoot, images shot by the two cameras are obtained, and the shot images are sent to the VR terminal in real time through data transmission connection. The VR terminal can receive images (live streams) sent by the remote shooting end and synchronously display the images for the user, so that the user can clearly and smoothly watch live broadcast.
According to the technical scheme, the data transmission connection can be established with the VR terminal in response to the remote connection instruction sent by the VR terminal; based on data transmission connection, receiving a first pupil distance parameter sent by a VR terminal, and determining a target distance matched with the first pupil distance parameter; adjusting the distance between the two cameras to be a target distance; and shooting images through the two adjusted cameras, and transmitting the images to the VR terminal based on data transmission connection so as to enable the VR terminal to display the images. According to the technical scheme, the distance between the two cameras of the remote shooting end can be automatically determined according to the interpupillary distance of the eyes, so that the remote shooting end can focus according to the interpupillary distance more accurately. The display definition and the anti-distortion effect of the equipment are improved, and the experience of a user using the VR equipment is further improved.
Fig. 5 is a schematic structural diagram of a VR device control apparatus applied to a VR terminal according to an embodiment of the present invention. As shown in fig. 5, the apparatus may specifically include:
a first connection module 501, configured to establish a data transmission connection with a remote capturing end in response to a remote connection instruction triggered by a first user; the remote shooting end comprises two cameras, and an included angle between the two cameras is 180 degrees;
a parameter obtaining module 502, configured to obtain a first pupil distance parameter of the first user based on the data transmission connection;
a parameter sending module 503, configured to send the first pupil distance parameter to the remote capturing end based on the data transmission connection, so that the remote capturing end adjusts a distance between the two cameras to be a target distance matched with the first pupil distance parameter;
and the image display module 504 is configured to receive and display, based on the data transmission connection, an image obtained by the remote shooting end based on the target distance.
Optionally, the parameter sending module 503 is specifically configured to: acquiring eye movement parameters of the first user based on the data transmission connection;
determining gaze point information of the first user based on the eye movement parameters;
And based on the data transmission connection, sending the gaze point information to the remote shooting end, so that the remote shooting end focuses the two cameras based on the gaze point information to obtain target imaging planes corresponding to the two cameras.
Optionally, the parameter obtaining module 502 is specifically configured to: responding to the pupil distance parameter updating instruction, and acquiring a second pupil distance parameter of a second user;
and based on the data transmission connection, the second pupil distance parameter is sent to the remote shooting end, so that the remote shooting end updates the target distance based on the second pupil distance parameter, and an update distance matched with the second pupil distance parameter is obtained.
Optionally, the parameter sending module 503 is further configured to: acquiring a target field angle, a lens focal length and target screen brightness of the VR terminal; wherein the target field of view includes a left eye field of view, a right eye field of view, and a binocular field of view; the target screen brightness is brightness matched with the ambient light;
based on the data transmission connection, the target field angle, the lens focal length and the target screen brightness are sent to the remote shooting end, so that the remote shooting end adjusts the field angles of the two cameras to the target field angle; adjusting the focal lengths of the two cameras to be the focal length of the lens; and adjusting the screen brightness to the target screen brightness.
The VR equipment control device provided by the embodiment of the invention can execute the VR equipment control method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method. Reference is made to the description of any method embodiment of the invention for details not described in this embodiment.
Fig. 6 is a schematic structural diagram of a VR device control apparatus applied to a remote shooting end according to an embodiment of the present invention. As shown in fig. 6, the apparatus may specifically include:
a second connection module 601, configured to establish a data transmission connection with a VR terminal in response to a remote connection instruction sent by the VR terminal;
the distance determining module 602 is configured to receive a first pupil distance parameter sent by a VR terminal based on the data transmission connection, and determine a target distance matched with the first pupil distance parameter;
a distance adjustment module 603, configured to adjust a distance between the two cameras to the target distance;
the image capturing module 604 is configured to capture an image through the adjusted two cameras, and send the image to the VR terminal based on the data transmission connection, so that the VR terminal displays the image.
Optionally, the remote shooting end further includes: the step motor and two strip barrels, step motor includes two gears, and every gear connection strip barrel, and every strip barrel connects a camera, and distance adjustment module 603 is specifically used for: calculating the rolling distance of the two strip-shaped rolling bars based on the target distance;
The two gears are driven to rotate through the stepping motor, so that the two gears are driven to drive the corresponding strip-shaped rolling bars to slide according to the rolling distance, and the distance between the two cameras is the target distance.
The VR equipment control device provided by the embodiment of the invention can execute the VR equipment control method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method. Reference is made to the description of any method embodiment of the invention for details not described in this embodiment.
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, and referring to fig. 7, a schematic structural diagram of a computer system 12 suitable for implementing the electronic device according to the embodiment of the present invention is shown. The electronic device shown in fig. 7 is only an example and should not be construed as limiting the functionality and scope of use of the embodiments of the invention. Components of the electronic device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, a bus 18 that connects the various system components, including the system memory 28 and the processing units 16.
Bus 18 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, micro channel architecture (MAC) bus, enhanced ISA bus, video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Electronic device 12 typically includes a variety of computer system readable media. Such media can be any available media that is accessible by electronic device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) 30 and/or cache memory 32. The electronic device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from or write to non-removable, nonvolatile magnetic media (not shown in FIG. 7, commonly referred to as a "hard disk drive"). Although not shown in fig. 7, a magnetic disk drive for reading from and writing to a removable non-volatile magnetic disk (e.g., a "floppy disk"), and an optical disk drive for reading from or writing to a removable non-volatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In such cases, each drive may be coupled to bus 18 through one or more data medium interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored in, for example, memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Program modules 42 generally perform the functions and/or methods of the embodiments described herein.
The electronic device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), one or more devices that enable a user to interact with the electronic device 12, and/or any devices (e.g., network card, modem, etc.) that enable the electronic device 12 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 22. In the electronic device 12 of the present embodiment, the display 24 is not provided as a separate body but is embedded in the mirror surface, and the display surface of the display 24 and the mirror surface are visually integrated when the display surface of the display 24 is not displayed. Also, the electronic device 12 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet, through a network adapter 20. As shown in fig. 7, the network adapter 20 communicates with other modules of the electronic device 12 over the bus 18. It should be appreciated that although not shown in fig. 7, other hardware and/or software modules may be used in connection with electronic device 12, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
The processing unit 16 executes various functional applications and VR device control by running programs stored in the system memory 28, for example, to implement a VR device control method provided by an embodiment of the present invention: responding to a remote connection instruction triggered by a first user, and establishing data transmission connection with a remote shooting end; the remote shooting end comprises two cameras, and an included angle between the two cameras is 180 degrees; acquiring a first pupil distance parameter of the first user based on the data transmission connection; based on the data transmission connection, the first pupil distance parameter is sent to the remote shooting end, so that the remote shooting end adjusts the distance between the two cameras to be a target distance matched with the first pupil distance parameter; and receiving and displaying the image shot by the remote shooting end based on the target distance based on the data transmission connection.
Embodiments of the present invention provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a VR device control method as provided by all embodiments of the present invention: responding to a remote connection instruction triggered by a first user, and establishing data transmission connection with a remote shooting end; the remote shooting end comprises two cameras, and an included angle between the two cameras is 180 degrees; acquiring a first pupil distance parameter of the first user based on the data transmission connection; based on the data transmission connection, the first pupil distance parameter is sent to the remote shooting end, so that the remote shooting end adjusts the distance between the two cameras to be a target distance matched with the first pupil distance parameter; and receiving and displaying the image shot by the remote shooting end based on the target distance based on the data transmission connection. Any combination of one or more computer readable media may be employed. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present invention may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.

Claims (10)

1. A VR device control method, applied to a virtual reality VR terminal, comprising:
responding to a remote connection instruction triggered by a first user, and establishing data transmission connection with a remote shooting end; the remote shooting end comprises two cameras, and an included angle between the two cameras is 180 degrees;
acquiring a first pupil distance parameter of the first user based on the data transmission connection;
based on the data transmission connection, the first pupil distance parameter is sent to the remote shooting end, so that the remote shooting end adjusts the distance between the two cameras to be a target distance matched with the first pupil distance parameter;
And receiving and displaying the image shot by the remote shooting end based on the target distance based on the data transmission connection.
2. The method according to claim 1, wherein after the step of establishing a connection with the remote capturing terminal in response to the remote connection instruction based on the first user trigger, the method further comprises:
acquiring eye movement parameters of the first user based on the data transmission connection;
determining gaze point information of the first user based on the eye movement parameters;
and based on the data transmission connection, sending the gaze point information to the remote shooting end, so that the remote shooting end focuses the two cameras based on the gaze point information to obtain target imaging planes corresponding to the two cameras.
3. The method according to claim 1, wherein after the step of receiving and displaying the image captured by the remote capturing end based on the target distance based on the data transmission connection, the method further comprises:
responding to the pupil distance parameter updating instruction, and acquiring a second pupil distance parameter of a second user;
and based on the data transmission connection, the second pupil distance parameter is sent to the remote shooting end, so that the remote shooting end updates the target distance based on the second pupil distance parameter, and an update distance matched with the second pupil distance parameter is obtained.
4. The method of claim 1, wherein after the step of sending the first pupil distance parameter to the remote photographing end based on the data transmission connection, the method further comprises:
acquiring a target field angle, a lens focal length and target screen brightness of the VR terminal; wherein the target field of view includes a left eye field of view, a right eye field of view, and a binocular field of view; the target screen brightness is brightness matched with the ambient light;
based on the data transmission connection, the target field angle, the lens focal length and the target screen brightness are sent to the remote shooting end, so that the remote shooting end adjusts the field angles of the two cameras to the target field angle; adjusting the focal lengths of the two cameras to be the focal length of the lens; and adjusting the screen brightness to the target screen brightness.
5. The VR equipment control method is characterized by being applied to a remote shooting end, wherein the remote shooting end comprises two cameras, and an included angle between the two cameras is 180 degrees, and the method comprises the following steps:
responding to a remote connection instruction sent by a VR terminal, and establishing data transmission connection with the VR terminal;
Based on the data transmission connection, receiving a first pupil distance parameter sent by a VR terminal, and determining a target distance matched with the first pupil distance parameter;
adjusting the distance between the two cameras to be the target distance;
and shooting images through the two adjusted cameras, and sending the images to the VR terminal based on the data transmission connection, so that the VR terminal displays the images.
6. The method of claim 5, wherein the remote capturing end further comprises: the step motor and two strip barrels, step motor includes two gears, and every gear connection strip barrel, every strip barrel connect a camera, will the distance adjustment between two cameras is the target distance, includes:
calculating the rolling distance of the two strip-shaped rolling bars based on the target distance;
the two gears are driven to rotate through the stepping motor, so that the two gears are driven to drive the corresponding strip-shaped rolling bars to slide according to the rolling distance, and the distance between the two cameras is the target distance.
7. A VR device control apparatus, characterized by being applied to a VR terminal, the apparatus comprising:
The first connection module is used for responding to a remote connection instruction triggered by a first user and establishing data transmission connection with a remote shooting end; the remote shooting end comprises two cameras, and an included angle between the two cameras is 180 degrees;
the parameter acquisition module is used for acquiring a first pupil distance parameter of the first user based on the data transmission connection;
the parameter sending module is used for sending the first pupil distance parameter to the remote shooting end based on the data transmission connection, so that the remote shooting end adjusts the distance between the two cameras to be a target distance matched with the first pupil distance parameter;
and the image display module is used for receiving and displaying the image obtained by the remote shooting end based on the target distance shooting based on the data transmission connection.
8. VR device control apparatus, characterized in that is applied to long-range shooting end, long-range shooting end includes two cameras, and the device includes:
the second connection module is used for responding to a remote connection instruction sent by the VR terminal and establishing data transmission connection with the VR terminal;
the distance determining module is used for receiving a first pupil distance parameter sent by the VR terminal based on the data transmission connection and determining a target distance matched with the first pupil distance parameter;
The distance adjusting module is used for adjusting the distance between the two cameras to be the target distance;
and the image shooting module is used for shooting images through the two adjusted cameras, and sending the images to the VR terminal based on the data transmission connection so that the VR terminal can display the images.
9. An electronic device comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the VR device control method of any one of claims 1 to 6 is implemented by the processor when the program is executed.
10. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when executed by a processor, implements the VR device control method of any one of claims 1 to 6.
CN202311207102.6A 2023-09-18 2023-09-18 VR device control method and device, electronic device and storage medium Pending CN116996656A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311207102.6A CN116996656A (en) 2023-09-18 2023-09-18 VR device control method and device, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311207102.6A CN116996656A (en) 2023-09-18 2023-09-18 VR device control method and device, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN116996656A true CN116996656A (en) 2023-11-03

Family

ID=88528637

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311207102.6A Pending CN116996656A (en) 2023-09-18 2023-09-18 VR device control method and device, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN116996656A (en)

Similar Documents

Publication Publication Date Title
US11860511B2 (en) Image pickup device and method of tracking subject thereof
US11694353B2 (en) Single depth tracked accommodation-vergence solutions
US6919907B2 (en) Anticipatory image capture for stereoscopic remote viewing with foveal priority
US20230132407A1 (en) Method and device of video virtual background image processing and computer apparatus
CN105787884A (en) Image processing method and electronic device
AU2011237473B2 (en) Remote gaze control system and method
WO2013191120A1 (en) Image processing device, method, and program, and storage medium
KR101690646B1 (en) Camera driving device and method for see-through displaying
CN112666705A (en) Eye movement tracking device and eye movement tracking method
WO2022262839A1 (en) Stereoscopic display method and apparatus for live performance, medium, and system
US20190265787A1 (en) Real world interaction utilizing gaze
CN116996656A (en) VR device control method and device, electronic device and storage medium
CN104239877B (en) The method and image capture device of image procossing
CN116405653A (en) Real-time naked eye 3D image processing method, system, equipment and medium
CN113960788B (en) Image display method, device, AR glasses and storage medium
CN111736692B (en) Display method, display device, storage medium and head-mounted device
WO2021184338A1 (en) Automatic focusing method and apparatus, gimbal, device, and storage medium
CN111857461A (en) Image display method and device, electronic equipment and readable storage medium
JP7498459B2 (en) Video transmission system, video transmission device, and video transmission program
JP2000182058A (en) Three-dimensional motion input method and three- dimensional motion input system
CN115589531B (en) Shooting method, shooting system and storage medium of target scene
CN117452642A (en) Auto-focusing method, system, head-mounted device and readable storage medium
CN114374815B (en) Image acquisition method, device, terminal and storage medium
CN117687211A (en) Virtual reality device
CN117319790A (en) Shooting method, device, equipment and medium based on virtual reality space

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination