CN114281190A - Information control method, device, system, equipment and storage medium - Google Patents

Information control method, device, system, equipment and storage medium Download PDF

Info

Publication number
CN114281190A
CN114281190A CN202111530308.3A CN202111530308A CN114281190A CN 114281190 A CN114281190 A CN 114281190A CN 202111530308 A CN202111530308 A CN 202111530308A CN 114281190 A CN114281190 A CN 114281190A
Authority
CN
China
Prior art keywords
robot
target robot
augmented reality
remote control
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111530308.3A
Other languages
Chinese (zh)
Inventor
周强
孙雨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202111530308.3A priority Critical patent/CN114281190A/en
Publication of CN114281190A publication Critical patent/CN114281190A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Manipulator (AREA)
  • Selective Calling Equipment (AREA)

Abstract

The embodiment of the application discloses an information control method, which comprises the following steps: selecting a target robot in response to a selection operation for the robot; in response to a remote control operation for the target robot, at least one frame of image acquired by the target robot under the remote control operation is obtained, and the at least one frame of image is presented on the augmented reality device. The embodiment of the application also discloses an information control device, a robot control system, augmented reality equipment and a storage medium.

Description

Information control method, device, system, equipment and storage medium
Technical Field
The present application relates to the field of wireless communication technologies, and in particular, to an information control method, an information control apparatus, a robot control system, an augmented reality device, and a storage medium.
Background
With the continuous development of science and technology, the robot gradually extends from the initial military field to the application scenes such as family companions, education and training, entertainment and nursing, industrial inspection and the like, thereby liberating the hands of people to a certain extent and bringing convenience to the life of people. However, in the prior art, the robot is controlled by a wireless remote controller through keys, a joystick, voice control or gesture recognition, so that the robot is controlled to move in a short distance from front to back and from left to right. It is therefore desirable to provide a method for remote control of a robot that is intelligent.
Disclosure of Invention
Embodiments of the present application are intended to provide an information control method, an information control apparatus, a robot control system, an augmented reality device, and a storage medium, and solve the current urgent need in the related art to provide a method for intelligently and remotely controlling a robot. The information control method provided by the application at least has the following beneficial effects: the interaction between the augmented reality equipment and the robot, the remote operation control and other functions are realized in response to the remote control operation aiming at the target robot, and the interactivity, the interestingness of the content and the visual innovation of the robot are improved.
The technical scheme of the application is realized as follows:
the application provides an information control method, which is applied to augmented reality equipment and comprises the following steps:
selecting a target robot in response to a selection operation for the robot;
in response to a remote control operation for a target robot, obtaining at least one frame of image acquired by the target robot under the remote control operation, and presenting the at least one frame of image on the augmented reality device.
The application provides an information control apparatus, the apparatus includes:
a selection module for selecting a target robot in response to a selection operation for the robot;
a processing module for obtaining at least one frame of image acquired by the target robot under a remote control operation in response to the remote control operation for the target robot;
a display module to present the at least one frame of image on the augmented reality device.
The application provides a robot control system, the system comprises an augmented reality device and a target robot,
responding to remote control operation aiming at the target robot by the augmented reality equipment, and sending control parameters corresponding to the remote control operation to the target robot;
the target robot collects at least one frame of image in the process of operating with the control parameters and sends the at least one frame of image to the augmented reality equipment;
the augmented reality device presents the at least one frame of image.
The application provides an augmented reality equipment, augmented reality equipment includes: a processor, a memory, and a communication bus;
the communication bus is used for realizing communication connection between the processor and the memory;
the processor is used for executing the information control program stored in the memory so as to realize the information control method.
The present application provides a storage medium, wherein the storage medium stores one or more programs, and the one or more programs are executable by one or more processors to implement the above-described information control method.
According to the information control method, the information control device, the robot control system, the augmented reality device and the storage medium, a target robot is selected by responding to selection operation aiming at the robot; responding to the remote control operation aiming at the target robot, obtaining at least one frame of image acquired by the target robot under the remote control operation, and presenting the at least one frame of image on the augmented reality equipment; therefore, the interaction between the augmented reality device and the robot, the remote operation control and other functions are realized in response to the remote control operation of the target robot, and the interactivity, the interestingness of the content and the innovation of the vision of the robot are improved.
Drawings
Fig. 1 is an alternative flow chart of an information control method provided in an embodiment of the present application;
fig. 2 is a schematic view of an optional application scenario of a snoop address according to an embodiment of the present application;
fig. 3 is an alternative flow chart of an information control method according to an embodiment of the present disclosure;
fig. 4 is an alternative flowchart of an information control method according to an embodiment of the present application;
FIG. 5 is a schematic diagram illustrating an alternative gesture of an information control method according to an embodiment of the present disclosure;
fig. 6 is an alternative flowchart of an information control method according to an embodiment of the present application;
FIG. 7 is a schematic diagram of an alternative telemetry link for the information handling method provided by embodiments of the present application;
fig. 8 is an alternative flowchart of an information control method according to an embodiment of the present application;
fig. 9 is an alternative communication interaction diagram of an information control method according to an embodiment of the present application;
fig. 10 is an alternative flowchart of an information control method according to an embodiment of the present application;
fig. 11 is an alternative flowchart of an information control method according to an embodiment of the present application;
FIG. 12 is a schematic diagram of an alternative architecture of a robot control system according to an embodiment of the present disclosure;
fig. 13 is an alternative structural diagram of an information control apparatus according to an embodiment of the present application;
fig. 14 is an optional schematic structural diagram of an augmented reality device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
It should be appreciated that reference throughout this specification to "an embodiment of the present application" or "an embodiment described previously" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in the embodiments of the present application" or "in the embodiments" in various places throughout this specification are not necessarily all referring to the same embodiments. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application. The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
An embodiment of the present application provides an information control method, which is applied to augmented reality equipment, and is shown in fig. 1, where the method includes the following steps:
step 101, in response to a selection operation for the robot, a target robot is selected.
In the embodiment of the application, the augmented reality device can be selected in a multicast mode and also can be selected in a wireless connection verification mode in the process of selecting the target robot, and the mode of selecting the target robot is not specifically limited in the application.
In practical applications, Augmented Reality (AR) technology is also called Augmented Reality, and is a newer technical content that promotes integration between real world information and virtual world information content, and implements analog simulation processing on entity information that is difficult to experience in the spatial range of the real world, overlays virtual information content for effective application in the real world, and can be perceived by human senses in the process, thereby realizing sensory experience beyond Reality. After the real environment and the virtual object are overlapped, the real environment and the virtual object can exist in the same picture and space at the same time. The augmented reality technology not only can effectively embody the content of the real world, but also can promote the display of virtual information content, and the fine content is mutually supplemented and superposed. In the visual augmented reality, a user needs to enable the real world to be overlapped with computer graphics on the basis of a helmet display, and the real world can be fully seen around the computer graphics after the real world is overlapped. The augmented reality technology mainly comprises new technologies and means such as multimedia, three-dimensional modeling, scene fusion and the like, and the information content provided by augmented reality and the information content which can be perceived by human beings are obviously different.
In the embodiment of the present application, the augmented reality device is a device with computing processing capability, and the augmented reality device includes, but is not limited to, an AR head-mounted device, such as AR glasses and an AR helmet. And 102, responding to the remote control operation aiming at the target robot, obtaining at least one frame of image acquired by the target robot under the remote control operation, and presenting the at least one frame of image on the augmented reality equipment.
In the embodiment of the application, the remote control operation is an operation of remotely controlling the robot by the augmented reality device. Here, the remote control operation may be understood as an operation in which the augmented reality device remotely controls the machine in the same scene and in different spaces of the robot; the remote control operation can be understood as the operation of the augmented reality device for remotely controlling the machine under different scenes and different spaces by the robot.
In the embodiment of the application, after the augmented reality device selects the target robot in response to the selection operation for the robot, the augmented reality device acquires the remote control operation for the target robot and responds to the remote control operation. Furthermore, at least one frame of image collected by the target robot under the remote control operation is obtained, and at least one frame of image is presented on a display module of the augmented reality device.
In other embodiments of the present application, taking the example that the augmented reality device selects the target robot by multicast, before the target robot is selected in response to the selection operation for the robot in step 101, the following steps may be further performed:
and monitoring a plurality of internet protocol addresses multicast by a plurality of robots in the wireless network.
In the embodiment of the application, the internet protocol address is a unique logical address allocated to each electronic device or robot on the internet, so that when the electronic device or robot is operated, an object communicating with the electronic device or robot can be efficiently and conveniently determined from tens of thousands of electronic devices or robots. Here, an Internet Protocol (IP) address is a uniform address format provided by an Internet Protocol.
In an implementation scenario, a plurality of robots periodically send multicast detection messages in a wireless network, after an augmented reality device is started, the multicast detection messages in the wireless network are monitored, an internet protocol address corresponding to each robot is obtained from each monitored multicast detection message, and all the internet protocol addresses are maintained in a data list.
In another implementation scenario, referring to fig. 2, after the robot 1, the robot 2, and the robot 3 start multicast services respectively, periodically multicast their IP addresses to a multicast group, so that the multicast group receives the IP address of the robot; and after the augmented reality device 4 is started, monitoring a specified multicast group, respectively acquiring the IP addresses of the robot 1, the robot 2 and the robot 3 from the multicast group, and maintaining all the IP addresses in a data list.
In the embodiment of the present application, the step 101, in response to the selection operation for the robot, selects the target robot, and may be implemented by the following steps:
in response to the selection operation for the robot, a target robot having a specified internet protocol address is selected from the plurality of robots based on the plurality of internet protocol addresses.
Wherein the plurality of internet protocol addresses includes a designated internet protocol address.
In this embodiment of the application, the selection operation may be a touch operation on the augmented reality device, the selection operation may also be a voice operation on the augmented reality device, and the selection operation may also be a gesture operation on the augmented reality device, which is not specifically limited in this application.
In the embodiment of the application, after the augmented reality device monitors a plurality of internet protocol addresses multicast by a plurality of robots in a wireless network, a user selects a specified internet protocol in a data list maintained in the augmented reality device, that is, a selection operation for the robot. Referring to fig. 2, at this time, the augmented reality apparatus selects, as the target robot, a robot 2 having a designated internet protocol address such as 192.168.1.3 from among a plurality of robots based on a plurality of internet protocol addresses in response to a selection operation of the user for the robot.
According to the information control method provided by the embodiment of the application, the target robot is selected by responding to the selection operation aiming at the robot; responding to the remote control operation aiming at the target robot, obtaining at least one frame of image acquired by the target robot under the remote control operation, and presenting the at least one frame of image on the augmented reality equipment; therefore, the interaction between the augmented reality device and the robot, the remote operation control and other functions are realized in response to the remote control operation of the target robot, and the interactivity, the interestingness of the content and the innovation of the vision of the robot are improved.
An embodiment of the present application provides an information control method, which is applied to an augmented reality device, and as shown in fig. 3, the method includes the following steps,
step 301, a plurality of internet protocol addresses multicast by a plurality of robots in a wireless network are monitored.
Step 302, in response to the selecting operation for the robot, selecting a target robot having a specified internet protocol address from the plurality of robots based on the plurality of internet protocol addresses.
Wherein the plurality of internet protocol addresses includes a designated internet protocol address.
And 303, responding to the remote control operation, and converting the remote control operation into the attitude parameter and the position parameter corresponding to the target robot.
In the embodiment of the application, the remote control operation is an operation of remotely controlling the robot by the augmented reality device. Here, the remote control operation may be understood as an operation in which the augmented reality device remotely controls the machine in the same scene and in different spaces of the robot; the remote control operation can be understood as the operation of the augmented reality device for remotely controlling the machine under different scenes and different spaces by the robot.
In the embodiment of the application, the attitude parameter is used for adjusting the current attitude of the target robot; the posture of the target robot includes, but is not limited to, standing, squatting, walking, facing, turning, swinging, leaning forward and backward, shaking hands, etc.
In the embodiment of the present application, the position parameter is a parameter for adjusting the current position of the target robot.
In the embodiment of the application, the augmented reality device monitors a plurality of internet protocol addresses multicast by a plurality of robots in a wireless network, responds to the selection operation aiming at the robots, and responds to the remote control operation and converts the remote control operation into the attitude parameter and the position parameter corresponding to the target robots under the condition that the target robots with the specified internet protocol addresses are selected from the plurality of robots based on the plurality of internet protocol addresses.
In the embodiment of the present application, referring to fig. 4, step 303, in response to the remote control operation, converts the remote control operation into the corresponding attitude parameter and position parameter of the target robot, and can be implemented by the following steps,
step 3031, obtaining at least one frame of gesture image collected by the augmented reality equipment.
In the embodiment of the application, the augmented reality device acquires at least one frame of gesture image acquired through the image acquisition module. Here, the image acquisition module can be that augmented reality equipment is from taking, and the image acquisition module also can be connected with augmented reality equipment through data connection line, to this, does not do specific restriction in this application. Further, the image acquisition module can be a camera, and the image acquisition module can also be a fisheye camera, to this, this application does not do specific limitation, as long as can acquire at least one frame of gesture image that augmented reality equipment gathered can.
In other embodiments of the present application, when the augmented reality device acquires at least one frame of image, the distance between the hand and the image capture module needs to be detected, and the distance is determined to be within the distance threshold, i.e., the hand can be determined to be within the visible range of the image capture module, and further, the image capture module captures at least one frame of image.
Step 3032, inputting the at least one frame of gesture image into the deep learning gesture model to obtain a gesture recognition result of the at least one frame of gesture image.
In the embodiment of the application, the augmented reality device stores the corresponding meaning of each gesture, and the augmented reality device recognizes at least one frame of gesture image through the deep learning gesture model to obtain the gesture recognition result of at least one frame of gesture image. For example, referring to fig. 5, if a frame of gesture image is recognized, the obtained gesture recognition result is an open palm state, which indicates that the gesture of the control robot is a standing gesture; if the gesture image of one frame is recognized, the obtained gesture recognition result is a fist making state, and the gesture of the robot is represented as a squatting gesture; if the two adjacent frames of gesture images are recognized, the obtained gesture recognition result is in a palm opening state, the gesture moves from the first position to the second position, the gesture of the robot is represented to be a standing gesture, and the robot is controlled to move from the current position to a target position corresponding to the second position in the space where the robot is located. It should be noted that the above description is only an exemplary description of the present application.
Step 3033, converting the gesture recognition result into a posture parameter and a position parameter.
In the embodiment of the application, the augmented reality equipment inputs at least one frame of gesture image into the deep learning gesture model to obtain a gesture recognition result of the at least one frame of gesture image, and then converts the gesture recognition result into the gesture parameter and the position parameter of the target robot, so that the target robot carries out gesture change by using the gesture parameter and/or carries out at least one frame of image collected in the position change process by using the position parameter; therefore, the gesture image is obtained through the image acquisition module of the augmented reality device and is recognized to obtain a gesture recognition result, and then the target robot is controlled to perform some interactive actions according to the gesture recognition result; therefore, the immersion scene of the augmented reality equipment is combined, and the interestingness of interactive control between the augmented reality equipment and the robot is increased.
In the embodiment of the present application, referring to fig. 6, step 303 converts the remote control operation into the corresponding posture parameter and position parameter of the target robot in response to the remote control operation, and can also be implemented by the following steps,
step 3034, obtaining a trigger operation of the remote lever aiming at the augmented reality equipment.
In one implementation scenario, referring to fig. 7, a virtual button 701 is present in the joystick of the augmented reality device, and a trigger operation for the joystick of the augmented reality device is generated by pressing and moving the virtual button 701 from a central position 702 to any direction, such as up, down, left, right, up or down to the left, turning a circle, continuing to move down, and the like.
And step 3035, converting the operation information corresponding to the trigger operation into the attitude parameter and the position parameter.
In the embodiment of the present application, the operation information corresponding to the trigger operation includes information of pressing and moving the virtual button from the center position to any direction, such as upward, downward, leftward, rightward, upward leftward or downward rightward, turning, continuously downward, and the like.
In the embodiment of the application, after the augmented reality equipment acquires the trigger operation of the remote lever aiming at the augmented reality equipment, the operation information corresponding to the trigger operation is converted into the attitude parameter and the position parameter, so that the target robot carries out attitude change by the attitude parameter and/or at least one frame of image acquired in the process of carrying out position change by the position parameter; that is to say, in the embodiment of the application, the accurate control direction is identified through touch operation, so that the target robot is controlled to perform some interactive actions; therefore, the immersion scene of the augmented reality equipment is combined, the virtual control is convenient, and the interestingness of interactive control between the augmented reality equipment and the robot is increased.
In the embodiment of the present application, referring to fig. 8, step 303 converts the remote control operation into the corresponding posture parameter and position parameter of the target robot in response to the remote control operation, and can also be implemented by the following steps,
and 3036, establishing a coordinate system of the augmented reality equipment based on the acquired reference image acquired by the target robot.
Step 3037, the position of the operation object of the augmented reality device is taken as the starting point of the ray in the coordinate system, the direction of the operation object is taken as the extending direction of the ray, and the collision point in the coordinate system is determined.
Wherein the attitude parameters include an extension direction, and the position parameters include a start point and a collision point.
In the embodiment of the application, the reference image is an image sent to the augmented reality device by the target robot in a wireless communication mode, and the reference image can be an image acquired by the target robot for the first time. After the augmented reality device receives the reference image, a spatial rectangular coordinate system of the augmented reality device is established, the position of an operation object such as a hand of the augmented reality device is used as a starting point of a ray in the spatial rectangular coordinate system, the orientation of the operation object such as the hand is used as an extending direction of the ray, the ray is extended along the extending direction, and in the process of extending the ray, a collision point which collides with the reference image in the spatial direct coordinate system is determined. And determining an attitude parameter of the target robot based on the extension direction, and determining a position parameter of the target robot based on the starting point and the collision point.
In an implementation application scenario, at the starting time of the augmented reality device, a spatial rectangular coordinate system of the augmented reality device is established, a position of an operation object such as a hand in the spatial rectangular coordinate system is determined through a simultaneous localization and mapping (slam) algorithm, the position is used as a starting point of a ray, an angle of a handle is obtained through calculation of a gyroscope, and an orientation of the hand is used as an extending direction of the ray, after the starting point and the extending direction of the ray are determined, the ray is extended along the extending direction with the starting point as a starting point, and a collision point in the spatial rectangular coordinate system is determined, wherein the position of the collision point can be determined by a collision detection component such as a collision device. Further, the augmented reality device determines an attitude parameter of the target robot based on the extension direction, and determines a position parameter of the target robot based on the start point and the collision point.
And step 304, sending the attitude parameters and the position parameters to the target robot in a wireless communication mode.
In the embodiment of the application, the wireless communication mode between the augmented reality device and the robot includes Rosbridge and Ros Unity Hub. Here, the rossbridge is a main method provided by a Robot Operating System (ROS) official website for developers to interactively communicate with a non-ROS System such as an AR device, and provides an Application Programming Interface (API) of JavaScript Object Notation (JavaScript Object notification) capable of calling the function of the ROS for an external program. Referring to fig. 9, what is responsible for management of the communication transport layer is a rossbridge _ server sub-function module in the rossbridge, and call instructions of these functions are all issued in JSON format through the rossbridge _ server sub-function module. Here, the rossbridge communication scheme provides a centralized communication scheme including a Websocket, a Transmission Control Protocol (TCP), and a User Data Protocol (UDP) UDP for different communication architectures. The Websocket is a Browser/Server (Browser/Server) architecture and is mainly used for a Server interacting with a Web Browser; TCP and UDP are directed to a Client/Server (Client/Server) architecture, in which UDP communicates a corresponding speed block, but a packet loss occurs, so that there is uncertainty; and TCP communication needs three-way handshake, so that the transmission process is more stable and accurate. According to the actual project requirements, a proper communication mode in the Rosbridge can be selected.
In the embodiment of the present application, referring to fig. 10, step 304 of sending the attitude parameter and the position parameter to the target robot through wireless communication can be implemented by the following steps,
step 3041, obtain the assigned port number of the target robot.
Step 3042, establishing a communication link between the augmented reality device and the target robot based on the designated port number and the designated internet protocol address.
Step 3043, sending the pose parameters and the position parameters to the target robot through the communication link.
In the embodiment of the application, when the augmented reality device selects a TCP communication mode for data transmission, firstly, the augmented reality device acquires the designated port number of the target robot, establishes a communication link between the augmented reality device and the target robot based on the designated port number and the designated internet interconnection protocol address, and sends attitude parameters and position parameters to the target robot through the communication link.
In the embodiment of the present application, referring to fig. 11, step 304 of sending the attitude parameter and the position parameter to the target robot through wireless communication can be implemented by the following steps,
step 3044, establish a connection node.
The connecting node is used for carrying out data transmission with the robot.
Step 3045, sending the attitude parameter and the position parameter to the target robot through the connection node, so that the target robot receives the attitude parameter and the position parameter through the connection end point established by itself.
In an implementation application scenario, the Ros Unity Hub communication mode is to establish a connection Endpoint, such as an Ros TCP Endpoint, at a robot end, where the connection Endpoint is a Ros node, and is used for receiving data information sent by an augmented reality device and is responsible for sending data messages to the augmented reality device. And the augmented reality device end establishes a connecting node such as an ROS TCP Connector to send and/or receive visually received data information.
And 305, acquiring at least one frame of image acquired in the process that the target robot carries out posture change by using the posture parameters and/or carries out position change by using the position parameters, and presenting the at least one frame of image on the augmented reality equipment.
In the embodiment of the application, the augmented reality equipment sends attitude parameters and position parameters to the target robot in a wireless communication mode, the target robot acquires at least one frame of image in the process of attitude change and/or position change of the target robot by the attitude parameters, and transmits the at least one frame of image to the augmented reality equipment in the wireless communication mode, and presents the at least one frame of image on the augmented reality equipment.
Therefore, in the embodiment of the application, the image acquired by the vision of the target robot is synchronized to the augmented reality device, immersive experience is brought to a user, meanwhile, functions such as interaction and remote operation control of the augmented reality device and the robot are achieved by using interaction control methods such as rays, gesture recognition and a remote lever of the augmented reality device, and the interactivity of the robot, the interestingness of the content and the innovation of the vision are improved.
It should be noted that, for the descriptions of the same steps and the same contents in this embodiment as those in other embodiments, reference may be made to the descriptions in other embodiments, which are not described herein again.
An embodiment of the present application provides a robot control system, which may be used to implement an information control method provided in the embodiments corresponding to fig. 1, fig. 3 to fig. 4, fig. 6, fig. 8, and fig. 10 to fig. 11, and as shown in fig. 12, the robot control system 12 includes: the augmented reality device 4 and the target robot 2,
the augmented reality device 4 transmits a control parameter corresponding to the remote control operation to the target robot 2 in response to the remote control operation for the target robot 2;
the target robot 2 collects at least one frame of image in the process of operating with the control parameters and sends the at least one frame of image to the augmented reality device 4;
the augmented reality device 4 presents at least one frame of image.
In an implementation scenario, referring to fig. 12, fig. 12 is an architectural diagram of a robot control system 13 provided in the present application, where the robot control system 12 includes at least a plurality of robots, such as a robot 1, a target robot 2, and a robot 3, an augmented reality device 4, and a network 5; wherein, a plurality of robots such as the robot 1, the target robot 2 and the robot 3 are respectively connected with the augmented reality device 4 through the network 5. The augmented reality device 4 includes, but is not limited to, AR head-mounted devices, such as AR glasses and AR helmets. Robots include, but are not limited to, quadruped robots, standing robots, and the like. Network 5 includes, but is not limited to, local area networks, metropolitan area networks, and wide area networks.
In other embodiments of the present application, the robot control system further includes at least one robot different from the target robot, and the augmented reality device may select the target robot from the plurality of robots by: the augmented reality equipment monitors a plurality of internet interconnection protocol addresses multicast by a plurality of robots in a wireless network; the augmented reality device responds to the selection operation for the robot, and selects a target robot with a specified internet protocol address from the plurality of robots based on the plurality of internet protocol addresses; the plurality of internet protocol addresses includes a designated internet protocol address.
In other embodiments of the present application, the augmented reality device can convert the remote control operation into a control parameter to control the target robot to capture at least one frame of image: responding to remote control operation by the augmented reality equipment, and converting the remote control operation into an attitude parameter and a position parameter corresponding to the target robot; the control parameters comprise attitude parameters and position parameters; the augmented reality equipment sends attitude parameters and position parameters to the target robot in a wireless communication mode; and acquiring at least one frame of image in the process of posture change of the target robot by using the posture parameters and/or position change of the target robot by using the position parameters.
In other embodiments of the present application, in an interactive control scenario, the augmented reality device may generate the gesture parameter and the position parameter based on the collected gesture image: the augmented reality equipment obtains at least one acquired frame of gesture image; the augmented reality equipment inputs at least one frame of gesture image into the deep learning gesture model to obtain a gesture recognition result of at least one frame of gesture image; the augmented reality device converts the gesture recognition result into an attitude parameter and a position parameter.
In other embodiments of the present application, in an interactive control scenario, the augmented reality device may further generate the attitude parameter and the position parameter based on the acquired trigger operation for the joystick: the augmented reality equipment acquires a trigger operation of a remote lever aiming at the augmented reality equipment; and the augmented reality equipment converts the operation information corresponding to the trigger operation into an attitude parameter and a position parameter.
In other embodiments of the present application, in an interactive control scenario, the augmented reality device may generate the above-mentioned attitude parameter and position parameter based on a ray: the augmented reality equipment establishes a coordinate system of the augmented reality equipment based on the acquired reference image acquired by the target robot; the augmented reality equipment takes the position of an operation object of the augmented reality equipment as a starting point of a ray in a coordinate system, and takes the orientation of the operation object as the extension direction of the ray to extend, and determines a collision point in the coordinate system; wherein the attitude parameters include an extension direction, and the position parameters include a start point and a collision point.
In other embodiments of the present application, the augmented reality device may establish a wireless communication mode in such a way that the augmented reality device obtains a designated port number of the target robot; the augmented reality equipment establishes a communication link between the augmented reality equipment and the target robot based on the designated port number and the designated internet interconnection protocol address; and the augmented reality equipment sends the attitude parameters and the position parameters to the target robot through the communication link.
In other embodiments of the present application, the augmented reality device may further establish another wireless communication mode by establishing a connection node, where the connection node is used for data transmission with the robot; the augmented reality equipment sends the attitude parameters and the position parameters to the target robot through the connecting nodes so that the target robot receives the attitude parameters and the position parameters through the connecting end points established by the target robot.
An embodiment of the present application provides an information control apparatus that can be used to implement an information control method provided in the embodiments corresponding to fig. 1, fig. 3 to fig. 4, fig. 6, fig. 8, and fig. 10 to fig. 11, and as shown in fig. 13, the information control apparatus 13 includes:
a selection module 1301 for selecting a target robot in response to a selection operation for the robot;
a processing module 1302, configured to obtain at least one frame of image acquired by a target robot under a remote control operation in response to the remote control operation for the target robot;
and a display module 1303, configured to present at least one frame of image on the augmented reality device.
In other embodiments of the present application, the information control apparatus 13 further includes a monitoring module, configured to monitor a plurality of internet protocol addresses multicast by a plurality of robots in a wireless network; a selecting module 1301, further configured to select, in response to the selecting operation for the robot, a target robot having a designated internet protocol address from the plurality of robots based on the plurality of internet protocol addresses; the plurality of internet protocol addresses includes a designated internet protocol address.
In other embodiments of the present application, the processing module 1302 is further configured to convert the remote control operation into an attitude parameter and a position parameter corresponding to the target robot in response to the remote control operation; sending the attitude parameters and the position parameters to a target robot in a wireless communication mode; and acquiring at least one frame of image acquired in the process that the target robot carries out posture change by using the posture parameters and/or carries out position change by using the position parameters.
In other embodiments of the present application, the processing module 1302 is further configured to obtain at least one frame of gesture image acquired by the augmented reality device; inputting at least one frame of gesture image into a deep learning gesture model to obtain a gesture recognition result of at least one frame of gesture image; and converting the gesture recognition result into a posture parameter and a position parameter.
In other embodiments of the present application, the processing module 1302 is further configured to obtain a trigger operation for a joystick of an augmented reality device; and converting the operation information corresponding to the trigger operation into an attitude parameter and a position parameter.
In other embodiments of the present application, the processing module 1302 is further configured to establish a coordinate system of the augmented reality device based on the obtained reference image acquired by the target robot; taking the position of an operation object of the augmented reality equipment as a starting point of a ray in a coordinate system, and taking the direction of the operation object as the extension direction of the ray to extend, and determining a collision point in the coordinate system; wherein the attitude parameters include an extension direction, and the position parameters include a start point and a collision point.
In other embodiments of the present application, the processing module 1302 is further configured to obtain a designated port number of the target robot; establishing a communication link between the augmented reality device and the target robot based on the designated port number and the designated internet interconnection protocol address; and transmitting the attitude parameters and the position parameters to the target robot through the communication link.
In other embodiments of the present application, the processing module 1302 is further configured to establish a connection node, where the connection node is used for data transmission with the robot; and sending the attitude parameters and the position parameters to the target robot through the connecting nodes so that the target robot receives the attitude parameters and the position parameters through a connecting endpoint established by the target robot.
It should be noted that, for the descriptions of the same steps and the same contents in this embodiment as those in other embodiments, reference may be made to the descriptions in other embodiments, which are not described herein again.
Based on the foregoing embodiments, an embodiment of the present application provides an augmented reality device, which may be used to implement an information control method provided in the embodiments corresponding to fig. 1, fig. 3 to fig. 4, fig. 6, fig. 8, and fig. 10 to fig. 11, where, as shown in fig. 14, the augmented reality device 4 (the augmented reality device 4 in fig. 14 corresponds to the information control apparatus 13 in fig. 13) includes: a processor 1401, a memory 1402, and a communication bus 1403, wherein:
the communication bus 1403 is used for enabling communication connection between the processor 1401 and the memory 1402;
the processor 1401 is configured to execute the information control program stored in the memory 1402 to realize the steps of:
selecting a target robot in response to a selection operation for the robot;
in response to a remote control operation for the target robot, at least one frame of image acquired by the target robot under the remote control operation is obtained, and the at least one frame of image is presented on the augmented reality device.
In other embodiments of the present application, the processor 1401 is configured to execute an information control program stored in the memory 1402, so as to implement the following steps:
monitoring a plurality of internet protocol addresses multicast by a plurality of robots in a wireless network;
selecting a target robot having a designated internet protocol address from the plurality of robots based on the plurality of internet protocol addresses in response to the selection operation for the robot; the plurality of internet protocol addresses includes a designated internet protocol address.
In other embodiments of the present application, the processor 1401 is configured to execute an information control program stored in the memory 1402, so as to implement the following steps:
responding to the remote control operation, and converting the remote control operation into an attitude parameter and a position parameter corresponding to the target robot; sending the attitude parameters and the position parameters to a target robot in a wireless communication mode; and acquiring at least one frame of image acquired in the process that the target robot carries out posture change by using the posture parameters and/or carries out position change by using the position parameters.
In other embodiments of the present application, the processor 1401 is configured to execute an information control program stored in the memory 1402, so as to implement the following steps:
obtaining at least one frame of gesture image acquired by augmented reality equipment; inputting at least one frame of gesture image into a deep learning gesture model to obtain a gesture recognition result of at least one frame of gesture image; and converting the gesture recognition result into a posture parameter and a position parameter.
In other embodiments of the present application, the processor 1401 is configured to execute an information control program stored in the memory 1402, so as to implement the following steps:
acquiring a trigger operation of a remote lever for augmented reality equipment; and converting the operation information corresponding to the trigger operation into an attitude parameter and a position parameter.
In other embodiments of the present application, the processor 1401 is configured to execute an information control program stored in the memory 1402, so as to implement the following steps:
establishing a coordinate system of augmented reality equipment based on the acquired reference image acquired by the target robot; taking the position of an operation object of the augmented reality equipment as a starting point of a ray in a coordinate system, and taking the direction of the operation object as the extension direction of the ray to extend, and determining a collision point in the coordinate system; wherein the attitude parameters include an extension direction, and the position parameters include a start point and a collision point.
In other embodiments of the present application, the processor 1401 is configured to execute an information control program stored in the memory 1402, so as to implement the following steps:
acquiring a designated port number of a target robot; establishing a communication link between the augmented reality device and the target robot based on the designated port number and the designated internet interconnection protocol address; and transmitting the attitude parameters and the position parameters to the target robot through the communication link.
In other embodiments of the present application, the processor 1401 is configured to execute an information control program stored in the memory 1402, so as to implement the following steps:
establishing a connection node, wherein the connection node is used for carrying out data transmission with the robot; and sending the attitude parameters and the position parameters to the target robot through the connecting nodes so that the target robot receives the attitude parameters and the position parameters through a connecting endpoint established by the target robot.
It should be noted that, for the descriptions of the same steps and the same contents in this embodiment as those in other embodiments, reference may be made to the descriptions in other embodiments, which are not described herein again.
Based on the foregoing embodiments, embodiments of the present application provide a computer storage medium storing one or more programs, the one or more programs being executable by one or more processors to implement the steps of:
selecting a target robot in response to a selection operation for the robot;
in response to a remote control operation for the target robot, at least one frame of image acquired by the target robot under the remote control operation is obtained, and the at least one frame of image is presented on the augmented reality device.
In other embodiments of the present application, the one or more programs are executable by the one or more processors and further implement the steps of:
monitoring a plurality of internet protocol addresses multicast by a plurality of robots in a wireless network;
selecting a target robot having a designated internet protocol address from the plurality of robots based on the plurality of internet protocol addresses in response to the selection operation for the robot; the plurality of internet protocol addresses includes a designated internet protocol address.
In other embodiments of the present application, the one or more programs are executable by the one or more processors and further implement the steps of:
responding to the remote control operation, and converting the remote control operation into an attitude parameter and a position parameter corresponding to the target robot; sending the attitude parameters and the position parameters to a target robot in a wireless communication mode; and acquiring at least one frame of image acquired in the process that the target robot carries out posture change by using the posture parameters and/or carries out position change by using the position parameters.
In other embodiments of the present application, the one or more programs are executable by the one or more processors and further implement the steps of:
obtaining at least one frame of gesture image acquired by augmented reality equipment; inputting at least one frame of gesture image into a deep learning gesture model to obtain a gesture recognition result of at least one frame of gesture image; and converting the gesture recognition result into a posture parameter and a position parameter.
In other embodiments of the present application, the one or more programs are executable by the one or more processors and further implement the steps of:
acquiring a trigger operation of a remote lever for augmented reality equipment; and converting the operation information corresponding to the trigger operation into an attitude parameter and a position parameter.
In other embodiments of the present application, the one or more programs are executable by the one or more processors and further implement the steps of:
establishing a coordinate system of augmented reality equipment based on the acquired reference image acquired by the target robot; taking the position of an operation object of the augmented reality equipment as a starting point of a ray in a coordinate system, and taking the direction of the operation object as the extension direction of the ray to extend, and determining a collision point in the coordinate system; wherein the attitude parameters include an extension direction, and the position parameters include a start point and a collision point.
In other embodiments of the present application, the one or more programs are executable by the one or more processors and further implement the steps of:
acquiring a designated port number of a target robot; establishing a communication link between the augmented reality device and the target robot based on the designated port number and the designated internet interconnection protocol address; and transmitting the attitude parameters and the position parameters to the target robot through the communication link.
In other embodiments of the present application, the one or more programs are executable by the one or more processors and further implement the steps of:
establishing a connection node, wherein the connection node is used for carrying out data transmission with the robot; and sending the attitude parameters and the position parameters to the target robot through the connecting nodes so that the target robot receives the attitude parameters and the position parameters through a connecting endpoint established by the target robot.
It should be noted that, for the descriptions of the same steps and the same contents in this embodiment as those in other embodiments, reference may be made to the descriptions in other embodiments, which are not described herein again.
The computer storage medium/Memory may be a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Programmable Read Only Memory (EPROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a magnetic Random Access Memory (FRAM), a Flash Memory (Flash Memory), a magnetic surface Memory, an optical Disc, or a Compact Disc Read-Only Memory (CD-ROM); but may also be various terminals such as mobile phones, computers, tablet devices, personal digital assistants, etc., that include one or any combination of the above-mentioned memories.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of a unit is only one logical function division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all functional units in the embodiments of the present application may be integrated into one building block, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit. Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: various media capable of storing program codes, such as a removable Memory device, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, and an optical disk.
The methods disclosed in the several method embodiments provided in the present application may be combined arbitrarily without conflict to obtain new method embodiments.
The features disclosed in the several method or apparatus embodiments provided in the present application may be combined arbitrarily, without conflict, to arrive at new method embodiments or apparatus embodiments.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (14)

1. An information control method is applied to an augmented reality device, and the method comprises the following steps:
selecting a target robot in response to a selection operation for the robot;
in response to a remote control operation for the target robot, obtaining at least one frame of image acquired by the target robot under the remote control operation, and presenting the at least one frame of image on the augmented reality device.
2. The method of claim 1, wherein prior to selecting the target robot in response to the selecting operation for the robot, comprising:
monitoring a plurality of internet protocol addresses multicast by a plurality of robots in a wireless network;
accordingly, the selecting a target robot in response to the selecting operation for the robot includes:
in response to a selection operation for a robot, selecting the target robot from the plurality of robots having a specified internet protocol address based on the plurality of internet protocol addresses; the plurality of internet protocol addresses includes the specified internet protocol address.
3. The method of claim 1, wherein said obtaining at least one image of the target robot acquired under a remote control operation in response to the remote control operation for the target robot comprises:
responding to the remote control operation, and converting the remote control operation into an attitude parameter and a position parameter corresponding to the target robot;
sending the attitude parameters and the position parameters to the target robot in a wireless communication mode;
and acquiring the at least one frame of image acquired in the process that the target robot carries out posture change by the posture parameters and/or carries out position change by the position parameters.
4. The method of claim 3, wherein said converting the remote control operation into the corresponding pose parameter and position parameter of the target robot in response to the remote control operation comprises:
obtaining at least one frame of gesture image acquired by the augmented reality device;
inputting the at least one frame of gesture image into a deep learning gesture model to obtain a gesture recognition result of the at least one frame of gesture image;
and converting the gesture recognition result into the posture parameter and the position parameter.
5. The method of claim 3, wherein said converting the remote control operation into the corresponding pose parameter and position parameter of the target robot in response to the remote control operation comprises:
acquiring a trigger operation for a remote lever of the augmented reality device;
and converting the operation information corresponding to the trigger operation into the attitude parameter and the position parameter.
6. The method of claim 3, wherein said converting the remote control operation into the corresponding pose parameter and position parameter of the target robot in response to the remote control operation comprises:
establishing a coordinate system of the augmented reality device based on the obtained reference image acquired by the target robot;
determining a collision point in the coordinate system by taking the position of an operation object of the augmented reality device as a starting point of a ray and taking the orientation of the operation object as an extension direction of the ray in the coordinate system; wherein the attitude parameters include the extension direction, and the position parameters include the origin point and the collision point.
7. The method of any one of claims 3 to 6, wherein said transmitting said pose parameters and said position parameters to said target robot by wireless communication comprises:
acquiring a designated port number of the target robot;
establishing a communication link between the augmented reality device and the target robot based on the specified port number and the specified internet protocol address;
transmitting the pose parameters and the position parameters to the target robot over the communication link.
8. The method of any one of claims 3 to 6, wherein said transmitting said pose parameters and said position parameters to said target robot by wireless communication comprises:
establishing a connection node, wherein the connection node is used for carrying out data transmission with the robot;
and sending the attitude parameter and the position parameter to the target robot through the connecting node so that the target robot receives the attitude parameter and the position parameter through a connecting endpoint established by the target robot.
9. An information control apparatus, characterized in that the apparatus comprises:
a selection module for selecting a target robot in response to a selection operation for the robot;
a processing module for obtaining at least one frame of image acquired by the target robot under a remote control operation in response to the remote control operation for the target robot;
a display module to present the at least one frame of image on the augmented reality device.
10. A robot control system, characterized in that the robot control system includes an augmented reality device and a target robot,
the augmented reality equipment responds to remote control operation aiming at the target robot and sends control parameters corresponding to the remote control operation to the target robot;
the target robot collects at least one frame of image in the process of operating with the control parameters and sends the at least one frame of image to the augmented reality equipment;
the augmented reality device presents the at least one frame of image.
11. The system of claim 10, wherein the robot control system further comprises at least one robot different from the target robot,
the augmented reality device monitors a plurality of internet protocol addresses multicast by a plurality of robots in a wireless network;
the augmented reality device selecting the target robot having a specified internet protocol address from the plurality of robots based on the plurality of internet protocol addresses in response to a selection operation for the robot; the plurality of internet protocol addresses includes the specified internet protocol address.
12. The system of claim 10 or 11,
the augmented reality equipment responds to the remote control operation and converts the remote control operation into an attitude parameter and a position parameter corresponding to the target robot; the control parameters include the attitude parameters and the position parameters;
the augmented reality equipment sends the attitude parameter and the position parameter to the target robot in a wireless communication mode;
and the target robot acquires the at least one frame of image in the process of carrying out posture change by the posture parameters and/or carrying out position change by the position parameters.
13. An augmented reality device, comprising: a processor, a memory, and a communication bus;
the communication bus is used for realizing communication connection between the processor and the memory;
the processor is configured to execute an information control program stored in the memory to implement the information control method according to any one of claims 1 to 8.
14. A storage medium characterized by storing one or more programs, which are executable by one or more processors to implement the information control method according to any one of claims 1 to 8.
CN202111530308.3A 2021-12-14 2021-12-14 Information control method, device, system, equipment and storage medium Pending CN114281190A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111530308.3A CN114281190A (en) 2021-12-14 2021-12-14 Information control method, device, system, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111530308.3A CN114281190A (en) 2021-12-14 2021-12-14 Information control method, device, system, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114281190A true CN114281190A (en) 2022-04-05

Family

ID=80872163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111530308.3A Pending CN114281190A (en) 2021-12-14 2021-12-14 Information control method, device, system, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114281190A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116761212A (en) * 2023-07-27 2023-09-15 北京小米机器人技术有限公司 Image transmission control method, device, terminal equipment and storage medium
CN117021117A (en) * 2023-10-08 2023-11-10 电子科技大学 Mobile robot man-machine interaction and positioning method based on mixed reality

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108520552A (en) * 2018-03-26 2018-09-11 广东欧珀移动通信有限公司 Image processing method, device, storage medium and electronic equipment
CN111383348A (en) * 2020-03-17 2020-07-07 北京理工大学 Method for remotely and synchronously controlling robot through virtual reality
CN112198957A (en) * 2020-08-28 2021-01-08 北京理工大学 Remote interaction system and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108520552A (en) * 2018-03-26 2018-09-11 广东欧珀移动通信有限公司 Image processing method, device, storage medium and electronic equipment
CN111383348A (en) * 2020-03-17 2020-07-07 北京理工大学 Method for remotely and synchronously controlling robot through virtual reality
CN112198957A (en) * 2020-08-28 2021-01-08 北京理工大学 Remote interaction system and method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116761212A (en) * 2023-07-27 2023-09-15 北京小米机器人技术有限公司 Image transmission control method, device, terminal equipment and storage medium
CN116761212B (en) * 2023-07-27 2024-04-23 北京小米机器人技术有限公司 Image transmission control method, device, terminal equipment and storage medium
CN117021117A (en) * 2023-10-08 2023-11-10 电子科技大学 Mobile robot man-machine interaction and positioning method based on mixed reality
CN117021117B (en) * 2023-10-08 2023-12-15 电子科技大学 Mobile robot man-machine interaction and positioning method based on mixed reality

Similar Documents

Publication Publication Date Title
CN107340853B (en) Remote presentation interaction method and system based on virtual reality and gesture recognition
CN102253712B (en) Recognition system for sharing information
US20170038829A1 (en) Social interaction for remote communication
US11782272B2 (en) Virtual reality interaction method, device and system
EP1193651A2 (en) Compound reality presentation
CN114281190A (en) Information control method, device, system, equipment and storage medium
CN106873767B (en) Operation control method and device for virtual reality application
WO2019019968A1 (en) Displacement control method and device for virtual character, and storage medium
WO2019057150A1 (en) Information exchange method and apparatus, storage medium and electronic apparatus
CN110947181A (en) Game picture display method, game picture display device, storage medium and electronic equipment
JP2012171024A (en) Robot system
CN108525305A (en) Image processing method, device, storage medium and electronic equipment
CN111716365B (en) Immersive remote interaction system and method based on natural walking
US20150331483A1 (en) System and method for simulating a user presence
CN107943282A (en) A kind of man-machine interactive system and method based on augmented reality and wearable device
CN112639685A (en) Display device sharing and interaction in Simulated Reality (SR)
WO2018032970A1 (en) Authentication method based on virtual reality scene, virtual reality device, and storage medium
CN115337634A (en) VR (virtual reality) system and method applied to meal games
CN112121406A (en) Object control method and device, storage medium and electronic device
CN111459432B (en) Virtual content display method and device, electronic equipment and storage medium
CN113315963A (en) Augmented reality display method, device, system and storage medium
CN114053693B (en) Object control method and device in virtual scene and terminal equipment
CN115624740A (en) Virtual reality equipment, control method, device and system thereof, and interaction system
CN114020978A (en) Park digital roaming display method and system based on multi-source information fusion
CN109753140B (en) Operation instruction obtaining method and device based on virtual reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination