CN112201243A - Human-computer interaction device and corresponding mobile user terminal - Google Patents

Human-computer interaction device and corresponding mobile user terminal Download PDF

Info

Publication number
CN112201243A
CN112201243A CN202011053863.7A CN202011053863A CN112201243A CN 112201243 A CN112201243 A CN 112201243A CN 202011053863 A CN202011053863 A CN 202011053863A CN 112201243 A CN112201243 A CN 112201243A
Authority
CN
China
Prior art keywords
user terminal
human
computer interaction
user
controlled
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011053863.7A
Other languages
Chinese (zh)
Inventor
戚耀文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mercedes Benz Group AG
Original Assignee
Daimler AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daimler AG filed Critical Daimler AG
Priority to CN202011053863.7A priority Critical patent/CN112201243A/en
Publication of CN112201243A publication Critical patent/CN112201243A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/162Interface to dedicated audio devices, e.g. audio drivers, interface to CODECs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/4401Bootstrapping
    • G06F9/4418Suspend and resume; Hibernate and awake
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/223Execution procedure of a spoken command

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Acoustics & Sound (AREA)
  • Computational Linguistics (AREA)
  • Computer Security & Cryptography (AREA)
  • Telephone Function (AREA)

Abstract

The invention relates to the field of human-computer interaction. More particularly, to a human-computer interaction device (100) comprising a control apparatus (10) and at least one controlled apparatus (30) communicatively connected to the control apparatus (10) to be manipulated by the control apparatus (10), the control apparatus (10) being configured to determine a position and an orientation at which a movable user terminal (200) is located and to identify the controlled apparatus (30) toward which the user terminal (200) is oriented or directed as an object manipulated by a voice instruction of a user based on the determined position and orientation. And to a mobile user terminal which can be communicatively connected to said human-computer interaction device, the user terminal comprising a sensor for detecting a position and/or orientation of the user terminal and transmitting the detected position and/or orientation data to the human-computer interaction device, in particular the user terminal further comprising voice input means for receiving a voice input of a user. The target intelligent device which needs to be controlled by voice can be conveniently and accurately marked through the method and the system.

Description

Human-computer interaction device and corresponding mobile user terminal
Technical Field
The invention relates to a human-computer interaction device. The invention also relates to a movable user terminal which is in communication connection with the human-computer interaction equipment.
Background
With the advancement of science and technology, more and more intelligent devices capable of being controlled by voice appear in the life of people. When a plurality of smart devices are arranged in a space, in order to control one of the smart devices, a user has to individually approach the smart device to issue a voice command. This is not only inconvenient, but the voice command may still be recognized by a nearby smart device causing malfunction.
On the other hand, in a public space, such as a common vehicle carrier, there may be mutual interference of voice instructions of multiple persons due to voice interaction involving multiple persons.
Accordingly, it is desirable to provide a solution that can conveniently and accurately identify a target smart device for which voice control is desired.
Disclosure of Invention
The object of the present invention is achieved by providing a human-computer interaction device, which includes a control means and at least one controlled means communicatively connected to the control means so as to be controllable by the control means, the control means being configured to determine a position and an orientation at which a movable user terminal is located and to identify the controlled means, toward which the user terminal is oriented or pointed, as an object to be controlled by a voice command of a user based on the determined position and orientation.
According to an alternative embodiment, the control device is configured to send the control signals to the following controlled devices in response to the voice command of the user: the controlled device is the controlled device towards or pointed to by the user terminal when the voice command is sent.
According to an alternative embodiment, the control means utilizes position and/or orientation data of the user terminal measured by sensors in the user terminal when determining the position and orientation of the user terminal.
According to an optional embodiment, the human-computer interaction device further comprises a storage device in communication connection with the control device, wherein the storage device stores therein position information of each controlled device, and the control device utilizes the position information when determining the controlled device towards which the user terminal is oriented or pointed.
According to an alternative embodiment, the human-computer interaction device further comprises a voice input device in communication with the control device, the voice input device being configured to receive a user's voice command for operating the controlled device.
According to an alternative embodiment, the human-computer interaction device further comprises a positioning device communicatively connected to the control device, the positioning device being configured to determine the position and/or distance of the user terminal relative to the positioning device, wherein the position information of the positioning device is pre-stored and the position information and the measurement results of the positioning device are used to determine the position and orientation of the user terminal.
The object of the present invention is also achieved by providing a mobile user terminal capable of communicative connection with a human-computer interaction device according to the above description, the user terminal comprising a sensor for detecting a position and/or orientation of the user terminal and transmitting the detected position and/or orientation data to the human-computer interaction device.
According to an alternative embodiment, the user terminal further comprises voice input means for receiving a voice input of the user.
According to an alternative embodiment, the user terminal is configured as a device that is portable with the head or hands of the user or a device that is easy to hold by the user.
According to an alternative embodiment, the user terminal is configured to wake up voice control of the controlled device upon detection of an occurrence of a triggering event, the triggering event comprising at least one of:
-the switch of the user terminal is operated;
-the display of the user terminal is tapped;
-detecting a voice input from the user for waking up.
According to an alternative embodiment, the user terminal is configured to display the following speech control states by means of a display screen:
-awakened, ready to be entered;
-is entering;
-understand input commands; and/or
-performing an operation.
By the invention, the following effects are realized:
-enabling convenient and accurate targeting of a target smart device for which voice control is desired, without the need for a user to approach the target smart device;
-avoiding mutual interference of the multi-person voice commands;
-preventing the voice command from causing malfunction of a device other than the target smart device; and
what you see is what you control and what you mean is what you control.
Further advantages and advantageous embodiments of the inventive subject matter are apparent from the description, the drawings and the claims.
Drawings
Further features and advantages of the present invention will be further elucidated by the following detailed description of an embodiment thereof, with reference to the accompanying drawings. The attached drawings are as follows:
FIG. 1 shows a schematic block diagram of a human-computer interaction device according to an exemplary embodiment of the present invention; and
fig. 2 illustrates a schematic block diagram of a user terminal according to an exemplary embodiment of the present invention.
Detailed Description
In order to make the technical problems, technical solutions and advantageous effects of the present invention more apparent, the present invention will be described in further detail with reference to the accompanying drawings and exemplary embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and do not limit the scope of the invention. In the drawings, the same or similar reference numerals refer to the same or equivalent parts.
Fig. 1 shows a schematic block diagram of a human-computer interaction device 100 according to an exemplary embodiment of the present invention. The human-computer interaction device 100 comprises a control device 10 and at least one controlled device 30 which is in communication connection with the control device 10 and is controlled by the control device. The control device is configured to determine the position and orientation of the movable user terminal 200 and identify the controlled device 30, which the user terminal 200 is oriented or pointed, as an object manipulated by the user's voice instruction based on the determined position and orientation.
According to an exemplary embodiment of the present invention, the human-computer interaction device 100 can be used for voice-manipulating a plurality of controlled devices 30 in the same spatial region. For example, a plurality of vehicle-mounted devices as the controlled device 30 may be operated as the vehicle-mounted human-computer interaction device, and the vehicle-mounted devices include but are not limited to: vehicle air conditioner or ventilation unit, each door window, lighting device, door window lamp, car machine display screen, storage tank, trunk lid, each door. Alternatively, the human-computer interaction device 100 can also be used to operate a plurality of controlled devices in the same indoor space, such as a room or a public place (e.g., a mall). In this case, the controlled device 30 may be various electric appliances.
According to an exemplary embodiment of the invention, the user terminal 200 may be a wearable device, e.g. a head-mounted device, such as a headset, glasses, to be carried with the head of the user. In this case, the orientation of the earphone may generally represent the direction of the user's line of sight, and then, when the user wants to control a controlled device 30 by voice, the user can select the controlled device by naturally moving the head to look at the controlled device without performing other actions. Thereby, a "what you see is what you control" is achieved. For example, when a person speaks an "open window" or "close window" command when looking at a window, the window being looked at may be opened or closed; when a person wants to turn on a lighting device, he can look at the lighting device and say "turn on", i.e. light up the lighting device.
Additionally or alternatively, the user terminal 200 may be a device that is portable with or easily held by the user's upper arm or hand, such as: other wearable devices, such as bracelets, watches; a vehicle key; a mobile phone; a tablet computer; a dedicated control terminal. In this case, the user terminal 200 can select the controlled device 30 by pointing or pointing the controlled device 30 desired to be voice-controlled, thereby realizing "pointing, i.e., controlling".
Further, the control device 10 is configured to send a manipulation signal to the following controlled devices 30 in response to a voice instruction of a user: the controlled device 30 is the controlled device 30 toward or pointed to by the user terminal 200 when the voice command is issued. For this reason, the control device 10 needs to acquire the time at which the voice instruction is issued and further calculate the position and orientation of the user terminal 200 at that time.
The control device 10 utilizes the position and/or orientation data of the user terminal 200 measured by the sensor 220 in the user terminal 200 when determining the position and orientation of the user terminal 200. The sensors 220 are, for example: an accelerometer (such as a three-axis or six-axis accelerometer), a gyroscope (such as a three-axis or six-axis gyroscope), a magnetometer (such as a three-axis or six-axis magnetometer), or any combination thereof.
In the case where the human machine interaction device 100 is used for a vehicle, an in-vehicle sensor 20 for detecting the position and/or direction of the vehicle may also be disposed in the vehicle. In this case, the control device 10 may determine the position and the direction of the user terminal 200 within the vehicle based on the measurement result of the sensor 220 of the user terminal 200 and the measurement result of the in-vehicle sensor 20. The in-vehicle sensor 20 is, for example: an accelerometer (such as a three-axis or six-axis accelerometer), a gyroscope (such as a three-axis or six-axis gyroscope), a magnetometer (such as a three-axis or six-axis magnetometer), or any combination thereof.
Additionally or alternatively, a positioning device may be provided within the space, e.g. within a vehicle or a room, which positioning device is configured to be able to measure the position and/or distance of the user terminal 200 relative to the positioning device, wherein the position information of the positioning device within the space is pre-stored. The position information and measurement results of the positioning means may also be used to determine the position and orientation of the user terminal 200. Illustratively, the positioning device may be a depth camera.
According to an example, the onboard sensor 20 and/or the positioning device may be arranged at a central portion of the vehicle body. According to another example, the depth camera may be disposed near a front end of a roof of the vehicle toward the in-vehicle space.
The human-computer interaction device 100 further includes a storage device 40 communicatively connected to the control device 10, and the storage device 40 stores therein position information of each of the controlled devices 30, the position information being used by the control device 10 when determining the controlled device 30 toward which the user terminal 200 is directed or pointed.
For this purpose, the spatial environment may be modeled to form a three-dimensional model of the spatial environment including all the controlled devices 30, for example. In the case where the human-computer interaction device 100 is used for a vehicle, the environment inside the vehicle may be modeled at the time of shipment of the vehicle.
The human-computer interaction device 100 further comprises a voice input means 50 communicatively connected to the control means 10 for receiving voice input from a user. In response to the voice instruction of the user, the control device 10 generates a manipulation instruction for manipulating the recognized user terminal 200. In the case where the human-computer interaction device 100 is used in a vehicle, the voice input device 50 may be a microphone mounted on the vehicle.
Additionally or alternatively, the user terminal 200 may have a voice input device 250. In this case, the user terminal 200 may directly transmit the received voice signal to the human interaction device 100 without processing the voice signal received by the voice input means 250 to be processed by the control means 10. In this way, the structure of the user terminal 200 is simplified and the weight of the user terminal 200 is reduced.
According to an exemplary embodiment of the present invention, noise reduction means are provided for performing noise reduction processing on the received speech signal so that the user's speech instruction is clarified. For example, the noise reduction device may include a microphone array mounted on the vehicle, which in turn includes a plurality of microphones arranged at different positions of the vehicle. By means of these microphones, sound signals can be received from different locations of the vehicle for targeted noise reduction processing of the speech signals received by the speech input device 50 or 250. Additionally or alternatively, the noise reduction means may be integrated in the user terminal 200.
According to an exemplary embodiment of the present invention, the communication of the user terminal 200 with the control device 10, in particular, the communication from the user terminal 200 to the control device 10 may be an encrypted communication and/or a voiceprint communication. In this case, the decryption of the encryption and/or the voiceprint may be performed in the control device 10 or in a remote server.
According to an exemplary embodiment of the present invention, a speech recognition apparatus for recognizing a speech input of a user is provided. The voice recognition device may be integrated with the control device 10 (for example, mounted in a vehicle), or may be provided in a remote server.
According to an exemplary embodiment of the present invention, a specific orientation (e.g., an orientation of a specific component) of the user terminal 200 may be set as a nominal direction for pointing to the controlled device 30 desired to be manipulated. In the case where the user terminal 200 is an earphone, a direction pointing to the front of the user when the earphone is worn by the user may be set as a calibration direction. In the case where the user terminal 200 is a car key, the orientation of the tip of the car key may be set as the direction of the orientation of the car key. In the case where the user terminal 200 is a mobile phone, a tablet computer, or a dedicated manipulation terminal as will be explained in detail later, the orientation of the microphones of these devices may be set as the calibration direction.
According to an exemplary embodiment of the present invention, the user terminal 200 may have a mark therein or thereon, and the mark is used for indicating the calibration direction to the user. The marker is for example an arrow.
According to an exemplary embodiment of the present invention, the following devices may be identified as the controlled apparatus 30 toward or pointed to by the user terminal 200: the intersection of the device with the nominal direction. When there are a plurality of controlled devices 30 intersecting the calibration direction, the controlled device closest to the mobile terminal 200 in the calibration direction is identified as the controlled device 30 toward or pointed to by the user terminal 200.
In particular, in case the user terminal 200 is a wearable device that is carried along with the head of the user, a calibration procedure for calibrating the deviation between the gaze direction of the person and the actual orientation of the nominal direction of the wearable device is provided. The calibration procedure may be performed as: the user terminal 200 is led to sequentially look at different calibration positions according to the prompt, so that whether the calibration direction of the user terminal 200 intersects with the calibration positions when looking at each calibration position is determined, and when the calibration direction does not intersect with the calibration position, a compensation means is determined according to the deviation, and the determined compensation means can be stored and used for compensating the deviation of the orientation of the user terminal 200 relative to the sight line direction subsequently. In the case where the human machine interaction device 100 is used for a vehicle, for example, the calibration procedure may be performed such that a person sits at a fixed position (e.g., on a driver's seat) with the user terminal 200 to sequentially look at different points in the screen of the vehicle as prompted.
According to an exemplary embodiment of the present invention, the user terminal 200 has a data communication device configured to be able to exchange data with a communication device of the human-computer interaction device 100 by means of a communication protocol (e.g. WIFI), for example to transmit position and/or orientation data detected by the sensor 220 and voice data received by the voice input device 250 to the human-computer interaction device 100.
According to an exemplary embodiment of the present invention, the user terminal 200 is configured to wake up a voice control function of the controlled device 30 upon detecting an occurrence of a triggering event, the triggering event comprising at least one of:
the wake-up switch (physical switch or virtual switch) of the user terminal 200 is operated;
the display of the user terminal 200 is tapped;
-detecting a voice input from the user for waking up.
Only if the voice control function of the controlled device 30 is awakened, the voice command received by the controlled device 30 and the position and/or orientation data detected by the sensor 220 can be transferred to the human-computer interaction device 100 to manipulate the controlled device 300.
According to an exemplary embodiment of the invention, the user terminal 200 is configured to: when the user presses the switch of the user terminal 200, the voice input device 250 of the user terminal 200 receives a voice instruction of the user; when the user releases the switch, the received voice command is transmitted to the control device 10. The switch may be the same switch as the wake-up switch or a different switch.
According to an exemplary embodiment of the present invention, the user terminal 200 is configured to display the following voice control states by means of a display screen (e.g., a display screen of the user terminal or a display screen of the controlled device 300 or a car-in-vehicle screen):
-awakened, ready to be entered;
-is entering;
-understand input commands; and/or
-performing an operation.
Fig. 2 shows a schematic block diagram of an exemplary user terminal 200. In this example, the user terminal 200 is configured as a dedicated manipulation terminal.
The manipulation terminal 200 includes a touch screen 210, and a microphone hole 230 is disposed around the touch screen 210, and a microphone for receiving a user voice input is positioned behind the microphone hole 230. The manipulation terminal 200 is configured to: the manipulation terminal 200 may be awakened when the user taps the touch screen a preset number of times (e.g., two consecutive times).
Further, the user terminal 200 further comprises a sensor 220 and a data communication device for detecting the position and/or orientation of the user terminal 200 as described above.
Additionally, user terminal 200 also carries a battery, which may or may not be readily removable, and which may or may not be rechargeable.
According to an exemplary embodiment of the present invention, the user terminal 200 is configured and dimensioned to be held by a user. In particular, the user terminal 200 has a size similar to a credit card.
According to an exemplary embodiment of the present invention, the user terminal 200 is designed to be temporarily fixed to a vehicle as an accessory device of the vehicle. For this purpose, the vehicle is provided with a fixing structure, such as a slot or a hook, for releasably fixing the user terminal 200. Correspondingly, the user terminal 200 may be provided with corresponding fixing structures cooperating with the fixing structures, such as protrusions cooperating with slots or slots cooperating with hooks.
Although some embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the invention. The appended claims and their equivalents are intended to cover all such modifications, substitutions and changes as fall within the true scope and spirit of the invention.

Claims (10)

1. A human-computer interaction device (100) comprises a control device (10) and at least one controlled device (30) which is in communication connection with the control device (10) and can be controlled by the control device (10), wherein the control device (10) is configured to determine the position and the orientation of a movable user terminal (200) and identify the controlled device (30) which is oriented or pointed by the user terminal (200) to serve as an object controlled by voice instructions of a user based on the determined position and orientation.
2. The human-computer interaction device (100) of claim 1,
the control device (10) is configured to send a control signal to the following controlled devices (30) in response to a voice instruction of a user: the controlled device (30) is the controlled device (30) towards or pointed by the user terminal (200) when the voice command is issued.
3. The human-computer interaction device (100) of claim 1 or 2,
the control device (10) utilizes position and/or orientation data of the user terminal (200) measured by sensors in the user terminal (200) when determining the position and orientation of the user terminal (200).
4. The human-computer interaction device (100) of any one of the preceding claims,
the man-machine interaction device (100) further comprises a storage device (40) which is in communication connection with the control device (10), wherein position information of each controlled device (30) is stored in the storage device (40), and the control device (10) utilizes the position information when determining the controlled device (30) which is oriented or pointed by the user terminal (200).
5. The human-computer interaction device (100) of any one of the preceding claims,
the human-computer interaction device (100) further comprises a voice input device (50) which is in communication connection with the control device (10), and the voice input device is configured to receive voice instructions of a user for operating the controlled device (30).
6. The human-computer interaction device (100) of any one of the preceding claims,
the human-computer interaction device (100) further comprises a positioning device in communication connection with the control device (10), the positioning device being configured to be able to determine the position and/or distance of the user terminal (200) relative to the positioning device, wherein position information of the positioning device is pre-stored, and the position information and the measurement result of the positioning device are used to determine the position and orientation of the user terminal (200).
7. A mobile user terminal (200) capable of communicative connection with a human-computer interaction device (100) according to any of the preceding claims, the user terminal (200) comprising a sensor for detecting a position and/or orientation of the user terminal (200) and transmitting the detected position and/or orientation data to the human-computer interaction device (100), in particular the user terminal (200) further comprising voice input means (250) for receiving a voice input of a user.
8. The user terminal (200) of claim 7,
the user terminal (200) is configured as a device that is portable with the head or hand of a user or a device that is easy to hold by a user.
9. The user terminal (200) of claim 7 or 8,
the user terminal (200) is configured to wake up voice control of the controlled device (30) upon detection of an occurrence of a triggering event, the triggering event comprising at least one of:
-a switch of the user terminal (200) is operated;
-the display of the user terminal (200) is tapped;
-detecting a voice input from the user for waking up.
10. The user terminal (200) of any of claims 7-9,
the user terminal (200) is configured to display the following speech control states by means of the display screen:
-awakened, ready to be entered;
-is entering;
-understand input commands; and/or
-performing an operation.
CN202011053863.7A 2020-09-29 2020-09-29 Human-computer interaction device and corresponding mobile user terminal Pending CN112201243A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011053863.7A CN112201243A (en) 2020-09-29 2020-09-29 Human-computer interaction device and corresponding mobile user terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011053863.7A CN112201243A (en) 2020-09-29 2020-09-29 Human-computer interaction device and corresponding mobile user terminal

Publications (1)

Publication Number Publication Date
CN112201243A true CN112201243A (en) 2021-01-08

Family

ID=74008064

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011053863.7A Pending CN112201243A (en) 2020-09-29 2020-09-29 Human-computer interaction device and corresponding mobile user terminal

Country Status (1)

Country Link
CN (1) CN112201243A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022188552A1 (en) * 2021-03-10 2022-09-15 Oppo广东移动通信有限公司 Device control method and related apparatus

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022188552A1 (en) * 2021-03-10 2022-09-15 Oppo广东移动通信有限公司 Device control method and related apparatus

Similar Documents

Publication Publication Date Title
CN105009202B (en) It is divided into two-part speech recognition
CN103890836B (en) The bluetooth with power management or other wave points for head mounted display
CN107801413B (en) Terminal for controlling electronic equipment and processing method thereof
KR102517610B1 (en) Electronic device
US7519537B2 (en) Method and apparatus for a verbo-manual gesture interface
KR102535720B1 (en) Electronic device
US9346471B2 (en) System and method for controlling a vehicle user interface based on gesture angle
CN114127665A (en) Multimodal user interface
US20110199292A1 (en) Wrist-Mounted Gesture Device
AU2015206668A1 (en) Smart necklace with stereo vision and onboard processing
WO2014144015A2 (en) Computing interface system
CN104521223A (en) Headset computer with handsfree emergency response
JP2001265457A (en) Device for manually operating unit of automobile and method for using the same
CN109238306A (en) Step counting data verification method, device, storage medium and terminal based on wearable device
WO2017193311A1 (en) Intelligent reminding method and apparatus for vehicle
CN112201243A (en) Human-computer interaction device and corresponding mobile user terminal
KR101763186B1 (en) Method and program for providing real-time traffic informaion
KR20100065074A (en) System for controlling robot based on motion recognition and method thereby
US10871837B2 (en) Wearable electronic devices having a rotatable input structure
KR100901482B1 (en) Remote control system and method by using virtual menu map
JP2002318684A (en) Information system for supplying functional element or explanatory information regarding control, etc., in automobile
CN115968461A (en) Terminal control system and method
KR20110061974A (en) Mobile terminal and operation control method thereof
US20090125640A1 (en) Ultrasmall portable computer apparatus and computing system using the same
CN110767285A (en) Reminding method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210108