CN110990686A - Control device of voice equipment, voice interaction method and device and electronic equipment - Google Patents

Control device of voice equipment, voice interaction method and device and electronic equipment Download PDF

Info

Publication number
CN110990686A
CN110990686A CN201910990389.1A CN201910990389A CN110990686A CN 110990686 A CN110990686 A CN 110990686A CN 201910990389 A CN201910990389 A CN 201910990389A CN 110990686 A CN110990686 A CN 110990686A
Authority
CN
China
Prior art keywords
voice
information
control device
control
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910990389.1A
Other languages
Chinese (zh)
Other versions
CN110990686B (en
Inventor
韩雪
王慧君
高磊
廖湖锋
王子
毛跃辉
梁博
陶梦春
王现林
林金煌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN201910990389.1A priority Critical patent/CN110990686B/en
Publication of CN110990686A publication Critical patent/CN110990686A/en
Application granted granted Critical
Publication of CN110990686B publication Critical patent/CN110990686B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/223Execution procedure of a spoken command

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Human Computer Interaction (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to a control device of voice equipment, a voice interaction method, a voice interaction device and electronic equipment, wherein the control device of the voice equipment comprises an attitude sensor, a controller and a communication module; the attitude sensor is in communication connection with the controller and is used for sending the acquired attitude information of the control device to the controller; the controller is used for determining the information type of the voice signal to be input according to the attitude information and sending a control instruction to the voice equipment, wherein the control instruction is used for controlling the voice equipment to search the information belonging to the information type; the communication module is connected with the voice equipment in an associated mode and used for transmitting the control instruction sent by the controller to the voice equipment. According to the method and the device, the interactive field is determined through the posture of the equipment, and then voice interaction is carried out with the user, so that the intention of the user is known, response information wanted by the user is given, the interactive accuracy is effectively improved, the implementation method is simple, and the user experience is improved.

Description

Control device of voice equipment, voice interaction method and device and electronic equipment
Technical Field
The present application relates to the field of home appliance control technologies, and in particular, to a control device for a voice device, a voice interaction method, a voice interaction device, and an electronic device.
Background
Nowadays, voice products are more and more popular, the functions are more and more complete, and the functions of listening to songs, listening to news, telling stories, hundred degrees encyclopedias, checking weather and the like can be realized. Due to the fact that the voice product has multiple functions, a user often needs to speak a voice instruction clearly in the using process, otherwise, the voice analysis can be ambiguous, and an unwanted voice response can be obtained. For example, a user speaks the voice command "loess hillside", the user's intent is to listen to the song "loess hillside", but the voice product may first provide the meaning of the word "loess hillside" of the department of encyclopedic; when the voice instruction is changed into 'playing song loess hillside', the voice product can accurately analyze the intention of the user to listen to the song. Thus, the process of voice control is very tedious and cumbersome, and the experience effect is not good.
Disclosure of Invention
In order to solve the technical problem or at least partially solve the technical problem, the application provides a control device of a voice device, a voice interaction method, a device and an electronic device, which determine the field of interaction resources according to the posture of the device so as to interact with a user, and improve the accuracy of interaction with the user.
In a first aspect, the present application provides a control apparatus for a speech device, including an attitude sensor, a controller, and a communication module;
the attitude sensor is in communication connection with the controller and is used for sending the acquired attitude information of the control device to the controller;
the controller is used for determining the information type of a voice signal to be input according to the attitude information and sending a control instruction to the voice equipment, wherein the control instruction is used for controlling the voice equipment to search the information belonging to the information type;
the communication module is connected with the voice device in a correlation mode and used for transmitting the control instruction sent by the controller to the voice device.
Further, the control device has a plurality of gestures, and a plurality of regions corresponding to the plurality of gestures, each of the plurality of regions being associated with one of the plurality of gestures.
Further, one of the plurality of areas is associated with one of the plurality of information types, wherein the information types associated with any two of the plurality of areas are different.
Further, the controller is connected to the mobile terminal, and the controller is further configured to receive control information sent by the mobile terminal, and generate the control instruction for controlling the voice device according to the control information;
or the controller directly receives a voice control signal sent by a user and converts the voice control signal into the control instruction for controlling the voice equipment.
Further, the gesture information is used for indicating a target area on the control device facing a target direction, and the controller determines that the information type corresponding to the target area is the information type of the voice signal to be input.
In a second aspect, the present application provides a voice interaction method, which includes:
acquiring attitude information of a control device;
determining the information type of the voice signal to be input according to the attitude information;
and sending a control instruction to a voice device, wherein the control instruction is used for controlling the voice device to search for the information belonging to the information type.
Further, before determining the information type of the voice signal to be input according to the posture information, the method further comprises: establishing an association between each of a plurality of gestures with which the control device has and one of the information types, wherein the information types associated with any two of the plurality of gestures are different;
establishing an association between each of a plurality of regions that the control device has and a gesture, wherein the gestures associated with any two of the plurality of regions are different;
determining the voice equipment to be controlled according to the attitude information comprises the following steps: and determining that the information type associated with the target area is the information type of the current voice signal to be input, wherein the target area is an area on the control device with the gesture facing to the target direction.
In another aspect, the present application provides a voice interaction apparatus, including:
the attitude acquisition module is used for acquiring attitude information of the control device;
the type matching module is used for determining the information type of the voice signal to be input according to the attitude information;
and the control module is used for sending a control instruction to the voice equipment, wherein the control instruction is used for controlling the voice equipment to search the information belonging to the information type.
In another aspect, the present application provides an electronic device including a memory, a processor, and a program stored on the memory and executable on the processor, wherein the processor implements the steps of the method when executing the program.
In another aspect, the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method described above.
Compared with the prior art, the technical scheme provided by the embodiment of the application has the following advantages:
according to the method provided by the embodiment of the application, the interaction field is determined through the posture of the equipment, and then voice interaction is carried out with the user, so that the intention of the user is known, the response information wanted by the user is given, the interaction accuracy is effectively improved, the implementation method is simple, and the user experience is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a block diagram of a control apparatus of a speech device according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of a control device of a speech apparatus according to an embodiment of the present application;
fig. 3 is a schematic flow chart of a voice interaction method according to an embodiment of the present application;
fig. 4 is a schematic diagram of a voice interaction apparatus according to an embodiment of the present application;
fig. 5 is an internal structure diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Fig. 1 is a control device of a voice device provided in an embodiment of the present application, including an attitude sensor 11, a controller 13, and a communication module 15;
the attitude sensor 11 is in communication connection with the controller and is used for sending the acquired attitude information of the control device to the controller.
Specifically, the attitude sensor 11 may be a three-axis gyroscope sensor, and may simultaneously measure position information, a movement track, and a movement speed of the control device in "up, down, left, right, front, and back" six directions (i.e., six directions in which three axes in a three-dimensional coordinate system are pointed), measure a current attitude of the control device, determine an orientation of each part of the control device, and transmit measured attitude information data to the controller. The control device may be in any shape, and preferably, for convenience of description and operation by a user, the control device may be set as a standard polyhedron, or a sphere, for example, a sphere similar to a football, which is divided into a plurality of regions with equal areas, and the orientation of each region is detected respectively. In this embodiment, a square shape is taken as an example for explanation.
As shown in fig. 2, as the user rotates the control device, the attitude sensor 11 may measure attitude information of the six faces of the cube A, B, C, D, E, F in real time, determine the orientation of each face after rotation, and send the attitude information to a controller, which may be a processor or a programmed single chip or the like.
The user can preset the associated attitude information of each surface of the control device, so that the attitude sensor 11 can accurately detect the attitude information of each surface and configure the orientation of one surface as the target orientation, namely when the attitude sensor 11 detects that the orientation of a certain surface is consistent with the configured target orientation, the user can control the surface, and other surfaces are not connected with the mobile terminal. For example, when the arrangement selection target is oriented upward, the control device may be rotated so that the corresponding surface faces upward, and the mobile terminal may be connected to the corresponding surface.
The controller 13 is configured to determine a voice device to be controlled according to the posture information, and send a control instruction to the voice device, where the control instruction is used to control the voice device to search for information belonging to the information type.
Specifically, the controller 13 receives the attitude information of each surface of the control device sent by the attitude sensor 11, and determines the information of the type of information required for searching by the voice device to be controlled according to the attitude information. The controller 13 is connected to a communication module 15, and the communication module 15 includes a WiFi or bluetooth unit, wherein the WiFi or bluetooth unit is connected to each side of the control device and is connected to and paired with a corresponding voice device. For example, if the control device is a cube, the attitude sensor 11 calculates and acquires the current attitude of the control device, that is, which face is currently upward, and if the face a is upward, the controller 13 controls the face a to be connected with the WiFi or bluetooth unit and operate, the user can use the client application of the mobile terminal to connect with the WiFi or bluetooth unit to configure the face a, each face of the control device is provided with a display interface, and the configured content is displayed on the display interface of the corresponding face.
When each face of the control device is configured through the communication module 15, specifically, the attitude sensor 11 calculates and acquires the current attitude of the control device, that is, which face is upward, if the face a is upward, the user uses the client application of the mobile terminal to connect with the face a through the WiFi or bluetooth unit in the control device, at this time, a list of information types to be configured is displayed on the client interface of the mobile terminal, a required information type is selected from the list, and the controller 13 configures the information type selected by the user through the mobile terminal into the memory of the face a to complete the configuration.
The information list types include "song", "story", "news", "ancient poetry", "encyclopedia", "map navigation", and the like, for example, if the user selects to configure "news" on the a-side, the searched information is searched in the category of "news" when the user uses the a-side of the control device.
After the configuration is completed, the control device is rotated, the attitude collector 1 collects the current attitude of the control device again, wherein the upward surface is connected with the mobile terminal through the WiFi or Bluetooth unit, the mobile terminal controls the WiFi or Bluetooth unit to configure the upward surface and define the information type represented by the upward surface, and the configuration operation of the six surfaces is completed in this way.
In the configuration process, if a certain face of the control device is repeatedly configured, the information category of the current configuration is updated. For example, if the information type of the a-plane arrangement is "news", and the a-plane arrangement is desired to be "song", the control device is rotated again so that the a-plane faces upward, the a-plane arrangement is performed again, and the information type of the a-plane is updated to be "song". After each face is configured, the configured information category name is displayed on the display interface of the face.
After the configuration of the control device is completed, the required information type is selected, the control device is rotated to face upwards, a voice command is input to the control device, the controller 13 of the control device converts the received voice command into a control signal, the control signal and the corresponding information type are sent to the voice equipment which is connected in advance through the communication module 15, the voice equipment searches for corresponding content in the category according to the information type and the control signal sent by the control device, and voice broadcasting is carried out or the required information is sent to the mobile terminal. The control device can effectively position the intention of the user, give accurate response information and improve the user experience to a great extent.
For example, the A surface of the control device is configured to be an ancient poetry type, the B surface is configured to be a map navigation type, the C surface is configured to be an encyclopedia type, when the A surface is set to be upward by a user, a voice instruction 'Huanghelou' is input to the control device, the control device can control corresponding voice equipment to search poetry related to the 'Huanghelou' through a network according to the voice instruction input by the user and the ancient poetry type on the A surface, and the title of each poetry is broadcasted through voice for the user to select.
When the user sets the B surface to be upward, a voice instruction 'Huanghelou' is input to the control device, the control device can control the corresponding voice equipment to search the navigation information reaching the 'Huanghelou' from the current position through the network according to the voice instruction input by the user and the 'map navigation' type of the B surface, and voice broadcast is carried out or the navigation information is sent to the mobile terminal.
When the user sets the C surface to be upward, a voice instruction 'Huanghelou' is input to the control device, the control device can search relevant knowledge about the 'Huanghelou' through a network according to the voice instruction input by the user and voice equipment corresponding to the type control of the encyclopedic of the C surface, introduces the scenic spot of the Huanghelou, briefly explains the history and the story of the scenic spot, and broadcasts the voice, so that the user can more deeply know the 'Huanghelou'.
Besides configuring the information type corresponding to each face by using the mobile client, the information type corresponding to each face can also be configured by directly inputting a voice command to the control device. Specifically, a voice command "configure to a certain type" may be input to a surface of the control device facing the target direction, or a type of information required for configuring the surface may be implemented, and then the control device is rotated to another surface to continue inputting the voice command. And the currently configured information type name is displayed on the display interface of each surface.
The field type of interaction is determined through the posture of the device, and then voice interaction is carried out with the user, so that the intention of the user is known, response information wanted by the user is given, the interaction accuracy is effectively improved, the implementation method is simple, and the user experience is improved.
As shown in fig. 3, an embodiment of the present application further discloses a voice interaction method, including:
s31, acquiring attitude information of the control device;
s32, determining the information type of the voice signal to be input according to the attitude information;
and S33, sending a control instruction to the voice equipment, wherein the control instruction is used for controlling the voice equipment to search the information belonging to the information type.
Specifically, the control device is bound with the voice equipment, and then each surface of the control device is configured to correspond to different information types; the attitude sensor is used for acquiring the attitude information of the control device, determining the orientation of the target, taking the upward direction as the orientation of the target in the embodiment, and sending the acquired attitude information to the controller. And sending a voice control instruction to the control device, determining the information type corresponding to the face according to the current attitude information by the controller, and controlling the voice equipment to search for response resources in the corresponding information type according to the information type corresponding to the face and the voice control instruction, so that ambiguity is prevented, and the accuracy of interaction is effectively improved.
As shown in fig. 4, an embodiment of the present application further discloses a voice interaction apparatus, including:
an attitude acquisition module 41, configured to acquire attitude information of the control device;
a control module 43, configured to send a control instruction to the voice device, where the control instruction is used to control the voice device to search for information belonging to the information type;
and the type matching module 45 is used for determining the information type of the voice signal to be input according to the attitude information.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, apparatus (device), or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (devices) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Fig. 5 is an internal structure diagram of an electronic device according to an embodiment of the present application. As shown in fig. 5, the electronic device includes a processor, a memory, a network interface, an input device, and a display screen connected through a system bus. Wherein the memory includes a non-volatile storage medium and an internal memory. The non-volatile storage medium of the electronic device stores an operating system and may also store a computer program, which, when executed by the processor, causes the processor to implement the voice interaction method. The internal memory may also have stored therein a computer program that, when executed by the processor, causes the processor to perform the voice interaction method. The display screen of the electronic device can be a liquid crystal display screen or an electronic ink display screen, and the input device of the electronic device can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the electronic device, an external keyboard, a touch pad or a mouse, and the like.
It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing are merely exemplary embodiments of the present invention, which enable those skilled in the art to understand or practice the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. The control device of the voice equipment is characterized by comprising an attitude sensor, a controller and a communication module;
the attitude sensor is in communication connection with the controller and is used for sending the acquired attitude information of the control device to the controller;
the controller is used for determining the information type of a voice signal to be input according to the attitude information and sending a control instruction to the voice equipment, wherein the control instruction is used for controlling the voice equipment to search the information belonging to the information type;
the communication module is connected with the voice device in a correlation mode and used for transmitting the control instruction sent by the controller to the voice device.
2. The control device according to claim 1,
the control device has a plurality of gestures, and also has a plurality of regions corresponding to the plurality of gestures, each of the plurality of regions being associated with one of the plurality of gestures.
3. The control apparatus of claim 2, wherein one of the plurality of regions is associated with one of a plurality of the information types, wherein the information types associated with any two of the plurality of regions are different.
4. The control device according to claim 2,
the controller is connected with the mobile terminal, and is further used for receiving control information sent by the mobile terminal and generating the control instruction for controlling the voice equipment according to the control information;
or the controller directly receives a voice control signal sent by a user and converts the voice control signal into the control instruction for controlling the voice equipment.
5. The control device according to claim 1, wherein the posture information indicates a target area on the control device in a direction toward a target, and the controller determines an information type corresponding to the target area as the information type of the voice signal to be input.
6. A method of voice interaction, comprising:
acquiring attitude information of a control device;
determining the information type of the voice signal to be input according to the attitude information;
and sending a control instruction to a voice device, wherein the control instruction is used for controlling the voice device to search for the information belonging to the information type.
7. The method of claim 6,
before determining the information type of the speech signal to be input according to the posture information, the method further comprises: establishing an association between each of a plurality of gestures with which the control device has and one of the information types, wherein the information types associated with any two of the plurality of gestures are different;
establishing an association between each of a plurality of regions that the control device has and a gesture, wherein the gestures associated with any two of the plurality of regions are different;
determining the voice equipment to be controlled according to the attitude information comprises the following steps: and determining that the information type associated with the target area is the information type of the current voice signal to be input, wherein the target area is an area on the control device with the gesture facing to the target direction.
8. A voice interaction apparatus, comprising:
the attitude acquisition module is used for acquiring attitude information of the control device;
the type matching module is used for determining the information type of the voice signal to be input according to the attitude information;
and the control module is used for sending a control instruction to the voice equipment, wherein the control instruction is used for controlling the voice equipment to search the information belonging to the information type.
9. An electronic device comprising a memory, a processor and a program stored on the memory and executable on the processor, wherein the steps of the method of any one of claims 6 or 7 are implemented when the program is executed by the processor.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 6 or 7.
CN201910990389.1A 2019-10-17 2019-10-17 Control device of voice equipment, voice interaction method and device and electronic equipment Active CN110990686B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910990389.1A CN110990686B (en) 2019-10-17 2019-10-17 Control device of voice equipment, voice interaction method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910990389.1A CN110990686B (en) 2019-10-17 2019-10-17 Control device of voice equipment, voice interaction method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN110990686A true CN110990686A (en) 2020-04-10
CN110990686B CN110990686B (en) 2021-04-20

Family

ID=70082110

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910990389.1A Active CN110990686B (en) 2019-10-17 2019-10-17 Control device of voice equipment, voice interaction method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN110990686B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1895510A1 (en) * 2006-08-29 2008-03-05 Aisin Aw Co., Ltd. Voice recognition method and voice recognition apparatus
CN103529762A (en) * 2013-02-22 2014-01-22 Tcl集团股份有限公司 Intelligent household control method and system based on sensor technology
CN105425648A (en) * 2016-01-11 2016-03-23 北京光年无限科技有限公司 Portable robot and data processing method and system thereof
CN106647398A (en) * 2016-12-23 2017-05-10 广东美的制冷设备有限公司 Remote controller, operation control method and device
CN106940591A (en) * 2016-01-04 2017-07-11 百度在线网络技术(北京)有限公司 View display methods, device and the wearable device of wearable device
CN109255064A (en) * 2018-08-30 2019-01-22 Oppo广东移动通信有限公司 Information search method, device, intelligent glasses and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1895510A1 (en) * 2006-08-29 2008-03-05 Aisin Aw Co., Ltd. Voice recognition method and voice recognition apparatus
CN103529762A (en) * 2013-02-22 2014-01-22 Tcl集团股份有限公司 Intelligent household control method and system based on sensor technology
CN106940591A (en) * 2016-01-04 2017-07-11 百度在线网络技术(北京)有限公司 View display methods, device and the wearable device of wearable device
CN105425648A (en) * 2016-01-11 2016-03-23 北京光年无限科技有限公司 Portable robot and data processing method and system thereof
CN106647398A (en) * 2016-12-23 2017-05-10 广东美的制冷设备有限公司 Remote controller, operation control method and device
CN109255064A (en) * 2018-08-30 2019-01-22 Oppo广东移动通信有限公司 Information search method, device, intelligent glasses and storage medium

Also Published As

Publication number Publication date
CN110990686B (en) 2021-04-20

Similar Documents

Publication Publication Date Title
US9870057B1 (en) Gesture detection using an array of short-range communication devices
KR102051418B1 (en) User interface controlling device and method for selecting object in image and image input device
KR101984915B1 (en) Supporting Portable Device for operating an Augmented reality contents and system, and Operating Method thereof
US9342205B2 (en) Display method, terminal device and multi-terminal device system
JP2014002748A (en) Remote control device and method for controlling the same
EP3328100B1 (en) Instruction transmission method and apparatus based on indication direction, smart device, and storage medium
CN107967096A (en) Destination object determines method, apparatus, electronic equipment and storage medium
US20190306903A1 (en) Disambiguation of target devices using ambient signal data
CN104320591B (en) A kind of front and rear method, apparatus switched of control camera and a kind of intelligent terminal
US9092894B2 (en) Display control device and display control program for grouping related items based upon location
US8836698B2 (en) Method and apparatus for identifying a 3-D object from a 2-D display of a portable unit
CN106415443A (en) Techniques for identifying rolling gestures on a device
CN109361930A (en) Method for processing business, device and computer readable storage medium
CN110703919A (en) Method, device, equipment and storage medium for starting vehicle-mounted application
CN103678441B (en) Equipment and the content search method for using the equipment
KR20160017461A (en) Device for controlling play and method thereof
WO2016187817A1 (en) Mobile terminal control method, device and system
JP5719325B2 (en) Display system, display system control method, control device, control device control method, program, and information storage medium
CN110990686B (en) Control device of voice equipment, voice interaction method and device and electronic equipment
CN110970023A (en) Control device of voice equipment, voice interaction method and device and electronic equipment
KR102464911B1 (en) Method of providing a user interfave and display apparatus according to thereof
CN109033100A (en) The method and device of content of pages is provided
CN107015231A (en) A kind of information processing method and electronic equipment
US20120059846A1 (en) Method for retrieving object information and portable electronic device applying the same
CN103970291A (en) Mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant