CN113495622A - Interactive mode switching method and device, electronic equipment and storage medium - Google Patents

Interactive mode switching method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113495622A
CN113495622A CN202010261346.2A CN202010261346A CN113495622A CN 113495622 A CN113495622 A CN 113495622A CN 202010261346 A CN202010261346 A CN 202010261346A CN 113495622 A CN113495622 A CN 113495622A
Authority
CN
China
Prior art keywords
interaction
switching
touch
voice
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010261346.2A
Other languages
Chinese (zh)
Inventor
王莎莎
张刚
罗咏曦
王峰磊
李明伟
卢家广
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baidu Online Network Technology Beijing Co Ltd
Shanghai Xiaodu Technology Co Ltd
Original Assignee
Baidu Online Network Technology Beijing Co Ltd
Shanghai Xiaodu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Baidu Online Network Technology Beijing Co Ltd, Shanghai Xiaodu Technology Co Ltd filed Critical Baidu Online Network Technology Beijing Co Ltd
Priority to CN202010261346.2A priority Critical patent/CN113495622A/en
Publication of CN113495622A publication Critical patent/CN113495622A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/223Execution procedure of a spoken command
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/225Feedback of the input speech

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a switching method and device of an interaction mode, electronic equipment and a storage medium, and relates to the technical field of intelligent interaction. The specific implementation scheme of the switching method is as follows: under the condition that the voice interaction equipment with the screen is in a multi-item interaction mode, starting timing; the multiple interaction modes are modes supporting voice interaction information display and touch interaction; and under the condition that the preset time is reached, switching the voice interaction equipment with the screen to a touch interaction mode. By the scheme, the interactive mode can be automatically switched. Particularly for video scenes, the bad experience brought by continuously displaying voice interaction information to users can be avoided. Therefore, the interactive satisfaction of the user can be improved on the whole.

Description

Interactive mode switching method and device, electronic equipment and storage medium
Technical Field
The application relates to the technical field of computers, in particular to the technical field of intelligent interaction.
Background
The existing intelligent voice interaction equipment with the screen can not perform other operations simultaneously when performing voice interaction with a user, namely, the intelligent voice interaction equipment with the screen only supports a single interaction mode at the same time. In addition, in the full-duplex wake-up-free working mode, there may be a situation of misoperation when switching from one single interaction mode to another single interaction mode, resulting in poor user experience.
Disclosure of Invention
Embodiments of the present application provide a method and an apparatus for switching an interaction mode, an electronic device, and a storage medium, so as to solve one or more technical problems in the prior art.
In a first aspect, the present application provides a method for switching an interactive mode, including:
under the condition that the voice interaction equipment with the screen is in a multi-item interaction mode, starting timing; the multiple interaction modes are modes supporting voice interaction information display and touch interaction;
and under the condition that the preset time is reached, switching the voice interaction equipment with the screen to a touch interaction mode.
By the scheme, the interactive mode can be automatically switched. Particularly for video scenes, the bad experience brought by continuously displaying voice interaction information to users can be avoided. Therefore, the interactive satisfaction of the user can be improved on the whole.
In a second aspect, the present application provides an apparatus for switching an interaction mode, including:
the timing module is used for starting timing under the condition that the voice interaction equipment with the screen is in a multi-item interaction mode; the multiple interaction modes are modes supporting voice interaction information display and touch interaction;
the first switching module is used for switching the voice interaction equipment with the screen to a touch interaction mode under the condition that the preset time is reached.
In a third aspect, an embodiment of the present application provides an electronic device, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to cause the at least one processor to perform a method provided by any one of the embodiments of the present application.
In a fourth aspect, embodiments of the present application provide a non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform a method provided by any one of the embodiments of the present application.
Other effects of the above-described alternative will be described below with reference to specific embodiments.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
fig. 1 is a flowchart of a switching method of an interactive mode according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a plurality of interaction patterns according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a touch interaction mode according to another embodiment of the present application;
fig. 4 is a flowchart of a switching method of an interactive mode according to another embodiment of the present application;
FIG. 5 is a schematic diagram of an apparatus for switching interaction modes according to an embodiment of the present application;
fig. 6 is a block diagram of an electronic device for implementing the method for switching the interaction mode according to the embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
As shown in fig. 1, in an embodiment, a method for switching an interactive mode is provided, which includes the following steps:
s101: under the condition that the voice interaction equipment with the screen is in a multi-item interaction mode, starting timing; the multiple interaction modes are modes supporting voice interaction information display and touch interaction.
S102: and under the condition that the preset time is reached, switching the voice interaction equipment with the screen to a touch interaction mode.
In the embodiment of the application, the touch interaction mode includes receiving and responding to a touch instruction. That is, the screen of the voice interaction device with screen may receive a point touch or sliding control instruction of the user, and the response to the touch instruction may be to execute a corresponding operation according to the touch instruction of the user. For example, according to the touch instruction of the user, operations such as opening or closing an application program can be performed, and according to the sliding instruction of the user, operations such as playing progress or volume adjustment can be correspondingly selected.
The multiple interaction modes may include receiving and executing a touch instruction while displaying voice interaction information. In the case of a multi-item interaction mode, a display area of the display device may be divided into a first screen area and a second screen area.
As shown in fig. 2, the multiple interaction modes may be applied to video scenes, such as playing video conference scenes, playing movie scenes, or game scenes. In the case of a video scene, the first screen area may serve as a display area for voice interaction information, for example, displaying the user's voice information "what is the weather today". Alternatively, feedback information for a user's question or voice control instruction is displayed. For example, the user's question is "how today's weather is". According to the query result, it can be displayed that "the XX zone temperature of XX city today is XX ℃" in the display area (first screen area) of the voice interaction information. For another example, the voice control command of the user is "play next episode", and "you have switched to the XX-th episode" may be displayed in the display area of the voice interaction information.
The second screen region may be used to receive a tap or swipe operation by a user. For example, voice information of the user is displayed in the first screen area, or feedback information for a question or voice control instruction of the user is displayed. At the same time, under the condition that the second screen area receives the sliding instruction of the user, the operations of adjusting the playing progress or adjusting the volume and the like can be correspondingly performed according to the analysis of the sliding instruction.
In the multi-item interaction mode, the touch control interaction mode can be automatically switched to according to the mode whether the preset time is up. For example, a preset time period is set, and if the preset time period is exceeded, the voice interaction device with the screen is switched from the multiple interaction modes to the touch interaction mode.
In a video scene, the duration of the multiple interaction modes (e.g., 1 minute) may be preset, taking into account that the user prefers an undisturbed viewing experience. During the duration, voice information of the user or feedback information for a question or a control instruction of the user may be received and displayed. After the duration time is exceeded, the multi-item interaction mode is automatically switched to the touch interaction mode shown in fig. 3. As mentioned above, in the touch interaction mode, the entire area of the screen of the display device may receive and respond to the user's tap or slide operation.
By the scheme, the interactive mode can be automatically switched. Particularly for video scenes, the bad experience brought by continuously displaying voice interaction information to users can be avoided. Therefore, the interactive satisfaction of the user can be improved on the whole.
As shown in fig. 4, in an embodiment, in a case that the voice interaction device with screen is in a touch interaction mode, the method further includes:
s401: and determining a valid switching instruction from the received switching instructions.
S402: and switching the voice interaction equipment with the screen to a plurality of interaction modes according to the effective switching instruction.
The switching instruction may include a line of sight switching instruction, a voice switching instruction, a gesture switching instruction, a touch switching instruction, and the like.
The sight line switching instruction can comprise that when the condition that the sight line of the user is concentrated on the voice interaction equipment with the screen and exceeds a certain time is detected, the sight line switching instruction is determined to be received.
The voice switching instruction may include a specific wake-up word, such as "degree, degree". Or a specific instruction such as "loud click", "play next song", or the like. And confirming that the voice switching instruction is received under the condition that the awakening word or the specific instruction is received.
The gesture switching instruction may include a preset specific gesture action and the like. And under the condition that the specific gesture action made by the user is collected, confirming that a gesture switching instruction is received.
The touch switching instruction may include acquiring that the user clicks an interactive mode switching switch on the screen to confirm that the switching control instruction is received.
Due to the video scene involved, there is a high probability that the user's gaze will always lock onto the display screen of the voice interactive device with screen. Based on the above, under the condition of the touch interaction mode, the switching to the multiple interaction modes can be performed according to the determined effective switching instruction.
In the case where each of the above switching commands is received, it is necessary to specify a valid switching command. For example, in a video scene, since the line of sight of the user may be concentrated on the voice interaction device with the screen for a long time, a voice switching instruction, a gesture switching instruction, and a touch switching instruction may be determined as effective switching instructions from the switching instructions. The sight line switching instruction is regarded as an invalid switching instruction.
Through the scheme, screening and determining are carried out in different switching instructions so as to determine a proper switching instruction. Misoperation caused by eye switching instructions can be avoided, so that the switching accuracy is improved, and the interaction experience of a user is improved.
In one embodiment, the switching instruction includes a line of sight switching instruction, a voice switching instruction, a gesture switching instruction, and a touch switching instruction.
Through the scheme, various switching instructions can be conveniently used as switching instructions by a user in a habitual mode.
In one embodiment, the predetermined time is 90 seconds.
For example, in the case of a video scene and in a multi-item interaction mode, the user may be left for 90 seconds as the voice interaction time. And within 90 seconds, corresponding control can be performed according to the collected voice instruction, and a control result is displayed. For example, when voice commands such as "loud click" and "dim brightness" are collected, operations such as adjusting volume or adjusting brightness may be performed correspondingly. Meanwhile, in the first screen area, a voice control instruction of a user can be correspondingly displayed, and a control result of the voice control instruction can be displayed. For example, information such as "the volume has been turned up for you" or "the brightness has been adjusted for you" may be displayed.
In case of more than 90 seconds, the touch interaction mode may be switched back automatically. The voice interaction information is no longer displayed on the screen, so that immersive video viewing by the user can be achieved.
It will be appreciated that the predetermined time may be adjusted according to the actual needs of the user.
By setting the time period, the user can be left with relatively sufficient voice interaction time. The user can conveniently control the video scene in the initial stage through voice instructions. After a predetermined time, the device can be immersed in the watching of the video.
In one embodiment, in the case of multiple interaction modes, the display interface of the voice interaction device with screen includes:
the device comprises an interaction mode switch, a voice interaction information display area and a touch instruction receiving area.
As mentioned above, the multiple interaction modes are modes supporting displaying voice interaction information and touch interaction. Based on this, in the case of multiple interaction modes, the display interface of the voice interaction device with screen includes: an interactive information display area for displaying voice interactive information; a touch instruction receiving area for receiving a touch instruction; and the interaction mode change-over switch is used for receiving the first change-over command and the second change-over command.
The interactive information display area corresponds to the first screen area, and the touch instruction receiving area corresponds to the second screen area.
In addition, the form of the interactive mode switching switch may be changed as the interactive mode is switched. Taking fig. 2 and fig. 3 as an example, the label at the lower left corner in the figure is an interaction mode switch. As shown in fig. 2, in the multiple interaction mode, the interaction mode switch may be presented in a first form; as shown in fig. 3, in the touch interaction mode, the interaction mode switch may be in a second mode.
The voice interaction information display area and the touch instruction receiving area may be in the form of partially overlapping areas. For example, the voice interaction information display area may be covered on the touch instruction receiving area in the form of an overlapping layer. The covered portion of the touch command receiving area is adjusted not to respond to the received touch command, and the other portion of the touch command receiving area remains to respond to the received touch command. The non-response to the received touch instruction may be that, after receiving the touch instruction of the user, no corresponding operation is performed.
Through the scheme, the voice interactive information can be displayed, and meanwhile, the touch instruction of a user can be received. The interaction requirements of the user are met, and the interaction experience of the user is improved.
In one embodiment, in the case of the touch interaction mode, the display interface of the voice interaction device with screen includes:
the device comprises an interaction mode switch and a touch instruction receiving area.
Under the condition of the touch interaction mode, the voice interaction information display area can be closed. The closing mode may include canceling a text display attribute of the original voice interaction information display area and adjusting the text display attribute to respond to the received touch instruction.
Through the scheme, under the condition of the touch interaction mode, the voice interaction equipment with the screen can respond to the received touch instruction in a full-screen mode.
As shown in fig. 5, in one embodiment, there is provided an interactive mode switching apparatus, including the following components:
the timing module 501 is configured to start timing when the voice interaction device with screen is in a multi-item interaction mode; the multiple interaction modes are modes supporting voice interaction information display and touch interaction.
The first switching module 502 is configured to switch the voice interaction device with the screen to the touch interaction mode when a predetermined time is reached.
In one embodiment, the method further comprises:
the second switching module is used for determining an effective switching instruction from the received switching instruction under the condition that the voice interaction equipment with the screen is in a touch interaction mode;
and switching the voice interaction equipment with the screen to a plurality of interaction modes according to the effective switching instruction.
In one embodiment, the switching instruction includes a line of sight switching instruction, a voice switching instruction, a gesture switching instruction, and a touch switching instruction.
In one embodiment, the predetermined time is 90 seconds.
In one embodiment, in the case of the touch interaction mode, the display interface of the voice interaction device with screen includes:
the device comprises an interaction mode switch, a voice interaction information display area and a touch instruction receiving area.
In one embodiment, in the case of the touch interaction mode, the display interface of the voice interaction device with screen includes:
the device comprises an interaction mode switch and a touch instruction receiving area.
The functions of each module in each apparatus in the embodiment of the present application may refer to corresponding descriptions in the above method, and are not described herein again.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
Fig. 6 is a block diagram of an electronic device according to an embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 6, the electronic apparatus includes: one or more processors 610, memory 620, and interfaces for connecting the various components, including a high-speed interface and a low-speed interface. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). One processor 610 is illustrated in fig. 6.
Memory 620 is a non-transitory computer readable storage medium as provided herein. The memory stores instructions executable by at least one processor, so that the at least one processor executes the method for switching the interaction mode provided by the application. The non-transitory computer-readable storage medium of the present application stores computer instructions for causing a computer to perform the switching method of the interaction mode provided by the present application.
The memory 620, as a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules (e.g., the timing module 501 and the first switching module 502 shown in fig. 5) corresponding to the switching method of the interaction mode in the embodiment of the present application. The processor 610 executes various functional applications of the server and data processing by running non-transitory software programs, instructions, and modules stored in the memory 620, that is, implements the switching method of the interaction mode in the above-described method embodiment.
The memory 620 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the electronic device of the switching method of the interactive mode, and the like. Further, the memory 620 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 620 may optionally include a memory remotely located from the processor 610, and these remote memories may be connected to the electronic device of the switching method of the interactive mode through a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of the method for switching the interaction mode may further include: an input device 630 and an output device 640. The processor 610, the memory 620, the input device 630, and the output device 640 may be connected by a bus or other means, such as the bus connection in fig. 6.
The input device 630 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic apparatus for the switching method of the interaction mode, such as an input device of a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointing stick, one or more mouse buttons, a track ball, a joystick, etc. The output device 640 may include a display device, an auxiliary lighting device (e.g., an LED), a haptic feedback device (e.g., a vibration motor), and the like. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present application can be achieved, and the present invention is not limited herein.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (13)

1. A method for switching an interactive mode, comprising:
under the condition that the voice interaction equipment with the screen is in a multi-item interaction mode, starting timing; the multiple interaction modes are modes supporting voice interaction information display and touch interaction;
and under the condition that the preset time is reached, switching the voice interaction equipment with the screen to a touch interaction mode.
2. The method of claim 1, wherein when the voice interaction device with screen is in the touch interaction mode, further comprising:
determining an effective switching instruction from the received switching instructions;
and switching the voice interaction equipment with the screen to a plurality of interaction modes according to the effective switching instruction.
3. The method of claim 2, wherein the switching instructions comprise a line of sight switching instruction, a voice switching instruction, a gesture switching instruction, and a touch switching instruction.
4. The method of claim 1, wherein the predetermined time is 90 seconds.
5. The method of claim 1, wherein in the case of the multiple interaction modes, the display interface of the voice interaction device with screen comprises:
the device comprises an interaction mode switch, a voice interaction information display area and a touch instruction receiving area.
6. The method of claim 1, wherein in the case of the touch interaction mode, the display interface of the voice interaction device with screen comprises:
the device comprises an interaction mode switch and a touch instruction receiving area.
7. An apparatus for switching an interactive mode, comprising:
the timing module is used for starting timing under the condition that the voice interaction equipment with the screen is in a multi-item interaction mode; the multiple interaction modes are modes supporting voice interaction information display and touch interaction;
and the first switching module is used for switching the voice interaction equipment with the screen to a touch interaction mode under the condition of reaching the preset time.
8. The apparatus of claim 7, further comprising:
the second switching module is used for determining an effective switching instruction from the received switching instruction under the condition that the voice interaction equipment with the screen is in the touch interaction mode;
and switching the voice interaction equipment with the screen to a plurality of interaction modes according to the effective switching instruction.
9. The apparatus of claim 8, wherein the switching instructions comprise a line of sight switching instruction, a voice switching instruction, a gesture switching instruction, and a touch switching instruction.
10. The apparatus of claim 7, wherein in the case of the multiple interaction modes, the display interface of the voice interaction device with screen comprises:
the device comprises an interaction mode switch, a voice interaction information display area and a touch instruction receiving area.
11. The apparatus of claim 7, wherein in the case of the touch interaction mode, the display interface of the voice interaction device with screen comprises:
the device comprises an interaction mode switch and a touch instruction receiving area.
12. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1 to 6.
13. A non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method of any one of claims 1 to 6.
CN202010261346.2A 2020-04-03 2020-04-03 Interactive mode switching method and device, electronic equipment and storage medium Pending CN113495622A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010261346.2A CN113495622A (en) 2020-04-03 2020-04-03 Interactive mode switching method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010261346.2A CN113495622A (en) 2020-04-03 2020-04-03 Interactive mode switching method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113495622A true CN113495622A (en) 2021-10-12

Family

ID=77995374

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010261346.2A Pending CN113495622A (en) 2020-04-03 2020-04-03 Interactive mode switching method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113495622A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102792764A (en) * 2010-02-10 2012-11-21 惠普发展公司,有限责任合伙企业 Mobile device having plurality of input modes
US20170193992A1 (en) * 2015-12-30 2017-07-06 Le Holdings (Beijing) Co., Ltd. Voice control method and apparatus
CN107682536A (en) * 2017-09-25 2018-02-09 努比亚技术有限公司 A kind of sound control method, terminal and computer-readable recording medium
CN108572764A (en) * 2018-03-13 2018-09-25 努比亚技术有限公司 A kind of word input control method, equipment and computer readable storage medium
CN108804010A (en) * 2018-05-31 2018-11-13 北京小米移动软件有限公司 Terminal control method, device and computer readable storage medium
CN109299249A (en) * 2018-09-18 2019-02-01 广州神马移动信息科技有限公司 Ask-Answer Community exchange method, device, terminal device and computer storage medium
CN109712621A (en) * 2018-12-27 2019-05-03 维沃移动通信有限公司 A kind of interactive voice control method and terminal
CN110083068A (en) * 2019-03-12 2019-08-02 上海绿联软件股份有限公司 Household electrical appliance exchange method and household electrical appliance based on switch door operation
CN110321006A (en) * 2019-06-20 2019-10-11 佛吉亚好帮手电子科技有限公司 Vehicle system intelligent interactive method and vehicle system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102792764A (en) * 2010-02-10 2012-11-21 惠普发展公司,有限责任合伙企业 Mobile device having plurality of input modes
US20170193992A1 (en) * 2015-12-30 2017-07-06 Le Holdings (Beijing) Co., Ltd. Voice control method and apparatus
CN107682536A (en) * 2017-09-25 2018-02-09 努比亚技术有限公司 A kind of sound control method, terminal and computer-readable recording medium
CN108572764A (en) * 2018-03-13 2018-09-25 努比亚技术有限公司 A kind of word input control method, equipment and computer readable storage medium
CN108804010A (en) * 2018-05-31 2018-11-13 北京小米移动软件有限公司 Terminal control method, device and computer readable storage medium
CN109299249A (en) * 2018-09-18 2019-02-01 广州神马移动信息科技有限公司 Ask-Answer Community exchange method, device, terminal device and computer storage medium
CN109712621A (en) * 2018-12-27 2019-05-03 维沃移动通信有限公司 A kind of interactive voice control method and terminal
CN110083068A (en) * 2019-03-12 2019-08-02 上海绿联软件股份有限公司 Household electrical appliance exchange method and household electrical appliance based on switch door operation
CN110321006A (en) * 2019-06-20 2019-10-11 佛吉亚好帮手电子科技有限公司 Vehicle system intelligent interactive method and vehicle system

Similar Documents

Publication Publication Date Title
CN112533041A (en) Video playing method and device, electronic equipment and readable storage medium
CN104915115A (en) Application program switching method and device for terminal
CN110992112B (en) Advertisement information processing method and device
CN110620844B (en) Program starting method, device, equipment and storage medium
CN112148160B (en) Floating window display method and device, electronic equipment and computer readable storage medium
US11175823B2 (en) Method and apparatus for controlling terminal device using gesture control function, and non-transitory computer-readable storage medium
CN111405377A (en) Video playing method and device, electronic equipment and storage medium
CN111586459B (en) Method and device for controlling video playing, electronic equipment and storage medium
US20210097993A1 (en) Speech recognition control method and apparatus, electronic device and readable storage medium
CN110913277A (en) Video playing method and device, electronic equipment and storage medium
CN112581946A (en) Voice control method and device, electronic equipment and readable storage medium
CN108710512A (en) Preloading method, apparatus, storage medium and the intelligent terminal of application program
CN112055261A (en) Subtitle display method and device, electronic equipment and storage medium
JP7051800B2 (en) Voice control methods, voice control devices, electronic devices, and readable storage media
CN112905134A (en) Method and device for refreshing display and electronic equipment
CN110933227A (en) Assistance method, device, equipment and medium for intelligent terminal
CN112000272B (en) Keyboard panel layout adjusting method and device, electronic equipment and storage medium
CN112584280B (en) Control method, device, equipment and medium for intelligent equipment
CN112162800A (en) Page display method and device, electronic equipment and computer readable storage medium
CN111638787A (en) Method and device for displaying information
CN113495620A (en) Interactive mode switching method and device, electronic equipment and storage medium
CN112578962A (en) Information flow display method, device, equipment and medium
CN113495621A (en) Interactive mode switching method and device, electronic equipment and storage medium
CN113495622A (en) Interactive mode switching method and device, electronic equipment and storage medium
CN110674338A (en) Voice skill recommendation method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination