WO2015133022A1 - 情報処理装置、情報処理方法およびプログラム - Google Patents
情報処理装置、情報処理方法およびプログラム Download PDFInfo
- Publication number
- WO2015133022A1 WO2015133022A1 PCT/JP2014/081430 JP2014081430W WO2015133022A1 WO 2015133022 A1 WO2015133022 A1 WO 2015133022A1 JP 2014081430 W JP2014081430 W JP 2014081430W WO 2015133022 A1 WO2015133022 A1 WO 2015133022A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- response
- request
- information processing
- function
- unit
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04Q—SELECTING
- H04Q9/00—Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F13/00—Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
- G06F13/38—Information transfer, e.g. on bus
- G06F13/382—Information transfer, e.g. on bus using universal interface adapter
- G06F13/385—Information transfer, e.g. on bus using universal interface adapter for adaptation of a particular data processing system to different peripheral devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2805—Home Audio Video Interoperability [HAVI] networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2816—Controlling appliance services of a home automation network by calling their functionalities
- H04L12/282—Controlling appliance services of a home automation network by calling their functionalities based on user interaction within the home
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4122—Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4126—The peripheral being portable, e.g. PDAs or mobile phones
- H04N21/41265—The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4131—Peripherals receiving signals from specially adapted client devices home appliance, e.g. lighting, air conditioning system, metering devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42203—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] sound input device, e.g. microphone
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/42222—Additional components integrated in the remote control device, e.g. timer, speaker, sensors for detecting position, direction or movement of the remote control, microphone or battery charging device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/439—Processing of audio elementary streams
- H04N21/4396—Processing of audio elementary streams by muting the audio signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L2012/2847—Home automation networks characterised by the type of home appliance used
- H04L2012/2849—Audio/video appliances
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L2012/2847—Home automation networks characterised by the type of home appliance used
- H04L2012/285—Generic home appliances, e.g. refrigerators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04Q—SELECTING
- H04Q2209/00—Arrangements in telecontrol or telemetry systems
- H04Q2209/10—Arrangements in telecontrol or telemetry systems using a centralized architecture
Definitions
- This disclosure relates to an information processing apparatus, an information processing method, and a program.
- the response acquisition unit that acquires a request to the system
- the response determination unit that determines a response to the request
- the response from the devices included in the system based on at least the attribute of the response.
- An information processing apparatus includes a device selection unit that selects a device to be provided, and a device control unit that executes control for causing the selected device to provide the response.
- an information processing method including selecting a device and executing control for causing the selected device to provide the response.
- the response is provided from the devices included in the system based on a function of acquiring a request to the system, a function of determining a response to the request, and at least an attribute of the response.
- a user by automatically selecting a device that provides a response to a request, a user can use a device network with a natural and simple operation.
- FIG. 3 is a block diagram illustrating a configuration example of a server according to an embodiment of the present disclosure.
- FIG. 3 is a block diagram illustrating a configuration example of an agent function according to an embodiment of the present disclosure.
- FIG. It is a figure which shows the example of apparatus information DB in one Embodiment of this indication.
- It is a sequence diagram showing the 1st example of the concrete usage pattern of one embodiment of this indication.
- It is a sequence diagram showing the 2nd example of the concrete usage pattern of one embodiment of this indication.
- FIG. 3 is a block diagram illustrating a hardware configuration example of an information processing apparatus according to an embodiment of the present disclosure.
- FIG. 1 is a diagram illustrating a configuration example of a system according to an embodiment of the present disclosure.
- a system 10 includes an electronic device 100 and a network 200 to which the electronic device 100 is connected.
- the system 10 may further include a server 300 connected to the network 200.
- the electronic device 100 is a device used by a user.
- the system 10 can include a plurality of electronic devices 100.
- Examples of the electronic device 100 are not limited to these devices, and may include any device that can be connected to the network 200, such as a media player, a printer, a game machine, an air conditioner, or a refrigerator.
- the electronic device 100 may be disposed in the home (living room, bedroom, study, etc.) such as the television 100a to the personal computer 100j, or carried by the user and taken out like the smartphone 100m. Also good.
- the network 200 is a wireless and / or wired network that connects the electronic devices 100 to each other.
- the network 200 includes, for example, a LAN (Local Area Network) to which each device arranged in the home is connected.
- the network 200 can include a device such as a smartphone 100m that is taken outside, the Internet to which the server 300 is connected, a mobile phone network, and the like.
- the server 300 provides a service to the electronic device 100 via the network 200.
- the server 300 is realized by an information processing device connected to the network 200, for example.
- the function of the server 300 to be described later may be realized by a single information processing apparatus or may be realized by cooperation of a plurality of information processing apparatuses connected via a wired or wireless network.
- a function of automatically selecting a device that provides a response to a user request acquired by any one of the electronic devices 100 is realized.
- the electronic device 100 on the network 200 is controlled by an agent function realized by either the server 300 or the electronic device 100, for example.
- the agent function extracts a user request from a user instruction input (for example, voice input) acquired by the television 100a, and determines a response to the request.
- the agent function needs to select which electronic device 100 provides the response.
- the response is provided to a user different from the user who gave the request, there are more options for the device that provides the response. For example, when a user who has gone out asks the smartphone 100m to tell the family in the family to take in the laundry, which information is used by any of the televisions 100a to 100j depending on the family location. It is different from providing appropriate.
- the agent function automatically selects a device suitable for providing a response based on the response attribute determined for the request. This eliminates the need for the user to specify a device when inputting a request to the agent function. Further, even when the user does not know a device suitable for providing a response, the user can leave the device selection to the agent function.
- FIG. 2 is a block diagram illustrating a configuration example of an electronic device according to an embodiment of the present disclosure.
- electronic device 100 may include an image / audio output unit 110, an image / audio input unit 120, an operation unit 130, a control unit 140, a communication unit 150, and a storage unit 160.
- the illustrated configuration is simplified for the description of the present embodiment, and the electronic device 100 may further include components not illustrated. However, components not shown in the figure may already be known as general components of each device, and thus detailed description thereof is omitted here.
- the image / sound output unit 110 can be realized by, for example, a display that outputs an image and a speaker that outputs sound.
- the display is, for example, an LCD (Liquid Crystal Display) or an organic EL (Electro-Luminescence) display, and electronically displays various images under the control of the control unit 140.
- an image includes a still image and a moving image.
- the speaker outputs various sounds according to the control of the control unit 140.
- the image / audio output unit 110 may output only one of the image and the audio, or the image / audio output unit 110 may not be provided.
- the image / sound input unit 120 can be realized by, for example, a camera that acquires an image and a microphone that acquires sound.
- the camera generates image data by electronically imaging a real space using an imaging element such as a CMOS (Complementary Metal Oxide Semiconductor).
- the microphone records voice such as a user's utterance to generate voice data.
- the generated image data and / or audio data is provided to the control unit 140.
- the image / sound input unit 120 may acquire only one of the image and the sound. Alternatively, the image / sound input unit 120 may not be provided.
- the operation unit 130 can be realized by, for example, a touch panel, a keyboard, a mouse, a keypad, or a button for acquiring a user operation. Information indicating the user operation acquired by the operation unit 130 is provided to the control unit 140.
- a user instruction input can be acquired via the operation unit 130, or can be acquired as a voice or a gesture via the image / audio input unit 120. Therefore, when the electronic device 100 mainly acquires a user's instruction input by the operation unit 130, the image / sound input unit 120 can be omitted, and conversely, the electronic device 100 mainly has the image / sound input unit.
- the operation unit 130 may not be provided.
- the control unit 140 can be realized by a processor such as a CPU (Central Processing Unit) and / or a DSP (Digital Signal Processor) operating according to a program stored in the storage unit 160, for example.
- the control unit 140 controls the operation of each unit of the electronic device 100.
- the control unit 140 controls the image / sound output unit 110 to output an image and / or sound received via the communication unit 150 or read from the storage unit 160.
- the control unit 140 controls the image / audio input unit 120 to acquire image data and / or audio data, processes the acquired data as necessary, and transmits the data via the communication unit 150. Or stored in the storage unit 160.
- the control unit 140 can execute these controls in accordance with, for example, a user instruction input acquired via the operation unit 130 or the image / audio input unit 120.
- the communication unit 150 is a communication interface corresponding to a wireless or wired communication method that configures the network 200.
- the communication unit 150 can include, for example, a communication circuit and an antenna or a port.
- the control unit 140 exchanges various types of information with the other electronic devices 100 on the network 200 or the server 300 via the communication unit 150.
- the storage unit 160 can be realized by, for example, a semiconductor memory or a hard disk.
- the storage unit 160 stores various data used in the electronic device 100 or generated by the electronic device 100.
- the storage unit 160 has a temporary storage area, and temporarily stores programs being executed by the control unit 140, data acquired by the image / sound input unit 120, data received by the communication unit 150, and the like. Yes.
- the storage unit 160 has a permanent storage area.
- the storage unit 160 includes a program to be executed by the control unit 140, various setting data, local content data output from the image / audio output unit 110, and image / audio input. Data acquired by the unit 120 and instructed to be saved by the operation unit 130 may be stored.
- FIG. 3 is a block diagram illustrating a configuration example of a server according to an embodiment of the present disclosure.
- the server 300 may include a control unit 310, a communication unit 320, and a storage unit 330. Note that the illustrated configuration is simplified for the description of the present embodiment, and the server 300 may further include components not illustrated. However, components not shown in the figure may already be known as general components of the server, and thus detailed description thereof is omitted here. Further, as described above, the server 300 may be realized by a single information processing apparatus or may be realized by cooperation of a plurality of information processing apparatuses. Therefore, the illustrated components can also be realized by being distributed to a plurality of information processing apparatuses.
- the control unit 310 can be realized by a processor such as a CPU and / or a DSP operating according to a program stored in the storage unit 330, for example.
- the control unit 310 controls the operation of each unit of the server 300.
- the control unit 310 transmits information to the electronic device 100 on the network 200 via the communication unit 320 while referring to setting information stored in the storage unit 330 as necessary. This information may include a command for causing the electronic device 100 to execute a predetermined operation.
- the control unit 310 transmits information that can include a command to another electronic device 100 based on the result of processing the information received from the electronic device 100 via the communication unit 320.
- the control unit 310 may update the setting information stored in the storage unit 330 based on the result of processing the information received from the electronic device 100 via the communication unit 320.
- the communication unit 320 is a communication interface corresponding to a wired or wireless communication method that configures the network 200.
- the communication unit 320 can include, for example, a communication circuit and a port or an antenna.
- the control unit 310 exchanges various types of information with the electronic device 100 on the network 200 via the communication unit 320.
- the storage unit 330 can be realized by, for example, a semiconductor memory or a hard disk.
- the storage unit 330 stores various data used by the server 300 or generated by the server 300.
- the storage unit 330 has a temporary storage area, and temporarily stores programs being executed by the control unit 310, data received from the electronic device 100 by the communication unit 320, data generated by the control unit 310, and the like. Yes.
- the storage unit 330 has a permanent storage area, and can store programs to be executed by the control unit 310, various setting data, and the like.
- FIG. 4 is a block diagram illustrating a configuration example of the agent function according to an embodiment of the present disclosure.
- the agent function includes a request acquisition unit 510, a response determination unit 520, a device selection unit 550, and a device control unit 580.
- the agent function may further include a user position information acquisition unit 540 and a device state monitoring unit 590.
- the agent function may refer to the command DB 530, the user preference DB 560, and the device information DB 570.
- These components are realized by, for example, the control unit 310 and the storage unit 330 of the server 300 described above. Or said component may be implement
- the system 10 may not include the server 300.
- the agent function can be realized by either the server 300 or the electronic device 100. If the server 300 can communicate with the network 200, the agent function can be realized by the server. Otherwise, the agent function can be realized by the electronic device 100 instead. May be.
- each component will be further described.
- the request acquisition unit 510 acquires a request from the user for the system 10.
- a request from the user is input in any of the electronic devices 100.
- a request from the user is input as audio data using a microphone included in the image / audio input unit 120 of the electronic device 100.
- the user request may include speech.
- the request from the user may be input as image data using a camera included in the image / audio input unit 120 of the electronic device 100.
- the user request may include a gesture image.
- a request from the user may be input via the operation unit 130 of the electronic device 100.
- the agent function is realized by the electronic device 100 different from the server 300 or the electronic device 100 to which the request is input, the request acquisition unit 510 receives the request via the network 200.
- the response determination unit 520 determines a response to the request acquired by the request acquisition unit 510. For example, when the request is acquired as voice data including uttered voice, the response determining unit 520 executes voice recognition processing and extracts the content of the uttered voice as, for example, text. Furthermore, the response determination unit 520 refers to the command DB 530 based on the extracted text, and determines a command indicated by the user's speech and a response to the command. For example, when the request is acquired as image data including a gesture image, the response determination unit 520 executes image recognition processing to extract the content of the gesture. Furthermore, the response determination unit 520 refers to the command DB 530 based on the extracted gesture content, and determines a command indicated by the user gesture and a response to the command.
- the response determined by the response determination unit 520 can include any function that can be realized by using the electronic device 100 alone or in combination.
- the response may be an audio output using the electronic device 100 including the speaker in the image / audio output unit 110.
- the output voice may be music content, for example, or may be a voice message that provides some information.
- the response may be an image output using the electronic device 100 including the display in the image / sound output unit 110.
- the output image may be, for example, image content or a notification image that provides some information.
- the response may be that the electronic device 100 having a specific function, for example, a lighting device, a printer, an air conditioner, a refrigerator, or the like performs on / off or adjustment of each function.
- the agent function itself provides a response to a request from the user, it is a known technique, and therefore, an example of such a request and response can be applied in the present embodiment.
- the response determined by the response determination unit 520 may be provided for any of the users who use the system 10. For example, when the response is an output of sound or provision of an image, the response determination unit 520 determines to which user the sound or image is provided. The response determination unit 520 may extract information indicating the target user from the content such as text and gesture extracted from the request, for example. Alternatively, the response determination unit 520 may specify the user (specified by the user position information acquisition unit 540) closest to the electronic device 100 to which the request is input as a target for providing a response.
- the user position information acquisition unit 540 acquires user position information related to the response determined by the response determination unit 520.
- the user position information acquisition unit 540 may acquire position information of a target user to whom a response is provided.
- the user position information acquisition unit 540 may acquire position information of the user who has input the request (may be the same as the target user to whom the response is provided).
- the user's position information is acquired using, for example, a GPS (Global Positioning System) receiver of the smartphone 100m.
- GPS Global Positioning System
- the position information of the user may be acquired based on an analysis of an image or sound acquired by the image / sound output unit 110 of the electronic device 100, a login state to a service using the electronic device 100, or the like.
- the position information can be provided in association with each electronic device 100.
- the device selection unit 550 selects a device that provides a response from a group of devices that can be used by the user based on at least the attribute of the response.
- the device group that can be used by the user is the electronic device 100 included in the system 10.
- the device group that can be used by the user is the user of the electronic device 100. Is a device to which access authority is granted.
- the device selection unit 550 may acquire information related to the access authority of the electronic device 100 from the device information DB 570.
- the attribute of the response includes a target for providing the response
- the device selection unit 550 provides the response based on the position of the target for providing the response and the position of the device included in the system 10.
- a device may be selected.
- the target user who provides the response can be specified by the response determination unit 520 as described above.
- the user position information can be acquired by the user position information acquisition unit 540.
- the position information of the devices included in the system 10 can be acquired from the device information DB 570, for example. More specifically, the device selection unit 550 selects an electronic device 100 that can provide the response determined by the response determination unit 520 as a device that provides a response, with a position close to the target. May be.
- the device selection unit 550 may reselect a device that provides a response when at least one of a target for providing a response, a target position, or a position of the electronic device 100 changes. That is, in the device selection unit 550, for example, a target for providing a response is changed by a request newly acquired by the request acquisition unit 510, or the target user or the electronic device 100 for providing a response moves while the response is being provided. In such a case, the electronic device 100 that provides the response may be dynamically changed.
- the response attribute includes a function realized in the response
- the device selection unit 550 may select a device that provides the response based on the performance of the electronic device 100 related to the function realized in the response. Good.
- the performance of the electronic device 100 related to the function can be acquired from the device information DB 570, for example. More specifically, the device selection unit 550 may select the electronic device 100 having the highest performance related to the function realized in the response as the device that provides the response.
- the device selection unit 550 when the function to be realized is provision of an image or sound, the device selection unit 550 performs a combination of selections using position information, and within a predetermined range based on the position of the target user (for example, the same From among the electronic devices 100 in the room), the electronic device 100 having the highest performance for providing images or sounds is selected.
- the device selection unit 550 when the function to be realized is a function that does not necessarily need to be executed near the user, such as information retrieval or image processing, the device selection unit 550 simply selects the electronic device 100 having the highest performance. May be.
- the device selection unit 550 may select a plurality of devices as devices that provide a response. For example, the device selection unit 550 may cause a plurality of electronic devices 100 to provide the function realized in the response simultaneously in parallel. Alternatively, the device selection unit 550 may divide and provide the function realized in the response to the plurality of electronic devices 100. More specifically, when an image and audio reproduction function is realized in the response, the device selection unit 550 may provide the electronic device 100 with an image reproduction function and an audio reproduction function. For example, in the example of the system 10 shown in FIG. 1, the television 100a may reproduce an image, and the speaker 100c may reproduce sound. In this case, the television 100a may be controlled to mute the sound while reproducing the image and the sound.
- the device selection unit 550 may select a device that provides a response based on the preference of the user to whom the response is to be provided.
- the target user who provides the response can be specified by the response determination unit 520 as described above.
- user preferences can be acquired from the user preference DB 560.
- the user preference DB 560 is generated based on, for example, the usage history of the electronic device 100 by each user who uses the system 10 or information explicitly input by each user. More specifically, the device selection unit 550 may select the electronic device 100 that best matches the preference of the user to whom the response is provided as the device that provides the response.
- the device selection unit 550 cannot always select a device that provides a response. For example, a selectable device may not be found as a result of applying conditions such as location, function, and / or preference.
- the device selection unit 550 outputs a notification that no device is selected, a request for additional information for selecting a device, and the like by controlling the electronic device 100 via the device control unit 580. You may let them.
- the device control unit 580 performs control for causing the device selected by the device selection unit 550 to provide the response determined by the response determination unit 520. More specifically, the device control unit 580 transmits a control command to the electronic device 100 via the network 200. In addition, for example, when the device selection unit 550 cannot select a device that provides a response, the device control unit 580 notifies that the device has not been selected or provides additional information for selecting a device.
- the electronic device 100 may be controlled to output a request message. In this case, the message may be output from the electronic device 100 to which the request is input. Alternatively, the device control unit 580 may control the electronic device 100 so as to output a message notifying that a response has been provided in response to the request. In this case, the electronic device 100 that provides the response is a device different from the electronic device 100 to which the request is input, and the message may be output from the electronic device 100 to which the request is input.
- the device state monitoring unit 590 monitors the electronic device 100 included in the system 10 and updates the device information DB 570 as necessary. For example, the device status monitoring unit 590 stores the device information DB 570 when the electronic device 100 moves, when the provision of functions by the electronic device 100 is started / finished, or when the setting of the electronic device 100 is changed. Update.
- FIG. 5 is a diagram illustrating an example of a device information DB according to an embodiment of the present disclosure.
- the device information DB 570 includes items of ID 570a, device type 570b, location 570c, location details 570d, owner 570e, function 570f, and function details 570g. Note that the content of the device information DB 570 in the illustrated example does not necessarily match the example of the system 10 illustrated in FIG. 1, for example. Hereinafter, each item will be further described.
- the device ID 570a is used to identify each piece of device information.
- the device type 570b indicates the type of the electronic device 100. For example, when the device selection unit 550 selects a device based on the user's preference, the device type 570b can be used for selection. More specifically, when the user preference DB 560 indicates a user preference such as “when browsing the web, prefers the tablet over the television”, the device type 570b indicates that the tablet instead of the television is used. It may be selected.
- the location 570c indicates a schematic location of the electronic device 100.
- a room living room, study, bedroom, etc.
- the place 570c is recorded based on setting information input by the user when the electronic device 100 is installed, for example.
- the location details 570d indicate the detailed location of the electronic device 100.
- the coordinates of three axes of X, Y, and Z are recorded as the location details 570d.
- the location details 570d are acquired using, for example, a GPS or a home sensor network.
- the location 570c and the location details 570d may be different depending on circumstances. For example, when the tablet in the living room is temporarily moved to the bedroom, the location 570c remains “living room” unless the setting information is changed, but the location details 570d are the coordinates of the bedroom according to the movement of the device. May be changed. Alternatively, when the location 570c is set based on the location details 570d instead of the setting information, the location 570c may be changed as the electronic device 100 moves.
- the location 570c and / or the location details 570d may be used for selection. More specifically, the device selection unit 550 selects the electronic device 100 that is indicated to be located near the user to whom the response is provided by the location 570c or the location details 570d. When the location 570c and / or the location details 570d change due to the movement of the electronic device 100, the device that the electronic device 100 newly provides a response by dynamically changing the device selected by the device selection unit 550 May be selected or may be removed from the device providing the response.
- the owner 570e indicates the owner of the electronic device 100 or an accessible user.
- “shared” or “father” is set for the owner 570e.
- “Shared” means that there is no restriction on access to the electronic device 100, and all of the users who use the system 10 (for example, the entire family) can use the electronic device 100.
- “Father” means that only the father is authorized to access the electronic device 100.
- the access authority is not limited to the father, and the access authority may be applied to the mother, the child, or a combination thereof.
- the device selection unit 550 may first filter the devices based on the owner 570e. More specifically, if neither the user who inputs the request nor the user who is the target of the response can use the electronic device 100, the electronic device 100 can be excluded from the selection target. In addition, when only one of the users can use the electronic device 100, whether or not to select the electronic device 100 depends on the setting by the user, the type of function provided in the response, and the like. Can be determined.
- the function 570f indicates the type of function that can be provided by the electronic device 100.
- the function detail 570g indicates the specification of the electronic device 100 related to the function. These items may be acquired from a database on the network based on information such as a model number provided to the agent function when the electronic device 100 is added to the system 10, for example.
- the device selection unit 550 may perform filtering based on the function 570f when selecting a device that provides a response. More specifically, electronic device 100 that cannot realize the function provided in the response may be excluded from selection targets.
- the function details 570g can be used for selection. More specifically, the device selection unit 550 may select, as the device that provides the response, the electronic device 100 that is indicated by the function details 570g to have the highest performance related to the function realized in the response.
- the function details 570g include video, audio, microphone, and network items, but many more items may be included.
- the device selection unit 550 may select the electronic device 100 based on the function details 570g individually for each function. For example, when the video and audio playback function is realized in the response, the function details 570g indicate that the television performance is the highest with respect to the video playback function and the speaker performance is the highest with respect to the audio playback function. If so, the device selection unit 550 may select a television for the video playback function and a speaker for the audio playback function.
- the device information DB 570 can include various items.
- the device information DB 570 may include items indicating functions that the electronic device 100 is providing.
- the device information DB 570 may include an item indicating a user who is using the electronic device 100.
- the device selection unit 550 can automatically select a device that provides a response in the agent function in the present embodiment.
- FIG. 6 is a sequence diagram illustrating a first example of a specific usage pattern according to an embodiment of the present disclosure.
- a speaker 100c installed in a living room is playing music.
- the speaker 100c notifies the television 100a that realizes the agent function that music content is being reproduced (S101).
- the device status monitoring unit 590 is playing back in the device information DB 570 that the speaker 100c is playing back music content.
- Information for specifying music content (for example, stored in the NAS 100i) is recorded.
- the control unit 140 that implements the agent function as a request from the user.
- the response determination unit 520 determines a response based on the audio data acquired by the request acquisition unit 510.
- the response determination unit 520 analyzes the request by executing, for example, voice recognition or natural language processing, and determines a response to the request. In the illustrated example, the response is “continue playback of the music content currently provided to the user with the device in the bathroom”.
- the control unit 140 of the television 100 a may request the server 300 to perform voice recognition and natural language processing via the network 200.
- the device selection unit 550 selects a device that provides a response.
- the place where the response is provided is the bathroom, and the function realized in the response is the reproduction of music content, so the speaker 100f is selected (S105).
- the device control unit 580 executes control for providing a response to the device selected by the device selection unit 550, that is, the speaker 100f. More specifically, the device control unit 580 transmits a control command to the bathroom speaker 100f so as to continuously play the music content that has been played on the living room speaker 100c (S107).
- the control command at this time may include information indicating the location of the music content (for example, NAS 100i) and the position where playback is started.
- the device control unit 580 sends a control command to the speaker 100f or returns a positive response to the control command to a user who inputs a request to the television 100a.
- the notification may be output by voice or may be output as an image using a GUI or the like.
- the device control unit 580 transmits a control command for stopping the reproduction of the music content to the speaker 100c. (S111).
- FIG. 7 is a sequence diagram illustrating a second example of a specific usage pattern according to an embodiment of the present disclosure.
- a smartphone 100 m that is carried by a user ( father) and taken out (workplace) periodically sends location information to a server 300 that realizes an agent function. It is transmitting (S201).
- the user mother
- the microphone included in the image / sound input unit 120 acquires the speech sound.
- the audio data output from the microphone is transmitted from the tablet 100e to the server 300 via the network 200 (S205).
- the request acquisition unit 510 of the agent function acquires audio data, and the response determination unit 520 determines a response based on the audio data. Similar to the first example, the response determination unit 520 executes, for example, speech recognition and natural language processing, and determines a response to the user's request. In the illustrated example, the response is “notify the father to buy milk”. Further, in the agent function, the device selection unit 550 selects a device that provides a response. In the illustrated example, since the user who provides the response is the father, the device selection unit 550 responds to the electronic device 100 closest to the father, that is, the smartphone 100m carried by the father who is out. (S207).
- the device control unit 580 executes control for causing the smartphone 100m selected by the device selection unit 550 to provide a response. More specifically, the device control unit 580 transmits a control command so that the smartphone 100m outputs a notification (S209).
- the control command at this time may include information indicating the content of the notification (request for purchasing milk on the way back) and the requester (the user who input the request to the tablet 100e, that is, the mother).
- the device control unit 580 receives a control command sent to the smartphone 100m or an acknowledgment for the control command is returned to the user who inputs a request to the tablet 100e. You may notify that the response (notification to the father via the smart phone 100m) was performed (S211, S213). The notification may be output by voice or may be output as an image using a GUI.
- the device selection unit 550 selects the device having the highest performance regarding the image display function and the device having the highest performance regarding the audio output function among the electronic devices 100 in the living room. Specifically, the television 100a is selected as the device having the highest performance for displaying images and the speaker 100c is selected for outputting audio.
- the device selection unit 550 may further specify the position where the user is sitting in the living room and select the speaker 100c as a device that can provide the optimum sound for the position.
- the device control unit 580 transmits a control command to the television 100a and the speaker 100c.
- the control command may include information indicating a video content acquisition method (for example, receiving a broadcast wave using the tuner of the television 100a or reading it from the NAS 100i). Further, the control command transmitted to the television 100a may include an instruction to mute the audio output of the video content.
- the agent function may be able to provide a more complex or timed response to the request. For example, assume that when the mother goes out, an input by spoken voice is given to the microphone provided in the front lighting apparatus 100k, "When the son comes back, let me know that there is a snack in the cupboard". At this time, in the agent function, the request acquisition unit 510 acquires the above voice input, and the response determination unit 520 determines a response to the request (notification to the son), but since the son has not returned home at that time. The user position information acquisition unit 540 cannot acquire the position information of the son who is the response target.
- the device selection unit 550 determines that the response should be provided after a time based on the content of the request “when you return”, and waits for the response to be provided. Thereafter, when the son comes home, for example, the user position information acquisition unit 540 acquires the position information of the son from the analysis of the image acquired by the camera included in the lighting fixture 100k. Thus, the device selection unit 550 that has determined that the response can be provided selects the electronic device 100 closest to the position of the son at that time, for example, the speaker 100b in the living room as the device that provides the response.
- the device control unit 580 transmits a control command for outputting the notification “There is a snack in the cupboard” from the speaker 100b by voice.
- the agent function does not necessarily acquire a user's explicit speech, gesture, operation, or the like as a request.
- the agent function may extract a request to the system 10 from a user's action (not particularly intended to operate the device) or a natural phenomenon.
- a user's action not particularly intended to operate the device
- a natural phenomenon For example, in the case where an image acquired by an electronic device 100 (not shown in FIG. 1) provided with a camera that reflects the outside of a window shows rain that has fallen and dried laundry, the request acquisition unit 510 and the response The determination unit 520 may recognize the above-described event through image analysis, and automatically specify a virtual request “notify someone in the house to take in laundry” and a response to the virtual request.
- the device selection unit 550 provides a response to the electronic device 100 that is close to the user in the house (someone may not be specified) based on the position information acquired by the user position information acquisition unit 540. Select the device to be used. For example, when the user position information acquisition unit 540 detects that the father is in the study from the analysis of the image acquired by the camera included in the personal computer 100j in the study, the device selection unit 550 provides a response (notification) The personal computer 100j is selected as the device to be used. In this case, the device control unit 580 transmits a control command for outputting a notification “Please take in the laundry” from a speaker included in the personal computer 100j by voice.
- the agent function provided in the embodiment of the present disclosure is natural and simple for a user by automatically selecting a device that provides a response to a request in various situations in the system 10, for example. It is possible to use the device network with simple operation.
- the system 10 including the electronic device 100 installed in the home is exemplified, but the present technology is not limited to such an example and can be applied.
- a system similar to that in a home can be constructed in an office or a car.
- electronic devices that can be used by a user are not concentrated in a specific place, but may be ubiquitous in various places such as homes, offices, and cars.
- FIG. 8 is a block diagram illustrating a hardware configuration example of the information processing apparatus according to the embodiment of the present disclosure.
- the illustrated information processing apparatus 900 can realize, for example, the electronic device or server in the above-described embodiment.
- the information processing apparatus 900 includes a CPU (Central Processing unit) 901, a ROM (Read Only Memory) 903, and a RAM (Random Access Memory) 905.
- the information processing apparatus 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925.
- the information processing apparatus 900 may include an imaging device 933 and a sensor 935 as necessary.
- the information processing apparatus 900 may include a processing circuit called DSP (Digital Signal Processor) or ASIC (Application Specific Integrated Circuit) instead of or in addition to the CPU 901.
- DSP Digital Signal Processor
- ASIC Application Specific Integrated Circuit
- the CPU 901 functions as an arithmetic processing device and a control device, and controls all or a part of the operation in the information processing device 900 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or the removable recording medium 927.
- the ROM 903 stores programs and calculation parameters used by the CPU 901.
- the RAM 905 primarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
- the CPU 901, the ROM 903, and the RAM 905 are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus. Further, the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
- PCI Peripheral Component Interconnect / Interface
- the input device 915 is a device operated by the user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever.
- the input device 915 may be, for example, a remote control device that uses infrared rays or other radio waves, or may be an external connection device 929 such as a mobile phone that supports the operation of the information processing device 900.
- the input device 915 includes an input control circuit that generates an input signal based on information input by the user and outputs the input signal to the CPU 901. The user operates the input device 915 to input various data and instruct processing operations to the information processing device 900.
- the output device 917 is a device that can notify the user of the acquired information visually or audibly.
- the output device 917 can be, for example, a display device such as an LCD (Liquid Crystal Display), a PDP (Plasma Display Panel), an organic EL (Electro-Luminescence) display, an audio output device such as a speaker and headphones, and a printer device.
- the output device 917 outputs the result obtained by the processing of the information processing device 900 as video such as text or an image, or outputs it as audio such as voice or sound.
- the storage device 919 is a data storage device configured as an example of a storage unit of the information processing device 900.
- the storage device 919 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
- the storage device 919 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like.
- the drive 921 is a reader / writer for a removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is built in or externally attached to the information processing apparatus 900.
- the drive 921 reads information recorded on the attached removable recording medium 927 and outputs the information to the RAM 905.
- the drive 921 writes a record in the attached removable recording medium 927.
- the connection port 923 is a port for directly connecting a device to the information processing apparatus 900.
- the connection port 923 can be, for example, a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like.
- the connection port 923 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like.
- the communication device 925 is a communication interface configured with, for example, a communication device for connecting to the communication network 931.
- the communication device 925 may be, for example, a communication card for wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), or WUSB (Wireless USB).
- the communication device 925 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication.
- the communication device 925 transmits and receives signals and the like using a predetermined protocol such as TCP / IP with the Internet and other communication devices, for example.
- the communication network 931 connected to the communication device 925 is a wired or wireless network, such as the Internet, a home LAN, infrared communication, radio wave communication, or satellite communication.
- the imaging device 933 uses various members such as an imaging element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and a lens for controlling the formation of a subject image on the imaging element. It is an apparatus that images a real space and generates a captured image.
- the imaging device 933 may capture a still image or may capture a moving image.
- the sensor 935 is various sensors such as an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, and a sound sensor.
- the sensor 935 acquires information about the state of the information processing apparatus 900 itself, such as the posture of the information processing apparatus 900, and information about the surrounding environment of the information processing apparatus 900, such as brightness and noise around the information processing apparatus 900, for example. To do.
- the sensor 935 may include a GPS sensor that receives a GPS (Global Positioning System) signal and measures the latitude, longitude, and altitude of the apparatus.
- GPS Global Positioning System
- Each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Such a configuration can be appropriately changed according to the technical level at the time of implementation.
- Embodiments of the present disclosure include, for example, an information processing apparatus (electronic device or server) as described above, a system, an information processing method executed by the information processing apparatus or system, a program for causing the information processing apparatus to function, And a non-transitory tangible medium on which the program is recorded.
- a request acquisition unit for acquiring a request to the system;
- a response determining unit that determines a response to the request;
- a device selection unit that selects a device that provides the response from among devices included in the system based on at least the attribute of the response;
- An information processing apparatus comprising: a device control unit that executes control for causing the selected device to provide the response.
- the attribute of the response includes a target for providing the response,
- the device selection unit When at least one of the target, the position of the target, or the position of a device included in the system changes, the device selection unit reselects a device that provides the response.
- the attribute of the response includes a function realized by the response,
- the function realized by the response includes a first function and a second function, The device selection unit selects any one of the first device that realizes the first function and the second device that realizes the second function.
- the function realized by the response includes reproduction of video content
- the device selection unit selects the first device that displays the image of the video content and the second device that outputs the audio of the video content,
- the information processing apparatus according to (6), wherein the device control unit performs control for displaying an image of the video content on the first device and muting the audio of the video content.
- the attribute of the response includes a target for providing the response,
- the information processing apparatus according to any one of (1) to (7), wherein the device selection unit selects a device that provides the response based on a preference of the target user.
- the device control unit executes control for requesting additional information for selecting a device that provides the response to the device to which the request is input, (1) to (8 The information processing apparatus according to any one of the above.
- the device control unit executes control for notifying the device to which the request is input that a response has been provided by the selected device, any of (1) to (9)
- the information processing apparatus according to claim 1.
- (11) obtaining a request to the system; Determining a response to the request; Selecting a device providing the response from devices included in the system based on at least the attribute of the response;
- An information processing method comprising: executing control for causing the selected device to provide the response.
- a function for acquiring a request to the system A function for determining a response to the request; A function of selecting a device that provides the response from devices included in the system based on at least the attribute of the response; A program for causing a computer to realize a function of executing control for causing the selected device to provide the response.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Selective Calling Equipment (AREA)
- Telephonic Communication Services (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
1.一実施形態の構成例
1-1.システム構成例
1-2.各装置の構成例
1-3.エージェント機能の構成例
1-4.機器情報の例
2.具体的な利用形態の例
2-1.第1の例
2-2.第2の例
2-3.その他の例
3.ハードウェア構成
4.補足
(1-1.システム構成例)
図1は、本開示の一実施形態に係るシステムの構成例を示す図である。図1を参照すると、システム10は、電子機器100と、電子機器100が接続されるネットワーク200とを含む。システム10は、さらに、ネットワーク200に接続されるサーバ300を含んでもよい。
図2は、本開示の一実施形態に係る電子機器の構成例を示すブロック図である。図2を参照すると、電子機器100は、画像/音声出力部110と、画像/音声入力部120と、操作部130と、制御部140と、通信部150と、記憶部160とを含みうる。なお、図示された構成は本実施形態の説明のために簡略化されており、電子機器100は図示されていない構成要素をさらに含んでもよい。ただし、図示されていない構成要素は、各機器の一般的な構成要素として既に知られたものでありうるため、ここでは詳細な説明を省略する。
図4は、本開示の一実施形態におけるエージェント機能の構成例を示すブロック図である。図4を参照すると、エージェント機能は、リクエスト取得部510と、レスポンス決定部520と、機器選択部550と、機器制御部580とを含む。エージェント機能は、さらに、ユーザ位置情報取得部540と、機器状態監視部590とを含んでもよい。また、エージェント機能は、コマンドDB530と、ユーザ嗜好DB560と、機器情報DB570とを参照してもよい。これらの構成要素は、例えば、上記のサーバ300の制御部310および記憶部330によって実現される。あるいは、上記の構成要素は、いずれかの電子機器100(例えばテレビ100aやスマートフォン100m)の制御部140および記憶部160によって実現されてもよい。また、上記の構成要素は、電子機器100とサーバ300に分散して実現されてもよい。
リクエスト取得部510は、システム10に対するユーザからのリクエストを取得する。ユーザからのリクエストは、電子機器100のいずれかにおいて入力される。例えば、ユーザからのリクエストは、電子機器100の画像/音声入力部120に含まれるマイクロフォンを用いて音声データとして入力される。この場合、ユーザのリクエストは、発話音声を含みうる。あるいは、ユーザからのリクエストは、電子機器100の画像/音声入力部120に含まれるカメラを用いて画像データとして入力されてもよい。この場合、ユーザのリクエストは、ジェスチャ画像を含みうる。また、ユーザからのリクエストは、電子機器100の操作部130を介して入力されてもよい。エージェント機能がサーバ300、またはリクエストが入力された電子機器100とは異なる電子機器100で実現される場合、リクエスト取得部510は、ネットワーク200を介してリクエストを受信する。
レスポンス決定部520は、リクエスト取得部510によって取得されたリクエストに対するレスポンスを決定する。例えば、リクエストが発話音声を含む音声データとして取得された場合、レスポンス決定部520は、音声認識の処理を実行して、発話音声の内容を例えばテキストとして抽出する。さらに、レスポンス決定部520は、抽出されたテキストに基づいてコマンドDB530を参照し、ユーザの発話音声によって示されるコマンドと、コマンドに対するレスポンスとを決定する。また、例えば、リクエストがジェスチャ画像を含む画像データとして取得された場合、レスポンス決定部520は、画像認識の処理を実行して、ジェスチャの内容を抽出する。さらに、レスポンス決定部520は、抽出されたジェスチャの内容に基づいてコマンドDB530を参照し、ユーザのジェスチャによって示されるコマンドと、コマンドに対するレスポンスとを決定する。
ユーザ位置情報取得部540は、レスポンス決定部520によって決定されたレスポンスに関係するユーザの位置情報を取得する。例えば、ユーザ位置情報取得部540は、レスポンスが提供される対象のユーザの位置情報を取得してもよい。あるいは、ユーザ位置情報取得部540は、リクエストを入力したユーザ(レスポンスが提供される対象のユーザと同一であってもよい)の位置情報を取得してもよい。ユーザの位置情報は、例えばスマートフォン100mのGPS(Global Positioning System)受信機などを利用して取得される。また、ユーザが家庭内にいる場合、家庭内のセンサネットワークなどによって、ユーザの詳細な位置情報を取得することも可能でありうる。あるいは、ユーザの位置情報は、電子機器100の画像/音声出力部110が取得する画像または音声の解析や、電子機器100を用いたサービスへのログイン状態などに基づいて取得されてもよい。この場合、位置情報は、それぞれの電子機器100に対応付けて提供されうる。
機器選択部550は、レスポンス決定部520によって決定されたレスポンスについて、少なくともレスポンスの属性に基づいて、ユーザが利用可能な機器群の中からレスポンスを提供する機器を選択する。本実施形態において、ユーザが利用可能な機器群は、システム10に含まれる電子機器100である。例えば、電子機器100の少なくとも一部に所有者が設定されており、所有者によって電子機器100へのアクセスが制限されている場合、ユーザが利用可能な機器群は、電子機器100のうち当該ユーザにアクセス権限が与えられた機器である。機器選択部550は、電子機器100のアクセス権限に関する情報を、機器情報DB570から取得してもよい。
機器制御部580は、機器選択部550によって選択された機器に、レスポンス決定部520によって決定されたレスポンスを提供させるための制御を実行する。より具体的には、機器制御部580は、ネットワーク200を介して電子機器100に制御コマンドを送信する。また、機器制御部580は、例えば機器選択部550がレスポンスを提供する機器を選択できなかった場合に、機器が選択されなかったことを通知したり、機器を選択するための追加的な情報を要求したりするメッセージを出力するように、電子機器100を制御してもよい。この場合、メッセージは、リクエストが入力された電子機器100から出力されてもよい。あるいは、機器制御部580は、リクエストに対応してレスポンスが提供されたことを通知するメッセージを出力するように、電子機器100を制御してもよい。この場合、レスポンスを提供する電子機器100はリクエストが入力された電子機器100とは異なる機器であり、メッセージはリクエストが入力された電子機器100から出力されてもよい。
機器状態監視部590は、システム10に含まれる電子機器100を監視し、必要に応じて機器情報DB570を更新する。例えば、機器状態監視部590は、電子機器100が移動した場合や、電子機器100による機能の提供が開始/終了された場合、電子機器100の設定が変更された場合などに、機器情報DB570を更新する。
図5は、本開示の一実施形態における機器情報DBの例を示す図である。図5を参照すると、機器情報DB570には、ID570a、機器種類570b、場所570c、場所詳細570d、所有者570e、機能570f、機能詳細570gの項目が含まれる。なお、図示された例の機器情報DB570の内容は、例えば図1に示したシステム10の例とは必ずしも一致しない。以下、それぞれの項目についてさらに説明する。
(2-1.第1の例)
図6は、本開示の一実施形態の具体的な利用形態の第1の例を示すシーケンス図である。図6を参照すると、図1に示したシステム10において、リビングルームに設置されたスピーカ100cが音楽を再生している。このとき、スピーカ100cは、エージェント機能を実現するテレビ100aに対して、音楽コンテンツを再生中であることを通知している(S101)。テレビ100aでは、この通知を受けて、制御部140によって実現されるエージェント機能において、機器状態監視部590が、機器情報DB570に、スピーカ100cが音楽コンテンツを再生中であることと、再生されている音楽コンテンツ(例えば、NAS100iに格納されている)を特定するための情報とを記録する。
図7は、本開示の一実施形態の具体的な利用形態の第2の例を示すシーケンス図である。図7を参照すると、図1に示したシステム10において、ユーザ(父親)によって携帯されて外(職場)に持ち出されたスマートフォン100mが、定期的に、エージェント機能を実現するサーバ300に位置情報を送信している(S201)。ここで、家にあるタブレット100eに対して、ユーザ(母親)が、「帰りに牛乳を買ってくるようお父さんに伝えて」と発話したとする(S203)。タブレット100eでは、画像/音声入力部120に含まれるマイクロフォンが発話音声を取得する。マイクロフォンから出力された音声データは、タブレット100eからネットワーク200を介してサーバ300に送信される(S205)。
(その他の例-1)
上記で図6および図7に示した例に限らず、他にもさまざまな例が可能である。例えば、ユーザが、リビングルームのテレビ100aで映画を視聴しようとしているときに、テレビ100aに向かって「テレビで映画を再生して」と発話したとする。このとき、エージェント機能(例えば、テレビ100a、他の電子機器100、またはサーバ300で実現される)では、リクエスト取得部510が取得したリクエスト(ユーザの発話音声を含む音声データ)に基づいて、レスポンス決定部520がレスポンスを決定する。この場合のレスポンスでは、「映画(映像コンテンツ)の画像を表示する」機能と、「映画(映像コンテンツ)の音声を出力する」機能とが実現される。
また、エージェント機能は、リクエストに対して、より複雑な、または時間をおいて起動するレスポンスを提供可能であってもよい。例えば、母親が出かけるときに、玄関の照明器具100kが備えるマイクロフォンに「息子が帰ってきたら、戸棚におやつがあることを知らせて」と発話音声による入力を与えたとする。このとき、エージェント機能では、リクエスト取得部510が上記の音声入力を取得し、レスポンス決定部520がリクエストに対するレスポンス(息子への通知)を決定するが、その時点でまだ息子は帰宅していないため、ユーザ位置情報取得部540はレスポンスの対象である息子の位置情報を取得できない。
また、エージェント機能は、必ずしもユーザの明示的な発話、ジェスチャ、または操作などをリクエストとして取得しなくてもよい。エージェント機能は、(特に機器の操作を意図していない)ユーザの動作や、自然現象などからシステム10へのリクエストを抽出してもよい。例えば、窓の外が映るカメラを備える電子機器100(図1には示されていない)が取得した画像に、降り出した雨と干された洗濯物が映っている場合、リクエスト取得部510およびレスポンス決定部520は、画像解析によって上記の事象を認識し、「洗濯物を取り込むように家の中の誰かに通知する」という仮想的なリクエストおよびそれに対するレスポンスを自動的に特定してもよい。
次に、図8を参照して、本開示の実施形態に係る情報処理装置のハードウェア構成について説明する。図8は、本開示の実施形態に係る情報処理装置のハードウェア構成例を示すブロック図である。図示された情報処理装置900は、例えば、上記の実施形態における電子機器またはサーバを実現しうる。
本開示の実施形態は、例えば、上記で説明したような情報処理装置(電子機器またはサーバ)、システム、情報処理装置またはシステムで実行される情報処理方法、情報処理装置を機能させるためのプログラム、およびプログラムが記録された一時的でない有形の媒体を含みうる。
(1)システムに対するリクエストを取得するリクエスト取得部と、
前記リクエストに対するレスポンスを決定するレスポンス決定部と、
少なくとも前記レスポンスの属性に基づいて、前記システムに含まれる機器の中から前記レスポンスを提供する機器を選択する機器選択部と、
前記選択された機器に前記レスポンスを提供させるための制御を実行する機器制御部と
を備える情報処理装置。
(2)前記レスポンスの属性は、前記レスポンスを提供する対象を含み、
前記機器選択部は、前記対象の位置と、前記システムに含まれる機器の位置とに基づいて、前記レスポンスを提供する機器を選択する、前記(1)に記載の情報処理装置。
(3)前記機器選択部は、前記対象の近くに位置する機器を選択する、前記(2)に記載の情報処理装置。
(4)前記機器選択部は、前記対象、前記対象の位置、または前記システムに含まれる機器の位置のうちの少なくともいずれかが変化した場合、前記レスポンスを提供する機器を再選択する、前記(2)または(3)に記載の情報処理装置。
(5)前記レスポンスの属性は、前記レスポンスによって実現される機能を含み、
前記機器選択部は、前記システムに含まれる機器の前記機能に関する性能に基づいて、前記レスポンスを提供する機器を選択する、前記(1)~(4)のいずれか1項に記載の情報処理装置。
(6)前記レスポンスによって実現される機能は、第1の機能と第2の機能とを含み、
前記機器選択部は、前記第1の機能を実現する第1の機器と、前記第2の機能を実現する第2の機器とを選択する、前記(1)~(5)のいずれか1項に記載の情報処理装置。
(7)前記レスポンスによって実現される機能は、映像コンテンツの再生を含み、
前記機器選択部は、前記映像コンテンツの画像を表示する前記第1の機器と、前記映像コンテンツの音声を出力する前記第2の機器とを選択し、
前記機器制御部は、前記第1の機器に前記映像コンテンツの画像を表示するとともに前記映像コンテンツの音声をミュートさせるための制御を実行する、前記(6)に記載の情報処理装置。
(8)前記レスポンスの属性は、前記レスポンスを提供する対象を含み、
前記機器選択部は、前記対象であるユーザの嗜好に基づいて前記レスポンスを提供する機器を選択する、前記(1)~(7)のいずれか1項に記載の情報処理装置。
(9)前記機器制御部は、前記リクエストが入力された機器に、前記レスポンスを提供する機器を選択するための追加的な情報を要求させるための制御を実行する、前記(1)~(8)のいずれか1項に記載の情報処理装置。
(10)前記機器制御部は、前記リクエストが入力された機器に、前記選択された機器によってレスポンスが提供されたことを通知させるための制御を実行する、前記(1)~(9)のいずれか1項に記載の情報処理装置。
(11)システムに対するリクエストを取得することと、
前記リクエストに対するレスポンスを決定することと、
少なくとも前記レスポンスの属性に基づいて、前記システムに含まれる機器の中から前記レスポンスを提供する機器を選択することと、
前記選択された機器に前記レスポンスを提供させるための制御を実行することと
を含む情報処理方法。
(12)システムに対するリクエストを取得する機能と、
前記リクエストに対するレスポンスを決定する機能と、
少なくとも前記レスポンスの属性に基づいて、前記システムに含まれる機器の中から前記レスポンスを提供する機器を選択する機能と、
前記選択された機器に前記レスポンスを提供させるための制御を実行する機能と
をコンピュータに実現させるためのプログラム。
100 電子機器
110 画像/音声出力部
120 画像/音声取得部
130 操作部
140 制御部
150 通信部
160 記憶部
200 ネットワーク
300 サーバ
310 制御部
320 通信部
330 記憶部
510 リクエスト取得部
520 レスポンス決定部
550 機器選択部
580 機器制御部
Claims (12)
- システムに対するリクエストを取得するリクエスト取得部と、
前記リクエストに対するレスポンスを決定するレスポンス決定部と、
少なくとも前記レスポンスの属性に基づいて、前記システムに含まれる機器の中から前記レスポンスを提供する機器を選択する機器選択部と、
前記選択された機器に前記レスポンスを提供させるための制御を実行する機器制御部と
を備える情報処理装置。 - 前記レスポンスの属性は、前記レスポンスを提供する対象を含み、
前記機器選択部は、前記対象の位置と、前記システムに含まれる機器の位置とに基づいて、前記レスポンスを提供する機器を選択する、請求項1に記載の情報処理装置。 - 前記機器選択部は、前記対象の近くに位置する機器を選択する、請求項2に記載の情報処理装置。
- 前記機器選択部は、前記対象、前記対象の位置、または前記システムに含まれる機器の位置のうちの少なくともいずれかが変化した場合、前記レスポンスを提供する機器を再選択する、請求項2に記載の情報処理装置。
- 前記レスポンスの属性は、前記レスポンスによって実現される機能を含み、
前記機器選択部は、前記システムに含まれる機器の前記機能に関する性能に基づいて、前記レスポンスを提供する機器を選択する、請求項1に記載の情報処理装置。 - 前記レスポンスによって実現される機能は、第1の機能と第2の機能とを含み、
前記機器選択部は、前記第1の機能を実現する第1の機器と、前記第2の機能を実現する第2の機器とを選択する、請求項1に記載の情報処理装置。 - 前記レスポンスによって実現される機能は、映像コンテンツの再生を含み、
前記機器選択部は、前記映像コンテンツの画像を表示する前記第1の機器と、前記映像コンテンツの音声を出力する前記第2の機器とを選択し、
前記機器制御部は、前記第1の機器に前記映像コンテンツの画像を表示するとともに前記映像コンテンツの音声をミュートさせるための制御を実行する、請求項6に記載の情報処理装置。 - 前記レスポンスの属性は、前記レスポンスを提供する対象を含み、
前記機器選択部は、前記対象であるユーザの嗜好に基づいて前記レスポンスを提供する機器を選択する、請求項1に記載の情報処理装置。 - 前記機器制御部は、前記リクエストが入力された機器に、前記レスポンスを提供する機器を選択するための追加的な情報を要求させるための制御を実行する、請求項1に記載の情報処理装置。
- 前記機器制御部は、前記リクエストが入力された機器に、前記選択された機器によってレスポンスが提供されたことを通知させるための制御を実行する、請求項1に記載の情報処理装置。
- システムに対するリクエストを取得することと、
前記リクエストに対するレスポンスを決定することと、
少なくとも前記レスポンスの属性に基づいて、前記システムに含まれる機器の中から前記レスポンスを提供する機器を選択することと、
前記選択された機器に前記レスポンスを提供させるための制御を実行することと
を含む情報処理方法。 - システムに対するリクエストを取得する機能と、
前記リクエストに対するレスポンスを決定する機能と、
少なくとも前記レスポンスの属性に基づいて、前記システムに含まれる機器の中から前記レスポンスを提供する機器を選択する機能と、
前記選択された機器に前記レスポンスを提供させるための制御を実行する機能と
をコンピュータに実現させるためのプログラム。
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/113,301 US9848253B2 (en) | 2014-03-03 | 2014-11-27 | Information processing apparatus, information processing method, and program |
EP20168014.7A EP3739460A1 (en) | 2014-03-03 | 2014-11-27 | Information processing apparatus, information processing method, and program |
JP2016506085A JP6503557B2 (ja) | 2014-03-03 | 2014-11-27 | 情報処理装置、情報処理方法およびプログラム |
EP14884408.7A EP3115905A4 (en) | 2014-03-03 | 2014-11-27 | Information processing apparatus, information processing method, and program |
KR1020167023227A KR102325697B1 (ko) | 2014-03-03 | 2014-11-27 | 정보 처리 장치, 정보 처리 방법 및 프로그램 |
US15/843,805 US10244293B2 (en) | 2014-03-03 | 2017-12-15 | Information processing apparatus, information processing method, and program |
US16/279,317 US10623835B2 (en) | 2014-03-03 | 2019-02-19 | Information processing apparatus, information processing method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014040840 | 2014-03-03 | ||
JP2014-040840 | 2014-03-03 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/113,301 A-371-Of-International US9848253B2 (en) | 2014-03-03 | 2014-11-27 | Information processing apparatus, information processing method, and program |
US15/843,805 Continuation US10244293B2 (en) | 2014-03-03 | 2017-12-15 | Information processing apparatus, information processing method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015133022A1 true WO2015133022A1 (ja) | 2015-09-11 |
Family
ID=54054845
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/081430 WO2015133022A1 (ja) | 2014-03-03 | 2014-11-27 | 情報処理装置、情報処理方法およびプログラム |
Country Status (5)
Country | Link |
---|---|
US (3) | US9848253B2 (ja) |
EP (2) | EP3115905A4 (ja) |
JP (1) | JP6503557B2 (ja) |
KR (1) | KR102325697B1 (ja) |
WO (1) | WO2015133022A1 (ja) |
Cited By (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018195312A (ja) * | 2017-05-19 | 2018-12-06 | ネイバー コーポレーションNAVER Corporation | 音声要請に対応する情報提供のためのメディア選択 |
JP2019510247A (ja) * | 2016-12-30 | 2019-04-11 | グーグル エルエルシー | パケット化されたデータのマルチモーダル送信 |
JP2019164615A (ja) * | 2018-03-20 | 2019-09-26 | シャープ株式会社 | 情報処理システム、及び情報処理方法 |
US10593329B2 (en) | 2016-12-30 | 2020-03-17 | Google Llc | Multimodal transmission of packetized data |
US10650066B2 (en) | 2013-01-31 | 2020-05-12 | Google Llc | Enhancing sitelinks with creative content |
US10735552B2 (en) | 2013-01-31 | 2020-08-04 | Google Llc | Secondary transmissions of packetized data |
JP2020129776A (ja) * | 2019-02-12 | 2020-08-27 | 株式会社Nttドコモ | 制御システム |
US10776830B2 (en) | 2012-05-23 | 2020-09-15 | Google Llc | Methods and systems for identifying new computers and providing matching services |
JP2020528594A (ja) * | 2017-10-03 | 2020-09-24 | グーグル エルエルシー | レイテンシを考慮したディスプレイモード依存応答生成 |
WO2020195388A1 (ja) * | 2019-03-26 | 2020-10-01 | パナソニックIpマネジメント株式会社 | 情報通知システム及び情報通知方法 |
US11087760B2 (en) | 2016-12-30 | 2021-08-10 | Google, Llc | Multimodal transmission of packetized data |
JP2022008837A (ja) * | 2016-02-22 | 2022-01-14 | ソノズ インコーポレイテッド | オーディオ応答再生 |
US11513763B2 (en) | 2016-02-22 | 2022-11-29 | Sonos, Inc. | Audio response playback |
US11646023B2 (en) | 2019-02-08 | 2023-05-09 | Sonos, Inc. | Devices, systems, and methods for distributed voice processing |
US11646045B2 (en) | 2017-09-27 | 2023-05-09 | Sonos, Inc. | Robust short-time fourier transform acoustic echo cancellation during audio playback |
US11714600B2 (en) | 2019-07-31 | 2023-08-01 | Sonos, Inc. | Noise classification for event detection |
US11727933B2 (en) | 2016-10-19 | 2023-08-15 | Sonos, Inc. | Arbitration-based voice recognition |
US11778259B2 (en) | 2018-09-14 | 2023-10-03 | Sonos, Inc. | Networked devices, systems and methods for associating playback devices based on sound codes |
US11792590B2 (en) | 2018-05-25 | 2023-10-17 | Sonos, Inc. | Determining and adapting to changes in microphone performance of playback devices |
US11790911B2 (en) | 2018-09-28 | 2023-10-17 | Sonos, Inc. | Systems and methods for selective wake word detection using neural network models |
US11790937B2 (en) | 2018-09-21 | 2023-10-17 | Sonos, Inc. | Voice detection optimization using sound metadata |
US11798553B2 (en) | 2019-05-03 | 2023-10-24 | Sonos, Inc. | Voice assistant persistence across multiple network microphone devices |
US11797263B2 (en) | 2018-05-10 | 2023-10-24 | Sonos, Inc. | Systems and methods for voice-assisted media content selection |
US11816393B2 (en) | 2017-09-08 | 2023-11-14 | Sonos, Inc. | Dynamic computation of system response volume |
US11817083B2 (en) | 2018-12-13 | 2023-11-14 | Sonos, Inc. | Networked microphone devices, systems, and methods of localized arbitration |
US11817076B2 (en) | 2017-09-28 | 2023-11-14 | Sonos, Inc. | Multi-channel acoustic echo cancellation |
US11854547B2 (en) | 2019-06-12 | 2023-12-26 | Sonos, Inc. | Network microphone device with command keyword eventing |
US11863593B2 (en) | 2016-02-22 | 2024-01-02 | Sonos, Inc. | Networked microphone device control |
US11862161B2 (en) | 2019-10-22 | 2024-01-02 | Sonos, Inc. | VAS toggle based on device orientation |
US11869503B2 (en) | 2019-12-20 | 2024-01-09 | Sonos, Inc. | Offline voice control |
US11881222B2 (en) | 2020-05-20 | 2024-01-23 | Sonos, Inc | Command keywords with input detection windowing |
US11881223B2 (en) | 2018-12-07 | 2024-01-23 | Sonos, Inc. | Systems and methods of operating media playback systems having multiple voice assistant services |
US11887598B2 (en) | 2020-01-07 | 2024-01-30 | Sonos, Inc. | Voice verification for media playback |
US11893308B2 (en) | 2017-09-29 | 2024-02-06 | Sonos, Inc. | Media playback system with concurrent voice assistance |
US11899519B2 (en) | 2018-10-23 | 2024-02-13 | Sonos, Inc. | Multiple stage network microphone device with reduced power consumption and processing load |
US11900937B2 (en) | 2017-08-07 | 2024-02-13 | Sonos, Inc. | Wake-word detection suppression |
US11934742B2 (en) | 2016-08-05 | 2024-03-19 | Sonos, Inc. | Playback device supporting concurrent voice assistants |
US11961519B2 (en) | 2020-02-07 | 2024-04-16 | Sonos, Inc. | Localized wakeword verification |
US11973893B2 (en) | 2018-08-28 | 2024-04-30 | Sonos, Inc. | Do not disturb feature for audio notifications |
US11979960B2 (en) | 2016-07-15 | 2024-05-07 | Sonos, Inc. | Contextualization of voice inputs |
US11984123B2 (en) | 2020-11-12 | 2024-05-14 | Sonos, Inc. | Network device interaction by range |
US11983463B2 (en) | 2016-02-22 | 2024-05-14 | Sonos, Inc. | Metadata exchange involving a networked playback system and a networked microphone system |
Families Citing this family (70)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9318108B2 (en) | 2010-01-18 | 2016-04-19 | Apple Inc. | Intelligent automated assistant |
US8977255B2 (en) | 2007-04-03 | 2015-03-10 | Apple Inc. | Method and system for operating a multi-function portable electronic device using voice-activation |
US8676904B2 (en) | 2008-10-02 | 2014-03-18 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
US20120309363A1 (en) | 2011-06-03 | 2012-12-06 | Apple Inc. | Triggering notifications associated with tasks items that represent tasks to perform |
US10417037B2 (en) | 2012-05-15 | 2019-09-17 | Apple Inc. | Systems and methods for integrating third party services with a digital assistant |
DE112014000709B4 (de) | 2013-02-07 | 2021-12-30 | Apple Inc. | Verfahren und vorrichtung zum betrieb eines sprachtriggers für einen digitalen assistenten |
US10652394B2 (en) | 2013-03-14 | 2020-05-12 | Apple Inc. | System and method for processing voicemail |
US10748529B1 (en) | 2013-03-15 | 2020-08-18 | Apple Inc. | Voice activated device for use with a voice-based digital assistant |
US10176167B2 (en) | 2013-06-09 | 2019-01-08 | Apple Inc. | System and method for inferring user intent from speech inputs |
KR101749009B1 (ko) | 2013-08-06 | 2017-06-19 | 애플 인크. | 원격 디바이스로부터의 활동에 기초한 스마트 응답의 자동 활성화 |
US9916839B1 (en) * | 2014-03-27 | 2018-03-13 | Amazon Technologies, Inc. | Shared audio functionality based on device grouping |
US9715875B2 (en) | 2014-05-30 | 2017-07-25 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US10170123B2 (en) | 2014-05-30 | 2019-01-01 | Apple Inc. | Intelligent assistant for home automation |
AU2015266863B2 (en) | 2014-05-30 | 2018-03-15 | Apple Inc. | Multi-command single utterance input method |
US9338493B2 (en) | 2014-06-30 | 2016-05-10 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US9886953B2 (en) | 2015-03-08 | 2018-02-06 | Apple Inc. | Virtual assistant activation |
US10460227B2 (en) | 2015-05-15 | 2019-10-29 | Apple Inc. | Virtual assistant in a communication session |
US10200824B2 (en) | 2015-05-27 | 2019-02-05 | Apple Inc. | Systems and methods for proactively identifying and surfacing relevant content on a touch-sensitive device |
US20160378747A1 (en) | 2015-06-29 | 2016-12-29 | Apple Inc. | Virtual assistant for media playback |
US10671428B2 (en) | 2015-09-08 | 2020-06-02 | Apple Inc. | Distributed personal assistant |
US10747498B2 (en) | 2015-09-08 | 2020-08-18 | Apple Inc. | Zero latency digital assistant |
US10740384B2 (en) | 2015-09-08 | 2020-08-11 | Apple Inc. | Intelligent automated assistant for media search and playback |
US10331312B2 (en) | 2015-09-08 | 2019-06-25 | Apple Inc. | Intelligent automated assistant in a media environment |
US10691473B2 (en) | 2015-11-06 | 2020-06-23 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US10956666B2 (en) | 2015-11-09 | 2021-03-23 | Apple Inc. | Unconventional virtual assistant interactions |
US10223066B2 (en) | 2015-12-23 | 2019-03-05 | Apple Inc. | Proactive assistance based on dialog communication between devices |
EP3418881B1 (en) * | 2016-02-18 | 2020-04-01 | Sony Corporation | Information processing device, information processing method, and program |
US10586535B2 (en) | 2016-06-10 | 2020-03-10 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
DK201670540A1 (en) | 2016-06-11 | 2018-01-08 | Apple Inc | Application integration with a digital assistant |
DK179415B1 (en) | 2016-06-11 | 2018-06-14 | Apple Inc | Intelligent device arbitration and control |
US10271093B1 (en) * | 2016-06-27 | 2019-04-23 | Amazon Technologies, Inc. | Systems and methods for routing content to an associated output device |
US11204787B2 (en) | 2017-01-09 | 2021-12-21 | Apple Inc. | Application integration with a digital assistant |
US10726832B2 (en) | 2017-05-11 | 2020-07-28 | Apple Inc. | Maintaining privacy of personal information |
DK180048B1 (en) | 2017-05-11 | 2020-02-04 | Apple Inc. | MAINTAINING THE DATA PROTECTION OF PERSONAL INFORMATION |
DK179745B1 (en) | 2017-05-12 | 2019-05-01 | Apple Inc. | SYNCHRONIZATION AND TASK DELEGATION OF A DIGITAL ASSISTANT |
DK179496B1 (en) | 2017-05-12 | 2019-01-15 | Apple Inc. | USER-SPECIFIC Acoustic Models |
DK201770427A1 (en) | 2017-05-12 | 2018-12-20 | Apple Inc. | LOW-LATENCY INTELLIGENT AUTOMATED ASSISTANT |
DK201770411A1 (en) | 2017-05-15 | 2018-12-20 | Apple Inc. | MULTI-MODAL INTERFACES |
US10303715B2 (en) | 2017-05-16 | 2019-05-28 | Apple Inc. | Intelligent automated assistant for media exploration |
US20180336892A1 (en) | 2017-05-16 | 2018-11-22 | Apple Inc. | Detecting a trigger of a digital assistant |
DK179560B1 (en) | 2017-05-16 | 2019-02-18 | Apple Inc. | FAR-FIELD EXTENSION FOR DIGITAL ASSISTANT SERVICES |
US10665522B2 (en) | 2017-12-22 | 2020-05-26 | Intel IP Corporation | Package including an integrated routing layer and a molded routing layer |
US10425780B1 (en) * | 2018-02-22 | 2019-09-24 | Amazon Technologies, Inc. | Outputting notifications using device groups |
US10818288B2 (en) | 2018-03-26 | 2020-10-27 | Apple Inc. | Natural assistant interaction |
US10928918B2 (en) | 2018-05-07 | 2021-02-23 | Apple Inc. | Raise to speak |
US11145294B2 (en) | 2018-05-07 | 2021-10-12 | Apple Inc. | Intelligent automated assistant for delivering content from user experiences |
DK179822B1 (da) | 2018-06-01 | 2019-07-12 | Apple Inc. | Voice interaction at a primary device to access call functionality of a companion device |
DK180639B1 (en) | 2018-06-01 | 2021-11-04 | Apple Inc | DISABILITY OF ATTENTION-ATTENTIVE VIRTUAL ASSISTANT |
US10892996B2 (en) | 2018-06-01 | 2021-01-12 | Apple Inc. | Variable latency device coordination |
US11462215B2 (en) | 2018-09-28 | 2022-10-04 | Apple Inc. | Multi-modal inputs for voice commands |
US11638059B2 (en) | 2019-01-04 | 2023-04-25 | Apple Inc. | Content playback on multiple devices |
WO2020142681A1 (en) * | 2019-01-04 | 2020-07-09 | Apple Inc. | Content playback on multiple devices |
US11348573B2 (en) | 2019-03-18 | 2022-05-31 | Apple Inc. | Multimodality in digital assistant systems |
US11307752B2 (en) | 2019-05-06 | 2022-04-19 | Apple Inc. | User configurable task triggers |
DK201970509A1 (en) | 2019-05-06 | 2021-01-15 | Apple Inc | Spoken notifications |
US11140099B2 (en) | 2019-05-21 | 2021-10-05 | Apple Inc. | Providing message response suggestions |
DK180129B1 (en) | 2019-05-31 | 2020-06-02 | Apple Inc. | USER ACTIVITY SHORTCUT SUGGESTIONS |
DK201970511A1 (en) | 2019-05-31 | 2021-02-15 | Apple Inc | Voice identification in digital assistant systems |
US11227599B2 (en) | 2019-06-01 | 2022-01-18 | Apple Inc. | Methods and user interfaces for voice-based control of electronic devices |
US11488406B2 (en) | 2019-09-25 | 2022-11-01 | Apple Inc. | Text detection using global geometry estimators |
JP2021081875A (ja) * | 2019-11-15 | 2021-05-27 | 株式会社リコー | 情報処理システム、情報処理方法、情報処理装置及び出力装置 |
US20210185365A1 (en) * | 2019-12-11 | 2021-06-17 | Google Llc | Methods, systems, and media for providing dynamic media sessions with video stream transfer features |
US11061543B1 (en) | 2020-05-11 | 2021-07-13 | Apple Inc. | Providing relevant data items based on context |
US11043220B1 (en) | 2020-05-11 | 2021-06-22 | Apple Inc. | Digital assistant hardware abstraction |
US11755276B2 (en) | 2020-05-12 | 2023-09-12 | Apple Inc. | Reducing description length based on confidence |
US11490204B2 (en) | 2020-07-20 | 2022-11-01 | Apple Inc. | Multi-device audio adjustment coordination |
US11438683B2 (en) | 2020-07-21 | 2022-09-06 | Apple Inc. | User identification using headphones |
KR20220014752A (ko) * | 2020-07-29 | 2022-02-07 | 삼성전자주식회사 | 전자 장치 및 그 제어 방법 |
US11699444B1 (en) * | 2020-10-23 | 2023-07-11 | Amazon Technologies, Inc. | Speech recognition using multiple voice-enabled devices |
US12021806B1 (en) | 2021-09-21 | 2024-06-25 | Apple Inc. | Intelligent message delivery |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004304228A (ja) * | 2003-03-28 | 2004-10-28 | Sharp Corp | 機器制御システム、制御機器、および、記録媒体 |
JP2005303423A (ja) * | 2004-04-07 | 2005-10-27 | Sony Corp | 制御システム、制御装置および方法、プログラム、並びに記録媒体 |
JP2006324876A (ja) * | 2005-04-18 | 2006-11-30 | Sony Corp | 制御装置および方法、プログラム、並びに記録媒体 |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3658896B2 (ja) * | 1996-11-26 | 2005-06-08 | ソニー株式会社 | 情報信号伝送システム、再生装置及び表示装置 |
DE69712485T2 (de) | 1997-10-23 | 2002-12-12 | Sony Int Europe Gmbh | Sprachschnittstelle für ein Hausnetzwerk |
EP1046097B1 (en) * | 1998-09-17 | 2004-03-17 | Koninklijke Philips Electronics N.V. | Remote control device with location dependent interface |
JP2000215598A (ja) * | 1999-01-27 | 2000-08-04 | Sony Corp | デジタル信号伝送方法、デジタル信号伝送システム、デジタル信号伝送装置及び記録媒体 |
JP2002132292A (ja) * | 2000-10-26 | 2002-05-09 | Daisuke Murakami | 音声によるホームオートメーションシステム |
GB2378779B (en) * | 2001-08-14 | 2005-02-02 | Advanced Risc Mach Ltd | Accessing memory units in a data processing apparatus |
US7170422B2 (en) * | 2002-06-24 | 2007-01-30 | Matsushita Electric Industrial Co., Ltd. | Personal programmable universal remote control |
JP4443989B2 (ja) * | 2003-09-10 | 2010-03-31 | パナソニック株式会社 | サービス要求端末装置 |
JP2007235613A (ja) * | 2006-03-01 | 2007-09-13 | Murata Mach Ltd | 遠隔制御装置 |
US7945251B2 (en) * | 2006-03-27 | 2011-05-17 | Sony Ericsson Mobile Communications Ab | Locating a service device for a portable communication device |
JP2007318319A (ja) * | 2006-05-24 | 2007-12-06 | Seiko Epson Corp | リモートコントローラ及びその制御方法 |
US20090027222A1 (en) * | 2007-07-23 | 2009-01-29 | Sony Ericsson Mobile Communications Ab | Providing services to a mobile device in a personal network |
US9509753B2 (en) * | 2014-01-08 | 2016-11-29 | Samsung Electronics Co., Ltd. | Mobile apparatus and method for controlling thereof, and touch device |
US10032008B2 (en) * | 2014-02-23 | 2018-07-24 | Qualcomm Incorporated | Trust broker authentication method for mobile devices |
-
2014
- 2014-11-27 WO PCT/JP2014/081430 patent/WO2015133022A1/ja active Application Filing
- 2014-11-27 EP EP14884408.7A patent/EP3115905A4/en not_active Ceased
- 2014-11-27 EP EP20168014.7A patent/EP3739460A1/en not_active Withdrawn
- 2014-11-27 KR KR1020167023227A patent/KR102325697B1/ko active IP Right Grant
- 2014-11-27 US US15/113,301 patent/US9848253B2/en active Active
- 2014-11-27 JP JP2016506085A patent/JP6503557B2/ja active Active
-
2017
- 2017-12-15 US US15/843,805 patent/US10244293B2/en active Active
-
2019
- 2019-02-19 US US16/279,317 patent/US10623835B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004304228A (ja) * | 2003-03-28 | 2004-10-28 | Sharp Corp | 機器制御システム、制御機器、および、記録媒体 |
JP2005303423A (ja) * | 2004-04-07 | 2005-10-27 | Sony Corp | 制御システム、制御装置および方法、プログラム、並びに記録媒体 |
JP2006324876A (ja) * | 2005-04-18 | 2006-11-30 | Sony Corp | 制御装置および方法、プログラム、並びに記録媒体 |
Cited By (59)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10776830B2 (en) | 2012-05-23 | 2020-09-15 | Google Llc | Methods and systems for identifying new computers and providing matching services |
US10650066B2 (en) | 2013-01-31 | 2020-05-12 | Google Llc | Enhancing sitelinks with creative content |
US10776435B2 (en) | 2013-01-31 | 2020-09-15 | Google Llc | Canonicalized online document sitelink generation |
US10735552B2 (en) | 2013-01-31 | 2020-08-04 | Google Llc | Secondary transmissions of packetized data |
US11750969B2 (en) | 2016-02-22 | 2023-09-05 | Sonos, Inc. | Default playback device designation |
US11863593B2 (en) | 2016-02-22 | 2024-01-02 | Sonos, Inc. | Networked microphone device control |
US11832068B2 (en) | 2016-02-22 | 2023-11-28 | Sonos, Inc. | Music service selection |
JP7346516B2 (ja) | 2016-02-22 | 2023-09-19 | ソノズ インコーポレイテッド | オーディオ応答再生 |
JP2022008837A (ja) * | 2016-02-22 | 2022-01-14 | ソノズ インコーポレイテッド | オーディオ応答再生 |
US11983463B2 (en) | 2016-02-22 | 2024-05-14 | Sonos, Inc. | Metadata exchange involving a networked playback system and a networked microphone system |
US11513763B2 (en) | 2016-02-22 | 2022-11-29 | Sonos, Inc. | Audio response playback |
JP2022020760A (ja) * | 2016-02-22 | 2022-02-01 | ソノズ インコーポレイテッド | メディア再生システムの音声制御 |
US11979960B2 (en) | 2016-07-15 | 2024-05-07 | Sonos, Inc. | Contextualization of voice inputs |
US11934742B2 (en) | 2016-08-05 | 2024-03-19 | Sonos, Inc. | Playback device supporting concurrent voice assistants |
US11727933B2 (en) | 2016-10-19 | 2023-08-15 | Sonos, Inc. | Arbitration-based voice recognition |
JP7139295B2 (ja) | 2016-12-30 | 2022-09-20 | グーグル エルエルシー | パケット化されたデータのマルチモーダル送信のシステムおよび方法 |
US11087760B2 (en) | 2016-12-30 | 2021-08-10 | Google, Llc | Multimodal transmission of packetized data |
JP2019510247A (ja) * | 2016-12-30 | 2019-04-11 | グーグル エルエルシー | パケット化されたデータのマルチモーダル送信 |
EP3961372A1 (en) * | 2016-12-30 | 2022-03-02 | Google LLC | Multimodal transmission of packetized data |
US10593329B2 (en) | 2016-12-30 | 2020-03-17 | Google Llc | Multimodal transmission of packetized data |
US11381609B2 (en) | 2016-12-30 | 2022-07-05 | Google Llc | Multimodal transmission of packetized data |
US11930050B2 (en) | 2016-12-30 | 2024-03-12 | Google Llc | Multimodal transmission of packetized data |
JP2020042270A (ja) * | 2016-12-30 | 2020-03-19 | グーグル エルエルシー | パケット化されたデータのマルチモーダル送信のシステムおよび方法 |
US10708313B2 (en) | 2016-12-30 | 2020-07-07 | Google Llc | Multimodal transmission of packetized data |
US11705121B2 (en) | 2016-12-30 | 2023-07-18 | Google Llc | Multimodal transmission of packetized data |
US10748541B2 (en) | 2016-12-30 | 2020-08-18 | Google Llc | Multimodal transmission of packetized data |
JP2018195312A (ja) * | 2017-05-19 | 2018-12-06 | ネイバー コーポレーションNAVER Corporation | 音声要請に対応する情報提供のためのメディア選択 |
US11900937B2 (en) | 2017-08-07 | 2024-02-13 | Sonos, Inc. | Wake-word detection suppression |
US11816393B2 (en) | 2017-09-08 | 2023-11-14 | Sonos, Inc. | Dynamic computation of system response volume |
US11646045B2 (en) | 2017-09-27 | 2023-05-09 | Sonos, Inc. | Robust short-time fourier transform acoustic echo cancellation during audio playback |
US11817076B2 (en) | 2017-09-28 | 2023-11-14 | Sonos, Inc. | Multi-channel acoustic echo cancellation |
US11893308B2 (en) | 2017-09-29 | 2024-02-06 | Sonos, Inc. | Media playback system with concurrent voice assistance |
JP2020528594A (ja) * | 2017-10-03 | 2020-09-24 | グーグル エルエルシー | レイテンシを考慮したディスプレイモード依存応答生成 |
US11823675B2 (en) | 2017-10-03 | 2023-11-21 | Google Llc | Display mode dependent response generation with latency considerations |
US11120796B2 (en) | 2017-10-03 | 2021-09-14 | Google Llc | Display mode dependent response generation with latency considerations |
JP7088703B2 (ja) | 2018-03-20 | 2022-06-21 | シャープ株式会社 | 情報処理システム |
JP2019164615A (ja) * | 2018-03-20 | 2019-09-26 | シャープ株式会社 | 情報処理システム、及び情報処理方法 |
US11797263B2 (en) | 2018-05-10 | 2023-10-24 | Sonos, Inc. | Systems and methods for voice-assisted media content selection |
US11792590B2 (en) | 2018-05-25 | 2023-10-17 | Sonos, Inc. | Determining and adapting to changes in microphone performance of playback devices |
US11973893B2 (en) | 2018-08-28 | 2024-04-30 | Sonos, Inc. | Do not disturb feature for audio notifications |
US11778259B2 (en) | 2018-09-14 | 2023-10-03 | Sonos, Inc. | Networked devices, systems and methods for associating playback devices based on sound codes |
US11790937B2 (en) | 2018-09-21 | 2023-10-17 | Sonos, Inc. | Voice detection optimization using sound metadata |
US11790911B2 (en) | 2018-09-28 | 2023-10-17 | Sonos, Inc. | Systems and methods for selective wake word detection using neural network models |
US11899519B2 (en) | 2018-10-23 | 2024-02-13 | Sonos, Inc. | Multiple stage network microphone device with reduced power consumption and processing load |
US11881223B2 (en) | 2018-12-07 | 2024-01-23 | Sonos, Inc. | Systems and methods of operating media playback systems having multiple voice assistant services |
US11817083B2 (en) | 2018-12-13 | 2023-11-14 | Sonos, Inc. | Networked microphone devices, systems, and methods of localized arbitration |
US11646023B2 (en) | 2019-02-08 | 2023-05-09 | Sonos, Inc. | Devices, systems, and methods for distributed voice processing |
JP2020129776A (ja) * | 2019-02-12 | 2020-08-27 | 株式会社Nttドコモ | 制御システム |
JP7181116B2 (ja) | 2019-02-12 | 2022-11-30 | 株式会社Nttドコモ | 制御システム |
WO2020195388A1 (ja) * | 2019-03-26 | 2020-10-01 | パナソニックIpマネジメント株式会社 | 情報通知システム及び情報通知方法 |
US11798553B2 (en) | 2019-05-03 | 2023-10-24 | Sonos, Inc. | Voice assistant persistence across multiple network microphone devices |
US11854547B2 (en) | 2019-06-12 | 2023-12-26 | Sonos, Inc. | Network microphone device with command keyword eventing |
US11714600B2 (en) | 2019-07-31 | 2023-08-01 | Sonos, Inc. | Noise classification for event detection |
US11862161B2 (en) | 2019-10-22 | 2024-01-02 | Sonos, Inc. | VAS toggle based on device orientation |
US11869503B2 (en) | 2019-12-20 | 2024-01-09 | Sonos, Inc. | Offline voice control |
US11887598B2 (en) | 2020-01-07 | 2024-01-30 | Sonos, Inc. | Voice verification for media playback |
US11961519B2 (en) | 2020-02-07 | 2024-04-16 | Sonos, Inc. | Localized wakeword verification |
US11881222B2 (en) | 2020-05-20 | 2024-01-23 | Sonos, Inc | Command keywords with input detection windowing |
US11984123B2 (en) | 2020-11-12 | 2024-05-14 | Sonos, Inc. | Network device interaction by range |
Also Published As
Publication number | Publication date |
---|---|
JP6503557B2 (ja) | 2019-04-24 |
US20180109853A1 (en) | 2018-04-19 |
US9848253B2 (en) | 2017-12-19 |
EP3739460A1 (en) | 2020-11-18 |
US10244293B2 (en) | 2019-03-26 |
US10623835B2 (en) | 2020-04-14 |
KR102325697B1 (ko) | 2021-11-15 |
EP3115905A1 (en) | 2017-01-11 |
US20190182566A1 (en) | 2019-06-13 |
EP3115905A4 (en) | 2017-10-25 |
US20170013331A1 (en) | 2017-01-12 |
JPWO2015133022A1 (ja) | 2017-04-06 |
KR20160127737A (ko) | 2016-11-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10623835B2 (en) | Information processing apparatus, information processing method, and program | |
US11741979B1 (en) | Playback of audio content on multiple devices | |
US20190304448A1 (en) | Audio playback device and voice control method thereof | |
US9431021B1 (en) | Device grouping for audio based interactivity | |
KR102264600B1 (ko) | 적응적 통지 네트워크용 시스템 및 방법 | |
US9084003B1 (en) | Methods, systems, and media for media playback | |
US8497796B2 (en) | Methods and apparatus for controlling one or more electronic devices based on the location of a user | |
JP6542419B2 (ja) | セルラーネットワークを介したメディアシステムのアクセス | |
CN106209800B (zh) | 设备权限共享方法和装置 | |
US11736760B2 (en) | Video integration with home assistant | |
JP2016523017A (ja) | メディア再生システムの再生待ち列の転送 | |
WO2020005563A1 (en) | Privacy chat trigger using mutual eye contact | |
CN113613046A (zh) | 管理回放组 | |
US11233490B2 (en) | Context based volume adaptation by voice assistant devices | |
US12001754B2 (en) | Context based media selection based on preferences setting for active consumer(s) | |
JP2017144521A (ja) | 情報処理装置、情報処理方法、及びプログラム | |
US11445269B2 (en) | Context sensitive ads | |
CN111630413B (zh) | 基于置信度的应用特定的用户交互 | |
US20130117385A1 (en) | Personal area network of devices and applications | |
WO2020105466A1 (ja) | 情報処理装置、及び情報処理方法 | |
WO2019239738A1 (ja) | 情報処理装置、情報処理方法 | |
JP6927331B2 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
WO2022209227A1 (ja) | 情報処理端末、情報処理方法、およびプログラム | |
JP2016057439A (ja) | 端末装置及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14884408 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016506085 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15113301 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 20167023227 Country of ref document: KR Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2014884408 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014884408 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |