WO2015198716A1 - 情報処理装置及び情報処理方法並びにプログラム - Google Patents
情報処理装置及び情報処理方法並びにプログラム Download PDFInfo
- Publication number
- WO2015198716A1 WO2015198716A1 PCT/JP2015/062725 JP2015062725W WO2015198716A1 WO 2015198716 A1 WO2015198716 A1 WO 2015198716A1 JP 2015062725 W JP2015062725 W JP 2015062725W WO 2015198716 A1 WO2015198716 A1 WO 2015198716A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- atmosphere
- user
- information
- information processing
- content
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44213—Monitoring of end-user related data
- H04N21/44222—Analytics of user selections, e.g. selection of programs or purchase activity
- H04N21/44224—Monitoring of user activity on external systems, e.g. Internet browsing
- H04N21/44226—Monitoring of user activity on external systems, e.g. Internet browsing on social networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9535—Search customisation based on user profiles and personalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/02—Knowledge representation; Symbolic representation
- G06N5/022—Knowledge engineering; Knowledge acquisition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4131—Peripherals receiving signals from specially adapted client devices home appliance, e.g. lighting, air conditioning system, metering devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/43615—Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/4508—Management of client data or end-user data
- H04N21/4532—Management of client data or end-user data involving end-user characteristics, e.g. viewer profile, preferences
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/466—Learning process for intelligent management, e.g. learning user preferences for recommending movies
- H04N21/4667—Processing of monitored end-user data, e.g. trend analysis based on the log file of viewer selections
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, and a program.
- Patent Document 1 discloses a system that automatically forms an atmosphere such as an illumination atmosphere based on a keyword input such as a keyword typed by the user or spoken by the user.
- a keyword input such as a keyword typed by the user or spoken by the user.
- colors, sounds, and images are exemplified as the atmosphere.
- Patent Document 2 includes a light source that creates a plurality of colors next to at least one side surrounding a television main screen, and each of the plurality of colors is adapted to video content in a spatially corresponding area of the screen. The technology is disclosed.
- an atmosphere is formed based on a keyword specified by a user, not a content, and the atmosphere is automatically generated according to presentation of information on the content while using the content. It is not formed. Therefore, the technique disclosed in Patent Document 1 cannot experience a sense of reality linked to content. Further, the technique disclosed in Patent Document 2 does not reflect the user's preference or operation in the color formed in conformity with the video content. When creating the color, if the output of the color to be formed is adjusted to reflect the user's preference and operation, the user can more comfortably enjoy the atmosphere linked to the content.
- the atmosphere in conjunction with presentation of content information used by a user, the atmosphere is a stimulus perceivable by the user different from the presentation of the information, and reflects the user's preference or operation
- an information processing apparatus including a control unit that controls an atmosphere forming device such that an atmosphere is formed.
- the atmosphere in conjunction with presentation of information on content used by a user, the atmosphere is a stimulus perceivable by the user different from the presentation of the information, and reflects the user's preference or operation.
- An information processing method is provided in which an atmosphere forming device is controlled by a control device such that a controlled atmosphere is formed.
- the atmosphere of the stimulus perceivable by the user which is different from the presentation of the information, is the user's preference or
- a program is provided that executes a function of controlling an atmosphere forming device so that an atmosphere reflecting the operation is formed.
- the user can enjoy the content more comfortably. Will be able to.
- First embodiment (reflecting user preferences) 1.1. Overview of atmosphere formation system 1.2. Hardware configuration 1.3. Software configuration 1.3.1. Content analysis unit 1.3.2. Atmosphere Management Department 1.3.3. Knowledge base system section 1.3.4. Device management unit 1.3.5. Output control unit 1.4. Atmosphere control processing 1.4.1. Content extraction processing 1.4.2. Feature amount extraction processing 1.4.3. Atmosphere table generation processing 1.4.4. Time series graph generation processing 1.4.5. Device optimization processing 1.5. Usage example 1.5.1. Example when watching a movie 1.5.2. Example when listening to music 1.5.3. Example when watching movies Second embodiment (multiple atmosphere combined treatment) 3. Third embodiment (reflecting user actions) 3.1. Overview of atmosphere formation system 3.2. Configuration example of atmosphere forming system 3.3.
- Atmosphere adjustment processing 3.3.1. Atmosphere adjustment by direct operation 3.3.2. Atmosphere adjustment by voice operation 3.3.3. Atmosphere adjustment by gesture operation 3.3.4. 3. Atmosphere adjustment processing according to other operations Fourth embodiment (atmosphere formation grasping the surrounding environment) 5. Fifth embodiment (multi-atmosphere output compatible device)
- the atmosphere forming system is a system that forms an atmosphere in a space where a user is present in conjunction with presentation of information on contents used by the user. A user's preference is reflected in the atmosphere formed by the atmosphere forming system according to the present embodiment.
- Examples of content include various information that can be used by the user, including movies, videos, music, images, EPG (Electronic Program Guide) data, SNS (Social Network Service), schedule data, and the like. That is, the content includes viewing information presented to the user, viewing information, listening information, and the like.
- EPG Electronic Program Guide
- SNS Social Network Service
- the atmosphere forming system analyzes the feature amount from these contents and collates the feature amount with the knowledge database to generate the atmosphere information formed in the space.
- the user's preference is reflected in the atmosphere information generated here.
- the atmosphere formation system concerning this embodiment forms the atmosphere of space with the optimal atmosphere formation apparatus according to the produced
- the atmosphere forming system can also form an atmosphere at an appropriate timing. Therefore, the user who is using the content can experience a sense of realism along with the presentation of the content information.
- the content information includes an element from which a feature amount can be extracted.
- a movie includes sound and subtitles
- a music includes lyrics and the like
- an image can be converted into a keyword and the like
- a movie includes sound and the like.
- EPG Electronic Program Guide
- SNS Social Network Service
- schedule includes ToDo list text.
- the atmosphere forming system generates atmosphere information based on these elements included in the content, and generates an atmosphere in the space.
- “Atmosphere” can be said to be something that surrounds a specific place, event, thing, or person and captures the light, sound, smell, or sign that the person can perceive as a whole.
- the visual, auditory, olfactory, and tactile sensations are stimulated to form an atmosphere that is linked to the presentation of content information.
- the “atmosphere” formed does not include presentation of information on the content itself.
- Home appliances such as lighting fixtures, speakers, electric fans, aroma diffusers, and air conditioners can be used as the atmosphere forming device.
- FIG. 1 is an explanatory diagram schematically showing a hardware configuration of the atmosphere forming system 10.
- the atmosphere forming system 10 includes an input device 20, an atmosphere forming device 30, an information processing device 60, a storage device 40, and a communication device 50.
- the input device 20 includes a playback device for playing back or displaying content such as a video recorder, a personal computer, a smartphone, or a music player. Information on the content to be reproduced or displayed is input to the information processing apparatus 60.
- the input device 20 includes an imaging device and a sensor device. Information detected by the imaging device or the sensor device is input to the information processing device 60 and used for output control of the atmosphere forming device 30. Examples of the imaging device include a surveillance camera and a color recognition camera. Examples of the sensor device include an illuminance sensor, a temperature sensor, a humidity sensor, and a position detection module such as an RFID (Radio Frequency Identifier).
- RFID Radio Frequency Identifier
- the atmosphere forming device 30 is a device used to form an atmosphere in the space.
- the atmosphere forming device 30 is subjected to output control by the information processing device 60.
- a device capable of changing visual, auditory, olfactory, and tactile sensations can be used.
- FIG. 2 shows an example of an output device (atmosphere forming device) capable of changing human senses and visual, auditory, olfactory, and tactile senses.
- a device that can change the viewing angle a device that affects light visually recognized by the user using color or video as a medium can be used.
- display devices such as a lighting fixture, indirect lighting, a television, a smart phone, and a tablet computer, are mentioned.
- a device that can change hearing a device that affects sound recognized by the user using BGM, music, or sound effects as a medium can be used.
- a speaker apparatus is mentioned.
- a device that can change the sense of smell a device that affects the scent or scent recognized by the user using air as a medium can be used.
- an aroma diffuser, an air conditioner, and a fan can be used.
- the apparatus which can change a tactile sense can use the apparatus which affects a user's bodily sensation using a vibration and ventilation.
- a smart phone or a wearable device capable of generating vibration, an air conditioner or a fan capable of adjusting air blowing, temperature, or humidity can be used.
- speakers, displays, lighting fixtures, aroma diffusers, fans, air conditioners, etc. are all home appliances. By connecting these home appliances, an atmosphere can be created without using special equipment. Can do.
- these devices are merely examples, and other devices may be used.
- the information processing apparatus 60 includes a CPU, a storage element such as a RAM and a ROM, a storage apparatus 40 that stores various data, and a communication apparatus 50 for communicating with an external device or the like.
- the ROM stores a control program executed by the CPU.
- the calculation results of the CPU and various information input from the input device 20 are written in the RAM.
- the CPU functions as the control unit 100 in the present disclosure.
- the storage device 40 stores data such as a knowledge database, a user database, and a device profile.
- the knowledge database is a database in which various feature amounts extracted from content and atmosphere information are mapped.
- the user database is a database that stores user preference information.
- the communication device 50 is configured as a device for communicating with an external communication network such as the Internet, for example.
- FIG. 3 is a block diagram functionally showing the software configuration of the control unit 100 of the information processing apparatus 60.
- the control unit 100 includes a content analysis unit 110, an atmosphere management unit 130, a knowledge base system unit 150, a device management unit 170, and an output control unit 190.
- each of these units can be a functional unit realized by executing a program by the CPU.
- the content analysis unit 110 performs a process of extracting the feature amount of the content by analyzing the content information.
- the content analysis unit 110 is configured to be able to execute each analysis process of time series analysis, image analysis, sound analysis, and language analysis.
- the time series analysis is performed within the content timeline.
- image analysis an object in an image in content information is recognized, and a feature amount related to the image is extracted.
- voice analysis performs a process of replacing voice with text by voice recognition. Further, the voice analysis may be performed by extracting features such as the excitement of the voice itself, genre, volume, tempo, and sound quality.
- Language analysis performs morphological analysis, syntax analysis, and the like, and performs processing for extracting keywords and the like as feature quantities.
- the content analysis unit 110 separates timelines, images, sounds, and texts in the content, and extracts feature amounts for each separated data. For example, for the image, the content analysis unit 110 converts the feature amount related to the image into a keyword and extracts it. In addition, the content analysis unit 110 can extract and extract a feature value from a keyword by language analysis after converting the text into a text by voice analysis. However, the content analysis unit 110 may execute an analysis process other than the above.
- the atmosphere management unit 130 inquires the knowledge base system unit 150 about the feature amount of the content extracted by the content analysis unit 110, and generates atmosphere information to be formed using the atmosphere forming device 30 based on the returned result. Specifically, the atmosphere management unit 130 acquires the atmosphere parameters (color, temperature, etc.) by making an inquiry to the knowledge base system unit 150 using the feature value at a certain point in the timeline as a key. Then, the atmosphere management unit 130 generates an atmosphere table by filling the acquired atmosphere parameters in each slot of the atmosphere table prepared in advance.
- the atmosphere parameters color, temperature, etc.
- the atmosphere management unit 130 when generating the atmosphere table, the atmosphere management unit 130 generates a personalized atmosphere table reflecting user preference information returned from the knowledge base system unit 150 simultaneously with the atmosphere parameters.
- the atmosphere parameter to be output matches the user's preference, the output of the atmosphere parameter is emphasized.
- the atmosphere parameter to be output is contrary to the user's preference, the output of the atmosphere parameter may be weakened or not output.
- an atmosphere parameter contrary to the user's preference may not be selected.
- the atmosphere management unit 130 connects the atmosphere tables generated in the timeline in time series, and generates a time series graph of the atmosphere. Specifically, the atmosphere management unit 130 connects the generated atmosphere tables in time series, and generates a time series graph indicating changes in the atmosphere according to time. Further, the atmosphere management unit 130 may detect the climax for each timeline by detecting the climax of the sound from the generated time series graph, and may emphasize each atmospheric parameter in the ambience table. In the present embodiment, the time series graph of the atmosphere generated in this way is sent to the device management unit 170 as atmosphere information.
- the knowledge base system unit 150 refers to the knowledge database stored in the storage device 40, acquires the atmosphere parameter corresponding to the feature amount of the inquired content, and returns it to the atmosphere management unit 130. If the corresponding atmosphere parameter cannot be obtained as a result of referring to the knowledge database, a related atmosphere parameter may be acquired by accessing a search engine such as the external Internet via the communication device 50. At the same time, the knowledge base system unit 150 refers to the user database stored in the storage device 40 and returns user preference information to the atmosphere management unit 130.
- the knowledge base system unit 150 will be described together with an example of a database to which the knowledge base system unit 150 refers.
- the knowledge database used in the present embodiment stores various data in which feature amounts and atmosphere parameters are mapped.
- parameters such as color, fragrance (aroma), wind, temperature, sound, and vibration exist.
- parameter information such as color and fragrance is defined as a core database, and atmospheric parameters corresponding to key feature quantities can be acquired by referring to the core database.
- color and scent atmosphere parameters will be described as an example of the core database.
- Fig. 4 shows an example of a color database.
- green is a color pronounced of healing
- red is a color pronounced of passion
- blue is pronounced of intellectual coolness.
- the color database an image (symbol) represented by each color and an image associated with the color are exemplified.
- the knowledge database stores various knowledge data in which keywords and colors are mapped.
- the knowledge database stores knowledge in a triple of “forest-color-green (forest is green)” with respect to the feature quantity of “forest”. Therefore, the knowledge base system unit 150 acquires the color parameter “green” by referring to the knowledge database for the keyword “forest”.
- the scent can give a change of feelings such as calming and exciting people.
- the image associated with the aroma system varies. Therefore, by generating an aroma corresponding to an image that matches the semantic concept of the keyword, the user's olfactory information is stimulated, and the scene associated with the keyword is given to the user more strongly as a sense of presence.
- Fig. 5 shows an example of an aroma database.
- the aroma system is divided into seven types, and the image represented by the scent of each system and the characteristics of the scent are illustrated.
- the knowledge database stores various knowledge data in which keywords and scents are mapped.
- the knowledge database stores knowledge in a triple of “nature-aroma-tree system (nature is the scent of tree system)”.
- the concept of “forest” is based on the ontology “forest-upper-nature (the meaning concept of forest is natural)”.
- the knowledge base system unit 150 acquires the scent parameter “tree system” by referring to the knowledge database for the keyword “forest”. You may make it repeat the meaning conceptualization (superordinate conception) of a keyword until it applies to any knowledge data.
- the user database used in the present embodiment is a database in which user profiles are stored, and stores user preferences regarding colors, scents, people, places, and the like. Such a user database is used to form an atmosphere in which the user's preference is reflected in the space by emphasizing the atmosphere parameter according to the user's preference.
- FIG. 6 shows an example of a user database.
- the user database stores properties such as individual gender, nationality, age, interest, likes and dislikes, and behavior history.
- the properties shown here are merely examples, and appropriate property information can be used.
- two modes of a macro level and a micro level are defined for each property. With this level mode, the reflection value of the user's preference can be set for the initial value of the generated atmosphere parameter, that is, the analysis result that does not reflect the user's preference.
- the macro level can be selected when it is desired to reflect the user's preference while reproducing the atmosphere faithfully to the initial value of the atmosphere parameter.
- the micro level can be selected.
- the mode may be selectable by the user himself or may be defined in advance by the information processing apparatus 60.
- the macro level can be defined as a group level such that the number of values (variations) that can be taken with a predetermined property is equal to or less than a predetermined value for a user profile. For example, there are few variations in gender and age properties, male or female in the case of gender, and at most 10 in the age. It is divided into.
- the micro level is positioned opposite to the group level, and can be defined as a level in which the number of values (variations) that can be taken with a predetermined property can exceed a predetermined value. For example, a favorite entertainer's property has a large number of possible candidates and is classified as a property exceeding a predetermined value.
- the knowledge base system unit 150 refers to the user database when an atmosphere parameter inquiry is received from the atmosphere management unit 130 and returns the current user preference information to the atmosphere management unit 130 together with the atmosphere parameter.
- the current user may be specified based on, for example, login data to the atmosphere forming system 10 or based on login data to the input device 20 or the atmosphere forming device 30 for reproducing content. May be.
- the device management unit 170 manages the sensor information input from the various sensor devices constituting the input device 20, and performs optimum output control based on the atmosphere information generated by the atmosphere management unit 130. Select 30. For example, the device management unit 170 grasps the current atmosphere based on the sensor information, and selects the optimum atmosphere forming device 30 for forming an atmosphere along the atmosphere information. Information on the selected atmosphere forming device 30 is sent to the output control unit 190.
- the device management unit 170 refers to the user's device profile, extracts atmosphere parameters that can be output, and transmits a command signal to the output control unit 190 that performs output control of the atmosphere forming device 30. At this time, the output of the atmosphere forming device 30 around the user needs to correspond to the output value corresponding to the atmosphere information to be formed. Therefore, the device management unit 170 refers to the device profile stored in the storage apparatus 40, and selects the atmosphere forming device 30 that can be output from the atmosphere forming devices 30 existing in the space based on the atmosphere information.
- FIG. 7 shows an example of a device profile.
- the device profile includes, for example, information on the name of the atmosphere forming device 30, an output format, and perceptual grasp characteristics.
- the device profile is created for each user according to the atmosphere forming device 30 existing in the space and stored in the storage device 40.
- the perception grasp characteristic is information indicating immediacy until the user perceives a change in the atmosphere when the atmosphere changes in accordance with the output of the atmosphere forming device 30.
- the perceptual grasping characteristic is represented by three levels of “1. Immediacy”, “2. Normality”, and “3. Delay”. The lower the level of the perceptual grasping characteristic, the more easily the user can grasp the change in the atmosphere, and the higher the level of the perceptual grasping characteristic, the less difficult the user perceives the change in the atmosphere.
- the change in the color of the lighting fixture is set to level 1 because the user can immediately perceive it visually at the changed stage.
- the change in the amount of air blown by the electric fan is set to level 2 because it takes a little time to be perceived by touch after the wind is sent.
- the change of the scent of the aroma diffuser is set to level 3 because it takes a relatively long time for the user to perceive it with the sense of smell.
- the difference in immediacy is due to the time from when the perceptual elements such as color, wind, and scent are output from the atmosphere forming device 30 until they reach the user.
- the time difference is caused by the medium transmitting the sensory element being light (color), sound wave (sound), straight wave (wind), or diffusive wave (fragrance).
- the device management unit 170 refers to the device profile and outputs the timing from the atmosphere forming device 30 so that the user perceives a change in the atmosphere in the space at an optimal timing in conjunction with the content information. To decide.
- the output control unit 190 outputs a control signal to the atmosphere forming device 30 in accordance with a command from the device management unit 170 so that an atmosphere corresponding to the atmosphere information generated reflecting the user's preference is formed. .
- FIG. 8 is a flowchart of the atmosphere control process.
- FIG. 9 is a diagram showing in detail the flow of the atmosphere control process.
- step S10 the content analysis unit 110 separates the timeline, image, audio, and text in the content.
- step S20 the content analysis unit 110 extracts a feature amount for each separated data.
- the content analysis unit 110 converts the feature amount of the image data into a keyword.
- the content analysis unit 110 converts the voice data into text by voice analysis, and then extracts a feature amount by language analysis.
- the atmosphere management unit 130 inquires of the knowledge base system unit 150 about an atmosphere parameter corresponding to the feature amount using the extracted feature amount as a key.
- the knowledge base system unit 150 refers to the knowledge database, extracts the atmosphere parameter corresponding to the feature amount of the inquired content, and returns it to the atmosphere management unit 130. Further, the knowledge base system unit 150 refers to the user database, extracts the current user preference information, and returns it to the atmosphere management unit 130 together with the atmosphere parameters.
- the atmosphere management unit 130 generates a personalized atmosphere table based on the atmosphere parameters and user preference information returned from the knowledge base system unit 150. Such a personalized atmosphere table is generated for each time point on the timeline. Specifically, the personalized atmosphere table generation process can be performed as follows.
- the knowledge base system unit 150 refers to the core database and obtains the triple knowledge data corresponding to the keyword as follows: Extract.
- the extracted knowledge data is returned to the atmosphere management unit 130 as an atmosphere parameter. ⁇ "Forest-Top-Nature” ⁇ "Forest-color-green” ⁇ "Nature-aroma-tree” ⁇ "Mori-wind-weak” ⁇ "Mori-sound-Twitter” ⁇ "Mori-vibration-None"
- the knowledge base system unit 150 When extracting the atmosphere parameter, the knowledge base system unit 150 refers to the user database and acquires user preference information as illustrated in FIG. As described above, macro-level and micro-level modes are associated with user preference information according to properties. The acquired user preference information is returned to the atmosphere management unit 130 together with the atmosphere parameters.
- the knowledge base system unit 150 may specify a user based on login data to the atmosphere forming system 10 or based on login data to the input device 20 or the atmosphere forming device 30 for reproducing content. A user may be specified.
- the atmosphere management unit 130 generates a personalized atmosphere table reflecting the user's preference based on the atmosphere parameter and the user's preference information. For example, among the properties of user preference information, properties at the macro level are classified according to a large framework such as gender and age. For these properties, based on general knowledge, trends, and historical trends, the output of atmospheric parameters depending on the color and scent preferred by men or women, the color and scent preferred by people of each age, and the content popular in each age. Can be emphasized.
- the output and type of atmosphere parameters can be changed for each user based on preferences such as behavior history and likes and dislikes. For example, if it is known from the action history information that the user has been to a location recently, the output of the atmosphere parameter may be emphasized when the content is related to the location. In addition, when changing the output of the atmosphere parameter from the preference information of likes and dislikes, the following may be performed.
- the atmosphere management unit 130 defines the positive atmosphere parameters returned from the knowledge base system unit 150 by considering the user's preference information, and defines the positive parameters for the user. Define as negative.
- the atmosphere management unit 130 emphasizes the output of the atmosphere parameter for the positive atmosphere parameter. For example, if the atmosphere parameter relates to color, the output intensity may be higher than the initial setting value. On the other hand, the atmosphere management unit 130 may weaken the output intensity of the atmosphere parameter or stop outputting the negative atmosphere parameter so that the user does not feel uncomfortable. When there are a plurality of atmosphere parameter candidates, a negative atmosphere parameter may not be selected.
- the atmosphere parameter includes a direct parameter whose output contents directly change depending on the preference such as color and scent.
- the output intensity in accordance with the likes and dislikes of the color, if the user's favorite color is blue, the blue output may be higher than the initial setting value.
- the output of the color may be the same as or lower than the initial set value. As a result, the user can see his / her favorite color highlighted.
- the atmosphere parameter has a related type parameter whose output contents do not change directly according to preference, unlike the direct type parameter.
- the relationship between the tag information associated with the content and the analysis result may be grasped, and the output of the atmosphere parameter may be emphasized.
- the atmosphere parameter output may be emphasized when content related to the home country appears.
- the output of the atmosphere parameter may be emphasized when a popular content appears in the corresponding age.
- the atmosphere parameter close to the user preference is selected. For example, when there is red or blue as a color candidate to be output, blue may be selected if the user is male and red may be selected if the user is female. Or you may make it replace the color to output with a user's favorite color compulsorily.
- FIG. 10 shows an atmosphere table generated by the atmosphere management unit 130 based on the atmosphere parameter information returned from the knowledge base system unit 150.
- Such an atmosphere table includes the degree of interest of the user with respect to the keyword representing the feature amount, but the example illustrated in FIG. 10 does not reflect the user's preference information, and the degree of interest of the user is “Normal”.
- the atmosphere management unit 130 sets “normal” for the slots not included in the information returned from the knowledge base system unit 150 among the slots of the atmosphere table.
- FIG. 11 shows a personalized atmosphere table generated by the atmosphere management unit 130 based on the atmosphere parameter information and the user preference information.
- the original data of each atmosphere parameter extracted from the knowledge database is defined as a basic parameter
- the basic strength of the output of each atmosphere parameter is defined as basic emphasis.
- data that reflects user preferences with respect to basic parameters is defined as personalization parameters
- data that reflects user preferences with respect to basic emphasis is defined as personalization emphasis.
- each atmosphere parameter is adjusted so that the formed atmosphere becomes positive for the user and does not become negative, reflecting the preference information of the user.
- step S40 the atmosphere management unit 130 connects the personalized atmosphere tables generated in the timeline in time series, and generates a time series graph representing the change in atmosphere over time.
- the atmosphere management unit 130 may detect the climax of each timeline by detecting the climax of the sound, and may further emphasize the ambience parameters of the ambience table.
- FIG. 12 shows an example of a time-series graph regarding colors.
- a time series graph of green As a specific color is shown.
- the user preference information includes information that the favorite color is green
- the personalization emphasis of green output is set higher than the basic emphasis in the personalized atmosphere table. Therefore, the time-series graph (solid line) reflecting the user's preference has higher output intensity than the graph (dashed line) in which the atmosphere table based on the original data extracted from the knowledge database is connected in time series. Yes.
- the intensity of the atmosphere parameter output can be adjusted to reflect the user's preference for a favorite scent, a favorite place, a favorite artist, and the like.
- Identification of information such as a place and an artist included in the content may be performed by object recognition of the image, identification of a voice speaker, analysis of the geotag (latitude / longitude information) of the content, and the like.
- a place, an artist, or the like may be specified by reading a person tag or a place tag that is already embedded for each timeline of content.
- FIG. 13 shows an example of a time-series graph in which the climax part in the content is detected and the intensity of the atmosphere parameter output is adjusted.
- the output intensity at the excitement is higher than the graph (dashed line) that connects the atmosphere table based on the original data extracted from the knowledge database in time series. Yes.
- the excitement location in the content can be detected by, for example, creating an excitement distribution in the content using an existing technique that detects sound excitement by analyzing sound information. Reflecting the excitement in the content is applied to the time series graph connected with the personalized atmosphere table reflecting the user's preference, instead of applying it to the graph connected with the atmosphere table based on the original data. May be.
- step S50 the device management unit 170 extracts atmosphere parameters that can be output with reference to the device profile based on the sensor information and the atmosphere table.
- step S ⁇ b> 50 the device management unit 170 selects the atmosphere forming device 30 corresponding to the atmosphere parameter that can be output, and sends a control command to the output control unit 190.
- the output control unit 190 performs output control of the atmosphere forming device 30 and changes the atmosphere parameter according to the timeline of the content based on the time series graph in which the personalized atmosphere table is connected in time series. At this time, feedback may be applied so that a desired atmosphere is formed based on sensor information of various installed sensor devices.
- FIG. 14 shows a time-series graph in which the output timing is corrected according to the perceptual grasp characteristics of each atmospheric parameter.
- the device management unit 170 refers to the device profile illustrated in FIG. 7 and corrects the time series graph so that the timing of output change is advanced as the level of the perceptual grasping characteristic is higher.
- the time series graph of the atmosphere parameter “Level 1. Immediate” is not corrected, while the time series graph of the atmosphere parameter “Level 2. Normality” and “Level 3. Delay” is Correction is made so that the timing of the output change is advanced.
- the atmosphere parameter corresponding to level 1 is an output change that is immediately perceived by the user, it may be output without correction along the timeline obtained from the analysis result.
- the atmosphere parameters corresponding to levels 2 and 3 require time until the output change is perceived by the user, it is necessary to make the output change timing earlier than the timeline obtained from the analysis result.
- the atmosphere parameter of level 3 is highly delayed and needs to be output from the atmosphere forming device 30 before the content information changes.
- the output change When correcting the output change, it may be corrected by the initial value set in the system for each atmospheric parameter in advance.
- the sensor device may be used to detect the distance between the target atmosphere forming device 30 and the user, and the correction value may be dynamically determined according to the distance.
- the correction value When the correction value is determined according to the distance, the correction value may be determined so that the output change timing becomes faster in proportion to the magnitude of the distance.
- Example of use> The configuration example of the atmosphere forming system 10 according to the present embodiment and the example of the atmosphere control process have been described above. Hereinafter, a specific usage example of the atmosphere forming system 10 will be described.
- a plurality of atmosphere tables including the personalized atmosphere table are connected in time series to generate a time series graph.
- the atmosphere forming system 10 in the above scene, the lighting of the lighting fixture 30a turns green, a fresh tree scent is emitted from the aroma diffuser 30b, and warm air is output from the air conditioner 30c. Further, from the speaker device 30d, the chirping of a small bird is played as BGM. At this time, reflecting the user's preference, the output intensity of lighting, fragrance, and ventilation is increased, and the user can experience a sense of reality as if they are in Hawaii.
- the atmosphere forming system 10 can be used to change the illumination color violently in accordance with, for example, a characteristic keyword extracted from the lyrics included in the music content.
- the color of the lighting can be changed gently.
- the information processing apparatus 60 can extract a keyword suitable for a person or a place from a movie image, or extract a characteristic keyword from a line or a caption.
- the information processing device 60 generates a time series graph by reflecting the user's preference information on the atmosphere parameter selected from these keywords, specifies the climax part of the movie from the sound, and outputs the output intensity of the atmosphere parameter. Can be high.
- the lighting, fragrance and the scent of the lighting, the scent emitted from the aroma diffuser, the temperature of the air sent from the air conditioner and the air flow rate are changed in accordance with the movie timeline. Blower is output. Also, BGM that matches the scene also flows from the speaker. Therefore, the user can appreciate the movie while experiencing a more realistic feeling.
- processing executed by the content analysis unit 110, the atmosphere management unit 130, and the knowledge base system unit 150 in order to generate the atmosphere information reflecting the user's preference is executed in real time according to the reproduction of the content. (Processing form 1). Alternatively, each process may be executed in advance before the time point at which the content is played back (processing mode 2). In particular, when the atmosphere forming system 10 is used while correcting the output change timing of the atmosphere forming device 30 using the perception grasping characteristic, it is preferable to generate a personalized atmosphere table and a time series graph in advance. .
- the personalized atmosphere table and the time series graph are generated in advance based on the content information, and the time series graph is read in accordance with the reproduction of the content, and the output control of the atmosphere forming device 30 is performed.
- the atmosphere table and the time series graph before reflecting the user's preference may be generated for each content, and the atmosphere information may be generated while reflecting the user's preference at the time of content reproduction (processing mode 2). -2).
- the output change of the atmosphere forming device 30 is changed according to the distance between the user's position and the atmosphere forming device 30 during content reproduction based on the atmosphere table and time series graph generated in advance.
- the atmosphere information may be generated while correcting the timing.
- the information processing apparatus 60 can take various forms of processing.
- the information processing apparatus 60 described in the first embodiment has a configuration in which the processing units 110, 130, 150, 170, and 190 are provided in one information processing apparatus 60. You may be comprised by the several information processing apparatus.
- the first information processing apparatus including the content analysis unit 110, the atmosphere management unit 130, and the knowledge base system unit 150, and the second information processing unit 170 and the output control unit 190 are provided. It may be configured separately from the information processing apparatus.
- the first information processing device generates atmosphere information that reflects user preference information for each content in advance
- the second information processing device reads the atmosphere information and reads the atmosphere information of the atmosphere forming device 30. Output control may be executed.
- the content analysis unit 110 and the function of generating an atmosphere table or a time series graph with reference to the knowledge database among the atmosphere management unit 130 and the knowledge base system unit 150 are provided.
- One information processing apparatus can be configured.
- a second information processing apparatus including the control unit 190 can be configured.
- the first information processing apparatus generates in advance atmosphere information before reflecting the user's preference information for each content
- the second information processing apparatus reads the atmosphere information, and the user's preference information.
- the output control of the atmosphere forming device 30 can be executed.
- the atmosphere parameter is extracted based on the feature amount extracted from the content content with reference to the knowledge database.
- the atmosphere forming system 10 extracts user preference information with reference to the user database together with the atmosphere parameters.
- the atmosphere formation system 10 generates a personalized atmosphere table and a time series graph by reflecting user preference information with respect to the atmosphere parameters, and controls output of the atmosphere forming device 30.
- an atmosphere is formed in the space according to the timeline of the content, and the user can experience a sense of reality that matches the presentation of the content information.
- the user's preference is reflected in the atmosphere formed, the user can experience a sense of reality without feeling uncomfortable.
- the atmosphere that is formed is different for each user, and the atmosphere of each user can be reproduced.
- the atmosphere forming system according to the second embodiment of the present disclosure is configured as a system that enables complex processing when a plurality of output information exists for the same kind of atmosphere parameters.
- a plurality of personalized atmosphere tables can be generated when there are a plurality of contents to be used at the same time or when a plurality of users use the contents.
- a plurality of pieces of output information are generated for the same kind of atmospheric parameters.
- the atmosphere forming system according to the present embodiment can form an atmosphere by combining a plurality of output information.
- the hardware configuration of the atmosphere forming system according to the present embodiment can be the same as that of the atmosphere forming system according to the first embodiment.
- the atmosphere management unit 130 of the control unit 100 configuring the information processing apparatus 60 performs a process of combining a plurality of output information for the same kind of atmosphere parameters.
- the information processing apparatus 60 can form a plurality of atmospheres by combining the plurality of contents.
- the composite processing by the information processing device 60 is not composite of contents but composite of atmosphere, and the user needs to select content to be used from a plurality of contents. For example, when there is only one content playback device, the user selects one content to be used. However, when there are a plurality of content playback devices, the user can simultaneously select each content. Can be used.
- the first playback device can play back the first content
- the second playback device can play back the second content. Therefore, in the atmosphere forming system according to the present embodiment, a plurality of playback devices are linked together and the first playback device is played back so that the second playback device is played back in synchronization with each content. The atmosphere generated based on this is output in combination.
- the atmosphere management unit 130 when content of a certain concert is played back, when content related to the natural forest is played back in synchronization, the atmosphere management unit 130 generates a time series graph for each content.
- the atmosphere management unit 130 combines time series graphs of a plurality of contents for each atmosphere parameter.
- the compound processing can be performed, for example, by a simple merge method in which a plurality of time series graphs are compounded as they are at an equal speed.
- FIG. 16 shows an example of complex processing of time series graphs by the simple merge method.
- the intensity of the atmosphere parameter is the sum of the output intensity of the plurality of atmosphere parameters.
- the user can experience the presence of being in a natural forest that is unlikely to occur in the real world, such as being at a concert venue.
- Such compounding by the simple merge method is applicable when the strengths of the atmospheric parameters can be added.
- the intensity of the atmosphere parameter cannot be added, such as when there is one lighting fixture and only one type of color reproduction, from a plurality of atmosphere parameters acquired based on a plurality of contents, One of them must be selected.
- FIG. 17 shows a selection rule that defines which content is prioritized when the parameter strengths cannot be added.
- the user can select content to be prioritized, and when there is content selection, the user selects an atmosphere parameter acquired based on the content.
- the automatic mode is automatically set. In automatic mode, priority is given to content that is an atmospheric parameter that reflects user preferences, priority is given to content that is an atmospheric parameter with a high rate of change in output intensity, and priority is given to content that has a short playback time. To select the atmospheric parameters.
- the selection rules are not limited to those shown in the figure, and can be set as appropriate. For example, instead of determining the content to be prioritized as one, the ratio to be reflected may be determined.
- FIG. 18 shows a usage example of the atmosphere forming system according to the present embodiment when a plurality of contents are reproduced simultaneously.
- the user reproduces moving image content using a display device such as a television or a tablet, and also reproduces music content using a music player.
- the music content is prioritized according to the user designation according to the selection rule shown in FIG.
- the keywords “Maui” and “warm wind” are extracted from the video content. Also, from the music content, a feature amount “relax” and texts “lyric” and “artist” are extracted.
- the information processing apparatus 60 generates the following personalized atmosphere table based on the atmosphere parameters extracted from these keywords and the like by the knowledge database, user preference information extracted from the user database, and content priority. Is done. In a situation where music content is being played back, there are no other sound atmosphere parameters. ⁇ Color: Blue (priority), green ⁇ Aroma: Herb (priority), tree system ⁇ Temperature: Warm (25 degrees) ⁇ Blowing: moderate ⁇ emphasis: yes
- the color and scent parameters compete, but these parameters are elements that cannot be added, so the color and scent correspond to the music content that is the priority content.
- a parameter is selected.
- a plurality of atmosphere tables including the personalized atmosphere table are connected in time series to generate a time series graph.
- the information processing apparatus 60 performs a composite process on a plurality of time series graphs generated by reflecting preference information for each user.
- the composite processing can be performed in the same manner as when a plurality of contents are reproduced at the same time. That is, when the intensity of the atmosphere parameter can be added, for example, a time series graph is combined by the simple merge method. On the other hand, when the intensity of the atmosphere parameter cannot be added, it is determined which user's preference information is prioritized according to a preset selection rule.
- FIG. 19 shows a selection rule that defines which user preference information is prioritized when the strengths of the parameters cannot be added.
- a selection rule a plurality of users can select a user to be prioritized, and when there is a user's selection, an atmosphere parameter reflecting the user's preference information is selected.
- the automatic mode is automatically set. In the automatic mode, for example, a user with high output enhancement can be given priority.
- the selection rules are not limited to those illustrated, and can be set as appropriate.
- the atmosphere forming system according to the present embodiment can basically obtain the same effects as the atmosphere forming system according to the first embodiment.
- the atmosphere forming system according to the present embodiment forms an atmosphere by combining atmosphere parameters when a plurality of contents are reproduced simultaneously or when a plurality of users use the contents.
- the atmosphere parameter that can add up the output intensities a plurality of output intensities are added up and output. Therefore, the user can experience an atmosphere that hardly occurs in the real world.
- the priority atmospheric parameter is selected according to a preset selection rule. Therefore, a desired atmosphere is formed according to any one of the plurality of contents.
- the information processing apparatus can be divided into a plurality of pieces according to the processing mode for generating the atmosphere information.
- the atmosphere forming system according to the present embodiment is a system that forms an atmosphere in a space using an atmosphere forming device in conjunction with presentation of content information, and the user's operation is reflected when the atmosphere is formed. It is configured as follows. For example, the user may feel that he / she wants to change the lighting more gorgeously while watching a live video of a favorite artist. In forming an atmosphere in conjunction with presentation of content information, for example, it may be considered that the user wants to turn off the output of fragrance during a meal. Therefore, the atmosphere forming system according to the present embodiment allows the user to adjust the atmosphere formed in conjunction with the presentation of content information.
- the input device 20 includes an operation device that enables a system setting operation, an imaging device, a sound collecting microphone, and the like as a device for detecting a user's operation.
- FIG. 20 is a block diagram functionally showing the software configuration of the control unit 200 of the information processing apparatus 60.
- the control unit 200 includes a user action detection unit 210, a content analysis unit 110, an atmosphere management unit 130, a knowledge base system unit 150, a device management unit 270, and an output control unit 190.
- each of these units can be a functional unit realized by executing a program by the CPU.
- each unit other than the user operation detection unit 210 and the device management unit 270 can have the same configuration as each unit of the information processing apparatus 60 according to the first embodiment.
- control process for generating the time series graph of the atmosphere formed in conjunction with the presentation of the content information can be basically the same as that of the atmosphere forming system according to the first embodiment.
- user preference information does not necessarily have to be reflected in the formed atmosphere.
- FIG.1 and FIG.20 it demonstrates centering on the control processing which adjusts the atmosphere formed in connection with presentation of the information of a content reflecting a user's operation
- the user's operation for adjusting the atmosphere includes, for example, a method of directly touching and operating the input device 20, a voice operation by user's utterance, a user's gesture operation, and the like.
- the adjustment of the atmosphere includes a scene where the type of atmosphere is selected and a scene where the output value is changed. In each scene, an operation that can be accepted by the information processing apparatus 60 may be changed.
- the input device 20 here is, for example, a device that enables a setting operation of the atmosphere forming system 10, and examples thereof include an operation device with a screen such as a smart phone or a tablet.
- the user operation detection unit 210 receives user operation input information to these input devices 20 and sends the input information to the device management unit 270.
- the device management unit 270 can adjust the output intensity of a desired atmosphere parameter or stop the output from some of the atmosphere forming devices 30 in accordance with a user operation input.
- FIG. 21 is an explanatory diagram showing atmosphere adjustment processing by voice operation. Assume that the user talks to “make it a little darker” in a situation where the lighting device 30a and the aroma diffuser 30b are outputting in conjunction with the presentation of content information.
- the user operation detection unit 210 receives a user's utterance via a sound collection microphone, and performs a process of extracting a keyword or the like representing the user's intention by known voice analysis or language analysis.
- the user motion detection unit 210 understands the “degree” from the word “little”, identifies the “lighting fixture 30a” from the word “dark”, and grasps the user's intention to “make it dark”. . That is, the user motion detection unit 210 specifies the lighting apparatus 30a by language analysis even if it is an ambiguous expression that does not designate the atmosphere forming device 30 instead of a direct expression such as “weakening the output of lighting”. And adjust the output.
- the device management unit 270 weakens the output of the lighting fixture 30a in accordance with the user's intention detected by the user operation detection unit 210. This makes it possible to darken the illumination according to the user's voice operation while forming an atmosphere linked to the presentation of the content information.
- the user action detection unit 210 grasps the user's intention to stop the output of all the atmosphere forming devices 30 by voice analysis or language analysis.
- the device management unit 270 stops the output of the lighting fixture 30a and the aroma diffuser 30b according to the user's intention detected by the user operation detection unit 210. Thereby, from the state where the atmosphere linked to the presentation of the content information is formed, the formation of the atmosphere can be temporarily stopped according to the user's voice operation.
- FIG. 22 is an explanatory diagram showing an example of adjusting the atmosphere by detecting the movement of the user's finger and line of sight.
- the user operation detection unit 210 identifies the lighting apparatus 30a at the tip of the line of sight via the imaging information, and grasps the user's intention for the strength of the output by the movement of the index finger.
- the device management unit 270 adjusts the output of the lighting fixture 30a according to the user's intention detected by the user operation detection unit 210. Thereby, the brightness of the illumination can be adjusted according to the user's gesture operation while forming an atmosphere linked to the presentation of the content information.
- FIG. 23 is an explanatory diagram showing an example of adjusting the atmosphere by detecting the number and movement of the user's fingers.
- information associated with the atmosphere forming device 30 corresponding to the number of fingers is stored in the information processing apparatus 60.
- the luminaire 30a if there is one finger, it refers to the luminaire 30a, and if there are two fingers, it refers to the aroma diffuser 30b.
- the user moves his / her index finger up and down in a situation where the lighting apparatus 30a and the aroma diffuser 30b are outputting in conjunction with presentation of content information.
- the user motion detection unit 210 detects the number and movement of the user's fingers through the imaging information, identifies the lighting device 30a because there is only one finger, and the user's intention for the strength of the output by the movement of the index finger To figure out.
- the device management unit 270 adjusts the output of the lighting fixture 30a according to the user's intention detected by the user operation detection unit 210.
- the user motion detection unit 210 identifies the aroma diffuser 30b because there are two fingers, and grasps the user's intention for the strength of the output by the movement of the two fingers.
- the device management unit 270 adjusts the output of the aroma diffuser 30b according to the user's intention detected by the user operation detection unit 210.
- the atmosphere can be adjusted according to the gesture operation of the user while the atmosphere linked to the presentation of the content information is formed.
- FIG. 24 is an explanatory diagram showing an example of adjusting the atmosphere by detecting the type and movement of the user's finger.
- information associated with the atmosphere forming device 30 corresponding to the type of finger is stored in the information processing apparatus 60.
- the thumb indicates the lighting device 30a
- the index finger indicates the aroma diffuser 30b.
- the user motion detection unit 210 detects the type and movement of the user's finger through the imaging information, specifies the lighting device 30a because the finger is a thumb, and also determines the user's intention for the strength of output by the movement of the thumb. To grasp.
- the device management unit 270 adjusts the output of the lighting fixture 30a according to the user's intention detected by the user operation detection unit 210.
- the user motion detection unit 210 identifies the aroma diffuser 30b because the finger is an index finger, and grasps the user's intention for the strength of the output by the movement of the index finger.
- the device management unit 270 adjusts the output of the aroma diffuser 30b according to the user's intention detected by the user operation detection unit 210.
- the atmosphere can be adjusted according to the gesture operation of the user while the atmosphere linked to the presentation of the content information is formed.
- the information processing apparatus 60 may automatically adjust the atmosphere according to the user's operation. For example, when the user makes a call, the information processing apparatus 60 may turn off the sound and vibration output in conjunction with the presentation of the content information, or may weaken the output intensity. Specifically, when the user is using a specific device such as a telephone device, the information processing apparatus 60 stops a playback device for content that is not being used, or a stop estimated from the user's operation status. The output of the atmosphere forming device 30 to be stopped is stopped.
- the information processing device 60 executes appropriate adjustment processing according to the usage state of the device by linking the telephone device and the atmosphere forming system so that the information processing device 60 knows the usage state of each device. .
- the information processing apparatus 60 can detect that the user makes a phone call by analyzing the imaging information.
- the information processing device 60 may make the illumination bright when the illumination is dark when the user stands up while the atmosphere is created in conjunction with the presentation of the content information. . Since the user may start walking when the user stands up, the danger can be reduced by brightening the room.
- the information processing apparatus 60 can detect that the user has started up by analyzing the imaging information.
- the information processing apparatus 60 may control the output and stop of the atmosphere forming device 30 together with the temporary stop and playback of the content depending on whether or not the user is in the content usage environment. For example, when the user leaves the room, that is, when the user moves out of the detection range of the imaging device or the sensor device, or when the user moves away from the content reproduction device more than a preset distance, the information processing device 60 temporarily stops the output of content and atmosphere. Further, when the user returns to the detection range of the imaging device or the sensor device, or when the user returns to a range in which the distance from the content playback device is set in advance, the information processing device 60 temporarily Play stopped content automatically. In this way, it is possible to automatically stop and reproduce the output of the atmosphere without hindering the user's action.
- the information processing apparatus 60 may change the change rate of the output of the atmosphere in accordance with the speed of the user's operation. For example, the rate of change can be reduced when the user is moving in a small and relaxed motion like a conductor, and the rate of change can be increased when the user is moving in a large and fast motion. By controlling in this way, the change rate of the atmosphere can be freely changed by the user's own operation.
- the information processing apparatus 60 updates the user preference information on the user database by learning the user's evaluation of the formed atmosphere based on the user's utterances and gestures according to a predetermined rule. You may do it. By controlling in this way, the atmosphere according to the user's preference can be appropriately reproduced.
- the atmosphere output is adjusted to reflect the user's operation while the atmosphere is formed in conjunction with the presentation of the content information.
- the atmosphere output is automatically adjusted according to the user's intention or in accordance with the user's operation, and a comfortable atmosphere can be formed according to the user's operation.
- the device management unit 270 adjusts the atmosphere by reflecting the user's operation or intention.
- the atmosphere may be adjusted by the atmosphere management unit 130 correcting the atmosphere parameter.
- the output control unit 190 may adjust the output intensity.
- the information processing apparatus can be divided into a plurality of pieces according to the processing mode for generating the atmosphere information.
- the information processing apparatus grasps the environment around the user with respect to the atmosphere forming systems according to the first to third embodiments, and the specific place It is configured as a system with a function that emphasizes the output of the atmosphere.
- the part that emphasizes the output of the atmosphere can be an object or a place.
- a glass, a mirror, a table, a display, etc. are mentioned, but it can be set as various objects and places other than these.
- FIG. 25 is an explanatory diagram illustrating an example in which illumination emitted from the lighting fixture 30a is applied to a predetermined object as an example as an atmosphere control process by the atmosphere forming system according to the present embodiment.
- a predetermined object as an example as an atmosphere control process by the atmosphere forming system according to the present embodiment.
- Such an example shows a function of particularly illuminating the glass 90 and the display device 95 existing around the user.
- the object or location to be illuminated may be identified by recognizing an object in the space using imaging information acquired by a camera to determine the specific object or location.
- the object or location to be illuminated may be specified by attaching the RFID tag to the object or location where the user wants to actively emphasize the atmosphere.
- FIG. 26 shows an example in which an RFID tag is attached to the glass 90 and an object to be illuminated is specified.
- the RFID tag even if the position of the glass 90 is moved, the irradiation position follows and the glass 90 can always be illuminated.
- the glass 90 it is possible to use the glass 90 such that, at a bar counter, the glass 90 is illuminated with light that matches the atmosphere of the BGM flowing in the bar following the glass 90 used by the customer.
- the fifth embodiment of the present disclosure is a multi-atmosphere output-compatible device in which a content playback device itself is configured as an atmosphere forming system.
- FIG. 27 shows an example of an image display device 400 configured as a multi-atmosphere output compatible device.
- the image display device 400 includes an input device (not shown) and an illumination device, a blower, a vibration generation device, and a scent diffusion device as an atmosphere forming device 30 on the display device, and an atmosphere control process is performed by an information processing device provided together. Configured to be executable.
- the multi-atmosphere output-compatible device according to the present embodiment can be easily carried in a space desired by the user as a so-called all-in-one type content playback device.
- the multi-atmosphere output compatible device may not include all the processing units of the information processing apparatus.
- the information processing apparatus may be configured to execute the output control of the atmosphere forming device 30 by reading the atmosphere information generated in advance by reflecting the user's preference.
- the information processing apparatus may read the atmosphere information generated in advance without reflecting the user's preference and reflect the user's preference so that the output control of the atmosphere forming device 30 can be executed. .
- the atmosphere forming system is capable of stimulating visual, auditory, olfactory, and tactile sensations.
- the present technology is not limited to such an example, and may stimulate at least one of the senses. Any system can be used. Further, the configurations of the first to fifth embodiments may be appropriately combined.
- an atmosphere created by stimulation different from the presentation of the information that can be perceived by the user and reflecting the user's preference or operation is formed.
- An information processing apparatus comprising a control unit that controls the atmosphere forming device.
- the control unit is configured to form the atmosphere based on a feature amount extracted from the content.
- the control unit is configured to change an output intensity of the atmosphere forming device according to a preference or operation of the user.
- control unit configured to select a type of the atmosphere forming device to be used according to the user's preference or operation. Processing equipment.
- the control unit is configured to reflect the user's preference in the atmosphere with reference to a database in which the user's preference information is stored.
- the information processing apparatus according to item.
- the user preference information is associated with a level of preference intensity, and the control unit is configured to vary the degree of reflection of the user preference with respect to the atmosphere according to the preference intensity.
- the information processing apparatus according to 5).
- the user preference information is associated with a preference intensity level, and when there are a plurality of users using the content, the control unit gives priority to the preference information of the user with a high preference intensity,
- the information processing apparatus according to (5) configured to reflect the user's preference in an atmosphere.
- the control unit is configured to form the atmosphere in preference to the contents specified by the user.
- the control unit is configured to form the atmosphere by combining the outputs of the corresponding atmospheres.
- the control unit When a plurality of contents are used at the same time, the control unit is configured to form the atmosphere selected according to the priority of the contents for the atmosphere where the output cannot be synthesized, The information processing apparatus according to (9). (11) When there are a plurality of users who use the content, the control unit is configured to form the atmosphere that preferentially reflects the preference of the selected specific user. The information processing apparatus according to any one of (10). (12) The control unit is configured to form the atmosphere in which the user's preference or action is reflected on an atmosphere parameter corresponding to a feature amount extracted from the content. The information processing apparatus according to any one of 11).
- control unit is configured to form the atmosphere based on a time series graph in which the atmosphere parameters at a plurality of times of the content are connected in time series.
- the control unit is configured to determine content to be a target for forming the atmosphere in conjunction with presentation of the content information based on a change rate of an output of the atmosphere forming device.
- the information processing apparatus according to any one of (13) to (13).
- control unit according to any one of (1) to (14), wherein the control unit is configured to control an output timing of the atmosphere forming device according to a perception grasp characteristic of each atmosphere. Information processing device.
- control unit configured to control an output timing of the atmosphere forming device according to a distance between the user and the atmosphere forming device.
- the information processing apparatus described.
- control unit is configured to adjust the atmosphere based on an input operation, a voice input operation, or a gesture operation by the user.
- Information processing device 18.
- control unit is configured to change the atmosphere in accordance with the user's intention obtained by analyzing a user's operation using the content.
- the atmosphere forming device according to any one of (1) to (18), wherein the atmosphere forming device is at least one of a lighting fixture, an aroma diffuser, a speaker, a fan, an air conditioner, and a vibration generator.
- Information processing device (20) In conjunction with the presentation of information on the content used by the user, an atmosphere that is different from the presentation of the information and that is perceivable by the user and that reflects the user's preferences or actions is formed An information processing method for controlling an atmosphere forming device with a control device.
- the atmosphere of the stimulus perceivable by the user which is different from the presentation of the information, reflects the user's preference or operation
- a program that executes a function of controlling an atmosphere forming device so that an atmosphere is formed.
- An atmosphere forming system comprising: an information processing device that controls an atmosphere forming device so that an atmosphere reflecting the user's preference or operation is formed.
- Atmosphere Formation System 20 Input Device 30 Output Device (Atmosphere Formation Equipment) 30a Lighting fixture 30b Aroma diffuser 30c Air conditioner 40 Storage device 50 Communication device 60 Information processing device 90 Glass 95 Display device 100 Control unit 110 Content analysis unit 130 Atmosphere management unit 150 Knowledge base system unit 170 Device management unit 190 Output control unit 200 Control unit 210 User operation detection unit 270 Device management unit 400 Image display device (multi-atmosphere output compatible device)
Landscapes
- Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- Data Mining & Analysis (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computational Linguistics (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Description
1.第1の実施の形態(ユーザの嗜好を反映)
1.1.雰囲気形成システムの概要
1.2.ハードウェア構成
1.3.ソフトウェア構成
1.3.1.コンテンツ解析部
1.3.2.雰囲気管理部
1.3.3.知識ベースシステム部
1.3.4.デバイス管理部
1.3.5.出力制御部
1.4.雰囲気制御処理
1.4.1.コンテンツ抽出処理
1.4.2.特徴量抽出処理
1.4.3.雰囲気テーブル生成処理
1.4.4.時系列グラフ生成処理
1.4.5.デバイス最適化処理
1.5.使用例
1.5.1.動画鑑賞時の例
1.5.2.音楽鑑賞時の例
1.5.3.映画鑑賞時の例
2.第2の実施の形態(複数の雰囲気複合処理)
3.第3の実施の形態(ユーザの動作を反映)
3.1.雰囲気形成システムの概要
3.2.雰囲気形成システムの構成例
3.3.雰囲気調整処理
3.3.1.直接操作による雰囲気調整
3.3.2.音声操作による雰囲気調整
3.3.3.ジェスチャー操作による雰囲気調整
3.3.4.その他の動作に応じた雰囲気調整処理
4.第4の実施の形態(周囲環境を把握した雰囲気形成)
5.第5の実施の形態(マルチ雰囲気出力対応機器)
<1.1.雰囲気形成システムの概要>
まず、本開示の第1の実施の形態にかかる雰囲気形成システムの概要について説明する。雰囲気形成システムは、ユーザが利用するコンテンツの情報の提示に連動して、ユーザのいる空間に雰囲気を形成するシステムである。本実施形態にかかる雰囲気形成システムにより形成される雰囲気には、ユーザの嗜好が反映されるようになっている。
次に、本実施形態にかかる雰囲気形成システム10の構成例について説明する。図1は、雰囲気形成システム10のハードウェア構成を概略的に示す説明図である。雰囲気形成システム10は、入力装置20と、雰囲気形成機器30と、情報処理装置60と、ストレージ装置40と、通信装置50とを備えている。
図3は、情報処理装置60の制御部100のソフトウェア構成を機能的に示すブロック図である。かかる制御部100は、コンテンツ解析部110と、雰囲気管理部130と、知識ベースシステム部150と、デバイス管理部170と、出力制御部190とを備える。これらの各部は、具体的には、CPUによるプログラムの実行により実現される機能部とすることができる。
コンテンツ解析部110は、コンテンツの情報の内容を解析することにより、コンテンツの特徴量を抽出する処理を実行する。本実施形態において、コンテンツ解析部110は、時系列解析、画像解析、音声解析及び言語解析の各解析処理を実行可能に構成されている。時系列解析は、コンテンツのタイムライン内で処理を行う。画像解析は、コンテンツの情報内の画像中の物体認識を行い、画像に関する特徴量を抽出する処理を行う。音声解析は、音声認識により音声をテキストに置き換える処理を行う。また、音声解析は、音声自体の盛り上がりやジャンル、音量、テンポ、音質等の特徴量を抽出する処理を行ってもよい。言語解析は、形態素解析や構文解析等を行い、キーワード等を特徴量として抽出する処理を行う。
雰囲気管理部130は、コンテンツ解析部110により抽出されたコンテンツの特徴量を知識ベースシステム部150に問い合わせ、返される結果に基づいて、雰囲気形成機器30を用いて形成させる雰囲気情報を生成する。具体的に、雰囲気管理部130は、タイムラインのある時点における特徴量をキーにして、知識ベースシステム部150に問い合わせることによって雰囲気パラメータ(色や温度等)を取得する。そして、雰囲気管理部130は、あらかじめ用意されている雰囲気テーブルの各スロットに、取得した雰囲気パラメータを埋めて、雰囲気テーブルを生成する。
知識ベースシステム部150は、ストレージ装置40に格納された知識データベースを参照し、問い合わせのあったコンテンツの特徴量に対応する雰囲気パラメータを取得して、雰囲気管理部130に返す。また、知識データベースを参照した結果、対応する雰囲気パラメータが得られない場合には、通信装置50を介して、外部のインターネット等の検索エンジンにアクセスし、関連する雰囲気パラメータを取得してもよい。このとき同時に、知識ベースシステム部150は、ストレージ装置40に格納されたユーザデータベースを参照し、ユーザの嗜好情報を雰囲気管理部130に返す。以下、知識ベースシステム部150が参照するデータベースの例とともに知識ベースシステム部150について説明する。
本実施形態において用いられる知識データベースは、特徴量と雰囲気パラメータとをマッピングさせた様々なデータを格納している。雰囲気を形成する要素としては、例えば、色や香り(アロマ)、風、温度、音、振動等のパラメータが存在する。本実施形態では、これらの色や香り等のパラメータ情報をコアデータベースと定義し、コアデータベースを参照することによりキーとなる特徴量に対応する雰囲気パラメータを取得できるようになっている。以下、コアデータベースの一例として、色及び香りの雰囲気パラメータについて説明する。
本実施形態で用いられるユーザデータベースは、ユーザのプロファイルが格納されたデータベースであり、色や香り、人物、場所等に関するユーザの嗜好が記憶されている。かかるユーザデータベースは、ユーザの嗜好に応じて雰囲気パラメータを強調させることにより、ユーザの嗜好が反映された雰囲気を空間内に形成するために利用される。
デバイス管理部170は、入力装置20を構成する種々のセンサ装置から入力されるセンサ情報を管理しながら、雰囲気管理部130で生成した雰囲気情報に基づいて、出力制御を行わせる最適な雰囲気形成機器30を選択する。例えば、デバイス管理部170は、センサ情報に基づいて現在の雰囲気を把握し、雰囲気情報に沿う雰囲気を形成するために最適な雰囲気形成機器30を選択する。選択した雰囲気形成機器30の情報は、出力制御部190に送られる。
図7は、デバイスプロファイルの例を示す。デバイスプロファイルには、例えば、雰囲気形成機器30の名称、出力フォーマット、知覚把握特性の情報が含まれる。また、デバイスプロファイルは、空間内に存在する雰囲気形成機器30に応じてユーザごとに作成されて、ストレージ装置40に格納されている。知覚把握特性とは、雰囲気形成機器30の出力に伴って雰囲気が変化したときに、ユーザが雰囲気の変化を知覚するまでの即時性を示す情報である。図7の例では、知覚把握特性は、「1.即時性」、「2.通常性」、「3.遅延性」の三段階のレベルで表されている。知覚把握特性のレベルが低いほど、ユーザが雰囲気の変化を即座に把握できることを表し、知覚把握特性のレベルが高いほど、ユーザが雰囲気の変化を知覚しにくいことを表している。
出力制御部190は、ユーザの嗜好を反映して生成された雰囲気情報に見合った雰囲気が形成されるように、デバイス管理部170の指令にしたがい、雰囲気形成機器30に対して制御信号を出力する。
次に、本実施形態にかかる情報処理装置60により実行される雰囲気制御処理について説明する。図8は、雰囲気制御処理のフローチャートである。また、図9は、雰囲気制御処理のフローを詳細に示す図である。
まず、ステップS10において、コンテンツ解析部110は、コンテンツ内のタイムライン、画像、音声、テキストを分離する。
次いで、ステップS20において、コンテンツ解析部110は、分離したデータごとに特徴量を抽出する。例えば、コンテンツ解析部110は、画像データの特徴量をキーワードに変換する。また、コンテンツ解析部110は、音声データを音声解析によりテキスト変換した後、言語解析により特徴量を抽出する。これらの解析処理は、既存の技術を利用して実行することができる。
次いで、ステップS30において、雰囲気管理部130は、抽出された特徴量をキーとして、この特徴量に対応する雰囲気パラメータを知識ベースシステム部150に問い合わせる。知識ベースシステム部150は、知識データベースを参照して、問い合わせのあったコンテンツの特徴量に対応する雰囲気パラメータを抽出して雰囲気管理部130に返す。また、知識ベースシステム部150は、ユーザデータベースを参照して、現在のユーザの嗜好情報を抽出し、雰囲気パラメータとともに雰囲気管理部130に返す。雰囲気管理部130は、知識ベースシステム部150から返された雰囲気パラメータ及びユーザの嗜好情報に基づき個人化雰囲気テーブルを生成する。かかる個人化雰囲気テーブルは、タイムラインのそれぞれの時点毎に生成される。具体的に、個人化雰囲気テーブルの生成処理は、以下のように行うことができる。
例えば、コンテンツの特徴量が「森」というキーワードで与えられている場合、知識ベースシステム部150は、コアデータベースを参照して、当該キーワードに対応する3つ組の知識データを、以下のように抽出する。抽出した知識データは、雰囲気パラメータとして、雰囲気管理部130に返される。
・「森-上位-自然」
・「森-color-緑」
・「自然-aroma-樹木系」
・「森-wind-弱」
・「森-sound-さえずり」
・「森-vibration-無し」
知識ベースシステム部150は、雰囲気パラメータを抽出する際に、ユーザデータベースを参照して、図6に例示されるような、ユーザの嗜好情報を取得する。上述のとおり、ユーザの嗜好情報には、プロパティに応じてマクロレベル及びミクロレベルのモードが関連付けられている。取得したユーザの嗜好情報は、雰囲気パラメータとともに雰囲気管理部130に返される。知識ベースシステム部150は、例えば、雰囲気形成システム10へのログインデータに基づいてユーザを特定してもよいし、コンテンツを再生するための入力装置20や雰囲気形成機器30へのログインデータに基づいてユーザを特定してもよい。
雰囲気管理部130は、雰囲気パラメータ及びユーザの嗜好情報に基づいて、ユーザの嗜好を反映した個人化雰囲気テーブルを生成する。例えば、ユーザの嗜好情報のプロパティのうち、マクロレベルのプロパティは、性別や年代のような大きな枠組みで分類されている。これらのプロパティについては、一般知識、傾向、歴史的傾向に基づき、男性又は女性が好む色や香り、各年代の人が好む色や香り、各年代に流行ったコンテンツに応じて、雰囲気パラメータの出力を強調させることができる。
次いで、ステップS40において、雰囲気管理部130は、タイムライン内で生成された個人化雰囲気テーブルを時系列でつなぎ、時間の経過に伴う雰囲気の変化を表す時系列グラフを生成する。このとき、雰囲気管理部130は、音の盛り上がりを検出することによりタイムラインごとの盛り上がり箇所を検出し、雰囲気テーブルの雰囲気パラメータをさらに強調させるようにしてもよい。
図12は、色に関する時系列グラフの例を示す。ここでは、特定の色として緑の時系列グラフの例が示されている。ユーザの嗜好情報に、好きな色が緑であるという情報が含まれている場合、個人化雰囲気テーブルでは、緑の出力の個人化強調が基本強調よりも高くされる。したがって、ユーザの嗜好を反映させた時系列グラフ(実線)は、知識データベースから抽出されたオリジナルのデータに基づく雰囲気テーブルを時系列でつないだグラフ(破線)に比べて、出力強度が高くされている。
図13は、コンテンツ内の盛り上がり箇所を検出して、雰囲気パラメータの出力の強度を調整した時系列グラフの例を示す。盛り上がりを反映させた時系列グラフ(実線)では、知識データベースから抽出されたオリジナルのデータに基づく雰囲気テーブルを時系列につないだグラフ(破線)に比べて、盛り上がり箇所での出力強度が高くされている。コンテンツ内の盛り上がりを反映させることによって、オリジナルのデータによる出力強度が低い箇所に対しても、盛り上がりに応じて出力強度を高くすることができる。
次いで、ステップS50において、デバイス管理部170は、センサ情報及び雰囲気テーブルに基づき、デバイスプロファイルを参照して出力可能な雰囲気パラメータを抽出する。また、ステップS50において、デバイス管理部170は、出力可能な雰囲気パラメータに対応する雰囲気形成機器30を選択し、出力制御部190に制御指令を送る。出力制御部190は、雰囲気形成機器30の出力制御を行い、個人化雰囲気テーブルを時系列につないだ時系列グラフに基づき、コンテンツのタイムラインに応じて雰囲気パラメータを変化させる。このとき、設置されている種々のセンサ装置のセンサ情報に基づいて、所望の雰囲気が形成されるようにフィードバックをかけるようにしてもよい。
図14は、各雰囲気パラメータの知覚把握特性に応じて出力タイミングを補正した時系列グラフを示す。デバイス管理部170は、図7に例示したようなデバイスプロファイルを参照して、知覚把握特性のレベルが高いほど、出力変化のタイミングを早めるように時系列グラフを補正する。図14の例では、「レベル1.即時性」の雰囲気パラメータの時系列グラフは補正されていない一方、「レベル2.通常性」及び「レベル3.遅延性」の雰囲気パラメータの時系列グラフは出力変化のタイミングが早くなるように補正されている。
以上、本実施形態にかかる雰囲気形成システム10の構成例及び雰囲気制御処理の例について説明した。以下、雰囲気形成システム10の具体的な使用例について説明する。
例えば、図15に示すように、ユーザがテレビやタブレット等の表示機器を使用して、ハワイ、マウイ島で撮影した動画コンテンツを見ているとする。このユーザの嗜好情報には、好きな場所として山が入力されている。この動画中において、人物がマウイ島の山を見ながら「暖かくて風が気持ちいい」と話すシーンがあるとする。このシーンの中のコンテンツの情報からは、「マウイ島」及び「暖かい風」のキーワードが抽出されている。情報処理装置60では、知識データベースによりこれらのキーワードから抽出される雰囲気パラメータに対して、ユーザデータベースから抽出されるユーザの嗜好情報を反映させて、以下の個人化雰囲気テーブルが生成される。
・色:緑
・香り:樹木系
・温度:暖かい(25度)
・送風:穏やか
・強調:あり
・音:小鳥のさえずり
例えば、ユーザが、ミュージックプレイヤー等を利用して、好きなアーティストの曲を聴いているとする。この場合、雰囲気形成システム10を利用して、例えば、音楽コンテンツに含まれる歌詞から抽出される特徴的なキーワード等に合わせて、照明の色を激しく変化させることができる。一方、好きではないアーティストの曲を聴いている場合には、例えば、照明の色を穏やかに変化させるようにすることができる。
例えば、ユーザが、ビデオ再生装置により映画コンテンツを再生してテレビの画面で映画を見ているとする。この場合、情報処理装置60は、映画の映像から人物や場所に適したキーワードを抽出したり、セリフや字幕から特徴的なキーワードを抽出したりすることができる。また、情報処理装置60は、これらのキーワードから選択される雰囲気パラメータにユーザの嗜好情報を反映させて時系列グラフを生成するとともに、音から映画の盛り上がり箇所を特定して、雰囲気パラメータの出力強度を高くすることができる。雰囲気形成システム10を利用すると、映画のタイムラインに合わせて、照明の色や、アロマディフューザから放たれる香り、エアコンディショナから送られる風の温度や送風量が変化しながら、照明、香り及び送風が出力されるようになる。また、スピーカからは、シーンに合わせたBGMも流れる。したがって、ユーザは、より臨場感を体験しながら、映画を鑑賞することができるようになる。
本開示の第2の実施の形態にかかる雰囲気形成システムは、同種の雰囲気パラメータに対して複数の出力情報が存在する場合の複合処理を可能にしたシステムとして構成されている。例えば、利用するコンテンツが同時に複数存在する場合や、複数のユーザがコンテンツを利用する場合に、複数の個人化雰囲気テーブルが生成され得る。かかる場合には、同種の雰囲気パラメータに対して複数の出力情報が生成されることになる。このような場合において、本実施形態にかかる雰囲気形成システムは、複数の出力情報を複合して、雰囲気を形成することができるようになっている。
・色:青(優先)、緑
・香り:ハーブ系(優先)、樹木系
・温度:暖かい(25度)
・送風:穏やか
・強調:あり
<3.1.雰囲気形成システムの概要>
まず、本開示の第3の実施の形態にかかる雰囲気形成システムの概要について説明する。本実施形態にかかる雰囲気形成システムは、コンテンツの情報の提示に連動させて雰囲気形成機器を用いて空間内に雰囲気を形成するシステムであり、雰囲気が形成される際にユーザの動作が反映されるように構成されている。例えば、ユーザが好きなアーティストのライブ映像を鑑賞している間に、より派手に照明を変化させたいと感じることが考えられる。また、コンテンツの情報の提示に連動させて雰囲気を形成するにあたり、例えば、食事中には香りの出力をオフにしておきたいと感じることも考えられる。そのため、本実施形態にかかる雰囲気形成システムは、コンテンツの情報の提示に連動して形成される雰囲気を、ユーザが調整できるようにしたものである。
本実施形態にかかる雰囲気形成システムのハードウェア構成は、基本的に第1の実施の形態にかかる雰囲気形成システムと同様の構成とすることができる。ただし、本実施形態にかかる雰囲気形成システムにおいて、入力装置20は、ユーザの動作を検出するための装置として、システムの設定操作を可能とした操作機器や、撮像装置、集音マイク等を含む。
雰囲気を調整するためのユーザの操作は、例えば、入力装置20に直接触れて操作する方法や、ユーザの発話による音声操作、ユーザのジェスチャー操作等が挙げられる。雰囲気の調整は、雰囲気の種類を選択する場面と、出力値を変更する場面があり、それぞれの場面において、情報処理装置60が受け入れ可能な操作が変えられるようになっていてもよい。
まず、ユーザが入力装置20に直接触れて操作する雰囲気調整処理について説明する。ここでいう入力装置20とは、例えば、雰囲気形成システム10の設定操作を可能とした装置であり、例えば、スマートホンやタブレット等の画面付きの操作機器が挙げられる。ユーザ動作検出部210は、これらの入力装置20へのユーザの操作入力情報を受け付け、デバイス管理部270に入力情報を送る。デバイス管理部270は、ユーザの操作入力にしたがって、所望の雰囲気パラメータの出力強度を調整したり、一部の雰囲気形成機器30からの出力を停止したりすることができる。
次に、ユーザの発話による音声操作による雰囲気調整処理について説明する。かかる方法は、例えば、ユーザが料理中や食事中であり、上記の入力装置20による操作ができない場面等において有効である。図21は、音声操作による雰囲気調整処理を示す説明図である。コンテンツの情報の提示に連動して、照明器具30a及びアロマディフューザ30bが出力を行っている状況において、ユーザが「少し暗くして」と話したとする。ユーザ動作検出部210は、集音マイクを介してユーザの発話を受け付け、既知の音声解析や言語解析等により、ユーザの意思を表すキーワード等を抽出する処理を行う。
次に、ユーザのジェスチャーによる雰囲気調整処理について説明する。ジェスチャーによる操作方法としては様々な動作が考えられる。ここでは、指や視線の動きにより雰囲気を調整する例について説明する。指や視線の動きに例示されるユーザのジェスチャーは、例えば、撮像装置を介して取得される撮像情報を解析することによって検出することができる。
上記の例の他、ユーザの動作に応じて、情報処理装置60が自動的に雰囲気を調整するようにしてもよい。例えば、情報処理装置60は、ユーザが電話をする場合に、コンテンツの情報の提示に連動して出力している音や振動をオフにしたり、あるいは出力強度を弱めたりしてもよい。具体的には、ユーザが電話機器等の特定の機器を使用している場合に、情報処理装置60は、利用していないコンテンツの再生機器を停止したり、ユーザの操作状況から推測される停止すべき雰囲気形成機器30の出力を停止したりする。あるいは、電話機器と雰囲気形成システムとを連携させ、各機器の使用状態を情報処理装置60が把握しておくことによって、機器の使用状態に応じて情報処理装置60は適宜の調整処理を実行する。情報処理装置60は、撮像情報を解析することによって、ユーザが電話をすることを検知することができる。
本開示の第4の実施の形態にかかる雰囲気形成システムは、第1~第3の実施の形態にかかる雰囲気形成システムに対して、情報処理装置がユーザの周囲の環境を把握し、特定の箇所に向けて雰囲気の出力を強調させる機能を付加したシステムとして構成されている。雰囲気の出力を強調させる箇所は、物体や場所とすることができる。例えば、グラスや鏡、テーブル、ディスプレイ等が挙げられるが、これら以外の種々の物体や場所とすることができる。
本開示の第5の実施の形態は、コンテンツの再生機器自体を雰囲気形成システムとして構成したマルチ雰囲気出力対応機器である。図27は、マルチ雰囲気出力対応機器として構成された画像表示機器400の例を示している。かかる画像表示機器400は、ディスプレイ装置に、図示しない入力装置や、雰囲気形成機器30として照明装置、送風機、振動発生装置及び香り拡散装置を設け、併せて備えられた情報処理装置により雰囲気制御処理を実行可能に構成されている。本実施形態にかかるマルチ雰囲気出力対応機器によれば、いわゆるオールインワン型のコンテンツ再生機器として、ユーザが所望する空間に容易に持ち運びすることができる。
(1)ユーザが利用するコンテンツの情報の提示に連動して、前記情報の提示とは異なる前記ユーザが知覚可能な刺激による雰囲気であって、前記ユーザの嗜好又は動作を反映させた雰囲気が形成されるように雰囲気形成機器を制御する制御部を備える、情報処理装置。
(2)前記制御部は、前記コンテンツから抽出される特徴量に基づいて前記雰囲気を形成するよう構成される、前記(1)に記載の情報処理装置。
(3)前記制御部は、前記ユーザの嗜好又は動作に応じて、前記雰囲気形成機器の出力強度を変化させるよう構成される、前記(1)又は(2)に記載の情報処理装置。
(4)前記制御部は、前記ユーザの嗜好又は動作に応じて、用いる前記雰囲気形成機器の種類を選択するよう構成される、前記(1)~(3)のいずれか一項に記載の情報処理装置。
(5)前記制御部は、前記ユーザの嗜好情報が格納されたデータベースを参照して、前記雰囲気に前記ユーザの嗜好を反映させるよう構成される、前記(1)~(4)のいずれか一項に記載の情報処理装置。
(6)前記ユーザの嗜好情報は嗜好強度のレベルに関連づけられており、前記制御部は、前記嗜好強度に応じて前記雰囲気に対する前記ユーザの嗜好の反映度合いを異ならせるよう構成される、前記(5)に記載の情報処理装置。
(7)前記ユーザの嗜好情報は嗜好強度のレベルに関連づけられており、前記コンテンツを利用するユーザが複数いる場合に、前記制御部は、嗜好強度の高いユーザの嗜好情報を優先しながら、前記雰囲気に前記ユーザの嗜好を反映させるよう構成される、前記(5)に記載の情報処理装置。
(8)複数のコンテンツが同時に利用される場合に、前記制御部は、前記ユーザが指定したコンテンツを優先して前記雰囲気を形成するよう構成される、前記(1)~(7)のいずれか一項に記載の情報処理装置。
(9)複数のコンテンツが同時に利用される場合に、前記制御部は、対応する前記雰囲気の出力を合成した前記雰囲気を形成するよう構成される、前記(1)~(7)のいずれか一項に記載の情報処理装置。
(10)複数のコンテンツが同時に利用される場合に、前記制御部は、出力を合成できない前記雰囲気については、前記コンテンツの優先度に応じて選択された前記雰囲気を形成するよう構成される、前記(9)に記載の情報処理装置。
(11)前記コンテンツを利用するユーザが複数いる場合に、前記制御部は、選択された特定のユーザの嗜好を優先して反映させた前記雰囲気を形成するよう構成される、前記(1)~(10)のいずれか一項に記載の情報処理装置。
(12)前記制御部は、前記コンテンツから抽出される特徴量に応じた雰囲気パラメータに対して前記ユーザの嗜好又は動作を反映させた前記雰囲気を形成するよう構成される、前記(1)~(11)のいずれか一項に記載の情報処理装置。
(13)前記制御部は、前記コンテンツの複数の時点の前記雰囲気パラメータを時系列につないだ時系列グラフに基づき前記雰囲気を形成するよう構成される、前記(12)に記載の情報処理装置。
(14)前記制御部は、前記雰囲気形成機器の出力の変化率に基づいて、前記コンテンツの情報の提示に連動する前記雰囲気を形成する対象となるコンテンツを決定するよう構成される、前記(2)~(13)のいずれか一項に記載の情報処理装置。
(15)前記制御部は、それぞれの前記雰囲気の知覚把握特性に応じて、前記雰囲気形成機器の出力タイミングを制御するよう構成される、前記(1)~(14)のいずれか一項に記載の情報処理装置。
(16)前記制御部は、前記ユーザと前記雰囲気形成機器との距離に応じて前記雰囲気形成機器の出力タイミングを制御するよう構成される、前記(1)~(15)のいずれか一項に記載の情報処理装置。
(17)前記制御部は、前記ユーザによる入力操作、音声入力操作又はジェスチャー操作に基づいて、前記雰囲気を調整するよう構成される、前記(1)~(16)のいずれか一項に記載の情報処理装置。
(18)前記制御部は、前記コンテンツを利用するユーザの動作を解析して得られる前記ユーザの意思にしたがって前記雰囲気を変化させるよう構成される、前記(1)~(17)のいずれか一項に記載の情報処理装置。
(19)前記雰囲気形成機器が、照明器具、アロマディフューザ、スピーカ、扇風機、エアコンディショナ、振動発生装置のうちの少なくとも1つである、前記(1)~(18)のいずれか一項に記載の情報処理装置。
(20)ユーザが利用するコンテンツの情報の提示に連動して、前記情報の提示とは異なる前記ユーザが知覚可能な刺激による雰囲気であって、前記ユーザの嗜好又は動作を反映させた雰囲気が形成されるように、制御装置により雰囲気形成機器を制御する、情報処理方法。
(21)コンピュータに、ユーザが利用するコンテンツの情報の提示に連動して、前記情報の提示とは異なる前記ユーザが知覚可能な刺激による雰囲気であって、前記ユーザの嗜好又は動作を反映させた雰囲気が形成されるように雰囲気形成機器を制御する機能を実行させる、プログラム。
(22)ユーザの周辺環境に前記ユーザが知覚可能な刺激による雰囲気を形成するための雰囲気形成機器と、前記ユーザが利用するコンテンツの情報の提示に連動して、前記情報の提示とは異なる雰囲気であって、前記ユーザの嗜好又は動作を反映させた雰囲気が形成されるように雰囲気形成機器を制御する情報処理装置と、を備える、雰囲気形成システム。
20 入力装置
30 出力装置(雰囲気形成機器)
30a 照明器具
30b アロマディフューザ
30c エアコンディショナ
40 ストレージ装置
50 通信装置
60 情報処理装置
90 グラス
95 ディスプレイ装置
100 制御部
110 コンテンツ解析部
130 雰囲気管理部
150 知識ベースシステム部
170 デバイス管理部
190 出力制御部
200 制御部
210 ユーザ動作検出部
270 デバイス管理部
400 画像表示機器(マルチ雰囲気出力対応機器)
Claims (20)
- ユーザが利用するコンテンツの情報の提示に連動して、前記情報の提示とは異なる前記ユーザが知覚可能な刺激による雰囲気であって、前記ユーザの嗜好又は動作を反映させた雰囲気が形成されるように雰囲気形成機器を制御する制御部を備える、情報処理装置。
- 前記制御部は、前記コンテンツから抽出される特徴量に基づいて前記雰囲気を形成するよう構成される、請求項1に記載の情報処理装置。
- 前記制御部は、前記ユーザの嗜好又は動作に応じて、前記雰囲気形成機器の出力強度を変化させるよう構成される、請求項1に記載の情報処理装置。
- 前記制御部は、前記ユーザの嗜好又は動作に応じて、用いる前記雰囲気形成機器の種類を選択するよう構成される、請求項1に記載の情報処理装置。
- 前記制御部は、前記ユーザの嗜好情報が格納されたデータベースを参照して、前記雰囲気に前記ユーザの嗜好を反映させるよう構成される、請求項1に記載の情報処理装置。
- 前記ユーザの嗜好情報は嗜好強度のレベルに関連づけられており、
前記制御部は、前記嗜好強度に応じて前記雰囲気に対する前記ユーザの嗜好の反映度合いを異ならせるよう構成される、請求項5に記載の情報処理装置。 - 前記ユーザの嗜好情報は嗜好強度のレベルに関連づけられており、
前記コンテンツを利用するユーザが複数いる場合に、前記制御部は、嗜好強度の高いユーザの嗜好情報を優先しながら、前記雰囲気に前記ユーザの嗜好を反映させるよう構成される、請求項5に記載の情報処理装置。 - 複数のコンテンツが同時に利用される場合に、前記制御部は、前記ユーザが指定したコンテンツを優先して前記雰囲気を形成するよう構成される、請求項1に記載の情報処理装置。
- 複数のコンテンツが同時に利用される場合に、前記制御部は、対応する前記雰囲気の出力を合成した前記雰囲気を形成するよう構成される、請求項1に記載の情報処理装置。
- 複数のコンテンツが同時に利用される場合に、前記制御部は、出力を合成できない前記雰囲気については、前記コンテンツの優先度に応じて選択された前記雰囲気を形成するよう構成される、請求項9に記載の情報処理装置。
- 前記コンテンツを利用するユーザが複数いる場合に、前記制御部は、選択された特定のユーザの嗜好を優先して反映させた前記雰囲気を形成するよう構成される、請求項1に記載の情報処理装置。
- 前記制御部は、前記コンテンツから抽出される特徴量に応じた雰囲気パラメータに対して前記ユーザの嗜好又は動作を反映させた前記雰囲気を形成するよう構成される、請求項1に記載の情報処理装置。
- 前記制御部は、前記雰囲気形成機器の出力の変化率に基づいて、前記コンテンツの情報の提示に連動する前記雰囲気を形成する対象となるコンテンツを決定するよう構成される、請求項2に記載の情報処理装置。
- 前記制御部は、それぞれの前記雰囲気の知覚把握特性に応じて、前記雰囲気形成機器の出力タイミングを制御するよう構成される、請求項1に記載の情報処理装置。
- 前記制御部は、前記ユーザと前記雰囲気形成機器との距離に応じて前記雰囲気形成機器の出力タイミングを制御するよう構成される、請求項1に記載の情報処理装置。
- 前記制御部は、前記ユーザによる入力操作、音声入力操作又はジェスチャー操作に基づいて、前記雰囲気を調整するよう構成される、請求項1に記載の情報処理装置。
- 前記制御部は、前記コンテンツを利用するユーザの動作を解析して得られる前記ユーザの意思にしたがって前記雰囲気を変化させるよう構成される、請求項1に記載の情報処理装置。
- 前記雰囲気形成機器が、照明器具、アロマディフューザ、スピーカ、扇風機、エアコンディショナ、振動発生装置のうちの少なくとも1つである、請求項1に記載の情報処理装置。
- ユーザが利用するコンテンツの情報の提示に連動して、前記情報の提示とは異なる前記ユーザが知覚可能な刺激による雰囲気であって、前記ユーザの嗜好又は動作を反映させた雰囲気が形成されるように、制御装置により雰囲気形成機器を制御する、情報処理方法。
- コンピュータに、ユーザが利用するコンテンツの情報の提示に連動して、前記情報の提示とは異なる前記ユーザが知覚可能な刺激による雰囲気であって、前記ユーザの嗜好又は動作を反映させた雰囲気が形成されるように雰囲気形成機器を制御する機能を実行させる、プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP15811430.6A EP3163891A4 (en) | 2014-06-24 | 2015-04-27 | Information processing apparatus, information processing method, and program |
JP2016529148A JP6504165B2 (ja) | 2014-06-24 | 2015-04-27 | 情報処理装置及び情報処理方法並びにプログラム |
CN201580026384.7A CN106416278A (zh) | 2014-06-24 | 2015-04-27 | 信息处理装置、信息处理方法及程序 |
US15/313,657 US20170214962A1 (en) | 2014-06-24 | 2015-04-27 | Information processing apparatus, information processing method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-129194 | 2014-06-24 | ||
JP2014129194 | 2014-06-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015198716A1 true WO2015198716A1 (ja) | 2015-12-30 |
Family
ID=54937815
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/062725 WO2015198716A1 (ja) | 2014-06-24 | 2015-04-27 | 情報処理装置及び情報処理方法並びにプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20170214962A1 (ja) |
EP (1) | EP3163891A4 (ja) |
JP (1) | JP6504165B2 (ja) |
CN (1) | CN106416278A (ja) |
WO (1) | WO2015198716A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109109624A (zh) * | 2018-07-12 | 2019-01-01 | 吉利汽车研究院(宁波)有限公司 | 一种智能控制气体发生装置的*** |
JP2020511044A (ja) * | 2017-06-21 | 2020-04-09 | ゼット5エックス グローバル エフゼット リミテッド ライアビリティ カンパニー | コンテンツ対話システム及び方法 |
JP2020077983A (ja) * | 2018-11-08 | 2020-05-21 | スカパーJsat株式会社 | 家電制御装置、表示装置、放送装置、制御システム |
US10743087B2 (en) | 2017-06-21 | 2020-08-11 | Z5X Global FZ-LLC | Smart furniture content interaction system and method |
JP2023509506A (ja) * | 2020-01-14 | 2023-03-08 | 日本電気株式会社 | 制御装置、制御方法及びプログラム |
WO2023190206A1 (ja) * | 2022-03-31 | 2023-10-05 | ソニーグループ株式会社 | コンテンツ提示システム、コンテンツ提示プログラム及びコンテンツ提示方法 |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2017094326A1 (ja) * | 2015-11-30 | 2018-09-13 | ソニー株式会社 | 情報処理装置、情報処理方法及びプログラム |
US10110851B2 (en) * | 2016-05-06 | 2018-10-23 | Avaya Inc. | System and method for dynamic light adjustment in video capture |
JP6774018B2 (ja) * | 2016-09-15 | 2020-10-21 | 富士ゼロックス株式会社 | 対話装置 |
CN109891357A (zh) * | 2016-10-20 | 2019-06-14 | 阿恩齐达卡士技术私人有限公司 | 情感智能陪伴装置 |
WO2018163700A1 (ja) * | 2017-03-07 | 2018-09-13 | ソニー株式会社 | コンテンツ提示システム、コンテンツ提示装置及び風提示装置 |
US10942569B2 (en) * | 2017-06-26 | 2021-03-09 | SonicSensory, Inc. | Systems and methods for multisensory-enhanced audio-visual recordings |
CN108332360B (zh) * | 2018-01-15 | 2020-12-01 | 广东美的制冷设备有限公司 | 空调器的喷香方法、空调器及计算机可读存储介质 |
CN111819565A (zh) * | 2018-02-27 | 2020-10-23 | 松下知识产权经营株式会社 | 数据转换***、数据转换方法和程序 |
JP7080078B2 (ja) * | 2018-03-19 | 2022-06-03 | 本田技研工業株式会社 | 情報提供システム、情報提供方法、及びプログラム |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003324402A (ja) * | 2002-05-07 | 2003-11-14 | Nippon Hoso Kyokai <Nhk> | 外部機器連動型コンテンツ生成装置、その方法及びそのプログラム、外部機器連動型コンテンツ再生装置、その方法及びそのプログラム |
JP2010034687A (ja) * | 2008-07-25 | 2010-02-12 | Sharp Corp | 付加データ生成システム |
JP2011040963A (ja) * | 2009-08-10 | 2011-02-24 | Toshiba Corp | 映像表示装置及び照明制御方法 |
JP2011259354A (ja) * | 2010-06-11 | 2011-12-22 | Sharp Corp | 視聴環境制御システム、送信装置、受信装置 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006345486A (ja) * | 2005-05-10 | 2006-12-21 | Victor Co Of Japan Ltd | 外部機器制御装置及び機器制御システム |
US8510778B2 (en) * | 2008-06-27 | 2013-08-13 | Rovi Guides, Inc. | Systems and methods for ranking assets relative to a group of viewers |
KR101078641B1 (ko) * | 2008-07-14 | 2011-11-01 | 명지대학교 산학협력단 | 감각 재생 장치에 관계된 메타데이터를 이용한 멀티미디어 응용 시스템 및 방법 |
WO2010007987A1 (ja) * | 2008-07-15 | 2010-01-21 | シャープ株式会社 | データ送信装置、データ受信装置、データ送信方法、データ受信方法および視聴環境制御方法 |
WO2010008234A2 (ko) * | 2008-07-16 | 2010-01-21 | 한국전자통신연구원 | 실감 효과 표현 방법 및 그 장치 및 실감 기기 성능 메타데이터가 기록된 컴퓨터로 읽을 수 있는 기록 매체 |
CN102282849A (zh) * | 2009-01-27 | 2011-12-14 | 夏普株式会社 | 数据发送装置、数据发送方法、视听环境控制装置、视听环境控制方法以及视听环境控制*** |
KR20110111251A (ko) * | 2010-04-02 | 2011-10-10 | 한국전자통신연구원 | 감각 효과를 위한 메타데이터 제공 방법 및 장치, 감각 효과를 위한 메타데이터가 기록된 컴퓨터로 읽을 수 있는 기록 매체, 감각 재생 방법 및 장치 |
US9959400B2 (en) * | 2010-06-25 | 2018-05-01 | Philips Lighting Holding B.V. | Controlling the access to a user interface for atmosphere control with an atmosphere creation system |
US8949901B2 (en) * | 2011-06-29 | 2015-02-03 | Rovi Guides, Inc. | Methods and systems for customizing viewing environment preferences in a viewing environment control application |
US8928811B2 (en) * | 2012-10-17 | 2015-01-06 | Sony Corporation | Methods and systems for generating ambient light effects based on video content |
US8984568B2 (en) * | 2013-03-13 | 2015-03-17 | Echostar Technologies L.L.C. | Enhanced experience from standard program content |
-
2015
- 2015-04-27 CN CN201580026384.7A patent/CN106416278A/zh active Pending
- 2015-04-27 EP EP15811430.6A patent/EP3163891A4/en not_active Withdrawn
- 2015-04-27 US US15/313,657 patent/US20170214962A1/en not_active Abandoned
- 2015-04-27 WO PCT/JP2015/062725 patent/WO2015198716A1/ja active Application Filing
- 2015-04-27 JP JP2016529148A patent/JP6504165B2/ja active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003324402A (ja) * | 2002-05-07 | 2003-11-14 | Nippon Hoso Kyokai <Nhk> | 外部機器連動型コンテンツ生成装置、その方法及びそのプログラム、外部機器連動型コンテンツ再生装置、その方法及びそのプログラム |
JP2010034687A (ja) * | 2008-07-25 | 2010-02-12 | Sharp Corp | 付加データ生成システム |
JP2011040963A (ja) * | 2009-08-10 | 2011-02-24 | Toshiba Corp | 映像表示装置及び照明制御方法 |
JP2011259354A (ja) * | 2010-06-11 | 2011-12-22 | Sharp Corp | 視聴環境制御システム、送信装置、受信装置 |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020511044A (ja) * | 2017-06-21 | 2020-04-09 | ゼット5エックス グローバル エフゼット リミテッド ライアビリティ カンパニー | コンテンツ対話システム及び方法 |
US10743087B2 (en) | 2017-06-21 | 2020-08-11 | Z5X Global FZ-LLC | Smart furniture content interaction system and method |
US10990163B2 (en) | 2017-06-21 | 2021-04-27 | Z5X Global FZ-LLC | Content interaction system and method |
US11009940B2 (en) | 2017-06-21 | 2021-05-18 | Z5X Global FZ-LLC | Content interaction system and method |
US11194387B1 (en) | 2017-06-21 | 2021-12-07 | Z5X Global FZ-LLC | Cost per sense system and method |
US11509974B2 (en) | 2017-06-21 | 2022-11-22 | Z5X Global FZ-LLC | Smart furniture content interaction system and method |
CN109109624A (zh) * | 2018-07-12 | 2019-01-01 | 吉利汽车研究院(宁波)有限公司 | 一种智能控制气体发生装置的*** |
JP2020077983A (ja) * | 2018-11-08 | 2020-05-21 | スカパーJsat株式会社 | 家電制御装置、表示装置、放送装置、制御システム |
JP2023509506A (ja) * | 2020-01-14 | 2023-03-08 | 日本電気株式会社 | 制御装置、制御方法及びプログラム |
JP7388562B2 (ja) | 2020-01-14 | 2023-11-29 | 日本電気株式会社 | 制御装置、制御方法及びプログラム |
WO2023190206A1 (ja) * | 2022-03-31 | 2023-10-05 | ソニーグループ株式会社 | コンテンツ提示システム、コンテンツ提示プログラム及びコンテンツ提示方法 |
Also Published As
Publication number | Publication date |
---|---|
CN106416278A (zh) | 2017-02-15 |
JPWO2015198716A1 (ja) | 2017-04-20 |
JP6504165B2 (ja) | 2019-04-24 |
EP3163891A4 (en) | 2017-10-25 |
EP3163891A1 (en) | 2017-05-03 |
US20170214962A1 (en) | 2017-07-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6504165B2 (ja) | 情報処理装置及び情報処理方法並びにプログラム | |
EP2229228B1 (en) | System and method for automatically creating an atmosphere suited to social setting and mood in an environment | |
CN111372109B (zh) | 一种智能电视以及信息交互方法 | |
US11647261B2 (en) | Electrical devices control based on media-content context | |
JP2004527809A (ja) | 個人のインタラクションをシミュレートする環境反応型ユーザインタフェース/エンタテインメントデバイス | |
JP2015525417A (ja) | 補足コンテンツの選択および伝達 | |
TW201821946A (zh) | 數據發送系統及其方法 | |
CN114128299A (zh) | 多媒体表演的基于模板的摘录和呈现 | |
Jalal et al. | Enhancing TV broadcasting services: A survey on mulsemedia quality of experience | |
US20200304882A1 (en) | Method and device for controlling the setting of at least one audio and/or video parameter, corresponding terminal and computer program | |
Epelde et al. | Providing universally accessible interactive services through TV sets: implementation and validation with elderly users | |
KR100934690B1 (ko) | 단일 미디어 다중 디바이스 기반 유비쿼터스 홈 미디어재현 방법 및서비스 방법 | |
WO2022249522A1 (ja) | 情報処理装置、情報処理方法及び情報処理システム | |
US20220164024A1 (en) | User-driven adaptation of immersive experiences | |
JP7170884B2 (ja) | メディアコンテンツにおけるスピーチの度合に基づく光効果の決定 | |
Esau-Held et al. | “Foggy sounds like nothing”—enriching the experience of voice assistants with sonic overlays | |
CN113424659B (zh) | 增强用户对光场景的识别 | |
CN109491499A (zh) | 一种电器设备控制方法、装置、电器设备和介质 | |
CN113056066A (zh) | 基于电视节目的灯光调节方法、设备、***及存储介质 | |
CN112883144A (zh) | 一种信息交互方法 | |
CN113055748A (zh) | 基于电视节目的灯光调节方法、设备、***及存储介质 | |
WO2024125478A1 (zh) | 音频呈现方法和设备 | |
CN118012550A (zh) | 播放器背景图控制方法、装置、电子设备及存储介质 | |
Hinde | Concurrency in auditory displays for connected television | |
Dumoulin et al. | Movie's affect communication using multisensory modalities |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15811430 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016529148 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2015811430 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015811430 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15313657 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |