WO2015186534A1 - 情報処理装置および方法、並びにプログラム - Google Patents
情報処理装置および方法、並びにプログラム Download PDFInfo
- Publication number
- WO2015186534A1 WO2015186534A1 PCT/JP2015/064676 JP2015064676W WO2015186534A1 WO 2015186534 A1 WO2015186534 A1 WO 2015186534A1 JP 2015064676 W JP2015064676 W JP 2015064676W WO 2015186534 A1 WO2015186534 A1 WO 2015186534A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- communication
- message
- information processing
- display
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 175
- 238000000034 method Methods 0.000 title abstract description 58
- 230000006854 communication Effects 0.000 claims abstract description 585
- 238000004891 communication Methods 0.000 claims abstract description 584
- 230000008451 emotion Effects 0.000 claims description 169
- 238000012545 processing Methods 0.000 claims description 158
- 230000000694 effects Effects 0.000 claims description 113
- 230000014509 gene expression Effects 0.000 claims description 108
- 230000002996 emotional effect Effects 0.000 claims description 51
- 230000008921 facial expression Effects 0.000 claims description 16
- 230000001815 facial effect Effects 0.000 claims description 3
- 238000003672 processing method Methods 0.000 claims description 3
- 238000004458 analytical method Methods 0.000 description 59
- 230000008569 process Effects 0.000 description 36
- 230000005540 biological transmission Effects 0.000 description 35
- 238000013461 design Methods 0.000 description 28
- 230000004044 response Effects 0.000 description 25
- 238000003860 storage Methods 0.000 description 23
- 230000006870 function Effects 0.000 description 19
- 238000012790 confirmation Methods 0.000 description 16
- 230000008707 rearrangement Effects 0.000 description 13
- 238000010586 diagram Methods 0.000 description 11
- 238000005516 engineering process Methods 0.000 description 9
- 238000001914 filtration Methods 0.000 description 8
- 238000012217 deletion Methods 0.000 description 7
- 230000037430 deletion Effects 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 6
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 230000015654 memory Effects 0.000 description 4
- 230000009467 reduction Effects 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 235000009508 confectionery Nutrition 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 238000004880 explosion Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000035900 sweating Effects 0.000 description 1
- 230000000699 topical effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/07—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
- H04L51/10—Multimedia information
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F13/00—Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/147—Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/107—Computer-aided management of electronic mailing [e-mailing]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/21—Monitoring or handling of messages
- H04L51/216—Handling conversation history, e.g. grouping of messages in sessions or threads
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/52—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M11/00—Telephonic communication systems specially adapted for combination with other electrical systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2310/00—Command of the display device
- G09G2310/02—Addressing, scanning or driving the display screen or processing steps related thereto
- G09G2310/0264—Details of driving circuits
- G09G2310/027—Details of drivers for data electrodes, the drivers handling digital grey scale data, e.g. use of D/A converters
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0407—Resolution change, inclusive of the use of different resolutions for different screen areas
- G09G2340/0435—Change or adaptation of the frame rate of the video stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72427—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/7243—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
- H04M1/72436—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for text messaging, e.g. short messaging services [SMS] or e-mails
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/60—Details of telephonic subscriber devices logging of communication history, e.g. outgoing or incoming calls, missed calls, messages or URLs
Definitions
- Patent Document 1 As a means of communication, a communication tool that uses various data such as images and sounds as well as text data has been developed (see, for example, Patent Document 1). For example, in the network system described in Patent Document 1, a chat system that provides and receives an avatar image reflecting a user's facial expression together with a message is provided.
- Patent Document 1 is not sufficient as a method for communication, and other methods have been required.
- This disclosure has been made in view of such a situation, and is intended to improve the expressiveness of communication.
- the information processing apparatus includes a display control unit that is displayed on the display unit in a state of being arranged along the same time series.
- the communication history between the users includes a communication history between a first user who is a predetermined user and a user other than the first user, and a communication history between the virtual character and the user is: A history of communication between the first user and the virtual character corresponding to the other user, or a history of communication between the virtual character corresponding to the first user and the other user is included. it can.
- the communication history further includes a third display that expresses an emotion assigned when each message is exchanged, and the display control unit displays an emotion expression assigned when the message is exchanged together with the message. Can be displayed on the display unit.
- the emotional expression includes a facial expression of a user or a virtual character that is a sender of the message, and the display control unit is configured to express the emotion of the message together with each message.
- the face image of the user or the virtual character can be displayed on the display unit.
- the emotion expression includes an effect image representing the emotion of the message, and the display control unit can cause the display unit to display an effect image representing the emotion of the message together with each message.
- the emotion expression includes a balloon shape representing the emotion of the message, and the display control unit can display a balloon having a shape representing the emotion of the message on the display unit together with each message.
- the emotional expression can be based on information obtained by sensing the first user or the other user who is the recipient of the message when the message is exchanged.
- the display control unit sends a message sent from the first user to the other user or a virtual character corresponding to the other user, and a virtual character corresponding to the first user to the other user. Displayed on the display unit in a state where they can be distinguished from each other, the message sent from the other user to the first user or a virtual character corresponding to the first user, and the other Messages sent from the virtual character corresponding to the user to the first user can be displayed on the display unit in a state where they can be distinguished from each other.
- An instruction receiving unit that receives an instruction to rearrange the messages; and the display control unit receives the instruction to rearrange the messages when the instruction receiving unit receives the instruction to rearrange the messages.
- the messages can be rearranged according to the message rearrangement conditions and displayed on the display unit.
- the user of the information processing apparatus itself and the other information processing apparatus Both the history of communication with the virtual character corresponding to the user or the second display representing the history of communication between the virtual character corresponding to the user of the information processing apparatus itself and the user of the other information processing apparatus Can be displayed on the display unit in a state of being arranged along the same time series.
- the communication processing unit realizes the communication in a state where a communication screen which is a screen for communication is displayed on the display unit, and the display control unit displays a history display screen different from the communication screen.
- the history display screen both the first display and the second display can be displayed in a state of being arranged along the same time series.
- the communication processing unit includes a face image of a virtual character corresponding to the user of the other information processing apparatus or the user of the other information processing apparatus that is the communication partner on the communication screen, and the other information processing apparatus.
- a message sent from a virtual character corresponding to the user of the user or the other information processing device, and a virtual character corresponding to the user of the other information processing device or the user of the other information processing device One message sent can be displayed.
- the communication processing unit can further display an emotion expression assigned to a message displayed on the communication screen on the communication screen.
- the communication processing unit as the emotion expression assigned to a message sent from a user of the other information processing apparatus or a virtual character corresponding to the user of the other information processing apparatus, expresses an expression representing the emotion of the message.
- the face image of the virtual character corresponding to the user of the other information processing apparatus or the user of the other information processing apparatus can be displayed on the communication screen.
- the communication processing unit is an effect image representing the emotion of the message as the emotion expression assigned to the message sent from the user of the other information processing device or a virtual character corresponding to the user of the other information processing device. Can be displayed at any position on the communication screen.
- One aspect of the present technology also includes both a first display representing a communication history between users and a second display representing a communication history between a virtual character that can respond on behalf of the user and the user. Is an information processing method for displaying on the display unit in a state of being arranged along the same time series.
- the computer further includes a first display that represents a history of communication between users, and a second that represents a history of communication between the virtual character that can respond on behalf of the user and the user.
- This is a program for causing both displays to function as a display control unit that is displayed on the display unit in a state of being arranged along the same time series.
- the number of terminal devices 131 included in the communication system 100 is arbitrary. Further, as the terminal device 131, for example, a plurality of types of information processing devices such as a mobile phone and a smartphone may be applied.
- the input / output interface 160 is also connected to the bus 154.
- An input unit 161, an output unit 162, a storage unit 163, a communication unit 164, and a drive 165 are connected to the input / output interface 160.
- the output unit 162 includes an output device that outputs information such as images and sounds.
- the output unit 162 includes a display, a speaker, an output terminal, and the like.
- the user management server 111 implements functional blocks such as a user information management unit 181 and a friend management unit 182.
- the user information management unit 181 performs processing related to management of user information that is information related to the user 101.
- the content of the user information is arbitrary.
- the user information may include personal information of the user 101, an ID and password for logging in to the communication system 100, and other various setting information.
- the friend management unit 182 performs processing related to management of friends who are communication partners of each user set by each user 101.
- each user 101 can communicate with another user 101 set as a friend (or the virtual character 103 corresponding to the other user 101).
- the user 101 preliminarily uses another user 101 as the communication partner (or another user 101 corresponding to the virtual character 103 as the communication partner) as a friend. It is necessary to register with the management server 111 (friend management unit 182).
- the terminal device 131 includes an SOC (System-on-a-Chip) 201, a RAM (Random Access Memory) 202, an input unit 211, an output unit 212, a storage unit 213, a communication unit 214, and A drive 215 is included.
- SOC System-on-a-Chip
- RAM Random Access Memory
- the input unit 211 includes various input devices.
- the input unit 211 includes, for example, an operation unit 221, a touch panel 222, a photographing unit 223, an audio input unit 224, a sensor unit 225, and an input terminal 226.
- the operation unit 221 includes, for example, arbitrary input devices operated by the user such as keys, buttons, switches, and levers.
- the operation unit 221 accepts user input by user operations on the input devices, and accepts the received user input to the SOC 201, for example. Supply.
- the touch panel 222 is formed, for example, so as to overlap a display unit 231 described later, and a user operation (for example, movement of a user's finger or stylus pen or the like) performed based on a GUI (Graphical User Interface) displayed on the display unit 231 Information (namely, user input) indicated by the position etc. is received, and the received user input is supplied to, for example, the SOC 201 or the like.
- GUI Graphic User Interface
- the photographing unit 223 has, for example, a lens, a diaphragm, an image sensor, and the like, obtains a photographed image by photographing a subject, and supplies the obtained photographed image data to the SOC 201, for example.
- the voice input unit 224 has a voice input device such as a microphone, for example, receives voice input, and supplies the received voice data to the SOC 201 or the like.
- the sensor unit 225 includes, for example, various sensors such as an acceleration sensor, an optical sensor, and a temperature sensor, obtains information corresponding to the sensor through sensing, and supplies the obtained various sensor information to, for example, the SOC 201 and the like.
- the input terminal 226 has an input terminal for an arbitrary signal such as an analog signal input terminal, and supplies the input data to the SOC 201 or the like.
- the drive 215 drives a removable medium 251 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and reads information stored in the removable medium 251 or writes information in the removable medium 251.
- a removable medium 251 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory
- this program can be installed in advance in the ROM or the storage unit 213 in the SOC 201.
- the user 101 operates the terminal device 131 to communicate with the user management server 111 and the virtual character management server 112, and performs processing such as registration of user information and setting information of the virtual character 103 as an initial setting.
- the initial setting unit 261 accepts registration of an image (icon) in step S103.
- This “image (icon)” is an image showing the user 101 and the virtual character 103 corresponding to the user 101 and the emotion on the conversation screen displayed on the display unit 231 when communication is performed. Details of the conversation screen will be described later.
- the initial setting unit 261 causes the display unit 231 to display an icon registration screen 310 as shown in FIG.
- a photographed image obtained by newly photographing is registered as an image (icon). That is, in this case, photographing is performed, and a photographed image obtained by the photographing is registered as an image (icon).
- an image registered in the album function for example, a captured image acquired in the past or an image downloaded from the outside (another device)
- images (icons) are registered as images (icons). That is, in this case, an image selected from the image group registered in the album function is registered as an image (icon).
- selection buttons are provided for each emotion (for example, emotions, etc.) (selection button 332-1 to photographing button 332-4).
- the selection buttons 332-1 to 332-4 are referred to as selection buttons 332 when there is no need to distinguish them from each other.
- Each selection button 332 displays a message, a picture, or the like indicating that an image is selected by an operation by the user 101, such as “select” in the example of FIG. You may do it. In this way, the user 101 can more easily grasp the meaning of operating the selection button 332 (contents of processing performed by the user 101 operating the selection button 332).
- the user information management unit 181 of the user management server 111 notifies the terminal device 131 that the user information has been successfully registered. Upon receiving the notification, the initial setting unit 261 of the terminal device 131 completes the account registration.
- effect image by expressing emotions using effect images (effects), it is possible to increase the expressive power of communication rather than communication that simply transmits and receives messages.
- the message analysis unit 191 of the analysis server 114 acquires the message and the like in step S231, and analyzes the message in step S232.
- the emotion setting unit 192 sets an emotion corresponding to the message based on the message analysis result.
- the effect setting unit 193 sets an effect (effect image) corresponding to the set emotion.
- a stamp image that makes an image of an explosion is received over the entire communication display area 363 (image display area 381, send message display) in response to a received message “Donto Koidaa! Area 382, received message display area 383, etc.).
- effect image effect
- the conversation screen 360 on the message transmitting side An effect image (effect) may be displayed only in the image display area 381 as shown in FIG.
- the communication management server 113 manages such communication using information of the user management server 111, for example. Also, the communication management server 113 creates a message for the virtual character 103 using the virtual character management server 112. Further, the communication management server 113 uses the analysis server 114 to generate and add an emotion expression corresponding to the message of the virtual character 103.
- the emotional expression of the virtual character 103 is displayed on the display unit 231 of the terminal device 131.
- an emotional expression is generated based on a transmitted / received message. Therefore, the emotional expression of the virtual character 103 can be performed in the same manner as the user 101. That is, it is possible to improve the communication expression.
- the emotional expression ability of the virtual character 103 may be increased.
- step S291 the communication processing unit 263 of the terminal device 131 that is the transmission source of the stamp image controls, for example, the operation unit 221 and receives a message input.
- the user 101 inputs a message on the conversation screen 360 displayed on the display unit 231 of the terminal device 131.
- the message analysis unit 191 of the analysis server 114 acquires the stamp image or the like in step S311, and analyzes the stamp image in step S312. That is, it is determined whether or not the stamp image is a predetermined stamp image 421 determined in advance. In step S313, the message analysis unit 191 of the analysis server 114 supplies the stamp image and the analysis result of the stamp image to the virtual character management server 112.
- step S321 the setting management unit 183 of the virtual character management server 112 acquires the supplied stamp image and the analysis result of the stamp image. If the supplied stamp image is a predetermined stamp image 421 determined in advance, the setting management unit 183 counts the number of acquired stamp images 421 in step S322. In step S323, the setting management unit 183 updates the setting of the virtual character 103 in accordance with the totaling result (that is, the number of acquired stamp images 421). For example, as shown in FIG. 30, an image (expression) used as an image representing each emotion is updated (the more stamp images are obtained, the more images can be used). .
- step S317 the effect setting unit 193 supplies the communication management server 113 with information related to the set emotion and effect.
- the communication management unit 189 of the communication management server 113 acquires information about the emotion or effect in step S303, based on the user information managed by the user information management unit 181 of the user management server 111, a virtual corresponding to the emotion is obtained.
- An image showing the character 103 is acquired, and the conversation screen 630 is generated using the image, effect, and the like.
- the communication between the user 101 and the virtual character 103 of the other user has been described.
- the communication between the user 101 and the virtual character 103 of the user 101 also includes the above-described virtual character of the user 101 and the other user. Processing can be performed in the same manner as in the case of communication with 103. That is, the above description can also be applied to communication between the user 101 and the virtual character 103 of the user 101.
- the user 101 may allow other users to participate in communication with the virtual character 103 of the user 101 from the middle (that is, replace the virtual character 103).
- an image for accepting midway participation may be displayed on the display unit 231 of the terminal device 131 of the user 101.
- the user 101-2 which is another user, operates the terminal device 131-2 to communicate with the server 110, and communicates with the virtual character 103-1 of the user 101-1.
- the user 101 may operate the terminal device 131-1 to access the server 110 and participate in the communication halfway.
- participation is performed halfway, the state shown in the example of FIG. 19 is obtained, and communication between users is performed.
- step S361 the communication processing unit 263 of the terminal device 131-2 that is the message transmission source, for example, controls the operation unit 221 or the like to receive the message input.
- the user 101-2 inputs a message on the conversation screen 360 displayed on the display unit 231 of the terminal device 131-2.
- step S344 the communication management unit 189 supplies the data of the conversation confirmation screen to the terminal device 131-1 on the virtual character side.
- step S331 the interrupt processing unit 264 of the terminal device 131-1 acquires the data of the conversation confirmation screen.
- step S332 the communication processing unit 263 of the terminal device 131-1 displays the conversation confirmation screen on the display unit 231.
- the exchanged message may be displayed as in the conversation confirmation screen 432 shown in FIG.
- the conversation confirmation screen 432 is provided with a GO button 432A.
- the user 101-1 becomes the communication partner of the user 101-2 instead of the virtual character 103-1, and the user 101-2 Communication between each other is started.
- the conversation confirmation screen 434 is shown in FIG.
- the shoji is half-opened, and you can see a picture that looks like a musical note mark behind the shoji.
- the user 101-1 becomes the virtual character 103-1 and the user 101-2 becomes the communication partner of the user 101-2, and the communication between the users starts (that is, the user 101-2). 101 participates on the way).
- the conversation confirmation screen 434 is shown in FIG.
- the image is like a shoji screen.
- the user 101-1 becomes the virtual character 103-1 and the user 101-2 becomes the communication partner of the user 101-2, and the communication between the users starts (that is, the user 101-2). 101 participates on the way).
- display may be made so as to prompt the user 101 to participate in the middle depending on the situation.
- step S346 the state management unit 188 of the communication management server 113 updates the state of the user 101-1 and its virtual character 103-1. That is, the user 101-1 is a communication partner.
- step S347 the communication management unit 189 of the communication management server 113 generates a conversation screen with which the user 101-1 is a communication partner and supplies it to the terminal device 131-2.
- the history management unit 190 of the communication management server 113 records and manages the message transmitted as described above and the emotion expression given to the message as a log.
- the log recording location may be the storage unit 163 of the communication management server 113 or another data server (not shown).
- the user 101 operates the terminal device 131 to exchange information with the user management server 111, the virtual character management server 112, the communication management server 113, and the like. This can be done.
- the mode management unit 185 of the virtual character management server 112 receives the block mode setting instruction in step S391, and in step S392, in accordance with the instruction, the user 101 of the virtual character 103-1 corresponding to the user 101-1. -2 is set to block mode.
- step S401 the square management unit 187 of the communication management server 113 acquires the block mode setting instruction.
- step S402 the friend (user) who is the target of the block mode on the square screen of the requesting user 101-1.
- the image of 101-2) is set to the block mode.
- step S403 the square management unit 187 transmits the updated square screen to the terminal device 131-1.
- step S404 the square management unit 187 of the communication management server 113 sets the image of the requesting friend (user 101-1) on the square screen of the requesting user 101-2 as the virtual character 103.
- step S405 the square management unit 187 transmits the updated square screen to the terminal device 131-2.
- the square processing unit 262 of the terminal device 131-2 receives the updated square screen in step S381, and causes the display unit 231 to display the updated square screen in step S382. That is, a square screen in which an image of a friend who is a block mode request source is a virtual character is displayed on the display unit 231 of the terminal device 131-2.
- the square processing unit 262 of the terminal device 131-2 receives the updated square screen in step S421, and causes the display unit 231 to display the updated square screen in step S382. That is, a square screen in which the image of the friend who has requested the block mode is returned to the user 101 is displayed on the display unit 231 of the terminal device 131-2.
- step S451 the mode setting unit 265 of the terminal device 131 accepts a friend deletion instruction. For example, when the user 101-1 inputs an instruction or the like and a friend deletion instruction for a predetermined other user is received, the mode setting unit 265 displays the received friend deletion instruction in step S 452. Transmit to the management server 111.
- step S471 the user information management unit 181 of the user management server 111 receives the friend deletion instruction, and in step S472, deletes the specified other user from the friend of the user 101 according to the instruction. .
- step S473 the user information management unit 181 notifies the communication management server 113 that it has been deleted from the friend.
- the message screen is basically not displayed on the conversation screen 360, but the message history may be displayed as a matter of course. In that case, the message history may be displayed on a screen different from the conversation screen 360. By doing so, the message history can be displayed while maintaining the above-described features of the conversation screen 360. That is, also in this case, it is possible to increase the expressiveness of communication.
- FIG. 42 A display example of the message history display screen 442 is shown in FIG. As shown in FIG. 42, the message history display screen 442 may be a screen wider than the display area of the display unit 231. In the case of the example in FIG. 42, the area surrounded by the dotted frame 443 indicates the display area of the display unit 231. In this case, the user 101 can display all message histories on the display unit 231 by sliding the message history display screen 442 up and down.
- the first display representing the communication history between the users
- the second display representing the communication history between the virtual character that can respond on behalf of the user and the user.
- Both displays may be displayed on the display unit 231 in a state of being arranged along the same time series.
- the message of the user 101 and the message of the virtual character 103 are displayed in a time series (for example, from the top to the bottom).
- the history can be referred to as one time series, and the message history is easy to see. Therefore, communication using the user 101 and the virtual character 103 can be realized without reducing the visibility of the message history, and more various communication can be performed. That is, it is possible to improve the communication expression.
- information other than a message may be included. That is, it may be a communication history.
- a question mark is added to the message 456, and an exclamation mark is added to the message 457.
- emotional expressions are not limited to facial images.
- the emotional expression may include a balloon shape representing the emotion of the message.
- a balloon having a shape representing the emotion of the message may be displayed on the display unit 231 together with each message.
- these messages can be identified by changing the color (pattern) of the balloon.
- a message 451 and a message 457 displayed in a white balloon on the left side of the message history display screen 442 are messages sent from the first user to another user or a virtual character corresponding to another user.
- a message 453 and a message 455 displayed in a hatched balloon on the left side of the message history display screen 442 are messages sent from the virtual character corresponding to the first user to another user.
- the message 452, message 454, and message 456 displayed in the white balloon on the right side of the message history display screen 442 are messages sent from other users to the first user or the virtual character corresponding to the first user. It is.
- a message 458 displayed in a diagonally-shaped balloon on the right side of the message history display screen 442 is a message sent from the virtual character corresponding to another user to the first user.
- communication is realized in a state where the communication screen, which is a communication screen, is displayed on the display unit 231, and a history display screen different from the communication screen is displayed on the display unit 231. You may make it display both the said 1st display and the said 2nd display in the state arranged along the same time series.
- the communication history may be displayed on a message history display screen 442 different from the conversation screen 360 which is a communication screen.
- the face image of the virtual character corresponding to the user of the other information processing apparatus or the user of the other information processing apparatus that is the communication partner, the user of the other information processing apparatus or the other information processing apparatus One message sent from the virtual character corresponding to the user and one message sent to the virtual character corresponding to the user of the other information processing apparatus or the user of the other information processing apparatus are displayed. May be.
- the communication screen may be realized as the conversation screen 360 configured as described with reference to FIG.
- the conversation screen 360 By using such a conversation screen 360, communication expressiveness can be increased.
- emotion expressions assigned to messages displayed on the communication screen may be displayed. As described with reference to FIG. 21, an emotional expression may be displayed in addition to the message on the conversation screen 360. By using such emotional expressions, communication expressiveness can be increased.
- an expression of another information processing apparatus having an expression representing the emotion of the message A face image of a virtual character corresponding to a user or a user of another information processing apparatus may be displayed on the communication screen.
- an image corresponding to an emotion estimated from a transmitted / received message is selected from images registered for each emotion in the initial setting. It may be selected and displayed.
- an effect image representing the emotion of the message can be displayed at any position on the communication screen. You may make it display on.
- This production image (effect) is optional.
- it may be a predetermined stamp image representing emotion, a design of a display area of a message displayed in the communication display area 363 (any visual effect such as size, shape, pattern, color, etc.) or a message.
- Design arbitrary visual effects such as font type, size, shape, pattern, color, etc., etc., or a presentation effect that changes the display of part or all of the communication display area 363. Also good.
- an effect image (effect) displayed on the entire communication display area 363 as shown in B of FIG. 25 is displayed on the conversation screen 360 on the message receiving side, and the conversation screen 360 on the message transmitting side is displayed on the conversation screen 360 of FIG.
- it may be displayed only in the image display area 381.
- a display unit 231 may be further provided.
- step S511 the history display control unit 266 of the terminal device 131 controls the operation unit 221, the touch panel 222, and the like to receive an input of a history display instruction. For example, when an input of a history display instruction that instructs the user 101 or the like to display a message history is received, the history display control unit 266 displays the received history display instruction in step S512 as the communication management server 113. Send to.
- the history display screen is displayed as described above.
- the history display control unit 266 of the terminal device 131 receives the history alignment instruction for instructing the rearrangement of the history by controlling the operation unit 221 or the touch panel 222 in step S515. For example, when an input of a history alignment instruction made by the user 101 or the like is received, the history display control unit 266 transmits the received history alignment instruction to the communication management server 113 in step S516.
- step S544 the history management unit 190 of the communication management server 113 receives the history alignment instruction.
- step S545 the history management unit 190 aligns the history on the history display screen under the conditions specified in the history alignment instruction.
- step S546 the history management unit 190 transmits the history display screen in which the history is rearranged to the terminal device 131.
- the history display control unit 266 of the terminal device 131 receives a history selection instruction for controlling the operation unit 221 and the touch panel 222 and the like to instruct selection of the history to be displayed in step S519. .
- the history display control unit 266 transmits the received history selection instruction to the communication management server 113 in step S520.
- step S547 the history management unit 190 of the communication management server 113 receives the history selection instruction.
- step S548 the history management unit 190 selects a history to be displayed on the history display screen under the conditions specified in the history selection instruction.
- step S549 the history management unit 190 transmits the history display screen that has selected the history to the terminal device 131.
- step S521 the history display control unit 266 of the terminal device 131 receives the history display screen.
- step S522 the history display control unit 266 causes the display unit 231 to display a history display screen in which the history is selected.
- the program can be installed in the storage unit 163 by attaching the removable medium 171 to the drive 165.
- the program can be installed in the storage unit 213 by attaching the removable medium 251 to the drive 215.
- this program can also be installed in advance in a storage unit or ROM.
- the program in the case of the user management server 111 to the analysis server 114 or the like (or the server 110), the program can be installed in advance in the storage unit 163, the ROM 152, or the like.
- the program in the case of the terminal device 131, the program can be installed in advance in the storage unit 213 or the ROM in the SOC 201.
- Such an application activation method is arbitrary, and a predetermined command may be input in a predetermined CLI (Command Line Interface), or a predetermined icon is operated in a predetermined GUI (Graphical User Interface) ( For example, a tap or the like may be used, or other methods may be used.
- CLI Common Line Interface
- GUI Graphic User Interface
- this icon is configured only by the outer frame (the line representing the outer shape) of the cloud-shaped design portion of the example shown in FIG. 44A. May be.
- the color of the outer frame (line representing the outer shape) is arbitrary. For example, it may be black or any other color.
- the color inside the outer frame is also arbitrary. For example, it may be white, other colors, or may be transmitted.
- an icon having a design such as the example shown in FIG. 44A or FIG. 44B may be used on the various screens described above. For example, you may make it use as a design of the transmission button 372 (FIG. 21) of the conversation screen 360. FIG. Of course, this icon design may be used for other parts of the conversation screen 360 or any part of any screen other than the conversation screen 360. In this way, by using the icon design on the screen displayed as the user interface of the activated application, the user 101 can confirm that the activated application corresponds to the icon operated by the user 101. It can be grasped more easily.
- the program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
- the step of describing the program recorded on the recording medium is not limited to the processing performed in chronological order according to the described order, but may be performed in parallel or It also includes processes that are executed individually.
- each step described above can be executed in each device described above or any device other than each device described above.
- the device that executes the process may have the functions (functional blocks and the like) necessary for executing the process described above.
- Information necessary for processing may be transmitted to the apparatus as appropriate.
- the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Accordingly, a plurality of devices housed in separate housings and connected via a network and a single device housing a plurality of modules in one housing are all systems. .
- the present technology can take a configuration of cloud computing in which one function is shared by a plurality of devices via a network and is jointly processed.
- the plurality of processes included in the one step can be executed by being shared by a plurality of apparatuses in addition to being executed by one apparatus.
- Both the first display representing the history of communication between users and the second display representing the history of communication between the virtual character that can respond on behalf of the user and the user are the same time series.
- An information processing apparatus comprising: a display control unit that displays on a display unit in a state of being arranged along the line.
- the communication history between the users includes a communication history between a first user who is a predetermined user and a user other than the first user,
- the communication history between the virtual character and the user is the communication history between the first user and the virtual character corresponding to the other user, or the virtual character and the other user corresponding to the first user.
- the information processing apparatus according to (1), including a history of communication with the information processing apparatus.
- the emotion expression includes a facial expression of a face image of a user or a virtual character who is a sender of the message, The information processing unit according to (4), wherein the display control unit causes the display unit to display a face image of a user or a virtual character who is a sender of the message and has a facial expression representing the emotion of the message together with each message.
- apparatus. (6)
- the emotion expression includes an effect image representing the emotion of the message, The information processing apparatus according to (4) or (5), wherein the display control unit causes the display unit to display an effect image representing the emotion of the message together with each message.
- the emotional expression includes a balloon shape representing the emotion of the message,
- the information processing apparatus according to any one of (4) to (6), wherein the display control unit causes the display unit to display, together with each message, a balloon having a shape representing the emotion of the message.
- the emotional expression is based on information obtained by sensing the first user or the other user who is the recipient of the message when the message is exchanged (4) to (7 ).
- the display control unit A message sent from the first user to the other user or a virtual character corresponding to the other user, and a message sent from the virtual character corresponding to the first user to the other user, Display on the display unit in a state where they can be distinguished from each other, A message sent from the other user to the first user or a virtual character corresponding to the first user, and a message sent from the virtual character corresponding to the other user to the first user.
- the information processing apparatus according to any one of (3) to (8), wherein the information is displayed on the display unit in a state where they can be distinguished from each other.
- (10) It further includes an instruction receiving unit that receives an instruction to rearrange the messages, The display control unit rearranges each message according to the message reordering condition received by the instruction accepting unit when the instruction reordering instruction is accepted by the instruction accepting unit, and displays the display The information processing apparatus according to any one of (3) to (9).
- An instruction receiving unit that receives an instruction to select the message is further provided, The display control unit, when an instruction for selecting the message is received by the instruction receiving unit, selects a message according to the message selection condition received by the instruction receiving unit, and selects the selected message, The information processing apparatus according to any one of (3) to (10), which is displayed on a display unit.
- the display control unit includes the first display representing a history of communication between the user of the information processing apparatus and the user of the other information processing apparatus performed by the communication processing unit, and the information processing apparatus.
- the information processing apparatus according to any one of (3) to (11), wherein both of the second displays representing the history are displayed on the display unit in a state of being arranged along the same time series.
- the communication processing unit realizes the communication in a state where the communication screen which is the communication screen is displayed on the display unit,
- the display control unit displays a history display screen different from the communication screen on the display unit, and in the history display screen, both the first display and the second display are performed along the same time series.
- the information processing apparatus according to (12), which is displayed in an aligned state.
- the communication processing unit may include a face image of a virtual character corresponding to a user of the other information processing apparatus or a user of the other information processing apparatus that is the communication partner, One message sent from a virtual character corresponding to the user of the information processing device or the user of the other information processing device, and the user of the other information processing device or the user of the other information processing device.
- the information processing apparatus according to (13), wherein one message sent to the virtual character is displayed.
- the communication processing unit further displays an emotion expression assigned to a message displayed on the communication screen on the communication screen.
- the communication processing unit uses the emotion of the message as the emotion expression assigned to the message sent from the virtual character corresponding to the user of the other information processing device or the user of the other information processing device.
- the information processing apparatus according to (15), wherein a face image of a virtual character corresponding to the user of the other information processing apparatus or the user of the other information processing apparatus having a facial expression is displayed on the communication screen.
- the communication processing unit uses the emotion of the message as the emotion expression assigned to the message sent from the virtual character corresponding to the user of the other information processing apparatus or the user of the other information processing apparatus.
- the information processing apparatus according to (15) or (16) wherein an effect image to be displayed is displayed at an arbitrary position on the communication screen.
- the communication processing unit represents the emotion of the message as the emotion expression assigned to the message sent from the user of the information processing apparatus itself or a virtual character corresponding to the user of the information processing apparatus itself. Any one of (15) to (17), wherein an image is displayed on the communication screen in a region where a face image of a virtual character corresponding to the user of the other information processing apparatus or the user of the other information processing apparatus is displayed.
- An information processing method for displaying on a display unit in a state of being arranged along a line. (21) Connect the computer Both the first display representing the history of communication between users and the second display representing the history of communication between the virtual character that can respond on behalf of the user and the user along the same time series A program for functioning as a display control unit to be displayed on the display unit in an aligned state.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- Entrepreneurship & Innovation (AREA)
- General Engineering & Computer Science (AREA)
- Quality & Reliability (AREA)
- Computer Hardware Design (AREA)
- General Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Operations Research (AREA)
- Computing Systems (AREA)
- Marketing (AREA)
- Data Mining & Analysis (AREA)
- Economics (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Information Transfer Between Computers (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
- Telephonic Communication Services (AREA)
Abstract
Description
1.第1の実施の形態(コミュニケーションシステム)
<コミュニケーションツールの表現力>
従来、インターネット等のネットワークを介した個人ユーザ同士のコミュニケーションツールとして、例えば特許文献1に記載のシステムのように、様々なシステムやサービスが普及している。
図1は、本技術を適用したコミュニケーションを実現するシステムの一実施の形態であるコミュニケーションシステムの概要を説明する図である。
図2は、図1を参照して説明したコミュニケーションシステム100の主な物理構成の例を示す図である。
次に各サーバの構成例について説明する。図3は、ユーザ管理サーバ111の主な構成例を示すブロック図である。
次に、ユーザ管理サーバ111乃至解析サーバ114が実現する機能について説明する。ユーザ管理サーバ111乃至解析サーバ114のCPU151は、プログラムを実行することにより、各種機能を実現する。図4は、各サーバにおいて実現される主な機能の例を示す機能ブロック図である。
次に端末装置131の構成例について説明する。図5は、端末装置131の主な構成例を示すブロック図である。
次に、端末装置131が実現する機能について説明する。端末装置131のSOC201は、プログラムを実行することにより、各種機能を実現する。図6は、端末装置131において実現される主な機能の例を示す機能ブロック図である。
次に、以上のようなコミュニケーションシステム100において行われる各種処理について説明する。最初に図7を参照して、初期設定の概要について説明する。コミュニケーションシステム100を利用するためには、ユーザ101の情報(ユーザ情報)をユーザ管理サーバ111に登録する必要がある。また、上述したように、コミュニケーションシステム100においては、仮想キャラクタ103を用いたコミュニケーションも行われるため、仮想キャラクタを仮想キャラクタ管理サーバ112に登録する必要がある。
初期設定の処理の流れの例を、図8のフローチャートを参照して説明する。必要に応じて図9乃至図13を参照して説明する。この処理は、例えば、ユーザ101が端末装置131において、コミュニケーションシステム100を用いたコミュニケーションを行うためのアプリケーションを起動させることにより開始される。
次に、広場について説明する。端末装置131においては、ユーザ101のお友達の画像(アイコン)の一覧が広場画面に表示される。ユーザ101は、この広場画面において、コミュニケーションを行うお友達を選択する。
次に、図15を参照して、広場に関する処理の概要について説明する。上述したように、ユーザ101は、コミュニケーションの相手を広場画面350において選択する。この広場画面350の表示、広場画面350における画像(アイコン)の並べ替えやフィルタリング、広場画面350におけるコミュニケーション相手の指定等の、広場に関する処理は、図15に示されるように、端末装置131、コミュニケーション管理サーバ113、およびユーザ管理サーバ111等により行われる。
広場に関する処理として、最初に、広場画面の表示に関する処理の流れの例を、図16のフローチャートを参照して説明する。
次に、ユーザ101同士のコミュニケーションについて説明する。例えば、図1の両矢印104-1のように、ユーザ101-1とユーザ101-2とがコミュニケーションを行う場合、図19に示されるように、端末装置131-1において作成されたユーザ101-1のメッセージは、ユーザ101-2の端末装置131-2に伝送され、表示される。逆に、端末装置131-2において作成されたユーザ101-2のメッセージは、ユーザ101-1の端末装置131-1に伝送され、表示される。
ユーザ同士のコミュニケーションに関する処理の流れの例を、図20のフローチャートを参照して説明する。必要に応じて図21乃至図25を参照して説明する。なお、ここでは、メッセージの送信元をユーザ101-1(端末装置131-1)とし、送信先をユーザ101-2(端末装置131-2)として説明する。ユーザ101同士、端末装置131同士は、特に区別しないので、いずれのユーザ101(端末装置131)から、いずれのユーザ101(端末装置131)に対してメッセージを送信する場合も、下記の説明と同様に行うことができる(下記の説明を適用することができる)。
次に、ユーザ101と仮想キャラクタ103とのコミュニケーションについて説明する。最初に、ユーザ101と、他のユーザ101の仮想キャラクタ103とのコミュニケーションについて説明する(例えば、図1の両矢印104-2)。この場合、仮想キャラクタのメッセージは、仮想空間102、すなわち、サーバ110等により生成される。つまり、端末装置131において作成されたユーザ101のメッセージは、図26に示されるように、サーバ110に供給される。サーバ110の各サーバは、協働し、ユーザ101からのメッセージに対する仮想キャラクタ103の応答メッセージや、その応答メッセージに応じた感情表現を生成する。その応答メッセージや感情表現は、メッセージ送信元の端末装置131に送信され、その表示部231に表示される。
ユーザと仮想キャラクタのコミュニケーションに関する処理の流れの例を、図27のフローチャートを参照して説明する。
なお、以上のようなユーザ101と仮想キャラクタ103とのコミュニケーションにおいて、ユーザ101がコミュニケーションとして授受するデータは、任意であり、メッセージデータに限らず、例えばスタンプ画像のような所定の絵柄の画像データであってもよい。
仮想キャラクタに対するその他のコミュニケーションに関する処理の流れの例を、図28のフローチャートを参照して説明する。必要に応じて、図29および図30を参照して説明する。なお、ここでは、メッセージの代わりにスタンプ画像のデータがコミュニケーションとして送受信されるものとして説明する。
なお、他のユーザが当該ユーザ101の仮想キャラクタ103とコミュニケーションを行っている間、当該ユーザ101の端末装置131の表示部231には、そのコミュニケーションの内容を示す情報が表示されるようにしてもよい。
他のユーザと仮想キャラクタとのコミュニケーションに関する処理の流れの例を、図32のフローチャートを参照して説明する。必要に応じて図33乃至図35を参照して説明する。
以上のように本コミュニケーションシステム100を用いたコミュニケーションにおいて、応答の仕方を変えるモードを設けるようにしてもよい。例えば、上述したように、あるユーザをコミュニケーションを行うためには、そのユーザを予めお友達として登録しておく必要がある。換言するに、あるユーザとのコミュニケーションを拒否したい場合、そのユーザをお友達から外せば良い。ただし、このような処置を行うと、それまでの関係がリセットされ、メッセージ履歴等もシステムから破棄されてしまう。再度そのユーザとコミュニケーションを行う場合、改めて、お友達として登録し直す必要がある。
図37のフローチャートを参照して、ブロックモードの設定に関する処理の流れの例を説明する。
図38のフローチャートを参照して、ブロックモードの解除に関する処理の流れの例を説明する。例えば、ユーザ101-1に対応する仮想キャラクタ103-1に、ユーザ101-2に対するブロックモードが設定されている状態とする。
図39のフローチャートを参照して、ブロックモードではなく、所望のユーザをお友達から削除する場合の処理の流れの例を説明する。
なお、仮想キャラクタ103が、応答の仕方を、その仮想キャラクタ103に対応するユーザ101の応答の仕方に基づいて学習するようにしてもよい。このようにすることにより、仮想キャラクタ103が、よりユーザ101の応答の仕方に近づけることができ、より個性的な応答を返すことができるようになる。つまり、コミュニケーションの表現力を向上させることができる。
上述したように、会話画面360においては、基本的にメッセージ履歴は表示しないように説明したが、メッセージ履歴を表示することができるようにしてももちろんよい。その場合、会話画面360と異なる画面にメッセージ履歴が表示されるようにしてもよい。このようにすることにより、会話画面360の上述した特徴を保ちながら、メッセージ履歴を表示させることができるようになる。つまり、この場合も、コミュニケーションの表現力を増大させることができる。
次にこのような履歴表示に関する処理の流れの例を、図43のフローチャートを参照して説明する。
上述した一連の処理は、ハードウェアにより実行させることもできるし、ソフトウェアにより実行させることもできる。上述した一連の処理をソフトウェアにより実行させる場合には、そのソフトウェアを構成するプログラムが、ネットワークや記録媒体からインストールされる。
(1) ユーザ同士のコミュニケーションの履歴を表す第1の表示、並びに、ユーザの代わりに応答することができる仮想キャラクタとユーザとのコミュニケーションの履歴を表す第2の表示の両方を、同一の時系列に沿って並べた状態で、表示部に表示させる表示制御部
を備える情報処理装置。
(2) 前記ユーザ同士のコミュニケーションの履歴は、所定のユーザである第1のユーザと、前記第1のユーザ以外の他のユーザとのコミュニケーションの履歴を含み、
前記仮想キャラクタとユーザとのコミュニケーションの履歴は、前記第1のユーザと前記他のユーザに対応する仮想キャラクタとのコミュニケーションの履歴、または、前記第1のユーザに対応する仮想キャラクタと前記他のユーザとのコミュニケーションの履歴を含む
(1)に記載の情報処理装置。
(3) 前記第1の表示は、前記第1のユーザから前記他のユーザに送られたメッセージ、または、前記他のユーザから前記第1のユーザに送られたメッセージを表すテキスト情報を含み、
前記第2の表示は、前記第1のユーザから前記他のユーザに対応する仮想キャラクタに送られたメッセージ、または、前記他のユーザに対応する仮想キャラクタから前記第1のユーザに送られたメッセージを表すテキスト情報、または、前記第1のユーザに対応する仮想キャラクタから前記他のユーザに送られたメッセージ、または、前記他のユーザから前記第1のユーザに対応する仮想キャラクタに送られたメッセージを表すテキスト情報を含む
(2)に記載の情報処理装置。
(4) 前記コミュニケーションの履歴は、各メッセージを授受した際に割り当てられた感情を表現する第3の表示をさらに含み、
前記表示制御部は、前記メッセージとともに、前記メッセージを授受した際に割り当てられる感情表現を、前記表示部に表示させる
(3)に記載の情報処理装置。
(5) 前記感情表現は、前記メッセージの発信者となるユーザ若しくは仮想キャラクタの顔画像の表情を含み、
前記表示制御部は、各メッセージとともに、前記メッセージの感情を表す表情をした、前記メッセージの発信者となるユーザ若しくは仮想キャラクタの顔画像を、前記表示部に表示させる
(4)に記載の情報処理装置。
(6) 前記感情表現は、前記メッセージの感情を表す演出画像を含み、
前記表示制御部は、各メッセージとともに、前記メッセージの感情を表す演出画像を、前記表示部に表示させる
(4)または(5)に記載の情報処理装置。
(7) 前記感情表現は、前記メッセージの感情を表す吹き出し形状を含み、
前記表示制御部は、各メッセージとともに、前記メッセージの感情を表す形状の吹き出しを、前記表示部に表示させる
(4)乃至(6)のいずれかに記載の情報処理装置。
(8) 前記感情表現は、前記メッセージを授受した際の、メッセージの受信者となる前記第1のユーザ、または、前記他のユーザをセンシングすることにより得られる情報に基づく
(4)乃至(7)のいずれかに記載の情報処理装置。
(9) 前記表示制御部は、
前記第1のユーザから前記他のユーザ若しくは前記他のユーザに対応する仮想キャラクタに送られたメッセージと、前記第1のユーザに対応する仮想キャラクタから前記他のユーザに送られたメッセージとを、互いに識別可能な状態で前記表示部に表示させ、
前記他のユーザから前記第1のユーザ若しくは前記第1のユーザに対応する仮想キャラクタに送られたメッセージと、前記他のユーザに対応する仮想キャラクタから前記第1のユーザに送られたメッセージとを、互いに識別可能な状態で前記表示部に表示させる
(3)乃至(8)のいずれかに記載の情報処理装置。
(10) 前記メッセージの並べ替えの指示を受け付ける指示受付部をさらに備え、
前記表示制御部は、前記指示受付部により前記メッセージの並べ替えの指示が受け付けられた場合、前記指示受付部により受け付けられた、前記メッセージの並べ替えの条件に従って各メッセージを並べ変えて、前記表示部に表示させる
(3)乃至(9)のいずれかに記載の情報処理装置。
(11) 前記メッセージの選択の指示を受け付ける指示受付部をさらに備え、
前記表示制御部は、前記指示受付部により前記メッセージの選択の指示が受け付けられた場合、前記指示受付部により受け付けられた、前記メッセージの選択の条件に従ってメッセージを選択し、選択したメッセージを、前記表示部に表示させる
(3)乃至(10)のいずれかに記載の情報処理装置。
(12) 通信部を介して前記他の情報処理装置と通信を行って前記メッセージを授受させることにより、前記情報処理装置自身のユーザと前記他の情報処理装置のユーザとのコミュニケーション、前記情報処理装置自身のユーザと前記他の情報処理装置のユーザに対応する仮想キャラクタとのコミュニケーション、並びに、前記情報処理装置自身のユーザに対応する仮想キャラクタと前記他の情報処理装置のユーザとのコミュニケーションを実現するコミュニケーション処理部をさらに備え、
前記表示制御部は、前記コミュニケーション処理部により行われた、前記情報処理装置自身のユーザと前記他の情報処理装置のユーザとのコミュニケーションの履歴を表す前記第1の表示、並びに、前記情報処理装置自身のユーザと前記他の情報処理装置のユーザに対応する仮想キャラクタとのコミュニケーションの履歴、または、前記情報処理装置自身のユーザに対応する仮想キャラクタと前記他の情報処理装置のユーザとのコミュニケーションの履歴を表す前記第2の表示の両方を、同一の時系列に沿って並べた状態で、前記表示部に表示させる
(3)乃至(11)のいずれかに記載の情報処理装置。
(13) 前記コミュニケーション処理部は、前記コミュニケーション用の画面であるコミュニケーション画面を前記表示部に表示させた状態で、前記コミュニケーションを実現し、
前記表示制御部は、前記コミュニケーション画面と異なる履歴表示画面を前記表示部に表示させ、前記履歴表示画面において、前記第1の表示および前記第2の表示の両方を、同一の時系列に沿って並べた状態で表示させる
(12)に記載の情報処理装置。
(14) 前記コミュニケーション処理部は、前記コミュニケーション画面において、前記コミュニケーションの相手である、前記他の情報処理装置のユーザ若しくは前記他の情報処理装置のユーザに対応する仮想キャラクタの顔画像、前記他の情報処理装置のユーザ若しくは前記他の情報処理装置のユーザに対応する仮想キャラクタから送られた1回分のメッセージ、並びに、前記他の情報処理装置のユーザ若しくは前記他の情報処理装置のユーザに対応する仮想キャラクタに送られた1回分のメッセージを表示させる
(13)に記載の情報処理装置。
(15) 前記コミュニケーション処理部は、前記コミュニケーション画面において、さらに、前記コミュニケーション画面に表示されるメッセージに割り当てられた感情表現を表示させる
(14)に記載の情報処理装置。
(16) 前記コミュニケーション処理部は、前記他の情報処理装置のユーザ若しくは前記他の情報処理装置のユーザに対応する仮想キャラクタから送られたメッセージに割り当てられた前記感情表現として、前記メッセージの感情を表す表情をした、前記他の情報処理装置のユーザ若しくは前記他の情報処理装置のユーザに対応する仮想キャラクタの顔画像を、前記コミュニケーション画面に表示させる
(15)に記載の情報処理装置。
(17) 前記コミュニケーション処理部は、前記他の情報処理装置のユーザ若しくは前記他の情報処理装置のユーザに対応する仮想キャラクタから送られたメッセージに割り当てられた前記感情表現として、前記メッセージの感情を表す演出画像を、前記コミュニケーション画面の任意の位置に表示させる
(15)または(16)に記載の情報処理装置。
(18) 前記コミュニケーション処理部は、前記情報処理装置自身のユーザ若しくは前記情報処理装置自身のユーザに対応する仮想キャラクタから送られたメッセージに割り当てられた前記感情表現として、前記メッセージの感情を表す演出画像を、前記コミュニケーション画面の、前記他の情報処理装置のユーザ若しくは前記他の情報処理装置のユーザに対応する仮想キャラクタの顔画像が表示される領域に表示させる
(15)乃至(17)のいずれかに記載の情報処理装置。
(19) 前記表示部をさらに備える
(1)乃至(18)のいずれかに記載の情報処理装置。
(20) ユーザ同士のコミュニケーションの履歴を表す第1の表示、並びに、ユーザの代わりに応答することができる仮想キャラクタとユーザとのコミュニケーションの履歴を表す第2の表示の両方を、同一の時系列に沿って並べた状態で、表示部に表示させる
情報処理方法。
(21) コンピュータを、
ユーザ同士のコミュニケーションの履歴を表す第1の表示、並びに、ユーザの代わりに応答することができる仮想キャラクタとユーザとのコミュニケーションの履歴を表す第2の表示の両方を、同一の時系列に沿って並べた状態で、表示部に表示させる表示制御部
として機能させるためのプログラム。
Claims (20)
- ユーザ同士のコミュニケーションの履歴を表す第1の表示、並びに、ユーザの代わりに応答することができる仮想キャラクタとユーザとのコミュニケーションの履歴を表す第2の表示の両方を、同一の時系列に沿って並べた状態で、表示部に表示させる表示制御部
を備える情報処理装置。 - 前記ユーザ同士のコミュニケーションの履歴は、所定のユーザである第1のユーザと、前記第1のユーザ以外の他のユーザとのコミュニケーションの履歴を含み、
前記仮想キャラクタとユーザとのコミュニケーションの履歴は、前記第1のユーザと前記他のユーザに対応する仮想キャラクタとのコミュニケーションの履歴、または、前記第1のユーザに対応する仮想キャラクタと前記他のユーザとのコミュニケーションの履歴を含む
請求項1に記載の情報処理装置。 - 前記第1の表示は、前記第1のユーザから前記他のユーザに送られたメッセージ、または、前記他のユーザから前記第1のユーザに送られたメッセージを表すテキスト情報を含み、
前記第2の表示は、前記第1のユーザから前記他のユーザに対応する仮想キャラクタに送られたメッセージ、または、前記他のユーザに対応する仮想キャラクタから前記第1のユーザに送られたメッセージを表すテキスト情報、または、前記第1のユーザに対応する仮想キャラクタから前記他のユーザに送られたメッセージ、または、前記他のユーザから前記第1のユーザに対応する仮想キャラクタに送られたメッセージを表すテキスト情報を含む
請求項2に記載の情報処理装置。 - 前記コミュニケーションの履歴は、各メッセージを授受した際に割り当てられた感情を表現する第3の表示をさらに含み、
前記表示制御部は、前記メッセージとともに、前記メッセージを授受した際に割り当てられる感情表現を、前記表示部に表示させる
請求項3に記載の情報処理装置。 - 前記感情表現は、前記メッセージの発信者となるユーザ若しくは仮想キャラクタの顔画像の表情を含み、
前記表示制御部は、各メッセージとともに、前記メッセージの感情を表す表情をした、前記メッセージの発信者となるユーザ若しくは仮想キャラクタの顔画像を、前記表示部に表示させる
請求項4に記載の情報処理装置。 - 前記感情表現は、前記メッセージの感情を表す演出画像を含み、
前記表示制御部は、各メッセージとともに、前記メッセージの感情を表す演出画像を、前記表示部に表示させる
請求項4に記載の情報処理装置。 - 前記感情表現は、前記メッセージの感情を表す吹き出し形状を含み、
前記表示制御部は、各メッセージとともに、前記メッセージの感情を表す形状の吹き出しを、前記表示部に表示させる
請求項4に記載の情報処理装置。 - 前記感情表現は、前記メッセージを授受した際の、メッセージの受信者となる前記第1のユーザ、または、前記他のユーザをセンシングすることにより得られる情報に基づく
請求項4に記載の情報処理装置。 - 前記表示制御部は、
前記第1のユーザから前記他のユーザ若しくは前記他のユーザに対応する仮想キャラクタに送られたメッセージと、前記第1のユーザに対応する仮想キャラクタから前記他のユーザに送られたメッセージとを、互いに識別可能な状態で前記表示部に表示させ、
前記他のユーザから前記第1のユーザ若しくは前記第1のユーザに対応する仮想キャラクタに送られたメッセージと、前記他のユーザに対応する仮想キャラクタから前記第1のユーザに送られたメッセージとを、互いに識別可能な状態で前記表示部に表示させる
請求項3に記載の情報処理装置。 - 前記メッセージの並べ替えの指示を受け付ける指示受付部をさらに備え、
前記表示制御部は、前記指示受付部により前記メッセージの並べ替えの指示が受け付けられた場合、前記指示受付部により受け付けられた、前記メッセージの並べ替えの条件に従って各メッセージを並べ変えて、前記表示部に表示させる
請求項3に記載の情報処理装置。 - 前記メッセージの選択の指示を受け付ける指示受付部をさらに備え、
前記表示制御部は、前記指示受付部により前記メッセージの選択の指示が受け付けられた場合、前記指示受付部により受け付けられた、前記メッセージの選択の条件に従ってメッセージを選択し、選択したメッセージを、前記表示部に表示させる
請求項3に記載の情報処理装置。 - 通信部を介して前記他の情報処理装置と通信を行って前記メッセージを授受させることにより、前記情報処理装置自身のユーザと前記他の情報処理装置のユーザとのコミュニケーション、前記情報処理装置自身のユーザと前記他の情報処理装置のユーザに対応する仮想キャラクタとのコミュニケーション、並びに、前記情報処理装置自身のユーザに対応する仮想キャラクタと前記他の情報処理装置のユーザとのコミュニケーションを実現するコミュニケーション処理部をさらに備え、
前記表示制御部は、前記コミュニケーション処理部により行われた、前記情報処理装置自身のユーザと前記他の情報処理装置のユーザとのコミュニケーションの履歴を表す前記第1の表示、並びに、前記情報処理装置自身のユーザと前記他の情報処理装置のユーザに対応する仮想キャラクタとのコミュニケーションの履歴、または、前記情報処理装置自身のユーザに対応する仮想キャラクタと前記他の情報処理装置のユーザとのコミュニケーションの履歴を表す前記第2の表示の両方を、同一の時系列に沿って並べた状態で、前記表示部に表示させる
請求項3に記載の情報処理装置。 - 前記コミュニケーション処理部は、前記コミュニケーション用の画面であるコミュニケーション画面を前記表示部に表示させた状態で、前記コミュニケーションを実現し、
前記表示制御部は、前記コミュニケーション画面と異なる履歴表示画面を前記表示部に表示させ、前記履歴表示画面において、前記第1の表示および前記第2の表示の両方を、同一の時系列に沿って並べた状態で表示させる
請求項12に記載の情報処理装置。 - 前記コミュニケーション処理部は、前記コミュニケーション画面において、前記コミュニケーションの相手である、前記他の情報処理装置のユーザ若しくは前記他の情報処理装置のユーザに対応する仮想キャラクタの顔画像、前記他の情報処理装置のユーザ若しくは前記他の情報処理装置のユーザに対応する仮想キャラクタから送られた1回分のメッセージ、並びに、前記他の情報処理装置のユーザ若しくは前記他の情報処理装置のユーザに対応する仮想キャラクタに送られた1回分のメッセージを表示させる
請求項13に記載の情報処理装置。 - 前記コミュニケーション処理部は、前記コミュニケーション画面において、さらに、前記コミュニケーション画面に表示されるメッセージに割り当てられた感情表現を表示させる
請求項14に記載の情報処理装置。 - 前記コミュニケーション処理部は、前記他の情報処理装置のユーザ若しくは前記他の情報処理装置のユーザに対応する仮想キャラクタから送られたメッセージに割り当てられた前記感情表現として、前記メッセージの感情を表す表情をした、前記他の情報処理装置のユーザ若しくは前記他の情報処理装置のユーザに対応する仮想キャラクタの顔画像を、前記コミュニケーション画面に表示させる
請求項15に記載の情報処理装置。 - 前記コミュニケーション処理部は、前記他の情報処理装置のユーザ若しくは前記他の情報処理装置のユーザに対応する仮想キャラクタから送られたメッセージに割り当てられた前記感情表現として、前記メッセージの感情を表す演出画像を、前記コミュニケーション画面の任意の位置に表示させる
請求項15に記載の情報処理装置。 - 前記コミュニケーション処理部は、前記情報処理装置自身のユーザ若しくは前記情報処理装置自身のユーザに対応する仮想キャラクタから送られたメッセージに割り当てられた前記感情表現として、前記メッセージの感情を表す演出画像を、前記コミュニケーション画面の、前記他の情報処理装置のユーザ若しくは前記他の情報処理装置のユーザに対応する仮想キャラクタの顔画像が表示される領域に表示させる
請求項15に記載の情報処理装置。 - ユーザ同士のコミュニケーションの履歴を表す第1の表示、並びに、ユーザの代わりに応答することができる仮想キャラクタとユーザとのコミュニケーションの履歴を表す第2の表示の両方を、同一の時系列に沿って並べた状態で、表示部に表示させる
情報処理方法。 - コンピュータを、
ユーザ同士のコミュニケーションの履歴を表す第1の表示、並びに、ユーザの代わりに応答することができる仮想キャラクタとユーザとのコミュニケーションの履歴を表す第2の表示の両方を、同一の時系列に沿って並べた状態で、表示部に表示させる表示制御部
として機能させるためのプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP15803179.9A EP3153973A4 (en) | 2014-06-06 | 2015-05-22 | Information processing device and method, and program |
JP2016525767A JP6670450B2 (ja) | 2014-06-06 | 2015-05-22 | 情報処理装置および方法、並びにプログラム |
US15/311,641 US20170093785A1 (en) | 2014-06-06 | 2015-05-22 | Information processing device, method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014118201 | 2014-06-06 | ||
JP2014-118201 | 2014-06-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015186534A1 true WO2015186534A1 (ja) | 2015-12-10 |
Family
ID=54766609
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/064676 WO2015186534A1 (ja) | 2014-06-06 | 2015-05-22 | 情報処理装置および方法、並びにプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20170093785A1 (ja) |
EP (1) | EP3153973A4 (ja) |
JP (1) | JP6670450B2 (ja) |
WO (1) | WO2015186534A1 (ja) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108519977A (zh) * | 2018-03-30 | 2018-09-11 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
JP2019191939A (ja) * | 2018-04-25 | 2019-10-31 | メドケア株式会社 | 指導支援システム、指導支援方法及び指導支援サーバ |
JP2020009424A (ja) * | 2016-05-18 | 2020-01-16 | アップル インコーポレイテッドApple Inc. | グラフィカルメッセージユーザインタフェースにおける確認応答オプションの使用 |
JP2020064616A (ja) * | 2018-10-18 | 2020-04-23 | 深▲せん▼前海達闥云端智能科技有限公司Cloudminds (Shenzhen) Robotics Systems Co.,Ltd. | 仮想ロボットのインタラクション方法、装置、記憶媒体及び電子機器 |
KR20200113675A (ko) * | 2019-03-26 | 2020-10-07 | 권택준 | 대사를 캐릭터별 상이한 목소리로 변환하여 전달하는 웹툰 동영상 생성 방법 |
US10852935B2 (en) | 2016-05-18 | 2020-12-01 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
JPWO2019116488A1 (ja) * | 2017-12-14 | 2020-12-17 | Line株式会社 | 情報処理方法、情報処理装置、プログラム、及び情報処理端末 |
CN112546638A (zh) * | 2020-12-18 | 2021-03-26 | 网易(杭州)网络有限公司 | 一种虚拟角色切换方法、装置、电子设备及存储介质 |
JP2021157681A (ja) * | 2020-03-30 | 2021-10-07 | 株式会社エヌ・ティ・ティ・データ | 簡易通信システム、簡易通信方法、及びプログラム |
US11159922B2 (en) | 2016-06-12 | 2021-10-26 | Apple Inc. | Layers in messaging applications |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102306538B1 (ko) * | 2015-01-20 | 2021-09-29 | 삼성전자주식회사 | 콘텐트 편집 장치 및 방법 |
US20180025004A1 (en) * | 2016-07-19 | 2018-01-25 | Eric Koenig | Process to provide audio/video/literature files and/or events/activities ,based upon an emoji or icon associated to a personal feeling |
CN107291446B (zh) * | 2017-05-16 | 2021-06-08 | 北京金山安全软件有限公司 | 一种桌面管理方法及装置 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001160021A (ja) * | 1999-12-03 | 2001-06-12 | Sony Corp | 仮想空間による通信システム |
JP2002055920A (ja) * | 2000-05-31 | 2002-02-20 | Namco Ltd | 情報提供システム、プログラムおよび情報記憶媒体 |
JP2002236656A (ja) * | 2001-02-08 | 2002-08-23 | Nifty Corp | チャットシステム及びサーバ装置 |
JP2006338685A (ja) * | 2006-08-02 | 2006-12-14 | Konami Digital Entertainment:Kk | チャットシステム、通信装置、その制御方法及びプログラム |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3597374B2 (ja) * | 1998-03-20 | 2004-12-08 | 富士通株式会社 | チャットシステムにおけるエージェント装置 |
JP3301983B2 (ja) * | 1999-01-12 | 2002-07-15 | 富士通株式会社 | キャラクタを用いた対話型コミュニケーション装置及び方法 |
US6907571B2 (en) * | 2000-03-01 | 2005-06-14 | Benjamin Slotznick | Adjunct use of instant messenger software to enable communications to or between chatterbots or other software agents |
US6983305B2 (en) * | 2001-05-30 | 2006-01-03 | Microsoft Corporation | Systems and methods for interfacing with a user in instant messaging |
US20040107251A1 (en) * | 2001-09-19 | 2004-06-03 | Hansen Wat | System and method for communicating expressive images for meetings |
US7913176B1 (en) * | 2003-03-03 | 2011-03-22 | Aol Inc. | Applying access controls to communications with avatars |
US7668922B2 (en) * | 2006-01-19 | 2010-02-23 | International Business Machines Corporation | Identifying and displaying relevant shared entities in an instant messaging system |
EP1984898A4 (en) * | 2006-02-09 | 2010-05-05 | Nms Comm Corp | PROGRESSIVE MORPHING BETWEEN AVATARS OF VIDEO CALL |
JP2008191748A (ja) * | 2007-02-01 | 2008-08-21 | Oki Electric Ind Co Ltd | ユーザ間コミュニケーション方法、ユーザ間コミュニケーションプログラム、ユーザ間コミュニケーション装置 |
US8214433B2 (en) * | 2008-12-15 | 2012-07-03 | International Business Machines Corporation | System and method to provide context for an automated agent to service multiple avatars within a virtual universe |
US8279779B2 (en) * | 2009-12-10 | 2012-10-02 | Verizon Patent And Licensing Inc. | Method and system for virtual agent session monitoring and barge-in |
WO2011077501A1 (ja) * | 2009-12-26 | 2011-06-30 | 株式会社ラピースドリーム | コミュニケーションシステム |
US20110265018A1 (en) * | 2010-04-23 | 2011-10-27 | Ganz | Emotion and mood control of virtual characters in a virtual world |
US20120130717A1 (en) * | 2010-11-19 | 2012-05-24 | Microsoft Corporation | Real-time Animation for an Expressive Avatar |
JP2013009073A (ja) * | 2011-06-23 | 2013-01-10 | Sony Corp | 情報処理装置、情報処理方法、プログラム、及びサーバ |
US8545330B2 (en) * | 2011-07-28 | 2013-10-01 | Zynga Inc. | Contextual in-game messaging system |
KR101907136B1 (ko) * | 2012-01-27 | 2018-10-11 | 라인 가부시키가이샤 | 유무선 웹을 통한 아바타 서비스 시스템 및 방법 |
US10116598B2 (en) * | 2012-08-15 | 2018-10-30 | Imvu, Inc. | System and method for increasing clarity and expressiveness in network communications |
US9706040B2 (en) * | 2013-10-31 | 2017-07-11 | Udayakumar Kadirvel | System and method for facilitating communication via interaction with an avatar |
US20150149925A1 (en) * | 2013-11-26 | 2015-05-28 | Lenovo (Singapore) Pte. Ltd. | Emoticon generation using user images and gestures |
-
2015
- 2015-05-22 JP JP2016525767A patent/JP6670450B2/ja active Active
- 2015-05-22 EP EP15803179.9A patent/EP3153973A4/en not_active Ceased
- 2015-05-22 US US15/311,641 patent/US20170093785A1/en not_active Abandoned
- 2015-05-22 WO PCT/JP2015/064676 patent/WO2015186534A1/ja active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001160021A (ja) * | 1999-12-03 | 2001-06-12 | Sony Corp | 仮想空間による通信システム |
JP2002055920A (ja) * | 2000-05-31 | 2002-02-20 | Namco Ltd | 情報提供システム、プログラムおよび情報記憶媒体 |
JP2002236656A (ja) * | 2001-02-08 | 2002-08-23 | Nifty Corp | チャットシステム及びサーバ装置 |
JP2006338685A (ja) * | 2006-08-02 | 2006-12-14 | Konami Digital Entertainment:Kk | チャットシステム、通信装置、その制御方法及びプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3153973A4 * |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11320982B2 (en) | 2016-05-18 | 2022-05-03 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US11112963B2 (en) | 2016-05-18 | 2021-09-07 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US11126348B2 (en) | 2016-05-18 | 2021-09-21 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US11966579B2 (en) | 2016-05-18 | 2024-04-23 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US11954323B2 (en) | 2016-05-18 | 2024-04-09 | Apple Inc. | Devices, methods, and graphical user interfaces for initiating a payment action in a messaging session |
US10852935B2 (en) | 2016-05-18 | 2020-12-01 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US11625165B2 (en) | 2016-05-18 | 2023-04-11 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US11513677B2 (en) | 2016-05-18 | 2022-11-29 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
JP2020009424A (ja) * | 2016-05-18 | 2020-01-16 | アップル インコーポレイテッドApple Inc. | グラフィカルメッセージユーザインタフェースにおける確認応答オプションの使用 |
US11221751B2 (en) | 2016-05-18 | 2022-01-11 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US10983689B2 (en) | 2016-05-18 | 2021-04-20 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
US11159922B2 (en) | 2016-06-12 | 2021-10-26 | Apple Inc. | Layers in messaging applications |
US11778430B2 (en) | 2016-06-12 | 2023-10-03 | Apple Inc. | Layers in messaging applications |
JP7072583B2 (ja) | 2017-12-14 | 2022-05-20 | Line株式会社 | 情報処理方法、情報処理装置、プログラム、及び情報処理端末 |
JPWO2019116488A1 (ja) * | 2017-12-14 | 2020-12-17 | Line株式会社 | 情報処理方法、情報処理装置、プログラム、及び情報処理端末 |
CN108519977A (zh) * | 2018-03-30 | 2018-09-11 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
JP2019191939A (ja) * | 2018-04-25 | 2019-10-31 | メドケア株式会社 | 指導支援システム、指導支援方法及び指導支援サーバ |
JP2020064616A (ja) * | 2018-10-18 | 2020-04-23 | 深▲せん▼前海達闥云端智能科技有限公司Cloudminds (Shenzhen) Robotics Systems Co.,Ltd. | 仮想ロボットのインタラクション方法、装置、記憶媒体及び電子機器 |
KR102184053B1 (ko) * | 2019-03-26 | 2020-11-27 | 권택준 | 대사를 캐릭터별 상이한 목소리로 변환하여 전달하는 웹툰 동영상 생성 방법 |
KR20200113675A (ko) * | 2019-03-26 | 2020-10-07 | 권택준 | 대사를 캐릭터별 상이한 목소리로 변환하여 전달하는 웹툰 동영상 생성 방법 |
JP2021157681A (ja) * | 2020-03-30 | 2021-10-07 | 株式会社エヌ・ティ・ティ・データ | 簡易通信システム、簡易通信方法、及びプログラム |
CN112546638B (zh) * | 2020-12-18 | 2024-05-10 | 网易(杭州)网络有限公司 | 一种虚拟角色切换方法、装置、电子设备及存储介质 |
CN112546638A (zh) * | 2020-12-18 | 2021-03-26 | 网易(杭州)网络有限公司 | 一种虚拟角色切换方法、装置、电子设备及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
JP6670450B2 (ja) | 2020-03-25 |
JPWO2015186534A1 (ja) | 2017-04-20 |
EP3153973A1 (en) | 2017-04-12 |
EP3153973A4 (en) | 2018-01-17 |
US20170093785A1 (en) | 2017-03-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015186534A1 (ja) | 情報処理装置および方法、並びにプログラム | |
JP6055731B2 (ja) | メンバー追加を拡張するためのメッセージングサービスシステム及びその方法 | |
US9253318B2 (en) | Method and apparatus for providing state information | |
Quinn et al. | Our Networked selves: Personal connection and relational maintenance in social media use | |
US10917368B2 (en) | Method and apparatus for providing social network service | |
CN101478399A (zh) | 个人广告简档秘密照片验证过程 | |
JP7326963B2 (ja) | 通信端末、通信システム、画像共有方法およびプログラム | |
CN106471784A (zh) | 设备访问控制 | |
US20160335599A1 (en) | Systems and methods for exchanging information | |
US20200225832A1 (en) | Information processing method, information processing apparatus, and information processing program | |
JP2014063342A (ja) | 管理装置、メッセージ管理方法およびプログラム | |
US20130268483A1 (en) | Information processing apparatus, information processing method, and computer program | |
US9866505B2 (en) | Configuring presence and notifications in persistent conversations | |
JP2021051529A (ja) | 通信端末、通信システム、データ共有方法およびプログラム | |
JP2021068346A (ja) | 通信端末、通信システム、データ共有方法およびプログラム | |
KR20150070005A (ko) | 프레즌스-기반 콘텐츠 공유 방법 및 이를 지원하는 전자 장치 | |
CN111108491B (zh) | 会议*** | |
JP2008276414A (ja) | 情報表示システム、情報表示端末、および情報表示方法 | |
JP2020149338A (ja) | 通信端末、通信システム、表示制御方法およびプログラム | |
CN113763192A (zh) | 信息处理装置、信息处理方法和计算机可读介质 | |
JP7476646B2 (ja) | 情報処理装置及びプログラム | |
JP2020149344A (ja) | 通信端末、通信システム、表示制御方法およびプログラム | |
JP7467986B2 (ja) | 通信端末、通信システム、通信方法およびプログラム | |
US11743215B1 (en) | Artificial reality messaging with destination selection | |
US20240177248A1 (en) | Spot Date App |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15803179 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016525767 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15311641 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2015803179 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015803179 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |