WO2005101216A1 - チャットシステム、通信装置、その制御方法及び情報記憶媒体 - Google Patents
チャットシステム、通信装置、その制御方法及び情報記憶媒体 Download PDFInfo
- Publication number
- WO2005101216A1 WO2005101216A1 PCT/JP2004/010936 JP2004010936W WO2005101216A1 WO 2005101216 A1 WO2005101216 A1 WO 2005101216A1 JP 2004010936 W JP2004010936 W JP 2004010936W WO 2005101216 A1 WO2005101216 A1 WO 2005101216A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- emotion
- message
- character string
- input
- string
- Prior art date
Links
- 238000004891 communication Methods 0.000 title claims description 46
- 238000000034 method Methods 0.000 title claims description 15
- 230000008451 emotion Effects 0.000 claims abstract description 252
- 230000002996 emotional effect Effects 0.000 abstract description 4
- 238000013500 data storage Methods 0.000 description 27
- 238000010586 diagram Methods 0.000 description 16
- 238000013523 data management Methods 0.000 description 11
- 230000008859 change Effects 0.000 description 10
- 230000003247 decreasing effect Effects 0.000 description 3
- 230000008921 facial expression Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/40—Business processes related to the transportation industry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/107—Computer-aided management of electronic mailing [e-mailing]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/16—Arrangements for providing special services to substations
- H04L12/18—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
- H04L12/1813—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
- H04L12/1827—Network arrangements for conference optimisation or adaptation
Definitions
- the present invention relates to a chat system, a communication device used in the chat system, a control method of the communication device, and an information storage medium, and more particularly to a system that outputs an image or the like representing a user's emotion.
- the message sender cannot transmit the information to the other party unless the information indicating the content of the emotion is input one by one. That's it.
- the present invention has been made in view of the above problems, and has as its object to provide a chat system capable of judging a change in the emotion of a message sender with a simple configuration and outputting the change on the receiving side.
- Another object of the present invention is to provide a communication device used for a chat system, a control method of the communication device, and an information storage medium.
- a chat system configured to include a plurality of devices. Each device inputs a message character string and transmits the message character string to another device. In the chat system that receives and outputs the message character string, the other message character string is input from a certain message character string input timing. Means for determining an emotion level according to the elapsed time until the input timing of the other, and means for outputting, in the another device, at least one of an image and a sound according to the determined emotion level.
- the emotion level is determined according to the elapsed time until the input timing of a certain message character string and the input timing of another message character string. Therefore, the message receiver can intuitively grasp the emotion without burdening the message sender.
- the sound is, for example, a voice, music, or the like (the same applies hereinafter).
- a chat system is a chat system including a first device and a second device, wherein the first device includes a means for inputting a message character string, and an emotion type indicating an emotion type. Means for inputting data, means for transmitting the input message character string to the second device, and means for transmitting the input emotion type data to the second device, wherein the second device Means for receiving the message character string from the first device, means for receiving the emotion type data from the first device, means for outputting the received message character string, and the first device Means for acquiring an emotion level determined according to the input timing of the message character string in the above, and an image or a picture corresponding to the received emotion type data and the acquired emotion level Characterized in that it comprises a means for outputting at least one of.
- the image and sound corresponding to the emotion type data input in the first device and the emotion level determined in accordance with the input timing of the message character string in the first device are converted to the second image. Since the message is output by the device, the message receiver can intuitively understand the emotion without putting a burden on the message sender.
- the "emotional type data” is data indicating the type of emotion of the message sender, such as emotions and emotions.
- the “timing of inputting the message character string in the first device” refers to the timing when the input of the message character string is completed in the first device and the timing when the message character string is transmitted from the first device to the second device.
- the timing at which the message string is received or output by the second device, the timing at which the message string is received or transmitted by the relay device that relays the communication between the first device and the second device, etc. Includes all timings corresponding to the input timing of the message string in one device.
- a communication device is a communication device used for a chat system, wherein: a means for inputting a message character string; a means for inputting emotion type data indicating an emotion type; Means for determining the emotion level according to the input timing of the sequence, means for transmitting the input message character string, means for transmitting the input emotion type data, and setting the emotion level to be determined. Transmitting means.
- the method according to the present invention is a control method of a communication device used in a chat system, wherein a step of receiving an input of a message character string and a step of receiving an input of emotion type data indicating an emotion type. Determining the emotion level according to the input timing of the message character string, transmitting the input message character string, and transmitting the input emotion type data. Transmitting the emotion level.
- the information storage medium storing the program according to the present invention includes a means for inputting a message character string, a means for inputting emotion type data indicating an emotion type, and an input timing for the message character string. Functioning a computer as means for determining an emotion level, means for transmitting the input message character string, means for transmitting the input emotion type data, and means for transmitting the determined emotion level.
- a communication device is a communication device used for a chat system, wherein: a means for receiving a message character string; a means for receiving emotion type data; Outputting means, means for determining an emotion level according to the input timing of the message character string, and outputting at least one of the received emotion type data and an image or sound corresponding to the determined emotion level. And means.
- a method is a method for controlling a communication device used in a chat system, comprising: a step of receiving a message character string; a step of receiving emotion type data; Outputting a sequence, determining an emotion level according to the input timing of the message character string, and converting at least one of the received emotion type data and the image or sound according to the determined emotion level. And outputting.
- the information storage medium storing the program according to the present invention includes: a means for receiving a message character string; a means for receiving emotion type data; a means for outputting the received message character string; A computer is caused to function as means for determining an emotion level according to the input timing of a column, and means for outputting at least one of the received emotion type data and an image or sound corresponding to the determined emotion level.
- a computer is caused to function as means for determining an emotion level according to the input timing of a column, and means for outputting at least one of the received emotion type data and an image or sound corresponding to the determined emotion level.
- the emotion level is further determined according to the character amount of the message character string.
- the character amount of the message character string is, for example, the number of characters in the message character string itself, the difficulty of inputting special kanji characters, the number of characters, and the number of characters weighted. According to this aspect, it is possible to appropriately evaluate whether the message character string has been input quickly or conversely, etc., and determine an appropriate emotion level accordingly.
- the emotion level is determined according to an input interval of the message character string in the first device. In this way, when the message sender is inputting the message character strings one after another, the emotion level can be determined as, for example, very happy or very angry.
- the second device further includes: means for inputting a message character string 1J; and means for transmitting the input message character string to the first device.
- the first device further includes means for receiving the message character string from the second device, and means for outputting the received message character string, wherein the emotion level is It is determined according to the difference between the input timing of the message character string in the second device and the input timing of the message character string in the first device.
- the "input timing of the message character string in the second device” is a timing at which the input of the message character string is completed in the second device, a message character addressed from the second device to the first device.
- the message string is received or output by the first device, the message string is received or transmitted by the relay device that relays the communication between the first device and the second device. This includes any timing corresponding to the input timing of the message character string in the second device, such as the timing at which the message is input.
- FIG. 1 is a diagram showing an overall configuration of a chat system according to an embodiment of the present invention.
- FIG. 2 is a diagram showing an example of a chat screen.
- FIG. 3 is a diagram showing an example of a chat log.
- FIG. 4 is a diagram showing an avatar image group corresponding to an emotion type “Kiraku”.
- FIG. 5 is a diagram showing an avatar image group corresponding to an emotion type “anger”.
- FIG. 6 is a diagram showing an avatar image group corresponding to an emotion type “sorrow”.
- FIG. 7 is a diagram for explaining how an avatar image changes.
- FIG. 8 is a functional block diagram of a server.
- FIG. 9 is a diagram showing storage contents of an emotion data storage unit.
- FIG. 10 is a functional block diagram of a client.
- FIG. 11 is a flowchart showing emotion data management processing in the server.
- FIG. 12 is a functional block diagram of a client according to a modification.
- 12 server 14 data communication network, 16A, 16B client, 18 emotion type input field, 20A personal information display area, 20B partner information display area, 22A, 22B balloon Image, 24A, 24B avatar image, 26 message character string input field, 30, 40 communication section, 32, 52 emotion data management section, 34, 50 emotion data storage section, 42 message input section, 44 display section, 46 avatar image Memory unit, 48 emotion type input unit.
- FIG. 1 is a diagram showing an overall configuration of a chat system according to one embodiment of the present invention.
- the chat system includes a server 12 and clients 16A and 16B.
- the server 12 and the clients 16A and 16B are both communicatively connected to a data communication network 14 such as the Internet so that data communication can be performed between them.
- a data communication network 14 such as the Internet
- the server 12 is realized by a known server computer mainly configured with a processor, various storage devices, and a data communication device, and manages and relays a chat performed between the client 16A and the client 16B. Things.
- the client 16 is realized by various computer systems such as a well-known personal computer or a well-known computer game system mainly composed of input means such as a monitor and a keyboard, a processor, various storage devices and a data communication device. Yes, each user has a chat (conversation by sending and receiving message strings).
- FIG. 2 shows an example of a chat screen displayed on the monitor of the client 16A.
- a similar chat screen is displayed on the client 16B monitor.
- a personal information display area 20A corresponding to the user of the client 16A hereinafter, referred to as “user A”
- a chat partner ie, a client 16B
- user B a chat partner information display area 20B
- message character string input box 26 for inputting a message character string
- an emotion type input box 18 Has been.
- an avatar image 24A representing the user A is displayed in the personal information display area 20A.
- a balloon image 22A is displayed below the personal information display area 20A, and a message character string input by the user A is sequentially displayed in the balloon image 22A.
- An avatar image 24B representing user B is displayed in the partner information display area 20B.
- a balloon image 22B is displayed below the partner information display area 20B, and a message character string input by the user B is sequentially displayed in the balloon image 22B.
- the message character string input field 26 is a character string editing area used by the user A to input a message character string (character IJ, which is a message to the other party) using character input means such as a keyboard. By inputting characters in order at the displayed cursor position, the message character string can be completed.
- the message character string displayed in the same column can be transmitted to the user B as a chat partner.
- characters such as "happy”, “angry”, and “sad” are displayed, and by selecting one of them by a predetermined emotion type switching operation, an avatar representing the user A is displayed.
- the user can set and input the emotion type of image 24A. This can change the expression of the avatar image 24A
- a chat screen similar to that shown in the figure is also displayed on the client 16B used by the chat partner, in which the same image as the avatar image 24A is displayed in the partner information display area. Is displayed in. The same image power as the avatar image 24B is displayed in the personal information display area. For this reason, when the user A sets and inputs the emotion type of the avatar image 24A in the emotion type input field 18 and changes the expression of the avatar image 24A, the information is displayed in the partner information display area of the chat screen of the client 16B in accordance with the change.
- the displayed avatar image also shows a similar facial expression change. In this way, it is possible to convey emotions to the chat partner, that is, user B, using the avatar image that is not composed of only characters.
- the client 16 can display a chat log on a monitor by performing a specific operation.
- Figure 3 shows an example of this chat log.
- the chat log is composed of characters (here, ⁇ A '' or ⁇ B '') that identify users A and B who are chat parties, and a message string that is a statement of the chat parties.
- the message character string is displayed, the image corresponding to the emotion type and emotion level corresponding to the chat party who input the message character string is associated with the time series. It is displayed in a column.
- the chat screen already described only the latest statement is ballooned out and displayed only in images 22A and 22B, and past statements are not displayed.However, by displaying the chat log shown in the same figure, it is immediately displayed. Now you can understand.
- the avatar images 24A and 24B are stored in advance in a storage unit of the client 16, and are selectively read out from the storage unit and displayed on a monitor.
- 4 to 6 show avatar image groups stored in the storage means of the client 16.
- FIG. 4 shows an avatar image group corresponding to the emotion type “Kiraku”. More specifically, FIG. 7A shows an avatar image group corresponding to the emotion type “pleasure” and the emotion level “1”.
- FIG. 3B shows an avatar image group corresponding to the emotion type “Kiraku” and the emotion level “2”, and
- FIG. An avatar image group corresponding to the velvet “3” is shown, and
- FIG. 11D shows an avatar image group corresponding to the emotion type “Kiraku” and the emotion level “4”.
- the client 16 has a force S in which a plurality of avatar images are stored in association with the same emotion type and the same emotion level, and these are avatar images showing different characters. That is, in this chat system, a plurality of characters depicting, for example, "male in their twenties”, “female in their twenties”, “female in their forties”, etc., are prepared. An image (avatar image) corresponding to the emotion level is created in advance. In the present embodiment, the user specifies in advance the character to be used as his / her avatar image. In the chat screen shown in FIG. 2, the image of the character thus specified by the user is used as the avatar image of the user. It is displayed.
- FIG. 5 shows an avatar image group corresponding to the emotion type “anger”.
- FIG. 7A shows an avatar image group corresponding to the emotion type “anger” and the emotion level “1”.
- FIG. 2B shows the emotion type “anger”.
- FIG. 10C shows an avatar image group corresponding to the emotion type “anger” and the emotion level “3”.
- FIG. 4D shows an avatar image group corresponding to the emotion type “anger” and the emotion level “4”.
- FIG. 6 shows an avatar image group corresponding to the emotion type “sorrow”.
- FIG. 1A shows an avatar image group corresponding to the emotion type “sad” and the emotion level “1”.
- FIG. 3B shows an avatar image group corresponding to the emotion type “sorrow” and the emotion level “2”
- FIG. An avatar image group corresponding to “3” is shown
- FIG. 11D shows an avatar image group corresponding to the emotion type “sorrow” and the emotion level “4”.
- FIG. 7 is a diagram for explaining a facial expression change of avatar images corresponding to user A and user B.
- characters A or "B" that identify the user who transmitted the message character string or performed the emotion type switching operation are described.
- the second column the contents of the transmitted message character string or the fact that an emotion type switching operation has been performed are described.
- the third column shows the expression of the avatar image corresponding to user A
- the fourth column shows the expression of the avatar image corresponding to user B.
- “laugh” represents the emotion type “happy”
- “anger” represents the emotion type “anger”
- a numerical value such as “1” represents the emotion level.
- FIG. 8 is a diagram showing a functional configuration of the server 12.
- the server 12 functionally includes a communication unit 30, an emotion data management unit 32, and an emotion data storage unit 34. These functional blocks are used to execute a predetermined program on the server 12. Is more realized.
- the communication unit 30 includes, for example, a known data communication card, and performs data communication with the client 16 via the data communication network 14.In particular, the communication unit 30 converts a message character string transmitted from the client 16A. Receive and forward it to client 16B. It also receives the message string sent from client 16B and forwards it to client 16A. At this time, the time when the message character string is received from each client 16 is notified to the emotion data management unit 32 as the input timing of the message character string in the client 16.
- communication unit 30 When communication unit 30 receives the emotion type data from client 16A or 16B, it passes it to emotion data management unit 32. Further, it receives an emotion data update request indicating the updated content of the emotion data from the emotion data management unit 32, and transmits the request to the client 16A ⁇ the client 16B.
- the emotion data is data including at least one of the emotion type data and the emotion level.
- the emotion data management unit 32 manages and distributes the emotion data stored in the emotion data storage unit 34. That is, the emotion data storage unit 34 is configured to include storage means such as a hard disk storage device and a RAM, and stores emotion data.
- FIG. 9 shows an example of the emotion data stored in the emotion data storage unit 34.
- the emotion data includes information identifying each user who is chatting, the time at which the user last entered the message character string, and the emotion type currently set (specified). , And the current emotion level.
- the current emotion type of the user A is “Kiraku”, the emotion level is “2”, and that the previous message string was input at 18:30:25. I have.
- the current emotion type of user B is “anger”, the emotion level is “1”, and that the input of the immediately preceding message string was made at 18:30:14. Let's do it.
- emotion data management unit 32 determines the emotion type stored in emotion data storage unit 34 in association with the user of client 16, and stores the emotion type data in the emotion type data. Change to what is shown. At this time, the emotion level stored in the emotion data storage unit 34 in association with the user is initialized to 1.
- a message string is received from the client 16, and the input of the message string
- the difference between the time and the previous input time stored in the emotion data storage unit 34 in association with the chat partner of the user who transmitted the message character string is calculated. Calculate. Then, the difference is divided by the character amount of the message character string received from the client 16 to calculate a time difference per unit character amount. If the time difference per unit character amount is less than the first predetermined value, the emotion level stored in emotion data storage unit 34 is increased by 1 in association with the user who transmitted the message character string. Let If the emotion level is already at the maximum value, the same level increase processing is not performed.
- the time difference per unit character amount is equal to or greater than a second predetermined value different from the first predetermined value
- the time difference is stored in the emotion data storage unit 34 in association with the user who transmitted the message character string. Decrease your emotion level by one. If the emotion level is already at the lowest level, the same level lowering process is not performed. In this way, when the message character string IJ corresponding to the chat partner's input message character string is promptly input, the emotion level of the user can be raised. If the response is dull, the emotion level of the user can be lowered.
- the time notified by the communication unit is stored in the emotion data storage unit 34 in association with the user who is the sender of the message character string, thereby updating the previous input time.
- an update content of the emotion data stored in the emotion data storage unit 34 that is, an emotion data update request indicating at least one of the emotion type data and the emotion level is transmitted to both the client 16A and the client 16B.
- the time is associated with the user who transmitted the message character string.
- the difference between the previous input time stored in the emotion data storage unit 34 and the previous time may be calculated. Then, the difference is divided by the character amount of the message character string received from the client 16 to calculate a time difference per unit character amount. If the time difference is less than the first predetermined value, the emotion level stored in the emotion data storage unit 34 is increased by 1 in association with the user who transmitted the message character string, and the first predetermined value is set.
- the emotion level stored in the emotion data storage unit 34 may be decreased by 1 in association with the user who transmitted the message character string. This In other words, the user who sends message strings one after another raises the emotion level, and conversely, if the interval of sending message strings is long, or if the input operation itself of message strings is slow, the user Emotional level can be lowered.
- FIG. 10 is a diagram showing a functional configuration of the client 16.
- the client 16 functionally includes a communication unit 40, a message input unit 42, a display unit 44, an avatar image storage unit 46, an emotion data storage unit 50, and an emotion type input unit 48. It is configured. These functions are realized by executing a predetermined program in the client 16.
- the communication unit 40 receives the message character string received from the server 12 and supplies it to the display unit 44.
- emotion data is received from the server 12
- the content is reflected in the content stored in the emotion data storage unit 50.
- the message character string is transmitted to the server 12.
- the communication unit 40 receives emotion type data from the emotion type input unit 48.
- the emotion type data is transmitted to the server 12.
- the message input unit 42 particularly includes character input means such as a keyboard, and causes a message character string to be input to the message character string input field 26 of the chat screen.
- the input message character string is combined with the balloon image 22 on the chat screen on the display unit 44, and is displayed and output on the monitor.
- the avatar image storage unit 46 is configured to include, for example, a hard disk storage device and the like, and stores the various avatar images shown in FIGS. 4 to 6.
- the emotion data storage unit 50 stores an emotion type and an emotion level in association with each user who is a chatting party. Among them, particularly, the emotion type corresponding to the user of the client 16 can be set and input by the emotion type input unit 48. In this case, by selecting a character indicating each emotion type in the emotion type input field 18, the emotion type corresponding to the character can be stored in the emotion data storage unit 50 in association with the user of the client 16. You can do it. Also, when the communication unit 40 receives an emotion data update request from the server 12, the emotion data storage unit 50 is updated according to the content of the request.
- the display unit 44 reads the emotion type and emotion level of each user from the emotion data storage unit 50, and reads an avatar image corresponding to the user from the avatar image storage unit 46. At this time, the display unit 44 acquires the character designation of each user in advance, and reads an avatar image corresponding to the designated character. The read avatar image is displayed by the display unit 44 in the personal information display area 20A and the partner information display area 20B, respectively.
- FIG. 11 is a flowchart showing emotion data management processing by the server 12.
- the server 12 monitors whether or not a message string is received from any of the clients 16 (S101).
- the current time is obtained (S102).
- the last input time corresponding to the user who is the chat partner is read from the emotion data storage unit 34, and the elapsed time tl from that time to the current time is calculated (S103). If the elapsed time tl is shorter than the first predetermined value TA, the emotion level stored in the emotion data storage unit 34 is increased by 1 in association with the user who is the sender of the message character string.
- Update (S105).
- the processing in S105 is not performed.
- the elapsed time tl is equal to or greater than the first predetermined value TA
- the process of S105 is skipped.
- the emotion level updated as described above is transmitted to each client 16 as an emotion data update request (S108). Further, the current time acquired in S102 is stored in the emotion data storage unit 34 in association with the user who is the sender of the message character string. In this way, the last input time is updated.
- the expression of the avatar image automatically changes according to the input timing of the message character string, so that the expression of the avatar image is changed. No special input is required, and the convenience for the user can be greatly improved.
- the avatar image is changed according to the input timing of the message character string.
- the client 16 outputs a sound, and this sound is used as the input timing of the message character string. You may make it change according to it.
- the sound in this case is music, for example, a voice that reads out a message character string. Even in this case, the emotion of the chat partner can be determined from the change in the sound.
- the reception time in the server 12 is treated as the input timing of the message character string in the client 16.
- the current time is acquired when the message character string is input in the client 16. If the message is transmitted to the server 12 in advance, the timing closer to the input timing of the message character string, and the timing can be handled as the input timing of the message character string in the client 16.
- the input timing of the immediately preceding message character string is set so that the emotion level is increased or decreased in accordance with the elapsed time until the input timing of the current message character string.
- the emotion level may be associated with the range of the value divided by the amount, so that the emotion level can be rapidly changed. It is also possible to calculate the elapsed time or other statistics such as the average of the value obtained by dividing the elapsed time by the character amount, and determine the emotion level accordingly.
- FIG. 12 is a diagram showing a functional configuration of the client 16 according to this modification. As shown in the figure, this example is characterized in that an emotion data management unit 52 is provided in the client 16. Other configurations are the same as those in FIG. 10, and the same reference numerals are given here, and detailed description will be omitted.
- the emotion data management unit 52 receives a message character string from the message input unit 42, and further receives a message character string received from another client 16 from the communication unit 40. Then, when a message character string is input from the message input unit 42, the current time is acquired by a timing unit (not shown), and Read the last input time corresponding to the stored chat partner, and read the elapsed time from there to the current time. Then, similarly to the case of FIG. 11, the emotion level is changed according to the elapsed time. When a message character string is input from the communication unit 40, the current time is obtained by a timing unit (not shown), and the last input time corresponding to the user of the client 16 stored in the emotion data storage unit 50 is stored.
- the emotion level is changed according to the elapsed time.
- the timing at which the message string is received at the client 16 is treated as the input timing of the message string.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- Entrepreneurship & Innovation (AREA)
- General Physics & Mathematics (AREA)
- Marketing (AREA)
- Theoretical Computer Science (AREA)
- General Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Tourism & Hospitality (AREA)
- Economics (AREA)
- Quality & Reliability (AREA)
- Operations Research (AREA)
- Computer Hardware Design (AREA)
- Multimedia (AREA)
- Computer Networks & Wireless Communication (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Information Transfer Between Computers (AREA)
- Telephonic Communication Services (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP04771096A EP1734453A4 (en) | 2004-03-31 | 2004-07-30 | CHAT-SYSTEM, COMMUNICATION DEVICE, CONTROL PROCEDURE AND INFORMATION RECORDING MEDIUM |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004108023A JP3930489B2 (ja) | 2004-03-31 | 2004-03-31 | チャットシステム、通信装置、その制御方法及びプログラム |
JP2004-108023 | 2004-03-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005101216A1 true WO2005101216A1 (ja) | 2005-10-27 |
Family
ID=35055666
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/010936 WO2005101216A1 (ja) | 2004-03-31 | 2004-07-30 | チャットシステム、通信装置、その制御方法及び情報記憶媒体 |
Country Status (7)
Country | Link |
---|---|
US (1) | US20050223078A1 (ja) |
EP (1) | EP1734453A4 (ja) |
JP (1) | JP3930489B2 (ja) |
KR (1) | KR100841590B1 (ja) |
CN (1) | CN100514312C (ja) |
TW (1) | TW200534901A (ja) |
WO (1) | WO2005101216A1 (ja) |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4671880B2 (ja) * | 2006-01-31 | 2011-04-20 | 株式会社コナミデジタルエンタテインメント | チャットシステム、チャット装置及びチャットサーバの制御方法、プログラム |
GB2443027B (en) * | 2006-10-19 | 2009-04-01 | Sony Comp Entertainment Europe | Apparatus and method of audio processing |
US8886537B2 (en) * | 2007-03-20 | 2014-11-11 | Nuance Communications, Inc. | Method and system for text-to-speech synthesis with personalized voice |
WO2008132533A1 (en) * | 2007-04-26 | 2008-11-06 | Nokia Corporation | Text-to-speech conversion method, apparatus and system |
US20090128567A1 (en) * | 2007-11-15 | 2009-05-21 | Brian Mark Shuster | Multi-instance, multi-user animation with coordinated chat |
US8788943B2 (en) * | 2009-05-15 | 2014-07-22 | Ganz | Unlocking emoticons using feature codes |
WO2013048225A2 (ko) * | 2011-09-29 | 2013-04-04 | Hur Min | 감성표현데이터 전달 방법 및 그 시스템 |
TWI482108B (zh) | 2011-12-29 | 2015-04-21 | Univ Nat Taiwan | To bring virtual social networks into real-life social systems and methods |
KR101390228B1 (ko) * | 2012-10-22 | 2014-05-07 | (주)카카오 | 채팅 영역에 이미지를 표시하는 디바이스 및 방법, 그리고 채팅 데이터를 관리하는 서버 |
WO2014129378A1 (ja) * | 2013-02-20 | 2014-08-28 | 株式会社ソニー・コンピュータエンタテインメント | 文字列入力システム |
KR20140120506A (ko) * | 2013-04-03 | 2014-10-14 | 삼성전자주식회사 | 휴대단말기에서의 대화 레벨 부여 방법 및 장치 |
JP2016118991A (ja) * | 2014-12-22 | 2016-06-30 | カシオ計算機株式会社 | 画像生成装置、画像生成方法及びプログラム |
US10594638B2 (en) | 2015-02-13 | 2020-03-17 | International Business Machines Corporation | Point in time expression of emotion data gathered from a chat session |
US10652183B2 (en) * | 2017-06-30 | 2020-05-12 | Intel Corporation | Incoming communication filtering system |
CN108536499B (zh) * | 2018-01-02 | 2021-05-18 | 联想(北京)有限公司 | 信息处理方法和电子设备 |
US10522143B2 (en) * | 2018-02-27 | 2019-12-31 | Microsoft Technology Licensing, Llc | Empathetic personal virtual digital assistant |
US10367931B1 (en) * | 2018-05-09 | 2019-07-30 | Fuvi Cognitive Network Corp. | Apparatus, method, and system of cognitive communication assistant for enhancing ability and efficiency of users communicating comprehension |
KR102117963B1 (ko) * | 2019-06-27 | 2020-06-02 | 라인 가부시키가이샤 | 사용자의 행동 패턴에 기반하여 메시지의 기대 심리 레벨을 산출하는 전자 기기, 방법 및 컴퓨터 프로그램 |
CN112312225B (zh) | 2020-04-30 | 2022-09-23 | 北京字节跳动网络技术有限公司 | 信息展示方法、装置、电子设备和可读介质 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11203227A (ja) * | 1998-01-19 | 1999-07-30 | Network Community Creation:Kk | チャット画面表示方法 |
US6064383A (en) | 1996-10-04 | 2000-05-16 | Microsoft Corporation | Method and system for selecting an emotional appearance and prosody for a graphical character |
WO2001046910A1 (en) | 1999-12-21 | 2001-06-28 | Electronic Arts Inc. | Behavioral learning for a visual representation in a communication environment |
JP2002325965A (ja) * | 2001-04-27 | 2002-11-12 | Sega Corp | 入力文字処理方法 |
US20030110450A1 (en) | 2001-12-12 | 2003-06-12 | Ryutaro Sakai | Method for expressing emotion in a text message |
JP2003271277A (ja) * | 2002-03-12 | 2003-09-26 | Sony Corp | 情報処理装置及び情報入力方法 |
US20030210265A1 (en) | 2002-05-10 | 2003-11-13 | Haimberg Nadav Y. | Interactive chat messaging |
EP1396984A1 (en) | 2002-09-04 | 2004-03-10 | Siemens Aktiengesellschaft | User interface for a mobile communication device |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6598020B1 (en) * | 1999-09-10 | 2003-07-22 | International Business Machines Corporation | Adaptive emotion and initiative generator for conversational systems |
KR20020059963A (ko) * | 2001-01-09 | 2002-07-16 | 김재길 | 근린 배급소 경쟁방식 전자상거래방법 및 장치 및 물품배달방법 |
US7058566B2 (en) * | 2001-01-24 | 2006-06-06 | Consulting & Clinical Psychology, Ltd. | System and method for computer analysis of computer generated communications to produce indications and warning of dangerous behavior |
US20020194006A1 (en) * | 2001-03-29 | 2002-12-19 | Koninklijke Philips Electronics N.V. | Text to visual speech system and method incorporating facial emotions |
US10298700B2 (en) * | 2002-06-25 | 2019-05-21 | Artimys Technologies Llc | System and method for online monitoring of and interaction with chat and instant messaging participants |
US7137070B2 (en) * | 2002-06-27 | 2006-11-14 | International Business Machines Corporation | Sampling responses to communication content for use in analyzing reaction responses to other communications |
US20040082839A1 (en) * | 2002-10-25 | 2004-04-29 | Gateway Inc. | System and method for mood contextual data output |
-
2004
- 2004-03-31 JP JP2004108023A patent/JP3930489B2/ja not_active Expired - Lifetime
- 2004-07-30 EP EP04771096A patent/EP1734453A4/en not_active Ceased
- 2004-07-30 CN CNB2004800426009A patent/CN100514312C/zh active Active
- 2004-07-30 WO PCT/JP2004/010936 patent/WO2005101216A1/ja not_active Application Discontinuation
-
2005
- 2005-03-04 TW TW094106545A patent/TW200534901A/zh unknown
- 2005-03-30 KR KR1020050026730A patent/KR100841590B1/ko active IP Right Grant
- 2005-03-31 US US11/094,378 patent/US20050223078A1/en not_active Abandoned
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6064383A (en) | 1996-10-04 | 2000-05-16 | Microsoft Corporation | Method and system for selecting an emotional appearance and prosody for a graphical character |
JPH11203227A (ja) * | 1998-01-19 | 1999-07-30 | Network Community Creation:Kk | チャット画面表示方法 |
WO2001046910A1 (en) | 1999-12-21 | 2001-06-28 | Electronic Arts Inc. | Behavioral learning for a visual representation in a communication environment |
JP2002325965A (ja) * | 2001-04-27 | 2002-11-12 | Sega Corp | 入力文字処理方法 |
US20030110450A1 (en) | 2001-12-12 | 2003-06-12 | Ryutaro Sakai | Method for expressing emotion in a text message |
JP2003271277A (ja) * | 2002-03-12 | 2003-09-26 | Sony Corp | 情報処理装置及び情報入力方法 |
US20030210265A1 (en) | 2002-05-10 | 2003-11-13 | Haimberg Nadav Y. | Interactive chat messaging |
EP1396984A1 (en) | 2002-09-04 | 2004-03-10 | Siemens Aktiengesellschaft | User interface for a mobile communication device |
Non-Patent Citations (2)
Title |
---|
PICARD, R. W.: "Building HAL: computers that sense, recognize, and respond to human emotion", PROCEEDINGS OF THE SPIE, vol. 4299, June 2001 (2001-06-01), pages 518 - 523 |
See also references of EP1734453A4 |
Also Published As
Publication number | Publication date |
---|---|
JP2005293280A (ja) | 2005-10-20 |
JP3930489B2 (ja) | 2007-06-13 |
US20050223078A1 (en) | 2005-10-06 |
KR100841590B1 (ko) | 2008-06-26 |
EP1734453A1 (en) | 2006-12-20 |
KR20060045040A (ko) | 2006-05-16 |
EP1734453A4 (en) | 2008-05-07 |
TW200534901A (en) | 2005-11-01 |
CN1934547A (zh) | 2007-03-21 |
CN100514312C (zh) | 2009-07-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR100841590B1 (ko) | 채팅 시스템, 통신장치, 그 제어방법 및 정보기억매체 | |
US20180011841A1 (en) | Enabling an im user to navigate a virtual world | |
KR101740274B1 (ko) | 이모티콘 탐색 방법 및 단말 | |
CN110249325A (zh) | 具有通信模型的输入*** | |
KR101200559B1 (ko) | 모바일 인스턴트 메신저에서 플래시콘을 제공하는 시스템,장치 및 방법 | |
US11792141B2 (en) | Automated messaging reply-to | |
US11625542B2 (en) | Instant messaging application configuration based on virtual world activities | |
US10200338B2 (en) | Integrating communication modes in persistent conversations | |
JP4854424B2 (ja) | チャットシステム、通信装置、その制御方法及びプログラム | |
JP2001160021A (ja) | 仮想空間による通信システム | |
KR101310274B1 (ko) | 메신저 서비스를 제공하는 방법 및 그 서버 | |
KR20090075397A (ko) | 이모티콘을 포함한 통신 메시지의 송수신 방법 | |
JP2009064418A (ja) | 個人的対象体を備えるインスタントメッセージシステム及びその方法 | |
CN104301202A (zh) | 一种即时通讯的振动信息表达方法和*** | |
CN110493120A (zh) | 一种用于发送设备操作指令的方法与设备 | |
KR20050027397A (ko) | 그림 문자 채팅을 지원하는 메시지 전송 방법 및 시스템 | |
JP2004102496A (ja) | テキストおよびキャラクタ表示システム、コンピュータプログラムならびにサーバ | |
US20240022535A1 (en) | System and method for dynamically generating suggestions to facilitate conversations between remote users | |
US11716298B2 (en) | Information processing apparatus, information processing method, and non-transitory computer readable medium | |
JP3694516B2 (ja) | 文字列表示システム、文字列表示方法及びプログラム | |
KR100415549B1 (ko) | 관심도를 반영한 다자간 대화 인터페이스 방법 | |
JP2003108506A (ja) | 通信システム及び表示方法 | |
KR20030063952A (ko) | 텍스트필터 프로그램을 이용한 유무선 육성게임시스템 및그 운영방법 | |
KR20060047148A (ko) | 보이스 이모티콘 제공 시스템 및 그 방법 | |
KR20040029643A (ko) | 동적 아이템을 이용한 채팅 시스템에서 대화서비스 제공방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2004771096 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200480042600.9 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 2004771096 Country of ref document: EP |