CN110674385A - Method and device for matching customer service in customer service upgrading scene - Google Patents

Method and device for matching customer service in customer service upgrading scene Download PDF

Info

Publication number
CN110674385A
CN110674385A CN201810612634.0A CN201810612634A CN110674385A CN 110674385 A CN110674385 A CN 110674385A CN 201810612634 A CN201810612634 A CN 201810612634A CN 110674385 A CN110674385 A CN 110674385A
Authority
CN
China
Prior art keywords
customer
service
interaction
client
customer service
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810612634.0A
Other languages
Chinese (zh)
Inventor
吴立欣
刘金鹤
陈鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201810612634.0A priority Critical patent/CN110674385A/en
Publication of CN110674385A publication Critical patent/CN110674385A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application provides a method for matching customer service in a customer service upgrading scene, which comprises the following steps: acquiring service interaction data generated by service interaction between a client and a first client service; analyzing the customer characteristic information of the customer from the service interaction data; evaluating the customer-adapted interaction characteristics according to the customer characteristic information; and matching the interactive features with the customer service labels of the customer services to obtain second customer services of which the customer service labels are matched with the interactive features. According to the method for matching the customer service in the customer service upgrading scene, the interaction characteristics of the customer are evaluated and then matched with the customer service label of the customer service, and the second customer service matched with the interaction characteristics of the customer is obtained, so that the problem of the customer is solved in a targeted mode through the matched second customer service, and the adaptability and the pertinence of the second customer service and the customer are improved.

Description

Method and device for matching customer service in customer service upgrading scene
Technical Field
The application relates to the field of customer service, in particular to a method and a device for matching customer service in a customer service upgrading scene, and also relates to electronic equipment.
Background
In the customer service area, a customer may fail to satisfactorily address the customer's issue after receiving the service of the first customer service. In this case, a second customer service may intervene to address the customer's problem. The first-level customer service often needs to explain the emotional characteristics and the logical communication characteristics of the customer with the second customer service so that the second customer service can deal with the problems of the customer in a targeted manner. In addition, the capacities of the service personnel are different, and the problem of the client can be solved only by the second customer service when the problem which cannot be solved by the first customer service is transferred to the second customer service with the matched capacity.
At present, the prior art directly delivers the unsolved customer problem of the first customer service to any second customer service to intervene and solve. In this case, the second customer service cannot specifically solve the problem of the customer without knowing the emotional characteristics and the logical communication characteristics of the customer; at the same time, the capacity of the second customer service may not be matched with the customer, so that the second customer service is inefficient in solving the problem of the customer.
Disclosure of Invention
The application provides a method for matching customer service in a customer service upgrading scene, and aims to solve the problem of a customer efficiently by the matched customer service aiming at the characteristics of the customer. The application also provides a device for matching the customer service under the customer service upgrading scene, and also relates to electronic equipment.
The application provides a method for matching customer service in a customer service upgrading scene, which comprises the following steps:
acquiring service interaction data generated by service interaction between a client and a first client service;
analyzing the customer characteristic information of the customer from the service interaction data;
evaluating the customer-adapted interaction characteristics according to the customer characteristic information;
and matching the interactive features with the customer service labels of the customer services to obtain second customer services of which the customer service labels are matched with the interactive features.
Optionally, the data type of the service interaction data includes at least one of: voice type, video type, and text type.
Optionally, if the data type of the service interaction data is a voice type, analyzing the customer feature information of the customer from the service interaction data, and implementing the following method:
analyzing the audio characteristic information of the client contained in the voice-type service interaction data as the client characteristic information;
correspondingly, the evaluation of the interaction characteristics adapted by the customer according to the customer characteristic information is realized by adopting the following modes:
calculating an emotion characteristic value of the client according to the audio characteristic information;
and comparing the emotion characteristic value of the client with a threshold interval of emotion characteristics, and taking the emotion characteristics corresponding to the threshold interval obtained by comparison as interactive characteristics adapted to the client.
Optionally, if the data type of the service interaction data is a voice type, after the step of obtaining the service interaction data generated by the service interaction between the customer and the first customer service is executed, and before the step of analyzing the customer feature information of the customer from the service interaction data is executed, the following operations are executed:
and converting the service interaction data of the voice type into service interaction data of the text type.
Optionally, the analyzing the customer feature information of the customer from the service interaction data is implemented in the following manner:
performing word segmentation on the service interaction data of the text type to obtain a service interaction subdata set of the text type as the customer characteristic information;
correspondingly, the evaluation of the interaction characteristics adapted by the customer according to the customer characteristic information is realized by adopting the following modes:
performing semantic analysis on the service interaction subdata contained in the service interaction subdata set of the text type;
and determining a logic capacity characteristic value and/or a communication expression characteristic value according to a semantic analysis result to serve as the interactive characteristic adapted to the client.
Optionally, the analyzing the customer feature information of the customer from the service interaction data is implemented in the following manner:
analyzing the audio characteristic information of the client contained in the voice-type service interaction data as the client characteristic information;
performing word segmentation on the service interaction data of the text type to obtain a service interaction subdata set of the text type as the customer characteristic information;
correspondingly, the evaluation of the interaction characteristics adapted by the customer according to the customer characteristic information is realized by adopting the following modes:
calculating an emotion characteristic value of the client according to the audio characteristic information;
comparing the emotion characteristic value of the client with a threshold interval of emotion characteristics, and taking the emotion characteristics corresponding to the threshold interval obtained by comparison as interaction characteristics adapted to the client;
performing semantic analysis on the service interaction subdata contained in the service interaction subdata set of the text type;
and determining a logic capacity characteristic value and/or a communication expression characteristic value according to a semantic analysis result to serve as the interactive characteristic adapted to the client.
Optionally, the audio feature information includes at least one of: volume information, tone information, and volume fluctuation amplitude.
Optionally, if the data type of the service interaction data is a video type, before analyzing the customer feature information of the customer from the service interaction data, the following operations are performed:
decomposing the service interaction data of the video type into service interaction data of an image type and service interaction data of a voice type;
correspondingly, the analyzing of the customer characteristic information of the customer from the service interaction data is implemented in the following manner:
analyzing the image characteristic information of the customer contained in the service interaction datagram of the image type as the customer characteristic information;
analyzing the audio characteristic information of the client contained in the audio type service interaction data as the client characteristic information;
correspondingly, the evaluation of the interaction characteristics adapted by the customer according to the customer characteristic information is realized by adopting the following modes:
comparing the image characteristic information with emotion characteristic images in an emotion characteristic image library, and taking emotion characteristics corresponding to the emotion characteristic images obtained by comparison as interaction characteristics adapted to the client;
calculating an emotion characteristic value of the client according to the audio characteristic information;
and comparing the emotion characteristic value of the client with a threshold interval of emotion characteristics, and taking the emotion characteristics corresponding to the threshold interval obtained by comparison as interactive characteristics adapted to the client.
Optionally, if the data type of the service interaction data is a text type, the client feature information of the client is analyzed from the service interaction data, and the following method is adopted:
performing word segmentation on the service interaction data of the text type to obtain a service interaction subdata set of the text type;
performing semantic analysis on the service interaction subdata of the text type contained in the service interaction subdata set of the text type, and taking a semantic analysis result as the customer characteristic information;
correspondingly, the evaluation of the interaction characteristics adapted by the customer according to the customer characteristic information is realized by adopting the following modes:
calculating the emotion characteristic value of the client according to the semantic analysis result;
comparing the emotion characteristic value of the client with a threshold interval of emotion characteristics, and taking the emotion characteristics corresponding to the threshold interval obtained by comparison as interaction characteristics adapted to the client;
and determining a logic capacity characteristic value and/or a communication expression characteristic value according to a semantic analysis result to serve as the interactive characteristic adapted to the client.
Optionally, if the data type of the service interaction data is a text type, the client feature information of the client is analyzed from the service interaction data, and the following method is adopted:
performing word segmentation on the service interaction data of the text type to obtain a service interaction subdata set of the text type;
performing semantic analysis on the service interaction subdata of the text type contained in the service interaction subdata set of the text type, and taking a semantic analysis result as the customer characteristic information;
correspondingly, the evaluation of the interaction characteristics adapted by the customer according to the customer characteristic information is realized by adopting the following modes:
calculating the emotion characteristic value of the client according to the semantic analysis result;
comparing the emotion characteristic value of the client with a threshold interval of emotion characteristics, and taking the emotion characteristics corresponding to the threshold interval obtained by comparison as interaction characteristics adapted to the client;
and determining a logic capacity characteristic value and/or a communication expression characteristic value according to a semantic analysis result to serve as the interactive characteristic adapted to the client.
Optionally, the method is executed after the user interacts with the first customer service, and before the service interaction data generated by the service interaction between the user and the first customer service is obtained, the following operations are executed:
and judging whether the first customer service has solved the problem of the user, if not, executing the step of acquiring service interaction data generated by service interaction between the user and the first customer service.
Optionally, the service label of the service is determined by the following method:
acquiring historical service data of the customer service;
analyzing service characteristics of the customer service based on the historical service data;
determining a customer service label for the customer service based on the service characteristics;
wherein the service features include at least one of: service topic, service object, rate of service success, service score.
Optionally, after the step of analyzing the customer characteristic information of the customer from the service interaction data is executed, and before the step of matching the interaction characteristic with the customer service label of the customer service is executed, the following operations are executed:
analyzing the service interaction data to obtain a service interaction subdata set;
extracting key service interaction subdata from the service interaction subdata set;
determining a service interaction theme according to the extracted key service interaction subdata;
correspondingly, after the second customer service step of matching the interactive feature with the customer service label of the customer service to obtain the customer service label matched with the interactive feature is executed, the following steps are executed:
judging whether the label of the second customer service is matched with the service interaction theme or not, if so, calling the second customer service to perform interaction service for the customer aiming at the problem;
and if not, returning to execute the second customer service step of matching the interactive feature with the customer service label of the customer service to obtain the matching of the customer service label and the interactive feature.
Optionally, before the step of determining whether the tag of the second customer service matches the service interaction theme is executed, the following operations are executed:
acquiring an interaction problem set corresponding to the service interaction theme;
according to the service interaction data, determining an interaction problem corresponding to the service interaction data from the interaction problem set;
judging whether the first customer service already solves the interaction problem of the customer, if not, executing the following operations:
judging whether the label of the second customer service is matched with the interaction problem, if not, returning to execute the second customer service step of matching the interaction feature with the customer service label of the customer service to obtain the second customer service step of matching the customer service label with the interaction feature;
and if the service interaction theme is matched with the interaction question, executing the step of judging whether the label of the second customer service is matched with the service interaction theme or not.
The application also provides a device for matching customer service under the customer service upgrading scene, which comprises:
the service interaction data acquisition unit is used for acquiring service interaction data generated by service interaction between a client and a first client service;
the customer characteristic information analyzing unit is used for analyzing the customer characteristic information of the customer from the service interaction data;
the interactive characteristic evaluation unit is used for evaluating the interactive characteristics adapted to the customer according to the customer characteristic information;
and the second customer service matching unit is used for matching the interactive characteristics with the customer service labels of the customer services to obtain second customer services matched with the interactive characteristics by the customer service labels.
Optionally, the client characteristic information parsing unit includes:
the first customer characteristic information analysis subunit is used for analyzing the audio characteristic information of the customer contained in the voice type service interaction data as the customer characteristic information;
the interactive feature evaluation unit includes:
the first emotion characteristic value acquisition subunit is used for calculating the emotion characteristic value of the client according to the audio characteristic information;
and the first interactive characteristic determining subunit is used for comparing the emotion characteristic value of the client with a threshold interval of emotion characteristics, and taking the emotion characteristics corresponding to the threshold interval obtained by comparison as the interactive characteristics adapted to the client.
Optionally, the client characteristic information parsing unit includes:
the first word segmentation subunit is used for performing word segmentation on the service interaction data of the text type to obtain a service interaction sub-data set of the text type as the customer characteristic information;
the interactive feature evaluation unit includes:
a first semantic analysis subunit, configured to perform semantic analysis on the service interaction sub data included in the service interaction sub data set of the text type;
and the second interactive characteristic determining subunit is used for determining a logic capacity characteristic value and/or a communication expression characteristic value according to a semantic analysis result, and the logic capacity characteristic value and/or the communication expression characteristic value are used as the interactive characteristics adapted to the client.
The present application further provides an electronic device, comprising:
a memory and a processor;
the memory is to store computer-executable instructions, and the processor is to execute the computer-executable instructions to:
acquiring service interaction data generated by service interaction between a client and a first client service;
analyzing the customer characteristic information of the customer from the service interaction data;
evaluating the customer-adapted interaction characteristics according to the customer characteristic information;
and matching the interactive features with the customer service labels of the customer services to obtain second customer services of which the customer service labels are matched with the interactive features.
According to the technical scheme for matching the customer service in the customer service upgrading scene, the customer feature information analyzed from the service interaction data generated by the first customer service and the customer is used for evaluating the interaction features of the customer, and then the customer service is matched with the customer service label of the customer service to obtain the second customer service matched with the interaction features, so that the problem of the customer is solved by the matched second customer service in a targeted manner.
Drawings
FIG. 1 is a flowchart illustrating an embodiment of a method for matching customer service in a customer service upgrade scenario;
FIG. 2 is a schematic diagram of an application scenario of a method for matching customer service in a customer service upgrade scenario according to the present application;
FIG. 3 is a schematic diagram of an embodiment of an apparatus for matching customer service in a customer service upgrade scenario according to the present application;
fig. 4 is a schematic diagram of an embodiment of an electronic device provided in the present application.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is capable of implementation in many different ways than those herein set forth and of similar import by those skilled in the art without departing from the spirit of this application and is therefore not limited to the specific implementations disclosed below.
Fig. 1 is a flow of an embodiment of a method for matching customer service in a customer service upgrade scenario provided by the present application. The following describes a technical solution of the method for matching customer service in a customer service upgrade scenario, which is provided by the present application, with reference to a flow of an embodiment of the method for matching customer service in a customer service upgrade scenario shown in fig. 1.
The flow of one embodiment of the method for matching customer service in the customer service upgrade scenario shown in fig. 1 includes:
step S101, service interaction data generated by service interaction between a client and a first client service is obtained.
In the scene of interaction between the customer and the customer service, the interaction may be performed by voice, video, or text. Accordingly, the service interaction data generated by the interaction process may be voice type service interaction data, video type service interaction data, or text type service interaction data. If the service interaction data acquired in step S101 is voice-type service interaction data, the voice-type service interaction data may also be converted into text-type service interaction data. In the analysis of the client feature information in the subsequent step S102 and the evaluation of the interactive features in the subsequent step S103, the processing may be performed on the acquired voice-type service interaction data, or may be performed on the text-type service interaction data converted from the acquired voice-type service interaction data, so that the convenience and the diversity of the processing on the voice-type service interaction data are enhanced, and the processing efficiency is improved. If the service interaction data acquired in step S101 is the video-type service interaction data, after step S101 and before step S102, the following operations are performed: and decomposing the service interaction data of the video type into service interaction data of an image type and service interaction data of a voice type. In the analyzing of the client feature information in the subsequent step S102 and the evaluating of the interactive feature in the subsequent step S103, the image-type service interaction data and the voice-type service interaction data converted from the acquired video-type service interaction data may be processed, so as to obtain the processing capability of the video-type service interaction data.
Step S102, analyzing the customer characteristic information of the customer from the service interaction data.
First, when the service interaction data obtained in step S101 is voice-type service interaction data, or the service interaction data obtained in step S101 is video-type service interaction data and is decomposed to obtain voice-type service interaction data, the step S102 may be implemented in the following manner: and analyzing the audio characteristic information of the client contained in the voice-type service interaction data as the client characteristic information. The audio feature information comprises at least one of the following: volume information, tone information, and volume fluctuation amplitude. The volume information is a volume value of voice type interactive data, and the unit is decibel (dB) and is used for measuring the sound intensity. When the voice of the customer is relatively large when the customer interacts with the customer service, the volume value of the customer is relatively large. The volume value of the client's voice can be detected by the volume detector, and since the method shown in this embodiment is implemented by a computer device, the volume detector for detecting the volume level is generally installed in the computer device. The volume detector generates voltage by receiving the vibration of the sound wave sent by the client, and displays the sound size corresponding to the voltage size, namely the volume value of the client. When the sound level of the client changes, the volume detector can correspondingly display different volume values, for example, the fluctuation range of the volume value of the client is detected to be (0-30) dB, (31-50) dB, (51-70) dB and (71-100) dB. The pitch information refers to a magnitude of a pitch of voice-type interactive data in hertz (Hz). The pitch is primarily related to the frequency of the vocal vibrations, e.g., when the customer is speaking, the mood is more excited, and when the speaking location is slightly behind the vocal cords, the frequency of the vocal cords vibrations may be higher, resulting in a higher pitch of the sound being emitted. When there is a change in the frequency of vocal cord vibrations as the customer vocalizes, the pitch value of the vocalized sound changes. As to the magnitude of the pitch value of the client's uttered sound, it can be detected by a pitch detector. Since the method shown in the present embodiment is implemented by a computer device, a tone detector for detecting the volume level is generally installed in the computer device. When the tone detector detects the sound emitted by the customer, the sound wave of the customer sound has a certain vibration frequency, so that the detector can sense the vibration frequency, vibrate along with the vibration frequency, generate an electric signal and obtain a tone value corresponding to the electric signal. When the vibration frequency of the customer's voice changes, i.e. the pitch of the customer's voice changes, the pitch detector will detect different pitch values accordingly, such as 20Hz, 30Hz, 40Hz, 60Hz, 80 Hz. In this way, the volume value and fluctuation range, and the pitch value and fluctuation range of the customer voice are detected for the voice-type service interaction data generated when the customer interacts with the customer service, and then used as the characteristic information of the customer in the steps S102 and thereafter.
Secondly, when the text-type service interaction data is acquired in step S101, or when the audio-type service interaction data is acquired in step S101 and converted into text-type service interaction data, and when the video-type service interaction data is acquired in step S101, and the voice-type service interaction data is decomposed therefrom and then converted into text-type service interaction data, S102 may also be implemented in the following manner: firstly, performing word segmentation on the service interaction data of the text type to obtain a service interaction subdata set of the text type as the customer characteristic information. The word segmentation refers to performing word segmentation on a text generated by interaction between a client and a customer service, for example, if the text is a segment of words, the segment of words is segmented into single words, and then the service interaction subdata of the text type is obtained. In the method shown in this embodiment, a text-type service interaction sub-data set may be used as the customer feature information in the following steps. Secondly, performing semantic analysis on the service interaction subdata of the text type contained in the service interaction subdata set of the text type, and taking a semantic analysis result as the customer characteristic information. After the service interaction subdata set of the text type is obtained based on segmentation, an optimal segmentation result is determined by using a statistical language model. The participles in the optimal segmentation result represent the semantics of the text interaction data at the same time, and can be used as the customer feature information in other steps after step S102.
Furthermore, when the service interaction data of the video type is acquired in step S101 and is decomposed into service interaction data of the image type and service interaction data of the audio type, step S102 may also be implemented in the following manner: first, image characteristic information of the customer included in the service interaction datagram of the image type is analyzed as the customer characteristic information. The service interaction data of the obtained image type is generated during the interaction process between the customer and the customer service, and the image generated during the interaction process between the customer and the customer service may include various images, such as a face image of the customer service and an image of other objects (such as an image of a certain purchased commodity). Therefore, the image of the customer, which is the image characteristic information of the customer, can be determined by image recognition. How to determine the image of the user through image recognition can be realized by the following ways: firstly, detecting characteristic pixel points of each image, wherein a series of characteristic pixel points can express the outline of an object described by the image; secondly, the outline of the object represented by each characteristic pixel point is compared with a human body image library or a human face image library, and if the outline of the object is matched with the image in the human body image library or the human face image library, the image is determined as the image characteristic information of the client. Secondly, analyzing the audio characteristic information of the client contained in the service interaction data of the audio type as the client characteristic information. As for the audio feature information decomposed from the video-type service interaction data as the client feature information of the client, the detailed description is given above and will not be repeated herein. In this way, the image feature information of the customer and the audio feature information of the customer are used as the customer feature information in other steps after step S102.
Step S103, evaluating the interactive characteristics adapted by the customer according to the customer characteristic information.
First, when the service interaction data obtained in step S101 is voice-type service interaction data, or the service interaction data obtained in step S101 is video-type service interaction data and is decomposed to obtain voice-type service interaction data, and after the voice-type service interaction data is analyzed in step S102, and further, the audio feature information of the client is determined as the client feature information of the client, step S103 may be implemented in the following manner: first, an emotion feature value of the client is calculated according to the audio feature information. In the foregoing step S102, it is exemplified that the fluctuation range of the volume value of the client may be (0-30) dB, (31-50) dB, (51-70) dB, (71-100) dB, and the volume value of the client may be 20Hz, 30Hz, 40Hz, 60Hz, 80 Hz. In step S103, the fluctuation ranges (0 to 30) dB, (31 to 50) dB, (51 to 70) dB, and (71 to 100) dB of the volume values may be determined to correspond to emotion value intervals according to preset mapping relationships, for example: are (71-100), (51-71), (31-51) and (0-30), respectively. Secondly, comparing the emotion characteristic value of the client with a threshold interval of emotion characteristics, and taking the emotion characteristics corresponding to the threshold interval obtained by comparison as interactive characteristics adapted to the client. After the emotion value intervals of the client are determined, because each emotion value interval has a mapping relation with the emotion feature, the emotion features corresponding to the threshold value intervals which are matched with each other and obtained through comparison can be used as the interactive features adapted to the client. For example: the emotion characteristics corresponding to the emotion value intervals (71-100) are happy, the emotion characteristics corresponding to the emotion value intervals (51-71) are peaceful, the emotion characteristics corresponding to the emotion value intervals (31-51) are angry, and the emotion characteristics corresponding to the emotion value intervals (0-30) are angry. Similarly, the corresponding emotional characteristic values can be calculated according to the mapping relations of the tone values of the client, namely 20Hz, 30Hz, 40Hz, 60Hz and 80Hz, and further the emotional characteristics of the user can be determined according to the mapping relations of the emotional characteristic values and the emotional characteristic threshold value intervals. Therefore, the emotional characteristics of the client are determined as the interactive characteristics of the client according to the audio characteristic information of the client through the steps.
Secondly, when step S101 acquires text-type service interaction data, or when step S101 acquires voice-type service interaction data and converts the voice-type service interaction data into text-type service interaction data, and when step S101 acquires video-type service interaction data, decomposes the voice-type service interaction data therefrom, and further converts the voice-type service interaction data into text-type service interaction data, and after step S102 analyzes the text-type service interaction data, and further determines feature information of the text type of the client as client feature information of the client, step S103 may also be implemented in the following manner: firstly, performing semantic analysis on the service interaction subdata contained in the service interaction subdata set of the text type. For specific implementation, reference is made to the foregoing detailed description, which is not repeated herein. The semantic analysis of the service interaction sub data included in the text-type service interaction sub data set is already described in detail in the foregoing step S102, and is not described herein again. Secondly, determining a logic capacity characteristic value and/or a communication expression characteristic value according to a semantic analysis result, wherein the logic capacity characteristic value and/or the communication expression characteristic value are used as the interactive characteristics adapted to the client. The logic ability characteristic value is used for representing the logic ability presented by the client when communicating with the customer service, such as the language is related and consistent before and after words. The communication expression characteristic value is used for representing the communication expression characteristic of the client during customer service communication, for example, the language expressed by the client is related or unrelated to the question or topic of the communication. And after semantic analysis is carried out on the service interaction subdata included in the text type service interaction subdata set, determining a logic capability characteristic value according to a semantic analysis result to serve as the interaction characteristic adapted by the client. The words obtained by the segmentation in the foregoing represent the semantics of the text interaction data, and therefore can be used as the customer characteristic information. Wherein, if the Nth word appears only related to the previous N-1 words but not any other words, the Nth word can embody the logic communication ability of the client. The logical capacity characteristic value of the client can be determined by counting the proportion of the number of words in a text, which are only related to the previous N-1 words but not related to any other words, to the total number of words in the text, for example, when the proportion is 30%, the logical capacity characteristic value of the client is determined to be 30%. And meanwhile, determining a communication expression characteristic value according to a semantic analysis result as an interactive characteristic adapted by the client. The segmented words described above simultaneously represent the semantics of the text interaction data, and therefore can be used as the client feature information. The words are related or unrelated to the problems or topics communicated with, therefore, the probabilities of the words related or unrelated to the problems or topics communicated with can be counted, and the logical communication expression characteristic values of the clients are determined according to the mapping relation between the probability values and the communication expression characteristic values. For example, when the probability of some correlations or some irrelations among the segments to the question or topic of the communication is 40%, the communication expression characteristic value of the client may be determined to be 40%.
Furthermore, after the service interaction data of the video type acquired in step S101 is decomposed into service interaction data of the image type or service data of the voice type, and the customer feature information of the customer is determined in step S102, step S103 may be implemented as follows: firstly, comparing the image characteristic information with emotion characteristic images in an emotion characteristic image library, and taking emotion characteristics corresponding to the emotion characteristic images obtained by comparison as interactive characteristics adapted to the client. The image feature information may be facial image information of the customer, and since facial expressions of the customer may be various, the image feature information of the customer may also be various, such as image feature information in a happy state, image feature information in a peaceful state, image feature information in a angry state, and the like. The emotion characteristic image library stores various emotion characteristic images such as an emotion characteristic image in a happy state, an emotion characteristic image in a peaceful state, an emotion characteristic image in an angry state, and the like. Because the image in each emotional state has certain characteristics, for example, in the image in the happy state, the mouth of the image in the mouth of the face of the client is in a smiling state, the face in the peaceful state is in a relatively stretched state, the face in the angry state is in a locked state, and the eyebrow part in the angry state is in a locked state. And after comparison, the emotion characteristics corresponding to the matched emotion characteristic images obtained by comparison are used as interactive characteristics adapted to the client. Secondly, calculating the emotion characteristic value of the client according to the audio characteristic information. The determination of the emotional characteristic value of the client according to the audio characteristic information has been described in detail in the foregoing, and is not described in detail herein. Thirdly, comparing the emotion characteristic value of the client with a threshold interval of emotion characteristics, and taking the emotion characteristics corresponding to the threshold interval obtained by comparison as interactive characteristics adapted to the client. The specific implementation manner has been described in detail above, and is not described herein in detail. In the above manner, the customer adapted interaction feature is thereby evaluated in step S103 based on the customer feature information.
Optionally, after the step S103 and before the step S104, the following operations are further performed to determine the interaction topic of the customer: firstly, the service interaction data is analyzed to obtain a service interaction sub-data set. In the method shown in this embodiment, the analyzing the service interaction data to obtain the service interaction sub data set may be service interaction data for a text type. The text-type service interaction data may be the text-type service interaction data directly acquired in step S101, or the text-type service interaction data acquired in step S101 and converted to obtain the voice-type service interaction data, or the video-type service interaction data acquired in step S101 and decomposed to obtain the voice-type service interaction data and further converted to obtain the text-type service interaction data. The parsing may be performed to analyze the service interaction data of the text type to obtain a service interaction sub-data set, and the text data may be obtained by splitting as described above. Secondly, extracting key service interaction subdata from the service interaction subdata set; and determining a service interaction theme according to the extracted key service interaction subdata. Through the steps, the service theme when the customer and the customer service perform interactive service can be determined, and the service theme can be applied in the subsequent steps.
And step S104, matching the interactive characteristics with the customer service labels of the customer services to obtain second customer services of which the customer service labels are matched with the interactive characteristics.
The service label of the service can be determined in advance by adopting the following method: first, historical service data of the customer service is obtained. The historical service data of the customer service is a record stored in a computer about the interaction with the customer on the history of the customer service. Second, service characteristics of the customer service are analyzed based on the historical service data. Wherein the service features include at least one of: service topic, service object, rate of service success, service score. Analyzing the service characteristics of the customer service based on the historical service data, for example, analyzing the theme of the service (such as closing time theme), the service object (old people), the proportion of the service success (such as 70%), and the service score (such as 9%) from the record of the customer service interacting with the customer historically. Third, a service label of the service is determined based on the service characteristics. For example, after determining the service characteristics as the theme of the service (such as the closing time theme), the service object (old people), the percentage of successful service (such as 70%), and the service score (such as 9%) in step S104-12, the determined service characteristics may be determined as the label of the customer service. The customer adapted interaction characteristics are determined via step S103, for example: the determined interactive characteristics of the client are emotional characteristics of happiness, peace, anger and the like, or the logical communication capacity value is 40%, or the communication expression capacity value is 30%, and the like. After step S103, the step S104 is executed to match the interactive feature with the service label of the customer service, and obtain a second customer service with the service label matched with the interactive feature. For example, for a customer whose emotional characteristic is anger, only the customer service scored 9 points in the service label will be the second service. For another example, for a client whose logical communication ability value is 40%, only the customer service whose service rate of the customer service tag is successful is 60% or more, and the customer service can be regarded as the second customer service. For example, for a client whose communication ability value is 30%, only a customer service whose service success rate is 70% and whose service score is 9 or more may be used as the second customer service.
Optionally, the method for matching customer service in the customer service upgrade scenario may be executed in a process of service interaction between the customer and the first customer service. Meanwhile, after the step S104 is executed, optionally, the following operations may also be executed: judging whether the first customer service has solved the problem of the customer, if so, ending the service interaction of the customer, wherein the problem is determined according to the service interaction data; and if not, calling the second customer service to perform interactive service on the customer aiming at the problem. Only when the first customer service does not solve the customer's problem, it is necessary to call a second customer service to intervene in the solution, thus performing the above steps. Wherein the customer's question may be determined by: firstly, an interaction problem set corresponding to the service interaction theme is obtained. The purpose of this step is to obtain the interaction problem set corresponding to the theme according to the interaction theme determined in the foregoing. Secondly, according to the service interaction data, determining an interaction problem corresponding to the service interaction data from the interaction problem set. In this step, after the possible interaction problem set corresponding to the interaction topic is acquired in the previous step, a more accurate interaction problem is determined according to the service interaction data acquired in step S101. Thirdly, judging whether the first customer service already solves the interaction problem of the customer, if not, executing the following operations: judging whether the label of the second customer service is matched with the interactive question, and if not, returning to execute the step S104; and if the service interaction theme is matched with the interaction question, executing the step of judging whether the label of the second customer service is matched with the service interaction theme or not. The method comprises the steps of judging whether a label of a second customer service is matched with an interaction problem of a customer when the first customer service fails to solve the problem of the customer, and continuing judging whether the label of the second customer service is matched with an interaction theme when the label of the second customer service is consistent with the interaction problem of the customer. Through the step, the determined second customer service can adapt to the interaction theme which the customer needs to know, and can adapt to the interaction problem to be solved by the customer, so that the pertinence of determining the second customer service to intervene in solving the problem is enhanced.
Optionally, the method for matching customer service in the customer service upgrade scenario may also be executed after the customer interacts with the first customer service, and before the service interaction data generated by the service interaction between the user and the first customer service is obtained, the following operations are executed: and judging whether the first customer service has solved the problem of the user, if not, executing the step of acquiring service interaction data generated by service interaction between the user and the first customer service. For a specific implementation, reference may be made to the foregoing description, which is not described herein again.
The following technical effects that can be obtained by the technical scheme of the present application are analyzed with reference to the application scenario diagram of the method for matching customer service in the customer service upgrade scenario shown in fig. 2.
As shown in fig. 2, when a customer seeks help over the phone, there is a problem that the first customer service accepts the customer. When the first customer service accepts the consultation of the customer, the system records service interaction data in real time. Then, the emotional characteristics, the logical ability characteristic values, the communication expression ability values, and the like of the client are evaluated with respect to the stored service interaction data. And then matching a second customer service matched with the characteristics of the customer. And when the first-line customer service is determined not to solve the problem of the customer, connecting the matched second customer service and automatically calling the customer for service until the service is finished. Therefore, according to the technical scheme for matching the customer service in the customer service upgrading scene, the characteristics of the customer are evaluated based on the service interaction data generated by the service of the first customer service and the customer, and then the customer service is matched with the customer service label of the customer service to obtain the matched second customer service, so that the problem of the customer is specifically solved by the matched second customer service. Optionally, characteristics of the customer are evaluated according to the voice-type service interaction data, the text-type service interaction data or the video-type service interaction data, and then the characteristics are matched with the customer service label of the customer service to obtain a second customer service matched with the interaction characteristics, so that the problem of the customer is specifically solved by the matched second customer service, and the capability of matching the second customer in an interaction scene of different types of customer services and the customer is improved. Optionally, the interactive theme or the interactive problem of the customer can be determined, and then the second customer service matched with the interactive theme or the interactive problem of the customer is determined, so that the adaptability and the pertinence of the second customer service and the customer are further improved.
The application also provides a device for matching customer service in the customer service upgrading scene. FIG. 3 is a schematic diagram of an embodiment of an apparatus for matching customer service in a customer service upgrade scenario provided by the present application. Since the apparatus embodiments are substantially similar to the method embodiments, they are described in a relatively simple manner, and reference may be made to the corresponding description of the method embodiments provided above for relevant portions.
Fig. 3 illustrates an apparatus for matching customer service in a customer service upgrade scenario according to an embodiment of the present disclosure, including:
a service interaction data obtaining unit 301, configured to obtain service interaction data generated by service interaction between a client and a first client;
a customer characteristic information parsing unit 302, configured to parse customer characteristic information of the customer from the service interaction data;
an interactive feature evaluating unit 303, configured to evaluate the customer-adapted interactive feature according to the customer feature information;
and a second customer service matching unit 304, configured to match the interaction feature with a customer service label of a customer service, so as to obtain a second customer service matching the customer service label with the interaction feature.
Optionally, the service interaction data obtaining unit 301 includes:
the voice type service interaction data acquisition subunit is used for acquiring voice type service interaction data;
the video type service interaction data acquisition subunit is used for acquiring video type service interaction data;
and the text type service interaction data acquisition subunit is used for acquiring the text type service interaction data.
Optionally, the client characteristic information parsing unit 302 includes:
the first customer characteristic information analysis subunit is used for analyzing the audio characteristic information of the customer contained in the voice type service interaction data as the customer characteristic information;
the interactive feature evaluation unit 303 includes:
the first emotion characteristic value acquisition subunit is used for calculating the emotion characteristic value of the client according to the audio characteristic information;
and the first interactive characteristic determining subunit is used for comparing the emotion characteristic value of the client with a threshold interval of emotion characteristics, and taking the emotion characteristics corresponding to the threshold interval obtained by comparison as the interactive characteristics adapted to the client.
Optionally, the apparatus for matching customer service in a customer service upgrade scenario includes:
the first service interaction data conversion unit is used for converting the service interaction data of the voice type into service interaction data of the text type.
Optionally, the client characteristic information parsing unit 302 includes:
the first word segmentation subunit is used for performing word segmentation on the service interaction data of the text type to obtain a service interaction sub-data set of the text type as the customer characteristic information;
the interactive feature evaluation unit 303 includes:
a first semantic analysis subunit, configured to perform semantic analysis on the service interaction sub data included in the service interaction sub data set of the text type;
and the second interactive characteristic determining subunit is used for determining a logic capacity characteristic value and/or a communication expression characteristic value according to a semantic analysis result, and the logic capacity characteristic value and/or the communication expression characteristic value are used as the interactive characteristics adapted to the client.
Optionally, the apparatus for matching customer service in the customer service upgrade scenario includes:
the second service interaction data conversion unit is used for decomposing the service interaction data of the video type into service interaction data of an image type and service interaction data of a voice type;
the client characteristic information parsing unit 302 includes:
a second customer characteristic information analysis subunit, configured to analyze, as the customer characteristic information, the image characteristic information of the customer included in the service interaction datagram of the image type;
a third customer characteristic information analysis subunit, configured to analyze the audio characteristic information of the customer included in the audio-type service interaction data, as the customer characteristic information;
the interactive feature evaluation unit 303 includes:
the emotion characteristic image comparison subunit is used for comparing the image characteristic information with emotion characteristic images in an emotion characteristic image library, and taking emotion characteristics corresponding to the emotion characteristic images obtained through comparison as interaction characteristics adapted to the client;
the second emotion characteristic value acquisition subunit is used for calculating the emotion characteristic value of the client according to the audio characteristic information;
and the third interactive characteristic determining subunit is used for comparing the emotion characteristic value of the client with the threshold interval of the emotion characteristic, and taking the emotion characteristic corresponding to the threshold interval obtained by comparison as the interactive characteristic adapted to the client.
Optionally, the client characteristic information parsing unit 302 includes:
the second word segmentation subunit is used for performing word segmentation on the service interaction data of the text type to obtain a service interaction sub-data set of the text type;
a second semantic analysis subunit, configured to perform semantic analysis on the service interaction sub data of the text type included in the service interaction sub data set of the text type, and use a semantic analysis result as the customer feature information;
the interactive feature evaluation unit 303 includes:
the third emotion characteristic value acquisition subunit is used for calculating the emotion characteristic value of the client according to the semantic analysis result;
the fourth interactive characteristic determining subunit is configured to compare the emotional characteristic value of the client with a threshold interval of emotional characteristics, and use the emotional characteristic corresponding to the threshold interval obtained through comparison as an interactive characteristic adapted to the client;
and the fifth interactive characteristic determining subunit is used for determining a logic capacity characteristic value and/or a communication expression characteristic value according to the semantic analysis result, and the logic capacity characteristic value and/or the communication expression characteristic value are used as the interactive characteristics adapted to the client.
Optionally, the apparatus for matching customer service in the customer service upgrade scenario includes:
the system comprises a historical service data acquisition unit of customer service, a historical service data acquisition unit of customer service and a historical service data acquisition unit of customer service, wherein the historical service data acquisition unit is used for acquiring the historical service data of the customer service;
a service characteristic analysis unit, configured to analyze a service characteristic of the customer service based on the historical service data, where the service characteristic includes at least one of: service subject, service object, service success rate, service score;
and the customer service label determining unit is used for determining the customer service label of the customer service based on the service characteristics.
Optionally, the apparatus for matching customer service in the customer service upgrade scenario includes:
a service interaction subdata set obtaining unit, configured to parse the service interaction data to obtain a service interaction subdata set;
an extracting unit, configured to extract key service interaction subdata from the service interaction subdata set; the service interaction theme determining unit is used for determining a service interaction theme according to the extracted key service interaction subdata;
the first judging unit is used for judging whether the label of the second customer service is matched with the service interaction theme or not, if so, calling the second customer service to perform interaction service for the customer aiming at the problem; and if not, returning to the secondary customer service matching unit for execution.
Optionally, the apparatus for matching customer service in the customer service upgrade scenario includes:
the interactive problem set acquisition unit is used for acquiring an interactive problem set corresponding to the service interactive theme;
the interaction problem determining unit is used for determining an interaction problem corresponding to the service interaction data from the interaction problem set according to the service interaction data;
a second determining unit, configured to determine whether the first customer service has solved the interaction problem of the customer, and if not, perform the following operations: judging whether the label of the second customer service is matched with the interaction problem or not, and returning to the secondary customer service matching unit for execution if the label of the second customer service is not matched with the interaction problem; and if the interactive question is matched with the interactive question, returning to the first judgment unit for execution.
The application also provides an electronic device. Fig. 4 shows an embodiment of the electronic device. Since the embodiment of the electronic device is basically similar to the method embodiment, it is relatively simple to describe, and please refer to the corresponding description of the method embodiment provided above for the relevant part.
The electronic equipment of one embodiment that this application provided includes:
a processor 401 and a memory 402;
the memory is to store computer-executable instructions, and the processor is to execute the computer-executable instructions to: acquiring service interaction data generated by service interaction between a client and a first client service;
analyzing the customer characteristic information of the customer from the service interaction data;
evaluating the customer-adapted interaction characteristics according to the customer characteristic information;
and matching the interactive features with the customer service labels of the customer services to obtain second customer services of which the customer service labels are matched with the interactive features.
Optionally, the data type of the service interaction data includes at least one of: voice type, video type, and text type.
Optionally, if the data type of the service interaction data is a voice type, the client feature information of the client is analyzed from the service interaction data, and the following instruction is adopted to implement:
analyzing the audio characteristic information of the client contained in the voice-type service interaction data as the client characteristic information;
correspondingly, the evaluation of the interactive characteristics adapted to the customer according to the customer characteristic information is realized by adopting the following instructions:
calculating an emotion characteristic value of the client according to the audio characteristic information;
and comparing the emotion characteristic value of the client with a threshold interval of emotion characteristics, and taking the emotion characteristics corresponding to the threshold interval obtained by comparison as interactive characteristics adapted to the client.
Optionally, if the data type of the service interaction data is a voice type, after the service interaction data instruction generated by the service interaction between the client and the first client service is executed, and before the client feature information instruction of the client is analyzed from the service interaction data, the following instruction is executed: and converting the service interaction data of the voice type into service interaction data of the text type.
Optionally, the analyzing the customer feature information of the customer from the service interaction data is implemented by using the following instructions:
performing word segmentation on the service interaction data of the text type to obtain a service interaction subdata set of the text type as the customer characteristic information;
correspondingly, the evaluation of the interactive characteristics adapted to the customer according to the customer characteristic information is realized by adopting the following instructions:
performing semantic analysis on the service interaction subdata contained in the service interaction subdata set of the text type;
and determining a logic capacity characteristic value and/or a communication expression characteristic value according to a semantic analysis result to serve as the interactive characteristic adapted to the client.
Optionally, the analyzing the customer feature information of the customer from the service interaction data is implemented by using the following instructions:
analyzing the audio characteristic information of the client contained in the voice-type service interaction data as the client characteristic information;
performing word segmentation on the service interaction data of the text type to obtain a service interaction subdata set of the text type as the customer characteristic information;
correspondingly, the evaluation of the interactive characteristics adapted to the customer according to the customer characteristic information is realized by adopting the following instructions:
calculating an emotion characteristic value of the client according to the audio characteristic information;
comparing the emotion characteristic value of the client with a threshold interval of emotion characteristics, and taking the emotion characteristics corresponding to the threshold interval obtained by comparison as interaction characteristics adapted to the client;
performing semantic analysis on the service interaction subdata contained in the service interaction subdata set of the text type;
and determining a logic capacity characteristic value and/or a communication expression characteristic value according to a semantic analysis result to serve as the interactive characteristic adapted to the client.
Optionally, the audio feature information includes at least one of: volume information, tone information, and volume fluctuation amplitude.
Optionally, if the data type of the service interaction data is a video type, before analyzing the customer feature information of the customer from the service interaction data, the following instruction is executed:
decomposing the service interaction data of the video type into service interaction data of an image type and service interaction data of a voice type;
correspondingly, the customer characteristic information of the customer is analyzed from the service interaction data, and the following instructions are adopted to realize the following steps:
analyzing the image characteristic information of the customer contained in the service interaction datagram of the image type as the customer characteristic information;
analyzing the audio characteristic information of the client contained in the audio type service interaction data as the client characteristic information;
correspondingly, the evaluation of the customer-adapted interaction characteristics according to the customer characteristic information is realized by adopting the following instructions:
comparing the image characteristic information with emotion characteristic images in an emotion characteristic image library, and taking emotion characteristics corresponding to the emotion characteristic images obtained by comparison as interaction characteristics adapted to the client;
calculating an emotion characteristic value of the client according to the audio characteristic information;
and comparing the emotion characteristic value of the client with a threshold interval of emotion characteristics, and taking the emotion characteristics corresponding to the threshold interval obtained by comparison as interactive characteristics adapted to the client.
Optionally, if the data type of the service interaction data is a text type, the client feature information of the client is analyzed from the service interaction data, and the following instruction is adopted to implement:
performing word segmentation on the service interaction data of the text type to obtain a service interaction subdata set of the text type;
performing semantic analysis on the service interaction subdata of the text type contained in the service interaction subdata set of the text type, and taking a semantic analysis result as the customer characteristic information;
correspondingly, the evaluation of the interactive characteristics adapted to the customer according to the customer characteristic information is realized by adopting the following instructions:
calculating the emotion characteristic value of the client according to the semantic analysis result;
comparing the emotion characteristic value of the client with a threshold interval of emotion characteristics, and taking the emotion characteristics corresponding to the threshold interval obtained by comparison as interaction characteristics adapted to the client;
and determining a logic capacity characteristic value and/or a communication expression characteristic value according to a semantic analysis result to serve as the interactive characteristic adapted to the client.
Optionally, the instruction matching the customer service in the customer service upgrade scenario is executed in the service interaction process between the customer and the first customer service;
correspondingly, after the interactive feature is matched with the customer service label of the customer service and the second customer service instruction matched with the interactive feature is obtained, the following instruction is executed:
judging whether the first customer service has solved the problem of the customer, if so, ending the service interaction of the customer, wherein the problem is determined according to the service interaction data;
and if not, calling the second customer service to perform interactive service on the customer aiming at the problem.
Optionally, the instruction for matching the customer service in the customer service upgrade scenario is executed after the user interacts with the first customer service, and before the service interaction data generated by the user interacting with the first customer service is obtained, the following instruction is executed:
and judging whether the first customer service has solved the problem of the user, if not, executing the service interaction data instruction generated by the service interaction between the user and the first customer service.
Optionally, the service label of the service is determined by the following instruction:
acquiring historical service data of the customer service;
analyzing service characteristics of the customer service based on the historical service data;
determining a customer service label for the customer service based on the service characteristics;
wherein the service features include at least one of: service topic, service object, rate of service success, service score.
Optionally, after the customer feature information instruction of the customer is analyzed from the service interaction data and executed, and before the interaction feature is matched with the customer service label of the customer service and a second customer service instruction matching the customer service label with the interaction feature is obtained and executed, the following instruction is executed:
analyzing the service interaction data to obtain a service interaction subdata set;
extracting key service interaction subdata from the service interaction subdata set;
determining a service interaction theme according to the extracted key service interaction subdata;
correspondingly, after the interactive feature is matched with the customer service label of the customer service and a second customer service instruction matched with the interactive feature is obtained, the following instructions are executed:
judging whether the label of the second customer service is matched with the service interaction theme or not, if so, calling the second customer service to perform interaction service for the customer aiming at the problem;
and if not, returning to execute the step of matching the interactive feature with the customer service label of the customer service to obtain a second customer service instruction matching the customer service label with the interactive feature.
Optionally, before the instruction for determining whether the tag of the second customer service matches the service interaction theme is executed, the following instruction is executed:
acquiring an interaction problem set corresponding to the service interaction theme;
according to the service interaction data, determining an interaction problem corresponding to the service interaction data from the interaction problem set;
judging whether the first customer service already solves the interaction problem of the customer, if not, executing the following instructions:
judging whether the label of the second customer service is matched with the interaction problem, if not, returning to execute the step of matching the interaction feature with the customer service label of the customer service to obtain a second customer service instruction of matching the customer service label with the interaction feature;
and if the service interaction theme is matched with the interaction problem, executing the instruction for judging whether the label of the second customer service is matched with the service interaction theme or not.
Although the present application has been described with reference to the preferred embodiments, it is not intended to limit the present application, and those skilled in the art can make variations and modifications without departing from the spirit and scope of the present application, therefore, the scope of the present application should be determined by the claims that follow.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, computer readable media does not include non-transitory computer readable media (transient media), such as modulated data signals and carrier waves.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.

Claims (18)

1. A method for matching customer service in a customer service upgrade scene is characterized by comprising the following steps:
acquiring service interaction data generated by service interaction between a client and a first client service;
analyzing the customer characteristic information of the customer from the service interaction data;
evaluating the customer-adapted interaction characteristics according to the customer characteristic information;
and matching the interactive features with the customer service labels of the customer services to obtain second customer services of which the customer service labels are matched with the interactive features.
2. The method for matching customer service in the scenario of customer service upgrade as claimed in claim 1, wherein the data type of the service interaction data comprises at least one of the following items:
voice type, video type, and text type.
3. The method for matching customer service in the customer service upgrade scenario according to claim 2, wherein if the data type of the service interaction data is a voice type, the customer feature information of the customer is parsed from the service interaction data, and the following method is adopted:
analyzing the audio characteristic information of the client contained in the voice-type service interaction data as the client characteristic information;
correspondingly, the evaluation of the interaction characteristics adapted by the customer according to the customer characteristic information is realized by adopting the following modes:
calculating an emotion characteristic value of the client according to the audio characteristic information;
and comparing the emotion characteristic value of the client with a threshold interval of emotion characteristics, and taking the emotion characteristics corresponding to the threshold interval obtained by comparison as interactive characteristics adapted to the client.
4. The method for matching customer service in a customer service upgrade scenario according to claim 2, wherein if the data type of the service interaction data is a voice type, after the step of obtaining the service interaction data generated by the customer interacting with the first customer service is executed, and before the step of analyzing the customer feature information of the customer from the service interaction data is executed, the following operations are executed:
and converting the service interaction data of the voice type into service interaction data of the text type.
5. The method for matching customer service in the customer service upgrade scenario according to claim 4, wherein the analyzing the customer feature information of the customer from the service interaction data is implemented as follows:
performing word segmentation on the service interaction data of the text type to obtain a service interaction subdata set of the text type as the customer characteristic information;
correspondingly, the evaluation of the interaction characteristics adapted by the customer according to the customer characteristic information is realized by adopting the following modes:
performing semantic analysis on the service interaction subdata contained in the service interaction subdata set of the text type;
and determining a logic capacity characteristic value and/or a communication expression characteristic value according to a semantic analysis result to serve as the interactive characteristic adapted to the client.
6. The method for matching customer service in the customer service upgrade scenario according to claim 4, wherein the analyzing the customer feature information of the customer from the service interaction data is implemented as follows:
analyzing the audio characteristic information of the client contained in the voice-type service interaction data as the client characteristic information;
performing word segmentation on the service interaction data of the text type to obtain a service interaction subdata set of the text type as the customer characteristic information;
correspondingly, the evaluation of the interaction characteristics adapted by the customer according to the customer characteristic information is realized by adopting the following modes:
calculating an emotion characteristic value of the client according to the audio characteristic information;
comparing the emotion characteristic value of the client with a threshold interval of emotion characteristics, and taking the emotion characteristics corresponding to the threshold interval obtained by comparison as interaction characteristics adapted to the client;
performing semantic analysis on the service interaction subdata contained in the service interaction subdata set of the text type;
and determining a logic capacity characteristic value and/or a communication expression characteristic value according to a semantic analysis result to serve as the interactive characteristic adapted to the client.
7. The method for matching customer service in the customer service upgrade scenario according to claim 3 or 6, wherein the audio feature information comprises at least one of:
volume information, tone information, and volume fluctuation amplitude.
8. The method for matching customer service in a customer service upgrade scenario according to claim 2, wherein if the data type of the service interaction data is a video type, before analyzing the customer feature information of the customer from the service interaction data, the following operations are performed:
decomposing the service interaction data of the video type into service interaction data of an image type and service interaction data of a voice type;
correspondingly, the analyzing of the customer characteristic information of the customer from the service interaction data is implemented in the following manner:
analyzing the image characteristic information of the customer contained in the service interaction datagram of the image type as the customer characteristic information;
analyzing the audio characteristic information of the client contained in the audio type service interaction data as the client characteristic information;
correspondingly, the evaluation of the interaction characteristics adapted by the customer according to the customer characteristic information is realized by adopting the following modes:
comparing the image characteristic information with emotion characteristic images in an emotion characteristic image library, and taking emotion characteristics corresponding to the emotion characteristic images obtained by comparison as interaction characteristics adapted to the client;
calculating an emotion characteristic value of the client according to the audio characteristic information;
and comparing the emotion characteristic value of the client with a threshold interval of emotion characteristics, and taking the emotion characteristics corresponding to the threshold interval obtained by comparison as interactive characteristics adapted to the client.
9. The method for matching customer service in the customer service upgrade scenario according to claim 2, wherein if the data type of the service interaction data is a text type, the customer feature information of the customer is parsed from the service interaction data, and the method is implemented as follows:
performing word segmentation on the service interaction data of the text type to obtain a service interaction subdata set of the text type;
performing semantic analysis on the service interaction subdata of the text type contained in the service interaction subdata set of the text type, and taking a semantic analysis result as the customer characteristic information;
correspondingly, the evaluation of the interaction characteristics adapted by the customer according to the customer characteristic information is realized by adopting the following modes:
calculating the emotion characteristic value of the client according to the semantic analysis result;
comparing the emotion characteristic value of the client with a threshold interval of emotion characteristics, and taking the emotion characteristics corresponding to the threshold interval obtained by comparison as interaction characteristics adapted to the client;
and determining a logic capacity characteristic value and/or a communication expression characteristic value according to a semantic analysis result to serve as the interactive characteristic adapted to the client.
10. The method for matching customer service in the customer service upgrade scenario according to any one of claims 1 to 9, wherein the method for matching customer service in the customer service upgrade scenario is performed during the service interaction between the customer and the first customer service;
correspondingly, after the second customer service step of matching the interactive feature with the customer service label of the customer service to obtain the customer service label matched with the interactive feature is executed, the following operations are executed:
judging whether the first customer service has solved the problem of the customer, if so, ending the service interaction of the customer, wherein the problem is determined according to the service interaction data;
and if not, calling the second customer service to perform interactive service on the customer aiming at the problem.
11. The method for matching customer service in the customer service upgrade scenario of claim 1, wherein the method is performed after the user interacts with the first customer service, and before the obtaining of the service interaction data generated by the service interaction between the user and the first customer service, the following operations are performed:
and judging whether the first customer service has solved the problem of the user, if not, executing the step of acquiring service interaction data generated by service interaction between the user and the first customer service.
12. The method for matching customer service in a customer service upgrade scenario according to claim 1, wherein the customer service label of the customer service is determined as follows:
acquiring historical service data of the customer service;
analyzing service characteristics of the customer service based on the historical service data;
determining a customer service label for the customer service based on the service characteristics;
wherein the service features include at least one of: service topic, service object, rate of service success, service score.
13. The method for matching customer service in a customer service upgrade scenario according to claim 1, wherein after the step of parsing customer characteristic information of the customer from the service interaction data is executed, and before the step of matching the interaction characteristic with a customer service tag of a customer service and obtaining a second customer service step matching the customer service tag with the interaction characteristic is executed, the following operations are executed:
analyzing the service interaction data to obtain a service interaction subdata set;
extracting key service interaction subdata from the service interaction subdata set;
determining a service interaction theme according to the extracted key service interaction subdata;
correspondingly, after the second customer service step of matching the interactive feature with the customer service label of the customer service to obtain the customer service label matched with the interactive feature is executed, the following steps are executed:
judging whether the label of the second customer service is matched with the service interaction theme or not, if so, calling the second customer service to perform interaction service for the customer aiming at the problem;
and if not, returning to execute the second customer service step of matching the interactive feature with the customer service label of the customer service to obtain the matching of the customer service label and the interactive feature.
14. The method of claim 13, wherein before the step of determining whether the tag of the second customer service matches the service interaction theme, the following operations are performed:
acquiring an interaction problem set corresponding to the service interaction theme;
according to the service interaction data, determining an interaction problem corresponding to the service interaction data from the interaction problem set;
judging whether the first customer service already solves the interaction problem of the customer, if not, executing the following operations:
judging whether the label of the second customer service is matched with the interaction problem, if not, returning to execute the second customer service step of matching the interaction feature with the customer service label of the customer service to obtain the second customer service step of matching the customer service label with the interaction feature;
and if the service interaction theme is matched with the interaction question, executing the step of judging whether the label of the second customer service is matched with the service interaction theme or not.
15. A device for matching customer service under a customer service upgrade scene is characterized by comprising:
the service interaction data acquisition unit is used for acquiring service interaction data generated by service interaction between a client and a first client service;
the customer characteristic information analyzing unit is used for analyzing the customer characteristic information of the customer from the service interaction data;
the interactive characteristic evaluation unit is used for evaluating the interactive characteristics adapted to the customer according to the customer characteristic information;
and the secondary customer service matching unit is used for matching the interactive features with the customer service labels of the customer services to obtain second customer services matched with the interactive features by the customer service labels.
16. The apparatus for matching customer service in the customer service upgrade scenario of claim 15, wherein the customer characteristic information parsing unit comprises:
the first customer characteristic information analysis subunit is used for analyzing the audio characteristic information of the customer contained in the voice type service interaction data as the customer characteristic information;
the interactive feature evaluation unit includes:
the first emotion characteristic value acquisition subunit is used for calculating the emotion characteristic value of the client according to the audio characteristic information;
and the first interactive characteristic determining subunit is used for comparing the emotion characteristic value of the client with a threshold interval of emotion characteristics, and taking the emotion characteristics corresponding to the threshold interval obtained by comparison as the interactive characteristics adapted to the client.
17. The apparatus for matching customer service in the customer service upgrade scenario of claim 16, wherein the customer characteristic information parsing unit comprises:
the first word segmentation subunit is used for performing word segmentation on the service interaction data of the text type to obtain a service interaction sub-data set of the text type as the customer characteristic information;
the interactive feature evaluation unit includes:
a first semantic analysis subunit, configured to perform semantic analysis on the service interaction sub data included in the service interaction sub data set of the text type;
and the second interactive characteristic determining subunit is used for determining a logic capacity characteristic value and/or a communication expression characteristic value according to a semantic analysis result, and the logic capacity characteristic value and/or the communication expression characteristic value are used as the interactive characteristics adapted to the client.
18. An electronic device, comprising:
a memory and a processor;
the memory is to store computer-executable instructions, and the processor is to execute the computer-executable instructions to:
acquiring service interaction data generated by service interaction between a client and a first client service;
analyzing the customer characteristic information of the customer from the service interaction data;
evaluating the customer-adapted interaction characteristics according to the customer characteristic information;
and matching the interactive features with the customer service labels of the customer services to obtain second customer services of which the customer service labels are matched with the interactive features.
CN201810612634.0A 2018-06-14 2018-06-14 Method and device for matching customer service in customer service upgrading scene Pending CN110674385A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810612634.0A CN110674385A (en) 2018-06-14 2018-06-14 Method and device for matching customer service in customer service upgrading scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810612634.0A CN110674385A (en) 2018-06-14 2018-06-14 Method and device for matching customer service in customer service upgrading scene

Publications (1)

Publication Number Publication Date
CN110674385A true CN110674385A (en) 2020-01-10

Family

ID=69065873

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810612634.0A Pending CN110674385A (en) 2018-06-14 2018-06-14 Method and device for matching customer service in customer service upgrading scene

Country Status (1)

Country Link
CN (1) CN110674385A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111368052A (en) * 2020-02-28 2020-07-03 重庆百事得大牛机器人有限公司 Legal artificial intelligence consultation system based on semantic recognition
CN111667316A (en) * 2020-06-16 2020-09-15 中国银行股份有限公司 Service processing method, device and system
CN112053165A (en) * 2020-08-24 2020-12-08 北京达佳互联信息技术有限公司 Information interaction method, device, server and storage medium
CN112509713A (en) * 2021-02-04 2021-03-16 阿里健康科技(杭州)有限公司 Network interaction, inquiry interaction and service determination method, device and storage medium
CN112561268A (en) * 2020-12-07 2021-03-26 深圳市思为软件技术有限公司 Behavior evaluation method and related equipment
CN113837587A (en) * 2021-09-17 2021-12-24 深圳追一科技有限公司 Customer service quality inspection method and device, computer equipment and storage medium
CN114049216A (en) * 2021-10-13 2022-02-15 北京博瑞彤芸科技股份有限公司 Method and system for matching insurance businessman for user
CN117151727A (en) * 2023-10-30 2023-12-01 南通贝瑞斯曼信息科技有限公司 Customer service intelligent switching method based on user behavior analysis

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105072173A (en) * 2015-08-03 2015-11-18 谌志群 Customer service method and system for automatically switching between automatic customer service and artificial customer service
CN105592237A (en) * 2014-10-24 2016-05-18 ***通信集团公司 Method and apparatus for session switching, and intelligent customer service robot
US20180007102A1 (en) * 2016-07-01 2018-01-04 At&T Intellectual Property I, Lp System and method for transition between customer care resource modes
CN107590159A (en) * 2016-07-08 2018-01-16 阿里巴巴集团控股有限公司 The method and apparatus that robot customer service turns artificial customer service
CN107968897A (en) * 2017-11-03 2018-04-27 平安科技(深圳)有限公司 Customer service session distribution method, electronic device and computer-readable recording medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105592237A (en) * 2014-10-24 2016-05-18 ***通信集团公司 Method and apparatus for session switching, and intelligent customer service robot
CN105072173A (en) * 2015-08-03 2015-11-18 谌志群 Customer service method and system for automatically switching between automatic customer service and artificial customer service
US20180007102A1 (en) * 2016-07-01 2018-01-04 At&T Intellectual Property I, Lp System and method for transition between customer care resource modes
CN107590159A (en) * 2016-07-08 2018-01-16 阿里巴巴集团控股有限公司 The method and apparatus that robot customer service turns artificial customer service
CN107968897A (en) * 2017-11-03 2018-04-27 平安科技(深圳)有限公司 Customer service session distribution method, electronic device and computer-readable recording medium

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111368052A (en) * 2020-02-28 2020-07-03 重庆百事得大牛机器人有限公司 Legal artificial intelligence consultation system based on semantic recognition
CN111667316A (en) * 2020-06-16 2020-09-15 中国银行股份有限公司 Service processing method, device and system
CN111667316B (en) * 2020-06-16 2023-11-10 中国银行股份有限公司 Service processing method, device and system
CN112053165A (en) * 2020-08-24 2020-12-08 北京达佳互联信息技术有限公司 Information interaction method, device, server and storage medium
CN112561268A (en) * 2020-12-07 2021-03-26 深圳市思为软件技术有限公司 Behavior evaluation method and related equipment
CN112561268B (en) * 2020-12-07 2023-12-15 深圳市思为软件技术有限公司 Behavior evaluation method and related equipment
CN112509713A (en) * 2021-02-04 2021-03-16 阿里健康科技(杭州)有限公司 Network interaction, inquiry interaction and service determination method, device and storage medium
CN113837587A (en) * 2021-09-17 2021-12-24 深圳追一科技有限公司 Customer service quality inspection method and device, computer equipment and storage medium
CN114049216A (en) * 2021-10-13 2022-02-15 北京博瑞彤芸科技股份有限公司 Method and system for matching insurance businessman for user
CN117151727A (en) * 2023-10-30 2023-12-01 南通贝瑞斯曼信息科技有限公司 Customer service intelligent switching method based on user behavior analysis
CN117151727B (en) * 2023-10-30 2024-02-02 南通贝瑞斯曼信息科技有限公司 Customer service intelligent switching method based on user behavior analysis

Similar Documents

Publication Publication Date Title
CN110674385A (en) Method and device for matching customer service in customer service upgrading scene
US10593333B2 (en) Method and device for processing voice message, terminal and storage medium
CN107481720B (en) Explicit voiceprint recognition method and device
US20240021202A1 (en) Method and apparatus for recognizing voice, electronic device and medium
CN109686383B (en) Voice analysis method, device and storage medium
CN108428446A (en) Audio recognition method and device
CN113488024B (en) Telephone interrupt recognition method and system based on semantic recognition
US11750742B2 (en) Systems and methods for handling calls based on call insight information
US20130246064A1 (en) System and method for real-time speaker segmentation of audio interactions
CN111901627B (en) Video processing method and device, storage medium and electronic equipment
CN108877779B (en) Method and device for detecting voice tail point
CN113223560A (en) Emotion recognition method, device, equipment and storage medium
WO2019045816A1 (en) Graphical data selection and presentation of digital content
CN113436634A (en) Voice classification method and device based on voiceprint recognition and related equipment
CN113779208A (en) Method and device for man-machine conversation
CN107680584B (en) Method and device for segmenting audio
CN114138960A (en) User intention identification method, device, equipment and medium
CN108962226B (en) Method and apparatus for detecting end point of voice
CN114171063A (en) Real-time telephone traffic customer emotion analysis assisting method and system
JP6327252B2 (en) Analysis object determination apparatus and analysis object determination method
CN111640450A (en) Multi-person audio processing method, device, equipment and readable storage medium
CN111326142A (en) Text information extraction method and system based on voice-to-text and electronic equipment
CN111508530A (en) Speech emotion recognition method, device and storage medium
CN115512698B (en) Speech semantic analysis method
CN115831125A (en) Speech recognition method, device, equipment, storage medium and product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200110

RJ01 Rejection of invention patent application after publication