CN114067489A - Data processing method and device - Google Patents

Data processing method and device Download PDF

Info

Publication number
CN114067489A
CN114067489A CN202111271822.XA CN202111271822A CN114067489A CN 114067489 A CN114067489 A CN 114067489A CN 202111271822 A CN202111271822 A CN 202111271822A CN 114067489 A CN114067489 A CN 114067489A
Authority
CN
China
Prior art keywords
information
getting
target object
position information
riding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111271822.XA
Other languages
Chinese (zh)
Inventor
李龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202111271822.XA priority Critical patent/CN114067489A/en
Publication of CN114067489A publication Critical patent/CN114067489A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F7/00Mechanisms actuated by objects other than coins to free or to actuate vending, hiring, coin or paper currency dispensing or refunding apparatus
    • G07F7/08Mechanisms actuated by objects other than coins to free or to actuate vending, hiring, coin or paper currency dispensing or refunding apparatus by coded identity card or credit card or other personal identification means
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07BTICKET-ISSUING APPARATUS; FARE-REGISTERING APPARATUS; FRANKING APPARATUS
    • G07B15/00Arrangements or apparatus for collecting fares, tolls or entrance fees at one or more control points
    • G07B15/02Arrangements or apparatus for collecting fares, tolls or entrance fees at one or more control points taking into account a variable factor such as distance or time, e.g. for passenger transport, parking systems or car rental systems

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Finance (AREA)
  • Devices For Checking Fares Or Tickets At Control Points (AREA)

Abstract

The embodiment of the application discloses a data processing method and a data processing device, wherein the data processing method comprises the following steps: acquiring riding card swiping information of a target object; acquiring auxiliary riding behavior information of a target object; determining getting-on position information and getting-off position information of a target object based on auxiliary riding behavior information and/or riding card swiping information; and supplementing the information that the card swiping by the bus is unsuccessful based on the getting-on position information, the getting-off position information and the card swiping information by the bus.

Description

Data processing method and device
Technical Field
The application relates to the technical field of public transport payment, in particular to a data processing method and device.
Background
In the process of urban public transport travel (such as buses, subways and the like), the problems that a passenger forgets to swipe a card when getting on or off the bus or when getting in or out of the station or even if the passenger swipes the card, the card swiping or payment is unsuccessful due to the fact that the card swiping or payment system has problems, and as a result, the whole-course ticket can be deducted when the bus fee is settled or the subway cannot smoothly get out of the station in time due to the fact that the card swiping record is incomplete when the subway leaves the station often occur.
At present, when a card is not swiped or is not swiped successfully, only workers who are related to the ticket kiosk or the ticket kiosk can manually replenish tickets or update ticket and card records, manual processing is needed, time and labor are wasted, and the people cannot smoothly go out of a station in time.
Disclosure of Invention
The embodiment of the application provides a data processing method and a data processing device, which are used for solving one or more of the problems in the prior art.
The technical scheme of the application is realized as follows:
the embodiment of the application provides a data processing method, which comprises the following steps:
acquiring riding card swiping information of a target object;
acquiring the auxiliary riding behavior information of the target object;
determining getting-on position information and getting-off position information of the target object based on the riding card swiping information and/or the auxiliary riding behavior information;
and supplementing the information that the card swiping by the bus is unsuccessful based on the getting-on position information, the getting-off position information and the card swiping information by the bus.
In the above scheme, the obtaining of the auxiliary riding behavior information of the target object includes at least one of the following:
acquiring a first image and a second image of the target object, and identifying the first image and the second image to obtain a riding identification result;
and acquiring the riding communication connection condition of the terminal corresponding to the target object through near field communication.
In the above scheme, acquiring a first image and a second image of the target object, and identifying the first image and the second image to obtain a riding identification result includes:
acquiring a first image of the target object during getting on the train and a second image of the target object during getting off the train, identifying the first image and the second image, and determining a first portrait, a second portrait, a clothing attribute of the first portrait and a clothing attribute of the second portrait; wherein the clothing attributes comprise clothing color, pattern and model.
In the above scheme, determining the getting-on position information and the getting-off position information of the target object based on the auxiliary riding behavior information and/or the riding card swiping information includes:
determining the getting-on position information and the getting-off position information of the target object according to the riding card swiping information;
and if the target object fails to be swiped, determining the getting-on position information and the getting-off position information of the target object according to the auxiliary riding behavior information.
In the above scheme, if the card swiping failure occurs in the target object, determining the getting-on position information of the target object according to the auxiliary riding behavior information, where the information includes at least one of the following:
when the first image identification succeeds in identifying the first portrait and the clothing attribute of the first portrait, acquiring the getting-on position information of the target object through a positioning system;
and acquiring the getting-on position information of the target object through the positioning system under the condition that the riding communication connection condition represents that the riding communication connection is formed with the riding vehicle.
In the above scheme, when the card swiping failure occurs in the target object, determining getting-off position information of the target object according to the auxiliary riding behavior information includes at least one of the following:
under the condition that the second image identification succeeds in clothes attributes of the second portrait and are consistent with clothes attributes of the first portrait and the first portrait, acquiring the getting-off position information of the target object through the positioning system;
acquiring, by the positioning system, the alighting position information of the target object in a case where the alighting communication connection condition indicates that communication with the seated vehicle has been disconnected from the connection.
In the above scheme, supplementing the unsuccessful card swiping message of the ride based on the getting-on position information, the getting-off position information and the card swiping message of the ride comprises:
determining station information of the target object when the target object takes the bus according to the getting-on position information and the getting-off position information of the target object;
determining the trading resources of the target object according to the site information and the preset site unit trading information;
and based on the transaction resources, automatically carrying out bus transaction from the bound account and supplementing the information of unsuccessful bus card swiping by judging the bus card swiping information condition.
In the above solution, when a vehicle taking transaction is performed by using image recognition in the auxiliary vehicle taking behavior information, before automatically performing a vehicle taking transaction from a bound account based on the transaction resource and supplementing information that the vehicle taking card swiping is unsuccessful by judging the vehicle taking card swiping information, the method further includes:
and under the condition that the clothing attributes of the first portrait and the first portrait are consistent with the clothing attributes of the second portrait and the second portrait, comparing at least one of the first portrait and the second portrait with the information of the existing passenger registration database to determine a passenger binding account.
The embodiment of the application provides a data processing device, wherein,
the short-range communication module is used for acquiring the card swiping information of the riding target object;
the interface is used for acquiring the auxiliary riding behavior information of the target object;
the positioning module is used for determining the getting-on position information and the getting-off position information of the target object based on the auxiliary riding behavior information and the riding card swiping information;
and the automatic transaction module is used for supplementing the information of unsuccessful card swiping in the bus based on the getting-on position information, the getting-off position information and the card swiping information in the bus.
An embodiment of the present application provides a data processing apparatus, including:
a memory for storing executable data instructions;
a processor for executing executable instructions stored in the memory, the processor performing the data processing method when the executable instructions are executed.
The embodiment of the invention provides a storage medium, wherein the storage medium stores executable instructions, and when the executable instructions are executed by one or more processors, the processors execute the data processing method.
The embodiment of the application provides a data processing method and device, wherein the riding card swiping information of a target object is obtained, and the auxiliary riding behavior information of the target object is obtained; determining getting-on position information and getting-off position information of the target object based on the auxiliary riding behavior information and/or the riding card swiping information; and supplementing the information that the card swiping by the bus is unsuccessful based on the getting-on position information, the getting-off position information and the card swiping information by the bus. By adopting the scheme, when passengers travel by taking public transport, the passengers can automatically pay under the condition of unsuccessful payment without manually completing ticket supplementing payment operation, and the efficiency of taking payment is improved.
Drawings
Fig. 1 is a first schematic flow chart of a data processing method according to an embodiment of the present application;
fig. 2 is a schematic flow chart of a data processing method according to an embodiment of the present application;
fig. 3 is a schematic flow chart of a data processing method according to an embodiment of the present application;
fig. 4 is a first schematic structural diagram of a data processing apparatus according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments of the present application without any creative effort belong to the protection scope of the present application.
In order that those skilled in the art will better understand the disclosure, the following detailed description will be given with reference to the accompanying drawings. Fig. 1 is a first schematic flow chart of a data processing method according to an embodiment of the present application, and will be described with reference to the steps shown in fig. 1.
S101, obtaining riding card swiping information of a target object.
In the embodiment of the application, when a passenger takes a vehicle, the passenger firstly needs a card swiping operation to get on or get off the vehicle, so that the passenger getting on or off the vehicle can be marked by acquiring the card swiping information of the passenger.
In the embodiment of the application, the card swiping in a bus includes but is not limited to interactive operation with the swiping-in of a vehicle or a transportation station through a bus card or an analog card or a special APP in a mobile terminal.
And S102, acquiring the auxiliary riding behavior information of the target object.
In the embodiment of the application, after the passenger card swiping information is obtained, the passenger auxiliary riding behavior information can be further obtained. The auxiliary riding behavior information can mark the behavior of passengers getting on or off the bus under the conditions that the passengers do not carry cards, unsuccessfully swipe cards or do not carry mobile terminals.
It can be understood that, in the embodiment of the application, by acquiring the auxiliary behavior information of the passenger, the passenger can be assisted to pay even if the passenger does not carry the card, does not succeed in swiping the card, or does not carry the mobile terminal, and the efficiency of bus taking payment is improved.
S103, determining getting-on position information and getting-off position information of the target object based on the auxiliary riding behavior information and/or the riding card swiping information.
In the embodiment of the application, whether passengers get on or off the bus or get in or out of the bus is determined through the riding card swiping information of the passengers, and when the passengers get on or get in the bus, the getting-on position information of the passengers is obtained; when a passenger gets off or leaves the station, getting-off position information of the passenger is obtained; and if the passenger does not successfully obtain at least one of the getting-on position information and the getting-off position information in a card swiping mode, obtaining the position of the passenger by utilizing the auxiliary riding behavior of riding.
It can be understood that, in the embodiment of the application, the getting-on position information and the getting-off position information of the passenger are acquired through the riding behavior information and the riding card swiping information assisted by the passenger, so that even under the condition that the passenger does not carry the card, the card swiping is unsuccessful or the passenger does not carry the mobile terminal, the position information of the passenger can be acquired through the riding behavior information assisted by the passenger, the payment is further performed, and the riding payment efficiency is improved.
And S104, supplementing the information that the card swiping by the bus is unsuccessful based on the getting-on position information, the getting-off position information and the card swiping by the bus.
In the embodiment of the application, after the getting-on position information and the getting-off position information of the passenger are obtained according to the card swiping information and the auxiliary riding behavior information, payment is carried out according to the position information. When a passenger rides a bus in a card swiping mode, if the card swiping is unsuccessful, the position of the passenger is acquired in an auxiliary riding mode to pay, incomplete card swiping times are completed after the payment is successful, and the passenger is ensured to smoothly and quickly exit without any operation.
It can be understood that, in the embodiment of the application, automatic payment is performed through the acquired position information of the passenger, and when the passenger is found not to swipe the card or the card swiping record is incomplete, the incomplete card swiping times are complemented after the successful payment, so that the passenger is ensured to smoothly and quickly exit without any operation.
In some embodiments, S102 may include at least one of S1021 and S1022, which will be described in connection with the steps.
And S1021, acquiring a first image and a second image of the target object, and identifying the first image and the second image to obtain a riding identification result.
In some embodiments of the present application, the passenger riding behavior information may be acquired using image recognition as an auxiliary condition. When the face recognition method is used, specific information of passengers is not required to be known specifically, the passengers are only required to be photographed respectively when the passengers get on, get on or get off the bus and get off the bus, the photos are recognized and marked, and then recognized images are compared and subjected to subsequent processing.
In some embodiments of the present application, the image recognition method may be implemented by a support vector machine, elastic graph matching, geometric features, and a neural network, which is not limited in the present application.
And S1022, acquiring the riding communication connection condition of the terminal corresponding to the target object through near field communication.
In some embodiments of the present application, the method is suitable for determining the scene of the passenger getting on/off position through the riding communication connection condition of the passenger terminal.
In some embodiments of the present application, the near field communication may also be utilized, such as: technologies such as Wi-Fi, hot spot and Bluetooth acquire the riding communication connection condition of the passenger terminal as auxiliary information for acquiring riding behavior information of passengers.
Illustratively, if the mobile terminal of the passenger a is successfully paired with the bluetooth module on the bus or Wi-Fi is connected, it indicates that the passenger a has got on the bus and is always on the bus, and when the mobile terminal of the passenger a is disconnected from the bluetooth module on the bus or Wi-Fi is disconnected, it indicates that the passenger a has got off the bus, so that the riding communication connection condition of the passenger terminal can be obtained in a near field communication manner as an auxiliary condition to obtain riding behavior information of the passenger.
It can be understood that in some embodiments of the present application, the riding behavior information of the passenger is obtained through auxiliary manners such as image recognition and near field communication, so that even if the passenger does not carry a card, the passenger behavior information can be obtained through the auxiliary manners, and the auxiliary manner for identifying the passenger image can solve the problem that the passenger simultaneously obtains the riding behavior information of the passenger without carrying the card and the mobile terminal, so that the behavior information of the passenger can be obtained under various conditions, and the flexibility and the efficiency of payment are improved.
In some embodiments of the present application, S1021 may be implemented through S201, which is described in detail with reference to the following steps.
S201, acquiring a first image of a target object during getting on and a second image of the target object during getting off, identifying the first image and the second image, and determining a first portrait, a second portrait, a clothing attribute of the first portrait and a clothing attribute of the second portrait; the clothing attributes comprise clothing color, patterns and patterns.
In some embodiments of the application, images acquired by getting on and off a vehicle are identified by using an image identification method, two portrait images and portrait clothes which can be successfully identified are acquired, and the two identified portrait images are compared to obtain a result, and then subsequent processing is performed.
In some embodiments of the present application, a passenger is captured to obtain a first image when the passenger gets on the vehicle, the first image is identified by using an image identification algorithm to obtain an identified first portrait, and then the portrait is stored in a database. And finally, carrying out portrait matching on the obtained second portrait and the passenger portrait stored in the database.
In some embodiments of the present application, the image recognition method may be implemented by a support vector machine, elastic graph matching, geometric features, and a neural network, which is not limited in the present application.
It can be understood that in some embodiments of the present application, an image of a passenger when getting on a vehicle is acquired, the image is identified and stored in a database, then an image of the passenger when getting off the vehicle is acquired, the image is identified and matched with a portrait of the passenger stored in the database and a contained clothing accessory, and by the identification method, clothing matching is increased, and the matching efficiency and success rate of the portrait are improved.
In some embodiments, S103 may be implemented by S1031 and S1032, which will be described with reference to each step.
And S1031, determining getting-on position information and getting-off position information of the target object according to the riding card swiping information.
In some embodiments of the present application, when the passenger swipes the card and displays that the card swipes successfully, the passenger directly passes through the station information of the bus system or the existing positioning system, such as: the GPS, the Beidou and the like acquire the getting-on position and the getting-off position of the passenger.
In some embodiments of the present application, when the passenger successfully swipes the card, the card swiping data (including subway and bus) of the passenger is obtained, because the card swiping data of the traffic card gives the card holder a fixed card number, and the bus taking point and the bus taking time of each trip are recorded. Each travel record comprises the traveler code, the departure place and the destination, and the detailed time of travel time between two points, and the data can be acquired and analyzed by using a bus system to obtain the getting-on position and the getting-off position of the passenger.
And S1032, if the card swiping failure occurs in the target object, determining the getting-on position information and the getting-off position information of the target object through the auxiliary riding behavior information.
In some embodiments of the application, if the card swiping is unsuccessful or a passenger forgets to take the card while traveling, the getting-on position information and the getting-off position information of the passenger can be acquired through the auxiliary riding behavior information.
It can be understood that in some embodiments of the present application, the passenger obtains the passenger position information through an auxiliary method in the case that the card swiping is unsuccessful or the passenger forgets to take the card, so that the flexibility and the efficiency of the payment are improved.
In some embodiments, determining the passenger boarding location information in S1032 may be performed by at least one of S301 and S302, which will be described in conjunction with the respective steps.
S301, when the first image is successfully identified, getting-on position information of the target object is obtained through the positioning system.
In some embodiments of the present application, a user takes a picture of the user while taking a car, and identifies an acquired image, and when the portrait and/or the apparel of the portrait are successfully identified, the user may use an existing positioning system on the car, such as: the GPS, the Beidou and the like acquire the getting-on position and the getting-off position of the passenger. For example, the first image obtained by shooting carries the position information given by the positioning system.
Illustratively, when a passenger takes a bus, the passenger takes an image and is identified; therefore, when the system acquires the face image, the passenger is proved to be on the bus, and the on-bus position of the passenger is acquired through the existing positioning system on the bus. When a passenger takes a subway, the face recognition of the passenger takes a picture when the passenger arrives, and when the recognition is successful, the fence at the station can be automatically opened to allow the passenger to arrive and take a bus. Therefore, when the passenger gets on the subway and the photographed image is successfully identified as a portrait, the passenger is shown to have arrived, and the arrival position information of the passenger is obtained through the positioning system.
In some embodiments of the application, the opening and closing time of the bus door can be acquired in real time, the bus is opened at a fixed station usually once the bus is driven out of the parking lot, the station corresponding to the opening time can be acquired by acquiring the opening time of the bus, the opening and closing time of the bus is acquired when the passenger acquires images after the passenger gets on the bus and successfully identifies the portrait, and the getting-on position information of the passenger is acquired by combining the bus line information.
And S302, acquiring getting-on position information of the target object through the positioning system under the condition that the riding communication connection condition represents that the communication connection is formed with the riding vehicle.
In some embodiments of the application, when a user takes a bus, the bus system acquires the getting-on position information of the passenger according to the connection condition between the passenger mobile terminal and the taken vehicle in the modes of Bluetooth, wi-fi or NFC and the like. In one embodiment, because a passenger may take a car, the time from parking to starting of the car is short, and timely positioning before starting of the car cannot be guaranteed, or because a certain distance is needed for disconnecting the bluetooth and the NFC, the situation of the passenger is determined to be inaccurate when the bluetooth is not connected or the NFC is not connected, so that the position information of the passenger can be obtained in real time by using a positioning system in the car, or a threshold value is set, and when the car stops or the speed of the car is lower than the threshold value, the position of the passenger is obtained through positioning.
In some embodiments of the application, the bus system acquires the getting-on position information of the passenger according to the connection condition of the mobile terminal of the passenger and the vehicle in the bus by using Wi-Fi in a near field communication mode. The method comprises the steps of collecting the opening and closing time of a bus door, scanning a mobile phone mac address of a passenger, and acquiring the opening time of the bus to obtain a station corresponding to the opening time and a subway station because the bus can only be opened at a fixed station once the bus leaves a parking lot; sending the scanning data of the Wi-Fi equipment to a data center server; and judging whether the scanned mobile phone mac address of the passenger is scanned for the first time, and scanning the corresponding bus stop when the Wi-Fi address of the passenger is scanned by combining bus route information so as to obtain the getting-on position information of the passenger.
In some embodiments, the determination of the passenger alighting position information in S1032 may be achieved by at least one of S401 and S402, which will be described in conjunction with the respective steps.
S401, under the condition that the second image identification succeeds in clothing attributes of the second portrait and is consistent with the clothing attributes of the first portrait and the first portrait, getting-off position information of the target object is obtained through a positioning system.
In some embodiments of the present application, when a passenger gets off or leaves a station, the passenger is photographed and the acquired image is identified, when the image successfully identifies a portrait and a dress of the portrait, the portrait is compared with an existing database, and when the similarity between the image and the portrait exceeds a preset threshold, the getting-off position information of the passenger is acquired through positioning. By adopting the mode, the getting-off position information of the passenger can be obtained, and the getting-on position information and the getting-off position information of the passenger on one trip are finally determined through the comparison operation with the portrait stored in the database.
In some embodiments of the application, the opening and closing time of the bus door is collected in real time, the obtained image is identified when the passenger gets off the bus, the image when the passenger gets on the bus is successfully matched, the opening and closing time of the bus when the passenger acquires the image is obtained, and the getting-off position information of the passenger is obtained by combining the bus line information.
S402, acquiring the getting-off position information of the target object through a positioning system under the condition that the getting-off communication connection condition represents that the communication with the occupied vehicle is disconnected.
In some embodiments of the present application, when the bluetooth, Wi-Fi or NFC in the passenger ' S mobile terminal is disconnected from the seated vehicle by connection, the passenger ' S location information is acquired in real time using a positioning system in the vehicle, or a threshold value is set, and the passenger ' S location is acquired by positioning when the vehicle is stopped or the vehicle speed is lower than the threshold value, as in S302.
In some embodiments of the application, in a near field communication mode, for example, after a mobile terminal of a passenger is disconnected from a vehicle in which the mobile terminal is taken by using Wi-Fi, the corresponding bus stop at which a Wi-Fi address of the mobile terminal of the passenger is disconnected is scanned by acquiring the door opening and closing time of the bus with the mobile phone Wi-Fi disconnected and combining bus route information, and then the getting-on position information of the passenger is acquired.
In some embodiments, referring to fig. 2, fig. 2 is a schematic flow chart of a data processing method provided in the embodiments of the present application, and S104 shown in fig. 2 may be implemented through S1041 to S1043, which will be described with reference to each step.
And S1041, determining station information of the riding of the target object according to the getting-on position information and the getting-off position information of the target object.
In some embodiments of the application, the boarding position information and the alighting position information of the passenger are obtained from the card swiping information of the passenger and the auxiliary behavior information of the passenger, and the total number of stations taken by the passenger in the traveling process and the preset station unit transaction information are obtained through the boarding position and the alighting position.
S1042, determining the transaction resources of the target object through the site information and the transaction information of the preset site unit;
in some embodiments of the application, after the station number taken by the passenger and the preset station unit transaction information are obtained, the amount to be paid by the passenger is determined for payment.
And S1043, automatically carrying out bus taking transaction from the bound account based on transaction resources, and supplementing the information of unsuccessful bus taking card swiping by judging the bus card swiping information condition.
In some embodiments of the present application, after the amount that the passenger should pay is obtained, the system automatically deducts the corresponding amount from the account bound by the passenger. If the card swiping record of the passenger is not complete or is not complete after the successful payment is carried out, the incomplete card swiping times are incomplete, the passenger is ensured to smoothly and quickly leave the station without any manual operation, and the situation that the whole ticket is deducted when the bus fare is settled due to the fact that the bus swiping record is not complete is avoided.
In some embodiments of the present application, before S1042, S501 may also be implemented, which will be described with reference to the steps.
S501, under the condition that the clothing attributes of the first portrait and the first portrait are consistent with the clothing attributes of the second portrait and the second portrait, at least one of the first portrait and the second portrait is compared with the existing information of the passenger registration database, and a passenger binding account is determined.
In some embodiments of the present application, after the amount due by the passenger is calculated, the passenger bound account is determined to be charged.
In some embodiments of the present application, passengers first need to register before using the system, and all registered passenger information is stored in a database. After the portrait of the passenger getting on the bus and the portrait of the passenger getting off the bus are matched by utilizing the portrait storage database, the account bound when the passenger registers is obtained according to the comparison between any one of the portrait of the passenger getting on the bus or the portrait of the passenger getting off the bus and the information in the existing passenger registration database, and then the system directly deducts the corresponding amount from the account bound by the passenger.
It can be understood that when a passenger does not swipe a card or does not swipe the card successfully during riding, the current method can only manually replenish tickets or update ticket records to ticket kiosks or related workers, which requires manual processing, wastes time and labor, and leads to failure in timely and smooth departure. According to the method and the device, the boarding position information and the alighting position information of the passenger are acquired by acquiring the card swiping information and the auxiliary riding behavior information of the passenger, the position information of the passenger can be acquired through the auxiliary riding behavior information under the conditions that the passenger does not carry a card, does not swipe the card unsuccessfully or does not carry a mobile terminal, the card swiping is automatically supplemented or the system finishes automatic riding deduction, the riding convenience is improved, and the payment efficiency is improved.
An alternative flowchart is shown in fig. 3, where the data processing method is provided in an embodiment of the present application.
And S1, determining the boarding or the arrival of the passenger through card swiping, face recognition or near field communication, and recording the position information of the boarding or the arrival of the passenger.
In some embodiments of the invention, the behavior that a passenger has got on the bus or arrived at the station can be marked by the current card swiping mode; whether passengers get on the bus or get on the bus can be determined through a face recognition technology; or whether the passenger has the behavior of getting on or getting off the bus or not is determined by means of near field communication (Bluetooth, Wi-Fi, hot spot, NFC and the like), such as: the bus is provided with the Bluetooth module, and a user can be paired with the Bluetooth module through a portable smart phone (or other intelligent equipment) when taking a bus, so that the Bluetooth module can automatically discover and record that the user takes a bus and the like each time the user takes the bus. In addition, the above-mentioned ways can also be used in combination to determine whether the passenger gets on or gets into the station, when determining that the passenger gets on or gets into the station, the existing positioning system on the vehicle is: the GPS/beidou etc. records the passenger position.
And S2, determining whether the passenger gets off or leaves the station through card swiping, face recognition or near field communication, and recording the position information of the passenger getting off or leaves the station.
In some embodiments of the invention, the passenger's behavior of getting off or out of the station can be marked by the current card swiping mode; whether passengers get off or get off the station can be determined by the face recognition technology; or whether the passenger gets off or gets off is determined by means of near field communication (Bluetooth, Wi-Fi, hot spot, NFC and the like), such as: the bus is provided with the Bluetooth module, a user uses a carried smart phone (or other intelligent equipment) to be paired and connected with the Bluetooth module when getting on the bus, and the Bluetooth connection is interrupted when the user gets off the bus. In addition, the above-mentioned various ways may be used in combination to determine whether a passenger has alight or exited. When determining that the passenger gets off or leaves the station, the passenger can get off or leave the station through the existing positioning system on the vehicle such as: the GPS/beidou etc. records the passenger position.
And S3, judging whether the card swiping times of the passengers are complete or not, if the card swiping times are incomplete or not, acquiring the boarding or entering information and the alighting or leaving information of the passengers, and calculating the riding amount to be paid by the passengers during the riding.
In some embodiments of the invention, whether the number of times of card swiping of the passenger is complete or not is judged, and if the number of times of card swiping of the passenger is incomplete or not, the riding amount required to be paid by the passenger in the current riding is calculated through the boarding position and the alighting position of the passenger acquired by S1 and S2.
And S4, according to the paid riding amount, the system calls the passenger binding account to deduct the fee.
In some embodiments of the invention, the system deducts fees based on the amount due by the passenger based on the account to which the passenger was bound when registering.
An embodiment of the present application provides a data processing apparatus, as shown in fig. 4, fig. 4 is a schematic structural diagram of the data processing apparatus provided in the embodiment of the present application, where the apparatus includes: a short-range communication module 401, an interface 402, a positioning module 403, an automatic transaction module 404; wherein the content of the first and second substances,
the short-range communication module 401 is configured to obtain card swiping information of a target object in a car;
the interface 402 is configured to obtain the riding assisting behavior information of the target object.
The positioning module 403 is configured to determine the getting-on position information and the getting-off position information of the target object based on the auxiliary riding behavior information and the riding card swiping information.
The automatic transaction module 404 is configured to supplement the information that the card swiping for the bus is unsuccessful based on the getting-on position information, the getting-off position information, and the card swiping for the bus.
In some embodiments of the present application, the interface 402 is further configured to acquire a first image and a second image of the target object, and identify the first image and the second image to obtain a riding identification result; and acquiring the riding communication connection condition of the terminal corresponding to the target object through near field communication.
In some embodiments of the present application, the interface 402 is further configured to obtain a first image of the target object when the target object gets on the vehicle and a second image of the target object when the target object gets off the vehicle, identify the first image and the second image, and determine the first portrait, the second portrait, a clothing attribute of the first portrait and a clothing attribute of the second portrait; wherein the clothing attributes comprise clothing color, pattern and model.
In some embodiments of the present application, the positioning module 403 is further configured to determine getting-on position information and getting-off position information of the target object according to the riding card swiping information; and if the target object fails to be swiped, determining the getting-on position information and the getting-off position information of the target object through the auxiliary riding behavior information.
In some embodiments of the present application, the positioning module 403 is further configured to obtain the boarding location information of the target object through a positioning system when the first image identification succeeds in identifying the first portrait and the clothing attribute of the first portrait; and acquiring the getting-on position information of the target object through the positioning system under the condition that the riding communication connection condition represents that the riding communication connection is formed with the riding vehicle.
In some embodiments of the present application, the positioning module 403 is further configured to, in a case that the second image identification succeeds in obtaining the clothing attributes of the second portrait and are consistent with the clothing attributes of the first portrait and the first portrait, obtain, by the positioning system, the get-off position information of the target object; and acquiring the get-off position information of the target object through the positioning system under the condition that the get-off communication connection condition represents that the communication with the seated vehicle is disconnected.
In some embodiments of the present application, the automatic transaction module 404 is further configured to determine, according to the getting-on position information and the getting-off position information of the target object, station information of taking a car by the target object; determining the transaction resources of the target object through the site information and preset site unit transaction information; and automatically carrying out bus taking transaction from the bound account based on the transaction resources, and supplementing the information of unsuccessful bus taking card by judging the bus taking card information condition.
In some embodiments of the present application, the automatic transaction module 404 is further configured to compare at least one of the first portrait and the second portrait with information in an existing passenger registration database to determine a passenger bound account when the clothing attributes of the first portrait and the first portrait are consistent with the clothing attributes of the second portrait and the second portrait.
It can be understood that, in the above device implementation scheme, the card swiping information of the target object in the riding process is firstly obtained; acquiring the auxiliary riding behavior information of the target object; then determining getting-on position information and getting-off position information of the target object based on the auxiliary riding behavior information and the riding card swiping information; and finally supplementing the information of unsuccessful card swiping in the bus based on the getting-on position information, the getting-off position information and the card swiping information in the bus. Under the conditions that the passenger does not carry the card, does not pay successfully or does not carry the mobile terminal, the position information of the passenger can be obtained through the auxiliary riding behavior information, and the riding payment efficiency is improved.
Based on the method of the foregoing embodiment, an embodiment of the present application provides a schematic structural diagram, as shown in fig. 5, fig. 5 is a schematic structural diagram of a data processing apparatus provided in an embodiment of the present application, where the apparatus includes: a first processor 501 and a first memory 502; the first memory 502 stores one or more programs executable by the first processor 501, and when the one or more programs are executed, the first processor 501 executes a data processing method corresponding to the server of the aforementioned embodiment.
The embodiment of the application provides a computer readable storage medium, which stores one or more programs, wherein the one or more programs can be executed by one or more processors, and when the programs are executed by the processors, the data processing method of the embodiment of the application is realized.

Claims (10)

1. A data processing method, comprising:
acquiring riding card swiping information of a target object;
acquiring the auxiliary riding behavior information of the target object;
determining getting-on position information and getting-off position information of the target object based on the riding card swiping information and/or the auxiliary riding behavior information;
and supplementing the information that the card swiping by the bus is unsuccessful based on the getting-on position information and the getting-off position information and the card swiping by the bus.
2. The method according to claim 1, wherein the obtaining of the riding assistance behavior information of the target object comprises at least one of:
acquiring a first image and a second image of the target object, and identifying the first image and the second image to obtain a riding identification result;
and acquiring the riding communication connection condition of the terminal corresponding to the target object through near field communication.
3. The method according to claim 2, wherein the acquiring a first image and a second image of the target object, and recognizing the first image and the second image to obtain a riding recognition result comprises:
acquiring a first image of the target object during getting on the train and a second image of the target object during getting off the train, identifying the first image and the second image, and determining a first portrait, a second portrait, a clothing attribute of the first portrait and a clothing attribute of the second portrait; wherein the clothing attributes comprise clothing color, pattern and model.
4. The method according to claim 1, wherein the determining getting-on position information and getting-off position information of the target object based on the riding card swiping information and/or the auxiliary riding behavior information comprises:
determining the getting-on position information and the getting-off position information of the target object according to the riding card swiping information;
and if the target object fails to be swiped, determining the getting-on position information and the getting-off position information of the target object according to the auxiliary riding behavior information.
5. The method according to claim 4, wherein if the target object fails to be swiped, determining getting-in position information of the target object through the auxiliary riding behavior information comprises at least one of the following:
when the first image is successfully identified, acquiring the getting-on position information of the target object through a positioning system;
and acquiring the getting-on position information of the target object through the positioning system under the condition that the riding communication connection condition represents that the riding communication connection is formed with the riding vehicle.
6. The method according to claim 4, wherein if the card swiping failure of the target object occurs, determining getting-off position information of the target object through the auxiliary riding behavior information comprises at least one of the following steps:
under the condition that the second image is successfully identified and the clothing attributes of the second image are consistent with those of the first portrait and the first portrait, acquiring the getting-off position information of the target object through the positioning system;
acquiring, by the positioning system, the alighting position information of the target object in a case where the alighting communication connection condition indicates that communication with the seated vehicle has been disconnected from the connection.
7. The method of claim 1, wherein supplementing information that a ride was unsuccessful in swiping a card based on the boarding location information and the disembarking location information and the ride swipe information comprises:
determining station information of the target object when the target object takes the bus according to the getting-on position information and the getting-off position information of the target object;
determining the trading resources of the target object according to the site information and the preset site unit trading information;
and based on the transaction resources, automatically carrying out bus transaction from the bound account and supplementing the information of unsuccessful bus card swiping by judging the bus card swiping information condition.
8. The method according to claim 7, wherein when a vehicle taking transaction is performed by using image recognition in the auxiliary vehicle taking behavior information, before automatically performing the vehicle taking transaction from a bound account based on the transaction resource and supplementing information that the vehicle taking card swiping is unsuccessful by judging the vehicle taking card swiping information, the method further comprises:
and under the condition that the clothing attributes of the first portrait and the first portrait are consistent with the clothing attributes of the second portrait and the second portrait, comparing at least one of the first portrait and the second portrait with the information of the existing passenger registration database to determine a passenger binding account.
9. A data processing apparatus, comprising:
the short-range communication module is used for acquiring the card swiping information of the riding target object;
the interface is used for acquiring the auxiliary riding behavior information of the target object;
the positioning module is used for determining the getting-on position information and the getting-off position information of the target object based on the auxiliary riding behavior information and the riding card swiping information;
and the automatic transaction module is used for supplementing the information of unsuccessful card swiping in the bus based on the getting-on position information, the getting-off position information and the card swiping information in the bus.
10. A data processing apparatus, characterized in that the data processing apparatus comprises:
a memory for storing executable data instructions;
a processor for implementing the method of any one of claims 1 to 8 when executing executable instructions stored in the memory.
CN202111271822.XA 2021-10-29 2021-10-29 Data processing method and device Pending CN114067489A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111271822.XA CN114067489A (en) 2021-10-29 2021-10-29 Data processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111271822.XA CN114067489A (en) 2021-10-29 2021-10-29 Data processing method and device

Publications (1)

Publication Number Publication Date
CN114067489A true CN114067489A (en) 2022-02-18

Family

ID=80236094

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111271822.XA Pending CN114067489A (en) 2021-10-29 2021-10-29 Data processing method and device

Country Status (1)

Country Link
CN (1) CN114067489A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111932229A (en) * 2020-03-13 2020-11-13 武汉小码联城科技有限公司 Subway bus code payment data processing method, device and system and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111932229A (en) * 2020-03-13 2020-11-13 武汉小码联城科技有限公司 Subway bus code payment data processing method, device and system and electronic equipment

Similar Documents

Publication Publication Date Title
CN108230465B (en) Mobile terminal-based gate quick passing control method
CN107248128B (en) Intelligent ticket checking method and system
WO2020063011A1 (en) Identity recognition system and method, terminal, and computer storage medium
JP5517393B2 (en) Mobile charging system and mobile charging method using mobile charging system
KR101956286B1 (en) Automatic get off payment system of public transport by image recognition and method thereof
CN110796747A (en) Ticket selling and checking system based on face recognition
CN113538714A (en) Parking lot control method, system and computer readable storage medium
CN104992476A (en) Parking charging method based on mobile-phone-shooting license plate identification
CN110428250B (en) Bus non-inductive payment method, device, medium and device based on Bluetooth
CN110766812B (en) Riding payment method and device for unmanned automobile and electronic equipment
CN111369727B (en) Traffic control method and device
WO2018113222A1 (en) Automatic quick payment system and method for automobile
CN104700655A (en) Parking route reminder implementation method and system based on mobile terminal
CN112348974A (en) PRT novel automatic ticket selling and checking method and system
CN106447797A (en) Parking lot intelligent management system and operation method of the same
CN111325559B (en) Payment control method and payment control system applied to buses
CN108734797B (en) Method and device for paying vehicle toll and computer readable storage medium
CN111858806A (en) Passenger travel track detection method, device, equipment and storage medium
CN114067489A (en) Data processing method and device
JP2020126491A (en) Information processing system, program, and vehicle
CN108364174A (en) A kind of elevator charging method and system
CN210166825U (en) Passenger service system of urban rail transit
CN115983863A (en) Urban intelligent passenger transport ticket selling and checking system
CN110766859A (en) Bus taking payment device and method based on face recognition
CN113542364B (en) Bus and network appointment linkage intelligent traffic system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination