WO2018030987A1 - Bus with reservation system and illuminated seating - Google Patents

Bus with reservation system and illuminated seating Download PDF

Info

Publication number
WO2018030987A1
WO2018030987A1 PCT/US2016/046057 US2016046057W WO2018030987A1 WO 2018030987 A1 WO2018030987 A1 WO 2018030987A1 US 2016046057 W US2016046057 W US 2016046057W WO 2018030987 A1 WO2018030987 A1 WO 2018030987A1
Authority
WO
WIPO (PCT)
Prior art keywords
bus
seat
image
person
processor
Prior art date
Application number
PCT/US2016/046057
Other languages
French (fr)
Inventor
Paul Kenneth DELLOCK
Pietro Buttolo
Stuart C. SALTER
James J. SURMAN
Annette Lynn HUEBNER
Original Assignee
Ford Global Technologies, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ford Global Technologies, Llc filed Critical Ford Global Technologies, Llc
Priority to PCT/US2016/046057 priority Critical patent/WO2018030987A1/en
Publication of WO2018030987A1 publication Critical patent/WO2018030987A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60NSEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
    • B60N2/00Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
    • B60N2/002Seats provided with an occupancy detection means mounted therein or thereon
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60NSEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
    • B60N2/00Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
    • B60N2/005Arrangement or mounting of seats in vehicles, e.g. dismountable auxiliary seats
    • B60N2/01Arrangement of seats relative to one another
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60NSEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
    • B60N3/00Arrangements or adaptations of other passenger fittings, not otherwise provided for
    • B60N3/02Arrangements or adaptations of other passenger fittings, not otherwise provided for of hand grips or straps
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q3/00Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors
    • B60Q3/20Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors for lighting specific fittings of passenger or driving compartments; mounted on specific fittings of passenger or driving compartments
    • B60Q3/233Seats; Arm rests; Head rests
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q3/00Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors
    • B60Q3/40Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors specially adapted for specific vehicle types
    • B60Q3/41Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors specially adapted for specific vehicle types for mass transit vehicles, e.g. buses
    • B60Q3/44Spotlighting, e.g. reading lamps
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q3/00Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors
    • B60Q3/60Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors characterised by optical aspects
    • B60Q3/68Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors characterised by optical aspects using ultraviolet light

Definitions

  • This document relates to monitoring and illumination systems for vehicles, such as buses.
  • a bus consistent with the present disclosure includes: motor(s), interior camera(s), seats, each seat associated with an ultraviolet light source and an occupancy sensor, processor(s) configured to: (a) command one of the ultraviolet light sources to illuminate one of the seats; (b) periodically detect occupancy of each seat via the occupancy sensors; (c) activate the interior camera(s) based on the detected occupancy.
  • the seats are coated with a luminescent paint or dye responsive to ultraviolet light.
  • the bus includes a plurality of handles, the handles being at least partially constructed from a transparent material coated with a luminescent paint or dye.
  • the handles house and include ultraviolet light sources configured to activate the luminescent paint or dye.
  • the processor(s) are configured to: identify a boarding passenger and issue the command to the ultraviolet light to illuminate the seat based on the passenger's identity.
  • all of the light produced by the ultraviolet light sources falls outside of the human-visible light spectrum.
  • the processor(s) are configured to: associate a destination with each occupied seat, activate the interior camera(s) based on determining that the occupied seat has become unoccupied prior to the destination.
  • activation of the interior camera(s) comprises flagging video recorded by the interior camera(s) and saving the flagged video.
  • the processor(s) are configured to: deactivate the interior camera(s) based on determining that the unoccupied seat has become occupied.
  • the processor(s) are configured to: activate the interior camera(s) based on determining that the seat has not become unoccupied after reaching the destination.
  • the bus includes exterior camera(s) and the processor(s) are configured to: authorize bus access to a person when the exterior camera(s) capture an image of the person's face; deny bus access to the person when the exterior camera(s) fail to capture the image of the person's face.
  • the processor(s) are configured to: determine whether the exterior camera(s) have captured the image of the person's face by running an image-recognition program, run the captured image through the image-recognition program based on and after receiving a wireless signal from a mobile device associated with the person.
  • the occupancy sensors are capacitance sensors.
  • a method of operating a bus includes processor(s), motor(s), interior camera(s), seats, each seat associated with an ultraviolet light source and an occupancy sensor; the method includes, via the processor(s): (a) commanding one of the ultraviolet light sources to illuminate one of the seats; (b) periodically detecting occupancy of each seat via the occupancy sensors; (c) activating the interior camera(s) based on the detected occupancy.
  • the seats are coated with a luminescent paint or dye responsive to ultraviolet light and the bus includes a plurality of handles, the handles being at least partially constructed from a transparent material coated with a luminescent paint or dye.
  • the method includes identifying a boarding passenger and issuing the command to the ultraviolet light to illuminate the seat based on the passenger's identity.
  • the method includes associating a destination with each occupied seat, activating the interior earner a(s) based on determining that the occupied seat has become unoccupied prior to the destination.
  • the method includes deactivating the interior camera(s) based on determining that the unoccupied seat has become occupied.
  • the method includes authorizing bus access to a person when the exterior camera(s) capture an image of the person's face; denying bus access to the person when the exterior camera(s) fail to capture the image of the person's face.
  • the method includes determining whether the exterior camera(s) have captured the image of the person's face by running an image-recognition program on the processor(s); running the captured image through the image-recognition program based on and after receiving a wireless signal from a mobile device associated with the person.
  • the occupancy sensors are capacitance sensors.
  • Figure 1 is a block diagram of a vehicle computing system.
  • Figure 2 is a schematic of a vehicle including the vehicle computing system.
  • Figure 3 is a top cross sectional view of a bus.
  • Figure 4 is a side view of a bus seat.
  • Figure 5 is a perspective view of a bus safety handle.
  • Figure 6 is a block diagram of a method of operating the bus.
  • the use of the disjunctive is intended to include the conjunctive.
  • the use of definite or indefinite articles is not intended to indicate cardinality.
  • a reference to "the” object or “a” and “an” object is intended to denote also one of a possible plurality of such objects.
  • the conjunction “or” may be used to convey features that are simultaneously present, as one option, and mutually exclusive alternatives as another option. In other words, the conjunction “or” should be understood to include “and/or” as one option and “either/or” as another option.
  • FIG. 1 shows a computing system 100 of an example vehicle 200.
  • the vehicle 200 includes a motor, a battery, at least one wheel driven by the motor, and a steering system configured to turn the at least one wheel about an axis.
  • Vehicles are also described, for example, in U.S. Patent App. No. 14/991,496 to Miller et al. ("Miller"), U.S. Patent No. 8,180,547 to Prasad et al. (“Prasad”), U.S. Patent App. No. 15/186,850 to Lavoie et. al. (“Lavoie”) and U.S. Patent App. No. 14/972,761 to Hu et al. (“Hu”), all of which are hereby incorporated by reference in their entireties.
  • the vehicle 200 is a bus 300 configured to carry many passengers (e.g., more than 10 passengers).
  • the computing system 100 enables automatic control of mechanical systems within the device.
  • the computing system 100 also enables communication with external devices.
  • the computing system 100 includes a data bus 101, one or more processors 108, volatile memory 107, non-volatile memory 106, user interfaces 105, a telematics unit 104, actuators and motors 103, and local sensors 102.
  • the data bus 101 traffics electronic signals or data between the electronic components.
  • the processor 108 performs operations on the electronic signals or data to produce modified electronic signals or data.
  • the volatile memory 107 stores data for immediate recall by the processor 108.
  • the nonvolatile memory 106 stores data for recall to the volatile memory 107 and/or the processor 108.
  • the non-volatile memory 106 includes a range of non-volatile memories including hard drives, SSDs, DVDs, Blu-Rays, etc.
  • the user interface 105 includes displays, touch-screen displays, keyboards, buttons, and other devices that enable user interaction with the computing system.
  • the telematics unit 104 enables both wired and wireless communication with external processors via Bluetooth, cellular data (e.g., 3G, LTE), USB, etc.
  • the telematics unit 104 may be configured to broadcast signals at a certain frequency.
  • the actuators /motors 103 produce physical results.
  • actuators /motors include fuel injectors, windshield wipers, brake light circuits, transmissions, airbags, engines, power train motors, locks, doors, steering, lights (as discussed below), etc.
  • the local sensors 102 transmit digital readings or measurements to the processor 108. Examples of suitable sensors include temperature sensors, rotation sensors, capacitance sensors, seatbelt sensors, speed sensors, cameras, lidar sensors, radar sensors, etc. It should be appreciated that the various connected components of Figure 1 may include separate or dedicated processors and memory. Further detail of the structure and operations of the computing system 100 is described, for example, in Miller and/ or Prasad.
  • FIG. 2 generally shows and illustrates the vehicle 200, which includes the computing system 100.
  • the vehicle 200 is in operative wireless communication with a nomadic device, such as a mobile device or a smartphone.
  • a nomadic device such as a mobile device or a smartphone.
  • Some of the local sensors 102 are mounted on the exterior of the vehicle 200.
  • Local sensor 102a may be an ultrasonic sensor, a lidar sensor, a camera, a video camera, and/or a microphone, etc.
  • Local sensor 102a may be configured to detect objects leading the vehicle 200.
  • Local sensor 102b may be an ultrasonic sensor, a lidar sensor, a camera, a video camera, and/or a microphone, etc.
  • Local sensor 102b may be configured to detect objects trailing the vehicle 200 as indicated by trailing sensing range 109b.
  • Left sensor 102c and right sensor 102d may be configured to perform the same functions for the left and right sides of the vehicle 200.
  • the vehicle 200 includes a host of other sensors 102 located in the vehicle interior or on the vehicle exterior. These sensors may include any or all of the sensors disclosed in Prasad.
  • the vehicle 200 is configured to perform the methods and operations described below. In some cases, the vehicle 200 is configured to perform these functions via computer programs stored on the volatile and/or non-volatile memories of the computing system 100.
  • a processor is "configured to" perform a disclosed operation when the processor is in operative communication with memory storing a software program with code or instructions embodying the disclosed operation. Further description of how the processor, memories, and programs cooperate appears in Prasad. It should be appreciated that the nomadic device or an external server in operative communication with the vehicle 200 perform some or all of the methods and operations discussed below.
  • the vehicle 200 includes some or all of the features of the vehicle 100a of Prasad.
  • the computing system 100 includes some or all of the features of the VCCS 102 of Figure 2 of Prasad.
  • the vehicle 200 is in communication with some or all of the devices shown in Figure 1 of Prasad, including the nomadic device 110, the communication tower 116, the telecom network 118, the Internet 120, and the data processing center 122.
  • loaded vehicle when used in the claims, is hereby defined to mean: "a vehicle including: a motor, a plurality of wheels, a power source, and a steering system; wherein the motor transmits torque to at least one of the plurality of wheels, thereby driving the at least one of the plurality of wheels; wherein the power source supplies energy to the motor; and wherein the steering system is configured to steer at least one of the plurality of wheels.”
  • equipped electric vehicle when used in the claims, is hereby defined to mean "a vehicle including: a battery, a plurality of wheels, a motor, a steering system; wherein the motor transmits torque to at least one of the plurality of wheels, thereby driving the at least one of the plurality of wheels; wherein the battery is rechargeable and is configured to supply electric energy to the motor, thereby driving the motor; and wherein the steering system is configured to steer at least one of the plurality of wheels.”
  • FIG 3 generally shows and illustrates a bus 300 consistent with the present disclosure.
  • the bus 300 is one species of the vehicle 200 and thus may include some or all of above-described features of the vehicle 200. It should be appreciated that while the disclosure below is described with reference to the bus 300, any of the systems and methods may be applied to any other type of vehicle, such as an airplane or a sedan (e.g., the vehicle 200 pictured in Figure 2).
  • the bus 300 includes the vehicle computing system 100.
  • the bus 300 includes an automatic door 301 coupled with one or more door motors; a door camera 302 located outside of the automatic door 301 and configured to capture images of customers located outside of the bus 300; exterior lights (e.g., LEDs) 303a, 303b configured to light up a first color (e.g., green) when the bus 300 authorizes entry to a customer and a second color (e.g., red) when the bus denies entry to a customer; seats 304a, 304b, 304c, 304d; seat lights (e.g., LEDs) 305a, 305b, 305c, 305d configured to light up a corresponding seat 304; seat sensors 306a, 306b, 306c, 306d configured to sense and confirm the presence of a customer; safety handles 307a, 307b; safety handle lights (e.g., LEDs) 305e, 305f configured to light up one or more safety handles 307; first
  • the bus 300 is configured to perform a reservation control method 600. More specifically, and as described above, the bus 300 includes software embodying the below-disclosed operations stored on the memory 106, 107.
  • the processor 108 is configured to execute the software stored on the memory 106, 107 and thereby cause the bus to perform the below- disclosed operations.
  • the bus 300 via the reservation control method 600, accomplishes the following objectives (a) reserves and assigns seats for customers, (b) locates customers while the bus is in motion, and (c) detects when customers unexpectedly remain in and/or leave their assigned seats.
  • SA smartphone application
  • the SA displays an internal view of the bus 300 similar to Figure 3.
  • the SA designates bus seats 304 as available or unavailable.
  • the SA may color seats 304 that have already been assigned or reserved in red and color seats that are available for reservation in blue or white.
  • the SA may further shade the available seats according to their price. For example, lower priced seats may be shaded light blue while higher priced seats are shaded dark blue.
  • the bus may include safety handles 307 in addition to seats.
  • Customers grip the safety handles 307 while riding the bus 300.
  • the safety handles 307 as opposed to seats 304, enable the bus 300 to carry a higher density of customers.
  • the SA may enable reservation and assignment of the safety handles similar to reservation and assignment of the seats.
  • the SA may thus color the safety handles the various shades of blue (or white) as discussed above with reference to the seats.
  • the SA may confirm payment via an external financial server, which processes credit cards.
  • an external server executes one or more of the following: (a) sends a confirmation code, such as a QR code to the SA; (b) sends, at block 602 of Figure 6, a confirmation message to the bus 300, the confirmation message including a unique identifier (e.g., MAC ID) of the customer's smartphone that was running the SA when the seat 304 or safety handle 307 was purchased.
  • the confirmation code such as the QR code may be rendered and thus displayed on the SA.
  • the bus 300 eventually stops at a location to pick up the customer (e.g., a bus stop).
  • the customer verifies his or her identify before entering the bus 300 via one or more of the following methods: (a) providing the displayed QR code to a sensor mounted on the bus 300, which scans the QR code and pulls up the customer's reservation; (b) communicating a unique identity of the customer's smartphone (e.g., the MAC ID) via a wireless communication technology such as BluetoothTM, NFC, or RFID; (c) manually entering a confirmation code on a keypad of the bus 300 located near an entrance of the bus 300; (d) facing the exterior camera 302 for a picture.
  • a wireless communication technology such as BluetoothTM, NFC, or RFID
  • the bus 300 is configured to automatically accept and process the above verification methods. If the customer verifies identity via (a), then the bus 300, as described identifies, identifies and loads the customer's reservation via the QR code. If the QR code results in a loaded reservation, then the bus 300 authorizes access at block 606. If the customer verifies identity via (b), then the bus 300 compares unique ID of the customer's smartphone to the unique ID contained in the confirmation message sent to the bus. If the IDs match, then the bus authorizes access at block 606. If the customer verifies identity via (c), then the bus compares the confirmation code to the confirmation message. If the code and the message sufficiently match (e.g., both contain the same series of numbers), then the bus 300 authorizes access at block 606.
  • the exterior camera 302 of the bus 300 captures an image of the customer's face.
  • the bus 300 compares the captured image to a prestored image.
  • the prestored image may have been previously provided by the customer via the SA. If the images sufficiently match, then the bus 300 authorizes access at block 606.
  • the bus 300 may authorize access at block 606 in any suitable manner, such unlocking or automatically opening the door 301 and/ or lighting exterior lights 303a, 303b a specific color (e.g., green).
  • the exterior camera 302 may still capture an image of the customer's face.
  • the bus 300 may compare the captured image to the prestored image as in (d). If the pictures sufficiently match, then the bus 300 continues to authorize access. If the pictures do not match, then the bus 300 denies access.
  • the bus 300 captures the image via exterior camera 301 and stores the captured image for possible later use.
  • the bus 300 may authorize access at block 606 when the customer enters via (a), (b), and/or (c) as described above (i.e., the QR code, the wireless technology, or the manual entry) and the bus 300 is able to capture a suitable (i.e., unobscured) image of the customer's face.
  • the bus 300 may evaluate whether the image is suitable by identifying whether the image includes sufficient detail (e.g., clear pictures of eyes, hair, and chin) to identify the customer at a later time, if needed.
  • the bus 300 may transmit all captured images to an external server before departing from the pick-up location. These embodiments do not require a prestored image of the customer because no comparison is required.
  • the bus 300 may send an illumination signal to the seat 304 or safety handle 307 reserved by the customer at block 608.
  • the illumination signal may cause a light (e.g., one or more LEDs) to illuminate the reserved seat 304 or safety handle 307, thus causing the reserved seat 304 or safety handle 307 to glow.
  • the bus 300 may, at block 608, light up LEDs on a floor of the bus in a pattern leading to the customer's reserved seat or safety handle.
  • FIG. 4 shows an example seat 304.
  • a surface of the seat 304 is coated with a phosphor paint, a rylene dye , or any other suitable luminescent paint or dye 401.
  • the paint or dye 401 reacts to certain wavelengths of light (e.g., ultra-violet light). This light may be invisible to humans. When the paint or dye 401 reacts with this light, the paint or dye 401 releases new light having human- visible wavelengths. The paint or dye 401 thus appears to glow.
  • a seat light 305 is mounted above the seat 304 and pointed in the direction of the paint or dye 401.
  • the seat light 305 may be fixed in any suitable location, including within a cushion of the seat 304.
  • the seat light 305 When the seat light 305 is activated, the seat light 305 produces the certain wavelengths of light (e.g., the ultra-violet that is invisible to humans).
  • the seat light 305 includes a plurality of different LEDs.
  • a first set of LEDs in the seat light 305 generates the ultra-violet light that activates the paint or dye 401.
  • a second set of LEDs in the seat light 305 generates visible light suitable for reading.
  • the seat light 305 may include a switch that enables a customer to turn the second set of lights on and off.
  • Figure 5 illustrates an example safety handle 307.
  • the safety handle 307 of Figure 5 hangs from a ceiling of the bus 300 via a strap 501
  • other safety handles 307 may extend from a floor of the bus 300.
  • the safety handle 307 includes the strap 501 and a triangular holding portion 502 with a grip 503.
  • the holding portion 502 (and in some cases, specifically the grip 503) include the sensors and light sources discussed above.
  • These sensors and light sources may include (a) a load sensor, electric capacitance sensor, and/or the wireless antenna used to verify presence of the customer (detailed below), (b) the first set of LEDs (e.g., an ultra-violet LED source) configured to activate phosphor paint or rylene dye, (c) the second set of LEDs (e.g., typical LEDs) that generates light visible to humans.
  • a load sensor e.g., electric capacitance sensor, and/or the wireless antenna used to verify presence of the customer (detailed below)
  • the first set of LEDs e.g., an ultra-violet LED source
  • the second set of LEDs e.g., typical LEDs
  • portions of the holding portion 502, including the grip 503, may be transparent to enable the passage of light therethrough.
  • a first portion of the grip 503 is transparent and includes the second set of LEDs.
  • a second portion of the grip 503 is made from a transparent material and includes the first set of LEDs.
  • the transparent material is coated with the luminescent paint or dye.
  • one safety handle light 305e, 305f external to the safety handles 307 may be configured to generate the ultra-violet light.
  • the one safety handle light 305e, 305f is thus capable of simultaneously illuminating (i.e., producing glow for) a plurality of safety handles 307. This configuration is suitable when the bus 300 prices all safety handles 307 equally.
  • the bus 300 verifies that the customer has been correctly seated (or is gripping the proper safety handle) before departing from the pick-up location.
  • the bus 300 may verify the customer's seat or grip via one or more of: (a) a load sensor 306 (e.g., a capacitive load sensor) located in the seat or handle that detects the application of force or weight; (b) an electrical sensor 306 (e.g., an electrical capacitance sensor) that detects a change in an electrical property (e.g., capacitance) corresponding to the customer; (c) wireless antennas (e.g., Bluetooth antennas or NFC antennas) that locate the customer's smartphone within the bus 300; (d) photo recognition via interior cameras 308a, 308b mounted inside the bus 300 (e.g., determining, via photo recognition, that the reserved seat has now been occupied).
  • a load sensor 306 e.g., a capacitive load sensor located in the seat or handle that detects the application of force or weight
  • the photo recognition may proceed as follows: a patch may be painted or affixed each seat 304 and safety handle 307.
  • the patch may be a certain unique color or have certain unique properties that render the patch highly visible to photo detection via cameras 308a, 308b.
  • Each patch should generally be a different color than surrounding portions of the seat or safety handle. All patches may have the same color and unique properties.
  • the bus 300 may determine that the seat 304 or handle 307 has been occupied.
  • methods (a), (b), and (d) enable the bus 300 to produce an audio or visual signal when a customer has incorrectly switched seats 304 or safety handles 307, or placed an item on top of a seat 304 not associated with (i.e., not reserved by) the customer. Further, methods (a), (b), (c), and (d) enable the bus 300 to identify when a customer has vacated the reserved seat prior to the customer's destination. At block 610 and in-between stops, the bus 300 periodically queries the sensors and determines whether the sensors indicate a vacant seat 304 or safety handle 307.
  • the bus 300 Upon detecting a vacant seat 304 or safety handle 307 associated with a customer on the bus 300, the bus 300 is configured to: (a) flash the first set of LEDs associated with the vacant seat 304 or safety handle 307; (b) send a text message to the smartphone of the customer associated with the vacant seat; (c-1) initiate video recording via the cameras 308; (c-2) flag the video recording as important; (c-3) upload the flagged video; (c-4) conduct image recognition on the flagged video via image recognition software; (d) pull over and stop if the bus 300 is autonomous. Steps (a) to (d) may occur sequentially. The bus 300 may wait a predetermined amount of time (e.g., five or ten seconds) between each of the steps.
  • a predetermined amount of time e.g., five or ten seconds
  • the steps may be cumulative (e.g., the first set of LEDs continues to flash when the bus pulls over and stop).
  • the bus 300 may charge customers taking longer trips more than customers taking shorter trips. The bus 300 is therefore configured to determine whether a customer's stay or ride on the bus 300 is consistent with the fare paid via the SA.
  • the bus 300 associates a destination with each customer. The destination is received via the message of block 602.
  • the bus 300 transmits a message, such as a text message to the customer's smartphone.
  • the bus may also flash or blink the second set of LEDs in the seat light 305 or the safety handle 307 associated with the customer.
  • the bus 300 verifies that the customer has vacated the seat 304, safety handle 307, and/or bus 300 via any of the above- described methods.
  • the bus 300 may activate an alarm if the customer has not vacated the seat 304, safety handle 307 and/or bus 300 within a predetermined amount of time after the bus 300 reaches the customer's destination.
  • the alarm includes activating a live-streaming capability of the cameras 308a, 308b, thus enabling a remote operator to view the inside of the bus 300.
  • the alarm may cause the bus 300 to record video of the interior via the cameras 308a, 308b, save the video, flag the saved video as important, analyze the video with image recognition software, and upload the flagged video to a centralized server.
  • Video from the cameras 308a, 308b may stop being saved, flagged, and/or uploaded when the customer vacates the seat 304, safety handle 307 and/or bus 300.
  • the bus 300 is configured to save both flagged and unflagged video.
  • the bus 300 is configured to automatically upload saved flagged video, but is not configured to automatically upload saved and unflagged video. According to some embodiments, the bus 300 is configured to automatically delete saved and unflagged video after a predetermined amount of time. According to some embodiments, the bus is configured to automatically delete saved flagged video only after successfully uploading the saved flagged video.
  • the alarm may prevent the bus from departing the stop until (a) the customer vacates the bus 300, or (b) the customer reserves a new seat 304 or safety handle 307. Instead of disabling the bus 300, the alarm may cause the bus 300 to instruct a centralized server to increase the fare associated with the customer.
  • the bus 300 may track when the customer does depart (via the technologies described above), and then supplement the fare according to the additional time that the customer occupied the seat or safety handle plus a penalty.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

A bus includes: motor(s), interior camera(s), seats, each seat associated with an ultraviolet light source and an occupancy sensor, processor(s) configured to: (a) command one of the ultraviolet light sources to illuminate one of the seats; (b) periodically detect occupancy of each seat via the occupancy sensors; (c) activate the interior camera(s) based on a detected occupancy.

Description

BUS WITH RESERVATION SYSTEM AND ILLUMINATED SEATING
TECHNICAL FIELD
[0001] This document relates to monitoring and illumination systems for vehicles, such as buses.
BACKGROUND
[0002] Existing buses fail to guarantee that customers will have a place to sit upon entry. Surveys show that customers are willing to pay a premium to reserve bus seats for a more comfortable and enjoyable ride. As buses become autonomous, the lack of a driver may pose security issues. A new bus is needed to resolve these problems.
SUMMARY
[0003] A bus consistent with the present disclosure includes: motor(s), interior camera(s), seats, each seat associated with an ultraviolet light source and an occupancy sensor, processor(s) configured to: (a) command one of the ultraviolet light sources to illuminate one of the seats; (b) periodically detect occupancy of each seat via the occupancy sensors; (c) activate the interior camera(s) based on the detected occupancy.
[0004] According to various embodiments, the seats are coated with a luminescent paint or dye responsive to ultraviolet light.
[0005] According to various embodiments, the bus includes a plurality of handles, the handles being at least partially constructed from a transparent material coated with a luminescent paint or dye.
[0006] According to various embodiments, the handles house and include ultraviolet light sources configured to activate the luminescent paint or dye.
[0007] According to various embodiments, the processor(s) are configured to: identify a boarding passenger and issue the command to the ultraviolet light to illuminate the seat based on the passenger's identity.
[0008] According to various embodiments, all of the light produced by the ultraviolet light sources falls outside of the human-visible light spectrum.
[0009] According to various embodiments, the processor(s) are configured to: associate a destination with each occupied seat, activate the interior camera(s) based on determining that the occupied seat has become unoccupied prior to the destination.
[0010] According to various embodiments, activation of the interior camera(s) comprises flagging video recorded by the interior camera(s) and saving the flagged video.
[0011] According to various embodiments, the processor(s) are configured to: deactivate the interior camera(s) based on determining that the unoccupied seat has become occupied.
[0012] According to various embodiments, the processor(s) are configured to: activate the interior camera(s) based on determining that the seat has not become unoccupied after reaching the destination.
[0013] According to various embodiments, the bus includes exterior camera(s) and the processor(s) are configured to: authorize bus access to a person when the exterior camera(s) capture an image of the person's face; deny bus access to the person when the exterior camera(s) fail to capture the image of the person's face.
[0014] According to various embodiments, the processor(s) are configured to: determine whether the exterior camera(s) have captured the image of the person's face by running an image-recognition program, run the captured image through the image-recognition program based on and after receiving a wireless signal from a mobile device associated with the person.
[0015] According to various embodiments, the occupancy sensors are capacitance sensors.
[0016] A method of operating a bus is disclosed. The bus includes processor(s), motor(s), interior camera(s), seats, each seat associated with an ultraviolet light source and an occupancy sensor; the method includes, via the processor(s): (a) commanding one of the ultraviolet light sources to illuminate one of the seats; (b) periodically detecting occupancy of each seat via the occupancy sensors; (c) activating the interior camera(s) based on the detected occupancy.
[0017] According to various embodiments, the seats are coated with a luminescent paint or dye responsive to ultraviolet light and the bus includes a plurality of handles, the handles being at least partially constructed from a transparent material coated with a luminescent paint or dye.
[0018] According to various embodiments, the method includes identifying a boarding passenger and issuing the command to the ultraviolet light to illuminate the seat based on the passenger's identity.
[0019] According to various embodiments, the method includes associating a destination with each occupied seat, activating the interior earner a(s) based on determining that the occupied seat has become unoccupied prior to the destination.
[0020] According to various embodiments, the method includes deactivating the interior camera(s) based on determining that the unoccupied seat has become occupied.
[0021] According to various embodiments, the method includes authorizing bus access to a person when the exterior camera(s) capture an image of the person's face; denying bus access to the person when the exterior camera(s) fail to capture the image of the person's face. [0022] According to various embodiments, the method includes determining whether the exterior camera(s) have captured the image of the person's face by running an image-recognition program on the processor(s); running the captured image through the image-recognition program based on and after receiving a wireless signal from a mobile device associated with the person.
[0023] According to various embodiments, the occupancy sensors are capacitance sensors.
BRIEF DESCRIPTION OF THE DRAWINGS
[0024] For a better understanding of the invention, reference may be made to embodiments shown in the following drawings. The components in the drawings are not necessarily to scale and related elements may be omitted, or in some instances proportions may have been exaggerated, so as to emphasize and clearly illustrate the novel features described herein. In addition, system components can be variously arranged, as known in the art. Further, in the drawings, like reference numerals designate corresponding parts throughout the several views.
[0025] Figure 1 is a block diagram of a vehicle computing system.
[0026] Figure 2 is a schematic of a vehicle including the vehicle computing system.
[0027] Figure 3 is a top cross sectional view of a bus.
[0028] Figure 4 is a side view of a bus seat.
[0029] Figure 5 is a perspective view of a bus safety handle.
[0030] Figure 6 is a block diagram of a method of operating the bus.
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
[0031] While the invention may be embodied in various forms, there are shown in the drawings, and will hereinafter be described, some exemplary and non-limiting embodiments, with the understanding that the present disclosure is to be considered an exemplification of the invention and is not intended to limit the invention to the specific embodiments illustrated.
[0032] In this application, the use of the disjunctive is intended to include the conjunctive. The use of definite or indefinite articles is not intended to indicate cardinality. In particular, a reference to "the" object or "a" and "an" object is intended to denote also one of a possible plurality of such objects. Further, the conjunction "or" may be used to convey features that are simultaneously present, as one option, and mutually exclusive alternatives as another option. In other words, the conjunction "or" should be understood to include "and/or" as one option and "either/or" as another option.
[0033] Figure 1 shows a computing system 100 of an example vehicle 200. The vehicle 200 includes a motor, a battery, at least one wheel driven by the motor, and a steering system configured to turn the at least one wheel about an axis. Vehicles are also described, for example, in U.S. Patent App. No. 14/991,496 to Miller et al. ("Miller"), U.S. Patent No. 8,180,547 to Prasad et al. ("Prasad"), U.S. Patent App. No. 15/186,850 to Lavoie et. al. ("Lavoie") and U.S. Patent App. No. 14/972,761 to Hu et al. ("Hu"), all of which are hereby incorporated by reference in their entireties. According to various embodiments, and as discussed below, the vehicle 200 is a bus 300 configured to carry many passengers (e.g., more than 10 passengers).
[0034] The computing system 100 enables automatic control of mechanical systems within the device. The computing system 100 also enables communication with external devices. The computing system 100 includes a data bus 101, one or more processors 108, volatile memory 107, non-volatile memory 106, user interfaces 105, a telematics unit 104, actuators and motors 103, and local sensors 102. [0035] The data bus 101 traffics electronic signals or data between the electronic components. The processor 108 performs operations on the electronic signals or data to produce modified electronic signals or data. The volatile memory 107 stores data for immediate recall by the processor 108. The nonvolatile memory 106 stores data for recall to the volatile memory 107 and/or the processor 108. The non-volatile memory 106 includes a range of non-volatile memories including hard drives, SSDs, DVDs, Blu-Rays, etc. The user interface 105 includes displays, touch-screen displays, keyboards, buttons, and other devices that enable user interaction with the computing system. The telematics unit 104 enables both wired and wireless communication with external processors via Bluetooth, cellular data (e.g., 3G, LTE), USB, etc. The telematics unit 104 may be configured to broadcast signals at a certain frequency.
[0036] The actuators /motors 103 produce physical results. Examples of actuators /motors include fuel injectors, windshield wipers, brake light circuits, transmissions, airbags, engines, power train motors, locks, doors, steering, lights (as discussed below), etc. The local sensors 102 transmit digital readings or measurements to the processor 108. Examples of suitable sensors include temperature sensors, rotation sensors, capacitance sensors, seatbelt sensors, speed sensors, cameras, lidar sensors, radar sensors, etc. It should be appreciated that the various connected components of Figure 1 may include separate or dedicated processors and memory. Further detail of the structure and operations of the computing system 100 is described, for example, in Miller and/ or Prasad.
[0037] Figure 2 generally shows and illustrates the vehicle 200, which includes the computing system 100. Although not shown, the vehicle 200 is in operative wireless communication with a nomadic device, such as a mobile device or a smartphone. Some of the local sensors 102 are mounted on the exterior of the vehicle 200. Local sensor 102a may be an ultrasonic sensor, a lidar sensor, a camera, a video camera, and/or a microphone, etc. Local sensor 102a may be configured to detect objects leading the vehicle 200. Local sensor 102b may be an ultrasonic sensor, a lidar sensor, a camera, a video camera, and/or a microphone, etc. Local sensor 102b may be configured to detect objects trailing the vehicle 200 as indicated by trailing sensing range 109b. Left sensor 102c and right sensor 102d may be configured to perform the same functions for the left and right sides of the vehicle 200. The vehicle 200 includes a host of other sensors 102 located in the vehicle interior or on the vehicle exterior. These sensors may include any or all of the sensors disclosed in Prasad.
[0038] It should be appreciated that the vehicle 200 is configured to perform the methods and operations described below. In some cases, the vehicle 200 is configured to perform these functions via computer programs stored on the volatile and/or non-volatile memories of the computing system 100. A processor is "configured to" perform a disclosed operation when the processor is in operative communication with memory storing a software program with code or instructions embodying the disclosed operation. Further description of how the processor, memories, and programs cooperate appears in Prasad. It should be appreciated that the nomadic device or an external server in operative communication with the vehicle 200 perform some or all of the methods and operations discussed below.
[0039] According to various embodiments, the vehicle 200 includes some or all of the features of the vehicle 100a of Prasad. According to various embodiments, the computing system 100 includes some or all of the features of the VCCS 102 of Figure 2 of Prasad. According to various embodiments, the vehicle 200 is in communication with some or all of the devices shown in Figure 1 of Prasad, including the nomadic device 110, the communication tower 116, the telecom network 118, the Internet 120, and the data processing center 122.
[0040] The term "loaded vehicle," when used in the claims, is hereby defined to mean: "a vehicle including: a motor, a plurality of wheels, a power source, and a steering system; wherein the motor transmits torque to at least one of the plurality of wheels, thereby driving the at least one of the plurality of wheels; wherein the power source supplies energy to the motor; and wherein the steering system is configured to steer at least one of the plurality of wheels." The term "equipped electric vehicle," when used in the claims, is hereby defined to mean "a vehicle including: a battery, a plurality of wheels, a motor, a steering system; wherein the motor transmits torque to at least one of the plurality of wheels, thereby driving the at least one of the plurality of wheels; wherein the battery is rechargeable and is configured to supply electric energy to the motor, thereby driving the motor; and wherein the steering system is configured to steer at least one of the plurality of wheels."
[0041] Figure 3 generally shows and illustrates a bus 300 consistent with the present disclosure. The bus 300 is one species of the vehicle 200 and thus may include some or all of above-described features of the vehicle 200. It should be appreciated that while the disclosure below is described with reference to the bus 300, any of the systems and methods may be applied to any other type of vehicle, such as an airplane or a sedan (e.g., the vehicle 200 pictured in Figure 2).
[0042] The bus 300 includes the vehicle computing system 100. The bus 300 includes an automatic door 301 coupled with one or more door motors; a door camera 302 located outside of the automatic door 301 and configured to capture images of customers located outside of the bus 300; exterior lights (e.g., LEDs) 303a, 303b configured to light up a first color (e.g., green) when the bus 300 authorizes entry to a customer and a second color (e.g., red) when the bus denies entry to a customer; seats 304a, 304b, 304c, 304d; seat lights (e.g., LEDs) 305a, 305b, 305c, 305d configured to light up a corresponding seat 304; seat sensors 306a, 306b, 306c, 306d configured to sense and confirm the presence of a customer; safety handles 307a, 307b; safety handle lights (e.g., LEDs) 305e, 305f configured to light up one or more safety handles 307; first and second interior cameras 308a, 308b configured to capture images of the interior of the bus 300.
[0043] As shown in Figure 6, the bus 300 is configured to perform a reservation control method 600. More specifically, and as described above, the bus 300 includes software embodying the below-disclosed operations stored on the memory 106, 107. The processor 108 is configured to execute the software stored on the memory 106, 107 and thereby cause the bus to perform the below- disclosed operations.
[0044] The bus 300, via the reservation control method 600, accomplishes the following objectives (a) reserves and assigns seats for customers, (b) locates customers while the bus is in motion, and (c) detects when customers unexpectedly remain in and/or leave their assigned seats.
[0045] Prior to entering or boarding the bus 300, customers download a smartphone application ("SA"). The SA displays an internal view of the bus 300 similar to Figure 3. The SA designates bus seats 304 as available or unavailable. For example, the SA may color seats 304 that have already been assigned or reserved in red and color seats that are available for reservation in blue or white. The SA may further shade the available seats according to their price. For example, lower priced seats may be shaded light blue while higher priced seats are shaded dark blue.
[0046] As shown in Figures 3 and 5, the bus may include safety handles 307 in addition to seats. Customers grip the safety handles 307 while riding the bus 300. The safety handles 307, as opposed to seats 304, enable the bus 300 to carry a higher density of customers. The SA may enable reservation and assignment of the safety handles similar to reservation and assignment of the seats. The SA may thus color the safety handles the various shades of blue (or white) as discussed above with reference to the seats. [0047] After picking a seat 304 or a safety handle 307, the customer confirms payment on the SA. The SA may confirm payment via an external financial server, which processes credit cards. Upon payment confirmation, an external server executes one or more of the following: (a) sends a confirmation code, such as a QR code to the SA; (b) sends, at block 602 of Figure 6, a confirmation message to the bus 300, the confirmation message including a unique identifier (e.g., MAC ID) of the customer's smartphone that was running the SA when the seat 304 or safety handle 307 was purchased. The confirmation code, such as the QR code may be rendered and thus displayed on the SA.
[0048] The bus 300 eventually stops at a location to pick up the customer (e.g., a bus stop). At block 604, the customer verifies his or her identify before entering the bus 300 via one or more of the following methods: (a) providing the displayed QR code to a sensor mounted on the bus 300, which scans the QR code and pulls up the customer's reservation; (b) communicating a unique identity of the customer's smartphone (e.g., the MAC ID) via a wireless communication technology such as Bluetooth™, NFC, or RFID; (c) manually entering a confirmation code on a keypad of the bus 300 located near an entrance of the bus 300; (d) facing the exterior camera 302 for a picture.
[0049] The bus 300 is configured to automatically accept and process the above verification methods. If the customer verifies identity via (a), then the bus 300, as described identifies, identifies and loads the customer's reservation via the QR code. If the QR code results in a loaded reservation, then the bus 300 authorizes access at block 606. If the customer verifies identity via (b), then the bus 300 compares unique ID of the customer's smartphone to the unique ID contained in the confirmation message sent to the bus. If the IDs match, then the bus authorizes access at block 606. If the customer verifies identity via (c), then the bus compares the confirmation code to the confirmation message. If the code and the message sufficiently match (e.g., both contain the same series of numbers), then the bus 300 authorizes access at block 606. If the customer verifies identity via (d), then the exterior camera 302 of the bus 300 captures an image of the customer's face. The bus 300 compares the captured image to a prestored image. The prestored image may have been previously provided by the customer via the SA. If the images sufficiently match, then the bus 300 authorizes access at block 606. The bus 300 may authorize access at block 606 in any suitable manner, such unlocking or automatically opening the door 301 and/ or lighting exterior lights 303a, 303b a specific color (e.g., green).
[0050] It should be appreciated that if the customer verifies identity via (a), (b), or (c), the exterior camera 302 may still capture an image of the customer's face. The bus 300 may compare the captured image to the prestored image as in (d). If the pictures sufficiently match, then the bus 300 continues to authorize access. If the pictures do not match, then the bus 300 denies access.
[0051] According to various embodiments, the bus 300 captures the image via exterior camera 301 and stores the captured image for possible later use. According to these embodiments, the bus 300 may authorize access at block 606 when the customer enters via (a), (b), and/or (c) as described above (i.e., the QR code, the wireless technology, or the manual entry) and the bus 300 is able to capture a suitable (i.e., unobscured) image of the customer's face. The bus 300 may evaluate whether the image is suitable by identifying whether the image includes sufficient detail (e.g., clear pictures of eyes, hair, and chin) to identify the customer at a later time, if needed. The bus 300 may transmit all captured images to an external server before departing from the pick-up location. These embodiments do not require a prestored image of the customer because no comparison is required.
[0052] Upon authorizing access or entry to a customer, the bus 300 may send an illumination signal to the seat 304 or safety handle 307 reserved by the customer at block 608. As described below, the illumination signal may cause a light (e.g., one or more LEDs) to illuminate the reserved seat 304 or safety handle 307, thus causing the reserved seat 304 or safety handle 307 to glow. Alternatively or in addition, the bus 300 may, at block 608, light up LEDs on a floor of the bus in a pattern leading to the customer's reserved seat or safety handle.
[0053] Figure 4 shows an example seat 304. A surface of the seat 304 is coated with a phosphor paint, a rylene dye , or any other suitable luminescent paint or dye 401. The paint or dye 401 reacts to certain wavelengths of light (e.g., ultra-violet light). This light may be invisible to humans. When the paint or dye 401 reacts with this light, the paint or dye 401 releases new light having human- visible wavelengths. The paint or dye 401 thus appears to glow.
[0054] As shown in Figure 4, a seat light 305 is mounted above the seat 304 and pointed in the direction of the paint or dye 401. The seat light 305 may be fixed in any suitable location, including within a cushion of the seat 304. When the seat light 305 is activated, the seat light 305 produces the certain wavelengths of light (e.g., the ultra-violet that is invisible to humans).
[0055] According to various embodiments, the seat light 305 includes a plurality of different LEDs. A first set of LEDs in the seat light 305 generates the ultra-violet light that activates the paint or dye 401. A second set of LEDs in the seat light 305 generates visible light suitable for reading. The seat light 305 may include a switch that enables a customer to turn the second set of lights on and off.
[0056] Figure 5 illustrates an example safety handle 307. Although the safety handle 307 of Figure 5 hangs from a ceiling of the bus 300 via a strap 501, other safety handles 307 may extend from a floor of the bus 300. The safety handle 307 includes the strap 501 and a triangular holding portion 502 with a grip 503. The holding portion 502 (and in some cases, specifically the grip 503) include the sensors and light sources discussed above. These sensors and light sources may include (a) a load sensor, electric capacitance sensor, and/or the wireless antenna used to verify presence of the customer (detailed below), (b) the first set of LEDs (e.g., an ultra-violet LED source) configured to activate phosphor paint or rylene dye, (c) the second set of LEDs (e.g., typical LEDs) that generates light visible to humans.
[0057] Accordingly, portions of the holding portion 502, including the grip 503, may be transparent to enable the passage of light therethrough. According to various embodiments, a first portion of the grip 503 is transparent and includes the second set of LEDs. According to various embodiments, a second portion of the grip 503 is made from a transparent material and includes the first set of LEDs. The transparent material is coated with the luminescent paint or dye. Alternatively, and as shown in Figure 3, one safety handle light 305e, 305f external to the safety handles 307, may be configured to generate the ultra-violet light. The one safety handle light 305e, 305f is thus capable of simultaneously illuminating (i.e., producing glow for) a plurality of safety handles 307. This configuration is suitable when the bus 300 prices all safety handles 307 equally.
[0058] At block 610, the bus 300 verifies that the customer has been correctly seated (or is gripping the proper safety handle) before departing from the pick-up location. The bus 300 may verify the customer's seat or grip via one or more of: (a) a load sensor 306 (e.g., a capacitive load sensor) located in the seat or handle that detects the application of force or weight; (b) an electrical sensor 306 (e.g., an electrical capacitance sensor) that detects a change in an electrical property (e.g., capacitance) corresponding to the customer; (c) wireless antennas (e.g., Bluetooth antennas or NFC antennas) that locate the customer's smartphone within the bus 300; (d) photo recognition via interior cameras 308a, 308b mounted inside the bus 300 (e.g., determining, via photo recognition, that the reserved seat has now been occupied). [0059] The photo recognition may proceed as follows: a patch may be painted or affixed each seat 304 and safety handle 307. The patch may be a certain unique color or have certain unique properties that render the patch highly visible to photo detection via cameras 308a, 308b. Each patch should generally be a different color than surrounding portions of the seat or safety handle. All patches may have the same color and unique properties. When the cameras 308a, 308b can no longer view the patch (as determined by image recognition software), the bus 300 may determine that the seat 304 or handle 307 has been occupied.
[0060] It should be appreciated that methods (a), (b), and (d) enable the bus 300 to produce an audio or visual signal when a customer has incorrectly switched seats 304 or safety handles 307, or placed an item on top of a seat 304 not associated with (i.e., not reserved by) the customer. Further, methods (a), (b), (c), and (d) enable the bus 300 to identify when a customer has vacated the reserved seat prior to the customer's destination. At block 610 and in-between stops, the bus 300 periodically queries the sensors and determines whether the sensors indicate a vacant seat 304 or safety handle 307.
[0061] Upon detecting a vacant seat 304 or safety handle 307 associated with a customer on the bus 300, the bus 300 is configured to: (a) flash the first set of LEDs associated with the vacant seat 304 or safety handle 307; (b) send a text message to the smartphone of the customer associated with the vacant seat; (c-1) initiate video recording via the cameras 308; (c-2) flag the video recording as important; (c-3) upload the flagged video; (c-4) conduct image recognition on the flagged video via image recognition software; (d) pull over and stop if the bus 300 is autonomous. Steps (a) to (d) may occur sequentially. The bus 300 may wait a predetermined amount of time (e.g., five or ten seconds) between each of the steps. The steps may be cumulative (e.g., the first set of LEDs continues to flash when the bus pulls over and stop). [0062] The bus 300 may charge customers taking longer trips more than customers taking shorter trips. The bus 300 is therefore configured to determine whether a customer's stay or ride on the bus 300 is consistent with the fare paid via the SA. The bus 300 associates a destination with each customer. The destination is received via the message of block 602. At block 612, when (or a predetermined amount of time before) a customer reaches a destination, the bus 300 transmits a message, such as a text message to the customer's smartphone. The bus may also flash or blink the second set of LEDs in the seat light 305 or the safety handle 307 associated with the customer.
[0063] At block 614, the bus 300 verifies that the customer has vacated the seat 304, safety handle 307, and/or bus 300 via any of the above- described methods. At block 614, the bus 300 may activate an alarm if the customer has not vacated the seat 304, safety handle 307 and/or bus 300 within a predetermined amount of time after the bus 300 reaches the customer's destination.
[0064] According to various embodiments, the alarm includes activating a live-streaming capability of the cameras 308a, 308b, thus enabling a remote operator to view the inside of the bus 300. The alarm may cause the bus 300 to record video of the interior via the cameras 308a, 308b, save the video, flag the saved video as important, analyze the video with image recognition software, and upload the flagged video to a centralized server. Video from the cameras 308a, 308b may stop being saved, flagged, and/or uploaded when the customer vacates the seat 304, safety handle 307 and/or bus 300. According to some embodiments, the bus 300 is configured to save both flagged and unflagged video. According to some embodiments, the bus 300 is configured to automatically upload saved flagged video, but is not configured to automatically upload saved and unflagged video. According to some embodiments, the bus 300 is configured to automatically delete saved and unflagged video after a predetermined amount of time. According to some embodiments, the bus is configured to automatically delete saved flagged video only after successfully uploading the saved flagged video.
[0065] If the bus 300 is autonomous, the alarm may prevent the bus from departing the stop until (a) the customer vacates the bus 300, or (b) the customer reserves a new seat 304 or safety handle 307. Instead of disabling the bus 300, the alarm may cause the bus 300 to instruct a centralized server to increase the fare associated with the customer. The bus 300 may track when the customer does depart (via the technologies described above), and then supplement the fare according to the additional time that the customer occupied the seat or safety handle plus a penalty.

Claims

1. A bus comprising: motor(s), interior camera(s), seats, each seat associated with an ultraviolet light source and an occupancy sensor, processor(s) configured to:
(a) command one of the ultraviolet light sources to illuminate one of the seats;
(b) periodically detect occupancy of each seat via the occupancy sensors;
(c) activate the interior camera(s) based on a detected occupancy.
2. The bus of claim 1, wherein the seats are coated with a luminescent paint or dye visually responsive to ultraviolet light.
3. The bus of claim 1 , comprising a plurality of handles, the handles being at least partially constructed from a transparent material coated with a luminescent paint or dye.
4. The bus of claim 3, wherein the handles house ultraviolet light sources configured to activate the luminescent paint or dye.
5. The bus of claim 1, wherein the processor(s) are configured to: identify a boarding passenger and command the ultraviolet light source to illuminate the seat based on the passenger's identity.
6. The bus of claim 1, wherein all of the light produced by the ultraviolet light sources falls outside of the human-visible light spectrum.
7. The bus of claim 1, wherein the processor(s) are configured to: associate a destination with each occupied seat, activate the interior camera(s) based on determining that one of the occupied seats has become unoccupied prior to the destination.
8. The bus of claim 7, wherein activation of the interior camera(s) comprises flagging video recorded by the interior camera(s) and saving the flagged video.
9. The bus of claim 7, wherein the processor(s) are configured to: deactivate the interior camera(s) based on determining that the unoccupied seat has become occupied.
10. The bus of claim 7, wherein the processor(s) are configured to: activate the interior camera(s) based on determining that the seat has not become unoccupied after reaching the destination.
11. The bus of claim 1, comprising exterior camera(s) and wherein the processor(s) are configured to: authorize bus access to a person when the exterior camera(s) capture an image of the person's face; deny bus access to the person when the exterior camera(s) fail to capture the image of the person's face.
12. The bus of claim 11, wherein the processor(s) are configured to: determine whether the exterior camera(s) have captured the image of the person's face by running an image-recognition program, run the captured image through the image-recognition program based on and after receiving a wireless signal from a mobile device associated with the person.
13. The bus of claim 1, wherein the occupancy sensors are capacitance sensors.
14. A method of operating a bus including processor(s), motor(s), interior camera(s), seats, each seat associated with an ultraviolet light source and an occupancy sensor; the method comprising, via the processor(s): (a) commanding one of the ultraviolet light sources to illuminate one of the seats;
(b) periodically detecting occupancy of each seat via the occupancy sensors;
(c) activating the interior camera(s) based on a detected occupancy.
15. The method of claim 14, wherein the seats are coated with a luminescent paint or dye responsive to ultraviolet light and the bus includes a plurality of handles, the handles being at least partially constructed from a transparent material coated with a luminescent paint or dye.
16. The method of claim 15, comprising: identifying a boarding passenger and commanding the ultraviolet light source to illuminate the seat based on the passenger's identity.
17. The method of claim 14, comprising: associating a destination with each occupied seat, activating the interior camera(s) based on determining that one of the occupied seats has become unoccupied prior to the destination.
18. The method of claim 17, comprising: deactivating the interior camera(s) based on determining that the unoccupied seat has become occupied.
19. The method of claim 17, comprising: authorizing bus access to a person when the exterior camera(s) capture an image of the person's face; denying bus access to the person when the exterior camera(s) fail to capture the image of the person's face; determining whether the exterior camera(s) have captured the image of the person's face by running an image-recognition program on the processor(s); running the captured image through the image-recognition program based on and after receiving a wireless signal from a mobile device associated with the person.
20. The method of claim 19, wherein the occupancy sensors are capacitance sensors.
PCT/US2016/046057 2016-08-08 2016-08-08 Bus with reservation system and illuminated seating WO2018030987A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2016/046057 WO2018030987A1 (en) 2016-08-08 2016-08-08 Bus with reservation system and illuminated seating

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2016/046057 WO2018030987A1 (en) 2016-08-08 2016-08-08 Bus with reservation system and illuminated seating

Publications (1)

Publication Number Publication Date
WO2018030987A1 true WO2018030987A1 (en) 2018-02-15

Family

ID=61163237

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/046057 WO2018030987A1 (en) 2016-08-08 2016-08-08 Bus with reservation system and illuminated seating

Country Status (1)

Country Link
WO (1) WO2018030987A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3561629A1 (en) * 2018-04-27 2019-10-30 Delphi Technologies, LLC Autonomous vehicle operation based on passenger-count
DE102018006769A1 (en) * 2018-08-27 2020-02-27 Daimler Ag Arrangement for irradiating a surface
US11013092B2 (en) 2019-03-01 2021-05-18 Chromo Lighting, LLC Light system
US11170459B2 (en) 2019-03-14 2021-11-09 Ford Global Technologies, Llc Systems and methods for seat selection in a vehicle of a ride service
US11479168B2 (en) 2020-06-24 2022-10-25 Shanghai Yanfeng Jinqiao Automotive Trim Systems Co. Ltd. Vehicle interior component
US11655032B2 (en) 2020-03-19 2023-05-23 B/E Aerospace, Inc. Systems and methods for efficient boarding of passenger vehicles
EP4296988A4 (en) * 2021-03-24 2024-04-10 Nec Corp Operation management system, operation management device, operation management method, and program

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010023908A1 (en) * 2000-03-09 2001-09-27 Jens Romca Passenger orientation arrangement in a passenger cabin
US20050080533A1 (en) * 2003-09-29 2005-04-14 Basir Otman A. Vehicle passenger seat sensor network
US20140264079A1 (en) * 2013-03-15 2014-09-18 International Automotive Components Group North America, Inc. Luminescent, Ultraviolet Protected Automotive Interior Members
US20150210287A1 (en) * 2011-04-22 2015-07-30 Angel A. Penilla Vehicles and vehicle systems for providing access to vehicle controls, functions, environment and applications to guests/passengers via mobile devices
US20150251596A1 (en) * 2013-11-21 2015-09-10 Ford Global Technologies, Llc Luminescent seating assembly
US20160075275A1 (en) * 2013-11-21 2016-03-17 Ford Global Technologies, Llc System and method for remote activation of vehicle lighting
US20160200219A1 (en) * 2015-01-08 2016-07-14 Hokky Tjahjono Vehicle Occupant Presence and Reminder System

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010023908A1 (en) * 2000-03-09 2001-09-27 Jens Romca Passenger orientation arrangement in a passenger cabin
US20050080533A1 (en) * 2003-09-29 2005-04-14 Basir Otman A. Vehicle passenger seat sensor network
US20150210287A1 (en) * 2011-04-22 2015-07-30 Angel A. Penilla Vehicles and vehicle systems for providing access to vehicle controls, functions, environment and applications to guests/passengers via mobile devices
US20140264079A1 (en) * 2013-03-15 2014-09-18 International Automotive Components Group North America, Inc. Luminescent, Ultraviolet Protected Automotive Interior Members
US20150251596A1 (en) * 2013-11-21 2015-09-10 Ford Global Technologies, Llc Luminescent seating assembly
US20160075275A1 (en) * 2013-11-21 2016-03-17 Ford Global Technologies, Llc System and method for remote activation of vehicle lighting
US20160200219A1 (en) * 2015-01-08 2016-07-14 Hokky Tjahjono Vehicle Occupant Presence and Reminder System

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3561629A1 (en) * 2018-04-27 2019-10-30 Delphi Technologies, LLC Autonomous vehicle operation based on passenger-count
CN110406543A (en) * 2018-04-27 2019-11-05 德尔福技术有限公司 The autonomous vehicle operation counted based on passenger
US11092963B2 (en) 2018-04-27 2021-08-17 Motional Ad Llc Autonomous vehicle operation based on passenger-count
EP4282703A3 (en) * 2018-04-27 2024-03-20 Motional AD LLC Autonomous vehicle operation based on passenger-count
DE102018006769A1 (en) * 2018-08-27 2020-02-27 Daimler Ag Arrangement for irradiating a surface
US11964065B2 (en) 2018-08-27 2024-04-23 Daimler Ag Arrangement for irradiating a surface
US11013092B2 (en) 2019-03-01 2021-05-18 Chromo Lighting, LLC Light system
US11363702B2 (en) 2019-03-01 2022-06-14 Chromo Lighting, LLC Light system
US11170459B2 (en) 2019-03-14 2021-11-09 Ford Global Technologies, Llc Systems and methods for seat selection in a vehicle of a ride service
US11655032B2 (en) 2020-03-19 2023-05-23 B/E Aerospace, Inc. Systems and methods for efficient boarding of passenger vehicles
US11479168B2 (en) 2020-06-24 2022-10-25 Shanghai Yanfeng Jinqiao Automotive Trim Systems Co. Ltd. Vehicle interior component
EP4296988A4 (en) * 2021-03-24 2024-04-10 Nec Corp Operation management system, operation management device, operation management method, and program

Similar Documents

Publication Publication Date Title
WO2018030987A1 (en) Bus with reservation system and illuminated seating
US11766993B2 (en) Automatic power door opening on sustained presence
US11465634B1 (en) Automobile detection system
US11167725B2 (en) Iris-detection alignment for vehicle feature activation
US10829034B2 (en) Vehicle control device
US9694764B2 (en) Vehicle electromechanical systems triggering based on image recognition and radio frequency
CN107972609A (en) Method and apparatus for door state detection
CN111231893B (en) Method for operating a shared vehicle and shared vehicle
CN108749767A (en) A kind of face recognition vehicle system for unlocking, unlocking method and vehicle
CN110001629A (en) The mobile device system of remote control park for vehicle connects
JP2009280063A (en) Vehicle air conditioning system
CN110001627A (en) The mobile device system of remote control park for vehicle connects
CN110758320B (en) Anti-leaving processing method and device for self-help test driving, electronic equipment and storage medium
CN109987087A (en) The mobile device system of remote control park for vehicle connects
CN109243024A (en) A kind of automobile unlocking system and method based on recognition of face
US20200186689A1 (en) Automated Vehicle (AV) Interior Inspection Method and Device
KR101663096B1 (en) Anti-theft Device for Vehicles
EP3664406B1 (en) Passenger selection and screening for automated vehicles
CN206097256U (en) Intelligent management and convenient for people system of residential quarter
CN108394372A (en) The method of seat occupancy for identification
CN113516034B (en) Keyless entry system and method for vehicle
JP2009234318A (en) Vehicular environment control system and ride intention detection device
CN103425962B (en) Vehicle face identification device and method
US20190299848A1 (en) User guidance device
CN208861343U (en) A kind of automobile system for unlocking based on recognition of face

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16912816

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16912816

Country of ref document: EP

Kind code of ref document: A1