WO2018056981A1 - Solar-powered, virtual-reality windshield - Google Patents

Solar-powered, virtual-reality windshield Download PDF

Info

Publication number
WO2018056981A1
WO2018056981A1 PCT/US2016/053069 US2016053069W WO2018056981A1 WO 2018056981 A1 WO2018056981 A1 WO 2018056981A1 US 2016053069 W US2016053069 W US 2016053069W WO 2018056981 A1 WO2018056981 A1 WO 2018056981A1
Authority
WO
WIPO (PCT)
Prior art keywords
windshield
vehicle
display
sensor
sub
Prior art date
Application number
PCT/US2016/053069
Other languages
French (fr)
Inventor
Wei Xu
Fazal Urrahman Syed
Scott Vincent MYERS
Original Assignee
Ford Global Technologies, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ford Global Technologies, Llc filed Critical Ford Global Technologies, Llc
Priority to PCT/US2016/053069 priority Critical patent/WO2018056981A1/en
Publication of WO2018056981A1 publication Critical patent/WO2018056981A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/03Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for supply of electrical power to vehicle subsystems or for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/03Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for supply of electrical power to vehicle subsystems or for
    • B60R16/0307Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for supply of electrical power to vehicle subsystems or for using generators driven by a machine different from the vehicle motor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2330/00Aspects of power supply; Aspects of display protection and defect management
    • G09G2330/02Details of power systems and of start or stop of display operation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/10Automotive applications

Definitions

  • This invention relates to multipurpose vehicle windshields for autonomous and semi-autonomous vehicles.
  • Modern vehicle windshields or windscreens are critical safety components of vehicles.
  • a windshield typically made of laminated safety glass, a windshield provides a barrier to protect the vehicle's occupants from wind, weather, and flying debris during travel.
  • a windshield also provides protection to vehicle occupants in the event of a rollover accident, and aids in proper airbag deployment.
  • autonomous (e.g., driverless) cars may reduce the need for a driver to assume many of the duties traditionally associated with driving.
  • many of the utilitarian functions of vehicle windshields such as allowing clear, unobstructed views of the driving environment, may become less important.
  • autonomous vehicles rely on various sensors to detect and navigate the external environment, less diligence may be required with respect to assessing road conditions and environmental conditions.
  • a driver may even be able to take his or her eyes off the road completely for a period of time without compromising the safety of vehicle occupants.
  • vehicle occupants, including the driver may feel free to direct their attention to other interests and entertainment.
  • Figure 1 is a high-level block diagram showing one example of a computing system in which a system and method in accordance with the invention may be implemented;
  • FIG. 2 is a high level block diagram showing various components of a vehicle windshield system in accordance with certain embodiments of the invention.
  • Figure 3 is a flow diagram showing a process for powering a vehicle windshield system in accordance with certain embodiments of the invention.
  • Figure 4 is a flow chart showing a process for selectively allowing and disallowing visibility through a vehicle windshield in accordance with certain embodiments of the invention
  • Figure 5 is a front view of one embodiment of a vehicle windshield including a solar panel having various sub-panels;
  • Figure 6 is a front view of the solar sub-panels of Figure 5 rotated about ninety degrees in accordance with certain embodiments of the invention.
  • Figure 7 is a front view of the solar sub-panels of Figures 5 and 6 retracted from a portion of the vehicle windshield in accordance with the invention.
  • Figure 8 is a front view of another embodiment of a vehicle windshield including a solar panel;
  • Figure 9 is a front view of the solar panel of Figure 8 retracted from a portion of the vehicle windshield in accordance with the invention;
  • Figure 10 is a front view of one embodiment of a display coupled to a vehicle windshield in accordance with the invention.
  • Figure 11 is a front view of a second embodiment of a display in accordance with the invention.
  • FIG. 1 one example of a computing system 100 is illustrated.
  • the computing system 100 is presented to show one example of an environment where a system and method in accordance with the invention may be implemented.
  • the computing system 100 may be embodied as a mobile device 100 such as a smart phone or tablet, a desktop computer, a workstation, a server, or the like.
  • the computing system 100 is presented by way of example and is not intended to be limiting. Indeed, the systems and methods disclosed herein may be applicable to a wide variety of different computing systems in addition to the computing system 100 shown. The systems and methods disclosed herein may also potentially be distributed across multiple computing systems 100.
  • the computing system 100 includes at least one processor 102 and may include more than one processor 102.
  • the processor 102 may be operably connected to a memory 104.
  • the memory 104 may include one or more non-volatile storage devices such as hard drives 104a, solid state drives 104a, CD-ROM drives 104a, DVD-ROM drives 104a, tape drives 104a, or the like.
  • the memory 104 may also include non-volatile memory such as a read-only memory 104b (e.g., ROM, EPROM, EEPROM, and/or Flash ROM) or volatile memory such as a random access memory 104c (RAM or operational memory).
  • a bus 106, or plurality of buses 106 may interconnect the processor 102, memory devices 104, and other devices to enable data and/or instructions to pass therebetween.
  • the computing system 100 may include one or more ports 108.
  • Such ports 108 may be embodied as wired ports 108 (e.g., USB ports, serial ports, Firewire ports, SCSI ports, parallel ports, etc.) or wireless ports 108 (e.g., Bluetooth, IrDA, etc.).
  • the ports 108 may enable communication with one or more input devices 110 (e.g., keyboards, mice, touchscreens, cameras, microphones, scanners, storage devices, etc.) and output devices 112 (e.g., displays, monitors, speakers, printers, storage devices, etc.).
  • the ports 108 may also enable communication with other computing systems 100.
  • the computing system 100 includes a wired or wireless network adapter 114 to connect the computing system 100 to a network 116, such as a LAN, WAN, or the Internet.
  • a network 116 may enable the computing system 100 to connect to one or more servers 118, workstations 120, personal computers 120, mobile computing devices, or other devices.
  • the network 116 may also enable the computing system 100 to connect to another network by way of a router 122 or other device 122.
  • a router 122 may allow the computing system 100 to communicate with servers, workstations, personal computers, or other devices located on different networks.
  • autonomous vehicles may relieve a driver from having to assume many of the duties traditionally associated with driving.
  • Sensors used to navigate an autonomous vehicle may assess the external environment more frequently and more accurately than human perception permits, thereby enabling a driver to divert his attention to entertainment and other interests without endangering vehicle occupants.
  • This new technology also provides a unique opportunity for a vehicle windshield to do more than simply provide unobstructed views from the inside to the outside of the vehicle.
  • Vehicle windshield systems in accordance with embodiments of the present invention provide a multi-purpose windshield configured to selectively provide information and entertainment to vehicle occupants while permitting traditional or manual use of the vehicle windshield in an emergency situation.
  • windshield refers to any windshield, windscreen, window, or other substantially transparent material mediating the space between the interior and the exterior of a vehicle.
  • vehicle refers to any autonomous, semi- autonomous, or non-autonomous passenger vehicle, including a heavy-duty industrial or transport vehicle, bus, truck, car, cart, airplane, and the like.
  • a vehicle windshield system 200 in accordance with the invention may utilize various vehicle on-board sensors 202, 204, 206, 208, 210 to sense an external environment of the vehicle and create a virtual reality therefrom, as discussed in more detail below.
  • the vehicle windshield system 200 may include lidar sensors 202, radar sensors 204, camera sensors 206, navigational or GPS sensors 208, and other sensors 210.
  • Lidar sensors 202 may be used to gather data by scanning the environment surrounding the vehicle with a laser light. In this manner, lidar sensors 202 may determine the exact location of other vehicles and environmental objects and conditions, such as pedestrians, children, animals, roadway signage, and the like, in addition to the roadway surface itself. Lidar sensors 202 may likewise determine relevant features and characteristics of such objects. This data may be received by an internal virtual reality system 216 and imputed into fusion 212 algorithms to produce a virtual reality for display to a user, as discussed in more detail below.
  • Radar sensors 204 may use electromagnetic waves to determine distances between objects. Radar sensors 204 may also be used to determine properties of objects to facilitate object differentiation. Data gathered from radar sensors 204 may also be communicated to the virtual reality system 216 for analysis, fusion, and display.
  • Optical or camera sensors 206 may include image acquisition devices such as cameras, charge coupled devices, or the like, to acquire still images or video of the surrounding area. Data from camera sensors 206 may be communicated to the virtual reality system 216, where it may undergo image processing to increase contrast or clarity, for example. The virtual reality system 216 may collect and combine the image data with other sensor data to create a virtual reality for display.
  • Navigational system or GPS sensors 208 may include common global positioning system (“GPS”) sensors, or other similar sensors known in the art. GPS sensors 208 be used to detect a location of the vehicle relative to road and map features, including intersections, bridges, buildings, homes, and the like. GPS sensors 208 may also provide pertinent information for display in combination with the virtual reality data gathered from other sensors 202, 204, 206, 208, 210, such as recommended speeds, distances to landmarks or desired locations, etc.
  • GPS global positioning system
  • Other sensors 210 may include, for example, temperature sensors, humidity sensors, ultrasonic sensors, audio sensors, and the like. These other sensors 210 may enhance the accuracy and detail of the virtual reality displayed by a vehicle windshield system 200 in accordance with embodiments of the invention.
  • the various sensors 202, 204, 206, 208, 210 may work together to sense conditions of the surrounding environment, thereby enabling safe navigation of the autonomous vehicle without human intervention. According to certain embodiments of the present invention, these sensors 202, 204, 206, 208, 210 may also collect data to provide an augmented or virtual reality to vehicle occupants.
  • Data from the sensors 202, 204, 206, 208, 210 may be received by a virtual reality system 216.
  • the virtual reality system 216 may analyze and fuse 212 the data gathered from the various sensors 202, 204, 206, 208, 210 to create a virtual or augmented reality. This information may be projected onto a vehicle windshield, or may be displayed on a monitor or other display device 220 associated with the vehicle windshield, as discussed in more detail below.
  • the virtual reality system 216 may perform one or more fusion 212 algorithms, using the data from sensors 202, 204, 206, 208, 210 as input. These fusion 212 algorithms may provide object tracking 212 to display pertinent information associated with objects in the surrounding environment, such as speeds, locations, or distances associated with other vehicles on the road. Object tracking 212 may also provide information regarding speeds, locations, or distances of objects not normally within the line of sight of a driver, such as a vehicle in the driver's blind spot, an approaching animal, or children playing near or behind the vehicle. Fusion 212 algorithms may be performed repeatedly to provide updated information to vehicle occupants in realtime.
  • the virtual reality system 216 may utilize the information produced from the fusion 212 algorithms to substantially reconstruct 214 the driving environment. For example, the virtual reality system 216 may use image data from the camera sensor 206 and correlate it with location data from the GPS sensors 208 to accurately depict distances of surrounding obj ects relative to the vehicle. Similarly, the virtual reality system 216 may fuse information from the lidar sensors 202, radar sensors 204, and other sensors 210 to indicate the presence of objects not captured by the camera sensors 206. In this manner, a vehicle windshield system 200 in accordance with the present invention may provide a more accurate and complete picture of the surrounding environment than human perception permits.
  • the virtual reality thus created by the virtual reality system 216 may be selectively displayed on a monitor or other display device 220 associated with a vehicle windshield.
  • the virtual reality may be displayed or proj ected onto the windshield itself.
  • an entertainment system 222 may also communicate with the vehicle windshield display 220.
  • a human/machine interface 218 may permit a user to selectively display the virtual reality environment created by the virtual reality system 216, information produced by the virtual reality system 216, entertainment provided by the entertainment system 222, or a combination thereof.
  • the human/machine interface 218 may allow the user to selectively inactivate the display 220.
  • the human/machine interface 218 may permit a user to selectively retract the display 220 to allow full or partial visibility through the windshield.
  • the human/machine interface 218 may be implemented as a touchscreen incorporated into the display 220, a separate touch panel incorporated into the dashboard, buttons on the dashboard, a keyboard, an audio processing system that accepts voice commands, or any other input device known to those in the art.
  • the display 220 may overlay information from the virtual reality system 216 onto objects visible through a transparent or substantially transparent windshield, thereby creating an augmented reality.
  • the virtual reality created by the virtual reality system 216 may be entirely reproduced onto the display 220.
  • the virtual reality system 216 and the entertainment system 222 may cooperate to produce augmented reality -type entertainment or games for display.
  • a vehicle windshield system 200 in accordance with the invention includes a solar panel 302 coupled to an exterior or interior surface of a vehicle windshield.
  • the solar panel 302 may receive solar or light energy and convert it to electricity.
  • the electricity may be communicated to a vehicle power source, such as a battery 306.
  • This energy may then be used to drive the virtual reality system 216, entertainment system 222, and/or sensor system 308, in addition to any other vehicle systems known to those in the art. In this manner, embodiments of the present invention may conserve and efficiently utilize energy.
  • light or solar energy may be received by one or more solar panels 302 associated with a vehicle windshield.
  • the solar panel 302 may be a unitary panel, or may include one or more sub-panels.
  • the received energy may then be transferred to a DC/DC converter 304 or other similar device.
  • the DC/DC converter 304 may boost the energy voltage and stabilize the current as needed to charge the battery 306. In this manner, embodiments of the present invention optimize fuel efficiency by using solar energy, rather than fuel energy, to power the battery 306.
  • the battery 306 may provide power to the entertainment system 222, the virtual reality system 216, and/or the sensor system 308.
  • the sensor system 308 may include any or all of the vehicle sensors 202, 204, 206, 208, 210 used to gather data for use by the virtual reality system 216.
  • one or more components of the vehicle windshield system 200 may cease to function correctly. Such a malfunction may adversely affect the autonomous driving capability of the vehicle. In such cases the vehicle windshield system 200 may transition between various modes of operation to reflect the vehicle's current autonomous driving capability.
  • FIG. 4 One embodiment of a method 400 describing transitions between the various modes is illustrated in Figure 4.
  • a determination may be made as to whether the various sensors 202, 204, 206, 208, 210 are functioning 402 correctly. If yes, the vehicle windshield system 200 may operate in normal mode 404. In this mode 404, all capabilities of the vehicle windshield system 200 are enabled and solar panels 302 and other components associated with the vehicle windshield system 200 may obstruct visibility through the windshield without compromising the safety of vehicle occupants.
  • the method 400 may determine whether autonomous driving is impaired 406. If not, the vehicle windshield system 200 may operate in limp mode 408, where one or more solar panels 302 allow at least partial visibility through the windshield, thereby allowing semi-autonomous driving with driver interference.
  • limp mode 408 may include rotating one or more solar sub-panels approximately ninety degrees such that the sub-panels are positioned substantially perpendicularly or vertically relative to the vehicle windshield. In this manner, limp mode 408 may allow at least partial driver visibility through the windshield.
  • the vehicle windshield system 200 may operate in manual mode 410.
  • the solar panels 302 may at least partially retract from the windshield and the features of the vehicle windshield system 200, including the virtual reality system 216 and the entertainment system 222, may be disabled to allow the driver to assume full operation of the vehicle.
  • FIG. 5 depicts one embodiment of a vehicle 500 equipped with a vehicle windshield system 200 in accordance with the present invention.
  • the vehicle windshield system 200 may include at least one solar panel 302 coupled to an interior or exterior of the vehicle windshield 502.
  • the solar panel 302 may include one or more sub-panels 506a, 506b to receive solar energy and communicate the energy to a vehicle power source.
  • the sub-panels 506a, 506b may be positioned substantially parallel to the surface of the vehicle windshield 502. In this manner the sub-panels 506a, 506b may occupy a substantial majority of the surface area of the vehicle windshield 502 to maximize the amount of solar energy received.
  • the sub-panels 506a, 506b may be independently operable and/or movable. In other embodiments, the sub-panels 506a, 506b may operate and/or move in sections, or as an integrated unit. In certain embodiments, as discussed in more detail below, the sub-panels 506a, 506b may selectively retract or rotate to allow at least partial visibility through the vehicle windshield 502.
  • each or any of the sub-panels 506a, 506b may be configured to selectively rotate on its vertical axis to provide greater visibility to a driver through the vehicle windshield 502.
  • each of the sub-panels 506a, 506b rotates approximately ninety degrees to position the sub-panels 506a, 506b substantially perpendicular relative to the interior or exterior surface of the vehicle windshield 502. This rotation may maximize spacing between adjacent sub-panels 506a, 506b to expose a maximum surface area of the vehicle windshield 502 to a vehicle 500 driver.
  • the visibility through the vehicle windshield 502 may be sufficiently increased to enable the driver to intervene or assist with operation of the autonomous vehicle 500 if, for example, one or more sensors 202, 204, 206, 208, 210 fails to function properly. Under such circumstances, driver intervention may be necessary to maintain vehicle 500 safety.
  • the human/machine interface 218 may enable the driver or other user to selectively implement this capability by manual or voice selection of the limp mode 408 option.
  • the sub-panels 506a, 506b may automatically assume limp mode 408 positioning when the vehicle windshield system 200 detects a malfunction in one or more sensors 202, 204, 206, 208, 210.
  • a vehicle windshield system 200 in accordance with the invention may selectively retract the sub-panels 506a, 506b to optimize visibility through the vehicle windshield 502.
  • solar sub-panels 506a, 506b may automatically or selectively rotate from a horizontal position substantially adjacent to the interior or exterior of the vehicle windshield 502 surface, to a vertical position substantially perpendicular to the windshield 502 surface.
  • the sub-panels 506a, 506b may then be retracted by increasing the distance between at least two adjacent sub-panels 506a, 506b by moving the adjacent sub-panels 506a, 506b in opposite directions 700a, 700b. In this manner, the sub-panels 506a, 506b may be substantially stacked on either side of the vehicle windshield 502, similar to vertical blinds. In another embodiment, the sub-panels 506a, 506b may retract so as to create one or more stacks of sub-panels 506a, 506b on the vehicle windshield 502 surface.
  • adjacent solar sub-panels 506a, 506b may slide over one another substantially horizontally, creating one or more overlaid stacks of sub-panels 506a, 506b on the vehicle windshield 502 surface. In any case, retracting the sub-panels 506a, 506b may expose one or more large areas of the vehicle windshield 502 to provide unobstructed visibility to a vehicle 500 driver.
  • the solar panel 302 may comprise a movable screen 800 positioned substantially adjacent to an exterior surface of the vehicle windshield 502. Particularly, an outside surface of the screen 800, i.e. facing the environment exterior to the vehicle 500, may be covered with one or more solar panels 302.
  • These solar panels 302 may convert sunlight passing through the vehicle windshield 502 to electricity to power other vehicle windshield system 200 components, as discussed above.
  • An inside surface of the screen 800 i.e. facing an interior of the vehicle 500 and vehicle 500 occupants, may be used as a display for the virtual reality system 216 or the entertainment system 222.
  • the screen 800 shown in Figures 8 and 9 may be selectively retracted in a horizontal direction 900 into a space 802 between the vehicle windshield 502 and the hood of the vehicle 500, for example.
  • the solar panel 302 may be located on an interior of the vehicle windshield 502 and may retract into a space 802 or slot integrated into a dashboard of the vehicle 500.
  • the screen 800 may be integrated within the vehicle windshield 502, such as between multiple panes of a vehicle windshield 502.
  • the inside surface of the solar panel 302 may act as a display for the virtual reality system 216 or entertainment system 222.
  • a separate mechanism may be provided for the display.
  • electrically switchable smart glass or non-electrical smart glass may be incorporated into the vehicle windshield 502. This may allow automatic or selective variation in the opacity of the vehicle windshield 502 to enable use of the vehicle windshield 502 itself as the display.
  • the solar panel 302 may be substantially transparent.
  • the solar panel 302 may be integrated into the vehicle windshield 502 itself, while the screen 800 may selectively occupy an interior surface of the vehicle windshield 502 to provide a display for the virtual reality system 216 and/or entertainment system 222.
  • adjusting the opacity of the screen 800 may facilitate the ability of the solar panel 302 to convert solar energy to electricity.
  • a display 220 may be associated with the interior or exterior surface of a vehicle windshield 502.
  • the display 220 may present information or a virtual driving environment created by the virtual reality system 216, or may display entertainment from the entertainment system 222.
  • the display 220 may be coupled to an interior or exterior surface of the vehicle windshield 502. As shown in Figure 10, in some embodiments, the display 220 may comprise the interior surface of the vehicle windshield 502 itself. In other embodiments, such as that shown in Figure 11, the display 220 may comprise a separate material, screen, or device coupled to the interior surface of the vehicle windshield 502.
  • the display 220 may selectively retract into a slot or other space in the vehicle 500 hood or dashboard. Also like some embodiments of the solar panel 302, the display 220 may include more than one sub-panel configured to retract vertically or horizontally such that the sub-panels become stacked. In one embodiment, the back surface of the solar panel 302 may be used as the display 220.
  • a dual-paned or multiple-paned vehicle windshield 502 may be modified such that the vehicle windshield 502 may transition from transparent to opaque. In this manner, the vehicle windshield 502 may act as a display 220 under certain circumstances. Transitioning the vehicle windshield 502 opacity in this manner may be accomplished automatically, or may be manually selected. This may enable the vehicle windshield system 200 to display 220 either an augmented reality, or a virtual reality.
  • information and virtual elements created by the virtual reality system 216 may be overlaid onto real-world elements visible through the vehicle windshield 502.
  • a second vehicle 1004 traveling ahead of the occupied vehicle 500 may be visible through the vehicle windshield 502.
  • the vehicle windshield system 200 may display a reconstructed driving environment, or virtual reality, based on the data received and fused from the various sensors 202, 204, 206, 208, 210.
  • the virtual reality system 216 may generate an image of the second vehicle 1004 traveling ahead of the occupied vehicle 500, along with other real-world elements and conditions of the external environment such as the road, trees, mountains, and the like.
  • the virtual reality thus created may be presented on the display 220.
  • the virtual reality system 216 may display indicators 1000 in combination with the virtual or augmented reality to provide information and warnings to vehicle 500 occupants. For example, some indicators 1000 may warn that there is insufficient space or distance between the vehicle and the vehicle ahead. These indicators 1000 may change color, from green to yellow to red for example, to indicate the severity of the situation and prompt driver interference in some circumstances. Indicators 1000 may also be displayed to warn a driver or vehicle occupants of dangerous or emergency circumstances, such as pedestrian traffic, children playing, animals approaching, and the like. [0062] Similarly, the virtual reality system 216 may display arrows 1002 or other symbols indicating a suggested course of action. For example, as shown in Figures 10 and 11, the virtual reality system 216 may utilize data from GPS sensors 208 and other sensors 202, 204, 206, 208, 210 to suggest a lane change or exit based on an intended destination or current traffic conditions.
  • General information generated by the virtual reality system 216 may also be provided at various locations on the display 220, such as in proximity to an affected object, or in a corner of the display 220.
  • Such information may include, for example, virtual fuel gauges, temperature gauges, tire pressure gauges, radio settings, vehicle service notifications, and the like.
  • the display 220 may also be used to present entertainment to vehicle occupants via the entertainment system 222.
  • Such entertainment may include, for example, movies, television, radio, video games, and the like.
  • the entertainment system 222 may be automatically disabled in favor of information generated by the virtual reality system 216, or the display 220 may be completely disabled and retracted as discussed above to facilitate driver interference and effectively safeguard vehicle occupants.
  • Implementations of the systems, devices, and methods disclosed herein may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed herein. Implementations within the scope of the present disclosure may also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are computer storage media (devices). Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, implementations of the disclosure can comprise at least two distinctly different kinds of computer-readable media: computer storage media (devices) and transmission media.
  • Computer storage media includes RAM, ROM, EEPROM, CD- ROM, solid state drives (“SSDs”) (e.g., based on RAM), Flash memory, phase-change memory (“PCM”), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
  • SSDs solid state drives
  • PCM phase-change memory
  • An implementation of the devices, systems, and methods disclosed herein may communicate over a computer network.
  • a "network" is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices.
  • Transmissions media can include a network and/or data links, which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
  • Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions.
  • the computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code.
  • the disclosure may be practiced in network computing environments with many types of computer system configurations, including, an in-dash vehicle computer, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, various storage devices, and the like.
  • the disclosure may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks.
  • program modules may be located in both local and remote memory storage devices.
  • ASICs application specific integrated circuits
  • a sensor may include computer code configured to be executed in one or more processors, and may include hardware logic/electrical circuitry controlled by the computer code.
  • These example devices are provided herein purposes of illustration, and are not intended to be limiting. Embodiments of the present disclosure may be implemented in further types of devices, as would be known to persons skilled in the relevant art(s).
  • At least some embodiments of the disclosure have been directed to computer program products comprising such logic (e.g., in the form of software) stored on any computer useable medium.
  • Such software when executed in one or more data processing devices, causes a device to operate as described herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Instrument Panels (AREA)

Abstract

A vehicle windshield system to provide real-time road information and entertainment to vehicle occupants. The system includes a vehicle windshield and a solar panel and display coupled to the windshield. The solar panel is configured to convert light energy to electricity and to communicate the electricity to a battery. The solar panel also selectively allows and disallows visibility through the vehicle windshield. The display utilizes the electricity from the battery to display at least one of a virtual reality and entertainment. A corresponding method is also disclosed and claimed herein.

Description

SOLAR-POWERED, VIRTUAL-REALITY WINDSHIELD
BACKGROUND
FIELD OF THE INVENTION
[0001] This invention relates to multipurpose vehicle windshields for autonomous and semi-autonomous vehicles.
BACKGROUND OF THE INVENTION
[0002] Modern vehicle windshields or windscreens are critical safety components of vehicles. Typically made of laminated safety glass, a windshield provides a barrier to protect the vehicle's occupants from wind, weather, and flying debris during travel. A windshield also provides protection to vehicle occupants in the event of a rollover accident, and aids in proper airbag deployment.
[0003] With modern technological advances, many windshields are now integrated with various sensors to detect moisture, prevent icing, and improve visibility in inclement weather. Advanced driver assistance systems and head-up displays, once reserved for fighter jets to keep fighter pilot's eyes on the sky rather than on the instruments in the cockpit, are also being offered as options on some vehicles. Despite these advances, however, windshields today still serve largely utilitarian purposes, mediating the space between a vehicle' s occupants and an external environment without interfering with the driver' s ability to see the external environment and safely navigate the vehicle.
[0004] Although still under development, autonomous (e.g., driverless) cars may reduce the need for a driver to assume many of the duties traditionally associated with driving. For example, many of the utilitarian functions of vehicle windshields, such as allowing clear, unobstructed views of the driving environment, may become less important. Indeed, since autonomous vehicles rely on various sensors to detect and navigate the external environment, less diligence may be required with respect to assessing road conditions and environmental conditions. A driver may even be able to take his or her eyes off the road completely for a period of time without compromising the safety of vehicle occupants. As a result, vehicle occupants, including the driver, may feel free to direct their attention to other interests and entertainment.
[0005] In view of the foregoing, what are needed are systems and methods to provide real-time road, vehicle, and environmental information, as well as entertainment, to vehicle occupants. Ideally, such systems and methods would optimize energy efficiencies, and would automatically prioritize real-time information over entertainment in an emergency.
BRIEF DESCRIPTION OF THE DRAWINGS
[0006] In order that the advantages of the invention will be readily understood, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments illustrated in the appended drawings. Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered limiting of its scope, the invention will be described and explained with additional specificity and detail through use of the accompanying drawings, in which:
[0007] Figure 1 is a high-level block diagram showing one example of a computing system in which a system and method in accordance with the invention may be implemented;
[0008] Figure 2 is a high level block diagram showing various components of a vehicle windshield system in accordance with certain embodiments of the invention;
[0009] Figure 3 is a flow diagram showing a process for powering a vehicle windshield system in accordance with certain embodiments of the invention;
[0010] Figure 4 is a flow chart showing a process for selectively allowing and disallowing visibility through a vehicle windshield in accordance with certain embodiments of the invention;
[0011 ] Figure 5 is a front view of one embodiment of a vehicle windshield including a solar panel having various sub-panels;
[0012] Figure 6 is a front view of the solar sub-panels of Figure 5 rotated about ninety degrees in accordance with certain embodiments of the invention;
[0013] Figure 7 is a front view of the solar sub-panels of Figures 5 and 6 retracted from a portion of the vehicle windshield in accordance with the invention;
[0014] Figure 8 is a front view of another embodiment of a vehicle windshield including a solar panel; [0015] Figure 9 is a front view of the solar panel of Figure 8 retracted from a portion of the vehicle windshield in accordance with the invention;
[0016] Figure 10 is a front view of one embodiment of a display coupled to a vehicle windshield in accordance with the invention; and
[0017] Figure 11 is a front view of a second embodiment of a display in accordance with the invention.
DETAILED DESCRIPTION
[0018] Referring to Figure 1, one example of a computing system 100 is illustrated. The computing system 100 is presented to show one example of an environment where a system and method in accordance with the invention may be implemented. The computing system 100 may be embodied as a mobile device 100 such as a smart phone or tablet, a desktop computer, a workstation, a server, or the like. The computing system 100 is presented by way of example and is not intended to be limiting. Indeed, the systems and methods disclosed herein may be applicable to a wide variety of different computing systems in addition to the computing system 100 shown. The systems and methods disclosed herein may also potentially be distributed across multiple computing systems 100.
[0019] As shown, the computing system 100 includes at least one processor 102 and may include more than one processor 102. The processor 102 may be operably connected to a memory 104. The memory 104 may include one or more non-volatile storage devices such as hard drives 104a, solid state drives 104a, CD-ROM drives 104a, DVD-ROM drives 104a, tape drives 104a, or the like. The memory 104 may also include non-volatile memory such as a read-only memory 104b (e.g., ROM, EPROM, EEPROM, and/or Flash ROM) or volatile memory such as a random access memory 104c (RAM or operational memory). A bus 106, or plurality of buses 106, may interconnect the processor 102, memory devices 104, and other devices to enable data and/or instructions to pass therebetween.
[0020] To enable communication with external systems or devices, the computing system 100 may include one or more ports 108. Such ports 108 may be embodied as wired ports 108 (e.g., USB ports, serial ports, Firewire ports, SCSI ports, parallel ports, etc.) or wireless ports 108 (e.g., Bluetooth, IrDA, etc.). The ports 108 may enable communication with one or more input devices 110 (e.g., keyboards, mice, touchscreens, cameras, microphones, scanners, storage devices, etc.) and output devices 112 (e.g., displays, monitors, speakers, printers, storage devices, etc.). The ports 108 may also enable communication with other computing systems 100.
[0021] In certain embodiments, the computing system 100 includes a wired or wireless network adapter 114 to connect the computing system 100 to a network 116, such as a LAN, WAN, or the Internet. Such a network 116 may enable the computing system 100 to connect to one or more servers 118, workstations 120, personal computers 120, mobile computing devices, or other devices. The network 116 may also enable the computing system 100 to connect to another network by way of a router 122 or other device 122. Such a router 122 may allow the computing system 100 to communicate with servers, workstations, personal computers, or other devices located on different networks.
[0022] As previously mentioned, autonomous vehicles may relieve a driver from having to assume many of the duties traditionally associated with driving. Sensors used to navigate an autonomous vehicle may assess the external environment more frequently and more accurately than human perception permits, thereby enabling a driver to divert his attention to entertainment and other interests without endangering vehicle occupants. This new technology also provides a unique opportunity for a vehicle windshield to do more than simply provide unobstructed views from the inside to the outside of the vehicle. Vehicle windshield systems in accordance with embodiments of the present invention provide a multi-purpose windshield configured to selectively provide information and entertainment to vehicle occupants while permitting traditional or manual use of the vehicle windshield in an emergency situation.
[0023] As used herein, the term "windshield" refers to any windshield, windscreen, window, or other substantially transparent material mediating the space between the interior and the exterior of a vehicle. The term "vehicle" refers to any autonomous, semi- autonomous, or non-autonomous passenger vehicle, including a heavy-duty industrial or transport vehicle, bus, truck, car, cart, airplane, and the like.
[0024] Referring to Figure 2, a vehicle windshield system 200 in accordance with the invention may utilize various vehicle on-board sensors 202, 204, 206, 208, 210 to sense an external environment of the vehicle and create a virtual reality therefrom, as discussed in more detail below. For example, the vehicle windshield system 200 may include lidar sensors 202, radar sensors 204, camera sensors 206, navigational or GPS sensors 208, and other sensors 210.
[0025] Lidar (Light Detection and Ranging) sensors 202 may be used to gather data by scanning the environment surrounding the vehicle with a laser light. In this manner, lidar sensors 202 may determine the exact location of other vehicles and environmental objects and conditions, such as pedestrians, children, animals, roadway signage, and the like, in addition to the roadway surface itself. Lidar sensors 202 may likewise determine relevant features and characteristics of such objects. This data may be received by an internal virtual reality system 216 and imputed into fusion 212 algorithms to produce a virtual reality for display to a user, as discussed in more detail below.
[0026] Radar sensors 204 may use electromagnetic waves to determine distances between objects. Radar sensors 204 may also be used to determine properties of objects to facilitate object differentiation. Data gathered from radar sensors 204 may also be communicated to the virtual reality system 216 for analysis, fusion, and display.
[0027] Optical or camera sensors 206 may include image acquisition devices such as cameras, charge coupled devices, or the like, to acquire still images or video of the surrounding area. Data from camera sensors 206 may be communicated to the virtual reality system 216, where it may undergo image processing to increase contrast or clarity, for example. The virtual reality system 216 may collect and combine the image data with other sensor data to create a virtual reality for display.
[0028] Navigational system or GPS sensors 208 may include common global positioning system ("GPS") sensors, or other similar sensors known in the art. GPS sensors 208 be used to detect a location of the vehicle relative to road and map features, including intersections, bridges, buildings, homes, and the like. GPS sensors 208 may also provide pertinent information for display in combination with the virtual reality data gathered from other sensors 202, 204, 206, 208, 210, such as recommended speeds, distances to landmarks or desired locations, etc.
[0029] Other sensors 210 may include, for example, temperature sensors, humidity sensors, ultrasonic sensors, audio sensors, and the like. These other sensors 210 may enhance the accuracy and detail of the virtual reality displayed by a vehicle windshield system 200 in accordance with embodiments of the invention.
[0030] In usual operation, the various sensors 202, 204, 206, 208, 210 may work together to sense conditions of the surrounding environment, thereby enabling safe navigation of the autonomous vehicle without human intervention. According to certain embodiments of the present invention, these sensors 202, 204, 206, 208, 210 may also collect data to provide an augmented or virtual reality to vehicle occupants.
[0031] Data from the sensors 202, 204, 206, 208, 210 may be received by a virtual reality system 216. The virtual reality system 216 may analyze and fuse 212 the data gathered from the various sensors 202, 204, 206, 208, 210 to create a virtual or augmented reality. This information may be projected onto a vehicle windshield, or may be displayed on a monitor or other display device 220 associated with the vehicle windshield, as discussed in more detail below.
[0032] In certain embodiments, the virtual reality system 216 may perform one or more fusion 212 algorithms, using the data from sensors 202, 204, 206, 208, 210 as input. These fusion 212 algorithms may provide object tracking 212 to display pertinent information associated with objects in the surrounding environment, such as speeds, locations, or distances associated with other vehicles on the road. Object tracking 212 may also provide information regarding speeds, locations, or distances of objects not normally within the line of sight of a driver, such as a vehicle in the driver's blind spot, an approaching animal, or children playing near or behind the vehicle. Fusion 212 algorithms may be performed repeatedly to provide updated information to vehicle occupants in realtime.
[0033] The virtual reality system 216 may utilize the information produced from the fusion 212 algorithms to substantially reconstruct 214 the driving environment. For example, the virtual reality system 216 may use image data from the camera sensor 206 and correlate it with location data from the GPS sensors 208 to accurately depict distances of surrounding obj ects relative to the vehicle. Similarly, the virtual reality system 216 may fuse information from the lidar sensors 202, radar sensors 204, and other sensors 210 to indicate the presence of objects not captured by the camera sensors 206. In this manner, a vehicle windshield system 200 in accordance with the present invention may provide a more accurate and complete picture of the surrounding environment than human perception permits.
[0034] The virtual reality thus created by the virtual reality system 216 may be selectively displayed on a monitor or other display device 220 associated with a vehicle windshield. In some embodiments, the virtual reality may be displayed or proj ected onto the windshield itself.
[0035] In some embodiments, an entertainment system 222 may also communicate with the vehicle windshield display 220. A human/machine interface 218 may permit a user to selectively display the virtual reality environment created by the virtual reality system 216, information produced by the virtual reality system 216, entertainment provided by the entertainment system 222, or a combination thereof. In one embodiment, the human/machine interface 218 may allow the user to selectively inactivate the display 220. In another embodiment, as discussed in more detail below, the human/machine interface 218 may permit a user to selectively retract the display 220 to allow full or partial visibility through the windshield.
[0036] The human/machine interface 218 may be implemented as a touchscreen incorporated into the display 220, a separate touch panel incorporated into the dashboard, buttons on the dashboard, a keyboard, an audio processing system that accepts voice commands, or any other input device known to those in the art.
[0037] In some embodiments, the display 220 may overlay information from the virtual reality system 216 onto objects visible through a transparent or substantially transparent windshield, thereby creating an augmented reality. In other embodiments, the virtual reality created by the virtual reality system 216 may be entirely reproduced onto the display 220. In one embodiment, the virtual reality system 216 and the entertainment system 222 may cooperate to produce augmented reality -type entertainment or games for display.
[0038] Referring now to Figure 3, a vehicle windshield system 200 in accordance with the invention includes a solar panel 302 coupled to an exterior or interior surface of a vehicle windshield. The solar panel 302 may receive solar or light energy and convert it to electricity. The electricity may be communicated to a vehicle power source, such as a battery 306. This energy may then be used to drive the virtual reality system 216, entertainment system 222, and/or sensor system 308, in addition to any other vehicle systems known to those in the art. In this manner, embodiments of the present invention may conserve and efficiently utilize energy.
[0039] As shown, light or solar energy may be received by one or more solar panels 302 associated with a vehicle windshield. The solar panel 302 may be a unitary panel, or may include one or more sub-panels. The received energy may then be transferred to a DC/DC converter 304 or other similar device. The DC/DC converter 304 may boost the energy voltage and stabilize the current as needed to charge the battery 306. In this manner, embodiments of the present invention optimize fuel efficiency by using solar energy, rather than fuel energy, to power the battery 306.
[0040] The battery 306 may provide power to the entertainment system 222, the virtual reality system 216, and/or the sensor system 308. The sensor system 308 may include any or all of the vehicle sensors 202, 204, 206, 208, 210 used to gather data for use by the virtual reality system 216.
[0041] Referring now to Figure 4, in certain cases, one or more components of the vehicle windshield system 200 may cease to function correctly. Such a malfunction may adversely affect the autonomous driving capability of the vehicle. In such cases the vehicle windshield system 200 may transition between various modes of operation to reflect the vehicle's current autonomous driving capability.
[0042] One embodiment of a method 400 describing transitions between the various modes is illustrated in Figure 4. According to the method 400, a determination may be made as to whether the various sensors 202, 204, 206, 208, 210 are functioning 402 correctly. If yes, the vehicle windshield system 200 may operate in normal mode 404. In this mode 404, all capabilities of the vehicle windshield system 200 are enabled and solar panels 302 and other components associated with the vehicle windshield system 200 may obstruct visibility through the windshield without compromising the safety of vehicle occupants.
[0043] If sensors are not functioning 402 correctly, the method 400 may determine whether autonomous driving is impaired 406. If not, the vehicle windshield system 200 may operate in limp mode 408, where one or more solar panels 302 allow at least partial visibility through the windshield, thereby allowing semi-autonomous driving with driver interference. In one embodiment, as discussed in more detail below, limp mode 408 may include rotating one or more solar sub-panels approximately ninety degrees such that the sub-panels are positioned substantially perpendicularly or vertically relative to the vehicle windshield. In this manner, limp mode 408 may allow at least partial driver visibility through the windshield.
[0044] If autonomous driving is impaired, the vehicle windshield system 200 may operate in manual mode 410. In this mode 410, the solar panels 302 may at least partially retract from the windshield and the features of the vehicle windshield system 200, including the virtual reality system 216 and the entertainment system 222, may be disabled to allow the driver to assume full operation of the vehicle.
[0045] Figure 5 depicts one embodiment of a vehicle 500 equipped with a vehicle windshield system 200 in accordance with the present invention. As described above, the vehicle windshield system 200 may include at least one solar panel 302 coupled to an interior or exterior of the vehicle windshield 502. The solar panel 302 may include one or more sub-panels 506a, 506b to receive solar energy and communicate the energy to a vehicle power source. Ideally, the sub-panels 506a, 506b may be positioned substantially parallel to the surface of the vehicle windshield 502. In this manner the sub-panels 506a, 506b may occupy a substantial majority of the surface area of the vehicle windshield 502 to maximize the amount of solar energy received.
[0046] In some embodiments, the sub-panels 506a, 506b may be independently operable and/or movable. In other embodiments, the sub-panels 506a, 506b may operate and/or move in sections, or as an integrated unit. In certain embodiments, as discussed in more detail below, the sub-panels 506a, 506b may selectively retract or rotate to allow at least partial visibility through the vehicle windshield 502.
[0047] For example, as shown in Figure 6, each or any of the sub-panels 506a, 506b may be configured to selectively rotate on its vertical axis to provide greater visibility to a driver through the vehicle windshield 502. In one embodiment, each of the sub-panels 506a, 506b rotates approximately ninety degrees to position the sub-panels 506a, 506b substantially perpendicular relative to the interior or exterior surface of the vehicle windshield 502. This rotation may maximize spacing between adjacent sub-panels 506a, 506b to expose a maximum surface area of the vehicle windshield 502 to a vehicle 500 driver.
[0048] By altering the positioning of solar sub-panels 506a, 506b in this manner, a process referred to herein as limp mode 408, the visibility through the vehicle windshield 502 may be sufficiently increased to enable the driver to intervene or assist with operation of the autonomous vehicle 500 if, for example, one or more sensors 202, 204, 206, 208, 210 fails to function properly. Under such circumstances, driver intervention may be necessary to maintain vehicle 500 safety. [0049] In some embodiments, the human/machine interface 218 may enable the driver or other user to selectively implement this capability by manual or voice selection of the limp mode 408 option. In other embodiments, the sub-panels 506a, 506b may automatically assume limp mode 408 positioning when the vehicle windshield system 200 detects a malfunction in one or more sensors 202, 204, 206, 208, 210.
[0050] Referring now to Figure 7, some embodiments of a vehicle windshield system 200 in accordance with the invention may selectively retract the sub-panels 506a, 506b to optimize visibility through the vehicle windshield 502. For example, as discussed with reference to Figure 6 above, solar sub-panels 506a, 506b may automatically or selectively rotate from a horizontal position substantially adjacent to the interior or exterior of the vehicle windshield 502 surface, to a vertical position substantially perpendicular to the windshield 502 surface.
[0051] The sub-panels 506a, 506b may then be retracted by increasing the distance between at least two adjacent sub-panels 506a, 506b by moving the adjacent sub-panels 506a, 506b in opposite directions 700a, 700b. In this manner, the sub-panels 506a, 506b may be substantially stacked on either side of the vehicle windshield 502, similar to vertical blinds. In another embodiment, the sub-panels 506a, 506b may retract so as to create one or more stacks of sub-panels 506a, 506b on the vehicle windshield 502 surface. In still another embodiment, adjacent solar sub-panels 506a, 506b may slide over one another substantially horizontally, creating one or more overlaid stacks of sub-panels 506a, 506b on the vehicle windshield 502 surface. In any case, retracting the sub-panels 506a, 506b may expose one or more large areas of the vehicle windshield 502 to provide unobstructed visibility to a vehicle 500 driver. [0052] Referring now to Figures 8 and 9, in one embodiment, the solar panel 302 may comprise a movable screen 800 positioned substantially adjacent to an exterior surface of the vehicle windshield 502. Particularly, an outside surface of the screen 800, i.e. facing the environment exterior to the vehicle 500, may be covered with one or more solar panels 302. These solar panels 302 may convert sunlight passing through the vehicle windshield 502 to electricity to power other vehicle windshield system 200 components, as discussed above. An inside surface of the screen 800, i.e. facing an interior of the vehicle 500 and vehicle 500 occupants, may be used as a display for the virtual reality system 216 or the entertainment system 222.
[0053] In contrast to the embodiments of Figure 7 where the solar sub-panels 506a, 506b retract in a vertical direction 700a, 700b, the screen 800 shown in Figures 8 and 9 may be selectively retracted in a horizontal direction 900 into a space 802 between the vehicle windshield 502 and the hood of the vehicle 500, for example. In another embodiment, the solar panel 302 may be located on an interior of the vehicle windshield 502 and may retract into a space 802 or slot integrated into a dashboard of the vehicle 500. In other embodiments, the screen 800 may be integrated within the vehicle windshield 502, such as between multiple panes of a vehicle windshield 502.
[0054] As mentioned previously, in certain embodiments, the inside surface of the solar panel 302 (facing vehicle 500 occupants) may act as a display for the virtual reality system 216 or entertainment system 222. Alternatively, a separate mechanism may be provided for the display. In some embodiments, for example, electrically switchable smart glass or non-electrical smart glass may be incorporated into the vehicle windshield 502. This may allow automatic or selective variation in the opacity of the vehicle windshield 502 to enable use of the vehicle windshield 502 itself as the display. [0055] In other embodiments, the solar panel 302 may be substantially transparent. Thus, in one embodiment, the solar panel 302 may be integrated into the vehicle windshield 502 itself, while the screen 800 may selectively occupy an interior surface of the vehicle windshield 502 to provide a display for the virtual reality system 216 and/or entertainment system 222. In this embodiment, adjusting the opacity of the screen 800 may facilitate the ability of the solar panel 302 to convert solar energy to electricity.
[0056] Referring now to Figures 10 and 11 , a display 220 may be associated with the interior or exterior surface of a vehicle windshield 502. The display 220 may present information or a virtual driving environment created by the virtual reality system 216, or may display entertainment from the entertainment system 222.
[0057] The display 220 may be coupled to an interior or exterior surface of the vehicle windshield 502. As shown in Figure 10, in some embodiments, the display 220 may comprise the interior surface of the vehicle windshield 502 itself. In other embodiments, such as that shown in Figure 11, the display 220 may comprise a separate material, screen, or device coupled to the interior surface of the vehicle windshield 502.
[0058] Like the solar panel 302 discussed above, the display 220 may selectively retract into a slot or other space in the vehicle 500 hood or dashboard. Also like some embodiments of the solar panel 302, the display 220 may include more than one sub-panel configured to retract vertically or horizontally such that the sub-panels become stacked. In one embodiment, the back surface of the solar panel 302 may be used as the display 220.
[0059] In certain embodiments, as discussed above, a dual-paned or multiple-paned vehicle windshield 502 may be modified such that the vehicle windshield 502 may transition from transparent to opaque. In this manner, the vehicle windshield 502 may act as a display 220 under certain circumstances. Transitioning the vehicle windshield 502 opacity in this manner may be accomplished automatically, or may be manually selected. This may enable the vehicle windshield system 200 to display 220 either an augmented reality, or a virtual reality.
[0060] In embodiments of the vehicle windshield system 200 displaying an augmented reality, information and virtual elements created by the virtual reality system 216 may be overlaid onto real-world elements visible through the vehicle windshield 502. For example, as shown, a second vehicle 1004 traveling ahead of the occupied vehicle 500 may be visible through the vehicle windshield 502. In other embodiments, the vehicle windshield system 200 may display a reconstructed driving environment, or virtual reality, based on the data received and fused from the various sensors 202, 204, 206, 208, 210. In this embodiment, for example, the virtual reality system 216 may generate an image of the second vehicle 1004 traveling ahead of the occupied vehicle 500, along with other real-world elements and conditions of the external environment such as the road, trees, mountains, and the like. The virtual reality thus created may be presented on the display 220.
[0061] In either case, based on input from the various sensors 202, 204, 206, 208, 210, the virtual reality system 216 may display indicators 1000 in combination with the virtual or augmented reality to provide information and warnings to vehicle 500 occupants. For example, some indicators 1000 may warn that there is insufficient space or distance between the vehicle and the vehicle ahead. These indicators 1000 may change color, from green to yellow to red for example, to indicate the severity of the situation and prompt driver interference in some circumstances. Indicators 1000 may also be displayed to warn a driver or vehicle occupants of dangerous or emergency circumstances, such as pedestrian traffic, children playing, animals approaching, and the like. [0062] Similarly, the virtual reality system 216 may display arrows 1002 or other symbols indicating a suggested course of action. For example, as shown in Figures 10 and 11, the virtual reality system 216 may utilize data from GPS sensors 208 and other sensors 202, 204, 206, 208, 210 to suggest a lane change or exit based on an intended destination or current traffic conditions.
[0063] General information generated by the virtual reality system 216 may also be provided at various locations on the display 220, such as in proximity to an affected object, or in a corner of the display 220. Such information may include, for example, virtual fuel gauges, temperature gauges, tire pressure gauges, radio settings, vehicle service notifications, and the like.
[0064] The display 220 may also be used to present entertainment to vehicle occupants via the entertainment system 222. Such entertainment may include, for example, movies, television, radio, video games, and the like. Under emergency circumstances, the entertainment system 222 may be automatically disabled in favor of information generated by the virtual reality system 216, or the display 220 may be completely disabled and retracted as discussed above to facilitate driver interference and effectively safeguard vehicle occupants.
[0065] In the above disclosure, reference has been made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration specific implementations in which the disclosure may be practiced. It is understood that other implementations may be utilized and structural changes may be made without departing from the scope of the present disclosure. References in the specification to "one embodiment," "an embodiment," "an example embodiment," etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
[0066] Implementations of the systems, devices, and methods disclosed herein may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed herein. Implementations within the scope of the present disclosure may also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are computer storage media (devices). Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, implementations of the disclosure can comprise at least two distinctly different kinds of computer-readable media: computer storage media (devices) and transmission media.
[0067] Computer storage media (devices) includes RAM, ROM, EEPROM, CD- ROM, solid state drives ("SSDs") (e.g., based on RAM), Flash memory, phase-change memory ("PCM"), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. [0068] An implementation of the devices, systems, and methods disclosed herein may communicate over a computer network. A "network" is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links, which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
[0069] Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
[0070] Those skilled in the art will appreciate that the disclosure may be practiced in network computing environments with many types of computer system configurations, including, an in-dash vehicle computer, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, various storage devices, and the like. The disclosure may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
[0071] Further, where appropriate, functions described herein can be performed in one or more of: hardware, software, firmware, digital components, or analog components. For example, one or more application specific integrated circuits (ASICs) can be programmed to carry out one or more of the systems and procedures described herein. Certain terms are used throughout the description and claims to refer to particular system components. As one skilled in the art will appreciate, components may be referred to by different names. This document does not intend to distinguish between components that differ in name, but not function.
[0072] It should be noted that the sensor embodiments discussed above may comprise computer hardware, software, firmware, or any combination thereof to perform at least a portion of their functions. For example, a sensor may include computer code configured to be executed in one or more processors, and may include hardware logic/electrical circuitry controlled by the computer code. These example devices are provided herein purposes of illustration, and are not intended to be limiting. Embodiments of the present disclosure may be implemented in further types of devices, as would be known to persons skilled in the relevant art(s). [0073] At least some embodiments of the disclosure have been directed to computer program products comprising such logic (e.g., in the form of software) stored on any computer useable medium. Such software, when executed in one or more data processing devices, causes a device to operate as described herein.
[0074] While various embodiments of the present disclosure have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the disclosure. Thus, the breadth and scope of the present disclosure should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents. The foregoing description has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. Further, it should be noted that any or all of the aforementioned alternate implementations may be used in any combination desired to form additional hybrid implementations of the disclosure.

Claims

1. A system comprising:
a windshield coupled to an autonomous vehicle;
a solar panel coupled to the windshield and configured to convert light energy to electricity and to communicate the electricity to a vehicle power source, the solar panel configured to selectively allow and disallow visibility through the windshield; and
a display coupled to the windshield and utilizing the electricity from the vehicle power source to display at least one of a virtual reality and entertainment.
2. The system of claim 1, further comprising a virtual reality system
communicating with the display and utilizing electricity from the vehicle power source to gather information for presentation on the display.
3. The system of claim 1, wherein the display is configured to present at least one of vehicle information, road information, environmental information, emergency information, augmented reality, virtual reality, and entertainment.
4. The system of claim 2, wherein the virtual reality system comprises at least one sensor coupled to a vehicle, the at least one sensor configured to obtain
environmental information from an external environment to present on the display.
5. The system of claim 4, wherein the at least one sensor is selected from the group consisting of an optical sensor, a thermal sensor, a video sensor, an audio sensor, an ultrasonic sensor, a radar sensor, and a Lidar sensor.
6. The system of claim 1, wherein the solar panel is configured to retract from the windshield to selectively allow and disallow visibility therethrough.
7. The system of claim 6, wherein the solar panel comprises at least one sub- panel configured to rotate from a first position to a second position.
8. The system of claim 7, wherein the first position comprises a substantially adjacent orientation of the at least one sub-panel relative to the windshield, and wherein the second position comprises a substantially perpendicular orientation of the at least one sub-panel relative to the windshield.
9. The system of claim 8, wherein a first sub-panel is configured to retract in a first direction relative to the windshield, and wherein a second sub-panel is configured to retract in second direction relative to the windshield, and wherein the first direction is substantially opposite the second direction to allow visibility through the windshield.
10. The system of claim 1, further comprising an entertainment system communicating with the display.
11. A method comprising:
converting, via a solar panel coupled to a windshield, light energy to electricity, wherein the windshield is coupled to an autonomous vehicle;
selectively allowing and disallowing, via the solar panel, visibility through the windshield;
storing the electricity in a vehicle power source; and
utilizing the electricity to present on a display coupled to the windshield at least one of a virtual reality and entertainment.
12. The method of claim 11, further comprising gathering, via a virtual reality system, information for presentation on the display.
13. The method of claim 12, wherein the display is configured to present at least one of vehicle information, road information, environmental information, emergency information, augmented reality, virtual reality, and entertainment.
14. The method of claim 12, wherein the virtual reality system comprises at least one sensor coupled to a vehicle, the at least one sensor configured to obtain
environmental information from an external environment to present on the display.
15. The method of claim 14, wherein the at least one sensor is selected from the group consisting of an optical sensor, a thermal sensor, a video sensor, an audio sensor, an ultrasonic sensor, a radar sensor, and a Lidar sensor.
16. The method of claim 11, wherein selectively allowing and disallowing visibility through the windshield comprises selectively retracting the solar panel from the windshield.
17. The method of claim 16, wherein selectively retracting the solar panel comprises rotating at least one sub-panel from a first position to a second position.
18. The method of claim 17, wherein the first position comprises a substantially adjacent orientation of the at least one sub-panel relative to the windshield, and wherein the second position comprises a substantially perpendicular orientation of the at least one sub-panel relative to the windshield.
19. The method of claim 18, wherein selectively retracting the solar panel further comprises retracting a first sub-panel in a first direction relative to the windshield and retracting a second sub-panel in a second direction relative to the windshield, and wherein the first direction is substantially opposite the second direction to allow visibility through the windshield.
20. The method of claim 11, wherein at least one of the solar panel and the display is selectively transparent.
PCT/US2016/053069 2016-09-22 2016-09-22 Solar-powered, virtual-reality windshield WO2018056981A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2016/053069 WO2018056981A1 (en) 2016-09-22 2016-09-22 Solar-powered, virtual-reality windshield

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2016/053069 WO2018056981A1 (en) 2016-09-22 2016-09-22 Solar-powered, virtual-reality windshield

Publications (1)

Publication Number Publication Date
WO2018056981A1 true WO2018056981A1 (en) 2018-03-29

Family

ID=61689682

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/053069 WO2018056981A1 (en) 2016-09-22 2016-09-22 Solar-powered, virtual-reality windshield

Country Status (1)

Country Link
WO (1) WO2018056981A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109083548A (en) * 2018-07-06 2018-12-25 杭州涂鸦信息技术有限公司 A kind of automatically controlled photovoltaic vehicle window
WO2020035380A1 (en) * 2018-08-13 2020-02-20 Audi Ag Operating vr glasses in autonomously driving vehicle
US10712791B1 (en) 2019-09-13 2020-07-14 Microsoft Technology Licensing, Llc Photovoltaic powered thermal management for wearable electronic devices

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7413233B1 (en) * 2007-08-28 2008-08-19 Man-Young Jung Vehicle sun visor with auto-shading prompter screen
US20080238723A1 (en) * 2007-03-28 2008-10-02 Fein Gene S Digital Windshield Information System Employing a Recommendation Engine Keyed to a Map Database System
US20120224062A1 (en) * 2009-08-07 2012-09-06 Light Blue Optics Ltd Head up displays
US20140070932A1 (en) * 2012-09-07 2014-03-13 Ford Global Technologies, Llc Wirelessly controlled heads-up display

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080238723A1 (en) * 2007-03-28 2008-10-02 Fein Gene S Digital Windshield Information System Employing a Recommendation Engine Keyed to a Map Database System
US7413233B1 (en) * 2007-08-28 2008-08-19 Man-Young Jung Vehicle sun visor with auto-shading prompter screen
US20120224062A1 (en) * 2009-08-07 2012-09-06 Light Blue Optics Ltd Head up displays
US20140070932A1 (en) * 2012-09-07 2014-03-13 Ford Global Technologies, Llc Wirelessly controlled heads-up display

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109083548A (en) * 2018-07-06 2018-12-25 杭州涂鸦信息技术有限公司 A kind of automatically controlled photovoltaic vehicle window
WO2020035380A1 (en) * 2018-08-13 2020-02-20 Audi Ag Operating vr glasses in autonomously driving vehicle
US10712791B1 (en) 2019-09-13 2020-07-14 Microsoft Technology Licensing, Llc Photovoltaic powered thermal management for wearable electronic devices

Similar Documents

Publication Publication Date Title
CN113195329B (en) Redundant hardware system for autonomous vehicles
KR102046468B1 (en) Side mirror for vehicle
US11749114B1 (en) Occupant facing vehicle display
EP3127771B1 (en) Driver assistance apparatus and vehicle including the same
KR101741433B1 (en) Driver assistance apparatus and control method for the same
KR101631963B1 (en) Head up display device and vehicle having the same
US11048105B1 (en) Visor-like tablet and tablet holder for automotive vehicle
KR20170048781A (en) Augmented reality providing apparatus for vehicle and control method for the same
WO2018056981A1 (en) Solar-powered, virtual-reality windshield
CN108791062A (en) Dynamic information system and operating method
EP4113486A1 (en) Image processing device, display system, image processing method, and recording medium
KR101822896B1 (en) Driver assistance apparatus and control method for the same
JP6127391B2 (en) Vehicle periphery visual recognition device
KR20210053382A (en) Vision head up display system with deep neural networks vision technology
WO2023102915A1 (en) Image display method and device
WO2017195693A1 (en) Image display device
CN114202965B (en) Driving assistance method and device, vehicle-mounted terminal and storage medium
KR101631964B1 (en) Head up display device and vehicle having the same
EP4236304A1 (en) Camera module, information processing system, information processing method, and information processing device
CN214874514U (en) Driving auxiliary driving system with car body panoramic warning function
EP4112387A1 (en) Image processing device, display system, image processing method, and recording medium
US20240034237A1 (en) Vehicle having no wiper
JP7259377B2 (en) VEHICLE DISPLAY DEVICE, VEHICLE, DISPLAY METHOD AND PROGRAM
US20240119873A1 (en) Vehicular driving assist system with head up display
US20230410531A1 (en) Determining Correctness of Image Data of Camera System

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16916950

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16916950

Country of ref document: EP

Kind code of ref document: A1