CN115047626A - AR intelligence helmet - Google Patents

AR intelligence helmet Download PDF

Info

Publication number
CN115047626A
CN115047626A CN202210680312.6A CN202210680312A CN115047626A CN 115047626 A CN115047626 A CN 115047626A CN 202210680312 A CN202210680312 A CN 202210680312A CN 115047626 A CN115047626 A CN 115047626A
Authority
CN
China
Prior art keywords
helmet
module
user
information
intelligent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210680312.6A
Other languages
Chinese (zh)
Inventor
张宇
王鹏
李宏伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Shilimited Innovation Technology Co ltd
Original Assignee
Beijing Shilimited Innovation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Shilimited Innovation Technology Co ltd filed Critical Beijing Shilimited Innovation Technology Co ltd
Priority to CN202210680312.6A priority Critical patent/CN115047626A/en
Publication of CN115047626A publication Critical patent/CN115047626A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • AHUMAN NECESSITIES
    • A42HEADWEAR
    • A42BHATS; HEAD COVERINGS
    • A42B3/00Helmets; Helmet covers ; Other protective head coverings
    • A42B3/04Parts, details or accessories of helmets
    • A42B3/0406Accessories for helmets
    • AHUMAN NECESSITIES
    • A42HEADWEAR
    • A42BHATS; HEAD COVERINGS
    • A42B3/00Helmets; Helmet covers ; Other protective head coverings
    • A42B3/04Parts, details or accessories of helmets
    • A42B3/0406Accessories for helmets
    • A42B3/042Optical devices
    • AHUMAN NECESSITIES
    • A42HEADWEAR
    • A42BHATS; HEAD COVERINGS
    • A42B3/00Helmets; Helmet covers ; Other protective head coverings
    • A42B3/04Parts, details or accessories of helmets
    • A42B3/0406Accessories for helmets
    • A42B3/0433Detecting, signalling or lighting devices
    • A42B3/0453Signalling devices, e.g. auxiliary brake or indicator lights
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/42Photometry, e.g. photographic exposure meter using electric radiation detectors
    • G01J1/4204Photometry, e.g. photographic exposure meter using electric radiation detectors with determination of ambient light
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G11/00Producing optical signals at preselected times

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optics & Photonics (AREA)
  • Automation & Control Theory (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Sustainable Development (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Helmets And Other Head Coverings (AREA)

Abstract

The application discloses AR intelligence helmet, including AR display module, the position warning light, environment light detection module and time determination module, AR display module is used for showing navigation information, the gesture information of riding, meteorological information and helmet system function information, environment light detection module is used for detecting environment illumination intensity, and trigger position warning light and light when environment illumination intensity is less than preset light intensity threshold value, time determination module is used for triggering position warning light and light when current time is located preset period, so that instruct self position to the external world. This helmet lets the user break away from the operation cell-phone, liberates user's both hands, has eliminated the danger that the user brought in riding and the operation cell-phone of driving of bowing, improves the security in the use, can project the content such as navigation of riding or driving, user interaction information, information notice through AR display technology in the user's eyes, has eliminated the limitation of single voice interaction's mode.

Description

AR intelligence helmet
Technical Field
The application relates to the technical field of electronic equipment, in particular to AR intelligence helmet.
Background
With the popularization of electric bicycles, motorcycles, shared bicycles and the popularity of the take-out industry, and the requirements of laws and regulations, the helmet becomes an indispensable riding device. In the related art, a conventional helmet is used as a basic protective device, an information interaction mode with a user is lacked, taking riding as an example, the user usually needs to add accessories such as a mobile phone support on a bicycle when riding, and some information interactions, such as functions of navigation, information prompt, voice call and the like, are completed by using a mobile phone, so that the user still needs to operate the mobile phone while riding, safety accidents are easily caused, the content displayed on a mobile phone screen is less, the user cannot easily see the content on the mobile phone screen when riding, and if the user parks the bicycle to perform mobile phone operation, the riding experience of the user is influenced.
In addition, although the user can process complex information interaction in a voice manner, the efficiency and accuracy of communication are greatly limited, especially when multiple tasks are processed in parallel, because voice is a time-sharing input and output form, the user can only listen to one voice at a time, and from the perspective of information theory, the entropy value of voice is low. When the user rides the bicycle, the user needs to reduce the sound reception of the surrounding environment if the user listens to the voice information, but the user is easy to cause safety accidents due to lack of perception of the environment sound during riding.
It is noted that the information disclosed in this background section is only for background understanding of the concepts of the application and, therefore, it may contain information that does not form the prior art.
Disclosure of Invention
The present application is directed to solving, at least to some extent, one of the technical problems in the related art. Therefore, the first purpose of this application lies in providing an AR intelligence helmet, eliminates the danger that the user drove the operation cell-phone and brings in bicycle riding or electric bicycle car driving process to show content such as navigation of will riding, user interaction information, information notice through AR display technology and project to the user in the front, eliminate the limitation of single voice interaction's mode, improved the security in the user use.
To achieve the above object, an embodiment of the present application provides an AR smart helmet, including: the AR display module is arranged on a position prompt lamp on the AR intelligent helmet, and an ambient light detection module and a time determination module which are integrated on the AR intelligent helmet; the AR display module is configured to display one or more items of navigation information, riding posture information, meteorological information and helmet system function information; the ambient light detection module is configured to detect ambient light intensity and trigger the position prompt lamp to light up when the ambient light intensity is lower than a preset light intensity threshold; the time determination module is configured to trigger the position prompt lamp to light when the current time is within a preset time period so as to indicate the position of the position prompt lamp to the outside, wherein the preset time period is set according to the sunset time.
According to the AR intelligent helmet provided by the embodiment of the application, on the premise that the riding or driving safety of a user is guaranteed, the user is separated from operating the mobile phone, the two hands of the user are liberated, the danger caused by the fact that the user lowers the head to operate the mobile phone during riding and driving is eliminated, the safety in the using process is improved, and meanwhile, the information interaction of the user is greatly facilitated; simultaneously can be with riding or navigation of traveling, user interaction information, content such as information notice projects the user in the front through AR display technology, replace the pronunciation of low entropy with the image of high entropy, make the user can look over a plurality of content information simultaneously on the image rather than listening to a content information once on pronunciation, the limitation of the mode of single voice interaction has been eliminated, provide all-round three-dimensional interactive experience, and the noise of environment can not influence the user and look over AR content, need not consequently to fall the noise to the helmet, the security in the user's use has been improved.
According to an embodiment of this application, AR display module assembly is portable AR display device, portable AR display device independently install in on the equipment support of AR intelligence helmet, or integrate in on the helmet body of AR intelligence helmet.
According to one embodiment of the present application, the AR smart helmet further comprises: the navigation module is integrated with the AR intelligent helmet, and the steering indicating lamps are installed on two sides of the AR intelligent helmet, wherein the navigation module is configured to identify steering nodes from a navigation route, and to trigger turning on and turning off of the steering indicating lamps in corresponding steering directions according to the distance between the current position and the steering nodes.
According to one embodiment of the present application, the AR smart helmet further comprises: the IMU posture module is integrated in the AR intelligent helmet and is configured to monitor a riding posture and trigger the turn indicator lamp to be turned on when the riding posture is a transverse inclined posture.
According to an embodiment of the application, the IMU posture module is further configured to acquire an acceleration change condition in a moving direction, and trigger the blinking of the turn indicator light and the position indicator light when the acceleration change condition satisfies a set condition, wherein the set condition includes: the acceleration change degree is higher than the preset degree value.
According to one embodiment of the present application, the AR smart helmet further comprises: install in camera module on the AR intelligence helmet, and still be provided with first button on the AR intelligence helmet, first button triggers when being pressed the camera module shoots to upload the image of shooing to the server.
According to an embodiment of this application, still be provided with the second button on the AR intelligence helmet, the second button triggers when being pressed the camera module is shot to discernment testee and benchmark object in the image of following the shooting, and obtain the volume of testee according to the volume of benchmark object.
According to one embodiment of the present application, the AR smart helmet further comprises: the wearing detection module is integrated on the AR intelligent helmet and comprises a plurality of sensors, and the wearing detection module is configured to judge whether the AR intelligent helmet is in a wearing state according to the detection results of the plurality of sensors and trigger the power-on starting of the AR intelligent helmet when the AR intelligent helmet is judged to be in the wearing state.
According to an embodiment of the present application, the plurality of sensors includes a plurality of ranging sensors, a human body charge induction sensor, a light intensity sensor, and a pressure sensor.
According to an embodiment of the application, wear and detect the module and carry out according to setting for the order AR intelligence helmet is in the judgement of wearing the state to be triggering when the judgement result to all sensor testing results AR intelligence helmet is in wearing the state the circular telegram of AR intelligence helmet starts, set for the order: distance measuring sensor, human body charge induction sensor, light intensity sensor and pressure sensor.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
Fig. 1 is a block diagram of an AR smart helmet 10 according to an embodiment of the present application.
Fig. 2 is a schematic flow chart of the lighting of the trigger position indicating lamp 210 according to an embodiment of the present application.
Fig. 3 is a block diagram of an AR smart helmet 10 according to another embodiment of the present application.
Fig. 4 is a schematic flow chart of the triggering of the turn indicator 220 according to an embodiment of the present application.
Fig. 5 is a schematic flow chart of the wearable detection module 810 for determining the detection result of the sensor in an embodiment of the present application.
Detailed Description
Reference will now be made in detail to the embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application and should not be construed as limiting the present application.
The AR smart helmet of the embodiments of the present application is described below with reference to the accompanying drawings.
The AR- (Augmented Reality Head-mounted Display Equipment) Augmented Reality technology is also called Augmented Reality, the AR Augmented Reality technology is a relatively new technical content which promotes integration between real world information and virtual world information content, and the AR Augmented Reality technology implements analog simulation processing on the basis of the scientific technology such as computers and the like on the entity information which is relatively difficult to experience in the space range of the real world originally, and effectively applies the virtual information content in the real world by superposition, and can be perceived by human senses in the process, so that the sensory experience beyond Reality is realized. After the real environment and the virtual object are overlapped, the real environment and the virtual object can exist in the same picture and space at the same time.
Referring to fig. 1, an AR smart helmet 10 according to an embodiment of the present disclosure mainly includes an AR display module 100, a position indicator 210, an ambient light detection module 310, and a time determination module 320. The AR smart helmet 10 mainly includes a helmet body and a strap, the helmet body is fixed on the head of a user through the strap, and the helmet body is used for protecting the head of the user when the user is impacted from the outside.
AR display module 100 can be through the AR technique with the notice, message and characters are projected to the user in front of, and AR display module 100 is configured to show one or more in navigation information, gesture information rides, meteorological information, helmet system function information, that is to say, AR display module 100 can only be used for showing gesture information rides to the user of AR intelligent helmet 10, also can be used for showing gesture information rides and navigation information to the user of AR intelligent helmet 10, still can be used for showing navigation information, gesture information rides, meteorological information and helmet system function information to the user of AR intelligent helmet 10. The display time and the display style of the information are set in advance through a program, and when the display time is met, the AR display module 100 displays the acquired or obtained corresponding information after logical operation in an AR mode so as to be conveniently seen by a user.
The AR display of navigation information may include travel routes and turn-around prompts, among other things. The AR display of the riding posture information may include contents of a current traveling speed, acceleration, direction, and the like, and if the user is driving the electric bicycle, the riding posture information corresponds to the traveling posture information. The AR display of weather information may include current temperature, humidity, and barometric pressure, among other things. The AR display of the helmet system function information may include contents such as the remaining power and the remaining storage capacity of the helmet. The AR display module 100 may also display entertainment information, and the AR display of the entertainment information may include the currently played song, volume, and play mode. Since the AR smart helmet 10 may be used as a helmet of a take-out delivery person, the AR display module 100 may further display take-out delivery information, and the AR display of the take-out delivery information may include contents such as a store name, a store location, a ticket number, a buyer location, a buyer name, and a buyer contact address.
The position indicator lamp 210 is installed on the AR smart helmet 10, that is, on the helmet body of the AR smart helmet 10, and may be specifically installed at the highest position outside the helmet body, so that it is also called a high-position indicator lamp for indicating the existence and position of the helmet user to the rear, and avoiding collision of the surrounding vehicles or other traveling objects due to the failure to see the user clearly.
The ambient light detection module 310 is integrated on the AR smart helmet 10 and may be built into the AR smart helmet 10. The ambient light detection module 310 is configured to detect the ambient light intensity and trigger the position cue light 210 to illuminate when the ambient light intensity is below a preset light intensity threshold. Specifically, the ambient light detection module 310 may detect the ambient light intensity of the outside world in real time or periodically, and compare the ambient light intensity with a preset light intensity threshold. If the ambient light intensity is lower than the preset light intensity threshold, it indicates that the external ambient brightness is low, and the user's presence is difficult to be observed and perceived by the outside, so the position warning lamp 210 is turned on to externally warn the helmet user's presence; if the ambient light intensity is not lower than the preset light intensity threshold, it indicates that the ambient brightness is high, and the user can be observed and perceived by the outside easily without the help of the position indicator 210, so the ambient light detection module 310 does not trigger the position indicator 210 to be turned on.
Further, the ambient light detection module 310 may include a photosensitive element, and the ambient light detection function is implemented by the photosensitive element constituting the light detection circuit. The photosensitive element can be an analog photosensitive element or a digital photosensitive element, wherein when the analog photosensitive element is selected, the linear voltage output by the photosensitive detection circuit can be linearly changed along with the illumination intensity from dark to bright, and the range of the linear voltage can be 0-3.3V. The ambient light detection module 310 may convert the linear voltage to obtain an ambient light intensity, and then compare the ambient light intensity with a preset light intensity threshold. When selecting for use numerical value formula photosensitive sensor, photosensitive detection circuit directly converts illumination intensity into the digital quantity, through interface connection to environment light detection module 310, and environment light detection module 310 analyzes data, and then realizes the comparison of environment illumination intensity and preset light intensity threshold value.
The time determination module 320 is also integrated on the AR smart helmet 10 and may be built into the AR smart helmet 10. The time determination module 320 is configured to trigger the position indication lamp 210 to light up when the current time is within a preset time period so as to indicate the self position to the outside. Specifically, the time determining module 320 may obtain the current time in real time or periodically according to the configured timing function, and determine whether the current time enters the preset time period. The preset period is set in accordance with the sunset time, and for example, a period from the sunset time to the next sunrise time may be taken as the preset period. If the current time is eleven hours at night, that is, the current time is within the preset time period, the time determination module 320 triggers the position indication lamp 210 to be turned on, so as to indicate the presence of the user to the outside in a dark or even dark environment; if the current time is ten am, the current time is outside the preset time period, which indicates that the user can be observed and detected by the outside without using the position indicator 210, and therefore the time determination module 320 does not trigger the position indicator 210 to be turned on.
Furthermore, the sunset time and the sunrise time can be determined according to the geographical location of the user, and the sunset time and the sunrise time of different geographical locations are different, so that the current sunset time and the current sunrise time can be determined according to the location, so that the trigger time of the position prompt lamp 210 by the time determination module 320 is more consistent with the real scene.
It should be noted that the ambient light detection module 310 and the time determination module 320 are both capable of triggering the lighting of the position indicator lamp 210, and the ambient light detection module 310 and the time determination module 320 form a lighting control set of the position indicator lamp 210. Referring to fig. 2, one specific determination process for lighting the position indicator 210 may be: if any module in the lighting control set triggers the lighting of the position prompt lamp 210, the position prompt lamp 210 is lighted; if a plurality of modules in the lighting control set trigger the lighting of the position prompt lamp 210, the position prompt lamp 210 is turned on; if no module in the lighting control set triggers the lighting of the position warning light 210, the position warning light 210 is turned off.
The control relationship between the ambient light detection module 310 and the time determination module 320 for the position indicator light 210 is described below by taking as an example a scenario in which the user wears the AR smart helmet 10 for cycling.
Assuming that the current time is ten o' clock in the morning, the time determination module 320 does not trigger the position indicator 210 to be turned on, if the rider enters the tunnel and the surrounding environment becomes dark, the ambient light detection module 310 triggers the position indicator 210 to be turned on through the detection of the ambient light intensity, and when the ambient light detection module 310 avoids controlling the position indicator to be turned on and off only by the sunset time, other vehicles do not adapt to the environment or have unclear sight lines due to entering the dark environment suddenly and then have rear-end accidents with the user.
Assuming that the current time is eleven points at night, the ambient light detection module 310 and the time determination module 320 both trigger the position indicator lamp 210 to be turned on, at this time, the external vehicle is closer to the user and the vehicle lamp illuminates the sensor of the ambient light detection module 310 for detecting the external light intensity, so that the ambient light intensity detected by the ambient light detection module 310 is higher than the preset light intensity threshold, and therefore the ambient light detection module 310 does not trigger the position indicator lamp 210 to be turned on, but the current time is still within the preset time period, so the time determination module 320 can keep the position indicator lamp 210 on, and the situation that the position indicator lamp 210 is turned off by mistake due to the fact that the vehicle lamp light directly illuminates the user at night when the on-off of the position indicator lamp is controlled by only depending on the ambient brightness is avoided by the time determination module 320.
According to the AR intelligent helmet provided by the embodiment of the application, on the premise that the riding or driving safety of a user is guaranteed, the user is separated from operating the mobile phone, the two hands of the user are liberated, the danger caused by the fact that the user lowers the head to operate the mobile phone during riding and driving is eliminated, the safety in the using process is improved, and meanwhile, the information interaction of the user is greatly facilitated; simultaneously can be with riding or navigation of traveling, user interaction information, content such as information notice projects the user in the front through AR display technology, replace the pronunciation of low entropy with the image of high entropy, make the user can look over a plurality of content information simultaneously on the image rather than listening to a content information once on pronunciation, the limitation of the mode of single voice interaction has been eliminated, provide all-round three-dimensional interactive experience, and the noise of environment can not influence the user and look over AR content, need not consequently to fall the noise to the helmet, the security in the user's use has been improved.
In some embodiments, the AR display module 100 is a portable AR display device, which is independently installed on a device bracket of the AR smart helmet 10 or integrated on a helmet body of the AR smart helmet 10.
Adopt portable AR display device, can reduce the cost of helmet, and portable AR display device can the person of facilitating the use carry out position adjustment to it, for example when portable AR display device is integrated on the helmet body of AR intelligence helmet 10, the user can carry out position adjustment to portable AR display device, makes things convenient for the user of different head types, and the adaptation scope of helmet is wider, and is mutual more convenient.
Can be provided with the equipment support on the AR intelligence helmet 10, portable AR display device also can install on the equipment support with external mode, portable AR display device can pass through the power of equipment support this moment, signal interface accomplish the power supply and with AR intelligence helmet 10 between the communication, portable AR display device also can adopt the mode of independent power supply to accomplish wireless data link with AR intelligence helmet 10 through wireless modes such as bluetooth and be connected. The AR display module 100 may also be configured with voice interaction recognition and audio playing functions.
In some embodiments, referring to fig. 3, the AR smart helmet 10 further comprises: a navigation module 400 and a turn indicator 220. The navigation module 400 is integrated on the AR smart helmet 10 and may be built into the AR smart helmet 10. Turn to pilot lamp 220 and install in the both sides of AR intelligence helmet 10, for example can install in the both sides of AR intelligence helmet 10 rear side, and both sides respectively set up a turn to pilot lamp 220 to turn to pilot lamp 220 can adopt continuous-flow type bar light module.
The navigation module 400 is configured to identify a turning node from the navigation route and trigger turning on and off of the turn indicator 220 of the corresponding turning direction according to a distance between the current location and the turning node. Specifically, when the AR smart helmet 10 enters the navigation mode, the navigation module 400 identifies all the location points that need to turn from the planned navigation route in combination with the map data, and these location points are referred to as turning nodes. In addition, the navigation module 400 can also track the position of the user, when the user rides or travels to a position closer to a turning node, the navigation module 400 triggers the turning indicator 220 corresponding to the turning direction of the closer turning node to light up, for example, when the navigation module 400 recognizes that the user rides or travels to a position 20 meters away from a left turn, the left turning indicator 220 can be controlled to display a running left arrow, so that when the user forgets to turn on the turning indicator of the driven vehicle, the user can still remind the surrounding vehicle, especially a person in front of the rear vehicle, that the person is about to turn, and if the user rides a bicycle, the user can also remind the person in front of the rear vehicle that the person is about to turn.
In some embodiments, continuing to refer to fig. 3, AR smart helmet 10 further comprises: IMU attitude module 500. IMU pose module 500 is integrated on AR smart helmet 10 and may be built into AR smart helmet 10. The IMU attitude module 500 may include IMU inertial devices and be configured with associated attitude solution routines. IMU pose module 500 is configured to monitor the ride pose and trigger the illumination of turn indicator lights 220 when the ride pose is a laterally inclined pose.
Specifically, when the user is driving forward, if the user is changing lanes, although the lane change does not belong to steering, it is also necessary to prompt the surrounding vehicles, especially the rear vehicle, so when the IMU attitude module 500 determines that the current attitude of the user is a left-inclined attitude or a right-inclined attitude, it indicates that the user is changing lanes, and thus controls the corresponding turn indicator 220 to be turned on, and when the attitude of the user is not inclined laterally, it indicates that the user is not changing lanes or has just changed lanes, and at this time, the turn indicator 220 may be controlled to be turned off. Through discerning user's lane change action, can avoid the user to do not hit the light and directly change the lane equally, and then avoid resulting in the rear vehicle not to judge in advance and take place the accident. Therefore, the riding posture (equivalent to a driving posture) of the user is monitored through the IMU posture module 500, and relevant interaction is completed according to the corresponding posture. For a user driving an electric bicycle, the IMU attitude module 500 can also monitor the driving attitude of the user and complete corresponding interaction.
It should be noted that the criterion for determining whether the current driving posture of the user belongs to the lateral tilt posture may be determined by a lateral tilt threshold, and if the inclination exceeds the lateral tilt threshold, the current driving posture is considered to belong to the lateral tilt posture, otherwise the current driving posture does not belong to the lateral tilt posture. And, the bicycle also has lane-changing behavior, e.g. at the crossroad, the red light in front prohibits straight going, the user
In addition, the IMU attitude module 500 can also provide relevant azimuth and acceleration indexes required by inertial navigation for the driving navigation of the user, and both the navigation information and the riding attitude information (driving attitude information) displayed to the user by the AR display module 100 are provided by the IMU attitude module 500.
In some embodiments, the IMU pose module 500 is further configured to acquire an acceleration change condition in the moving direction, and trigger the blinking of the turn indicator light 220 and the position indicator light 210 when the acceleration change condition satisfies a set condition. Wherein the setting conditions include: the acceleration change degree is higher than the preset degree value.
Specifically, ride forward or the in-process of traveling suddenly brake when the user, then the acceleration variation in user unit interval is great to the acceleration variation has exceeded the change threshold value, IMU gesture module 500 can detect that the user has taken place great change suddenly at the acceleration of riding direction or the direction of traveling this moment, and then the indicator 220 that turns to of control both sides is with the same frequency scintillation, still controls the highlight scintillation of position warning light 210 simultaneously to warn the rear vehicle, remind the rear vehicle to notice the brake and avoid knocking into the back.
For the position indicator light 210 and the turn indicator light 220, after the flashing is triggered, the flashing priority is higher than the flashing priority, if the position indicator light 210 is originally in a normally-on state, the position indicator light 210 will flash when the flashing is triggered, and when the flashing is finished, the normally-on state will be recovered or turned off according to whether the ambient light detection module 310 and the time determination module 320 are triggered. The same applies to the turn signal lamp 220.
Referring to fig. 4, one specific determination process for turning on the turn signal lamp 220 may be: after navigation is started, sequentially judging whether a left steering node is about to be reached or not, whether a right steering node is about to be reached or not and whether the acceleration reduction amount exceeds a change threshold or not, and if the judgment results are negative, restarting sequential judgment; if one of the judgment results is yes, the corresponding steering indicator lamp is lightened or flickers at the same frequency, and the position indicator lamp is also lightened and flickered while the steering indicator lamp flickers at the same frequency; if the user is about to turn to the vehicle and brakes suddenly, the same-frequency flickering is preferentially carried out.
Highly integrated through AR intelligence helmet 10 with AR demonstration, interaction, pronunciation, gesture analysis, navigation, camera, intelligent light early warning etc. through the integration and the interaction of multisensor, realize intelligent navigation, intelligent indicator, the mutual integration solution of intelligent pronunciation.
In some embodiments, referring to fig. 3, the AR smart helmet 10 further comprises: a camera module 600 and a first button 710. The camera module 600 is installed on the AR intelligent helmet 10, specifically, the camera module 600 can be installed right in front of the forehead of the AR intelligent helmet 10, the camera module 600 is used for shooting video images in front of a user, real-time video recording can be performed in the riding or driving process of the user, the whole riding or driving process is recorded, when the user bumps and falls down, the camera module 600 can automatically upload video images from a certain time before to a certain time after the event occurrence time to a cloud server, for example, the user crashes at the time of T1, the camera module 600 uploads video images in the time period of [ T1-10min, T1+10min ], so as to record accidents and determine responsibility.
First button 710 can set up on the surface of AR intelligence helmet 10 to be located the convenient position that reaches and operate of user, the user can trigger camera module 600 to shoot through pressing first button 710, so that upload the image of shooing to the server. The difference between the camera triggering function of the first button 710 and the automatic video recording function of the camera module 600 is that the automatic video recording of the camera module 600 occurs during riding or driving, and the first button 710 can be manually operated by a user when the user stops or does not ride or drive, for example, when the user is delivering a take-out product, the shooting function required when the user manually triggers the first button 710 to complete taking and taking an order, providing a flow monitoring and archiving function of images for the take-out product delivery, and the user is not driving.
In some embodiments, continuing to refer to fig. 3, AR smart helmet 10 further comprises: a second button 720. The second button 720 may also be disposed on the outer surface of the AR smart helmet 10 and spaced a distance from the first button 710 to avoid user confusion, false presses, and false touches. The second button 720 triggers the camera module 600 to shoot when pressed down equally, but the shooting that the second button 720 triggers is the volume that is used for making the operation module of high in the clouds server or AR intelligent helmet 10 discernment testee and benchmark object from the image of shooing to obtain the testee according to the volume of benchmark object. Therefore, the contents of the commands issued after the first button 710 and the second button 720 are pressed are different. For example, when the user is delivering take away, the take away box and the transported object placed in the same image can be photographed and identified by manually triggering the second button 720, and the volume of the transported object can be estimated by calibrating the image of the standard take away box whose volume is known.
It is understood that the first button 710 and the second button 720 may be provided as the same button, and whether to perform the function of the first button 710 or the function of the second button 720 is distinguished by the duration of pressing. For example, when pressing down duration and not exceeding 1 second, trigger camera module 600 and shoot and upload the server, when pressing down duration and exceeding 1 second, trigger camera module 600 and shoot and carry out the measured object volume estimation. Further, the first button 710 and the second button 720 may be provided as the same knob, and whether to perform the function of the first button 710 or the function of the second button 720 is distinguished by the difference of the rotational positions.
In some embodiments, continuing to refer to fig. 3, AR smart helmet 10 further comprises: wear detection module 810. Wear and detect module 810 and integrate on AR intelligence helmet 10 to can place in AR intelligence helmet 10. Wear to detect module 810 and include multiple sensor, be the comprehensive intelligent detection module who has assembleed multiple sensor, these sensors all are used for detecting whether AR intelligent helmet 10 is in wearing the state. And wear detection module 810 and be configured as the testing result according to this multiple sensor and judge whether AR intelligence helmet is in wearing the state to trigger the circular telegram of AR intelligence helmet and start when judging that AR intelligence helmet is in wearing the state.
In some embodiments, the wear detection module 810 may include more than one or even all of a distance measurement sensor, a body charge sensing sensor, a light intensity sensor, and a pressure sensor, all mounted on the inner side of the helmet. The distance measuring sensor can adopt an infrared distance measuring sensor and is used for detecting the distance of the inner side of the helmet; the human body charge induction sensor is used for detecting the charge on the inner side of the helmet; the light intensity sensor is used for detecting the brightness of the inner side of the helmet; the pressure sensor is used for detecting the stress condition of the inner side of the helmet. In this embodiment, the wearing detection module 810 includes all kinds of the above sensors as an example.
The judgment strategy of wearing the detection module 810 may be that only when the judgment results of all the sensor detection results all represent that the AR intelligent helmet is in a wearing state, the power-on start of the AR intelligent helmet 10 is triggered. That is to say, only when the detection result of all sensors all represents that AR intelligent helmet 10 is worn, wear detection module 810 and can judge that AR intelligent helmet 10 is worn, if the detection result of one or more of the sensors all represents that AR intelligent helmet 10 is not worn, then the sensors that give the detection result that representation AR intelligent helmet 10 is worn may have taken place the false positive, so just lead to the detection result of each sensor different, can not judge that AR intelligent helmet 10 is worn this moment.
The wear detection module 810 can turn on the determination of whether the user wears the helmet once every 500 ms. When determining whether the wearing detection module 810 is worn, the wearing detection module may determine whether the wearing detection module is worn by comparing the detection results of the sensors with corresponding set values, and the detection results of the sensors may be sequentially compared in a set order. In some embodiments, the order of determining the sensor detection results adopted by the wearing detection module 810 when determining the wearing state is: distance measuring sensor, human body charge induction sensor, light intensity sensor and pressure sensor.
Specifically referring to fig. 5, the wearing detection module 810 first obtains a detection value of the infrared distance measurement sensor, if the infrared distance measurement detection value is smaller than a first set value, it indicates that an object enters the helmet, and at this time, the next sensor continues to determine, otherwise, the infrared distance measurement detection value is determined again.
When the infrared ranging detection value is smaller than the first set value, the wearing detection module 810 acquires the detection value of the human body charge induction sensor, if the human body charge detection value is effective, it is a component of a human body that the human body enters the helmet, at the moment, the judgment of the next sensor is continued, and otherwise, the judgment of the infrared ranging detection value is carried out again.
When the human body charge detection value is effective, the wearing detection module 810 acquires the brightness detection value of the light intensity sensor, if the brightness detection value is smaller than a second set value, the occupied space of the human body part in the helmet is larger, and the human body part is more consistent with the shape of the cavity on the inner side of the helmet, the judgment of the next sensor is continued, and if not, the judgment of the infrared distance measurement detection value is carried out again.
When the brightness detection value is smaller than the second set value, the wearing detection module 810 obtains the pressure detection value of the pressure sensor, if the pressure detection value is smaller than the second set value, it is indicated that the human body part in the helmet touches the corresponding pressure point, and it is indicated that the human body part in the helmet is the head of a human body, at this time, the starting of the helmet starting process is triggered, the control main board 820 and the power supply system 830 are started, the peripheral function module is powered on, the control main board 820 completes the starting self-check of the function module, otherwise, the judgment of the infrared distance measurement detection value is carried out again.
It can be understood that, when the comparison result of the detection value of the sensor and the corresponding set value does not meet the condition, the first sensor in the judgment sequence is returned to perform all re-detection again, so that the helmet is judged to be worn only when all the sensors meet the condition at the same time, otherwise, the condition that the helmet is considered to be worn when all the sensors meet the condition in sequence at different times occurs, but actually, the helmet is not necessarily worn when the conditions meet in sequence at different times.
In some embodiments, referring to fig. 3, the AR smart helmet 10 further comprises: control mainboard 820 and electrical power generating system 830, control mainboard 820 and electrical power generating system 830 all integrate on AR intelligence helmet 10 to can place in AR intelligence helmet 10. The hardware carrier for controlling the system of the main board 820 is responsible for connecting functional modules of each part of the system and completing the operation theme of data and control algorithm, so that the main board 820 has an electrical connection relationship with functional modules and components of the AR display module 100, the position indicator light 210, the steering indicator light 220, the ambient light detection module 310, the time determination module 320, the navigation module 400, the IMU posture module 500, the camera module 600, the first button 710, the second button 720, the wearing detection module 810, the power supply system 830, and the like. The power system 830 provides different voltages for each hardware carrier, such as 3.3V, 1.8V, and 5V, and the power system 830 is also connected to the battery unit to manage the charging and discharging of the battery, and protect the short circuit, overcurrent, and overdischarge.
In some embodiments, continuing to refer to fig. 3, AR smart helmet 10 may further comprise: and the network service module is integrated on the AR intelligent helmet 10 and can be arranged in the AR intelligent helmet 10. The network service modules may include 4G and 5G modules 910, and a GPS module 920. 4G and 5G module 910 provide the network communication function for AR intelligence helmet 10, and AR intelligence helmet 10 can replace the cell-phone to accomplish functions such as information, pronunciation, navigation, communication after the networking to camera module 600 uploads video image to the server through 4G and 5G module 910. The GPS module 920 is used to provide geographic location information, service navigation and positioning functions, and the time determination module 320 obtains the geographic location of the user according to the sunrise and sunset time, that is, the geographic location is obtained through the GPS module 920.
In some embodiments, continuing to refer to fig. 3, AR smart helmet 10 may further comprise: a microphone and a directional earpiece. The microphones may be digital microphones with PDM (pulse density modulation) interfaces, and one microphone is respectively arranged at a position near the mouth of the user outside and inside the AR smart helmet 10. The external microphone is used for collecting external noise of the AR intelligent helmet 10 and collecting voice audio of a user, the built-in microphone is a main microphone, the two microphones are connected to the control main board 820, an intelligent noise reduction program is embedded in the control main board 820, environmental noise collected by the external microphone can be processed through a noise reduction algorithm, sound waves opposite to noise are generated and superposed to earphone playing audio, the influence of the earphone audio on the outside is reduced, or transparent transmission of the external environmental sound is required to be achieved according to a specific scene, and danger caused by the fact that the helmet is sealed inside and cannot sense the environmental sound is avoided. The noise reduction and transparent transmission functions can be set in a sound control function menu by a user through a spinning button.
The directional earphone can be a loudspeaker with a directional sound cavity arranged above the ear of the AR intelligent helmet 10, and the outlet direction of the sound cavity points to the auricle outside the ear of the human, so that a user can clearly hear system audio, such as navigation, voice, music and the like, outdoors.
It should be noted that the logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise. Relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the apparatus, the electronic device, and the computer-readable storage medium embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and in relation to the description, reference may be made to some portions of the description of the method embodiments.
The above description is only for the preferred embodiment of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application are included in the protection scope of the present application.

Claims (10)

1. An AR smart helmet, comprising: the AR display module is arranged on a position prompt lamp on the AR intelligent helmet, and an ambient light detection module and a time determination module which are integrated on the AR intelligent helmet;
the AR display module is configured to display one or more items of navigation information, riding posture information, meteorological information and helmet system function information;
the ambient light detection module is configured to detect ambient light intensity and trigger the position prompt lamp to light up when the ambient light intensity is lower than a preset light intensity threshold;
the time determination module is configured to trigger the position prompt lamp to light when the current time is within a preset time period so as to indicate the position of the position prompt lamp to the outside, wherein the preset time period is set according to the sunset time.
2. The AR smart helmet of claim 1, wherein the AR display module is a portable AR display device, the portable AR display device being mounted on a device bracket of the AR smart helmet independently or integrated on a helmet body of the AR smart helmet.
3. The AR smart helmet of claim 1, further comprising: the navigation module is integrated with the AR intelligent helmet, and the steering indicating lamps are installed on two sides of the AR intelligent helmet, wherein the navigation module is configured to identify steering nodes from a navigation route, and to trigger turning on and turning off of the steering indicating lamps in corresponding steering directions according to the distance between the current position and the steering nodes.
4. The AR smart helmet of claim 3, further comprising: the IMU posture module is integrated in the AR intelligent helmet and is configured to monitor a riding posture and trigger the turn indicator lamp to be turned on when the riding posture is a transverse inclined posture.
5. The AR smart helmet of claim 4, wherein the IMU pose module is further configured to obtain an acceleration change condition in the moving direction and trigger the blinking of the turn indicator light and the position indicator light when the acceleration change condition satisfies a set condition, wherein the set condition comprises: the acceleration change degree is higher than the preset degree value.
6. The AR smart helmet of claim 1, further comprising: install in camera module on the AR intelligence helmet, and still be provided with first button on the AR intelligence helmet, first button triggers when being pressed the camera module shoots to upload the image of shooing to the server.
7. The AR intelligent helmet according to claim 6, wherein a second button is further arranged on the AR intelligent helmet, and when the second button is pressed, the camera module is triggered to shoot, so that the measured object and the reference object can be identified from the shot image, and the volume of the measured object can be obtained according to the volume of the reference object.
8. The AR smart helmet of claim 1, further comprising: the wearing detection module is integrated on the AR intelligent helmet and comprises a plurality of sensors, and the wearing detection module is configured to judge whether the AR intelligent helmet is in a wearing state according to the detection results of the plurality of sensors and trigger the power-on starting of the AR intelligent helmet when the AR intelligent helmet is judged to be in the wearing state.
9. The AR smart helmet of claim 8, wherein the plurality of sensors comprises a plurality of ranging sensors, human body charge sensing sensors, light intensity sensors, and pressure sensors.
10. The AR intelligent helmet according to claim 9, wherein the wear detection module determines whether the AR intelligent helmet is in a wearing state according to a set sequence, and triggers power-on start of the AR intelligent helmet when determination results of detection results of all sensors are that the AR intelligent helmet is in the wearing state, and the set sequence is: distance measuring sensor, human body charge induction sensor, light intensity sensor and pressure sensor.
CN202210680312.6A 2022-06-15 2022-06-15 AR intelligence helmet Pending CN115047626A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210680312.6A CN115047626A (en) 2022-06-15 2022-06-15 AR intelligence helmet

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210680312.6A CN115047626A (en) 2022-06-15 2022-06-15 AR intelligence helmet

Publications (1)

Publication Number Publication Date
CN115047626A true CN115047626A (en) 2022-09-13

Family

ID=83161244

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210680312.6A Pending CN115047626A (en) 2022-06-15 2022-06-15 AR intelligence helmet

Country Status (1)

Country Link
CN (1) CN115047626A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106307766A (en) * 2016-08-13 2017-01-11 上海天奕无线信息科技有限公司 Headgear, vehicle system and control method for helmet indicator lamp
CN112932007A (en) * 2021-02-22 2021-06-11 浙江欧凯车业有限公司 Intelligent helmet and control method thereof
CN213756864U (en) * 2020-11-26 2021-07-23 浙江警察学院 Intelligent safety helmet of electric bicycle
CN216723316U (en) * 2021-12-27 2022-06-14 星微科技(天津)有限公司 Intelligent helmet capable of automatically identifying vehicle running state

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106307766A (en) * 2016-08-13 2017-01-11 上海天奕无线信息科技有限公司 Headgear, vehicle system and control method for helmet indicator lamp
CN213756864U (en) * 2020-11-26 2021-07-23 浙江警察学院 Intelligent safety helmet of electric bicycle
CN112932007A (en) * 2021-02-22 2021-06-11 浙江欧凯车业有限公司 Intelligent helmet and control method thereof
CN216723316U (en) * 2021-12-27 2022-06-14 星微科技(天津)有限公司 Intelligent helmet capable of automatically identifying vehicle running state

Similar Documents

Publication Publication Date Title
KR102552285B1 (en) Portable electronic device and method thereof
US7495549B2 (en) Integrated power, lighting, and instrumentation system for bicycles
CN109087485B (en) Driving reminding method and device, intelligent glasses and storage medium
US20150228066A1 (en) Rear Encroaching Vehicle Monitoring And Alerting System
WO2016187973A1 (en) Intelligent riding helmet
US10088911B2 (en) Programmable electronic helmet
KR20090105257A (en) Single unit video camera recoder
JP2007510575A (en) Vehicle running and / or traffic condition recording apparatus and recording evaluation method
JP2013067209A (en) Vehicle projection system and program therefor
WO2020165810A1 (en) A structure for personal protection, driving assistance and signalling, in particular for motor applications
JP6439184B2 (en) Vehicle projection system and program thereof
TWM418840U (en) Traffic recorder for bicycle or tricycle
WO2015120428A1 (en) Rear encroaching vehicle monitoring and alerting system
CN105916760A (en) Wearable signaling system and methods
CN115047626A (en) AR intelligence helmet
KR100934942B1 (en) System for recoding emergency condition using avn for automobile
JP6083019B2 (en) System, program, imaging device, and software
JP6172483B2 (en) Vehicle projection system and program thereof
CN210901568U (en) Driving auxiliary helmet system
JP6765606B2 (en) Systems, programs, imaging devices, and software
CN114587045A (en) Multifunctional riding helmet
JP6661858B2 (en) System, program, imaging device, and software
JP6533902B2 (en) Device and program
CN106952450A (en) One kind utilizes dual camera device for resisting fatigue driving
CN203920640U (en) Electric vehicle body control system based on UART agreement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20220913