US20210180917A1 - System and method for monitoring and assessing projectile performance - Google Patents

System and method for monitoring and assessing projectile performance Download PDF

Info

Publication number
US20210180917A1
US20210180917A1 US17/117,962 US202017117962A US2021180917A1 US 20210180917 A1 US20210180917 A1 US 20210180917A1 US 202017117962 A US202017117962 A US 202017117962A US 2021180917 A1 US2021180917 A1 US 2021180917A1
Authority
US
United States
Prior art keywords
projectile
viewfinder
data
trajectory
launcher device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/117,962
Inventor
Raymond Dikun
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Good Sportsman Marketing LLC
Original Assignee
Good Sportsman Marketing LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Good Sportsman Marketing LLC filed Critical Good Sportsman Marketing LLC
Priority to US17/117,962 priority Critical patent/US20210180917A1/en
Assigned to PLANO MOLDING COMPANY, LLC reassignment PLANO MOLDING COMPANY, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DIKUN, RAYMOND
Assigned to WGI INNOVATIONS, LTD. reassignment WGI INNOVATIONS, LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PLANO MOLDING COMPANY, LLC, PLANO SYNERGY HOLDING INC.
Assigned to GOOD SPORTSMAN MARKETING, L.L.C. reassignment GOOD SPORTSMAN MARKETING, L.L.C. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WGI INNOVATIONS, LTD.
Publication of US20210180917A1 publication Critical patent/US20210180917A1/en
Assigned to NXT CAPITAL, LLC, AS AGENT reassignment NXT CAPITAL, LLC, AS AGENT SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GOOD SPORTSMAN MARKETING, L.L.C.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/08Aiming or laying means with means for compensating for speed, direction, temperature, pressure, or humidity of the atmosphere
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/04Aiming or laying means for dispersing fire from a battery ; for controlling spread of shots; for coordinating fire from spaced weapons
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/06Aiming or laying means with rangefinder
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/14Indirect aiming means
    • F41G3/16Sighting devices adapted for indirect laying of fire
    • F41G3/165Sighting devices adapted for indirect laying of fire using a TV-monitor
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G9/00Systems for controlling missiles or projectiles, not provided for elsewhere

Definitions

  • the disclosure relates generally to a method, system and computer program for monitoring, logging or assessing performance of a launcher device or a projectile launched from the launcher device, and for adjusting the launcher device for optimal trajectory to a target based on a projectile performance assessment for the launcher device or the projectile.
  • Projectile tracking can be a daunting task. Historically, projectile tracking has relied on human spotters using various optical gear, and in some cases, laser rangefinders. Once a target is ranged a shooter is offered information based on projectile performance that will aid in range compensation. However, human spotters may not be available in certain environments, such as, for example, target practice or hunting. Moreover, human spotters can provide erroneous or inconsistent information, resulting in costly and unnecessary expenditure of ammunition, arrows or other projectiles.
  • a projectile targeting solution is provided that can monitor, log and assess projectile performance of a launcher device or a projectile launched from the launcher device.
  • the projectile targeting solution includes a method, system and computer program for monitoring, logging or assessing performance of the launcher device or projectile and adjusting the launcher device for optimal trajectory to a target based on a projectile performance assessment for the launcher device or the projectile.
  • a projectile targeting system for monitoring, logging, assessing or adjusting performance of a launcher device or a projectile launched from the launcher device.
  • the system comprises a transceiver arranged to receive projectile trajectory data from a communication device and a projectile launcher doping unit.
  • the projectile launcher unit can be arranged to analyze the projectile trajectory data, determine an optimal trajectory for the projectile from the launcher device to a target, predict an actual trajectory for the projectile from the launcher device to the target, compare the optimal and predicted actual trajectories for the projectile, and generate correction parameters based on a result of the comparison.
  • the transceiver can be arranged to transmit the correction parameters to the communication device to set or adjust viewfinder settings.
  • the projectile trajectory data can comprise a distance from the launcher device to the target; and/or the projectile trajectory data can comprise global positioning system (GPS) coordinates; and/or the projectile trajectory data can comprise a horizontal doping setting; and/or the projectile trajectory data can comprise a vertical doping setting; and/or the projectile trajectory data can comprise projectile data; and/or the projectile trajectory data can comprise launcher device data; and/or the viewfinder settings can comprise riflescope doping settings.
  • GPS global positioning system
  • a method for projectile targeting for monitoring, logging, assessing or adjusting performance of a launcher device or a projectile launched from the launcher device.
  • the method comprises receiving projectile trajectory data from a communication device, analyzing the projectile trajectory data, determining an optimal trajectory for the projectile to a target, predicting an actual trajectory for the projectile to the target, comparing the optimal and predicted actual trajectories for the projectile, generating correction parameters, and sending the correction parameters to the communication device to set or adjust viewfinder settings.
  • the analyzing the projectile trajectory data can comprise analyzing historical performance data for the projectile or launcher device; and/or the projectile trajectory data can comprise a distance from the launcher device to the target; and/or the projectile trajectory data can comprise global positioning system (GPS) coordinates; and/or the projectile trajectory data can comprise a horizontal doping setting; and/or the projectile trajectory data can comprise a vertical doping setting; and/or the projectile trajectory data can comprise projectile data; and/or the projectile trajectory data can comprise launcher device data; and/or the viewfinder settings can comprise riflescope doping settings.
  • GPS global positioning system
  • a non-transitory computer-readable storage medium storing computer program instructions for projectile targeting, including monitoring, logging, assessing or adjusting performance of a launcher device or a projectile launched from the launcher device.
  • the program instructions comprise the steps of receiving projectile trajectory data from a communication device, analyzing the projectile trajectory data, determining an optimal trajectory for the projectile to a target, predicting an actual trajectory for the projectile to the target, comparing the optimal and predicted actual trajectories for the projectile, generating correction parameters, and sending the correction parameters to the communication device to set or adjust viewfinder settings.
  • the analyzing the projectile trajectory data can comprise analyzing historical performance data for the projectile or launcher device.
  • FIG. 1 shows a nonlimiting embodiment of a projectile targeting system.
  • FIG. 2 shows a nonlimiting embodiment of a projectile assessment and adjustment (PAAA) system.
  • PAAA projectile assessment and adjustment
  • FIG. 3 shows a nonlimiting embodiment of a process for monitoring, logging or assessing performance of a launcher device or projectile.
  • FIG. 4 shows a nonlimiting embodiment of a process for adjusting or setting performance of a launcher device.
  • FIG. 5 shows a nonlimiting embodiment of a viewfinder system.
  • Launcher devices such as, for example, guns, pistols, rifles, bows, crossbows, compound bows, or any other device capable of a launching a projectile, can be used to launch a projectile at a target that is located at a distance from the launcher device.
  • the launcher device For a projectile to impact a specific point on the target, the launcher device must be positioned so that the point is aligned with and in the trajectory path of the projectile. While the mechanics of targeting the projectile and hitting the target are simple in concept, in reality these processes can be extraordinarily difficult, so much so that even the most-skilled marksmen can miss the target.
  • Projectile accuracy is affected significantly by factors such as, for example, user experience, user skill level, condition or characteristics of the launcher device, condition or characteristics of the projectile, or ambient conditions such as, for example, wind speed, wind direction, temperature, precipitation, humidity, or any changes in the foregoing with respect to time.
  • factors such as, for example, user experience, user skill level, condition or characteristics of the launcher device, condition or characteristics of the projectile, or ambient conditions such as, for example, wind speed, wind direction, temperature, precipitation, humidity, or any changes in the foregoing with respect to time.
  • ambient conditions such as, for example, wind speed, wind direction, temperature, precipitation, humidity, or any changes in the foregoing with respect to time.
  • FIG. 1 shows a nonlimiting embodiment of a projectile targeting (PT) system 1 , according to the principles of the disclosure.
  • the PT system 1 can include a target 10 , a launcher device 20 , a projectile 30 , and a viewfinder 40 .
  • the PT system 1 can include a communication device (CD) 50 .
  • the PT system 1 can include a projectile assessment and adjustment (PAAA) system 60 .
  • the communication device 50 can be arranged to interact with the viewfinder 40 via a communication link.
  • the communication device 50 can exchange data and instruction signals with the viewfinder 40 over the communication link.
  • the launcher device 20 includes a bow
  • the projectile 30 includes an arrow
  • the communication device 50 includes a mobile cellular telephone.
  • the launcher device 20 includes a firearm (for example, a rifle or a pistol) and the projectile includes ammunition.
  • the PT system 1 can be arranged for ranging, tracking or projectile compensation of the launcher device 20 based on, for example, data from previous projectile launch events or predetermined projectile and/or launcher device performance data.
  • the viewfinder 40 can be configured to range the target 10
  • the communication device 50 can be arranged to, once the target 10 is ranged, display information based on projectile performance or launcher device performance that can aid in range compensation.
  • the PT system 1 can track the target with, for example, thermal imaging technology providing for geolocation. All of the aforementioned imagery can be captured via an imaging sensor (not shown) or camera (not shown) in the viewfinder 40 that is capable of visible light, infrared, or thermal imagery.
  • the communication device 50 can be arranged to interact with the PAAA system 60 via a communication link, which can include a cellular site 70 and/or a network 80 .
  • the PAAA system 60 can be located in the network 80 or outside the network 80 .
  • the PAAA system 60 can be accessible through the network 80 (for example, the Internet or a local area network (LAN)) or directly through a communication link.
  • the PAAA system 60 can exchange data and instruction signals with the communication device 50 over the communication link(s).
  • the PAAA system 60 can be included in the communication device 50 .
  • the target 10 can include any object or animal.
  • the object can include, for example, a bullseye target (for example, seen in FIG. 1 ), a bottle, a can or any other object suitable for target practice; the animal can include, for example, a bear, coyote, deer, elk, rabbit, squirrel, or any other game animal.
  • the launcher device 20 can include any device that can hold, support, aim, direct, guide or launch the projectile 30 .
  • the launcher device 20 can include, for example, a bow, a compound bow, a crossbow, a gun, a pistol, or a rifle; and, the projectile 30 can include, for example, an arrow, a bullet, a cartridge, or any object that can be launched from the launcher device 20 .
  • the viewfinder 40 can include, for example, a laser rangefinder, a ballistic smart laser rangefinder, a smart rifle scope, or a smart high definition (HD) thermal rifle scope (for example, ATN ThOR smart thermal scope, ATN OTS-HD 640 5-50x, Leupold LTO Tracker HD 174906, LTO Tracker HD Thermal Viewer).
  • the viewfinder 40 can include an electronic viewfinder system 40 A (shown in FIG. 5 ).
  • the viewfinder 40 can be arranged to view and capture images of the target 10 in a field-of-view (FOV) in real-time.
  • the viewfinder 40 can be arranged to measure a distance from the viewfinder 40 to a point on each object in the FOV, including one or more points on the target 10 when the target is in the FOV.
  • the viewfinder 40 can be arranged to calculate a trajectory of the projectile 30 from the launch device 20 to an aimpoint in the FOV, such as, for example, a dot or an intersection point of the horizontal and vertical lines in a crosshair reticle in an aiming indicator in the viewfinder 40 .
  • the launching device 20 can be manipulated and maneuvered until the aimpoint overlays a selected point or location (“target point”) on the target 10 to be impacted by the projectile 30 .
  • the viewfinder 40 can be arranged to detect, measure, monitor and log ambient conditions surrounding the viewfinder 40 or the target 10 .
  • the viewfinder 40 can be arranged to communicate with one or more sensors (for example, sensor 40 - 3 , shown in FIG. 5 ) to receive sensor data.
  • the sensors can be located on, in, or proximate to the viewfinder 40 , such as, for example, on, in or near the launcher device 20 , on, in or near the target 10 , or elsewhere in the surrounding environment.
  • the aiming indicator can include, for example, a dot, a reticle, a fine crosshair, a duplex crosshair, a German reticle, a target dot, a mil-dot, a circle, a range finder, an SVD-type reticle, or any other positioning, aiming or measuring indicator that can be used to range and align the trajectory of the projectile 30 with a point on the target 10 , including horizontal, vertical and distance values, such that when the projectile 30 is launched from the launcher device 20 , the projectile 30 will impact the target point that was overlayed by the aimpoint when the projectile 30 was launched.
  • the communication device 50 can include, for example, a smartphone, a tablet, or a portable computing device.
  • the communication device 50 can be arranged to receive an image or a series of images of the FOV in real-time, including the target 10 when it is in the FOV of the viewfinder 40 .
  • the images can be captured by the viewfinder 40 at, for example, 25, 30, 60, 120, or more frames-per-second (fps), or any other frame rate, depending on the application.
  • the viewfinder 40 can be configured with an image capture device (for example, high speed camera) capable of capturing 30,000 fps, or more, allowing for frame-by-frame imaging of projectiles such as small arms bullets as they travel at speeds greater than, for example, 3,000 feet-per-second.
  • the communication device 50 can be arranged to transmit data and instruction signals to the viewfinder 40 , including, for example, viewfinder correction parameters to adjust one or more viewfinder settings, such as, for example, an adjustment to a diopter setting, a magnification setting, an illumination setting, an elevation setting, a windage (or horizontal) setting or a parallax setting of the viewfinder 40 .
  • the viewfinder settings can be adjusted to match and align the actual trajectory of the projectile 30 to a predicted optimal trajectory of the projectile 30 , such that the projectile 30 will impact the target point when the aimpoint overlays the target point.
  • the viewfinder 40 (or communication device 50 ) can monitor or store image data, viewfinder settings, ambient condition data, time data and positional data, including, geographic location data such as GPS coordinates and directional orientation, which can be transmitted to the communication device 50 (or PAAA system 60 ) over one or more communication links.
  • FIG. 2 shows a nonlimiting embodiment of the PAAA system 60 , constructed according to the principles of the disclosure.
  • the PAAA system 60 can include one or more computing devices or one or more computer resources.
  • the PAAA system 60 can be provided separate from the communication device 50 , such as, for example, in a server located in a cloud network or on the Internet, or it can be included in the communication device 50 .
  • the PAAA system 60 can be included in the viewfinder 40 .
  • the PAAA system 60 can include a graphic processor unit (GPU) 110 , a storage 120 , a network interface 130 , an input-output (I/O) interface 140 , a user profile manager 150 , a database 160 , a launcher device doping (PLD) unit 170 , and a user dashboard generator 180 .
  • the PAAA system 60 can receive a communication signal from the viewfinder 40 or the communication device 50 (shown in FIG. 1 ), demodulate the communication signal and separate or parse out launcher device data, projectile data, target data, timestamp data, user data, user request data, positional data, or ambient condition data.
  • the PAAA system 60 can generate and transmit to the communication device 50 (or viewfinder 40 ) user dashboard rendering instructions and data, launcher device data, projectile data, or viewfinder correction parameter data.
  • the viewfinder correction parameter data can include viewfinder correction parameters to adjust one or more viewfinder settings, including, for example, a minute-of-angle (MOA) doping setting, a magnification setting, an illumination setting, an elevation setting, a windage (or horizontal) setting or a parallax setting.
  • the viewfinder correction parameter data can cause the viewfinder 40 to adjust the viewfinder settings automatically, or they can be rendered on the communication device 50 to allow the user to manually adjust the viewfinder settings.
  • the launcher device data can include key specifications for the launcher device 20 .
  • the launcher device data can include original equipment manufacturer (OEM) identification, gun type, gun model, year of manufacture of gun, caliber, action type, capacity, barrel length, barrel material, barrel style, stock type, length of pull, overall length, overall weight, manufacturer serial number, or UPC code.
  • OEM original equipment manufacturer
  • the launcher device data can include, for example, draw weight, weight, length, width, arrow length, arrow velocity, power-stroke dimensions, speed, axle-to-axle, or brace height.
  • the projectile data can include key specifications for the projectile 30 .
  • the projectile data can include an OEM identification, ammunition type, model, year of manufacture, cartridge type or size, bullet type or size, bullet weight, caliber, average weight-to-caliber ratio, muzzle velocity, trajectory, drop rate (e.g., inches/yard), or UPC code.
  • the projectile data can include, for example, OEM, type, model, shaft length, total length, diameter, material, tip type, fletching type, spine, spine concentricity, straightness, or weight.
  • the target data can include, for example, a timestamp, a distance from the viewfinder 40 (shown in FIG. 1 ) to the target 10 , dimensions of the target 10 (for example, width, height, length), or target type, such as, for example, a species type of animal.
  • the PAAA system 60 can be arranged to query, retrieve, or download launcher device data or projectile data from the database 160 or an external source (not shown) such as, for example, an OEM server (not shown).
  • the PAAA system 60 can be arranged to query or retrieve historical data for the launcher device 20 or projectile 30 (shown in FIG. 1 ).
  • the PAAA system 60 can be arranged to query or retrieve historical data for the user, including, for example, historical performance with the launcher device 20 or projectile 30 .
  • the PAAA system 60 can be arranged to receive launcher device data or projectile data from the communication device 50 (or viewfinder 40 ).
  • the PAAA system 60 can be arranged to analyze historical data for the launcher device 20 , projectile 30 or user and predict an actual trajectory for the projectile 30 based on the target data when it is launched by the launcher device 20 under operation by the user.
  • the PAAA system 60 can be arranged to determine (or predict) an optimal trajectory for the projectile 30 based on the target data and current viewfinder settings, such that the projectile 30 will impact the target point when the aimpoint is overlayed on the target point and the projectile 30 launched by the launcher device 20 .
  • the PAAA system 60 can be arranged to calculate viewfinder correction parameters for the viewfinder settings to adjust the settings from their current settings to adjusted viewfinder settings necessary to adjust the actual projectile trajectory to match the predicted optimal projectile trajectory.
  • the PAAA system 60 can predict an actual trajectory for the projectile 30 from the launcher device 20 to the target 10 when the projectile 30 is launched by the user.
  • the PAAA system 60 can determine current, real-time settings for the viewfinder 40 , such as, for example, diopter, magnification, illumination, elevation, windage, and parallax.
  • the PAAA system 60 can determine an optimal trajectory for the projectile 30 , compare the optimal and actual trajectories, and generate viewfinder correction parameters to align the actual trajectory with the optimal trajectory.
  • the viewfinder correction parameters can include, for example, horizontal and/or vertical doping instructions that can be used to set or adjust the minutes-of-angle (MOAs) on the viewfinder 40 so that the actual projectile trajectory path impacts the target point on the target 10 when the aimpoint overlays the target point and the launcher device 20 launches the projectile 30 .
  • MOAs minutes-of-angle
  • the PAAA system 60 can include a computer-readable medium that can hold executable or interpretable computer code (or instructions) that, when executed by one or more of the components (for example, the GPU 110 ), cause the steps, processes and methods described in this disclosure to be carried out.
  • the computer-readable medium can be included in the storage 120 , or an external computer-readable medium connected to the PAAA system 60 via the network interface 130 or the I/O interface 140 .
  • the GPU 110 can include any of various commercially available graphic processors, processors, microprocessors or multi-processor architectures.
  • the GPU 110 can include a plurality of GPUs that can execute computer program instructions in parallel.
  • the GPU 110 can include a central processing unit (CPU) or a plurality of CPUs arranged to function in parallel.
  • BIOS basic input/output system
  • the BIOS can contain the basic routines that help to transfer information between computing resources within the PAAA system 60 , such as during start-up.
  • the storage 120 can include a read-only memory (ROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), a random-access memory (RAM), a non-volatile random-access memory (NVRAM), a dynamic random-access memory (DRAM), a synchronous dynamic random-access memory (SDRAM), a static random-access memory (SRAM), a burst buffer (BB), or any other device that can store digital data and computer executable instructions or code.
  • ROM read-only memory
  • EPROM erasable programmable read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • RAM random-access memory
  • NVRAM non-volatile random-access memory
  • DRAM dynamic random-access memory
  • SDRAM synchronous dynamic random-access memory
  • SRAM static random-access memory
  • BB burst buffer
  • a variety of program modules can be stored in the storage 120 , including an operating system (not shown), one or more application programs (not shown), application program interfaces (APIs) (not shown), program modules (not shown), or program data (not shown). Any (or all) of the operating system, application programs, APIs, program modules, or program data can be cached in the storage 120 as executable sections of computer code.
  • an operating system not shown
  • application programs not shown
  • APIs application program interfaces
  • program modules not shown
  • program data not shown
  • Any (or all) of the operating system, application programs, APIs, program modules, or program data can be cached in the storage 120 as executable sections of computer code.
  • the network interface 130 can be arranged to connect to the network 80 or one or more external networks (not shown).
  • the network interface 130 can include a wired or a wireless communication network interface (not shown) or a modem (not shown).
  • the PAAA system 60 can be connected to the LAN through the wired or wireless communication network interface; and, when communicating in a WAN, the PAAA system 60 can be connected to the WAN through the modem.
  • the modem (not shown) can be internal or external and wired or wireless.
  • the modem can be connected to the backbone B via, for example, a serial port interface (not shown).
  • the I/O interface 140 can be arranged to receive instructions and data from, for example, an operator via a user interface device (not shown), such as, for example, a keyboard (not shown), a mouse (not shown), a pointer (not shown), a microphone (not shown), a speaker (not shown), or a display (not shown).
  • a user interface device such as, for example, a keyboard (not shown), a mouse (not shown), a pointer (not shown), a microphone (not shown), a speaker (not shown), or a display (not shown).
  • the received commands and data can be forwarded to the GPU 110 , or one or more of the components 120 through 180 as instruction or data signals via the backbone B.
  • the I/O interface 140 can include a transmitter and receiver (transceiver) that can receive a communication signal from an external source, such as, for example, the viewfinder 40 , communication device 50 , or cell site 70 (shown in FIG. 1 ).
  • a transmitter and receiver transmitter and receiver that can receive a communication signal from an external source, such as, for example, the viewfinder 40 , communication device 50 , or cell site 70 (shown in FIG. 1 ).
  • the network interface 130 can include a data parser (not shown) or the data parsing operation can be carried out by the GPU 110 .
  • Received communication signals from the viewfinder 40 or communication device 50 can be demodulated or depacketized and data can be transferred from the network interface 130 to the GPU 110 , user profile manager 150 , database 160 , launcher device doping unit 170 or dashboard generator 180 .
  • the network interface 130 can facilitate communication between any one or more of the components in the PAAA system 60 and communication devices located internal (or external) to the network 80 .
  • the network interface 130 can handle a variety of communication or data packet formats or protocols, including conversion from one or more communication or data packet formats or protocols used by the viewfinder 40 or communication device 50 to the communication or data packet formats or protocols used in the PAAA system 60 .
  • the user profile manager 150 can include a computing device or it can be included in a computing device as one or more computing resources.
  • the user profile manager 150 can create, manage, edit, or delete a user record for each user, which can include, for example, information for the target 10 , the target point on the target 10 , launcher device 20 , projectile 30 , viewfinder 40 or communication device 50 .
  • the user record can include, for example, a user identification, an email address, a username, a media access control (MAC) address, an Internet Protocol (IP) address, a device serial number, a device name, or any other user or device identification.
  • the user profile manager 150 can communicate with the database 160 to search, retrieve, edit or store user records in the database 160 .
  • the user profile manager 150 can manage and link multiple user profiles to enable group or individual-to-individual sharing of information between viewfinders 40 or communication devices 50 , including performance or position data.
  • the database 160 can include one or more relational databases.
  • the database 160 can store and maintain a record for each user, target 10 , launcher device 20 , projectile 30 , viewfinder 40 , or communication device 50 .
  • the record can include projectile data, launcher device data, user data, viewfinder data (including, for example, diopter, magnification, illumination, elevation, windage, or parallax data), ambient condition data, and timestamp data.
  • the record can include historical performance data for the user, launcher device 20 , projectile 30 or viewfinder 40 .
  • the historical performance data can include ambient conditions data, including, for example, ambient conditions during a launch event for which performance of the user, launcher device 20 , projectile 30 or viewfinder 40 was measured and logged.
  • the launcher device doping (PLD) unit 170 can include one or more computing devices or it can be included in a computing device as one or more computing resources.
  • the PLD unit 170 can include a machine learning platform, including, for example, one or more supervised machine learning systems and/or one or more unsupervised machine learning systems.
  • the machine learning platform can include machine intelligence, such as, for example, an artificial neural network (ANN), a convolutional neural network (CNN), a deep CNN (DCNN), a region-based CNN (R-CNN), a Mask-RCNN, a deep convolutional encoder-decoder (DCED), a recurrent neural network (RNN), a neural Turing machine (NTM), a differential neural computer (DNC), a support vector machine (SVM), a deep learning neural network (DLNN), Naive Bayes, decision trees, logistic model tree induction (LMT), NBTree classifier, case-based, linear regression, Q-learning, temporal difference (TD), deep adversarial networks, fuzzy logic, K-nearest neighbor, clustering, random forest, rough set, or any other machine intelligence platform capable of supervised or unsupervised learning.
  • ANN artificial neural network
  • CNN convolutional neural network
  • DCNN deep CNN
  • R-CNN region-based CNN
  • R-RCNN a Mask-RCNN
  • DCED
  • the machine intelligence can include a machine learning model that can be trained based on training datasets and validated based on testing datasets.
  • the training datasets and testing datasets can be created in part or in whole from data received from, for example, the viewfinder 40 or the communication device 50 .
  • the training datasets can be annotated through interaction with the user via the communication device 50 .
  • the training dataset can be created through interaction with the user over time based on numerous (for example, hundreds or more) of launch events, where the user launched the projectile 30 through operation of the launcher device 20 and provided feedback with respect to accuracy of the resultant impacts on the target 10 .
  • the PLD unit 170 can be arranged to analyze real-time data received from the viewfinder 40 or communication device 50 (shown in FIG. 1 ) and historical data queried or retrieved from the database 160 , including launcher device data, projectile data, target data, user data, user request data, positional data, timestamp data or ambient condition data and predict both an actual and an optimal trajectory of the projectile 30 between the launcher device 20 and target 10 when launched by operation of the user under existing viewfinder 40 settings.
  • the PLD unit 170 can be arranged to generate viewfinder correction parameters (such as, for example, doping settings) for the viewfinder 40 to set or adjust its viewfinder settings such that the actual trajectory of the projectile 30 will be adjusted to impact the target point.
  • the viewfinder correction parameters can be sent directly to the viewfinder 40 or through the communication device 50 .
  • the viewfinder correction parameters can be sent to the communication device 50 and rendered on a display screen, in which case the user can reference the viewfinder correction parameters to manually adjust the viewfinder settings on the viewfinder 40 .
  • the dashboard generator 180 can interact with the user profile manager 150 , database 160 , or PLD unit 170 and generate instructions and data that cane be implemented by the communication device 50 to render a dashboard (not shown) on the display device 50 .
  • the dashboard generator 180 can interact with the communication device 50 to cause the communication device 50 to display a graphic user interface (GUI) that can be operated by the user to enter data or commands, which can be transmitted in response to a user entry on the communication device 50 , such as, for example, selection, addition, deletion or change in user data, projectile data, launcher device data, target data, or ambient condition data.
  • the GUI can be rendered on the display and superimposed on the image received from the viewfinder 40 .
  • the data or commands can be transmitted to the communication device 50 via, for example, the network interface 130 or I/O interface 140 over a communication link, which can include the cell site 70 .
  • FIG. 3 shows a nonlimiting embodiment of a process 200 for monitoring, logging or assessing performance of the launcher device 20 or projectile 30 (shown in FIG. 1 ), according to the principles of the disclosure.
  • the process 200 can be carried out by the communication device 50 .
  • the communication device 50 can exchange data with the viewfinder 40 over a communication link and display a projectile trajectory correction menu and/or an image of the FOV of the viewfinder 40 on the display of the communication device 50 .
  • the communication device 50 can receive current viewfinder settings, such as, for example, diopter settings, elevation setting, windage setting, magnification setting, parallax setting or MOA (minute-of-angle) settings (Step 205 ). Any one or more of the viewfinder settings can be received automatically from the viewfinder 40 over a communication link or entered manually on the GUI by the user on the communication device 50 .
  • the communication device 50 can receive a request for projectile trajectory correction (Step 210 ).
  • the received request can include a selection on the GUI by the user, such as, for example, selecting an item on a dropdown menu or entering instructions into a command field of the GUI.
  • the launcher device data or projectile data can be received from an external data source (not shown) such as, for example, an OEM server.
  • an external data source such as, for example, an OEM server.
  • ambient conditions can be determined at the location of the viewfinder 40 or target 10 .
  • the current viewfinder settings, target distance, projectile data, launcher device data or measured ambient conditions can be provided and sent to the PAAA system 60 (Step 225 ).
  • the user can do so by repeatedly launching the projectile 30 from the launcher device 20 , impacting the target with the projectile 30 at varying distances, and assessing the actual point of impact each time compared to the aimpoint.
  • the user can build a training dataset by displaying and annotating images for each launch on the communication device 50 .
  • the image data can include metadata that is automatically added to the image data based on the viewfinder trajectory settings, target distance, projectile data or launcher device data.
  • the training dataset can be received by the PAAA system 60 and used to train or tune the machine learning model for the particular user, target 10 , launcher device 20 , projectile 30 or viewfinder 40 .
  • the launcher device data can include specifications received from an OEM for a hunting rifle and the projectile data can include specifications received from an OEM for ammunition included in the hunting rifle.
  • a process 300 for adjusting or setting performance of a launcher device can be carried out by the PAAA system 60 (Step 235 ).
  • Projectile trajectory corrections can be received from the PAAA system 60 (Step 240 ).
  • the viewfinder 40 includes the viewfinder system 40 A (shown in FIG. 5 )
  • the projectile trajectory corrections can be received by the viewfinder 40 directly from the PAAA system 60 or via the communication device 50 .
  • the received viewfinder correction parameters can be used to set the viewfinder settings (Step 245 ) for optimal trajectory of the projectile 30 from the launcher device 20 to the target point on the target 10 .
  • the viewfinder settings can be set manually by the user while referencing the viewfinder correction parameters on the display of the communication device 50 .
  • the user can view the viewfinder correction parameters displayed on the communication device 50 and manually adjust the viewfinder settings (for example, +1 MOA for elevation and +2 MOA for windage).
  • the viewfinder settings can be set automatically by the viewfinder 40 based on the received viewfinder correction parameters.
  • the process 200 can be carried out by the viewfinder 40 .
  • the steps relating to exchanges of data between the viewfinder 40 and communication device 50 can be omitted.
  • the target 10 can include a bullseye target
  • the launcher device 20 can include a 30-06 REMINGTON SPRINGFIELD hunting rifle
  • the projectile 30 can include 180 Grain .30-06 Springfield ammunition
  • the viewfinder 40 can include a computing device and an image pickup device that can capture and record images of the field of view, including the target 10 when positioned in the field of view.
  • the communication device 50 can include a smartphone. Referring to FIG. 3 , the user can pull up or launch an app on the smartphone 50 , confirm the smartphone 50 is paired or linked to the viewfinder 40 .
  • the smartphone 50 can receive current horizontal and vertical doping settings with (or without) image data from the viewfinder 40 (Step 205 ).
  • the user can select “TRAJECTORY CORRECTION” from a dropdown menu or enter a command into the smartphone 50 (Step 210 ), in response to which the smartphone app can determine whether specifications (including performance specifications) are stored for the 30-06 REMINGTON SPRINGFIELD hunting rifle and 180 Grain .30-06 Springfield ammunition (Step 215 ), and, if not (NO at Step 215 ), then the specification data can be loaded or downloaded (Step 220 ).
  • the distance from the viewfinder 40 to the target 10 can be measured and sent to the smartphone 50 , which can determine and display the distance on the display device (Step 225 ).
  • the smartphone 50 can send the projectile trajectory data, including current viewfinder settings (including doping settings), target distance, and any other information that can be used by the PAAA system 60 to determine optimal and actual trajectories for the 180 Grain . 30 - 06 Springfield ammunition (Step 230 ).
  • the smartphone 50 can receive viewfinder correction parameters from the PAAA system 60 , which can be received in real-time (Step 240 ).
  • the smartphone 50 via the app, can send corrected viewfinder settings (including doping settings) or adjustments to current viewfinder settings to the viewfinder 40 (Step 245 ) for an optimal trajectory to the target point on the target 10 .
  • the PAAA system 60 can be included in the smartphone 50 or located remotely such as, for example, in the cloud or network 80 (shown in FIG. 1 ).
  • FIG. 4 shows a nonlimiting embodiment of a process 300 for adjusting or setting performance of the launcher device 20 (shown in FIG. 1 ), according to the principles of the disclosure.
  • the PAAA system 60 can receive a communication signal from the communication device 50 that comprises projectile trajectory data (Step 305 ).
  • the projectile trajectory data can be parsed and, based on the received communication signal, the user profile manager 150 can retrieve, update or create a record for the user (Step 310 ).
  • the PLD unit 170 can analyze the projectile trajectory data, including the projectile data, launcher device data, target data, current viewfinder settings or ambient condition data (Step 315 ) and determine an optimal trajectory (Step 320 ) and an actual trajectory (Step 325 ) for the projectile 30 (shown in FIG. 2 ) from the launcher device 20 to the target 10 .
  • the optimal trajectory can be determined, for example, by analyzing historical data, OEM specifications or ambient conditions for the projectile 30 and launcher device 20 and determining the velocity (for example, yards/second) when the projectile 30 leaves the launcher device 20 , the deceleration rate of the projectile (for example, yards/second 2 ), and the drop rate of the projectile (for example, inches/yard) based on the distance from the launcher device 20 to the target point on the target 10 .
  • the actual projectile trajectory can be determined (or predicted) by the PLD unit 170 by determining external forces and their effects on the projectile 30 as it travels from the launcher device 20 to the target 10 .
  • the external forces can include, for example, forces resulting from gravity, wind, pressure, temperature, precipitation, or user operation, or any changes in the foregoing as a function of time.
  • the PLD unit 170 can interact with the database 160 and analyze historical data for the user, projectile 30 , launcher device 20 , target 10 , viewfinder 40 , geographic location or ambient conditions to predict the magnitudes and vectors of external forces, as well as their effects on the trajectory of the projectile 30 as it travels from the launcher device 20 to the target point, which can include, for example, external forces such as gravity, wind, precipitation, temperature, pressure, or user operation.
  • the PLD unit 170 can predict how much and how fast the trajectory of the projectile 30 will deviate from a targeted point on the target 10 due to the external forces.
  • a user can develop a unique force signature that can affect the trajectory of the projectile 30 .
  • the launcher device 20 is a 30-06 hunting rifle
  • the user may consistently demonstrate a 0.1° offset to the right due to an unequal user operation force applied when squeezing the trigger on the rifle.
  • the PLD unit 170 can learn the user force signature, including the user signature associated with the specific launcher device 20 , projectile 30 , or target 10 , and predict a deviation of the projectile trajectory to the target point on the target 10 due to the user operation force.
  • the PLD unit 170 can generate viewfinder correction parameters to counteract the deviation due to the user operation force, as well as other external forces, for a projectile launch event.
  • the PLD unit 170 can be arranged to analyze past performance of the user with the same 30-06 REMINGTON rifle, 180 Grain .30-06 Springfield cartridge, at comparable distances to a target point. Based on the analysis of historical data, the PLD unit 170 can recognize patterns and, based on the analysis of real-time data, predict the likely trajectory of the bullet with respect to the aimpoint, including the drop rate and point of impact on the target 10 .
  • the PLD unit 170 can compare the actual and optimal projectile trajectories (Step 330 ) and generate viewfinder correction parameters to adjust the settings for the viewfinder 40 (shown in FIG. 1 ) and align the actual projectile trajectory with the optimal projectile trajectory so that they overlap and have the same or substantially the same trajectory (Step 335 ).
  • the viewfinder correction parameters can include, for example, horizontal or vertical MOA settings for the viewfinder 40 (shown in FIG. 1 ).
  • the viewfinder correction parameters can be sent directly to the viewfinder 40 or transmitted to the communication device 50 (shown in FIG. 1 ), in which case the communication device 50 can send the viewfinder correction parameters to the viewfinder 40 to cause the viewfinder 40 to update the viewfinder settings (Step 340 ).
  • the PLD unit 170 can be arranged to generate and send commands (or instructions) and data to the communication device 50 that, when executed by a processor (not shown) in the device, cause the communication device 50 to render a display screen displaying an image of the FOV of the viewfinder 40 and overlaying the image with a software overlay for the aiming indicator (for example, a reticle) and informational display.
  • the commands and data can cause the processor to manage the storage of information from one or more measuring sensors (such as, for example, sensors 40 - 3 , shown in FIG. 5 ), and to pass information wirelessly, including geo-positioning data, imaging data (visible and infrared), distance measurements, and values for angle-compensated ranging.
  • the communication device 50 can be arranged to interact with the viewfinder 40 , which can acquire a target object in the FOV of the viewfinder 40 , and display the FOV, including the target object, on the display screen.
  • the captured image can include visible light, near-infrared (NIR) or longwave infrared (LWIR) wavelengths. This image can be improved optically and digitally to effect zoom.
  • the user can select to range the target in the FOV.
  • the position of the viewfinder 40 and the position of the target 10 can be recorded and geolocated.
  • the user can select the projectile data (for example, ballistic data) most relevant to the launching platform, including the launcher device 20 and projectile 30 .
  • the projectile data can include, for example, data driven by historical data on previous engagements or projectile performance data received from industry sources.
  • the projectile data can be relayed to the user via the communication device 50 so that adjustments can be made either automatically in the case of, for example a smart scope, or manually by the user in the case of a non-connected viewfinder 40 .
  • the viewfinder 40 can be linked to the communication device 50 wirelessly so that the data and the calculations can be performed by either device and the data can be stored and or recorded by either device. Once the target is engaged it can be tracked via a thermal imager so the harvest can be completed.
  • the viewfinder 40 includes the viewfinder system 40 A, depicted in FIG. 5 .
  • the viewfinder system 40 A can be included in the communication device 50 .
  • the viewfinder system 40 A includes a processor 40 - 1 , a storage device 40 - 2 , one or more onboard sensors 40 - 3 , a driver suite 40 - 4 , and a transceiver 40 - 5 , and an input-output ( 10 ) interface 40 - 6 .
  • the viewfinder system 40 A can include a network interface 40 - 7 , a target tracker 40 - 8 and/or a viewfinder adjuster suite 40 - 9 .
  • the system 40 A can include a bus B, which can be connected to any or all of the components 40 - 1 to 40 - 9 by a communication link.
  • the processor 40 - 1 , storage 40 - 2 , IO interface 40 - 6 , network interface 40 - 7 and bus B can be similar to the GPU 110 , storage 120 , I/O interface 140 , network interface 130 and bus B, respectively, shown in FIG. 2 (discussed above).
  • any one or more of the components 40 - 1 to 40 - 9 in the viewfinder system 40 A can include a computing resource or a computing device.
  • the components can include a computing resource or computing device that is separate from the processor 40 - 1 , as seen in FIG. 5 , or integrated with the processor 40 - 1 .
  • Any of the components can include a computer resource that can be executed on the processor 40 - 1 as one or more processes.
  • the computer resources can be contained in the storage 40 - 2 .
  • the viewfinder 40 is arranged to exchange data and instructions signals with the PAAA system 60 (shown in FIGS. 1 and 2 ).
  • the sensor(s) 40 - 3 can include a global positioning system (GPS) receiver or any device that can accurately determine the geographic location and/or physical orientation of the viewfinder system 40 A in real-time, regardless of whether the viewfinder 40 is stationary or moving.
  • GPS global positioning system
  • the sensor(s) 40 - 3 can include a rangefinder, a laser rangefinder, a monocular, a binocular, a riflescope, a camera, a display, a three-dimensional (3D) depth camera, an infrared (IR) sensor, a two-dimensional (2D) Light Detection and Ranging (LiDAR) sensor, a 3D LiDAR sensor, a laser ranger finder, an accelerometer, a motion detector, a temperature sensor, a humidity sensor, a precipitation sensor, a wind sensor, an atmospheric pressure sensor, a sound sensor, a light sensor, or any other device capable of detecting or measuring ambient conditions surrounding the viewfinder 40 and/or the target 10 .
  • the camera or 3D camera can be arranged to capture and output image data, with or without 3D point cloud data, of the FOV.
  • the camera can be arranged to capture visible and infrared images, including near visible and thermal wavelengths.
  • the sensor(s) 40 - 3 can be included in or attached to the viewfinder 40 , or fitted or attached to the launcher device 20 to enable sensory reception such as, for example, viewing and image capture of the target 10 .
  • the sensor(s) 40 - 3 can include a computing device, a computer resource, or a suite of computing devices or computer resources.
  • the sensor(s) 40 - 3 can be arranged to capture images in real-time of the FOV, including all objects in the FOV, measure distances to one or more points on each object in the FOV (including the target point overlayed by the aimpoint), and, in certain embodiments, measure ambient conditions surrounding the viewfinder or target.
  • the sensor(s) 40 - 3 can be arranged to measure changes and rates of change in motion of the aimpoint (including magnitude and directional vectors) as a function of time, as well as any changes or rates of change in, for example, temperature, humidity, precipitation, pressure, wind, sound, or light.
  • One or more of the sensors 40 - 3 can be provided separate from the viewfinder 40 .
  • one or more of the sensors 40 - 3 can be provided on the communication device 50 , on the launcher device 30 , or elsewhere in the surrounding environment.
  • the sensor(s) 40 - 3 can exchange data and instructions signals with the viewfinder 40 or communication device 50 over one or more communication links.
  • the driver suite 40 - 4 can include a plurality of drivers, including a driver 40 - 4 B for each senor 40 - 3 .
  • the driver suite 40 - 4 can include a video driver 40 - 6 A to drive the display device (not shown) and the sensor driver 40 - 4 B to drive, for example, the camera to capture images of the FOV in real-time, including visible and infrared wavelengths.
  • the display device (not shown) can be arranged to display the field of view of the camera.
  • the viewfinder system 40 A can receive ambient condition data for the geographic location of the viewfinder 40 from an external data source, such as, for example, a weather service website (for example, a National Oceanic and Atmospheric Administration (NOAA) National Weather Service website).
  • the received ambient condition data can include, for example, temperature, pressure, humidity, precipitation, wind speed, or wind direction data for the geographic location of the target 10 or viewfinder 40 (or communication device 50 ).
  • the transceiver 40 - 5 can include a transmitter 40 - 5 A and a receiver 40 - 5 B.
  • the transceiver 40 - 5 can be arranged to transmit data and instruction signals between the viewfinder system 40 A and the communication device 50 (shown in FIG. 1 ) over one or more communication links.
  • the viewfinder system 40 A can transmit image data signals to the communication device 50 , which can be displayed by the communication device 50 on a display device.
  • the image data signals can include rendering instructions and data that can be used by the communication device 50 to display an aiming indicator (for example, a crosshair) superimposed on the displayed field of view.
  • an aiming indicator for example, a crosshair
  • the IO interface 40 - 6 can receive or communicate commands and data from or to an operator via a user interface device (not shown), such as, for example, a keyboard, a touch-display, a mouse, a pointer, a microphone, a speaker, or a display.
  • the commands and data can be communicated between any of the components in the viewfinder system 40 A and the user interface as instruction or data signals via the backbone B.
  • the IO interface 40 - 6 is arranged to receive viewfinder settings or viewfinder setting adjustment values entered by the user.
  • the network interface 40 - 7 can be arranged to connect to a network, such as, for example, the network 80 (show in FIG. 1 ), and interact with the PAAA system 60 .
  • the target tracker 40 - 8 can be arranged to receive sensor data from the sensors 40 - 3 , including image data from the camera (not shown), and calculate a distance value to each point on an object in the FOV, including the aimpoint.
  • the target tracker 40 - 8 can be arranged to receive thermal image data and track the aimpoint until the projectile 30 is launched and impacts the target object, or another object.
  • the target tracker 40 - 8 can include a machine learning system arranged to recognize and track each object in the field of view, including characteristics of the object, such as, for example, color, size, and shape.
  • the sensor data can include visible light image data, IR image data, viewfinder location data corresponding to the real-world geo-positioning, target geo-position data, target distance data, or ambient condition data.
  • the target tracker 40 - 8 can be arranged to predict an actual trajectory for the projectile 30 from the launcher device 20 to the aimpoint.
  • the viewfinder adjuster suite 40 - 9 can include one or more computing devices or computing resources, including a diopter adjuster 40 - 9 A, elevation adjuster 49 - 9 B, horizontal (or windage) adjuster 40 - 9 C, illumination adjuster 40 - 9 D or parallax adjuster 40 - 9 E.
  • the diopter adjuster 40 - 9 A can be arranged to adjust the diopter settings of the viewfinder based on a diopter setting in the received viewfinder correction parameters.
  • the diopter adjuster 40 - 9 A can adjust an optical system (not shown) for optimal focus of an object in the FOV of the viewfinder 40 .
  • the elevation adjuster 49 - 9 B, horizontal adjuster 40 - 9 C, illumination adjuster 40 - 9 D and parallax adjuster 40 - 9 E can be arranged to adjust elevation settings, windage settings, illumination settings and parallax settings, respectively, of the viewfinder 40 based on respective parameter values in the received viewfinder correction parameters.
  • the viewfinder 40 shown in FIG. 1
  • the viewfinder 40 will be set for the particular user, the particular launcher device 20 and the particular projectile 30 such that, when the projectile 30 is launched and the aimpoint is overlayed atop of the target point, the projectile 30 will repeatedly and consistently impact the target point.
  • the term “backbone,” as used in this disclosure, means a transmission medium that interconnects one or more computing resources to provide a path that conveys data signals and instruction signals between the one or more computing resources.
  • the backbone can include a bus or a network.
  • the backbone can include an ethernet TCP/IP.
  • the backbone can include a distributed backbone, a collapsed backbone, a parallel backbone or a serial backbone.
  • the backbone can include any of several types of bus structures that can further interconnect to a memory bus (with or without a memory controller), a peripheral bus, and a local bus using any of a variety of commercially available bus architectures.
  • the term “communication device,” as used in this disclosure, means any hardware, firmware, or software that can transmit or receive data packets, instruction signals, data signals or radio frequency signals over a communication link.
  • the communication device can include a computer or a server.
  • the communication device can be portable or stationary.
  • the term “communication link,” as used in this disclosure, means a wired or wireless medium that conveys data or information between at least two points.
  • the wired or wireless medium can include, for example, a metallic conductor link, a radio frequency (RF) communication link, an Infrared (IR) communication link, or an optical communication link.
  • the RF communication link can include, for example, WiFi, WiMAX, IEEE 802.11, DECT, 0G, 1G, 2G, 3G, 4G, or 5G cellular standards, or Bluetooth.
  • a communication link can include, for example, an RS-232, RS-422, RS-485, or any other suitable serial interface.
  • means any machine, device, circuit, component, or module, or any system of machines, devices, circuits, components, or modules which are capable of manipulating data according to one or more instructions, such as, for example, without limitation, a processor, a microprocessor, a graphics processing unit, a central processing unit, a general purpose computer, a super computer, a personal computer, a laptop computer, a palmtop computer, a notebook computer, a desktop computer, a workstation computer, a server, a server farm, a computer cloud, or an array of processors, microprocessors, central processing units, general purpose computers, super computers, personal computers, laptop computers, palmtop computers, notebook computers, desktop computers, workstation computers, or servers.
  • computing resource means software, a software application, a web application, a web page, a computer application, a computer program, computer code, machine executable instructions, firmware, or a process that can be arranged to execute on a computing device as one or more processes.
  • Non-volatile media can include, for example, optical or magnetic disks and other persistent memory.
  • Volatile media can include dynamic random access memory (DRAM).
  • DRAM dynamic random access memory
  • Computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EEPROM, any other memory chip or cartridge, a carrier wave, or any other medium from which a computer can read.
  • the computer-readable medium can include a “Cloud,” which includes a distribution of files across multiple (for example, thousands of) memory caches on multiple (for example, thousands of) computers.
  • sequences of instruction can be delivered from a RAM to a processor, (ii) can be carried over a wireless transmission medium, or (iii) can be formatted according to numerous formats, standards or protocols, including, for example, WiFi, WiMAX, IEEE 802.11, DECT, 0G, 1G, 2G, 3G, 4G, or 5G cellular standards, or Bluetooth.
  • the term “database,” as used in this disclosure, means any combination of software or hardware, including at least one application or at least one computer.
  • the database can include a structured collection of records or data organized according to a database model, such as, for example, but not limited to at least one of a relational model, a hierarchical model, or a network model.
  • the database can include a database management system application (DBMS) as is known in the art.
  • DBMS database management system application
  • the at least one application may include, but is not limited to, for example, an application program that can accept connections to service requests from clients by sending back responses to the clients.
  • the database can be configured to run the at least one application, often under heavy workloads, unattended, for extended periods of time with minimal human direction.
  • network means, but is not limited to, for example, at least one of a personal area network (PAN), a local area network (LAN), a wireless local area network (WLAN), a campus area network (CAN), a metropolitan area network (MAN), a wide area network (WAN), a metropolitan area network (MAN), a wide area network (WAN), a global area network (GAN), a broadband area network (BAN), a cellular network, a storage-area network (SAN), a system-area network, a passive optical local area network (POLAN), an enterprise private network (EPN), a virtual private network (VPN), the Internet, or the like, or any combination of the foregoing, any of which can be configured to communicate data via a wireless and/or a wired communication medium.
  • These networks can run a variety of protocols, including, but not limited to, for example, Ethernet, IP, IPX, TCP, UDP, SPX, IP, IRC, HTTP, FTP, Telnet, SMTP, DNS, A
  • server means any combination of software or hardware, including at least one application or at least one computer to perform services for connected clients as part of a client-server architecture, server-server architecture or client-client architecture.
  • a server can include a mainframe or a server cloud or server farm.
  • the at least one server application can include, but is not limited to, for example, an application program that can accept connections to service requests from clients by sending back responses to the clients.
  • the server can be configured to run the at least one application, often under heavy workloads, unattended, for extended periods of time with minimal human direction.
  • the server can include a plurality of computers configured, with the at least one application being divided among the computers depending upon the workload. For example, under light loading, the at least one application can run on a single computer. However, under heavy loading, multiple computers can be required to run the at least one application.
  • the server, or any if its computers, can also be used as a workstation.
  • the terms “send,” “sent,” “transmission,” “transmit,” “communication,” “communicate,” “connection,” or “connect,” as used in this disclosure, include the conveyance of data, data packets, computer instructions, or any other digital or analog information via electricity, acoustic waves, light waves or other electromagnetic emissions, such as those generated with communications in the radio frequency (RF), or infrared (IR) spectra.
  • Transmission media for such transmissions can include subatomic particles, atomic particles, molecules (in gas, liquid, or solid form), space, or physical articles such as, for example, coaxial cables, copper wire and fiber optics, including the wires that comprise a system bus coupled to the processor.
  • Devices that are in communication with each other need not be in continuous communication with each other unless expressly specified otherwise.
  • devices that are in communication with each other may communicate directly or indirectly through one or more intermediaries.
  • process steps, method steps, or algorithms may be described in a sequential or a parallel order, such processes, methods and algorithms may be configured to work in alternate orders.
  • any sequence or order of steps that may be described in a sequential order does not necessarily indicate a requirement that the steps be performed in that order; some steps may be performed simultaneously.
  • a sequence or order of steps is described in a parallel (or simultaneous) order, such steps can be performed in a sequential order.
  • the steps of the processes, methods or algorithms described in this specification may be performed in any order practical.
  • one or more process steps, method steps, or algorithms can be omitted or skipped.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)

Abstract

A projectile targeting solution is provided that can monitor, log and assess projectile performance of a launcher device or a projectile launched from the launcher device. The projectile targeting solution includes a method, system and computer program for monitoring, logging or assessing performance of the launcher device or projectile and adjusting the launcher device for optimal trajectory to a target based on a projectile performance assessment for the launcher device or the projectile.

Description

    CROSS-REFERENCE TO PRIOR APPLICATION
  • The present application claims the benefit of and priority to provisional U.S. Patent Application Ser. No. 62/946,802, filed on Dec. 11, 2019, titled, “System and Method for Monitoring and Assessing Projectile Performance,” which is hereby incorporated herein by reference in its entirety, as if fully set forth herein.
  • FIELD OF THE DISCLOSURE
  • The disclosure relates generally to a method, system and computer program for monitoring, logging or assessing performance of a launcher device or a projectile launched from the launcher device, and for adjusting the launcher device for optimal trajectory to a target based on a projectile performance assessment for the launcher device or the projectile.
  • BACKGROUND OF THE DISCLOSURE
  • Projectile tracking can be a daunting task. Historically, projectile tracking has relied on human spotters using various optical gear, and in some cases, laser rangefinders. Once a target is ranged a shooter is offered information based on projectile performance that will aid in range compensation. However, human spotters may not be available in certain environments, such as, for example, target practice or hunting. Moreover, human spotters can provide erroneous or inconsistent information, resulting in costly and unnecessary expenditure of ammunition, arrows or other projectiles. An unfulfilled need exists for a solution that can monitor, log or assess performance of a launcher device or a projectile launched from the launcher device, and adjust the launcher device for optimal trajectory to a target based on a projectile performance assessment for the launcher device or the projectile.
  • SUMMARY OF THE DISCLOSURE
  • A projectile targeting solution is provided that can monitor, log and assess projectile performance of a launcher device or a projectile launched from the launcher device. The projectile targeting solution includes a method, system and computer program for monitoring, logging or assessing performance of the launcher device or projectile and adjusting the launcher device for optimal trajectory to a target based on a projectile performance assessment for the launcher device or the projectile.
  • According to a nonlimiting embodiment of the disclosure, a projectile targeting system is provided for monitoring, logging, assessing or adjusting performance of a launcher device or a projectile launched from the launcher device. The system comprises a transceiver arranged to receive projectile trajectory data from a communication device and a projectile launcher doping unit. The projectile launcher unit can be arranged to analyze the projectile trajectory data, determine an optimal trajectory for the projectile from the launcher device to a target, predict an actual trajectory for the projectile from the launcher device to the target, compare the optimal and predicted actual trajectories for the projectile, and generate correction parameters based on a result of the comparison. The transceiver can be arranged to transmit the correction parameters to the communication device to set or adjust viewfinder settings. In the system: the projectile trajectory data can comprise a distance from the launcher device to the target; and/or the projectile trajectory data can comprise global positioning system (GPS) coordinates; and/or the projectile trajectory data can comprise a horizontal doping setting; and/or the projectile trajectory data can comprise a vertical doping setting; and/or the projectile trajectory data can comprise projectile data; and/or the projectile trajectory data can comprise launcher device data; and/or the viewfinder settings can comprise riflescope doping settings.
  • According to another nonlimiting embodiment of the disclosure, a method for projectile targeting is provided for monitoring, logging, assessing or adjusting performance of a launcher device or a projectile launched from the launcher device. The method comprises receiving projectile trajectory data from a communication device, analyzing the projectile trajectory data, determining an optimal trajectory for the projectile to a target, predicting an actual trajectory for the projectile to the target, comparing the optimal and predicted actual trajectories for the projectile, generating correction parameters, and sending the correction parameters to the communication device to set or adjust viewfinder settings. In the method: the analyzing the projectile trajectory data can comprise analyzing historical performance data for the projectile or launcher device; and/or the projectile trajectory data can comprise a distance from the launcher device to the target; and/or the projectile trajectory data can comprise global positioning system (GPS) coordinates; and/or the projectile trajectory data can comprise a horizontal doping setting; and/or the projectile trajectory data can comprise a vertical doping setting; and/or the projectile trajectory data can comprise projectile data; and/or the projectile trajectory data can comprise launcher device data; and/or the viewfinder settings can comprise riflescope doping settings.
  • According to another nonlimiting embodiment of the disclosure, a non-transitory computer-readable storage medium is provided, storing computer program instructions for projectile targeting, including monitoring, logging, assessing or adjusting performance of a launcher device or a projectile launched from the launcher device. The program instructions comprise the steps of receiving projectile trajectory data from a communication device, analyzing the projectile trajectory data, determining an optimal trajectory for the projectile to a target, predicting an actual trajectory for the projectile to the target, comparing the optimal and predicted actual trajectories for the projectile, generating correction parameters, and sending the correction parameters to the communication device to set or adjust viewfinder settings. The analyzing the projectile trajectory data can comprise analyzing historical performance data for the projectile or launcher device.
  • Additional features, advantages, and embodiments of the disclosure may be set forth or apparent from consideration of the following detailed description, drawings, and claims. Moreover, it is to be understood that both the foregoing summary of the disclosure and the following detailed description are exemplary and intended to provide further explanation without limiting the scope of the disclosure as claimed.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, which are included to provide a further understanding of the disclosure, are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the detailed description serve to explain the principles of the disclosure. No attempt is made to show structural details of the disclosure in more detail than may be necessary for a fundamental understanding of the disclosure and the various ways in which it may be practiced.
  • FIG. 1 shows a nonlimiting embodiment of a projectile targeting system.
  • FIG. 2 shows a nonlimiting embodiment of a projectile assessment and adjustment (PAAA) system.
  • FIG. 3 shows a nonlimiting embodiment of a process for monitoring, logging or assessing performance of a launcher device or projectile.
  • FIG. 4 shows a nonlimiting embodiment of a process for adjusting or setting performance of a launcher device.
  • FIG. 5 shows a nonlimiting embodiment of a viewfinder system.
  • The present disclosure is further described in the detailed description and drawings that follows.
  • DETAILED DESCRIPTION OF THE DISCLOSURE
  • The embodiments of the disclosure and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments and examples that are described or illustrated in the accompanying drawings and detailed in the following description. It should be noted that the features illustrated in the drawings are not necessarily drawn to scale, and features of one embodiment may be employed with other embodiments as the skilled artisan would recognize, even if not explicitly stated. Descriptions of well-known components and processing techniques can be omitted so as to not unnecessarily obscure the embodiments of the disclosure. The examples are intended merely to facilitate an understanding of ways in which the disclosure can be practiced and to further enable those of skill in the art to practice the embodiments of the disclosure. Accordingly, the examples and embodiments should not be construed as limiting the scope of the disclosure, which is defined solely by the appended claims and applicable law. Moreover, it is noted that like reference numerals represent similar parts throughout the several views of the drawings.
  • Launcher devices such as, for example, guns, pistols, rifles, bows, crossbows, compound bows, or any other device capable of a launching a projectile, can be used to launch a projectile at a target that is located at a distance from the launcher device. For a projectile to impact a specific point on the target, the launcher device must be positioned so that the point is aligned with and in the trajectory path of the projectile. While the mechanics of targeting the projectile and hitting the target are simple in concept, in reality these processes can be extraordinarily difficult, so much so that even the most-skilled marksmen can miss the target. Projectile accuracy is affected significantly by factors such as, for example, user experience, user skill level, condition or characteristics of the launcher device, condition or characteristics of the projectile, or ambient conditions such as, for example, wind speed, wind direction, temperature, precipitation, humidity, or any changes in the foregoing with respect to time. As a result, hitting a target with a projectile can elude even the most-skilled marksmen. There exists an unfulfilled need for a projectile targeting solution that can optimize target impact and projectile accuracy, regardless of user experience, user skill level, or ambient conditions.
  • FIG. 1 shows a nonlimiting embodiment of a projectile targeting (PT) system 1, according to the principles of the disclosure. The PT system 1 can include a target 10, a launcher device 20, a projectile 30, and a viewfinder 40. The PT system 1 can include a communication device (CD) 50. The PT system 1 can include a projectile assessment and adjustment (PAAA) system 60. The communication device 50 can be arranged to interact with the viewfinder 40 via a communication link. The communication device 50 can exchange data and instruction signals with the viewfinder 40 over the communication link.
  • In the nonlimiting embodiment of the PT system 1 seen in FIG. 1, the launcher device 20 includes a bow, the projectile 30 includes an arrow, and the communication device 50 includes a mobile cellular telephone. In other nonlimiting embodiments, the launcher device 20 includes a firearm (for example, a rifle or a pistol) and the projectile includes ammunition.
  • The PT system 1 can be arranged for ranging, tracking or projectile compensation of the launcher device 20 based on, for example, data from previous projectile launch events or predetermined projectile and/or launcher device performance data. The viewfinder 40 can be configured to range the target 10, and the communication device 50 can be arranged to, once the target 10 is ranged, display information based on projectile performance or launcher device performance that can aid in range compensation. Once the target 10 is engaged the PT system 1 can track the target with, for example, thermal imaging technology providing for geolocation. All of the aforementioned imagery can be captured via an imaging sensor (not shown) or camera (not shown) in the viewfinder 40 that is capable of visible light, infrared, or thermal imagery.
  • The communication device 50 can be arranged to interact with the PAAA system 60 via a communication link, which can include a cellular site 70 and/or a network 80. The PAAA system 60 can be located in the network 80 or outside the network 80. The PAAA system 60 can be accessible through the network 80 (for example, the Internet or a local area network (LAN)) or directly through a communication link. The PAAA system 60 can exchange data and instruction signals with the communication device 50 over the communication link(s).
  • The PAAA system 60 can be included in the communication device 50.
  • The target 10 can include any object or animal. The object can include, for example, a bullseye target (for example, seen in FIG. 1), a bottle, a can or any other object suitable for target practice; the animal can include, for example, a bear, coyote, deer, elk, rabbit, squirrel, or any other game animal.
  • The launcher device 20 can include any device that can hold, support, aim, direct, guide or launch the projectile 30. The launcher device 20 can include, for example, a bow, a compound bow, a crossbow, a gun, a pistol, or a rifle; and, the projectile 30 can include, for example, an arrow, a bullet, a cartridge, or any object that can be launched from the launcher device 20.
  • The viewfinder 40 can include, for example, a laser rangefinder, a ballistic smart laser rangefinder, a smart rifle scope, or a smart high definition (HD) thermal rifle scope (for example, ATN ThOR smart thermal scope, ATN OTS-HD 640 5-50x, Leupold LTO Tracker HD 174906, LTO Tracker HD Thermal Viewer). In a nonlimiting embodiment, the viewfinder 40 can include an electronic viewfinder system 40A (shown in FIG. 5).
  • The viewfinder 40 can be arranged to view and capture images of the target 10 in a field-of-view (FOV) in real-time. The viewfinder 40 can be arranged to measure a distance from the viewfinder 40 to a point on each object in the FOV, including one or more points on the target 10 when the target is in the FOV. The viewfinder 40 can be arranged to calculate a trajectory of the projectile 30 from the launch device 20 to an aimpoint in the FOV, such as, for example, a dot or an intersection point of the horizontal and vertical lines in a crosshair reticle in an aiming indicator in the viewfinder 40. The launching device 20 can be manipulated and maneuvered until the aimpoint overlays a selected point or location (“target point”) on the target 10 to be impacted by the projectile 30.
  • The viewfinder 40 can be arranged to detect, measure, monitor and log ambient conditions surrounding the viewfinder 40 or the target 10. The viewfinder 40 can be arranged to communicate with one or more sensors (for example, sensor 40-3, shown in FIG. 5) to receive sensor data. The sensors can be located on, in, or proximate to the viewfinder 40, such as, for example, on, in or near the launcher device 20, on, in or near the target 10, or elsewhere in the surrounding environment.
  • The aiming indicator (not shown) can include, for example, a dot, a reticle, a fine crosshair, a duplex crosshair, a German reticle, a target dot, a mil-dot, a circle, a range finder, an SVD-type reticle, or any other positioning, aiming or measuring indicator that can be used to range and align the trajectory of the projectile 30 with a point on the target 10, including horizontal, vertical and distance values, such that when the projectile 30 is launched from the launcher device 20, the projectile 30 will impact the target point that was overlayed by the aimpoint when the projectile 30 was launched.
  • The communication device 50 can include, for example, a smartphone, a tablet, or a portable computing device. The communication device 50 can be arranged to receive an image or a series of images of the FOV in real-time, including the target 10 when it is in the FOV of the viewfinder 40. The images can be captured by the viewfinder 40 at, for example, 25, 30, 60, 120, or more frames-per-second (fps), or any other frame rate, depending on the application. For instance, for real-time ballistic imaging, the viewfinder 40 can be configured with an image capture device (for example, high speed camera) capable of capturing 30,000 fps, or more, allowing for frame-by-frame imaging of projectiles such as small arms bullets as they travel at speeds greater than, for example, 3,000 feet-per-second. The communication device 50 can be arranged to transmit data and instruction signals to the viewfinder 40, including, for example, viewfinder correction parameters to adjust one or more viewfinder settings, such as, for example, an adjustment to a diopter setting, a magnification setting, an illumination setting, an elevation setting, a windage (or horizontal) setting or a parallax setting of the viewfinder 40. The viewfinder settings can be adjusted to match and align the actual trajectory of the projectile 30 to a predicted optimal trajectory of the projectile 30, such that the projectile 30 will impact the target point when the aimpoint overlays the target point.
  • The viewfinder 40 (or communication device 50) can monitor or store image data, viewfinder settings, ambient condition data, time data and positional data, including, geographic location data such as GPS coordinates and directional orientation, which can be transmitted to the communication device 50 (or PAAA system 60) over one or more communication links.
  • FIG. 2 shows a nonlimiting embodiment of the PAAA system 60, constructed according to the principles of the disclosure. The PAAA system 60 can include one or more computing devices or one or more computer resources. The PAAA system 60 can be provided separate from the communication device 50, such as, for example, in a server located in a cloud network or on the Internet, or it can be included in the communication device 50. In a nonlimiting embodiment, the PAAA system 60 can be included in the viewfinder 40.
  • The PAAA system 60 can include a graphic processor unit (GPU) 110, a storage 120, a network interface 130, an input-output (I/O) interface 140, a user profile manager 150, a database 160, a launcher device doping (PLD) unit 170, and a user dashboard generator 180. The PAAA system 60 can receive a communication signal from the viewfinder 40 or the communication device 50 (shown in FIG. 1), demodulate the communication signal and separate or parse out launcher device data, projectile data, target data, timestamp data, user data, user request data, positional data, or ambient condition data. The PAAA system 60 can generate and transmit to the communication device 50 (or viewfinder 40) user dashboard rendering instructions and data, launcher device data, projectile data, or viewfinder correction parameter data. The viewfinder correction parameter data can include viewfinder correction parameters to adjust one or more viewfinder settings, including, for example, a minute-of-angle (MOA) doping setting, a magnification setting, an illumination setting, an elevation setting, a windage (or horizontal) setting or a parallax setting. The viewfinder correction parameter data can cause the viewfinder 40 to adjust the viewfinder settings automatically, or they can be rendered on the communication device 50 to allow the user to manually adjust the viewfinder settings.
  • The launcher device data can include key specifications for the launcher device 20. For instance, in the case of a gun, the launcher device data can include original equipment manufacturer (OEM) identification, gun type, gun model, year of manufacture of gun, caliber, action type, capacity, barrel length, barrel material, barrel style, stock type, length of pull, overall length, overall weight, manufacturer serial number, or UPC code. In the case of a bow, compound bow or crossbow, the launcher device data can include, for example, draw weight, weight, length, width, arrow length, arrow velocity, power-stroke dimensions, speed, axle-to-axle, or brace height.
  • The projectile data can include key specifications for the projectile 30. For instance, in the case of small arms or gun ammunition, the projectile data can include an OEM identification, ammunition type, model, year of manufacture, cartridge type or size, bullet type or size, bullet weight, caliber, average weight-to-caliber ratio, muzzle velocity, trajectory, drop rate (e.g., inches/yard), or UPC code.
  • In the case of an arrow, the projectile data can include, for example, OEM, type, model, shaft length, total length, diameter, material, tip type, fletching type, spine, spine concentricity, straightness, or weight.
  • The target data can include, for example, a timestamp, a distance from the viewfinder 40 (shown in FIG. 1) to the target 10, dimensions of the target 10 (for example, width, height, length), or target type, such as, for example, a species type of animal.
  • The PAAA system 60 can be arranged to query, retrieve, or download launcher device data or projectile data from the database 160 or an external source (not shown) such as, for example, an OEM server (not shown). The PAAA system 60 can be arranged to query or retrieve historical data for the launcher device 20 or projectile 30 (shown in FIG. 1). The PAAA system 60 can be arranged to query or retrieve historical data for the user, including, for example, historical performance with the launcher device 20 or projectile 30. The PAAA system 60 can be arranged to receive launcher device data or projectile data from the communication device 50 (or viewfinder 40). The PAAA system 60 can be arranged to analyze historical data for the launcher device 20, projectile 30 or user and predict an actual trajectory for the projectile 30 based on the target data when it is launched by the launcher device 20 under operation by the user. The PAAA system 60 can be arranged to determine (or predict) an optimal trajectory for the projectile 30 based on the target data and current viewfinder settings, such that the projectile 30 will impact the target point when the aimpoint is overlayed on the target point and the projectile 30 launched by the launcher device 20. The PAAA system 60 can be arranged to calculate viewfinder correction parameters for the viewfinder settings to adjust the settings from their current settings to adjusted viewfinder settings necessary to adjust the actual projectile trajectory to match the predicted optimal projectile trajectory.
  • For instance, where the launcher device 20 is a 30-06 REMINGTON RIFLE, the projectile 30 is a 180 Grain .30-06 Springfield cartridge, and the target 10 is located 272 yards away, the PAAA system 60 can predict an actual trajectory for the projectile 30 from the launcher device 20 to the target 10 when the projectile 30 is launched by the user. In this regard, the PAAA system 60 can determine current, real-time settings for the viewfinder 40, such as, for example, diopter, magnification, illumination, elevation, windage, and parallax. The PAAA system 60 can determine an optimal trajectory for the projectile 30, compare the optimal and actual trajectories, and generate viewfinder correction parameters to align the actual trajectory with the optimal trajectory. The viewfinder correction parameters can include, for example, horizontal and/or vertical doping instructions that can be used to set or adjust the minutes-of-angle (MOAs) on the viewfinder 40 so that the actual projectile trajectory path impacts the target point on the target 10 when the aimpoint overlays the target point and the launcher device 20 launches the projectile 30.
  • The PAAA system 60 can include a computer-readable medium that can hold executable or interpretable computer code (or instructions) that, when executed by one or more of the components (for example, the GPU 110), cause the steps, processes and methods described in this disclosure to be carried out. The computer-readable medium can be included in the storage 120, or an external computer-readable medium connected to the PAAA system 60 via the network interface 130 or the I/O interface 140.
  • The GPU 110 can include any of various commercially available graphic processors, processors, microprocessors or multi-processor architectures. The GPU 110 can include a plurality of GPUs that can execute computer program instructions in parallel. The GPU 110 can include a central processing unit (CPU) or a plurality of CPUs arranged to function in parallel.
  • A basic input/output system (BIOS) can be stored in a non-volatile memory in the PAAA system 60, such as, for example, in the storage 120. The BIOS can contain the basic routines that help to transfer information between computing resources within the PAAA system 60, such as during start-up.
  • The storage 120 can include a read-only memory (ROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), a random-access memory (RAM), a non-volatile random-access memory (NVRAM), a dynamic random-access memory (DRAM), a synchronous dynamic random-access memory (SDRAM), a static random-access memory (SRAM), a burst buffer (BB), or any other device that can store digital data and computer executable instructions or code.
  • A variety of program modules can be stored in the storage 120, including an operating system (not shown), one or more application programs (not shown), application program interfaces (APIs) (not shown), program modules (not shown), or program data (not shown). Any (or all) of the operating system, application programs, APIs, program modules, or program data can be cached in the storage 120 as executable sections of computer code.
  • The network interface 130 can be arranged to connect to the network 80 or one or more external networks (not shown). The network interface 130 can include a wired or a wireless communication network interface (not shown) or a modem (not shown). When communicating in a LAN, the PAAA system 60 can be connected to the LAN through the wired or wireless communication network interface; and, when communicating in a WAN, the PAAA system 60 can be connected to the WAN through the modem. The modem (not shown) can be internal or external and wired or wireless. The modem can be connected to the backbone B via, for example, a serial port interface (not shown).
  • The I/O interface 140 can be arranged to receive instructions and data from, for example, an operator via a user interface device (not shown), such as, for example, a keyboard (not shown), a mouse (not shown), a pointer (not shown), a microphone (not shown), a speaker (not shown), or a display (not shown). The received commands and data can be forwarded to the GPU 110, or one or more of the components 120 through 180 as instruction or data signals via the backbone B.
  • The I/O interface 140 can include a transmitter and receiver (transceiver) that can receive a communication signal from an external source, such as, for example, the viewfinder 40, communication device 50, or cell site 70 (shown in FIG. 1).
  • The network interface 130 can include a data parser (not shown) or the data parsing operation can be carried out by the GPU 110. Received communication signals from the viewfinder 40 or communication device 50 can be demodulated or depacketized and data can be transferred from the network interface 130 to the GPU 110, user profile manager 150, database 160, launcher device doping unit 170 or dashboard generator 180. The network interface 130 can facilitate communication between any one or more of the components in the PAAA system 60 and communication devices located internal (or external) to the network 80. The network interface 130 can handle a variety of communication or data packet formats or protocols, including conversion from one or more communication or data packet formats or protocols used by the viewfinder 40 or communication device 50 to the communication or data packet formats or protocols used in the PAAA system 60.
  • The user profile manager 150 can include a computing device or it can be included in a computing device as one or more computing resources. The user profile manager 150 can create, manage, edit, or delete a user record for each user, which can include, for example, information for the target 10, the target point on the target 10, launcher device 20, projectile 30, viewfinder 40 or communication device 50. The user record can include, for example, a user identification, an email address, a username, a media access control (MAC) address, an Internet Protocol (IP) address, a device serial number, a device name, or any other user or device identification. The user profile manager 150 can communicate with the database 160 to search, retrieve, edit or store user records in the database 160. The user profile manager 150 can manage and link multiple user profiles to enable group or individual-to-individual sharing of information between viewfinders 40 or communication devices 50, including performance or position data.
  • The database 160 can include one or more relational databases. The database 160 can store and maintain a record for each user, target 10, launcher device 20, projectile 30, viewfinder 40, or communication device 50. The record can include projectile data, launcher device data, user data, viewfinder data (including, for example, diopter, magnification, illumination, elevation, windage, or parallax data), ambient condition data, and timestamp data. The record can include historical performance data for the user, launcher device 20, projectile 30 or viewfinder 40. The historical performance data can include ambient conditions data, including, for example, ambient conditions during a launch event for which performance of the user, launcher device 20, projectile 30 or viewfinder 40 was measured and logged.
  • The launcher device doping (PLD) unit 170 can include one or more computing devices or it can be included in a computing device as one or more computing resources. The PLD unit 170 can include a machine learning platform, including, for example, one or more supervised machine learning systems and/or one or more unsupervised machine learning systems. The machine learning platform can include machine intelligence, such as, for example, an artificial neural network (ANN), a convolutional neural network (CNN), a deep CNN (DCNN), a region-based CNN (R-CNN), a Mask-RCNN, a deep convolutional encoder-decoder (DCED), a recurrent neural network (RNN), a neural Turing machine (NTM), a differential neural computer (DNC), a support vector machine (SVM), a deep learning neural network (DLNN), Naive Bayes, decision trees, logistic model tree induction (LMT), NBTree classifier, case-based, linear regression, Q-learning, temporal difference (TD), deep adversarial networks, fuzzy logic, K-nearest neighbor, clustering, random forest, rough set, or any other machine intelligence platform capable of supervised or unsupervised learning.
  • The machine intelligence can include a machine learning model that can be trained based on training datasets and validated based on testing datasets. The training datasets and testing datasets can be created in part or in whole from data received from, for example, the viewfinder 40 or the communication device 50. In a nonlimiting embodiment, the training datasets can be annotated through interaction with the user via the communication device 50. For instance, the training dataset can be created through interaction with the user over time based on numerous (for example, hundreds or more) of launch events, where the user launched the projectile 30 through operation of the launcher device 20 and provided feedback with respect to accuracy of the resultant impacts on the target 10.
  • The PLD unit 170 can be arranged to analyze real-time data received from the viewfinder 40 or communication device 50 (shown in FIG. 1) and historical data queried or retrieved from the database 160, including launcher device data, projectile data, target data, user data, user request data, positional data, timestamp data or ambient condition data and predict both an actual and an optimal trajectory of the projectile 30 between the launcher device 20 and target 10 when launched by operation of the user under existing viewfinder 40 settings. The PLD unit 170 can be arranged to generate viewfinder correction parameters (such as, for example, doping settings) for the viewfinder 40 to set or adjust its viewfinder settings such that the actual trajectory of the projectile 30 will be adjusted to impact the target point. The viewfinder correction parameters can be sent directly to the viewfinder 40 or through the communication device 50. The viewfinder correction parameters can be sent to the communication device 50 and rendered on a display screen, in which case the user can reference the viewfinder correction parameters to manually adjust the viewfinder settings on the viewfinder 40.
  • The dashboard generator 180 can interact with the user profile manager 150, database 160, or PLD unit 170 and generate instructions and data that cane be implemented by the communication device 50 to render a dashboard (not shown) on the display device 50. The dashboard generator 180 can interact with the communication device 50 to cause the communication device 50 to display a graphic user interface (GUI) that can be operated by the user to enter data or commands, which can be transmitted in response to a user entry on the communication device 50, such as, for example, selection, addition, deletion or change in user data, projectile data, launcher device data, target data, or ambient condition data. The GUI can be rendered on the display and superimposed on the image received from the viewfinder 40. The data or commands can be transmitted to the communication device 50 via, for example, the network interface 130 or I/O interface 140 over a communication link, which can include the cell site 70.
  • FIG. 3 shows a nonlimiting embodiment of a process 200 for monitoring, logging or assessing performance of the launcher device 20 or projectile 30 (shown in FIG. 1), according to the principles of the disclosure. The process 200 can be carried out by the communication device 50.
  • Referring to FIGS. 1 and 3, the communication device 50 can exchange data with the viewfinder 40 over a communication link and display a projectile trajectory correction menu and/or an image of the FOV of the viewfinder 40 on the display of the communication device 50. The communication device 50 can receive current viewfinder settings, such as, for example, diopter settings, elevation setting, windage setting, magnification setting, parallax setting or MOA (minute-of-angle) settings (Step 205). Any one or more of the viewfinder settings can be received automatically from the viewfinder 40 over a communication link or entered manually on the GUI by the user on the communication device 50. The communication device 50 can receive a request for projectile trajectory correction (Step 210). The received request can include a selection on the GUI by the user, such as, for example, selecting an item on a dropdown menu or entering instructions into a command field of the GUI.
  • A determination can be made, for example, by a processor (not shown) in the communication device 50, whether launcher device data and projectile data is stored locally on the communication device 50 for the launcher device 20 and projectile 30 (Step 215). If it is determined that launcher device data or projectile data is not stored (NO at Step 215), then launcher device data or projectile data can be entered by the user on the communication device 50 or downloaded from the PAAA system 60 (shown in FIGS. 1 and 2) (Step 220), otherwise (YES at Step 220) a distance to the target 10 (shown in FIG. 1) can be determined by the viewfinder 40 (Step 225). In alternative embodiment, the launcher device data or projectile data can be received from an external data source (not shown) such as, for example, an OEM server. At this step, ambient conditions can be determined at the location of the viewfinder 40 or target 10. The current viewfinder settings, target distance, projectile data, launcher device data or measured ambient conditions can be provided and sent to the PAAA system 60 (Step 225).
  • If projectile data or launcher device data is not available, or the user wishes to create new projectile data or launcher device data, the user can do so by repeatedly launching the projectile 30 from the launcher device 20, impacting the target with the projectile 30 at varying distances, and assessing the actual point of impact each time compared to the aimpoint. In a nonlimiting embodiment, the user can build a training dataset by displaying and annotating images for each launch on the communication device 50. The image data can include metadata that is automatically added to the image data based on the viewfinder trajectory settings, target distance, projectile data or launcher device data. The training dataset can be received by the PAAA system 60 and used to train or tune the machine learning model for the particular user, target 10, launcher device 20, projectile 30 or viewfinder 40.
  • In a nonlimiting example, the launcher device data can include specifications received from an OEM for a hunting rifle and the projectile data can include specifications received from an OEM for ammunition included in the hunting rifle.
  • A process 300 (shown in FIG. 4) for adjusting or setting performance of a launcher device can be carried out by the PAAA system 60 (Step 235). Projectile trajectory corrections, including viewfinder correction parameters, can be received from the PAAA system 60 (Step 240). In an embodiment in which the viewfinder 40 includes the viewfinder system 40A (shown in FIG. 5), the projectile trajectory corrections can be received by the viewfinder 40 directly from the PAAA system 60 or via the communication device 50. The received viewfinder correction parameters can be used to set the viewfinder settings (Step 245) for optimal trajectory of the projectile 30 from the launcher device 20 to the target point on the target 10.
  • In the case of a manually adjusted viewfinder 40, the viewfinder settings can be set manually by the user while referencing the viewfinder correction parameters on the display of the communication device 50. For instance, the user can view the viewfinder correction parameters displayed on the communication device 50 and manually adjust the viewfinder settings (for example, +1 MOA for elevation and +2 MOA for windage).
  • In the case of a smart viewfinder 40, such as, for example, one equipped with the viewfinder system 40A (shown in FIG. 5), the viewfinder settings can be set automatically by the viewfinder 40 based on the received viewfinder correction parameters.
  • In an alternative embodiment, which does not include the communication device 50, the process 200 can be carried out by the viewfinder 40. In that embodiment, the steps relating to exchanges of data between the viewfinder 40 and communication device 50 can be omitted.
  • In a nonlimiting example of the process 200, the target 10 can include a bullseye target, the launcher device 20 can include a 30-06 REMINGTON SPRINGFIELD hunting rifle, the projectile 30 can include 180 Grain .30-06 Springfield ammunition, and the viewfinder 40 can include a computing device and an image pickup device that can capture and record images of the field of view, including the target 10 when positioned in the field of view. The communication device 50 can include a smartphone. Referring to FIG. 3, the user can pull up or launch an app on the smartphone 50, confirm the smartphone 50 is paired or linked to the viewfinder 40. The smartphone 50 can receive current horizontal and vertical doping settings with (or without) image data from the viewfinder 40 (Step 205). The user can select “TRAJECTORY CORRECTION” from a dropdown menu or enter a command into the smartphone 50 (Step 210), in response to which the smartphone app can determine whether specifications (including performance specifications) are stored for the 30-06 REMINGTON SPRINGFIELD hunting rifle and 180 Grain .30-06 Springfield ammunition (Step 215), and, if not (NO at Step 215), then the specification data can be loaded or downloaded (Step 220).
  • The distance from the viewfinder 40 to the target 10 can be measured and sent to the smartphone 50, which can determine and display the distance on the display device (Step 225). The smartphone 50 can send the projectile trajectory data, including current viewfinder settings (including doping settings), target distance, and any other information that can be used by the PAAA system 60 to determine optimal and actual trajectories for the 180 Grain .30-06 Springfield ammunition (Step 230). In response, the smartphone 50 can receive viewfinder correction parameters from the PAAA system 60, which can be received in real-time (Step 240). The smartphone 50, via the app, can send corrected viewfinder settings (including doping settings) or adjustments to current viewfinder settings to the viewfinder 40 (Step 245) for an optimal trajectory to the target point on the target 10.
  • As noted earlier, the PAAA system 60 can be included in the smartphone 50 or located remotely such as, for example, in the cloud or network 80 (shown in FIG. 1).
  • FIG. 4 shows a nonlimiting embodiment of a process 300 for adjusting or setting performance of the launcher device 20 (shown in FIG. 1), according to the principles of the disclosure. Referring to FIGS. 2 and 4, the PAAA system 60 can receive a communication signal from the communication device 50 that comprises projectile trajectory data (Step 305). The projectile trajectory data can be parsed and, based on the received communication signal, the user profile manager 150 can retrieve, update or create a record for the user (Step 310). The PLD unit 170 can analyze the projectile trajectory data, including the projectile data, launcher device data, target data, current viewfinder settings or ambient condition data (Step 315) and determine an optimal trajectory (Step 320) and an actual trajectory (Step 325) for the projectile 30 (shown in FIG. 2) from the launcher device 20 to the target 10. The optimal trajectory can be determined, for example, by analyzing historical data, OEM specifications or ambient conditions for the projectile 30 and launcher device 20 and determining the velocity (for example, yards/second) when the projectile 30 leaves the launcher device 20, the deceleration rate of the projectile (for example, yards/second2), and the drop rate of the projectile (for example, inches/yard) based on the distance from the launcher device 20 to the target point on the target 10.
  • The actual projectile trajectory can be determined (or predicted) by the PLD unit 170 by determining external forces and their effects on the projectile 30 as it travels from the launcher device 20 to the target 10. The external forces can include, for example, forces resulting from gravity, wind, pressure, temperature, precipitation, or user operation, or any changes in the foregoing as a function of time. For instance, the PLD unit 170 can interact with the database 160 and analyze historical data for the user, projectile 30, launcher device 20, target 10, viewfinder 40, geographic location or ambient conditions to predict the magnitudes and vectors of external forces, as well as their effects on the trajectory of the projectile 30 as it travels from the launcher device 20 to the target point, which can include, for example, external forces such as gravity, wind, precipitation, temperature, pressure, or user operation. The PLD unit 170 can predict how much and how fast the trajectory of the projectile 30 will deviate from a targeted point on the target 10 due to the external forces.
  • During operation of the launcher device 20, a user can develop a unique force signature that can affect the trajectory of the projectile 30. For instance, where the launcher device 20 is a 30-06 hunting rifle, the user may consistently demonstrate a 0.1° offset to the right due to an unequal user operation force applied when squeezing the trigger on the rifle. The PLD unit 170 can learn the user force signature, including the user signature associated with the specific launcher device 20, projectile 30, or target 10, and predict a deviation of the projectile trajectory to the target point on the target 10 due to the user operation force. The PLD unit 170 can generate viewfinder correction parameters to counteract the deviation due to the user operation force, as well as other external forces, for a projectile launch event.
  • For instance, referring the nonlimiting example discussed above, the PLD unit 170 can be arranged to analyze past performance of the user with the same 30-06 REMINGTON rifle, 180 Grain .30-06 Springfield cartridge, at comparable distances to a target point. Based on the analysis of historical data, the PLD unit 170 can recognize patterns and, based on the analysis of real-time data, predict the likely trajectory of the bullet with respect to the aimpoint, including the drop rate and point of impact on the target 10.
  • The PLD unit 170 can compare the actual and optimal projectile trajectories (Step 330) and generate viewfinder correction parameters to adjust the settings for the viewfinder 40 (shown in FIG. 1) and align the actual projectile trajectory with the optimal projectile trajectory so that they overlap and have the same or substantially the same trajectory (Step 335). The viewfinder correction parameters can include, for example, horizontal or vertical MOA settings for the viewfinder 40 (shown in FIG. 1). The viewfinder correction parameters can be sent directly to the viewfinder 40 or transmitted to the communication device 50 (shown in FIG. 1), in which case the communication device 50 can send the viewfinder correction parameters to the viewfinder 40 to cause the viewfinder 40 to update the viewfinder settings (Step 340).
  • The PLD unit 170 can be arranged to generate and send commands (or instructions) and data to the communication device 50 that, when executed by a processor (not shown) in the device, cause the communication device 50 to render a display screen displaying an image of the FOV of the viewfinder 40 and overlaying the image with a software overlay for the aiming indicator (for example, a reticle) and informational display. In a nonlimiting embodiment, the commands and data can cause the processor to manage the storage of information from one or more measuring sensors (such as, for example, sensors 40-3, shown in FIG. 5), and to pass information wirelessly, including geo-positioning data, imaging data (visible and infrared), distance measurements, and values for angle-compensated ranging.
  • Referring to FIG. 1, in a nonlimiting embodiment, the communication device 50 can be arranged to interact with the viewfinder 40, which can acquire a target object in the FOV of the viewfinder 40, and display the FOV, including the target object, on the display screen. The captured image can include visible light, near-infrared (NIR) or longwave infrared (LWIR) wavelengths. This image can be improved optically and digitally to effect zoom. The user can select to range the target in the FOV. The position of the viewfinder 40 and the position of the target 10 can be recorded and geolocated. The user can select the projectile data (for example, ballistic data) most relevant to the launching platform, including the launcher device 20 and projectile 30. The projectile data can include, for example, data driven by historical data on previous engagements or projectile performance data received from industry sources. The projectile data can be relayed to the user via the communication device 50 so that adjustments can be made either automatically in the case of, for example a smart scope, or manually by the user in the case of a non-connected viewfinder 40. The viewfinder 40 can be linked to the communication device 50 wirelessly so that the data and the calculations can be performed by either device and the data can be stored and or recorded by either device. Once the target is engaged it can be tracked via a thermal imager so the harvest can be completed.
  • In a nonlimiting embodiment of the disclosure, the viewfinder 40 includes the viewfinder system 40A, depicted in FIG. 5. In another embodiment, the viewfinder system 40A can be included in the communication device 50.
  • The viewfinder system 40A includes a processor 40-1, a storage device 40-2, one or more onboard sensors 40-3, a driver suite 40-4, and a transceiver 40-5, and an input-output (10) interface 40-6. The viewfinder system 40A can include a network interface 40-7, a target tracker 40-8 and/or a viewfinder adjuster suite 40-9. The system 40A can include a bus B, which can be connected to any or all of the components 40-1 to 40-9 by a communication link. The processor 40-1, storage 40-2, IO interface 40-6, network interface 40-7 and bus B can be similar to the GPU 110, storage 120, I/O interface 140, network interface 130 and bus B, respectively, shown in FIG. 2 (discussed above).
  • Any one or more of the components 40-1 to 40-9 in the viewfinder system 40A can include a computing resource or a computing device. The components can include a computing resource or computing device that is separate from the processor 40-1, as seen in FIG. 5, or integrated with the processor 40-1. Any of the components can include a computer resource that can be executed on the processor 40-1 as one or more processes. The computer resources can be contained in the storage 40-2.
  • In a nonlimiting embodiment, the viewfinder 40 is arranged to exchange data and instructions signals with the PAAA system 60 (shown in FIGS. 1 and 2).
  • The sensor(s) 40-3 can include a global positioning system (GPS) receiver or any device that can accurately determine the geographic location and/or physical orientation of the viewfinder system 40A in real-time, regardless of whether the viewfinder 40 is stationary or moving.
  • The sensor(s) 40-3 can include a rangefinder, a laser rangefinder, a monocular, a binocular, a riflescope, a camera, a display, a three-dimensional (3D) depth camera, an infrared (IR) sensor, a two-dimensional (2D) Light Detection and Ranging (LiDAR) sensor, a 3D LiDAR sensor, a laser ranger finder, an accelerometer, a motion detector, a temperature sensor, a humidity sensor, a precipitation sensor, a wind sensor, an atmospheric pressure sensor, a sound sensor, a light sensor, or any other device capable of detecting or measuring ambient conditions surrounding the viewfinder 40 and/or the target 10. The camera or 3D camera can be arranged to capture and output image data, with or without 3D point cloud data, of the FOV. The camera can be arranged to capture visible and infrared images, including near visible and thermal wavelengths.
  • The sensor(s) 40-3 can be included in or attached to the viewfinder 40, or fitted or attached to the launcher device 20 to enable sensory reception such as, for example, viewing and image capture of the target 10. The sensor(s) 40-3 can include a computing device, a computer resource, or a suite of computing devices or computer resources. The sensor(s) 40-3 can be arranged to capture images in real-time of the FOV, including all objects in the FOV, measure distances to one or more points on each object in the FOV (including the target point overlayed by the aimpoint), and, in certain embodiments, measure ambient conditions surrounding the viewfinder or target. The sensor(s) 40-3 can be arranged to measure changes and rates of change in motion of the aimpoint (including magnitude and directional vectors) as a function of time, as well as any changes or rates of change in, for example, temperature, humidity, precipitation, pressure, wind, sound, or light.
  • One or more of the sensors 40-3 can be provided separate from the viewfinder 40. For instance, one or more of the sensors 40-3 can be provided on the communication device 50, on the launcher device 30, or elsewhere in the surrounding environment. The sensor(s) 40-3 can exchange data and instructions signals with the viewfinder 40 or communication device 50 over one or more communication links.
  • The driver suite 40-4 can include a plurality of drivers, including a driver 40-4B for each senor 40-3. As seen, the driver suite 40-4 can include a video driver 40-6A to drive the display device (not shown) and the sensor driver 40-4B to drive, for example, the camera to capture images of the FOV in real-time, including visible and infrared wavelengths. The display device (not shown) can be arranged to display the field of view of the camera.
  • In a nonlimiting embodiment, the viewfinder system 40A can receive ambient condition data for the geographic location of the viewfinder 40 from an external data source, such as, for example, a weather service website (for example, a National Oceanic and Atmospheric Administration (NOAA) National Weather Service website). The received ambient condition data can include, for example, temperature, pressure, humidity, precipitation, wind speed, or wind direction data for the geographic location of the target 10 or viewfinder 40 (or communication device 50).
  • The transceiver 40-5 can include a transmitter 40-5A and a receiver 40-5B. The transceiver 40-5 can be arranged to transmit data and instruction signals between the viewfinder system 40A and the communication device 50 (shown in FIG. 1) over one or more communication links. The viewfinder system 40A can transmit image data signals to the communication device 50, which can be displayed by the communication device 50 on a display device. The image data signals can include rendering instructions and data that can be used by the communication device 50 to display an aiming indicator (for example, a crosshair) superimposed on the displayed field of view.
  • The IO interface 40-6 can receive or communicate commands and data from or to an operator via a user interface device (not shown), such as, for example, a keyboard, a touch-display, a mouse, a pointer, a microphone, a speaker, or a display. The commands and data can be communicated between any of the components in the viewfinder system 40A and the user interface as instruction or data signals via the backbone B. In a nonlimiting embodiment, the IO interface 40-6 is arranged to receive viewfinder settings or viewfinder setting adjustment values entered by the user.
  • The network interface 40-7 can be arranged to connect to a network, such as, for example, the network 80 (show in FIG. 1), and interact with the PAAA system 60.
  • The target tracker 40-8 can be arranged to receive sensor data from the sensors 40-3, including image data from the camera (not shown), and calculate a distance value to each point on an object in the FOV, including the aimpoint. The target tracker 40-8 can be arranged to receive thermal image data and track the aimpoint until the projectile 30 is launched and impacts the target object, or another object.
  • In a nonlimiting embodiment, the target tracker 40-8 can include a machine learning system arranged to recognize and track each object in the field of view, including characteristics of the object, such as, for example, color, size, and shape. The sensor data can include visible light image data, IR image data, viewfinder location data corresponding to the real-world geo-positioning, target geo-position data, target distance data, or ambient condition data. The target tracker 40-8 can be arranged to predict an actual trajectory for the projectile 30 from the launcher device 20 to the aimpoint.
  • The viewfinder adjuster suite 40-9 can include one or more computing devices or computing resources, including a diopter adjuster 40-9A, elevation adjuster 49-9B, horizontal (or windage) adjuster 40-9C, illumination adjuster 40-9D or parallax adjuster 40-9E. The diopter adjuster 40-9A can be arranged to adjust the diopter settings of the viewfinder based on a diopter setting in the received viewfinder correction parameters. The diopter adjuster 40-9A can adjust an optical system (not shown) for optimal focus of an object in the FOV of the viewfinder 40.
  • Similar to the diopter adjuster 40-9A, the elevation adjuster 49-9B, horizontal adjuster 40-9C, illumination adjuster 40-9D and parallax adjuster 40-9E can be arranged to adjust elevation settings, windage settings, illumination settings and parallax settings, respectively, of the viewfinder 40 based on respective parameter values in the received viewfinder correction parameters. Once all viewfinder settings have been updated according to the received viewfinder correction parameters, the viewfinder 40 (shown in FIG. 1) will be set for the particular user, the particular launcher device 20 and the particular projectile 30 such that, when the projectile 30 is launched and the aimpoint is overlayed atop of the target point, the projectile 30 will repeatedly and consistently impact the target point.
  • The terms “a,” “an,” and “the,” as used in this disclosure, means “one or more,” unless expressly specified otherwise.
  • The term “backbone,” as used in this disclosure, means a transmission medium that interconnects one or more computing resources to provide a path that conveys data signals and instruction signals between the one or more computing resources. The backbone can include a bus or a network. The backbone can include an ethernet TCP/IP. The backbone can include a distributed backbone, a collapsed backbone, a parallel backbone or a serial backbone. The backbone can include any of several types of bus structures that can further interconnect to a memory bus (with or without a memory controller), a peripheral bus, and a local bus using any of a variety of commercially available bus architectures.
  • The term “communication device,” as used in this disclosure, means any hardware, firmware, or software that can transmit or receive data packets, instruction signals, data signals or radio frequency signals over a communication link. The communication device can include a computer or a server. The communication device can be portable or stationary.
  • The term “communication link,” as used in this disclosure, means a wired or wireless medium that conveys data or information between at least two points. The wired or wireless medium can include, for example, a metallic conductor link, a radio frequency (RF) communication link, an Infrared (IR) communication link, or an optical communication link. The RF communication link can include, for example, WiFi, WiMAX, IEEE 802.11, DECT, 0G, 1G, 2G, 3G, 4G, or 5G cellular standards, or Bluetooth. A communication link can include, for example, an RS-232, RS-422, RS-485, or any other suitable serial interface.
  • The terms “computer,” “computing device,” or “processor,” as used in this disclosure, means any machine, device, circuit, component, or module, or any system of machines, devices, circuits, components, or modules which are capable of manipulating data according to one or more instructions, such as, for example, without limitation, a processor, a microprocessor, a graphics processing unit, a central processing unit, a general purpose computer, a super computer, a personal computer, a laptop computer, a palmtop computer, a notebook computer, a desktop computer, a workstation computer, a server, a server farm, a computer cloud, or an array of processors, microprocessors, central processing units, general purpose computers, super computers, personal computers, laptop computers, palmtop computers, notebook computers, desktop computers, workstation computers, or servers.
  • The terms “computing resource” or “computer resource,” as used in this disclosure, means software, a software application, a web application, a web page, a computer application, a computer program, computer code, machine executable instructions, firmware, or a process that can be arranged to execute on a computing device as one or more processes.
  • The term “computer-readable medium,” as used in this disclosure, means any non-transitory storage medium that participates in providing data (for example, instructions) that can be read by a computer. Such a medium can take many forms, including non-volatile media and volatile media. Non-volatile media can include, for example, optical or magnetic disks and other persistent memory. Volatile media can include dynamic random access memory (DRAM). Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EEPROM, any other memory chip or cartridge, a carrier wave, or any other medium from which a computer can read. The computer-readable medium can include a “Cloud,” which includes a distribution of files across multiple (for example, thousands of) memory caches on multiple (for example, thousands of) computers.
  • Various forms of computer readable media can be involved in carrying sequences of instructions to a computer. For example, sequences of instruction (i) can be delivered from a RAM to a processor, (ii) can be carried over a wireless transmission medium, or (iii) can be formatted according to numerous formats, standards or protocols, including, for example, WiFi, WiMAX, IEEE 802.11, DECT, 0G, 1G, 2G, 3G, 4G, or 5G cellular standards, or Bluetooth.
  • The term “database,” as used in this disclosure, means any combination of software or hardware, including at least one application or at least one computer. The database can include a structured collection of records or data organized according to a database model, such as, for example, but not limited to at least one of a relational model, a hierarchical model, or a network model. The database can include a database management system application (DBMS) as is known in the art. The at least one application may include, but is not limited to, for example, an application program that can accept connections to service requests from clients by sending back responses to the clients. The database can be configured to run the at least one application, often under heavy workloads, unattended, for extended periods of time with minimal human direction.
  • The terms “including,” “comprising” and their variations, as used in this disclosure, mean “including, but not limited to,” unless expressly specified otherwise.
  • The term “network,” as used in this disclosure means, but is not limited to, for example, at least one of a personal area network (PAN), a local area network (LAN), a wireless local area network (WLAN), a campus area network (CAN), a metropolitan area network (MAN), a wide area network (WAN), a metropolitan area network (MAN), a wide area network (WAN), a global area network (GAN), a broadband area network (BAN), a cellular network, a storage-area network (SAN), a system-area network, a passive optical local area network (POLAN), an enterprise private network (EPN), a virtual private network (VPN), the Internet, or the like, or any combination of the foregoing, any of which can be configured to communicate data via a wireless and/or a wired communication medium. These networks can run a variety of protocols, including, but not limited to, for example, Ethernet, IP, IPX, TCP, UDP, SPX, IP, IRC, HTTP, FTP, Telnet, SMTP, DNS, ARP, ICMP.
  • The term “server,” as used in this disclosure, means any combination of software or hardware, including at least one application or at least one computer to perform services for connected clients as part of a client-server architecture, server-server architecture or client-client architecture. A server can include a mainframe or a server cloud or server farm. The at least one server application can include, but is not limited to, for example, an application program that can accept connections to service requests from clients by sending back responses to the clients. The server can be configured to run the at least one application, often under heavy workloads, unattended, for extended periods of time with minimal human direction. The server can include a plurality of computers configured, with the at least one application being divided among the computers depending upon the workload. For example, under light loading, the at least one application can run on a single computer. However, under heavy loading, multiple computers can be required to run the at least one application. The server, or any if its computers, can also be used as a workstation.
  • The terms “send,” “sent,” “transmission,” “transmit,” “communication,” “communicate,” “connection,” or “connect,” as used in this disclosure, include the conveyance of data, data packets, computer instructions, or any other digital or analog information via electricity, acoustic waves, light waves or other electromagnetic emissions, such as those generated with communications in the radio frequency (RF), or infrared (IR) spectra. Transmission media for such transmissions can include subatomic particles, atomic particles, molecules (in gas, liquid, or solid form), space, or physical articles such as, for example, coaxial cables, copper wire and fiber optics, including the wires that comprise a system bus coupled to the processor.
  • Devices that are in communication with each other need not be in continuous communication with each other unless expressly specified otherwise. In addition, devices that are in communication with each other may communicate directly or indirectly through one or more intermediaries.
  • Although process steps, method steps, or algorithms may be described in a sequential or a parallel order, such processes, methods and algorithms may be configured to work in alternate orders. In other words, any sequence or order of steps that may be described in a sequential order does not necessarily indicate a requirement that the steps be performed in that order; some steps may be performed simultaneously. Similarly, if a sequence or order of steps is described in a parallel (or simultaneous) order, such steps can be performed in a sequential order. The steps of the processes, methods or algorithms described in this specification may be performed in any order practical. In certain non-limiting embodiments, one or more process steps, method steps, or algorithms can be omitted or skipped.
  • When a single device or article is described, it will be readily apparent that more than one device or article may be used in place of a single device or article. Similarly, where more than one device or article is described, it will be readily apparent that a single device or article may be used in place of the more than one device or article. The functionality or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality or features.
  • The subject matter described above is provided by way of illustration only and should not be construed as limiting. Various modifications and changes can be made to the subject matter described herein without following the example embodiments and applications illustrated and described, and without departing from the true spirit and scope of the invention encompassed by the present disclosure, which is defined by the set of recitations in the following claims and by structures and functions or steps which are equivalent to these recitations.

Claims (20)

What is claimed is:
1. A system for monitoring, logging, assessing or adjusting performance of a launcher device or a projectile launched from the launcher device, the system comprising:
a transceiver arranged to receive projectile trajectory data from a communication device; and
a projectile launcher doping unit arranged to
analyze the projectile trajectory data,
determine an optimal trajectory for a projectile from a launcher device to a target,
predict an actual trajectory for the projectile from the launcher device to the target,
compare the optimal and predicted actual trajectories for the projectile, and
generate viewfinder correction parameters based on a result of the comparison,
wherein the transceiver is further arranged to transmit the viewfinder correction parameters to the communication device to set or adjust viewfinder settings.
2. The system in claim 1, wherein the projectile trajectory data comprises a distance from the launcher device to the target.
3. The system in claim 1, wherein the projectile trajectory data comprises global positioning system (GPS) coordinates.
4. The system in claim 1, wherein the projectile trajectory data comprises a horizontal doping setting.
5. The system in claim 1, wherein the projectile trajectory data comprises a vertical doping setting.
6. The system in claim 1, wherein the projectile trajectory data comprises projectile data.
7. The system in claim 1, wherein the projectile trajectory data comprises launcher device data.
8. The system in claim 1, wherein the viewfinder settings comprise riflescope doping settings.
9. A method for monitoring, logging, assessing or adjusting performance of a launcher device or a projectile launched from the launcher device, the method comprising:
receiving projectile trajectory data relating to a launcher device and a projectile;
analyzing the projectile trajectory data;
determining an optimal trajectory for the projectile from the launcher device to a target based on the projectile trajectory data;
predicting an actual trajectory for the projectile from the launcher device to the target based on the projectile trajectory data;
comparing the optimal trajectory and predicted actual trajectory for the projectile;
generating one or more viewfinder correction parameters to match the predicted actual trajectory to the optimal trajectory; and
sending the one or more viewfinder correction parameters to a communication device,
wherein the one or more viewfinder correction parameters include a setting adjustment for a viewfinder setting.
10. The method in claim 9, wherein the analyzing the projectile trajectory data comprises analyzing historical performance data for the projectile or launcher device.
11. The method in claim 9, wherein the projectile trajectory data comprises a distance from the launcher device to the target.
12. The method in claim 9, wherein the projectile trajectory data comprises global positioning system (GPS) coordinates.
13. The method in claim 9, wherein the projectile trajectory data comprises a horizontal doping setting.
14. The method in claim 9, wherein the projectile trajectory data comprises a vertical doping setting.
15. The method in claim 9, wherein the projectile trajectory data comprises projectile data.
16. The method in claim 9, wherein the projectile trajectory data comprises launcher device data.
17. The method in claim 9, wherein the viewfinder setting comprise a riflescope doping setting.
18. A non-transitory computer-readable storage medium containing computer program instructions for monitoring, logging, assessing or adjusting performance of a launcher device or a projectile launched from the launcher device that, when executed on a processor, cause the processor to perform an operation comprising:
receiving projectile trajectory data relating to a projectile and a launcher device;
analyzing the projectile trajectory data;
determining an optimal trajectory for the projectile from the launcher device to a target point based on the projectile trajectory data;
predicting an actual trajectory for the projectile from the launcher device to the target point based on the projectile trajectory data;
comparing the optimal trajectory and predicted actual trajectory for the projectile;
generating one or more viewfinder correction parameters to match the predicted actual trajectory to the optimal trajectory; and
sending the viewfinder correction parameters to a communication device,
wherein the one or more viewfinder correction parameters include a setting adjustment for a viewfinder setting.
19. The storage medium in claim 18, wherein the analyzing the projectile trajectory data comprises analyzing historical performance data for the projectile or launcher device.
20. The storage medium in claim 18, wherein the determining the optimal trajectory comprises analyzing historical performance data for the projectile or launcher device.
US17/117,962 2019-12-11 2020-12-10 System and method for monitoring and assessing projectile performance Abandoned US20210180917A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/117,962 US20210180917A1 (en) 2019-12-11 2020-12-10 System and method for monitoring and assessing projectile performance

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962946802P 2019-12-11 2019-12-11
US17/117,962 US20210180917A1 (en) 2019-12-11 2020-12-10 System and method for monitoring and assessing projectile performance

Publications (1)

Publication Number Publication Date
US20210180917A1 true US20210180917A1 (en) 2021-06-17

Family

ID=76317545

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/117,962 Abandoned US20210180917A1 (en) 2019-12-11 2020-12-10 System and method for monitoring and assessing projectile performance

Country Status (2)

Country Link
US (1) US20210180917A1 (en)
WO (1) WO2021119406A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11997730B2 (en) * 2021-01-19 2024-05-28 Sig Sauer, Inc. Establishing pairing between firearm accessories

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4698489A (en) * 1982-09-30 1987-10-06 General Electric Company Aircraft automatic boresight correction
US5211356A (en) * 1991-08-30 1993-05-18 Texas Instruments Incorporated Method and apparatus for rejecting trackable subimages
US6666410B2 (en) * 2001-10-05 2003-12-23 The Charles Stark Draper Laboratory, Inc. Load relief system for a launch vehicle
US6896220B2 (en) * 2003-05-23 2005-05-24 Raytheon Company Munition with integrity gated go/no-go decision
US7249730B1 (en) * 2004-09-23 2007-07-31 United States Of America As Represented By The Secretary Of The Army System and method for in-flight trajectory path synthesis using the time sampled output of onboard sensors

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11997730B2 (en) * 2021-01-19 2024-05-28 Sig Sauer, Inc. Establishing pairing between firearm accessories

Also Published As

Publication number Publication date
WO2021119406A1 (en) 2021-06-17

Similar Documents

Publication Publication Date Title
US10816306B2 (en) Weapon targeting system
US11874092B2 (en) Target analysis and recommendation
US10782096B2 (en) Skeet and bird tracker
US20070234626A1 (en) Systems and methods for adjusting a sighting device
EP2956733B1 (en) Firearm aiming system with range finder, and method of acquiring a target
US20150345907A1 (en) Anti-sniper targeting and detection system
US9310163B2 (en) System and method for automatically targeting a weapon
US9033711B2 (en) Interactive system and method for shooting and target tracking for self-improvement and training
JP2020533606A (en) A device with a networked scope to allow the target to be tracked by multiple devices at the same time
CN105765602A (en) Interactive weapon targeting system displaying remote sensed image of target area
US20180039061A1 (en) Apparatus and methods to generate images and display data using optical device
US20210389080A1 (en) Rifle Intelligence Systems and Methods
US20210180917A1 (en) System and method for monitoring and assessing projectile performance
KR102290878B1 (en) Remote controlled weapon station to fire targets hidden by obstacles
EP1580516A1 (en) Device and method for evaluating the aiming behaviour of a weapon
EP3928126A1 (en) Device and method for shot analysis
US20220349676A1 (en) Novel System And Methods For Incorporating Firearm Ammunition Temperature & Thermal Susceptibility To Improve Ballistic Calculator Algorithms And Fidelity
Wildt et al. Sensor data fusion for automated threat recognition in manned-unmanned infantry platoons
Sharma et al. Target identification and control model of autopilot for passive homing missiles

Legal Events

Date Code Title Description
AS Assignment

Owner name: PLANO MOLDING COMPANY, LLC, ILLINOIS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DIKUN, RAYMOND;REEL/FRAME:054608/0710

Effective date: 20191212

STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

AS Assignment

Owner name: WGI INNOVATIONS, LTD., TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PLANO MOLDING COMPANY, LLC;PLANO SYNERGY HOLDING INC.;REEL/FRAME:055984/0454

Effective date: 20210416

AS Assignment

Owner name: GOOD SPORTSMAN MARKETING, L.L.C., TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WGI INNOVATIONS, LTD.;REEL/FRAME:056385/0337

Effective date: 20210525

AS Assignment

Owner name: NXT CAPITAL, LLC, AS AGENT, ILLINOIS

Free format text: SECURITY INTEREST;ASSIGNOR:GOOD SPORTSMAN MARKETING, L.L.C.;REEL/FRAME:056982/0801

Effective date: 20210726

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION