US20160035246A1 - Facility operations management using augmented reality - Google Patents

Facility operations management using augmented reality Download PDF

Info

Publication number
US20160035246A1
US20160035246A1 US14/812,712 US201514812712A US2016035246A1 US 20160035246 A1 US20160035246 A1 US 20160035246A1 US 201514812712 A US201514812712 A US 201514812712A US 2016035246 A1 US2016035246 A1 US 2016035246A1
Authority
US
United States
Prior art keywords
equipment
user
data
controller
facility
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/812,712
Inventor
Peter M. Curtis
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US14/812,712 priority Critical patent/US20160035246A1/en
Publication of US20160035246A1 publication Critical patent/US20160035246A1/en
Assigned to BNB BANK reassignment BNB BANK SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CURTIS, PETER
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/24Use of tools
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B7/00Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00
    • G08B7/06Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00 using electric transmission, e.g. involving audible and visible signalling through the use of sound and light sources
    • G08B7/066Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00 using electric transmission, e.g. involving audible and visible signalling through the use of sound and light sources guiding along a path, e.g. evacuation path lighting strip
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/06Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]

Definitions

  • the present disclosure relates to management of facility operations, and more specifically, to management of facility operations using augmented reality.
  • Operating mission critical facilities may involve monitoring numerous building functions and equipment on a regular basis. If an individual performing such monitoring observes that equipment is operating outside of its designed limits, various steps may need to be taken to correct the situation.
  • comparison of equipment data with benchmark values can provide a reasonable indication that equipment is close to failing or that it is operating near or exceeding its designed limits.
  • facility engineers may be required to complete procedures from memory or using paper instruction. However, since these procedures can be long and complex, they can be difficult for a human operator to perform without assistance.
  • Augmented reality is a live direct or indirect view of a physical, real world environment whose elements are supplemented by computer-generated sensory input.
  • the technology functions by enhancing one's current perception of reality.
  • a wearable device configured to guide a user to perform a procedure in a facility.
  • the device includes a wearable element (e.g., eyeglass frame, a watch band, etc.); a display area; a sensor; and a controller.
  • the controller is configured to control the sensor to capture sensor data, identify equipment from the sensor data, and present an image on the display area.
  • the image includes information indicating how to perform a current step of a procedure associated with the determined equipment.
  • the controller includes a transceiver that enables the controller to wirelessly receive the information from a remote device.
  • a wearable device to guide a user safely through a facility.
  • the device includes a wearable element; a display area; a sensor; and a controller comprising a transceiver that enables the controller to wirelessly receive information from a remote device.
  • the controller is configured is configured to control the sensor to capture sensor data, identify equipment from the sensor data and the received information, and present an image on the display area representing a safe path through the equipment using the sensor data and the received information.
  • a wearable device to manage a facility using a drone.
  • the device includes a wearable element; a display area; and a controller configured to wirelessly control the drone to capture sensor data, identify equipment from the sensor data, determine whether the equipment is malfunctioning using the sensor data, and present an image on the display area representing the equipment and providing information indicating whether the equipment is malfunctioning.
  • a wearable device to manage a facility using at least one robot.
  • the device includes a wearable element; a display area; and a controller configured to wirelessly control the robot to capture sensor data, identify equipment from the sensor data, determine whether the equipment is malfunctioning using the sensor data, and present an image on the display area representing the equipment and providing information indicating whether the equipment is malfunctioning.
  • FIG. 1 is a schematic diagram illustrating a system according to an exemplary embodiment of the invention.
  • FIG. 2 illustrates an augmented reality device according to an exemplary embodiment of the invention.
  • FIG. 3 illustrates a controller according to an exemplary embodiment of the invention.
  • FIG. 4 illustrates an augmented view of the augmented reality device according to an exemplary embodiment of the invention.
  • FIG. 5 illustrates augmented view of the augmented reality device according to an exemplary embodiment of the invention.
  • FIG. 6 illustrates augmented view of the augmented reality device according to an exemplary embodiment of the invention.
  • FIG. 7 illustrates augmented view of the augmented reality device according to an exemplary embodiment of the invention.
  • FIG. 8 illustrates augmented view of the augmented reality device according to an exemplary embodiment of the invention.
  • FIG. 9 illustrates augmented view of the augmented reality device according to an exemplary embodiment of the invention.
  • FIGS. 10A-D illustrates augmented views of the augmented reality device according to exemplary embodiments of the invention.
  • FIG. 11 illustrates an exemplary screen for a walkthrough function of a possible graphical user interface (GUI) of the system.
  • GUI graphical user interface
  • FIG. 12 illustrates another exemplary screen for the walkthrough function.
  • FIG. 13 illustrates an exemplary screen for a trending function of a possible GUI.
  • FIG. 14 illustrates another exemplary screen for the trending function.
  • FIG. 15 illustrates an exemplary view of a smart watch that may be used in a system of an embodiment of the invention.
  • FIG. 16 illustrates an exemplary view of a smart watch that may be used in a system of an embodiment of the invention.
  • FIG. 17 illustrates an example of a drone that may be controlled by a system of an embodiment of the invention.
  • FIG. 18 illustrates an example of a robot being controlled by a system of an embodiment of the invention.
  • FIG. 19 shows an example of a computer system capable of implementing one or more devices of the invention or methods of the invention according to embodiments of the present disclosure.
  • FIG. 1 illustrates a system that enables facility management to be performed using at least one of an augmented reality device 110 , a smart watch 185 , a mobile device 180 , a drone 190 , and a robot 195 according to an exemplary embodiment of the invention.
  • the augmented reality device 110 and the smart watch 185 are wearable devices.
  • the system further includes a central server 120 and a database 130 .
  • the mobile device 180 may be a tablet computer, a smart phone, etc.
  • the augmented reality device 110 includes a client applicator layer 111 and a static presentation layer 112 .
  • the mobile device 180 or the smart watch 185 may also include layers 111 and 112 .
  • the server 120 When the server 120 interfaces with a browser 140 of a remote computer, the server 120 includes a web presentation layer 121 .
  • the server 120 further includes a dynamic presentation layer 122 to interface with the static presentation layer 112 of the augmented reality device 112 and the web presentation layer 121 .
  • the server 120 further includes a business logic layer 123 and data access layer 124 to interface with a database layer 131 of the database 130 .
  • FIG. 2 illustrates the augmented reality device 110 according to an exemplary embodiment of the invention.
  • the augmented reality device 110 includes a controller (not shown), a frame 202 , a camera lens 230 , a lens 204 (e.g., a prism), a display 205 or a projected image 205 , and a control panel 206 with one or more physical buttons.
  • the augmented reality device 110 includes a projector that projects an image onto the prism to realize the projected image 205 .
  • a case is mounted to the frame 202 that includes the control panel 206 , the controller, a camera having the camera lens 230 , and the projector.
  • the case includes an opening that enables the projector to project images onto the lens 204 or prism attached adjacent to the case.
  • the augmented reality device 110 does not include a frame but is incorporated into a wrist watch (i.e., a smart watch).
  • the display 205 is not fitted over a lens 204 , but corresponds to the screen of the watch.
  • FIG. 3 illustrates the controller according to an exemplary embodiment of the invention.
  • the controller may be present within any of devices 110 , 180 , 185 , or 190 .
  • the controller includes an application processor 310 , a presentation subsystem 320 , a connectivity subsystem 330 , a sensor subsystem 340 , an input/output subsystem 350 , a memory 360 , and a power management system 370 .
  • the controller may omit any one of the illustrated elements shown in FIG. 3 or may include additional elements.
  • the application processor 310 is configured to execute a computer program that controls the augmented reality device 110 .
  • the computer program is stored in memory 360 .
  • the computer program will be discussed in more detail below.
  • the application processor 310 is in a controller of devices 180 , 185 , or 190 , it is configured to execute a computer program that controls the corresponding devices.
  • the presentation subsystem 320 controls what is presented on display 205 or seen in the projected image 205 of the augmented reality (AR) device 110 , a display of the mobile device 180 , or a screen of the watch 185 .
  • AR augmented reality
  • the connectivity subsystem 330 enables the AR device 110 to communicate with other devices such as the central server 120 , another mobile device (e.g., 180 ), or other devices such as a mainframe, a workstation, a server, a database, a desktop computer, a tablet computer, a smart watch (e.g., 185 ), another client, etc.
  • the connectivity subsystem 330 includes a wireless transceiver that enables the augmented reality device 110 or devices 180 or 185 to wirelessly communicate with the other devices.
  • the connectivity subsystem 130 may include the technology (e.g., suitable hardware and/or software) to exchange data wirelessly (e.g., using radio waves) over a computer network (e.g., the Internet). This technology may enable Wi-Fi communications based on the IEEE 802.11 standard, Bluetooth communications, Near Field Communications (NFC), Radio Frequency Identification (RFID), Infrared, etc.
  • NFC Near Field Communications
  • RFID Radio Frequency Identification
  • the sensor subsystem 340 may include one or more sensors, such as an ambient light sensor, a proximity sensor, a global positioning system (GPS), a compass, an accelerometer, a gyroscope, etc. Since the controller includes the sensor subsystem 340 , and the controller may be present in any of devices 110 , 180 , or 185 , any of these devices may provide the functions of the sensor subsystem 340 .
  • sensors such as an ambient light sensor, a proximity sensor, a global positioning system (GPS), a compass, an accelerometer, a gyroscope, etc. Since the controller includes the sensor subsystem 340 , and the controller may be present in any of devices 110 , 180 , or 185 , any of these devices may provide the functions of the sensor subsystem 340 .
  • the input/output (I/O) subsystem 350 may provide an interface to input devices, such as control panel 206 .
  • the I/O subsystem 350 may include a digital camera having lens 203 controlled by the applications processor 310 or by a controller of the I/O subsystem 350 for capturing images and videos.
  • the I/O subsystem may be present within any of devices 110 , 180 , or 185 .
  • the images and videos may be stored in a memory or a buffer of the I/O subsystem 350 or the memory 360 .
  • the memory 360 may be embodied by various types of volatile or non-volatile memory devices.
  • the memory 360 may include flash memory, such as an SD card, an MMC card, an eMMC card, hard drive, etc.
  • the memory may be located within any of devices 110 , 180 , 185 , or 190 .
  • the power management subsystem 370 may include a battery, an interface for receiving power from an external power source, software and/or hardware to manage power usage of the augmented reality device 110 , etc.
  • the power management subsystem 370 may include an AC power adaptor for receiving power in a wired manner or a Wireless Power Receiver for receiving power in a wireless manner.
  • the power manage mange subsystem 370 may be located within any of devices 110 , 180 , 185 , or 190 .
  • a user places the frame 202 of the AR device 110 over one or more eyes like a pair of glasses.
  • the view perceived by the user through the lens 204 is referred to as a lens view 204 - 1 and the view perceived by the user though the display device 205 or an area of the projected image 205 is referred to as an augmented view 205 - 1 .
  • the lens 204 is transparent, all objects that would be visible to the naked eye are visible in the lens view 204 - 1 .
  • the size and location of the augmented view 205 - 1 may vary within the lens view 204 - 1 , and be adjusted by the controller of FIG.
  • the display device 205 or the projector can present augmented data (e.g., images) to all or just a portion of the augmented view 205 - 1 .
  • augmented data e.g., images
  • the display device 205 or the projector presents augmented data to a portion of the augmented view 205 - 1 , any objects that would be visible to the naked eye in the remaining area of the augmented view 205 - 1 is visible to the user.
  • the augmented data is provided by the central server 120 and is sent either directly to the connectivity subsystem 330 of the controller within the augmented reality device 110 , or the augmented data is sent in an indirect manner from the central server 120 to the mobile device 180 or the watch 185 (e.g., the intermediary party), and then from the intermediary party to the augmented reality device 110 .
  • No Internet or WiFi service may be present in an equipment room.
  • the AR device 110 can be preloaded wirelessly with all the necessary augmented data from the server 120 , which may include all instructions that need to be performed, existing parameter data about the equipment in the facility, existing warnings, trends, etc.
  • the mobile device 180 or the watch 185 may be preloaded with the same or a portion of this information wirelessly.
  • the instructions are derived from operational procedures including Standard Operating Procedures (SOPs), Emergency Action Procedures (EAPs), Emergency Operating Procedures (EOPs), Maintenance Procedures (MPs), Method of Procedures (MOPs) and other facility documentation.
  • SOPs Standard Operating Procedures
  • EAPs Emergency Action Procedures
  • EOPs Emergency Operating Procedures
  • MPs Maintenance Procedures
  • MOPs Method of Procedures
  • the procedures are stored on the server 120 , and procedures for a given facility are then transferred to the augmented reality device 110 prior to entering the facility. The procedures may also be transferred to the mobile device 180 or the watch 185 .
  • the facility may be marked with tags that can be scanned by the augmented reality device 110 , the mobile device 180 , or the watch 185 to determine whether the user has entered a particular room or is standing in front of a particular facility component.
  • At least one of the tags are radio frequency identification (RFID) tags or near field communication (NFC) tags that can identify a given room/floor in the facility and/or a given facility component within the facility.
  • RFID radio frequency identification
  • NFC near field communication
  • a tag may be present in the doorway of a room in the facility that identifies the room the user is about to enter, and tags may be present on all the facility components within the room that identifies the corresponding equipment.
  • the connectivity subsystem 330 of the augmented reality device 110 , the mobile device 180 , or the watch 185 may include an RFID or an NFC reader that is operated by the processor 310 to read the tags.
  • the reader can retrieve the next instruction for the identified room from the pre-retrieved data found in the memory 360 of the controller.
  • the reader can retrieve the next instruction for the identified component from the pre-retrieved data. If WiFi is available, the controller can retrieve the instruction directly from the server 120 .
  • scanning the tag may cause the augmented reality device 110 , the mobile device 180 , or the watch 185 to interface with the metering device to automatically download the latest data readings for a given component (e.g., power left in a UPS), which could be presented in the augmented view 205 - 1 , a display of the mobile device 180 , or on a screen of the watch 185 .
  • a given component e.g., power left in a UPS
  • the tags have barcodes (e.g., UPC, QR, etc.) that are scanned by a camera of the augmented reality device 110 using lens 203 , or a camera of the watch 185 , or a camera of the mobile device 180 , and the scanned codes identify a room and/or a floor of the facility, and/or a component within the facility/room/floor.
  • barcodes e.g., UPC, QR, etc.
  • the augmented reality device 110 scans the barcode, retrieves identifying information from the read bar code, and retrieves augmented data to present based on the retrieved identifying information.
  • images can be captured using the AR device 110 , and then image recognition can be used to automatically identify facility components and areas of the facility (e.g., a particular room, a particular floor, etc.).
  • the AR device 110 may configured to ping the server 120 or the mobile device 180 regularly for updates.
  • the server 120 can be made more efficient by detecting shifts in an internal gyrometer, accelerometer, or inclinometer of the sensor subsystem 340 of the device, which allows the server 120 to be more efficient in its appropriate timing for requests for data.
  • the AR device 110 would only call for new information when visuals have changed significantly or the user has moved a certain distance.
  • the server 120 sends information on a granular level based on the scope that the client using the AR device 110 has most recently entered. For example, when a client enters a facility, based on his particular preference during the initial installation, a configuration will be retrieved from the server 120 to the client specifying how tags (e.g., RFID, QR, or manual queue) as well as whether geometries are used or not and to what granularity. As a user enters a sub-area, or is in the vicinity of a piece of equipment, the server 120 will then request more and more granular information while the client enters different zones, either via sensors specified in the configuration or via its manual pinging.
  • tags e.g., RFID, QR, or manual queue
  • Information can be retrieved in a row-line format of column data.
  • the user can select from a list of equipment as he navigates to more and more granular levels of identifying his area/equipment. If there is data on the server 120 , the user will get coordinates, or geometries, as it applies as is available for that facility.
  • the AR device 110 may include multiple infrared cameras to distinguish geometries.
  • the AR device 110 will use the multiple cameras to scan the geometries to retrieve its metadata (such as angles, geometries, closest basic overall mesh color of the geometry, etc.). It will send a query to the server 120 with this data periodically to retrieve possible matches. Images of the equipment previously captured by any of the devices ( 110 , 180 , 185 , 190 ) and stored by the server 120 , along with a visual scan of an area of the facility with a geometry analyzing function or API will determine whether there is a similar fit in terms of the number of edges and the number of geometric objects. The system may be configured to confirm that the equipment being looked at is what the database algorithm and visual device's representation matches. If no exact match is found, the system will show other equipment in the nearest geometrically mapped vicinity/scope.
  • FIG. 4 illustrates an example of the augmented view 205 - 1 being used to present augmented data.
  • the equipment includes an Electrical Switchgear and the augmented view 205 - 1 presents data such as the name of the equipment (e.g., “Switchgear A), a current instruction to perform on the equipment (e.g., “check voltage levels”), the name of the room the equipment is housed within (e.g., “Main Electrical Room”), the time, date, etc.
  • the name of the equipment e.g., “Switchgear A
  • a current instruction to perform on the equipment e.g., “check voltage levels”
  • the name of the room the equipment is housed within e.g., “Main Electrical Room”
  • the presentation subsystem 320 formats the data presented in the augmented view 205 - 1 based on augmented data pre-stored on the augmented reality device 110 , retrieved wirelessly from the mobile device 180 or the central server 120 when WiFi is present. It is assumed that prior to presenting the augmented data, a user wearing the augmented reality device 110 passed and/or scanned a tag identifying the room or the component so that the augmented reality device 110 can retrieve the next instruction that corresponds to the identified room or component.
  • the data presented in the augmented view 205 - 1 shown in FIG. 4 may also be presented on a display of the mobile device 180 or a screen of the watch 185 .
  • control panel 206 After the user performs the instruction, the user can acknowledge that the instruction was performed by pressing a button on control panel 206 , which sends an acknowledge command to the controller shown in FIG. 3 .
  • the control panel 206 may be located on the watch 185 to allow the user to send the acknowledge command using the watch 185 .
  • the user may also use the mobile device 180 to send the acknowledge command.
  • the controller can present a next step in the augmented view 205 - 1 , a screen of the watch 185 , or a display of the mobile device 180 , that is to be performed on a component in the same room, or another instruction (e.g., “head to another room”).
  • control panel 206 is illustrated in FIG. 2 as being located next to the camera lens 203 , the position of the panel 206 can be moved to various locations on the augmented reality device 110 , or may be present on watch 185 .
  • the panel 206 may include at least one depressible button such as an option to advance to display a next instruction (e.g., a “+”), an option to display a previous instruction (e.g., a “ ⁇ ”), an option to acknowledge that a current instruction has been performed, etc.
  • a next instruction e.g., a “+”
  • a previous instruction e.g., a “ ⁇ ”
  • an option to acknowledge that a current instruction has been performed etc.
  • the presentation subsystem 320 displays the text of the next step in the augmented view 205 - 1 , on a display of the mobile device 180 , or on a screen of the watch 185 .
  • the controller is configured to receive voice commands from the user.
  • the sensor subsystem 340 may include a microphone for receiving the voice commands and the memory 360 may store speech recognition software executed by the processor 310 to interpret the entered voice commands.
  • the user can acknowledge that the current instruction has been completed by speaking a term recognized by the processor 310 as indicating that the current step has been completed.
  • the controller can present the next step in the augmented view 205 - 1 , on a display of the mobile device 180 , or on a screen of the watch 185 .
  • a user may be presented with information about a given component such as parameter data about the component, a warning about the component, a data trend corresponding to the component, etc.
  • FIG. 4 illustrates an Electrical Switchgear
  • the invention is not limited to providing instruction steps or information for an Electrical Switchgear, as information and instructions for various different types of facility components may be presented.
  • the facility components may include at least one of an Uninterruptible Power Supply (UPS), a Power Transfer Switch (PTS), a Computer Room Air Conditioner (CRAC), a Generator, a Boiler, or any other type of facility component that is included in a facility's core infrastructure.
  • UPS Uninterruptible Power Supply
  • PTS Power Transfer Switch
  • CRAC Computer Room Air Conditioner
  • Generator a Boiler
  • FIG. 5 illustrates another example of the augmented view 205 - 1 being used to present augmented data according to an exemplary embodiment of the invention.
  • the user views facility components 501 , 502 , 503 , 504 , . . . , 50 n that are part of an equipment path in the lens view 204 - 1 and a path 510 through the equipment that provides the user with safe passage through the equipment.
  • the components are transformers, an arc flash may occur that can injure someone that is too close to the equipment.
  • An arc flash is a type of electrical explosion that results from a low-impedance connection to ground or another voltage phase in an electrical system. Arc flashes are often witnessed from lines or transformers just before a power outage, creating bright flashes. Most 480 volt electrical equipment have sufficient capacity to cause an arc flash hazard. Higher voltages can cause a spark to jump, initiating an arc flash without the need for physical contact.
  • a user wearing the AR device 110 who is about to walk past equipment is presented with the safe path in the augmented view 205 - 1 .
  • the user is not likely to be injured by an arc flash if he stays inside the dotted lines.
  • the safe distances between each component in the facility may be stored in the server 120 or the mobile device 180 and pre-retrieved by the augmented reality device 110 prior to entering the facility.
  • a tag at the entrance of a room or at the beginning of a path through the equipment may identify the components along the path, so that the augmented reality device 110 can retrieve the corresponding safe distances, and then the presentation subsystem 320 can display these distances like the boundary path 510 illustrated in FIG. 5 .
  • the watch 185 can vibrate (e.g., using an internal vibration motor) or provide an audible or visible alert when the user is getting close to stepping outside the safe boundaries, such as those illustrated in FIG. 5 .
  • the watch 185 could display green to indicate the user is within a safe boundary from the equipment, yellow to indicate they are getting too close to the boundary, and red to indicate they are outside the safe boundary (i.e., too close to the equipment).
  • other colors or other graphical indicator may be used.
  • the boundary path 510 may be presented on a display of the mobile device 180 or a screen of the watch 185 .
  • the boundary path 510 shown in FIG. 5 has a left side and a right side. However, if there is no equipment on one side, for example the right side, or if the equipment on the right side is not capable of causing arc flashes, the right dotted lines would be omitted. In the example shown in FIG. 5 , the power rating of the component 503 allows the user to be closer, and accordingly its adjacent dotted line is closer to component 503 .
  • the boundary path 510 can change dynamically as the user walks past equipment (e.g., marked with a corresponding tag or recognized through image recognition), since the power ratings of each component can vary. While FIG. 5 illustrates the boundary path 510 as having dotted lines, the invention is not limited to any particular graphical shape.
  • the boundary path 510 could be represented by other types of lines (e.g., straight, dashed, etc.), or by a multisided shape with straight or curved lines.
  • any object visible to the naked eye would be visible in the area of the augmented view 205 - 1 not covered by the dotted lines.
  • the invention is not limited to arc flash boundaries, as there may be other reasons for staying a certain distance away from equipment, such as to prevent electrostatic discharge, to prevent excessive inhalation of noxious fumes, to protect the user from being harmed by equipment with sharp edges, to keep the user away from equipment that does not need to be serviced, etc.
  • FIG. 6 illustrates another example of the augmented view 205 - 1 being used to present an arc flash hazard according to an exemplary embodiment of the invention.
  • the equipment, the walls, and floor of the facility are visible in the lens view 204 - 1 and the augmented view 205 - 1 comprises several augmented images 520 , 521 , 522 , 523 , and 524 .
  • the images 520 - 524 are part of an example of an alert that the user would see when walking near electrical equipment.
  • Each of the images 521 - 524 represent different paths, where each path has a different shade or color to indicate a different amount of energy that one would be exposed to in the event of an arc flash.
  • a first shading or color (e.g., green) associated with the first image path 521 could indicate a first amount of energy
  • a second shading or color (e.g., yellow) associated with the second image path 522 could indicate a second amount of energy higher than the first
  • a third shading or color (e.g., gold) associated with the third image path 523 could indicate a third amount of energy higher than the second
  • a fourth shading or color (e.g., red) associated with the fourth image path 524 could indicate a fourth amount of energy higher than the third.
  • Other colors or shading styles may be used, and additional or fewer image paths may be presented. No shading or the portions outside the paths 521 - 524 indicate zero exposure to arc flash hazards.
  • the augmented image 520 provides a textual warning, which may be omitted.
  • one or more of the image paths 521 - 524 are transparent to allow the floor to be seen.
  • the augmented images 520 - 524 , or data derived therefrom may be presented on a display of the mobile device 180 , or a screen of the watch 185 .
  • the watch 185 can vibrate when the user is close to approaching an outer one of the paths such as 521 .
  • the watch 185 can vibrate at different frequencies to indicate different levels of energy that one would be exposed to in the event of an arc flash. For example, a lower level of energy such as what would be encountered by walking within path 521 could be indicated by the watch vibrating at a first frequency and a higher level of energy such as what would be encountered by walking within path 522 could be indicated by the watch 185 vibrating at a second frequency higher than the first.
  • the watch 185 , mobile device 180 , or the augmented reality device 110 may produce audible alarms to indicate the user is about to step into one of the paths 521 - 524 or upon entering one of the paths.
  • the audible alarms may be different to indicate the different energy levels.
  • the audible alarm could be a beeping that increases in volume and/or frequency as the user moves from a lower energy path to a higher energy path.
  • FIG. 7 illustrates another example of the augmented view 205 - 1 being used to present augmented data according to an exemplary embodiment of the invention.
  • the equipment, the walls, and floor of the facility are visible in the lens view 204 - 1 and the augmented view 205 - 1 comprises several augmented images 530 , 531 , 532 , and 533 .
  • the augmented images 530 and 532 may be referred to as boundary images that correspond to the boundary of one or more facility components.
  • the boundary images 530 and 532 are transparent so that the underlying component is visible.
  • the boundary images 530 and 532 may be different colors or shading styles to indicate whether the underlying equipment has been serviced recently or is in need of service.
  • the first boundary image 530 is presented in a color (e.g., green) that indicates it was recently serviced and the second boundary image 532 is presented in a color (e.g., red) that indicates it is in need of service.
  • the boundary images may however be presented in different color or shading styles.
  • the textual images 531 and 533 may be presented to indicate the actual date the underlying equipment was last serviced, and/or to warn that service is overdue.
  • the first textual image 531 may textually and/or graphically (e.g., in green) indicate that service was recent and the second textual image 533 may textually and/or graphically (e.g., in red) indicate that service is overdue.
  • the textual images 531 and 533 may be omitted.
  • At least some of the augmented images 531 - 533 or data derived therefrom may be presented on a screen of the watch 185 or a display of the mobile device 180 .
  • the watch 180 or the mobile device 180 can indicate with a graphical, audible, or vibratory cue that the equipment needs to be serviced.
  • FIG. 8 illustrates another example of the augmented view 205 - 1 being used to present augmented data according to an exemplary embodiment of the invention.
  • the equipment, the walls, and floor of the facility are visible in the lens view 204 - 1 and the augmented view 205 - 1 comprises several augmented images 541 and 542 .
  • the augmented data presented in FIG. 8 is a warning of what action not to be performed on equipment.
  • the augmented data may include a boundary image 541 that surrounds the underlying equipment and may be transparent to allow the underlying equipment to be viewed.
  • the augmented data may include a textual image 542 that describes textually and/or graphically (e.g., in yellow) the specific action not to be performed (e.g., do not turn off power, do not flip switch, do not turn lever, etc.).
  • a textual image 542 that describes textually and/or graphically (e.g., in yellow) the specific action not to be performed (e.g., do not turn off power, do not flip switch, do not turn lever, etc.).
  • At least some of the augmented images 541 and 542 or data derived therefrom may be presented on a screen of the watch 185 or a display of the mobile device 180 .
  • the watch 185 or the mobile device 180 can indicate with a graphical, audible, or vibratory cue that the equipment needs to be avoided.
  • FIG. 9 shows an example of the augmented view 205 - 1 being used to present augmented data according to an exemplary embodiment of the invention.
  • the user is observing a facility component 600 with several sub-components 610 , 620 , 630 , . . . , 63 n , and the augmented view 205 - 1 indicates that the third sub-component 630 should be operated next.
  • each sub-component e.g., 610 , 620 , 630 , . . .
  • the presentation subsystem 320 is marked with a tag that identifies the sub-component and its location so that the presentation subsystem 320 can move the augmented view 205 - 1 to the identified location, assuming it corresponds to a sub-component that needs to be next acted upon.
  • tags with barcodes are used and a user wearing the augmented reality device 110 views the tag of a sub-component, the camera takes a picture of the barcode, and the display 205 presents the augmented data in the augmented view 205 - 1 only if the augmented reality device 110 determines augmented data is available for the scanned sub-component.
  • a camera of the mobile device 180 or the watch 185 may also be used to take a picture of the barcode.
  • the component 600 is marked with an RFID or NFC tag that identifies all the sub-components and all their relative locations
  • a reader of the AR device 110 , the watch 185 , or the mobile device 180 scans the tag
  • the AR device 110 displays the augmented view 205 - 1 at a location based on the location of the tag and the relative location corresponding to the sub-component that is next to be operated on.
  • a single tag can be located just to the left of the first sub-component 610 that represents the location of component 610 and scanning of the tag could indicate the third sub-component 630 is offset to the right by 12 inches so that the augmented view 205 - 1 is presented next to the third subcomponent 630 .
  • the data presented in the augmented view 205 - 1 of FIG. 9 can also be presented on the screen of the watch 185 or a display of the mobile device 180 when the user is close to the next sub-component that needs to be acted upon.
  • FIG. 10A shows an example of the augmented view 205 - 1 being used to present augmented data, according to an exemplary embodiment of the invention.
  • the augmented view 205 - 1 includes augmented images 621 and 622 .
  • the first augmented image 621 identifies the next part of the equipment (e.g., a sub-component) that is to be operated on.
  • the first augmented image 621 may be a boundary image that surrounds the part such as a rectangle, square, circle, etc.
  • the second augmented image 622 describes textually (e.g., use key to unlock breaker, etc.) the current action/step that is to be performed on the identified equipment part.
  • the action or step may be derived from a procedure for the equipment that is initially provided by server 120 .
  • a scanner of the AR device 110 , the mobile device 180 , or the watch 185 can scan a tag (e.g., RFID, barcode, etc.) near the equipment that identifies the equipment or a camera of the device 110 , 180 , or 185 can snap a picture that can be used to identify the equipment by image recognition, and then assuming a procedure for the equipment has already been downloaded from server 120 , the AR device 110 can present a current step of procedure in the augmented view 205 - 1 using the augmented images 621 and 622 .
  • a tag e.g., RFID, barcode, etc.
  • FIG. 10B shows the result of the user performing the action requested in FIG. 10A .
  • the user can notify the system that the action has been performed by pressing a button on the control panel 206 on devices 110 or 185 , touching a screen of the mobile device 180 , or through a voice command received by devices 110 , 185 , or 180 .
  • the user can request a next step in the procedure by pressing the same or a different button on the control panel 206 on devices 110 or 185 , or by again touching a screen of the mobile device.
  • FIG. 10C illustrates a second step being presented to the user in the augmented view 205 - 1 using a first image 621 to identify the component to be operated on and a second image 622 to present the next step of the procedure (e.g., power switch on).
  • the user can again notify the system that the next action has been performed by pressing the button on the control panel 206 or through a voice command.
  • FIG. 10D illustrates that no more steps are left in the procedure and the procedure has been completed.
  • FIG. 10D illustrates a textual image 622 that indicates that the procedure is complete.
  • the augmented images 621 and/or 622 , or data derived therefrom may be presented on a screen of the watch 185 or a display of the mobile device 180 .
  • the screen of the watch 185 or the display of the mobile device can indicate a current procedure step that is to be performed.
  • a button of the watch 185 e.g., on panel 205
  • a touch of a screen of the display device 180 can be used to acknowledge to the system that the current procedure step has in fact been performed.
  • the augmented reality device 110 , the mobile device 180 , or the watch 185 can enable the user to perform data collection, view data trends, and perform operational procedures including Standard Operating Procedures (SOPs), Emergency Action Procedures (EAPs), Emergency Operating Procedures (EOPs), Maintenance Procedures (MPs), Method of Procedures (MOPs) and other facility documentation.
  • SOPs Standard Operating Procedures
  • EAPs Emergency Action Procedures
  • EOPs Emergency Operating Procedures
  • MPs Maintenance Procedures
  • MOPs Method of Procedures
  • the steps can be presented in more detail on the mobile device 180 .
  • the user wearing the AR device 110 scans a tag associated with equipment, while a brief message related to the component can be displayed in the augmented view 205 - 1 or on the watch 185 , more information about the component or the facility (e.g., a schematic diagram, the entire procedure, etc.) can be displayed on the mobile device 180 .
  • FIG. 11 illustrates an example of a possible graphical user interface (GUI) 700 that can be presented on the mobile device 180 , in the augmented view 205 - 1 , or on the screen of the watch 185 .
  • GUI graphical user interface
  • the GUI 700 includes a walkthrough mode 710 that is selected to enable the user to configure their facility into separate rooms.
  • Each of the rooms may be represented by selectable room buttons 711 .
  • selectable room buttons 711 When the user selects one of the room buttons 711 , the electrical and mechanical apparatuses that are associated with the selected room will appear.
  • buttons 711 appear on the display of the mobile device 180
  • a user can use a touch screen of the device to touch the buttons 711 .
  • the room buttons 711 appear in the augmented view 205 - 1 or a screen of the watch
  • a user can advance to and select the room buttons 711 by selecting a button of panel 206 .
  • the panel 206 may be located on the watch.
  • Each room button 711 may provide a graphical indicator indicating whether the room has already been configured (e.g., a check) or has yet to be configured (e.g., an ‘x’ or a blank box).
  • a graphical indicator indicating whether the room has already been configured (e.g., a check) or has yet to be configured (e.g., an ‘x’ or a blank box).
  • the invention is not limited thereto.
  • the graphical indicators illustrated in FIG. 11 are merely examples, as other graphical symbols or text may be used to convey the same information.
  • the user may also upload facility floor layout plans to be used for navigation of this screen.
  • the user may configure areas of the floor layout plan to correspond to rooms within the system. Selecting the room from this view will display the electrical and mechanical apparatuses as previously described.
  • buttons 711 brings up a new interface screen that enables the user to enter a new facility component that is housed within the corresponding room, or view/edit facility components that were previously entered (e.g., either manually or automatically). Further, one or more of the facility components in the rooms may be pre-loaded automatically using default facility component templates or site templates.
  • the default facility component template may be used by the GUI 700 to provide the user with a list of available facility components. Custom facility component fields may also be created by the user and/or added to the default facility component template. In this way, each room may be configured to accommodate a unique facility component setup (e.g., UPS, PTS, switchgear, generator, power distribution unit PDU, boiler, chiller, etc.).
  • FIG. 12 illustrates an exemplary screen of the GUI 700 when the walkthrough mode 710 is selected.
  • This screen enables a user to be guided through a facility walkthrough room by room, clearly indicating data values that may be recorded for each facility component.
  • the screen includes the name of the room 712 , an image 713 of the selected facility component or the room, a data entry pane 714 , and buttons 715 for selecting one of the available facility components/equipment in the room.
  • the image 713 and the room name 712 are optional.
  • the buttons 715 may include labels that identify the corresponding facility components, which can be revised as necessary by the user.
  • all or a portion of the information presented in FIG. 12 is presented in the augmented view 205 - 1 or a screen of the watch, and the user can advance to different fields by using button on panel 206 located on device 110 or the watch, or using voice commands received through device 110 or the watch.
  • button on panel 206 located on device 110 or the watch
  • voice commands received through device 110 or the watch or using voice commands received through device 110 or the watch.
  • the information presented in FIG. 12 is presented on the mobile device 180 , a user can adjust the different fields using a touch screen of the device 180 .
  • the image 713 (or a video) may be captured using its camera.
  • a camera may also be present within the watch to capture the image/video.
  • the room name 712 field may edited by the user using a virtual/physical keyboard of the mobile device 180 to identify the room or by the user speaking into a microphone of the augmented reality device 110 , the watch, or the mobile device 180 .
  • the data entry pane 714 includes one or more parameters and data entry fields corresponding to the parameters associated with the selected facility components.
  • the parameters and corresponding data entry fields for a UPS could include its current battery voltages, currents, power levels, power quality, temperatures, statuses, alarms, etc.
  • the data entry pane 714 is presented in the augmented view 205 - 1 or a screen of the watch, and the user can advance to and change different parameters within the pane 714 by selecting one or more options of the button 206 , which is located on the device 110 or the watch.
  • the data fields can be one of various field types, such as numeric (e.g., integer or decimal), a text string, an array of choices, or a checkbox.
  • the text string may be input via a virtual/physical keyboard of the mobile device 180 or by a user speaking into a microphone of the augmented reality device 110 or the watch.
  • the array of choices may be a selectable list box or dropdown menu with selectable items.
  • the selectable choices can be presented in the augmented view 205 - 1 or the watch, where each choice could be advanced to and/or selected by pressing one or more buttons of the panel 206 , which can be located on the mobile device 110 or the watch.
  • Each item may be associated with or return a unique integer value when selected that corresponds to an index into an array that stores the items of the list/menu.
  • the data field may also be a label with one or more selectable arrow buttons that allow the user to increment or decrement the value on the label by a predefined amount. For example, when the selectable arrow buttons are presented in the augmented view 205 - 1 or a screen of the watch, an arrow button can be selected by selecting a button on panel 206 . Selection on the checkbox may be stored as an integer representing whether the checkbox has been checked (e.g. 1) or unchecked (e.g., 0).
  • the application may maintain a data structure or object that corresponds to a facility component, which may comprise one or more of the above-described data fields.
  • the equipment object (or facility component object) may include data regarding its name, type, image file location, its collection of fields, and a collection of document references.
  • access methods e.g., object methods
  • the name may be a string representation of the name of the equipment/facility component (e.g., “Ferro-Resonant UPS”, “Line-Interactive UPS”, etc.)
  • the image file location is the string representation of an absolute or relative file path of the image file (e.g., a .png, .jpg) that may either be located within a memory file system of the AR device 110 , the mobile device 180 , or its appropriate location on a memory file system of the server 120 that visually describes the equipment/facility component, which may have been captured by the camera of the augmented reality device 110 .
  • the collection of fields is a collection of field objects that pertains to the equipment/facility components.
  • the collection of documents is a collection of strings that point to the absolute or relative file path of the documentation files that may either be located within the memory file system of devices 110 or 180 , or its appropriate location on a memory file system of the server 120 that describe the structure, use, properties, or maintenance of the specific equipment/facility component.
  • Room objects are facility component objects that have data regarding the name, image file location, a collection of fields, and a collection of documents for the particular room.
  • Room objects may also have a collection of equipment/facility components, as mentioned previously, which is literally a collection of equipment/facility component objects whose data collection interfaces are spatially located within that room area.
  • room objects may have data pertaining to its representation in the walkthrough data collection.
  • a room object may include a flag that indicates whether or not the room is required to be checked during a specific scheduled walkthrough, as well as data denoting the percentage of fields within the room and the fields within the equipment/facility components of the room that may have been completed (whether problematic or not) over all the fields within the room and its equipment/facility components.
  • Fields, equipment/facility components, and rooms may be used as collections within a facility area, which the devices 110 / 180 or server 120 can maintain using an area object.
  • An area is a specific dimension of facility space that separates the total collection of facility rooms into smaller collections.
  • Each area object may have a name and an image file location for a picture representing that area. Areas are not only limited to different areas within the facility building itself, but also include rooftops and outside areas of a facility.
  • Fields, equipment/facility components, rooms, and areas may be used as collections within a facility, which the devices 110 / 180 or server 120 can maintain using a facility object.
  • the facility object itself may have a name, address, image file location, a collection of all its area objects, and a collection of all the actual document objects pertaining to the entire facility. All of the rooms and equipment/facility components may have a reference to the facility's master list of documents in order to link themselves to a specific document or collection of documents.
  • Fields, equipment, rooms, areas, and facilities may be used as collections within a client profile, which the devices 110 / 180 or server 120 can maintain using a software license object.
  • the software license object itself may have an organization name, owner, license key, a collection of all its facility objects, and a collection of all the user objects working for or within the facility.
  • Users of the application may be represented within the devices 110 / 180 or server 120 using user objects.
  • the user object itself may have a name and privileges.
  • the data entered into the fields may be stored in database in a memory 360 and/or in remote database 130 .
  • every room within the facility may be stored as a database table.
  • a time stamp may be applied to each and every walkthrough session.
  • Each record in that table may contain as columns every single field from that room and its equipment/facility components, as well as the time stamp of the walkthrough session. Every time a walkthrough session is saved, it may either overwrite the latest record within the table, or insert a new record into the table. If the current session being saved is a new session, it may insert a new record, but if it is a continued session that was previously saved it may overwrite the last record saved.
  • Each walkthrough session is saved in a manner which the devices 110 / 180 or server 120 may maintain using a walkthrough session object.
  • the walkthrough session object itself may have a timestamp for when the session began and a timestamp for when the session ends, the user which performed the walkthrough session, and the data collected.
  • users may record information as comments attached to facility components. These comments may be maintained by the devices 110 / 180 or the server 120 as a comment object.
  • the comment object itself may contain a title, a message, a timestamp, an image, a user as an author, and a sound file.
  • the application When the application is started, it may first retrieve the stored object-oriented data from the device memory, which may be encoded.
  • the encoded data for the facility and its areas, rooms, equipment/facility components, fields, users, comments, walkthrough sessions and documents may be decoded and recreated at runtime, after a user attempting to login to the application has been authenticated.
  • the last record from each table in the local database e.g., each table may refer to a specific room within the facility
  • each table may refer to a specific room within the facility
  • an entire set of records across multiple timestamp ranges may be imported into the application from the remote database for the sake of viewing, analyzing, and reporting trends throughout time across equipment/facility components and fields.
  • Data from a record set may be collected as an array of arrays (a two-dimensional array).
  • the first position of the two-dimensional array refers to the index of the record retrieved, uniquely identified by its timestamp, while the second position of the two-dimensional array refers to the column of the record retrieved.
  • An embodiment of the application may point to and retrieve a specific data field collected from any time.
  • multiple record sets are obtained, one for each database table (e.g., one for each room), which a user may choose any collected data field from any time as far as the database record set allows.
  • the devices 110 / 180 or server 120 may compare the entered parameter data against stored thresholds or previously entered data to determine whether an error has occurred.
  • the thresholds may include a Maximum Threshold and a Precise Threshold.
  • the tolerance value is 10%
  • the actual temperature of a boiler is 200 degrees
  • the nominal value is 250 degrees
  • an actual of 200 is not above 250 ⁇ (250*0.1) (i.e., 200 is not above 225)
  • no warning would be displayed.
  • the temperature had risen to 226 degrees in this example, a warning would have been displayed.
  • the devices 110 / 180 or server 120 may also, or instead, average the previous logged data/parameter with the current entered parameter data, and compare this average value with a corresponding threshold to determine if a warning should be displayed. For example, if the current boiler temperature was entered at 226 degrees, but the prior 4 samples have the temperature at 200 degrees, since the overall average temperature is less than 226, no warning would be displayed. The amount of samples used for this averaging may vary and be a configurable parameter.
  • the GUI 700 may display a warning when data is outside of the tolerance as shown by the following Equation 2:
  • the value of the data field is beyond the threshold, then the value of the field is out of range and the system may alert the user. For example, if the tolerance value is 10%, the actual temperature of the boiler is 100 degrees, and the nominal value is 140 degrees, a warning would be displayed because an actual of 100 is less than 140 ⁇ 140*0.1 (e.g., 100 is less than 140 ⁇ 14). However, if the boiler temperature rises to 150 degrees a warning would not be displayed since it lower than 140+140*0.1.
  • the devices 110 / 180 or server 120 may notify the user to take corrective action to maintain the facility components before a failure occurs.
  • the notification may appear as a visual on the GUI 700 , an audible alert, or a vibratory alert.
  • the visual can be presented on a display of the mobile device 180 , in the augmented view 205 - 1 , or in the screen of the watch 185 .
  • a speaker of the devices 110 , 180 , or 185 can sound the audible alert, or vibrating mechanism of the devices 110 , 180 , or 185 can vibrate to present the vibratory alert.
  • parameter data for facility components that are outside of the threshold limits may be marked with visual cues on a display of the mobile device 180 , in the augmented view 205 - 1 , or on a screen of the watch.
  • different levels of alerts may be present. For example, a low level alert may become an elevated alert if new data is entered outside of a user configurable normal tolerance threshold. Since the error may have been caused by a data entry error, the user can choose to commit the data with the value as is, or edit the data before it is committed.
  • the notifications may be managed by selecting the configurations tab 730 .
  • the interface 700 enables the user to indicate who should be contacted, the form used for the contact (e.g., SMS text message, MMS message, e-mail, social network message, etc.).
  • the devices 110 / 180 / 185 can notify the site administrator via an automatically generated email, text message, social network message, etc.
  • the devices 110 / 180 / 185 may enable the user to set the system into a training mode that prevents the application from sending a message to a remote party if it is determined that the facility components are not functioning properly.
  • the devices 110 , 180 , 185 , or server 120 may include a wireless transceiver to send a message using the wireless transceiver if it is determined that a facility component is not functioning properly.
  • the user may record comments in the comment field 716 for each room for each facility component within the room.
  • the comment field 716 displayed on the mobile device 600 can be advanced to by touching a screen option of the mobile device and then the comment can be entered by using a virtual/physical keyboard.
  • the comment field 716 displayed in the augmented view 205 - 1 or a screen of the watch can be advanced to using a button on panel 206 and then the comment can be entered by a user speaking into a microphone of the augmented reality device 110 , the mobile device 180 , or the watch 185 .
  • the camera of the augmented reality device 110 , the device 180 , or watch 185 may be used to record one or more pictures that are automatically associated with the comment.
  • the comments and associated pictures may be stored on a database of the devices 110 / 180 / 185 , or the remote database 130 . For example, if the user notices that the temperature of the boiler is too high, he can provide a corresponding comment and snap a photo of the instrument panel of the boiler showing the elevated temperature.
  • each data field may be graphed against its threshold limits and an estimated prediction trend line may be illustrated to show if and when the thresholds may be exceeded.
  • the user can use a button of panel 206 to advance to and select the trending mode 720 .
  • Analysis of the data may reveal opportunities for the user to take corrective action to maintain facility components before a failure occurs.
  • the trending may identify spikes and dips in recorded data as anomalies.
  • the anomalies may be used as a basis of analysis where related data fields are analyzed for anomalies occurring during the same time period. Correlation between anomalies occurring during the same time period may provide a basis for corrective actions to locate and address the issue.
  • the AR device 110 , mobile device 180 , watch 185 , or the server 120 may trend data input by the user during a walkthrough data collection session.
  • a data trending activity fragment may first retrieve historic data from a database of the devices 110 / 180 / 185 or the remote database 130 up to a specified range of time defined by the user, by means of the data retrieval mechanism detailed previously.
  • the data trending activity fragment may then calculate the positive and negative slopes of the sinusoidal curves that were created by the obtained information.
  • the data trending activity fragment may be able to calculate transients (voltage spikes, current spikes, brownouts) by using the trends generated by the data collected. If such problematic transients are implied by the generated trends, the data trending feature may pinpoint the facility components or location of the issue within the facility.
  • the trending mode 720 may also be used to generate a report of the collected data stating a brief analysis of said trends or anomalies, and may also include a printout of the data the selected fields in graphical and tabular format.
  • the report can be presented on a display of the mobile device 180 , in the augmented view 205 - 1 , or in a screen of the watch.
  • the interface displayed by selecting the trending mode 720 may include an equipment label 721 identifying the equipment (e.g., a facility component), an overlay button 722 , and a toggle button 723 .
  • the interface of FIG. 13 can be displayed on a display of the mobile device 180 or may be presented in the augmented view 205 - 1 or the screen of the watch 185 .
  • the arrows to the left and right of the equipment name 721 may be used to advance to a preceding or subsequent piece of equipment in the room.
  • a user can use a touch screen to select the arrows.
  • a user can advance to and select the arrows by selecting one or more buttons on panel 206 .
  • the trending mode 720 interface enables the user to specify equipment data fields and date ranges to be displayed on the graph for comparison of past and present data.
  • the toggle button 723 can be used to toggle the view of the data between the graphical view and a tabular view.
  • a user can use a touch screen of the mobile device 180 to select the toggle button 723 .
  • the toggle button 723 is presented in the augmented view 205 - 1 or the screen of the watch, a user can use one or more buttons of the panel 206 to advance to and select the toggle button 723 .
  • the overlay button 722 can be used to view an overlay of projected equipment loading values to identify trends in data that was collected, as shown in FIG. 14 .
  • the overlay button 722 is displayed on a display of the mobile device 180 , a user can use a touch to select the overlay button 722 .
  • the overlay button 722 is presented in the augmented view 205 - 1 or the screen of the watch, a user can use one buttons of the panel 206 to advance to and select the overlay button 722 .
  • the graph 724 can display one data parameter of a piece of equipment over time, and against the threshold or capacity limits, and an estimated (e.g., extrapolated) trend line will show if and when the thresholds may be exceeded. This information may aid the user in identifying system capacity trends as new facility components are added to the facility.
  • the graph 724 may be presented on the display of the mobile device 180 , the augmented view 205 - 1 , or the screen of the watch 185 .
  • the user may select equipment data fields to be displayed and analyzed on the data trending graphs. These graphs may be overlaid on the same set of axes for data comparison.
  • the interface 700 may highlight increasing and decreasing trends for the user.
  • the devices 110 / 180 / 185 or the server 120 may be configured to generate reports of the collected data, a brief analysis, and an overview of the data that was entered for selected fields.
  • the devices 110 / 180 / 185 enable the user to generate a full report based on the entered data.
  • the system may be able to manage multiple facilities data, functionalities and components.
  • the document library mode 740 can be selected to access one or more documents such as SOPs, EAPs, EOPs, MPs, MOPs, drawings, schematics, or other relevant documentation associated with facility components in the corresponding room or the facility.
  • documents such as SOPs, EAPs, EOPs, MPs, MOPs, drawings, schematics, or other relevant documentation associated with facility components in the corresponding room or the facility.
  • the user can advance to and select the library mode 740 by selecting one or more buttons of panel 206 .
  • the user can select the library mode 740 by touching its screen.
  • These documents may be accessed in a “step by step” mode of the interface 700 on a display of the mobile device 180 , in the augmented view 205 - 1 , or the screen of the watch to guide the user through each step in a procedure of a corresponding one of the documents.
  • text of the steps that have been completed and the subsequent steps can be displayed on the display 205 or the display of the mobile device 180 , where the next step to be completed can be emphasized (e.g., highlighted in a different color, underlined, etc.).
  • the interface 700 enables the user to mark each step as complete.
  • the interface 700 may provide a check box next to each step that can be selected when the user has finished that part of the procedure.
  • the check box may be visible in the augmented view 205 - 1 , the screen of the watch, or on the display of the mobile device 180 .
  • the user may also be notified if a step has been skipped.
  • the “step by step” mode may reduce confusion during an emergency by guiding the user in a high stress environment, which may also increase safety.
  • the user may also access electrical and mechanical one-line diagrams and other various important operational drawings including floor layouts, etc. These documents may also be used to facilitate practice simulations that may reduce risk and improve life safety.
  • the documents may be viewable in the augmented view 205 - 1 , the screen of the watch, or a display of the mobile device 180 .
  • the document library mode 740 may be selected to provide personnel with immediate access to important facility drawings (e.g., electrical one line diagrams, floor plans, and mechanical drawings, etc.), emergency operating procedures, and other pertinent information to keep operations running smoothly.
  • the document library mode 740 can be selected on the mobile device 180 or using the augmented reality device 110 when the interface is presented in the augmented view 205 - 1 .
  • the document library mode 740 is not limited to providing emergency related information, as it may also provide pertinent information on switching procedures, technical maintenance programs, and other operations procedures, facility one lines, and other facility infrastructure drawings including SOPs, EAPs, EOPs, MPs, MOPs, etc. Further, other documentation may be imported from a remote server and saved into the application locally, which allows use of the data without network connectivity.
  • the Simulation mode 760 may be selected to simulate one or more functions of the client 100 .
  • the simulation mode 760 can be selected on the mobile device 600 or using the augmented reality device 110 when the interface is presented in the augmented view 205 - 1 .
  • Selection of the simulation mode 260 may present a screen that enables a trainee to run one or more available training scenarios or a site administrator to create the training scenarios.
  • the screen may be presented on a display of the mobile device or in the augmented view 205 - 1 .
  • the training scenarios may be used to teach a trainee or to simulate conditions that would be predicted to occur based on data entered by a user.
  • a training scenario could be a walkthrough of a room full of several facility components (e.g., pieces of equipment) in a room, where the trainee is expected to enter parameter data (e.g., boiler temperature, battery voltage, etc.) for each corresponding component/piece. If the trainee enters parameter data that is outside of expected thresholds, they would receive an alert that is similar to the actual one that would have been received during normal operation. However, since this is merely a simulation, the site administrator would not receive a notification (e.g., e-mail, text, etc.) of the alert.
  • a notification e.g., e-mail, text, etc.
  • the application that launches the interface 700 may be represented on the display of the mobile device 180 by a selectable graphical icon.
  • the icon may also be presented in the augmented view 205 - 1 or a screen of the watch.
  • the icon may be selected (e.g., using a touch screen of 180 or selecting a button of panel 206 )
  • the user may be prompted to enter a login name/identifier and a password that corresponds to one or more accounts maintained by the application.
  • the application may support different layers of accounts, where some accounts have access to more features of the interface 700 than others. For example, the administrator may have access to all features whereas a trainee could have access only the simulation mode 760 , etc. This layered access may be secured by multiple authorization methods including encrypted passphrases, hardware keys, iris scan, user fingerprint, and facial recognition.
  • the devices 110 / 180 / 185 synchronize all collected data with a centralized, secure database (e.g., encrypted) 130 .
  • a centralized, secure database e.g., encrypted
  • updates to data generated on devices 110 / 180 / 185 may be uploaded to the remote database 130 .
  • the remote database 130 may then update other devices(s) with the updated data.
  • the data may include facility configurations, collected facility component data, facility documentation, document association properties with facility, equipment, and/or rooms, or training simulations.
  • the remote server 130 or the devices 110 / 180 / 185 may also aggregate data from various third party data sources including electrical system metering devices, mechanical system metering devices, facility alarms and alerts, security systems, and weather data. This data may be used for analytics and for basic viewing.
  • the remote server 120 or database 130 may host a library containing generic manufacturer data, equipment manuals, and other documentation that may be accessed by the augmented reality device 110 , the mobile device 180 , or watch 185 .
  • the devices 110 / 180 / 185 may communicate with the remote server 120 via a private network where all interactions may be encrypted.
  • the users of the devices 110 / 180 / 185 may access all information via a locally cached database for locations that do not have network access. For example, if network access is not available, the application will locally cache data on the device's internal storage (e.g., memory 360 ). Once network access is restored, the application may synchronize the locally stored data with the remote database automatically.
  • FIG. 15 illustrates an example of a view that may be presented by the watch 185 .
  • the view may include an image of a place such as the facility, a facility room, or an area within the facility and a status message or an alert about the place.
  • FIG. 15 shows an alert that indicates the overall health of the facility shown as a grade out of 100.
  • the view may be presented when a position of the user wearing the watch 185 is within a pre-defined distance from the location of the place depicted in the image.
  • FIG. 16 illustrates another example of a view that may be presented by the watch 185 .
  • the view may include an image of a facility component, and a status or alert about the facility component.
  • FIG. 16 shows an alert that indicates the depicted facility component (i.e., a UPS of system A) is powered off.
  • the view may be presented when a position of the user wearing the watch 185 is with a predefined distance from the location of the component. Thus, if a user walks to another facility component, the view will update to show any status or alert associated with the new component.
  • the data presented in the view can be output to the smart watch 185 from the mobile device 180 .
  • Any alerts or status messages that can be viewed on the mobile device 180 or the AR device 110 can be pushed to the smart watch 185 for display.
  • Results calculated by analytics e.g., located on the mobile device 180 or the server 120
  • facility health may be determined from factors such as frequency of rounds performed, available documentation, redundancy of critical systems, number of recent alerts, outstanding maintenance tickets, and predictions made by the analytics.
  • Other results that may be pushed to the watch for display include a rounds alert summary, equipment capacity trending, critical equipment status.
  • the AR device 110 is used to control a drone 190 (e.g., a remote controlled aircraft (RCA) or unmanned aircraft system (UAS)) or a robot 195 .
  • the RCA may be controlled with a handheld radio transmitter, which communicates with a receiver aboard the aircraft or the robot 195 .
  • the transmitter is embedded within the AR device 110 (e.g., within the case) to enable a user wearing the AR device 110 to control the RCA or the robot.
  • the receiver directs the aircraft's servos move the control surfaces based on pilot input.
  • the UAS may correspond to a quadrocopter (e.g., FIG.
  • the UAS or the robot 195 is further modified to include at least one of an infra-red sensor, a camera, an ambient temperature sensor, a smoke detector, a GPS, a gyroscope, and an accelerometer.
  • a user wearing the AR device 110 can control the camera of the drone to take pictures and videos of rooftop system components and control the IR sensor to take IR scans of equipment to detect overheating/hot spots.
  • the pictures taken by the drone 190 or the robot 195 can be presented in the augmented view 205 - 1 .
  • the radio transmitter of the drone 190 or the robot can transmit the images, the IR scans, and temperature data captured by the temperature sensor to the AR device 110 , the server 120 , the mobile device 180 , or the watch 185 .
  • the temperature sensor can be used to analyze data center cooling efficiency and identify hot spots. Large facilities covering one million square feet or more area can utilize the drone 190 to monitor temperature and equipment throughout the data center with a reduced need for an on-site staff.
  • the drone 190 is land based (e.g., an unmanned ground system (UGS)).
  • the UGS may be remotely controlled by the AR device 110 or the mobile device 180 .
  • a UGS is used to assist human operation in environments that are not suitable/feasible for personnel to work.
  • the UGS may include a temperature sensor and/or a forward looking infrared (FLIR) sensor synced to a user interface of the mobile device 180 or the AR device 110 to provide real time data readings.
  • the sensors can be used to detect hot spots on wire connections and larger scale hot spots such as server racks and equipment. Use of the UGS eliminates arc flash hazard exposure by removing personnel from close proximity of energized equipment.
  • the UGS has the ability to take photographs of equipment, and may be able to more precisely detect hot racks/equipment than the UAS.
  • the UGS has the ability to monitor large areas in large data centers and relay current data to the AR device 110 , the mobile device 180 , the watch 185 , or the server 120 , for access by a user.
  • the UGS allows the facility to reduce the manpower required for monitoring equipment, which allows facility engineering staff to focus on other operational tasks.
  • a system for detecting infrared radiation using infrared sensors may include: i) an infrared source such as blackbody radiators, tungsten lamps, and silicon carbide; ii) transmission medium used for infrared transmission, which includes a vacuum, the atmosphere, and optical fibers; iii) optical components such as optical lenses made from quartz, CaF 2 , Ge, and Si, polyethylene Fresnel lenses, Al or Au mirrors, used to converge or focus infrared radiation, and iv) an infrared detector for detecting the infrared radiation.
  • the IR monitoring may be used as part of a predictive maintenance regime to identify potential failures and prevent them.
  • the temperature sensors can be used to detect temperature fluctuation in a data center environment to identify areas where cooling efficiency can be improved.
  • the UGS can systematically scan each row in a data center following any work to check for changes in air flow patterns.
  • the UGS can utilize spatial detection sensors and algorithms to autonomously scan entire rooms. The sensors allow optimal cooling efficiency to reduce energy consumption to maximize the useful life of the equipment.
  • FIG. 18 illustrates an example of at least one robot being controlled by a wearable device (e.g., device 110 , watch 185 ) or the mobile device 180 .
  • the circular objects depicted in the figures are the robots and the rectangular objects depicted in the figures are rows or columns of equipment that are spaced apart to create a path that is travelled by the robots.
  • Each robot includes one or more sensors.
  • the sensors may include a sensor to detect a biological or chemical agent, to detect radiation, to detect whether conductors are burning (e.g., smell or detect certain scents), to capture regular images (e.g., high resolution camera) or thermal images (e.g., a thermal imaging or infrared camera), to detect hot and cold spots (e.g., a temperature sensor), to detect noise frequencies that indicate a server shutdown (e.g., ultrasound sensor/instrumentation), a GPS locator to detect a location of the robot, etc.
  • a sensor to detect a biological or chemical agent, to detect radiation, to detect whether conductors are burning (e.g., smell or detect certain scents), to capture regular images (e.g., high resolution camera) or thermal images (e.g., a thermal imaging or infrared camera), to detect hot and cold spots (e.g., a temperature sensor), to detect noise frequencies that indicate a server shutdown (e.g., ultrasound sensor/instrumentation), a GPS locator to detect a location of
  • Each robot may include a transmitter for transmitting all data collected by its sensors to the wearable device, the mobile device 180 , or the central server 120 .
  • a physical cable may be connected to a port of a robot and a port of the wearable device or the mobile 180 to enable sensor data to be downloaded from the robot to the wearable device or to the mobile device 180 .
  • the wearable device or the mobile device 180 may be used to control the robots to move to a particular location within the facility or to provide an instruction to the robots so that they can carry out their duties (e.g., capture sensor data, move to various locations) in an autonomous fashion.
  • Images or video captured by at least one of the robots may be presented in the augmented view 205 - 1 so that a user can see what the robot(s) see in real-time.
  • a digitally enhanced floor plan using a 4dScape technology may be presented in the augmented view 205 - 1 based on the location sensed by the GPS of the robot. For example, the robot can use its GPS to detect its current location, send that location to the wearable device, the mobile device 180 , or the server 120 , and then wearable device, mobile device 180 , or the server 120 is configured to retrieve and present the digitally enhanced floor plan that corresponds to the location.
  • the digitally enhanced floor plan may include three dimensional graphics representing the equipment known to be present in the room of the location, and textual information identifying the equipment and warnings based on data sensed by the robots (e.g., a hot spot, excessive radiation, etc.).
  • each robot is configured to respond to voice commands.
  • the voice commands may be spoken to a microphone of the robot when a user is near a robot, or transmitted by the wearable device, the mobile device 180 , or the server 120 to the robot when a voice command is spoke to a microphone of the sending device.
  • each robot includes a touch screen for entering commands to control the robot, or a touch screen of the mobile device 180 or the watch 185 is used to enter commands to control the robot.
  • a robot in an exemplary embodiment, includes an extendible and/or rotatable extension (e.g., an arm or leg) that can be used to remotely turn on/off equipment in the facility or make equipment adjustments.
  • the extension can be a rod that protrudes some distance away from the robot and is oriented at a certain angle, where the distance and the angle can be adjusted remotely by the wearable device, the mobile device 180 , or the server 120 using instructions transmitted from the wearable device or the mobile device 180 to the robot.
  • the extension may be attached to a ball joint attached to the robot to enable the extension to be oriented to various different angles.
  • a motor within the robot may be used extend or retract the extension to change the length of the extension and to adjust the angle of the extension.
  • the extension can be used to reach places that are difficult or dangerous to reach, such as for closing a steam valve.
  • the robots may also be controlled globally from a central command center (e.g., the central server 120 ).
  • the robots of a given facility may include robots of different types.
  • the robots may include one set of robots for normal monitoring and another set of robots for repairing systems or shutting down systems.
  • the AR device 110 , the mobile device 180 , or the watch 185 may include the computer system shown in FIG. 19 .
  • the computer system referred to generally as system 1000 may include, for example, a central processing unit (CPU) 1001 , random access memory (RAM) 1004 , a printer interface 1010 , a display unit 1011 , a local area network (LAN) data transmission controller 1005 , a LAN interface 1006 , a network controller 1003 , an internal bus 1002 , and one or more input devices 1009 , for example, a keyboard, mouse etc.
  • the system 1000 may be connected to a data storage device, for example, a hard disk, 1008 via a link 1007 .
  • GUI 700 graphical user interface
  • one or more of the illustrated modes may be omitted, additional modes may be present, selection of modes may be accomplished in a different manner from that illustrated, and different interactive or descriptive graphical elements may be used from those illustrated (e.g., labels may have different text, buttons may have different size, shapes, colors, etc., text fields may be replaced with drop down menus, lists, etc.).
  • the interface may not be a graphical user interface. The interface may be non-graphical and rely on other means of interaction (e.g., voice control).

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Information Transfer Between Computers (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A wearable device is provided that is configured to guide a user to perform a procedure in a facility. The device includes a wearable element; a display area; a sensor; and a controller. The controller is configured to control the sensor to capture sensor data, identify equipment from the sensor data, and present an image on the display area. The image includes information indicating how to perform a current step of a procedure associated with the determined equipment. The controller includes a transceiver that enables the controller to wirelessly receive the information from a remote device.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • The present application claims priority to U.S. provisional application Ser. No. 62/031,277 filed Jul. 31, 2014, U.S. provisional application Ser. No. 62/031,283 filed Jul. 31, 2014, and U.S. provisional application Ser. No. 62/089,633 filed Dec. 9, 2014, the entire contents of which are herein incorporated by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Technical Field
  • The present disclosure relates to management of facility operations, and more specifically, to management of facility operations using augmented reality.
  • 2. Discussion of Related Art
  • Operating mission critical facilities may involve monitoring numerous building functions and equipment on a regular basis. If an individual performing such monitoring observes that equipment is operating outside of its designed limits, various steps may need to be taken to correct the situation.
  • Further, comparison of equipment data with benchmark values can provide a reasonable indication that equipment is close to failing or that it is operating near or exceeding its designed limits.
  • In the event of emergencies, facility component maintenance shutdowns, or other site specific events, facility engineers may be required to complete procedures from memory or using paper instruction. However, since these procedures can be long and complex, they can be difficult for a human operator to perform without assistance.
  • Augmented reality is a live direct or indirect view of a physical, real world environment whose elements are supplemented by computer-generated sensory input. As a result, the technology functions by enhancing one's current perception of reality.
  • Thus, there is a need for a system that enables facility management using augmented reality.
  • SUMMARY OF THE INVENTION
  • According to an exemplary embodiment of the invention, a wearable device is provided that is configured to guide a user to perform a procedure in a facility. The device includes a wearable element (e.g., eyeglass frame, a watch band, etc.); a display area; a sensor; and a controller. The controller is configured to control the sensor to capture sensor data, identify equipment from the sensor data, and present an image on the display area. The image includes information indicating how to perform a current step of a procedure associated with the determined equipment. The controller includes a transceiver that enables the controller to wirelessly receive the information from a remote device.
  • According to an exemplary embodiment of the invention, a wearable device is provided to guide a user safely through a facility. The device includes a wearable element; a display area; a sensor; and a controller comprising a transceiver that enables the controller to wirelessly receive information from a remote device. The controller is configured is configured to control the sensor to capture sensor data, identify equipment from the sensor data and the received information, and present an image on the display area representing a safe path through the equipment using the sensor data and the received information.
  • According to an exemplary embodiment of the invention, a wearable device is provided to manage a facility using a drone. The device includes a wearable element; a display area; and a controller configured to wirelessly control the drone to capture sensor data, identify equipment from the sensor data, determine whether the equipment is malfunctioning using the sensor data, and present an image on the display area representing the equipment and providing information indicating whether the equipment is malfunctioning.
  • According to an exemplary embodiment of the invention, a wearable device is provided to manage a facility using at least one robot. The device includes a wearable element; a display area; and a controller configured to wirelessly control the robot to capture sensor data, identify equipment from the sensor data, determine whether the equipment is malfunctioning using the sensor data, and present an image on the display area representing the equipment and providing information indicating whether the equipment is malfunctioning.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • A more complete appreciation of the present disclosure and many of the attendant aspects thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
  • FIG. 1 is a schematic diagram illustrating a system according to an exemplary embodiment of the invention.
  • FIG. 2 illustrates an augmented reality device according to an exemplary embodiment of the invention.
  • FIG. 3 illustrates a controller according to an exemplary embodiment of the invention.
  • FIG. 4 illustrates an augmented view of the augmented reality device according to an exemplary embodiment of the invention.
  • FIG. 5 illustrates augmented view of the augmented reality device according to an exemplary embodiment of the invention.
  • FIG. 6 illustrates augmented view of the augmented reality device according to an exemplary embodiment of the invention.
  • FIG. 7 illustrates augmented view of the augmented reality device according to an exemplary embodiment of the invention.
  • FIG. 8 illustrates augmented view of the augmented reality device according to an exemplary embodiment of the invention.
  • FIG. 9 illustrates augmented view of the augmented reality device according to an exemplary embodiment of the invention.
  • FIGS. 10A-D illustrates augmented views of the augmented reality device according to exemplary embodiments of the invention.
  • FIG. 11 illustrates an exemplary screen for a walkthrough function of a possible graphical user interface (GUI) of the system.
  • FIG. 12 illustrates another exemplary screen for the walkthrough function.
  • FIG. 13 illustrates an exemplary screen for a trending function of a possible GUI.
  • FIG. 14 illustrates another exemplary screen for the trending function.
  • FIG. 15 illustrates an exemplary view of a smart watch that may be used in a system of an embodiment of the invention.
  • FIG. 16 illustrates an exemplary view of a smart watch that may be used in a system of an embodiment of the invention.
  • FIG. 17 illustrates an example of a drone that may be controlled by a system of an embodiment of the invention.
  • FIG. 18 illustrates an example of a robot being controlled by a system of an embodiment of the invention.
  • FIG. 19 shows an example of a computer system capable of implementing one or more devices of the invention or methods of the invention according to embodiments of the present disclosure.
  • DETAILED DESCRIPTION
  • In describing exemplary embodiments of the present disclosure illustrated in the drawings, specific terminology is employed for sake of clarity. However, the present disclosure is not intended to be limited to the specific terminology so selected, and it is to be understood that each specific element includes all technical equivalents which operate in a similar manner.
  • FIG. 1 illustrates a system that enables facility management to be performed using at least one of an augmented reality device 110, a smart watch 185, a mobile device 180, a drone 190, and a robot 195 according to an exemplary embodiment of the invention. The augmented reality device 110 and the smart watch 185 are wearable devices. The system further includes a central server 120 and a database 130. The mobile device 180 may be a tablet computer, a smart phone, etc.
  • The augmented reality device 110 includes a client applicator layer 111 and a static presentation layer 112. The mobile device 180 or the smart watch 185 may also include layers 111 and 112.
  • When the server 120 interfaces with a browser 140 of a remote computer, the server 120 includes a web presentation layer 121. The server 120 further includes a dynamic presentation layer 122 to interface with the static presentation layer 112 of the augmented reality device 112 and the web presentation layer 121. The server 120 further includes a business logic layer 123 and data access layer 124 to interface with a database layer 131 of the database 130.
  • FIG. 2 illustrates the augmented reality device 110 according to an exemplary embodiment of the invention. Referring to FIG. 2, the augmented reality device 110 includes a controller (not shown), a frame 202, a camera lens 230, a lens 204 (e.g., a prism), a display 205 or a projected image 205, and a control panel 206 with one or more physical buttons. In an exemplary embodiment, the augmented reality device 110 includes a projector that projects an image onto the prism to realize the projected image 205. In an exemplary embodiment, a case is mounted to the frame 202 that includes the control panel 206, the controller, a camera having the camera lens 230, and the projector. In an embodiment, the case includes an opening that enables the projector to project images onto the lens 204 or prism attached adjacent to the case. In an exemplary embodiment, the augmented reality device 110 does not include a frame but is incorporated into a wrist watch (i.e., a smart watch). In this embodiment, the display 205 is not fitted over a lens 204, but corresponds to the screen of the watch.
  • FIG. 3 illustrates the controller according to an exemplary embodiment of the invention.
  • The controller may be present within any of devices 110, 180, 185, or 190. The controller includes an application processor 310, a presentation subsystem 320, a connectivity subsystem 330, a sensor subsystem 340, an input/output subsystem 350, a memory 360, and a power management system 370. The controller may omit any one of the illustrated elements shown in FIG. 3 or may include additional elements.
  • The application processor 310 is configured to execute a computer program that controls the augmented reality device 110. The computer program is stored in memory 360. The computer program will be discussed in more detail below. When the application processor 310 is in a controller of devices 180, 185, or 190, it is configured to execute a computer program that controls the corresponding devices.
  • The presentation subsystem 320 controls what is presented on display 205 or seen in the projected image 205 of the augmented reality (AR) device 110, a display of the mobile device 180, or a screen of the watch 185.
  • The connectivity subsystem 330 enables the AR device 110 to communicate with other devices such as the central server 120, another mobile device (e.g., 180), or other devices such as a mainframe, a workstation, a server, a database, a desktop computer, a tablet computer, a smart watch (e.g., 185), another client, etc. The connectivity subsystem 330 includes a wireless transceiver that enables the augmented reality device 110 or devices 180 or 185 to wirelessly communicate with the other devices. The connectivity subsystem 130 may include the technology (e.g., suitable hardware and/or software) to exchange data wirelessly (e.g., using radio waves) over a computer network (e.g., the Internet). This technology may enable Wi-Fi communications based on the IEEE 802.11 standard, Bluetooth communications, Near Field Communications (NFC), Radio Frequency Identification (RFID), Infrared, etc.
  • The sensor subsystem 340 may include one or more sensors, such as an ambient light sensor, a proximity sensor, a global positioning system (GPS), a compass, an accelerometer, a gyroscope, etc. Since the controller includes the sensor subsystem 340, and the controller may be present in any of devices 110, 180, or 185, any of these devices may provide the functions of the sensor subsystem 340.
  • The input/output (I/O) subsystem 350 may provide an interface to input devices, such as control panel 206. The I/O subsystem 350 may include a digital camera having lens 203 controlled by the applications processor 310 or by a controller of the I/O subsystem 350 for capturing images and videos. The I/O subsystem may be present within any of devices 110, 180, or 185. The images and videos may be stored in a memory or a buffer of the I/O subsystem 350 or the memory 360.
  • The memory 360 may be embodied by various types of volatile or non-volatile memory devices. For example, the memory 360 may include flash memory, such as an SD card, an MMC card, an eMMC card, hard drive, etc. The memory may be located within any of devices 110, 180, 185, or 190.
  • The power management subsystem 370 may include a battery, an interface for receiving power from an external power source, software and/or hardware to manage power usage of the augmented reality device 110, etc. The power management subsystem 370 may include an AC power adaptor for receiving power in a wired manner or a Wireless Power Receiver for receiving power in a wireless manner. The power manage mange subsystem 370 may be located within any of devices 110, 180, 185, or 190.
  • A user places the frame 202 of the AR device 110 over one or more eyes like a pair of glasses. The view perceived by the user through the lens 204 is referred to as a lens view 204-1 and the view perceived by the user though the display device 205 or an area of the projected image 205 is referred to as an augmented view 205-1. Since the lens 204 is transparent, all objects that would be visible to the naked eye are visible in the lens view 204-1. When there is no augmented data to present, anything that would be visible to the naked eye in the area of the augmented view 205-1 is visible to the user. The size and location of the augmented view 205-1 may vary within the lens view 204-1, and be adjusted by the controller of FIG. 3. The display device 205 or the projector can present augmented data (e.g., images) to all or just a portion of the augmented view 205-1. When the display device 205 or the projector presents augmented data to a portion of the augmented view 205-1, any objects that would be visible to the naked eye in the remaining area of the augmented view 205-1 is visible to the user.
  • The augmented data is provided by the central server 120 and is sent either directly to the connectivity subsystem 330 of the controller within the augmented reality device 110, or the augmented data is sent in an indirect manner from the central server 120 to the mobile device 180 or the watch 185 (e.g., the intermediary party), and then from the intermediary party to the augmented reality device 110.
  • No Internet or WiFi service may be present in an equipment room. Thus, prior to entering the facility, the AR device 110 can be preloaded wirelessly with all the necessary augmented data from the server 120, which may include all instructions that need to be performed, existing parameter data about the equipment in the facility, existing warnings, trends, etc. The mobile device 180 or the watch 185 may be preloaded with the same or a portion of this information wirelessly.
  • In an exemplary embodiment, the instructions are derived from operational procedures including Standard Operating Procedures (SOPs), Emergency Action Procedures (EAPs), Emergency Operating Procedures (EOPs), Maintenance Procedures (MPs), Method of Procedures (MOPs) and other facility documentation. The procedures are stored on the server 120, and procedures for a given facility are then transferred to the augmented reality device 110 prior to entering the facility. The procedures may also be transferred to the mobile device 180 or the watch 185.
  • The facility may be marked with tags that can be scanned by the augmented reality device 110, the mobile device 180, or the watch 185 to determine whether the user has entered a particular room or is standing in front of a particular facility component.
  • In an exemplary embodiment, at least one of the tags are radio frequency identification (RFID) tags or near field communication (NFC) tags that can identify a given room/floor in the facility and/or a given facility component within the facility. For example, a tag may be present in the doorway of a room in the facility that identifies the room the user is about to enter, and tags may be present on all the facility components within the room that identifies the corresponding equipment. The connectivity subsystem 330 of the augmented reality device 110, the mobile device 180, or the watch 185 may include an RFID or an NFC reader that is operated by the processor 310 to read the tags. For example, if the user passes a tag identifying a room, the reader can retrieve the next instruction for the identified room from the pre-retrieved data found in the memory 360 of the controller. In another example, if the user passes a tag identifying a given facility component, the reader can retrieve the next instruction for the identified component from the pre-retrieved data. If WiFi is available, the controller can retrieve the instruction directly from the server 120.
  • For equipment that is equipped with wireless metering devices, scanning the tag may cause the augmented reality device 110, the mobile device 180, or the watch 185 to interface with the metering device to automatically download the latest data readings for a given component (e.g., power left in a UPS), which could be presented in the augmented view 205-1, a display of the mobile device 180, or on a screen of the watch 185.
  • In another exemplary embodiment, the tags have barcodes (e.g., UPC, QR, etc.) that are scanned by a camera of the augmented reality device 110 using lens 203, or a camera of the watch 185, or a camera of the mobile device 180, and the scanned codes identify a room and/or a floor of the facility, and/or a component within the facility/room/floor.
  • For example, when a user wearing the augmented reality (AR) device 110 looks at a barcode, the augmented reality device 110 scans the barcode, retrieves identifying information from the read bar code, and retrieves augmented data to present based on the retrieved identifying information.
  • Since certain clients will object to placement of tags on their equipment or on the walls of their facility, in another embodiment of the invention, images can be captured using the AR device 110, and then image recognition can be used to automatically identify facility components and areas of the facility (e.g., a particular room, a particular floor, etc.). The AR device 110 may configured to ping the server 120 or the mobile device 180 regularly for updates. Depending on the hardware, the server 120 can be made more efficient by detecting shifts in an internal gyrometer, accelerometer, or inclinometer of the sensor subsystem 340 of the device, which allows the server 120 to be more efficient in its appropriate timing for requests for data. For example, the AR device 110 would only call for new information when visuals have changed significantly or the user has moved a certain distance. The server 120 sends information on a granular level based on the scope that the client using the AR device 110 has most recently entered. For example, when a client enters a facility, based on his particular preference during the initial installation, a configuration will be retrieved from the server 120 to the client specifying how tags (e.g., RFID, QR, or manual queue) as well as whether geometries are used or not and to what granularity. As a user enters a sub-area, or is in the vicinity of a piece of equipment, the server 120 will then request more and more granular information while the client enters different zones, either via sensors specified in the configuration or via its manual pinging.
  • Information can be retrieved in a row-line format of column data. Depending on the user's configuration (e.g., preference when setting up the user's data), the user can select from a list of equipment as he navigates to more and more granular levels of identifying his area/equipment. If there is data on the server 120, the user will get coordinates, or geometries, as it applies as is available for that facility.
  • The AR device 110 may include multiple infrared cameras to distinguish geometries. The AR device 110 will use the multiple cameras to scan the geometries to retrieve its metadata (such as angles, geometries, closest basic overall mesh color of the geometry, etc.). It will send a query to the server 120 with this data periodically to retrieve possible matches. Images of the equipment previously captured by any of the devices (110, 180, 185, 190) and stored by the server 120, along with a visual scan of an area of the facility with a geometry analyzing function or API will determine whether there is a similar fit in terms of the number of edges and the number of geometric objects. The system may be configured to confirm that the equipment being looked at is what the database algorithm and visual device's representation matches. If no exact match is found, the system will show other equipment in the nearest geometrically mapped vicinity/scope.
  • FIG. 4 illustrates an example of the augmented view 205-1 being used to present augmented data. For example, in the lens view 204-1, the user is viewing facility components, and in the augmented view 205-1 the user is viewing information about one or more of the components. In the example shown in FIG. 4, the equipment includes an Electrical Switchgear and the augmented view 205-1 presents data such as the name of the equipment (e.g., “Switchgear A), a current instruction to perform on the equipment (e.g., “check voltage levels”), the name of the room the equipment is housed within (e.g., “Main Electrical Room”), the time, date, etc. The presentation subsystem 320 formats the data presented in the augmented view 205-1 based on augmented data pre-stored on the augmented reality device 110, retrieved wirelessly from the mobile device 180 or the central server 120 when WiFi is present. It is assumed that prior to presenting the augmented data, a user wearing the augmented reality device 110 passed and/or scanned a tag identifying the room or the component so that the augmented reality device 110 can retrieve the next instruction that corresponds to the identified room or component. The data presented in the augmented view 205-1 shown in FIG. 4 may also be presented on a display of the mobile device 180 or a screen of the watch 185.
  • After the user performs the instruction, the user can acknowledge that the instruction was performed by pressing a button on control panel 206, which sends an acknowledge command to the controller shown in FIG. 3. The control panel 206 may be located on the watch 185 to allow the user to send the acknowledge command using the watch 185. The user may also use the mobile device 180 to send the acknowledge command.
  • Upon receipt of the acknowledge command, the controller can present a next step in the augmented view 205-1, a screen of the watch 185, or a display of the mobile device 180, that is to be performed on a component in the same room, or another instruction (e.g., “head to another room”).
  • While the control panel 206 is illustrated in FIG. 2 as being located next to the camera lens 203, the position of the panel 206 can be moved to various locations on the augmented reality device 110, or may be present on watch 185.
  • The panel 206 may include at least one depressible button such as an option to advance to display a next instruction (e.g., a “+”), an option to display a previous instruction (e.g., a “−”), an option to acknowledge that a current instruction has been performed, etc. For example, when the user has used the panel 206 or the mobile device 180 to acknowledge that a current instruction has been performed, the presentation subsystem 320 displays the text of the next step in the augmented view 205-1, on a display of the mobile device 180, or on a screen of the watch 185.
  • In an alternate embodiment, the controller is configured to receive voice commands from the user. For example, the sensor subsystem 340 may include a microphone for receiving the voice commands and the memory 360 may store speech recognition software executed by the processor 310 to interpret the entered voice commands. For example, the user can acknowledge that the current instruction has been completed by speaking a term recognized by the processor 310 as indicating that the current step has been completed. Upon the processor 310 recognizing that the user has responded with a voice command indicating that a current step has been completed, the controller can present the next step in the augmented view 205-1, on a display of the mobile device 180, or on a screen of the watch 185.
  • While the above discusses use of information to guide the operator in performing maintenance steps during a walkthrough of a facility, the invention is not limited thereto. For example, a user may be presented with information about a given component such as parameter data about the component, a warning about the component, a data trend corresponding to the component, etc.
  • Further, while FIG. 4 illustrates an Electrical Switchgear, the invention is not limited to providing instruction steps or information for an Electrical Switchgear, as information and instructions for various different types of facility components may be presented. For example, the facility components may include at least one of an Uninterruptible Power Supply (UPS), a Power Transfer Switch (PTS), a Computer Room Air Conditioner (CRAC), a Generator, a Boiler, or any other type of facility component that is included in a facility's core infrastructure.
  • FIG. 5 illustrates another example of the augmented view 205-1 being used to present augmented data according to an exemplary embodiment of the invention. For example, the user views facility components 501, 502, 503, 504, . . . , 50 n that are part of an equipment path in the lens view 204-1 and a path 510 through the equipment that provides the user with safe passage through the equipment. For example, when the components are transformers, an arc flash may occur that can injure someone that is too close to the equipment.
  • An arc flash is a type of electrical explosion that results from a low-impedance connection to ground or another voltage phase in an electrical system. Arc flashes are often witnessed from lines or transformers just before a power outage, creating bright flashes. Most 480 volt electrical equipment have sufficient capacity to cause an arc flash hazard. Higher voltages can cause a spark to jump, initiating an arc flash without the need for physical contact.
  • Thus, a user wearing the AR device 110 who is about to walk past equipment is presented with the safe path in the augmented view 205-1. For example, the user is not likely to be injured by an arc flash if he stays inside the dotted lines. The safe distances between each component in the facility may be stored in the server 120 or the mobile device 180 and pre-retrieved by the augmented reality device 110 prior to entering the facility. A tag at the entrance of a room or at the beginning of a path through the equipment may identify the components along the path, so that the augmented reality device 110 can retrieve the corresponding safe distances, and then the presentation subsystem 320 can display these distances like the boundary path 510 illustrated in FIG. 5.
  • The watch 185 can vibrate (e.g., using an internal vibration motor) or provide an audible or visible alert when the user is getting close to stepping outside the safe boundaries, such as those illustrated in FIG. 5. For example, the watch 185 could display green to indicate the user is within a safe boundary from the equipment, yellow to indicate they are getting too close to the boundary, and red to indicate they are outside the safe boundary (i.e., too close to the equipment). However, other colors or other graphical indicator may be used. The boundary path 510 may be presented on a display of the mobile device 180 or a screen of the watch 185.
  • The boundary path 510 shown in FIG. 5 has a left side and a right side. However, if there is no equipment on one side, for example the right side, or if the equipment on the right side is not capable of causing arc flashes, the right dotted lines would be omitted. In the example shown in FIG. 5, the power rating of the component 503 allows the user to be closer, and accordingly its adjacent dotted line is closer to component 503. The boundary path 510 can change dynamically as the user walks past equipment (e.g., marked with a corresponding tag or recognized through image recognition), since the power ratings of each component can vary. While FIG. 5 illustrates the boundary path 510 as having dotted lines, the invention is not limited to any particular graphical shape. For example, the boundary path 510 could be represented by other types of lines (e.g., straight, dashed, etc.), or by a multisided shape with straight or curved lines. In the example shown in FIG. 5, any object visible to the naked eye would be visible in the area of the augmented view 205-1 not covered by the dotted lines. Further, the invention is not limited to arc flash boundaries, as there may be other reasons for staying a certain distance away from equipment, such as to prevent electrostatic discharge, to prevent excessive inhalation of noxious fumes, to protect the user from being harmed by equipment with sharp edges, to keep the user away from equipment that does not need to be serviced, etc.
  • FIG. 6 illustrates another example of the augmented view 205-1 being used to present an arc flash hazard according to an exemplary embodiment of the invention. As shown in FIG. 6 the equipment, the walls, and floor of the facility are visible in the lens view 204-1 and the augmented view 205-1 comprises several augmented images 520, 521, 522, 523, and 524. The images 520-524 are part of an example of an alert that the user would see when walking near electrical equipment. Each of the images 521-524 represent different paths, where each path has a different shade or color to indicate a different amount of energy that one would be exposed to in the event of an arc flash. For example, a first shading or color (e.g., green) associated with the first image path 521 could indicate a first amount of energy, a second shading or color (e.g., yellow) associated with the second image path 522 could indicate a second amount of energy higher than the first, a third shading or color (e.g., gold) associated with the third image path 523 could indicate a third amount of energy higher than the second, and a fourth shading or color (e.g., red) associated with the fourth image path 524 could indicate a fourth amount of energy higher than the third. Other colors or shading styles may be used, and additional or fewer image paths may be presented. No shading or the portions outside the paths 521-524 indicate zero exposure to arc flash hazards. The augmented image 520 provides a textual warning, which may be omitted. In an exemplary embodiment, one or more of the image paths 521-524 are transparent to allow the floor to be seen.
  • The augmented images 520-524, or data derived therefrom may be presented on a display of the mobile device 180, or a screen of the watch 185. In another embodiment, the watch 185 can vibrate when the user is close to approaching an outer one of the paths such as 521. For example, the watch 185 can vibrate at different frequencies to indicate different levels of energy that one would be exposed to in the event of an arc flash. For example, a lower level of energy such as what would be encountered by walking within path 521 could be indicated by the watch vibrating at a first frequency and a higher level of energy such as what would be encountered by walking within path 522 could be indicated by the watch 185 vibrating at a second frequency higher than the first.
  • The watch 185, mobile device 180, or the augmented reality device 110 may produce audible alarms to indicate the user is about to step into one of the paths 521-524 or upon entering one of the paths. The audible alarms may be different to indicate the different energy levels. For example, the audible alarm could be a beeping that increases in volume and/or frequency as the user moves from a lower energy path to a higher energy path.
  • FIG. 7 illustrates another example of the augmented view 205-1 being used to present augmented data according to an exemplary embodiment of the invention. As shown in FIG. 7 the equipment, the walls, and floor of the facility are visible in the lens view 204-1 and the augmented view 205-1 comprises several augmented images 530, 531, 532, and 533. The augmented images 530 and 532 may be referred to as boundary images that correspond to the boundary of one or more facility components. In an exemplary embodiment, the boundary images 530 and 532 are transparent so that the underlying component is visible. The boundary images 530 and 532 may be different colors or shading styles to indicate whether the underlying equipment has been serviced recently or is in need of service. For example, the first boundary image 530 is presented in a color (e.g., green) that indicates it was recently serviced and the second boundary image 532 is presented in a color (e.g., red) that indicates it is in need of service. The boundary images may however be presented in different color or shading styles. The textual images 531 and 533 may be presented to indicate the actual date the underlying equipment was last serviced, and/or to warn that service is overdue. For example, the first textual image 531 may textually and/or graphically (e.g., in green) indicate that service was recent and the second textual image 533 may textually and/or graphically (e.g., in red) indicate that service is overdue. The textual images 531 and 533 may be omitted.
  • At least some of the augmented images 531-533 or data derived therefrom may be presented on a screen of the watch 185 or a display of the mobile device 180. For example, if the user is within a certain distance from equipment that needs to be serviced, the watch 180 or the mobile device 180 can indicate with a graphical, audible, or vibratory cue that the equipment needs to be serviced.
  • FIG. 8 illustrates another example of the augmented view 205-1 being used to present augmented data according to an exemplary embodiment of the invention. As shown in FIG. 8 the equipment, the walls, and floor of the facility are visible in the lens view 204-1 and the augmented view 205-1 comprises several augmented images 541 and 542. The augmented data presented in FIG. 8 is a warning of what action not to be performed on equipment. The augmented data may include a boundary image 541 that surrounds the underlying equipment and may be transparent to allow the underlying equipment to be viewed. The augmented data may include a textual image 542 that describes textually and/or graphically (e.g., in yellow) the specific action not to be performed (e.g., do not turn off power, do not flip switch, do not turn lever, etc.).
  • At least some of the augmented images 541 and 542 or data derived therefrom may be presented on a screen of the watch 185 or a display of the mobile device 180. For example, if the user is within a certain distance from equipment that should be avoided, the watch 185 or the mobile device 180 can indicate with a graphical, audible, or vibratory cue that the equipment needs to be avoided.
  • FIG. 9 shows an example of the augmented view 205-1 being used to present augmented data according to an exemplary embodiment of the invention. In FIG. 9, the user is observing a facility component 600 with several sub-components 610, 620, 630, . . . , 63 n, and the augmented view 205-1 indicates that the third sub-component 630 should be operated next. In an exemplary embodiment, each sub-component (e.g., 610, 620, 630, . . . , 64 n) is marked with a tag that identifies the sub-component and its location so that the presentation subsystem 320 can move the augmented view 205-1 to the identified location, assuming it corresponds to a sub-component that needs to be next acted upon. For example, when tags with barcodes are used and a user wearing the augmented reality device 110 views the tag of a sub-component, the camera takes a picture of the barcode, and the display 205 presents the augmented data in the augmented view 205-1 only if the augmented reality device 110 determines augmented data is available for the scanned sub-component. A camera of the mobile device 180 or the watch 185 may also be used to take a picture of the barcode.
  • In another example, the component 600 is marked with an RFID or NFC tag that identifies all the sub-components and all their relative locations, a reader of the AR device 110, the watch 185, or the mobile device 180 scans the tag, and the AR device 110 displays the augmented view 205-1 at a location based on the location of the tag and the relative location corresponding to the sub-component that is next to be operated on. For example, a single tag can be located just to the left of the first sub-component 610 that represents the location of component 610 and scanning of the tag could indicate the third sub-component 630 is offset to the right by 12 inches so that the augmented view 205-1 is presented next to the third subcomponent 630. The data presented in the augmented view 205-1 of FIG. 9 can also be presented on the screen of the watch 185 or a display of the mobile device 180 when the user is close to the next sub-component that needs to be acted upon.
  • FIG. 10A shows an example of the augmented view 205-1 being used to present augmented data, according to an exemplary embodiment of the invention. The augmented view 205-1 includes augmented images 621 and 622. The first augmented image 621 identifies the next part of the equipment (e.g., a sub-component) that is to be operated on. For example, the first augmented image 621 may be a boundary image that surrounds the part such as a rectangle, square, circle, etc. The second augmented image 622 describes textually (e.g., use key to unlock breaker, etc.) the current action/step that is to be performed on the identified equipment part. The action or step may be derived from a procedure for the equipment that is initially provided by server 120. For example, a scanner of the AR device 110, the mobile device 180, or the watch 185 can scan a tag (e.g., RFID, barcode, etc.) near the equipment that identifies the equipment or a camera of the device 110, 180, or 185 can snap a picture that can be used to identify the equipment by image recognition, and then assuming a procedure for the equipment has already been downloaded from server 120, the AR device 110 can present a current step of procedure in the augmented view 205-1 using the augmented images 621 and 622.
  • FIG. 10B shows the result of the user performing the action requested in FIG. 10A. For example, since the user has unlocked the breaker, he is no longer instructed to perform the current action. The user can notify the system that the action has been performed by pressing a button on the control panel 206 on devices 110 or 185, touching a screen of the mobile device 180, or through a voice command received by devices 110, 185, or 180. The user can request a next step in the procedure by pressing the same or a different button on the control panel 206 on devices 110 or 185, or by again touching a screen of the mobile device.
  • FIG. 10C illustrates a second step being presented to the user in the augmented view 205-1 using a first image 621 to identify the component to be operated on and a second image 622 to present the next step of the procedure (e.g., power switch on). The user can again notify the system that the next action has been performed by pressing the button on the control panel 206 or through a voice command.
  • FIG. 10D illustrates that no more steps are left in the procedure and the procedure has been completed. For example, FIG. 10D illustrates a textual image 622 that indicates that the procedure is complete.
  • The augmented images 621 and/or 622, or data derived therefrom may be presented on a screen of the watch 185 or a display of the mobile device 180. For example, when the user is within a certain distance of equipment for which a current procedure step is to be performed, the screen of the watch 185 or the display of the mobile device can indicate a current procedure step that is to be performed. A button of the watch 185 (e.g., on panel 205) or a touch of a screen of the display device 180 can be used to acknowledge to the system that the current procedure step has in fact been performed.
  • The augmented reality device 110, the mobile device 180, or the watch 185 can enable the user to perform data collection, view data trends, and perform operational procedures including Standard Operating Procedures (SOPs), Emergency Action Procedures (EAPs), Emergency Operating Procedures (EOPs), Maintenance Procedures (MPs), Method of Procedures (MOPs) and other facility documentation.
  • Since the mobile device 180 includes a larger screen than the augmented reality device 110 or the watch 185, the steps can be presented in more detail on the mobile device 180. For example, if the user wearing the AR device 110 scans a tag associated with equipment, while a brief message related to the component can be displayed in the augmented view 205-1 or on the watch 185, more information about the component or the facility (e.g., a schematic diagram, the entire procedure, etc.) can be displayed on the mobile device 180.
  • FIG. 11 illustrates an example of a possible graphical user interface (GUI) 700 that can be presented on the mobile device 180, in the augmented view 205-1, or on the screen of the watch 185.
  • Referring to FIG. 11, the GUI 700 includes a walkthrough mode 710 that is selected to enable the user to configure their facility into separate rooms. Each of the rooms may be represented by selectable room buttons 711. When the user selects one of the room buttons 711, the electrical and mechanical apparatuses that are associated with the selected room will appear.
  • When the room buttons 711 appear on the display of the mobile device 180, a user can use a touch screen of the device to touch the buttons 711. When one or more of the room buttons 711 appear in the augmented view 205-1 or a screen of the watch, a user can advance to and select the room buttons 711 by selecting a button of panel 206. When the watch is used, the panel 206 may be located on the watch.
  • Each room button 711 may provide a graphical indicator indicating whether the room has already been configured (e.g., a check) or has yet to be configured (e.g., an ‘x’ or a blank box). However, the invention is not limited thereto. For example, the graphical indicators illustrated in FIG. 11 are merely examples, as other graphical symbols or text may be used to convey the same information.
  • The user may also upload facility floor layout plans to be used for navigation of this screen. The user may configure areas of the floor layout plan to correspond to rooms within the system. Selecting the room from this view will display the electrical and mechanical apparatuses as previously described.
  • Selection on one of the available room buttons 711 brings up a new interface screen that enables the user to enter a new facility component that is housed within the corresponding room, or view/edit facility components that were previously entered (e.g., either manually or automatically). Further, one or more of the facility components in the rooms may be pre-loaded automatically using default facility component templates or site templates. The default facility component template may be used by the GUI 700 to provide the user with a list of available facility components. Custom facility component fields may also be created by the user and/or added to the default facility component template. In this way, each room may be configured to accommodate a unique facility component setup (e.g., UPS, PTS, switchgear, generator, power distribution unit PDU, boiler, chiller, etc.).
  • FIG. 12 illustrates an exemplary screen of the GUI 700 when the walkthrough mode 710 is selected. This screen enables a user to be guided through a facility walkthrough room by room, clearly indicating data values that may be recorded for each facility component. In the example shown in FIG. 12, the screen includes the name of the room 712, an image 713 of the selected facility component or the room, a data entry pane 714, and buttons 715 for selecting one of the available facility components/equipment in the room. The image 713 and the room name 712 are optional. The buttons 715 may include labels that identify the corresponding facility components, which can be revised as necessary by the user.
  • In an exemplary embodiment, all or a portion of the information presented in FIG. 12 is presented in the augmented view 205-1 or a screen of the watch, and the user can advance to different fields by using button on panel 206 located on device 110 or the watch, or using voice commands received through device 110 or the watch. When the information presented in FIG. 12 is presented on the mobile device 180, a user can adjust the different fields using a touch screen of the device 180.
  • Since the augmented reality device 110 includes a camera, the image 713 (or a video) may be captured using its camera. A camera may also be present within the watch to capture the image/video. The room name 712 field may edited by the user using a virtual/physical keyboard of the mobile device 180 to identify the room or by the user speaking into a microphone of the augmented reality device 110, the watch, or the mobile device 180.
  • The data entry pane 714 includes one or more parameters and data entry fields corresponding to the parameters associated with the selected facility components. For example, the parameters and corresponding data entry fields for a UPS could include its current battery voltages, currents, power levels, power quality, temperatures, statuses, alarms, etc.
  • In an exemplary embodiment, the data entry pane 714 is presented in the augmented view 205-1 or a screen of the watch, and the user can advance to and change different parameters within the pane 714 by selecting one or more options of the button 206, which is located on the device 110 or the watch.
  • In an exemplary embodiment of the invention, the data fields can be one of various field types, such as numeric (e.g., integer or decimal), a text string, an array of choices, or a checkbox. The text string may be input via a virtual/physical keyboard of the mobile device 180 or by a user speaking into a microphone of the augmented reality device 110 or the watch.
  • The array of choices may be a selectable list box or dropdown menu with selectable items. The selectable choices can be presented in the augmented view 205-1 or the watch, where each choice could be advanced to and/or selected by pressing one or more buttons of the panel 206, which can be located on the mobile device 110 or the watch.
  • Each item may be associated with or return a unique integer value when selected that corresponds to an index into an array that stores the items of the list/menu. The data field may also be a label with one or more selectable arrow buttons that allow the user to increment or decrement the value on the label by a predefined amount. For example, when the selectable arrow buttons are presented in the augmented view 205-1 or a screen of the watch, an arrow button can be selected by selecting a button on panel 206. Selection on the checkbox may be stored as an integer representing whether the checkbox has been checked (e.g. 1) or unchecked (e.g., 0).
  • The application may maintain a data structure or object that corresponds to a facility component, which may comprise one or more of the above-described data fields. The equipment object (or facility component object) may include data regarding its name, type, image file location, its collection of fields, and a collection of document references. When an object is used, it may include access methods (e.g., object methods) that can be called by the system to set its data and read its data. The name may be a string representation of the name of the equipment/facility component (e.g., “Ferro-Resonant UPS”, “Line-Interactive UPS”, etc.)
  • The image file location is the string representation of an absolute or relative file path of the image file (e.g., a .png, .jpg) that may either be located within a memory file system of the AR device 110, the mobile device 180, or its appropriate location on a memory file system of the server 120 that visually describes the equipment/facility component, which may have been captured by the camera of the augmented reality device 110. The collection of fields is a collection of field objects that pertains to the equipment/facility components. Likewise, the collection of documents is a collection of strings that point to the absolute or relative file path of the documentation files that may either be located within the memory file system of devices 110 or 180, or its appropriate location on a memory file system of the server 120 that describe the structure, use, properties, or maintenance of the specific equipment/facility component.
  • Both fields and equipment/facility components may be used as collections within a facility room, which the system can maintain using a room object. Room objects are facility component objects that have data regarding the name, image file location, a collection of fields, and a collection of documents for the particular room. Room objects may also have a collection of equipment/facility components, as mentioned previously, which is literally a collection of equipment/facility component objects whose data collection interfaces are spatially located within that room area. In addition, room objects may have data pertaining to its representation in the walkthrough data collection. For example, a room object may include a flag that indicates whether or not the room is required to be checked during a specific scheduled walkthrough, as well as data denoting the percentage of fields within the room and the fields within the equipment/facility components of the room that may have been completed (whether problematic or not) over all the fields within the room and its equipment/facility components.
  • Fields, equipment/facility components, and rooms may be used as collections within a facility area, which the devices 110/180 or server 120 can maintain using an area object. An area is a specific dimension of facility space that separates the total collection of facility rooms into smaller collections. Each area object may have a name and an image file location for a picture representing that area. Areas are not only limited to different areas within the facility building itself, but also include rooftops and outside areas of a facility.
  • Fields, equipment/facility components, rooms, and areas may be used as collections within a facility, which the devices 110/180 or server 120 can maintain using a facility object. The facility object itself may have a name, address, image file location, a collection of all its area objects, and a collection of all the actual document objects pertaining to the entire facility. All of the rooms and equipment/facility components may have a reference to the facility's master list of documents in order to link themselves to a specific document or collection of documents.
  • Fields, equipment, rooms, areas, and facilities may be used as collections within a client profile, which the devices 110/180 or server 120 can maintain using a software license object. The software license object itself may have an organization name, owner, license key, a collection of all its facility objects, and a collection of all the user objects working for or within the facility. Users of the application may be represented within the devices 110/180 or server 120 using user objects. The user object itself may have a name and privileges.
  • The data entered into the fields may be stored in database in a memory 360 and/or in remote database 130. When a session is saved to the database 130, every room within the facility may be stored as a database table. A time stamp may be applied to each and every walkthrough session. Each record in that table may contain as columns every single field from that room and its equipment/facility components, as well as the time stamp of the walkthrough session. Every time a walkthrough session is saved, it may either overwrite the latest record within the table, or insert a new record into the table. If the current session being saved is a new session, it may insert a new record, but if it is a continued session that was previously saved it may overwrite the last record saved.
  • Each walkthrough session is saved in a manner which the devices 110/180 or server 120 may maintain using a walkthrough session object. The walkthrough session object itself may have a timestamp for when the session began and a timestamp for when the session ends, the user which performed the walkthrough session, and the data collected.
  • During walkthrough sessions, users may record information as comments attached to facility components. These comments may be maintained by the devices 110/180 or the server 120 as a comment object. The comment object itself may contain a title, a message, a timestamp, an image, a user as an author, and a sound file.
  • When the application is started, it may first retrieve the stored object-oriented data from the device memory, which may be encoded. The encoded data for the facility and its areas, rooms, equipment/facility components, fields, users, comments, walkthrough sessions and documents may be decoded and recreated at runtime, after a user attempting to login to the application has been authenticated. Afterwards, if any synchronization to a remote database is to be made, the last record from each table in the local database (e.g., each table may refer to a specific room within the facility) may update the values of all the fields within the facility's field objects. Also, as far as data analysis is concerned, an entire set of records across multiple timestamp ranges may be imported into the application from the remote database for the sake of viewing, analyzing, and reporting trends throughout time across equipment/facility components and fields.
  • Data from a record set may be collected as an array of arrays (a two-dimensional array). In an exemplary embodiment, the first position of the two-dimensional array refers to the index of the record retrieved, uniquely identified by its timestamp, while the second position of the two-dimensional array refers to the column of the record retrieved. An embodiment of the application may point to and retrieve a specific data field collected from any time. In an exemplary embodiment, multiple record sets are obtained, one for each database table (e.g., one for each room), which a user may choose any collected data field from any time as far as the database record set allows.
  • The devices 110/180 or server 120 may compare the entered parameter data against stored thresholds or previously entered data to determine whether an error has occurred. The thresholds may include a Maximum Threshold and a Precise Threshold.
  • Data fields that conform to the Maximum Threshold may not exceed the nominal value (Xnominal). Warnings may be displayed on the GUI 700 when the data (Xactual) falls outside a tolerance value (7%) as shown by the following Equation 1:

  • Xactual>Xnomintal−(Xnominal*T %)  (1).
  • For example, if the tolerance value is 10%, the actual temperature of a boiler is 200 degrees, and the nominal value is 250 degrees, since an actual of 200 is not above 250−(250*0.1) (i.e., 200 is not above 225), no warning would be displayed. However, if the temperature had risen to 226 degrees in this example, a warning would have been displayed.
  • The devices 110/180 or server 120 may also, or instead, average the previous logged data/parameter with the current entered parameter data, and compare this average value with a corresponding threshold to determine if a warning should be displayed. For example, if the current boiler temperature was entered at 226 degrees, but the prior 4 samples have the temperature at 200 degrees, since the overall average temperature is less than 226, no warning would be displayed. The amount of samples used for this averaging may vary and be a configurable parameter.
  • Data fields that conform to the Precise Threshold must not rise above or fall below the nominal value by more than the tolerance value. The GUI 700 may display a warning when data is outside of the tolerance as shown by the following Equation 2:

  • Xactual>Xnominal+Xnominal*T % AND Xactual<Xnominal−Xnominal*T %  (2).
  • If the value of the data field is beyond the threshold, then the value of the field is out of range and the system may alert the user. For example, if the tolerance value is 10%, the actual temperature of the boiler is 100 degrees, and the nominal value is 140 degrees, a warning would be displayed because an actual of 100 is less than 140−140*0.1 (e.g., 100 is less than 140−14). However, if the boiler temperature rises to 150 degrees a warning would not be displayed since it lower than 140+140*0.1.
  • If it has been determined that a warning is to be displayed, the devices 110/180 or server 120 may notify the user to take corrective action to maintain the facility components before a failure occurs. The notification may appear as a visual on the GUI 700, an audible alert, or a vibratory alert. For example, the visual can be presented on a display of the mobile device 180, in the augmented view 205-1, or in the screen of the watch 185. For example, a speaker of the devices 110, 180, or 185 can sound the audible alert, or vibrating mechanism of the devices 110, 180, or 185 can vibrate to present the vibratory alert.
  • For example, parameter data for facility components that are outside of the threshold limits may be marked with visual cues on a display of the mobile device 180, in the augmented view 205-1, or on a screen of the watch. Further, different levels of alerts may be present. For example, a low level alert may become an elevated alert if new data is entered outside of a user configurable normal tolerance threshold. Since the error may have been caused by a data entry error, the user can choose to commit the data with the value as is, or edit the data before it is committed.
  • The notifications may be managed by selecting the configurations tab 730. For example, the interface 700 enables the user to indicate who should be contacted, the form used for the contact (e.g., SMS text message, MMS message, e-mail, social network message, etc.). For example, in addition to presenting the user of the devices 110/180/185 with a visual or audible notification, the devices 110/180/185 can notify the site administrator via an automatically generated email, text message, social network message, etc. The devices 110/180/185 may enable the user to set the system into a training mode that prevents the application from sending a message to a remote party if it is determined that the facility components are not functioning properly. The devices 110, 180, 185, or server 120 may include a wireless transceiver to send a message using the wireless transceiver if it is determined that a facility component is not functioning properly.
  • The user may record comments in the comment field 716 for each room for each facility component within the room. For example, the comment field 716 displayed on the mobile device 600 can be advanced to by touching a screen option of the mobile device and then the comment can be entered by using a virtual/physical keyboard. For example, the comment field 716 displayed in the augmented view 205-1 or a screen of the watch can be advanced to using a button on panel 206 and then the comment can be entered by a user speaking into a microphone of the augmented reality device 110, the mobile device 180, or the watch 185.
  • The camera of the augmented reality device 110, the device 180, or watch 185 may be used to record one or more pictures that are automatically associated with the comment. The comments and associated pictures may be stored on a database of the devices 110/180/185, or the remote database 130. For example, if the user notices that the temperature of the boiler is too high, he can provide a corresponding comment and snap a photo of the instrument panel of the boiler showing the elevated temperature.
  • Once sufficient data has been captured by the devices 110/180/185, the user may select the trending mode 720 to access a visual representation of all data. In this visual representation, each data field may be graphed against its threshold limits and an estimated prediction trend line may be illustrated to show if and when the thresholds may be exceeded. For example, when the interface 700 is displayed in the augmented view 205-1, the user can use a button of panel 206 to advance to and select the trending mode 720.
  • Analysis of the data may reveal opportunities for the user to take corrective action to maintain facility components before a failure occurs. The trending may identify spikes and dips in recorded data as anomalies. The anomalies may be used as a basis of analysis where related data fields are analyzed for anomalies occurring during the same time period. Correlation between anomalies occurring during the same time period may provide a basis for corrective actions to locate and address the issue.
  • The AR device 110, mobile device 180, watch 185, or the server 120 may trend data input by the user during a walkthrough data collection session. A data trending activity fragment may first retrieve historic data from a database of the devices 110/180/185 or the remote database 130 up to a specified range of time defined by the user, by means of the data retrieval mechanism detailed previously. The data trending activity fragment may then calculate the positive and negative slopes of the sinusoidal curves that were created by the obtained information. The data trending activity fragment may be able to calculate transients (voltage spikes, current spikes, brownouts) by using the trends generated by the data collected. If such problematic transients are implied by the generated trends, the data trending feature may pinpoint the facility components or location of the issue within the facility.
  • The trending mode 720 may also be used to generate a report of the collected data stating a brief analysis of said trends or anomalies, and may also include a printout of the data the selected fields in graphical and tabular format. The report can be presented on a display of the mobile device 180, in the augmented view 205-1, or in a screen of the watch.
  • As shown in FIG. 13, the interface displayed by selecting the trending mode 720 may include an equipment label 721 identifying the equipment (e.g., a facility component), an overlay button 722, and a toggle button 723. The interface of FIG. 13 can be displayed on a display of the mobile device 180 or may be presented in the augmented view 205-1 or the screen of the watch 185.
  • The arrows to the left and right of the equipment name 721 may be used to advance to a preceding or subsequent piece of equipment in the room. For example, when the arrows are presented on the display of the mobile device 180, a user can use a touch screen to select the arrows. For example, when the arrows are presented in the augmented view 205-1 or the screen of the watch, a user can advance to and select the arrows by selecting one or more buttons on panel 206.
  • The trending mode 720 interface enables the user to specify equipment data fields and date ranges to be displayed on the graph for comparison of past and present data. The toggle button 723 can be used to toggle the view of the data between the graphical view and a tabular view. When the toggle button 723 is displayed on a display of the mobile device 180, a user can use a touch screen of the mobile device 180 to select the toggle button 723. When the toggle button 723 is presented in the augmented view 205-1 or the screen of the watch, a user can use one or more buttons of the panel 206 to advance to and select the toggle button 723.
  • The overlay button 722 can be used to view an overlay of projected equipment loading values to identify trends in data that was collected, as shown in FIG. 14. When the overlay button 722 is displayed on a display of the mobile device 180, a user can use a touch to select the overlay button 722. When the overlay button 722 is presented in the augmented view 205-1 or the screen of the watch, a user can use one buttons of the panel 206 to advance to and select the overlay button 722.
  • As an example, the graph 724 can display one data parameter of a piece of equipment over time, and against the threshold or capacity limits, and an estimated (e.g., extrapolated) trend line will show if and when the thresholds may be exceeded. This information may aid the user in identifying system capacity trends as new facility components are added to the facility. The graph 724 may be presented on the display of the mobile device 180, the augmented view 205-1, or the screen of the watch 185.
  • The user may select equipment data fields to be displayed and analyzed on the data trending graphs. These graphs may be overlaid on the same set of axes for data comparison. The interface 700 may highlight increasing and decreasing trends for the user.
  • In the portfolio mode 750, the devices 110/180/185 or the server 120 may be configured to generate reports of the collected data, a brief analysis, and an overview of the data that was entered for selected fields. The devices 110/180/185 enable the user to generate a full report based on the entered data. The system may be able to manage multiple facilities data, functionalities and components.
  • The document library mode 740 can be selected to access one or more documents such as SOPs, EAPs, EOPs, MPs, MOPs, drawings, schematics, or other relevant documentation associated with facility components in the corresponding room or the facility. For example, when the interface is presented in the augmented view 205-1 or the screen of the watch, the user can advance to and select the library mode 740 by selecting one or more buttons of panel 206. For example, when the interface is presented on a display of the mobile device 180, the user can select the library mode 740 by touching its screen.
  • These documents may be accessed in a “step by step” mode of the interface 700 on a display of the mobile device 180, in the augmented view 205-1, or the screen of the watch to guide the user through each step in a procedure of a corresponding one of the documents. For example, text of the steps that have been completed and the subsequent steps can be displayed on the display 205 or the display of the mobile device 180, where the next step to be completed can be emphasized (e.g., highlighted in a different color, underlined, etc.). The interface 700 enables the user to mark each step as complete.
  • For example, the interface 700 may provide a check box next to each step that can be selected when the user has finished that part of the procedure. The check box may be visible in the augmented view 205-1, the screen of the watch, or on the display of the mobile device 180. The user may also be notified if a step has been skipped. The “step by step” mode may reduce confusion during an emergency by guiding the user in a high stress environment, which may also increase safety.
  • The user may also access electrical and mechanical one-line diagrams and other various important operational drawings including floor layouts, etc. These documents may also be used to facilitate practice simulations that may reduce risk and improve life safety. The documents may be viewable in the augmented view 205-1, the screen of the watch, or a display of the mobile device 180.
  • The document library mode 740 may be selected to provide personnel with immediate access to important facility drawings (e.g., electrical one line diagrams, floor plans, and mechanical drawings, etc.), emergency operating procedures, and other pertinent information to keep operations running smoothly. The document library mode 740 can be selected on the mobile device 180 or using the augmented reality device 110 when the interface is presented in the augmented view 205-1.
  • The document library mode 740 is not limited to providing emergency related information, as it may also provide pertinent information on switching procedures, technical maintenance programs, and other operations procedures, facility one lines, and other facility infrastructure drawings including SOPs, EAPs, EOPs, MPs, MOPs, etc. Further, other documentation may be imported from a remote server and saved into the application locally, which allows use of the data without network connectivity.
  • The Simulation mode 760 may be selected to simulate one or more functions of the client 100. The simulation mode 760 can be selected on the mobile device 600 or using the augmented reality device 110 when the interface is presented in the augmented view 205-1.
  • Selection of the simulation mode 260 may present a screen that enables a trainee to run one or more available training scenarios or a site administrator to create the training scenarios. The screen may be presented on a display of the mobile device or in the augmented view 205-1.
  • As an example, the training scenarios may be used to teach a trainee or to simulate conditions that would be predicted to occur based on data entered by a user. For example, a training scenario could be a walkthrough of a room full of several facility components (e.g., pieces of equipment) in a room, where the trainee is expected to enter parameter data (e.g., boiler temperature, battery voltage, etc.) for each corresponding component/piece. If the trainee enters parameter data that is outside of expected thresholds, they would receive an alert that is similar to the actual one that would have been received during normal operation. However, since this is merely a simulation, the site administrator would not receive a notification (e.g., e-mail, text, etc.) of the alert.
  • The application that launches the interface 700 may be represented on the display of the mobile device 180 by a selectable graphical icon. The icon may also be presented in the augmented view 205-1 or a screen of the watch. When the icon is selected (e.g., using a touch screen of 180 or selecting a button of panel 206), prior to enabling the user access to the functions of the interface 700, the user may be prompted to enter a login name/identifier and a password that corresponds to one or more accounts maintained by the application. The application may support different layers of accounts, where some accounts have access to more features of the interface 700 than others. For example, the administrator may have access to all features whereas a trainee could have access only the simulation mode 760, etc. This layered access may be secured by multiple authorization methods including encrypted passphrases, hardware keys, iris scan, user fingerprint, and facial recognition.
  • In at least one embodiment of the invention, the devices 110/180/185 synchronize all collected data with a centralized, secure database (e.g., encrypted) 130. For example, updates to data generated on devices 110/180/185 may be uploaded to the remote database 130. The remote database 130 may then update other devices(s) with the updated data. The data may include facility configurations, collected facility component data, facility documentation, document association properties with facility, equipment, and/or rooms, or training simulations.
  • The remote server 130 or the devices 110/180/185 may also aggregate data from various third party data sources including electrical system metering devices, mechanical system metering devices, facility alarms and alerts, security systems, and weather data. This data may be used for analytics and for basic viewing.
  • The remote server 120 or database 130 may host a library containing generic manufacturer data, equipment manuals, and other documentation that may be accessed by the augmented reality device 110, the mobile device 180, or watch 185.
  • The devices 110/180/185 may communicate with the remote server 120 via a private network where all interactions may be encrypted. The users of the devices 110/180/185 may access all information via a locally cached database for locations that do not have network access. For example, if network access is not available, the application will locally cache data on the device's internal storage (e.g., memory 360). Once network access is restored, the application may synchronize the locally stored data with the remote database automatically.
  • FIG. 15 illustrates an example of a view that may be presented by the watch 185. The view may include an image of a place such as the facility, a facility room, or an area within the facility and a status message or an alert about the place. For example, FIG. 15 shows an alert that indicates the overall health of the facility shown as a grade out of 100. The view may be presented when a position of the user wearing the watch 185 is within a pre-defined distance from the location of the place depicted in the image.
  • FIG. 16 illustrates another example of a view that may be presented by the watch 185. The view may include an image of a facility component, and a status or alert about the facility component. For example, FIG. 16 shows an alert that indicates the depicted facility component (i.e., a UPS of system A) is powered off. The view may be presented when a position of the user wearing the watch 185 is with a predefined distance from the location of the component. Thus, if a user walks to another facility component, the view will update to show any status or alert associated with the new component. The data presented in the view can be output to the smart watch 185 from the mobile device 180. Any alerts or status messages that can be viewed on the mobile device 180 or the AR device 110 can be pushed to the smart watch 185 for display. Results calculated by analytics (e.g., located on the mobile device 180 or the server 120) can also be pushed to the smart watch 185 for display. For example, facility health may be determined from factors such as frequency of rounds performed, available documentation, redundancy of critical systems, number of recent alerts, outstanding maintenance tickets, and predictions made by the analytics. Other results that may be pushed to the watch for display include a rounds alert summary, equipment capacity trending, critical equipment status.
  • In an exemplary embodiment, the AR device 110 is used to control a drone 190 (e.g., a remote controlled aircraft (RCA) or unmanned aircraft system (UAS)) or a robot 195. The RCA may be controlled with a handheld radio transmitter, which communicates with a receiver aboard the aircraft or the robot 195. In an exemplary embodiment, the transmitter is embedded within the AR device 110 (e.g., within the case) to enable a user wearing the AR device 110 to control the RCA or the robot. The receiver directs the aircraft's servos move the control surfaces based on pilot input. The UAS may correspond to a quadrocopter (e.g., FIG. 17) that includes a frame, a motor, an electronic speed control (ESC), a flight control board, a radio transmitter and receiver, a battery and charger. In an exemplary embodiment of the invention, the UAS or the robot 195 is further modified to include at least one of an infra-red sensor, a camera, an ambient temperature sensor, a smoke detector, a GPS, a gyroscope, and an accelerometer. A user wearing the AR device 110 can control the camera of the drone to take pictures and videos of rooftop system components and control the IR sensor to take IR scans of equipment to detect overheating/hot spots. The pictures taken by the drone 190 or the robot 195 can be presented in the augmented view 205-1. The radio transmitter of the drone 190 or the robot can transmit the images, the IR scans, and temperature data captured by the temperature sensor to the AR device 110, the server 120, the mobile device 180, or the watch 185. The temperature sensor can be used to analyze data center cooling efficiency and identify hot spots. Large facilities covering one million square feet or more area can utilize the drone 190 to monitor temperature and equipment throughout the data center with a reduced need for an on-site staff.
  • In another embodiment, the drone 190 is land based (e.g., an unmanned ground system (UGS)). The UGS may be remotely controlled by the AR device 110 or the mobile device 180. A UGS is used to assist human operation in environments that are not suitable/feasible for personnel to work. The UGS may include a temperature sensor and/or a forward looking infrared (FLIR) sensor synced to a user interface of the mobile device 180 or the AR device 110 to provide real time data readings. The sensors can be used to detect hot spots on wire connections and larger scale hot spots such as server racks and equipment. Use of the UGS eliminates arc flash hazard exposure by removing personnel from close proximity of energized equipment. The UGS has the ability to take photographs of equipment, and may be able to more precisely detect hot racks/equipment than the UAS. The UGS has the ability to monitor large areas in large data centers and relay current data to the AR device 110, the mobile device 180, the watch 185, or the server 120, for access by a user. The UGS allows the facility to reduce the manpower required for monitoring equipment, which allows facility engineering staff to focus on other operational tasks.
  • The IR monitoring that a drone 190 or the robot 195 performs, can be used to identify hot spot locations in electrical and mechanical hardware before they cause equipment failure. A system for detecting infrared radiation using infrared sensors may include: i) an infrared source such as blackbody radiators, tungsten lamps, and silicon carbide; ii) transmission medium used for infrared transmission, which includes a vacuum, the atmosphere, and optical fibers; iii) optical components such as optical lenses made from quartz, CaF2, Ge, and Si, polyethylene Fresnel lenses, Al or Au mirrors, used to converge or focus infrared radiation, and iv) an infrared detector for detecting the infrared radiation. The IR monitoring may be used as part of a predictive maintenance regime to identify potential failures and prevent them.
  • The temperature sensors can be used to detect temperature fluctuation in a data center environment to identify areas where cooling efficiency can be improved. The UGS can systematically scan each row in a data center following any work to check for changes in air flow patterns. The UGS can utilize spatial detection sensors and algorithms to autonomously scan entire rooms. The sensors allow optimal cooling efficiency to reduce energy consumption to maximize the useful life of the equipment.
  • FIG. 18 illustrates an example of at least one robot being controlled by a wearable device (e.g., device 110, watch 185) or the mobile device 180. The circular objects depicted in the figures are the robots and the rectangular objects depicted in the figures are rows or columns of equipment that are spaced apart to create a path that is travelled by the robots. Each robot includes one or more sensors. The sensors may include a sensor to detect a biological or chemical agent, to detect radiation, to detect whether conductors are burning (e.g., smell or detect certain scents), to capture regular images (e.g., high resolution camera) or thermal images (e.g., a thermal imaging or infrared camera), to detect hot and cold spots (e.g., a temperature sensor), to detect noise frequencies that indicate a server shutdown (e.g., ultrasound sensor/instrumentation), a GPS locator to detect a location of the robot, etc.
  • Each robot may include a transmitter for transmitting all data collected by its sensors to the wearable device, the mobile device 180, or the central server 120. A physical cable may be connected to a port of a robot and a port of the wearable device or the mobile 180 to enable sensor data to be downloaded from the robot to the wearable device or to the mobile device 180. The wearable device or the mobile device 180 may be used to control the robots to move to a particular location within the facility or to provide an instruction to the robots so that they can carry out their duties (e.g., capture sensor data, move to various locations) in an autonomous fashion. Images or video captured by at least one of the robots may be presented in the augmented view 205-1 so that a user can see what the robot(s) see in real-time. A digitally enhanced floor plan using a 4dScape technology may be presented in the augmented view 205-1 based on the location sensed by the GPS of the robot. For example, the robot can use its GPS to detect its current location, send that location to the wearable device, the mobile device 180, or the server 120, and then wearable device, mobile device 180, or the server 120 is configured to retrieve and present the digitally enhanced floor plan that corresponds to the location.
  • The digitally enhanced floor plan may include three dimensional graphics representing the equipment known to be present in the room of the location, and textual information identifying the equipment and warnings based on data sensed by the robots (e.g., a hot spot, excessive radiation, etc.).
  • In an exemplary embodiment, each robot is configured to respond to voice commands. The voice commands may be spoken to a microphone of the robot when a user is near a robot, or transmitted by the wearable device, the mobile device 180, or the server 120 to the robot when a voice command is spoke to a microphone of the sending device.
  • In an exemplary embodiment, each robot includes a touch screen for entering commands to control the robot, or a touch screen of the mobile device 180 or the watch 185 is used to enter commands to control the robot.
  • In an exemplary embodiment, a robot includes an extendible and/or rotatable extension (e.g., an arm or leg) that can be used to remotely turn on/off equipment in the facility or make equipment adjustments. The extension can be a rod that protrudes some distance away from the robot and is oriented at a certain angle, where the distance and the angle can be adjusted remotely by the wearable device, the mobile device 180, or the server 120 using instructions transmitted from the wearable device or the mobile device 180 to the robot. The extension may be attached to a ball joint attached to the robot to enable the extension to be oriented to various different angles. A motor within the robot may be used extend or retract the extension to change the length of the extension and to adjust the angle of the extension. The extension can be used to reach places that are difficult or dangerous to reach, such as for closing a steam valve. The robots may also be controlled globally from a central command center (e.g., the central server 120). The robots of a given facility may include robots of different types. For example, in one facility, the robots may include one set of robots for normal monitoring and another set of robots for repairing systems or shutting down systems.
  • The AR device 110, the mobile device 180, or the watch 185 may include the computer system shown in FIG. 19. The computer system referred to generally as system 1000 may include, for example, a central processing unit (CPU) 1001, random access memory (RAM) 1004, a printer interface 1010, a display unit 1011, a local area network (LAN) data transmission controller 1005, a LAN interface 1006, a network controller 1003, an internal bus 1002, and one or more input devices 1009, for example, a keyboard, mouse etc. As shown, the system 1000 may be connected to a data storage device, for example, a hard disk, 1008 via a link 1007.
  • Please note that while a particular graphical user interface (GUI) 700 is described above an illustrated in the figures, embodiments of the inventive concept are not limited thereto as the GUI 700 may be changed in various ways. For example, one or more of the illustrated modes may be omitted, additional modes may be present, selection of modes may be accomplished in a different manner from that illustrated, and different interactive or descriptive graphical elements may be used from those illustrated (e.g., labels may have different text, buttons may have different size, shapes, colors, etc., text fields may be replaced with drop down menus, lists, etc.). Furthermore, the interface may not be a graphical user interface. The interface may be non-graphical and rely on other means of interaction (e.g., voice control).
  • Exemplary embodiments described herein are illustrative, and many variations can be introduced without departing from the spirit of the disclosure or from the scope of the appended claims. For example, elements and/or features of different exemplary embodiments may be combined with each other and/or substituted for each other within the scope of this disclosure and appended claims.

Claims (25)

What is claimed is:
1. A wearable device configured to guide a user to perform a procedure in a facility, the device comprising:
a wearable element;
a display area;
a sensor; and
a controller configured to control the sensor to capture sensor data, identify equipment from the sensor data, and present a first image on the display area, the first image including information indicating how to perform a current step of a procedure associated with the determined equipment,
wherein the controller comprises a transceiver that enables the controller to wirelessly receive the information from a remote device.
2. The wearable device of claim 1, wherein the wearable element is one of a band or an eyeglass frame.
3. The wearable device of claim 1, further comprising a projector, where the display area includes a prism and the controller is configured project the first image onto the prism.
4. The wearable device of claim 1, wherein the remote device is configured to download the entire procedure from a central server and the current step upon the remote device establishing a connection to a network attached to the central server.
5. The wearable device of claim 1, wherein the sensor is a camera, the sensor data is a second image captured by the camera, and the equipment is identified from the second image.
6. The wearable device of claim 5, wherein the second image includes a bar code and the controller determines a code from the bar code and uses the code to identify the equipment.
7. The wearable device of claim 5, wherein the controller sends the second image to the remote device, the remote device performs image recognition on the second image to detect an object, and the remote device compares the detected objects against pre-stored images to identify the equipment.
8. The wearable device of claim 1, wherein the sensor includes a radio frequency identification (RFID) reader, the sensor data is RFID data, the controller determines a code from the RFID data and uses the code to identify the equipment.
9. The wearable device of claim 1, wherein the sensor comprises a plurality of infrared (IR) cameras, the IR cameras controller scan geometries to retrieve metadata, the controller sends the metadata to the remote device, and the remote device uses the metadata to identify the equipment.
10. The wearable device of claim 1, wherein the first image includes text of the current step and text identifying a room the equipment is housed within.
11. The wearable device of claim 1, wherein the wearable element includes a physical button and the controller determines that the current step has been completed in response to a user depressing the physical button.
12. The wearable device of claim 1, wherein the wearable element includes a physical button and the controller sends a message to the remote device indicating the current step has been completed in response to a user depressing the physical button.
13. The wearable device of claim 1, wherein the wearable element includes a physical button and the controller presents a second image on the display area including information indicating how to perform a next step of the procedure in response to a user depressing the physical button.
14. The wearable device of claim 1, wherein the first image is transparent and overlays the equipment.
15. The wearable device of claim 1, wherein the equipment comprises a plurality of components and the first image is disposed closest to one of the components that is part of the current step.
16. A wearable device configured to guide a user safely through a facility, the device comprising:
a wearable element;
a display area;
a sensor; and
a controller comprising a transceiver that enables the controller to wirelessly receive information from a remote device,
wherein the controller configured is configured to control the sensor to capture sensor data, identify equipment from the sensor data and the received information, and present an image on the display area representing a safe path through the equipment using the sensor data and the received information.
17. The wearable device of claim 16, wherein the path comprises at least one line that is spaced a distance away from the equipment.
18. The wearable device of claim 17, wherein the distance is based on a range of an arc flash predicted for the equipment.
19. The wearable device of claim 17, wherein the controller dynamically adjusts a position of the line according to a distance determined between the user and the equipment using the sensor data.
20. The wearable device of claim 17, wherein the path comprises a plurality of different adjacent sub-paths, where each sub-path represents a different amount of energy the user would be exposed to when the arc flash occurs.
21. The wearable device of claim 16, further comprising a vibration motor that vibrates when the controller determines that the user is approaching a boundary of the path.
22. A wearable device configured to manage a facility using a remotely controllable device, the device comprising:
a wearable element;
a display area; and
a controller configured to wirelessly control the remotely controllable device to capture sensor data, identify equipment from the sensor data, determine whether the equipment is malfunctioning using the sensor data, and present an image on the display area representing the equipment and providing information indicating whether the equipment is malfunctioning.
23. The wearable device of claim 22, wherein the controller is configured to receive temperature data from the remotely controllable device to determine whether a temperature of the equipment is abnormal.
24. The wearable device of claim 23, wherein the temperature data comprises infrared scans of the equipment performed by the remotely controllable device.
25. The wearable device of claim 22, wherein the remotely controllable device includes a rotatable or extendible extension and the controller enables a user to transmit a command to the remotely controllable device to rotate or extend the extension.
US14/812,712 2014-07-31 2015-07-29 Facility operations management using augmented reality Abandoned US20160035246A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/812,712 US20160035246A1 (en) 2014-07-31 2015-07-29 Facility operations management using augmented reality

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201462031277P 2014-07-31 2014-07-31
US201462031283P 2014-07-31 2014-07-31
US201462089633P 2014-12-09 2014-12-09
US14/812,712 US20160035246A1 (en) 2014-07-31 2015-07-29 Facility operations management using augmented reality

Publications (1)

Publication Number Publication Date
US20160035246A1 true US20160035246A1 (en) 2016-02-04

Family

ID=55180622

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/812,647 Active 2036-02-04 US10170018B2 (en) 2014-07-31 2015-07-29 Cloud based server to support facility operations management
US14/812,712 Abandoned US20160035246A1 (en) 2014-07-31 2015-07-29 Facility operations management using augmented reality

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/812,647 Active 2036-02-04 US10170018B2 (en) 2014-07-31 2015-07-29 Cloud based server to support facility operations management

Country Status (1)

Country Link
US (2) US10170018B2 (en)

Cited By (86)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160070439A1 (en) * 2014-09-04 2016-03-10 International Business Machines Corporation Electronic commerce using augmented reality glasses and a smart watch
US20160165190A1 (en) * 2014-12-04 2016-06-09 Jungheinrich Aktiengesellschaft Method and system for monitoring logistics facilities
US20160162772A1 (en) * 2014-12-09 2016-06-09 Peter M. Curtis Facility walkthrough and maintenance guided by scannable tags or data
US20170236348A1 (en) * 2016-02-16 2017-08-17 Honeywell International Inc. Systems and methods of access control in security systems with augmented reality
US9779275B2 (en) * 2014-07-03 2017-10-03 Mitsubishi Electric Corporation Control system, terminal, information setting method, and program
WO2018007075A1 (en) * 2016-07-05 2018-01-11 Siemens Aktiengesellschaft Interaction system and method
US20180211447A1 (en) * 2017-01-24 2018-07-26 Lonza Limited Methods and Systems for Using a Virtual or Augmented Reality Display to Perform Industrial Maintenance
JP2019008605A (en) * 2017-06-26 2019-01-17 積水ハウス株式会社 Information processing system
US20190049275A1 (en) * 2017-12-29 2019-02-14 Intel Corporation Method, a circuit and a system for environmental sensing
US20190057181A1 (en) * 2017-08-18 2019-02-21 International Business Machines Corporation System and method for design optimization using augmented reality
US20190112148A1 (en) * 2017-10-13 2019-04-18 Otis Elevator Company Commissioning and upgrading remote software/firmware using augmented reality
US10339793B2 (en) * 2015-07-31 2019-07-02 Johnson Controls Fire Protection LP System and method for smoke detector performance analysis
US10395116B2 (en) * 2015-10-29 2019-08-27 Hand Held Products, Inc. Dynamically created and updated indoor positioning map
CN110264820A (en) * 2019-06-24 2019-09-20 国网浙江江山市供电有限公司 A kind of high-tension switch cabinet Training Methodology based on virtual reality technology
CN110570715A (en) * 2019-09-17 2019-12-13 南京铁道职业技术学院 Electric power simulation experiment device based on augmented reality technology
WO2020010033A1 (en) * 2018-07-06 2020-01-09 Carrier Corporation Building maintaining method and system
US10534326B2 (en) 2015-10-21 2020-01-14 Johnson Controls Technology Company Building automation system with integrated building information model
US10613729B2 (en) * 2016-05-03 2020-04-07 Johnson Controls Technology Company Building and security management system with augmented reality interface
US10747634B2 (en) * 2015-10-27 2020-08-18 Fluke Corporation System and method for utilizing machine-readable codes for testing a communication network
US20200396361A1 (en) * 2015-04-14 2020-12-17 ETAK Systems, LLC 360 Degree Camera Apparatus with Monitoring Sensors
US11048760B1 (en) * 2019-07-31 2021-06-29 Splunk Inc. Techniques for placing content in and applying layers in an extended reality environment
US20210216938A1 (en) * 2018-06-01 2021-07-15 Johnson Controls Technology Company Enterprise platform for enhancing operational performance
WO2021150506A1 (en) * 2020-01-23 2021-07-29 Netapp, Inc. Augmented reality diagnostic tool for data center nodes
US20210256771A1 (en) * 2018-06-12 2021-08-19 Current Lighting Solutions, Llc Integrated management of sensitive controlled environments and items contained therein
US11127211B2 (en) * 2016-03-30 2021-09-21 Nec Corporation Plant management system, plant management method, plant management apparatus, and plant management program
US20220138183A1 (en) 2017-09-27 2022-05-05 Johnson Controls Tyco IP Holdings LLP Web services platform with integration and interface of smart entities with enterprise applications
US11394609B2 (en) * 2019-10-30 2022-07-19 Wistron Corporation Equipment deploying system and method thereof
US20220254209A1 (en) * 2021-01-04 2022-08-11 Bank Of America Corporation System for secure access and initiation using a remote terminal
US11495118B2 (en) * 2017-06-27 2022-11-08 Oneevent Technologies, Inc. Augmented reality of a building
US20220376944A1 (en) 2019-12-31 2022-11-24 Johnson Controls Tyco IP Holdings LLP Building data platform with graph based capabilities
US11570050B2 (en) 2020-11-30 2023-01-31 Keysight Technologies, Inc. Methods, systems and computer readable media for performing cabling tasks using augmented reality
US11602691B2 (en) * 2020-03-13 2023-03-14 Harmonix Music Systems, Inc. Techniques for virtual reality boundaries and related systems and methods
US11699903B2 (en) 2017-06-07 2023-07-11 Johnson Controls Tyco IP Holdings LLP Building energy optimization system with economic load demand response (ELDR) optimization and ELDR user interfaces
US11704311B2 (en) 2021-11-24 2023-07-18 Johnson Controls Tyco IP Holdings LLP Building data platform with a distributed digital twin
US11709965B2 (en) 2017-09-27 2023-07-25 Johnson Controls Technology Company Building system with smart entity personal identifying information (PII) masking
US11714930B2 (en) 2021-11-29 2023-08-01 Johnson Controls Tyco IP Holdings LLP Building data platform with digital twin based inferences and predictions for a graphical building model
US11726632B2 (en) 2017-07-27 2023-08-15 Johnson Controls Technology Company Building management system with global rule library and crowdsourcing framework
US11727738B2 (en) 2017-11-22 2023-08-15 Johnson Controls Tyco IP Holdings LLP Building campus with integrated smart environment
US11735021B2 (en) 2017-09-27 2023-08-22 Johnson Controls Tyco IP Holdings LLP Building risk analysis system with risk decay
US11733663B2 (en) 2017-07-21 2023-08-22 Johnson Controls Tyco IP Holdings LLP Building management system with dynamic work order generation with adaptive diagnostic task details
US11741165B2 (en) 2020-09-30 2023-08-29 Johnson Controls Tyco IP Holdings LLP Building management system with semantic model integration
US11754982B2 (en) 2012-08-27 2023-09-12 Johnson Controls Tyco IP Holdings LLP Syntax translation from first syntax to second syntax based on string analysis
US11755604B2 (en) 2017-02-10 2023-09-12 Johnson Controls Technology Company Building management system with declarative views of timeseries data
US11763266B2 (en) 2019-01-18 2023-09-19 Johnson Controls Tyco IP Holdings LLP Smart parking lot system
US11762351B2 (en) 2017-11-15 2023-09-19 Johnson Controls Tyco IP Holdings LLP Building management system with point virtualization for online meters
US11762886B2 (en) 2017-02-10 2023-09-19 Johnson Controls Technology Company Building system with entity graph commands
US11761653B2 (en) 2017-05-10 2023-09-19 Johnson Controls Tyco IP Holdings LLP Building management system with a distributed blockchain database
US11762343B2 (en) 2019-01-28 2023-09-19 Johnson Controls Tyco IP Holdings LLP Building management system with hybrid edge-cloud processing
US11762356B2 (en) 2017-09-27 2023-09-19 Johnson Controls Technology Company Building management system with integration of data into smart entities
US11764991B2 (en) 2017-02-10 2023-09-19 Johnson Controls Technology Company Building management system with identity management
US11762362B2 (en) 2017-03-24 2023-09-19 Johnson Controls Tyco IP Holdings LLP Building management system with dynamic channel communication
EP3724856B1 (en) * 2017-12-14 2023-09-20 The Joan and Irwin Jacobs Technion-Cornell Institute System and method for creating geo-localized enhanced floor plans
US11769066B2 (en) 2021-11-17 2023-09-26 Johnson Controls Tyco IP Holdings LLP Building data platform with digital twin triggers and actions
US11768004B2 (en) 2016-03-31 2023-09-26 Johnson Controls Tyco IP Holdings LLP HVAC device registration in a distributed building management system
US11770020B2 (en) 2016-01-22 2023-09-26 Johnson Controls Technology Company Building system with timeseries synchronization
US11774922B2 (en) 2017-06-15 2023-10-03 Johnson Controls Technology Company Building management system with artificial intelligence for unified agent based control of building subsystems
US11774930B2 (en) 2017-02-10 2023-10-03 Johnson Controls Technology Company Building system with digital twin based agent processing
US11778030B2 (en) 2017-02-10 2023-10-03 Johnson Controls Technology Company Building smart entity system with agent based communication and control
US11774920B2 (en) 2016-05-04 2023-10-03 Johnson Controls Technology Company Building system with user presentation composition based on building context
US11782407B2 (en) 2017-11-15 2023-10-10 Johnson Controls Tyco IP Holdings LLP Building management system with optimized processing of building system data
US20230326146A1 (en) * 2022-04-07 2023-10-12 Asustek Computer Inc. Augmented reality implementing method
US11792039B2 (en) 2017-02-10 2023-10-17 Johnson Controls Technology Company Building management system with space graphs including software components
US11796974B2 (en) 2021-11-16 2023-10-24 Johnson Controls Tyco IP Holdings LLP Building data platform with schema extensibility for properties and tags of a digital twin
US11796333B1 (en) * 2020-02-11 2023-10-24 Keysight Technologies, Inc. Methods, systems and computer readable media for augmented reality navigation in network test environments
US11854329B2 (en) 2019-05-24 2023-12-26 Ademco Inc. Systems and methods for authorizing transmission of commands and signals to an access control device or a control panel device
US11874809B2 (en) 2020-06-08 2024-01-16 Johnson Controls Tyco IP Holdings LLP Building system with naming schema encoding entity type and entity relationships
US11880677B2 (en) 2020-04-06 2024-01-23 Johnson Controls Tyco IP Holdings LLP Building system with digital network twin
US11892180B2 (en) 2017-01-06 2024-02-06 Johnson Controls Tyco IP Holdings LLP HVAC system with automated device pairing
US11894944B2 (en) 2019-12-31 2024-02-06 Johnson Controls Tyco IP Holdings LLP Building data platform with an enrichment loop
US11899723B2 (en) 2021-06-22 2024-02-13 Johnson Controls Tyco IP Holdings LLP Building data platform with context based twin function processing
US11900287B2 (en) 2017-05-25 2024-02-13 Johnson Controls Tyco IP Holdings LLP Model predictive maintenance system with budgetary constraints
US11902375B2 (en) 2020-10-30 2024-02-13 Johnson Controls Tyco IP Holdings LLP Systems and methods of configuring a building management system
US11921481B2 (en) 2021-03-17 2024-03-05 Johnson Controls Tyco IP Holdings LLP Systems and methods for determining equipment energy waste
US11920810B2 (en) 2017-07-17 2024-03-05 Johnson Controls Technology Company Systems and methods for agent based building simulation for optimal control
US11927925B2 (en) 2018-11-19 2024-03-12 Johnson Controls Tyco IP Holdings LLP Building system with a time correlated reliability data stream
US11934966B2 (en) 2021-11-17 2024-03-19 Johnson Controls Tyco IP Holdings LLP Building data platform with digital twin inferences
US11941238B2 (en) 2018-10-30 2024-03-26 Johnson Controls Technology Company Systems and methods for entity visualization and management with an entity node editor
US11947785B2 (en) 2016-01-22 2024-04-02 Johnson Controls Technology Company Building system with a building graph
US11954713B2 (en) 2018-03-13 2024-04-09 Johnson Controls Tyco IP Holdings LLP Variable refrigerant flow system with electricity consumption apportionment
US11954478B2 (en) 2017-04-21 2024-04-09 Tyco Fire & Security Gmbh Building management system with cloud management of gateway configurations
US11954154B2 (en) 2020-09-30 2024-04-09 Johnson Controls Tyco IP Holdings LLP Building management system with semantic model integration
US12013673B2 (en) 2021-11-29 2024-06-18 Tyco Fire & Security Gmbh Building control system using reinforcement learning
US12013823B2 (en) 2022-09-08 2024-06-18 Tyco Fire & Security Gmbh Gateway system that maps points into a graph schema
US12021650B2 (en) 2019-12-31 2024-06-25 Tyco Fire & Security Gmbh Building data platform with event subscriptions
US12019437B2 (en) 2017-02-10 2024-06-25 Johnson Controls Technology Company Web services platform with cloud-based feedback control
US12040911B2 (en) 2020-12-28 2024-07-16 Tyco Fire & Security Gmbh Building data platform with a graph change feed

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9915929B1 (en) * 2014-09-30 2018-03-13 Amazon Technologies, Inc. Monitoring availability of facility equipment
US10473270B2 (en) * 2016-09-30 2019-11-12 General Electric Company Leak detection user interfaces
US10783799B1 (en) * 2016-12-17 2020-09-22 Sproutel, Inc. System, apparatus, and method for educating and reducing stress for patients with illness or trauma using an interactive location-aware toy and a distributed sensor network
CN107728811B (en) * 2017-11-01 2022-02-18 网易(杭州)网络有限公司 Interface control method, device and system
CN108922305A (en) * 2018-05-24 2018-11-30 陈荣 A kind of guidance system and method guiding standard operation by augmented reality
FI128647B (en) * 2018-06-29 2020-09-30 Elisa Oyj Automated network monitoring and control
FI129101B (en) 2018-06-29 2021-07-15 Elisa Oyj Automated network monitoring and control
EP3761318A1 (en) * 2019-07-05 2021-01-06 Norton (Waterford) Limited Drug delivery device with electronics and power management
US20210019679A1 (en) * 2019-07-17 2021-01-21 Pacific Gas And Electric Company Location and marking system and server
US11593074B2 (en) * 2019-11-14 2023-02-28 BizFlow Corporation System, method, and apparatus for data-centric networked application development services
CN111047689B (en) * 2019-12-20 2023-09-08 合肥卓瑞信息技术有限公司 Computer lab 3D management system
CN114039972B (en) * 2021-11-11 2023-07-25 东风越野车有限公司 Intelligent vehicle guarantee system, method and medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US290414A (en) * 1883-12-18 Underground conduit for electric wires
US20070219645A1 (en) * 2006-03-17 2007-09-20 Honeywell International Inc. Building management system
US20130031202A1 (en) * 2011-07-26 2013-01-31 Mick Jason L Using Augmented Reality To Create An Interface For Datacenter And Systems Management
US20130114100A1 (en) * 2011-11-04 2013-05-09 Canon Kabushiki Kaisha Printing system, image forming apparatus, and method
US20130169681A1 (en) * 2011-06-29 2013-07-04 Honeywell International Inc. Systems and methods for presenting building information
US20140022281A1 (en) * 2012-07-18 2014-01-23 The Boeing Company Projecting airplane location specific maintenance history using optical reference points
US20150310667A1 (en) * 2014-04-23 2015-10-29 Darrell L. Young Systems and methods for context based information delivery using augmented reality
US20150325047A1 (en) * 2014-05-06 2015-11-12 Honeywell International Inc. Apparatus and method for providing augmented reality for maintenance applications
US20150371455A1 (en) * 2014-06-23 2015-12-24 GM Global Technology Operations LLC Augmented reality based interactive troubleshooting and diagnostics for a vehicle
US20160103437A1 (en) * 2013-06-27 2016-04-14 Abb Technology Ltd Method and data presenting device for assisting a remote user to provide instructions
US20160127712A1 (en) * 2013-06-27 2016-05-05 Abb Technology Ltd Method and video communication device for transmitting video to a remote user
US20170301181A9 (en) * 2010-11-15 2017-10-19 Bally Gaming, Inc. System and method for augmented reality gaming

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3992786A (en) 1975-02-05 1976-11-23 Grumman Aerospace Corporation Apparatus for sequence training
JP2733707B2 (en) 1990-08-31 1998-03-30 ファナック株式会社 Injection molding machine parts maintenance warning method
US5600576A (en) 1994-03-11 1997-02-04 Northrop Grumman Corporation Time stress measurement device
US5864784A (en) 1996-04-30 1999-01-26 Fluor Daniel Hanford, Inc. Hand held data collection and monitoring system for nuclear facilities
US6304851B1 (en) 1998-03-13 2001-10-16 The Coca-Cola Company Mobile data collection systems, methods and computer program products
US6457049B2 (en) * 1998-06-08 2002-09-24 Telxon Corporation Enterprise wide software management system for integrating a plurality of heterogenous software systems to support clients and subclients communication by using a midware interface
EP1203377A1 (en) 1999-04-21 2002-05-08 Research Investment Network, Inc. System, method and article of manufacture for updating content stored on a portable storage medium
JP3625418B2 (en) 2000-07-21 2005-03-02 株式会社日立製作所 Maintenance information management system and maintenance plan providing method
US8655698B2 (en) * 2000-10-17 2014-02-18 Accenture Global Services Limited Performance-based logistics for aerospace and defense programs
US7173528B1 (en) 2001-05-31 2007-02-06 Alien Technology Corporation System and method for disabling data on radio frequency identification tags
US20030036983A1 (en) 2001-08-14 2003-02-20 Hougen Jerome L. Method for inventory and layout management of a facility
US20040107114A1 (en) 2002-11-15 2004-06-03 Curtis Peter M. System and method for processing, organizing and accessing mission critical facilities information and intellectual capital
US7802065B1 (en) * 2004-05-03 2010-09-21 Crimson Corporation Peer to peer based cache management
EP1672483A1 (en) 2004-12-20 2006-06-21 Siemens Aktiengesellschaft Data input method for a data processing system
US7712670B2 (en) 2005-09-28 2010-05-11 Sauerwein Jr James T Data collection device and network having radio signal responsive mode switching
WO2009115921A2 (en) * 2008-02-22 2009-09-24 Ipath Technologies Private Limited Techniques for enterprise resource mobilization
US8248237B2 (en) 2008-04-02 2012-08-21 Yougetitback Limited System for mitigating the unauthorized use of a device
US20100281387A1 (en) 2009-05-01 2010-11-04 Johnson Controls Technology Company Systems and methods for managing building automation systems and it systems
US8743130B2 (en) * 2009-08-28 2014-06-03 Zynga Inc. Apparatuses, methods and systems for a distributed object renderer
US8830267B2 (en) * 2009-11-16 2014-09-09 Alliance For Sustainable Energy, Llc Augmented reality building operations tool
US8433547B2 (en) 2009-12-03 2013-04-30 Schneider Electric It Corporation System and method for analyzing nonstandard facility operations within a data center
US9064290B2 (en) 2010-07-23 2015-06-23 Jkads Llc Method for inspecting a physical asset
US8965927B2 (en) 2010-07-30 2015-02-24 Rbm Technologies Managing facilities
GB2498575A (en) * 2012-01-20 2013-07-24 Renesas Mobile Corp Device-to-device discovery resource allocation for multiple cells in a device-to-device discovery area
US9479549B2 (en) * 2012-05-23 2016-10-25 Haworth, Inc. Collaboration system with whiteboard with federated display
US9479548B2 (en) * 2012-05-23 2016-10-25 Haworth, Inc. Collaboration system with whiteboard access to global collaboration data
US20130331963A1 (en) 2012-06-06 2013-12-12 Rockwell Automation Technologies, Inc. Systems, methods, and software to identify and present reliability information for industrial automation devices
US8800015B2 (en) * 2012-06-19 2014-08-05 At&T Mobility Ii, Llc Apparatus and methods for selecting services of mobile network operators
US9436724B2 (en) * 2013-10-21 2016-09-06 Sap Se Migrating data in tables in a database
US20150193439A1 (en) * 2014-01-08 2015-07-09 International Business Machines Corporation Schemaless data access management

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US290414A (en) * 1883-12-18 Underground conduit for electric wires
US20070219645A1 (en) * 2006-03-17 2007-09-20 Honeywell International Inc. Building management system
US20170301181A9 (en) * 2010-11-15 2017-10-19 Bally Gaming, Inc. System and method for augmented reality gaming
US20130169681A1 (en) * 2011-06-29 2013-07-04 Honeywell International Inc. Systems and methods for presenting building information
US20130031202A1 (en) * 2011-07-26 2013-01-31 Mick Jason L Using Augmented Reality To Create An Interface For Datacenter And Systems Management
US20130114100A1 (en) * 2011-11-04 2013-05-09 Canon Kabushiki Kaisha Printing system, image forming apparatus, and method
US20140022281A1 (en) * 2012-07-18 2014-01-23 The Boeing Company Projecting airplane location specific maintenance history using optical reference points
US20160103437A1 (en) * 2013-06-27 2016-04-14 Abb Technology Ltd Method and data presenting device for assisting a remote user to provide instructions
US20160127712A1 (en) * 2013-06-27 2016-05-05 Abb Technology Ltd Method and video communication device for transmitting video to a remote user
US20150310667A1 (en) * 2014-04-23 2015-10-29 Darrell L. Young Systems and methods for context based information delivery using augmented reality
US20150325047A1 (en) * 2014-05-06 2015-11-12 Honeywell International Inc. Apparatus and method for providing augmented reality for maintenance applications
US20150371455A1 (en) * 2014-06-23 2015-12-24 GM Global Technology Operations LLC Augmented reality based interactive troubleshooting and diagnostics for a vehicle

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Glockner, Holger, Kai Jannek, Johannes Mahn, Bjorn Theis; Augmented Reality in Logistics; June 30, 2014; DHL Customer Solutions & Innovation; http://www.dhl.com/content/dam/downloads/g0/about_us/logistics_insights/csi_augmented_reality_report_290414.pdf; *
Henderson, Steven; Augmented Reality for Maintenance and Repair; Oct. 10, 2009; Youtube; https://www.youtube.com/watch?v=mn-zvymlSvk; *

Cited By (120)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11754982B2 (en) 2012-08-27 2023-09-12 Johnson Controls Tyco IP Holdings LLP Syntax translation from first syntax to second syntax based on string analysis
US9779275B2 (en) * 2014-07-03 2017-10-03 Mitsubishi Electric Corporation Control system, terminal, information setting method, and program
US20160070439A1 (en) * 2014-09-04 2016-03-10 International Business Machines Corporation Electronic commerce using augmented reality glasses and a smart watch
US10055647B2 (en) * 2014-12-04 2018-08-21 Jungheinrich Aktiengesellschaft Method and system for monitoring logistics facilities
US20160165190A1 (en) * 2014-12-04 2016-06-09 Jungheinrich Aktiengesellschaft Method and system for monitoring logistics facilities
US10026033B2 (en) 2014-12-09 2018-07-17 Peter M. Curtis Facility walkthrough and maintenance guided by scannable tags or data
US20160162772A1 (en) * 2014-12-09 2016-06-09 Peter M. Curtis Facility walkthrough and maintenance guided by scannable tags or data
US9792542B2 (en) * 2014-12-09 2017-10-17 Peter M. Curtis Facility walkthrough and maintenance guided by scannable tags or data
US20200396361A1 (en) * 2015-04-14 2020-12-17 ETAK Systems, LLC 360 Degree Camera Apparatus with Monitoring Sensors
US10339793B2 (en) * 2015-07-31 2019-07-02 Johnson Controls Fire Protection LP System and method for smoke detector performance analysis
US11899413B2 (en) 2015-10-21 2024-02-13 Johnson Controls Technology Company Building automation system with integrated building information model
US11353831B2 (en) 2015-10-21 2022-06-07 Johnson Controls Technology Company Building automation system with integrated building information model
US11353832B2 (en) 2015-10-21 2022-06-07 Johnson Controls Technology Company Building automation system with integrated building information model
US10534326B2 (en) 2015-10-21 2020-01-14 Johnson Controls Technology Company Building automation system with integrated building information model
US11874635B2 (en) 2015-10-21 2024-01-16 Johnson Controls Technology Company Building automation system with integrated building information model
US11307543B2 (en) 2015-10-21 2022-04-19 Johnson Controls Technology Company Building automation system with integrated building information model
US10747634B2 (en) * 2015-10-27 2020-08-18 Fluke Corporation System and method for utilizing machine-readable codes for testing a communication network
US10395116B2 (en) * 2015-10-29 2019-08-27 Hand Held Products, Inc. Dynamically created and updated indoor positioning map
US11770020B2 (en) 2016-01-22 2023-09-26 Johnson Controls Technology Company Building system with timeseries synchronization
US11947785B2 (en) 2016-01-22 2024-04-02 Johnson Controls Technology Company Building system with a building graph
US11894676B2 (en) 2016-01-22 2024-02-06 Johnson Controls Technology Company Building energy management system with energy analytics
US10339738B2 (en) * 2016-02-16 2019-07-02 Ademco Inc. Systems and methods of access control in security systems with augmented reality
US20170236348A1 (en) * 2016-02-16 2017-08-17 Honeywell International Inc. Systems and methods of access control in security systems with augmented reality
US11127211B2 (en) * 2016-03-30 2021-09-21 Nec Corporation Plant management system, plant management method, plant management apparatus, and plant management program
US11768004B2 (en) 2016-03-31 2023-09-26 Johnson Controls Tyco IP Holdings LLP HVAC device registration in a distributed building management system
US10613729B2 (en) * 2016-05-03 2020-04-07 Johnson Controls Technology Company Building and security management system with augmented reality interface
US11927924B2 (en) 2016-05-04 2024-03-12 Johnson Controls Technology Company Building system with user presentation composition based on building context
US11774920B2 (en) 2016-05-04 2023-10-03 Johnson Controls Technology Company Building system with user presentation composition based on building context
WO2018007075A1 (en) * 2016-07-05 2018-01-11 Siemens Aktiengesellschaft Interaction system and method
US11892180B2 (en) 2017-01-06 2024-02-06 Johnson Controls Tyco IP Holdings LLP HVAC system with automated device pairing
US20180211447A1 (en) * 2017-01-24 2018-07-26 Lonza Limited Methods and Systems for Using a Virtual or Augmented Reality Display to Perform Industrial Maintenance
US11778030B2 (en) 2017-02-10 2023-10-03 Johnson Controls Technology Company Building smart entity system with agent based communication and control
US11774930B2 (en) 2017-02-10 2023-10-03 Johnson Controls Technology Company Building system with digital twin based agent processing
US11994833B2 (en) 2017-02-10 2024-05-28 Johnson Controls Technology Company Building smart entity system with agent based data ingestion and entity creation using time series data
US12019437B2 (en) 2017-02-10 2024-06-25 Johnson Controls Technology Company Web services platform with cloud-based feedback control
US11792039B2 (en) 2017-02-10 2023-10-17 Johnson Controls Technology Company Building management system with space graphs including software components
US11755604B2 (en) 2017-02-10 2023-09-12 Johnson Controls Technology Company Building management system with declarative views of timeseries data
US11762886B2 (en) 2017-02-10 2023-09-19 Johnson Controls Technology Company Building system with entity graph commands
US11809461B2 (en) 2017-02-10 2023-11-07 Johnson Controls Technology Company Building system with an entity graph storing software logic
US11764991B2 (en) 2017-02-10 2023-09-19 Johnson Controls Technology Company Building management system with identity management
US11762362B2 (en) 2017-03-24 2023-09-19 Johnson Controls Tyco IP Holdings LLP Building management system with dynamic channel communication
US11954478B2 (en) 2017-04-21 2024-04-09 Tyco Fire & Security Gmbh Building management system with cloud management of gateway configurations
US11761653B2 (en) 2017-05-10 2023-09-19 Johnson Controls Tyco IP Holdings LLP Building management system with a distributed blockchain database
US11900287B2 (en) 2017-05-25 2024-02-13 Johnson Controls Tyco IP Holdings LLP Model predictive maintenance system with budgetary constraints
US11699903B2 (en) 2017-06-07 2023-07-11 Johnson Controls Tyco IP Holdings LLP Building energy optimization system with economic load demand response (ELDR) optimization and ELDR user interfaces
US11774922B2 (en) 2017-06-15 2023-10-03 Johnson Controls Technology Company Building management system with artificial intelligence for unified agent based control of building subsystems
JP2019008605A (en) * 2017-06-26 2019-01-17 積水ハウス株式会社 Information processing system
US11495118B2 (en) * 2017-06-27 2022-11-08 Oneevent Technologies, Inc. Augmented reality of a building
US11920810B2 (en) 2017-07-17 2024-03-05 Johnson Controls Technology Company Systems and methods for agent based building simulation for optimal control
US11733663B2 (en) 2017-07-21 2023-08-22 Johnson Controls Tyco IP Holdings LLP Building management system with dynamic work order generation with adaptive diagnostic task details
US11726632B2 (en) 2017-07-27 2023-08-15 Johnson Controls Technology Company Building management system with global rule library and crowdsourcing framework
US20190057181A1 (en) * 2017-08-18 2019-02-21 International Business Machines Corporation System and method for design optimization using augmented reality
US20190057180A1 (en) * 2017-08-18 2019-02-21 International Business Machines Corporation System and method for design optimization using augmented reality
US11709965B2 (en) 2017-09-27 2023-07-25 Johnson Controls Technology Company Building system with smart entity personal identifying information (PII) masking
US12013842B2 (en) 2017-09-27 2024-06-18 Johnson Controls Tyco IP Holdings LLP Web services platform with integration and interface of smart entities with enterprise applications
US11762356B2 (en) 2017-09-27 2023-09-19 Johnson Controls Technology Company Building management system with integration of data into smart entities
US11741812B2 (en) 2017-09-27 2023-08-29 Johnson Controls Tyco IP Holdings LLP Building risk analysis system with dynamic modification of asset-threat weights
US11762353B2 (en) 2017-09-27 2023-09-19 Johnson Controls Technology Company Building system with a digital twin based on information technology (IT) data and operational technology (OT) data
US11735021B2 (en) 2017-09-27 2023-08-22 Johnson Controls Tyco IP Holdings LLP Building risk analysis system with risk decay
US20220138183A1 (en) 2017-09-27 2022-05-05 Johnson Controls Tyco IP Holdings LLP Web services platform with integration and interface of smart entities with enterprise applications
US11768826B2 (en) 2017-09-27 2023-09-26 Johnson Controls Tyco IP Holdings LLP Web services for creation and maintenance of smart entities for connected devices
US20190112148A1 (en) * 2017-10-13 2019-04-18 Otis Elevator Company Commissioning and upgrading remote software/firmware using augmented reality
CN109669852A (en) * 2017-10-13 2019-04-23 奥的斯电梯公司 It is debugged using augmented reality and upgrading remote software/firmware
US11762351B2 (en) 2017-11-15 2023-09-19 Johnson Controls Tyco IP Holdings LLP Building management system with point virtualization for online meters
US11782407B2 (en) 2017-11-15 2023-10-10 Johnson Controls Tyco IP Holdings LLP Building management system with optimized processing of building system data
US11727738B2 (en) 2017-11-22 2023-08-15 Johnson Controls Tyco IP Holdings LLP Building campus with integrated smart environment
EP3724856B1 (en) * 2017-12-14 2023-09-20 The Joan and Irwin Jacobs Technion-Cornell Institute System and method for creating geo-localized enhanced floor plans
US20190049275A1 (en) * 2017-12-29 2019-02-14 Intel Corporation Method, a circuit and a system for environmental sensing
US11781890B2 (en) * 2017-12-29 2023-10-10 Intel Corporation Method, a circuit and a system for environmental sensing
US11954713B2 (en) 2018-03-13 2024-04-09 Johnson Controls Tyco IP Holdings LLP Variable refrigerant flow system with electricity consumption apportionment
US20210216938A1 (en) * 2018-06-01 2021-07-15 Johnson Controls Technology Company Enterprise platform for enhancing operational performance
US20210256771A1 (en) * 2018-06-12 2021-08-19 Current Lighting Solutions, Llc Integrated management of sensitive controlled environments and items contained therein
US11989836B2 (en) * 2018-06-12 2024-05-21 Current Lighting Solutions, Llc Integrated management of sensitive controlled environments and items contained therein
WO2020010033A1 (en) * 2018-07-06 2020-01-09 Carrier Corporation Building maintaining method and system
US11941238B2 (en) 2018-10-30 2024-03-26 Johnson Controls Technology Company Systems and methods for entity visualization and management with an entity node editor
US11927925B2 (en) 2018-11-19 2024-03-12 Johnson Controls Tyco IP Holdings LLP Building system with a time correlated reliability data stream
US11763266B2 (en) 2019-01-18 2023-09-19 Johnson Controls Tyco IP Holdings LLP Smart parking lot system
US11775938B2 (en) 2019-01-18 2023-10-03 Johnson Controls Tyco IP Holdings LLP Lobby management system
US11769117B2 (en) 2019-01-18 2023-09-26 Johnson Controls Tyco IP Holdings LLP Building automation system with fault analysis and component procurement
US11762343B2 (en) 2019-01-28 2023-09-19 Johnson Controls Tyco IP Holdings LLP Building management system with hybrid edge-cloud processing
US11854329B2 (en) 2019-05-24 2023-12-26 Ademco Inc. Systems and methods for authorizing transmission of commands and signals to an access control device or a control panel device
CN110264820A (en) * 2019-06-24 2019-09-20 国网浙江江山市供电有限公司 A kind of high-tension switch cabinet Training Methodology based on virtual reality technology
US11048760B1 (en) * 2019-07-31 2021-06-29 Splunk Inc. Techniques for placing content in and applying layers in an extended reality environment
CN110570715A (en) * 2019-09-17 2019-12-13 南京铁道职业技术学院 Electric power simulation experiment device based on augmented reality technology
US11394609B2 (en) * 2019-10-30 2022-07-19 Wistron Corporation Equipment deploying system and method thereof
US11894944B2 (en) 2019-12-31 2024-02-06 Johnson Controls Tyco IP Holdings LLP Building data platform with an enrichment loop
US11777757B2 (en) 2019-12-31 2023-10-03 Johnson Controls Tyco IP Holdings LLP Building data platform with event based graph queries
US20220376944A1 (en) 2019-12-31 2022-11-24 Johnson Controls Tyco IP Holdings LLP Building data platform with graph based capabilities
US12021650B2 (en) 2019-12-31 2024-06-25 Tyco Fire & Security Gmbh Building data platform with event subscriptions
US11991018B2 (en) 2019-12-31 2024-05-21 Tyco Fire & Security Gmbh Building data platform with edge based event enrichment
US11991019B2 (en) 2019-12-31 2024-05-21 Johnson Controls Tyco IP Holdings LLP Building data platform with event queries
US11968059B2 (en) 2019-12-31 2024-04-23 Johnson Controls Tyco IP Holdings LLP Building data platform with graph based capabilities
US11777756B2 (en) 2019-12-31 2023-10-03 Johnson Controls Tyco IP Holdings LLP Building data platform with graph based communication actions
US11777759B2 (en) 2019-12-31 2023-10-03 Johnson Controls Tyco IP Holdings LLP Building data platform with graph based permissions
US11770269B2 (en) 2019-12-31 2023-09-26 Johnson Controls Tyco IP Holdings LLP Building data platform with event enrichment with contextual information
US11824680B2 (en) 2019-12-31 2023-11-21 Johnson Controls Tyco IP Holdings LLP Building data platform with a tenant entitlement model
US11777758B2 (en) 2019-12-31 2023-10-03 Johnson Controls Tyco IP Holdings LLP Building data platform with external twin synchronization
US11610348B2 (en) 2020-01-23 2023-03-21 Netapp, Inc. Augmented reality diagnostic tool for data center nodes
WO2021150506A1 (en) * 2020-01-23 2021-07-29 Netapp, Inc. Augmented reality diagnostic tool for data center nodes
US11796333B1 (en) * 2020-02-11 2023-10-24 Keysight Technologies, Inc. Methods, systems and computer readable media for augmented reality navigation in network test environments
US11602691B2 (en) * 2020-03-13 2023-03-14 Harmonix Music Systems, Inc. Techniques for virtual reality boundaries and related systems and methods
US11880677B2 (en) 2020-04-06 2024-01-23 Johnson Controls Tyco IP Holdings LLP Building system with digital network twin
US11874809B2 (en) 2020-06-08 2024-01-16 Johnson Controls Tyco IP Holdings LLP Building system with naming schema encoding entity type and entity relationships
US11954154B2 (en) 2020-09-30 2024-04-09 Johnson Controls Tyco IP Holdings LLP Building management system with semantic model integration
US11741165B2 (en) 2020-09-30 2023-08-29 Johnson Controls Tyco IP Holdings LLP Building management system with semantic model integration
US11902375B2 (en) 2020-10-30 2024-02-13 Johnson Controls Tyco IP Holdings LLP Systems and methods of configuring a building management system
US11570050B2 (en) 2020-11-30 2023-01-31 Keysight Technologies, Inc. Methods, systems and computer readable media for performing cabling tasks using augmented reality
US12040911B2 (en) 2020-12-28 2024-07-16 Tyco Fire & Security Gmbh Building data platform with a graph change feed
US11967192B2 (en) * 2021-01-04 2024-04-23 Bank Of America Corporation System for secure access and initiation using a remote terminal
US20220254209A1 (en) * 2021-01-04 2022-08-11 Bank Of America Corporation System for secure access and initiation using a remote terminal
US11921481B2 (en) 2021-03-17 2024-03-05 Johnson Controls Tyco IP Holdings LLP Systems and methods for determining equipment energy waste
US11899723B2 (en) 2021-06-22 2024-02-13 Johnson Controls Tyco IP Holdings LLP Building data platform with context based twin function processing
US11796974B2 (en) 2021-11-16 2023-10-24 Johnson Controls Tyco IP Holdings LLP Building data platform with schema extensibility for properties and tags of a digital twin
US11934966B2 (en) 2021-11-17 2024-03-19 Johnson Controls Tyco IP Holdings LLP Building data platform with digital twin inferences
US11769066B2 (en) 2021-11-17 2023-09-26 Johnson Controls Tyco IP Holdings LLP Building data platform with digital twin triggers and actions
US11704311B2 (en) 2021-11-24 2023-07-18 Johnson Controls Tyco IP Holdings LLP Building data platform with a distributed digital twin
US12013673B2 (en) 2021-11-29 2024-06-18 Tyco Fire & Security Gmbh Building control system using reinforcement learning
US11714930B2 (en) 2021-11-29 2023-08-01 Johnson Controls Tyco IP Holdings LLP Building data platform with digital twin based inferences and predictions for a graphical building model
US20230326146A1 (en) * 2022-04-07 2023-10-12 Asustek Computer Inc. Augmented reality implementing method
US12013823B2 (en) 2022-09-08 2024-06-18 Tyco Fire & Security Gmbh Gateway system that maps points into a graph schema

Also Published As

Publication number Publication date
US20160036898A1 (en) 2016-02-04
US10170018B2 (en) 2019-01-01

Similar Documents

Publication Publication Date Title
US20160035246A1 (en) Facility operations management using augmented reality
US10854013B2 (en) Systems and methods for presenting building information
US10516857B2 (en) Method and apparatus of secured interactive remote maintenance assist
US9459755B2 (en) Facility operations management and mobile systems
US10026033B2 (en) Facility walkthrough and maintenance guided by scannable tags or data
US10055869B2 (en) Enhanced reality system for visualizing, evaluating, diagnosing, optimizing and servicing smart grids and incorporated components
US11176383B2 (en) Hazard detection through computer vision
US20210216773A1 (en) Personal protective equipment system with augmented reality for safety event detection and visualization
US9749596B2 (en) Systems and methods for automated cloud-based analytics for security and/or surveillance
US20210343182A1 (en) Virtual-reality-based personal protective equipment training system
CA2895778C (en) Context based augmented reality
US20220188545A1 (en) Augmented reality enhanced situational awareness
US10200631B2 (en) Method for configuring a camera
KR20180041703A (en) A notification management system and method for monitoring the operation of a modular power plant
EP3104315A1 (en) Information collection system, information collection terminal device, information collection server, and information collection method
US20160364396A1 (en) Information search system and information search method
CA2942668A1 (en) Facility operations management and distributed system
US20200258360A1 (en) Sensor monitoring and mapping in a translated coordinate system
JP2017045319A (en) Information processing device, information processing method, and information processing system

Legal Events

Date Code Title Description
AS Assignment

Owner name: BNB BANK, NEW YORK

Free format text: SECURITY INTEREST;ASSIGNOR:CURTIS, PETER;REEL/FRAME:044669/0391

Effective date: 20180119

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION