CN105224897B - Information providing apparatus, detecting system, and information providing method - Google Patents

Information providing apparatus, detecting system, and information providing method Download PDF

Info

Publication number
CN105224897B
CN105224897B CN201510206405.5A CN201510206405A CN105224897B CN 105224897 B CN105224897 B CN 105224897B CN 201510206405 A CN201510206405 A CN 201510206405A CN 105224897 B CN105224897 B CN 105224897B
Authority
CN
China
Prior art keywords
information
image
subject
detection
configuration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510206405.5A
Other languages
Chinese (zh)
Other versions
CN105224897A (en
Inventor
王浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Mission Infrared Electro Optics Technology Co Ltd
Original Assignee
Hangzhou Mission Infrared Electro Optics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Mission Infrared Electro Optics Technology Co Ltd filed Critical Hangzhou Mission Infrared Electro Optics Technology Co Ltd
Priority to CN201510206405.5A priority Critical patent/CN105224897B/en
Publication of CN105224897A publication Critical patent/CN105224897A/en
Application granted granted Critical
Publication of CN105224897B publication Critical patent/CN105224897B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Radiation Pyrometers (AREA)
  • Studio Devices (AREA)

Abstract

The invention provides an information providing device, a detection system and an information providing method, and relates to the field of detection application. The detection device in the prior art is difficult to conveniently obtain the information of the detected body related to the detected body when the detection device in the prior art detects the detected body. The information providing apparatus of the present invention includes a first acquisition section for acquiring a first image; a communication unit for supplying predetermined information to an external device connected to the information supply device; the prescribed information includes one or more of a first image, character information recognized based on the first image, first subject information recognized based on the first image, second subject information selected based on the first subject information, configuration information associated with the first subject information recognized based on the first image, and configuration information associated with the second subject information selected based on the first subject information. Thus, the existing problems are solved.

Description

Information providing apparatus, detecting system, and information providing method
Technical Field
The invention discloses an information providing device, a detection system and an information providing method, and relates to the field of detection application.
Background
At present, various detection devices such as various imaging instruments (visible light, infrared, ultraviolet, laser and the like), partial discharge testing instruments, gas leakage testing instruments and the like are widely applied; when the detection device in the prior art detects the detected body, how to conveniently obtain the information of the detected body related to the detected body is a difficult point.
The detection device takes a thermal image shooting device as an example, and the thermal image files obtained by shooting can generate file names according to time sequence numbers; for subsequent archiving and analysis, thermal image files obtained by shooting different detected bodies need to be distinguished, during infrared detection, a user needs to correspondingly record information of the detected bodies according to the cognition or field tags of the detected bodies, and the existing common recording modes are as follows: manually recording the file name of the thermal image file and the corresponding information of the measured object by using paper; attaching a voice annotation representing the information of the measured body in the thermal image file; the above operation is cumbersome and the subsequent finishing work load is large.
There are also detection devices with orientation acquisition devices such as GPS, which can prompt GPS information and record the GPS information in association with the file of the detection data. However, in the prior art, GPS information recorded by various detection devices such as a thermal image capturing device cannot conveniently correspond to information of a detected object, only the GPS information is recorded, and in subsequent arrangement, the information of the detected object needs to be correspondingly recorded according to the GPS information, which increases workload. When a plurality of detected bodies exist in the same place, a user still needs to manually record the information of the detected bodies, and only the GPS azimuth information is recorded, so that the problem that the information of the detected bodies is correspondingly associated with the detection data cannot be solved.
Or a bar code and the like can be arranged on the measured body, and the information of the measured body can be obtained by bar code scanning and the like when in use; however, in some applications, such as in the power industry, the distance of about 3-20 m is usually required for detecting the object to be detected by using a detection device such as a thermal image capturing device, and if the barcode is scanned, the scanning operation needs to be performed close to the barcode mounting position of the object to be detected, which increases the workload of the user.
Therefore, it is understood that there is a need for an information providing device that can easily obtain information such as subject information related to a detected subject and provide the information to a detecting device connected thereto to solve the problems existing at present.
Disclosure of Invention
The invention provides an information providing device, a detection system and an information providing method; an information providing apparatus includes a first acquisition section for acquiring a first image; a communication unit for supplying predetermined information to an external device connected to the information supply device; the prescribed information includes one or more of a first image, character information recognized based on the first image, first subject information recognized based on the first image, second subject information selected based on the first subject information, configuration information associated with the first subject information recognized based on the first image, and configuration information associated with the second subject information selected based on the first subject information.
The detection device comprises an acquisition part, a detection part and a detection part, wherein the acquisition part is used for acquiring detection data; a communication unit for acquiring prescribed information provided by the connected information providing apparatus; the prescribed information includes one or more of a first image, character information recognized based on the first image, first subject information recognized based on the first image, second subject information selected based on the first subject information, configuration information associated with the first subject information recognized based on the first image, and configuration information associated with the second subject information selected based on the first subject information.
Thus, the existing problems are solved.
The information providing method of the present invention includes the following steps:
a first acquisition step of acquiring a first image;
a communication step of supplying predetermined information to an external device connected to the information supply device; the prescribed information includes one or more of a first image, character information recognized based on the first image, first subject information recognized based on the first image, second subject information selected based on the first subject information, configuration information associated with the first subject information recognized based on the first image, and configuration information associated with the second subject information selected based on the first subject information.
Other aspects and advantages of the invention will become apparent from the following description.
Description of the drawings:
fig. 1 is a block diagram of an electrical configuration of an information providing apparatus 100 of embodiment 1.
Fig. 2 is an outline schematic diagram of the information providing apparatus 100 of embodiment 1.
Fig. 3 is a schematic diagram of an implementation of the subject information and its associated configuration information stored in the storage medium of the information providing apparatus 100.
Fig. 4 is a block diagram of an electrical configuration in which the information providing apparatus 100 and the detection apparatus 101 of embodiment 1 are connected.
Fig. 5 is a schematic diagram of the connection of the information providing apparatus 100 and the detection apparatus 101 of embodiment 1.
Fig. 6 is a display example of the corresponding processing performed on the acquired detection data.
Fig. 7 is a control flowchart of the information providing apparatus 100 of embodiment 1.
Fig. 8 is a schematic diagram of a display interface of the information providing apparatus 100 capturing and acquiring the first image.
Detailed Description
The following examples are to be construed as better understanding of the present invention without limiting the scope thereof and may be modified in various forms within the scope thereof. Furthermore, although the external device in the following embodiments of the present invention is exemplified by the detection device, and the detection device is exemplified by the portable thermal image capturing device 101, but not limited thereto, the idea of the embodiments is applicable to general detection devices, and the detection device 101 may be various detection devices that obtain detection data based on detectors (including various detectors and sensors), such as various imaging devices (such as visible light, infrared, ultraviolet, laser, and other imaging devices), partial discharge testing devices, gas leakage testing devices, vibration testing devices, and the like.
Embodiment 1, the structure of an information providing apparatus 100 of embodiment 1 is explained with reference to fig. 1. Fig. 1 is a block diagram of an electrical configuration of an information providing apparatus 100 of the embodiment.
The information providing apparatus 100 includes a communication interface 1, an acquisition unit 2, an auxiliary storage unit 3, a display unit 4, a RAM5, a hard disk 6, an operation unit 7, and a CPU8 connected to the above components via a bus and configured to perform overall control. As the information providing apparatus 100, a tablet computer, a personal digital assistant, and the like can be exemplified.
A communication interface 1 for data communication with a connected detection device; in the present embodiment, prescribed information can be supplied to an external device connected to the information providing device 100 through the communication interface 1; the prescribed information comprises one or more of a first image, character information recognized based on the first image, first measured body information recognized based on the first image, second measured body information selected based on the first measured body information, configuration information related to the first measured body information, and configuration information related to the second measured body information selected by the first measured body information. To the detecting device 101 connected to the information providing device 100.
Furthermore, the communication interface 1 may also include a communication interface that controls the detection device 101; further, when the information providing apparatus 100 itself does not include the acquisition section 2, the communication interface 1 may include an interface for connecting with another apparatus that acquires the first image; the communication interface 1 may include various wired or wireless communication interfaces such as a network interface, a USB interface, a 1394 interface, a video interface, GPRS, 3G, 4G, 5G, etc. suitable for the information providing apparatus 100 according to the needs of the specific embodiment.
In another embodiment, the detection data output by the detection device 101 connected to the information providing device 100 may be received through the communication interface 1; in one embodiment, the inclusion may be receiving detection data transmitted by the relay device;
in yet another embodiment, further, the communication interface 1 further comprises a wireless communication interface that can transmit the prescribed information, or detection data that is also associated with the received detection device 101, to a storage medium of another destination such as a server.
The acquisition section 2, as an example of a first acquisition section, acquires a first image containing information to be identified. In embodiment 1, the first acquisition unit is a visible light imaging unit, and includes an optical component, a lens driving component, a visible light detector, a signal preprocessing circuit, and the like, which are not shown; a first image containing information to be identified is obtained by photographing a label related to a subject.
The first acquiring unit may be implemented in various ways depending on the material, color, and the like of the label information and the background of the target label, and depending on the application environment conditions.
In another example, the first acquiring unit may employ a near-infrared photographing device, and may photograph a first image of the sign of the corresponding retroreflective material at night based on an auxiliary lighting device, for example, an infrared lighting device of 950 nm; preferably, the information providing apparatus 100 includes an auxiliary lighting device.
In yet another example, the first acquiring part may employ a thermal image photographing part of far infrared, and the first image is acquired based on far infrared radiation of the sign; wherein the first image may be acquired based on a difference in emissivity of the signage information and the background surface material.
In embodiment 1, the first image is image data obtained based on an output signal of an image detector; according to the different embodiment of the first acquisition unit, the image data may be raw image data obtained by AD-converting the output signal of the image sensor, or may be image data obtained by performing predetermined processing on the basis of the raw image data, for example, by performing various image processing such as white balance compensation processing, Y compensation processing, and YC conversion processing, to generate image data composed of a digitized luminance signal and color difference signals. However, in other embodiments, the first image may be obtained by receiving external image data, and the information providing apparatus 100 may also obtain the first image provided by an external apparatus connected to the information providing apparatus 100 by wire or wirelessly through the communication interface 1, such as a visible light image of a sign output by a visible light shooting apparatus connected to the information providing apparatus 100; or may be obtained by reading the first image file from a storage medium.
The auxiliary storage unit 3 is a storage medium such as a memory card and a related interface.
The display section 4 is, for example, a liquid crystal display, and the display section 4 may be another display connected to the information providing apparatus 100, and the information providing apparatus 100 itself may have no display in its electrical configuration.
The RAM5 functions as a work memory for the CPU8, and temporarily stores data processed by the CPU 8.
The hard disk 6 stores therein a program for control and various data used in the control. Data relating to the first image recognition and the like is stored in a storage medium such as the hard disk 6, for example, a character database for template matching is stored.
In embodiment 1, the hard disk 6 is an example of a storage medium that stores information such as subject information and associated configuration information, as shown in the table of fig. 3. Storage media in the information providing apparatus 100, such as a hard disk 6, nonvolatile storage media such as a memory card, and volatile storage media such as a RAM 5; the storage medium may be another storage medium that is connected to the information providing apparatus 100 by wire or wirelessly, such as another storage apparatus or a storage medium of a detection apparatus, a storage medium in a computer or the like, or a storage medium of a network destination, which communicates by wire or wirelessly connected to the communication interface 4; the information providing apparatus 100 can also acquire data necessary for processing from the storage medium.
The subject information is information related to the subject, for example, information representing a specific attribute of the subject such as a location, a type, a number, and the like of the subject;
the information of the subject stored in the hard disk 6 should include keywords in the label information corresponding to the subject, for the purpose of subsequent matching. Taking the label of the power equipment as an example, the labels of the tested objects installed in the actual substation field can be listed as the following common types: in one example, a subject label installed in correspondence with an entity of the subject includes information representing a location, a number, a type, a phase, etc. of the subject; correspondingly, the information of the tested object comprises information representing the location (such as a transformer substation and an equipment area), the number, the type (such as the type of a transformer, a switch and the like), the phase (such as A, B, C phase) and the like of the tested object; in another example, a placard associated with the device region of the subject; correspondingly, the information of the detected body is the information of the equipment area; in yet another example, the label represents the type of subject; accordingly, the subject information is information of the type of the subject. The subject information stored in advance in the storage medium may further include information such as an attribution unit, a voltage class, a model number, an importance level, a manufacturer, performance, and characteristics of the subject, past imaging or inspection history, a manufacturing date, a lifetime, an ID number, and a detection notice. The information of the measured body can be formed in various ways according to different applications.
When there are the above-mentioned various cases of the measured object information prepared in advance, for example, the actual placard contents "device area 1-1200-switch-a phase"; the stored information of the tested body comprises an equipment area 1, a switch and an equipment area 1-1200-switch-A phase; when character information 'equipment area 1-1200-switch-A phase' is obtained by recognition, the tested body information 'equipment area 1-1200-switch-A phase' with the highest matching degree is selected as representing the first tested body information obtained by recognition during matching comparison.
For convenience of explanation, in the embodiment, the following explanation will be given with reference to the abbreviated symbol contents "subject 1" shown in fig. 5.
The operation unit 7: the CPU8 executes a program in response to an operation signal from the operation unit 7, for the user to perform various instruction operations or various operations such as inputting setting information. A touch screen or keys (not shown) may be employed to implement the relevant operations.
The CPU8 controls the overall operation of the information providing apparatus 100 as an example of a control unit, and stores a program for control and various data used for controlling each unit in a storage medium such as the hard disk 6. The control unit may be realized by, for example, a CPU, an MPU, an SOC, a programmable FPGA, or the like.
In a preferred example, the CPU8 includes a first recognition unit configured to recognize the first image acquired by the first acquisition unit and acquire first subject information.
Taking as an example a first image obtained by imaging a sign including character-constituting sign information, in one embodiment, the first recognition unit includes a positioning unit, a dividing unit, a recognition unit, and a determination unit.
The positioning unit is used for positioning the label area in the first image; for example, performing a related search in a specified range on the first image, finding a plurality of areas which accord with the sign characteristics (such as different colors according to the sign area and the environmental background) as candidate areas, further analyzing the candidate areas, and finally selecting an optimal area as the sign area and extracting the optimal area from the first image; here, predetermined processing such as inclination correction may be performed on the label image or the like.
The segmentation unit is used for segmenting the characters in the label; specifically, the placard area is divided into individual characters according to the extracted placard area, and the character division may be performed by, for example, dividing an image of an individual character based on a blank space between characters by using a vertical projection method, taking into consideration conditions such as a character writing format of the placard, characters, size restrictions, and the like.
And the recognition unit is used for identifying the characters of the signs and recognizing the well-divided characters. The placard character recognition method may employ, for example, a template matching algorithm, binarizes the segmented character and scales the size of the binarized character to the size of a character template in a character database, and then matches all the character templates to select the best match as a result. How to extract a character image from an image and perform character recognition is a technique well known in the art, and a detailed description is omitted.
The judging unit is used for judging whether the character information is first detected body information or not; the character information is information constituted by the recognized character or characters. Specifically, in one example, when the information providing apparatus 100 matches the recognized character information with predetermined subject information (e.g., subject information stored in the hard disk 6), and if the keywords of the two match, the first subject information is obtained as a representative of the recognition. If not, the first detected body information is judged not to be identified.
In the explanation of the identification by the label content "subject 1" in fig. 5, the character information "subject 1" constituted by the characters "subject", "measure", "body", and "1" is identified and obtained for the first image taken by the label, and if the character information "subject 1" matches the keyword of the subject information "subject 1" stored in the hard disk 6, the first subject information "subject 1" is obtained for the representative identification. The configuration information associated with the subject information "subject 1" stored in the hard disk 6 may be used as the configuration information associated with the identified first subject information "subject 1".
When the content of the placard is composed of characters (letters, numbers, letters, etc.), the first subject information may be determined whether or not to obtain the first subject information based on the recognized character information according to a matching comparison with stored subject information. However, for example, when the content of the label is a barcode, a graphic code, or the like, the barcode may be matched with a barcode template in the template library, and when the content of the label is matched, the subject information associated with the barcode template may be used as the first subject information for identification. The configuration information associated with the subject information may be used as the configuration information associated with the identified first subject information. Obviously, it is preferable that a library of barcode templates is stored in the flash memory 9 in advance.
Various embodiments of recognizing the first image to obtain the character information and/or the first subject information may be employed. However, in another example, the determination unit may be omitted and character information may be recognized.
In another embodiment, the information providing apparatus 100 may have a structure including no first identification portion or only a partial function of the first identification portion; in one example, the first image may be transmitted to a destination through the communication interface 1, for example, to a designated server through wireless 3G, the server performs recognition of the first image, and the first subject information obtained through the recognition of the first image and/or configuration information associated with the first subject information may be received through the communication interface 1. In another example, the divided character image extracted from the first image may be transmitted to a server of a destination through the communication interface 1, the server may perform recognition of the character and processing for obtaining the first object information, and the first object information obtained by the recognition processing and/or configuration information associated with the first object information may be received through the communication interface 1. Obviously, when the configuration information can be received through the communication interface 1, the content as shown in fig. 3 may not need to be stored in the storage medium of the information providing apparatus 100.
The second subject information is information related to the subject selected based on the first subject information. Preferably, the storage medium such as the hard disk 6 stores therein the second subject information in association with the first subject information. As shown in the table of fig. 3, the stored subject information "subject 1" and "subject 2" are associated with the corresponding second subject information.
For example, when the first object information "object 1" is obtained by the first image recognition, assuming that the "object 1" corresponds to a certain electric power equipment and it is possible that a certain component of the electric power equipment, such as a body, a joint, etc., is detected, when it is necessary to photograph the "joint", the user may select the associated second object information, such as the "joint", based on the first object information "object 1".
For example, when the first subject information "subject 2" is obtained from the first image recognition, assuming that "subject 2" corresponds to a certain device area and that a certain device such as a switch, a blade, or the like of the device area is detected, the user may select the associated second subject information such as "switch" based on the first subject information "subject 2" when the "switch" needs to be photographed.
For example, when the first subject information includes only a part of information related to the subject; the complete information of the measured object also comprises other attribute information such as ID, and other attribute information of the same measured object is selected according to the first measured object information, and the second measured object information is ID number.
The configuration information may be one or more items of parameters, information, programs, and the like necessary to perform process control. The detection device 101 may perform process control according to the configuration information.
When the detection device 101 connected to the information providing device 100 acquires predetermined information, it is possible to perform processing control based on the predetermined information, for example, one or more of processing control such as notification processing, display control of presentation information, processing control of acquired detection data, acquisition control of detection data, and the like;
the notification processing executes control of performing notification, for example, when the first subject information is acquired. The obtained first subject information may be caused to be displayed based on the control; in other examples, the change may be or be accompanied by one or more of a vibration component in the detection device 101, a change in light of an indicator light, a sound of a sound component, and the like; any notification means that the user can perceive may be used. The notification control may be performed based on the first subject information and/or the second subject information obtained by the recognition.
The display control of the presentation information may be, for example, one or more of display control of presentation information related to the subject, such as display control of a reference image, past imaging or inspection history, a detection notice, second subject information, and the like. The display control of the prompt message can be carried out based on the configuration information related to the first measured object information and/or the second measured object information. The configuration information may be one or more items of parameters, information, programs, and the like necessary to perform display control of the guidance information.
The processing control of the detection data includes one or more of various processing controls such as recording, labeling, communication, recognition, analysis region setting, analysis, diagnosis, classification, and the like with the detection data. The processing control of the detection data may be performed based on the configuration information associated with the first subject information and/or the second subject information. The configuration information may be one or more items of parameters, information, programs, and the like necessary to perform process control of the detected data.
The acquisition control is, for example, control related to lens switching, acquisition frequency of a detector output signal, image processing, etc. in association with the detection device 101. The acquisition control of the detection data may be performed based on the configuration information associated with the first subject information and/or the second subject information. The configuration information may be one or more items of parameters, information, programs, and the like necessary to perform acquisition control of the detection data.
There may be various kinds of configuration information depending on different applications of the detection data, and the configuration information related to the processing may be stored in the table shown in fig. 3 in association with the subject information, the second subject information, and when the information obtained from the first image recognition matches the subject information, the configuration information associated with the subject information may be used as the configuration information associated with the first subject information obtained from the first image recognition; when the second subject information is selected, configuration information associated with the second subject information may be employed in the processing.
A part of the processing in the display interface of the detection apparatus 101 shown in fig. 6 will be described with reference to the display example thereof;
display control of the presentation information, for example, when the first subject information is acquired, causing the display unit 4 to display the first subject information; preferably, when the first subject information is acquired, if the first subject information is associated with the second subject information, the display unit 4 displays the first subject information and the associated second subject information; the user can conveniently select the second detected body information. As shown in fig. 6(a), when the first subject information "subject 1" is detected, the first subject information and the associated second subject information "body", "joint", and the like may be displayed. Also, a table as shown in fig. 3 may be stored in the detection apparatus 101 for flexible use.
And performing the display processing of the reference image, such as executing the corresponding reference image display processing according to the reference image configuration information related to the first measured object information and/or the second measured object information. The reference image configuration information includes one or more configuration parameters related to the reference image display process, and when only a part of the configuration parameters related to the reference image is included, the configuration parameters of the other part may adopt a default configuration of the detection apparatus 101 or configuration parameters set by a user. The reference image configuration parameters include, for example, composition data of the reference image, a position parameter of display of the reference image or a rule for obtaining the position parameter, and other display parameters of the reference image such as a transparency, a line type, a color, and the like. For example, the acquired thermal image data is subjected to display processing of a reference image according to reference image composition data associated with the first subject information. The reference image may be various figures and image data related to the detection of the subject, such as a previously captured historical image, for example, an image for assisting the capturing of the subject, which represents morphological features of the subject, and is used for assisting the user in aligning the subject for detection when the reference image is superimposed on the image obtained by the detection data according to predetermined position parameters (position, size, or rotation angle); for example, a standard pattern or image of the detection data for reference when the user detects the subject may be displayed in a region other than the pattern or image obtained from the detection data displayed on the display unit 4. As shown in fig. 6(c), a reference image T1 obtained by associating reference image composition data with the first subject information is superimposed on the infrared thermal image obtained by the captured thermal image data according to the predetermined position parameters as a reference for capturing the subject thermal image IR1, so that the quality of capturing the subject thermal image IR1 can be standardized.
The recording process records the received predetermined information in association with the detected data, for example, when the detected data is recorded.
In the marking process, for example, when the continuously acquired detection data is dynamically recorded, the received predetermined information is recorded in association with a predetermined frame in the continuously acquired detection data in response to the marking instruction. For example, when the continuously acquired detection data is recorded dynamically, the predetermined information is recorded in association with the detection data acquired at the marking time in response to a marking instruction from the user via the operation unit.
The communication process is, for example, to associate the acquired predetermined information with the acquired detection data and transmit the data to a destination storage medium via the communication interface 1.
And performing identification processing, for example, according to identification configuration information associated with the first measured object information and/or the second measured object information, and based on the acquired detection data, performing corresponding identification processing control to identify whether a specific measured object is detected. The identification configuration information may include, for example, one or more configuration parameters related to the identification process, and when only a part of the configuration parameters related to the identification is included, the other part of the configuration parameters may adopt a default configuration of the detection apparatus 101 or configuration parameters set by a user. The identification configuration parameters include, for example, the identification information of the subject, the judgment value of the degree of correlation, and the identification search policy. The subject identification information is used for identifying matched image templates and feature quantities, for example; the judgment value of the correlation is used for comparing with the correlation obtained by matching to determine whether the detection data of a specific detected object is identified; the identification search strategy includes, for example, determination of a detection area in the detection data, for example, determination of a processing order when there is a combination of a plurality of templates, detection areas, and determination values. For example, according to the identification information (such as a characteristic template of the measured object) of the measured object associated with the first measured object information and the corresponding judgment value of the degree of correlation, extracting thermal image data from a specified detection area from the acquired thermal image data, matching and comparing the thermal image data with the identification information of the measured object to obtain the degree of correlation, and comparing the correlation with the judgment value of the degree of correlation to identify whether a specific measured object thermal image is shot or not; as shown in fig. 6(b), when a subject thermal image IR1 matching the template T1 associated with the first subject information is detected, a blinking icon SS1 is displayed.
And analysis region setting processing, such as analysis region configuration information associated with the first measured object information and/or the second measured object information, based on the acquired detection data, and corresponding analysis region setting processing is executed. The analysis area configuration information includes one or more of configuration parameters related to the analysis area setting process, and when only a part of the configuration parameters related to the analysis area setting is included, the configuration parameters of the other part may adopt a default configuration of the detection apparatus 101 or configuration parameters set by a user. The analysis area configuration parameter includes, for example, analysis area configuration data, a position parameter of the analysis area, or a rule for obtaining the position parameter. For example, according to analysis region composition data and position parameters associated with the first measured body information, setting a corresponding analysis region for the acquired thermal image data; as shown in fig. 6(d), the infrared thermography has analysis regions S01, S02 set according to analysis region configuration data and position parameters associated with the first subject information.
And analyzing, such as analyzing configuration information associated with the first measured object information and/or the second measured object information, based on the acquired detection data, and executing corresponding analysis processing. The analysis configuration information includes one or more configuration parameters related to the analysis process, and when only a part of the configuration parameters related to the analysis is included, the other part of the configuration parameters may adopt a default configuration of the detection apparatus 101 or configuration parameters set by a user. The analysis parameters include, for example, analysis area configuration data related to analysis, a position parameter of the analysis area or a rule for obtaining the position parameter, an analysis pattern. For example, the acquired thermal image data is analyzed according to the analysis area and/or analysis mode associated with the first measured object information. As shown in fig. 6(e), the analysis region configuration data and the analysis mode are associated with the first subject information, the analysis regions S01, S02 are set, and the analysis data is obtained according to the analysis mode (S01MAX-S02 MAX). The analysis mode refers to an analysis calculation rule; taking the detected data as thermal image data as an example, the detected data represents an analysis calculation rule adopted by performing temperature analysis on the thermal image data determined based on the analysis area to obtain an analysis result, such as calculating a maximum temperature, an average temperature, a minimum temperature, a percentage content and the like; and, a calculation relationship between the analysis regions such as a temperature difference calculation, etc. may also be included.
And diagnosis processing, such as diagnosis configuration information associated with the first measured object information and/or the second measured object information, based on the acquired detection data, and executing corresponding diagnosis processing. The diagnostic configuration information may include one or more of configuration parameters related to the diagnostic process, and when only some of the configuration parameters related to the diagnostic process are included, other portions of the configuration parameters may adopt a default configuration of the detection device 101 or configuration parameters set by a user. Taking diagnosis of thermal image data as an example, configuration parameters contained in the diagnosis configuration information include analysis region configuration data, position parameters of the analysis region in the thermal image, an analysis mode, a diagnosis threshold value and a diagnosis conclusion corresponding to the analysis region configuration data; as shown in fig. 6(f), the analysis region configuration data, the position parameter of the analysis region in the thermal image, the analysis mode, the diagnosis threshold, and the diagnosis conclusion are associated with the first measured object information; the set analysis regions S01 and S02, and the analysis data obtained according to the analysis mode (S01MAX-S02MAX), the diagnosis result "serious defect!obtained according to the diagnosis threshold and the corresponding diagnosis conclusion (S01MAX-S02MAX ≧ 2 ℃, serious defect)! "
And classifying the detection data, for example, classifying the detection data according to the first detected object information and/or the second detected object information associated with the detection data, for example, storing the detection data in a specific folder, so as to facilitate subsequent batch processing.
And lens switching processing, namely switching lens parameters such as changing the aperture of a lens and the like according to lens configuration information related to the first measured object information and/or the second measured object information.
And controlling the processing of the detection data, such as executing corresponding processing based on the acquired detection data according to the configuration information related to the first detected body information and/or the second detected body information. For example, image display processing such as pseudo color is performed on the acquired thermal image data based on processing parameters such as interpolation and pseudo color associated with the first object information.
The above-described processing is not limited to one of the processing, and a combination of plural processing may be performed;
for example, based on the identification configuration information associated with the first measured object information and/or the second measured object information, based on the acquired detection data, corresponding identification processing is performed, and when the measured object identification information matches the detection data, the detection data is processed according to the first measured object information, the second measured object information, the identification configuration information, or one or more other associated configuration information.
In one example, according to identification configuration information related to the first measured object information and/or the second measured object information, corresponding identification processing control is executed based on the acquired detection data, and when the photographed thermal image data is identified to have a specific measured object thermal image, an analysis area is set based on analysis area configuration information related to the first measured object information; as shown in fig. 6(f), when a subject thermal image IR1 matching the template T1 associated with the first subject information is detected, setting of analysis regions S01, S02 may be performed according to the analysis region configuration data and the position parameters associated with the first subject information; preferably, the analysis area may be set based on analysis area configuration data and a positional parameter of an analysis area having a specific positional relationship with the template T1.
In another example, the user photographs the tested object 1 to obtain the detection data, and the detection data is identified based on the tested object identification information body template and joint template associated with the first tested object information, and when the detection data is matched with the body template, the detection data is processed according to the configuration information associated with the tested object identification information body template; for example, the "ontology template" is associated with the "ontology" of the second detected body information, and the second detected body information and the detection data can be associated and recorded, so that subsequent classification processing is facilitated. For example, if the "ontology template" is associated with the diagnosis configuration information, the detection data can be diagnosed and processed according to the diagnosis configuration information.
Fig. 2(a) and (b) show the external form of the information providing apparatus 100; the information providing apparatus 100 includes a lens 201 of the acquisition unit 2 and an auxiliary lighting device 202. The auxiliary lighting device 202 is, for example, a high-intensity lighting lamp for illuminating at night with the light source, and the first acquiring unit images the label of the subject. The information providing apparatus 100 may include a plurality of keys (not shown) for providing user operations, and may also implement related operations using a touch panel, a voice recognition unit (not shown), or the like. The information providing apparatus 100 can be held and used by a user. Preferably, a hand strap is provided which is connected to the housing to facilitate use by a user wearing the hand strap on a wrist in a manner similar to a watch (not shown).
Fig. 4 is a block diagram showing an electrical configuration of an information providing system in which the information providing apparatus 100 and the detection apparatus 101 are connected.
In this example, the thermal image capturing device 101 is taken as an example of the detection device 101. Referring to fig. 4, the detection apparatus 101 is an electrical block diagram, and includes a communication interface 10, a second image capturing section 20, a flash memory 30, an image processing section 40, a RAM50, a display section 60, a CPU80, and the like.
The communication interface 10 is an interface for connecting the detection device 101 to an external device and exchanging data according to a communication specification such as USB, 1394, network, GPRS, 3G, 4G, or 5G, and examples of the external device include a personal computer, a server, a PDA (personal digital assistant), the information providing device 100, and another storage device.
The second imaging unit 20 is configured by an optical component, a lens driving component, an infrared detector, a signal preprocessing circuit, and the like, which are not shown. The optical component is composed of an infrared optical lens for focusing the received infrared radiation to the infrared detector. The lens driving section drives the lens to perform focusing or zooming operation according to a control signal of the CPU 80. Furthermore, it may be an optical component that is manually adjusted. An infrared detector, such as a refrigeration or non-refrigeration type infrared focal plane detector, converts infrared radiation passing through the optical components into electrical signals. The signal preprocessing circuit includes a sampling circuit, an AD conversion circuit, a timing trigger circuit, and the like, performs signal processing such as sampling on an electrical signal output from the infrared detector in a predetermined period, and converts the electrical signal into a digital thermal image signal, for example, binary data (also referred to as an AD value) of 14 bits or 16 bits, through the AD conversion circuit. The thermal image signal is temporarily stored in the RAM50, and then subjected to predetermined processing (such as compression processing) by the image processing unit 40 (such as a DSP), and then output via the communication interface 10. For example, the thermal image data output by the detection device 101 may be thermal image signals and/or data obtained after specified processing of the thermal image signals (e.g., one or more of image data of infrared thermal images, temperature data obtained by thermal image data, data compressed by a specified format of thermal image data, etc.), which are collectively referred to as thermal image data, according to different design and detection purposes.
The flash memory 30 stores a control program and various data used for controlling each section.
The image processing unit 40 performs predetermined processing on the thermal image data obtained by the second image pickup unit 20, and the image processing unit 40 performs processing for converting the thermal image data into data suitable for display, recording, and the like, such as correction, interpolation, pseudo color, synthesis, compression, decompression, and the like. The image processing unit 40 may be implemented by a DSP, another microprocessor, a programmable FPGA, or the like, or may be a processor integrated with the CPU 60.
The RAM50, such as a volatile memory, for example, a RAM, a DRAM, or the like, serves as a buffer memory for temporarily storing thermal image data output from the image pickup section 1, and at the same time, serves as a work memory for the image processing section 40 and the CPU80 to temporarily store data processed by the image processing section 40 and the CPU 80.
The display unit 60 performs display of the image data for display stored in the RAM50 on the display unit 60 under the control of the CPU 80.
The auxiliary storage unit 70 is connected to a memory card as a rewritable nonvolatile memory as an interface of the memory card, is detachably mounted in a card slot of the main body of the detection device 101, and records data such as thermal image data under the control of the CPU 80.
The CPU80 controls the overall operation of the detection device 101.
Referring to fig. 5, an implementation of the connection between the information providing apparatus 100 and the detecting apparatus 101 is schematically illustrated.
After the information providing device 100 is powered on, a user can observe the prompt message of the information providing device 100 during patrol, and at this time, the user does not need to watch the display screen of the detection device 101 or connect the information providing device 100 with the detection device 101; when the first subject information is recognized, the information providing device 100 and the detecting device 101 may be connected through the cable 103, and two ends of the cable 103 respectively have plugs that are matched with the communication interface 1 of the information providing device 100 and the communication interface 10 of the detecting device 101.
The connection between the communication interface 1 of the information providing apparatus 100 and the communication interface 10 of the detection apparatus 101 may be performed by wireless, such as bluetooth.
Thus, if there is no notification from the information providing device 100 during the patrol, the user does not need to watch the display screen of the detecting device 101, and does not need to turn on the detecting device 101, so that the detecting device 101 is very convenient to carry; obviously, the power consumption of the information providing apparatus 100 can be configured to be small, and the volume and weight thereof can also be configured to be light, small and portable.
In one example, the captured thermal image data may be subsequently recorded in association with the first subject information "subject 1", as a result of the information providing device 100 providing the identified first subject information "subject 1" to the detecting device 101. In a preferred embodiment, the received first subject information "subject 1" may be displayed on the display unit 60 of the detection device 101, and in a further preferred embodiment, the configuration information associated with the first subject information may also be transmitted to the detection device 101.
The control procedure of embodiment 1 is explained with reference to the flowchart of fig. 7.
The specific operation and control flow of embodiment 1 will be described in detail below. Before the main shooting, a table as in fig. 3 is stored in advance in the hard disk 6; the CPU8 controls the overall operation of the information providing apparatus 100 based on the control program stored in the hard disk 6 and various data used for controlling each part, and the control procedure is as follows:
step A01, judging whether there is an instruction to acquire a first image; when the user selects to acquire the first image through the operation part 7, the next step is carried out;
step a02, the first acquiring unit acquires a first image, in this example, a visible light image, and the user adjusts the angle of the image so that the first image shown in fig. 8(a) is displayed on the display unit 4, and the subject 1(501) and the label 502 attached to the subject mount are visible in the first image; preferably, as shown in fig. 8(b), a positioning frame 503 is displayed in the image displayed on the display section 4, so that the user can frame the image of the sign 502 in the positioning frame 503; the first recognition unit can quickly locate the signboard region according to the range of the location frame 503, and can increase the recognition processing speed.
A specific example of the first image is a logo region image extracted from the first image obtained by imaging, or a character image obtained by further extracting segments.
Step A03, judging whether there is identification indication; if not, the process goes to step a07, and if not, the process goes back to step a02, the user can adjust the angle, distance, focal length, etc. of the information providing apparatus 100 for photographing the sign, or can adjust the position parameters (position, size, or rotation angle) of the positioning frame 503, and when the user confirms through the operation unit 7, the process goes to the next step;
a step a04 of performing processing relating to recognition based on the first image;
specifically, the recognition section may locate the placard area based on the first image or the range of the location frame 503 in the first image; then, the character is divided, template matching is performed to specify the characters "measured", "volume", and "1", and character information "measured volume 1" formed from the recognized characters is obtained and stored in a predetermined area of the temporary storage unit 3.
Step A05, judging whether to identify and obtain the first detected body information;
specifically, when the information providing apparatus 100 matches the recognized character information "subject 1" with the subject information "subject 1" stored in the hard disk 6, for example, a keyword match indicates that the first subject information is recognized, and the first subject information may be notified, for example, by displaying the first subject information or by a light generated by an indicator, for example, a sound prompt, for example, vibration, or the like, as long as the method is perceivable by the user.
Because the measured object information "measured object 1" is also associated with the second measured object information: the "main body" and the "joint" are controlled by the CPU8, and the display unit 4 can display second subject information associated with the identified first subject information "subject 1": the body and the joint can be used for selecting the second measured object information joint when the user intends to shoot the joint.
At this time, if the detection apparatus 101 and the information providing apparatus 100 are not connected, the user can connect them to step a 06;
if not, judging that the first detected body information is not detected; in step a07, if the step a07 is not exited, the process returns to step a02, and the user can adjust the shooting angle, distance, etc. of the information providing apparatus 100 to re-acquire the first image for subsequent processing;
a step a06 in which the CPU8 determines whether or not there is an instruction for data communication, for example, when the user presses a transmission key (not shown) of the operation unit 7, performs transmission processing for transmitting one or a combination of plural items of recognized character information, first subject information, second subject information selected based on the first subject information, arrangement information associated with the first subject information, and arrangement information associated with the second subject information;
then, the detection apparatus 101 can perform processing control, for example, one or more of various processing controls related to notification, display of information related to the subject, recording, labeling, communication, recognition, setting of an analysis region, analysis, diagnosis, display of a reference image, lens switching, image processing control, and the like. There may be various processing configurations associated with the identified first subject information and/or second subject information depending on the application of the detection data. For example, an image obtained by detecting data may be displayed on the display unit 4, and the first subject information may be displayed, for example, by superimposing the first subject information on the image;
the display interface of the detection device 101 may display the subject indication information "subject 1" obtained from the received subject information, and may record predetermined record information in association with the detection data at the time of recording. For example, the prescribed recording information is recorded in association with prescribed infrared data obtained by performing prescribed processing on thermal image data obtained by photographing with the photographing part and/or thermal image data obtained by photographing with the photographing part.
Step A07, judging whether to end, if yes, ending; if not, the process returns to step A02 to re-acquire the first image.
As described above, the prescribed information is obtained by acquiring the first image, and the prescribed information is supplied to the detection device 101; the problems of the prior art are solved; in the prior art, the label of the tested object needs to be checked to manually record the information of the tested object, and the use is inconvenient; the information providing device with the first image shooting function is convenient for acquiring first detected body information of a detected body with a label, is convenient to use in the daytime and at night, does not need to be close to the label installation part of the detected body, and can select second detected body information based on the first detected body information and provide the second detected body information for the detecting device 101 if the first image is identified to acquire the first detected body information.
Further, the instruction to acquire the first image and the instruction to recognize by the user through the operation portion 7 are not limited; in one example, the first image may be continuously acquired, and the recognition processing may be performed according to a recognition instruction of the user; in another example, it may be configured to acquire the first images continuously, and also to identify the continuously acquired first images; further, the detection of connection may be automatically performed by the CPU8 without being limited to the communication instruction from the user through the operation unit 7, and when the detection device 101 having connection is detected, the first object information thus identified may be transmitted to the detection device 101. In another example, when the CPU80 configured as the detection device 101 automatically detects that the connection is made, and the information providing device 100 having the connection is detected, the first subject information and other predetermined information identified are read from the predetermined area of the temporary storage unit 3.
In another embodiment, the first acquiring unit may be implemented by reading the first image stored in the storage medium or by receiving the first image through the communication interface 1.
In another preferred embodiment, after the information providing apparatus 100 is connected to the detecting apparatus 101, the information providing apparatus 100 may be configured to receive the detection data acquired by the detecting apparatus 101, and process, for example, associate the received detection data based on the recognized character information, the first subject information, and the second subject information by the information providing apparatus 100; in another preferred embodiment, the selected subject information is associated with the received detection data and transmitted to a storage medium of the destination through the communication interface 1, for example, a storage medium of a computer of the destination by wireless (such as GPRS, 3G, 4G, 5G). In a preferred example, the acquired detection data is matched based on the identification arrangement information related to the subject 1, such as a subject identification information joint template and a subject template respectively corresponding to the second subject information "joint" and "subject", and the second subject information "subject" is automatically selected when the detection data and the subject template have a correlation.
Other embodiments
The present invention is not limited to the acquisition of detection data based on an output signal of a probe or from the outside, and may be configured as one component or a functional block of a detection device or an information processing device, for example, as the acquisition of detection data from another component.
In the above examples, a certain step order is described, but various sequences are possible according to different embodiments, and the processing order described in the above examples is not limited. When the control unit 11, the image processing unit, and the like include a plurality of processors, there may be parallel processing in which some steps are applied.
Aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU, MPU, or the like) that performs the functions of the above-described embodiments by separately and executing a program recorded on a storage device, and a method known by a computer of a system or apparatus by the steps thereof to perform the functions of the above-described embodiments by, for example, reading out and executing a program recorded on a storage device. For this purpose, the program is supplied to the computer, for example, via a network or from a recording medium of various types serving as a storage device (e.g., a computer-readable medium).
The present invention provides a computer program in which a digital signal formed by the computer program is recorded in a computer-readable recording medium such as a hard disk or a memory. After the program is operated, the following steps are executed: a first acquisition step of acquiring a first image; a communication step of supplying predetermined information to an external device connected to the information supply device; the prescribed information includes one or more of a first image, character information recognized based on the first image, first subject information recognized based on the first image, second subject information selected based on the first subject information, configuration information associated with the first subject information recognized based on the first image, and configuration information associated with the second subject information selected based on the first subject information.
Embodiments of the present invention also provide a readable storage medium storing a computer program for electronic data exchange, wherein the computer program causes a computer in a thermographic imaging apparatus to perform the above steps.
Although the functional blocks in the drawings may be implemented by hardware, software, or a combination thereof, there is generally no need for structures to implement the functional blocks in a one-to-one correspondence. One or more of the functional blocks may be implemented by one or more software and hardware modules. In addition, some or all of the processing and control functions of the components in the embodiments of the present invention may be implemented by dedicated circuits, general-purpose processors, or programmable FPGAs.
In addition, the example is exemplified by the application of the object to be measured in the power industry, and the application is also widely applied to various detection industries.
The above description is only a specific example (embodiment) of the invention, and various illustrations do not limit the essence of the invention, and various embodiments can be configured into more embodiments by performing corresponding substitution and combination. Other modifications and variations to the specific embodiments can be practiced by those skilled in the art upon reading the present specification without departing from the spirit and scope of the invention.

Claims (7)

1. The detection system is characterized by comprising
An information providing device and a detecting device;
the information providing apparatus includes
A first acquisition section for acquiring a first image;
a first recognition unit configured to acquire character information based on recognition of the first image, compare the character information with predetermined subject information, and acquire first subject information representing recognition when the character information and the predetermined subject information match;
a communication unit for providing predetermined information;
the detection device comprises
An acquisition section for acquiring detection data;
a communication unit for acquiring predetermined information;
the information providing device communicates with the detection device through a communication part, and the information providing device provides specified information for the detection device;
a control unit for performing corresponding process control on the acquired detection data based on the predetermined information;
the prescribed information includes first subject information identified based on the first image;
the prescribed information further comprises one or more of a first image, character information recognized based on the first image, second measured body information selected based on the first measured body information, configuration information related to the first measured body information recognized based on the first image, and configuration information related to the second measured body information selected based on the first measured body information.
2. The detection system of claim 1, having
A second communication section for acquiring detection data;
and the control part is used for executing corresponding processing control on the acquired detection data according to the configuration information related to the first measured object information and/or the configuration information related to the second measured object information.
3. The inspection system of claim 1, wherein the processing control includes one or more of notification control, display control of a prompt message, acquisition control of inspection data, and processing control of acquired inspection data.
4. The detection system of claim 1, wherein the first image is a visible light image containing information to be identified; the information to be identified can be characters, numbers, letters or bar codes.
5. A sensing system according to claim 1, having auxiliary light source means for illuminating a label associated with the subject; the first acquisition unit acquires a first image by imaging the sign illuminated by the auxiliary light source device.
6. The inspection system of claim 1, wherein the first acquisition unit is a visible light camera or a near infrared camera, and the first image is obtained by photographing a sign associated with the subject.
7. A test method provided by the test system of claim 1, comprising the steps of,
an information providing step and a detection step;
the information providing step includes
A first acquisition step of acquiring a first image;
a first recognition step, which is used for acquiring character information based on the recognition of the first image, comparing the character information with specified object information, and acquiring first object information representing the recognition when the character information is matched with the specified object information;
a communication step for providing the detection device with prescribed information;
the detecting step comprises
An acquisition step for acquiring detection data;
a communication step for acquiring prescribed information provided by the information providing device;
a control step of executing corresponding processing control on the acquired detection data according to the prescribed information;
the prescribed information includes first subject information identified based on the first image;
the prescribed information further comprises one or more of a first image, character information recognized based on the first image, second measured body information selected based on the first measured body information, configuration information related to the first measured body information recognized based on the first image, and configuration information related to the second measured body information selected based on the first measured body information.
CN201510206405.5A 2014-04-29 2015-04-26 Information providing apparatus, detecting system, and information providing method Active CN105224897B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510206405.5A CN105224897B (en) 2014-04-29 2015-04-26 Information providing apparatus, detecting system, and information providing method

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201410179099 2014-04-29
CN2014101790996 2014-04-29
CN201510206405.5A CN105224897B (en) 2014-04-29 2015-04-26 Information providing apparatus, detecting system, and information providing method

Publications (2)

Publication Number Publication Date
CN105224897A CN105224897A (en) 2016-01-06
CN105224897B true CN105224897B (en) 2020-12-29

Family

ID=54993859

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510206405.5A Active CN105224897B (en) 2014-04-29 2015-04-26 Information providing apparatus, detecting system, and information providing method

Country Status (1)

Country Link
CN (1) CN105224897B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111563502B (en) * 2020-05-09 2023-12-15 腾讯科技(深圳)有限公司 Image text recognition method and device, electronic equipment and computer storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100369487C (en) * 2002-04-25 2008-02-13 松下电器产业株式会社 Object detection device, object detection server, and object detection method
CN101551233A (en) * 2008-04-01 2009-10-07 深圳富泰宏精密工业有限公司 Workpiece size detecting device
CN101976406A (en) * 2010-07-13 2011-02-16 江苏金典数据有限公司 Anti-counterfeiting discrimination method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100369487C (en) * 2002-04-25 2008-02-13 松下电器产业株式会社 Object detection device, object detection server, and object detection method
CN101551233A (en) * 2008-04-01 2009-10-07 深圳富泰宏精密工业有限公司 Workpiece size detecting device
CN101976406A (en) * 2010-07-13 2011-02-16 江苏金典数据有限公司 Anti-counterfeiting discrimination method

Also Published As

Publication number Publication date
CN105224897A (en) 2016-01-06

Similar Documents

Publication Publication Date Title
US20160005156A1 (en) Infrared selecting device and method
CN105224896B (en) Recording apparatus, processing apparatus, recording method, and processing method
JP2008209306A (en) Camera
CN105224897B (en) Information providing apparatus, detecting system, and information providing method
CN114923583A (en) Thermal image selection device and thermal image selection method
CN105262943A (en) Thermal image recording device, thermal image processing device, thermal image recording method and thermal image processing method
CN104655636B (en) Thermal image analysis device, thermal image configuration device, thermal image analysis method and thermal image configuration method
US11093777B2 (en) Optical character recognition (OCR) and coded data for legacy instrument data transfer
CN114923581A (en) Infrared selecting device and infrared selecting method
CN105092051B (en) Information acquisition apparatus and information acquisition method
CN105208299A (en) Thermal image shooting device, thermal image processing device, thermal image shooting method and thermal image processing method
US20150358559A1 (en) Device and method for matching thermal images
JP6911914B2 (en) Inspection support device, inspection support method and program
CN105157742B (en) Identification device and identification method
CN104655284B (en) Analysis device, processing device, analysis method, and processing method
US20150334314A1 (en) Device and method for detecting thermal images
CN116358711A (en) Infrared matching updating device and infrared matching updating method
CN105021290B (en) Shooting device, pseudo color setting device, shooting method and pseudo color setting method
CN115578549A (en) Task selection device, task processing device, task recording method, and task processing method
CN104655637B (en) Selection device and selection method
CN104655289B (en) Analysis area setting device, processing device, analysis area setting method, and processing method
CN114923582A (en) Thermal image selection notification device and thermal image selection notification method
CN113542646A (en) Detection navigation recording device and detection navigation recording method
CN115993191A (en) Thermal image matching updating device and thermal image matching updating method
CN113645425A (en) Processing method, processing system, information adding device, detection device, and processing device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Yuhang District of Hangzhou city in Zhejiang province 311113 Qixian Village Building 1 Liangzhu Street Bridge

Applicant after: Mission Infrared Electro-optics Technology Co., Ltd.

Address before: 310030 Zhejiang city of Hangzhou province Xihu District city Hongkong No. 386 thick Renlu 14 Building 3 floor

Applicant before: Mission Infrared Electro-optics Technology Co., Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant