WO2004068407A1 - アプリケーション間のデータ連携支援方法 - Google Patents
アプリケーション間のデータ連携支援方法 Download PDFInfo
- Publication number
- WO2004068407A1 WO2004068407A1 PCT/JP2003/000802 JP0300802W WO2004068407A1 WO 2004068407 A1 WO2004068407 A1 WO 2004068407A1 JP 0300802 W JP0300802 W JP 0300802W WO 2004068407 A1 WO2004068407 A1 WO 2004068407A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- screen
- information
- program
- image data
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/5854—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using shape and object relationship
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/5838—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using colour
Definitions
- the present invention relates to a technique for linking data between applications.
- target APL information displayed on the display screen of one application
- linked APL another application
- step 2) it was necessary to remodel not only the cooperative APL but also the target APL, which led to a problem that the scale of system development was increased, and the development budget and period were accordingly increased. Also, if the developer of the target APL is different from that of the collaborative APL, it may be necessary to disclose confidential information.If there is no agreement between both developers, it may be necessary to modify the target APL in the first place. There was also the problem that there were times when it was not possible.
- paper is printed using a scanner.
- a system that reads the output data as image data, recognizes a region for displaying characters from the image data, generates a character code from the image data in the recognized region, and registers the character code in a database. It has been disclosed.
- layout rules of a document structure are stored in advance, image data of a document is stored, and the stored image data is stored. Disclosure of a system that analyzes a layout based on the above layout rules, cuts out a character pattern from the image data based on the analysis result, recognizes a character from the cut out character pattern, and stores the character recognition result in a file I do. Both inventions make it possible to omit the manual re-entry in step 1). However, both inventions still have the problem that it is necessary to output the screen on paper and to modify the target API and the linked API in order to enable data linkage between programs. Disclosure of the invention
- the purpose of this invention is to output the screen to a paper medium, manually input data items, and execute the target APL and the linked APL when modifying the system so that data can be linked between the target APL and the linked APL.
- the purpose is to eliminate the need for remodeling.
- a display for capturing screen image data of a display screen from a first program (target APL) is provided.
- the display screen in the first program is not output on paper or the like.
- it takes in screen image data from the first program, acquires cooperation data from the taken screen image data, and outputs the acquired cooperation data to the second program.
- the data cooperation support device may further include, based on coordinate information indicating an area in a display screen and screen type determination information including image data displayed in the area, image data displayed in the area from the screen image data. Further comprising a screen type determining means for identifying a display screen by determining whether or not the extracted image data and the image data included in the screen type determination information match. Alternatively, the cooperation data may be obtained from the screen image data of the identified display screen.
- the item information obtaining means may be configured to determine, from the screen image data, based on recognition item information including coordinate information indicating an area in a display screen and data attribute information indicating an attribute of data displayed in the area.
- the data displayed in the area may be cut out, the cut out data may be recognized based on the data attribute information, and the cooperation data may be obtained as a recognition result.
- the data cooperation support device may further include an item information output unit that outputs the cooperation data processed by the second program to the first program. This makes it possible to return the processing result of the cooperative data by the second program to the first program.
- the item information output means may be configured to output the link data processed by the second program based on output method information including coordinate information indicating an area in the display screen. May be output to the area.
- data for supporting data linkage between programs A display screen capturing means for capturing screen image data of the display screen from the first program; and coordinate information indicating an area in the display screen and image data displayed in the area.
- image data displayed in the area is cut out from the screen image data, and whether or not the cut out image data matches the image data included in the screen type determination information
- the display screen in the first program is captured as screen image data, the captured display screen is identified based on the captured screen image data, and the identified display screen is displayed in the second display screen. It is possible to output the data processed by the program. According to this, it is also possible to realize data cooperation between the first and second programs without modifying the first and second programs, thereby achieving the above object.
- the same operation and effect as those of the computer can be obtained by the data cooperation support method that performs the same procedure as the processing performed by the data cooperation support device. Therefore, the above-mentioned data cooperation support method can also achieve the above object. Also, a program that causes a processor to perform the same control as the procedure performed in the data cooperation support method can achieve the above object by causing the processor to execute the program. Furthermore, the above object can also be achieved by recording on a recording medium (recording device) storing the above-mentioned program and causing the processor to read and execute the program.
- FIG. 1 is a block diagram of the computer.
- FIG. 2 is a configuration diagram of a system according to the first embodiment.
- FIG. 3 is a functional configuration diagram of the screen plug.
- FIG. 4 is a diagram illustrating an overview of the screen definition process.
- FIG. 5 is a diagram for explaining the outline of the data linkage processing.
- FIG. 6 is a flowchart showing the procedure of the display screen capturing process.
- FIG. 7 is a flowchart illustrating the procedure of the screen type definition process.
- FIG. 8 is a flowchart showing the procedure of the screen type determination definition subroutine.
- FIG. 9 is a flowchart showing the procedure of the recognition item definition subroutine.
- FIG. 10 is a flowchart showing the procedure of the cooperation method definition subroutine.
- FIG. 11 is a flowchart illustrating the procedure of the definition duplication confirmation process.
- FIG. 12 is a flowchart illustrating the procedure of the screen type determination process.
- FIG. 13 is a flowchart showing the procedure of the image comparison subroutine.
- FIG. 14 is a flowchart showing the procedure of item information acquisition and recognition processing.
- FIG. 15 is a flowchart showing the procedure of the cooperative processing.
- FIG. 16 is a flowchart showing the procedure of the item information output process.
- FIG. 17 is a diagram showing an example of the display screen of the target API.
- FIG. 18 is a diagram illustrating an example of the screen type determination information.
- FIG. 19 is a diagram illustrating an example of the recognition item information.
- FIG. 20 is a diagram illustrating an example of the cooperation method information.
- FIG. 21 is a diagram illustrating an example of the output method information.
- FIG. 22 is a diagram illustrating an area cut out from the screen image data.
- FIG. 23 is a diagram illustrating a portion masked in the image comparison subroutine.
- FIG. 24 is a diagram illustrating a process of extracting data to be linked from the screen image data and recognizing the data.
- FIG. 25 is a diagram showing the recognition items acquired from the screen shown in FIG. 24 based on the recognition item information shown in FIG.
- FIG. 26 is a diagram showing an example of a recognition item screen obtained from the screen shown in FIG.
- FIG. 27 is a diagram showing an example of a screen showing the result of the seal verification process by the cooperative APL.
- FIG. 28 is a diagram showing an example of a confirmation screen of the result of the cooperative processing.
- FIG. 29 is a diagram illustrating an example of a screen on which the cooperation result information is output on the screen of the target API.
- FIG. 30 is a diagram illustrating an outline of a data cooperation process according to a modification of the first embodiment.
- FIG. 31 is a configuration diagram of a system according to the second embodiment. BEST MODE FOR CARRYING OUT THE INVENTION
- the computer includes a CPU 11, a memory 12, an input device 13, an output device 14, an external storage device 15, a medium drive device 16, and a network connection device 17, Are connected to each other by a bus 18.
- the memory 12 includes, for example, a ROM (Read Only Memory), a RAM (Random Access Memory) and the like, and stores programs and data used for processing.
- a data linkage support program (described later) that causes the computer to perform control to support data linkage between programs is stored in a specific program code segment in the memory 12 of the computer.
- the memory 12 implements a screen image writing unit 228 described later.
- the CPU 11 performs necessary processing by executing the above-described program using the memory 12.
- the computer can function as a data cooperation support device.
- the input device 13 is, for example, a keyboard, a pointing device, a touch panel, or the like, and is used for inputting an instruction or information from a user.
- the output device 14 is, for example, a display, a printer, or the like, and is used to output an inquiry to a computer user, a processing result, and the like.
- the external storage device 15 is, for example, a magnetic disk device, an optical disk device, a magneto-optical disk device, or the like.
- the external storage device 15 realizes screen information DB 229 described later.
- the above-described programs can be stored in the external storage device 15 of the computer, and can be used by loading them into the memory 12 as needed.
- the medium driving device 16 drives the portable recording medium 19 and accesses the recorded contents.
- the portable recording medium 19 can be any computer-readable medium such as a memory card, memory stick, flexible disk, CD-ROM (Compact Disc Read Only Memory), optical disk, magneto-optical disk, DVD (Digital Versatile Disk), etc. A removable recording medium is used.
- the above-described data cooperation support program can be stored in the portable recording medium 19, and can be used by loading it into the memory 12 of the computer as needed.
- the network connection device 17 communicates with an external device via an arbitrary network (line) such as a LAN or WAN, and performs data conversion accompanying the communication. Also need Accordingly, the program described above can be received from an external device and loaded into the memory 12 of the computer for use.
- an arbitrary network such as a LAN or WAN
- a computer 20 has a coordination APL 21, a data coordination support program (hereinafter, referred to as a screen plug) 22, a target APL 23, and an operation system (hereinafter, an OS) 24 installed.
- a coordination APL 21 a data coordination support program (hereinafter, referred to as a screen plug) 22, a target APL 23, and an operation system (hereinafter, an OS) 24 installed.
- the cooperative AP L 21 receives data to be cooperated from the target AP L 23 via the screen plug 22, and performs processing based on a predetermined algorithm using the data.
- the screen plug 22 captures screen image data of a display screen from the target APL 23 and acquires data from the screen image data. Then, the extracted data is output to the linked APL 21. Thereby, the screen plug 22 realizes data cooperation between the cooperation APL 21 and the target APL 23. Also, the screen plug 22 returns the processing result by the cooperative APL 21 to the target APL 23 as necessary.
- the target AP L 23 processes the data based on a predetermined algorithm.
- the OS 24 provides a system management function for applications such as the linked AP L21, the screen plug 22, and the target AP L23.
- the process performed by the coordination APL 21 and the target APL 23 for causing the screen plug 22 to cooperate with the data may be any process.
- computer 20 and computer 30 are connected via a network, and target APL 23 in computer 20 acquires data from database 31 of computer 30 via search system 32, The case where processing is performed using the data is shown.
- network N is a single network, It may be a combination of a plurality of networks. Examples of the network N include a WAN (Wide Area Network) and a LAN (Local Area Network) such as the Internet, a telephone network, and a wireless network.
- WAN Wide Area Network
- LAN Local Area Network
- the screen plug 22 includes a display screen capture section 221, a screen type definition section 222, a definition duplication check section 2 23, a screen type determination section 2224, and item information acquisition / It has a recognition unit 2 25, a cooperation processing unit 2 2 6, an item information output unit 2 2 7, a screen image writing unit 2 2 8, and a screen information database (hereinafter abbreviated as DB) 2 29.
- DB screen information database
- the processing performed by the screen plug 22 is roughly classified into a screen definition processing and a data linkage processing.
- the former is a process that is performed before data linkage, and the data that should be acquired from the target APL 23 exists in what data format at which position on which display screen, and the acquired data is transmitted to which program. This is how to define how to output.
- This processing is performed by the display screen capture unit 2 21, the screen type definition unit 2 22, and the definition duplication confirmation unit 2 2 3, and the result of this processing is written to the screen information DB 222.
- the latter is a process of linking the data extracted from the target API 23 to the link API 21 based on the result of the screen definition process.
- This processing is performed by the display screen capture section 2 21, screen type determination section 2 24, item information acquisition / recognition section 2 25, cooperation processing section 2 26 and item information output section 2 27. In this process, the result of the screen definition process written in the screen information DB 229 is used.
- the display screen capture section 222 captures the screen image data of the display screen from the target APL 23 and writes it to the screen image writing section 222.
- the screen type definition section 2 2 2 stores the screen written in the screen image writing section 2 2 8 Based on the image data, screen definition information that defines the position of the data to be extracted from the target APL 23 on the screen, the attribute of the data, and the like is created. Details of the information included in the screen definition information will be described later.
- the definition duplication check section 2 23 checks whether the screen definition information created by the screen type definition section 2 22 overlaps the screen definition information stored in the screen information DB 2 29. Then, if it is confirmed that there is no duplication, the created screen definition information is stored in the screen information DB 229.
- the screen type determination unit 224 identifies the captured screen based on the screen definition information stored in the screen information DB 229 and the screen image data written in the screen image writing unit 228. I do.
- the item information acquisition Z recognition unit 225 extracts the data to be linked by the link APL 21 from the screen image data based on the screen definition information.
- the cooperation processing unit 226 outputs the extracted data to a predetermined data area of the cooperation API 21 as a data cooperation destination by a predetermined cooperation method.
- the link APL as the data link destination is specified in advance by the link destination identification information
- the data link method is defined in advance by the link method specifying information. Both the linkage destination identification information and the linkage method designation information are included in the screen definition information.
- the cooperative processing unit 226 receives the data processed from the cooperative API 21 and outputs it to the item information output unit 227.
- the item information output unit 227 outputs the data processed by the cooperative API 21 to the target API 23 or another application.
- the output destination of the processing result by the cooperation API 21 is specified by the output destination definition information included in the screen definition information. Note that FIG. 2 shows a case where the processing result by the cooperative API 21 is returned to the target API 23.
- the screen image writing unit 222 receives the screen image data captured by the display screen capturing unit 222.
- This screen image writing unit 2 28 is basically a screen image processing unit Screen image data is written.
- the screen information DB 222 stores the screen definition information created by the screen type definition part 222.
- the display screen capture unit 2 21 first executes the display screen capture processing for capturing the display screen from the target APL 23. Do.
- the screen image data of the fetched screen is written to the screen image writing section 228.
- the screen type definition section 222 creates screen definition information based on the screen image data written in the screen image writing section 222 and the user's instruction, and stores the created screen definition information in the screen information DB2. 2 Write 9
- the definition duplication confirmation unit 223 performs a definition duplication confirmation process for confirming whether or not some of the screen definition information stored in the screen information DB 229 overlaps with each other.
- the definition duplication check unit 223 outputs information indicating that it is duplicated.
- an outline of the data linkage processing will be described.
- the display screen capture unit 22 1 executes a display screen capture process to capture the display screen from the target APL 23. Do.
- the screen image data of the fetched screen is written to the screen image writing section 228.
- the screen type determination unit 224 reads the screen definition information from the screen information DB 229 and identifies the screen on which the screen image data has been written to the screen image writing unit 228 based on the screen definition information. I do.
- the item information acquisition / recognition unit 2 25 extracts data to be output to the cooperative APL 21 from the screen image data based on the identification result and the screen definition information corresponding to the screen, and extracts the extracted data.
- a confirmation screen for confirmation is displayed on the output device 14. The user can confirm the retrieved data based on the confirmation screen. Note that this confirmation and screen display processing can be omitted.
- the cooperation processing unit 226 outputs the extracted data to the predetermined cooperation API 21 in a predetermined cooperation method based on the cooperation method information included in the screen definition information. Further, the cooperation processing unit 226 receives the data processed from the cooperation API 21 and causes the output device 14 to display a cooperation result confirmation screen for confirming the data. Note that the display processing of the confirmation screen of the cooperation result can be omitted.
- the item information output unit 227 outputs data obtained as a result of the cooperation to a predetermined area of the target APL 23 based on the output method information included in the screen definition information. Further, the item information output unit 227 causes the output device 14 to display a result confirmation screen for confirming the result of the data linkage processing. The output process of the data obtained as a result of the cooperation and the display process of the confirmation screen can be omitted. Also, like FIG. 2, FIG. 5 shows a case where the processing result by the cooperative API 21 is returned to the target AP 23.
- target APL 23 is a program for financial remittance business and that linked APL 21 is a seal verification program. Then, let the collaborative APL 21 perform the seal verification process based on the data items retrieved from the “Remittance process” screen of the target APL 23, and return the verification result to the target APL 23. . In the description, a specific example of a remittance processing screen will be used. These assumptions are for the purpose of facilitating understanding by making the description concrete, and are not intended to limit the target APL 23 and the linked APL 21 or to limit the scope of application of the present invention. .
- the display screen capturing unit 221 captures the display screen from the target APL 23, and sets the cullet or cursor in the screen to a non-display state (step S1). Subsequently, the display screen capture unit 222 acquires an area in the memory where the screen image data of the screen exists (step S2), and copies the screen image data of the screen from the area (step S2). S 3). Thereafter, the display screen capture unit 222 returns the cullet or the cursor in the display screen to the display state (step S4). The display screen capturing section 222 writes the copied image data into the screen image writing section 222 (step S5).
- FIG. 17 shows an example of the display screen captured by the display screen capturing unit 2 21.
- the display screen shown in Fig. 17 is displayed during the remittance processing by the financial remittance business program.
- “Transmission processing” is displayed at the upper left of FIG. 17 as character information indicating the processing performed on this screen
- “Menu 00201” is displayed at the upper right of FIG. 17 as the screen identification number. Is displayed.
- the screen identification number does not need to match the screen ID set by the screen type definition unit 222.
- the transaction information was read from the customer database number (CIF number, CIF: Customer Interaction File), the customer name, the remittance amount, the field for inputting the result of the seal verification, and the remittance processing application form.
- An image of a seal (hereinafter referred to as a read mark) is displayed (hereinafter referred to as a read mark).
- the screen type definition unit 222 reads screen image data from the screen image writing unit 222 (step SI1). Subsequently, the screen type definition unit 222 sets a screen ID for identifying the screen (step S12). Further, the screen type definition unit 2 2 2 includes a screen type determination definition subroutine (step S 13), a recognition term. The eye definition subroutine (step SI4) and the linkage method definition subroutine (step S15) are performed, and the process ends. Details of each subroutine will be described later.
- Screen definition information is created corresponding to the screen containing the data to be linked.
- the screen definition information includes screen type determination information, recognition item information, and linkage method information.
- the screen definition information further includes output method information.
- the screen type determination information is created in the screen type determination definition subroutine, the recognition item definition information is created in the recognition item definition subroutine, and the linkage method information and output method information are created in the linkage method definition subroutine.
- step S21 when the screen type definition unit 222 receives designation of an item to be extracted as screen type determination information from the captured screen, the screen type definition unit 222 adds an item ID to the item (step S21). Subsequently, upon receiving the designation of the area (identification part) in which the item to be extracted in the screen image is displayed (step S22), the screen type definition unit 222 receives the screen image data based on the coordinates. The image data is cut out from (step S23). The screen type definition unit 222 creates screen identification determination information including the item ID, the coordinates, and the extracted image data.
- steps S21 to S23 have been performed for all items to be taken out as screen type determination information (step S24: Yes)
- step S24: Yes the processing ends. Otherwise (step S24: No), return to step S21.
- FIG. 18 shows an example of the screen type determination information for the screen shown in FIG.
- One is the coordinate information indicating the area on the screen where the text "Transmission processing" displayed at the upper left of FIG. 17 is displayed, and an image cut out based on the coordinate information.
- Screen type determination information including image data cut out based on the coordinate information indicating an area in the screen where the number “Menu 00201” is displayed and the coordinate information.
- Information In FIG. 18, as an example, the region is shown using the coordinates of two opposing vertices among the four vertices forming a rectangle.
- the screen type determination information includes “item name” instead of “item ID”, but any type may be used.
- the screen type determination information is used to identify the screen captured from the target APL 23 in the data linkage processing described later.
- the screen type definition unit 222 receives designation of an item name of data to be extracted as a recognition item from a screen captured by the display screen capture unit 222. Note that this item name may be matched with the item name on the screen (step S31). Subsequently, the screen type definition unit 222 receives designation of an area where recognition and recognition items are displayed on the screen (step S32). Further, the screen type definition unit 222 receives the designation of the attribute of the data displayed in the area (step S33). The screen type definition unit 222 receives the item name, the coordinate information indicating the designated area, and the data attribute based on the specification of steps S31 to S33.
- step S34: Yes When steps S31 to S33 have been performed for all items to be taken out as recognition items (step S34: Yes), the process ends. Otherwise (step S34: No), return to step S31.
- the recognition item information is used when extracting data to be linked to the link APL 21 from the screen fetched from the target APL 23 in the data link processing described later.
- FIG. 19 shows an example of recognition and knowledge item information on the screen shown in FIG. As shown in FIG. 19, there are four pieces of recognition item information on the screen shown in FIG.
- the first is the recognition item information corresponding to the CIF number entry field
- the second is the recognition item information corresponding to the customer name entry field
- the third is the remittance amount entry field
- the fourth item is recognition item information corresponding to the image of the read mark.
- each piece of recognition item information includes an item name, coordinate information indicating an area where the item is displayed on the screen, and data attribute information of the item.
- the region is shown using the coordinates of two opposing vertices forming a rectangle.
- the screen type definition unit 222 receives designation of the cooperation APL 21 and designation of the cooperation method (step S41).
- the screen type definition unit 222 creates cooperation method information including cooperation destination identification information for identifying the cooperation APL 21 and cooperation method designation information for specifying a cooperation method, based on the specification.
- Examples of the linking method include TCP (Transmission Control Protocol / Internet Protocol), HTTP (Hypertext Transmission Protocol), API (Application Program Interface), and DLL (Dynamic Link Library).
- FIG. 20 shows an example of the cooperation method information.
- the linkage method information includes the program name of the linkage API 21 as linkage destination identification information for identifying the linkage API 21.
- TCP / IP and socket communication are specified as a method of linking data between the link AP L 21 and the target AP L 23.
- the screen type definition unit 222 specifies whether or not to output information obtained as a result of processing the data linked by the cooperative APL 21 (hereinafter, cooperative result information) from the cooperative APL 21 to the target APL 23.
- Information to be received step S42. If it is determined that the cooperation result information is to be output to the target APL 23 based on the specification (step S42: Yes), the process proceeds to step S43. Otherwise (step S42: No), go to step S46.
- step S43 the screen type definition unit 222 causes the screen output to display the item name of the cooperation result information and the cooperation result information based on the user's specification.
- Output method information including coordinate information indicating the destination area is created (steps S43 and S44). When multiple items are output as cooperation result information, multiple items of output method information are created for each item.
- step S45: Yes When the steps S41 to S43 have been performed for all the items for which the cooperation result information is to be output (step S45: Yes), the screen type definition unit 222 ends the processing. Otherwise (step S45: No), return to step S43.
- This output method information is used when the cooperation result information is output to the screen of the target APL 23 in the data cooperation processing described later.
- step S42 If No in step S42, the process ends because it is not necessary to create output method information in step S46.
- Figure 21 shows an example of output method information.
- the output method information includes an item name of the cooperation result information and coordinate information indicating an output destination area on the screen where the cooperation result information is to be output.
- one item is output as the cooperation result information, and thus one output method information is created.
- the coordination result information is displayed in the collation result input field on the screen shown in FIG.
- the screen definition information created in the above-described screen type definition processing is written to the screen information DB 229.
- the definition duplication confirmation processing the created screen definition information is checked for force overlapping with the previously created screen definition information. This definition duplication confirmation process is included in the screen definition process.
- step S23 the screen definition information that has not been subjected to the duplication check processing is read from the image information DB 229 (step S51). If duplication confirmation processing has already been performed for all screen definition information in the image information DB 229 (step S52: Yes), the process proceeds to step S58. Otherwise (step S52: No), perform step S53 and subsequent steps.
- the screen definition information includes one or more screen type determination information.
- Each screen type determination information includes coordinate information indicating a region on the screen and image data.
- the definition duplication check unit 222 acquires image data from the screen type determination information for which duplication determination processing has not been performed yet, from the screen definition information read in step S51. If duplication check processing has already been performed on all screen type determination information in the screen definition information read out in step S51 (step S54: Yes), the process proceeds to step S57. Otherwise (step S54: No), perform steps S55 and S56.
- step S55 the definition duplication confirmation unit 222 receives the screen image data written in the screen image writing unit 222 based on the coordinate information corresponding to the image data acquired in step S53. Cut out image data. Then, the definition duplication checker 2 23 compares the image data acquired in step S53 with the clipped image data, and determines whether or not both match based on the comparison result of the image data. Is determined. The procedure for comparing the images will be described later in detail as an image comparison subroutine. Then, the definition duplication section 22 3 records the determination result of the image data acquired in step S53 in the temporary storage area (step S56), and returns to step S53.
- step S57 the definition duplication confirmation unit 2 23 determines whether or not the judgment result recorded in the temporary storage area indicates “match in all judgments”, and is read out in step S51. Temporarily records the judgment result for the screen definition information Then, the process returns to step S53.
- step S58 the definition duplication confirmation unit 223 counts the number of pieces of screen definition information for which the determination result is “match”. Further, the definition duplication checker 2 23 determines that there is no duplication when the number is “1”, and judges that there is duplication when the number is “2 or more”. The definition duplication check unit 222 notifies the user of the determination result, and ends the processing.
- the definition duplication confirmation processing is performed every time screen definition information is created. Create screen definition information for all the screens to be defined, write them to the screen information DB 229, perform a definition duplication check process, and perform screen duplication on the screen information DB 229. Whether or not the definition information is stored may be determined.
- the procedure of the screen type determination process will be described with reference to FIGS.
- This screen type determination processing is included in the data linkage processing.
- the display screen capture processing is performed as described with reference to FIG.
- the screen type determination unit 224 acquires the screen image data from the screen image writing unit 228 (step S61), and further determines the screen type from the screen information DB 229.
- One screen definition information that has not been processed is extracted (step S62).
- step S63: Yes If the screen type processing has already been performed for all the screen definition information in the image information DB 229 (step S63: Yes), the process proceeds to step S69. Otherwise (step S63: No), perform step S64 and subsequent steps.
- the screen definition information includes one or more screen type determination information, and each screen type determination information includes coordinate information indicating an area in the screen image data and image data. Steps S64 to S66 are performed based on all the screen type determination information included in the screen definition information acquired in step S62.
- the screen type determination unit 224 acquires one piece of screen type determination information from the screen definition information obtained in step S62. Further, the screen type determination unit 222 extracts image data from the screen image data acquired in step S61 based on the coordinate information in the screen type determination information.
- an area indicated by coordinates in the screen type determination information shown in FIG. 18 is indicated by an arrow.
- the area where the character “Remittance processing” is displayed and the area where the screen identification number “Menu 00201” is displayed are shown in FIG. 18 for the screen type determination processing. It is cut out based on the screen type determination information shown.
- the screen type determination unit 222 performs an image comparison subroutine for comparing the image data in the screen type determination information acquired in step S64 with image data cut out from the screen image data (step S64). 6 5). This image comparison subroutine will be described later in detail.
- the screen type determination unit 222 determines whether or not the image data included in the screen type determination information matches the image data cut out in step S64 based on the result of the image comparison subroutine. Yes (step S66). If they do not match (step S66: No), the process returns to step S62. If they match (step S66: Yes), the process proceeds to step S67.
- step S67 the screen type determination unit 2 2 4 determines whether or not all screen type determination information in the screen definition information acquired in step S62 has been subjected to steps S64 to S66. Monument IJ. If steps S64 to S66 have been performed for all screen type determination information (step S67: Yes), the process proceeds to step S68. Otherwise (step S67: No), return to step S64.
- step S68 the screen type determination unit 2 24 determines that the screen definition information acquired in step S62 corresponds to the screen image data acquired in step S61 (step S68), The process ends.
- step S69 the screen type determination unit 224 determines that the screen information DB 229 does not store the screen definition information corresponding to the screen image data acquired in step S61.
- the screen type determination unit 224 notifies the user that a processing error has occurred because the captured screen is not registered in the database, and ends the processing.
- the screen type determination unit 2 24 performs processing on the image data in the screen type determination information obtained in step S64 and the image data cut out from the screen image data. Then, the color distribution for each pixel is detected (step S71). Subsequently, the screen type determination unit 222 determines the background color of each of the image data included in the screen type determination information and the image data cut out from the screen image data based on the color distribution. Then, the color is masked (step S72).
- the screen type determination unit 222 displays the image pattern in the image data included in the screen type determination information that has not been masked, and the image pattern that has been cut out from the screen image data. Then, the image pattern is compared with the image portion of the selected portion, and the two are matched (step S73). The matching result is used for the determination in step S66 described above (step S74).
- FIG. 23 shows the image data included in the two screen type determination information shown in FIG. 18 and the image data cut out based on the coordinates included in each screen type determination information. And are shown. For example, two image data whose item name is “process name” display the characters “transfer process”. In these image data, a shaded portion is determined as a background color and masked. As a result, in both the image data, the image pattern indicating the character “remittance processing” remains without being masked, and is compared with each other by the screen type determination unit 224.
- the screen definition information includes one or more pieces of recognition item information.
- the following steps S81 to S93 are performed for each piece of recognition item information included in the screen definition information corresponding to the screen image data acquired in step S61.
- the item information acquisition Z recognition unit 225 acquires one piece of recognition item information from the screen definition information, and based on the information indicating the data attribute included in the recognition and knowledge item information, the processing mode of item recognition. Is determined (step S81). For example, when the data attribute indicates “character data”, the item information acquisition recognition unit 222 determines that the processing mode is character recognition using an OCR (Optical Character Reader) or the like (step S81: Recognition), and the process proceeds to Step S82. For example, when the data attribute indicates “edit control” or “static control”, the item information acquisition / recognition unit 225 determines that the processing mode is the reading of the data item (step). (Step S81: item reading), and the process proceeds to step S89.
- OCR Optical Character Reader
- step S82 the item information acquisition / recognition unit 225 cuts out image data from the captured screen image data based on the coordinate information included in the recognition item information, and sets the pixels constituting the image data. By determining the color by scanning each one, a color indicating a character and a color indicating a background are determined (step S82). Subsequently, the item information acquisition / recognition unit 225 separates the color indicating the character from the color indicating the background by binarizing the image data using a threshold value for separating the background image and the character (step S83). If the color indicating characters becomes white and the color indicating the background image becomes black due to binarization, the item information acquisition / recognition unit 225 inverts the value (bit) indicating both colors. (Step S84).
- the item information acquisition Z recognizing unit 225 extracts label information for each character by acquiring a continuous black pixel group (label information) (step S85).
- the item information acquisition / recognition unit 2 25 determines the size of one character based on the result of extracting the label information, and if the size of one character is not appropriate for character recognition processing, it becomes an appropriate size. The character is enlarged or reduced as described above (step S86).
- the item information acquisition Z recognition unit 225 performs character recognition (step S87) and notifies the user of the recognition result.
- the recognition result may be notified to the user by displaying a confirmation screen after the recognition process for all pieces of recognition recognition item information is completed.
- the item information acquisition / recognition unit 225 acquires the control handle of the data item to be recognized based on the coordinates included in the recognition item information. Further, based on the information indicating the data attribute included in the recognition item information, it is determined whether the control knob is an edit control node or a static control node (step S90).
- step S90 edit control
- the item information acquisition Z recognition unit 225 reads the data from the control using, for example, a class member function GetLine O, and the like. It is acquired as a recognition result (step S91), and the process proceeds to step S88.
- step s90 static control
- the item information acquisition Z recognition unit 225 may use the class member function GetText (), for example. The data is read from the control by using, the recognition and the recognition result are obtained (step S92), and the process proceeds to step S88.
- step S93 the item information acquisition / recognition unit 225 cuts out image data from the screen image data based on the coordinate information included in the recognition item information, and proceeds to step S88.
- FIG. 24 shows four recognition items in the screen image data extracted based on the four recognition item information shown in FIG.
- a recognition item whose item name is “CIF number” is cut out from image data as image data, and obtained as character data through character recognition processing.
- a recognition item whose item name is “customer name” and a recognition item whose item name is “remittance amount” are respectively obtained as data directly from the input fields in the image data. Also, if the item name is
- Recognition and recognition items which are “read marks”, are cut out from the image data as image data and are obtained as they are. Although specific class member functions are shown in FIG. 24, this is only an example.
- FIG. 25 shows recognition items obtained from the screen shown in FIG. 24 based on the recognition item information shown in FIG.
- FIG. 26 shows an example of a confirmation screen of the recognition item acquired from the screen shown in FIG.
- the recognition and recognition items other than the recognition items whose data attribute is "image data” are cut out from the screen image data, the data that has not been recognized yet, and the result of the recognition process. Both data obtained are displayed for comparison.
- the data that has been cut out and has not been recognized is displayed in the upper row, and the data obtained as a result of the recognition processing is displayed in the lower row.
- image data cut out from screen image data is displayed.
- the user can use this confirmation screen Then, it is determined whether or not the recognition processing is properly performed, and if necessary, the recognition result is corrected on this confirmation screen.
- the recognition result is output from the item information acquisition Z recognition unit 225 to the cooperation processing unit 226, and the cooperation processing is started.
- the notification of the recognition result to the user and the display of the confirmation screen can be omitted.
- the cooperation processing unit 226 receives a recognition result from the item information acquisition Z recognition unit 225 as data to be cooperated (step S101). Further, the cooperation processing unit 226 extracts cooperation method information from the image definition information corresponding to the captured image data (step S102).
- the linkage method information includes linkage destination identification information for identifying the linkage APL 21 that is the output destination of data to be linked, and linkage method designation information for designating a linkage method.
- the cooperative processing unit 226 outputs data to be cooperated with the specified cooperative method to the specified cooperative API 21 based on the cooperative method information (step S103).
- the cooperation processing unit 226 receives the processed data (cooperation result information) from the cooperation APL 21 and outputs a confirmation screen of the result of the cooperation processing (step S104). Then, the process ends.
- the output of the confirmation screen for the result of the cooperative processing can be omitted.
- FIG. 27 shows an example of a screen showing the result of the seal verification process by the cooperative API 21.
- the screen shown in FIG. 27 displays the CIF number, the customer name and the image of the read mark obtained from the screen of the target APL 23, and the image of the notification mark obtained from the DB.
- an image in which both images are superimposed is displayed on the screen as a result of the comparison.
- Figure 28 shows an example of a confirmation screen for the result of the cooperative processing.
- Figure 28 shows the confirmation screen when it is determined that the image data of the read stamp and the report stamp match as a result of the cooperative processing. The user determines whether or not the cooperative processing has been successfully performed based on the confirmation screen. If the cooperative processing has been successfully performed, the user presses the “continue processing” button to execute the subsequent processing. Can be instructed.
- This item information output process is performed when it is necessary to output the cooperation result information to the target APL 23. If it is not necessary to output the cooperation result information to the target API 23, it can be omitted.
- the item information output unit 2 27 receives the cooperation result information from the cooperation processing unit 2 26 (step S 111), and further responds to the screen currently being processed.
- the output method information is obtained from the screen definition information (step S112).
- the screen definition information includes one or more output method information.Each output method information includes an item name of the cooperation result information and coordinates indicating an output destination area in the screen where the cooperation result information is to be displayed. Information is included.
- the item information output unit 227 extracts one piece of output method information and, based on the coordinate information included in the output method information, outputs control information about the output destination area to a screen serving as an output destination in the target AP L23. Then, based on this control information, the link result information is displayed in the output destination area on the screen (step S114).
- the item information output unit 227 determines whether or not the output process has been completed for all the output method information (step S115). If the output processing has not been completed for all the output method information (step S115: No), the item information output unit 227 increments the number of output method information that has been output by one, and One output method information is extracted (step S116). Further, the item information output unit 227 extracts items that have not been output yet from the cooperation result information (step S117), and returns to step S114.
- the item information output unit 227 determines that the output processing has been completed for all the output method information (step S115: Yes), and executes the processing. finish.
- FIG. 29 shows an example of a screen in which the cooperation result information is output on the screen of the target AP L23.
- “Confirmation of seal verification” is output as cooperation result information in the input field where “Verification result” is described.
- the screen plug 22 does not need to include the item information acquisition / recognition unit 225. Therefore, the screen plug 22 according to the modification of the first embodiment may have a configuration in which the item information acquisition / recognition unit 225 is removed from the functional configuration diagram of FIG.
- the screen definition information does not include the recognition item information.
- the screen type definition unit 222 does not perform the recognition item definition subroutine in the screen type definition processing.
- the item information acquisition-recognition process is not performed in the data linkage process.
- the display screen capture unit 2 21 performs display screen capture processing for capturing the display screen from the target APL 23.
- the screen image data of the fetched display screen is written to the screen image writing unit 228.
- the screen type determination unit 224 reads the screen definition information from the screen information DB 229, and identifies the screen on which the screen image data has been written to the screen image writing unit 228 based on the screen definition information. I do.
- the screen type determination unit 224 causes the output device 14 to display a confirmation screen for confirming the screen identification result. The user can confirm the retrieved data based on the confirmation screen.
- the display processing of the confirmation screen can be omitted.
- the processing device receives the processed data from the server and causes the output device 14 to display a cooperation result confirmation screen for confirming the data.
- the display processing of the confirmation screen of the coordination result can be omitted.
- the item information output unit 227 outputs the obtained data to a predetermined area of the target APL 23 based on the output method information included in the screen definition information. Further, the item information output unit 227 causes the output device 14 to display a result confirmation screen for confirming the result of the data linkage processing.
- the display processing of the confirmation screen can be omitted.
- the item information acquisition / recognition processing for acquiring information from the screen of the target APL 23 is not performed. The difference between the two is clear when comparing Fig. 5 and Fig. 30.
- the target APL 23 is a program for a financial remittance business and that the linked APL 21 is a slip processing and seal verification program.
- the linked APL 21 reads the deposit receipt with the account number and remittance amount recorded and the stamp stamped on it as image data using a scanner, and cuts out the image of the read stamp from the image data. Recognize the account number and remittance amount as character data. Further, the cooperative APL 21 obtains the CIF number corresponding to the account number, the customer name, and the image of the notification stamp from a DB (not shown), and based on the image of the reading stamp and the image of the notification stamp. Check the seal.
- the screen plug 22 identifies the “remittance processing” screen of the target APL23, receives the CIF number, the customer name, the remittance amount, the verification result, and the image of the read seal from the cooperative APL21 card, and receives the received information. Output to the identified screen.
- An example of the “Remittance Processing” screen is shown in FIG.
- the input fields of “CIF number”, “customer name”, “remittance amount”, “matching result” and “reading stamp” on the “remittance processing” screen of the target APL 23 Is “empty” at the beginning of the screen identification. Output the physical data to this screen.
- the target APL 23 which does not have the function of reading the deposit / receipt slip as image data using a scanner and acquiring the necessary information from the image data, is processed by the cooperative APL 21 with that function. It is possible to give a result. As a result, it is possible to incorporate new functions into target APL 23 without modifying target APL 23.
- the target AP L 23 and the cooperation API 21 are installed on the same computer. However, both may be installed on different computers.
- the screen plug 22 is provided on the computer on which the target APL 23 is installed. Also in this case, the processing procedure is as described above.
- the cooperation result information is output on the same screen as the screen from which the data to be linked is extracted from the target APL 23, but the screen is different from the screen from which the data in the target APL 23 is extracted May be output to In this case, it is necessary to define a screen for outputting the cooperation result information in advance.
- the cooperation result information may be output to a third program other than the target API 23.
- the output method information further includes information for identifying the third program.
- the cooperation result information may be further processed in the program.
- FIG. 31 shows a configuration of a system according to the second embodiment.
- targets APL 23-1 and 23-2 are installed on the same computer.
- the screen plug 22-1 extracts data to be linked from the screen of the target APL 23-1 and outputs the data to the target APL 23-2.
- the screen plug 22-1 receives the result of the cooperative processing from the target APL 23-2 and outputs it to the target APL 23-1.
- the screen plug 22-2 extracts the data to be linked from the screen of the target APL 23-2 and outputs the data to the target APL 23-1.
- the screen plug 22-2 receives the result of the cooperative processing from the target APL 23-1, and outputs it to the target APL 23-2.
- the target APL2 3-2 is a linked APL of the target APL23_1, and the target APL 23-1 is a linked APL of the target APL 23-2.
- the target APL 23-1 and the target APL 23-2 may be installed on two different computers.
- the target APL 23-1 and the screen plug 22-1 are installed on one computer
- the target APL 23-2 and the screen plug 22-2 are installed on the other computer.
- the data linkage technology according to the present invention is useful for a system that performs processing using a computer.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Library & Information Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Development Economics (AREA)
- Strategic Management (AREA)
- Finance (AREA)
- Accounting & Taxation (AREA)
- General Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Entrepreneurship & Innovation (AREA)
- Game Theory and Decision Science (AREA)
- Economics (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- User Interface Of Digital Computer (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Character Input (AREA)
- Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004567517A JP4102365B2 (ja) | 2003-01-28 | 2003-01-28 | アプリケーション間のデータ連携支援方法 |
CNB038254395A CN1324524C (zh) | 2003-01-28 | 2003-01-28 | 应用程序间的数据链接支持方法 |
PCT/JP2003/000802 WO2004068407A1 (ja) | 2003-01-28 | 2003-01-28 | アプリケーション間のデータ連携支援方法 |
EP03703065A EP1589474A4 (en) | 2003-01-28 | 2003-01-28 | METHOD FOR FACILITATING DATA COUPLING BETWEEN APPLICATIONS |
AU2003208051A AU2003208051A1 (en) | 2003-01-28 | 2003-01-28 | Method for supporting data linkage between applications |
US11/137,364 US7958458B2 (en) | 2003-01-28 | 2005-05-26 | Method for supporting data linkage between applications |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2003/000802 WO2004068407A1 (ja) | 2003-01-28 | 2003-01-28 | アプリケーション間のデータ連携支援方法 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/137,364 Continuation US7958458B2 (en) | 2003-01-28 | 2005-05-26 | Method for supporting data linkage between applications |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004068407A1 true WO2004068407A1 (ja) | 2004-08-12 |
Family
ID=32800802
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2003/000802 WO2004068407A1 (ja) | 2003-01-28 | 2003-01-28 | アプリケーション間のデータ連携支援方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US7958458B2 (ja) |
EP (1) | EP1589474A4 (ja) |
JP (1) | JP4102365B2 (ja) |
CN (1) | CN1324524C (ja) |
AU (1) | AU2003208051A1 (ja) |
WO (1) | WO2004068407A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8266200B2 (en) | 2006-11-29 | 2012-09-11 | Nec Corporation | Application interaction system, application interaction method, recording medium, and application interaction program |
CN105160562A (zh) * | 2015-10-10 | 2015-12-16 | 金大甲 | 利用移动客户终端的电子章***及方法 |
JP2019121004A (ja) * | 2017-12-28 | 2019-07-22 | 日本電気株式会社 | 処理装置、処理方法及びプログラム |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9886461B1 (en) * | 2014-07-11 | 2018-02-06 | Google Llc | Indexing mobile onscreen content |
US10176336B2 (en) | 2015-07-27 | 2019-01-08 | Microsoft Technology Licensing, Llc | Automated data transfer from mobile application silos to authorized third-party applications |
US9300678B1 (en) | 2015-08-03 | 2016-03-29 | Truepic Llc | Systems and methods for authenticating photographic image data |
JP6646214B2 (ja) * | 2016-02-10 | 2020-02-14 | 富士通株式会社 | 情報処理システム、情報処理装置、情報処理方法および情報処理プログラム |
US10375050B2 (en) | 2017-10-10 | 2019-08-06 | Truepic Inc. | Methods for authenticating photographic image data |
US10360668B1 (en) | 2018-08-13 | 2019-07-23 | Truepic Inc. | Methods for requesting and authenticating photographic image data |
US11037284B1 (en) | 2020-01-14 | 2021-06-15 | Truepic Inc. | Systems and methods for detecting image recapture |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09198506A (ja) * | 1996-01-17 | 1997-07-31 | Hitachi Ltd | 帳票イメージ切り出し方法 |
US5889518A (en) | 1995-10-10 | 1999-03-30 | Anysoft Ltd. | Apparatus for and method of acquiring, processing and routing data contained in a GUI window |
JPH11312231A (ja) * | 1998-04-28 | 1999-11-09 | Omron Corp | データ処理プログラムを記録した記録媒体、データ処理装置およびデータ処理方法 |
JP2000048215A (ja) * | 1998-07-27 | 2000-02-18 | Sharp Corp | データ処理装置及びその制御プログラムを記憶した媒体 |
JP2000194869A (ja) * | 1998-12-25 | 2000-07-14 | Matsushita Electric Ind Co Ltd | 文書作成装置 |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05257714A (ja) | 1990-03-13 | 1993-10-08 | Selecterm Inc | 1台のコンピュータで複数のアプリケーションプログラムをコオペレートするための方法 |
US5586240A (en) * | 1992-03-11 | 1996-12-17 | Genesis Software, Inc. | Image generation and retrieval system integrated with arbitrary application using layered interface |
JPH05290049A (ja) | 1992-04-13 | 1993-11-05 | Hitachi Ltd | データ交換方法 |
JPH0728801A (ja) | 1993-07-08 | 1995-01-31 | Ricoh Co Ltd | イメージデータ処理方法及びその装置 |
US6339767B1 (en) * | 1997-06-02 | 2002-01-15 | Aurigin Systems, Inc. | Using hyperbolic trees to visualize data generated by patent-centric and group-oriented data processing |
JPH0883285A (ja) | 1994-09-13 | 1996-03-26 | N T T Data Tsushin Kk | 文字コ−ド生成方法及び文書デ−タベ−ス登録システムの前処理装置 |
US5819261A (en) * | 1995-03-28 | 1998-10-06 | Canon Kabushiki Kaisha | Method and apparatus for extracting a keyword from scheduling data using the keyword for searching the schedule data file |
US6702736B2 (en) * | 1995-07-24 | 2004-03-09 | David T. Chen | Anatomical visualization system |
JPH09282477A (ja) | 1996-04-10 | 1997-10-31 | Hitachi Ltd | 仕様書生成方法及びそのシステム |
JP3422897B2 (ja) | 1996-05-17 | 2003-06-30 | 株式会社テクノクラフト | 文字列抽出システム及び文字列抽出方法 |
US6272235B1 (en) * | 1997-03-03 | 2001-08-07 | Bacus Research Laboratories, Inc. | Method and apparatus for creating a virtual microscope slide |
US5911044A (en) * | 1996-11-08 | 1999-06-08 | Ricoh Company, Ltd. | Network image scanning system which transmits image information from a scanner over a network to a client computer |
US5815149A (en) * | 1997-02-19 | 1998-09-29 | Unisys Corp. | Method for generating code for modifying existing event routines for controls on a form |
JP3598711B2 (ja) | 1997-02-21 | 2004-12-08 | 三菱電機株式会社 | 文書ファイリング装置 |
JPH11265404A (ja) | 1998-03-17 | 1999-09-28 | Fujitsu Ltd | 図形処理システムおよび記憶媒体 |
US6480304B1 (en) * | 1998-12-09 | 2002-11-12 | Scansoft, Inc. | Scanning system and method |
JP2001052015A (ja) | 1999-08-09 | 2001-02-23 | Sony Corp | 情報処理装置、情報処理方法及びプログラム格納媒体 |
US6836780B1 (en) * | 1999-09-01 | 2004-12-28 | Jacada, Ltd. | Method and system for accessing data in legacy applications |
JP2001118077A (ja) | 1999-10-15 | 2001-04-27 | Ricoh Co Ltd | 文書作成装置および文書作成方法 |
KR20020026276A (ko) * | 2000-03-30 | 2002-04-09 | 이데이 노부유끼 | 기부 처리 시스템 |
WO2001086531A1 (fr) * | 2000-05-11 | 2001-11-15 | Kazuyoshi Kouketsu | Systeme et procede pour offrir des services a des membres, et systeme administratif de reservation client par telephone mobile |
US7116807B1 (en) * | 2000-09-22 | 2006-10-03 | Ge Medical Systems Global Technology Company, Llc | Method and apparatus for linking images and reports at remote view station |
US6686930B2 (en) * | 2000-11-29 | 2004-02-03 | Xerox Corporation | Technique for accomplishing copy and paste and scan to fit using a standard TWAIN data source |
JP2002222196A (ja) | 2001-01-29 | 2002-08-09 | Sharp Corp | 画像処理装置 |
JP4344185B2 (ja) * | 2003-01-28 | 2009-10-14 | シャープ株式会社 | クライアント端末装置、情報処理方法、サブクライアント端末装置、コンピュータ実行可能なプログラム、及び記録媒体 |
-
2003
- 2003-01-28 JP JP2004567517A patent/JP4102365B2/ja not_active Expired - Lifetime
- 2003-01-28 AU AU2003208051A patent/AU2003208051A1/en not_active Abandoned
- 2003-01-28 CN CNB038254395A patent/CN1324524C/zh not_active Expired - Fee Related
- 2003-01-28 EP EP03703065A patent/EP1589474A4/en not_active Ceased
- 2003-01-28 WO PCT/JP2003/000802 patent/WO2004068407A1/ja active Application Filing
-
2005
- 2005-05-26 US US11/137,364 patent/US7958458B2/en not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5889518A (en) | 1995-10-10 | 1999-03-30 | Anysoft Ltd. | Apparatus for and method of acquiring, processing and routing data contained in a GUI window |
JPH09198506A (ja) * | 1996-01-17 | 1997-07-31 | Hitachi Ltd | 帳票イメージ切り出し方法 |
JPH11312231A (ja) * | 1998-04-28 | 1999-11-09 | Omron Corp | データ処理プログラムを記録した記録媒体、データ処理装置およびデータ処理方法 |
JP2000048215A (ja) * | 1998-07-27 | 2000-02-18 | Sharp Corp | データ処理装置及びその制御プログラムを記憶した媒体 |
JP2000194869A (ja) * | 1998-12-25 | 2000-07-14 | Matsushita Electric Ind Co Ltd | 文書作成装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1589474A4 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8266200B2 (en) | 2006-11-29 | 2012-09-11 | Nec Corporation | Application interaction system, application interaction method, recording medium, and application interaction program |
CN105160562A (zh) * | 2015-10-10 | 2015-12-16 | 金大甲 | 利用移动客户终端的电子章***及方法 |
JP2019121004A (ja) * | 2017-12-28 | 2019-07-22 | 日本電気株式会社 | 処理装置、処理方法及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
EP1589474A4 (en) | 2008-03-05 |
CN1703721A (zh) | 2005-11-30 |
AU2003208051A8 (en) | 2004-08-23 |
JPWO2004068407A1 (ja) | 2006-05-25 |
CN1324524C (zh) | 2007-07-04 |
JP4102365B2 (ja) | 2008-06-18 |
US7958458B2 (en) | 2011-06-07 |
US20050213824A1 (en) | 2005-09-29 |
EP1589474A1 (en) | 2005-10-26 |
AU2003208051A1 (en) | 2004-08-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102149050B1 (ko) | 인공지능을 이용한 ocr 기반 문서 분석 시스템 및 방법 | |
US7130466B2 (en) | System and method for compiling images from a database and comparing the compiled images with known images | |
US7958458B2 (en) | Method for supporting data linkage between applications | |
US7991778B2 (en) | Triggering actions with captured input in a mixed media environment | |
US20070050360A1 (en) | Triggering applications based on a captured text in a mixed media environment | |
US20070050419A1 (en) | Mixed media reality brokerage network and methods of use | |
US20070050341A1 (en) | Triggering applications for distributed action execution and use of mixed media recognition as a control input | |
JP6826293B2 (ja) | 情報処理システムと、その処理方法及びプログラム | |
EP2884425B1 (en) | Method and system of extracting structured data from a document | |
CN113673500A (zh) | 证件图像识别方法、装置、电子设备及存储介质 | |
JP2973913B2 (ja) | 入力シートシステム | |
US7844080B2 (en) | Image processing system and image processing method, and computer program | |
US20080024834A1 (en) | Information registration apparatus for registering information onto registering destination on network and method thereof | |
JP4275973B2 (ja) | 加筆画像抽出装置、プログラム、記憶媒体及び加筆画像抽出方法 | |
JP2004013813A (ja) | 情報管理システムおよび情報管理方法 | |
JP2008257530A (ja) | 電子ペン入力データ処理システム | |
JP2003223610A (ja) | 文字認識装置及び文字認識方法 | |
KR100776864B1 (ko) | 어플리케이션간의 데이터 제휴 지원 방법 | |
JP7328797B2 (ja) | 端末装置、文字認識システム及び文字認識方法 | |
JP5673277B2 (ja) | 画像処理装置およびプログラム | |
JP4517822B2 (ja) | 画像処理装置及びプログラム | |
JP5445740B2 (ja) | 画像処理装置、画像処理システムおよび処理プログラム | |
JP2007079967A (ja) | 登録印影照合システム | |
JP7333759B2 (ja) | 画像データ生成システム、画像データ生成方法及びコンピュータプログラム | |
JP5223328B2 (ja) | 情報管理装置及び情報管理方法ならびにそのプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2004567517 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2003703065 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11137364 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 20038254395 Country of ref document: CN Ref document number: 1020057009628 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 1020057009628 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2003703065 Country of ref document: EP |