CN115079973A - Display system and display device - Google Patents

Display system and display device Download PDF

Info

Publication number
CN115079973A
CN115079973A CN202210199474.8A CN202210199474A CN115079973A CN 115079973 A CN115079973 A CN 115079973A CN 202210199474 A CN202210199474 A CN 202210199474A CN 115079973 A CN115079973 A CN 115079973A
Authority
CN
China
Prior art keywords
display
image
region
display mode
visual field
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210199474.8A
Other languages
Chinese (zh)
Inventor
山崎航史
大野千代
山本将史
中道拓也
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Publication of CN115079973A publication Critical patent/CN115079973A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/2092Details of a display terminals using a flat panel, the details relating to the control arrangement of the display terminal and to the interfaces thereto
    • G09G3/2096Details of the interface to the display terminal specific for a flat panel
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/32Operator till task planning
    • G05B2219/32014Augmented reality assists operator in maintenance, repair, programming, assembly, use of head mounted display with 2-D 3-D display and voice feedback, voice and gesture command
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0464Positioning
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0492Change of orientation of the displayed image, e.g. upside-down, mirrored
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • General Business, Economics & Management (AREA)
  • Business, Economics & Management (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The invention provides a display system and a display device, which are suitable for an operator working in a real space and can display information related to the working. A display system (10) is configured to include a display (160) and a detector that detects the orientation of the display. A display system (10) is provided with: a display source image generation unit that generates a display source image including a first image and a second image related to the first image; and a visual field image display unit that displays the first image in a first display area of the display and the second image in a second display area of the display. The visual field image display unit sets a predetermined fixed region of the display as a first display region when the display mode associated with the first image is the first display mode, and determines the first display region according to the orientation of the display when the display mode is the second display mode.

Description

Display system and display device
Technical Field
The present invention relates to a display system and a display device for displaying information for assisting a job.
Background
In operations such as manufacturing of products in a factory (or a workshop), inspection of factory equipment, and repair, operations may be performed while viewing operation files such as a flow file and a picture. However, depending on the work environment, it may be difficult to dispose a display or the like for displaying a work file in the vicinity of the work object. As a Display device that can be used in such a case, a see-through Head-Mounted Display device (Head-Mounted Display, hereinafter also referred to as HMD) that is worn on the Head of an operator and displays an image of a virtual space in a superimposed manner in a real space, or smart glasses have attracted attention. If a see-through HMD or the like is used, the operator does not need to hold the display device by hand or view a display device at a remote location, and the work efficiency can be improved.
Display control in the HMD is easy to use by changing and configuring a display screen according to the state of the HMD and the operator or display contents. For example, in the HMD disclosed in patent document 1, an image of a virtual camera that captures a virtual space is generated from an angle of a line of sight measured by the HMD, and is displayed together with an image that does not change according to the angle.
In the invention described in patent document 1, for example, a first image showing a state of a virtual space according to the orientation of a non-see-through HMD and a second image that does not change according to the orientation of the HMD in a predetermined area of a display screen are displayed. On the other hand, in an HMD that supports a job at a site such as a factory, an appropriate display method for a first image and a second image differs depending on a device to be used, a scene to be used, and display contents. Therefore, the technique described in patent document 1 cannot be said to be suitable as a display device for on-site work.
Patent document 1: japanese patent laid-open publication No. 2019-101330
Disclosure of Invention
The present invention has been made in view of such a background, and an object thereof is to provide a display system and a display device which are suitable for an operator who performs work in an actual space and which can display information related to the work.
In order to solve the above problem, a display system according to the present invention includes a display and a detector for detecting an orientation of the display, and includes: a display source image generation unit that generates a display source image including a first image and a second image related to the first image; and a visual field image display unit that displays the first image in a first display area of the display and the second image in a second display area of the display, wherein the visual field image display unit sets a predetermined fixed area of the display as the first display area if a display mode associated with the first image is a first display mode, and determines the first display area according to an orientation of the display if the display mode is a second display mode.
According to the present invention, it is possible to provide a display system and a display device which are suitable for an operator who performs work in an actual space and which can display information related to the work. Problems, structures, and effects other than those described above will become apparent from the following description of the embodiments.
Drawings
Fig. 1 is a diagram showing a configuration of a display system according to a first embodiment.
Fig. 2 is a functional block diagram of the HMD according to the first embodiment.
Fig. 3 is a functional block diagram of the smartphone of the first embodiment.
Fig. 4 is a data structure diagram of the job file database according to the first embodiment.
Fig. 5 is a diagram showing a configuration of a display source image according to the first embodiment.
Fig. 6 is a diagram showing a configuration of a field image displayed on the display in the first display mode of the first embodiment.
Fig. 7 is a diagram showing a configuration of a field image displayed on the display in the second display mode of the first embodiment.
Fig. 8 is a diagram showing a configuration of a field image displayed on the display in the third display mode of the first embodiment.
Fig. 9 is a sequence diagram of the configuration definition processing of the first embodiment.
Fig. 10 is a flowchart of the display source image generation processing in the first embodiment.
Fig. 11 is a flowchart of the sight field image display processing in the first embodiment.
Fig. 12 is a functional block diagram of the HMD of the second embodiment.
Fig. 13 is a functional block diagram of a smartphone of the third embodiment.
Fig. 14 is a flowchart of the display source image generation processing in the third embodiment.
Description of the reference numerals
10 a display system;
100, 100A HMD (display device);
111 a data receiving section;
112 visual field image display unit;
113 a display mode control section;
122 displaying the source image;
126 display mode;
160 display (display);
170 sensor (detector);
170A sensors (detectors, microphones);
175 an operation section;
200. 200B smart phones (portable devices, external devices);
211 a configuration definition part;
212 a display source image generation unit;
213 a display mode control section;
214 a flow document emphasizing unit (display source image generating unit);
224 display mode;
230 displaying the source image;
231 a flow file configuration area (area of the first image);
232 picture configuration area (area of second image);
233. 233A field of view;
240 job file database;
250 a picture database;
420. 420A, 420B field of view images;
a 421 in-view flow file region (first display region, fixed region);
422 picture area in field of view (second display area);
423 a flow file area (first display area) within the field of view.
Detailed Description
Overview of display System
Hereinafter, a display system in an embodiment (embodiment) for carrying out the present invention will be described. The display system includes a smartphone and a see-through HMD (head mounted display). The smartphone stores a flow file and a picture of a job, generates an image (also referred to as a display source image) including the flow file and the picture of a job procedure performed by the operator in each predetermined region, and transmits the image to the HMD.
The HMD displays a flow file and a picture included in the display source image at a predetermined position corresponding to the display mode. For example, in the first display mode, the HMD cuts out an image from an area where a flow file included in the source image is displayed, and displays the image at a predetermined position (for example, below the center) on the screen of the HMD. In addition, the HMD displays pictures according to the orientation of the subject. For example, the picture area is located on the right side of the display source image (see the picture arrangement area 232 in the display source image 230 in fig. 5 described later). Thus, the HMD does not display a picture when the operator is facing the front. The HMD displays the left side of the picture on the right side of the screen when detecting that the operator is facing the right side with a small width, and displays the whole picture on the right side of the screen when detecting that the operator is facing the right side with a large width.
By displaying the flow document and the picture on the HMD in this manner, the operator can always refer to the flow document and perform the work. Further, the operator can refer to the picture by directing it to the right. When the work site is facing the front surface, the picture is not displayed, and the view of the operator viewing the work object and the work site (actual space) is not blocked. As a result, the operator can perform the work with both hands without manually operating the smartphone or the HMD, while referring to the flow file or the picture.
In the second display mode, the HMD displays images cut out from the display source image directly on a screen of the HMD. Since the flow file is not fixedly displayed as compared with the first display mode, visibility of the actual space (work place, work space) is improved.
The display system switches a first display mode and a second display mode according to a job. By setting the display mode in advance as appropriate according to the work content (flow file), the operator can perform the work without performing an operation of switching the display mode.
First embodiment: overall structure of display system
Fig. 1 is a diagram showing a configuration of a display system 10 according to a first embodiment. The display system 10 is configured to include an HMD100 (display device) and a smartphone 200 (portable device). The HMD100 is, for example, a goggle type display device that is worn on the head so as to cover the visual field. The HMD100 and smartphone 200 are connected, for example, by a USB (Universal Serial Bus) cable, but may be connected by other standard cables or wirelessly. The smartphone 200 may be connected to the USB, and may be connected to the HMD100 by being converted to an image output connector such as HDMI (High-Definition Multimedia Interface) or DisplayPort (display port) via a conversion adapter. The HMD100 also includes smart glasses used as glasses. Similarly, the smartphone 200 (portable device) includes a portable terminal such as a tablet PC or a notebook PC.
First embodiment: structure of HMD
Fig. 2 is a functional block diagram of the HMD100 of the first embodiment. The HMD100 includes a control unit 110, a storage unit 120, a display 160, a sensor 170, and a communication unit 180. The communication unit 180 has 1 or more communication interfaces such as USB and Wi-Fi (registered trademark), and transmits and receives data to and from the smartphone 200.
The display 160 is a see-through display device (see fig. 1) having high transmittance and provided in the front of the HMD 100. The HMD100 may be of a monocular type used by either of the left and right eyes, in addition to the binocular type shown in fig. 1. The operator can perform work while wearing the HMD100 and referring to information displayed on the display 160.
The sensor 170 is, for example, a sensor including a MEMS (Micro Electro Mechanical Systems) type gyroscope, and detects the angle or angular velocity of the HMD100 and outputs the angle or angular velocity to the control unit 110. The system may include an illuminance sensor for measuring the illuminance around the system and a camera for photographing the surroundings and the operator.
The storage unit 120 is constituted by a ROM (Read Only Memory), a RAM (Random Access Memory), a flash Memory, and the like. The storage unit 120 stores a program 121, display source images 122, process file coordinates 123, related picture coordinates 124, in-field process file coordinates 125, and a display mode 126. The program 121 is a program executed by a CPU (Central Processing Unit) constituting the control Unit 110, and controls the HMD 100. The program 121 includes a description of the view field image display processing (see fig. 11 described later) and other processing procedures.
The display source image 122 is a display source image 230 (see fig. 3 and 5 described later) received from the smartphone 200, and is an image which is a base of a view image displayed on the display 160. The flow file coordinates 123, the related picture coordinates 124, and the in-view flow file coordinates 125 are coordinate data indicating an area in the display source image 122 received from the smartphone 200, and are equivalent to the flow file coordinates 221, the related picture coordinates 222, and the in-view flow file coordinates 223 stored in the smartphone 200 (see fig. 3) described later, respectively. The display mode 126 indicates a method of displaying the sight field image, and is equivalent to the display mode 224 stored in the smartphone 200 described later.
The control unit 110 includes a CPU, and includes a data receiving unit 111 and a visual field image display unit 112. The data receiving unit 111 stores the data received from the smartphone 200 in the display source image 122, the process file coordinates 123, the related picture coordinates 124, the in-view process file coordinates 125, and the display mode 126.
Based on display mode 126, visual field image display unit 112 clips an image from display source image 122 (see display source image 230 described in fig. 5, which will be described later) and displays the clipped image on display 160. When an image is cut out, the view field image display unit 112 refers to the flow file coordinates 123, the related picture coordinates 124, the in-view flow file coordinates 125, and the display mode 126.
First embodiment: structure of smart phone
Fig. 3 is a functional block diagram of the smartphone 200 of the first embodiment. The smartphone 200 includes a control unit 210, a storage unit 220, a touch panel display 260, a microphone 270, and a communication unit 280. The communication unit 280 includes 1 or more communication interfaces such as USB and Wi-Fi, and transmits and receives data to and from the HMD 100.
The storage unit 220 is constituted by ROM, RAM, flash memory, and the like. The storage unit 220 stores a display source image 230, a flow file coordinate 221, a related picture coordinate 222, an in-view flow file coordinate 223, a display mode 224, a job file database 240 (see fig. 4 described later), a picture database 250, and a program 229.
The program 229 is a program executed by the CPU constituting the control unit 210, and controls the smartphone 200. The program 229 includes a description of the display source image generation processing (see fig. 10 described later) and other processing procedures.
Fig. 4 is a data configuration diagram of the job file database 240 according to the first embodiment. The job file database 240 is, for example, table-form data. Row 1 (record) of job file database 240 represents 1 job step performed by a worker who is a user of display system 10. The record structure includes a step number 241 (indicated as #, in fig. 4), a work target location 242, a work content 243, a related picture 244, a completion flag 245 (indicated as completion F, in fig. 4), a completion date and time 246, and a display mode 247 (indicated as display M, in fig. 4).
Step number 241 is a number assigned to a work step, and indicates the order of the work step. The work target portion 242 indicates a portion to be subjected to a work such as repair or inspection, and is, for example, the name of a work target such as "machine No. 1 of model a". The work content 243 is a description of the work procedure displayed on the display 160 (see fig. 1 and 2), and is a description of the work content such as "turn off the switch B", for example. The related picture 244 is a picture related to the work procedure, and is identification information of the picture displayed on the display 160. Completion flag 245 indicates whether the job step is complete ("yes") or incomplete ("no"). Completion time 246 is the time when the work step was completed. The display mode 247 indicates how to display the flow file and the related picture, and the first to third display modes are specified in the first embodiment. The details of the display mode will be described later.
Returning to FIG. 3, picture database 250 stores pictures associated with the job steps. The picture is given identification information corresponding to the related picture 244 (see fig. 4). The control unit 210 can access the picture by specifying the identification information. The display source image 230, the flow file coordinates 221, the related picture coordinates 222, and the in-view flow file coordinates 223 will be described with reference to fig. 5 to 8 described later.
The display mode 224 is a mode related to the configuration of the view field image including the flow document and the picture, and there are 3 display modes of the first to third embodiments. Details of the display mode will be described with reference to fig. 5 to 8.
First embodiment: displaying source and visual field images
Fig. 5 is a diagram showing a configuration of a display source image 230 according to the first embodiment. The display source image 230 is an image that is the basis of the image displayed on the display 160 (see fig. 1 and 2), and is equivalent to the display source image 122 shown in fig. 2. The flow file layout region 231 is a partial region where the source image 230 is displayed, and is a region where the contents of the flow file (step number 241, work target portion 242, work content 243, related picture 244, and completion flag 245 in fig. 4) are displayed. The picture configuration area 232 is a partial area where the source image 230 is displayed, and is an area where a picture associated with the process file (a picture identified by the associated picture 244 of fig. 4 and stored in the picture database 250) is displayed.
The visual field region 233 is a partial region where the source image 230 is displayed, and is a region displayed on the display 160 (see fig. 1 and 2) in the first and second display modes. The size of the field of view region 233 is the display size of the display 160.
The field of view region 233 moves within the display source image 230 according to the orientation of the HMD 100. For example, when the operator is facing the left side and the sensor 170 (see fig. 2) detects the left rotation of the HMD100, the visual field area 233A located on the left side of the display source image 230 is displayed on the display 160.
The flow file coordinates 221 (see fig. 3) are arrangement coordinates of the flow file arrangement region 231 in the display source image 230. The configuration coordinates are, for example, the coordinates of the upper left vertex and the lower right vertex of the region within the display source image 230. The configuration coordinates may be coordinates of the upper left vertex of the region and the size of the region. The associated picture coordinates 222 are configuration coordinates that display a picture configuration area 232 in the source image 230.
The configurations of the visual field images 420, 420A, and 420B in the first to third display modes (omitted, "and also referred to as the first to third display modes) will be described below with reference to fig. 6 to 8.
Fig. 6 is a diagram showing a configuration of a field image 420 displayed on the display 160 in the first display mode according to the first embodiment. The visual field image 420 is an image corresponding to the visual field region 233 (see fig. 5), and is a visual field image in the first display mode. The view field image 420 is an image in which an in-view process file region 421 is added to an image obtained by cutting out an in-view picture region 422, which will be described later, from the display source image 230. In the display 160, since pixels are not displayed in the part other than the in-view picture region 422 and the in-view flow file region 421, the operator wearing the HMD100 can see the work place (actual space) through the part of the field of view.
The in-view flow file area 421 is a partial area of the view image 420, and is an area where the flow file is displayed. In the first embodiment, the in-view flow file region 421 is disposed below the center of the view field image 420, but may be disposed at another position. The in-view flow file coordinates 223 (see fig. 3) are arrangement coordinates of the in-view flow file region 421 in the view field image 420.
The in-view picture region 422 is a region where the view region 233 (see fig. 5) overlaps the picture arrangement region 232, and is a region including a part of a picture. Since the visual field region 233 moves within the display source image 230 in accordance with the orientation of the HMD100, the in-field picture region 422, which is the region where the visual field region 233 overlaps the picture arrangement region 232, also changes in accordance with the orientation of the HMD 100. For example, when the worker is facing the left side, the worker moves from the visual field region 233 to the visual field region 233A. Accordingly, since the visual field region 233A does not overlap the picture arrangement region 232, the in-visual-field picture region 422 does not exist, and no picture is displayed on the display 160.
In the first display mode, the in-view flow file region 421 in the view field image 420 is fixed, and the flow file is always displayed on the display 160, so that the worker can confirm the flow file without changing the orientation of the HMD100 (head). Therefore, the work efficiency is improved when a plurality of relatively short processes are performed on the same work object.
Fig. 7 is a diagram showing a configuration of a field image 420A displayed on the display 160 in the second display mode of the first embodiment. The visual field image 420A is an image corresponding to the visual field region 233 (see fig. 5), and is a visual field image in the second display mode. The visual field image 420A is an image obtained by cutting out an in-field picture region 422 and an in-field flow file region 423 from the display source image 230. In the display 160, since pixels are not displayed in the portion other than the in-view picture region 422 and the in-view flow file region 423, the operator wearing the HMD100 can see the work place through the portion of the field of view.
The in-view flow file region 423 is a region where the view region 233 overlaps the flow file arrangement region 231. The in-view picture region 422 is a region where the view region 233 overlaps the picture arrangement region 232.
In the second display mode, the in-view flow file region 423 in the view field image 420A is not fixed, and the flow file is not fixedly displayed, so that visibility of the actual space is improved. Therefore, the work efficiency is improved when relatively precise work is intensively performed.
Fig. 8 is a diagram showing a configuration of a field image 420B displayed on the display 160 in the third display mode according to the first embodiment. The view field image 420B is an image corresponding to the display source image 230, and is a view field image in the third display mode. The display 160 has fewer pixels than the original image, and the field-of-view image 420B is a reduced image of the display source image 230. The flow file arrangement region 231 and the picture arrangement region 232 are also reduced in size to become an in-view flow file region 425 and an in-view picture region 424, respectively.
In the third display mode, since the flow document and the picture can be confirmed at the same time, it is convenient when confirming the contents before the execution of the job. In addition, in the case of using the monocular HMD, since visibility to the real space can be ensured even if the flow file and the picture are always displayed on the display, the third display mode can be always set.
First embodiment: structure of the smart phone: control section
Returning to fig. 3, the control unit 210 includes a CPU, and includes an arrangement definition unit 211, a display source image generation unit 212, and a display mode control unit 213. The arrangement defining unit 211 receives an instruction from an operator as a user of the display system 10, and defines (sets) the arrangement (arrangement coordinates) of the flow file arrangement region 231 and the picture arrangement region 232 in the display source image 230 (see fig. 5), and the arrangement of the in-view flow file region 421 in the view image 420 (see fig. 6). The arrangement definition unit 211 transmits the defined arrangement coordinates to the HMD 100.
The display source image generation unit 212 generates a display source image 230 including a flow file and a related picture related to a work procedure in which the worker works, and transmits the generated display source image to the HMD 100.
The display mode control unit 213 determines the display mode and stores it as the display mode 224.
First embodiment: configuration definition processing
Fig. 9 is a sequence diagram of the configuration definition processing of the first embodiment. With reference to fig. 9, a process of the smartphone 200 and the HMD100 for setting the display source image 230 (see fig. 5) and the visual field image 420 (see fig. 6) will be described.
In step S111, the arrangement definition unit 211 of the smartphone 200 acquires the arrangement coordinates of the flow file arrangement region 231 in the display source image 230 (see fig. 5) instructed by the operator as the user of the display system 10, and stores the arrangement coordinates as the flow file coordinates 221 (see fig. 3). The configuration coordinates are, for example, the coordinates of the upper left vertex and the lower right vertex of the region within the display source image 230.
In step S112, the arrangement definition unit 211 acquires the arrangement coordinates of the picture arrangement region 232 in the display source image 230 instructed by the operator, and stores the arrangement coordinates as the related picture coordinates 222.
In step S113, the arrangement definition unit 211 acquires the arrangement coordinates of the in-view flow file area 421 in the view field image 420 (see fig. 6) instructed by the operator, and stores the arrangement coordinates as the in-view flow file coordinates 223.
In step S114, the arrangement definition unit 211 transmits the flow file coordinates 221, the associated picture coordinates 222, and the in-field flow file coordinates 223 to the HMD 100.
In step S115, the data receiving unit 111 of the HMD100 stores the received flow file coordinates 221, associated picture coordinates 222, and in-view flow file coordinates 223 as flow file coordinates 123, associated picture coordinates 124, and in-view flow file coordinates 125, respectively.
Next, the processing of the display system 10 when the operator performs work using the display system 10 will be described.
First embodiment: display source image creation process
Fig. 10 is a flowchart of the display source image generation processing in the first embodiment. With reference to fig. 10, a process of generating a display source image 230 and transmitting the same to the HMD100 by the smartphone 200 will be described.
In step S131, if there is an instruction to complete the job procedure (step S131 → yes), the display source image generation unit 212 proceeds to step S132, and if not (step S131 → no), it proceeds to step S133. The instruction to complete the working step is an instruction to the display system 10 to instruct the operator to record that 1 working step is completed. The display source image generation unit 212 detects the utterance of "work step completion" by the operator based on the voice acquired by the microphone 270 (see fig. 3), and thereby determines whether or not there is a completion instruction of the work step. Alternatively, display source image generation unit 212 may detect a click of a work procedure completion button displayed on touch-panel display 260 and determine that there is a completion instruction of a work procedure.
In step S132, the display source image generation unit 212 records the completion of the work procedure. Specifically, the display source image generation unit 212 updates the completion flag 245 (see fig. 4) of the current work step to yes and updates the completion date and time 246 to the current time. The current job step is the job step with the completion flag 245 of "no" and the step number 241 smallest.
In step S133, the display source image generation unit 212 generates a blank image and stores the blank image as the display source image 230.
In step S134, the display source image generation unit 212 draws a flow file in the flow file arrangement region 231 (see fig. 5) in which the source image 230 is displayed. Specifically, the display source image generation unit 212 draws a step number 241, a work target portion 242, work contents 243, a related picture 244, and a completion flag 245 (see fig. 4) of the current work step in the flow file arrangement region 231.
In step S135, the display mode control unit 213 acquires the display mode 247 of the current job step and stores it as the display mode 224.
In step S136, the display source image generation unit 212 draws a related picture in the picture arrangement region 232 of the display source image 230. Specifically, the display source image generation unit 212 acquires a picture corresponding to the identification information in the related picture 244 of the current work procedure from the picture database 250 and draws the picture in the picture arrangement area 232.
In step S137, the display source image generation unit 212 transmits the display source image 230 and the display mode 224 to the HMD100, and the process returns to step S131. Further, the display mode control unit 213 may transmit the display mode 224 to the HMD 100.
First embodiment: visual field image display processing
Fig. 11 is a flowchart of the sight field image display processing in the first embodiment. With reference to fig. 11, a process of displaying a field of view image on the display 160 in accordance with the display source image 230 and the display mode 224 received from the smartphone 200 (see step S137 in fig. 10) by the HMD100 will be described.
In step S151, the data receiving unit 111 receives the display source image 230 and the display mode 224 transmitted from the smartphone 200, and stores them as the display source image 122 and the display mode 126, respectively.
In step S152, if the display mode 126 is the first display mode (step S152 → first display mode), the visual field image display unit 112 proceeds to step S153, if the display mode is the second display mode (step S152 → second display mode), the process proceeds to step S156, and if the display mode is the third display mode (step S152 → third display mode), the process proceeds to step S159.
In step S153, the visual field image display unit 112 displays an image of the flow file arrangement region 231 (see fig. 5, the partial region where the source image 122 is displayed, which is indicated by the flow file coordinates 123) in the region of the display 160 (see the in-view flow file region 421 in fig. 6) indicated by the in-view flow file coordinates 125.
In step S154, the visual field image display unit 112 calculates a visual field region 233 (see fig. 5) from the orientation (angle) of the HMD100 acquired by the sensor 170.
In step S155, the visual field image display unit 112 displays the overlapping area of the visual field region 233 and the picture arrangement region 232 in the corresponding region of the display 160, and returns to step S151. The corresponding region of the display 160 is a region corresponding to the in-field picture region 422 in the field-of-view image 420 when the display image of the display 160 is regarded as the field-of-view image 420 (see fig. 6).
Step S156 is the same as step S154.
In step S157, the visual field image display unit 112 displays the overlapping area of the visual field area 233 and the flow document placement area 231 in the corresponding area of the display 160. In other words, the visual field image display unit 112 determines, as the in-view flow file region 423, a region overlapping with the flow file arrangement region 231 in the visual field region 233 in accordance with the orientation of the HMD 100.
Step S158 is the same as step S155.
In step S159, the visual field image display unit 112 displays the source image 122 on the display 160, and returns to step S151. In this case, the size of the source image 122 may be changed so that the source image 122 is displayed in full on the display 160.
First embodiment: characteristics of display system
A flow file and a related picture of a work step that the worker is working on are displayed on the display 160 of the HMD 100.
In the first display mode, the flow file is always displayed at a position set on the display 160 (see the in-view flow file area 421 in fig. 6). Therefore, the operator can perform the work while always referring to the flow file. The related picture is displayed in accordance with the orientation of the head (HMD100) of the operator. Assume that the picture position is set to the right of the display source image 230. Therefore, the operator does not display the picture when facing the front surface, and the operator can perform the work without being obstructed by the picture. The operator can refer to the picture by moving to the right side. As a result, the operator can perform the work with both hands without manually operating the smartphone 200 or the HMD100, and can refer to the flow file or the picture.
In the second display mode, the flow file is not displayed in the visual field area, so that the visibility of the real space is improved, and the work efficiency is improved when a finer work is performed in a concentrated manner.
In the case of the third display mode, the flow document and the picture can be confirmed through 1 display, and therefore, it is convenient in the case of confirming the contents before the job is performed, or the like. In addition, in the case of using the monocular HMD, since visibility to the actual space can be ensured even if the flow file and the picture are always displayed on the display, the third display mode may be always set.
The display mode is switched according to the operation procedure (see steps S135 and S137 described in fig. 10 and step S152 described in fig. 11). By setting the operation mode according to the operation content corresponding to the operation step, the operator does not need to switch the operation mode during the operation, and can concentrate on the operation, thereby improving the operation efficiency.
First embodiment: modification example: switching of display modes
The display system 10 according to the first embodiment changes the first to third display modes according to the display mode 247 set in the job file database 240, but may adopt other methods. For example, the display mode control unit 213 may recognize the type of the content to be displayed, and set the display mode to the third display mode if the setting screen (see fig. 9) is displayed, or set the display mode to the first display mode if the job screen is displayed. The display mode control unit 213 may acquire the HMD type (monocular/binocular) from the HMD100, and set the HMD type to the third display mode in the case of the monocular type, and set the HMD type to the first display mode in the case of the binocular type. The display mode control unit 213 may switch the display mode in accordance with a voice command of the operator obtained by the microphone 270 (see fig. 3).
Second embodiment
In the first embodiment, the HMD100 switches the display mode in accordance with the display mode 126 transmitted by the smartphone 200. The display mode may be switched by an operator operating the HMD 100.
Fig. 12 is a functional block diagram of the HMD100A of the second embodiment. In comparison with the HMD100 (see fig. 2) of the first embodiment, the HMD100A includes an operation unit 175, the control unit 110 includes a display mode control unit 113, and the sensor 170A includes a microphone.
The operation unit 175 is, for example, a button. The display mode control unit 113 recognizes a voice picked up by the microphone of the sensor 170A and switches the display control mode. The display mode control unit 113 recognizes a voice such as "display mode 1", for example, and switches to the first display mode. The display mode control unit 113 switches the first display mode, the second display mode, and the third display mode in order by toggle every time the button of the operation unit 175 is pressed.
In the first embodiment, the display mode is set to the display mode 247 (see fig. 4), and is fixed according to the work procedure (work content). In the second embodiment, the operator can switch between the display modes according to the situation, and the convenience of use of the display system 10 is improved.
Third embodiment
Fig. 13 is a functional block diagram of a smartphone 200B of the third embodiment. In the third embodiment, a flow file emphasizing unit 214 for emphasizing the difference of the flow files is added. The flow file emphasizing unit 214 emphasizes and displays the difference from the previous work process in the work content 243 (work process) in the work file database 240. Thereby, a read error or the like in the working process can be prevented.
Fig. 14 is a flowchart of the display source image generation processing in the third embodiment.
Steps S311 to S313 are the same as steps S131 to S133 shown in fig. 10, respectively.
In step S314, the flow-file emphasizing unit 214 calculates a difference between the flow file of the current work step and the flow file of the previous 1 work step. In the calculation of the difference, for example, a Levenshtein (Levenshtein) distance is used.
In step S315, the flow file emphasizing unit 214 proceeds to step S316 if the levenstein distance D calculated in step S314 is greater than 0 and smaller than a predetermined threshold value (D _ a) (step S315 → yes), and proceeds to step S318 otherwise (step S315 → no).
In step S316, the flow document emphasizing section 214 determines that the current flow document is different from the previous 1 flow document in character.
In step S317, the display source image generation unit 212 changes the color of the character or the like to emphasize the character specified in step S316, and draws a flow file in the flow file arrangement region 231 (see fig. 5) of the display source image 230.
Steps S318 to S321 are the same as steps S134 to S137 shown in fig. 10, respectively.
Features of the third embodiment
In the third embodiment, in the case where the flow files (job contents) of the preceding and following job steps are similar, the flow files are displayed to see the difference thereof. This prevents the operator from thinking and misreading, and thus enables more reliable work.
In the third embodiment, the flow files of the preceding and following work steps are compared. In the case where the current work step includes a plurality of work processes, the difference between the plurality of work processes may be emphasized.
Modification example: area of vision
In the above-described embodiment, the visual field region 233 (see fig. 5) is calculated based on the orientation of the HMD100 detected by the sensor 170 (see step S154 shown in fig. 11). On the other hand, the visual field region 233 may be determined based on other information.
As a first example, the sensor 170 may have a line-of-sight sensor that detects a line of sight of an operator wearing the HMD100, and the visual field image display unit 112 may determine a moving direction of the visual field area 233 based on the line of sight. For example, the viewing field region 233 in the display source image 230 may be moved from the current position in the same direction as the direction of the line of sight, depending on the direction of the line of sight such as up, right, and left.
As a second example, the visual field image display unit 112 may determine the moving direction of the visual field region 233 from the voice of the operator detected by the microphone provided in the HMD100 or the smartphone 200. For example, the sound of the operator indicating the upward, upward right, and leftward directions may be detected, and the visual field region 233 in the display source image 230 may be moved in the same direction as the instruction by the voice.
As a third example, the visual field image display unit 112 may determine the moving direction of the visual field region 233 according to the direction of the sound detected by a microphone provided in the HMD 100. The sound is a sound of a work object (a preset sound source), and is, for example, an engine sound or a motor sound. For example, the direction of the sound of the work object may be detected, and the visual field region 233 may be moved in the same direction as the sound.
In the first to third examples, the visual field image display unit 112 determines the moving direction of the visual field region 233 based on the line of sight, the voice, and the sound of the work object, but may determine the position of the in-visual field picture region 422 in the visual field image 420 (see fig. 6 and 7). For example, the in-view picture region 422 may be disposed on the top, top right, left, and the like of the visual field image 420 (regions above, above right, left, and the like with respect to the center of the visual field image 420) in accordance with the line of sight, the voice, and the sound of the object. In the arrangement, the visual field image display unit 112 displays a picture arrangement region 232 in which the source image 230 (see fig. 5) is displayed in a region above, above right, left, or the like of the center of the display 160.
In the case of a sound of an object, the in-view picture region 422 may be disposed in a region above, right above, left of, or the like in the center of the visual field image 420 in a direction opposite to the direction of the sound of the object. By arranging in the opposite direction, the operator can secure a view in the direction in which the object is located, and the operator can easily perform the work.
In order to secure a visual field, it is preferable that the picture arrangement region 232 is displayed so as to be in contact with an end portion (periphery) of the display 160 in a region above, right above, left of, or the like the center of the display 160.
Other modifications
The present invention can take other various embodiments, and various modifications such as omission and replacement can be made without departing from the scope of the present invention. For example, functional parts and data of the smartphone 200 may be acquired by the HMD 100. In such a case, the display system 10 can be regarded as the HMD 100.
The components (functional units) of the control unit 110 of the HMD100 and the control unit 210 of the smartphone 200 may be moved. For example, the configuration defining section 211, the display source image generating section 212, and the display mode control section 213 may all be included in the HMD 100. In this case, the display source image generation section 212 and the display mode control section 213 in the HMD100 access the job file database 240 and the picture database 250 stored in the smartphone 200. Such an embodiment is preferable when the storage unit 120 of the HMD100 is small.
In the above-described embodiment and modification, the smartphone 200 generates the display source image and transmits the display source image to the HMD100, and the HMD100 cuts out the image of the picture and the image of the flow file from the display source image and displays the cut images on the display 160. The smartphone 200 may transmit images of pictures and images of flow files, and the HMD100 may cut out from the images of pictures or the images of flow files and display them on the display 160. Alternatively, the smartphone 200 may transmit the text to the HMD100 instead of the image of the text, and the HMD100 may convert the text into an image.
These embodiments and modifications are included in the scope and gist of the invention described in the present specification and the like, and are included in the invention described in the claims and the equivalent scope thereof.

Claims (13)

1. A display system configured to include a display and a detector that detects an orientation of the display,
the display system is provided with:
a display source image generation unit that generates a display source image including a first image and a second image related to the first image; and
a visual field image display unit that displays the first image in a first display region of the display and the second image in a second display region of the display,
the sight field image display section sets a predetermined fixed region of the display as the first display region if a display mode associated with the first image is a first display mode,
the visual field image display unit determines the first display area according to an orientation of the display if the display mode is a second display mode.
2. The display system according to claim 1,
and if the display mode is a third display mode, the sight field image display part displays the display source image on the display.
3. The display system according to claim 2,
in a case where the display is a monocular display, the visual field image display unit displays the display source image on the display while setting the display mode to the third display mode.
4. The display system according to claim 1,
the first image is an image of text,
the display source image generation unit generates a plurality of display source images,
the display source image generation unit obtains a difference between the text of the first image included in the previously generated display source image and the text of the first image included in the currently generated display source image, and generates the display source image by emphasizing the difference when the difference satisfies a predetermined condition.
5. The display system according to claim 1,
the first image is an image of text,
the text comprises a plurality of partial texts,
the display source image generation unit obtains a difference between the plurality of partial texts, and generates the display source image by emphasizing the difference when the difference satisfies a predetermined condition.
6. The display system according to claim 4 or 5,
the display source image generation unit obtains the difference using a levenstein distance.
7. A display device of a display system configured to include a display device and a portable device,
the portable device includes:
a display source image generation unit that generates a display source image including a first image and a second image related to the first image, and transmits the display source image to the display device; and
a display mode control unit that transmits a display mode associated with the first image to the display device,
the display device includes:
a display;
a detector that detects an orientation of the display; and
a visual field image display unit that displays the first image in a first display region of the display and the second image in a second display region of the display,
the sight field image display section sets a predetermined fixed region of the display as the first display region if the display mode is a first display mode,
the visual field image display unit determines the first display area according to an orientation of the display if the display mode is a second display mode.
8. A display device is characterized by comprising:
a display;
a sensor; and
a visual field image display unit that displays a first image in a first display region of the display and displays a second image related to the first image in a second display region of the display,
the visual field image display unit determines the second display area based on an orientation of a detection result detected by the sensor.
9. The display device according to claim 8,
the visual field image display unit displays the second display region in a region of the display located in a direction of a detection result detected by the sensor from a center of the display, or moves the second display region before detection in the direction of the detection result detected by the sensor in the display.
10. The display device according to claim 8 or 9,
the sensor detects a direction of the display device, a direction of a line of sight of a user of the display device, a direction indicated by a voice of the user of the display device, or a direction of a sound source set in advance.
11. The display device according to claim 8 or 9,
the display device further includes:
a second sensor that detects an orientation of the display; and
a display mode control unit that acquires a display mode associated with the first image,
the sensor detects a direction of a line of sight of a user of the display device, a direction indicated by a voice of the user of the display device, or a preset direction of a sound source,
the sight field image display section sets a predetermined fixed region of the display as the first display region if the display mode is a first display mode,
the visual field image display unit determines the first display area according to an orientation of the display if the display mode is a second display mode.
12. The display device according to claim 11,
the display mode control unit acquires a display mode from any one of sounds acquired by an external device, an operation unit provided in the display device, and a microphone provided in the display device.
13. The display device according to claim 8,
the sensor detects a preset sound direction of the sound source,
the visual field image display unit displays the second display region in a region of the display located in a direction opposite to a direction as a detection result detected by the sensor from a center of the display, or moves the second display region before detection in the display in a direction opposite to the direction as the detection result detected by the sensor.
CN202210199474.8A 2021-03-11 2022-03-02 Display system and display device Pending CN115079973A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021039144A JP7476128B2 (en) 2021-03-11 2021-03-11 Display system and display device
JP2021-039144 2021-03-11

Publications (1)

Publication Number Publication Date
CN115079973A true CN115079973A (en) 2022-09-20

Family

ID=83194354

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210199474.8A Pending CN115079973A (en) 2021-03-11 2022-03-02 Display system and display device

Country Status (3)

Country Link
US (1) US11574582B2 (en)
JP (1) JP7476128B2 (en)
CN (1) CN115079973A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11748965B2 (en) * 2021-09-28 2023-09-05 Htc Corporation Virtual image display system and virtual image display method

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006201907A (en) 2005-01-19 2006-08-03 Konica Minolta Holdings Inc Update detecting device
JP4118892B2 (en) 2005-02-10 2008-07-16 三菱電機株式会社 Drawing editing and display device, operating method thereof, and program for causing computer to execute the method
JP2007265068A (en) 2006-03-29 2007-10-11 National Institute Of Information & Communication Technology Document difference detection device and program
US9740935B2 (en) * 2013-11-26 2017-08-22 Honeywell International Inc. Maintenance assistant system
KR102212030B1 (en) * 2014-05-26 2021-02-04 엘지전자 주식회사 Glass type terminal and control method thereof
JP6122814B2 (en) 2014-07-17 2017-04-26 富士フイルム株式会社 Information processing apparatus, program, and digital plate inspection method
JP6421543B2 (en) 2014-10-17 2018-11-14 セイコーエプソン株式会社 Head-mounted display device, method for controlling head-mounted display device, computer program
JP2016110319A (en) 2014-12-04 2016-06-20 ソニー株式会社 Display control device, display control method, and program
JP2016126188A (en) 2015-01-05 2016-07-11 コニカミノルタ株式会社 Voice information display device
KR20160096422A (en) 2015-02-05 2016-08-16 삼성전자주식회사 Method for displaying screen and electronic device
US10645374B2 (en) * 2016-03-04 2020-05-05 Seiko Epson Corporation Head-mounted display device and display control method for head-mounted display device
US10503391B2 (en) * 2017-11-17 2019-12-10 Motorola Solutions, Inc. Device, system and method for correcting operational device errors
JP6522092B1 (en) 2017-12-06 2019-05-29 株式会社スクウェア・エニックス Display system and display method
JP7003633B2 (en) 2017-12-20 2022-01-20 セイコーエプソン株式会社 Transparent display device, display control method, and computer program
JP6585218B1 (en) * 2018-03-28 2019-10-02 Eizo株式会社 Display system and program
US10872460B1 (en) * 2018-09-21 2020-12-22 Immersivetouch, Inc. Device and system for volume visualization and interaction in a virtual reality or augmented reality environment
KR20200098034A (en) 2019-02-11 2020-08-20 삼성전자주식회사 Electronic device for providing augmented reality user interface and operating method thereof
JP7292144B2 (en) 2019-08-06 2023-06-16 株式会社日立製作所 Display control device, transmissive display device
KR102334091B1 (en) 2020-03-20 2021-12-02 주식회사 코클리어닷에이아이 Augmented reality device for audio identification and control method thereof
US11333892B2 (en) * 2020-04-24 2022-05-17 Hitachi, Ltd. Display apparatus, display system, and display method

Also Published As

Publication number Publication date
JP2022138964A (en) 2022-09-26
JP7476128B2 (en) 2024-04-30
US11574582B2 (en) 2023-02-07
US20220293036A1 (en) 2022-09-15

Similar Documents

Publication Publication Date Title
US11557134B2 (en) Methods and systems for training an object detection algorithm using synthetic images
JP6387825B2 (en) Display system and information display method
CN110058759B (en) Display device and image display method
US10489981B2 (en) Information processing device, information processing method, and program for controlling display of a virtual object
JP6491574B2 (en) AR information display device
US10234955B2 (en) Input recognition apparatus, input recognition method using maker location, and non-transitory computer-readable storage program
JP6399692B2 (en) Head mounted display, image display method and program
EP3029547A1 (en) Three-dimensional input device and input system
US11972532B2 (en) Display terminal, display control system and display control method
CN110788836A (en) Cooperative action assisting device
EP3748477A1 (en) Method and system for spawning attention pointers (atp) for drawing attention of an user in a virtual screen display with augmented and virtual reality
CN115079973A (en) Display system and display device
US11333892B2 (en) Display apparatus, display system, and display method
JP2010205031A (en) Method, system and program for specifying input position
CN103713387A (en) Electronic device and acquisition method
JP2010205030A (en) Method, system and program for specifying related information display position
US11720313B2 (en) Display system, display method and program
EP3550524A1 (en) Computer program, display device, head worn display device, and marker
WO2020071144A1 (en) Information processing device, information processing method, and program
JP7505112B2 (en) Wearable terminal device, program, and notification method
WO2017122508A1 (en) Information display system and information display method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination