WO2015105044A1 - インターフェース装置、可搬装置、制御装置、モジュール、制御方法およびプログラム記憶媒体 - Google Patents
インターフェース装置、可搬装置、制御装置、モジュール、制御方法およびプログラム記憶媒体 Download PDFInfo
- Publication number
- WO2015105044A1 WO2015105044A1 PCT/JP2015/000030 JP2015000030W WO2015105044A1 WO 2015105044 A1 WO2015105044 A1 WO 2015105044A1 JP 2015000030 W JP2015000030 W JP 2015000030W WO 2015105044 A1 WO2015105044 A1 WO 2015105044A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- projection
- projected
- image
- video
- operation object
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
- G06F3/0426—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/26—Projecting separately subsidiary matter simultaneously with main image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1662—Details related to the integrated keyboard
- G06F1/1673—Arrangements for projecting a virtual keyboard
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
Definitions
- the present invention relates to a technology in an interface.
- the interface device includes, for example, a projection unit and an imaging unit, and realizes an interface function by, for example, display of the keyboard and information (projection) by the projection unit and image processing of an image photographed by the imaging unit. . Due to the recent miniaturization of the projection unit and the imaging unit, such an interface device is easier to miniaturize and easier to carry than an input device such as a general keyboard or mouse.
- Non-Patent Document 1 discloses an example of a virtual keyboard.
- An apparatus for realizing a virtual keyboard disclosed in Non-Patent Document 1 includes a red semiconductor laser, an infrared semiconductor laser, and a camera.
- a red semiconductor laser projects a keyboard image, and at the same time, the red semiconductor laser irradiates a region where the keyboard is projected with a screen-like infrared beam.
- the screen-like infrared beam hits the finger and is reflected.
- the device (virtual keyboard) recognizes the position of the finger based on the captured image when the camera captures infrared light from the reflection.
- Non-Patent Document 2 discloses an example of an interface device.
- the interface device disclosed in Non-Patent Document 2 is configured by a combination of a projection device disposed on a shoulder portion of an operator and a 3D (three-dimensional) depth recognition device.
- the projection device projects a key pattern or the like, and a 3D depth recognition device, commonly known as Kinect® (Kinect is a registered trademark of Microsoft), recognizes that the key has been pressed using a three-dimensional position detection function.
- Kinect® Korean is a registered trademark of Microsoft
- Non-Patent Document 1 it is necessary to irradiate a screen-shaped infrared beam based on the projection position (operation surface) of the keyboard. For this reason, there exists a subject that the positional relationship of the projection position (operation surface) of a keyboard and an apparatus will be decided uniquely. 2. Description of the Related Art In recent years, mobile terminals such as tablets, which have been remarkably popular, have a convenient function of rotating and displaying an image in a direction that is easy for a user to view, regardless of whether the screen is in landscape orientation or portrait orientation.
- Non-Patent Document 1 when the virtual keyboard disclosed in Non-Patent Document 1 is mounted on a portable terminal such as a tablet, the function is available when using the virtual keyboard, despite having a convenient function as a portable terminal.
- the distance between the portable terminal and the projection position of the keyboard needs to be set to an appropriate predetermined distance, and thus the position of the portable terminal is determined.
- the mobile terminal since the mobile terminal needs to be installed at a predetermined angle with respect to the projection surface of the keyboard, the degree of freedom of the attitude of the mobile terminal is reduced.
- Non-Patent Document 1 also has a basic problem that the device often malfunctions. That is, when using a normal keyboard, the finger is placed on the key at the basic position, but in the case of this virtual keyboard, the finger must be left floating. For this reason, if the finger is accidentally brought closer to the operation surface than necessary, the apparatus recognizes that the key is being pressed.
- the interface device disclosed in Non-Patent Document 2 also has problems related to recognition accuracy.
- a highly accurate recognition technique is required for the operation of recognizing the keystroke by detecting the position of the finger and the surface in three dimensions. For this reason, it is considered difficult for the interface device disclosed in Non-Patent Document 2 to prevent misrecognition due to the orientation of the hand or the influence of ambient light.
- a main object of the present invention is to provide an interface device having an input function with high recognition accuracy.
- the interface device of the present invention includes, as one aspect thereof, Projecting means for projecting a first projection image; Imaging means for imaging an area in which the first projection image is projected; In the case where the first projected video is reflected in the captured video that is the video captured by the imaging means and the operation object is reflected, an imaging position where the first projected video in the captured video is reflected; Control means for recognizing operation information by the operation object based on a relationship with an imaging position where the operation object is shown; Is provided.
- the interface device of the present invention is one of the other aspects.
- Projection means for projecting a projected image Imaging means for imaging an area where the projected video is projected;
- the interface apparatus of the present invention is provided.
- the control device of the present invention When a photographic image in which a projection image projected by a projection unit to be controlled is projected is received and an operation object is projected along with the projection image in the photographic image, the projection image is displayed in the photographic image. Based on the relationship between the imaging position at which the image is projected and the imaging position at which the operation object is projected, the operation information by the operation object is recognized, and the projection unit is controlled according to the operation information.
- One aspect of the module of the present invention is as follows: The control device of the present invention described above; And projection means controlled by the control device.
- the control method of the present invention Receiving a captured image on which the projected image projected by the projection means to be controlled is projected, When an operation object is projected along with the projection video in the captured video, based on a relationship between an imaging position where the projection video is projected in the captured video and an imaging position where the operation object is projected Recognizing operation information by the operation object, The projection means is controlled according to the operation information.
- the program storage medium of the present invention On the computer, When a photographic image in which a projection image projected by a projection unit to be controlled is projected is received and an operation object is projected along with the projection image in the photographic image, the projection image is displayed in the photographic image. Processing for recognizing operation information by the operation object based on the relationship between the imaged position where the operation object is displayed and the image pickup position where the operation object is displayed; Processing for controlling the projection means according to the operation information; The computer program that executes is held.
- the object of the present invention is also achieved by the control method of the present invention corresponding to the interface apparatus of the present invention. Furthermore, the object of the present invention is also achieved by a computer program corresponding to the interface apparatus of the present invention and the control method of the present invention, and a program storage medium storing the computer program.
- an interface device having an input function with high recognition accuracy can be provided.
- FIG. 1 is a block diagram showing the configuration of the interface apparatus according to the first embodiment of the present invention.
- the interface device 100 includes an imaging unit (imaging unit) 110, a control unit (control unit) 120, and a projection unit (projection unit) 130.
- the projection unit 130 has a function of projecting the projection image 300 determined by the control unit 120 to the position or direction determined by the control unit 120.
- an image projected by the projection unit 130 is expressed as a “projection image”.
- a surface on which the projected image is projected is expressed as an operation surface 200.
- FIG. 1 as an example of the projected image 300, an arrow image indicating the vertical and horizontal directions is shown.
- the whole image projected by the projection unit 130 is expressed as “projection video”, and a part of the image projected by the projection unit 130 (for example, the left arrow portion in the example of the projection video 300 shown in FIG. 1) Etc.) and may be expressed as “projected video”.
- the light projected by the projection unit 130 is visible light. Note that light projected by the projection unit 130 may include ultraviolet light or infrared light in addition to visible light.
- the operation object 400 is an object that points to a projection image by the projection unit 130, and is, for example, a user's hand or finger using the interface device 100.
- the operation object 400 may be an object such as a pen, or may be light projected from a laser pointer.
- the operation object 400 is not limited to a finger or a hand as long as it can be imaged by the imaging unit 110, and an appropriate object may be adopted. It should be noted that here, for the sake of convenience, it is assumed that light is also included in the operation object 400 even if the light from the laser pointer or the like as described above is used as the means for indicating the projected image by the projection unit 130.
- the imaging unit 110 is configured by a camera.
- the imaging unit 110 can capture an image including both the projection image 300 and the operation object 400.
- an image captured by the imaging unit 110 may be expressed as a “captured image”.
- the imaging unit 110 captures visible light.
- the imaging unit 110 may be configured by a camera that can capture not only visible light but also ultraviolet light or infrared light.
- the imaging unit 110 may include other functional devices such as a 3D depth measurement device.
- the imaging unit 110 is arranged in consideration of the positional relationship with the projection unit 130 so that an image including both the projection image 300 projected on the operation surface 200 and the operation object 400 indicating the projection image 300 can be captured.
- the imaging unit 110 and the projection unit 130 are installed in close proximity to each other. This is because it is advantageous from the viewpoint of miniaturization of the interface device 100.
- the angle of view of the imaging unit 110 and the angle of view of the projection unit 130 are aligned, it is advantageous from this point of view that the imaging unit 110 and the projection unit 130 are installed close to each other.
- the angle of view of the imaging unit 110 is indicated by a dotted line
- the angle of view of the projection unit 130 is indicated by a one-dot chain line.
- the control unit 120 performs image processing on the image captured by the imaging unit 110, so that the position (imaging position) where the image such as characters and figures included in the projected video 300 is reflected and the operation object 400 are reflected.
- the positional relationship with the position (imaging position) is calculated.
- the control unit 120 recognizes the operation information by the operation object 400 by detecting which image of the projection image 300 the operation object 400 indicates based on the calculated positional relationship.
- the interface apparatus 100 is provided with information on the projected video that is sequentially projected according to the operation in a state associated with the operation information.
- the control unit 120 determines the projection video 300 to be projected next by the projection unit 130 and the projection position based on the recognized operation information and the information on the projection video associated with the operation information. To do.
- the control part 120 controls the projection part 130 so that the determined projection image
- video 300 is projected on the determined projection position.
- the user of the interface apparatus 100 inputs operation information to the interface apparatus 100 by changing the relative position of the operation object 400 with respect to the projection image 300 projected on the operation surface 200, for example.
- a typical example of the operation by the operation object 400 is that the user moves the operation object 400 on the operation surface 200 on which the projection image 300 is projected.
- the operation object 400 can be imaged by the imaging unit 110 together with the projection image 300, the user may move the operation object 400 at a position that is spaced from the operation surface 200.
- FIG. 2 is a block diagram illustrating details of the control unit 120.
- the control unit 120 includes a first detection unit (detection unit) 121, a second detection unit (detection unit) 122, a processing unit (processing unit) 123, and a video determination unit. (Video determination means) 124 and a position determination unit (position determination means) 125 are provided.
- the first detection unit 121 has a function of detecting the position (imaging position) where the operation object 400 is reflected in the captured image by image processing.
- the first detection unit 121 may have a function of detecting the movement (for example, speed or acceleration) of the operation object 400 using, for example, an object tracking process.
- the first detection unit 121 may have a function of detecting the shape of the operation object 400 by, for example, contour extraction processing. For example, when the operation object 400 is a user's hand using the interface device 100, the first detection unit 121 may have a function of detecting the shape of the hand.
- the second detection unit 122 performs image processing on a position (imaging position) where an image such as a character or a graphic included in the projected image 300 (that is, an image related to operation information) is included in the captured image. It has a function to detect.
- the processing unit 123 Based on the imaging position information of the operation object 400 by the first detection unit 121 and the imaging position information of the video by the second detection unit 122, the processing unit 123 includes an image included in the projection video 300 in the captured image. And a function of detecting the positional relationship between the operation object 400 and the operation object 400. Then, the processing unit 123 recognizes the operation information by the operation object 400 based on the detected positional relationship (in other words, detects what instruction the user has issued using the operation object 400). ) It has a function. Further, the processing unit 123 has an object recognition function for recognizing an object shown in the captured video based on, for example, contour data of the object and contour information obtained by performing image processing on the captured image. May be.
- the processing unit 123 is based on, for example, data in which the movement of the operation object 400 and operation information corresponding to the movement are associated, and information on the movement of the operation object 400 by the first detection unit 121.
- the operation information by the operation object 400 may be recognized.
- the video determination unit 124 has a function of determining the projection video 300 to be projected next by the projection unit 130 based on the collective data of the projection video 300 corresponding to the operation information and the operation information recognized by the processing unit 123. I have.
- the position determination unit 125 is determined by the video determination unit 124 based on the data representing the projection position (or projection direction) of the projection video 300 projected according to the operation information and the operation information recognized by the processing unit 123. A function of determining the projection position (projection direction) of the projected image 300 is provided. In addition, the position determination unit 125 has a function of controlling the projection unit 130 so that the projected video 300 determined by the video determination unit 124 is projected to the determined position or direction.
- the control unit 120 having the configuration shown in FIG. 2 is configured as described above.
- the control unit 120 may have a function of exchanging information with the other device.
- control unit 120 is not necessarily installed in the vicinity of the imaging unit 110 or the projection unit 130.
- the imaging unit 110 and the projection unit 130 may be incorporated in a mobile device including a power source and a wireless unit, and the control unit 120 may be implemented in a device different from the mobile device.
- the control unit 120 implements the above-described functions by communicating with the imaging unit 110 and the projection unit 130 using wireless communication technology.
- FIG. 3 is a diagram illustrating an example of a captured image captured by the imaging unit 110.
- a projected image 303 and an operation object 400 are shown as captured images.
- the projected video 303 in FIG. 3 includes a plurality of key-like videos and provides an interface for inputting characters to the user.
- the processing unit 123 recognizes the operation information by the operation object 400 based on the positional relationship between the imaging position of the operation object 400 and the imaging position of the image included in the projection image 303 in the captured image. In the example of FIG. 3, the processing unit 123 recognizes that the operation object 400 has selected (pointed to) the key “SPC (space)”.
- the first detection unit 121 detects the imaging position of the tip (fingertip) of the operation object (finger) 400 by performing image processing on the captured image. Based on the positional relationship between the imaging position of the fingertip of the operation object 400 and the imaging position of the image of each key included in the projection image 303, the processing unit 123 determines which of the fingertip of the operation object 400 is in the projection image 303. Detect if it overlaps with the key. Then, the processing unit 123 recognizes that the key overlapping with the fingertip is the key selected (pointed) by the operation object 400. Accordingly, the processing unit 123 can recognize that the operation object 400 has selected the key “SPC (space)” based on the captured image shown in FIG. 3.
- SPC space
- the processing unit 123 recognizes which key is selected by the operation object 400 based on the area of the portion where the operation object 400 and the key overlap in the captured image. That is, in this case, the control unit 120 is given data representing the area of each key or the outline of each key. Based on the data, information on the contour of the operation object 400 detected by the first detection unit 121, and the positional relationship between the imaging position of each key and the imaging position of the operation object 400, for example, The area of the overlap portion between the operation object 400 and the key is calculated. Then, the processing unit 123 detects (recognizes) the key selected by the operation object 400 based on the calculated area of the overlapped portion and a predetermined rule.
- the key “SPC (space)” has an area of about 70% to 80% overlapped with the operation object 400, whereas the key “SEL (select)” In the key “RET (return)”, almost all the area overlaps the operation object 400.
- the processing unit 123 is given a rule such as “a key whose area of 60% or more and less than 90% of the key area overlaps the operation object 400 is considered to be selected by the operation object 400”. .
- the processing unit 123 determines that the operation object 400 has selected the key “SPC (space)” based on the area of the overlapping portion between the operation object 400 and the key and the rule. It can be detected.
- control unit 120 recognizes the operation information by the operation object 400 based on the positional relationship between the imaging position of the image included in the projection image 300 and the imaging position of the operation object 400 in the captured image. .
- the interface apparatus 100 according to the first embodiment can recognize operation information from the operation object 400 as long as one imaging unit 110 and one projection unit 130 are mounted. That is, since the interface device 100 according to the first embodiment can be realized with a small number of parts, it is possible to promote downsizing, weight reduction, and power saving.
- the operation means (finger) 400 does not necessarily need to be in direct contact with the operation surface 200. That is, if the imaging unit 110 can capture an image in which the operation object 400 appears to indicate a portion corresponding to the operation content in the projection image 300, the control unit 120 even if the operation object 400 is not in contact with the operation surface 200. Can recognize operation information by the operation object 400.
- the user's finger directly contacts the operation surface. For this reason, when the touch panel is used by an unspecified number of users, the fingers of the unspecified number of users touch the operation surface of the touch panel. In this case, the nervous user cannot use the touch panel with peace of mind. In addition, there is a risk of spreading infectious diseases through a dirty touch panel.
- the interface device 100 the user does not have to directly touch the operation surface 200 with a finger, so the interface device 100 is excellent in terms of hygiene. Thereby, even a nervous user can use the interface device 100 with peace of mind. Further, unlike the touch panel interface device, the interface device 100 can be used without any problem by a user wearing gloves or the like.
- the center position of the projection image 300 by the projection unit 130 is This coincides with the center position of the image captured by the image capturing unit 110.
- the projected image of the projection image 300 by the projection unit 130 does not change in the captured image by the imaging unit 110.
- the imaging position of the video included in the projected video 300 does not change.
- the optical axis of the imaging unit 110 and the optical axis of the projection unit 130 are not coaxial.
- the projection image is displayed in the captured image according to the change in the interval between the interface device 100 and the operation surface 200 and the change in the inclination of the operation surface 200.
- the imaging position of the video included in 300 changes.
- the control unit 120 uses the imaging unit 110 and the operation surface 200 in order to accurately grasp the portion indicated by the operation object 400 in the projection video 300 based on the captured image by the imaging unit 110. It is necessary to grasp the positional relationship in the three-dimensional space.
- calibration process The process in which the control unit 120 grasps the positional relationship between the imaging unit 110 and the operation surface 200 in the three-dimensional space is hereinafter referred to as “calibration process”.
- the calibration process in the first embodiment is performed as follows. In the calibration process in the first embodiment, the calibration process is performed using the projection image 300 projected on the operation surface 200, not using the pattern drawn on the operation surface 200 itself.
- FIG. 4 is a diagram for explaining calibration processing in the first embodiment.
- the field angle of the imaging unit 110 is represented by a dotted line
- the field angle of the projection unit 130 is represented by a one-dot chain line.
- the optical axis of the imaging unit 110 and the optical axis of the projection unit 130 are not coaxial (not arranged on the same straight line)
- the center position of the imaging range by the imaging unit 110 The center position of the projected image by the projection unit 130 is different. For this reason, even if the projection direction of the projection video 300 by the projection unit 130 does not change, if the position of the operation surface 200 changes, the imaging position of the video included in the projection video 300 differs in the captured video by the imaging unit 110. .
- the projected video (dot video) 304 is projected to the position 304A shown in FIG.
- the projected video image 304 is located above the center in the vertical direction.
- the projection video 304 is projected to the position 304B.
- the projected video image 304 is located at a substantially central portion in the vertical direction.
- the imaging position of the video included in the projection video 300 in the captured video differs depending on the position and inclination of the operation surface 200.
- the control unit 120 according to the first embodiment performs calibration processing using the difference (shift) in the imaging position. Thereby, the interface apparatus 100 grasps the positional relationship between the imaging unit 110 and the operation surface 200 in the three-dimensional space.
- the control unit 120 can grasp the positional relationship in the three-dimensional space between the imaging unit 110 and the operation surface 200 based on the imaging position of the video in the captured video.
- the following calculation formula is given to the interface apparatus 100 in advance. This calculation formula calculates the positional relationship in the three-dimensional space between the imaging unit 110 and the operation surface 200 based on the imaging positions of images such as characters and graphics included in the projection video 300 in the captured image by the imaging unit 110. It is a calculation formula to do.
- a lookup table may be given to the interface apparatus 100. Such a calculation formula or look-up table is obtained by the following work.
- the positional relationship between the interface device 100 and the operation surface 200 in a three-dimensional space is acquired at a plurality of points. Then, for example, the interface device 100 measures (detects) the imaging position of each measurement point in the captured image by the imaging unit 110 while changing the positional relationship at some of these points (hereinafter referred to as measurement points). ) For example, the interface device 100 is previously given a format for generating a calculation formula for calculating the positional relationship or a lookup table. The interface apparatus 100 generates the above-described calculation formula or lookup table based on the format and data obtained by measurement (detection).
- the interface device 100 performs calibration processing based on the imaging position of the image projected on the operation surface 200. For this reason, the interface device 100 can easily execute the calibration process even if the operation surface 200 is not drawn with a pattern or the like that becomes a mark when performing calibration.
- the interface apparatus 100 can perform the calibration process first to obtain the positional relationship between the imaging unit 110 and the operation surface 200 in the three-dimensional space, and can simply analyze the imaging position of the projected image. 200 actual positions can be identified. Therefore, the interface apparatus 100 can easily execute the subsequent calibration process. Even if the positional relationship with the operation surface 200 changes dynamically, the interface apparatus 100 performs a calibration process periodically to thereby position the imaging unit 110 and the operation surface 200 in a three-dimensional space. Can keep track of.
- FIG. 5 is a diagram illustrating an example of an image projected by the projection unit 130 that is used by the control unit 120 for the calibration process.
- the projected video 305 shown in FIG. 5 is the same projected video as the projected video 303 shown in FIG.
- the control unit 120 uses these intersections for calibration processing. For example, the control unit 120 uses the points 3051, 3052, 3053, and 3054 shown in FIG. 5 for the calibration process.
- the reason for using such a point is that the change in the imaging position of the point with respect to fluctuations in the position of the operation surface 200 and the tilt is greater when the point at the outermost contour is used than at the point at the center of the image. This is because the accuracy of the calibration process is improved.
- the control unit 120 detects the positional relationship between the imaging unit 110 and the operation surface 200 in the three-dimensional space based on the imaging positions of the points 3051, 3052, 3053, and 3054 in the captured image as described above.
- FIG. 6 is a diagram illustrating another example of the projected image by the projection unit 130 used by the control unit 120 for the calibration process.
- the projected image 306 shown in FIG. 6 includes an area 3061 (hereinafter referred to as “operation area 3061”) used by the operation object 400 and a portion 3062 (hereinafter referred to as “marker 3062”) used for calibration processing. Including).
- the operation area 3061 is a video portion in which the interface device 100 recognizes operation information by the operation object 400 when pointed by the operation object 400.
- the video of the operation area 3061 is the same video as the projected video 305 shown in FIG.
- the projection image 306 includes a marker 3062 separately from the operation area 3061.
- the control unit 120 can execute the calibration process with high accuracy by executing the calibration process using the marker 3062 regardless of the shape of the operation region 3061.
- the marker 3062 is a plurality of images (four in the example of FIG. 6) that are projected from each other at intervals in the peripheral portion of the projected image 306, and is projected onto a portion where the projection position on the operation surface 200 can be specified. Is preferred.
- the shape of the marker 3062 is preferably, for example, a circle or an ellipse. The reason is that it is easy for the control unit 120 to calculate the center point of the marker.
- the operation surface 200 in a posture inclined with respect to the optical axis of the imaging unit 110 and the projection unit 130 of the interface device 100.
- the operation area 3061 that is originally rectangular has an upper base length longer than a lower base length. Deforms into a long trapezoid.
- the four markers 3062 are originally located at the corners of a virtual rectangle. However, the four markers 3062 are located at the corners of a virtual trapezoidal shape. That is, the interval between the upper two markers 3062 is wider than the interval between the lower two markers 3062.
- the relative positional relationship of the four markers 3062 changes depending on the inclination of the operation surface 200. The reason for such deformation is that the projected image 306 increases as the distance from the interface device 100 to the operation surface 200 increases.
- the control unit 120 detects that the relative positional relationship between the markers 3062 in the captured image is shifted (changed) based on the relative positional relationship serving as a reference between the plurality of markers 3062. It can be recognized that the operation surface 200 is inclined.
- the control unit 120 controls the projection unit 130 to project a projection image 308 including an operation region 3061 deformed in advance as illustrated in FIG.
- the operation region 3061 is projected onto the operation surface 200 by projecting the operation region 3061 having a shape in consideration of the inclination of the operation surface 200. Projected without distortion of shape.
- the calibration process is performed using the deformation (distortion) of the projected image 300 due to the fluctuation of the inclination of the operation surface 200 or the like.
- the marker 3062 used for the calibration process separately from the operation area 3061, the following effects can be obtained. That is, the effect is that the calibration process can be executed using the marker 3062 and the image of the operation area 3061 without distortion as shown in FIG. 9 can be displayed on the inclined operation surface 200. It is an effect.
- the entire projected image 300 is the operation area 3061
- a part of the operation area 3061 is allowed to be deformed according to the inclination of the operation surface 200 for the calibration process, and the other parts are It is difficult to display the operation surface 200 without deformation.
- the calibration process can be executed using the marker 3062, and the operation area 3061 without distortion can be copied to the operation surface 200.
- the interface apparatus 100 may project the projection image 300 that allows the user to recognize that the calibration process has been executed.
- FIG. 10 is a diagram showing an example.
- the interface apparatus 100 changes the marker 3062 from the form shown in FIG. 6 to the form shown in FIG. By changing the marker 3062, the user can recognize that the calibration process has been executed.
- the marker 3062 does not necessarily have to be an image by visible light.
- the imaging unit 110 is configured by a camera capable of capturing infrared rays
- the marker 3062 may be an infrared image.
- the operation area 3061 may be an image by visible light
- the marker 3062 may be an image by infrared light.
- FIG. 11 is a flowchart illustrating an example of the operation of the interface apparatus 100.
- the interface device 100 detects the imaging position of the operation object 400 in the captured image (step S101). Based on this, the interface apparatus 100 projects the projection image 300 onto the peripheral area (including the operation object 400) of the operation object 400 (step S102). Thereafter, the interface apparatus 100 performs a calibration process (step S103). As a result, the interface apparatus 100 grasps the positional relationship between the operation surface 200 and the imaging unit 110. And the interface apparatus 100 adjusts the projection direction etc. of the projection image 300, and projects the same or different projection image 300 as the projection image 300 projected in step S102 after this adjustment. Since the operation from step S102 to projecting the adjusted projection image 300 is completed in about one or two frames, the projection image 300 may appear to be switched instantaneously from the user's perspective.
- the interface apparatus 100 detects the positional relationship between the imaging position of the operation object 400 and the imaging position of the projection video 300 in the captured image (step S104). Then, the interface device 100 operates the operation object 400 based on the relationship data between the image such as characters and graphics included in the projection image 300 and the operation information associated with the image and the detected positional relationship. Is recognized (step S105).
- the interface apparatus 100 determines the projection image 300 to be projected next, the projection direction, and the like based on the operation information recognized in step S105 (step S106). Then, the interface apparatus 100 projects the determined projection video 300 in the determined direction (step S107).
- the order of the operations described above in the interface device 100 is not limited to the order described above, and can be changed within a range where there is no problem.
- the operation of step S101 and the operation of step S102 described above are not necessarily performed in this order.
- the interface apparatus 100 may irradiate the projection image 300 in a predetermined direction before detecting the position where the operation object 400 is reflected in the captured image.
- the control unit 120 determines the projection image 300 and its projection direction, and the projection unit 130 projects the projection image 300 in the determined projection direction by the control operation of the control unit 120 based on this,
- the projection unit 130 expresses the projection image 300.
- the imaging unit 110 captures an image, and the control unit 120 recognizes the operation information by the operating object 400 based on the relationship between the imaging position of the operation object 400 and the imaging position of the projection image 300 in the captured image. It is expressed that the control unit 120 recognizes the operation information.
- the operation object 400 displays the projection image 300. It may be expressed as operating. Furthermore, when the operation object 400 operates the projection image 300, information is input to the interface device 100, and thereby the function executed by the control unit 120 or the like may be expressed as a function provided by the projection image 300. .
- the operation of the interface device 100 may be described from the viewpoint of the user of the interface device 100 in order to facilitate understanding. Further, in the following description, even when the calibration process is performed, the description related to the calibration process may be omitted.
- the interface device 100 projects the projection image 312 on the peripheral area (including the operation object 400) of the operation object 400. This is shown in FIG. As described above, the interface apparatus 100 first projects the projection video 312 on the peripheral area of the operation object 400. Then, the interface apparatus 100 determines the positional relationship between the operation surface 200 and the imaging unit 110 (for example, the interval between the imaging unit 110 and the operation surface 200, the imaging unit 110, based on the imaging position of the projected video 312 in the captured image.
- control unit 120 determines that the character “B”. Detect that is selected. Then, the control unit 120 projects the projection unit 130 so that another projection image 313 as shown in FIG. 13 related to the selected character “B” is projected around the operation object 400. To control. When viewed from the user, when the operation object 400 is moved so as to select the letter “B”, the projection image 312 disappears and another projection image 313 is displayed around the operation object 400.
- the projected image 313 is an image representing the selected character “B” and options related to the character “B”, that is, the characters “B1”, “B2”, “B3”, and “B4”.
- the projected video 313 is a video that displays options related to the options selected in the projected video 312.
- the interface device 100 is provided with, for example, information on a plurality of different projection images including an option image and information for controlling the display order of the projection images.
- control unit 120 When the control unit 120 detects that the operation object 400 moves in the direction toward the image of the character “B3” in the projection image 313 in the captured image including the projection image 313, the control unit 120 displays the character “B3”. As an input result (selection result (information)). And the control part 120 controls the projection part 130 so that the projection image
- the operation object 400 may move in a direction away from the character image.
- the control unit 120 assumes that any option displayed on the projection video 313 has not been selected, and the projection video 312 is displayed again around the operation object 400 without recognizing any operation information.
- the projection unit 130 is controlled so as to be projected onto the screen.
- the control unit 120 first detects selection information by the operation object 400 for the first projection image 300 (projection image 312). Then, in accordance with the selection information, the control unit 120 controls the projection unit 130 to project the second projection image 300 (projection image 313) in order to obtain the next selection information. Thereafter, the control unit 120 recognizes the operation information by the operation object 400 by detecting selection information of the operation object 400 with respect to the second projection image 300 (projection image 313).
- the interface device 100 is configured to obtain operation information by multi-step operation, and thus the size of the projected image 300 can be suppressed even when the number of options is large. The reason is that it is not necessary to display all of the options for the first projection image 300. If the size of the projected image 300 is large, the projected image 300 may not be able to enter the angle of view of the imaging unit 110, the projected image 300 may not be projected onto the small operation surface 200, or the operation object 400 needs to be moved greatly. Inconvenience that it is difficult to operate. On the other hand, the interface in the first specific example can prevent the occurrence of such an inconvenient situation by adopting a configuration in which options are sequentially displayed in multiple stages.
- a projected video 314 shown in FIG. 14 is a video showing a plurality of key-like videos similar to the projected video 305 shown in FIG. 5, and provides a user with an interface related to character input. It is.
- the projection video 314 is projected by the projection unit 130 as the main video.
- the control unit 120 detects the positional relationship between the imaging position of the operating means (fingertip) 400 and the imaging position of each key included in the projection video 314 in the captured image. For example, the control unit 120 detects the tip position of the operation object 400 in the captured image by image processing. Then, the control unit 120 recognizes what operation the operation object 400 is performing based on the following rule (reference).
- the rule is, for example, “when a state where a certain key overlaps the tip of the operation object 400 has continued for a predetermined time or longer, the operation object 400 selects the key overlapping that tip. It is a rule such as “Consider”.
- the rule of the content is also referred to as a first rule.
- the control unit 120 has selected the key “JKL” by the operation object 400. Is recognized. Based on this recognition, the control unit 120 controls the projection unit 130, whereby the interface apparatus 100 projects a projection image as shown in FIG. 15.
- the video 315 related to the selected key is displayed on the projected video 314 that is the main video.
- the video 315 expands the key “J”, the key “K”, and the key “L”, which are options related to the key “JKL” selected by the operation object 400, to the user. It is an image to represent.
- control unit 120 detects that the operation object 400 has quickly moved in the direction toward the key (option) “K”
- the control unit 120 displays the character “K”. It is detected that the input of the character “K” has been instructed.
- the control unit 120 may recognize what operation the operation object 400 is performing based on the following rules.
- the rule is, for example, “When the projection image 315 is projected, the operation object 400 moves toward one of the keys displayed as the projection image 315 at a speed (or acceleration) equal to or higher than a threshold value. , When the operation object 400 stops at a position where the operation object 400 overlaps with the key, the operation object 400 is regarded as selecting the key.
- the content rule is also referred to as a second rule.
- the control unit 120 calculates the speed (or acceleration) of the operation object 400 using, for example, a tracking process based on image processing. Perform the function.
- the interface device 100 is preliminarily provided with a number of videos that are sequentially displayed based on selection information by the operation object 400 and information related to the display order in which the videos are displayed.
- the control unit 120 detects the movement of the operation object 400, the control unit 120 operates the key “@ # / &” which is a key superimposed on the key “K” instead of the key “K”. It may be determined that the object 400 has been selected. In this case, the control unit 120 does not recognize that the character “K” has been selected.
- the control unit 120 detects that the key “@ # / &” is selected, the video representing the options related to the key “@ # / &” expanded to the user is displayed as described above.
- the projection unit 130 is controlled to be projected near the key “@ # / &” (around the operation object 400).
- the interface apparatus 100 also detects the movement (speed or acceleration) of the operation object 400 in addition to the positional relationship between the imaging position of the operation object 400 and the imaging position of the projection image 300. As a result, the operation information by the operation object 400 is recognized.
- a rule (first rule) for determining the operation content of the operation object 400 with respect to the projection image 314 (first projection image) and an operation object with respect to the projection image 315 (second projection image) 400 rules (second rule) for determining the operation content are different from each other.
- the interface device 100 in the second specific example recognizes the operation information by the operation object 400 in consideration of not only the positional relationship in the captured image but also the movement of the operation object 400. For this reason, in addition to the effect in the first specific example, the interface device 100 in the second specific example has an effect of further reducing erroneous input. The reason will be described below.
- the operation object 400 may continue to move on the projection video 314 for a while in a state where the projection video 314 as shown in FIG. 14 is projected. In such a case, it is assumed that the operation object 400 comes over the key “JKL”, for example.
- the control unit 120 uses the positional relationship in the captured image to determine whether the operation object 400 has selected the key “JKL” or is only passing the key “JKL”. It is difficult to make an accurate judgment only by doing.
- the interface device 100 determines that the operation object 400 selects the key “JKL” even though it is only passing over the key “JKL”. And In this case, the interface device 100 is in a state where a character is erroneously input, and the user needs to perform an operation such as deleting the erroneously input character. That is, the interface device 100 may give the user a discomfort due to the complexity of the operation.
- the interface device 100 presents options related to the key to the user. Only the video (projected video 315) is developed. In this case, the user can reselect another key in the projection image 314 only by slowly moving the operation object 400 toward the other key. Then, the user can input a desired character by quickly moving the operation object 400 toward a desired key in the projected video 315 displayed by the interface device 100 as requested.
- the number of options displayed in association with the projection video 315 is displayed so as to overlap the projection video 314 rather than the number of options (keys) included in the projection video 314.
- the number of options (keys) included in the projected video 315 is reduced.
- the control unit 120 can detect the operation “the operation object 400 has moved quickly to the key” with almost no error.
- control unit 120 detects that the imaging position of a certain key and the imaging position of the tip of the operation object 400 in the captured image overlap each other (key selection state) continues for a predetermined time or longer.
- key selection state There are various methods, and an appropriately selected method is adopted.
- the control unit 120 has a predetermined number of frames in which the imaging position of the tip of the operation object 400 does not change in the captured image (N is a positive integer here, where N is a positive integer). It is detected that the operation has been continued. Then, using the detection as a trigger, the control unit 120 takes a captured image (that is, a captured image as a still image) of the next frame (the (N + 1) th frame after the imaging position of the tip of the operation object 400 no longer changes). ). Thereby, the control unit 120 detects the positional relationship between the imaging position of the key and the imaging position of the tip of the operation object 400 in the captured image, and detects the above-described key selection state based on the detected positional relationship. To do.
- control unit 120 analyzes a captured image as a moving image by using a known moving image recognition processing technique, and thereby a state in which the key and the tip of the operation object 400 overlap (key selection state). ) May be detected for a predetermined time or more.
- key selection state a state in which the key and the tip of the operation object 400 overlap
- control unit 120 there are various methods for the control unit 120 to detect that the operation object 400 has moved toward the key at a speed equal to or higher than a threshold in the captured image, and a method appropriately selected from these methods is available. Adopted.
- the control unit 120 analyzes a captured image for each frame (that is, a captured image as a still image), and tracks the imaging position of the operation object 400 in the captured image of each frame. The moving speed of the operation object 400 is detected. Further, the control unit 120 detects the moving direction of the operation object 400 and the imaging position of each key included in the projection video 315. Then, based on the comparison result between the detected moving speed and the threshold value, the moving direction of the operating object 400 and the imaging position of the key, the control unit 120 moves the operating object 400 toward a certain key at a speed higher than the threshold value. Detect that you are doing.
- control unit 120 detects the moving speed of the operation object 400 by analyzing a captured image as a moving image using a known moving image recognition processing technique. Then, using the detected moving speed, the control unit 120 may detect that the operating object 400 is moving toward a certain key at a speed equal to or higher than a threshold, as described above. Note that the method (operation) by which the control unit 120 detects such movement of the operation object 400 is not limited to such a specific example.
- FIG. 17 is a diagram illustrating an example of a projection image 315 that is projected when the operation object 400 selects the key “;” () ”in the projection image 314.
- the interface device 100 is configured to project information (video) instead of displaying information (video) on the screen of the display device, and the size restriction of the projected video is loose. For this reason, the interface apparatus 100 can easily display the projected video 315 that protrudes beyond the projected video 314 as shown in FIG.
- the interface in the third specific example is an interface that accepts numerical input.
- the user repeats a first operation for displaying a selection candidate number among the numbers from 0 to 9 and a second operation for selecting one of the numbers displayed by the first operation, whereby the interface is displayed.
- a number is entered into the device 100.
- FIG. 18 is a diagram illustrating an example of a projection image 318 by the projection unit 130.
- the projected video 318 includes an area 318A (hereinafter referred to as a confirmed area 318A) for displaying a number whose input has been confirmed, and an area 318B (hereinafter referred to as an operation area 318B) used for recognizing an operation by the operation object 400. ).
- the operation area 318B is an image having a linear shape or a rod shape (strip shape).
- the operation area 318B defines a range in which the operation object 400 should move. That is, the user operates the interface apparatus 100 by moving the operation object 400 along the operation area 318B.
- the operation region 318B is not limited to the shape shown in FIG.
- the operation region 318B is an image having a line or a bar representing the range in which the operation object 400 should move, and circles arranged at intervals between both ends thereof. Also good.
- a first operation operation for displaying a selection candidate number among numbers from 0 to 9) using the operation object 400 using the projection image 318 will be described with reference to FIGS. 19 and 20.
- the user selects a desired number from 0 to 9 by moving (sliding) the operation object 400 along the operation area 318B.
- the left end of the line (bar) in the operation area 318B corresponds to 0
- the corresponding number increases in order from the left side to the right side of the line (bar)
- the right end of the line (bar) corresponds to 9.
- 19 and 20 show a state in which the numbers of selection candidates change in order as the operation object 400 slides from the left side to the right side of the operation area 318B.
- the operation object 400 is located on the left side of the operation area 318B.
- the control unit 120 grasps the positional relationship between the imaging position of the operation region 318B and the imaging position of the operation object 400 in the captured image, and based on the grasped positional relationship, the operation object 400 is positioned at a position corresponding to the number “2”. Judge that there is. Then, the control unit 120 controls the projection unit 130, and in accordance with this control, the projection unit 130 projects an image 318 ⁇ / b> C representing the number “2” in the vicinity of the operation object 400. At this point, the number “2” is only displayed as a selection candidate and has not yet been selected.
- the control unit 120 grasps the positional relationship between the imaging position of the operation region 318B and the imaging position of the operation object 400 in the captured image, and the operation object 400 is determined based on the grasped positional relationship. It is determined that the position corresponds to the number “8”. Then, the control unit 120 controls the projection unit 130, and in accordance with this control, the projection unit 130 projects an image 318 ⁇ / b> D representing the number “8” in the vicinity of the operation object 400. At this point, the number “8” is only displayed as a selection candidate and has not yet been selected.
- FIG. 21 is a diagram for explaining an example of an operation in which a number is selected by the operation object 400.
- the user slides the operating object 400 toward the number “8”.
- the interface device 100 detects the operation (movement) of the operation object 400 using the result of image processing of the captured image by the imaging unit 110, the number “8” is selected. to decide.
- the interface device 100 controls the projection unit 130, and the projection unit 130 projects the number “8” onto the fixed region 318A.
- the input of the number “8” to the interface device 100 may be confirmed, or a step of separately confirming the input to the interface device 100 after the user has determined the number of digits of a desired number. May be prepared.
- FIG. 22 is a diagram for explaining another example of an operation in which a number is selected by the operation object 400.
- the user slides the operation object 400 in the direction from the number “8” toward the operation area 318B.
- the interface device 100 determines that the number “8” is selected when the movement is detected. Then, the interface device 100 (the control unit 120) controls the projection unit 130, and the projection unit 130 projects the number “8” onto the fixed region 318A.
- the user places the operation object 400 at a position corresponding to the selected number for a predetermined time (for example, 1.5 seconds) or more. Keep it.
- the interface device 100 determines that the number “8” is selected when the movement is detected. Then, the interface device 100 (the control unit 120) controls the projection unit 130 to project an animation such that, for example, the number “8” expands and disappears as shown in FIG. Thereafter, the number “8” is projected onto the fixed area 318A. With this display, the user can clearly recognize that the number “8” has been selected.
- the interface apparatus 100 selects a selection candidate from a plurality of options based on the positional relationship between the imaging position of the operation object 400 in the captured image and the imaging position of the operation area (first projection image) 318B. Recognize that is chosen. Then, the interface apparatus 100 (the control unit 120) controls the projection unit 130 so as to project the images (second projection images) 318C and 318D corresponding to the selection candidate options in the vicinity of the operation object 400. When the operation object 400 detects a predetermined operation for selecting the selection candidate in a state where the selection candidate images (second projection images) 318C and 318D are projected, the interface device 100 Recognize that a selection candidate has been selected.
- the interface apparatus 100 to which the third specific example is applied can prevent an erroneous input in which a number other than the number to be selected is erroneously selected.
- the control unit 120 of the interface device 100 in the fourth specific example is provided with a video processing engine that detects the shape of the hand (operation object 400) and detects the shape and movement of each finger.
- FIG. 24 is a diagram illustrating a state in which the projection image 324 is projected in the vicinity of the operation means (right hand) 400.
- the projection video 324 shown in FIG. 24 includes an area 324A, an area 324B, and an area 324C.
- the area 324A is an image having the same function as the projection image 314 shown in FIG. However, in the area 324A shown in FIG. 24, unlike the projected image 314, keys relating to operations such as “SPC (space)”, “ ⁇ (delete)”, and “RET (return)” are omitted. Yes.
- the area 324B is an image having the same function as the projected image 315 shown in FIG.
- the area 324C is an image representing keys related to operations such as “SPC (space)”, “ ⁇ (delete)”, or “RET (return)”.
- the control unit 120 distinguishes and detects the thumb 401 and the index finger 402 of the operation object 400 included in the captured image by the video processing engine described above. Then, as shown in FIG. 24, the control unit 120 causes the projection unit 130 to project the area 324A and the area 324B near the index finger 402 and the area 342C near the thumb 401. Control.
- the control unit 120 recognizes the operation information by the operation object 400 based on the positional relationship between the imaging position of the projection image 300 in the captured image and the imaging position of the thumb 401 or the index finger 402.
- the interface device 100 to which the fourth specific example described with reference to FIG. 24 is applied has an effect that high-speed input is possible in addition to the effect exhibited by the interface device 100 of the first specific example.
- the reason is that the control unit 120 detects the position and movement of the thumb 401 and the position and movement of the index finger 402, respectively, and recognizes the operation information by the operation object 400 based on the information.
- the interface in the fifth specific example is an interface that accepts input of words.
- the projected video 325 shown in FIG. 25 includes an area 325A, an area 325B, and an area 325C.
- the area 325A is an area having a function similar to that of the projected video 314 in the second specific example.
- the area 325B is an area for displaying characters input using the area 325A. In the example of FIG. 25, the character string “vege” is displayed in the area 325B.
- the region 325C is a region where a video providing a so-called prediction input function is projected. That is, in the example of FIG. 25, when the operation object 400 selects the key “SEL (select)” in a state where the character string “vege” is displayed in the region 325B, an input is predicted near the index finger 402. A video 325C of a word candidate (input prediction candidate) to be played is projected. In the example of FIG. 25, word candidates beginning with “vege”, “vegeburger”, “vegemite”, and “vegetable” are displayed (vegemite is a registered trademark).
- the control unit 120 detects that “vegetable” is instructed by the index finger 402 as described above, the interface apparatus 100 recognizes that the word “vegetable” has been input.
- the control unit 120 is provided with the same video processing engine as that in the fourth specific example.
- the interface device 100 of the fifth specific example has a configuration in which a word candidate displayed (projected) by the projected video 325C can be changed by moving the thumb 401 of the user.
- the control unit 120 detects that the thumb 401 has moved based on the image captured by the image capturing unit 110, the control unit 120 causes the projection unit 130 to display a projection image 325 ⁇ / b> C representing another candidate word starting with “vege” in FIG. 26. Let it project like. Thus, the user appears to have changed the word candidates from the candidates shown in FIG. 25 to the candidates shown in FIG. The user inputs the word “vegetarian” to the interface device 100 by instructing “vegetarian” with the index finger 402.
- control unit 120 does not always recognize the operation information by the operation object 400 based on the positional relationship between the imaging position of the image included in the projected image 300 in the captured image and the imaging position of the operation object 400. That is, as described in the fifth specific example, the control unit 120 recognizes the operation information by the operation object 400 when detecting the movement of the thumb 401 regardless of the positional relationship, and thereby The projected image 325C is switched.
- character input interface in the fifth specific example described above can also be applied to, for example, a kanji conversion function in Japanese input.
- the interface in the sixth specific example will be described with reference to FIGS.
- the interface in the sixth specific example is an interface that provides a full keyboard function.
- the control unit 120 of the interface device 100 that realizes the sixth specific example has a function of detecting fingers of a plurality of hands in a captured image and detecting their movements.
- the control unit 120 detects the positions of the right hand 403 and the left hand 404 of the operation object 400 in the captured image. Then, the control unit 120 controls the projection unit 130 so that the projection video 327A is projected in the vicinity of the right hand 403 and the left hand 404.
- the projected video 327A includes a key “SP (space)”, a key “S (shift)”, a key “R (line feed)”, and a key “B (back space)”.
- the projected video 327A does not include the alphabet keys constituting the full keyboard.
- the projection unit 130 is controlled so that the projection video 327B is projected in the vicinity of the middle finger.
- the projection video 327B shown in FIG. 28 includes a key “e”, a key “d”, and a key “c”. These three keys are keys assigned to the middle finger of the left hand 404 in a normal full keyboard.
- the control unit 120 When the control unit 120 detects that the middle finger of the left hand 404 moves toward any key based on the captured image in a state where the projection image 327B is projected as described above, the middle finger The character corresponding to the key that was headed is recognized as the input target character. Further, when the control unit 120 detects that the tip side of the middle finger of the left hand 404 is further moved to the palm side or the tip side of another finger is moved to the palm side based on the captured image, the control unit 120 displays the projection image 327B. Stop display.
- the right index finger has six keys: a key “y”, a key “h”, a key “n”, a key “u”, a key “j”, and a key “m”.
- a key is assigned.
- the control unit 120 detects that the tip side of the index finger of the right hand 403 has moved to the palm side based on the captured image, a projection image 327C as illustrated in FIG.
- the projection unit 130 is controlled so as to be projected.
- the projected video 327C includes, for example, a “y” key and a “u” key.
- control unit 120 detects that the index finger of the right hand 403 moves toward the key “y” based on the captured image
- the control unit 120 is associated with the key “y” in the vicinity of the index finger of the right hand 403.
- the projection unit 130 is controlled so that the image of the key being projected is projected.
- the key associated with the key “y” includes, for example, the key “h” and the key “n”, and the video of these keys is projected together with the video of the key “y” as described above. Is done.
- the projection unit 130 is controlled such that the projected video 327D is projected.
- the projected video 327D includes an escape key (E), a tab key (T), a control key (Ctl), an ortho key (A), a function key (F), and a delete key (D).
- control unit 120 When the control unit 120 detects that the projected key is selected as described above, the control unit 120 recognizes the operation information by the operation object 400 based on the selected key, and the operation information. The operation (function) based on is executed.
- the interface device 100 can display various options in the same manner as described above according to the movement of each of a plurality of fingers.
- the interface of the seventh specific example provided by the interface device 100 will be described with reference to FIGS. 31 to 34.
- the interface in the seventh specific example is an interface that provides a function corresponding to a mouse (a kind of input device for inputting information to a computer (auxiliary input device)).
- the projection unit 130 first projects a projection video 331A as shown in FIG. 31 on the operation surface 200 under the control of the control unit 120.
- the projected video 331A is, for example, a rectangular frame-shaped video.
- the control unit 120 detects that the operation object (hand) 400 is placed on the projection image 331A based on the captured image, the control unit 120 captures the entire image of the operation object 400 in the captured image and the index finger 402. The imaging position is detected.
- the control unit 120 controls the projection unit 130 so that a projection image 331B as shown in FIG. 31 is projected in the vicinity of the index finger 402.
- the projected video 331B includes an “L button” and an “R button”.
- the “L button” in the projected video 331B provides a function corresponding to a general mouse L button.
- the “R button” in the projected video 331B provides a function corresponding to a general mouse R button.
- the control unit 120 When detecting such a movement of the operation object 400 based on the captured image, the control unit 120 recognizes a change in position information of the operation object 400 as operation information. Thereby, the interface according to the seventh specific example provides the same function as the position information input function of the mouse or the like.
- the control unit 120 detects the movement of the operation object 400 based on the captured image, and thereby the projection position of the projection image 331 ⁇ / b> B is determined.
- the projection unit 130 is controlled so as to accompany the movement.
- control unit 120 when the control unit 120 detects that the index finger of the operation object 400 has moved in the direction of the “L button” in the projection image 331B based on the captured image, the control unit 120 functions according to the left click operation of the mouse. Performs the same function as Similarly, for example, when the control unit 120 detects that the index finger of the operation object 400 has moved in the direction of the “R button” in the projection image 331B based on the captured image, the control unit 120 performs the right click operation of the mouse. Performs the same function as the corresponding function. Similarly to the above, the control unit 120 may execute a function similar to a function corresponding to, for example, a double click of the mouse or a drag operation by detecting the movement of the finger.
- the control unit 120 indicates that the “L button” and the “R button” in the projected video 331B are represented in FIG. 31 and FIG. You may provide the function to control the projection part 130 so that it may project on right and left reverse.
- the control unit 120 projects the “L button” to one of the area near the user's thumb and the area near the middle finger of the user, and the “R button” to the other area.
- a function for controlling the projection unit 130 may be provided.
- FIG. 33 is a diagram illustrating an application example 1 (variation 1) of the seventh specific example.
- the control unit 120 has a function corresponding to an operation corresponding to the left click or right click on the mouse based on the positional relationship between the imaging position (or movement) of the thumb 401 and the imaging position of the projection video 331B. Perform similar functions.
- the control unit 120 detects the movement of the index finger 402 and recognizes position information for designating an arrangement position of a cursor or the like according to the movement of the index finger 402, for example. In this example, the control unit 120 recognizes the position information based on the movement of the index finger 402, but does not recognize the option selection information.
- the controller 120 can be prevented from being erroneously recognized.
- FIG. 34 is a diagram illustrating an application example 2 (variation part 2) of the seventh specific example.
- a shape made up of a plurality of fingers is given meaning. That is, the control unit 120 detects the shape of the operation object 400 in the captured image, and recognizes operation information corresponding to the detected shape of the operation object 400.
- the control unit 120 recognizes the start of the drag operation.
- the drag operation is one of operations using a mouse which is a kind of input device. For example, information such as specifying a range by moving the mouse while holding down the mouse button is sent to the computer. This is an input operation.
- the projection image 334 is projected near the thumb 401 and the index finger 402.
- the projection unit 130 is controlled.
- the projected video 334 is a video that indicates to the user that the start of the drag operation has been recognized.
- control unit 120 When the control unit 120 detects that the operation object 400 is moving based on the captured image in a state where the projection image 334 is being projected, the control unit 120 provides information (for example, information on a specified range) by a drag operation. recognize.
- the arrow shown in FIG. 34 is a drag trajectory.
- Such control related to the drag operation can be combined with the control methods in the various specific examples described above.
- FIG. 35 is a block diagram illustrating an example of a hardware configuration that can implement the control unit 120.
- the hardware configuring the control unit 120 includes a CPU (Central Processing Unit) 1 and a storage unit 2.
- the control unit 120 may include an input device (not shown) and an output device (not shown).
- Various functions of the control unit 120 are realized, for example, when the CPU 1 executes a computer program (software program, hereinafter simply referred to as “program”) read from the storage unit 2.
- the control unit 120 may include a communication interface (I / F (InterFace)) not shown.
- the control unit 120 may access an external device via a communication interface and determine an image to be projected based on information acquired from the external device.
- control unit 120 may be a control unit dedicated to the interface device 100, or a part of the control unit provided in a device including the interface device 100 may function as the control unit 120.
- the hardware configuration of the control unit 120 is not limited to the above-described configuration.
- the projection unit 130 only needs to have a function of projecting the projection video 300 under the control of the control unit 120.
- FIG. 36 is a diagram for explaining a specific example of a hardware configuration capable of realizing the projection unit 130.
- the projection unit 130 includes a laser light source 131, a first optical system 132, an element 133, and a second optical system 134.
- the laser light emitted from the laser light source 131 is shaped by the first optical system 132 into a mode suitable for later phase modulation.
- the first optical system 132 includes, for example, a collimator, and the collimator makes the laser beam suitable for the element 133 (that is, parallel light).
- the first optical system 132 may have a function of adjusting the polarization of the laser light so as to be suitable for later phase modulation. That is, when the element 133 is a phase modulation type, it is necessary to irradiate the element 133 with light having a polarization direction set in the manufacturing stage.
- the laser light source 131 is a semiconductor laser
- the laser light source 131 semiconductor so that the polarization direction of the light incident on the element 133 matches the set polarization direction. (Laser) may be installed.
- the first optical system 132 includes, for example, a polarizing plate, and the polarization direction of the light incident on the element 133 is set by the polarizing plate. It is necessary to adjust so that it may become the polarization direction.
- the polarizing plate is disposed closer to the element 133 than the collimator.
- Such laser light guided from the first optical system 132 to the element 133 is incident on the light receiving surface of the element 133.
- the element 133 has a plurality of light receiving regions.
- the control unit 200 varies the optical characteristics (for example, refractive index) of each light receiving region of the element 133 according to the information for each pixel of the image to be irradiated, for example, by varying the voltage applied to each light receiving region. Control.
- the laser light phase-modulated by the element 133 passes through a Fourier transform lens (not shown) and is condensed toward the second optical system 134.
- the second optical system 134 has, for example, a projection lens, and the collected light is imaged by the second optical system 134 and irradiated to the outside.
- the element 133 that realizes the projection unit 130 is a type that reflects light, but the projection unit 130 may be realized by using a transmission type element 133.
- the element 133 will be described. As described above, the laser beam emitted from the laser light source 131 is incident on the element 133 via the first optical system 132. The element 133 modulates the phase of the incident laser light and emits the modulated laser light.
- the element 133 is also called a spatial light phase modulator (SpatialpatLight Phase Modulator) or a phase modulation type spatial modulation device. Details will be described below.
- the element 133 includes a plurality of light receiving areas (details will be described later).
- the light receiving area is a cell constituting the element 133.
- the light receiving areas are arranged in a one-dimensional or two-dimensional array, for example.
- the control unit 120 determines a difference between the phase of light incident on the light receiving region and the phase of light emitted from the light receiving region for each of the plurality of light receiving regions constituting the element 133. Control the parameters to change.
- control unit 120 controls each of the plurality of light receiving regions so that optical characteristics such as a refractive index or an optical path length change.
- the phase distribution of the incident light incident on the element 133 changes according to the change in the optical characteristics of each light receiving region.
- the element 133 emits light reflecting the control information from the control unit 120.
- the element 133 includes, for example, a ferroelectric liquid crystal, a homogeneous liquid crystal, or a vertical alignment liquid crystal, and is realized using, for example, LCOS (Liquid Crystal Crystal On Silicon).
- the control unit 120 controls the voltage applied to the light receiving region for each of the plurality of light receiving regions constituting the element 133.
- the refractive index of the light receiving region changes according to the applied voltage. For this reason, the control unit 120 can generate a difference in refractive index between the light receiving regions by controlling the refractive index of each light receiving region constituting the element 133.
- the incident laser light is appropriately diffracted in each light receiving region based on the difference in refractive index between the light receiving regions under the control of the control unit 120.
- the element 133 can also be realized by, for example, a technology of MEMS (Micro Electro Mechanical System).
- FIG. 37 is a diagram for explaining the structure of the element 133 realized by MEMS.
- the element 133 includes a substrate 133A and a plurality of mirrors 133B assigned to each light receiving region on the substrate. Each of the plurality of light receiving regions of the element 133 is configured by a mirror 133B.
- the substrate 133A is, for example, parallel to the light receiving surface of the element 133 or substantially perpendicular to the incident direction of the laser light.
- the control unit 120 controls the distance between the substrate 133A and the mirror 133B for each of the plurality of mirrors 133B included in the element 133. Thereby, the control part 120 changes the optical path length at the time of the incident light reflecting for every light reception area
- the element 133 diffracts incident light on the same principle as that of a diffraction grating.
- the element 133 can theoretically form any image by diffracting the incident laser beam.
- a diffractive optical element is described in detail in Non-Patent Document 3, for example.
- Non-Patent Document 4 describes a method in which the control unit 120 controls the element 133 to form an arbitrary image. Therefore, description is abbreviate
- the hardware configuration for realizing the projection unit 130 is not limited to the above-described example, but when the projection unit 130 is realized by the above-described hardware configuration, the following effects are obtained.
- the projection unit 130 having the above-described hardware configuration can be reduced in size and weight by including the element 133 manufactured using the MEMS technology.
- the image projected by the projection unit 130 having the above-described hardware configuration is formed by a region where the intensity of the laser beam is increased due to light diffraction, the projection unit 130 is formed on the projection surface 200 located far away. In contrast, a bright image can be projected.
- interface device 100 is superior to the conventional interface device in terms of size, weight, and power consumption.
- the inventor considered applying the interface device 100 to a portable terminal by taking advantage of these advantages.
- the present inventor also considered using as a wearable terminal.
- FIG. 38 shows a specific example in which the interface device 100 is applied to a tablet (portable device).
- the tablet 100A includes an imaging unit 110A and a projection unit 130A.
- the projection unit 130A projects the projection image 338 onto the operation surface 200 (for example, a table top).
- the projected video 338 is a projected video 300 that provides a keyboard function, for example.
- the user of the tablet 100 ⁇ / b> A operates the tablet 100 ⁇ / b> A using the projection video 338 projected on the operation surface 200.
- a user inputs characters or the like by using the keyboard (projected image 338) while displaying, for example, an Internet site image on the entire surface of the display. be able to.
- the keyboard 100Key is displayed on a part of the display 100Dis. For this reason, in a state where the keyboard 100Key is displayed, an area where information such as video and characters is displayed on the display 100Dis is narrowed. As a result, the user may find it inconvenient because it is difficult to perform an operation of inputting characters using the keyboard 100Key while referring to an image or the like displayed on the display 100Dis.
- the tablet 100A equipped with the interface device 100 as described above, the user uses the keyboard (projected video 338) while referring to the video displayed on the entire surface of the display, etc. Can be entered. That is, this tablet 100A can improve the usability of the user.
- FIG. 40 shows a specific example in which the interface device 100 is applied to a smartphone (portable device).
- the smartphone 100B shown in FIG. 40 includes an imaging unit 110B and a projection unit 130B.
- Projection unit 130 ⁇ / b> B projects projected video 340 onto operation surface 200.
- the projected video 340 is, for example, a projected video 300 that provides the function of an input key.
- the user of the smartphone 100 ⁇ / b> B operates the smartphone 100 ⁇ / b> B using the projected video 340 projected on the operation surface 200.
- the operation surface 200 may be, for example, a desk top surface or a screen 101B attached to the smartphone 100B as shown in FIG.
- the usability of the smartphone 100B can be improved. That is, the user can operate the projected video 340 while referring to information displayed on the entire display surface of the smartphone 100B (or while viewing the cursor 102B), and thus the usability of the smartphone 100B can be improved. improves.
- FIG. 41 is a diagram illustrating an example in which the interface device 100 is applied to various small terminals (portable devices).
- the small terminal 100 ⁇ / b> C is a name plate-like terminal on which the interface device 100 is mounted.
- the small terminal 100D is a pen-like terminal on which the interface device 100 is mounted.
- the small terminal 100E is a portable music terminal in which the interface device 100 is mounted.
- the small terminal 100F is a pendant terminal on which the interface device 100 is mounted.
- the interface device 100 is not limited to a device to be mounted as a small terminal. However, since the interface device 100 has a simple structure and can be easily downsized, application to such a small terminal is effective.
- FIG. 42 is a diagram illustrating a specific example in which the interface device 100 is mounted on a wristband type device (portable device (accessory)) 100G.
- FIG. 43 is a diagram showing a device (portable device (jewelry)) 100H in which the interface device 100 is mounted on eyewear such as glasses or sunglasses.
- the interface device 100 may be mounted on shoes, a belt, a tie, a hat, or the like to constitute a wearable terminal (portable device (accessory)).
- the imaging unit 110 and the projection unit 130 are provided at different positions from each other.
- the imaging unit 110 and the projection unit 130 may be designed to be coaxial with each other.
- the interface device 100 can be used by hanging from the ceiling or hanging on a wall, taking advantage of its small size or lightness.
- the operator holds the sandwich 405 in his hand, points the interface device 100 toward the sandwich 405, and images the sandwich 405 by the imaging unit 110.
- the control unit 120 performs image processing on the image captured by the image capturing unit 110, and the sandwich 405 is reflected in the captured image based on the image processing result and information (for example, contour information) on the sandwich provided in advance. Recognize that.
- the control unit 120 further recognizes the label 406 affixed to the sandwich 405, and determines whether the expiration date of the sandwich 405 has expired based on the information of the label 406.
- control unit 120 determines that the expiration date of the sandwich 405 has expired, the control unit 120 controls the projection unit 130 so that the projection image 345 as shown in FIG.
- the projected video 345 is a video that indicates to the worker that the term has expired.
- the operator can perform a slip process for discarding the sandwich 405 as follows. That is, the control unit 120 controls the projection unit 130 so that the projected video 346 as shown in FIG. 46 is projected on the sandwich 405.
- the projected video 346 is a video that accepts selection of whether or not to discard the sandwich 405 from the operator.
- the operator operates the projection image 346 with the operation object (finger) 400.
- the control unit 120 detects in the captured image that the operation object 400 has moved toward the area representing “Y (Yes)”, the control unit 120 confirms that the operator has selected to discard the sandwich 405. recognize.
- control unit 120 detects that the operation object 400 moves toward the area representing “N (No)” in the captured image, the control unit 120 ends the process without doing anything.
- the control unit 120 communicates with a server of a store designated in advance, and the server of the store Information indicating that the sandwich 405 is to be discarded is transmitted. Thereby, the slip process performed by the worker is completed. That is, the worker can finish the slip processing only by selecting “Y (Yes)” in the projection video 346.
- the interface device 100 described here has a communication function for wireless communication or wired communication with the server as described above.
- the server of the store which received the information that the sandwich 405 is discarded transmits (replies) the information that the slip processing related to the discard of the sandwich 405 is completed to the interface device 100, for example.
- the interface apparatus 100 controls the projection unit 130 such that the projection video 347 as shown in FIG. 47 is projected onto the sandwich 405.
- the projected video 347 is a video that notifies the worker that the slip processing is completed.
- the sandwich 405 is disposed of by putting the sandwich 405 into a disposal basket or the like.
- the operator when an operator performs a disposal process using a check sheet, the operator is likely to make a mistake such as leaving a food that must be discarded.
- the interface device 100 by using the interface device 100 to check whether the sandwiches 405 are discarded one by one, the operator can prevent the above-described mistakes.
- the imaging unit 110 images the component shelf 407.
- the control unit 120 recognizes the equipment shelf 407 in the captured video by performing image processing on the captured video.
- the interface device 100 has a communication function and can communicate with the server using the communication function.
- the interface device 100 inquires of the server the type and number of components stored in the component shelf 407. As a result, the server transmits (replies) information indicating that, for example, eight parts A are stored in the parts shelf 407 to the interface apparatus 100.
- the control unit 120 controls the projection unit 130 such that the projection image 348 as shown in FIG.
- the projected image 348 is an image indicating that eight parts A are stored in the parts shelf 407 for the worker.
- the server device may be notified of an instruction to the interface device 100 that three parts A should be picked up.
- the operator taps the projected image 348 three times according to the number of picked up parts.
- the worker taps the projection image 348 once, it appears that the numbers displayed in the projection image 349 are counted up as shown in FIG.
- the display “3” is projected.
- the operator uses the function to determine the number of parts A picked up from the parts shelf 407. Input to the interface device 100.
- the worker then performs a predetermined confirming operation to confirm the number “3” as a number to be input (recognized) to the interface device 100.
- the interface device 100 recognizes the input of the number “3”, and from the number “8” of the parts A stored in the parts shelf 407, the number “3” picked up by the operator. "5" is calculated by subtracting ".”
- the control unit 120 controls the projection unit 130 so that a projection image 350 as shown in FIG. 50 in which the calculation result is reflected is projected onto, for example, a door of the work shelf 407.
- the operator views the projected image 350 and confirms that the number of components A displayed in the projected image 350 matches the number of components A stored in the work shelf 407. Management work is complete.
- the interface apparatus 100 transmits information indicating that three parts A have been picked up from the parts shelf 407 to the server. As a result, the component management data held in the server is updated.
- the interface device 100 As described above, if the interface device 100 is used, not only the work easy for the operator to understand is possible, but also the work relating to electronic component management is completed on the spot.
- FIGS. 1-10 A specific example in which the interface apparatus 100 is used for the operation of the portable music terminal will be described below with reference to FIGS.
- Portable music terminals are widespread and are convenient devices that allow users to listen to music anywhere, but they have the inconvenience that they need to be taken out of the bowl for each operation. Although some operations can be performed with a small device attached to the earphone or the like, there is a problem that the amount of information and the content of the operation are limited, and the operation is too small in the first place. By using the interface device 100, these disadvantages can be solved.
- the interface device 100 may be mounted integrally with the portable music terminal, or may be provided separately from the portable music terminal and configured to be communicable with the portable music terminal.
- the interface device 100 may be configured by a module including the control unit 120 and the projection unit 130 and a mobile phone camera that can communicate with the module (that is, a camera that functions as the imaging unit 110).
- the control unit 120 can communicate with the portable music terminal.
- the interface device 100 may be configured by an external camera (imaging unit 110) and an external projector (projection unit 130), and a control device that functions as the control unit 120 that can communicate with them. In this case, the control unit 120 can communicate with the portable music terminal, the external camera, and the external projector.
- the control unit 120 when the control unit 120 recognizes that a hand is projected on the video image captured by the imaging unit 110, the control unit 120 controls the projection unit 130 so that a projection video image 351 as illustrated in FIG. To do.
- the projected video 351 includes information 351A on the music being selected and a state selection bar 351B. Various options are displayed on the state selection bar 351B.
- the operation object 400 slides along the state selection bar 351 ⁇ / b> B so that the operation is performed by the projection unit 130 based on the control of the control unit 120. Options corresponding to the position of the object 400 are projected.
- the operation object 400 (user) has selected reproduction (triangular mark).
- the control unit 120 recognizes the selection information and transmits the information to, for example, the control unit of the portable music terminal, whereby music is played by the portable music terminal.
- the control unit 120 when the control unit 120 recognizes that the operation object 400 is located at the position of the left circle, the projection video is displayed as the music list video ( The projection unit 130 is controlled to switch to the music selection screen. Further, when the control unit 120 recognizes that the operation object 400 is slidingly moved along the left selection bar, the selection target music is selected according to the slide position of the operation object 400. The projection unit 130 is controlled so as to be highlighted one after another. Further, when the control unit 120 recognizes that the operation object 400 is located at the highlighted music piece, it recognizes that the music piece has been selected. Furthermore, the control part 120 transmits the information of the selected music to the control part of a portable music terminal, for example.
- the functions related to information display as described above can be applied to display of information related to music, for example.
- the function can be applied to, for example, a telephone number display and selection function of a telephone directory registered in the telephone.
- the function can be applied to various functions.
- interface device 100 may be mounted integrally with the remote control, or may be provided separately from the remote control and configured to be communicable with the remote control.
- the interface device 100 may include an external camera (imaging unit 110), an external projector (projection unit 130), and a control device that functions as the control unit 120.
- the control device can communicate with an external camera and an external projector.
- the interface apparatus 100 projects a projection image 354 on the operation surface 200.
- the projected video 354 includes a volume control bar 354A. Similar to the third specific example, when the operation object 400 slides along the volume control bar 354A, the control unit 120 projects a projection image 354B representing options according to the position of the operation object 400 on the volume control bar 354A. Thus, the projection unit 130 is controlled.
- “MUTE (silence)” and “8” representing the volume level are displayed as the images of the options (projection video 354B).
- the projected video 354 further includes a channel selection bar 355A.
- the control unit 120 projects a projection image representing options according to the position of the operation object 400 on the channel selection bar 355A.
- the projection unit 130 is controlled so that 355B is projected.
- “ABC” representing the channel name of the television broadcast is displayed as the video of the option (projected video 355B).
- the interface apparatus 100 can implement
- the remote controller normally communicates with a main body such as a television using infrared rays, but may project such infrared rays from the interface device 100.
- the projection unit 130 projects a projection image 356 as shown in FIG.
- the projected video 356 is a video displayed for inputting Japanese, and includes a projected video 356A and a projected video 356B.
- the projected video 356A is a video in which a plurality of key-like videos are represented.
- the projected image 356A is Japanese hiragana “a”, “ka”, “sa”, “ta”, “na”, “ha”, “ma”, “ya”, “ra”.
- “WA” is displayed, and a plurality of kana key images are displayed.
- the projected image 356A further includes a key representing Japanese (Katakana) “space” meaning blank and a key representing Japanese (kanji) “new line” meaning line break. And are included.
- the projected video 356B is a video in which a plurality of options associated with the kana key instructed by the operation object 400 among a plurality of kana keys in the projected video 356A are developed.
- the operation object 400 indicates the kana key “NA” in the projection image 356A.
- the Japanese hiragana characters “N”, “Ni”, “Nu”, “Ne”, “No”, which are a plurality of options related to the Kana key “NA”, are respectively represented as the projected image 356B.
- the key image is projected.
- control unit 120 When the control unit 120 detects that one of a plurality of option keys in the projection image 356B is instructed by the operation object 400 based on the image captured by the image capturing unit 110, the control unit 120 corresponds to the selected key. To recognize as input characters.
- the flick input for example, when the operation object 400 stays at the position of one key in the projection image 356A, the character displayed on the key is recognized by the apparatus as the input character.
- the operating object 400 temporarily stops at the position of one key in the projection image 356A, for example, there is a case where the operation object 400 moves by sliding in a certain direction.
- the device recognizes a character determined based on the key at the position where the operation object 400 is once stopped and the direction in which the operation object 400 is subsequently slid as an input character.
- the flick input the following erroneous input problem is likely to occur.
- the Japanese hiragana “NA” is recognized as an input character, for example, because the operation object 400 stays at the position of one key in the projection image 356A.
- the user wants to input the Japanese hiragana “ne” related to the Japanese hiragana “na” (incorrect input).
- the operation object 400 stays at the position of one key in the projection image 356A, images of a plurality of options (projection image 356B) related to the characters displayed on the key are projected.
- the projection video 356B includes the characters indicated by the operation object 400 in the projection video 356A as options.
- a character determined based on the projected image and the position of the option indicated by the operation object 400 is recognized as an input character by the apparatus (control unit 120). That is, in this specific example, a character is not input only by operating the operation object 400 on the projection image 356A, and an input character is input by the operation of the operation object 400 on the projection image 356B according to the operation of the operation object 400 on the projection image 356A. Be recognized.
- the reference for recognizing the operation of the operation object 400 for the projection image 356A is different from the reference for recognizing the operation of the operation object 400 for the projection image 356B.
- the input character is recognized based on the projection video 356B representing the option instead of the projection video 356A, and the option of the projection video 356B is the projection video 356A.
- the characters of the indicated key are also included.
- the interface apparatus 100 of this specific example can prevent erroneous input of characters.
- FIG. 57 is a diagram illustrating another example of the projected image 356 in FIG.
- the operation object 400 indicates the Japanese hiragana “ha” key in the projection image 356A.
- the projection unit 130 based on the control of the control unit 120 projects a key-like image of a plurality of options related to Japanese hiragana “ha” as shown in FIG. 57 as the projected image 356B.
- the Japanese hiragana characters “ha”, “ba”, “pa”, “hi”, “bi”, “pi”, “fu”, “bu”, “pu”, “pu” “He”, “be”, “pe”, “ho”, “bo” and “po” are respectively shown.
- the interface device 100 can provide an interface capable of inputting characters at a high speed by collectively displaying characters including muddy sounds and semi-voiced sounds.
- the interface apparatus 100 describes an example in which Japanese characters are displayed. Instead of this, for example, the interface device 100 displays not only Japanese characters but also symbols such as alphabets, umlauts, and characters (emoticons) that combine them as options of the projected video 356B. May be.
- the interface device 100 of this specific example can be applied to various types of character input interfaces.
- FIG. 58 is a block diagram showing a simplified configuration of the interface device 100I according to the second embodiment of the present invention.
- the interface device 100I includes an imaging unit 110I, a control unit 120I, and a projection unit 130I.
- Projection unit 130I projects the first projection image.
- the imaging unit 110I can capture an image including at least an operation object that operates the device itself and the first projection image.
- the control unit 120I recognizes the operation information by the operation object based on the relationship between the imaging position of the operation object and the imaging position of the first projection image in the captured image that is an image captured by the imaging unit 110I. Further, the control unit 120I controls the projection unit 130I according to the recognized operation information.
- FIG. 59 is a block diagram showing a simplified configuration of a module 500 according to the third embodiment of the present invention. As shown in FIG. 59, the module 500 is used in a state where it is communicably connected to the electronic component 800.
- the electronic component 800 includes an imaging unit 810.
- the module 500 includes a control unit 120J and a projection unit 130J.
- Projection unit 130J projects the first projection image.
- the control unit 120J receives the captured image captured by the imaging unit 810 from the electronic component 800. Then, the control unit 120J recognizes the operation information of the operation object based on the positional relationship between the imaging position of the operation object in the captured image and the imaging position of the first projection image. Furthermore, the control unit 120J controls the projection unit 130J according to the operation information.
- FIG. 60 is a simplified block diagram showing the configuration of the control device 600 according to the fourth embodiment of the present invention.
- the control device 600 is communicably connected to the electronic component 900.
- the electronic component 900 includes an imaging unit 910 and a projection unit 930. Note that the imaging unit 910 and the projection unit 930 may be mounted on separate electronic components 900, respectively. In this case, the control device 600 is communicably connected to an electronic component 900 including the imaging unit 910 and another electronic component 900 including the projection unit 930 via a communication network.
- the control device 600 includes a control unit 120K.
- the control unit 120K receives the captured video captured by the imaging unit 910. Then, the control unit 120K recognizes the operation information by the operation object based on the relationship between the imaging position of the operation object in the captured image and the imaging position of the first projection image. Furthermore, the control unit 120K controls the projection unit 930 according to the operation information.
- FIG. 61 is a block diagram schematically illustrating the configuration of an interface device 100L according to the fifth embodiment of the present invention.
- the interface device 100L includes an imaging unit 110L, a control unit 120L, and a projection unit 130L.
- Projection unit 130L projects a projected image.
- the imaging unit 110L captures a projected video.
- the control unit 120L calculates the positional relationship between the surface (operation surface) on which the projection video is projected and the imaging unit 110L based on the imaging position of the projection video in the captured video captured by the imaging unit 110L.
- FIG. 62 is a block diagram showing a simplified configuration of a module 500M according to the sixth embodiment of the present invention.
- the module 500M is communicably connected to the electronic component 800M.
- the electronic component 800M includes an imaging unit 810M.
- the module 500M includes a control unit 120M and a projection unit 130M.
- Projection unit 130M projects a projected image.
- the control unit 120M calculates the positional relationship between the surface (operation surface) on which the projection video is projected and the imaging unit 810M based on the imaging position of the projection video in the captured video captured by the imaging unit 810M.
- FIG. 63 is a block diagram showing a simplified configuration of the control device 600N according to the seventh embodiment of the present invention.
- the control device 600N is communicably connected to the electronic component 900N.
- the electronic component 900N includes an imaging unit 910N and a projection unit 930N.
- the control device 600N includes a control unit 120N. Note that the imaging unit 910N and the projection unit 930N may be mounted on separate electronic components 900N, respectively. In this case, the control device 600N is communicably connected to an electronic component 900N including the imaging unit 910N and another electronic component 900N including the projection unit 930N via a communication network.
- the control unit 120N calculates the positional relationship between the surface (operation surface) on which the projected video is projected and the imaging unit 910N based on information on the imaging position of the projected video in the captured video captured by the imaging unit 910N.
- An interface device comprising:
- control means In response to receiving the operation, the control means controls the projection means to project a second projection image, The control means is further configured to follow the received operation based on a relationship between a position where the operation object is reflected in the captured image and a position where the second projection image projected by the projection means is reflected. Accept operations, The interface device according to attachment 1.
- the first projection video is a video for displaying a plurality of options
- the control means receives an operation of selecting a first option of the plurality of options based on a relationship between the position of the operation object and the position of the first projection image in the captured image
- the second projected video is a video for displaying options related to the first option.
- the control means accepts an operation by the operation object based on a relationship between a position where the operation object is reflected in the captured image and a position where the projection image is reflected, and a movement of the operation object, The control means receives an operation according to different criteria between a case where an operation is received in relation to the first projection video and a case where an operation is received in relation to the second projection video.
- the interface device according to attachment 3.
- the control means controls the projection means so that the second projection video is projected in a manner superimposed on the first projection video, In the case where the second projection image is reflected in the captured image, the control means is based on the speed or acceleration of the operation object in the captured image, Executing a process of receiving input of information corresponding to the first option among a plurality of options indicated by the second projection image; Or Executing a process for controlling the projection means to project the first projection video again; The interface device according to attachment 4.
- the control means projects the first projection image in the vicinity of a first operation object among the plurality of operation objects, and in the vicinity of a second operation object different from the first operation object, the first operation object. Controlling the projection means to project two projection images; The interface device according to any one of appendices 2 to 5.
- the imaging means captures an image including the first operation object, the first projection image, the second operation object, and the second projection image
- the control means includes a relationship between a position where the first operation object is reflected in the captured image and a position where the first projection image is reflected, and the second operation object in the captured image. Accepting an operation based on the relationship between the position where the second projection image is reflected and the position where the second projection image is reflected, and executing the process of controlling the projection means according to the accepted operation, The interface device according to attachment 6.
- the control means is based on a relationship between a position where the operation object is reflected and a position where the first projection image is reflected in a captured image which is an image captured by the imaging means, from among a plurality of options.
- the projection means is controlled to project a second projection image, which is an image corresponding to the certain option, in the vicinity of the operation object,
- the control means further includes In a state where the second projected video is included in the captured video, an input of information corresponding to the certain option is received when a specific action by the operation object is detected in the captured video,
- the interface device according to attachment 1.
- the specific action is In the captured image, an operation in which the position where the operation object is reflected remains at a position corresponding to the specific option for a predetermined time or more in relation to the position where the first projection image is reflected; Or In the captured image, the position where the operation object is reflected is a predetermined speed toward the direction in which the second projection image is reflected or the direction opposite to the direction in which the second projection image is reflected. 9.
- the operation object is an index finger of a user who operates the interface device
- the control means detects that the first projection image and the user's index finger are reflected in the captured image
- the control means is in the vicinity of the user's index finger and in the vicinity of the user's thumb and the user's index finger.
- the second projection image is projected on one of the areas near the middle finger, and the other of the area near the user's index finger and near the user's thumb and the area near the middle finger of the user.
- Controlling the projecting means to project a third projection image on the area of The control means further includes In accordance with the positional relationship between the position where the second projection image is reflected in the captured image and the position where the user's index finger is reflected, the first operation is accepted. Accepting a second operation in accordance with the positional relationship between the position where the third projected image is reflected and the position where the user's index finger is reflected in the captured image;
- the interface device according to attachment 1.
- An interface device comprising:
- the imaging means captures an operation object that operates the device itself, and an image including the projection image
- the control means accepts an operation by the operation object based on a relationship between a position where the operation object is reflected and a position where the projection image is reflected in a captured image which is an image captured by the imaging means, Executing a process for controlling the projection means in accordance with the received operation;
- the projected image is a first area used for an operation on the interface device based on a positional relationship with the operation object, and an area different from the first area, and the projected image is projected thereon.
- a second region used to calculate a positional relationship between a surface and the imaging means, The interface device according to attachment 11.
- the projection means includes A laser light source for irradiating laser light; An element that modulates and emits the phase of the laser beam when the laser beam is incident, The control means determines an image formed based on the light emitted by the element according to the content of the accepted operation, and controls the element so that the determined image is formed.
- the interface device according to any one of appendices 1 to 12.
- the element has a plurality of light receiving regions, and each of the light receiving regions modulates and emits a phase of laser light incident on the light receiving region,
- the control means controls the element so as to change a parameter for determining a difference between a phase of light incident on the light receiving region and a phase of light emitted from the light receiving region for each of the light receiving regions.
- the interface device according to attachment 13.
- Appendix 15 A portable electronic device in which the interface device according to any one of appendices 1 to 14 is incorporated.
- a control device for controlling the projection means for projecting the first projection image An image including at least an operation object that operates the device and a first projection image is received, and the position where the operation object is reflected in the captured image that is the received image and the first projection image are displayed.
- a control device that receives an operation by the operation object based on a relationship with a position and transmits a signal for controlling the projection unit to the projection unit in accordance with the received operation.
- a computer for controlling an interface device comprising an imaging means and a projection means for projecting a first projection image; A process of controlling the imaging means so as to capture an image including at least an operation object that operates the device and the first projection image; An operation by the operation object is received based on a relationship between a position where the operation object is reflected and a position where the first projection image is reflected in a captured image which is an image captured by the imaging unit.
- a module used by being incorporated in an electronic device including an imaging means, Projection means for projecting a projected image; Control means for calculating the positional relationship between the surface on which the projection video is projected and the imaging means based on information indicating the position where the projection video is shown in the captured video that is the video captured by the imaging means.
- a module comprising:
- a control device that controls an electronic device including an imaging unit and a projection unit that projects a projected image, A control device that calculates a positional relationship between a surface on which the projection video is projected and the imaging unit based on information indicating a position where the projection video is reflected in a captured video that is a video captured by the imaging unit.
- Appendix 23 A control method executed by a computer that controls an interface device including an imaging unit and a projection unit that projects a projected image, A control method for calculating a positional relationship between a surface on which the projection video is projected and the imaging unit based on information indicating a position where the projection video is reflected in a captured video which is a video captured by the imaging unit.
- Appendix 24 A computer that controls an interface device including an imaging unit and a projection unit that projects a projected image. A process of calculating a positional relationship between the surface on which the projection video is projected and the imaging unit is executed based on information indicating a position where the projection video is captured in the captured video that is an image captured by the imaging unit. Let the program.
- the present invention can be applied to, for example, realizing an interface device having an input function with high recognition accuracy.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
- Projection Apparatus (AREA)
- Input From Keyboards Or The Like (AREA)
Abstract
Description
第1の投射映像を投射する投射手段と、
前記第1の投射映像が投射されている領域を撮像する撮像手段と、
前記撮像手段が撮像した映像である撮像映像に前記第1の投射映像が映っていると共に操作物体が映っている場合には、前記撮影映像における前記第1の投射映像が映っている撮像位置と前記操作物体が映っている撮像位置との関係に基づいて、前記操作物体による操作情報を認識する制御手段と、
を備える。
投射映像を投射する投射手段と、
前記投射映像が投射されている領域を撮像する撮像手段と、
前記撮像手段が撮像した映像である撮像映像における前記投射映像が映っている撮像位置の情報に基づいて、前記投射映像が投射されている面と前記撮像手段との位置関係を算出する制御手段と、
を備える。
本発明のインターフェース装置を備えている。
制御対象の投射手段により投射された投射映像が映し出されている撮影映像を受信し、当該撮影映像に、前記投射映像と共に操作物体が映し出されている場合には、前記撮影映像において前記投射映像が映し出されている撮像位置と前記操作物体が映し出されている撮像位置との関係に基づいて、前記操作物体による操作情報を認識し、当該操作情報に応じて前記投射手段を制御する。
上述した本発明における制御装置と、
当該制御装置により制御される投射手段と
を備える。
制御対象の投射手段により投射された投射映像が映し出されている撮影映像を受信し、
当該撮影映像に、前記投射映像と共に操作物体が映し出されている場合には、前記撮影映像において前記投射映像が映し出されている撮像位置と前記操作物体が映し出されている撮像位置との関係に基づいて、前記操作物体による操作情報を認識し、
前記操作情報に応じて前記投射手段を制御する。
コンピュータに、
制御対象の投射手段により投射された投射映像が映し出されている撮影映像を受信し、当該撮影映像に、前記投射映像と共に操作物体が映し出されている場合には、前記撮影映像において前記投射映像が映し出されている撮像位置と前記操作物体が映し出されている撮像位置との関係に基づいて、前記操作物体による操作情報を認識する処理と、
前記操作情報に応じて前記投射手段を制御する処理と、
を実行させるコンピュータプログラムを保持している。
-- 概要の説明 --
図1は、本発明に係る第1実施形態のインターフェース装置の構成を表すブロック図である。図1に表されるように、インターフェース装置100は、撮像部(撮像手段)110と、制御部(制御手段)120と、投射部(投射手段)130とを備える。
以下、第1実施形態のインターフェース装置100におけるキャリブレーション処理を図1と図4~図10を用いて説明する。
次に、第1実施形態におけるインターフェース装置100の動作の一例を、図11を用いて説明する。図11は、インターフェース装置100の動作の一例を示すフローチャートである。
ここから、インターフェース装置100が提供するインターフェースの具体例をいくつか説明する。説明を分かり易くするために、以下の説明では、制御部120が実行する動作の詳細を適宜省略する。例えば、制御部120が投射映像300およびその投射方向を決定し、これに基づいた制御部120の制御動作によって投射部130が投射映像300を、決定された投射方向に投射することを、単に、投射部130が投射映像300を投射する、というように表現する。また、撮像部110が映像を撮像し、撮像映像における操作物体400の撮像位置と投射映像300の撮像位置との関係に基づいて制御部120が操作物体400による操作情報を認識することを、単に、制御部120が操作情報を認識する、と表現する。
図12および図13を参照して、第1具体例にかかるインターフェースを説明する。撮像部110による撮像映像に操作物体400(この場合は指先)が撮像されると、インターフェース装置100は、操作物体400の周辺領域(操作物体400も含まれる)に投射映像312を投射する。この様子が図12に表されている。上述したように、インターフェース装置100は、操作物体400の周辺領域に投射映像312をまず投射する。そして、インターフェース装置100は、撮像映像における投射映像312の撮像位置に基づき、操作面200と撮像部110との位置関係(例えば、撮像部110と操作面200との間の間隔や、撮像部110の光軸に対する操作面200の傾き等)を把握(検出)する。そして、インターフェース装置100は、その検出結果を利用して投射映像312の投射方向等を調整し、調整後に、投射映像312を操作物体400(指先)の周辺に投射し直す。
図14乃至図16を参照して、第2具体例にかかるインターフェースを説明する。図14に表される投射映像314は、図5に表される投射映像305と同様の、複数のキー状の映像が表されている映像であり、文字入力に関わるインターフェースをユーザに提供する映像である。第2具体例では、その投射映像314がメイン映像として投射部130により投射される。
図17は、操作物体400が投射映像314におけるキー“;’()”を選択した場合に投射される投射映像315の一例を表す図である。インターフェース装置100は、ディスプレイ装置の画面に情報(映像)を表示するのではなく、情報(映像)を投射する構成であり、その投射映像のサイズの制約が緩い。このため、インターフェース装置100は、図17に表されるような投射映像314の欄外にはみ出すような投射映像315を容易に表示することができる。
図18乃至図22を参照して、第3具体例におけるインターフェースを説明する。第3具体例におけるインターフェースは、数字の入力を受け付けるインターフェースである。ユーザは、0から9までの数字のうち選択候補の数字を表示させる第1の操作と、第1の操作により表示された数字の一つを選択する第2の操作とを繰り返すことにより、インターフェース装置100に数字を入力する。
図24を参照して、第4具体例におけるインターフェースを説明する。
図25および図26を参照して、第5具体例におけるインターフェースを説明する。第5具体例におけるインターフェースは、単語の入力を受け付けるインターフェースである。図25に表される投射映像325には、領域325A、領域325Bおよび領域325Cが含まれる。領域325Aは、第2具体例における投射映像314と同様の機能を有する領域である。領域325Bは、領域325Aを利用して入力された文字を表示する領域である。図25の例では、領域325Bには、“vege”という文字列が表示されている。
図27乃至図30を参照して、第6具体例におけるインターフェースを説明する。第6具体例におけるインターフェースは、フルキーボードの機能を提供するインターフェースである。第6具体例を実現するインターフェース装置100の制御部120は、撮像映像における複数の手の指を検出し、かつ、それらの動きも検出する機能を有している。
図31乃至図34を参照して、インターフェース装置100が提供する第7具体例のインターフェースを説明する。第7具体例におけるインターフェースは、マウス(コンピュータに情報を入力する入力装置の一種(補助入力デバイス))に相当する機能を提供するインターフェースである。
図33は、第7具体例の応用例1(バリエーションその1)を表す図である。この例では、制御部120は、親指401の撮像位置(あるいは動き)と、投射映像331Bの撮像位置との位置関係に基づいて、マウスにおける左クリックあるいは右クリックに対応する操作に応じた機能と同様の機能を実行する。また、制御部120は、人差し指402の動きを検出し、この人差し指402の動きに応じて、例えば、カーソル等の配置位置を指定する位置情報を認識する。この例では、制御部120は、人差し指402の動きによって位置情報を認識するが、選択肢の選択情報は認識しない。このため、仮に、操作物体(手)400が高速に移動することにより、人差し指402が投射映像331Bにおける“Rボタン”や“Lボタン”の位置を単に通過した場合に、 “Rボタン”や“Lボタン”が選択されたと、制御部120が誤認識することを防止できる。
図34は、第7具体例の応用例2(バリエーションその2)を表す図である。この例では、複数の指で作る形に意味を持たせる。つまり、制御部120は、撮像映像における操作物体400の形を検出し、検出した操作物体400の形に応じた操作情報を認識する。例えば、操作物体(手)400の形が、親指401を人差し指402につけた形である場合、制御部120は、ドラッグ操作の開始を認識する。なお、ドラッグ操作とは、入力装置の一種であるマウスを利用する操作の一つであり、例えば、マウスのボタンを押したままマウスを移動することにより、範囲を指定する等の情報をコンピュータに入力する操作である。
- 制御部120のハードウェア構成の一例 -
図35は、制御部120を実現可能なハードウェア構成の一例を説明するブロック図である。
次に、投射部130のハードウェア構成の一例を説明する。投射部130は、制御部120からの制御により投射映像300を投射する機能を有していればよい。
以下、インターフェース装置100が適用された装置の具体例を説明する。上述した通り、インターフェース装置100は、サイズ、重量および消費電力の観点から、従来型のインターフェース装置よりも優れている。本発明者は、これらの利点を活かして、インターフェース装置100を携帯端末に適用することを考えた。また、本発明者は、ウェアラブル端末として利用することも考えた。
以下に、物品を選別する作業にインターフェース装置100を利用する具体例を図44乃至図47を参照しながら説明する。
以下に、部品管理に係る作業にインターフェース装置100を利用する具体例を図48乃至図50を参照しながら説明する。
以下に、携帯音楽端末の操作にインターフェース装置100を利用する具体例を図51乃至図53を参照しながら説明する。携帯音楽端末は広く普及しており、どこでも音楽を聴くことのできる便利な装置であるが、操作のためにいちいち鞄などから出す必要があるなどの不便さがある。イヤホンなどに付属した小さな装置で一部の操作を行うことも可能であるが、情報量や操作の内容は限られるし、そもそも小さすぎて操作しづらいという問題もある。インターフェース装置100を用いることによりそれらの不都合を解消することができる。
以下に、テレビなどのリモコンにインターフェース装置100を利用する具体例を図54および図55を参照しながら説明する。ここでは、インターフェース装置100は、リモコンと一体として実装されていてもよいし、リモコンとは別体に設けられ、リモコンと通信可能に構成されていてもよい。あるいは、インターフェース装置100は、外部のカメラ(撮像部110)と、外部のプロジェクタ(投射部130)と、制御部120として機能する制御装置とを有して構成されていてもよい。この場合には、制御装置(制御部120)は、外部のカメラおよび外部のプロジェクタと通信可能になっている。
以下に、日本語入力機能を備えた装置にインターフェース装置100を適用する具体例を図56および図57を参照しながら説明する。
図58は、本発明に係る第2実施形態のインターフェース装置100Iの構成を簡略化して表すブロック図である。このインターフェース装置100Iは、撮像部110Iと、制御部120Iと、投射部130Iとを備える。
図59は、本発明に係る第3実施形態のモジュール500の構成を簡略化して表すブロック図である。図59に表されるように、モジュール500は、電子部品800と通信可能に接続した状態で用いられる。
図60は、本発明に係る第4実施形態の制御装置600の構成を簡略化して表すブロック図である。制御装置600は電子部品900と通信可能に接続される。
図61は、本発明に係る第5実施形態のインターフェース装置100Lの構成を簡略化して表すブロック図である。インターフェース装置100Lは、撮像部110Lと、制御部120Lと、投射部130Lとを備える。
図62は、本発明に係る第6実施形態のモジュール500Mの構成を簡略化して表すブロック図である。このモジュール500Mは、電子部品800Mと通信可能に接続される。
図63は、本発明に係る第7実施形態の制御装置600Nの構成を簡略化して表すブロック図である。制御装置600Nは、電子部品900Nと通信可能に接続される。
第1の投射映像を投射する投射手段と、
自装置を操作する操作物体と前記第1の投射映像とを少なくとも含む映像を撮像する撮像手段と、
前記撮像手段が撮像した映像である撮像映像における、前記操作物体が映っている位置と前記第1の投射映像が映っている位置との関係に基づいて、前記操作物体による操作を受け付け、前記受け付けた操作に応じて前記投射手段を制御する制御手段と、
を備えるインターフェース装置。
前記制御手段は、前記操作を受け付けたことに応じて、第2の投射映像を投射するよう前記投射手段を制御し、
前記制御手段は更に、前記撮像映像における、前記操作物体が映っている位置と前記投射手段が投射する前記第2の投射映像が映っている位置との関係に基づいて、前記受け付けた操作の次の操作を受け付ける、
付記1に記載のインターフェース装置。
前記第1の投射映像は、複数の選択肢を表示するための映像であり、
前記制御手段は、前記撮像映像における、前記操作物体の位置と前記第1の投射映像の位置との関係に基づいて、前記複数の選択肢のうちの第1の選択肢を選択する操作を受け付け、
前記第2の投射映像は、前記第1の選択肢に関連する選択肢を表示するための映像である、
付記2に記載のインターフェース装置。
前記制御手段は、前記撮像映像における、前記操作物体が映っている位置と前記投射映像が映っている位置との関係、および、前記操作物体の動きに基づいて前記操作物体による操作を受け付け、
前記制御手段は、前記第1の投射映像との関係において操作を受け付ける場合と、前記第2の投射映像との関係において操作を受け付ける場合とで、互いに異なる基準に従って操作を受け付ける、
付記3に記載のインターフェース装置。
前記制御手段は、前記第2の投射映像が前記第1の投射映像に重畳された態様で投射されるよう前記投射手段を制御し、
前記制御手段は、撮像映像に前記第2の投射映像が映っている場合において、撮像映像における前記操作物体の速度または加速度に基づいて、
前記第2の投射映像が示している複数の選択肢のうち前記第1の選択肢に対応する情報の入力を受け付ける処理を実行するか、
あるいは、
再度前記第1の投射映像を投射するよう前記投射手段を制御する処理を実行する、
付記4に記載のインターフェース装置。
複数の前記操作物体によって自装置が操作される場合に、
前記制御手段は、前記複数の操作物体のうち第1の操作物体の近傍に前記第1の投射映像を投射し、前記第1の操作物体とは異なる第2の操作物体の近傍に、前記第2の投射映像を投射するよう前記投射手段を制御する、
付記2ないし5のいずれかに記載のインターフェース装置。
前記撮像手段は、前記第1の操作物体、前記第1の投射映像、前記第2の操作物体、および、前記第2の投射映像を含む映像を撮像し、
前記制御手段は、前記撮像映像における、前記第1の操作物体が映っている位置と前記第1の投射映像が映っている位置との関係、および、前記撮像映像における、前記第2の操作物体が映っている位置と前記第2の投射映像が映っている位置との関係に基づいて操作を受け付け、前記受け付けた操作に応じて前記投射手段を制御する処理を実行する、
付記6に記載のインターフェース装置。
前記制御手段は、複数の選択肢のうちから、前記撮像手段が撮像した映像である撮像映像における、前記操作物体が映っている位置と前記第1の投射映像が映っている位置との関係に基づいて定まるある選択肢の選択を受け付け、前記ある選択肢に対応する映像である第2の投射映像を前記操作物体の近傍に投射するよう前記投射手段を制御し、
前記制御手段は更に、
前記撮像映像に前記第2の投射映像が含まれている状態において、前記撮像映像において前記操作物体による特定の動作を検出した場合に、前記ある選択肢に対応する情報の入力を受け付ける、
付記1に記載のインターフェース装置。
前記特定の動作は、
前記撮像映像において、前記操作物体が映っている位置が前記第1の投射映像が映っている位置との関係において前記特定の選択肢に対応する位置に所定時間以上留まる動作、
または、
前記撮像映像において、前記操作物体が映っている位置が、前記第2の投射映像の映っている方向若しくは前記第2の投射映像の映っている方向とは反対の方向に向かって、所定の速度以上若しくは加速度以上で変化する動作
である、付記8に記載のインターフェース装置。
前記操作物体は、前記インターフェース装置を操作するユーザの人指し指であり、
前記制御手段は、前記撮像映像において前記第1の投射映像と前記ユーザの人指し指とが映っていることを検出すると、前記ユーザの人指し指の近傍であり当該ユーザの親指の近傍の領域および当該ユーザの中指の近傍の領域のうちの一方の領域に第2の投射映像を投射し、前記ユーザの人指し指の近傍であり当該ユーザの親指の近傍の領域および当該ユーザの中指の近傍の領域のうちの他方の領域に第3の投射映像を投射するよう前記投射手段を制御し、
前記制御手段は更に、
前記撮像映像における、前記第2の投射映像が映っている位置と前記ユーザの人指し指が映っている位置との位置関係に応じて、第1の操作を受け付け、
前記撮像映像における、前記第3の投射映像が映っている位置と前記ユーザの人指し指が映っている位置との位置関係に応じて第2の操作を受け付ける、
付記1に記載のインターフェース装置。
投射映像を投射する投射手段と、
前記投射映像を撮像する撮像手段と、
前記撮像手段が撮像した映像である撮像映像における、前記投射映像が映っている位置を示す情報に基づいて、前記投射映像が投射されている面と前記撮像手段との位置関係を算出する制御手段と、
を備えるインターフェース装置。
前記撮像手段は、自装置を操作する操作物体、および、前記投射映像を含む映像を撮像し、
前記制御手段は、前記撮像手段が撮像した映像である撮像映像における、前記操作物体が映っている位置と前記投射映像が映っている位置との関係に基づいて前記操作物体による操作を受け付け、前記受け付けた操作に応じて前記投射手段を制御する処理を実行し、
前記投射映像は、前記操作物体との位置関係に基づいて前記インターフェース装置に対する操作に利用される第1の領域と、前記第1の領域とは異なる領域であって前記射映像が投射されている面と前記撮像手段との位置関係を算出するのに利用される第2の領域とを含む、
付記11に記載のインターフェース装置。
前記投射手段は、
レーザ光を照射するレーザ光源と、
前記レーザ光が入射されると当該レーザ光の位相を変調して出射する素子と、を含み、
前記制御手段は、前記受け付けた操作の内容に応じて、前記素子が出射する光に基づいて形成される像を決定し、前記決定された像が形成されるように前記素子を制御する、
付記1ないし12のいずれかに記載のインターフェース装置。
前記制御手段は、それぞれの前記受光領域について、当該受光領域に入射された光の位相と当該受光領域が出射する光の位相との差分を決定付けるパラメータを変化させるよう前記素子を制御する、
付記13に記載のインターフェース装置。
付記1から14のいずれかに記載のインターフェース装置が組み込まれた、携帯電子機器。
付記1から14のいずれかに記載のインターフェース装置が組み込まれた、装身具。
電子機器に組み込まれて用いられるモジュールであって、
第1の投射映像を投射する投射手段と、
自装置を操作する操作物体と前記第1の投射映像とを少なくとも含む映像を受信し、前記受信した映像である撮像映像における、前記操作物体が映っている位置と前記第1の投射映像が映っている位置との関係に基づいて前記操作物体による操作を受け付け、前記受け付けた操作に応じて前記投射手段を制御する処理を実行する制御手段と、
を具備するモジュール。
第1の投射映像を投射する投射手段を制御する制御装置であって、
自装置を操作する操作物体と第1の投射映像とを少なくとも含む映像を受信し、前記受信した映像である撮像映像における前記操作物体が映っている位置と前記第1の投射映像が映っている位置との関係に基づいて前記操作物体による操作を受け付け、前記受け付けた操作に応じて前記投射手段を制御する信号を前記投射手段に送信する、制御装置。
撮像手段と、第1の投射映像を投射する投射手段と、を備えるインターフェース装置を制御するコンピュータによって実行される制御方法であって、
自装置を操作する操作物体と前記第1の投射映像とを少なくとも含む映像を撮像するよう前記撮像手段を制御し、
前記撮像手段が撮像した映像である撮像映像における、前記操作物体が映っている位置と前記第1の投射映像が映っている位置との関係に基づいて前記操作物体による操作を受け付け、前記受け付けた操作に応じて前記投射手段を制御する処理を実行する、
制御方法。
撮像手段と、第1の投射映像を投射する投射手段と、を備えるインターフェース装置を制御するコンピュータに、
自装置を操作する操作物体と前記第1の投射映像とを少なくとも含む映像を撮像するよう前記撮像手段を制御する処理と、
前記撮像手段が撮像した映像である撮像映像における、前記操作物体が映っている位置と前記第1の投射映像が映っている位置との関係に基づいて前記操作物体による操作を受け付け、前記受け付けた操作に応じて前記投射手段を制御する処理と、
を実行させるプログラム。
撮像手段を備える電子機器に組み込まれて用いられるモジュールであって、
投射映像を投射する投射手段と、
前記撮像手段が撮像した映像である撮像映像における、前記投射映像が映っている位置を示す情報に基づいて、前記投射映像が投射されている面と前記撮像手段との位置関係を算出する制御手段と、
を具備するモジュール。
撮像手段と、投射映像を投射する投射手段と、を備える電子機器を制御する制御装置であって、
前記撮像手段が撮像した映像である撮像映像における前記投射映像が映っている位置を示す情報に基づいて、前記投射映像が投射されている面と前記撮像手段との位置関係を算出する、制御装置。
撮像手段と、投射映像を投射する投射手段と、を備えるインターフェース装置を制御するコンピュータによって実行される制御方法であって、
前記撮像手段が撮像した映像である撮像映像における前記投射映像が映っている位置を示す情報に基づいて、前記投射映像が投射されている面と前記撮像手段との位置関係を算出する、制御方法。
撮像手段と、投射映像を投射する投射手段と、を備えるインターフェース装置を制御するコンピュータに、
前記撮像手段が撮像した映像である撮像映像における前記投射映像が映っている位置を示す情報に基づいて、前記投射映像が投射されている面と前記撮像手段との位置関係を算出する処理を実行させる、プログラム。
2 記憶部
100 インターフェース装置
110 撮像部
120 制御部
130 投射部
200 操作面
300 投射映像
400 操作物体
500 モジュール
600 制御装置
800 電子部品
900 電子部品
Claims (18)
- 第1の投射映像を投射する投射手段と、
前記第1の投射映像が投射されている領域を撮像する撮像手段と、
前記撮像手段が撮像した映像である撮像映像に前記第1の投射映像が映っていると共に操作物体が映っている場合には、前記撮影映像における前記第1の投射映像が映っている撮像位置と前記操作物体が映っている撮像位置との位置関係に基づいて、前記操作物体による操作情報を認識する制御手段と、
を備えるインターフェース装置。 - 前記制御手段は、前記操作情報に応じて、第2の投射映像を投射するように前記投射手段を制御し、
前記制御手段は、さらに、前記撮像映像における前記操作物体の撮像位置と前記投射手段による前記第2の投射映像の撮像位置との位置関係に基づいて、前記操作物体による次の操作情報を認識する請求項1に記載のインターフェース装置。 - 前記第1の投射映像は、複数の選択肢が表示されている映像であり、
前記制御手段は、前記撮像映像における前記操作物体の撮像位置と前記第1の投射映像の撮像位置との位置関係に基づいて、前記複数の選択肢のうちの一つが選択された操作の情報を認識し、
前記第2の投射映像は、前記第1の選択肢に関連する選択肢が表示されている映像である請求項2に記載のインターフェース装置。 - 前記撮影映像における前記第1の投射映像の撮像位置を利用して前記操作物体による操作情報を認識する場合に前記制御手段が用いる基準と、前記撮影映像における前記第2の投射映像の撮像位置を利用して前記操作物体による操作情報を認識する場合に前記制御手段が用いる基準とが異なっている請求項3に記載のインターフェース装置。
- 前記投射手段は、前記第2の投射映像を前記第1の投射映像に重畳する態様でもって投射する請求項2又は請求項3又は請求項4に記載のインターフェース装置。
- 前記制御手段は、前記位置関係に加えて、前記操作物体の動きの情報をも利用して、前記操作物体による操作情報を認識する請求項1乃至請求項5の何れか一つに記載のインターフェース装置。
- 前記制御手段は、前記操作物体が所定時間以上留まっている動作を検出した場合には、その操作物体が留まっている位置に投射されている前記投射映像に基づいて前記操作物体による操作情報を認識するか、あるいは、
前記制御手段は、前記操作物体が所定の速度以上あるいは所定の加速度以上で直線状に移動している動作を検出した場合には、その操作物体が向かっている方向に位置する前記投射映像に基づいて前記操作物体による操作情報を認識するか、あるいは、
前記制御手段は、前記操作物体が所定の速度以上あるいは所定の加速度以上で直線状に移動している動作を検出した場合には、その操作物体がその動きを開始した位置に投射されている前記投射映像に基づいて前記操作物体による操作情報を認識する請求項6記載のインターフェース装置。 - 前記制御手段は、複数種の前記操作物体を識別する機能を有し、
前記制御手段は、前記撮影映像に、複数種の前記操作物体が映し出されている場合には、前記各操作物体の近傍にそれぞれ別々の投射映像が投射されるように、前記投射手段を制御する請求項1乃至請求項7の何れか一つに記載のインターフェース装置。 - 前記操作物体の一つは、ユーザの親指であり、前記操作物体の別の一つは、前記ユーザの人差し指であり、
前記制御手段は、前記ユーザの前記親指と前記人差し指を含む領域に前記第1の投射映像が投射されるように前記投射手段を制御し、また、前記親指の近傍と、前記人差し指の近傍とには、それぞれ、互いに異なる投射映像が投射されるように前記投射手段を制御し、
さらに、前記制御手段は、前記親指の撮像位置と前記親指の近傍に投射されている前記投射映像の撮像位置との位置関係に基づいて、前記親指による操作情報を認識し、また、前記人差し指の撮像位置と前記人差し指の近傍に投射されている前記投射映像の撮像位置との位置関係に基づいて、前記人差し指による操作情報を認識する請求項8に記載のインターフェース装置。 - 投射映像を投射する投射手段と、
前記投射映像が投射されている領域を撮像する撮像手段と、
前記撮像手段が撮像した映像である撮像映像における前記投射映像が映っている撮像位置の情報に基づいて、前記投射映像が投射されている面と前記撮像手段との位置関係を算出する制御手段と
を備えるインターフェース装置。 - 前記制御手段は、前記撮像手段が撮像した映像である撮像映像に前記投射映像が映っていると共に操作物体が映っている場合には、前記撮影映像における前記投射映像が映っている撮像位置と前記操作物体が映っている撮像位置との位置関係に基づいて、前記操作物体による操作情報を認識し、
さらに、前記投射手段による前記投射映像には、前記操作物体による前記操作情報を認識する処理にて利用される映像領域と、前記投射映像が投射されている面と前記撮像手段との位置関係を算出する処理にて利用される映像領域とが含まれている請求項10に記載のインターフェース装置。 - 前記投射手段は、レーザ光を照射するレーザ光源と、入射した前記レーザ光の位相を変調し当該変調後のレーザ光を出射する素子とを含み、
前記制御手段は、前記認識した操作情報に応じて、前記素子が出射する光に基づいて形成される像を決定し、前記決定された像が形成されるように前記素子を制御する請求項1乃至請求項11の何れか一つに記載のインターフェース装置。 - 前記素子は、複数の受光領域を有し、前記各受光領域は、当該受光領域に入射したレーザ光の位相を変調して出射し、
前記制御手段は、前記各受光領域について、当該受光領域に入射した光の位相と当該受光領域が出射する光の位相との差分を決定付けるパラメータが変化するように前記素子を制御する請求項12に記載のインターフェース装置。 - 請求項1乃至請求項13の何れか一つのインターフェース装置を備えた可搬装置。
- 制御対象の投射手段により投射された投射映像が映し出されている撮影映像を受信し、当該撮影映像に、前記投射映像と共に操作物体が映し出されている場合には、前記撮影映像において前記投射映像が映し出されている撮像位置と前記操作物体が映し出されている撮像位置との関係に基づいて、前記操作物体による操作情報を認識し、当該操作情報に応じて前記投射手段を制御する制御装置。
- 請求項15に記載の制御装置と、
当該制御装置により制御される投射手段と
を備えるモジュール。 - 制御対象の投射手段により投射された投射映像が映し出されている撮影映像を受信し、
当該撮影映像に、前記投射映像と共に操作物体が映し出されている場合には、前記撮影映像において前記投射映像が映し出されている撮像位置と前記操作物体が映し出されている撮像位置との関係に基づいて、前記操作物体による操作情報を認識し、
前記操作情報に応じて前記投射手段を制御する制御方法。 - 制御対象の投射手段により投射された投射映像が映し出されている撮影映像を受信し、当該撮影映像に、前記投射映像と共に操作物体が映し出されている場合には、前記撮影映像において前記投射映像が映し出されている撮像位置と前記操作物体が映し出されている撮像位置との関係に基づいて、前記操作物体による操作情報を認識する処理と、
前記操作情報に応じて前記投射手段を制御する処理と、
をコンピュータに実行させるコンピュータプログラムを保持しているプログラム記憶媒体。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015556783A JPWO2015105044A1 (ja) | 2014-01-10 | 2015-01-07 | インターフェース装置、可搬装置、制御装置、モジュール、制御方法およびコンピュータプログラム |
US15/110,486 US20160349926A1 (en) | 2014-01-10 | 2015-01-07 | Interface device, portable device, control device and module |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-003224 | 2014-01-10 | ||
JP2014003224 | 2014-01-10 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/110,486 A-371-Of-International US20160349926A1 (en) | 2014-01-10 | 2015-01-07 | Interface device, portable device, control device and module |
US15/873,308 Continuation US10299201B2 (en) | 2014-01-09 | 2018-01-17 | Methods and systems relating to ultra wideband broadcasting |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015105044A1 true WO2015105044A1 (ja) | 2015-07-16 |
Family
ID=53523874
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/000030 WO2015105044A1 (ja) | 2014-01-10 | 2015-01-07 | インターフェース装置、可搬装置、制御装置、モジュール、制御方法およびプログラム記憶媒体 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20160349926A1 (ja) |
JP (1) | JPWO2015105044A1 (ja) |
WO (1) | WO2015105044A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018079446A1 (ja) * | 2016-10-27 | 2018-05-03 | 日本電気株式会社 | 情報入力装置および情報入力方法 |
WO2018146922A1 (ja) * | 2017-02-13 | 2018-08-16 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
JP2019003397A (ja) * | 2017-06-15 | 2019-01-10 | コニカミノルタ株式会社 | 情報処理装置及びプログラム |
JP2020071587A (ja) * | 2018-10-30 | 2020-05-07 | セイコーエプソン株式会社 | 表示装置、及び、表示装置の制御方法 |
Families Citing this family (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9690400B2 (en) | 2015-04-21 | 2017-06-27 | Dell Products L.P. | Information handling system interactive totems |
US11243640B2 (en) | 2015-04-21 | 2022-02-08 | Dell Products L.P. | Information handling system modular capacitive mat with extension coupling devices |
US11106314B2 (en) | 2015-04-21 | 2021-08-31 | Dell Products L.P. | Continuous calibration of an information handling system projected user interface |
US9921644B2 (en) | 2015-04-21 | 2018-03-20 | Dell Products L.P. | Information handling system non-linear user interface |
US9791979B2 (en) | 2015-04-21 | 2017-10-17 | Dell Products L.P. | Managing inputs at an information handling system by adaptive infrared illumination and detection |
US9804733B2 (en) | 2015-04-21 | 2017-10-31 | Dell Products L.P. | Dynamic cursor focus in a multi-display information handling system environment |
US9983717B2 (en) | 2015-04-21 | 2018-05-29 | Dell Products L.P. | Disambiguation of false touch inputs at an information handling system projected user interface |
US9804718B2 (en) * | 2015-04-21 | 2017-10-31 | Dell Products L.P. | Context based peripheral management for interacting with an information handling system |
JP6354653B2 (ja) * | 2015-04-25 | 2018-07-11 | 京セラドキュメントソリューションズ株式会社 | 拡張現実操作システムおよび拡張現実操作プログラム |
JP6631181B2 (ja) * | 2015-11-13 | 2020-01-15 | セイコーエプソン株式会社 | 画像投射システム、プロジェクター、及び、画像投射システムの制御方法 |
WO2017087872A1 (en) * | 2015-11-20 | 2017-05-26 | Harman International Industries, Incorporated | Dynamic reconfigurable display knobs |
CN107015736B (zh) * | 2016-01-27 | 2020-08-21 | 北京搜狗科技发展有限公司 | 一种按键处理方法和装置、一种用于按键处理的装置 |
US10496216B2 (en) | 2016-11-09 | 2019-12-03 | Dell Products L.P. | Information handling system capacitive touch totem with optical communication support |
US10139951B2 (en) | 2016-11-09 | 2018-11-27 | Dell Products L.P. | Information handling system variable capacitance totem input management |
US10139973B2 (en) | 2016-11-09 | 2018-11-27 | Dell Products L.P. | Information handling system totem tracking management |
US10139930B2 (en) | 2016-11-09 | 2018-11-27 | Dell Products L.P. | Information handling system capacitive touch totem management |
US10146366B2 (en) | 2016-11-09 | 2018-12-04 | Dell Products L.P. | Information handling system capacitive touch totem with optical communication support |
FR3063821B1 (fr) * | 2017-03-10 | 2021-07-30 | Inst Mines Telecom | Interface homme machine |
US10459528B2 (en) | 2018-02-28 | 2019-10-29 | Dell Products L.P. | Information handling system enhanced gesture management, control and detection |
US10664101B2 (en) | 2018-06-28 | 2020-05-26 | Dell Products L.P. | Information handling system touch device false touch detection and mitigation |
US10761618B2 (en) | 2018-06-28 | 2020-09-01 | Dell Products L.P. | Information handling system touch device with automatically orienting visual display |
US10795502B2 (en) | 2018-06-28 | 2020-10-06 | Dell Products L.P. | Information handling system touch device with adaptive haptic response |
US10635199B2 (en) | 2018-06-28 | 2020-04-28 | Dell Products L.P. | Information handling system dynamic friction touch device for touchscreen interactions |
US10852853B2 (en) | 2018-06-28 | 2020-12-01 | Dell Products L.P. | Information handling system touch device with visually interactive region |
US10817077B2 (en) | 2018-06-28 | 2020-10-27 | Dell Products, L.P. | Information handling system touch device context aware input tracking |
KR102207067B1 (ko) * | 2018-12-28 | 2021-01-25 | 이진우 | 홀로그램 기반의 문자 인식 방법 및 그 장치 |
CN111093066A (zh) * | 2019-12-03 | 2020-05-01 | 耀灵人工智能(浙江)有限公司 | 一种动态平面投影方法及*** |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006104132A1 (ja) * | 2005-03-28 | 2006-10-05 | Matsushita Electric Industrial Co., Ltd. | ユーザインタフェイスシステム |
JP2008134793A (ja) * | 2006-11-28 | 2008-06-12 | Fujifilm Corp | 電子的手書入力装置 |
US20100199232A1 (en) * | 2009-02-03 | 2010-08-05 | Massachusetts Institute Of Technology | Wearable Gestural Interface |
JP2012108771A (ja) * | 2010-11-18 | 2012-06-07 | Panasonic Corp | 画面操作システム |
WO2012173001A1 (ja) * | 2011-06-13 | 2012-12-20 | シチズンホールディングス株式会社 | 情報入力装置 |
JP2013061552A (ja) * | 2011-09-14 | 2013-04-04 | Ricoh Co Ltd | プロジェクタ装置および操作検出方法 |
JP2013182342A (ja) * | 2012-02-29 | 2013-09-12 | Sharp Corp | 入力装置、入力装置の制御方法、制御プログラム、および記録媒体 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006268209A (ja) * | 2005-03-23 | 2006-10-05 | Akinori Yoshino | ユーザの身体動作による遠隔命令入力装置 |
US8726194B2 (en) * | 2007-07-27 | 2014-05-13 | Qualcomm Incorporated | Item selection using enhanced control |
WO2012088046A2 (en) * | 2010-12-21 | 2012-06-28 | Syndiant, Inc. | Spatial light modulator with storage reducer |
JP5840399B2 (ja) * | 2011-06-24 | 2016-01-06 | 株式会社東芝 | 情報処理装置 |
JP2013074601A (ja) * | 2011-09-29 | 2013-04-22 | Manabu Chijimatsu | 類推型注視点検出及び携帯情報端末操作 |
JP6145963B2 (ja) * | 2012-04-05 | 2017-06-14 | セイコーエプソン株式会社 | プロジェクター、表示システム、及びプロジェクターの制御方法 |
US20160054860A1 (en) * | 2013-03-27 | 2016-02-25 | Sharp Kabushiki Kaisha | Input device |
-
2015
- 2015-01-07 WO PCT/JP2015/000030 patent/WO2015105044A1/ja active Application Filing
- 2015-01-07 US US15/110,486 patent/US20160349926A1/en not_active Abandoned
- 2015-01-07 JP JP2015556783A patent/JPWO2015105044A1/ja active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006104132A1 (ja) * | 2005-03-28 | 2006-10-05 | Matsushita Electric Industrial Co., Ltd. | ユーザインタフェイスシステム |
JP2008134793A (ja) * | 2006-11-28 | 2008-06-12 | Fujifilm Corp | 電子的手書入力装置 |
US20100199232A1 (en) * | 2009-02-03 | 2010-08-05 | Massachusetts Institute Of Technology | Wearable Gestural Interface |
JP2012108771A (ja) * | 2010-11-18 | 2012-06-07 | Panasonic Corp | 画面操作システム |
WO2012173001A1 (ja) * | 2011-06-13 | 2012-12-20 | シチズンホールディングス株式会社 | 情報入力装置 |
JP2013061552A (ja) * | 2011-09-14 | 2013-04-04 | Ricoh Co Ltd | プロジェクタ装置および操作検出方法 |
JP2013182342A (ja) * | 2012-02-29 | 2013-09-12 | Sharp Corp | 入力装置、入力装置の制御方法、制御プログラム、および記録媒体 |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018079446A1 (ja) * | 2016-10-27 | 2018-05-03 | 日本電気株式会社 | 情報入力装置および情報入力方法 |
JPWO2018079446A1 (ja) * | 2016-10-27 | 2019-09-19 | 日本電気株式会社 | 情報入力装置および情報入力方法 |
US10955971B2 (en) | 2016-10-27 | 2021-03-23 | Nec Corporation | Information input device and information input method |
WO2018146922A1 (ja) * | 2017-02-13 | 2018-08-16 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
JP2019003397A (ja) * | 2017-06-15 | 2019-01-10 | コニカミノルタ株式会社 | 情報処理装置及びプログラム |
JP2020071587A (ja) * | 2018-10-30 | 2020-05-07 | セイコーエプソン株式会社 | 表示装置、及び、表示装置の制御方法 |
JP7247519B2 (ja) | 2018-10-30 | 2023-03-29 | セイコーエプソン株式会社 | 表示装置、及び、表示装置の制御方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2015105044A1 (ja) | 2017-03-23 |
US20160349926A1 (en) | 2016-12-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015105044A1 (ja) | インターフェース装置、可搬装置、制御装置、モジュール、制御方法およびプログラム記憶媒体 | |
US11816296B2 (en) | External user interface for head worn computing | |
US11886638B2 (en) | External user interface for head worn computing | |
US20170336872A1 (en) | External user interface for head worn computing | |
US10591729B2 (en) | Wearable device | |
US20170100664A1 (en) | External user interface for head worn computing | |
JP5802667B2 (ja) | ジェスチャ入力装置およびジェスチャ入力方法 | |
US20170017323A1 (en) | External user interface for head worn computing | |
US20160026239A1 (en) | External user interface for head worn computing | |
US20160027211A1 (en) | External user interface for head worn computing | |
US20150205351A1 (en) | External user interface for head worn computing | |
JP4681629B2 (ja) | 表示デバイスのキャリブレーション方法及び装置 | |
WO2015195444A1 (en) | External user interface for head worn computing | |
WO2015179877A2 (en) | External user interface for head worn computing | |
US20150220149A1 (en) | Systems and methods for a virtual grasping user interface | |
WO2017015093A1 (en) | External user interface for head worn computing | |
US9678663B2 (en) | Display system and operation input method | |
US20170228057A1 (en) | Projection display unit and function control method | |
CN109643206B (zh) | 控制装置、显示装置、程序及检测方法 | |
JP2013171529A (ja) | 操作入力装置、操作判定方法およびプログラム | |
CN117480483A (zh) | 用于增强现实设备的文本输入方法 | |
US12032754B2 (en) | Information processing apparatus, information processing method, and non-transitory computer readable medium | |
US20240053832A1 (en) | Information processing apparatus, information processing method, and non-transitory computer readable medium | |
JP6169462B2 (ja) | 情報処理装置及び情報処理方法 | |
Molineux et al. | Search Light Interactions with Personal Projector |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15735545 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015556783 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15110486 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15735545 Country of ref document: EP Kind code of ref document: A1 |