WO2015062248A1 - 显示装置及其控制方法、和手势识别方法 - Google Patents
显示装置及其控制方法、和手势识别方法 Download PDFInfo
- Publication number
- WO2015062248A1 WO2015062248A1 PCT/CN2014/078016 CN2014078016W WO2015062248A1 WO 2015062248 A1 WO2015062248 A1 WO 2015062248A1 CN 2014078016 W CN2014078016 W CN 2014078016W WO 2015062248 A1 WO2015062248 A1 WO 2015062248A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- virtual
- user
- control screen
- control
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
Definitions
- the invention belongs to the technical field of gesture recognition, and particularly relates to a display device, a control method thereof, and a gesture recognition method. Background technique
- a display device having a gesture recognition function includes a display unit for performing display, and an image collection unit (camera, camera, etc.) for collecting gestures, which analyzes an image collected by the image collection unit, that is, Determine what the user is doing.
- the “select” and “determine” operations must be performed separately through different gestures, and the operation is troublesome. For example, if the television is changed by the gesture, the first gesture (such as waving from left to right) is selected first. Taiwan, each time the wave is changed once, when the correct station number is selected, the second gesture (such as waving from top to bottom) enters the station. That is to say, the gesture recognition technology of the existing display device cannot implement the operation of "selecting" and “determining”, that is, it cannot "touch” one of the plurality of candidate icons, like a tablet computer, once. Select the instruction to execute and execute it. This is so because the "click" operation must accurately determine the click location.
- the hand is directly on the screen, so it is feasible to determine the click position by touch technology.
- the hand usually cannot touch the display unit (especially for the TV, the user is far away from the TV display during normal use), and can only "point” to a certain position of the display unit (such as an icon displayed by the display unit).
- this long-distance "pointing" accuracy is very poor.
- the gestures of different users may be different. Some people point to the left and some point to the right, so it is impossible to determine where the user wants to point. , you can not achieve the "click" operation. Summary of the invention
- the technical problem to be solved by the present invention includes a problem that the "select” and “determine” operations must be separately performed in the existing gesture recognition, and a display capable of achieving “selection” and “determination” operations by gesture recognition is provided in one step.
- the device and its control method, and gesture recognition method are provided in one step.
- the technical solution for solving the technical problem to be solved by the present invention is a control method for a display device, comprising: a display unit displaying a control screen, and the 3D unit converting the control screen into a virtual 3D control screen and providing the same to the user, wherein
- the 3D unit includes 3D glasses, and the virtual distance between the virtual 3D control screen and the user's eyes is equal to the first distance, the first distance is smaller than the distance between the display unit and the user's eyes; 3D controls the image of the click action of the screen; the gesture recognition unit determines the click position of the virtual 3D control screen by the user according to the image collected by the image collection unit, and sends a control instruction corresponding to the click position to the corresponding execution unit.
- the first distance is less than or equal to the length of the user's arm.
- the first distance is less than or equal to 0.5 meters and greater than or equal to 0.25 meters.
- the virtual 3D control screen is distributed over the entire display screen for displaying the virtual 3D control screen; or the virtual 3D control screen is a part of a display screen for displaying the virtual 3D control screen.
- the virtual 3D control picture is divided into at least two areas, and each area corresponds to one control instruction.
- the method further includes: the positioning unit determines the position of the user relative to the display unit; and the gesture recognition unit is configured according to the image collection unit.
- the captured image determines that the user clicks on the virtual 3D control screen includes: the gesture recognition unit determines the click position of the user on the virtual 3D control screen according to the image collected by the image collection unit and the position of the user relative to the display unit.
- the positioning unit determines that the position of the user relative to the display unit comprises: the positioning unit analyzes the image collected by the image collection unit, thereby determining the position of the user relative to the display unit.
- a technical solution to solve the technical problem to be solved by the present invention is a display device comprising: a display unit for performing display; a 3D unit including 3D glasses for converting a control screen displayed by the display unit into The virtual 3D control screen is provided to the user, and the virtual distance between the virtual 3D control screen and the user's eyes is equal to the first distance, the first distance is smaller than the distance between the display unit and the user's eyes; and the image collection unit is used for An image of a click action of the user on the virtual 3D control screen; a gesture recognition unit configured to determine a click position of the virtual 3D control screen by the user according to the image collected by the image collection unit, and control the click position The instructions are sent to the corresponding execution unit.
- the display unit is a television display or a computer display.
- the 3D unit further includes a 3D polarizing film disposed outside the display surface of the display unit.
- the display device further includes: a positioning unit configured to determine a position of the user relative to the display unit.
- the positioning unit is configured to analyze the image collected by the image collection unit to determine the position of the user relative to the display unit.
- a technical solution for solving the technical problem to be solved by the present invention is a gesture recognition method, comprising: a display unit displaying a control screen, and the 3D unit converting the control screen into a virtual 3D control screen and providing the same to the user, wherein The 3D unit includes 3D glasses, and the virtual distance between the virtual 3D control screen and the user's eyes is equal to the first distance, the first distance is smaller than the distance between the display unit and the user's eyes; and the image collection unit collects the user to the virtual 3D The image of the click action of the screen is controlled; the gesture recognition unit determines the click position of the virtual 3D control screen by the user according to the image collected by the image collection unit, and sends a control instruction corresponding to the click position to the corresponding execution unit.
- the "3D unit” can convert the planar image displayed by the display unit into a stereoscopic 3D image (
- the "virtual 3D control screen” refers to a stereoscopic control screen converted by the 3D unit, and the control screen is used for control.
- “virtual distance” refers to the distance between the virtual 3D control screen that the user feels and himself.
- the sense of distance is a part of the stereoscopic effect, which is caused by the difference between the images seen by the left and right eyes. Therefore, as long as the display unit displays a specific content and then undergoes 3D unit conversion, the user can feel that the virtual 3D control screen is in front of himself. At a certain distance, even if the user is away from or close to the display unit, the perceived distance between the virtual 3D control screen and himself is always the same.
- the "execution unit” refers to any unit that can execute the corresponding control instruction.
- the execution unit is the display unit
- the execution unit is the sounding unit.
- the 3D unit can present a virtual 3D control screen for the user, and the distance between the virtual 3D control screen and the user is smaller than the distance between the display unit and the user, so the user will Feel the control screen is very close to you (just in front of you), you can directly reach the virtual 3D control screen accurately, so that the actions of different users when clicking the same position of the virtual 3D control screen are the same or similar, thus the gesture recognition
- the unit can accurately determine the click position desired by the user, thereby implementing the "click" operation of "select” and "determine”.
- the invention is used for the control of a display device, and is particularly suitable for the control of a television.
- Fig. 1 is a flow chart showing a method of controlling a display device according to a first embodiment of the present invention.
- Fig. 2 is a view showing a state in which the display device of the first embodiment of the present invention displays a virtual 3D control screen.
- the embodiment provides a control method for a display device.
- the display device to which the method is applied includes a display unit, a 3D unit, an image collection unit, a gesture recognition unit, and preferably a positioning unit.
- the display unit is any display device capable of displaying a 2D picture, such as a liquid crystal display device, an organic light emitting diode display device, and the like.
- the display unit is a television display. Since people need to perform relatively frequent operations on the television (such as changing channels, adjusting the volume, etc.), and usually the user is far away from the television, it is difficult to control the television by touch or the like, so the present invention is more suitable for television. Of course, it is also possible if the display unit is a computer display or the like.
- the 3D unit refers to a device that converts a flat image displayed by the display unit into a stereoscopic 3D image, which includes 3D glasses for the user to wear.
- the 3D glasses may be shutter type 3D glasses, that is, they may turn on the left and right eyeglasses in turn (such as one image per frame), so that the left and right eyes can not see the image to achieve a 3D effect.
- the 3D unit may further include 3D glasses and a 3D polarizing film disposed outside the display surface of the display unit, and the 3D polarizing film may convert light from different positions of the display unit into polarized light having different polarization directions.
- the left and right eyeglasses of the 3D glasses are different polarizers, so that the polarized light passing through the 3D polarizing film can be differently filtered, so that the left and right eyes respectively see different images. Since there are many methods for realizing 3D display through 3D glasses, they are not described one by one here.
- the image collection unit is used to collect images of the user, which may be known devices such as a CCD (Charge Coupled Device) camera or a camera. From a convenient point of view, the image collection unit can be located near the display unit (e.g., fixed above or to the side of the display unit) or integrated with the display unit.
- CCD Charge Coupled Device
- the above control method includes the following steps S01 to S04.
- S01. The display unit displays a control screen, and the 3D unit converts the control screen into a virtual 3D control screen and provides the same to the user.
- the virtual distance between the virtual 3D control screen and the user's eyes is equal to the first distance, and the first distance is smaller than between the display unit and the user's eyes. the distance.
- the control screen refers to a screen specially used for controlling the display device, and includes various control commands for the display device, and the user can realize different control of the display device by selecting different control commands.
- the display unit 1 displays a control screen
- the 3D unit including the 3D glasses 2 converts the control screen into a virtual 3D control screen 4 in the form of a stereoscopic image, and makes the user feel that the virtual 3D control screen 4 is located in front of itself.
- the distance (first distance) is smaller than the distance between the display unit 1 and the user. Since the user feels that the virtual 3D control screen 4 is close to himself, the user can be made to accurately "click” on a certain position of the screen, so that the display device can more accurately determine what operation the user wants to perform. "Click" control.
- the first distance is less than or equal to the length of the user's arm.
- the user feels that he can "touch" the virtual 3D control screen 4 by hand, thus maximally ensuring the accuracy of the click action.
- the first distance is less than or equal to 0.5 meters and greater than or equal to 0.25 meters. According to the range of the first distance, most people do not have to straighten their arms to "reach" the virtual 3D control screen 4, nor do they think that the virtual 3D control screen 4 is too close to itself.
- the virtual 3D control screen 4 is spread over the entire display screen for displaying the virtual 3D control screen 4. That is to say, when the virtual 3D control screen 4 is displayed, the virtual 3D control screen 4 is the entire display content, and the user can only see the virtual 3D control screen 4, so that the virtual 3D control screen 4 has a larger area and can accommodate More control commands to be selected, and the click accuracy is higher.
- the virtual 3D control screen 4 may be a part of the entire display screen for displaying the virtual 3D control screen 4. That is to say, the virtual 3D control screen 4 is displayed together with a normal screen (such as a television program), and the virtual 3D control screen 4 seen by the user can be located on the side or corner of the display screen, so that the user can simultaneously view the regular screen and the virtual screen. 3D control screen 4, to Control at any time (such as adjusting the volume, changing channels, etc.).
- the virtual 3D control screen 4 when the virtual 3D control screen 4 is spread over the entire display screen for displaying the virtual 3D control screen 4, it is preferably displayed when certain conditions (e.g., the user issues an instruction) are satisfied, and the normal screen is still displayed in other cases.
- certain conditions e.g., the user issues an instruction
- the normal screen is still displayed in other cases.
- the virtual 3D control screen 4 is a part of the display screen for displaying the virtual 3D control screen 4, it can be continuously displayed.
- the virtual 3D control screen 4 is divided into at least two areas, each of which corresponds to one control command. That is to say, the virtual 3D control screen 4 can be divided into a plurality of different areas, and different control commands can be executed by clicking different areas, so that a plurality of different operations can be performed through one virtual 3D control screen 4.
- the virtual 3D control screen 4 can be equally divided into a total of 9 rectangular regions of 3 rows and 3 columns, and each rectangular region corresponds to a control command (such as changing the volume, changing the station number, changing the brightness, Exit the virtual 3D control screen 4, etc.).
- the virtual 3D control screen 4 corresponds to only one control command (for example, the virtual 3D control screen 4 is a part of the display screen for displaying the virtual 3D control screen 4, the corresponding command is "Enter full screen control screen") feasible.
- control picture is converted into a 3D form
- conventional picture such as a TV program
- the conventional picture can still be in a 2D form
- the user can not wear the 3D glasses 2 or the 2D glasses 2 when viewing the regular picture.
- Simultaneously open, or display the left and right eye images displayed by unit 1 are the same.
- the image collection unit collects an image of the user's click action on the virtual 3D control screen.
- the image collection unit 5 fixed above the display unit 1 collects an image of the click action of the user's hand 3 on the virtual 3D control screen 4. That is, when the display unit 1 displays the control screen and the 3D unit converts the control screen into the virtual 3D control screen 4 and provides it to the user, the image collection unit 5 is turned on, thereby collecting the image of the user's motion, specifically collecting the image. An image of the action of the user's hand 3 clicking on the virtual 3D control screen 4.
- the image collection unit 5 can also be turned on, thereby collecting images of other gestures of the user or for determining the position of the user. 503.
- the positioning unit determines a position (distance and/or angle) of the user relative to the display unit.
- the image collection unit 5 is It is said that the images collected by them are different. For this reason, it is preferable to preliminarily determine the relative positional relationship between the user and the display unit 1, thereby performing more accurate recognition in the gesture recognition process.
- the positioning unit (not shown) can determine the position of the user relative to the display unit 1 by analyzing the image collected by the image collecting unit 5. For example, when the virtual 3D control screen 4 is displayed, the first image of the image collection unit 5 can be used to determine the position of the user relative to the display unit 1, and the image that is subsequently collected is used for gesture recognition.
- the method of judging the position of the user relative to the display unit 1 according to the image of the collection is also various, for example, the contour of the user or the outline of the 3D glasses 2 can be obtained by contour analysis, thereby determining the position of the user, or can also be set on the 3D glasses 2.
- a marker that determines the location of the user by tracking the marker.
- an infrared range finder can be set at two different positions, and the user position can be calculated by the distance between the two measured by the two infrared range finder.
- the user position may also be defaulted.
- the gesture recognition unit determines, according to the image collected by the image collection unit (and the position of the user relative to the display unit), the click position of the virtual 3D control screen, and sends the control instruction corresponding to the click position to the corresponding execution unit. .
- the gesture recognition unit (not shown) can confirm the virtual The spatial position of the 3D control screen 4 relative to the display unit 1 (because the virtual 3D control screen 4 is necessarily located on the line connecting the display unit 1 and the user), and at the same time, when the user reaches 3 to click on the virtual 3D control screen 4, the gesture recognition unit also Image according to the collection (image collection unit 5 relative to display unit 1
- the location of the location is also known (ie, the location of the hand 3), and the position of the virtual 3D control screen 4 corresponding to the click location is confirmed, that is, the control command corresponding to the user gesture is determined, such that the gesture
- the identification unit can send the control instruction to the corresponding execution unit, and cause the execution unit to execute the corresponding instruction to implement the control.
- execution unit refers to any unit that can execute the corresponding control instruction.
- the execution unit is the display unit
- the execution unit is the sounding unit.
- the user position may be determined according to the default position, or the user may be judged by determining the relative positional relationship between the user's hand and the body.
- the embodiment further provides a display device controllable by using the above method, comprising: a display unit 1 for performing display; a 3D unit including 3D glasses 2 for converting a control screen displayed by the display unit 1 into The virtual 3D control screen 4 is provided to the user, and the virtual distance between the virtual 3D control screen 4 and the user's eyes is equal to the first distance, and the first distance is smaller than the distance between the display unit 1 and the user's eyes; the image collection unit 5 An image for collecting a click action of the user on the virtual 3D control screen 4; a gesture recognition unit for determining a click position of the user on the virtual 3D control screen 4 according to the image captured by the image collection unit 5, and The control command corresponding to the click location is sent to the corresponding execution unit.
- the display unit 1 is a television display or a computer display.
- the 3D unit further includes a 3D polarizing film disposed outside the display surface of the display unit 1.
- the display device further comprises: a positioning unit for determining the position of the user relative to the display unit 1.
- the positioning unit is configured to analyze the image collected by the image collection unit 5 to determine the position of the user relative to the display unit 1.
- Example 2
- the embodiment provides a gesture recognition method, including: a display unit displays a control screen, and the 3D unit converts the control screen into a virtual 3D control screen and provides the same to the user, wherein the 3D unit includes 3D glasses, and the virtual 3D control
- the virtual distance between the screen and the user's eyes is equal to the first distance, the first distance is smaller than the distance between the display unit and the user's eyes
- the image collection unit collects the image of the user's click action on the virtual 3D control screen
- the gesture recognition unit is configured according to The image collected by the image collection unit determines the click position of the virtual 3D control screen by the user, and sends a control instruction corresponding to the click position to the corresponding execution unit.
- the above gesture recognition method is not limited to use for controlling the display device, and it can also be used to control other devices as long as the gesture recognition unit transmits (e.g., wirelessly) the control command to the corresponding device.
- the gesture recognition unit transmits (e.g., wirelessly) the control command to the corresponding device.
- a number of specialized gesture recognition systems can be used to control a wide range of devices such as televisions, computers, air conditioners, and washing machines.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/421,044 US20160048212A1 (en) | 2013-10-31 | 2014-05-21 | Display Device and Control Method Thereof, and Gesture Recognition Method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310530739.9A CN103530060B (zh) | 2013-10-31 | 2013-10-31 | 显示装置及其控制方法、手势识别方法 |
CN201310530739.9 | 2013-10-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015062248A1 true WO2015062248A1 (zh) | 2015-05-07 |
Family
ID=49932115
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2014/078016 WO2015062248A1 (zh) | 2013-10-31 | 2014-05-21 | 显示装置及其控制方法、和手势识别方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20160048212A1 (zh) |
CN (1) | CN103530060B (zh) |
WO (1) | WO2015062248A1 (zh) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103530061B (zh) * | 2013-10-31 | 2017-01-18 | 京东方科技集团股份有限公司 | 显示装置及控制方法 |
CN103530060B (zh) * | 2013-10-31 | 2016-06-22 | 京东方科技集团股份有限公司 | 显示装置及其控制方法、手势识别方法 |
CN105334718B (zh) * | 2014-06-27 | 2018-06-01 | 联想(北京)有限公司 | 显示切换方法和电子设备 |
US9727296B2 (en) | 2014-06-27 | 2017-08-08 | Lenovo (Beijing) Co., Ltd. | Display switching method, information processing method and electronic device |
CN106502376A (zh) * | 2015-09-08 | 2017-03-15 | 天津三星电子有限公司 | 一种3d触控操作方法、电子设备及3d眼镜 |
CN108369451B (zh) * | 2015-12-18 | 2021-10-29 | 索尼公司 | 信息处理装置、信息处理方法及计算机可读存储介质 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102253713A (zh) * | 2011-06-23 | 2011-11-23 | 康佳集团股份有限公司 | 面向三维立体影像显示*** |
WO2012064803A1 (en) * | 2010-11-12 | 2012-05-18 | At&T Intellectual Property I, L.P. | Electronic device control based on gestures |
CN102508546A (zh) * | 2011-10-31 | 2012-06-20 | 冠捷显示科技(厦门)有限公司 | 一种3d虚拟投影及虚拟触摸的用户交互界面及实现方法 |
CN103067727A (zh) * | 2013-01-17 | 2013-04-24 | 乾行讯科(北京)科技有限公司 | 一种3d眼镜和3d显示*** |
CN103246351A (zh) * | 2013-05-23 | 2013-08-14 | 刘广松 | 一种用户交互***和方法 |
CN103530060A (zh) * | 2013-10-31 | 2014-01-22 | 京东方科技集团股份有限公司 | 显示装置及其控制方法、手势识别方法 |
CN103530061A (zh) * | 2013-10-31 | 2014-01-22 | 京东方科技集团股份有限公司 | 显示装置及控制方法、手势识别方法、头戴显示装置 |
CN103529947A (zh) * | 2013-10-31 | 2014-01-22 | 京东方科技集团股份有限公司 | 显示装置及其控制方法、手势识别方法 |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080266323A1 (en) * | 2007-04-25 | 2008-10-30 | Board Of Trustees Of Michigan State University | Augmented reality user interaction system |
US20110012896A1 (en) * | 2009-06-22 | 2011-01-20 | Ji Maengsob | Image display apparatus, 3d glasses, and method for operating the image display apparatus |
US9407908B2 (en) * | 2009-08-20 | 2016-08-02 | Lg Electronics Inc. | Image display apparatus and method for operating the same |
JP5525213B2 (ja) * | 2009-08-28 | 2014-06-18 | 富士フイルム株式会社 | 偏光膜、積層体、及び液晶表示装置 |
KR101647722B1 (ko) * | 2009-11-13 | 2016-08-23 | 엘지전자 주식회사 | 영상표시장치 및 그 동작방법 |
CN102457735B (zh) * | 2010-10-28 | 2014-10-01 | 深圳Tcl新技术有限公司 | 一种兼容性3d快门眼镜的实现方法 |
CN102591446A (zh) * | 2011-01-10 | 2012-07-18 | 海尔集团公司 | 手势控制显示***及其控制方法 |
CN102681651B (zh) * | 2011-03-07 | 2016-03-23 | 刘广松 | 一种用户交互***和方法 |
KR101252169B1 (ko) * | 2011-05-27 | 2013-04-05 | 엘지전자 주식회사 | 휴대 단말기 및 그 동작 제어방법 |
US20150153572A1 (en) * | 2011-10-05 | 2015-06-04 | Google Inc. | Adjustment of Location of Superimposed Image |
CN102375542B (zh) * | 2011-10-27 | 2015-02-11 | Tcl集团股份有限公司 | 一种肢体遥控电视的方法及电视遥控装置 |
CN102789313B (zh) * | 2012-03-19 | 2015-05-13 | 苏州触达信息技术有限公司 | 一种用户交互***和方法 |
CN102769802A (zh) * | 2012-06-11 | 2012-11-07 | 西安交通大学 | 一种智能电视机的人机交互***及其交互方法 |
US9378592B2 (en) * | 2012-09-14 | 2016-06-28 | Lg Electronics Inc. | Apparatus and method of providing user interface on head mounted display and head mounted display thereof |
CN103442244A (zh) * | 2013-08-30 | 2013-12-11 | 北京京东方光电科技有限公司 | 3d眼镜、3d显示***及3d显示方法 |
-
2013
- 2013-10-31 CN CN201310530739.9A patent/CN103530060B/zh active Active
-
2014
- 2014-05-21 WO PCT/CN2014/078016 patent/WO2015062248A1/zh active Application Filing
- 2014-05-21 US US14/421,044 patent/US20160048212A1/en not_active Abandoned
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012064803A1 (en) * | 2010-11-12 | 2012-05-18 | At&T Intellectual Property I, L.P. | Electronic device control based on gestures |
CN102253713A (zh) * | 2011-06-23 | 2011-11-23 | 康佳集团股份有限公司 | 面向三维立体影像显示*** |
CN102508546A (zh) * | 2011-10-31 | 2012-06-20 | 冠捷显示科技(厦门)有限公司 | 一种3d虚拟投影及虚拟触摸的用户交互界面及实现方法 |
CN103067727A (zh) * | 2013-01-17 | 2013-04-24 | 乾行讯科(北京)科技有限公司 | 一种3d眼镜和3d显示*** |
CN103246351A (zh) * | 2013-05-23 | 2013-08-14 | 刘广松 | 一种用户交互***和方法 |
CN103530060A (zh) * | 2013-10-31 | 2014-01-22 | 京东方科技集团股份有限公司 | 显示装置及其控制方法、手势识别方法 |
CN103530061A (zh) * | 2013-10-31 | 2014-01-22 | 京东方科技集团股份有限公司 | 显示装置及控制方法、手势识别方法、头戴显示装置 |
CN103529947A (zh) * | 2013-10-31 | 2014-01-22 | 京东方科技集团股份有限公司 | 显示装置及其控制方法、手势识别方法 |
Also Published As
Publication number | Publication date |
---|---|
CN103530060A (zh) | 2014-01-22 |
US20160048212A1 (en) | 2016-02-18 |
CN103530060B (zh) | 2016-06-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015062247A1 (zh) | 显示装置及其控制方法、手势识别方法、和头戴显示装置 | |
WO2015062248A1 (zh) | 显示装置及其控制方法、和手势识别方法 | |
US9250746B2 (en) | Position capture input apparatus, system, and method therefor | |
EP3293620B1 (en) | Multi-screen control method and system for display screen based on eyeball tracing technology | |
US10440319B2 (en) | Display apparatus and controlling method thereof | |
JP4900741B2 (ja) | 画像認識装置および操作判定方法並びにプログラム | |
JP6480434B2 (ja) | デジタルデバイスとの対話のための直接的なポインティング検出のためのシステムおよび方法 | |
WO2017113668A1 (zh) | 一种根据眼部动作对终端进行控制的方法及装置 | |
WO2015062251A1 (zh) | 显示装置及其控制方法、和手势识别方法 | |
WO2015027574A1 (zh) | 3d眼镜、3d显示***及3d显示方法 | |
US20120056989A1 (en) | Image recognition apparatus, operation determining method and program | |
US20130154913A1 (en) | Systems and methods for a gaze and gesture interface | |
US20160078680A1 (en) | Technologies for adjusting a perspective of a captured image for display | |
JP5114795B2 (ja) | 画像認識装置および操作判定方法並びにプログラム | |
JP2015503162A (ja) | 3次元で表示されるオブジェクトのユーザの選択ジェスチャに応答する方法およびシステム | |
CN106327583A (zh) | 一种实现全景摄像的虚拟现实设备及其实现方法 | |
JP2012216953A (ja) | 情報処理装置、表示制御方法及び表示制御装置 | |
WO2017206383A1 (zh) | 一种终端控制方法、控制装置以及终端 | |
WO2019028855A1 (zh) | 一种虚拟显示装置、智能交互方法和云端服务器 | |
JP2012238293A (ja) | 入力装置 | |
WO2023184816A1 (zh) | 一种云桌面的展示方法、装置、设备及存储介质 | |
TW202018486A (zh) | 多螢幕操作方法與使用此方法的電子系統 | |
CN103713387A (zh) | 电子设备和采集方法 | |
JP2016126687A (ja) | ヘッドマウントディスプレイ、操作受付方法および操作受付プログラム | |
JP2017033195A (ja) | 透過型ウェアラブル端末、データ処理装置、及びデータ処理システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 14421044 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14858372 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC , EPO FORM 1205A DATED 26-09-16 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14858372 Country of ref document: EP Kind code of ref document: A1 |