CN103703763A - System and method of visual layering - Google Patents

System and method of visual layering Download PDF

Info

Publication number
CN103703763A
CN103703763A CN201180072678.5A CN201180072678A CN103703763A CN 103703763 A CN103703763 A CN 103703763A CN 201180072678 A CN201180072678 A CN 201180072678A CN 103703763 A CN103703763 A CN 103703763A
Authority
CN
China
Prior art keywords
layer
digital information
physical object
ground floor
working space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201180072678.5A
Other languages
Chinese (zh)
Other versions
CN103703763B (en
Inventor
O.K.西弗特
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hewlett Packard Development Co LP
Original Assignee
Hewlett Packard Development Co LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett Packard Development Co LP filed Critical Hewlett Packard Development Co LP
Publication of CN103703763A publication Critical patent/CN103703763A/en
Application granted granted Critical
Publication of CN103703763B publication Critical patent/CN103703763B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor
    • H04N5/7408Direct viewing projectors, e.g. an image displayed on a video CRT or LCD display being projected on a screen
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/48Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus
    • G03B17/54Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus with projector
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/20Lamp housings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor
    • H04N5/7416Projection arrangements for image reproduction, e.g. using eidophor involving the use of a spatial light modulator, e.g. a light valve, controlled by a video signal

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

A camera identifies a physical object positioned in a workspace. A display displays first digital information into the workspace. A layering module treats the physical object as a first layer in the workspace and treats the first digital information as a second layer in the workspace. A controller controls the visual adjacency of the first and second layers via display of the first digital information.

Description

Vision layered system and method
Background technology
Share digital information and cooperate and just become more and more general based on that digital information.Input equipment is caught digital information (for example, user's input on computing equipment, digital camera, scanning device etc.).Output equipment output digital information is to organize use by user or user.Output equipment can comprise digital information is shown to digital display or the digital projector on display screen or in working space.
Accompanying drawing explanation
Ensuing description comprises the discussion of accompanying drawing, and described accompanying drawing has the explanation that the mode with the example of the realization of embodiments of the invention provides.Accompanying drawing should be understood in the mode of example, and should not understand in the mode of restriction.As used herein, the reference of one or more " embodiment " will be understood to describe specific feature, structure or the characteristic being included at least one realization of the present invention.Therefore, the statement of this appearance (such as, " in one embodiment " or " in an alternative embodiment ") various embodiment of the present invention and realization are described, and needn't all refer to same embodiment.Yet they also needn't be got rid of mutually.
Fig. 1 is the block diagram according to various embodiment system shown.
Fig. 2 is perspective, the external view according to various embodiment system shown.
Fig. 3 is the perspective view according to the logical layer in various embodiment system shown.
Fig. 4 is the block diagram according to the layer state in various embodiment system shown.
Fig. 5 is according to the flow chart operating in the system of various embodiment.
Fig. 6 is according to the flow chart operating in the system of various embodiment.
Embodiment
Embodiment described here relates to projection capture systems.Projection at same time in identical working space, catch combination with user's input control and be conducive to mixed reality (for example, physics with virtual) cooperation.For example, virtual objects can be the file (for example, electronic presentations lantern slide, electronic document, digital photos etc.) on subscriber computer.Physical object can be two-dimentional object (for example, photo, document etc.) or three-dimensional object (for example, model, widget etc.).As described in this, physical object and virtual objects are regarded as visually interchangeable " layer ".As described in this, these layer of presentation logic layer, and allow system described here and equipment to control working space environment, make one deck (for example, physical object or digital information group) there is the outward appearance that is positioned at another layer of top.In the Collaborative environment that user on diverse location cooperates via independent projection capture systems, for example, because the camera in each system fact (, video) each user supplies with and to send to other positions for projection, so can see local and remote object (physics with virtual) and mutual with local and remote object (physics with virtual).
Fig. 1 is the block diagram according to various embodiment system shown.Fig. 1 comprises according to the specific parts of various embodiment, module etc.Yet, in different embodiment, more, still less and/or the layout of miscellaneous part, module, parts/module etc. can be used according to instruction described here.In addition, various parts described here, module etc. may be implemented as hardware (for example, specialized hardware, application-specific integrated circuit (ASIC) (ASIC), embedded controller, hard-wired circuitry etc.) or their certain combination of one or more software module, hardware module, special purpose.In Fig. 1, illustrated various modules and/or parts may be implemented as the computer-readable recording medium of include instruction, and described instruction is carried out and is stored in by processor in memory to carry out operation described here and function.
System 100 comprises camera 110, display 120, hierarchical block 130 and controller 140.Camera 110, display 120 and hierarchical block 130 are operably connected to controller 140 with the visual adjacent between each layer in convenient change working space.In one example, display 120 is shown to map on working surface.Display 120 can be the display screen being placed on working surface, or it can be, via digital projection, information is presented to the digital projector on working surface.In addition, camera 110 detects and is placed on the physical object (for example, the model of house or building) on working surface.Camera 110 can be Visible Light Camera (for example, digital picture camera (digital image camera), digital camera etc.) or infrared (IR) camera.
Hierarchical block 130 is associated described physical object with a vision layer, and the map being projected is associated with another vision layer.By keeping the state information of each layer, controller 140 can be controlled the visual adjacent of each layer.In other words, one deck may show as " top " at another layer at first, but controller 140 can change described visual adjacent, makes " end " layer visually become " top " layer.
Fig. 2-3rd, according to perspective, the external view of various embodiment system shown.Fig. 2-3 comprise according to the specific parts of various embodiment, module etc.Yet, in different embodiment, more, still less and/or the layout of miscellaneous part, module, parts/module etc. can be used according to instruction described here.In addition, various parts described here, module etc. may be implemented as one or more software module, hardware module, hardware (for example, specialized hardware, application-specific integrated circuit (ASIC) (ASIC), embedded controller, hard-wired circuitry etc.) or their certain combination of special purpose.
System 200 comprises top 210 and bottom 220.Bottom 220 comprises infrared (IR) camera 224 and projecting apparatus 222, and the miscellaneous part of accommodating (for example, processor, memory, hierarchical block, controller etc.).Projecting apparatus 222 can be to include but not limited to LED(Laser emission diode) and DLP(digital light process) any applicable light projector of projection.In various embodiments, projecting apparatus 222 is 210 projection digital informations towards top, and at top, 210 mirrors reflex to described projection on working surface 230.IR camera 224 detects the position that is placed in the physical object on working surface 230.
Projecting apparatus 222 projects the projection 250 of digital information (for example, line) on working surface 230.Object 260 is the physical objecies that are placed in working space 230.IR camera 224 detects the position of described object 260.Hierarchical block (for example, hierarchical block 130) is associated projection 250 with a logical layer, and object 260 is associated with another logical layer.As shown in Figure 2, object 260 shows as " top " in projection 250.In other words, object 260 shows as foreground layer, and projection 250 shows as background layer.
Fig. 3 illustrates above-mentioned logical layer.Layer 310 represents working surface 230.Layer 320 represents projection 250 and in this example, is positioned at " top " of layer 310.Layer 330 indicated object 260 and be positioned at layer 320 " top " in this example.Once object and the information digitally being projected are associated with logical layer, described hierarchical block keeps the state information about each layer of vision order.
System 200 also comprises and allows user and the mutual user input device 240 of system 200.In various embodiments, user's input (UI) equipment 240 comprises infrared digital pen and/or infrared camera, in order to the position of UI equipment 240 in testing space 230.Although any applicable UI equipment all can be used, digital pen tool has the following advantages: in the situation that there is no tablet or other special surfaces, allow (to comprise along the surface of working space 230) input in three dimensions.Therefore, system 200 can be used to multiple working surface.
From user's input of UI equipment 240 or the reception of other input mechanisms, can indicate the request that changes the visual adjacent of each layer working space 230.For example, adopting UI equipment 240 contact object 260 to indicate makes object 260 for foreground layer, projection 250 request of layer as a setting.The surface of contact working space 230 can be indicated and be made projection 250 for the request of layer as a setting of foreground layer, object 260.System 200 can be used described input and change the described projection from projecting apparatus 222 from the state information of hierarchical block, thereby the vision order of each layer is changed (for example, foreground layer and background layer exchange).
Fig. 4 is the block diagram of the state variation of each layer (real and virtual) in diagram projection capture systems.For illustrative object, object is placed surface thereon and that digital information is projeced into the working space on it and is considered to a layer L0.Therefore, layer L1 visually adjoining course L0(for example, " above it "), and layer L2 visually adjoining course L1(for example, " above it ").As shown in the figure, state 410 comprises the physical object that is associated with layer L1 (for example, map, document, other two dimension or three dimensional object etc.).In other words, described physical object is visually in abutting connection with L0.The projection of digital information (for example, image, document etc.) is initial to be associated with a layer L2, and adjoining course L1 visually.Therefore,, at state 410, the projection of digital information visually shows as " top " at described physical object for system user.
In response to user's input or other control signals of changing the visual adjacent of each layer, system layer module changes to the layer relevance of state 420 layer relevance from the layer relevance of state 410.At state 420, physical object becomes a layer L2, and the projection of digital information becomes a layer L1.In order to produce described physical object at the visual appearance of described projection " top ", described system can stop projecting described digital information simply to described working space.
State 412 comprises the projection of the digital information (for example, map, document, image etc.) being associated with layer L1 at first.In other words, described in, be projected in visually in abutting connection with L0.Physical object (for example, two dimension or three-dimensional) is initial to be associated with a layer L2, and adjoining course L1 visually.In order to obtain physical object at the visual appearance of described projection " top ", the digital information (for example, map or view data etc.) that described system can be corresponding the coordinate with described physical object removes from described projection.For example, the position being located at described physical object, described system can project white space (for example, the light of white or other applicable colors), for example, and every other position in working space still projects original digital information (, map or view data etc.).Alternately, described system can (for example, directly over described physical object (directly above)) be caught the digital picture of described physical object, and that image is incident upon to the position (substituting projection white space) that described physical object is located.In arbitrary example, effect is all that described physical object is positioned at the described outward appearance that is projected digital information " top " therein.
In response to user's input or other control signals of changing the visual adjacent of each layer, system layer module changes to the layer relevance of state 422 layer relevance from the layer relevance of state 412.At state 422, physical object becomes a layer L1, and the projection of digital information becomes a layer L2.In order to produce the described visual appearance of digital information in described physical object " top " that projected, described system can project all digital information in described working space simply, comprises the position that described physical object is located.
It should be noted in the discussion above that a plurality of projection capture systems can be used for virtual corporation at remote location.The state of describing in Fig. 4 illustrates the example of this cooperation.User in primary importance can have physical object and the projection according to state 410.Meanwhile, the user in the second place can have physical object and the projection according to state 412.In other words, the system being represented by state 410 can be caught the digital picture of described physical object, and it is sent to cooperative system, and described digital picture is projected according to state 412 herein.Similarly, the user in the second place has the physical object according to state 412.This physical object is also digitally caught, and is sent to other system (being shown as the projection in state 410).So, the system user on diverse location can be used real object and virtual objects these two cooperates.
Fig. 5 is according to the flow chart operating in the system of various embodiment.Fig. 5 comprises specific operation and the execution order according to some embodiment.Yet in different embodiment, other operate, omit one or more the operation described and/or carry out also can being used according to instruction described here with other execution orders.
Physical object in system identification 510 working spaces.Described physical object can be two dimension or three-dimensional.Object can for example, be identified by camera (, infrared camera, digital image capture (digital image capture) camera, digital camera etc.).In various embodiments, identification comprises the position (for example, coordinate) of determining described object in working space.Hierarchical block is associated 520 described physical object with ground floor.System also for example, shows that using the digital information as the second layer (, image) (for example, projection) 530 is in working space.Again, described hierarchical block keeps described by the relevance between project information and the described second layer.
System (perhaps may input in response to user) changes the visual adjacent between 540 ground floors and the second layer.For example, ground floor at first can be visually for example, in abutting connection with the surface (, showing as " above it ") of working space.The second layer at first can be visually for example, in abutting connection with described ground floor (, showing as " above it ").In response to user's request or other control signals, system is switched the vision order of (switch) each layer.Therefore, the described second layer becomes the visually surface of the described working space of adjacency, and described ground floor becomes visually in abutting connection with the described second layer.
Fig. 6 is according to the flow chart operating in the system of various embodiment.Fig. 6 comprises specific operation and the execution order according to some embodiment.Yet in different embodiment, other operate, omit one or more the operation described and/or carry out also can being used according to instruction described here with other execution orders.
Projection capture systems is identified the physical object in 610 working spaces.Again, described physical object can be two dimension or three-dimensional.Object can for example, be identified by camera (, infrared camera, digital image capture camera, digital camera etc.).System layer module is associated 620 described physical object with ground floor.System also for example, shows that using the digital information as the second layer (, image) (for example, projection) 630 is in working space.Again, described hierarchical block keeps described by the relevance between project information and the described second layer.
Described system detects the variation of physical object position described in 640.For example, described system can comprise infrared camera or Visible Light Camera (for example, digital still camera (digital still-image camera) or digital camera), for detection of the position of described physical object.In response to the change in location that detects described physical object, described system keeps the visual adjacent between ground floor and the described second layer described in 650.For example, if working space is divided into four quadrants in logic, physical object may be detected as at first and occupy first quartile.If described physical object (being associated from one deck) is visually positioned at first by the top of projection digital information (being associated with different layers), described system can be abandoned projection otherwise can be positioned at the digital information of first quartile, so that consistent with the vision order of each layer.Yet, in response to detecting the motion of described physical object from first quartile to the second quadrant, then described system can project the digital information being associated with first quartile, and from described projection, remove the digital information with described the second quadrant dependence connection, thereby, keep described physical object to be positioned at described by the outward appearance of projection digital information top.
In example described here, the visual adjacent of method and system management ground floor and the second layer.Method and system described here can easily expand to more than two-layer.For example, the extra play that is projected digital information can be managed by projection capture systems.In addition, additional physical object all can be associated from different layers.In these a plurality of physical layers with by the visual adjacent between projection layer, can manage according to embodiment described here.
To the various modifications of the disclosed embodiment of the present invention and realization, can in the situation that not deviating from its scope, be made.Therefore, should be with exemplary rather than explain with restrictive meaning in this explanation and example.

Claims (15)

1. a system, comprising:
Camera, identification is placed in the physical object in working space;
Display shows the first digital information in working space;
Hierarchical block, is considered as the ground floor in working space described physical object, and described the first digital information is considered as to the second layer in working space; And
Controller, via the demonstration of described the first digital information, controls the vision order of described ground floor and the second layer.
2. system according to claim 1, wherein, described camera is at least one in infrared camera or Visible Light Camera.
3. system according to claim 1, further comprises:
Described controller is the change in location with respect to described the first digital information in response to described physical object, changes the demonstration of described the first digital information to keep the existing vision order between described ground floor and the second layer.
4. system according to claim 1, further comprises:
User's input module, receives user's input; And
Described controller changes the demonstration of described the first digital information to change the vision order of described ground floor and the second layer.
5. system according to claim 1, further comprises:
Described display shows the second digital information in working space; And
Described hierarchical block is considered as the 3rd layer in working space described the second digital information; And
Described controller is controlled the demonstration of described the first digital information and the second digital information, to change the vision order between described ground floor, the second layer and the 3rd layer.
6. system according to claim 1, wherein, described display is digital projector.
7. a computer-readable recording medium for include instruction, when described instruction is performed, makes computer:
Identification is placed in the physical object in working space;
Described physical object is associated with ground floor;
In working space, show the digital information as the second layer; And
Change the visual adjacent between described ground floor and the second layer.
8. computer-readable recording medium according to claim 7, comprises other instruction, and it makes computer:
Change in location in response to described physical object with respect to described shown digital information, keeps the existing visual adjacent between described ground floor and the second layer.
9. computer-readable recording medium according to claim 7, comprises other instruction, and it makes computer:
In working space, show the additional digital information as the 3rd layer; And
Change the visual adjacent between described ground floor, the second layer and the 3rd layer.
10. for a method for projecting apparatus-camera system, comprising:
Identification is placed in the physical object in working space;
Described physical object is associated with ground floor;
Digital information as the second layer is projected in described working space; And
Change the visual adjacent between described ground floor and the second layer.
11. methods according to claim 10, wherein, described physical object is identified via infrared sensing.
12. methods according to claim 10, wherein, described digital information comprises that digital picture is supplied with or digital video is supplied with.
13. methods according to claim 10, further comprise: the change in location in response to described physical object with respect to the described digital information being projected, keeps the existing visual adjacent between described ground floor and the second layer.
14. methods according to claim 10, wherein, change described visual adjacent and comprise:
Receive user's input;
In response to described user's input, change the visual adjacent of described ground floor and the second layer.
15. methods according to claim 10, further comprise:
Additional digital information as the 3rd layer is projected in described working space; And
Change the visual adjacent between described ground floor, the second layer and the 3rd layer.
CN201180072678.5A 2011-07-29 2011-07-29 Vision layered system and method Expired - Fee Related CN103703763B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2011/045983 WO2013019190A1 (en) 2011-07-29 2011-07-29 System and method of visual layering

Publications (2)

Publication Number Publication Date
CN103703763A true CN103703763A (en) 2014-04-02
CN103703763B CN103703763B (en) 2018-02-27

Family

ID=47629542

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201180072678.5A Expired - Fee Related CN103703763B (en) 2011-07-29 2011-07-29 Vision layered system and method

Country Status (7)

Country Link
US (1) US10229538B2 (en)
EP (1) EP2737693B1 (en)
JP (1) JP6126594B2 (en)
KR (1) KR101773988B1 (en)
CN (1) CN103703763B (en)
BR (1) BR112014002234B1 (en)
WO (1) WO2013019190A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105208361A (en) * 2014-06-26 2015-12-30 松下知识产权经营株式会社 Light projection apparatus and illumination apparatus using same
CN113709439A (en) * 2017-04-11 2021-11-26 杜比实验室特许公司 Layered enhanced entertainment experience

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3036602A4 (en) 2013-08-22 2017-04-12 Hewlett-Packard Development Company, L.P. Projective computing system
JP6304618B2 (en) * 2013-11-05 2018-04-04 パナソニックIpマネジメント株式会社 Lighting device
EP3100136A4 (en) * 2014-01-31 2018-04-04 Hewlett-Packard Development Company, L.P. Touch sensitive mat of a system with a projector unit
JP6372266B2 (en) * 2014-09-09 2018-08-15 ソニー株式会社 Projection type display device and function control method
US10444894B2 (en) 2014-09-12 2019-10-15 Hewlett-Packard Development Company, L.P. Developing contextual information from an image

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003131319A (en) * 2001-10-25 2003-05-09 Seiko Epson Corp Optical transmission and reception device
JP2006189712A (en) * 2005-01-07 2006-07-20 Nippon Telegr & Teleph Corp <Ntt> Information presenting apparatus, information presenting method and program
CN101292516A (en) * 2005-07-06 2008-10-22 米迪尔波得股份有限公司 System and method for capturing visual data
CN101810003A (en) * 2007-07-27 2010-08-18 格斯图尔泰克股份有限公司 enhanced camera-based input
WO2010137496A1 (en) * 2009-05-26 2010-12-02 パナソニック電工株式会社 Information presentation device

Family Cites Families (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US695460A (en) 1901-03-13 1902-03-18 Horace Craighead Hoop for barrels, & c.
US4986651A (en) 1989-08-04 1991-01-22 Minnesota Mining And Manufacturing Company Overhead projector with centerless Fresnel lens reflective stage
EP0622722B1 (en) * 1993-04-30 2002-07-17 Xerox Corporation Interactive copying system
GB9614837D0 (en) * 1996-07-12 1996-09-04 Rank Xerox Ltd Interactive desktop system with multiple image capture and display modes
JPH10222436A (en) * 1997-02-12 1998-08-21 Meidensha Corp Transfer method for program and data
US6965460B1 (en) 2000-08-08 2005-11-15 Hewlett-Packard Development Company, L.P. Method and system for scanning an image using a look-down linear array scanner
US6431711B1 (en) 2000-12-06 2002-08-13 International Business Machines Corporation Multiple-surface display projector with interactive input capability
US7259747B2 (en) 2001-06-05 2007-08-21 Reactrix Systems, Inc. Interactive video display system
US7253832B2 (en) * 2001-08-13 2007-08-07 Olympus Corporation Shape extraction system and 3-D (three dimension) information acquisition system using the same
JP2003152851A (en) 2001-11-14 2003-05-23 Nec Corp Portable terminal
US20040095562A1 (en) 2002-11-20 2004-05-20 John Moffatt Combination scanner/projector
JP3927168B2 (en) * 2002-11-25 2007-06-06 日本電信電話株式会社 Real world object recognition method and real world object recognition system
US6840627B2 (en) 2003-01-21 2005-01-11 Hewlett-Packard Development Company, L.P. Interactive display device
US7203384B2 (en) 2003-02-24 2007-04-10 Electronic Scripting Products, Inc. Implement for optically inferring information from a planar jotting surface
JP4401728B2 (en) 2003-09-30 2010-01-20 キヤノン株式会社 Mixed reality space image generation method and mixed reality system
US7110100B2 (en) 2003-11-04 2006-09-19 Electronic Scripting Products, Inc. Apparatus and method for determining an inclination of an elongate object contacting a plane surface
US7268956B2 (en) 2003-11-24 2007-09-11 Electronic Scripting Products, Inc. Solid catadioptric lens with two viewpoints
US7038846B2 (en) 2003-11-24 2006-05-02 Electronic Scripting Products, Inc. Solid catadioptric lens with a single viewpoint
US7088440B2 (en) 2003-12-22 2006-08-08 Electronic Scripting Products, Inc. Method and apparatus for determining absolute position of a tip of an elongate object on a plane surface with invariant features
US9229540B2 (en) 2004-01-30 2016-01-05 Electronic Scripting Products, Inc. Deriving input from six degrees of freedom interfaces
US7961909B2 (en) 2006-03-08 2011-06-14 Electronic Scripting Products, Inc. Computer interface employing a manipulated object with absolute pose detection component and a display
US8542219B2 (en) 2004-01-30 2013-09-24 Electronic Scripting Products, Inc. Processing pose data derived from the pose of an elongate object
US7729515B2 (en) 2006-03-08 2010-06-01 Electronic Scripting Products, Inc. Optical navigation apparatus using fixed beacons and a centroid sensing device
US7826641B2 (en) 2004-01-30 2010-11-02 Electronic Scripting Products, Inc. Apparatus and method for determining an absolute pose of a manipulated object in a real three-dimensional environment with invariant features
US7023536B2 (en) 2004-03-08 2006-04-04 Electronic Scripting Products, Inc. Apparatus and method for determining orientation parameters of an elongate object
US7161664B2 (en) 2004-04-13 2007-01-09 Electronic Scripting Products, Inc. Apparatus and method for optical determination of intermediate distances
US7432917B2 (en) 2004-06-16 2008-10-07 Microsoft Corporation Calibration of an interactive display system
US7113270B2 (en) 2004-06-18 2006-09-26 Electronics Scripting Products, Inc. Determination of an orientation parameter of an elongate object with a scan beam apparatus
US7519223B2 (en) 2004-06-28 2009-04-14 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
US7743348B2 (en) * 2004-06-30 2010-06-22 Microsoft Corporation Using physical objects to adjust attributes of an interactive display application
US7557966B2 (en) 2004-08-11 2009-07-07 Acushnet Company Apparatus and method for scanning an object
US20060126128A1 (en) 2004-12-15 2006-06-15 Lexmark International, Inc. Scanning assembly
US7843470B2 (en) 2005-01-31 2010-11-30 Canon Kabushiki Kaisha System, image processing apparatus, and information processing method
CN101208738B (en) 2005-04-11 2011-11-09 波利维森有限公司 Automatic projection calibration
EP1739622B1 (en) * 2005-06-28 2013-08-14 Canon Kabushiki Kaisha Image feature identification with two cameras
CN101213487B (en) 2005-06-30 2010-06-16 株式会社理光 Projection image display device
JP2009500963A (en) 2005-07-06 2009-01-08 メディアポッド リミテッド ライアビリティ カンパニー System and method for capturing visual and non-visual data for multi-dimensional video display
CA2621191C (en) 2005-08-29 2012-12-18 Evryx Technologies, Inc. Interactivity via mobile image recognition
EP1980935A1 (en) * 2006-02-03 2008-10-15 Matsushita Electric Industrial Co., Ltd. Information processing device
JP4777182B2 (en) 2006-08-01 2011-09-21 キヤノン株式会社 Mixed reality presentation apparatus, control method therefor, and program
US7690795B2 (en) 2006-10-06 2010-04-06 Hewlett-Packard Development Company, L.P. Projector/camera system
US9377874B2 (en) 2007-11-02 2016-06-28 Northrop Grumman Systems Corporation Gesture recognition light and video image projector
JP4341723B2 (en) * 2008-02-22 2009-10-07 パナソニック電工株式会社 Light projection device, lighting device
JP5277703B2 (en) 2008-04-21 2013-08-28 株式会社リコー Electronics
CN101261557B (en) 2008-04-30 2011-09-14 北京汇冠新技术股份有限公司 Image sensing apparatus for touch screen
US8355038B2 (en) 2009-01-28 2013-01-15 Hewlett-Packard Development Company, L.P. Systems for capturing images through a display
JP5347673B2 (en) * 2009-04-14 2013-11-20 ソニー株式会社 Information processing apparatus, information processing method, and program
US20100271394A1 (en) 2009-04-22 2010-10-28 Terrence Dashon Howard System and method for merging virtual reality and reality to provide an enhanced sensory experience
JP5395507B2 (en) 2009-05-21 2014-01-22 キヤノン株式会社 Three-dimensional shape measuring apparatus, three-dimensional shape measuring method, and computer program
KR20110003705A (en) * 2009-07-06 2011-01-13 엘지전자 주식회사 Method for displaying information in mobile terminal and mobile terminal using the same
GB2469346B (en) 2009-07-31 2011-08-10 Promethean Ltd Calibration of interactive whiteboard
JP2011081556A (en) * 2009-10-06 2011-04-21 Sony Corp Information processor, method of processing information, program, and server
US8842096B2 (en) 2010-01-08 2014-09-23 Crayola Llc Interactive projection system
JP2011203823A (en) * 2010-03-24 2011-10-13 Sony Corp Image processing device, image processing method and program
US20120054355A1 (en) * 2010-08-31 2012-03-01 Nokia Corporation Method and apparatus for generating a virtual interactive workspace with access based on spatial relationships
KR101795644B1 (en) * 2011-07-29 2017-11-08 휴렛-팩커드 디벨롭먼트 컴퍼니, 엘.피. Projection capture system, programming and method
US8440156B2 (en) 2011-08-05 2013-05-14 Chevron U.S.A. Inc. Reduction of oxides of nitrogen in a gas stream using molecular sieve SSZ-28
US9069382B1 (en) * 2012-01-06 2015-06-30 Google Inc. Using visual layers to aid in initiating a visual search
US8970709B2 (en) 2013-03-13 2015-03-03 Electronic Scripting Products, Inc. Reduced homography for recovery of pose parameters of an optical apparatus producing image data with structural uncertainty
US20150042678A1 (en) * 2013-08-09 2015-02-12 Metaio Gmbh Method for visually augmenting a real object with a computer-generated image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003131319A (en) * 2001-10-25 2003-05-09 Seiko Epson Corp Optical transmission and reception device
JP2006189712A (en) * 2005-01-07 2006-07-20 Nippon Telegr & Teleph Corp <Ntt> Information presenting apparatus, information presenting method and program
CN101292516A (en) * 2005-07-06 2008-10-22 米迪尔波得股份有限公司 System and method for capturing visual data
CN101810003A (en) * 2007-07-27 2010-08-18 格斯图尔泰克股份有限公司 enhanced camera-based input
WO2010137496A1 (en) * 2009-05-26 2010-12-02 パナソニック電工株式会社 Information presentation device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105208361A (en) * 2014-06-26 2015-12-30 松下知识产权经营株式会社 Light projection apparatus and illumination apparatus using same
CN113709439A (en) * 2017-04-11 2021-11-26 杜比实验室特许公司 Layered enhanced entertainment experience
CN113709439B (en) * 2017-04-11 2024-05-14 杜比实验室特许公司 Method, device and system for rendering visual object

Also Published As

Publication number Publication date
WO2013019190A1 (en) 2013-02-07
KR101773988B1 (en) 2017-09-01
JP2014527643A (en) 2014-10-16
EP2737693A4 (en) 2015-05-27
KR20140043933A (en) 2014-04-11
JP6126594B2 (en) 2017-05-10
US20140125704A1 (en) 2014-05-08
BR112014002234B1 (en) 2022-01-25
EP2737693A1 (en) 2014-06-04
EP2737693B1 (en) 2020-01-08
BR112014002234A2 (en) 2017-02-21
US10229538B2 (en) 2019-03-12
CN103703763B (en) 2018-02-27

Similar Documents

Publication Publication Date Title
CN103703763A (en) System and method of visual layering
JP6951595B2 (en) Housing data collection and model generation methods
US20190212901A1 (en) Manipulation of content on display surfaces via augmented reality
US9619060B2 (en) Display device and method of operating and manufacturing the display device
US9767612B2 (en) Method, system and apparatus for removing a marker projected in a scene
Chan et al. Enabling beyond-surface interactions for interactive surface with an invisible projection
CN104024936A (en) Projection capture system, programming and method
US10869009B2 (en) Interactive display
US20190102135A1 (en) Scalable interaction with multi-displays
US20140267600A1 (en) Synth packet for interactive view navigation of a scene
Grammenos et al. PaperView: augmenting physical surfaces with location-aware digital information
CN105122297A (en) Panorama packet
US20140104431A1 (en) System and Method for Utilizing a Surface for Remote Collaboration
KR20180066440A (en) Apparatus for learning painting, method thereof and computer recordable medium storing program to perform the method
WO2023230182A1 (en) Three dimensional mapping
US20190212135A1 (en) Methods And Systems For 3D Scanning
KR20080052378A (en) Cad system for building
JP2020144413A (en) Display method and display apparatus
RU2783218C1 (en) Method and system for controlling display of virtual tours in multi-user mode
CN104424869B (en) Control the method, apparatus and system of display multimedia messages
US20230351706A1 (en) Scanning interface systems and methods for building a virtual representation of a location
WO2022185719A1 (en) Information processing device, information processing method, and display device
TW202147173A (en) Superimpose virtual object method based on optical communitation device, electric apparatus, and computer readable storage medium
WO2023158345A1 (en) Method and system for controlling the display of virtual tours in multi-user mode
Hardy Toolkit support for interactive projected displays

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20180227

CF01 Termination of patent right due to non-payment of annual fee