WO2005124450A1 - 展示装置 - Google Patents
展示装置 Download PDFInfo
- Publication number
- WO2005124450A1 WO2005124450A1 PCT/JP2005/010916 JP2005010916W WO2005124450A1 WO 2005124450 A1 WO2005124450 A1 WO 2005124450A1 JP 2005010916 W JP2005010916 W JP 2005010916W WO 2005124450 A1 WO2005124450 A1 WO 2005124450A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- screen
- viewer
- real
- display device
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09F—DISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
- G09F19/00—Advertising or display means not otherwise provided for
- G09F19/12—Advertising or display means not otherwise provided for using special optical effects
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47F—SPECIAL FURNITURE, FITTINGS, OR ACCESSORIES FOR SHOPS, STOREHOUSES, BARS, RESTAURANTS OR THE LIKE; PAYING COUNTERS
- A47F10/00—Furniture or installations specially adapted to particular types of service systems, not otherwise provided for
- A47F10/06—Furniture or installations specially adapted to particular types of service systems, not otherwise provided for for restaurant service systems
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09F—DISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
- G09F19/00—Advertising or display means not otherwise provided for
- G09F19/12—Advertising or display means not otherwise provided for using special optical effects
- G09F19/18—Advertising or display means not otherwise provided for using special optical effects involving the use of optical projection means, e.g. projection of images on clouds
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/74—Projection arrangements for image reproduction, e.g. using eidophor
Definitions
- the present invention relates to a display device.
- Patent Document 1 discloses a display device. This exhibition device performs an exhibition that combines the projected image and the exhibits. Patent Literature 1 discloses an exhibition example in which an image of a dinosaur is combined with a forest miniature.
- Patent Document 2 discloses a virtual operation system. This virtual operation system can provide a virtual experience as if the power of a personal computer at home, for example, is touching the real thing or moving the real thing through the Internet. Patent Document 2 discloses a specific example of remotely controlling a toy robot or the like via the Internet.
- Patent Document 3 discloses a pointing device.
- the pointing device includes an input plane plate arranged in front of the display device, an optical sensor unit for projecting and receiving scanning light within a plane having a predetermined distance from the input plane plate, and a light projection unit of the optical sensor unit.
- a retroreflective member that retroreflects light rays
- imaging means that uses the retroreflected light received by the optical sensor unit to image a pointer on the input flat plate, and converts the pointer into an electric signal
- Image processing means for analyzing the obtained imaging signal to calculate the coordinate position of the pointer.
- Patent Document 4 discloses a virtual space movement control device that moves a virtual space image displayed on a display screen of a display unit.
- the virtual space movement control device includes a detection unit that detects a touch operation or a drag operation of a plurality of pointing parts, which are performed on a display screen, and a viewpoint of a virtual space image based on a detection result of the detection unit.
- the apparatus includes viewpoint position information generating means for generating position information, and stereoscopic image generating means for generating virtual space image data viewed from the viewpoint indicated by the viewpoint position information and outputting the data to the display means.
- Patent Document 1 Japanese Patent Application Laid-Open No. 5-35192 (Abstract, etc.)
- Patent Document 2 Japanese Patent Application Laid-Open No. 2002-170055 (Abstract, etc.)
- Patent Document 3 Japanese Patent Application Laid-Open No. 2004-5271 (Claims, etc.)
- Patent Document 4 Japanese Patent Application Laid-Open No. 2004-5272 (Claims, etc.)
- the display device is provided between the real object to be displayed and the viewing position, and the viewer's power at the viewing position is large enough to see the real object and is a transparent screen. And, from the viewer's point of view, the real posture that is almost the same as the real posture when viewed through the screen And a projector for projecting an image on a screen so that the image overlaps with the real object.
- This display device allows the viewer to perceive that the image of the real object formed on the screen corresponds one-to-one with the real object, so that the real image formed on the screen is the real object itself. Illusion can be given.
- This display device can give a real image to a real image that actually exists.
- the exhibition apparatus further includes a viewing position detecting unit that detects a viewing position or a viewpoint position of a viewer, and a position detected by the viewing position detecting unit.
- Image change means for changing the image of the real object formed on the screen so that the image of the actual object overlaps the actual object in view of the positional force.
- the real image appears to overlap with the real object, taking into account the viewer's position. Further, it is not necessary to specify a position where the real image is seen overlapping with the real image.
- the display device in addition to the configuration of each of the above-described inventions, includes a viewing position detecting unit 1S, a screen force, a distance sensor for detecting a distance to a viewer, and a predetermined range on the viewer side of the screen.
- the display device may be configured such that, when a plurality of viewers are included in a captured image, the viewer viewpoint specifying means is located at the center of the image. This is to identify the viewpoint position of the viewer who is closest to the center of the viewer or the screen.
- the exhibition device further includes a viewing position detecting unit, a contact position detecting unit that detects a position touching the power screen, and a contact position detected by the contact position detecting unit. Based on the screen assuming that the viewer reached out and touched And a viewer viewpoint specifying means for specifying the position of the viewer based on the positional relationship with the viewer.
- the position where the viewer is present can be specified by simple processing.
- the screen is a transmissive screen having a transmissivity of 50% or more and 80% or less, and is arranged on the real side of the screen.
- the first control to reduce the amount of light output by the light projecting member and the actual light image formed on the screen when the real image is formed on the screen and the light projecting member that illuminates the real object
- a light amount control means for executing at least one of the second controls for increasing the light amount output from the light projecting member.
- the display device in addition to the configuration of each of the above-described inventions, further includes an operation detecting means for detecting an operation of the viewer on the screen, an operation detected by the operation detecting means, and an image of the image at that time.
- Image control means for rotating or enlarging or reducing the real image formed on the screen based on the display state.
- the viewer can operate the screen on which the real image is formed by overlapping the real object, thereby rotating the real image formed on the screen. And can be enlarged or reduced.
- the display device further includes an operation detecting means for detecting an operation of the viewer on the screen, an operation detected by the operation detecting means, Image forming means for forming an image, video, or character information related to the real thing on the screen based on the display state of the image at the time.
- FIG. 1 is a view showing a display device according to Embodiment 1 of the present invention.
- FIG. 2 is a cross-sectional view of the display device of FIG. 1.
- FIG. 3 is a cross-sectional view of the display device of FIG. 1.
- FIG. 4 is an explanatory view showing a layout and a structure of a transmission screen, a retroreflective tape, and two infrared light emitting and receiving units of the exhibition apparatus of FIG. 1.
- FIG. 5 is a block diagram showing a hardware configuration of a control system of the exhibition device of FIG. 1.
- FIG. 6 is a block diagram showing a control system of the exhibition device of FIG. 1.
- FIG. 7 is a diagram showing data stored in the hard disk device of FIG. 5.
- FIG. 8 is a diagram showing an example of an image based on the three-dimensional CG data of FIG. 7.
- FIG. 9 is a diagram showing an example of another image based on the three-dimensional CG data of FIG. 7.
- FIG. 10 is a diagram showing still another example of an image based on the three-dimensional CG data of FIG. 7.
- FIG. 11 is a diagram showing a projection state of the initial image of FIG. 8 onto a transmission screen.
- FIG. 12 is a view showing a display device according to Embodiment 2 of the present invention.
- FIG. 13 is a block diagram showing a hardware configuration of a control system of the exhibition device in FIG. 12.
- FIG. 14 is a block diagram showing a control system of the display device in FIG. 12.
- FIG. 15 is a diagram showing data stored in the hard disk device of FIG.
- FIG. 16 is a diagram showing a geometrical positional relationship between a real object, a viewer, and a transmission screen.
- FIG. 17 is a diagram showing an example of an image based on three-dimensional CG data.
- FIG. 18 is a diagram showing a state of projection of the image of FIG. 17 onto a transmission screen.
- FIG. 19 is a diagram showing a situation where three viewers are standing in front of a transmissive screen.
- FIG. 20 is a diagram showing a state in which a viewer is touching a transmission screen.
- FIG. 21 is a block diagram showing a hardware configuration of a control system of the display device according to Embodiment 3 of the present invention.
- FIG. 22 is a block diagram showing a control system of the display device in FIG. 21.
- FIG. 23 is a diagram showing data stored in the hard disk device of FIG. 21.
- FIG. 24 is a cross-sectional view showing a modification of the display device having a spot lamp that outputs incandescent light and a spot lamp that outputs monochromatic light.
- FIG. 25 is a cross-sectional view of a display device having a plurality of spot lamps and a real object.
- the display device according to the first embodiment of the present invention is a display device that can be suitably used when exhibiting arts and crafts that cannot be directly touched.
- the display device according to the first embodiment of the present invention provides an image based on the combi- ter graphics as well as real objects such as arts and crafts in a state where the viewer can freely operate it.
- FIG. 1 is a diagram showing a display device according to Embodiment 1 of the present invention.
- Fig. 1 (A) is a side view of the exhibition device.
- FIG. 1B is a front view of the display device.
- 2 and 3 are cross-sectional views of the display device of FIG.
- Fig. 2 (A) is a cross-sectional view of the display device of Fig. 1 (A) taken along line A-A. .
- FIG. 2 (B) is a BB cross-sectional view of the display device of FIG. 1 (A).
- FIG. 3 is a CC cross-sectional view of the display device shown in FIGS. 1 (A) and 1 (B).
- the exhibition apparatus includes an access restriction frame 1, a mounting table 2, a cover member 3, and a spot lamp 4 as a light emitting means. Have.
- the entry restriction frame 1 has four column members 5.
- the four long pillar members 5 are arranged so as to be erected at the four corners of the square.
- the four column members 5 are connected to each other by the beam members 6, and are assembled into a cubic frame structure.
- Restricted access frame 1 is installed on floor 7.
- the four column members 5 of the entry restriction frame 1 may be fixed to the floor 7. Restricted access frame 1 is about 2.4m tall, higher than an adult.
- the entry restriction frame 1 has four horizontal bar members 8 connecting the four column members 5 to be erected.
- the four horizontal bar members 8 are provided so as to bridge between the adjacent column members 5 at the height of the adult's knee or at a height above it.
- a person who wants to enter the restricted access frame 1 needs to cross over the horizontal bar member 8.
- a person who attempts to enter Restricted Access Box 1 must raise his or her feet significantly when attempting to enter Restricted Access Box 1. As a result, a viewer or a thief cannot easily enter the restricted access frame 1.
- the restricted access frame 1 is reinforced by the four horizontal bar members 8.
- a transparent reinforcing glass may be provided between the four pillar members 5 that are erected.
- the mounting table 2 is arranged at the center inside the entrance restriction frame 1.
- the mounting table 2 has the height of the waist of an adult or more.
- the mounting table 2 has a substantially quadrangular prism shape.
- the cover member 3 covers the real object 9 mounted on the upper surface of the mounting table 2.
- the cover member 3 is formed of a transparent material, and covers the real object 9 mounted on the upper surface of the mounting table 2. Hippopotamus Examples of the material of the one member 3 include glass and a transparent acrylic plate.
- the spot lamp 4 is disposed on the ceiling of the entry restriction frame 1 so that the spot light illuminates the real object 9 mounted on the mounting table 2.
- the spot lamp 4 is a kind of lighting equipment and outputs a spot light having a brightness according to the supplied electric power.
- the spot light of the spot lamp 4 is incandescent. The greater the power supplied, the brighter the spotlight. The smaller the power supplied, the darker the spotlight. When no power is supplied, the spot lamp 4 is turned off. In the first embodiment, the spot lamp 4 is always supplied with power.
- the display device includes a translucent screen 11 as a screen, an opaque plate 12, and a projector 13.
- the transmission screen 11 has a rectangular shape.
- the transmissive screen 11 is a set of entrance restriction frames 1 so that the lengthwise direction of the translucent screen 11 is parallel to the horizontal direction and the height extends from the height of the adult's waist to the height of the head. It is disposed between the pillar members 5 of the first and second members.
- the transmission screen 11 has a transmittance of about 65%.
- the opaque plate 12 has a rectangular shape.
- the opaque plate 12 is disposed between the pair of column members 5 of the entrance restriction frame 1 above the transmissive screen 11.
- the opaque plate 12 has a height up to the ceiling of the restricted access frame 1.
- the opaque plate 12 is disposed so that its surface is substantially parallel to the surface of the transmission screen 11.
- the opaque plate material 12 is disposed so as to be shifted outside the entrance restriction frame 1 from the transmission screen 11. As shown in the interval B in FIG. 3, between the surface of the transmissive screen 11 (the surface outside the restricted access frame 1) and the back surface of the opaque plate 12 (the surface inside the restricted access frame 1). Will form a gap of several centimeters.
- the projector 13 When the video signal or the still image data is input, the projector 13 outputs an image based on the video signal or the still image data from the output unit.
- the projector 13 is disposed on the ceiling of the entrance restriction frame 1 in such a manner that the center of the output image is the center of the transmissive screen 11.
- the image output from the projector 13 is projected on the transmission screen 11. From the front side of the transmissive screen 11 (outside the entrance restriction frame 1), the projector 13 can see the image projected on the transmissive screen 11.
- the projector 13 is disposed so as to be positioned obliquely upward at 45 degrees from the center of the transmissive screen 11.
- viewer a viewer standing in front of the transmissive screen 11
- viewpoint of an output unit of the projector 13.
- the opaque plate 12 will be located. From the viewer 14 standing in front of the translucent screen 11, the output of the projector 13 is not visible. The viewer 14 is not dazzled by the direct light from the output unit of the projector 13.
- the output of the projector 13 is corrected by the trapezoidal distortion correction function so that the image is projected on the translucent screen 11 with a rectangular outline.
- the display device includes a retroreflective tape 21 and two infrared light emitting and receiving units 22 and 23.
- the retroreflective tape 21 and the two infrared light emitting and receiving units 22, 23 are operation detecting means and contact position detecting means.
- FIG. 4 is an explanatory diagram showing the layout and structure of the transmissive screen 11, the retroreflective tape 21, and the two infrared light emitting and receiving units 22, 23 of the display device of FIG.
- the retroreflective tape 21 is a tape that reflects irradiated light in the irradiation direction.
- the retroreflective tape 21 is, for example, a plurality of beads 24 each having a reflective layer adhered to half of the surface of a transparent sphere such as glass, which is also strong, arranged in such a manner that the reflective layer is on the side where the tape adheres. Some have a structure.
- the left and right sides and the lower side of the transmissive screen 11 are arranged so that the surface where the tape is attached is outside the transmissive screen 11.
- the light irradiated on the retroreflective tape 21 enters the bead body 24 from the front side of the tape, is reflected by the reflective layer, and is reflected from the front side of the tape in the irradiation direction.
- the plurality of bead bodies 24 may be encapsulated in a tape or encapsulated in the tape.
- One of the two infrared light emitting and receiving units 22 and 23 includes an infrared LED (Light Emitting Diode) 25, a polygon mirror 26, and an infrared CCD (Charge-Coupled Device) 27.
- the infrared LED 25 outputs infrared light. Infrared rays are a type of light.
- the polygon mirror 26 is, for example, a polygon mirror such as a hexagon mirror.
- the infrared CCD 27 has a plurality of infrared light receiving elements.
- the infrared light receiving element outputs a light receiving level signal corresponding to the amount of received infrared light.
- the infrared CCD 27 outputs an infrared image based on the light receiving level signals of the plurality of infrared light receiving elements.
- One infrared projection / reception unit 22 is provided near one end of the translucent screen 11 in the longitudinal direction.
- One infrared light emitting / receiving unit 22 is provided in a gap between the transmission screen 11 and the opaque plate 12.
- the plurality of infrared light receiving elements are arranged along the surface of the transmission screen 11.
- one infrared light emitting / receiving unit 22 emits infrared light, and the ED 25 outputs infrared light.
- the polygon mirror 26 reflects the infrared light output from the infrared LED 25 on one mirror surface. reflect.
- the infrared light reflected by the polygon mirror 26 travels along the surface of the transmission screen 11 and enters the retroreflective tape 21 provided along the outer periphery of the transmission screen 11.
- the retroreflective tape 21 reflects the incident infrared light in the incident direction.
- the infrared light reflected by the retroreflective tape 21 returns to one of the infrared light emitting / receiving units 22 along substantially the same path as the case of incidence.
- the infrared light returned to one infrared light emitting / receiving unit 22 is received by a certain infrared light receiving element among the plurality of infrared light receiving elements of the infrared CCD 27.
- the polygon mirror 26 rotates.
- the direction of infrared light output from one infrared light emitting and receiving unit 22 changes.
- the infrared rays output in the different directions proceed along the surface of the transmissive screen 11 and are incident on the retroreflective tape 21 at a site different from the one before.
- the retro-reflective tape 21 reflects the irradiated infrared rays in the direction of incidence.
- the infrared light reflected by the retroreflective tape 21 returns to one of the infrared light emitting and receiving units 22 along substantially the same path as the case of incidence.
- the infrared light that has returned to the one infrared light emitting / receiving unit 22 is received by a certain infrared light receiving element different from the previous one among the plurality of infrared light receiving elements of the infrared CCD 27.
- the path of the infrared light output from one infrared light emitting / receiving unit 22 gradually moves along the surface of the transmission screen 11 as the polygon mirror 26 rotates.
- the infrared light receiving element that receives infrared light at the infrared CCD 27 changes.
- the plurality of infrared light receiving elements of the infrared CCD 27 receive infrared light once.
- the plurality of infrared light receiving elements of the infrared CCD 27 receive infrared light again.
- one infrared projection / reception unit 22 moves the polygon mirror 26 by a predetermined range (see FIG. 4) of the surface of the transmission screen 11 every time the polygon mirror 26 rotates by an angle obtained by dividing 360 degrees by the number of surfaces. Scan the area C with infrared light. Further, one infrared light emitting and receiving unit 22 repeats the scanning. If there is no obstacle in a predetermined range on the surface of the transmission screen 11 during a period required for one scan, the plurality of infrared light receiving elements receive infrared light once.
- the infrared CCD 27 of the infrared emitting / receiving unit 22 outputs an infrared image every time the polygon mirror 26 rotates by an angle obtained by dividing 360 degrees by the number of surfaces.
- the infrared ray that receives the infrared ray passing through the position where the obstacle is located The light receiving element stops receiving infrared light.
- the infrared CCD 27 outputs an infrared image in which a shadow is formed in a small portion that does not receive the infrared rays.
- the other infrared projection / reception unit 23 is provided near the other end in the longitudinal direction of the transmission screen 11.
- the other infrared light emitting / receiving unit 23 is disposed in a gap between the transmission screen 11 and the opaque plate 12.
- the configuration and operation of the other infrared projection / reception unit 23 are the same as the configuration and operation of the infrared projection / reception unit 22, and are denoted by the same reference numerals and description thereof will be omitted.
- the infrared CCD 27 outputs an infrared image every time the polygon mirror 26 rotates by an angle obtained by dividing 360 degrees by the number of surfaces.
- one infrared light emitting / receiving unit 22 and the other infrared light emitting / receiving unit 23 are disposed at both ends on the upper side of the transmission screen 11. So the other red A predetermined range of the surface of the transmission screen 11 scanned by the external light emitting / receiving unit 23 (a range C indicated by a dashed line in FIG. 4) and a predetermined range of the surface of the transmission screen 11 scanned by one infrared light emitting / receiving unit 22 (Range D shown by a dashed line in FIG. 4) does not coincide with each other, although there are overlapping portions, because the two infrared light emitting / receiving units 22, 23 are disposed at different positions.
- a point on the transmissive screen 11 (for example, point E in FIG. 4) is in a direction different from the direction viewed from one infrared light emitting / receiving unit 22 and the direction viewed from the other infrared light emitting / receiving unit 23. .
- the display device includes a control device 31. Although not shown in FIGS. 1 to 3, the control device 31 is arranged at a position where the viewer's power is not visible.
- FIG. 5 is a block diagram showing a hardware configuration of a control system of the exhibition apparatus of FIG.
- FIG. 6 is a block diagram showing a control system of the exhibition device of FIG.
- the control device 31 can be realized by a personal computer or the like.
- the control device 31 has an input / output port 32, a CPU (Central Processing Unit: central processing unit) 33, a memory 34, a hard disk device 35, and a system node 36 connecting these.
- a CPU Central Processing Unit: central processing unit
- the input / output port 32 is connected to the projector 13, one infrared light emitting / receiving unit 22, and the other infrared light emitting / receiving unit 23.
- the one infrared light emitting and receiving unit 22 and the other infrared light emitting and receiving unit 23 output an infrared image of the infrared CCD 27 to the input / output port 32 every time scanning is performed.
- FIG. 7 is a diagram showing data stored in the hard disk device 35 of FIG.
- the hard disk device 35 stores, for example, programs such as a projection image generating program 41, a moving image reproducing program 42, and an operation detecting program 43, and display data such as three-dimensional computer graphics data 44 and moving image data 45.
- programs such as a projection image generating program 41, a moving image reproducing program 42, and an operation detecting program 43, and display data such as three-dimensional computer graphics data 44 and moving image data 45.
- the three-dimensional computer graphics data 46 is data for generating still image data of an image including an image when the real object 9 is viewed in any direction.
- the three-dimensional computer graphics data 46 includes, for example, modeling data of the real object 9 and data of an image attached to the surface of the modeling data.
- the data of the image pasted on the surface of this modeling data includes, for example, image power It is possible to use the data of the image obtained.
- FIG. 8 is a diagram showing one image 51 based on still image data generated based on the three-dimensional computer graphics data 44 of FIG.
- the same real image as when the real 9 (silver cup) is viewed from the front side of the transmission screen 11 is formed.
- FIG. 9 is a diagram showing another image 52 based on still image data generated based on the three-dimensional computer graphics data 44 of FIG.
- a real image larger than the real image in the image 51 in FIG. 8 is formed.
- FIG. 10 is a diagram showing still another image 53 based on still image data generated based on the three-dimensional computer graphics data 44 of FIG.
- the same real image as when the real 9 (silver cup) is viewed upward is formed.
- the moving image data 45 is data for generating a video signal.
- the moving image data 45 is, for example, moving image data 45 that introduces the manufacturing process of the real 9.
- the projection image generation program 41 is read into the memory 34 and executed by the CPU 33.
- the projection image generation unit 61 shown in FIG. 6 as the image changing unit and the image control unit is generated.
- the projection image generation unit 61 generates still image data from the three-dimensional computer graphics data 44, and outputs the generated still image data to the projector 13.
- the moving image reproducing program 42 is read into the memory 34 and executed by the CPU 33. As a result, the moving image reproducing unit 62 shown in FIG. 6 as an image forming unit is generated. The moving image reproducing unit 62 generates a video signal based on the moving image data 45, and outputs the generated video signal to the projector 13.
- the operation detection program 43 is read into the memory 34 and executed by the CPU 33.
- the operation detecting unit 63 shown in FIG. 6 is generated as the operation detecting unit and the viewer viewpoint specifying unit.
- the operation detection unit 63 determines an operation on the transmissive screen 11 based on the infrared images input from the two infrared light emitting and receiving units 22 and 23, and according to the determined operation, the projection image generation unit 61 and the moving image reproduction unit. Output instructions to 62.
- the real object 9 (here, a silver cup) is placed on the mounting table 2,
- the spot lamp 4 illuminates the light.
- a viewer or the like can check the appearance and color of the real object 9 through the translucent screen 11.
- the projection image generation unit 61 reads the three-dimensional computer graphics data 44, and, based on the read three-dimensional computer graphics data 44, two-dimensional still image data obtained by projecting an image of the real object 9 on a predetermined plane. Generate Here, the projection image generation unit 61 generates still image data of the image shown in FIG.
- the projection image generator 61 outputs the generated still image data to the projector 13 via the input / output port 32.
- the projector 13 outputs an image 51 shown in FIG.
- An image projected by the projector 13 is formed on the transmission screen 11.
- FIG. 11 is a diagram showing a state where the initial image 51 shown in FIG.
- the real image is projected substantially in the center of the transmissive screen 11 in comparison to the image projected on the transmissive screen 11.
- the real image formed on the transmissive screen 11 has a viewing height having a standard height (for example, 175 cm) at the position of point A (the position viewed by the viewer 14) shown in FIG.
- a standard height for example, 175 cm
- the real image and the real object 9 seen through the transmissive screen 11 are positioned and overlapped.
- the real 9 and the real image slightly deviate, but the real 9 and the real image rarely appear to be separated.
- the viewer 14 can see both the real object 9 and the real image. As a result, the viewer 14 can be given an illusion as if the real thing 9 itself is at hand.
- the two infrared light emitting / receiving units 22 , 23 When the viewer 14 or the like stands at the position of the point A shown in FIG. 2 and reaches out to the transmissive screen 11, the two infrared light emitting / receiving units 22 , 23 output two infrared images in which the shadow of the finger of the hand is formed. This set of infrared images is input to the operation detection unit 63 via the input / output port 32. Further, the two infrared light emitting / receiving units 22, 23 output one set of infrared images for each scan. Hereinafter, two infrared images output from the two infrared light emitting and receiving units 22, 23 for each scan are referred to as a set of infrared images.
- the operation detection unit 63 determines an operation on the transmission screen 11 based on each set of infrared images, and outputs an instruction to the projection image generation unit 61 or the moving image reproduction unit 62 according to the determined operation.
- the operation detecting unit 63 first determines the position and the number of fingers based on each set of infrared images. As described above, when infrared light is blocked by a finger, a shadow is formed in the infrared image. The operation detection unit 63 determines the position and number of shadows in each infrared image.
- the operation detection unit 63 determines that the shadow in the two infrared images is a shadow by one finger. Further, the operation detecting unit 63 specifies the direction of the finger from each infrared light emitting / receiving unit (if the shadow has a width, the direction of the center of the shadow) based on the position of the shadow in each infrared image. . The operation detection unit 63 uses the specified direction of the finger from each infrared light emitting / receiving unit and the distance between one infrared light emitting / receiving unit 22 and the other infrared light emitting / receiving unit 23 based on the principle of the triangulation measurement method. Then, the position of the one finger on the transmissive screen 11 is specified. Accordingly, when a finger is present at point E in FIG. 4, for example, operation detection unit 63 can determine that there is one finger at that position.
- the operation detection unit 63 determines that the shadow in the two infrared images is a shadow by two fingers. In addition, the operation detecting unit 63 determines that the right shadows in the two infrared images are shadows of the first finger, and the left shadows in the two infrared images are shadows of the remaining first finger. Assuming that there is, the direction of each finger from each infrared light emitting / receiving unit is specified.
- the operation detection unit 63 uses the specified direction of each finger from each infrared light emitting and receiving unit and the distance between one infrared light emitting and receiving unit 22 and the other infrared light emitting and receiving unit 23 to determine the principle of triangulation.
- the positions of the two fingers on the transmissive screen 11 are specified based on
- the operation detecting unit 63 determines the number and position of the fingers based on the specified number and positions of the fingers. The contents of the operation instruction for the transmission screen 11 are determined. Alternatively, the operation detection unit 63 Based on the change of the finger position with respect to the finger position in the previous set of infrared images, the content of the operation instruction on the translucent screen 11 is determined. After determining the content of the operation instruction, the operation detection unit 63 outputs an instruction to the projection image generation unit 61 or the moving image reproduction unit 62 according to the determined operation instruction.
- the positions of two fingers are specified in a certain set of infrared images, and the positions of the two fingers are specified in a certain set of previous infrared images.
- the operation detection unit 63 performs an operation to instruct an enlargement of the image projected on the transmissive screen 11. Judge. If it is determined that the operation is an operation for instructing enlargement of the image, the operation detection unit 63 outputs an instruction for enlarging the image to the projection image generation unit 61.
- the positions of two fingers are specified in a certain set of infrared images, and the positions of two fingers are specified in a certain set of infrared images before that.
- the operation detection unit 63 determines that an operation for instructing reduction of the image projected on the transmissive screen 11 has been performed. If it is determined that the operation is an operation to instruct to reduce the image, the operation detecting unit 63 outputs an instruction to reduce the image to the projection image generating unit 61.
- the positions of two fingers are specified in a certain set of infrared images, and the positions of two fingers are specified in a certain set of infrared images before that.
- the operation detecting unit 63 projects the image on the transmissive screen 11 and instructs to rotate the image. to decide. If it is determined that the operation is to instruct the rotation of the image, the operation detection unit 63 outputs an instruction to rotate the image to the projection image generation unit 61.
- the operation detection unit 63 determines that the operation force for instructing the reproduction of the moving image has been input. to decide. If it is determined that the operation is an operation for instructing reproduction of a moving image, the operation detection unit 63 outputs an instruction to reproduce the moving image to the moving image reproduction unit 62.
- the projection image generation unit 61 reads the three-dimensional computer graphics data 44. From the three-dimensional computer graphics data 44, two-dimensional still image data in which the image of the real object 9 is projected onto a predetermined plane is generated.
- Projection image generation section 61 generates still image data for projecting a projection image obtained by enlarging, reducing, or rotating the image based on the image projected by projector 13 on transmission screen 11. .
- the image 51 output to the projector 13 by the initial screen display unit 61 is used as a reference.
- the projection image generation unit 61 outputs the generated still image data to the projector 13.
- Projector 13 outputs a screen based on the newly input still image data. On the transmissive screen 11, an image newly projected by the projector 13 is formed.
- the transmission screen 11 includes the image shown in FIG. An image 53 having an image of the real object 9 shown as viewed from above is projected.
- the image of the real object which was initially formed on the transmissive screen 11 so as to be superimposed on the real object 9 is enlarged, reduced or rotated by operating the transmissive screen 11. Or you can. As a result, it is possible to give an illusion to the viewer 14 who is operating as if he / she is holding the real thing 9 on display.
- the operation detection unit 63 instructs the projection image generation unit 61 to display the initial image 51 shown in FIG. As a result, an image 51 overlapping the real object 9 shown in FIG. As a result, it is possible to give the operating viewer 14 an illusion that the real object 9 held and watched by the user is placed on the mounting table 2.
- the projection image generation unit 61 continuously enlarges the real image until the image power of the real object formed on the transmissive screen 11 matches the real image in the image based on the initial screen data shown in Fig. 8.
- the reduced and rotated still image data may be generated, and then the still image data of the image 51 overlapping the real object 9 shown in FIG. 8 may be generated.
- the image of the real object formed on the screen 11 becomes an image that overlaps with the real object 9 after its size and orientation smoothly change. In this way, by smoothly changing the image of the real object formed on the transmissive screen 11 so as to overlap with the real object 9, the sense of unity between the real image formed on the transmissive screen 11 and the real object 9 is further enhanced. You can have.
- the moving image reproducing unit 62 generates a video signal from the moving image data 45 when a moving image reproducing instruction is input.
- the moving image reproducing unit 62 outputs the generated video signal to the projector 13.
- the projector 13 outputs a moving image based on the video signal.
- the moving image projected from the projector 13 is displayed on the transmission screen 11.
- the video playback unit 62 instructs the projection image generation unit 61 to display the initial image 51 shown in FIG. As a result, an image 51 overlapping the real object 9 shown in FIG. 8 is displayed on the transmission screen 11.
- the display device is configured such that the image of the real object having the same posture as the real object 9 when viewed through the transparent screen 11 is placed on the transmissive screen 11. It can be overlaid with the real thing 9 above. This allows the viewer 14 to recognize that the real image formed on the transmissive screen 11 has a one-to-one correspondence with the real 9, and the real image formed on the transmissive screen 11 is the real 9 itself. It can give the illusion as if.
- the display device of the first embodiment allows the viewer 14 to view the real object 9 and the virtual real image from the same viewpoint, and has the same feeling between them. I can do it. For example, it is possible to give the size and texture of the real object 9 to the virtual real image, which is difficult to grasp only with the virtual real image. As a result, the display device of the first embodiment can give the real image a sense of reality.
- the viewer 14 operates the transmission screen 11 to enlarge, reduce, or rotate the real image formed on the transmission screen 11. can do. Therefore, the viewer 14 can freely view the inaccessible real object 9 in the image of the real object. As a result, a high learning effect for art objects can be expected.
- the operation on the transmission screen 11 is detected by the retroreflective tape 21 and the two infrared light emitting and receiving units 22 and 23.
- Embodiment 1 In this configuration, by employing this configuration, it is possible to detect an operation on the transmissive screen 11 without disposing a member for detecting an operation on the transmissive screen 11 so as to overlap the transmissive screen 11. Since there are no members arranged on the transmissive screen 11, there is no loss of the real object 9 seen through the transmissive screen 11 or the sharpness of the real image formed on the transmissive screen 11.
- the two infrared light projecting and receiving units 22, 23 are arranged above the translucent screen 11 that is erected. Therefore, compared to the case where the two infrared emitting and receiving units 22 and 23 are arranged below the transmission screen 11, for example, infrared rays other than the infrared rays from the infrared LED 25 enter the infrared CCD 26. . Also, since infrared light is used as light used for detecting operations, the influence of visible light is reduced. When the visible light is used as the light for detecting the operation, the light becomes susceptible to, for example, the light of the spot lamp 4 illuminating the real object 9.
- the operation detecting section 63 can easily specify the position of the hand to be operated based on the infrared image in which the shadow of the hand is clearly seen.
- the initial image 51 shown in FIG. 8 is projected on the transmission screen 11 at a predetermined position.
- the initial image 51 shown in FIG. 8 projected on a predetermined position of the transmissive screen 11 is moved right and left and up and down on the transmissive screen 11 in accordance with the operation of the viewer 14, and is enlarged or reduced. Or may be able to do so.
- the viewer 14 can move the real image so as to overlap the real image 9 with his / her eyes.
- an image to be subsequently projected on the transmissive screen 11 may be projected based on the moved position.
- the real image formed based on the three-dimensional computer graphics data 44 is the real object 9 mounted on the mounting table 2 itself.
- the real image formed based on the three-dimensional computer graphics data 44 adds the discolored color to the real object 9.
- Image may be used.
- moving image data 45 is only continuously projected on transmissive screen 11.
- a video signal based on the moving image data 45 may be generated stepwise according to an operation of the viewer 14 on the translucent screen 11. As a result, for example, the viewer 14 can relive the coloring process of the real object 9 and the like.
- moving image data 45 is stored in hard disk device 35.
- various data such as slide data and audio data may be reproduced on an exhibition device to form an image, a video, or character information related to a real object on the transparent screen 11.
- the operation on the transmission screen 11 is detected by the retroreflective tape 21 and the two infrared light emitting and receiving units 22 and 23.
- the operation on the transmissive screen 11 may be detected by a touch panel or the like disposed so as to overlap the transmissive screen 11.
- the viewer feels that the operation of the viewer 14 is not as much as operating the real image formed on the transmissive screen 11 but as being operating on the touch panel. turn into.
- the operation on the transmissive screen 11 is detected by the retroreflective tape 21 and the two infrared light emitting and receiving units 22 and 23, because such a sense of unity can be maintained.
- a screen having a transmittance of 65% is used as the transmission screen 11.
- the transmission screen 11 may have a transmittance of 10% or more and 90% or less.
- a polarizing screen may be used in place of the transmission screen 11.
- the polarizing screen is not suitable for a large-scale display device in which a plurality of viewers 14 can view at the same time, because the color of the image changes only by slightly changing the viewing angle.
- the projector 13 is disposed on the actual 9 side of the transmissive screen 11, the occupied area of the display device is larger than when the projector 13 is disposed on the viewer 14 side of the transmissive screen 11, for example. Become smaller.
- the display device according to the second embodiment is configured such that the viewer 14 stands in front of the transmission screen 11.
- the point that the position and the size of the real image formed on the transmissive screen 11 are controlled in accordance with the height, the position of the viewpoint (eye), the standing position, and the like is different from the display device according to the first embodiment. You. In the following description, this difference will be mainly described.
- FIG. 12 is a diagram showing a display device according to Embodiment 2 of the present invention.
- FIG. 12A is a side view of the display device.
- FIG. 12B is a front view of the display device.
- FIG. 13 is a block diagram showing a hardware configuration of a control system of the exhibition device in FIG.
- FIG. 14 is a block diagram showing a control system of the display device of FIG.
- FIG. 15 is a diagram showing data stored in the hard disk device 35 of FIG.
- the exhibition device includes a distance sensor 71 as a viewing position detecting unit and an imaging device 72 as a viewing position detecting unit.
- the distance sensor 71 is attached to the center of the lower end of the transmission screen 11 toward the front of the transmission screen 11.
- the distance sensor 71 outputs infrared light or other light, and receives reflected light of the light.
- the distance sensor 71 calculates the distance to the object reflecting the light based on the time from when the light is output to when the force receives the reflected light.
- the distance sensor 71 outputs the calculated distance to the input / output port 32, as shown in FIG.
- the imaging device 72 has a plurality of light receiving elements (not shown).
- the light receiving element outputs a light receiving level signal corresponding to the amount of received light.
- the plurality of light receiving elements are arranged vertically and horizontally in the same plane.
- the surface on which the plurality of light receiving elements are arranged is called a light receiving surface.
- the imaging device 72 is attached to the center of the upper end of the opaque plate 12 in such a manner that the light receiving surface thereof is directed downward by a predetermined angle in front of the transmission screen 11.
- the imaging device 72 outputs a captured image based on the light receiving level signals of the plurality of light receiving elements to the input / output port 32, as shown in FIG.
- the hard disk device 35 stores a projection image generation program 81 and a human detection program 82, as shown in FIG.
- the projection image generation program 81 is read into the memory 34 and executed by the CPU 33.
- the projection image generation unit 91 shown in FIG. 14 as the image changing unit and the image control unit is generated.
- the projection image generation unit 91 generates still image data based on the three-dimensional computer graphics data 44, and sends the generated still image data to the projector 13. Output.
- the human detection program 82 is read into the memory 34 and executed by the CPU 33.
- a person detecting unit 92 shown in FIG. 14 is generated as a viewer viewpoint specifying unit and a viewing position detecting unit.
- the captured image captured by the imaging device 72 and the distance to the object reflecting the light calculated by the distance sensor 71 are input.
- the components other than the components described above are the same as the components of the display device according to the first embodiment, and will be described using the same reference numerals and names. Is omitted.
- the distance sensor 71 In an initial state where there is no person around the display device, the distance sensor 71 outputs light in the direction in front of the transmission screen 11.
- the imaging device 72 images a predetermined range on the near side of the transmission screen 11. Spot lamp 4 is lit.
- the projector 13 is off.
- the projector 13 can be turned off by setting its luminance to 0. When the projector 13 cannot be turned off, a black image may be output.
- the distance sensor 71 When the viewer 14 stands in front of the translucent screen 11 to view the real object 9 (silver cup) illuminated by the light of the spot lamp 4, the distance sensor 71 The emitted reflected light is received. The distance sensor 71 calculates the distance to the viewer 14 and outputs the distance to the input / output port 32. The imaging device 72 outputs a captured image including the image of the viewer 14 to the input / output port 32.
- Information on the distance to the viewer 14 detected by the distance sensor 71 and the captured image are input to the human detection unit 92.
- the human detection unit 92 detects the head of the viewer 14 standing in front of the translucent screen 11 based on the information on the distance and the captured image at that time. Identify the location. Instead of the head, a position such as the eye or the space between the eyebrows may be specified as the viewpoint position.
- the viewpoint position (L, H) of the installation height of the imaging device 72 is specified based on the shooting angle ⁇ ⁇ ⁇ ⁇ of each pixel of the captured image and the distance L to the viewer 14.
- the human detection unit 92 specifies the position of the head of the viewer 14 (that is, the viewpoint position) in the captured image by image processing.
- the human detection unit 92 assumes that the distance to the viewer 14 detected by the distance sensor 71 is the distance from the translucent screen 11 to the viewer 14 and virtually captures the captured image from the translucent screen 11 at that distance.
- the relative position of the identified head with respect to the transmissive screen 11 and the real object 9 is identified based on the virtual arrangement.
- the human detection unit outputs information on the specified relative position of the head to the projection image generation unit 91.
- the projection image generation unit 91 Based on the information of the relative position of the head (that is, the viewpoint position) and the three-dimensional computer graphics data 44, the projection image generation unit 91 generates a real image power to be formed on the translucent screen 11. When viewed from the position of the head, still image data that matches the real object 9 is generated.
- the projection image generation unit 91 firstly determines the position of the head (that is, the viewpoint position) specified by the human detection unit 92 and the center of the real object 9. The position 102 of the transmissive screen 11 that intersects the connecting center line is specified. This position 102 is the center of the real image formed on the transmissive screen 11. In addition, the projection image generation unit 91 specifies the size 101 of the real image formed on the transmissive screen 11, as shown in FIG.
- the projection image generation unit 91 After specifying the center 102 and the size 101 of the real image formed on the transmissive screen 11, the projection image generation unit 91 generates still image data in which the real image is formed on the specified part.
- the projection image generation unit 91 outputs the generated still image data to the projector 13.
- the projector 13 starts outputting and outputs an image based on the still image data.
- An image projected by the projector 13 is formed on the transmissive screen 11.
- FIG. 17 is a diagram showing an image 111 based on still image data generated based on the three-dimensional computer graphics data 44.
- the image force of the real object 9 here, the silver cup
- the front side of the translucent screen 11 is seen from the front side of the translucent screen 11, and its center is formed shifted to the lower left in FIG. .
- FIG. 18 is a diagram showing a state where image 111 shown in FIG. 17 is projected on transmissive screen 11.
- the real image formed on the transmission screen 11 is formed at a position where the central force of the transmission screen 11 is shifted to the lower left.
- the viewer 14 standing to the left in front of the transmissive screen 11 sees the real image formed on the transmissive screen 11 and the real object 9 seen through the transmissive screen 11 completely overlap.
- projection image generating section 91 Based on detection of viewer 14 by distance sensor 71, projection image generating section 91 generates still image data, and projector 13 starts outputting an image based on the still image data. .
- projection image generating section 91 generates still image data
- projector 13 starts outputting an image based on the still image data.
- the operation of the display device after the display of the initial screen is the same as the operation of the display device according to the first embodiment, and a description thereof will be omitted.
- the display device has the position and size of the real image formed on the transmission screen 11 according to the height and position of the viewer 14 standing in front of the transmission screen 11. Control.
- the display device according to the second embodiment adjusts the position of the real image on the transmission screen 11. This allows the viewer 14 to see the real image formed on the transmissive screen 11 irrespective of the height and the standing position, superimposed on the real object 9. Further, unlike Embodiment 1, the standing position of the viewer 14 does not have to be specified. In the case of the display device according to the second embodiment, the position where the viewer 14 views may be on the front side of the transmission screen 11.
- FIG. 19 is a diagram showing a situation where three viewers 14 are standing in front of the transmissive screen 11.
- the human detection unit 92 outputs the viewer closest to the center of the transmissive screen 11 (in FIG. 19, three out of three viewers).
- the middle viewer 14) is identified as a representative viewer 14 operating on the transmission screen 11, and the image of the real object formed on the transmission screen 11 is displayed to the representative viewer 14 in real time. You can make it overlap with nine.
- the other viewers 14 who do not operate the translucent screen 11 merely observe the operations of the representative viewers 14, so that a sufficient viewing effect can be obtained even with such control. You. It should be noted that the same effect can be expected even if the viewer located in the center of the captured image is specified instead of the viewer located closest to the center of the transmissive screen 11.
- the human detection unit 92 accurately specifies the position of the viewer 14 based on the detection information of the distance sensor 71 and the imaging device 72.
- the human detection unit 92 has the viewer 14 touch the translucent screen 11, and based on the touched position, and when the viewer 14 extends his arm, The approximate position of the head of the viewer 14 may be specified based on the positional relationship between the translucent screen 11 and the viewer 14 when it is assumed that the touch has been made.
- FIG. 20 is a diagram showing a state in which the viewer 14 is touching the transmission screen 11.
- the human detection unit 92 may display a message such as “Touch here” on the transmission screen 11 so that the viewer 14 touches the transmission screen 11.
- the operation detection unit 63 It is a viewer viewpoint specifying means for specifying the position of the viewer 14.
- the human detection unit 92 when displaying the initial screen, the human detection unit 92 The position is specified, and the formation position of the real image is adjusted.
- a human detection unit In addition to this, for example, a human detection unit
- the 92 may specify the position of the viewer 14 and adjust the formation position of the real image each time the projection image generation unit 91 outputs the still image data.
- the real image can be formed at a position that always overlaps with the real object 9 by shifting the real image according to the eyes of the viewer 14.
- a control device 31 capable of high-speed image processing is required, and the display device becomes expensive.
- Embodiment 2 when there is no viewer 14 around the display device, projector 13 is turned off. In addition, for example, when there is no viewer 14 around the display device, the projector 13 may output an image other than the real image. This makes it easier for viewers 14 who are far away from the display equipment to see.
- the display apparatus according to the third embodiment is different from the display apparatus according to the second embodiment in that the spot lamp 4 is dimmed in accordance with the projection of an image onto the transmission screen 11.
- the following description focuses mainly on this difference.
- FIG. 21 is a block diagram showing a hardware configuration of a control system of the display device according to Embodiment 3 of the present invention.
- FIG. 22 is a block diagram showing a control system of the display device of FIG.
- FIG. 23 is a diagram showing data stored in the hard disk device 35 of FIG.
- the spot lamp 4 is connected to the input / output port 32 as shown in FIG.
- the light amount of the spot lamp 4 is controlled by the control system of the exhibition device.
- the hard disk device 35 according to the third embodiment stores a projection image generation program 121 and a lamp control program 122, as shown in FIG.
- the projection image generation program 121 is read into the memory 34 and executed by the CPU 33.
- the projection image generation unit 131 shown in FIG. 22 is generated as the image changing unit, the light amount control unit, and the image control unit.
- the projection image generation unit 131 generates still image data based on the three-dimensional computer graphics data 44, and outputs the generated still image data to the projector 13.
- the lamp control program 122 is read into the memory 34 and executed by the CPU 33. This As a result, a lamp control unit 132 shown in FIG. 22 is generated as light amount control means. The lamp control unit 132 controls the light amount of the spot lamp 4 to be 0 to 100%.
- the components other than the components described above are the same as the components of the display device according to the second embodiment, and will be described using the same reference numerals and names. Is omitted.
- the distance sensor 71 In an initial state in which no person is around the display device, the distance sensor 71 outputs light in the forward direction of the transmission screen 11.
- the imaging device 72 images a predetermined range on the near side of the transmission screen 11.
- the lamp control unit 132 lights the spot lamp 4 at 100% output.
- the projector 13 is off.
- the distance sensor 71 When the viewer 14 stands in front of the transmissive screen 11 to view the real object 9 (here, the silver cup) illuminated by the light of the spot lamp 4, the distance sensor 71 The distance to is output.
- the human detection unit 92 specifies the position of the head of the viewer 14 and outputs information on the relative position of the specified head to the translucent screen 11 and the real object 9 to the projection image generation unit 131.
- the projection image generation unit 131 Based on the information on the relative position of the head and the three-dimensional computer graphics data 44, the projection image generation unit 131 generates a real image formed on the transmissive screen 11 when viewed from the position of the head. Then, still image data that matches the real object 9 is generated and output to the projector 13. The projector 13 starts outputting and outputs an image based on the still image data. An image projected by the projector 13 is formed on the translucent screen 11.
- projection image generation section 131 instructs lamp control section 132 to turn off the light.
- the lamp control unit 132 turns off the spot lamp 4.
- the transmittance of the transmissive screen 11 is lower than 50%, the real object 9 disappears as soon as the spot lamp 4 is turned off, and the force of the real object 9 placed on the mounting table 2 also approaches the near side. It is difficult to give an impression as if they have moved to the city. If the transmittance of the transmissive screen 11 is higher than 80%, even if the spot lamp 4 is turned off, the actual object 9 may be seen indefinitely, and the actual object 9 placed on the mounting table 2 may be invisible. It is difficult to give the impression that your power has moved closer to you.
- the operation detection unit 63 instructs the projection image generation unit 131 to display the previously displayed initial image.
- the projection image generation unit 131 After displaying the initial image 51 previously displayed on the transmissive screen 11 after a predetermined time has elapsed, the projection image generation unit 131 outputs an instruction to turn off the light to the projector 13 and turns on the lamp control unit 132. Output instructions.
- the lamp control unit 132 turns on the spot lamp 4.
- the projector 13 is turned off.
- the operation of the display device from the initial display of the initial screen to the end of the display of the initial screen is the same as that of the display device according to the second embodiment. It is the same, and the description is omitted.
- the spot lamp 4 when displaying the real image on the transmissive screen 11, the spot lamp 4 is turned off. Thereby, the viewer 14 can be given an illusion as if the real object 9 placed on the mounting table 2 is approaching. The attention of the viewer 14 will come and go between the real 9 and the real image. In addition, spot The same effect can be expected by simply reducing the light amount of the spot lamp 4 instead of turning off the lamp 4.
- spot lamp 4 when the display of the real image on transmission screen 11 is finished, spot lamp 4 is turned on. Thereby, the viewer 14 can be given an illusion of returning to the real image 9 on the other side of the transmissive screen 11 formed on the transmissive screen 11. The point of interest of the viewer 14 will be toggling between the real 9 and the real statue. The same effect can be expected by merely increasing the light amount of the spot lamp 4 instead of turning on the spot lamp 4.
- Embodiment 3 only spot lamp 4 is turned on when there is no person around the display device, and only projector 13 is turned on when a viewer comes.
- the lighting of the spot lamp 4 and the lighting of the projector 13 may be controlled stepwise or may be controlled so as to change slowly.
- the spot lamp 4 and the projector 13 are turned off, and when the viewer comes, only the spot lamp 4 is turned on. You can turn it on.
- the display device has one spot lamp 4 that outputs incandescent light.
- the display device may have a spot lamp 4 that outputs incandescent light and a spot lamp 141 that outputs monochromatic light.
- the display device may be controlled such that the plurality of spot lamps 4 and 141 are switched on or off, or may be turned on simultaneously. When a plurality of spot lamps 4 and 141 are turned on at the same time, the color of the real object 9 can be changed by combining light.
- FIG. 24 is a cross-sectional view showing a modification of the display device having the spot lamp 4 that outputs incandescent light and the spot lamp 141 that outputs monochromatic light.
- the display device has one spot lamp 4 and one real object 9.
- the display device may have a plurality of sets of spot lamps 4, 151 and real objects 9, 152.
- the display device may switch the plurality of spot lamps 4 and 151 to be lit, thereby switching the point of interest of the viewer 14 between the plurality of real objects 9 and 152.
- FIG. 25 is a cross-sectional view showing a modification of the display device having a plurality of sets of spot lamps 4 and 151 and real objects 9 and 152.
- the entrance restriction frame 1 of the display device is assembled in a cubic frame structure.
- the entry restriction frame 1 may be formed in a hexagonal column shape or another column shape.
- the display device may be installed in a pillar of a building or the like.
- the transmissive screen 11 is provided only between the pair of column members 5 of the entry restriction frame 1.
- a plurality of transmissive screens may be provided between the column members 5 of all the sets of the entrance restriction frame 1. This allows a plurality of viewers 14 to operate the respective translucent screens at the same time, and to watch at the respective paces.
- the display device according to the present invention can be used to display arts, crafts, commodities, and the like that actually exist.
Landscapes
- Business, Economics & Management (AREA)
- Accounting & Taxation (AREA)
- Marketing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Projection Apparatus (AREA)
- Transforming Electric Information Into Light Information (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-177086 | 2004-06-15 | ||
JP2004177086A JP2006003414A (ja) | 2004-06-15 | 2004-06-15 | 展示装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005124450A1 true WO2005124450A1 (ja) | 2005-12-29 |
Family
ID=35509856
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/010916 WO2005124450A1 (ja) | 2004-06-15 | 2005-06-15 | 展示装置 |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2006003414A (ja) |
WO (1) | WO2005124450A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009540349A (ja) * | 2006-06-07 | 2009-11-19 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 物理的対象物の選択に関する光フィードバック |
JP2014042655A (ja) * | 2012-08-27 | 2014-03-13 | Adc Technology Inc | 展示装置 |
JP2017080516A (ja) * | 2017-01-17 | 2017-05-18 | エイディシーテクノロジー株式会社 | 展示装置 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011248041A (ja) | 2010-05-26 | 2011-12-08 | Seiko Epson Corp | 取り付け装置及び投写型表示装置 |
JP2015232633A (ja) * | 2014-06-10 | 2015-12-24 | セイコーエプソン株式会社 | 表示装置 |
JP2015232634A (ja) * | 2014-06-10 | 2015-12-24 | セイコーエプソン株式会社 | 表示装置 |
JP2016001211A (ja) * | 2014-06-11 | 2016-01-07 | セイコーエプソン株式会社 | 表示装置 |
JP6449120B2 (ja) * | 2015-08-31 | 2019-01-09 | 日本電信電話株式会社 | 空間像表示装置及び空間像表示方法 |
JP6726889B2 (ja) * | 2016-06-20 | 2020-07-22 | パナソニックIpマネジメント株式会社 | 映像表示システム |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03221008A (ja) * | 1990-01-25 | 1991-09-30 | Fujitsu Ltd | 展示用表示装置 |
JPH0535192A (ja) * | 1991-07-25 | 1993-02-12 | Sony Corp | 展示装置 |
JPH0933856A (ja) * | 1995-07-24 | 1997-02-07 | Denso Corp | 表示装置 |
JPH11164291A (ja) * | 1997-09-26 | 1999-06-18 | Denso Corp | 映像情報表示システム |
JP2003058092A (ja) * | 2001-08-10 | 2003-02-28 | Masaaki Matsumura | ウィンドウ広告電子ディスプレイおよびその使用方法 |
JP2003173237A (ja) * | 2001-09-28 | 2003-06-20 | Ricoh Co Ltd | 情報入出力システム、プログラム及び記憶媒体 |
JP2004054065A (ja) * | 2002-07-23 | 2004-02-19 | Saeilo Japan Inc | ショーウィンドウ・インタラクティブ表示装置 |
-
2004
- 2004-06-15 JP JP2004177086A patent/JP2006003414A/ja active Pending
-
2005
- 2005-06-15 WO PCT/JP2005/010916 patent/WO2005124450A1/ja active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03221008A (ja) * | 1990-01-25 | 1991-09-30 | Fujitsu Ltd | 展示用表示装置 |
JPH0535192A (ja) * | 1991-07-25 | 1993-02-12 | Sony Corp | 展示装置 |
JPH0933856A (ja) * | 1995-07-24 | 1997-02-07 | Denso Corp | 表示装置 |
JPH11164291A (ja) * | 1997-09-26 | 1999-06-18 | Denso Corp | 映像情報表示システム |
JP2003058092A (ja) * | 2001-08-10 | 2003-02-28 | Masaaki Matsumura | ウィンドウ広告電子ディスプレイおよびその使用方法 |
JP2003173237A (ja) * | 2001-09-28 | 2003-06-20 | Ricoh Co Ltd | 情報入出力システム、プログラム及び記憶媒体 |
JP2004054065A (ja) * | 2002-07-23 | 2004-02-19 | Saeilo Japan Inc | ショーウィンドウ・インタラクティブ表示装置 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009540349A (ja) * | 2006-06-07 | 2009-11-19 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 物理的対象物の選択に関する光フィードバック |
US9336700B2 (en) | 2006-06-07 | 2016-05-10 | Koninklijke Philips N.V. | Light feedback on physical object selection |
JP2014042655A (ja) * | 2012-08-27 | 2014-03-13 | Adc Technology Inc | 展示装置 |
JP2017080516A (ja) * | 2017-01-17 | 2017-05-18 | エイディシーテクノロジー株式会社 | 展示装置 |
Also Published As
Publication number | Publication date |
---|---|
JP2006003414A (ja) | 2006-01-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
ES2354985T3 (es) | Sistema y metodo para su funcionamiento en el espacio virtual 3d. | |
WO2005124450A1 (ja) | 展示装置 | |
KR101795644B1 (ko) | 투영 캡쳐 시스템, 투영 캡쳐 프로그래밍, 및 투영 캡쳐 방법 | |
JP6059223B2 (ja) | 携帯型投射捕捉デバイス | |
JP3968477B2 (ja) | 情報入力装置及び情報入力方法 | |
JP3092162B2 (ja) | 複数画像合成装置 | |
EP1530119A2 (en) | Stereoscopic two-dimensional image display device and method | |
JP2000010194A (ja) | 画像表示方法及び装置 | |
JP4843901B2 (ja) | 表示装置 | |
JP2007318754A (ja) | 仮想環境体験表示装置 | |
JP2004227332A (ja) | 情報表示方法 | |
KR101756948B1 (ko) | 전자칠판 장치 | |
Fisher et al. | Augmenting reality with projected interactive displays | |
JP2000090285A (ja) | 映像表示装置 | |
JP4687820B2 (ja) | 情報入力装置及び情報入力方法 | |
JP2007200353A (ja) | 情報処理装置及び情報処理方法 | |
JP6233941B1 (ja) | 非接触式の三次元タッチパネル、非接触式の三次元タッチパネルシステム、非接触式の三次元タッチパネルの制御方法、プログラム及び記録媒体 | |
Lee | Projector-based location discovery and tracking | |
WO2023162690A1 (ja) | 空中浮遊映像表示装置 | |
JP2000039949A (ja) | 映像表示装置 | |
WO2017054114A1 (zh) | 一种显示***及其显示方法 | |
WO2004114108A1 (ja) | 立体画像表示方法及び装置 | |
JP2004258287A (ja) | 映像表示システム | |
Spassova | Interactive ubiquitous displays based on steerable projection | |
Lancelle | Visual Computing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |