WO2023203853A1 - Système d'expérience à distance - Google Patents

Système d'expérience à distance Download PDF

Info

Publication number
WO2023203853A1
WO2023203853A1 PCT/JP2023/005479 JP2023005479W WO2023203853A1 WO 2023203853 A1 WO2023203853 A1 WO 2023203853A1 JP 2023005479 W JP2023005479 W JP 2023005479W WO 2023203853 A1 WO2023203853 A1 WO 2023203853A1
Authority
WO
WIPO (PCT)
Prior art keywords
local
image
guide
unit
client
Prior art date
Application number
PCT/JP2023/005479
Other languages
English (en)
Japanese (ja)
Inventor
登仁 福田
Original Assignee
株式会社サンタ・プレゼンツ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社サンタ・プレゼンツ filed Critical 株式会社サンタ・プレゼンツ
Publication of WO2023203853A1 publication Critical patent/WO2023203853A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • This invention relates to a remote experience system for viewing local conditions and shopping remotely.
  • a system has been proposed that allows people to experience traveling and shopping without actually going to the location.
  • Patent Document 1 discloses a system that transmits images captured from a bus or the like running in a local area to a terminal device, which is a user, in real time via a server device. This allows the user to enjoy the local scenery while staying at home or the like.
  • Patent Document 2 a local person located far away from the user captures a video and transmits it to the user's terminal device so that the user can view it.
  • the user gives instructions to local people regarding the imaging direction, etc. This allows the user to enjoy images in the desired direction.
  • Patent Document 1 and Patent Document 2 do not have a system in place for a client to appropriately select and decide on a local guide, and there is a problem in that it is not easy to make a request.
  • Patent Document 2 when a remote user wants to change the imaging direction, he or she has to instruct a local person to change the imaging direction, making it difficult for the user to see the direction he or she wants to see.
  • the purpose of this invention is to solve any of the above-mentioned problems and provide a more advanced remote experience system that allows people to experience sightseeing and shopping in remote locations.
  • the remote experience system is a remote experience system comprising a plurality of local devices, a server device, and a client device,
  • the local device transmits an image of the local guide or the local guide robot to the imaging unit attached to the body or a moving body that moves with the local guide, and the transmitting unit.
  • a guide capable transmitting means transmits the location and the fact that guidance is possible to the server device as possible information
  • a receiving unit receives request information from the server device and requests information to enter a guide mode for the client device.
  • the server device includes a possibility information receiving means for receiving possibility information from the local device by a receiving section, and a possibility information receiving means for receiving possibility information from the local device by a receiving section, and a possible device list in which local devices that have transmitted the possibility information are arranged on a map by a transmitting section.
  • the client device includes, by means of a receiving unit, possible device list receiving means for receiving the possible device list, possible device list display means for displaying the possible device list on a display unit, and a possible device list displaying means for displaying the possible device list on a display unit.
  • a guidance request transmitting means for transmitting the identification code of the local device selected by the client's operation, the guidance request, and the identification code of the client's device as request information to the server device by the transmitting unit, and the receiving unit receives the captured image. and a captured image display unit that displays the received captured image using a display section.
  • the client can select a guide or guide robot near the place he or she wants to experience and receive guidance.
  • the imaging unit of the field device is a wide-angle imaging unit that outputs a wide-angle captured image
  • the captured image transfer means of the server device receives direction instructions from the client device. selects an area corresponding to the direction instruction from the received wide-angle captured image and transmits it as a selected area image to the client device by the transmitting unit, and the client device displays the selected area image displayed on the display unit.
  • the present invention is characterized by comprising a direction instruction transmitting means for transmitting a direction instruction input by the client to the server device.
  • the wide-angle imaging unit of the field device detects a change in the direction of the wide-angle imaging unit, regardless of the movement of the local guide, the local guide robot, or the mobile object.
  • the wide-angle captured image is output so that a predetermined direction becomes a reference direction, and the direction instruction is given with the reference direction as a reference.
  • a plurality of client devices are provided that receive and display the selected area image from the local device via a server device, and the direction instruction given by each client device is Each is characterized by being different.
  • each client can enjoy images taken in their desired direction.
  • the system according to this invention a projection unit in which the local device is attached to a local guide or a local guide robot on his or her own body or a moving body that moves with the local guide, and projects an instruction image onto the local space based on given instruction image data; In response to outputs from a drive unit that changes the projection direction of the projection unit and a sensor that detects the orientation of the projection unit, the projection unit changes regardless of the movement of the local guide, the local guide robot, or the moving body.
  • the client device includes a fixing command means for giving a fixing command to the local device by a transmitting unit, and when there is a fixing command, the client device sets a vicinity of the characteristic partial image of the captured image as a reference captured image, and sets the reference captured image as a reference captured image.
  • the present invention is characterized in that it further comprises instruction image transmitting means for transmitting instruction image data specifying the position of the instruction image to the local device via the server device.
  • the client can project an image instruction to the local guide or the local guide robot at a precise location on the site.
  • the system according to the present invention includes a correction means for correcting the projection of the instruction image by the projection unit without depending on the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the local space. It is characterized by additional features.
  • the on-site device is a wide-angle projection unit that is attached to the on-site guide or the on-site guide robot on his or her own body or on a moving object that moves with the on-site guide, and projects an instruction image into the on-site space based on given instruction image data.
  • the client device includes a fixing command means for giving a fixing command to the local device by a transmitting unit, and when there is a fixing command, the client device sets a vicinity of the characteristic partial image of the captured image as a reference captured image, and sets the reference captured image as a reference captured image.
  • the present invention is characterized in that it further comprises instruction image transmitting means for transmitting instruction image data specifying the position of the instruction image to the local device via the server device.
  • the client can project an image instruction to the local guide or the local guide robot at a precise location on the site.
  • the system according to the present invention is characterized by further comprising a correction means for correcting the projection of the instruction image by the projection unit so that the instruction image is correctly displayed with reference to a predetermined part of the local space.
  • the on-site device receives outputs from a drive unit that changes the imaging direction of the imaging unit and a sensor that detects the orientation of the imaging unit, regardless of the movement of the on-site guide, the on-site guide robot, or the mobile object. further comprising a direction control means for controlling a drive unit so that the imaging unit faces in a predetermined direction based on a direction instruction with the local guide or local guide robot as the center;
  • the client device is characterized in that it includes a direction instruction transmitting means for transmitting a direction instruction input by the client to the server device while looking at the captured image displayed on the display unit.
  • the image can be viewed in a desired direction by the client's operation.
  • the local device further includes a projection unit that is attached to a local guide or a local guide robot on his or her own body or a moving body that moves with the local guide, and projects an instruction image onto the local space based on given instruction image data.
  • the drive unit also changes the projection direction of the projection unit, and receives an output from a sensor that detects the direction of the projection unit, and changes the projection direction regardless of the movement of the local guide, the local guide robot, or the mobile object.
  • the client device includes a fixing command means for giving a fixing command to the local device by a transmitting unit, and when there is a fixing command, the client device sets the vicinity of the characteristic partial image of the captured image as a captured image of interest, and sets the captured image of interest to the captured image of interest.
  • the present invention is characterized in that it further comprises instruction image transmitting means for transmitting instruction image data specifying the position of the instruction image to the local device via the server device.
  • the client can project an image instruction to the local guide or the local guide robot at a precise location on the site.
  • the system according to the present invention includes a correction means for correcting the projection of the instruction image by the projection unit without depending on the drive unit so that the instruction image is correctly displayed with reference to a predetermined part of the local space. It is characterized by additional features.
  • the "guidable transmission means" corresponds to step S103.
  • the "request information receiving means" corresponds to step S104.
  • the "captured image transmitting means" corresponds to steps S106, S21, S30, and S35.
  • the "availability information receiving means" corresponds to step S121.
  • the "capable device list transmitting means" corresponds to step S123.
  • the "request information transfer means" corresponds to step S124.
  • the "captured image transfer means" corresponds to steps S125 and S91.
  • the "capable device list display means" corresponds to step S143.
  • the "guidance request transmitting means" corresponds to step S144.
  • the "captured image receiving means" corresponds to steps S145 and S41.
  • the "captured image display means" corresponds to steps S146 and S41.
  • the term "device” is a concept that includes not only one computer but also multiple computers connected via a network. Therefore, when the means (or a part of the means) of the present invention is distributed over multiple computers, these multiple computers correspond to the apparatus.
  • Program refers to not only programs that can be directly executed by the CPU, but also programs in source format, compressed programs, encrypted programs, programs that cooperate with the operating system to perform their functions, etc. It is a concept that includes
  • FIG. 1 is a functional configuration diagram of a remote experience system according to an embodiment of the present invention.
  • This is the system configuration of the remote experience system.
  • This is the hardware configuration of Smartphone GT.
  • This is the hardware configuration of the server device SV.
  • This is the hardware configuration of the client device IT.
  • It is a flowchart of local guide determination processing.
  • It is a flowchart of local guide determination processing.
  • It is a flowchart of guidance processing.
  • FIG. 2 is a functional configuration diagram of a remote experience system according to a second embodiment. It is a diagram showing a local guide wearing a local device ST. It is a diagram showing a projector 776 with direction control.
  • FIG. 7 is a diagram showing a cross section of a unit 80.
  • FIG. It is a figure showing the attachment position of gel bush 120. This is the hardware configuration of the motor control circuit 400.
  • 5 is a diagram showing the relationship between the movement of the local guide 56 and the instruction image display.
  • FIG. 5 is a diagram showing the relationship between the movement of the local guide 56 and the instruction image display.
  • FIG. 5 is a diagram showing the relationship between the movement of the local guide 56 and the instruction image display.
  • FIG. 6 is a diagram showing a case where image feature points 512 are used instead of markers 60.
  • FIG. It is a functional configuration of a remote experience system according to a third embodiment. It is a figure showing the local guide 54 wearing the local device GT. It is a flowchart of guidance processing. It is a flowchart of guidance processing. It is a functional configuration of a remote experience system according to a fourth embodiment. It is a figure showing the local guide 54 wearing the local device GT.
  • 9 is a diagram showing a camera/laser projector integrated body 90.
  • FIG. It is a flowchart of guidance processing. It is a flowchart of guidance processing.
  • FIG. 1 shows the functional configuration of a remote experience system according to an embodiment of the present invention.
  • Field devices GT1, GT2, . . . GTn are provided to be able to communicate with the server device SV.
  • a client device IT is provided to be able to communicate with the server device SV.
  • the local devices GT1, GT2, . . . GTn are mobile terminal devices such as smartphones owned by each local guide at the site such as a tourist spot or a shopping mall. Although a plurality of local devices GT1, GT2, . . . GTn are provided, the description below will focus on the local device GT1.
  • the local guide When the local guide becomes available for guidance, the local guide inputs that fact into the local device GT1.
  • the guiding capability transmitting means 614 of the local device GT1 transmits the fact that guidance is possible, the identification code of the local device GT1, and the current position to the server device SV as possible information by the transmitting unit 620.
  • the current location of the local device GT1 is obtained, for example, from a built-in GPS receiver.
  • Local devices other than the local device GT1 also transmit the above-mentioned availability information to the server device SV when the local guide becomes available for guidance.
  • the availability information receiving means 646 of the server device SV receives these availability information through the receiving unit 642. Therefore, the server device SV can grasp which field devices are currently available for guidance.
  • the server device SV generates a possible device list showing local devices that are ready to guide on the map based on the possibility information received from each local device GT1, GT2, . . . GTn.
  • the possible device list transmitting means 652 uses the transmitting unit 644 to transmit the generated possible device list to the client device IT.
  • the possible device list receiving means 664 of the client device IT receives the possible device list.
  • Possible device list display means 666 displays a list of possible devices on display section 668. This allows the client to confirm on the map the local guides who are available to guide them and their locations.
  • the client refers to the list displayed on the display section 668 and selects one of the local devices GT1, GT2, . . . GTn (ie, local guide).
  • the guide request transmitting means 676 transmits the identification code of the selected local device GT1 and the identification code of the requester's device to the server device SV by the transmitting unit 674 as request information.
  • the request information transfer means 648 of the server device SV transfers this request information to the selected local device GT1.
  • the request information receiving means 618 of the local device GT1 receives the request information through the receiving unit 622. This causes the local device GT1 to enter the guidance mode.
  • the local device GT1 images the site using the imaging unit 612.
  • the captured image transmitting means 616 transmits the captured image of the site to the server device SV using the transmitting unit 620.
  • the captured image transfer means 650 of the server device SV transfers the captured image to the client device IT.
  • the captured image receiving means 670 of the client device IT receives the captured image through the receiving unit 662.
  • the captured image display means 672 displays the received captured image of the site on the display section 668.
  • the client can select a local guide and request guidance based on his/her selection.
  • FIG. 2 shows the system configuration of the remote experience system.
  • smartphones GT1, GT2, . . . GTn are used as local devices.
  • the smartphones GT1, GT2, . . . GTn owned by these local guides can communicate with the server device SV via the Internet.
  • the client device IT used by the client can also communicate with the server device SV via the Internet.
  • FIG. 3 shows the hardware configuration of smartphone GT.
  • a memory 204 Connected to the CPU 202 are a memory 204, a touch display 206, a short-range communication circuit 208, a built-in camera 217 serving as an imaging unit, a nonvolatile memory 212, a speaker 214, a microphone 216, a communication circuit 218, and a GPS receiver 219.
  • the short range communication circuit 208 is a circuit for short range communication such as Bluetooth.
  • the communication circuit 218 is a circuit for communicating with a base station in order to connect to the Internet.
  • the GPS receiver 219 is for receiving radio waves from satellites and acquiring its own position.
  • the built-in camera 217 is for capturing still images and videos of the site.
  • the microphone 216 is used to collect the guide's voice and local sounds.
  • An operating system 222 and a field program 224 are recorded in the nonvolatile memory 212.
  • the field program 224 cooperates with the operating system 222 to perform its functions.
  • FIG. 4 shows the hardware configuration of the server device SV.
  • a memory 554, a display 556, an SSD 558, a DVD-ROM drive 560, and a communication circuit 562 are connected to the CPU 552.
  • Communication circuit 562 is for connecting to the Internet.
  • An operating system 564 and a server program 566 are recorded on the SSD 558.
  • the server program 566 cooperates with the operating system 564 to perform its functions.
  • These programs were recorded on the DVD-ROM 568 and installed into the SSD 558 via the DVD-ROM drive 560.
  • FIG. 5 shows the hardware configuration of the client device IT.
  • a memory 304, a display 306, a microphone 308, a communication circuit 310, an SSD 312, a DVD-ROM drive 314, a mouse/keyboard 316, and a speaker 318 are connected to the CPU 302.
  • Communication circuit 310 is for connecting to the Internet.
  • An operating system 320 and a client program 322 are recorded on the SSD 312 .
  • the client program 322 cooperates with the operating system 320 to perform its functions.
  • These programs were recorded on the DVD-ROM 324 and installed into the SSD 312 via the DVD-ROM drive 314.
  • FIG. 6 to 8 show flowcharts of remote experience processing. 6 and 7 show the local guide determination process, and FIG. 8 shows the guide process.
  • a local guide who owns a local device GT registers the local guide by transmitting his or her name, photo, address, and payment method (guide fee transfer destination, etc.) to the server device SV in advance.
  • the server device SV records this information by adding an identification code of the local device GT.
  • the local guide inputs information to the effect that he/she is available to provide guidance into the local device (smartphone) GT that he or she owns.
  • the CPU 202 of the local device GT acquires the position of the local device using the GPS receiver 219 (step S102). Further, the local device GT transmits possibility information including the fact that guidance is possible, the identification code of the own device, and the position (latitude and longitude) of the own device to the server device SV (step S103).
  • the server device SV receives the availability information and records it in the nonvolatile memory 212 (step S121). Since there are many local devices, a lot of possibility information is accumulated in the server device SV. Note that if the local guide becomes unable to guide the guide because he or she has other things to do, a message to the effect that the guide is unable to guide is sent to the server device SV. In response to this, the server device SV changes the local device GT from being guideable to being unable to guide it.
  • a client who wishes to have a remote experience operates the client's device IT, specifies the place or area in which he or she desires the experience, and inputs a request for a list of local devices GT that are available for guidance.
  • the CPU 302 of the client device IT (hereinafter sometimes abbreviated as client device IT) requests the server device SV for a list of local devices GT that can be guided (step S141). For example, if a client wishes to be guided around the Metropolitan Museum of Art, he or she may input "Metropolitan Museum of Art” or specify "Metropolitan Museum of Art” on a map, and request a list of local equipment GTs that are available for guidance.
  • the server device SV extracts local devices GT that are available for guidance near the Metropolitan Museum of Art based on the stored availability information. This is placed on the map to generate a list of possible devices (step S122). Note that the list of possible devices may be generated in advance for each location.
  • the server device SV transmits the generated list of possible devices to the client device IT (step S123).
  • the client device IT receives this (step S142) and displays it on the display 306 (step S143).
  • FIG. 9 shows a list of possible devices displayed on the client's device IT.
  • Local equipment GTs located near the designated Metropolitan Museum of Art are displayed together with the name of the local guide. Further, each local device GT is associated with its identification code.
  • the client places the mouse cursor on the local device GT, the local guide's profile, ratings from past clients, facial photos, etc. can be sent from the server device SV.
  • the client uses the mouse/keyboard 316 to select one of the displayed local devices GT (local guide) with reference to these.
  • the client device IT transmits the identification code of the selected local device GT and the identification code of the client device IT as request information to the server device SV (step S144). For example, assume that Emma's local device GT is selected.
  • the server device SV transfers this request information to the selected Emma's local device GT (step S124).
  • the CPU 202 of Emma's local device GT receives the request information (step S104).
  • Emma's local device GT enters the guidance mode (step S105).
  • a desired local guide can be selected by comparing and selecting local guides who are near the place where the user desires guidance.
  • FIG. 8 shows a processing flowchart in the guide mode.
  • Emma the local guide who owns the local device GT, and the client who operates the client device IT, use the Internet call function of the local device GT and client device IT to communicate via the server device SV (or directly). , can make voice calls to each other (steps S106, S125, S145, S146, S147, S126, S107, S108 in FIG. 8).
  • the client tells Emma, the local guide, that he wants to go to the Metropolitan Museum of Art.
  • the local guide Emma takes a video with the camera of the local equipment GT and heads to the Metropolitan Museum of Art.
  • the client can view the captured video and surrounding sounds on the client's device IT.
  • the local guide Emma Upon arriving at the ticket counter of the Metropolitan Museum of Art, the local guide Emma operates the local device GT to display a payment window 700 superimposed on the video being captured, as shown in FIG.
  • the payment window 700 displays a two-dimensional barcode for accessing a site for paying admission fees to the Metropolitan Museum of Art. It is preferable that such two-dimensional barcodes be obtained in advance from target facilities or shops and stored in the server device SV.
  • the local guide can record it on his or her smartphone GT or obtain it from the server device SV.
  • the client reads the two-dimensional barcode displayed on the client's device IT using a smartphone or the like and makes a payment.
  • the local guide Emma may pay the entrance fee in advance and settle the payment together with the guide fee later.
  • the client can watch a video of the inside of the Metropolitan Museum of Art guided by the local guide Emma using the client's IT device.
  • the client can select a local guide and experience the local location from a remote location.
  • the two-dimensional payment barcode 700 prepared in the same manner as in FIG. 10 can be used.
  • the client when the client selects a local guide, the client is allowed to check the profile of the local guide. Alternatively or in addition to this, it may be possible to select a local guide after checking the camera image of the local device GT. This allows you to check the quality of the camera images from the on-site equipment.
  • the local guide may upload videos of past guidance to the server device SV so that the client can view them.
  • evaluations and comments made by past clients to the local guide may be displayed.
  • the local guide fixes, holds, or wears the local device.
  • the on-site device may be fixed, held, or attached to a robot for on-site guidance. The same applies to the following embodiments.
  • the local guide's smartphone is used as the local device GT in the guidance mode.
  • a case will be explained in which a local device GT is used, which is designed to be able to send more stable videos and to make it easier to convey instructions from the client.
  • FIG. 11 shows the functional configuration of the remote experience system according to the second embodiment.
  • This embodiment does not provide a mechanism for matching clients and local guides. However, a mechanism similar to the first embodiment can be used. Alternatively, other matching methods may be used.
  • the local guide is wearing a wide-angle imaging unit 12. Further, a projection section 14 is provided via a drive section 16 .
  • the projection unit 14 is configured so that its projection direction can be changed by a drive unit 16.
  • the projection direction of the projection unit 14 is detected by the sensor 28.
  • the direction control means 20 controls the drive unit 16 based on the output of the sensor 28, and maintains the projection direction of the projection unit 14 in a predetermined direction centered on the local guide, regardless of the movement of the local guide.
  • the imaging unit 12 of the field device GT is a wide-angle camera such as a spherical camera, and captures images of the local guide in all spherical directions to generate a wide-angle captured image.
  • This wide-angle captured image is transmitted to the server device SV by the transmitter 22 under the control of the captured image transmitter 18.
  • the server device SV receives this wide-angle captured image, selects a portion in that direction from the wide-angle captured image based on the direction instruction received from the client device IT, and generates a selected area image.
  • the captured image transfer means of the server device SV transmits the generated selected area image to the client device IT using the transmitter 758.
  • the captured image receiving means 36 of the client's device IT receives the selected area image in the direction specified by the client, and displays it on the captured image display section 40.
  • the client looks at this selection area image and wants to change the direction, he inputs a direction instruction into the client's device IT.
  • the direction instruction transmitting means 39 of the client device IT transmits the direction instruction to the server device SV through the transmitter 34.
  • the server device SV receives this direction instruction and uses it to select a selection area image from the wide-angle captured image. Further, the direction command transfer means 752 transmits the direction command to the local device GT using the transmitter 758.
  • the client can select and view images of the local guide in any direction in the celestial sphere. This allows the client to enjoy the experience of viewing images in any direction without having a fixed field of view.
  • each client can independently enjoy the selected area image in their preferred direction.
  • this embodiment has a function for the client to give clear instructions to the local guide. For example, when a local guide enters a store and makes purchases for a client, it is preferable that the client be able to clearly indicate the items purchased. This is not always possible through voice calls. When multiple similar products are lined up, it is difficult to determine which product to purchase.
  • the client when the client gives an instruction, the client inputs a fixed command into the client device IT.
  • the captured image display unit 40 uses the selected area image at that time as a reference selected area image and displays it as a still image.
  • the client inputs an instruction image from the instruction image input section 44 while viewing the local reference selection area image displayed on the captured image display section 40.
  • an instruction image for example, a circle image
  • the instruction image transmitting means 38 transmits the input instruction image to the server device SV by the transmitting unit 34.
  • the fixed command transfer means 754 of the server device SV transfers the instruction image to the local device GT.
  • the local device GT receives the instruction image through the receiving section 24 and projects the instruction image from the projection section 14.
  • the instruction image for example, a circle image
  • the instruction image 62 is projected onto the product 52c desired by the client among the products 52a, 52b, 52c, and 52d.
  • the projection direction of the projection unit 14 is fixed, even if the local guide changes the direction of his or her face, the instruction image will be projected onto the location intended by the client.
  • the correction means 26 of the local device GT compares the characteristic partial image (marker, etc.) in the reference selection area image with the characteristic partial image in the current selection area image so that the instruction image is correctly projected at the intended position. Then, the projection position of the instruction image is corrected and controlled. As a result, even if the local guide moves, the instruction image will be displayed in the correct position.
  • the transmitting unit 34 transmits the fixing command to the local device GT via the server device SV.
  • the receiving unit 24 of the local device GT receives this and records the selected area image at that time as a reference selected area image. Further, the local guide places markers near the products 52a, 52b, 52c, and 52d so that the products can be imaged.
  • the local guide can confirm which product to purchase based on the actually projected instruction image 62.
  • This instruction image 62 is displayed at the correct position by the direction control means 20 even if the local guide changes the direction of his or her head. Therefore, the instruction image 62 does not disappear depending on the direction of the local guide's head, reducing stress. Furthermore, if multiple local guides are accompanying the local guide, even if the local guide wearing the local device GT moves his or her head significantly, the instruction image 62 will continue to be projected, which may confuse the other local guides. None. Furthermore, even if the local guide moves, the instruction image 62 is displayed correctly.
  • Figure 12 shows the local guide 54 equipped with the local device GT.
  • the local guide 54 goes to sightseeing spots, facilities, shops, etc., and allows remote clients to experience various things by transmitting videos.
  • a smartphone 772, a laser projector 776 with direction control, a spherical camera 774, and a headset 770 constitute a field device GT.
  • the speaker and microphone of headset 770 are connected to smartphone 772 by short-range communication (such as Bluetooth).
  • the omnidirectional camera 774 (with built-in short-range communication circuit) and the laser projector 776 with direction control (with built-in short-range communication circuit) are similarly connected to the smartphone 772 by short-range communication.
  • the omnidirectional camera 772 is provided at the top of the headset 770. Images in all directions are captured by a half-celestial camera that captures an image in front of the local guide 54 and a half-celestial camera that captures an image behind the local guide.
  • a laser projector 776 with direction control is provided above the omnidirectional camera 772.
  • FIG. 13 shows the external appearance of the laser projector 776 with direction control.
  • the laser projector 776 with direction control is provided with a base 93 , and the base 93 is fixed to the top of the omnidirectional camera 772 .
  • a unit 80 housing a laser projector 84 is fixed to the base 93 via a triaxial structure 90 (another multiaxial structure may be used) as a drive section.
  • a motor 92 is fixed to the base 93 of the triaxial structure 90, and one end of an intermediate member 92A that is rotated in the XY plane by the motor 92 is connected.
  • the intermediate member 92A is formed in an L-shape, and a motor 94 is fixed to the other end.
  • One end of an intermediate member 94A that is rotated in the ZX plane by the motor 94 is connected to the motor 94.
  • the intermediate member 94A is formed in an L shape, and a motor 96 is fixed to the other end.
  • a mount member 97 that is rotated in the ZY plane by the motor 96 is connected to the motor 96 . Note that the XYZ axes shown in FIG. 13 vary as each member 92A, 94A, 97 rotates.
  • the three-axis structure 90 can adjust the orientation of the mount member 97 with three-axis degrees of freedom by driving the motors 92, 94, and 96.
  • the base 93 is provided with a triaxial gyro sensor JS and a triaxial acceleration sensor AS as the sensors 28. Further, the base 93 is provided with a motor control circuit (not shown) that controls the motors 92, 94, and 96 described above. Each of the motors 92, 94, and 96 is controlled by a motor control circuit based on the outputs of the three-axis gyro sensor JS and the three-axis acceleration sensor AS.
  • a unit 80 housing a laser projector 84 is fixed to the mount member 97 of the triaxial structure 90.
  • a laser projector control circuit 104 (including a short-range communication circuit) that controls the laser projector 84 is provided within the housing 81 of the unit 80.
  • the laser projector control circuit 104 may be provided in the base 93, it is preferable that at least the MEMS circuit of the laser projector 104 is provided in the unit 80.
  • the casing 81 is attached to the mount top surface 101, mount side surface 97, and mount bottom surface 99 of the triaxial structure 90 via silicone gel bushings 120 (for example, Taica's anti-vibration material gel bushing B-1). Note that in FIG. 13, the mount top surface 101 is omitted for easy understanding.
  • the silicone gel bush 120 includes a ring-shaped silicone gel 114 inserted outside the upper part of the ring-shaped silicone gel 116.
  • the upper part of the silicone gel 116 is inserted into a hole provided in the housing 81.
  • the housing 81 is sandwiched between the silicon gel 114 and the silicon gel 116.
  • the silicon gels 114 and 116 are screwed to the mount bottom surface 99 by bolts 110 and washers 112. With this structure, the housing 81 is held by the silicone gels 116 and 114. This can prevent high-frequency vibrations from being transmitted to the housing 81 from the outside.
  • silicone gel bushes 120 are provided at two locations on each of the top, side, and bottom surfaces of the housing 81.
  • FIG. 16 shows the hardware configuration of the motor control circuit 400.
  • a memory 404, a gyro sensor JS, an acceleration sensor AS, a camera 82, a laser projector 84, motors 92, 94, and 96, and a nonvolatile memory 406 are connected to the CPU 402.
  • the operating system 31 and motor control program 32 are recorded in the nonvolatile memory 406.
  • the motor control program 32 cooperates with the operating system 31 to perform its functions.
  • the hardware configuration of the client device IT and the smartphone 772 is the same as in the first embodiment.
  • FIG. 17 shows a flowchart during guidance.
  • the local guide's smartphone 200 acquires a wide-angle captured image (video) of the omnidirectional camera 774 through short-range communication, and transmits it to the server device SV (step S21).
  • the server device SV receives this wide-angle captured image and records it on the SSD 558.
  • the wide-angle captured image output by the omnidirectional camera 774 is an image captured in all directions.
  • the server device SV selects an image in a predetermined direction from this wide-angle captured image according to the direction command acquired and recorded from the client device IT, and generates a selected area image (step S91). Therefore, the selected area image is the same as the image obtained when the local guide takes an image in a predetermined direction with a normal camera.
  • the server device SV transmits the generated selection area image to the client device IT.
  • the client device IT receives the selected area image and displays it on the display 306 (step S41).
  • FIG. 18 shows the displayed selection area image. This allows the client to view the local video. You can enjoy seeing the local scene as the local guide moves around.
  • an image in a predetermined direction is selected and displayed as a selected area image. Since this predetermined direction is determined as a direction in the wide-angle captured image, it is determined as the up, down, left and right directions centered on the local guide.
  • the client wishes to change this predetermined direction and view the image in a different direction, he/she operates the keyboard/mouse 316 of the client's device IT and clicks on the direction command button 500.
  • the direction command button 500 can be clicked in 360 degrees in the circumferential direction, up, down, left, and right.
  • the client device IT transmits a direction command corresponding to the click to the server device SV (step S42).
  • Server device SV receives this and updates the direction command. Therefore, the server device SV changes the predetermined direction for selecting the selection area image in step S91, and the image in the direction commanded by the direction command button 500 is displayed on the display 306 of the client device IT. For example, as shown in FIG. 19, selected area images can be viewed in different directions.
  • the wide-angle captured image is an image in all celestial sphere directions, it is possible to view selected area images in any direction, up, down, left, or right, centered on the local guide.
  • the client can enjoy the local image in the desired direction through his/her own operations.
  • the client can give instructions to the local guide by projecting an image onto a local object.
  • An instruction image is transmitted from the client's device IT, and is projected onto an object at the site using a direction-controlled laser projector 776 worn by the local guide.
  • the guide can provide accurate instructions.
  • a direction-controlled laser projector 776 is used to ensure that the instruction image is projected correctly.
  • the CPU 402 of the motor control circuit 400 acquires the outputs of the gyro sensor JS and acceleration sensor AS of the direction-controlled laser projector 776 (FIG. 17, step S1).
  • a gyro sensor and an acceleration sensor in three orthogonal axes are used.
  • the motor control circuit 400 calculates in which position and in which direction the base 93 (see FIG. 13) is located in three-dimensional space based on the outputs of the gyro sensor JS and the acceleration sensor AS.
  • the rotation angles of the motors 92, 94, and 96 are then controlled so that the unit 80 faces in a predetermined direction regardless of the position and direction of the base 93 (step S2). Therefore, regardless of the orientation of the local guide 54's head, the unit 80 is kept in a constant direction.
  • Such control is similar to that of a gimbal used as a stabilizing device for cameras and the like.
  • the above-mentioned predetermined direction is changed by a direction command from the client device IT (steps S42, S92, S22, S3). Therefore, the projection direction of the laser projector 84 matches the direction of the local image that the client is viewing on the display 306. Thereby, the range of the selected area image and the projection range of the laser projector 84 are made to match.
  • FIG. 20 shows a flowchart of instruction image projection.
  • FIG. 21 a case will be described in which an instruction is given to purchase 52c among products 52a, 52b, 52c, 52d, etc. lined up at a store.
  • the client instructs the local guide 54 to place a marker 60 prepared in advance nearby by voice call or the like.
  • the size and shape of the image of the marker 60 are recorded in advance in the nonvolatile memory 212 of the smartphone 200. Therefore, the smartphone 200 can calculate the distance, direction, etc. from the laser projector 84 to the marker 60 based on the captured image of the marker 60.
  • the selected area image is displayed as a moving image on the display 306 of the client device IT. While looking at this selected area image, the client clicks the instruction input mode button 501 with the marker 60 captured as shown in FIG. 21 to give a fixing instruction. Note that here, it is assumed that the marker 60 prepared as a card is placed leaning against the product 52b.
  • the client device IT When the client device IT receives a fixing command by clicking the instruction input mode button 501, the client device IT sets the selected area image at that time as the reference selected area image and displays it on the display 306 as a still image (step S52).
  • the client uses the mouse 316 to input instructions to the local guide as an instruction image for this still image (step S53). For example, as shown in FIG. 21b, a circle mark 62 is drawn using the mouse 316 on the image of the product 52c displayed on the display 306, and inputted.
  • the client device IT transmits the fixed command to the smartphone 200 via the server device SV (steps S51, S93).
  • the smartphone 200 that has received the fixing command records the selected area image at the time of receiving the fixing instruction in the nonvolatile memory 212 as a reference selected area image (step S32). Note that since the smartphone 200 receives and updates the direction command in step S22, it can generate the selected area image from the wide-angle captured image.
  • the client device IT and the smartphone 200 can recognize the selected area image at the same time as the reference selected area image.
  • information for identifying the frame such as a frame number
  • the smartphone 200 by determining the reference selection area image based on information specifying this frame, it is possible to prevent deviations due to time lag.
  • the client device IT transmits the instruction image to the smartphone 200 via the server device SV (steps S53, S94).
  • the client device IT cancels the instruction input mode, stops displaying the still image as the reference selection area image, and displays the transmitted selection area image as a moving image (step S54). This allows the instructor to see the local situation again.
  • the data structure of the instruction image sent to the smartphone 200 is shown in FIG. 23A.
  • the instruction image data is the actual data of the instruction image input by the instructor, as shown in FIG. 23B.
  • the reference coordinate position is the XY coordinate value of the reference point of the instruction image when the reference point of the marker image (for example, the lower center point of M) is set as the origin.
  • the reference point is the upper left of the rectangle circumscribing the instruction image input by the instructor.
  • the instruction image is transmitted as image data, but the parameters may be transmitted as numerical values depending on the shape of the image determined in advance. For example, if it is a perfect circle, the center coordinates and radius, and if it is a square, the upper left coordinates and side lengths may be expressed numerically and transmitted.
  • the smartphone 200 Upon receiving the instruction image data of FIG. 23A, the smartphone 200 stores it in the memory 204. Furthermore, the smartphone 200 acquires the current captured image (selected area image) from the omnidirectional camera 774 (step S33).
  • the range of the selected area image (the imaging range when outputting the selected area image with a normal camera) and the projection range of the laser projector 84 are configured to be the same. Therefore, if the current selection area image is exactly the same as the recorded reference selection area image (that is, if the field personnel has not moved at all since the reference selection area image), the position based on the reference coordinate position (FIG. 23C) Then, if the instruction image data is projected by the laser projector 84, the instruction image 62 will be projected onto the product 52c.
  • this instruction image 62 Since the position of this instruction image 62 matches the position input by the client on the display 306, the target product 52c can be accurately shown to the local guide.
  • the local guide can use the instruction image 62 as a landmark to purchase the product 52c without making a mistake.
  • the smartphone 200 calculates the distance and direction between the laser projector 84 and the marker 60 (and the location where the instruction image 62 is to be projected) based on the image of the marker 60 in the reference selection area image. As mentioned above, since a known pattern is printed on the marker 60 in advance, the distance to the marker 60 placed near the product 52c (and the location where the instruction image 62 should be projected) can be determined based on the captured image. The direction can be calculated.
  • the smartphone 200 calculates the distance and direction to the marker 60 (and the location where the instruction image 62 is to be projected) based on the current selection area image acquired in step S33.
  • the smartphone 200 determines the distance and direction to the marker 60 (and the place where the instruction image 62 should be projected) in the reference selection area image, and the distance and direction to the marker 60 (and the place where the instruction image 62 should be projected) of the current selection area image.
  • the instruction image 62 is transformed based on the comparison with the distance and direction to the location), and the position where the instruction image 62 is projected is controlled (step S34).
  • the instruction image 62 is controlled to be enlarged/reduced and projected according to the change in the distance between the camera 82 and the marker 60 (or the instruction image 62).
  • control is performed to move the position where the instruction image 62 is projected as the marker 60 moves.
  • the direction fixing control (see FIG. 17) is performed separately by the triaxial structure 90, in many cases, the instruction image 62 is displayed at the correct position by performing the control for FIGS. 24B and 24C. be able to.
  • the direction fixing control by the triaxial structure 90 is separately performed and the above control is performed, so the instruction image 62 can be stably displayed at the correct position. Moreover, even if the local guide 54 changes the direction of his head and takes his line of sight away from the object 52, the instruction image 62 continues to be displayed by the direction fixing control. Therefore, stress on the local guide 54 is reduced.
  • the slope of surface 510 may change from the reference selection area image of FIG. 25A to the current selection area image of FIG. 25B.
  • the smartphone 200 calculates the inclination of the surface 510 of the object 52 based on the image of the marker 60 in the reference selection area image (FIG. 25A). Thereby, the actual distance LL between the marker 60 and the instruction image 62 is calculated based on the reference coordinate position PL1 (X or Y) sent from the client device IT.
  • the inclination of the surface 510 of the object 52 is calculated based on the image of the marker 60 in the current selected area image (FIG. 25B).
  • the position where the instruction image 62 should be displayed is determined based on the actual distance LL calculated above, and the reference coordinate position PL2 (X or Y) is calculated.
  • the smartphone 200 can control the position at which the instruction image 62 is projected based on this reference coordinate position PL2, and can project the instruction image 62 at the correct position. Furthermore, the instruction image 62 is transformed so that the projected instruction image 62 is not distorted.
  • the above process can be performed in the same way in both the vertical and horizontal directions.
  • the imaging range 504 for the reference selection area image may be tilted diagonally as shown in the imaging range 506.
  • FIG. 26 shows a tilt in a direction horizontal to the plane of the paper, such a tilt may occur in all three-dimensional directions. As a result, the projected instruction image 62 will also be distorted.
  • the instruction image 62 is deformed (inversely deformed to the distortion) based on the image of the marker 60 in the reference selection area image and the image of the marker 60 in the current selection area image ), it is possible to project the correct instruction image.
  • Smartphone 200 calculates the distance and direction between laser projector 84 and marker 60 based on the image of marker 60 in the reference selection area image. Furthermore, the smartphone 200 calculates the distance and direction to the marker 60 near the target object 52 based on the current selected area image acquired in step S33. The smartphone 200 transforms the instruction image 62 based on the comparison between the distance and direction to the marker 60 in the reference selection area image and the distance and direction to the marker 60 in the current selection area image. Control the projection position.
  • the instruction image intended by the client is projected and displayed on the local object 52.
  • the marker 60 is preferably placed on the plane where the instruction image 60 is to be displayed.
  • the smartphone 200 analyzes the captured image to calculate feature points (points on the boundary of the object, etc.) near the object (near the marker 60). By comparing the feature points in the reference photographed selection area image and the feature points in the current selection area image, the positional relationship between the marker 60 and the surface on which the instruction image 62 is to be displayed is determined.
  • the display of the instruction image can be stopped by operating the client device IT or the local device GT.
  • a correction means 26 is provided for correctly displaying the instruction image by performing image processing or projection control.
  • the direction control means 20 that controls the projection direction of the projection section 14 by the drive section 16 may be sufficient.
  • a laser projector 776 with direction control is provided.
  • the laser projector 776 with direction control may not be provided, and only the omnidirectional camera 774 may be provided. Even in this case, the client can view the image in the direction he or she desires.
  • the direction instruction is given based on the reference direction of the wide-angle captured image (for example, in front of the local guide). Therefore, when the local guide changes the orientation of his or her body, the orientation of the selected area image changes accordingly. Even if the client wants to see the buildings on the right side of the road, if the local guide turns to the side, he or she will not be able to see the desired direction.
  • the omnidirectional camera 774 can be attached via a triaxial structure to control the direction, similar to the directional laser projector 776. .
  • the omnidirectional camera 774 can be attached via a triaxial structure to control the direction, similar to the directional laser projector 776.
  • the laser projector 84 may be fixed to the omnidirectional camera 774, and the projection direction may be controlled by the above three-axis structure.
  • a sensor such as a three-axis gyro sensor or a three-axis acceleration sensor
  • a sensor is installed to detect the orientation of the omnidirectional camera 774, so that even if the local guide changes the orientation, images of the selected area in the same orientation will be extracted. Good too. In this case, a triaxial structure is not required.
  • the orientation of the omnidirectional camera 774 may be detected by analyzing the image itself captured by the omnidirectional camera 774.
  • the instruction image 62 is always projected onto the object 52 by the laser projector 84. However, if there are people in the projection direction, the laser projector 84 may not emit radiation.
  • the laser projector 84 is used as the projection section.
  • a normal projector may also be used.
  • a three-axis structure 90 (gimbal) is used, but a single-axis structure, a two-axis structure (gimbal), a structure with four or more axes, etc. may also be used.
  • the local guide 54 attaches the marker 60 to the object 52.
  • the marker 60 may be placed on the object 52 at the site in advance.
  • the marker 60 is used to determine the distance and direction to the laser projector 84.
  • SLAM SLAM or the like to grasp this only from the feature points of the captured image and perform similar processing.
  • the smartphone 200 recognizes the feature points 512 (vertices that characterize the image, etc.), transmits them to the client device IT, and displays them on the display 306 as shown in FIG. 27.
  • the instructor looks at this image, operates the mouse 316, and selects a feature point 512 to be used for position specification. It is preferable to select a feature point 512 on the same plane as the object 52 as the feature point 512 used for position identification.
  • the instruction input mode button 501 When the instruction input mode button 501 is clicked, information on the selected feature point 512 (coordinate values on the screen) is transmitted to the smartphone 200.
  • the smartphone 200 can specify the position and direction based on these feature points 512.
  • the client confirms the screen shown in FIG. 21b on the client device IT and clicks the instruction input mode button 502.
  • the client device IT or the smartphone 200 detects that the marker 60 in the captured image has entered a predetermined area (for example, a predetermined central area) in the captured image
  • the client device IT or the smartphone 200 automatically switches to the instruction input mode. You may try to hit the target. The same applies when processing is performed using the feature points 512 without using the marker 60.
  • the motor control circuit 400 controls the triaxial structure 90, and the smartphone 200 controls the projection position based on image processing.
  • the three-axis structure 90 may also be controlled by the smartphone 200.
  • a circuit may be provided in the base 93 to control the projection position based on image processing.
  • smart phone 200 will be used only for phone calls.
  • a telephone call function may also be provided within the base 93.
  • the instruction image is transformed by the smartphone 200 so that the instruction image is not projected in a distorted (or changed in size) manner.
  • the shape of the instruction image is not important and it is important to indicate a specific position (for example, when indicating the position at the center point of a cross mark), if the position can be shown correctly, the instruction image Even if it becomes distorted (even if its size changes), there is no problem. In such a case, the process of transforming the instruction image may not be performed.
  • the projection position and the like are controlled based on image processing by the smartphone 200 (FIG. 20).
  • the projection position and the like may not be controlled based on image processing by the smartphone 200, and only the processing by the triaxial structure 90 may be performed.
  • the triaxial structure 90 control is sufficient.
  • the marker 60 may not be used.
  • the smartphone 200 not only performs the controls corresponding to FIGS. 24B and 24C, but also performs the controls corresponding to FIGS. 25 and 26. However, only the controls corresponding to FIGS. 24B and 24C may be performed.
  • the mode when a fixing command is given to the client device IT, the mode is set such that a reference selection area image is displayed as a still image and an instruction image is input. However, if the local guide does not move, the selected area image may be displayed as a moving image.
  • the instructor inputs an instruction image in this state and clicks the instruction image transmission button 502.
  • the client device IT and the smartphone 200 may use the captured image at that time as the reference selection area image.
  • a still image is used as the instruction image.
  • a moving image may be used as the instruction image.
  • the on-site device repeatedly reproduces the video.
  • the smartphone 200, the laser projector 776 with direction control, and the omnidirectional camera 774 constitute the on-site device. However, they may be constructed as one. Further, instead of the smartphone 200, a dedicated device, a PC, a tablet, a stick type PC, etc. may be used.
  • the direction of the displayed captured image is changed by operating the direction change button 500.
  • the direction of the displayed captured image may be changed by dragging the screen (moving the cursor while holding down the mouse button).
  • a camera and a projector are attached to the headset 770. However, it may also be attached to something else worn, such as a helmet.
  • It may be attached to a car, bicycle, cart, etc. operated by a local guide.
  • the driving unit 16 controls the projection direction, and the smartphone 200 performs image processing and projection control (step S34) so that the instruction image is correctly displayed by following the marker 60. ing.
  • control for tracking the marker 60 and the like may also be controlled by the drive section 16.
  • the wide-angle captured image is transmitted from the smartphone 200 to the server device SV, and the selected area image is generated by the server device SV.
  • the selected area image is generated by the server device SV.
  • different selection area images can be transmitted. For example, as shown in FIG. 22, in the case of a tour where a plurality of clients perform remote experiences for one local guide 54, each client A, B, and C can use the change button 500 to request a tour. You can see the image in the direction you want to go. In this case, only one client can provide the instruction image.
  • selection area images for each client A, B, and C may be generated using the smartphone 200 and transmitted to the client device IT via the server device SV. Communication load can be reduced.
  • a spherical camera 774 that captures images in all directions is used.
  • a camera that captures images at 360 degrees in the horizontal direction may be a predetermined degree
  • a hemispherical camera that captures images below the horizontal may also be used.
  • an instruction image of a product selection mark is projected at the site.
  • a barcode used for smartphone payment for example, a PayPay (trademark) payment barcode
  • this barcode may be read by a camera of the client's device IT and projected onto a desk or the like at the site as an instruction image. Local shops can read this barcode and make payments with customers.
  • the credit card number may be recorded in the server device SV in advance when the client registers as a user.
  • the local guide accesses the server device SV to use the client's credit card number at the time of payment, and sends amount information to the server device SV.
  • the server device SV transmits the credit number, amount information, etc. to the client's client terminal device IT or the client's smartphone to request approval.
  • the server device SV uses the credit number and amount to access the credit company's server and process the payment.
  • a stationary PC is used as an example of the client device IT.
  • a smartphone, tablet, laptop computer, etc. may also be used.
  • an image of the local area may be displayed on a head-mounted display (HMD) worn by the client.
  • HMD head-mounted display
  • a 6Dof head tracker
  • the client can change the direction of the local image according to the direction of their head without having to input direction commands using a mouse or the like, allowing them to naturally enjoy the local scenery.
  • FIG. 28 shows the functional configuration of the remote experience system according to the third embodiment.
  • a wide-angle camera such as a spherical camera is used as the imaging section 12, and a spherical laser projector is used as the projection section 14.
  • the imaging unit 12 of the field device GT is a wide-angle camera such as a spherical camera, and captures images of the local guide in all spherical directions to generate a wide-angle captured image.
  • This wide-angle captured image is transmitted to the server device SV by the transmitter 22 under the control of the captured image transmitter 18.
  • the server device SV receives this wide-angle captured image, selects a portion in that direction from the wide-angle captured image based on the direction instruction received from the client device IT, and generates a selected area image.
  • the captured image transfer means of the server device SV transmits the generated selected area image to the client device IT using the transmitter 758.
  • the captured image receiving means 36 of the client's device IT receives the selected area image in the direction specified by the client, and displays it on the captured image display section 40.
  • the client looks at this selection area image and wants to change the direction, he inputs a direction instruction into the client's device IT.
  • the direction instruction transmitting means 39 of the client device IT transmits the direction instruction to the server device SV through the transmitter 34.
  • the server device SV receives this direction instruction and uses it to select a selection area image from the wide-angle captured image. Further, the direction command transfer means 752 transmits the direction command to the local device GT using the transmitter 758.
  • the client can select and view images of the local guide in any direction in the celestial sphere. This allows the client to enjoy the experience of viewing images in any direction without having a fixed field of view.
  • each client can independently enjoy the selected area image in their preferred direction.
  • the client When giving an instruction, the client inputs a fixing command while the selected area image containing the target object is displayed.
  • the captured image display unit 40 sets the selected area image in that direction as a reference selected area image and displays it as a still image.
  • the direction when the fixed command is given is transmitted to the local device GT via the server device SV.
  • the client inputs an instruction image from the instruction image input section 44 while viewing the local reference selection image displayed on the captured image display section 40.
  • the instruction image transmitting means 38 transmits the instruction image input by the transmitter 34 to the local device GT via the server device SV.
  • the follow-up control means 21 of the field device GT receives the instruction image by the receiving section 24, and controls the projection section 14 to project the instruction image in the direction based on the direction when the fixing command is given. control. As a result, the instruction image 62 is projected onto the target object 52.
  • the direction in which the instruction image is projected by the projection unit 14 matches the direction of the reference selection area image, so the instruction image is projected onto the location intended by the instructor. Although it is possible to implement this control alone, if the person in charge of the site moves from place to place, the projected position of the instruction image will shift.
  • the correction means 26 of the local device GT compares the characteristic partial images (markers, etc.) on the reference selection area screen with the characteristic partial images in the current selection area image so that the instruction image is correctly projected at the intended position. Then, the instruction image is deformed and the projection position of the instruction image is corrected and controlled. As a result, even if the local guide moves, the instruction image will be displayed in the correct position.
  • a spherical laser projector 780 is used instead of the laser projector 776 with direction control. Therefore, a spherical camera 774 and a spherical laser projector 780 are fixed to the top of the headset 770.
  • the omnidirectional laser projector 780 is configured to be able to project in all directions, up, down, left and right. It may be constructed by combining a plurality of laser projectors.
  • the hardware configuration of the client device IT is the same as that in the first embodiment (see FIG. 5). Furthermore, since the three-axis structure 90 is not used, the motors 92, 94, and 96 for controlling it are unnecessary, and the motor control circuit 400 is also unnecessary.
  • the hardware configuration of the smartphone 200 is the same as that of the first embodiment (see FIG. 3).
  • the hardware configuration of the server device SV is also similar to that of the first embodiment (see FIG. 4).
  • Remote experience processing Figure 30 shows a flowchart of remote experience processing.
  • the smartphone 200 of the local guide 54 acquires a wide-angle captured image of the omnidirectional camera 774 by short-range communication (wired communication may be used), and transmits it to the server device SV via the Internet (step S21).
  • the server device SV receives this wide-angle captured image and records it on the SSD 558.
  • the wide-angle captured image output by the omnidirectional camera 774 is an image captured in all directions.
  • the server device SV selects an image in a predetermined direction from this wide-angle captured image according to the direction command acquired and recorded from the client device IT, and generates a selected area image (step S91). Therefore, the selected area image is the same as the image obtained when the local guide takes an image in a predetermined direction with a normal camera.
  • the server device SV transmits the generated selection area image to the client device IT.
  • the client device IT receives the selected area image and displays it on the display 306 (step S41).
  • FIG. 18 shows the displayed selection area image. This allows the client to view the local video. You can enjoy seeing the local scene as the local guide moves around.
  • an image in a predetermined direction is selected and displayed as a selected area image. Since this predetermined direction is determined as a direction in the wide-angle captured image, it is determined as the up, down, left and right directions centered on the local guide.
  • the client wishes to change this predetermined direction and view the image in a different direction, he/she operates the keyboard/mouse 316 of the client's device IT and clicks on the direction command button 500.
  • the direction command button 500 can be clicked in 360 degrees in the circumferential direction, up, down, left, and right.
  • the client device IT transmits a direction command corresponding to the click to the server device SV (step S42).
  • Server device SV receives this and updates the direction command (step S92). Therefore, the server device SV changes the predetermined direction for selecting the selection area image in step S91, and the image in the direction commanded by the direction command button 500 is displayed on the display 306 of the client device IT. For example, as shown in FIG. 19, selected area images can be viewed in different directions.
  • server device SV transmits the direction command received from the client device IT to the smartphone 200 (step S92), and the smartphone 200 receives this and updates the direction command (step S22).
  • the wide-angle captured image is an image in all celestial sphere directions, it is possible to view selected area images in any direction, up, down, left, or right, centered on the local guide.
  • the client can enjoy the local image in the desired direction through his/her own operations.
  • the selected area image in the direction selected by the client is displayed as a moving image on the display 306 of the client's device IT.
  • the client clicks the instruction input mode button 502 while viewing the partial captured image and with the object 52 and marker 60 displayed.
  • the client device IT sets the selected area image at that time as a reference selected area image and displays it on the display 306 as a still image (step S52).
  • the client uses the mouse 316 to input instructions to the local guide as an instruction image for this still image (step S53). For example, as shown in FIG. 21b, in the product image displayed on the display 306, an instruction image 62 (in this example, a circle image) is drawn on the product 52c desired to be purchased using the mouse 316 and input.
  • the client device IT transmits the fixed command and the direction when the fixed command was given to the smartphone 200 via the server SV (steps S51, S93).
  • the smartphone 200 that has received the fixing command determines a reference selected area image based on the selected area image and direction at the time of receiving the fixing command, and records it in the nonvolatile memory 212 (step S32). Note that since the direction command has been received in step S22 of FIG. 30, the smartphone 200 can generate the selected area image from the wide-angle captured image.
  • the client device IT and the smartphone 200 can recognize the selected area image at the same time as the reference selected area image.
  • information for identifying the frame such as a frame number
  • the smartphone 200 by determining the reference selection area image based on information specifying this frame, it is possible to prevent deviations due to time lag.
  • the client device IT When the client finishes inputting the instruction image, he or she clicks the instruction image transmission button 502 (displayed when the instruction input mode is entered) displayed at the lower right of the reference selection area image on the display 306. Thereby, the client device IT transmits the instruction image to the smartphone 200 via the server device SV (steps S53, S94). Further, the client device IT cancels the instruction input mode, stops displaying the still image as the reference selection area image, and displays the transmitted selection area image as a moving image (step S54). This allows the client to see the local situation again.
  • the smartphone 200 Upon receiving the instruction image data of FIG. 23A, the smartphone 200 stores it in the memory 204. Furthermore, the smartphone 200 acquires the current wide-angle captured image from the omnidirectional camera 774, and extracts a selected area image based on the direction command received in step S22 (step S33).
  • the smartphone 200 compares the marker of the reference selection area image with the marker of the current selection area image, and adjusts the instruction image by the spherical laser projector 780 so that the instruction image is correctly projected by following the marker.
  • the projection direction is controlled (step S34). Furthermore, by comparing the markers, the instruction image is transformed so that it is correctly projected, and its projection position is controlled.
  • the omnidirectional laser projector 780 since the omnidirectional laser projector 780 is used, it is possible to simultaneously project images instructed by a plurality of clients.
  • a spherical camera 774 that captures images in all directions and a spherical laser projector 780 that projects images in all directions are used.
  • a camera that captures images 360 degrees horizontally may be a predetermined degree
  • a hemispherical camera that captures images below the horizontal may be a hemispherical camera that captures images in front (backward)
  • a camera that captures images 360 degrees horizontally may be a predetermined degree
  • a laser projector that projects, a hemispherical laser projector that projects downward (upwards) from the horizontal, a hemispherical laser projector that projects forward (rearward), etc. may be used.
  • a marker is used as the partial feature image, but feature points of the image may also be used.
  • the omnidirectional camera 774 and the omnidirectional laser projector 780 are directly attached to the headset 770. However, it may be attached via a cushioning material such as silicone gel.
  • the direction instruction is given based on the reference direction of the wide-angle captured image (for example, in front of the local guide). Therefore, when the local guide changes the orientation of his or her body, the orientation of the selected area image changes accordingly. Even if the client wants to see the buildings on the right side of the road, if the local guide turns to the side, he or she will not be able to see the desired direction.
  • the omnidirectional camera 774 (the omnidirectional laser projector 780) can be attached via a three-axis structure to perform direction control.
  • the omnidirectional camera 774 (the omnidirectional laser projector 780) can be attached via a three-axis structure to perform direction control.
  • FIG. 32 shows the functional configuration of the remote experience system according to the fourth embodiment.
  • the local guide wears the imaging section 12 and the projection section 14 via the driving section 16.
  • the imaging area of the imaging unit 12 and the projection area of the projection unit 14 are arranged to be substantially the same.
  • imaging section 12 and projection section 14 are integrally configured so that their imaging direction and projection direction can be changed by a driving section 16.
  • the imaging direction and projection direction of the imaging unit 12 and the projection unit 14 are detected by the sensor 28.
  • the direction control means 20 controls the drive unit 16 based on the output of the sensor 28, and directs the imaging unit 12 and the projection unit 14 in a predetermined direction centered on the local guide, regardless of the movement of the local guide. maintain.
  • the imaging unit 12 of the local device GT images the site and generates a captured image.
  • This captured image is transmitted by the transmitting unit 22 to the client device IT via the server device SV under the control of the captured image transmitting means 18.
  • the captured image receiving means 36 of the client device IT receives the captured image by the receiving unit 32.
  • the captured image display unit 40 displays the received captured image. This allows the instructor to view images of the site.
  • the client wishes to look in a different direction, he or she inputs a direction instruction into the client's device IT.
  • the direction instruction transmitting means 39 of the client device IT transmits this direction instruction to the local device GT via the server device SV.
  • the direction control means 20 of the field device GT controls the drive section 16 and changes the predetermined directions of the imaging section 12 and the projection section 14 according to the direction instruction.
  • the imaging direction at the site is changed according to the direction instruction, and the captured image displayed on the client device IT is also changed to one in a different direction.
  • the client can look in the direction he/she wants to see, regardless of the orientation of the local guide.
  • the client When giving instructions, the client inputs a fixed command. When a fixing command is input, the captured image display 40 uses the captured image at that time as a reference captured image and displays it as a still image. The client inputs an instruction image from the instruction image input section 44 while viewing the reference captured image of the site displayed on the captured image display section 40.
  • the instruction image transmitting means 38 transmits the instruction image input by the transmitter 34 to the local device GT via the server device SV.
  • the local device GT receives the instruction image through the receiving section 24 and projects the instruction image from the projection section 14. As a result, the instruction image 62 is projected onto the target object 52. As described above, since the projection direction of the projection unit 14 is fixed, even if the local guide changes the direction of his or her face, the instruction image will be projected onto the location intended by the client. However, if the local guide moves from place to place, the projected position of the instruction image will shift.
  • the correction means 26 of the field device GT compares the characteristic partial image (marker, etc.) in the reference captured image with the characteristic partial image in the current captured image so that the instruction image is correctly projected at the intended position.
  • the projection position of the instruction image is corrected and controlled. As a result, even if the local guide moves, the instruction image will be displayed in the correct position.
  • the transmitting unit 34 transmits the fixing command to the local device GT via the server device SV.
  • the receiving unit 24 of the field device GT receives this and records the captured image at that time as a reference captured image. Further, the local guide places a marker near the object 52 so that the image is taken.
  • the local guide can receive instructions on the products to purchase and the direction in which he or she should proceed based on the instruction image 62 actually projected at the site.
  • This instruction image 62 is displayed at the correct position by the direction control means 20 even if the local guide changes the direction of his or her head. Therefore, the instruction image 62 does not disappear depending on the direction of the local guide's head, reducing stress.
  • the instruction image 62 will continue to be projected, and the other local guides will not be confused. . Furthermore, even if the local guide moves, the instruction image 62 is displayed correctly.
  • Appearance and Hardware Configuration Figure 33 shows the state in which the local guide 54 is wearing the local device GT.
  • headset 770 is a camera/laser projector complex 58 .
  • the configuration of the camera/laser projector complex 58 is as shown in FIG.
  • the basic configuration is similar to the direction-controlled laser projector 776 of the second embodiment.
  • not only the laser projector 84 but also the camera 82 is housed within the unit 80. Therefore, both the laser projector 84 and the camera 82 are directionally controlled.
  • the projection area of the laser projector 84 and the imaging area of the camera 82 are configured to substantially match.
  • the hardware configuration of the smartphone 200 is the same as that shown in FIG. 3, the hardware configuration of the server device SV is the same as that shown in FIG. 4, the hardware configuration of the client device IT is the same as that shown in FIG. 5, and the hardware configuration of the motor control circuit 400 is the same as that shown in FIG. 16.
  • Remote experience processing Figure 35 shows a flowchart during guidance.
  • the local guide's smartphone 200 acquires an image (video) taken by the camera 92 through short-range communication, and transmits it to the server device SV (step S21).
  • the server device SV receives this captured image and transfers it to the client device IT (step S91).
  • the client device IT receives the captured image and displays it on the display 306 (step S41).
  • FIG. 18 shows the displayed selection area image. This allows the client to view the local video. You can enjoy seeing the local scene as the local guide moves around.
  • the camera/laser projector integrated body 58 is provided with a gyro sensor JS and an acceleration sensor AS.
  • the motor control circuit 400 receives the outputs of the gyro sensor JS and the acceleration sensor AS, and detects the orientation of (the base 93 of) the camera/laser projector integrated body 58 (step S1).
  • the motor control circuit 400 controls the motors 92, 94, and 96 so that the unit 80 is oriented in a predetermined direction even if the orientation of the base 93 changes (step S2). Therefore, an image in a predetermined direction is captured regardless of the direction of the local guide.
  • the direction command button 500 can be clicked in 360 degrees in the circumferential direction, up, down, left, and right.
  • the client device IT transmits a direction command corresponding to the click to the server device SV (step S42).
  • Server device SV receives this and transfers it to smartphone 200 (step S92).
  • the smartphone 200 that has received the direction command transfers it to the motor control circuit 400 (step S22).
  • the motor control circuit 400 changes the direction (predetermined direction) to be constantly controlled (step S3).
  • the client can view a stable image in the desired direction through his/her own operations.
  • the client can give instructions to the local guide by projecting an image onto a local object.
  • An instruction image is transmitted from the client's device IT, and is projected onto an object at the site using a laser projector 84 worn by the local guide.
  • the guide can provide accurate instructions.
  • FIG. 36 shows a flowchart of instruction image projection.
  • FIG. 21a a case will be described in which the user instructs to purchase 52c among products 52a, 52b, 52c, 52d, etc. lined up at a store.
  • the client instructs the local guide 54 to place a marker 60 prepared in advance nearby by voice call or the like.
  • the size and shape of the image of the marker 60 are recorded in advance in the nonvolatile memory 212 of the smartphone 200. Therefore, the smartphone 200 can calculate the distance, direction, etc. from the laser projector 84 to the marker 60 based on the captured image of the marker 60.
  • the captured image is displayed as a moving image on the display 306 of the client's device IT. While looking at this captured image, the client clicks the instruction input mode button 501 with the marker 60 captured as shown in FIG. 21a to give a fixing instruction. Note that here, it is assumed that the marker 60 prepared as a card is placed leaning against the product 52b.
  • the client device IT sets the captured image at that time as a reference captured image and displays it on the display 306 as a still image (step S52).
  • the client uses the mouse 316 to input instructions to the local guide as an instruction image for this still image (step S53). For example, as shown in FIG. 21b, a circle mark 62 is drawn using the mouse 316 on the image of the product 52c displayed on the display 306, and inputted.
  • the client device IT transmits the fixed command to the smartphone 200 via the server device SV (steps S51, S93).
  • the smartphone 200 that has received the fixing instruction records the captured image when receiving the fixing instruction in the nonvolatile memory 212 as a reference captured image (step S32).
  • the client device IT and the smartphone 200 can recognize captured images taken at the same time as reference captured images.
  • the client clicks the instruction image transmission button 502 (displayed when the instruction input mode is entered) displayed at the lower right of the reference captured image on the display 306.
  • the client device IT transmits the instruction image to the smartphone 200 via the server device SV (steps S53, S94).
  • the client device IT cancels the instruction input mode, stops displaying the still image as the reference captured image, and displays the transmitted captured image as a moving image (step S54). This allows the instructor to see the local situation again.
  • the smartphone 200 Upon receiving the instruction image data of FIG. 24A, the smartphone 200 stores it in the memory 204. Furthermore, the smartphone 200 acquires the current captured image from the camera 82 (step S33).
  • the imaging range of the camera 82 and the projection range of the laser projector 84 are configured to be the same. Therefore, if the current captured image is exactly the same as the recorded reference captured image (that is, if the local guide has not moved at all since the reference captured image), the laser When the instruction image data is projected by the projector 84, the instruction image 62 is projected onto the product 52c.
  • this instruction image 62 Since the position of this instruction image 62 matches the position input by the client on the display 306, the target product 52c can be accurately shown to the local guide.
  • the local guide can use the instruction image 62 as a landmark to purchase the product 52c without making a mistake.
  • a correction means 26 is provided for correctly displaying the instruction image by performing image processing or projection control.
  • the direction control means 20 that controls the projection direction of the projection section 14 by the drive section 16 may be sufficient.
  • a laser projector 94 is provided to project the instruction image 62.
  • the laser projector 94 may not be provided. Even in this case, the client can view the image in the direction he or she desires.
  • the driving unit 16 controls the projection direction, and the smartphone 200 performs image processing and projection control (step S34) so that the instruction image is correctly displayed by following the marker 60. ing.
  • control for tracking the marker 60 and the like may also be controlled by the drive section 16.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

Le problème décrit par la présente invention est de fournir un système d'expérience à distance plus avancé permettant une expérience de visualisation et d'achat à un endroit distant. La solution selon l'invention comprend la transmission, par un moyen de transmission de liste de dispositifs disponibles 652 d'un dispositif serveur SV, d'une liste de dispositifs disponibles générée à un dispositif demandeur IT au moyen d'une unité de transmission 644. Un moyen d'affichage de liste de dispositifs disponibles 666 du dispositif demandeur (IT) affiche la liste de dispositifs disponibles sur une unité d'affichage 668. Un demandeur se réfère à la liste affichée sur l'unité d'affichage 668 et sélectionne n'importe quel dispositif local GT1, GT2... GTn (c'est-à-dire, un guide local). Un moyen de réception d'informations de demande 618 du dispositif local GT1 reçoit des informations de demande au moyen d'une unité de réception 622. Ceci amène le dispositif local GT1 à entrer dans un mode de guidage. Dans le mode de guidage, le dispositif local GT1 capture une image de la zone locale au moyen d'une unité de capture d'image 612. Un moyen de transmission d'image capturée 616 transmet l'image capturée de la zone locale au dispositif serveur SV au moyen d'une unité de transmission 620. Un moyen de transfert d'image capturée 650 du dispositif serveur (SV) transfère l'image capturée au dispositif demandeur (IT). Un moyen de réception d'image capturée 670 du dispositif demandeur IT reçoit l'image capturée au moyen d'une unité de réception 662. Un moyen d'affichage d'image capturée 672 affiche, sur l'unité d'affichage 668, l'image capturée de la zone locale qui a été reçue.
PCT/JP2023/005479 2022-04-20 2023-02-16 Système d'expérience à distance WO2023203853A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022069247 2022-04-20
JP2022-069247 2022-04-20

Publications (1)

Publication Number Publication Date
WO2023203853A1 true WO2023203853A1 (fr) 2023-10-26

Family

ID=88419658

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/005479 WO2023203853A1 (fr) 2022-04-20 2023-02-16 Système d'expérience à distance

Country Status (1)

Country Link
WO (1) WO2023203853A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005117621A (ja) * 2003-09-16 2005-04-28 Honda Motor Co Ltd 画像配信システム
JP2005323310A (ja) * 2004-05-11 2005-11-17 Nippon Telegr & Teleph Corp <Ntt> 視野共有装置、視野移動量入力装置、画像表示装置、撮影範囲投影方法、視野移動量入力装置の制御方法、画像表示装置の制御方法、視野共有装置のプログラム、視野移動量入力装置のプログラム及び画像表示装置のプログラム
JP2013192029A (ja) * 2012-03-14 2013-09-26 Renesas Mobile Corp 撮影機能を有する携帯端末およびそれを用いる画像取得システム
WO2014077046A1 (fr) * 2012-11-13 2014-05-22 ソニー株式会社 Dispositif d'affichage d'image et procédé d'affichage d'image, dispositif formant corps en mouvement, système d'affichage d'image, et programme informatique
JP2014225108A (ja) * 2013-05-16 2014-12-04 ソニー株式会社 画像処理装置、画像処理方法およびプログラム
JP2016126365A (ja) * 2014-12-26 2016-07-11 セイコーエプソン株式会社 表示システム、表示装置、情報表示方法、及び、プログラム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005117621A (ja) * 2003-09-16 2005-04-28 Honda Motor Co Ltd 画像配信システム
JP2005323310A (ja) * 2004-05-11 2005-11-17 Nippon Telegr & Teleph Corp <Ntt> 視野共有装置、視野移動量入力装置、画像表示装置、撮影範囲投影方法、視野移動量入力装置の制御方法、画像表示装置の制御方法、視野共有装置のプログラム、視野移動量入力装置のプログラム及び画像表示装置のプログラム
JP2013192029A (ja) * 2012-03-14 2013-09-26 Renesas Mobile Corp 撮影機能を有する携帯端末およびそれを用いる画像取得システム
WO2014077046A1 (fr) * 2012-11-13 2014-05-22 ソニー株式会社 Dispositif d'affichage d'image et procédé d'affichage d'image, dispositif formant corps en mouvement, système d'affichage d'image, et programme informatique
JP2014225108A (ja) * 2013-05-16 2014-12-04 ソニー株式会社 画像処理装置、画像処理方法およびプログラム
JP2016126365A (ja) * 2014-12-26 2016-07-11 セイコーエプソン株式会社 表示システム、表示装置、情報表示方法、及び、プログラム

Similar Documents

Publication Publication Date Title
US10354407B2 (en) Camera for locating hidden objects
CN107782314B (zh) 一种基于扫码的增强现实技术室内定位导航方法
US9401050B2 (en) Recalibration of a flexible mixed reality device
US9858643B2 (en) Image generating device, image generating method, and program
CN104903775B (zh) 头戴式显示器及其控制方法
US20170337743A1 (en) System and method for referencing a displaying device relative to a surveying instrument
JP6201024B1 (ja) ヘッドマウントデバイスを用いてコンテンツを提供するアプリケーションへの入力を支援するための方法、当該方法をコンピュータに実行させるためのプログラム、およびコンテンツ表示装置
JP2018009836A (ja) プログラム、頭部装着型表示装置、キャリブレーション方法
CN109561282B (zh) 一种用于呈现地面行动辅助信息的方法与设备
JP4433385B2 (ja) 目的地案内装置および携帯端末装置
JP2022061959A (ja) ハンズフリー歩行者ナビゲーションのためのシステムと方法
WO2023203853A1 (fr) Système d&#39;expérience à distance
JP6398630B2 (ja) 可視像表示方法、第1デバイス、プログラム、及び、視界変更方法、第1デバイス、プログラム
EP4086571A1 (fr) Capture d&#39;environnement 3d haute densité pour guider la réalité mixte
US20230035962A1 (en) Space recognition system, space recognition method and information terminal
WO2022200949A1 (fr) Appareil d&#39;affichage, système de communication, procédé de commande d&#39;affichage et support d&#39;enregistrement
CN114842056A (zh) 多机位第一机器视角追随方法、***、装置及设备
WO2023188951A1 (fr) Système d&#39;instruction à distance
KR20200004135A (ko) 증강현실 기반의 모델하우스 가상이미지 제공방법
US20240112422A1 (en) Communication management server, communication system, and method for managing communication
JP2020042667A (ja) 投影システム、投影方法及びプログラム
WO2022172335A1 (fr) Dispositif d&#39;affichage de guide virtuel, système d&#39;affichage de guide virtuel et procédé d&#39;affichage de guide virtuel
US20220084258A1 (en) Interaction method based on optical communication apparatus, and electronic device
WO2018019563A1 (fr) Dispositif pour services basés sur la localisation
JP2022021009A (ja) 現場映像管理システムおよび現場映像管理方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23791509

Country of ref document: EP

Kind code of ref document: A1