WO2023199626A1 - Display control device - Google Patents

Display control device Download PDF

Info

Publication number
WO2023199626A1
WO2023199626A1 PCT/JP2023/007281 JP2023007281W WO2023199626A1 WO 2023199626 A1 WO2023199626 A1 WO 2023199626A1 JP 2023007281 W JP2023007281 W JP 2023007281W WO 2023199626 A1 WO2023199626 A1 WO 2023199626A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual object
display
transmittance
virtual
real
Prior art date
Application number
PCT/JP2023/007281
Other languages
French (fr)
Japanese (ja)
Inventor
健吾 松本
真治 木村
裕一 市川
宏暢 藤野
修 後藤
拓郎 栗原
泰士 山本
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Publication of WO2023199626A1 publication Critical patent/WO2023199626A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present invention relates to a display control device.
  • a display control device that displays the virtual space on a display device controls the transparency of the virtual objects in order to increase the visibility of both virtual objects for a user.
  • Patent Document 1 discloses a program that allows the user to easily and smoothly operate each virtual object in a virtual space without causing the user any trouble in operating the terminal.
  • the program includes an object determining unit that determines at least one of the virtual objects as an object to be subjected to transparency processing, and an object determining unit that determines at least one of the virtual objects as an object to be subjected to transparency processing, and an object determining unit that determines the object to be subjected to transparency processing according to the arrangement relationship with the virtual camera.
  • the computer is made to function as an object control unit that performs transparency processing according to the transmittance.
  • Patent Document 1 is a technology that only supports the superposition of virtual objects that exist in virtual space, and does not support the superposition of virtual objects and objects that exist in real space.
  • an object of the present invention is to provide a display control device that can improve the visibility of a virtual object and an object existing in real space when the object overlaps.
  • a first display control device displays a virtual space in which the virtual object is superimposed on a real space via the display device by displaying an image showing the virtual object on the display device.
  • a display control device comprising: a display control section that allows a user to recognize the virtual object; and a transmittance control section that controls the transmittance of the virtual object according to the occupancy rate of the area occupied by the virtual object in the display area of the display device. It is.
  • a second display control device includes an imaging device that captures an image of a real space in which a real object exists, a recognition unit that recognizes the type of the real object, and an image that shows a virtual object on a display device.
  • a display control unit that causes a user to recognize a virtual space in which the virtual object is superimposed on the real space via the display device; and a transmittance control section that controls the transmittance of the virtual object.
  • FIG. 1 is a diagram showing the overall configuration of an information processing system 1 according to a first embodiment.
  • FIG. 2 is a diagram showing an example of a mixed reality space in which virtual objects are superimposed on a real space.
  • FIG. 3 is a block diagram showing a configuration example of a terminal device 10-K.
  • FIG. 2 is a block diagram showing a configuration example of a server 20.
  • FIG. 3 is a block diagram showing a configuration example of terminal devices 10A-K. Flowchart showing the operation of terminal devices 10A-K.
  • FIG. 3 is a block diagram showing a configuration example of a terminal device 10B-K. An explanatory diagram of a first distance and a second distance. Flowchart showing the operation of terminal device 10B-K.
  • FIG. 1 shows the overall configuration of an information processing system 1 according to the first embodiment.
  • the information processing system 1 includes terminal devices 10-1, 10-2, ... 10-K, ... 10-J, and a server 20.
  • J is an integer of 1 or more.
  • K is an integer greater than or equal to 1 and less than or equal to J.
  • the terminal devices 10-1 to 10-J have the same configuration. However, a terminal device whose configuration is not the same as that of other terminals may be included.
  • the terminal devices 10-1 to 10-J and the server 20 are communicably connected to each other via the communication network NET. Note that in FIG. 1, it is assumed that the user UK uses the terminal device 10-K.
  • the server 20 provides various data and cloud services to the terminal devices 10-1 to 10-J via the communication network NET.
  • the server 20 provides various contents to be displayed in the virtual space to the terminal devices 10-1 to 10-J.
  • the terminal device 10-K causes a virtual object to be displayed on a display 15 provided in the terminal device 10-K or XR glasses connected to the terminal device 10-K and worn on the head of the user UK .
  • XR glasses is a general term for VR (Virtual Reality) glasses, AR (Augmented Reality) glasses, and MR (Mixed Reality) glasses.
  • the display 15 provided in the terminal device 10-K” and “XR glasses connected to the terminal device 10-K and worn on the head by the user UK are examples of "display devices.”
  • virtual objects are, for example, virtual objects representing data such as still images, moving images, 3DCG models, HTML files, and text files, and virtual objects representing applications.
  • text files include memos and source codes.
  • applications include a browser, an application for using SNS, and an application for generating a document file.
  • the terminal device 10-K is preferably a mobile terminal device such as a smartphone or a tablet, for example.
  • FIGS. 2A and 2B are diagrams showing examples of mixed reality spaces in which virtual objects are superimposed on real spaces.
  • FIG. 2A is an example of the real space RS displayed on the display device.
  • the example of the real space RS shown in FIG. 2A is a university classroom.
  • the classroom of the university there are a desk T1, chairs C1 to C4, and a bulletin board NB as some of the objects existing in the classroom.
  • a desk T1 and chairs C1 to C4 are located in front of the bulletin board NB.
  • chairs C1 to C4 surround desk T1.
  • FIG. 2B is an example of a mixed reality space MS in which a virtual object VO is superimposed on the real space RS shown in FIG. 2A.
  • the virtual object VO is located in front of the desk T1, the chairs C1 to C4, and the bulletin board NB when viewed from the user of the display device.
  • the size of the virtual object VO is such that it covers the desk T1, the chairs C1 to C4, and part of the bulletin board NB.
  • the terminal device 10 controls the transmittance of the virtual object VO to allow the user to see the desk T1, the chairs C1 to C4, and the bulletin board NB.
  • the terminal device 10 controls the transmittance of the virtual object as described above.
  • FIG. 3 is a block diagram showing an example of the configuration of the terminal device 10-K.
  • the terminal device 10-K includes a processing device 11, a storage device 12, a communication device 13, a positioning device 14, a display 15, an input device 16, and an inertial sensor 17.
  • Each element included in the terminal device 10-K is interconnected using one or more buses for communicating information.
  • the processing device 11 is a processor that controls the entire terminal device 10-K. Further, the processing device 11 is configured using, for example, a single chip or a plurality of chips. The processing device 11 is configured using, for example, a central processing unit (CPU) that includes an interface with peripheral devices, an arithmetic unit, registers, and the like. Note that some or all of the functions of the processing device 11 may be implemented using hardware such as a DSP, ASIC, PLD, or FPGA. The processing device 11 executes various processes in parallel or sequentially.
  • CPU central processing unit
  • the storage device 12 is a recording medium that can be read and written by the processing device 11. Furthermore, the storage device 12 stores a plurality of programs including the control program PR1 executed by the processing device 11. Furthermore, the storage device 12 further stores image information indicating an image displayed on the display 15. In particular, the storage device 12 further stores image information indicating an image used by the generation unit 111, which will be described later, to generate the virtual object VO.
  • the communication device 13 is hardware as a transmitting/receiving device for communicating with other devices.
  • the communication device 13 is also called, for example, a network device, a network controller, a network card, a communication module, or the like.
  • the communication device 13 may include a connector for wired connection and an interface circuit corresponding to the connector.
  • the communication device 13 may include a wireless communication interface. Examples of connectors and interface circuits for wired connections include products compliant with wired LAN, IEEE1394, and USB.
  • examples of the wireless communication interface include products compliant with wireless LAN, Bluetooth (registered trademark), and the like.
  • the positioning device 14 acquires position information.
  • the positioning device 14 may be, for example, a GPS (Global Positioning System) device.
  • GPS Global Positioning System
  • the positioning device 14 receives radio waves from a plurality of satellites.
  • the positioning device 14 generates GPS information as position information from the received radio waves.
  • the location information may be in any format as long as the location can be specified.
  • the location information indicates, for example, the latitude and longitude of the terminal device 10-K.
  • the acquired position information is output to the processing device 11.
  • the positioning device 14 may be, for example, a VPS (Visual Positioning System) device.
  • VPS Visual Positioning System
  • the positioning device 14 acquires image information indicating an image obtained by capturing the scenery in front of the user UK from an imaging device (not shown). Furthermore, the positioning device 14 outputs image information acquired from the imaging device to a position information server (not shown) via the communication device 13. Furthermore, the positioning device 14 acquires VPS information as location information from the location information server via the communication device 13.
  • the position information includes the position of the user UK in the real space RS and the direction in which the user UK views the real space RS.
  • the display 15 is a device that displays images and text information.
  • the display 15 displays various images under the control of the processing device 11.
  • various display panels such as a liquid crystal display panel and an organic EL (Electro Luminescence) display panel are suitably used as the display 15.
  • the display 15 does not have to be an essential component.
  • the terminal device 10-K may be configured without the display 15.
  • the display 15 and XR glasses are examples of display devices.
  • the input device 16 accepts operations from the user UK .
  • the input device 16 includes a keyboard, a touch pad, a touch panel, or a pointing device such as a mouse.
  • the input device 16 may also serve as the display 15.
  • the inertial sensor 17 is a sensor that detects inertial force.
  • the inertial sensor 17 includes, for example, one or more of an acceleration sensor, an angular velocity sensor, and a gyro sensor.
  • the processing device 11 detects the attitude of the terminal device 10-K based on the output information of the inertial sensor 17. Further, the processing device 11 receives selection of the virtual object VO, input of characters, and input of instructions in the mixed reality space MS based on the attitude of the terminal device 10-K. For example, when the user UK operates the input device 16 while pointing the central axis of the terminal device 10-K toward a predetermined region of the mixed reality space MS, a virtual object VO placed in the predetermined region is selected. The user UK 's operation on the input device 16 is, for example, a double tap. In this way, by operating the terminal device 10-K, the user UK can select the virtual object VO without looking at the input device 16 of the terminal device 10-K.
  • the processing device 11 reads the control program PR1 from the storage device 12 and executes the control program PR1. As a result, the processing device 11 functions as a generation section 111, a display control section 112, a calculation section 113, a transmittance control section 114, and a communication control section 115.
  • the generation unit 111 generates a virtual object VO.
  • the generation unit 111 may generate the virtual object VO using image information stored in the storage device 12.
  • the generation unit 111 may acquire image information from the server 20 via the communication device 13 and generate the virtual object VO using the acquired image information.
  • the display control unit 112 causes the user UK to recognize the mixed reality space MS in which the virtual object VO is superimposed on the real space RS via the display 15 by displaying an image showing the virtual object VO on the display 15.
  • the display control unit 112 displays the virtual object VO in a situation where light incident from the outside world passes through the XR glasses.
  • the user UK is made to recognize the mixed reality space MS.
  • the calculation unit 113 calculates the occupancy rate of the area occupied by the virtual object VO displayed by the display control unit 112 in the display area of the display 15.
  • the transmittance control unit 114 controls the transmittance of the virtual object VO according to the occupancy rate of the area occupied by the virtual object VO calculated by the calculation unit 113. Specifically, the transmittance control unit 114 increases the transmittance of the virtual object VO as the occupancy rate calculated by the calculation unit 113 increases. This is because the larger the area occupied by the virtual object VO in the display area of the display 15, the larger the area in front of the real space RS that is invisible to the user UK , more specifically, behind the virtual object VO. . Therefore, the higher the occupancy rate, the more the transparency control unit 114 makes the virtual object VO transparent.
  • the occupancy rate of the virtual object VO and the transmittance of the virtual object VO may be in a proportional relationship.
  • the transmittance control unit 114 sets the transmittance of the virtual object VO to 0%
  • the occupancy rate is 90%
  • the transmittance control unit 114 sets the transmittance of the virtual object VO to 0%.
  • the rate may be set to 90%.
  • the transmittance control unit 114 controls the occupancy rate of the virtual object VO according to an increase in the occupancy rate of the virtual object VO. The rate may increase monotonically.
  • the terminal device 10-K increases the transmittance of the virtual object VO as the occupancy rate of the area occupied by the virtual object VO in the display area of the display 15 increases. As a result, user UK can obtain the information that he or she should have originally obtained. Furthermore, the increased risk that occurs as the user UK moves is suppressed.
  • the communication control unit 115 causes the communication device 13 to transmit operation information indicating the user UK 's operation using the terminal device 10-K to the server 20.
  • the communication control unit 115 causes the communication device 13 to transmit operation information indicating the user UK 's operation on the virtual object VO to the server 20.
  • FIG. 4 is a block diagram showing an example of the configuration of the server 20.
  • the server 20 includes a processing device 21, a storage device 22, a communication device 23, a display 24, and an input device 25. Each element included in the server 20 is interconnected using one or more buses for communicating information.
  • the processing device 21 is a processor that controls the entire server 20. Further, the processing device 21 is configured using, for example, a single chip or a plurality of chips. The processing device 21 is configured using, for example, a central processing unit (CPU) that includes an interface with peripheral devices, an arithmetic unit, registers, and the like. Note that some or all of the functions of the processing device 21 may be implemented using hardware such as a DSP, ASIC, PLD, or FPGA. The processing device 21 executes various processes in parallel or sequentially.
  • CPU central processing unit
  • the storage device 22 is a recording medium that can be read and written by the processing device 21. Furthermore, the storage device 22 stores a plurality of programs including the control program PR2 executed by the processing device 21. Furthermore, the storage device 22 further stores image information indicating an image displayed on the display 24.
  • the communication device 23 is hardware as a transmitting/receiving device for communicating with other devices.
  • the communication device 23 is also called, for example, a network device, a network controller, a network card, a communication module, or the like.
  • the communication device 23 may include a connector for wired connection and an interface circuit corresponding to the connector.
  • the communication device 23 may include a wireless communication interface. Examples of connectors and interface circuits for wired connections include products compliant with wired LAN, IEEE1394, and USB.
  • examples of the wireless communication interface include products compliant with wireless LAN, Bluetooth (registered trademark), and the like.
  • the display 24 is a device that displays images and text information.
  • the display 24 displays various images under the control of the processing device 21.
  • various display panels such as a liquid crystal display panel and an organic EL (Electro Luminescence) display panel are suitably used as the display 24.
  • the input device 25 accepts operations from the administrator of the information processing system 1.
  • the input device 25 includes a keyboard, a touch pad, a touch panel, or a pointing device such as a mouse.
  • the input device 25 may also serve as the display 24.
  • the processing device 21 reads the control program PR2 from the storage device 22 and executes the control program PR2. As a result, the processing device 21 functions as an acquisition section 211 and a communication control section 212.
  • the acquisition unit 211 acquires operation information indicating the operation of the user UK from the terminal device 10-K via the communication device 23.
  • the communication control unit 212 causes the communication device 23 to transmit various data for providing various contents and cloud services to the terminal device 10-K.
  • the communication control unit 212 causes the communication device 23 to transmit various data necessary for the user of the terminal device 10-K to experience the mixed reality space MS to the terminal device 10-K.
  • the communication control unit 212 reads out the image information from the storage device 22 and sends the image information to the communication device 23. Then, the image information is transmitted to the terminal device 10-K.
  • FIG. 5 is a flowchart showing the operation of the terminal device 10-K as a display control device according to the first embodiment. The operation of the terminal device 10-K will be described below with reference to FIG.
  • step S1 the processing device 11 functions as the generation unit 111.
  • the processing device 11 generates a virtual object VO.
  • step S2 the processing device 11 functions as the display control section 112.
  • the processing device 11 causes the display 15 to display a mixed reality space MS in which the virtual object VO is superimposed on the real space RS by causing the display 15 to display an image showing the virtual object VO generated in step S1.
  • step S3 the processing device 11 functions as the calculation unit 113.
  • the processing device 11 calculates the occupancy rate of the area occupied by the virtual object VO displayed by the display control unit 112 in the display area of the display 15 .
  • step S4 the processing device 11 functions as the transmittance control section 114.
  • the processing device 11 controls the transmittance of the virtual object VO according to the occupancy rate calculated in step S3.
  • the terminal device 10-K ends all the processes shown in the flowchart of FIG.
  • the terminal device 10-K as a display control device includes a display control section 112 and a transmittance control section 114.
  • the display control unit 112 causes the display 15 serving as a display device to display an image showing the virtual object VO, so that the mixed reality space MS in which the virtual object VO is superimposed on the real space RS is displayed to the user UK via the display 15. Recognize it.
  • the transmittance control unit 114 controls the transmittance of the virtual object VO according to the occupancy rate of the area occupied by the virtual object VO in the display area of the display 15.
  • the terminal device 10-K has the above configuration, when the virtual object VO and the object existing in the real space RS overlap in the augmented reality space or the mixed reality space, it is possible to visually recognize the object existing in the real space RS. It becomes possible to increase the sex.
  • the terminal device 10-K increases the transmittance of the virtual object VO as the occupancy rate of the area occupied by the virtual object VO on the screen displayed by the display 15 increases. As a result, the information that the user UK should originally obtain is obtained. Furthermore, the increased risk that occurs as the user UK moves is suppressed.
  • Second embodiment 2-1-1 Configuration of second embodiment 2-1-1: Overall configuration Information processing system 1A according to the second embodiment includes information processing system 1 according to the first embodiment, terminal devices 10-1, 10- The difference is that terminal devices 10A-1, 10A-2, ...10A-K, ...10A-J are provided instead of terminal devices 10A-1, 10A-2, ...10A-K, ...10A-J.
  • the overall configuration of the information processing system 1A is the same as the overall configuration of the information processing system 1 shown in FIG. 1, so illustration thereof will be omitted.
  • FIG. 6 is a block diagram showing an example of the configuration of the terminal devices 10A-K.
  • the terminal device 10A-K is different from the terminal device 10-K according to the first embodiment in that it includes a processing device 11A instead of the processing device 11, and a storage device 12A instead of the storage device 12. Furthermore, the terminal device 10A-K includes an imaging device 18 in addition to the components included in the terminal device 10-K.
  • the imaging device 18 images the real space RS in the external world where objects exist. Further, the imaging device 18 outputs imaging information indicating an image obtained by imaging the outside world. Further, the imaging device 18 includes, for example, a lens, an imaging element, an amplifier, and an AD converter.
  • the image sensor converts the light collected through the lens into an image signal, which is an analog signal.
  • the amplifier amplifies the imaging signal and outputs the amplified imaging signal to the AD converter.
  • the AD converter converts the amplified imaging signal, which is an analog signal, into imaging information, which is a digital signal.
  • the imaging information is output to the processing device 11A.
  • the terminal device 10A-K when using XR glasses connected to the terminal device 10A-K instead of the terminal device 10A-K having the display 15, the terminal device 10A-K is equipped with the imaging device 18.
  • an imaging device included in the XR glasses may be used. Note that the "object” here is an example of a "real object.”
  • the storage device 12A stores a control program PR1A instead of the control program PR1 included in the storage device 12 according to the first embodiment. Furthermore, the storage device 12A further stores a learning model LM.
  • the learning model LM is a learning model for the recognition unit 116 (described later) to recognize the type of object included in an image obtained by the imaging device 18 performing imaging.
  • the learning model LM is generated by learning teacher data in the learning phase.
  • the teacher data used to generate the learning model LM includes a plurality of one-to-one pairs of photographs of objects and types of the objects.
  • the learning model LM is generated outside the terminal devices 10A-K.
  • the learning model LM is generated in a second server (not shown).
  • the terminal devices 10A-K acquire the learning model LM from a second server (not shown) via the communication network NET.
  • the processing device 11A reads the control program PR1A from the storage device 12A and executes the control program PR1A. As a result, the processing device 11A functions as a transmittance control section 114A and a recognition section 116 in addition to the generation section 111, display control section 112, and communication control section 115 similar to those in the first embodiment.
  • the recognition unit 116 uses the learning model LM to recognize the type of the real object RO existing in the real space RS. Specifically, the recognition unit 116 recognizes the type of one or more real objects existing in the real space RS before the virtual object VO is generated.
  • the recognition unit 116 recognizes that the type of desk T1, which is the real object RO existing in the real space RS, is "desk”. Similarly, the recognition unit 116 recognizes that the types of chairs C1 to C4, which are real objects RO existing in the real space RS, are “chairs.” Further, the recognition unit 116 recognizes that the type of the bulletin board NB, which is the real object RO existing in the real space RS, is a "bulletin board".
  • the transmittance control unit 114A controls the transmittance of the virtual object VO according to the type of the real object RO located behind the virtual object VO.
  • the transmittance control unit 114A performs the following actions in response to the types of the real object RO located behind the virtual object VO being "desk”, “chair”, and "bulletin board”. Controls the transparency of the virtual object VO.
  • the transmittance control unit 114A may control the transmittance of the virtual object VO according to the combination of types of these real objects RO.
  • the transmittance control unit 114A determines that the combination of types of the real object RO located behind the virtual object VO is a combination of three types: "desk”, “chair”, and "bulletin board". Based on this, the transparency of the virtual object VO may be controlled.
  • the transmittance control unit 114A controls the transmittance of the virtual object VO based on the type of any real object RO among one or more real objects RO located behind the virtual object VO. Good too.
  • the transmittance control unit 114A controls the transmittance of the virtual object VO based on the fact that the real object RO whose type is "desk" is located behind the virtual object VO. Good too.
  • the transmittance control unit 114A may control the transmittance of the virtual object VO based on the fact that the real object RO whose type is "chair” is located behind the virtual object VO.
  • the transmittance control unit 114A may control the transmittance of the virtual object VO based on the fact that a real object RO whose type is "bulletin board" is located behind the virtual object VO.
  • the degree of danger to the user UK differs depending on the type of the obstructed real object RO.
  • the terminal device 10-K recognizes the type of the real object RO and controls the transmittance of the virtual object VO according to the recognized type. As a result, if the risk of occurrence when the user UK 's field of view is blocked by the virtual object VO is relatively high, the terminal devices 10A-K increase the transmittance of the virtual object VO relatively. By doing so, it is possible to suppress the occurrence of situations where the risk is relatively high.
  • the terminal devices 10A-K make the transmittance of the virtual object VO relatively low. This ensures the visibility of the virtual object VO for the user UK .
  • FIG. 7 is a flowchart showing the operation of the terminal device 10A-K as a display control device according to the second embodiment. The operation of the terminal devices 10A-K will be described below with reference to FIG.
  • step S11 the imaging device 18 images the external real space RS where the real object RO exists.
  • step S12 the processing device 11A functions as the recognition unit 116.
  • the processing device 11A uses the learning model LM to recognize the type of the real object RO existing in the real space RS.
  • step S13 the processing device 11A functions as the generation unit 111.
  • the processing device 11A generates a virtual object VO.
  • step S14 the processing device 11A functions as the display control section 112.
  • the processing device 11A allows the user to view the mixed reality space MS in which the virtual object VO is superimposed on the real space RS via the display 15 by displaying the image showing the virtual object VO generated in step S13 on the display 15. Make the UK aware.
  • step S15 the processing device 11A functions as a transmittance control section 114A.
  • the processing device 11A controls the transparency of the virtual object VO according to the type of the real object RO located behind the virtual object VO.
  • the terminal devices 10A-K complete all processes shown in the flowchart of FIG.
  • the terminal device 10A-K as a display control device according to the present embodiment includes an imaging device 18, a recognition section 116, a display control section 112, and a transmittance control section 114A. Equipped with The imaging device 18 images the real space RS where the real object RO exists.
  • the recognition unit 116 recognizes the type of real object RO.
  • the display control unit 112 causes the display 15 serving as a display device to display an image showing the virtual object VO, so that the user U can see the mixed reality space MS in which the virtual object VO is superimposed on the real space RS via the display 15. Let K recognize it.
  • the transmittance control unit 114A controls the transmittance of the virtual object VO according to the type of the real object RO located behind the virtual object VO.
  • the terminal devices 10A-K have the above configuration, when the virtual object VO and the object existing in the real space RS overlap in the augmented reality space or the mixed reality space, the object existing in the real space RS can be visually recognized. It becomes possible to increase the sex.
  • the degree of danger to the user UK differs depending on the type of the obstructed real object RO.
  • the terminal device 10-K recognizes the type of the real object RO and controls the transmittance of the virtual object VO according to the recognized type. As a result, if the risk of occurrence when the user UK 's field of view is blocked by the virtual object VO is relatively high, the terminal devices 10A-K increase the transmittance of the virtual object VO relatively. By doing so, it is possible to suppress the occurrence of situations where the risk is relatively high.
  • the terminal devices 10A-K make the transmittance of the virtual object VO relatively low. This ensures the visibility of the virtual object VO for the user UK .
  • 3-1 Configuration of third embodiment 3-1-1: Overall configuration Information processing system 1B according to the third embodiment includes information processing system 1 according to the first embodiment, terminal devices 10-1, 10- The difference is that terminal devices 10B-1, 10B-2, . . . 10B-K, . . . 10B-J are provided instead of 2, .
  • the overall configuration of the information processing system 1A is the same as the overall configuration of the information processing system 1 shown in FIG. 1, so illustration thereof will be omitted.
  • FIG. 8 is a block diagram showing an example of the configuration of the terminal device 10B-K.
  • the terminal device 10B-K differs from the terminal device 10-K according to the first embodiment in that it includes a processing device 11B instead of the processing device 11, and a storage device 12B instead of the storage device 12.
  • the storage device 12B stores a control program PR1B instead of the control program PR1 included in the storage device 12 according to the first embodiment.
  • the processing device 11B reads the control program PR1B from the storage device 12B and executes the control program PR1B.
  • the processing device 11B includes a transmittance control unit 114B, a first acquisition unit 117, and a first acquisition unit 117. 2 acquisition unit 118.
  • the first acquisition unit 117 acquires the distance from the virtual object VO to the real object RO located behind the virtual object VO in the mixed reality space MS.
  • the distance is an example of a "first distance.”
  • FIG. 9 is an explanatory diagram of a first distance and a second distance described below.
  • FIG. 9 is a diagram of the mixed reality space MS shown in FIG. 2B viewed from another angle.
  • the first distance L1 is the distance between the close real object RO and the virtual object VO.
  • the proximate real object RO is the real object RO closest to the virtual object VO among the plurality of real objects RO located behind the virtual object VO when viewed from the user UK .
  • the plurality of real objects RO located behind the virtual object VO are a desk T1, chairs C1 to C4, and a bulletin board NB.
  • the first distance L1 is the distance between the close point of the close real object RO and the close point of the virtual object VO.
  • the proximity point of the nearby real object RO is the point on the surface of the nearby real object RO that is closest to the virtual object VO.
  • the proximity point of the virtual object VO is the point on the surface of the virtual object VO that is closest to the proximity real object RO.
  • the real object RO closest to the virtual object VO is the chair C3. Therefore, in the example shown in FIG. 9, the first distance L1 is the distance between the point of the chair C3 closest to the virtual object VO and the point of the virtual object VO closest to the chair C3. It is.
  • the first distance L1 is not limited to this.
  • the first distance L1 may be the distance between the center of gravity of the nearby real object RO and the center of gravity of the virtual object VO.
  • the second acquisition unit 118 acquires the distance from the display 15 as a display device to the virtual object VO in the mixed reality space MS.
  • the distance is an example of a "second distance.”
  • the second distance L2 is the distance between the point of the display 15 that is closest to the virtual object VO and the point of the virtual object VO that is closest to the display 15. be.
  • the second distance L2 is not limited to this.
  • the second distance L2 may be the distance between the center of gravity of the display 15 and the center of gravity of the virtual object VO.
  • the transmittance control unit 114B determines the virtual object VO according to the first distance L1 acquired by the first acquisition unit 117 in addition to the occupancy rate of the virtual object VO calculated by the calculation unit 113. control the transmittance of
  • the transmittance control unit 114B lowers the transmittance of the virtual object VO as the first distance L1 is longer.
  • the terminal device 10B-K can suppress the increased risk associated with a situation where the user UK cannot visually recognize the real object RO.
  • the transmittance control unit 114B controls the transmittance of the virtual object VO according to the second distance L2 acquired by the second acquisition unit 118 in addition to the occupancy rate of the virtual object VO calculated by the calculation unit 113. It's okay.
  • the transmittance control unit 114B lowers the transmittance of the virtual object VO as the second distance L2 becomes longer.
  • the terminal device 10B-K determines that the closer the display 15 and the virtual object VO are to each other, and furthermore the closer the user UK and the virtual object VO are to each other, the more likely the user UK is to view the real object RO. It is possible to suppress the increased risk associated with situations where visibility is not possible.
  • the transmittance control unit 114B may calculate the transmittance z (%) of the virtual object VO using the following formula (1).
  • the transmittance control unit 114B calculates, in addition to the occupancy rate of the virtual object VO calculated by the calculation unit 113, the first distance L1 acquired by the first acquisition unit 117 and the second distance L2 acquired by the second acquisition unit 118.
  • the transmittance of the virtual object VO may be controlled according to both of the above.
  • FIG. 10 is a flowchart showing the operation of the terminal device 10B-K as a display control device according to the third embodiment. The operation of the terminal devices 10B-K will be described below with reference to FIG.
  • step S21 the processing device 11B functions as the generation unit 111.
  • the processing device 11B generates a virtual object VO.
  • step S22 the processing device 11B functions as the display control section 112.
  • the processing device 11B allows the user to view the mixed reality space MS in which the virtual object VO is superimposed on the real space RS via the display 15 by displaying the image showing the virtual object VO generated in step S21 on the display 15. Make the UK aware.
  • step S23 the processing device 11B functions as the calculation unit 113.
  • the processing device 11B calculates the occupancy rate of the area occupied by the virtual object VO displayed by the display control unit 112 in the display area of the display 15.
  • step S24 the processing device 11B functions as the first acquisition unit 117.
  • the processing device 11B obtains a first distance L1 from the virtual object VO to the real object RO located behind the virtual object VO in the mixed reality space MS.
  • step S25 the processing device 11B functions as the second acquisition unit 118.
  • the processing device 11B obtains a second distance L2 from the display 15 as a display device to the virtual object VO in the mixed reality space MS.
  • step S26 the processing device 11B functions as the transmittance control section 114B.
  • the processing device 11B controls the transmittance of the virtual object VO according to the occupancy rate calculated in step S23, the first distance L1 acquired in step S24, and the second distance L2 acquired in step S25.
  • the terminal device 10B-K ends all the processes shown in the flowchart of FIG.
  • the terminal device 10B-K as a display control device includes the first acquisition
  • the apparatus further includes a section 117.
  • the first acquisition unit 117 acquires a first distance L1 from the virtual object VO to the real object RO located behind the virtual object VO in the mixed reality space MS.
  • the transmittance control unit 114B controls the transmittance of the virtual object VO according to the first distance L1 in addition to the occupancy rate of the virtual object VO.
  • the terminal device 10B-K has the above configuration, when the virtual object VO and the object existing in the real space RS overlap in the augmented reality space or the mixed reality space, the object existing in the real space RS can be visually recognized. It becomes possible to increase the sex.
  • the terminal device 10B-K controls the transmittance of the virtual object VO according to the first distance L1 from the virtual object VO to the real object RO. Specifically, the terminal device 10B-K lowers the transmittance of the virtual object VO as the first distance L1 becomes longer. As a result, the closer the virtual object VO and real object RO are to each other, the more the terminal device 10B-K can suppress the increase in risk associated with a situation where the user UK cannot visually recognize the real object RO.
  • the terminal device 10B-K as a display control device according to the present embodiment further includes a second acquisition unit 118 in addition to the components provided in the terminal device 10-K according to the first embodiment.
  • the second acquisition unit 118 acquires a second distance L2 from the display 15 as a display device to the virtual object VO in the mixed reality space MS.
  • the transmittance control unit 114B controls the transmittance of the virtual object VO according to the second distance L2 in addition to the occupancy rate of the virtual object VO.
  • the terminal device 10B-K has the above configuration, when the virtual object VO and the object existing in the real space RS overlap in the augmented reality space or the mixed reality space, the object existing in the real space RS can be visually recognized. It becomes possible to increase the sex.
  • the terminal devices 10B-K control the transmittance of the virtual object VO according to the second distance L2 between the display 15 and the virtual object VO. Specifically, the terminal devices 10B-K lower the transmittance of the virtual object VO as the second distance L2 becomes longer. As a result, the terminal device 10B-K realizes that the closer the display 15 and the virtual object VO are to each other, and furthermore the closer the user UK and the virtual object VO are to each other, the more the user UK is able to view the real object RO. It is possible to suppress the increase in danger associated with situations where people cannot visually see the objects.
  • the terminal device 10-K displays the mixed reality space MS in which the virtual object VO is superimposed on the real space RS to the user U via the display 15 or XR glasses connected to the terminal device 10-K. Let K recognize it.
  • the terminal device 10-K may allow the user UK to recognize a virtual reality space or an augmented reality space instead of the mixed reality space MS.
  • the terminal device 10-K may allow the user UK to recognize the virtual reality space, augmented reality space, or mixed reality space MS through an HMD (Head Mounted Display) instead of the XR glasses.
  • the HMD is an example of a "display device.”
  • virtual reality space, augmented reality space, and mixed reality space MS are all examples of "virtual space.”
  • the terminal device 10-K may cause a video see-through type HMD to display a virtual reality space in which the virtual object VO is superimposed on a captured image of the real space RS including the real object RO. .
  • the display control unit 112 displays on the HMD a superimposed image in which an image indicating the virtual object VO is superimposed on a captured image obtained by capturing an image of the outside world, thereby providing the virtual reality space to the user UK . Make them aware.
  • the terminal devices 10A-K according to the second embodiment and the terminal devices 10B-K according to the third embodiment may also be modified in the same way as in the first embodiment.
  • the terminal device 10-K generates a virtual object VO and controls the transmittance of the virtual object VO.
  • the server 20 may generate the virtual object VO and then distribute the virtual object VO to the terminal device 10-K.
  • the server 20 may control the transparency of the virtual object VO distributed to the terminal device 10-K.
  • the information processing system 1A according to the second embodiment and the information processing system 1B according to the third embodiment may also be modified in the same way as the first embodiment.
  • the terminal device according to the present modification can be , the transmittance of the virtual object VO may be controlled according to the occupancy rate of the virtual object VO only when the real object RO exists behind the virtual object VO.
  • the recognition section 116 provided in the terminal devices 10A-K according to the second embodiment is combined with the first acquisition section 117 and the second acquisition section 118 provided in the terminal devices 10B-K according to the third embodiment.
  • the terminal device according to this modification may control the transmittance of the virtual object VO according to the type of the real object RO, the first distance L1, and the second distance L2.
  • the storage devices 12 to 12B and the storage device 22 are exemplified as ROM and RAM, but they may also be flexible disks, magneto-optical disks (for example, compact disks, digital versatile disks, Blu-ray (R) disc), smart card, flash memory device (e.g. card, stick, key drive), CD-ROM (Compact Disc-ROM), register, removable disk, hard disk, floppy (R) disk , magnetic strip, database, server or other suitable storage medium.
  • the program may also be transmitted from a network via a telecommunications line. Further, the program may be transmitted from the communication network NET via a telecommunications line.
  • the information, signals, etc. described may be represented using any of a variety of different technologies.
  • data, instructions, commands, information, signals, bits, symbols, chips, etc. which may be referred to throughout the above description, may refer to voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. It may also be represented by a combination of
  • the input/output information may be stored in a specific location (for example, memory) or may be managed using a management table. Information etc. to be input/output may be overwritten, updated, or additionally written. The output information etc. may be deleted. The input information etc. may be transmitted to other devices.
  • the determination may be made using a value expressed using 1 bit (0 or 1) or a truth value (Boolean: true or false).
  • the comparison may be performed by comparing numerical values (for example, comparing with a predetermined value).
  • each function illustrated in FIGS. 1 to 10 is realized by an arbitrary combination of at least one of hardware and software.
  • the method for realizing each functional block is not particularly limited. That is, each functional block may be realized using one physically or logically coupled device, or may be realized using two or more physically or logically separated devices directly or indirectly (e.g. , wired, wireless, etc.) and may be realized using a plurality of these devices.
  • the functional block may be realized by combining software with the one device or the plurality of devices.
  • the programs exemplified in the above-described embodiments are instructions, instruction sets, codes, codes, regardless of whether they are called software, firmware, middleware, microcode, hardware description language, or by other names. Should be broadly construed to mean a segment, program code, program, subprogram, software module, application, software application, software package, routine, subroutine, object, executable, thread of execution, procedure, function, etc.
  • software, instructions, information, etc. may be sent and received via a transmission medium.
  • a transmission medium For example, if the software uses wired technology (coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.) and/or wireless technology (infrared, microwave, etc.) to create a website, When transmitted from a server or other remote source, these wired and/or wireless technologies are included within the definition of transmission medium.
  • wired technology coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.
  • wireless technology infrared, microwave, etc.
  • the information, parameters, etc. described in this disclosure may be expressed using absolute values, relative values from a predetermined value, or other corresponding information. It may also be expressed as
  • the terminal devices 10-1 to 10-J, 10A-1 to 10A-J, and 10B-1 to 10B-J, and the server 20 are mobile stations (MS). This includes some cases.
  • a mobile station is defined by a person skilled in the art as a subscriber station, mobile unit, subscriber unit, wireless unit, remote unit, mobile device, wireless device, wireless communication device, remote device, mobile subscriber station, access terminal, mobile terminal, wireless It may also be referred to as a terminal, remote terminal, handset, user agent, mobile client, client, or some other suitable terminology. Further, in the present disclosure, terms such as “mobile station,” “user terminal,” “user equipment (UE),” and “terminal” may be used interchangeably.
  • connection refers to direct or indirect connections between two or more elements. Refers to any connection or combination, including the presence of one or more intermediate elements between two elements that are “connected” or “coupled” to each other.
  • the coupling or connection between elements may be a physical coupling or connection, a logical coupling or connection, or a combination thereof.
  • connection may be replaced with "access.”
  • two elements may include one or more wires, cables, and/or printed electrical connections, as well as in the radio frequency domain, as some non-limiting and non-inclusive examples. , electromagnetic energy having wavelengths in the microwave and optical (both visible and invisible) ranges.
  • determining and “determining” used in this disclosure may encompass a wide variety of operations.
  • “Judgment” and “decision” include, for example, judging, calculating, computing, processing, deriving, investigating, looking up, search, and inquiry. (e.g., searching in a table, database, or other data structure), and regarding an ascertaining as a “judgment” or “decision.”
  • judgment and “decision” refer to receiving (e.g., receiving information), transmitting (e.g., sending information), input, output, and access.
  • (accessing) may include considering something as a “judgment” or “decision.”
  • judgment and “decision” refer to resolving, selecting, choosing, establishing, comparing, etc. as “judgment” and “decision”. may be included.
  • judgment and “decision” may include regarding some action as having been “judged” or “determined.”
  • judgment (decision) may be read as “assuming", “expecting", “considering”, etc.
  • notification of prescribed information is not limited to explicit notification, but may also be done implicitly (for example, by not notifying the prescribed information). Good too.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A display control device according to the present invention comprises: a display control unit that, through a display device, causes a user to recognize a virtual space in which a virtual object is superimposed on a real space, by displaying an image representing a virtual object on a display device; and a transmissivity control unit that controls transmissivity of the virtual object in accordance with an occupancy rate of area of a display region of the display device that the virtual object occupies.

Description

表示制御装置display control device
 本発明は、表示制御装置に関する。 The present invention relates to a display control device.
 仮想空間において、仮想オブジェクト同士が重なった場合に、ユーザにとっての双方の仮想オブジェクトの視認性を高めるため、当該仮想空間を表示装置に表示させる表示制御装置が、仮想オブジェクトの透過性を制御することがある。 When virtual objects overlap in a virtual space, a display control device that displays the virtual space on a display device controls the transparency of the virtual objects in order to increase the visibility of both virtual objects for a user. There is.
 例えば特許文献1は、ユーザに対し、端末操作上の煩わしさを与えることなく、仮想空間内の各仮想オブジェクトへのユーザの操作を容易かつ円滑に実現できるプログラムを開示している。当該プログラムは、仮想オブジェクトのうち少なくとも1つを透過処理の対象となるオブジェクトとして決定するオブジェクト決定部と、当該透過処理の対象となったオブジェクトに対し、仮想カメラとの配置関係に応じて決定される透過率に応じて透過処理を実施するオブジェクト制御部として、コンピュータを機能させる。 For example, Patent Document 1 discloses a program that allows the user to easily and smoothly operate each virtual object in a virtual space without causing the user any trouble in operating the terminal. The program includes an object determining unit that determines at least one of the virtual objects as an object to be subjected to transparency processing, and an object determining unit that determines at least one of the virtual objects as an object to be subjected to transparency processing, and an object determining unit that determines the object to be subjected to transparency processing according to the arrangement relationship with the virtual camera. The computer is made to function as an object control unit that performs transparency processing according to the transmittance.
特開2016-016319号公報Japanese Patent Application Publication No. 2016-016319
 ユーザが、仮想オブジェクトと現実空間とが重畳される拡張現実空間又は複合現実空間を提供するスマートフォン又はXRグラスを利用する場合、仮想オブジェクト同士が重畳する場合だけではなく、仮想空間に存在する仮想オブジェクトと、現実空間に存在する物体とが重畳する場合が発生する。 When a user uses a smartphone or XR glasses that provide an augmented reality space or a mixed reality space where virtual objects and real space are superimposed, not only virtual objects may overlap with each other, but also virtual objects that exist in the virtual space. A case may occur where the object and the object existing in real space overlap.
 しかし、特許文献1に係る技術は、仮想空間に存在する仮想オブジェクト同士の重畳のみに対応する技術であり、仮想オブジェクトと現実空間に存在する物体との重畳に対応していなかった。 However, the technology according to Patent Document 1 is a technology that only supports the superposition of virtual objects that exist in virtual space, and does not support the superposition of virtual objects and objects that exist in real space.
 そこで、本発明は、仮想オブジェクトと現実空間に存在する物体とが重畳する場合に、当該物体の視認性を高めることが可能な表示制御装置を提供することを目的とする。 Therefore, an object of the present invention is to provide a display control device that can improve the visibility of a virtual object and an object existing in real space when the object overlaps.
 本発明の好適な態様に係る第1の表示制御装置は、表示装置に仮想オブジェクトを示す画像を表示させることによって、前記表示装置を介して、現実空間に前記仮想オブジェクトが重畳された仮想空間をユーザに認識させる表示制御部と、前記表示装置の表示領域のうち前記仮想オブジェクトの占める面積の占有率に応じて、前記仮想オブジェクトの透過率を制御する透過率制御部と、を備える表示制御装置である。 A first display control device according to a preferred aspect of the present invention displays a virtual space in which the virtual object is superimposed on a real space via the display device by displaying an image showing the virtual object on the display device. A display control device comprising: a display control section that allows a user to recognize the virtual object; and a transmittance control section that controls the transmittance of the virtual object according to the occupancy rate of the area occupied by the virtual object in the display area of the display device. It is.
 本発明の好適な態様に係る第2の表示制御装置は、実オブジェクトが存在する現実空間を撮像する撮像装置と、前記実オブジェクトの種別を認識する認識部と、表示装置に仮想オブジェクトを示す画像を表示させることによって、前記表示装置を介して、前記現実空間に前記仮想オブジェクトが重畳された仮想空間をユーザに認識させる表示制御部と、前記仮想オブジェクトの背後に位置する実オブジェクトの種別に応じて、前記仮想オブジェクトの透過率を制御する透過率制御部と、を備える表示制御装置である。 A second display control device according to a preferred aspect of the present invention includes an imaging device that captures an image of a real space in which a real object exists, a recognition unit that recognizes the type of the real object, and an image that shows a virtual object on a display device. a display control unit that causes a user to recognize a virtual space in which the virtual object is superimposed on the real space via the display device; and a transmittance control section that controls the transmittance of the virtual object.
 本発明によれば、仮想オブジェクトと現実空間に存在する物体とが重畳する場合に、現実空間に存在する物体の視認性を高めることが可能となる。 According to the present invention, when a virtual object and an object existing in real space overlap, it is possible to improve the visibility of the object existing in real space.
第1実施形態に係る情報処理システム1の全体構成を示す図。FIG. 1 is a diagram showing the overall configuration of an information processing system 1 according to a first embodiment. 現実空間の例を示す図。A diagram showing an example of real space. 現実空間に仮想オブジェクトを重畳した複合現実空間の例を示す図。FIG. 2 is a diagram showing an example of a mixed reality space in which virtual objects are superimposed on a real space. 端末装置10-Kの構成例を示すブロック図。FIG. 3 is a block diagram showing a configuration example of a terminal device 10-K. サーバ20の構成例を示すブロック図。FIG. 2 is a block diagram showing a configuration example of a server 20. FIG. 端末装置10-Kの動作を示すフローチャート。Flowchart showing the operation of the terminal device 10-K. 端末装置10A-Kの構成例を示すブロック図。FIG. 3 is a block diagram showing a configuration example of terminal devices 10A-K. 端末装置10A-Kの動作を示すフローチャート。Flowchart showing the operation of terminal devices 10A-K. 端末装置10B-Kの構成例を示すブロック図。FIG. 3 is a block diagram showing a configuration example of a terminal device 10B-K. 第1距離及び第2距離の説明図。An explanatory diagram of a first distance and a second distance. 端末装置10B-Kの動作を示すフローチャート。Flowchart showing the operation of terminal device 10B-K.
1:第1実施形態
 以下、図1~図5を参照しつつ、本発明の第1実施形態に係る表示制御装置としての端末装置10を含む情報処理システム1の構成について説明する。
1: First Embodiment Hereinafter, the configuration of an information processing system 1 including a terminal device 10 as a display control device according to a first embodiment of the present invention will be described with reference to FIGS. 1 to 5.
1-1:第1実施形態の構成
1-1-1:全体構成
 図1は、第1実施形態に係る情報処理システム1の全体構成を示す。図1に示されるように、情報処理システム1は、端末装置10-1、10-2、…10-K、…10-J、及びサーバ20を備える。Jは1以上の整数である。Kは1以上J以下の整数である。本実施形態において、端末装置10-1~10-Jは相互に同一の構成である。但し、他の端末と構成が同一でない端末装置が含まれても良い。
1-1: Configuration of First Embodiment 1-1-1: Overall Configuration FIG. 1 shows the overall configuration of an information processing system 1 according to the first embodiment. As shown in FIG. 1, the information processing system 1 includes terminal devices 10-1, 10-2, ... 10-K, ... 10-J, and a server 20. J is an integer of 1 or more. K is an integer greater than or equal to 1 and less than or equal to J. In this embodiment, the terminal devices 10-1 to 10-J have the same configuration. However, a terminal device whose configuration is not the same as that of other terminals may be included.
 情報処理システム1において、端末装置10-1~10-Jと、サーバ20とは、通信網NETを介して互いに通信可能に接続される。なお、図1において、ユーザUは、端末装置10-Kを利用するものとする。 In the information processing system 1, the terminal devices 10-1 to 10-J and the server 20 are communicably connected to each other via the communication network NET. Note that in FIG. 1, it is assumed that the user UK uses the terminal device 10-K.
 サーバ20は、通信網NETを介して、端末装置10-1~10-Jに対して各種データ及びクラウドサービスを提供する。とりわけ、サーバ20は、端末装置10-1~10-Jに対して、仮想空間に表示される各種コンテンツを提供する。 The server 20 provides various data and cloud services to the terminal devices 10-1 to 10-J via the communication network NET. In particular, the server 20 provides various contents to be displayed in the virtual space to the terminal devices 10-1 to 10-J.
 端末装置10-Kは、当該端末装置10-Kに備わるディスプレイ15、又は当該端末装置10-Kに接続され、ユーザUが頭部に装着するXRグラスに対して、仮想オブジェクトを表示させる。ここで、「XRグラス」とは、VR(Virtual Reality)グラス、AR(Augmented Reality)グラス、及びMR(Mixed Reality)グラスの総称である。また、「端末装置10-Kに備わるディスプレイ15」、及び「端末装置10-Kに接続され、ユーザUが頭部に装着するXRグラス」は、「表示装置」の一例である。 The terminal device 10-K causes a virtual object to be displayed on a display 15 provided in the terminal device 10-K or XR glasses connected to the terminal device 10-K and worn on the head of the user UK . Here, "XR glasses" is a general term for VR (Virtual Reality) glasses, AR (Augmented Reality) glasses, and MR (Mixed Reality) glasses. Further, "the display 15 provided in the terminal device 10-K" and "XR glasses connected to the terminal device 10-K and worn on the head by the user UK " are examples of "display devices."
 また、「仮想オブジェクト」は、例として、静止画像、動画、3DCGモデル、HTMLファイル、及びテキストファイル等のデータを示す仮想オブジェクト、及びアプリケーションを示す仮想オブジェクトである。ここで、テキストファイルとしては、例として、メモ、及びソースコードが挙げられる。また、アプリケーションとしては、例として、ブラウザ、SNSを用いるためのアプリケーション、及びドキュメントファイルを生成するためのアプリケーションが挙げられる。 Furthermore, "virtual objects" are, for example, virtual objects representing data such as still images, moving images, 3DCG models, HTML files, and text files, and virtual objects representing applications. Here, examples of text files include memos and source codes. Further, examples of applications include a browser, an application for using SNS, and an application for generating a document file.
 なお、端末装置10-Kは、例として、スマートフォン、及びタブレット等の携帯端末装置であることが好適である。 Note that the terminal device 10-K is preferably a mobile terminal device such as a smartphone or a tablet, for example.
 図2A及び図2Bは、現実空間に仮想オブジェクトを重畳した複合現実空間の例を示す図である。 FIGS. 2A and 2B are diagrams showing examples of mixed reality spaces in which virtual objects are superimposed on real spaces.
 図2Aは、表示装置に表示される現実空間RSの例である。図2Aに示される現実空間RSの例は、大学の教室である。当該大学の教室には、当該教室に存在する物体の一部として、机T1、椅子C1~C4、及び掲示板NBが存在する。また、表示装置のユーザから見て、掲示板NBの手前に、机T1及び椅子C1~C4が位置する。更に、椅子C1~C4が机T1を取り囲む。 FIG. 2A is an example of the real space RS displayed on the display device. The example of the real space RS shown in FIG. 2A is a university classroom. In the classroom of the university, there are a desk T1, chairs C1 to C4, and a bulletin board NB as some of the objects existing in the classroom. Further, as viewed from the user of the display device, a desk T1 and chairs C1 to C4 are located in front of the bulletin board NB. Furthermore, chairs C1 to C4 surround desk T1.
 図2Bは、図2Aに示される現実空間RSに対して、仮想オブジェクトVOが重畳された複合現実空間MSの例である。複合現実空間MSにおいて、仮想オブジェクトVOは、表示装置のユーザから見て、机T1、椅子C1~C4、及び掲示板NBの手前に位置する。更に、表示装置のユーザから見て、仮想オブジェクトVOの大きさは、机T1及び椅子C1~C4、及び、掲示板NBの一部を覆う大きさである。この結果、当該ユーザが、現実空間RSにおいて、表示装置を見ながら前進した場合、仮想オブジェクトVOによって隠された机T1及び椅子C1~C4に気付かずに、机T1又は椅子C1~C4の何れかに衝突する危険性がある。更に、掲示板NBに貼られた掲示物の一部が仮想オブジェクトVOによって隠されているため、当該ユーザは、当該掲示物の一部の内容を知ることはできない。そこで、端末装置10は、仮想オブジェクトVOの透過率を制御することで、ユーザに、机T1、椅子C1~C4、及び掲示板NBを視認させる。 FIG. 2B is an example of a mixed reality space MS in which a virtual object VO is superimposed on the real space RS shown in FIG. 2A. In the mixed reality space MS, the virtual object VO is located in front of the desk T1, the chairs C1 to C4, and the bulletin board NB when viewed from the user of the display device. Furthermore, as viewed from the user of the display device, the size of the virtual object VO is such that it covers the desk T1, the chairs C1 to C4, and part of the bulletin board NB. As a result, when the user moves forward in the real space RS while looking at the display device, the user does not notice the desk T1 and the chairs C1 to C4 hidden by the virtual object VO, and moves towards either the desk T1 or the chairs C1 to C4. There is a risk of collision. Furthermore, since a part of the bulletin board posted on the bulletin board NB is hidden by the virtual object VO, the user cannot know the contents of the part of the bulletin board. Therefore, the terminal device 10 controls the transmittance of the virtual object VO to allow the user to see the desk T1, the chairs C1 to C4, and the bulletin board NB.
 また、他の例として、図示はしないが、表示装置のユーザが、街中において、当該表示装置に表示される複合現実空間を視認しながら移動する場合、例えば信号、及び交通標識が仮想オブジェクトによって遮られることで、ユーザに危険性が発生するケースが考えられる。また、例えば、建物に設置された看板が仮想オブジェクトによって遮られることで、ユーザが得られてしかるべき情報が得られないケースが考えられる。そこで、端末装置10は、上記のように、仮想オブジェクトの透過率を制御する。 As another example, although not shown, when a user of a display device moves around the city while visually checking a mixed reality space displayed on the display device, for example, traffic lights and traffic signs may be blocked by virtual objects. There is a possibility that this could pose a danger to the user. Furthermore, for example, there may be a case where a user cannot obtain the information that they should be able to obtain because a signboard installed in a building is blocked by a virtual object. Therefore, the terminal device 10 controls the transmittance of the virtual object as described above.
1-1-2:端末装置の構成
 図3は、端末装置10-Kの構成例を示すブロック図である。端末装置10-Kは、処理装置11、記憶装置12、通信装置13、測位装置14、ディスプレイ15、入力装置16、及び慣性センサ17を備える。端末装置10-Kが有する各要素は、情報を通信するための単体又は複数のバスを用いて相互に接続される。
1-1-2: Configuration of Terminal Device FIG. 3 is a block diagram showing an example of the configuration of the terminal device 10-K. The terminal device 10-K includes a processing device 11, a storage device 12, a communication device 13, a positioning device 14, a display 15, an input device 16, and an inertial sensor 17. Each element included in the terminal device 10-K is interconnected using one or more buses for communicating information.
 処理装置11は、端末装置10-Kの全体を制御するプロセッサである。また、処理装置11は、例えば、単数又は複数のチップを用いて構成される。処理装置11は、例えば、周辺装置とのインタフェース、演算装置及びレジスタ等を含む中央処理装置(CPU)を用いて構成される。なお、処理装置11が有する機能の一部又は全部を、DSP、ASIC、PLD、FPGA等のハードウェアを用いて実現してもよい。処理装置11は、各種の処理を並列的又は逐次的に実行する。 The processing device 11 is a processor that controls the entire terminal device 10-K. Further, the processing device 11 is configured using, for example, a single chip or a plurality of chips. The processing device 11 is configured using, for example, a central processing unit (CPU) that includes an interface with peripheral devices, an arithmetic unit, registers, and the like. Note that some or all of the functions of the processing device 11 may be implemented using hardware such as a DSP, ASIC, PLD, or FPGA. The processing device 11 executes various processes in parallel or sequentially.
 記憶装置12は、処理装置11による読取及び書込が可能な記録媒体である。また、記憶装置12は、処理装置11が実行する制御プログラムPR1を含む複数のプログラムを記憶する。また、記憶装置12は、ディスプレイ15に表示される画像を示す画像情報を更に記憶する。とりわけ、記憶装置12は、後述の生成部111が、仮想オブジェクトVOを生成する際に用いる画像を示す画像情報を更に記憶する。 The storage device 12 is a recording medium that can be read and written by the processing device 11. Furthermore, the storage device 12 stores a plurality of programs including the control program PR1 executed by the processing device 11. Furthermore, the storage device 12 further stores image information indicating an image displayed on the display 15. In particular, the storage device 12 further stores image information indicating an image used by the generation unit 111, which will be described later, to generate the virtual object VO.
 通信装置13は、他の装置と通信を行うための、送受信デバイスとしてのハードウェアである。通信装置13は、例えば、ネットワークデバイス、ネットワークコントローラ、ネットワークカード、通信モジュール等とも呼ばれる。通信装置13は、有線接続用のコネクターを備え、上記コネクターに対応するインタフェース回路を備えていてもよい。また、通信装置13は、無線通信インタフェースを備えていてもよい。有線接続用のコネクター及びインタフェース回路としては有線LAN、IEEE1394、USBに準拠した製品が挙げられる。また、無線通信インタフェースとしては無線LAN及びBluetooth(登録商標)等に準拠した製品が挙げられる。 The communication device 13 is hardware as a transmitting/receiving device for communicating with other devices. The communication device 13 is also called, for example, a network device, a network controller, a network card, a communication module, or the like. The communication device 13 may include a connector for wired connection and an interface circuit corresponding to the connector. Furthermore, the communication device 13 may include a wireless communication interface. Examples of connectors and interface circuits for wired connections include products compliant with wired LAN, IEEE1394, and USB. Furthermore, examples of the wireless communication interface include products compliant with wireless LAN, Bluetooth (registered trademark), and the like.
 測位装置14は、位置情報を取得する。測位装置14は、例えばGPS(Global Positioning System)装置であってもよい。測位装置14が、GPS装置である場合、測位装置14は、複数の衛星からの電波を受信する。また、測位装置14は、受信した電波から、位置情報としてのGPS情報を生成する。位置情報は、位置を特定できるのであれば、どのような形式であってもよい。位置情報は、例えば、端末装置10-Kの緯度と経度とを示す。取得された位置情報は、処理装置11に出力される。 The positioning device 14 acquires position information. The positioning device 14 may be, for example, a GPS (Global Positioning System) device. When the positioning device 14 is a GPS device, the positioning device 14 receives radio waves from a plurality of satellites. Furthermore, the positioning device 14 generates GPS information as position information from the received radio waves. The location information may be in any format as long as the location can be specified. The location information indicates, for example, the latitude and longitude of the terminal device 10-K. The acquired position information is output to the processing device 11.
 あるいは、測位装置14は、例えばVPS(Visual Positioning System)装置であってもよい。測位装置14が、VPS装置である場合、測位装置14は、図示しない撮像装置から、ユーザUの眼前の風景が撮像されることによって得られる画像を示す画像情報を取得する。また、測位装置14は、当該撮像装置から取得した画像情報を、通信装置13を介して、図示しない位置情報サーバに出力する。更に、測位装置14は、通信装置13を介して、当該位置情報サーバから、位置情報としてのVPS情報を取得する。当該位置情報は、ユーザUの現実空間RSにおける位置、及びユーザUが現実空間RSを目視する方向を含む。 Alternatively, the positioning device 14 may be, for example, a VPS (Visual Positioning System) device. When the positioning device 14 is a VPS device, the positioning device 14 acquires image information indicating an image obtained by capturing the scenery in front of the user UK from an imaging device (not shown). Furthermore, the positioning device 14 outputs image information acquired from the imaging device to a position information server (not shown) via the communication device 13. Furthermore, the positioning device 14 acquires VPS information as location information from the location information server via the communication device 13. The position information includes the position of the user UK in the real space RS and the direction in which the user UK views the real space RS.
 ディスプレイ15は、画像及び文字情報を表示するデバイスである。ディスプレイ15は、処理装置11による制御のもとで各種の画像を表示する。例えば、液晶表示パネル及び有機EL(Electro Luminescence)表示パネル等の各種の表示パネルがディスプレイ15として好適に利用される。なお、端末装置10-Kに、XRグラスが接続される場合、ディスプレイ15は必須の構成要素としなくてもよい。具体的には、当該XRグラスをディスプレイ15として用いることで、端末装置10-Kにディスプレイ15が備わらない構成としてもよい。ディスプレイ15、及びXRグラスは、表示装置の一例である。 The display 15 is a device that displays images and text information. The display 15 displays various images under the control of the processing device 11. For example, various display panels such as a liquid crystal display panel and an organic EL (Electro Luminescence) display panel are suitably used as the display 15. Note that when XR glasses are connected to the terminal device 10-K, the display 15 does not have to be an essential component. Specifically, by using the XR glasses as the display 15, the terminal device 10-K may be configured without the display 15. The display 15 and XR glasses are examples of display devices.
 入力装置16は、ユーザUからの操作を受け付ける。例えば、入力装置16は、キーボード、タッチパッド、タッチパネル又はマウス等のポインティングデバイスを含んで構成される。ここで、入力装置16は、タッチパネルを含んで構成される場合、ディスプレイ15を兼ねてもよい。 The input device 16 accepts operations from the user UK . For example, the input device 16 includes a keyboard, a touch pad, a touch panel, or a pointing device such as a mouse. Here, when the input device 16 includes a touch panel, it may also serve as the display 15.
 慣性センサ17は、慣性力を検出するセンサである。慣性センサ17は、例えば、加速度センサ、角速度センサ、及びジャイロセンサのうち、1以上のセンサを含む。処理装置11は、慣性センサ17の出力情報に基づいて、端末装置10-Kの姿勢を検出する。更に、処理装置11は、端末装置10-Kの姿勢に基づいて、複合現実空間MSにおいて、仮想オブジェクトVOの選択、文字の入力、及び指示の入力を受け付ける。例えば、ユーザUが端末装置10-Kの中心軸を、複合現実空間MSの所定領域に向けた状態で、入力装置16を操作すると、所定領域に配置される仮想オブジェクトVOが選択される。入力装置16に対するユーザUの操作は、例えば、ダブルタップである。このようにユーザUは端末装置10-Kを操作することで、端末装置10-Kの入力装置16を見なくても仮想オブジェクトVOを選択できる。 The inertial sensor 17 is a sensor that detects inertial force. The inertial sensor 17 includes, for example, one or more of an acceleration sensor, an angular velocity sensor, and a gyro sensor. The processing device 11 detects the attitude of the terminal device 10-K based on the output information of the inertial sensor 17. Further, the processing device 11 receives selection of the virtual object VO, input of characters, and input of instructions in the mixed reality space MS based on the attitude of the terminal device 10-K. For example, when the user UK operates the input device 16 while pointing the central axis of the terminal device 10-K toward a predetermined region of the mixed reality space MS, a virtual object VO placed in the predetermined region is selected. The user UK 's operation on the input device 16 is, for example, a double tap. In this way, by operating the terminal device 10-K, the user UK can select the virtual object VO without looking at the input device 16 of the terminal device 10-K.
 処理装置11は、記憶装置12から制御プログラムPR1を読み出して制御プログラムPR1を実行する。その結果、処理装置11は、生成部111、表示制御部112、算出部113、透過率制御部114、及び通信制御部115として機能する。 The processing device 11 reads the control program PR1 from the storage device 12 and executes the control program PR1. As a result, the processing device 11 functions as a generation section 111, a display control section 112, a calculation section 113, a transmittance control section 114, and a communication control section 115.
 生成部111は、仮想オブジェクトVOを生成する。生成部111は、記憶装置12に記憶される画像情報を用いて、仮想オブジェクトVOを生成してもよい。あるいは、生成部111は、通信装置13を介して、サーバ20から画像情報を取得し、当該取得した画像情報を用いて、仮想オブジェクトVOを生成してもよい。 The generation unit 111 generates a virtual object VO. The generation unit 111 may generate the virtual object VO using image information stored in the storage device 12. Alternatively, the generation unit 111 may acquire image information from the server 20 via the communication device 13 and generate the virtual object VO using the acquired image information.
 表示制御部112は、ディスプレイ15に仮想オブジェクトVOを示す画像を表示させることによって、ディスプレイ15を介して、現実空間RSに仮想オブジェクトVOが重畳した複合現実空間MSをユーザUに認識させる。とりわけ上記のように、端末装置10-Kに、ディスプレイ15としてXRグラスが接続される場合には、表示制御部112は、外界から入射される光が当該XRグラスを透過する状況において仮想オブジェクトVOを示す画像をXRグラス(ディスプレイ15)に表示させることによって、複合現実空間MSをユーザUに認識させる。 The display control unit 112 causes the user UK to recognize the mixed reality space MS in which the virtual object VO is superimposed on the real space RS via the display 15 by displaying an image showing the virtual object VO on the display 15. In particular, as described above, when XR glasses are connected to the terminal device 10-K as the display 15, the display control unit 112 displays the virtual object VO in a situation where light incident from the outside world passes through the XR glasses. By displaying an image showing the image on the XR glasses (display 15), the user UK is made to recognize the mixed reality space MS.
 算出部113は、ディスプレイ15の表示領域のうち、表示制御部112が表示させる仮想オブジェクトVOの占める面積の占有率を算出する。 The calculation unit 113 calculates the occupancy rate of the area occupied by the virtual object VO displayed by the display control unit 112 in the display area of the display 15.
 透過率制御部114は、算出部113が算出した仮想オブジェクトVOの占める面積の占有率に応じて、仮想オブジェクトVOの透過率を制御する。具体的には、透過率制御部114は、算出部113が算出した占有率が高いほど、仮想オブジェクトVOの透過率を高める。ディスプレイ15の表示領域のうち、仮想オブジェクトVOによって占められる面積が広いほど、ユーザUにとって現実空間RSにおける前方、より具体的には仮想オブジェクトVOの奥の、視認できない領域が広くなるためである。従って、透過率制御部114は、上記の占有率が高いほど、仮想オブジェクトVOをより透過させる。 The transmittance control unit 114 controls the transmittance of the virtual object VO according to the occupancy rate of the area occupied by the virtual object VO calculated by the calculation unit 113. Specifically, the transmittance control unit 114 increases the transmittance of the virtual object VO as the occupancy rate calculated by the calculation unit 113 increases. This is because the larger the area occupied by the virtual object VO in the display area of the display 15, the larger the area in front of the real space RS that is invisible to the user UK , more specifically, behind the virtual object VO. . Therefore, the higher the occupancy rate, the more the transparency control unit 114 makes the virtual object VO transparent.
 ここで、仮想オブジェクトVOの占有率と、仮想オブジェクトVOの透過率とは比例関係にあってよい。例えば、占有率が0%の場合は、透過率制御部114は、仮想オブジェクトVOの透過率を0%とし、占有率が90%の場合は、透過率制御部114は、仮想オブジェクトVOの透過率を90%としてもよい。あるいは、透過率制御部114は、仮想オブジェクトVOの占有率と、仮想オブジェクトVOの透過率とを比例関係とはしないないものの、仮想オブジェクトVOの占有率の増加に応じて、仮想オブジェクトVOの透過率を単調増加させてもよい。 Here, the occupancy rate of the virtual object VO and the transmittance of the virtual object VO may be in a proportional relationship. For example, when the occupancy rate is 0%, the transmittance control unit 114 sets the transmittance of the virtual object VO to 0%, and when the occupancy rate is 90%, the transmittance control unit 114 sets the transmittance of the virtual object VO to 0%. The rate may be set to 90%. Alternatively, although the occupancy rate of the virtual object VO and the transmittance of the virtual object VO are not in a proportional relationship, the transmittance control unit 114 controls the occupancy rate of the virtual object VO according to an increase in the occupancy rate of the virtual object VO. The rate may increase monotonically.
 上記のように、仮想オブジェクトVOによって、現実空間RSに存在する物体としての実オブジェクトROが遮られる面積が大きいほど、ユーザUは、当該実オブジェクトROから本来得ていた情報が得られなくなる。また、ユーザUの進行方向に位置する実オブジェクトROが、仮想オブジェクトVOによって遮られることで、ユーザUの移動によって発生する危険性が高まる。端末装置10-Kは、ディスプレイ15の表示領域のうち、仮想オブジェクトVOが占める面積の占有率が高いほど、仮想オブジェクトVOの透過率を高める。この結果、ユーザUは、本来得るべき情報を得られる。また、ユーザUの移動に伴って発生する危険性の高まりが抑制される。 As described above, the larger the area where the real object RO, which is an object existing in the real space RS, is obstructed by the virtual object VO, the more the user UK is unable to obtain the information originally obtained from the real object RO. Furthermore, the real object RO located in the direction of movement of the user UK is blocked by the virtual object VO, which increases the risk of occurrence due to the movement of the user UK . The terminal device 10-K increases the transmittance of the virtual object VO as the occupancy rate of the area occupied by the virtual object VO in the display area of the display 15 increases. As a result, user UK can obtain the information that he or she should have originally obtained. Furthermore, the increased risk that occurs as the user UK moves is suppressed.
 通信制御部115は、通信装置13に、端末装置10-Kを用いたユーザUの操作を示す操作情報を、サーバ20に対して送信させる。とりわけ、通信制御部115は、通信装置13に、仮想オブジェクトVOに対するユーザUの操作を示す操作情報を、サーバ20に送信させる。 The communication control unit 115 causes the communication device 13 to transmit operation information indicating the user UK 's operation using the terminal device 10-K to the server 20. In particular, the communication control unit 115 causes the communication device 13 to transmit operation information indicating the user UK 's operation on the virtual object VO to the server 20.
1-1-3:サーバの構成
 図4は、サーバ20の構成例を示すブロック図である。サーバ20は、処理装置21、記憶装置22、通信装置23、ディスプレイ24、及び入力装置25を備える。サーバ20が有する各要素は、情報を通信するための単体又は複数のバスを用いて相互に接続される。
1-1-3: Server Configuration FIG. 4 is a block diagram showing an example of the configuration of the server 20. The server 20 includes a processing device 21, a storage device 22, a communication device 23, a display 24, and an input device 25. Each element included in the server 20 is interconnected using one or more buses for communicating information.
 処理装置21は、サーバ20の全体を制御するプロセッサである。また、処理装置21は、例えば、単数又は複数のチップを用いて構成される。処理装置21は、例えば、周辺装置とのインタフェース、演算装置及びレジスタ等を含む中央処理装置(CPU)を用いて構成される。なお、処理装置21が有する機能の一部又は全部を、DSP、ASIC、PLD、FPGA等のハードウェアを用いて実現してもよい。処理装置21は、各種の処理を並列的又は逐次的に実行する。 The processing device 21 is a processor that controls the entire server 20. Further, the processing device 21 is configured using, for example, a single chip or a plurality of chips. The processing device 21 is configured using, for example, a central processing unit (CPU) that includes an interface with peripheral devices, an arithmetic unit, registers, and the like. Note that some or all of the functions of the processing device 21 may be implemented using hardware such as a DSP, ASIC, PLD, or FPGA. The processing device 21 executes various processes in parallel or sequentially.
 記憶装置22は、処理装置21による読取及び書込が可能な記録媒体である。また、記憶装置22は、処理装置21が実行する制御プログラムPR2を含む複数のプログラムを記憶する。また、記憶装置22は、ディスプレイ24に表示される画像を示す画像情報を更に記憶する。 The storage device 22 is a recording medium that can be read and written by the processing device 21. Furthermore, the storage device 22 stores a plurality of programs including the control program PR2 executed by the processing device 21. Furthermore, the storage device 22 further stores image information indicating an image displayed on the display 24.
 通信装置23は、他の装置と通信を行うための、送受信デバイスとしてのハードウェアである。通信装置23は、例えば、ネットワークデバイス、ネットワークコントローラ、ネットワークカード、通信モジュール等とも呼ばれる。通信装置23は、有線接続用のコネクターを備え、上記コネクターに対応するインタフェース回路を備えていてもよい。また、通信装置23は、無線通信インタフェースを備えていてもよい。有線接続用のコネクター及びインタフェース回路としては有線LAN、IEEE1394、USBに準拠した製品が挙げられる。また、無線通信インタフェースとしては無線LAN及びBluetooth(登録商標)等に準拠した製品が挙げられる。 The communication device 23 is hardware as a transmitting/receiving device for communicating with other devices. The communication device 23 is also called, for example, a network device, a network controller, a network card, a communication module, or the like. The communication device 23 may include a connector for wired connection and an interface circuit corresponding to the connector. Furthermore, the communication device 23 may include a wireless communication interface. Examples of connectors and interface circuits for wired connections include products compliant with wired LAN, IEEE1394, and USB. Furthermore, examples of the wireless communication interface include products compliant with wireless LAN, Bluetooth (registered trademark), and the like.
 ディスプレイ24は、画像及び文字情報を表示するデバイスである。ディスプレイ24は、処理装置21による制御のもとで各種の画像を表示する。例えば、液晶表示パネル及び有機EL(Electro Luminescence)表示パネル等の各種の表示パネルがディスプレイ24として好適に利用される。 The display 24 is a device that displays images and text information. The display 24 displays various images under the control of the processing device 21. For example, various display panels such as a liquid crystal display panel and an organic EL (Electro Luminescence) display panel are suitably used as the display 24.
 入力装置25は、情報処理システム1の管理者からの操作を受け付ける。例えば、入力装置25は、キーボード、タッチパッド、タッチパネル又はマウス等のポインティングデバイスを含んで構成される。ここで、入力装置25は、タッチパネルを含んで構成される場合、ディスプレイ24を兼ねてもよい。 The input device 25 accepts operations from the administrator of the information processing system 1. For example, the input device 25 includes a keyboard, a touch pad, a touch panel, or a pointing device such as a mouse. Here, when the input device 25 includes a touch panel, it may also serve as the display 24.
 処理装置21は、記憶装置22から制御プログラムPR2を読み出して制御プログラムPR2を実行する。その結果、処理装置21は、取得部211、及び通信制御部212として機能する。 The processing device 21 reads the control program PR2 from the storage device 22 and executes the control program PR2. As a result, the processing device 21 functions as an acquisition section 211 and a communication control section 212.
 取得部211は、通信装置23を介して、端末装置10-KからユーザUの操作を示す操作情報を取得する。 The acquisition unit 211 acquires operation information indicating the operation of the user UK from the terminal device 10-K via the communication device 23.
 通信制御部212は、通信装置23に、各種のコンテンツ及びクラウドサービスを提供するための各種データを、端末装置10-Kに対して送信させる。とりわけ、通信制御部212は、通信装置23に、端末装置10-Kのユーザが、複合現実空間MSを体験するために必要となる各種のデータを、当該端末装置10-Kに対して送信させる。また、端末装置10-Kが、サーバ20に記憶される画像情報を用いて仮想オブジェクトVOを生成する場合には、通信制御部212は、当該画像情報を記憶装置22から読み出して、通信装置23に、当該画像情報を端末装置10-Kに対して送信させる。 The communication control unit 212 causes the communication device 23 to transmit various data for providing various contents and cloud services to the terminal device 10-K. In particular, the communication control unit 212 causes the communication device 23 to transmit various data necessary for the user of the terminal device 10-K to experience the mixed reality space MS to the terminal device 10-K. . Further, when the terminal device 10-K generates a virtual object VO using image information stored in the server 20, the communication control unit 212 reads out the image information from the storage device 22 and sends the image information to the communication device 23. Then, the image information is transmitted to the terminal device 10-K.
1-2:第1実施形態の動作
 図5は、第1実施形態に係る表示制御装置としての端末装置10-Kの動作を示すフローチャートである。以下、図5を参照しながら、端末装置10-Kの動作について説明する。
1-2: Operation of the first embodiment FIG. 5 is a flowchart showing the operation of the terminal device 10-K as a display control device according to the first embodiment. The operation of the terminal device 10-K will be described below with reference to FIG.
 ステップS1において、処理装置11は、生成部111として機能する。処理装置11は、仮想オブジェクトVOを生成する。 In step S1, the processing device 11 functions as the generation unit 111. The processing device 11 generates a virtual object VO.
 ステップS2において、処理装置11は、表示制御部112として機能する。処理装置11は、ステップS1において生成された仮想オブジェクトVOを示す画像をディスプレイ15に表示させることによって、ディスプレイ15に対して、現実空間RSに仮想オブジェクトVOが重畳した複合現実空間MSを表示させる。 In step S2, the processing device 11 functions as the display control section 112. The processing device 11 causes the display 15 to display a mixed reality space MS in which the virtual object VO is superimposed on the real space RS by causing the display 15 to display an image showing the virtual object VO generated in step S1.
 ステップS3において、処理装置11は、算出部113として機能する。処理装置11は、ディスプレイ15の表示領域のうち、表示制御部112が表示させる仮想オブジェクトVOの占める面積の占有率を算出する。 In step S3, the processing device 11 functions as the calculation unit 113. The processing device 11 calculates the occupancy rate of the area occupied by the virtual object VO displayed by the display control unit 112 in the display area of the display 15 .
 ステップS4において、処理装置11は、透過率制御部114として機能する。処理装置11は、ステップS3において算出された占有率に応じて、仮想オブジェクトVOの透過率を制御する。その後、端末装置10-Kは、図5のフローチャートに示される全ての処理を終了する。 In step S4, the processing device 11 functions as the transmittance control section 114. The processing device 11 controls the transmittance of the virtual object VO according to the occupancy rate calculated in step S3. Thereafter, the terminal device 10-K ends all the processes shown in the flowchart of FIG.
1-3:第1実施形態が奏する効果
 本実施形態に係る表示制御装置としての端末装置10-Kは、表示制御部112と、透過率制御部114とを備える。表示制御部112は、表示装置としてのディスプレイ15に仮想オブジェクトVOを示す画像を表示させることによって、ディスプレイ15を介して、現実空間RSに仮想オブジェクトVOが重畳された複合現実空間MSをユーザUに認識させる。透過率制御部114は、ディスプレイ15の表示領域のうち、仮想オブジェクトVOの占める面積の占有率に応じて、仮想オブジェクトVOの透過率を制御する。
1-3: Effects of the first embodiment The terminal device 10-K as a display control device according to the present embodiment includes a display control section 112 and a transmittance control section 114. The display control unit 112 causes the display 15 serving as a display device to display an image showing the virtual object VO, so that the mixed reality space MS in which the virtual object VO is superimposed on the real space RS is displayed to the user UK via the display 15. Recognize it. The transmittance control unit 114 controls the transmittance of the virtual object VO according to the occupancy rate of the area occupied by the virtual object VO in the display area of the display 15.
 端末装置10-Kは、上記の構成を備えるので、拡張現実空間又は複合現実空間において、仮想オブジェクトVOと現実空間RSに存在する物体とが重畳する場合に、現実空間RSに存在する物体の視認性を高めることが可能となる。 Since the terminal device 10-K has the above configuration, when the virtual object VO and the object existing in the real space RS overlap in the augmented reality space or the mixed reality space, it is possible to visually recognize the object existing in the real space RS. It becomes possible to increase the sex.
 仮想オブジェクトVOによって、現実空間RSに存在する物体としての実オブジェクトROが遮られる面積が大きいほど、ユーザUは、当該実オブジェクトROから本来得ていた情報が得られなくなる。また、ユーザUの進行方向に位置する実オブジェクトROが、仮想オブジェクトVOによって遮られることで、ユーザUの移動に伴って発生する危険性が高まる。端末装置10-Kは、ディスプレイ15によって表示される画面のうち、仮想オブジェクトVOが占める面積の占有率が高いほど、仮想オブジェクトVOの透過率を高める。この結果、ユーザUが本来得るべき情報が得られる。また、ユーザUの移動に伴って発生する危険性の高まりが抑制される。 The larger the area where the real object RO, which is an object existing in the real space RS, is blocked by the virtual object VO, the more the user UK is unable to obtain the information originally obtained from the real object RO. Furthermore, the real object RO located in the direction of movement of the user UK is blocked by the virtual object VO, which increases the risk of occurrence as the user UK moves. The terminal device 10-K increases the transmittance of the virtual object VO as the occupancy rate of the area occupied by the virtual object VO on the screen displayed by the display 15 increases. As a result, the information that the user UK should originally obtain is obtained. Furthermore, the increased risk that occurs as the user UK moves is suppressed.
2:第2実施形態
 以下、図6~図7を参照しつつ、本発明の第2実施形態に係る表示制御装置としての端末装置10Aを含む情報処理システム1Aの構成について説明する。
2: Second Embodiment The configuration of an information processing system 1A including a terminal device 10A as a display control device according to a second embodiment of the present invention will be described below with reference to FIGS. 6 and 7.
2-1:第2実施形態の構成
2-1-1:全体構成
 第2実施形態に係る情報処理システム1Aは、第1実施形態に係る情報処理システム1と、端末装置10-1、10-2、…10-K、…10-Jの代わりに、端末装置10A-1、10A-2、…10A-K、…10A-Jを備える点において異なる。その他の点で、情報処理システム1Aの全体構成は、図1に示される情報処理システム1の全体構成と同一であるため、その図示を省略する。
2-1: Configuration of second embodiment 2-1-1: Overall configuration Information processing system 1A according to the second embodiment includes information processing system 1 according to the first embodiment, terminal devices 10-1, 10- The difference is that terminal devices 10A-1, 10A-2, ...10A-K, ...10A-J are provided instead of terminal devices 10A-1, 10A-2, ...10A-K, ...10A-J. In other respects, the overall configuration of the information processing system 1A is the same as the overall configuration of the information processing system 1 shown in FIG. 1, so illustration thereof will be omitted.
 なお、以下では、第2実施形態に係る情報処理システム1Aに備わる構成要素のうち、第1実施形態に係る情報処理システム1に備わる構成要素と同一の構成要素については、同一の符号を用いると共に、説明の簡略化のため、その機能の説明を省略する。 Note that, in the following, among the components included in the information processing system 1A according to the second embodiment, the same components as those included in the information processing system 1 according to the first embodiment will be denoted by the same reference numerals. , for the sake of brevity, the description of its functions will be omitted.
2-1-2:端末装置の構成
 図6は、端末装置10A-Kの構成例を示すブロック図である。端末装置10A-Kは、第1実施形態に係る端末装置10-Kとは異なり、処理装置11の代わりに処理装置11Aを備え、記憶装置12の代わりに記憶装置12Aを備える。また、端末装置10A-Kは、端末装置10-Kに備わる構成要素に加えて、撮像装置18を備える。
2-1-2: Configuration of terminal device FIG. 6 is a block diagram showing an example of the configuration of the terminal devices 10A-K. The terminal device 10A-K is different from the terminal device 10-K according to the first embodiment in that it includes a processing device 11A instead of the processing device 11, and a storage device 12A instead of the storage device 12. Furthermore, the terminal device 10A-K includes an imaging device 18 in addition to the components included in the terminal device 10-K.
 撮像装置18は、物体が存在する外界の現実空間RSを撮像する。また、撮像装置18は、外界を撮像して得られた画像を示す撮像情報を出力する。更に、撮像装置18は、例えば、レンズ、撮像素子、増幅器、及びAD変換器を備える。レンズを介して集光された光を、撮像素子がアナログ信号である撮像信号に変換する。増幅器は撮像信号を増幅し、増幅された撮像信号をAD変換器に出力する。AD変換器はアナログ信号である増幅された撮像信号をデジタル信号である撮像情報に変換する。撮像情報は、処理装置11Aに出力される。なお、上記のように、端末装置10A-Kがディスプレイ15を備える代わりに、当該端末装置10A-Kに接続されたXRグラスを用いる場合には、当該端末装置10A-Kは、撮像装置18に代えて、当該XRグラスに備わる撮像装置を用いてもよい。なお、ここでの「物体」は「実オブジェクト」の一例である。 The imaging device 18 images the real space RS in the external world where objects exist. Further, the imaging device 18 outputs imaging information indicating an image obtained by imaging the outside world. Further, the imaging device 18 includes, for example, a lens, an imaging element, an amplifier, and an AD converter. The image sensor converts the light collected through the lens into an image signal, which is an analog signal. The amplifier amplifies the imaging signal and outputs the amplified imaging signal to the AD converter. The AD converter converts the amplified imaging signal, which is an analog signal, into imaging information, which is a digital signal. The imaging information is output to the processing device 11A. Note that, as described above, when using XR glasses connected to the terminal device 10A-K instead of the terminal device 10A-K having the display 15, the terminal device 10A-K is equipped with the imaging device 18. Alternatively, an imaging device included in the XR glasses may be used. Note that the "object" here is an example of a "real object."
 記憶装置12Aは、第1実施形態に係る記憶装置12に備わる制御プログラムPR1の代わりに制御プログラムPR1Aを記憶する。また、記憶装置12Aは、更に学習モデルLMを記憶する。 The storage device 12A stores a control program PR1A instead of the control program PR1 included in the storage device 12 according to the first embodiment. Furthermore, the storage device 12A further stores a learning model LM.
 学習モデルLMは、後述の認識部116が、撮像装置18が撮像を実行することによって得られた画像に含まれる物体の種別を認識するための学習モデルである。 The learning model LM is a learning model for the recognition unit 116 (described later) to recognize the type of object included in an image obtained by the imaging device 18 performing imaging.
 学習モデルLMは、学習フェーズにおいて、教師データを学習することによって生成される。学習モデルLMを生成するために用いられる教師データは、物体の写真と、当該物体の種別との1対1の組を複数有する。 The learning model LM is generated by learning teacher data in the learning phase. The teacher data used to generate the learning model LM includes a plurality of one-to-one pairs of photographs of objects and types of the objects.
 また、学習モデルLMは、端末装置10A-Kの外部において生成される。とりわけ学習モデルLMは、図示しない第2のサーバにおいて生成されることが好適である。この場合、端末装置10A-Kは、通信網NETを介して図示しない第2のサーバから学習モデルLMを取得する。 Furthermore, the learning model LM is generated outside the terminal devices 10A-K. In particular, it is preferable that the learning model LM is generated in a second server (not shown). In this case, the terminal devices 10A-K acquire the learning model LM from a second server (not shown) via the communication network NET.
 処理装置11Aは、記憶装置12Aから制御プログラムPR1Aを読み出して制御プログラムPR1Aを実行する。その結果、処理装置11Aは、第1実施形態と同様の生成部111、表示制御部112、通信制御部115に加えて、透過率制御部114A及び認識部116として機能する。 The processing device 11A reads the control program PR1A from the storage device 12A and executes the control program PR1A. As a result, the processing device 11A functions as a transmittance control section 114A and a recognition section 116 in addition to the generation section 111, display control section 112, and communication control section 115 similar to those in the first embodiment.
 認識部116は、学習モデルLMを用いて、現実空間RSに存在する実オブジェクトROの種別を認識する。具体的には、認識部116は、仮想オブジェクトVOが生成される前の段階において、現実空間RSに存在する1つ又は複数の実オブジェクトの種別を認識しておく。 The recognition unit 116 uses the learning model LM to recognize the type of the real object RO existing in the real space RS. Specifically, the recognition unit 116 recognizes the type of one or more real objects existing in the real space RS before the virtual object VO is generated.
 図2A及び図2Bを参照すると、認識部116は、現実空間RSに存在する実オブジェクトROである机T1の種別が「机」であることを認識する。同様に、認識部116は、現実空間RSに存在する実オブジェクトROである椅子C1~C4の種別が「椅子」であることを認識する。また、認識部116は、現実空間RSに存在する実オブジェクトROである掲示板NBの種別が「掲示板」であることを認識する。 Referring to FIGS. 2A and 2B, the recognition unit 116 recognizes that the type of desk T1, which is the real object RO existing in the real space RS, is "desk". Similarly, the recognition unit 116 recognizes that the types of chairs C1 to C4, which are real objects RO existing in the real space RS, are "chairs." Further, the recognition unit 116 recognizes that the type of the bulletin board NB, which is the real object RO existing in the real space RS, is a "bulletin board".
 透過率制御部114Aは、仮想オブジェクトVOの背後に位置する実オブジェクトROの種別に応じて、仮想オブジェクトVOの透過率を制御する。図2Bに示される例においては、透過率制御部114Aは、仮想オブジェクトVOの背後に位置する実オブジェクトROの種別が、「机」、「椅子」、及び「掲示板」であることに応じて、仮想オブジェクトVOの透過率を制御する。 The transmittance control unit 114A controls the transmittance of the virtual object VO according to the type of the real object RO located behind the virtual object VO. In the example shown in FIG. 2B, the transmittance control unit 114A performs the following actions in response to the types of the real object RO located behind the virtual object VO being "desk", "chair", and "bulletin board". Controls the transparency of the virtual object VO.
 ここで、透過率制御部114Aは、これらの実オブジェクトROの種別の組み合わせに応じて、仮想オブジェクトVOの透過率を制御してもよい。図2Bに示される例において、透過率制御部114Aは、仮想オブジェクトVOの背後に位置する実オブジェクトROの種別の組み合わせが「机」、「椅子」及び「掲示板」の3つの種別の組合せであることに基づいて、仮想オブジェクトVOの透過率を制御してもよい。 Here, the transmittance control unit 114A may control the transmittance of the virtual object VO according to the combination of types of these real objects RO. In the example shown in FIG. 2B, the transmittance control unit 114A determines that the combination of types of the real object RO located behind the virtual object VO is a combination of three types: "desk", "chair", and "bulletin board". Based on this, the transparency of the virtual object VO may be controlled.
 あるいは、透過率制御部114Aは、仮想オブジェクトVOの背後に位置する1つ又は複数の実オブジェクトROのうち、いずれかの実オブジェクトROの種別に基づいて、仮想オブジェクトVOの透過率を制御してもよい。図2Bに示される例において、透過率制御部114Aは、仮想オブジェクトVOの背後に、「机」を種別とする実オブジェクトROが位置することに基づいて、仮想オブジェクトVOの透過率を制御してもよい。あるいは、透過率制御部114Aは、仮想オブジェクトVOの背後に、「椅子」を種別とする実オブジェクトROが位置することに基づいて、仮想オブジェクトVOの透過率を制御してもよい。あるいは、透過率制御部114Aは、仮想オブジェクトVOの背後に、「掲示板」を種別とする実オブジェクトROが位置することに基づいて、仮想オブジェクトVOの透過率を制御してもよい。 Alternatively, the transmittance control unit 114A controls the transmittance of the virtual object VO based on the type of any real object RO among one or more real objects RO located behind the virtual object VO. Good too. In the example shown in FIG. 2B, the transmittance control unit 114A controls the transmittance of the virtual object VO based on the fact that the real object RO whose type is "desk" is located behind the virtual object VO. Good too. Alternatively, the transmittance control unit 114A may control the transmittance of the virtual object VO based on the fact that the real object RO whose type is "chair" is located behind the virtual object VO. Alternatively, the transmittance control unit 114A may control the transmittance of the virtual object VO based on the fact that a real object RO whose type is "bulletin board" is located behind the virtual object VO.
 仮想オブジェクトVOによって、現実空間RSに存在する実オブジェクトROが遮られる場合、遮られる実オブジェクトROの種別によって、ユーザUに発生する危険性の程度は異なる。端末装置10-Kは、当該実オブジェクトROの種別を認識し、認識した種別に応じて、仮想オブジェクトVOの透過率を制御する。この結果、仮想オブジェクトVOによってユーザUの視界が遮られた場合に発生する危険性が相対的に高い場合には、端末装置10A-Kが、仮想オブジェクトVOの透過率を相対的に高くすることで、当該危険性が相対的に高い状態の発生を抑制できる。一方で、仮想オブジェクトVOによってユーザUの視界が遮られた場合に発生する危険性が相対的に低い場合には、端末装置10A-Kが、仮想オブジェクトVOの透過率を相対的に低くすることで、ユーザUにとっての仮想オブジェクトVOの視認性が確保される。 When a real object RO existing in the real space RS is obstructed by a virtual object VO, the degree of danger to the user UK differs depending on the type of the obstructed real object RO. The terminal device 10-K recognizes the type of the real object RO and controls the transmittance of the virtual object VO according to the recognized type. As a result, if the risk of occurrence when the user UK 's field of view is blocked by the virtual object VO is relatively high, the terminal devices 10A-K increase the transmittance of the virtual object VO relatively. By doing so, it is possible to suppress the occurrence of situations where the risk is relatively high. On the other hand, if the risk of occurrence when the user UK 's field of view is blocked by the virtual object VO is relatively low, the terminal devices 10A-K make the transmittance of the virtual object VO relatively low. This ensures the visibility of the virtual object VO for the user UK .
2-2:第2実施形態の動作
 図7は、第2実施形態に係る表示制御装置としての端末装置10A-Kの動作を示すフローチャートである。以下、図7を参照しながら、端末装置10A-Kの動作について説明する。
2-2: Operation of Second Embodiment FIG. 7 is a flowchart showing the operation of the terminal device 10A-K as a display control device according to the second embodiment. The operation of the terminal devices 10A-K will be described below with reference to FIG.
 ステップS11において、撮像装置18は、実オブジェクトROが存在する外界の現実空間RSを撮像する。 In step S11, the imaging device 18 images the external real space RS where the real object RO exists.
 ステップS12において、処理装置11Aは、認識部116として機能する。処理装置11Aは、学習モデルLMを用いて、現実空間RSに存在する実オブジェクトROの種別を認識する。 In step S12, the processing device 11A functions as the recognition unit 116. The processing device 11A uses the learning model LM to recognize the type of the real object RO existing in the real space RS.
 ステップS13において、処理装置11Aは、生成部111として機能する。処理装置11Aは、仮想オブジェクトVOを生成する。 In step S13, the processing device 11A functions as the generation unit 111. The processing device 11A generates a virtual object VO.
 ステップS14において、処理装置11Aは、表示制御部112として機能する。処理装置11Aは、ディスプレイ15に、ステップS13において生成された仮想オブジェクトVOを示す画像を表示させることによって、ディスプレイ15を介して、現実空間RSに仮想オブジェクトVOが重畳された複合現実空間MSをユーザUに認識させる。 In step S14, the processing device 11A functions as the display control section 112. The processing device 11A allows the user to view the mixed reality space MS in which the virtual object VO is superimposed on the real space RS via the display 15 by displaying the image showing the virtual object VO generated in step S13 on the display 15. Make the UK aware.
 ステップS15において、処理装置11Aは、透過率制御部114Aとして機能する。処理装置11Aは、仮想オブジェクトVOの背後に位置する実オブジェクトROの種別に応じて、仮想オブジェクトVOの透過率を制御する。その後、端末装置10A-Kは、図7のフローチャートに示される全ての処理を終了する。 In step S15, the processing device 11A functions as a transmittance control section 114A. The processing device 11A controls the transparency of the virtual object VO according to the type of the real object RO located behind the virtual object VO. Thereafter, the terminal devices 10A-K complete all processes shown in the flowchart of FIG.
2-3:第2実施形態が奏する効果
 本実施形態に係る表示制御装置としての端末装置10A-Kは、撮像装置18と、認識部116と、表示制御部112と、透過率制御部114Aとを備える。撮像装置18は、実オブジェクトROが存在する現実空間RSを撮像する。認識部116は、実オブジェクトROの種別を認識する。表示制御部112は、表示装置としてのディスプレイ15に、仮想オブジェクトVOを示す画像を表示させることによって、ディスプレイ15を介して、現実空間RSに仮想オブジェクトVOが重畳された複合現実空間MSをユーザUに認識させる。透過率制御部114Aは、仮想オブジェクトVOの背後に位置する実オブジェクトROの種別に応じて、仮想オブジェクトVOの透過率を制御する。
2-3: Effects of the second embodiment The terminal device 10A-K as a display control device according to the present embodiment includes an imaging device 18, a recognition section 116, a display control section 112, and a transmittance control section 114A. Equipped with The imaging device 18 images the real space RS where the real object RO exists. The recognition unit 116 recognizes the type of real object RO. The display control unit 112 causes the display 15 serving as a display device to display an image showing the virtual object VO, so that the user U can see the mixed reality space MS in which the virtual object VO is superimposed on the real space RS via the display 15. Let K recognize it. The transmittance control unit 114A controls the transmittance of the virtual object VO according to the type of the real object RO located behind the virtual object VO.
 端末装置10A-Kは、上記の構成を備えるので、拡張現実空間又は複合現実空間において、仮想オブジェクトVOと現実空間RSに存在する物体とが重畳する場合に、現実空間RSに存在する物体の視認性を高めることが可能となる。 Since the terminal devices 10A-K have the above configuration, when the virtual object VO and the object existing in the real space RS overlap in the augmented reality space or the mixed reality space, the object existing in the real space RS can be visually recognized. It becomes possible to increase the sex.
 仮想オブジェクトVOによって、現実空間RSに存在する物体としての実オブジェクトROが遮られる場合、遮られる実オブジェクトROの種別によって、ユーザUに発生する危険性の程度は異なる。端末装置10-Kは、当該実オブジェクトROの種別を認識し、認識した種別に応じて、仮想オブジェクトVOの透過率を制御する。この結果、仮想オブジェクトVOによってユーザUの視界が遮られた場合に発生する危険性が相対的に高い場合には、端末装置10A-Kが、仮想オブジェクトVOの透過率を相対的に高くすることで、当該危険性が相対的に高い状態の発生を抑制できる。一方で、仮想オブジェクトVOによってユーザUの視界が遮られた場合に発生する危険性が相対的に低い場合には、端末装置10A-Kが、仮想オブジェクトVOの透過率を相対的に低くすることで、ユーザUにとっての仮想オブジェクトVOの視認性が確保される。 When a real object RO existing in the real space RS is obstructed by a virtual object VO, the degree of danger to the user UK differs depending on the type of the obstructed real object RO. The terminal device 10-K recognizes the type of the real object RO and controls the transmittance of the virtual object VO according to the recognized type. As a result, if the risk of occurrence when the user UK 's field of view is blocked by the virtual object VO is relatively high, the terminal devices 10A-K increase the transmittance of the virtual object VO relatively. By doing so, it is possible to suppress the occurrence of situations where the risk is relatively high. On the other hand, if the risk of occurrence when the user UK 's field of view is blocked by the virtual object VO is relatively low, the terminal devices 10A-K make the transmittance of the virtual object VO relatively low. This ensures the visibility of the virtual object VO for the user UK .
3:第3実施形態
 以下、図8~図10を参照しつつ、本発明の第3実施形態に係る表示制御装置としての端末装置10Bを含む情報処理システム1Bの構成について説明する。
3: Third Embodiment Hereinafter, the configuration of an information processing system 1B including a terminal device 10B as a display control device according to a third embodiment of the present invention will be described with reference to FIGS. 8 to 10.
3-1:第3実施形態の構成
3-1-1:全体構成
 第3実施形態に係る情報処理システム1Bは、第1実施形態に係る情報処理システム1と、端末装置10-1、10-2、…10-K、…10-Jの代わりに、端末装置10B-1、10B-2、…10B-K、…10B-Jを備える点において異なる。その他の点で、情報処理システム1Aの全体構成は、図1に示される情報処理システム1の全体構成と同一であるため、その図示を省略する。
3-1: Configuration of third embodiment 3-1-1: Overall configuration Information processing system 1B according to the third embodiment includes information processing system 1 according to the first embodiment, terminal devices 10-1, 10- The difference is that terminal devices 10B-1, 10B-2, . . . 10B-K, . . . 10B-J are provided instead of 2, . In other respects, the overall configuration of the information processing system 1A is the same as the overall configuration of the information processing system 1 shown in FIG. 1, so illustration thereof will be omitted.
 なお、以下では、第3実施形態に係る情報処理システム1Bに備わる構成要素のうち、第1実施形態に係る情報処理システム1に備わる構成要素と同一の構成要素については、同一の符号を用いると共に、説明の簡略化のため、その機能の説明を省略する。 Note that, in the following, among the components included in the information processing system 1B according to the third embodiment, the same components as those included in the information processing system 1 according to the first embodiment will be denoted by the same reference numerals. , for the sake of brevity, the description of its functions will be omitted.
3-1-2:端末装置の構成
 図8は、端末装置10B-Kの構成例を示すブロック図である。端末装置10B-Kは、第1実施形態に係る端末装置10-Kとは異なり、処理装置11の代わりに処理装置11Bを備え、記憶装置12の代わりに記憶装置12Bを備える。
3-1-2: Configuration of Terminal Device FIG. 8 is a block diagram showing an example of the configuration of the terminal device 10B-K. The terminal device 10B-K differs from the terminal device 10-K according to the first embodiment in that it includes a processing device 11B instead of the processing device 11, and a storage device 12B instead of the storage device 12.
 記憶装置12Bは、第1実施形態に係る記憶装置12に備わる制御プログラムPR1の代わりに制御プログラムPR1Bを記憶する。 The storage device 12B stores a control program PR1B instead of the control program PR1 included in the storage device 12 according to the first embodiment.
 処理装置11Bは、記憶装置12Bから制御プログラムPR1Bを読み出して制御プログラムPR1Bを実行する。その結果、処理装置11Bは、第1実施形態と同様の生成部111、表示制御部112、算出部113、及び通信制御部115に加えて、透過率制御部114B、第1取得部117及び第2取得部118として機能する。 The processing device 11B reads the control program PR1B from the storage device 12B and executes the control program PR1B. As a result, in addition to the generation unit 111, display control unit 112, calculation unit 113, and communication control unit 115 similar to those in the first embodiment, the processing device 11B includes a transmittance control unit 114B, a first acquisition unit 117, and a first acquisition unit 117. 2 acquisition unit 118.
 第1取得部117は、複合現実空間MSにおける、仮想オブジェクトVOから仮想オブジェクトVOの背後に位置する実オブジェクトROまでの距離を取得する。当該距離は「第1距離」の一例である。 The first acquisition unit 117 acquires the distance from the virtual object VO to the real object RO located behind the virtual object VO in the mixed reality space MS. The distance is an example of a "first distance."
 図9は、第1距離及び後述の第2距離の説明図である。なお、図9は、図2Bに示される複合現実空間MSを別の角度から見た図である。図9に示される複合現実空間MSにおいて、第1距離L1は、近接実オブジェクトROと仮想オブジェクトVOとの間の距離である。近接実オブジェクトROは、ユーザUから見て、仮想オブジェクトVOの背後に位置する複数の実オブジェクトROのうち、最も仮想オブジェクトVOに近接する実オブジェクトROである。仮想オブジェクトVOの背後に位置する複数の実オブジェクトROは、机T1、椅子C1~C4、及び掲示板NBである。より詳細には、第1距離L1は、近接実オブジェクトROの近接点と、仮想オブジェクトVOの近接点との間の距離である。近接実オブジェクトROの近接点は、近接実オブジェクトROの表面において、仮想オブジェクトVOに対して最も近接する点である。仮想オブジェクトVOの近接点は、仮想オブジェクトVOの表面において、近接実オブジェクトROに対して最も近接する点である。図9に示される例において、仮想オブジェクトVOに最も近接する実オブジェクトROは、椅子C3である。このため、図9に示される例において、第1距離L1は、椅子C3のうち、仮想オブジェクトVOに最も近接する点と、仮想オブジェクトVOのうち、椅子C3に最も近接する点との間の距離である。しかし、第1距離L1は、これには限定されない。例えば、第1距離L1は、近接実オブジェクトROの重心と、仮想オブジェクトVOの重心との間の距離であってもよい。 FIG. 9 is an explanatory diagram of a first distance and a second distance described below. Note that FIG. 9 is a diagram of the mixed reality space MS shown in FIG. 2B viewed from another angle. In the mixed reality space MS shown in FIG. 9, the first distance L1 is the distance between the close real object RO and the virtual object VO. The proximate real object RO is the real object RO closest to the virtual object VO among the plurality of real objects RO located behind the virtual object VO when viewed from the user UK . The plurality of real objects RO located behind the virtual object VO are a desk T1, chairs C1 to C4, and a bulletin board NB. More specifically, the first distance L1 is the distance between the close point of the close real object RO and the close point of the virtual object VO. The proximity point of the nearby real object RO is the point on the surface of the nearby real object RO that is closest to the virtual object VO. The proximity point of the virtual object VO is the point on the surface of the virtual object VO that is closest to the proximity real object RO. In the example shown in FIG. 9, the real object RO closest to the virtual object VO is the chair C3. Therefore, in the example shown in FIG. 9, the first distance L1 is the distance between the point of the chair C3 closest to the virtual object VO and the point of the virtual object VO closest to the chair C3. It is. However, the first distance L1 is not limited to this. For example, the first distance L1 may be the distance between the center of gravity of the nearby real object RO and the center of gravity of the virtual object VO.
 説明を図8に戻すと、第2取得部118は、複合現実空間MSにおける、表示装置としてのディスプレイ15から仮想オブジェクトVOまでの距離を取得する。当該距離は「第2距離」の一例である。 Returning to FIG. 8, the second acquisition unit 118 acquires the distance from the display 15 as a display device to the virtual object VO in the mixed reality space MS. The distance is an example of a "second distance."
 図9を参照すると、ユーザUは、ディスプレイ15を備える端末装置10B-Kを把持している。図9に示される複合現実空間MSにおいて、第2距離L2は、ディスプレイ15のうち最も仮想オブジェクトVOに近接する点と、仮想オブジェクトVOのうち、最もディスプレイ15に近接する点との間の距離である。しかし、第2距離L2は、これには限定されない。例えば、第2距離L2は、ディスプレイ15の重心と、仮想オブジェクトVOの重心との間の距離であってもよい。 Referring to FIG. 9, user UK is holding a terminal device 10B-K that includes a display 15. As shown in FIG. In the mixed reality space MS shown in FIG. 9, the second distance L2 is the distance between the point of the display 15 that is closest to the virtual object VO and the point of the virtual object VO that is closest to the display 15. be. However, the second distance L2 is not limited to this. For example, the second distance L2 may be the distance between the center of gravity of the display 15 and the center of gravity of the virtual object VO.
 説明を図8に戻すと、透過率制御部114Bは、算出部113が算出した仮想オブジェクトVOの占有率に加えて、第1取得部117が取得した第1距離L1に応じて、仮想オブジェクトVOの透過率を制御する。 Returning to FIG. 8, the transmittance control unit 114B determines the virtual object VO according to the first distance L1 acquired by the first acquisition unit 117 in addition to the occupancy rate of the virtual object VO calculated by the calculation unit 113. control the transmittance of
 仮想オブジェクトVOが透過していない場合に、仮想オブジェクトVOと実オブジェクトROと間の第1距離L1が長いほど、すなわち、仮想オブジェクトVOと実オブジェクトROとが相互に離れているほど、ユーザUが実オブジェクトROを視認できない状況に伴う危険性は低くなる。逆に、仮想オブジェクトVOと実オブジェクトROとが相互に近いほど、ユーザUにとっての危険性は高くなる。従って、透過率制御部114Bは、第1距離L1が長いほど、仮想オブジェクトVOの透過率を低くする。 When the virtual object VO is not transparent, the longer the first distance L1 between the virtual object VO and the real object RO, that is, the farther apart the virtual object VO and the real object RO are from each other, the more the user UK The danger associated with a situation where the real object RO cannot be seen is reduced. Conversely, the closer the virtual object VO and real object RO are to each other, the higher the risk to the user UK . Therefore, the transmittance control unit 114B lowers the transmittance of the virtual object VO as the first distance L1 is longer.
 この結果、端末装置10B-Kは、仮想オブジェクトVOと実オブジェクトROとが相対的に相互に近い場合に、ユーザUが実オブジェクトROを視認できない状況に伴う危険性の高まりを抑制できる。 As a result, when the virtual object VO and the real object RO are relatively close to each other, the terminal device 10B-K can suppress the increased risk associated with a situation where the user UK cannot visually recognize the real object RO.
 あるいは、透過率制御部114Bは、算出部113が算出した仮想オブジェクトVOの占有率に加えて、第2取得部118が取得した第2距離L2に応じて、仮想オブジェクトVOの透過率を制御してもよい。 Alternatively, the transmittance control unit 114B controls the transmittance of the virtual object VO according to the second distance L2 acquired by the second acquisition unit 118 in addition to the occupancy rate of the virtual object VO calculated by the calculation unit 113. It's okay.
 ディスプレイ15と仮想オブジェクトVOと間の第2距離L2が長いほど、すなわちユーザUと仮想オブジェクトVOとが相互に離れているほど、ユーザUが仮想オブジェクトVOに到達するまで移動する距離が長いため、ユーザUが実オブジェクトROを視認できない状況に伴う危険性は低くなる。従って、透過率制御部114Bは、第2距離L2が長いほど、仮想オブジェクトVOの透過率を低くする。この結果、端末装置10B-Kは、ディスプレイ15と仮想オブジェクトVOが相互に近いほど、延いてはユーザUと仮想オブジェクトVOとが相対的に相互に近いほど、ユーザUが実オブジェクトROを視認できない状況に伴う危険性の高まりを抑制できる。 The longer the second distance L2 between the display 15 and the virtual object VO, that is, the farther the user UK and the virtual object VO are from each other, the longer the distance the user UK has to travel to reach the virtual object VO. Therefore, the risk associated with a situation where the user UK cannot visually recognize the real object RO is reduced. Therefore, the transmittance control unit 114B lowers the transmittance of the virtual object VO as the second distance L2 becomes longer. As a result, the terminal device 10B-K determines that the closer the display 15 and the virtual object VO are to each other, and furthermore the closer the user UK and the virtual object VO are to each other, the more likely the user UK is to view the real object RO. It is possible to suppress the increased risk associated with situations where visibility is not possible.
 一例として、ディスプレイ15と仮想オブジェクトVOとの間の第2距離L2をxとし、x=a(m)の距離で仮想オブジェクトVOの透過を開始し、x=b(m)の距離で仮想オブジェクトVOの透過率を100(%)とする場合、透過率制御部114Bは、以下の数式(1)を用いて、仮想オブジェクトVOの透過率z(%)を算出してもよい。
Figure JPOXMLDOC01-appb-M000001
As an example, let x be the second distance L2 between the display 15 and the virtual object VO, start transmitting the virtual object VO at a distance of x=a(m), and start transmitting the virtual object VO at a distance of x=b(m). When the transmittance of the VO is 100 (%), the transmittance control unit 114B may calculate the transmittance z (%) of the virtual object VO using the following formula (1).
Figure JPOXMLDOC01-appb-M000001
 あるいは、透過率制御部114Bは、算出部113が算出した仮想オブジェクトVOの占有率に加えて、第1取得部117が取得した第1距離L1と第2取得部118が取得した第2距離L2との双方に応じて、仮想オブジェクトVOの透過率を制御してもよい。 Alternatively, the transmittance control unit 114B calculates, in addition to the occupancy rate of the virtual object VO calculated by the calculation unit 113, the first distance L1 acquired by the first acquisition unit 117 and the second distance L2 acquired by the second acquisition unit 118. The transmittance of the virtual object VO may be controlled according to both of the above.
3-2:第3実施形態の動作
 図10は、第3実施形態に係る表示制御装置としての端末装置10B-Kの動作を示すフローチャートである。以下、図10を参照しながら、端末装置10B-Kの動作について説明する。
3-2: Operation of Third Embodiment FIG. 10 is a flowchart showing the operation of the terminal device 10B-K as a display control device according to the third embodiment. The operation of the terminal devices 10B-K will be described below with reference to FIG.
 ステップS21において、処理装置11Bは、生成部111として機能する。処理装置11Bは、仮想オブジェクトVOを生成する。 In step S21, the processing device 11B functions as the generation unit 111. The processing device 11B generates a virtual object VO.
 ステップS22において、処理装置11Bは、表示制御部112として機能する。処理装置11Bは、ディスプレイ15に、ステップS21において生成された仮想オブジェクトVOを示す画像を表示させることによって、ディスプレイ15を介して、現実空間RSに仮想オブジェクトVOが重畳された複合現実空間MSをユーザUに認識させる。 In step S22, the processing device 11B functions as the display control section 112. The processing device 11B allows the user to view the mixed reality space MS in which the virtual object VO is superimposed on the real space RS via the display 15 by displaying the image showing the virtual object VO generated in step S21 on the display 15. Make the UK aware.
 ステップS23において、処理装置11Bは、算出部113として機能する。処理装置11Bは、ディスプレイ15の表示領域のうち、表示制御部112が表示させる仮想オブジェクトVOの占める面積の占有率を算出する。 In step S23, the processing device 11B functions as the calculation unit 113. The processing device 11B calculates the occupancy rate of the area occupied by the virtual object VO displayed by the display control unit 112 in the display area of the display 15.
 ステップS24において、処理装置11Bは、第1取得部117として機能する。処理装置11Bは、複合現実空間MSにおける、仮想オブジェクトVOから仮想オブジェクトVOの背後に位置する実オブジェクトROまでの第1距離L1を取得する。 In step S24, the processing device 11B functions as the first acquisition unit 117. The processing device 11B obtains a first distance L1 from the virtual object VO to the real object RO located behind the virtual object VO in the mixed reality space MS.
 ステップS25において、処理装置11Bは、第2取得部118として機能する。処理装置11Bは、複合現実空間MSにおける、表示装置としてのディスプレイ15から仮想オブジェクトVOまでの第2距離L2を取得する。 In step S25, the processing device 11B functions as the second acquisition unit 118. The processing device 11B obtains a second distance L2 from the display 15 as a display device to the virtual object VO in the mixed reality space MS.
 ステップS26において、処理装置11Bは、透過率制御部114Bとして機能する。処理装置11Bは、ステップS23において算出された占有率、ステップS24において取得された第1距離L1、及びステップS25において取得された第2距離L2に応じて、仮想オブジェクトVOの透過率を制御する。その後、端末装置10B-Kは、図10のフローチャートに示される全ての処理を終了する。 In step S26, the processing device 11B functions as the transmittance control section 114B. The processing device 11B controls the transmittance of the virtual object VO according to the occupancy rate calculated in step S23, the first distance L1 acquired in step S24, and the second distance L2 acquired in step S25. After that, the terminal device 10B-K ends all the processes shown in the flowchart of FIG.
3-3:第3実施形態が奏する効果
 本実施形態に係る表示制御装置としての端末装置10B-Kは、第1実施形態に係る端末装置10-Kに備わる構成要素に加えて、第1取得部117を更に備える。第1取得部117は、複合現実空間MSにおける、仮想オブジェクトVOから仮想オブジェクトVOの背後に位置する実オブジェクトROまでの第1距離L1を取得する。透過率制御部114Bは、仮想オブジェクトVOの占有率に加えて、更に第1距離L1に応じて、仮想オブジェクトVOの透過率を制御する。
3-3: Effects achieved by the third embodiment The terminal device 10B-K as a display control device according to the present embodiment includes the first acquisition The apparatus further includes a section 117. The first acquisition unit 117 acquires a first distance L1 from the virtual object VO to the real object RO located behind the virtual object VO in the mixed reality space MS. The transmittance control unit 114B controls the transmittance of the virtual object VO according to the first distance L1 in addition to the occupancy rate of the virtual object VO.
 端末装置10B-Kは、上記の構成を備えるので、拡張現実空間又は複合現実空間において、仮想オブジェクトVOと現実空間RSに存在する物体とが重畳する場合に、現実空間RSに存在する物体の視認性を高めることが可能となる。 Since the terminal device 10B-K has the above configuration, when the virtual object VO and the object existing in the real space RS overlap in the augmented reality space or the mixed reality space, the object existing in the real space RS can be visually recognized. It becomes possible to increase the sex.
 とりわけ本実施形態において、端末装置10B-Kは、仮想オブジェクトVOから実オブジェクトROまでの第1距離L1に応じて、仮想オブジェクトVOの透過率を制御する。具体的には、端末装置10B-Kは、第1距離L1が長いほど、仮想オブジェクトVOの透過率を低くする。この結果、端末装置10B-Kは、仮想オブジェクトVOと実オブジェクトROとが相対的に相互に近いほど、ユーザUが実オブジェクトROを視認できない状況に伴う危険性の高まりを抑制できる。 Particularly in this embodiment, the terminal device 10B-K controls the transmittance of the virtual object VO according to the first distance L1 from the virtual object VO to the real object RO. Specifically, the terminal device 10B-K lowers the transmittance of the virtual object VO as the first distance L1 becomes longer. As a result, the closer the virtual object VO and real object RO are to each other, the more the terminal device 10B-K can suppress the increase in risk associated with a situation where the user UK cannot visually recognize the real object RO.
 また、本実施形態に係る表示制御装置としての端末装置10B-Kは、第1実施形態に係る端末装置10-Kに備わる構成要素に加えて、第2取得部118を更に備える。第2取得部118は、複合現実空間MSにおける、表示装置としてのディスプレイ15から、仮想オブジェクトVOまでの第2距離L2を取得する。透過率制御部114Bは、仮想オブジェクトVOの占有率に加えて、更に第2距離L2に応じて、仮想オブジェクトVOの透過率を制御する。 Furthermore, the terminal device 10B-K as a display control device according to the present embodiment further includes a second acquisition unit 118 in addition to the components provided in the terminal device 10-K according to the first embodiment. The second acquisition unit 118 acquires a second distance L2 from the display 15 as a display device to the virtual object VO in the mixed reality space MS. The transmittance control unit 114B controls the transmittance of the virtual object VO according to the second distance L2 in addition to the occupancy rate of the virtual object VO.
 端末装置10B-Kは、上記の構成を備えるので、拡張現実空間又は複合現実空間において、仮想オブジェクトVOと現実空間RSに存在する物体とが重畳する場合に、現実空間RSに存在する物体の視認性を高めることが可能となる。 Since the terminal device 10B-K has the above configuration, when the virtual object VO and the object existing in the real space RS overlap in the augmented reality space or the mixed reality space, the object existing in the real space RS can be visually recognized. It becomes possible to increase the sex.
 とりわけ、本実施形態において、端末装置10B-Kは、ディスプレイ15と仮想オブジェクトVOと間の第2距離L2に応じて、仮想オブジェクトVOの透過率を制御する。具体的には、端末装置10B-Kは、第2距離L2が長いほど、仮想オブジェクトVOの透過率を低くする。この結果、端末装置10B-Kは、ディスプレイ15と仮想オブジェクトVOとが相互に近いほど、延いてはユーザUと仮想オブジェクトVOとが相対的に相互に近いほど、ユーザUが実オブジェクトROを視認できない状況に伴う危険性の高まりを抑制できる。 In particular, in this embodiment, the terminal devices 10B-K control the transmittance of the virtual object VO according to the second distance L2 between the display 15 and the virtual object VO. Specifically, the terminal devices 10B-K lower the transmittance of the virtual object VO as the second distance L2 becomes longer. As a result, the terminal device 10B-K realizes that the closer the display 15 and the virtual object VO are to each other, and furthermore the closer the user UK and the virtual object VO are to each other, the more the user UK is able to view the real object RO. It is possible to suppress the increase in danger associated with situations where people cannot visually see the objects.
4:変形例
 本開示は、以上に例示した実施形態に限定されない。具体的な変形の態様を以下に例示する。以下の例示から任意に選択された2以上の態様を併合してもよい。
4: Modification The present disclosure is not limited to the embodiments illustrated above. Specific modes of modification are illustrated below. Two or more aspects arbitrarily selected from the examples below may be combined.
4-1:変形例1
 第1実施形態において、端末装置10-Kは、ディスプレイ15又は当該端末装置10-Kに接続されるXRグラスを介して、現実空間RSに仮想オブジェクトVOが重畳された複合現実空間MSをユーザUに認識させる。しかし、端末装置10-Kは、複合現実空間MSの代わりに、仮想現実空間又は拡張現実空間をユーザUに認識させてもよい。また、端末装置10-Kは、XRグラスの代わりに、HMD(Head Mounted Display)を介して、仮想現実空間、拡張現実空間、又は複合現実空間MSをユーザUに認識させてもよい。HMDは、「表示装置」の一例である。また、仮想現実空間、拡張現実空間、及び複合現実空間MSのいずれも、「仮想空間」の一例である。
4-1: Modification example 1
In the first embodiment, the terminal device 10-K displays the mixed reality space MS in which the virtual object VO is superimposed on the real space RS to the user U via the display 15 or XR glasses connected to the terminal device 10-K. Let K recognize it. However, the terminal device 10-K may allow the user UK to recognize a virtual reality space or an augmented reality space instead of the mixed reality space MS. Furthermore, the terminal device 10-K may allow the user UK to recognize the virtual reality space, augmented reality space, or mixed reality space MS through an HMD (Head Mounted Display) instead of the XR glasses. The HMD is an example of a "display device." Furthermore, virtual reality space, augmented reality space, and mixed reality space MS are all examples of "virtual space."
 一例として、端末装置10-Kは、ビデオシースルー型のHMDに対して、実オブジェクトROが含まれる現実空間RSを撮像した映像に、仮想オブジェクトVOが重畳された仮想現実空間を表示させてもよい。 As an example, the terminal device 10-K may cause a video see-through type HMD to display a virtual reality space in which the virtual object VO is superimposed on a captured image of the real space RS including the real object RO. .
 この場合、表示制御部112は、外界を撮像することによって得られた撮像画像に、仮想オブジェクトVOを示す画像が重畳された重畳画像をHMDに表示させることによって、仮想現実空間をユーザUに認識させる。 In this case, the display control unit 112 displays on the HMD a superimposed image in which an image indicating the virtual object VO is superimposed on a captured image obtained by capturing an image of the outside world, thereby providing the virtual reality space to the user UK . Make them aware.
 第2実施形態に係る端末装置10A-K、及び第3実施形態に係る端末装置10B-Kについても第1実施形態と同様に変形されてもよい。 The terminal devices 10A-K according to the second embodiment and the terminal devices 10B-K according to the third embodiment may also be modified in the same way as in the first embodiment.
4-2:変形例2
 第1実施形態に係る情報処理システム1において、端末装置10-Kが、仮想オブジェクトVOを生成し、当該仮想オブジェクトVOの透過率を制御していた。しかし、端末装置10-Kの代わりに、サーバ20が仮想オブジェクトVOを生成した後、仮想オブジェクトVOを端末装置10-Kに配信してもよい。更に、サーバ20が、端末装置10-Kに配信する仮想オブジェクトVOの透過率を制御してもよい。
4-2: Modification 2
In the information processing system 1 according to the first embodiment, the terminal device 10-K generates a virtual object VO and controls the transmittance of the virtual object VO. However, instead of the terminal device 10-K, the server 20 may generate the virtual object VO and then distribute the virtual object VO to the terminal device 10-K. Furthermore, the server 20 may control the transparency of the virtual object VO distributed to the terminal device 10-K.
 第2実施形態に係る情報処理システム1A、及び第3実施形態に係る情報処理システム1Bについても第1実施形態と同様に変形されてもよい。 The information processing system 1A according to the second embodiment and the information processing system 1B according to the third embodiment may also be modified in the same way as the first embodiment.
4-3:変形例3
 第1実施形態に係る情報処理システム1~第3実施形態に係る情報処理システム1Bに備わる技術的特徴を、互いに組み合わせてもよい。
4-3: Modification 3
The technical features provided in the information processing system 1 according to the first embodiment to the information processing system 1B according to the third embodiment may be combined with each other.
 例えば、第1実施形態に係る端末装置10-Kに備わる算出部113と、第2実施形態に係る端末装置10A-Kに備わる認識部116とを組み合わせることで、本変形例に係る端末装置は、仮想オブジェクトVOの背後に実オブジェクトROが存在する場合にのみ、仮想オブジェクトVOの占有率に応じて、仮想オブジェクトVOの透過率を制御してもよい。 For example, by combining the calculation section 113 provided in the terminal device 10-K according to the first embodiment and the recognition section 116 provided in the terminal device 10A-K according to the second embodiment, the terminal device according to the present modification can be , the transmittance of the virtual object VO may be controlled according to the occupancy rate of the virtual object VO only when the real object RO exists behind the virtual object VO.
 あるいは、例えば、第2実施形態に係る端末装置10A-Kに備わる認識部116と、第3実施形態に係る端末装置10B-Kに備わる第1取得部117、及び第2取得部118とを組み合わせることで、本変形例に係る端末装置は、実オブジェクトROの種別、第1距離L1、及び第2距離L2に応じて、仮想オブジェクトVOの透過率を制御してもよい。 Alternatively, for example, the recognition section 116 provided in the terminal devices 10A-K according to the second embodiment is combined with the first acquisition section 117 and the second acquisition section 118 provided in the terminal devices 10B-K according to the third embodiment. Thus, the terminal device according to this modification may control the transmittance of the virtual object VO according to the type of the real object RO, the first distance L1, and the second distance L2.
5:その他
(1)上述した実施形態では、記憶装置12~12B、及び記憶装置22は、ROM及びRAMなどを例示したが、フレキシブルディスク、光磁気ディスク(例えば、コンパクトディスク、デジタル多用途ディスク、Blu-ray(登録商標)ディスク)、スマートカード、フラッシュメモリデバイス(例えば、カード、スティック、キードライブ)、CD-ROM(Compact Disc-ROM)、レジスタ、リムーバブルディスク、ハードディスク、フロッピー(登録商標)ディスク、磁気ストリップ、データベース、サーバその他の適切な記憶媒体である。また、プログラムは、電気通信回線を介してネットワークから送信されてもよい。また、プログラムは、電気通信回線を介して通信網NETから送信されてもよい。
5: Others (1) In the embodiments described above, the storage devices 12 to 12B and the storage device 22 are exemplified as ROM and RAM, but they may also be flexible disks, magneto-optical disks (for example, compact disks, digital versatile disks, Blu-ray (R) disc), smart card, flash memory device (e.g. card, stick, key drive), CD-ROM (Compact Disc-ROM), register, removable disk, hard disk, floppy (R) disk , magnetic strip, database, server or other suitable storage medium. The program may also be transmitted from a network via a telecommunications line. Further, the program may be transmitted from the communication network NET via a telecommunications line.
(2)上述した実施形態において、説明した情報、信号などは、様々な異なる技術のいずれかを使用して表されてもよい。例えば、上記の説明全体に渡って言及され得るデータ、命令、コマンド、情報、信号、ビット、シンボル、チップなどは、電圧、電流、電磁波、磁界若しくは磁性粒子、光場若しくは光子、又はこれらの任意の組み合わせによって表されてもよい。 (2) In the embodiments described above, the information, signals, etc. described may be represented using any of a variety of different technologies. For example, data, instructions, commands, information, signals, bits, symbols, chips, etc., which may be referred to throughout the above description, may refer to voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. It may also be represented by a combination of
(3)上述した実施形態において、入出力された情報等は特定の場所(例えば、メモリ)に保存されてもよいし、管理テーブルを用いて管理してもよい。入出力される情報等は、上書き、更新、又は追記され得る。出力された情報等は削除されてもよい。入力された情報等は他の装置へ送信されてもよい。 (3) In the embodiments described above, the input/output information may be stored in a specific location (for example, memory) or may be managed using a management table. Information etc. to be input/output may be overwritten, updated, or additionally written. The output information etc. may be deleted. The input information etc. may be transmitted to other devices.
(4)上述した実施形態において、判定は、1ビットを用いて表される値(0か1か)によって行われてもよいし、真偽値(Boolean:true又はfalse)によって行われてもよいし、数値の比較(例えば、所定の値との比較)によって行われてもよい。 (4) In the embodiments described above, the determination may be made using a value expressed using 1 bit (0 or 1) or a truth value (Boolean: true or false). Alternatively, the comparison may be performed by comparing numerical values (for example, comparing with a predetermined value).
(5)上述した実施形態において例示した処理手順、シーケンス、フローチャートなどは、矛盾の無い限り、順序を入れ替えてもよい。例えば、本開示において説明した方法については、例示的な順序を用いて様々なステップの要素を提示しており、提示した特定の順序に限定されない。 (5) The order of the processing procedures, sequences, flowcharts, etc. illustrated in the embodiments described above may be changed as long as there is no contradiction. For example, the methods described in this disclosure use an example order to present elements of the various steps and are not limited to the particular order presented.
(6)図1~図10に例示された各機能は、ハードウェア及びソフトウェアの少なくとも一方の任意の組み合わせによって実現される。また、各機能ブロックの実現方法は特に限定されない。すなわち、各機能ブロックは、物理的又は論理的に結合した1つの装置を用いて実現されてもよいし、物理的又は論理的に分離した2つ以上の装置を直接的又は間接的に(例えば、有線、無線などを用いて)接続し、これら複数の装置を用いて実現されてもよい。機能ブロックは、上記1つの装置又は上記複数の装置にソフトウェアを組み合わせて実現されてもよい。 (6) Each function illustrated in FIGS. 1 to 10 is realized by an arbitrary combination of at least one of hardware and software. Furthermore, the method for realizing each functional block is not particularly limited. That is, each functional block may be realized using one physically or logically coupled device, or may be realized using two or more physically or logically separated devices directly or indirectly (e.g. , wired, wireless, etc.) and may be realized using a plurality of these devices. The functional block may be realized by combining software with the one device or the plurality of devices.
(7)上述した実施形態において例示したプログラムは、ソフトウェア、ファームウェア、ミドルウェア、マイクロコード、ハードウェア記述言語と呼ばれるか、他の名称を用いて呼ばれるかを問わず、命令、命令セット、コード、コードセグメント、プログラムコード、プログラム、サブプログラム、ソフトウェアモジュール、アプリケーション、ソフトウェアアプリケーション、ソフトウェアパッケージ、ルーチン、サブルーチン、オブジェクト、実行可能ファイル、実行スレッド、手順、機能などを意味するよう広く解釈されるべきである。 (7) The programs exemplified in the above-described embodiments are instructions, instruction sets, codes, codes, regardless of whether they are called software, firmware, middleware, microcode, hardware description language, or by other names. Should be broadly construed to mean a segment, program code, program, subprogram, software module, application, software application, software package, routine, subroutine, object, executable, thread of execution, procedure, function, etc.
 また、ソフトウェア、命令、情報などは、伝送媒体を介して送受信されてもよい。例えば、ソフトウェアが、有線技術(同軸ケーブル、光ファイバケーブル、ツイストペア、デジタル加入者回線(DSL:Digital Subscriber Line)など)及び無線技術(赤外線、マイクロ波など)の少なくとも一方を使用してウェブサイト、サーバ、又は他のリモートソースから送信される場合、これらの有線技術及び無線技術の少なくとも一方は、伝送媒体の定義内に含まれる。 Additionally, software, instructions, information, etc. may be sent and received via a transmission medium. For example, if the software uses wired technology (coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.) and/or wireless technology (infrared, microwave, etc.) to create a website, When transmitted from a server or other remote source, these wired and/or wireless technologies are included within the definition of transmission medium.
(8)前述の各形態において、「システム」及び「ネットワーク」という用語は、互換的に使用される。 (8) In each of the above embodiments, the terms "system" and "network" are used interchangeably.
(9)本開示において説明した情報、パラメータなどは、絶対値を用いて表されてもよいし、所定の値からの相対値を用いて表されてもよいし、対応する別の情報を用いて表されてもよい。 (9) The information, parameters, etc. described in this disclosure may be expressed using absolute values, relative values from a predetermined value, or other corresponding information. It may also be expressed as
(10)上述した実施形態において、端末装置10-1~10-J、10A-1~10A-J、及び10B-1~10B-J、並びにサーバ20は、移動局(MS:Mobile Station)である場合が含まれる。移動局は、当業者によって、加入者局、モバイルユニット、加入者ユニット、ワイヤレスユニット、リモートユニット、モバイルデバイス、ワイヤレスデバイス、ワイヤレス通信デバイス、リモートデバイス、モバイル加入者局、アクセス端末、モバイル端末、ワイヤレス端末、リモート端末、ハンドセット、ユーザエージェント、モバイルクライアント、クライアント、又はいくつかの他の適切な用語によって呼ばれる場合もある。また、本開示においては、「移動局」、「ユーザ端末(user terminal)」、「ユーザ装置(UE:User Equipment)」、「端末」等の用語は、互換的に使用され得る。 (10) In the embodiment described above, the terminal devices 10-1 to 10-J, 10A-1 to 10A-J, and 10B-1 to 10B-J, and the server 20 are mobile stations (MS). This includes some cases. A mobile station is defined by a person skilled in the art as a subscriber station, mobile unit, subscriber unit, wireless unit, remote unit, mobile device, wireless device, wireless communication device, remote device, mobile subscriber station, access terminal, mobile terminal, wireless It may also be referred to as a terminal, remote terminal, handset, user agent, mobile client, client, or some other suitable terminology. Further, in the present disclosure, terms such as "mobile station," "user terminal," "user equipment (UE)," and "terminal" may be used interchangeably.
(11)上述した実施形態において、「接続された(connected)」、「結合された(coupled)」という用語、又はこれらのあらゆる変形は、2又はそれ以上の要素間の直接的又は間接的なあらゆる接続又は結合を意味し、互いに「接続」又は「結合」された2つの要素間に1又はそれ以上の中間要素が存在することを含められる。要素間の結合又は接続は、物理的な結合又は接続であっても、論理的な結合又は接続であっても、或いはこれらの組み合わせであってもよい。例えば、「接続」は「アクセス」を用いて読み替えられてもよい。本開示において使用する場合、2つの要素は、1又はそれ以上の電線、ケーブル及びプリント電気接続の少なくとも一つを用いて、並びにいくつかの非限定的かつ非包括的な例として、無線周波数領域、マイクロ波領域及び光(可視及び不可視の両方)領域の波長を有する電磁エネルギーなどを用いて、互いに「接続」又は「結合」されると考えられる。 (11) In the embodiments described above, the terms "connected", "coupled", or any variations thereof refer to direct or indirect connections between two or more elements. Refers to any connection or combination, including the presence of one or more intermediate elements between two elements that are "connected" or "coupled" to each other. The coupling or connection between elements may be a physical coupling or connection, a logical coupling or connection, or a combination thereof. For example, "connection" may be replaced with "access." As used in this disclosure, two elements may include one or more wires, cables, and/or printed electrical connections, as well as in the radio frequency domain, as some non-limiting and non-inclusive examples. , electromagnetic energy having wavelengths in the microwave and optical (both visible and invisible) ranges.
(12)上述した実施形態において、「に基づいて」という記載は、別段に明記されていない限り、「のみに基づいて」を意味しない。言い換えれば、「に基づいて」という記載は、「のみに基づいて」と「に少なくとも基づいて」の両方を意味する。 (12) In the embodiments described above, the statement "based on" does not mean "based solely on" unless specified otherwise. In other words, the phrase "based on" means both "based only on" and "based at least on."
(13)本開示において使用される「判断(determining)」、「決定(determining)」という用語は、多種多様な動作を包含する場合がある。「判断」、「決定」は、例えば、判定(judging)、計算(calculating)、算出(computing)、処理(processing)、導出(deriving)、調査(investigating)、探索(looking up、search、inquiry)(例えば、テーブル、データベース又は別のデータ構造での探索)、確認(ascertaining)した事を「判断」「決定」したとみなす事などを含み得る。また、「判断」、「決定」は、受信(receiving)(例えば、情報を受信すること)、送信(transmitting)(例えば、情報を送信すること)、入力(input)、出力(output)、アクセス(accessing)(例えば、メモリ中のデータにアクセスすること)した事を「判断」「決定」したとみなす事などを含み得る。また、「判断」、「決定」は、解決(resolving)、選択(selecting)、選定(choosing)、確立(establishing)、比較(comparing)などした事を「判断」「決定」したとみなす事を含み得る。つまり、「判断」「決定」は、何らかの動作を「判断」「決定」したとみなす事を含み得る。また、「判断(決定)」は、「想定する(assuming)」、「期待する(expecting)」、「みなす(considering)」などによって読み替えられてもよい。 (13) The terms "determining" and "determining" used in this disclosure may encompass a wide variety of operations. "Judgment" and "decision" include, for example, judging, calculating, computing, processing, deriving, investigating, looking up, search, and inquiry. (e.g., searching in a table, database, or other data structure), and regarding an ascertaining as a "judgment" or "decision." In addition, "judgment" and "decision" refer to receiving (e.g., receiving information), transmitting (e.g., sending information), input, output, and access. (accessing) (e.g., accessing data in memory) may include considering something as a "judgment" or "decision." In addition, "judgment" and "decision" refer to resolving, selecting, choosing, establishing, comparing, etc. as "judgment" and "decision". may be included. In other words, "judgment" and "decision" may include regarding some action as having been "judged" or "determined." Further, "judgment (decision)" may be read as "assuming", "expecting", "considering", etc.
(14)上述した実施形態において、「含む(include)」、「含んでいる(including)」及びそれらの変形が使用されている場合、これらの用語は、用語「備える(comprising)」と同様に、包括的であることが意図される。更に、本開示において使用されている用語「又は(or)」は、排他的論理和ではないことが意図される。 (14) In the embodiments described above, when “include”, “including” and variations thereof are used, these terms are used in the same manner as the term “comprising”. , is intended to be comprehensive. Furthermore, the term "or" as used in this disclosure is not intended to be exclusive or.
(15)本開示において、例えば、英語でのa, an及びtheのように、翻訳により冠詞が追加された場合、本開示は、これらの冠詞の後に続く名詞が複数形であることを含んでもよい。 (15) In the present disclosure, when articles are added by translation, such as a, an, and the in English, the present disclosure does not include the fact that the nouns following these articles are plural. good.
(16)本開示において、「AとBが異なる」という用語は、「AとBが互いに異なる」ことを意味してもよい。なお、当該用語は、「AとBがそれぞれCと異なる」ことを意味してもよい。「離れる」、「結合される」等の用語も、「異なる」と同様に解釈されてもよい。 (16) In the present disclosure, the term "A and B are different" may mean "A and B are different from each other." Note that the term may also mean that "A and B are each different from C". Terms such as "separate", "coupled", etc. may also be interpreted similarly to "different".
(17)本開示において説明した各態様/実施形態は単独で用いてもよいし、組み合わせて用いてもよいし、実行に伴って切り替えて用いてもよい。また、所定の情報の通知(例えば、「Xであること」の通知)は、明示的に行う通知に限られず、暗黙的(例えば、当該所定の情報の通知を行わない)ことによって行われてもよい。 (17) Each aspect/embodiment described in the present disclosure may be used alone, in combination, or may be switched and used in accordance with execution. In addition, notification of prescribed information (for example, notification of "X") is not limited to explicit notification, but may also be done implicitly (for example, by not notifying the prescribed information). Good too.
 以上、本開示について詳細に説明したが、当業者にとっては、本開示が本開示中に説明した実施形態に限定されないということは明らかである。本開示は、請求の範囲の記載により定まる本開示の趣旨及び範囲を逸脱することなく修正及び変更態様として実施できる。従って、本開示の記載は、例示説明を目的とし、本開示に対して何ら制限的な意味を有さない。 Although the present disclosure has been described in detail above, it is clear to those skilled in the art that the present disclosure is not limited to the embodiments described in the present disclosure. The present disclosure can be implemented as modifications and changes without departing from the spirit and scope of the present disclosure as determined by the claims. Accordingly, the description of the present disclosure is for illustrative purposes only and is not meant to be limiting on the present disclosure.
1、1A、1B…情報処理システム、10、10A、10B…端末装置、11、11A、11B…処理装置、12、12A、12B…記憶装置、13…通信装置、14…測位装置、15…ディスプレイ、16…入力装置、17…慣性センサ、18…撮像装置、20…サーバ、21…処理装置、22…記憶装置、23…通信装置、24…ディスプレイ、25…入力装置、111…生成部、112…表示制御部、113…算出部、114、114A、114B…透過率制御部、115…出力部、116…認識部、117…第1取得部、118…第2取得部、211…取得部、212…出力部、C1、C2、C3、C4…椅子、L1…第1距離、L2…第2距離、PR1、PR1A、PR1B、PR2…制御プログラム、T1…机 1, 1A, 1B... Information processing system, 10, 10A, 10B... Terminal device, 11, 11A, 11B... Processing device, 12, 12A, 12B... Storage device, 13... Communication device, 14... Positioning device, 15... Display , 16... Input device, 17... Inertial sensor, 18... Imaging device, 20... Server, 21... Processing device, 22... Storage device, 23... Communication device, 24... Display, 25... Input device, 111... Generation unit, 112 ...Display control section, 113... Calculation section, 114, 114A, 114B... Transmittance control section, 115... Output section, 116... Recognition section, 117... First acquisition section, 118... Second acquisition section, 211... Acquisition section, 212...Output section, C1, C2, C3, C4...Chair, L1...First distance, L2...Second distance, PR1, PR1A, PR1B, PR2...Control program, T1...Desk

Claims (5)

  1.  表示装置に仮想オブジェクトを示す画像を表示させることによって、前記表示装置を介して、現実空間に前記仮想オブジェクトが重畳された仮想空間をユーザに認識させる表示制御部と、
     前記表示装置の表示領域のうち前記仮想オブジェクトの占める面積の占有率に応じて、前記仮想オブジェクトの透過率を制御する透過率制御部と、を備える表示制御装置。
    a display control unit that causes a user to recognize a virtual space in which the virtual object is superimposed on a real space via the display device by displaying an image showing the virtual object on the display device;
    A display control device comprising: a transmittance control unit that controls transmittance of the virtual object according to an occupation rate of an area occupied by the virtual object in a display area of the display device.
  2.  前記表示制御部は、外界を撮像することによって得られた撮像画像に、前記仮想オブジェクトを示す画像が重畳された重畳画像を前記表示装置に表示させること、又は、前記表示装置を前記外界から入射される光が透過する状況において前記仮想オブジェクトを示す画像を前記表示装置に表示させることによって、前記仮想空間を前記ユーザに認識させる、請求項1に記載の表示制御装置。 The display control unit is configured to cause the display device to display a superimposed image in which an image indicating the virtual object is superimposed on a captured image obtained by capturing an image of the outside world, or to cause the display device to display a superimposed image in which an image indicating the virtual object is superimposed on a captured image obtained by capturing an image of the outside world; The display control device according to claim 1, wherein the display control device causes the user to recognize the virtual space by displaying an image showing the virtual object on the display device in a situation where light passes through the virtual object.
  3.  実オブジェクトが存在する現実空間を撮像する撮像装置と、
     前記実オブジェクトの種別を認識する認識部と、
     表示装置に仮想オブジェクトを示す画像を表示させることによって、前記表示装置を介して、前記現実空間に前記仮想オブジェクトが重畳された仮想空間をユーザに認識させる表示制御部と、
     前記仮想オブジェクトの背後に位置する実オブジェクトの種別に応じて、前記仮想オブジェクトの透過率を制御する透過率制御部と、を備える表示制御装置。
    an imaging device that images a real space in which real objects exist;
    a recognition unit that recognizes the type of the real object;
    a display control unit that causes a user to recognize a virtual space in which the virtual object is superimposed on the real space via the display device by displaying an image showing the virtual object on the display device;
    A display control device comprising: a transmittance control section that controls transmittance of the virtual object according to a type of a real object located behind the virtual object.
  4.  前記仮想空間における、前記仮想オブジェクトから、前記仮想オブジェクトの背後に位置する実オブジェクトまでの第1距離を取得する第1取得部を更に備え、
     前記透過率制御部は、更に前記第1距離に応じて、前記仮想オブジェクトの透過率を制御する、請求項1又は請求項3に記載の表示制御装置。
    further comprising a first acquisition unit that acquires a first distance from the virtual object to a real object located behind the virtual object in the virtual space;
    The display control device according to claim 1 or 3, wherein the transmittance control unit further controls the transmittance of the virtual object according to the first distance.
  5.  前記仮想空間における、前記表示装置から前記仮想オブジェクトまでの第2距離を取得する第2取得部を更に備え、
     前記透過率制御部は、更に前記第2距離に応じて、前記仮想オブジェクトの透過率を制御する、請求項1又は請求項3に記載の表示制御装置。
    further comprising a second acquisition unit that acquires a second distance from the display device to the virtual object in the virtual space,
    The display control device according to claim 1 or 3, wherein the transmittance control unit further controls the transmittance of the virtual object according to the second distance.
PCT/JP2023/007281 2022-04-14 2023-02-28 Display control device WO2023199626A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-066851 2022-04-14
JP2022066851 2022-04-14

Publications (1)

Publication Number Publication Date
WO2023199626A1 true WO2023199626A1 (en) 2023-10-19

Family

ID=88329276

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/007281 WO2023199626A1 (en) 2022-04-14 2023-02-28 Display control device

Country Status (1)

Country Link
WO (1) WO2023199626A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018106661A (en) * 2017-07-03 2018-07-05 株式会社Cygames Inconsistency detection system, mixed reality system, program, and inconsistency detection method
JP2020024752A (en) * 2013-12-25 2020-02-13 キヤノンマーケティングジャパン株式会社 Information processing device, control method thereof, and program
JP2021135776A (en) * 2020-02-27 2021-09-13 キヤノン株式会社 Information processor, information processing method, and program
JP2021165864A (en) * 2018-06-18 2021-10-14 ソニーグループ株式会社 Information processing device, information processing method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020024752A (en) * 2013-12-25 2020-02-13 キヤノンマーケティングジャパン株式会社 Information processing device, control method thereof, and program
JP2018106661A (en) * 2017-07-03 2018-07-05 株式会社Cygames Inconsistency detection system, mixed reality system, program, and inconsistency detection method
JP2021165864A (en) * 2018-06-18 2021-10-14 ソニーグループ株式会社 Information processing device, information processing method, and program
JP2021135776A (en) * 2020-02-27 2021-09-13 キヤノン株式会社 Information processor, information processing method, and program

Similar Documents

Publication Publication Date Title
US9696549B2 (en) Selectively pairing an application presented in virtual space with a physical display
KR102049132B1 (en) Augmented reality light guide display
CN107077311B (en) Input signal emulation
KR102463304B1 (en) Video processing method and device, electronic device, computer-readable storage medium and computer program
WO2010027193A2 (en) Spatially correlated rendering of three-dimensional content on display components having arbitrary positions
KR20170031525A (en) Method for measuring angles between displays and Electronic device using the same
CN104025158A (en) Information processing apparatus, information processing method, and program
US9658689B2 (en) Interfacing via heads-up display using eye contact
US10659741B2 (en) Projecting obstructed content over touch screen obstructions
CN108259810A (en) A kind of method of video calling, equipment and computer storage media
US10748000B2 (en) Method, electronic device, and recording medium for notifying of surrounding situation information
US20150339855A1 (en) Laser pointer selection for augmented reality devices
US10048762B2 (en) Remote control of a desktop application via a mobile device
KR20180042551A (en) Electronic apparatus and controlling method thereof
US11106915B1 (en) Generating in a gaze tracking device augmented reality representations for objects in a user line-of-sight
US10962738B2 (en) Information processing apparatus and information processing method to calibrate line-of-sight of a user
KR20170049991A (en) Method for providing user interaction based on force touch and electronic device using the same
US20180286352A1 (en) Information display method and head-mounted display
US20150046809A1 (en) Activity indicator
US10261602B2 (en) Hop navigation
CN110148224B (en) HUD image display method and device and terminal equipment
WO2023199626A1 (en) Display control device
WO2023176317A1 (en) Display control device
US11566913B2 (en) Method, apparatus, electronic device and storage medium for displaying AR navigation
WO2023149256A1 (en) Display control device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23788048

Country of ref document: EP

Kind code of ref document: A1