WO2021200187A1 - Terminal portatif, procédé de traitement d'informations et support de stockage - Google Patents

Terminal portatif, procédé de traitement d'informations et support de stockage Download PDF

Info

Publication number
WO2021200187A1
WO2021200187A1 PCT/JP2021/010983 JP2021010983W WO2021200187A1 WO 2021200187 A1 WO2021200187 A1 WO 2021200187A1 JP 2021010983 W JP2021010983 W JP 2021010983W WO 2021200187 A1 WO2021200187 A1 WO 2021200187A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile terminal
control unit
scale
specified
map data
Prior art date
Application number
PCT/JP2021/010983
Other languages
English (en)
Japanese (ja)
Inventor
育英 細田
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2021200187A1 publication Critical patent/WO2021200187A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram

Definitions

  • the present technology relates to a mobile terminal, an information processing method, and a storage medium, and more particularly to a mobile terminal, an information processing method, and a storage medium that make it easier to specify a position in real space.
  • UI User Interface
  • Patent Document 2 a UI (User Interface) for arranging a virtual object at a predetermined position in space based on, for example, distance, line of sight, gesture operation, etc. has also been proposed (see, for example, Patent Document 2).
  • GPS Global Positioning System
  • word-of-mouth information is arranged as a virtual object with respect to the current position.
  • the virtual object cannot be placed at a position away from the current position.
  • the position of the virtual object is defined as the coordinates of the global coordinate system.
  • a sensor for specifying a three-dimensional position specified by a user and a dedicated UI are required.
  • This technology was made in view of such a situation, and makes it easier to specify the position in the real space.
  • a mobile terminal on one side of the present technology includes a motion sensor, a position sensor, and a control unit.
  • the motion sensor is configured to detect a horizontal orientation and a vertical orientation.
  • the position sensor is configured to detect its own position in real space.
  • the control unit identifies one of the plurality of scales of the map data in the real space based on the vertical orientation, and the identified one scale, the self-position, and the self-position. It is configured to identify the first position of the map data based on the orientation in the horizontal direction.
  • the information processing method of one aspect of the present technology identifies one of a plurality of scales of the map data in the real space based on the vertical orientation detected by the motion sensor of the mobile terminal, and Identifying the position of the map data based on the one identified scale, the self-position in the real space detected by the position sensor of the mobile terminal, and the horizontal orientation detected by the motion sensor. including.
  • the storage medium of one aspect of the present technology includes a motion sensor configured to detect horizontal orientation and vertical orientation, and a position sensor configured to detect its own position in real space.
  • a program that causes a computer that controls a mobile terminal to execute a process including a plurality of steps is stored.
  • the plurality of steps include a step of specifying one of the plurality of scales of the map data in the real space based on the vertical orientation, the specified one scale, and the self-position. And the step of identifying the first position of the map data based on the horizontal orientation.
  • horizontal and vertical orientations are detected, self-positions are detected, and one of a plurality of scales of the map data is based on the vertical orientation.
  • a real space associated with the map data based on the specified one of the plurality of scales, the self-position information indicating the self-position, and the horizontal orientation. The position is specified.
  • the virtual objects mentioned here include not only 3DCG (Computer Graphics) objects, but also 2DCG objects including text information, 3D models using captured images, sound images (audio objects), and the like. That is, the virtual object may be an invisible object as long as it can be associated with (associate with) a specific position.
  • 3DCG Computer Graphics
  • 2DCG objects including text information, 3D models using captured images, sound images (audio objects), and the like. That is, the virtual object may be an invisible object as long as it can be associated with (associate with) a specific position.
  • virtual objects can be arranged three-dimensionally even in a space where the global coordinate system is not set by the AR (Augmented Reality) system.
  • the operation can be performed intuitively and easily.
  • the user can arrange virtual objects with a sense of distance on a world map scale.
  • the mobile terminal is, for example, a smartphone, a wearable terminal including a hand-held terminal (Hand-Held Terminal), and an HMD (Head Mounted Display), but the case where the mobile terminal is a smartphone will be described below.
  • control unit or a device including at least the control unit may be referred to as an information processing device.
  • a mobile terminal to which this technology is applied is configured as shown in FIG. 1, for example.
  • the mobile terminal 11 shown in FIG. 1 can be connected to an external device such as a server 12 via a network.
  • the mobile terminal 11 has a position sensor 21, a motion sensor 22, a control unit 23, a memory 24, a camera 25, a display 26, a communication device 27, and an input unit 28. Note that in FIG. 1, illustration of essential configurations such as a battery and a housing as a mobile terminal 11 is omitted as appropriate.
  • the camera 25 and the display 26 may be referred to as an image pickup device and a display device, respectively.
  • the position sensor 21 is composed of, for example, a GNSS (Global Navigation Satellite System) sensor, and positions the current position (self-position) of the mobile terminal 11, that is, the current position of the user who owns the mobile terminal 11, and indicates the positioning result.
  • the position information is supplied to the control unit 23.
  • GNSS Global Navigation Satellite System
  • the motion sensor 22 is composed of, for example, an IMU (Inertial Measurement Unit), measures the posture of the mobile terminal 11, and supplies the posture information indicating the measurement result to the control unit 23. For example, in the motion sensor 22, the horizontal direction and the vertical direction of the mobile terminal 11 with respect to the ground plane are detected (measured) as the posture of the mobile terminal 11.
  • IMU Inertial Measurement Unit
  • the control unit 23 is composed of, for example, a CPU (Central Processing Unit) or the like, that is, a processor, and controls the operation of the entire mobile terminal 11.
  • a CPU Central Processing Unit
  • the control unit 23 is composed of, for example, a CPU (Central Processing Unit) or the like, that is, a processor, and controls the operation of the entire mobile terminal 11.
  • the control unit 23 acquires posture information from a display control unit that controls the operation of the display 26, an imaging control unit that controls the operation of the camera 25, a position acquisition unit that acquires the current position information from the position sensor 21, and a motion sensor 22. It functions as a functional block of a posture acquisition unit, a communication unit that performs information communication with an external device such as a server 12 via a communication device 27, and the like.
  • control unit 23 acquires various data from the server 12 via the communication device 27.
  • the data acquired from the server 12 includes, for example, map data, virtual objects, and more specifically, presentation data such as image data and audio data for presenting virtual objects. These data may be acquired from the server 12 in real time, or may be downloaded in advance and stored in the memory 24.
  • the memory 24 is a non-volatile storage medium, and holds (stores) various data supplied from the control unit 23, and supplies the stored data to the control unit 23.
  • the camera 25 takes a picture of the surroundings of the mobile terminal 11 as a subject, and supplies the resulting image (image data, captured image) to the control unit 23.
  • the camera 25 may be a monocular camera or a multi-lens camera.
  • the display 26 is an output device and displays an image supplied from the control unit 23.
  • the communication device 27 communicates with the server 12 by a wireless LAN (Local Area Network) or the like according to the control by the control unit 23, and transmits the data supplied from the control unit 23 to the server 12 or has been transmitted from the server 12. It receives data and supplies it to the control unit 23.
  • a wireless LAN Local Area Network
  • the input unit 28 is composed of, for example, a touch panel, buttons, switches, microphones, etc., which are superposed on the display 26, and supplies a signal according to the user's operation or a signal obtained by collecting sound to the control unit 23.
  • the configuration of the mobile terminal 11 to which this technology is applied is not limited to the one shown in FIG. 1, and various configurations such as a geomagnetic sensor, an IR (Infrared) camera, and a polarization sensor may be combined.
  • a polarization sensor or an IR camera can function as a depth sensor for acquiring depth information.
  • the control unit 23 acquires the posture information indicating the horizontal direction and the vertical direction of the mobile terminal 11 detected by the motion sensor 22 from the motion sensor 22 in real time. Further, the control unit 23 acquires the current position information indicating the current position of the mobile terminal 11 detected by the position sensor 21 from the position sensor 21 in real time.
  • the control unit 23 is a three-dimensional position in real space, that is, a specific position in the global coordinate system, as described below with reference to FIGS. 2 and 3. Pointing (specifying).
  • control unit 23 is located at a position away from the mobile terminal 11 based on the yaw angle and pitch angle indicating the orientation of the mobile terminal 11 in the global coordinate system and the xyz coordinates (or latitude / longitude / altitude). Specify the coordinates.
  • control unit 23 specifies one of a plurality of predefined scales for each map data based on the vertical orientation (pitch angle) of the mobile terminal 11.
  • pitch angle is defined by, for example, a value of -90 degrees or more and 90 degrees or less.
  • the operation mode and scale of the mobile terminal 11 are selected (determined) by the vertical orientation (tilt) of the mobile terminal 11 indicated by the posture information, that is, the pitch angle. ..
  • the pitch angle of the mobile terminal 11 increases (increases)
  • a scale having a smaller scale is selected.
  • the pitch angle when the mobile terminal 11 is directed in the front direction is 0 degrees
  • the pitch angle when the mobile terminal 11 is directed directly upward is 90 degrees
  • the pitch angle when the terminal 11 is directly below, that is, toward the ground is set to -90 degrees.
  • the long-distance pointing mode that specifies (points) the position in the real space according to the scale is selected as the operation mode.
  • the long-distance pointing mode may be referred to as a first mode.
  • the scale SC1 is selected when the pitch angle of the mobile terminal 11 is 0 degrees or more and less than 15 degrees
  • the scale SC2 is selected when the pitch angle of the mobile terminal 11 is 15 degrees or more and less than 30 degrees. ..
  • the scale SC3 is selected
  • the scale SC4 is selected.
  • a position within the range that is close to the user and is visible in front of the user such as a range of 10 m to several tens of meters from the current position of the user (mobile terminal 11), is specified.
  • the position specified in the long-distance pointing mode is the position related to the virtual object such as the placement position of the virtual object specified by the user.
  • a virtual object specified by the user is placed at a position specified in the long-distance pointing mode, or a virtual object placed (associated) at a position specified in the long-distance pointing mode.
  • the virtual object is presented to the user.
  • the scale SC2 specifies a position several hundred meters to several kilometers away from the user's current position, such as a predetermined position in the city where the user is located. In other words, a position within a predetermined range (area) located at a distance of several hundred meters to several kilometers from the current position of the user is specified.
  • the position of a place several kilometers to several hundred kilometers away from the user's current position is specified, for example, a city, city, town, or prefecture different from the city, city, town, or prefecture where the user is located.
  • the position of a place several hundred kilometers or more away from the user's current position is specified.
  • the control unit 23 changes the distance from the current position to the specified position based on the change of the specified scale and pitch angle of one of the scales SC1 to SC3. More specifically, the rate of change of the distance with respect to the change of the pitch angle may be set to be larger as the specified scale is larger. That is, the change in distance from the current position based on the change in pitch angle is larger when a larger scale (second scale) is specified than when a smaller scale (first scale) is specified. growing.
  • the change in distance with respect to the change in pitch angle may be in a proportional relationship or in other increasing relationships. Further, the change in distance with respect to the change in pitch angle may be continuous or discontinuous.
  • the position is selected on the selected scale such as road name, place name, city name, country name, that is, the scale according to the pitch angle.
  • a plurality of specific points are generally set as data for each scale.
  • a specific point in the map data may be called a Point of Interest (POI).
  • POI Point of Interest
  • the control unit 23 can specify the POI of each scale.
  • the control unit 23 may specify the position (second position) of the map data corresponding to the POI (second point) closest to the first position instead of the first position.
  • the control unit 23 replaces the first position with the map data corresponding to the POI (third point) closest to the first position among the plurality of POIs between the first position and the current position.
  • a position (third position) may be specified. Note that the second and third points can be the same.
  • the normal operation mode is selected as the operation mode
  • the operation mode is set as the operation mode.
  • the short-range pointing mode is selected. That is, the control unit 23 dynamically switches between the long-distance pointing mode and the short-distance pointing mode according to the change in the pitch angle.
  • the long-distance pointing mode is set when the mobile terminal 11 is facing upward from the horizontal direction
  • the short-distance pointing mode is set when the mobile terminal 11 is facing downward from the horizontal direction. It may be regarded as.
  • the short-range pointing mode may be referred to as a second mode.
  • the user executes a normal function of the mobile terminal 11, such as shooting a still image or a moving image with the camera 25, UI operation, and listening to contents, that is, a function different from the pointing by the present technology. Can be done.
  • a virtual object is detected on a plane (hereinafter, also referred to as a short-distance plane) at a short distance from the mobile terminal 11 detected by a known plane detection method such as the feet of a user who owns the mobile terminal 11. Is placed and the virtual object is presented to the user. More specifically, the control unit 23 specifies a plane in real space based on the captured image.
  • a virtual object associated with a position on the short-distance plane is directly arranged on a short-distance plane such as the ground detected by the mobile terminal 11, and for example, on an image of the ground or the like corresponding to the short-distance plane.
  • the virtual object is superimposed on the corresponding position and displayed on the display 26.
  • the range to be detected on the short-distance plane is, for example, a range less than several meters from the mobile terminal 11. Further, whether to use the normal operation mode or the short-distance pointing mode may be selected not by, for example, the pitch angle of the mobile terminal 11, but by whether or not a short-distance plane is detected.
  • the -X degree which is the pitch angle for selecting the normal operation mode or the short-range pointing mode, may be a predetermined angle, or may be arbitrarily set to reflect the personal tendency of each user. It may be set (changed). However, it is desirable that the pitch angle "-X" is set to a small value in order to suppress the occurrence of a straight neck.
  • the range of pitch angles for which the normal operation mode is selected is the range of -X degrees or more and less than 0 degrees
  • the holding angle of the mobile terminal 11 when the user views the normal screen that is, UI operations and contents instead of pointing. This is the angle at which viewing is performed.
  • the holding angle of a general smartphone is about -30 degrees. If the angle when the display surface of the smartphone is perpendicular to the ground is regarded as 90 degrees, the holding angle of a general smartphone is 60 degrees.
  • the holding angle when viewing a normal screen that is, the pitch angle in the normal operation mode, is in the range of ⁇ 30 degrees (-60 degrees or more and 0) based on the normal holding angle of -30 degrees. It may be set to less than a degree).
  • the control unit 23 links to a specific position (point) on the map indicated by the map data, in other words, the map data. Specify (pointing) a specific position in the attached real space.
  • control unit 23 associates the virtual object with the designated (specified) position on the map, that is, the position on the map corresponding to the position in the real space. Specifically, the control unit 23 sets the designated position on the map as the placement position of the virtual object, or selects the virtual object at the designated position on the map as the virtual object to be presented at that position. ..
  • the yaw angle indicating the horizontal orientation of the mobile terminal 11 included in the current position information from the position sensor 21, the selected scale, and the posture information is used.
  • control unit 23 performs pointing as shown in FIG.
  • the control unit 23 determines the current position of the mobile terminal 11 on the map and the horizontal orientation of the mobile terminal 11 indicated by the posture information. Point to the position of the sidewalk or intersection in front of you. That is, the position where the sidewalk or the intersection is located is specified.
  • control unit 23 controls "Osaki” or "Shinagawa” based on the current position of the mobile terminal 11 and the horizontal orientation of the mobile terminal 11. Point to the location in the city where the user is.
  • the control unit 23 is "Akita” or "Aomori” based on the current position of the mobile terminal 11 and the horizontal orientation of the mobile terminal 11. Point to the location of a prefecture that is different from the prefecture where the user is.
  • the position pointed (designated) in the long-distance pointing mode will also be referred to as the designated position.
  • the position pointed by the control unit 23, that is, the pointing direction (pointing direction), which is determined by the horizontal orientation of the mobile terminal 11, may be set as a perfect straight line, simulating the influence of gravity. It may be set to draw a parabola.
  • the pointing direction is set as a perfect straight line, it may not be possible to point any real space plane when the pitch angle is 0 degrees or more.
  • a pointing direction for drawing a parabola which is determined by the pitch angle and yaw angle of the mobile terminal 11, is set (determined), and depending on the pointing direction, any global coordinate system is used without accurately pointing the horizontal plane.
  • the point is always specified (pointing).
  • the pointing direction of the parabolic shape in the global coordinate system indicating the position in the real space is determined based on the pitch angle and yaw angle of the mobile terminal 11 and the selected scale. Then, in the global coordinate system, the position pointed to by the pointing direction starting from the current position indicated by the current position information, that is, the position pointed to by the parabola corresponding to the pointing direction starting from the current position is specified. The specified position is set as the designated position.
  • the position (coordinates) in the global coordinate system that is, the position in the real space and the position on the map indicated by the map data are linked in advance. That is, the correspondence between the position on the global coordinate system and the position on the map is known.
  • the designated position in the global coordinate system is specified, it is possible to specify which point such as "Osaki" on the map the designated position is. That is, the designated position on the map can be specified.
  • control unit 23 knows the correspondence between the position in the global coordinate system and the position on the map indicated by the map data. Therefore, it is possible to specify the current position of the mobile terminal 11 on the map, the horizontal orientation, and the vertical orientation from the current position in the global coordinate system and the pitch angle and yaw angle of the mobile terminal 11 in the global coordinate system. can.
  • control unit 23 specifies the designated position on the map indicated by the map data based on the current position information, the attitude information, the selected scale, and the map data, and based on the specific result, the global coordinate system (real space). ) May be specified.
  • a mode for designating a specific position in the global coordinate system that is, a surface of a specific real object based on the pointing direction of such a parabolic shape may be referred to as a first operation mode.
  • control unit 23 may execute a mode in which the hollow is designated as the designated position (pointing position) as the second operation mode.
  • a designated position is set in the pointing direction, that is, a position on the pointing straight line that is separated from the mobile terminal 11 by a predetermined distance, and a virtual object is placed at the designated position.
  • the predetermined predetermined distance is defined to represent the absolute distance in the real space, such as 1 m, 10 m, 100 m, and 1 km, and it is preferable to set the larger as the scale of the map data becomes smaller.
  • the virtual object is always displayed regardless of the change in the current position or posture of the mobile terminal 11. It is set so that there is no change in the appearance.
  • the operation mode is the normal operation mode and the normal camera function is activated
  • the operation area R11-1 and the operation area R11 for the normal operation mode are displayed on the display screen of the display 26 as shown in FIG. -2 and a video-through image area R12 are provided.
  • the video-through image is defined as a captured image displayed in real time.
  • the display 26 is turned sideways, and the operation area R11-1 and the operation area R11-2 are provided at the left and right ends of the display screen of the display 26.
  • the area between the operation area R11-1 and the operation area R11-2 is the video through image area R12.
  • the operation area R11-1 and the operation area R11-2 it is also simply referred to as the operation area R11.
  • an operation UI in which icons (buttons) for capturing an image by the camera 25 and icons for performing various settings at the time of image capture are arranged is displayed, and the user can use the operation UI. It is possible to take an image by performing an operation on.
  • This video-through image is displayed for the user to confirm the angle of view and the like at the time of image shooting, and is shot by the camera 25.
  • the video-through image obtained by the camera 25 is supplied from the camera 25 to the display 26 via the control unit 23 and displayed in the video-through image area R12.
  • message MS11-1 and message MS11-2 which are GUIs (Graphical User Interface) indicating switching of the operation mode, are displayed superimposed on the video-through image.
  • message MS11-1 and message MS11-2 may be referred to as a first indicator and a second indicator, respectively.
  • the message MS11-1 is arranged at the upper part of the video-through image area R12
  • the message MS11-2 is arranged at the lower part of the video-through image area R12.
  • message MS11-1 indicates that tilting the mobile terminal 11 upward switches to the long-distance pointing mode
  • message MS11-2 indicates that tilting the mobile terminal 11 downward switches to the short-distance pointing mode. Shown.
  • message MS11-1 indicates that tilting the mobile terminal 11 upward switches to the long-distance pointing mode
  • message MS11-2 indicates that tilting the mobile terminal 11 downward switches to the short-distance pointing mode. Shown.
  • message MS11-1 indicates that tilting the mobile terminal 11 upward switches to the long-distance pointing mode
  • message MS11-2 indicates that tilting the mobile terminal 11 downward switches to the short-distance pointing mode.
  • FIG. 5 in the figure of the video-through image area R12, the vertically adjacent portions, that is, the portions of the region R21 and the region R22 and the portions of the region R23 and the region R24 are shown as a video in order to make the explanation easy to understand.
  • a real-space image that is not included as a subject in the through image is drawn with a dotted line.
  • the operation mode is the normal operation mode
  • the operation area R11 and the video through image area R12 are displayed on the display screen of the display 26 as in the example shown in FIG. It has become.
  • the operation mode automatically switches (transitions) from the normal operation mode to the long-distance pointing mode.
  • the control unit 23 switches the operation mode from the normal operation mode to the long-distance pointing mode.
  • the scale SC1 is selected in the long-distance pointing mode.
  • the entire display screen of the display 26 becomes the video-through image area R12, and the video-through image is displayed in the video-through image area R12. Further, in the video-through image area R12, the indicator IND11 including the horizontal axis and the vertical axis and the message MS11-3 indicating the switching of the operation mode are superimposed and displayed on the video-through image.
  • Message MS11-3 indicates that when the mobile terminal 11 is tilted downward, it switches to the normal operation mode.
  • message MS11-3 indicates that when the mobile terminal 11 is tilted downward, it switches to the normal operation mode.
  • message MS11-3 indicates that when the mobile terminal 11 is tilted downward, it switches to the normal operation mode.
  • the pointer for each scale in the long-distance pointing mode is displayed on the indicator IND11 along the horizontal axis and the vertical axis of the indicator IND11.
  • the triangles in the figure represent pointers, and each pointer aligned along the horizontal axis of the indicator IND11 indicates the position in real space on the currently selected scale SC1. ..
  • the pointers lined up along the vertical axis of the indicator IND11 indicate the position in real space on different scales. For example, a pointer with the character “Aomori” in the neighborhood indicates “Aomori”, which is the position in real space on the scale SC3.
  • the user has an "intersection" near his front, "Shinagawa” a little far from the front, “Aomori” far from the front, and a mobile terminal.
  • tilting 11 it is possible to instantly grasp that the position can be specified on a different scale. That is, the user can intuitively and easily perform the operation without knowing the UI operation of the long-distance pointing mode in detail.
  • a pointer whose display format such as color is different from that of other pointers is selected as a designated position, that is, a state in which a predetermined position (object) in the real space at the selected scale is pointed. It indicates that it is in a state of being.
  • the position of the "intersection" in front of the user is pointed by a pointer located near the intersection of the horizontal axis and the vertical axis of the indicator IND11 and having the character "intersection” written in the vicinity. It is in a state of being.
  • the color and shape of the horizontal axis and the vertical axis of the indicator IND11 are not particularly limited, and any color or shape may be used.
  • control unit 23 may dynamically select a color having high visibility according to the background color, that is, the color of the superposed video-through image, as the display color of the horizontal axis or the vertical axis.
  • the display color of the horizontal axis or the vertical axis may be a predetermined color (single color).
  • control unit 23 scales in the long-distance pointing mode. Select scale SC3 as.
  • control unit 23 displays, for example, the video-through image area R12 shown in FIG. 6 on the display 26.
  • the same reference numerals are given to the parts corresponding to the cases in FIG. 5, and the description thereof will be omitted as appropriate.
  • the indicator IND11 is displayed together with the video-through image in the video-through image area R12.
  • the position "Aomori" far away from the front of the user is pointed by the pointer near the intersection of the horizontal axis and the vertical axis and the character “Aomori” is written in the vicinity. ing.
  • This position "Aomori” is the position in real space on the selected scale SC3.
  • the control unit 23 sets the short-distance pointing mode as the operation mode. Is selected, and the screen shown in FIG. 7, for example, is displayed on the display 26.
  • a video-through image taken by the camera 25 is displayed on the display 26, and word-of-mouth information as a virtual object is displayed superimposed on the video-through image.
  • control unit 23 performs plane detection based on the video-through image obtained by the camera 25, and is associated with the detected position on the short-distance plane from the virtual objects acquired from the server 12. Select the one that exists (associates). Then, the control unit 23 supplies the selected virtual object to the display 26, and displays the virtual object at the corresponding position in the video through image.
  • the message MS11-1 and the message MS11-2 are displayed at the upper and lower ends of the video through image area R12, respectively. Is displayed.
  • the character "REMOTE POINTING" is written as the message MS11-1 displayed at the upper end, and the user can tilt the mobile terminal 11 upward from the character (text) and the display position of the message MS11-1. For example, it is possible to intuitively and easily grasp the switching to the long-distance pointing mode.
  • the user can intuitively and easily grasp that the display position of the message MS11-2 and the character "NEAR POINTING" are switched to the short-distance pointing mode by tilting the mobile terminal 11 downward. .. Further, with respect to the message MS11-3 shown in FIG. 5, the user can intuitively and easily grasp that the mobile terminal 11 is switched to the normal operation mode by tilting the mobile terminal 11 downward.
  • the operation modes such as the short-distance pointing mode, the normal operation mode, and the long-distance pointing mode are seamlessly switched according to the change in the pitch angle (tilt angle).
  • the message MS11 is for making the user recognize the switching of the operation mode, and the message MS11 is displayed superimposed on the video through image.
  • the message MS11 is composed of a rectangular area portion and a character (text) portion in the rectangular area portion.
  • the transmittance and color of the rectangular region of the message MS11 are changed as the pitch angle of the mobile terminal 11 increases, the user is informed in which direction and how much the mobile terminal 11 is tilted to switch the operation mode. Can be easily recognized.
  • the operation mode is the normal operation mode
  • the message MS11-1 and the message MS11-2 are displayed at the upper and lower ends of the video through image area R12, respectively.
  • the parts corresponding to the case in FIG. 5 are designated by the same reference numerals, and the description thereof will be omitted as appropriate.
  • the pitch angle of the mobile terminal 11 is -30 degrees (however, -30 ⁇ -X) in the normal operation mode.
  • the background part of the message MS11-1 that is, the rectangular area part is translucent, for example, with a transmittance of about 50% in order to ensure the visibility of the character (text) part. It is displayed in a single color.
  • the background part of the message MS11-1 is displayed in white, and the character part is displayed in black.
  • the user tilts the mobile terminal 11 upward, and the pitch angle becomes -15 degrees.
  • the overall transmittance of the background (rectangular area) of the message MS11-1 is 25%, the upper part of the background is blue, the lower part of the background is white, and the character part is white. Is displayed.
  • the area displayed in blue in the background portion of the message MS11-1 continuously increases as the pitch angle increases, and the transmittance of the background portion also decreases. To go.
  • the transmittance of the background portion of the message MS11-1 is 0%
  • the background portion is displayed in blue
  • the character portion is displayed in white. Switches to the long-distance pointing mode.
  • the transmittance decreases as the pitch angle increases, and the display color also continuously changes from white from the top to the bottom. It turns blue.
  • the color change mode at this time allows the user to intuitively and easily understand in which direction and how much the mobile terminal 11 should be tilted to switch the operation mode. That is, the user can intuitively and easily recognize the timing of switching the operation mode and the direction of the operation for switching the operation mode.
  • the user can intuitively recognize that the operation mode will soon switch to the long-distance pointing mode when the area displayed in blue in the background part of the message MS11-1 becomes large.
  • the user can recognize the timing of switching the operation mode and the like. It is possible to prevent the operation mode from being unintentionally switched due to a change in posture (tilt).
  • the rectangular area of the message MS11 was explicitly divided into a white area and a blue area, and the size of those areas changed according to the pitch angle.
  • the present invention is not limited to this, and for example, the color of the rectangular region portion may be changed in a gradation according to the pitch angle.
  • the operation mode lock function may be realized by operating the message MS11.
  • the operation mode is the normal operation mode and the user shoots with the camera 25, the user wants to shoot a subject diagonally above the user at an angle at which the pitch angle of the mobile terminal 11 is 0 degrees or more.
  • the operation mode is the normal operation mode and the user shoots with the camera 25
  • the user wants to shoot a subject diagonally above the user at an angle at which the pitch angle of the mobile terminal 11 is 0 degrees or more.
  • FIG. 9 when the user swipes the message MS11, the display of the message MS11 is deleted, and the automatic switching (transition) of the operation mode is temporarily disabled. You may do it.
  • the parts corresponding to the case in FIG. 5 are designated by the same reference numerals, and the description thereof will be omitted as appropriate.
  • the display of the message MS11-1 is deleted and the switching to the long-distance pointing mode is temporarily disabled.
  • the arrows in the figure indicate the swipe direction when temporarily canceling the switching of the operation mode, and are not actually displayed.
  • the user swipes message MS11-2 to the right, left, or down in the figure to remove the display of message MS11-2 and temporarily disable switching to short-range pointing mode. You may do so.
  • the operation mode switching is invalidated. May be automatically canceled. By doing so, it is possible to realize seamless operability according to the change of the pitch angle while appropriately disabling the switching of the operation mode.
  • the invalidation of the operation mode switching may be canceled in any way. For example, disabling the switching of the operation mode returns the pitch angle to a value within the range of the pitch angle in the operation mode different from the current operation mode, and then returns to the range of the pitch angle of the current operation mode. It may be canceled at times.
  • a GUI such as a button for canceling the invalidation of the operation mode switching may be explicitly displayed, and the invalidation may be canceled by operating the GUI, or depending on the restart of the shooting application program. The invalidation may be canceled.
  • step S11 the control unit 23 determines whether or not the operation mode is the normal operation mode based on the posture information supplied from the motion sensor 22.
  • the pitch angle included in the attitude information is -X degrees or more and less than 0 degrees, it is determined to be in the normal operation mode.
  • the operation mode switching is invalid, it is determined to be the normal operation mode.
  • the control unit 23 Temporarily disables switching between operating modes.
  • step S11 When it is determined in step S11 that the normal operation mode is set, the control unit 23 controls the display 26 in step S12 to display the UI according to the normal operation mode. As a result, the screen shown in FIG. 4, for example, is displayed on the display 26.
  • step S13 the control unit 23 performs processing according to the user's operation.
  • a signal corresponding to the user's operation is supplied from the input unit 28 to the control unit 23.
  • the control unit 23 controls the camera 25 in response to the signal from the input unit 28 to capture an image, and supplies the image obtained by the capture to the memory 24 for storage.
  • step S13 When the process of step S13 is performed, the process then proceeds to step S23.
  • step S11 determines whether or not the operation mode is the normal operation mode. If it is determined in step S11 that the operation mode is not the normal operation mode, the control unit 23 determines in step S14 whether or not the operation mode is the long-distance pointing mode based on the posture information supplied from the motion sensor 22. judge.
  • step S14 when the pitch angle included in the posture information is 0 degrees or more, it is determined that the long-distance pointing mode is used.
  • step S14 If it is determined in step S14 that the long-distance pointing mode is used, then the process proceeds to step S15.
  • step S15 the control unit 23 selects one scale from a plurality of predetermined scales SC1 to SC4, which are different from each other, based on the pitch angle included in the attitude information.
  • step S16 the control unit 23 points to a position in the real space associated with the map data of the selected scale based on the current position information from the position sensor 21 and the posture information from the motion sensor 22. Specify the specified position.
  • control unit 23 specifies a position pointed by a straight line or a parabola represented by a pointing direction determined by the pitch angle indicated by the attitude information and the yaw angle, starting from the current position on the map obtained from the current position information. Identify as.
  • control unit 23 uses the global coordinate system corresponding to the designated position on the map based on the known correspondence between the position in the global coordinate system and the position on the map. Specify the above, that is, the specified position in the real space.
  • the designated position in the global coordinate system is specified based on the selected scale, the current position information, and the attitude information, and the position on the map of the map data corresponding to the specified position is specified. You may do it.
  • step S17 the control unit 23 specifies the virtual object corresponding to the designated position specified in the process of step S16.
  • control unit 23 holds a virtual object acquired from the server 12 in advance by the communication device 27, more specifically, presentation data for presenting the virtual object, and placement position information indicating the placement position of the virtual object.
  • the arrangement position information may be coordinates indicating a position in the global coordinate system, or may be information indicating a position on a map indicated by map data.
  • control unit 23 Based on the arrangement position information of each of the plurality of virtual objects, the control unit 23 identifies the plurality of virtual objects that are arranged at the designated position specified in the process of step S16.
  • step S18 the control unit 23 supplies the display 26 with the presentation data of the virtual object specified in the process of step S17, and displays the virtual object by superimposing it on the video through image displayed in the video through image area R12. Let me.
  • control unit 23 supplies audio data as presentation data to an AR speaker or the like (not shown) to output audio as a virtual object.
  • the virtual object may be placed at a specified position.
  • the input unit 28 when the user operates the input unit 28 to specify a virtual object associated with (associate with) the specified position, the input unit 28 supplies a signal to the control unit 23 according to the user's specified operation. ..
  • control unit 23 associates the virtual object specified by the user with the designated position specified in step S16 in response to the signal from the input unit 28. That is, for example, the control unit 23 generates information indicating the designated position specified in step S16 as the placement position information of the virtual object. Then, the control unit 23 supplies the presentation data of the virtual object and the arrangement position information to the communication device 27, and transmits (uploads) the data to the server 12.
  • step S18 When the process of step S18 is performed, the process then proceeds to step S23.
  • step S14 when it is determined in step S14 that the mode is not the long-distance pointing mode, that is, when the short-distance pointing mode is set, the process proceeds to step S19.
  • step S19 the control unit 23 detects the plane based on the video-through image supplied from the camera 25.
  • step S20 the control unit 23 specifies the position in the real space corresponding to the detected short-distance plane, that is, the position (coordinates) in the global coordinate system, based on the result of the plane detection in step S19. More specifically, it is specified which position in the real space each position on the short-distance plane corresponds to.
  • step S21 the control unit 23 specifies the virtual object corresponding to the position specified in step S20. That is, the control unit 23 is arranged on the position specified in the process of step S20, that is, on the short-distance plane from among the plurality of virtual objects based on the arrangement position information of each of the plurality of virtual objects. Identify things.
  • step S22 the control unit 23 supplies the presentation data of the virtual object specified in the process of step S21 to the display 26, superimposes it on the video through image, and displays the virtual object.
  • the process of step S22 proceeds to step S23.
  • step S23 determines in step S23 whether or not to end the processing being performed. For example, when the user instructs the end of the function by the camera 25 of the mobile terminal 11, it is determined that the process is finished.
  • step S23 If it is determined in step S23 that the process has not yet been completed, the process returns to step S11, and the above-described process is repeated.
  • step S23 when it is determined in step S23 that the processing is finished, each part of the mobile terminal 11 stops the processing being performed, and the display processing is finished.
  • the mobile terminal 11 selects a scale according to its own pitch angle, and specifies a designated position to be pointed based on the selected scale, map data, current position information, and posture information. By doing so, the position with respect to the virtual object can be specified more easily.
  • the position P11 in Tokyo is the current position, and the position several hundred kilometers away from this current position is pointed by the pointing straight line L11.
  • any point will be pointed even if a pointing error of several kilometers occurs in the horizontal direction. It is possible to distinguish between the two and place the virtual object at the desired designated position.
  • the distance in the depth direction is several hundred kilometers.
  • the horizontal distance between multiple candidate stores will be several tens of meters or less.
  • the error of the required yaw angle (pan rotation angle) of pointing is much less than 1 degree. Therefore, it is difficult to point to a specific point among a plurality of points (positions) located several hundred kilometers away from the current position.
  • the designated position is specified, for example, as shown in FIG. P21 is the pseudo current position.
  • FIG. 12 the parts corresponding to the case in FIG. 11 are designated by the same reference numerals, and the description thereof will be omitted as appropriate.
  • the user is actually at the position P11 shown in FIG. 11, but the current position for pointing moves to the position P21 in a pseudo manner. Therefore, the user can point the desired position more accurately with less error by the posture information, that is, the pointing straight line L11, starting from the position P21 as the starting point (starting point).
  • the horizontal pointing destination changes in units of several kilometers.
  • the yaw angle changes by 1 degree even when the actual pointing destination is several hundred kilometers away, whereas the horizontal change of the pointing destination is. It will be about several meters.
  • the pseudo movement destination of the self-position is not limited to voice input, but is specified (selected) by designating the place name displayed on the display 26, the position on the map, etc. by touch operation. You may do it.
  • step S16 is performed using the position indicated by the signal as a pseudo current position.
  • control unit 23 goes to the "Izakaya in front of Aomori Station" voice-input by the user instead of the current position indicated by the current position information.
  • the corresponding position "Aomori Station” is assumed to be the current position.
  • control unit 23 controls the display 26 based on the video-through image supplied from the camera 25 and the pseudo current position “Aomori Station” to display the screen indicated by the arrow Q52.
  • pointers indicating the positions of "Izakaya A”, “Izakaya B”, and “Izakaya C” around the pseudo current position "Aomori Station” and the characters of the voice input by the user, that is, The characters “Izakaya in front of Aomori Station” indicating the pseudo current position are displayed superimposed on the video through image.
  • the user can point the desired position by moving the mobile terminal 11 in the horizontal direction.
  • the pointer indicating "Izakaya B" is displayed in a display format different from other pointers, and it can be seen that the position of this "Izakaya B" is pointed.
  • the series of processes described above can be executed by hardware or software.
  • the programs that make up the software are installed on the computer.
  • the computer includes a computer embedded in dedicated hardware and, for example, a general-purpose personal computer capable of executing various functions by installing various programs.
  • FIG. 14 is a block diagram showing a configuration example of computer hardware that executes the above-mentioned series of processes programmatically.
  • CPU 501 In a computer, CPU 501, ROM (Read Only Memory) 502, and RAM (Random Access Memory) 503 are connected to each other by a bus 504.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • An input / output interface 505 is further connected to the bus 504.
  • An input unit 506, an output unit 507, a storage unit 508, a communication unit 509, and a drive 510 are connected to the input / output interface 505.
  • the input unit 506 includes a keyboard, a mouse, a microphone, an image sensor, and the like.
  • the output unit 507 includes a display, a speaker, and the like.
  • the storage unit 508 includes a hard disk, a non-volatile memory, and the like.
  • the communication unit 509 includes a network interface and the like.
  • the drive 510 drives a removable storage medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
  • the CPU 501 loads the program stored in the storage unit 508 into the RAM 503 via the input / output interface 505 and the bus 504 and executes the above-described series. Is processed.
  • the program executed by the computer can be stored and provided in the removable storage medium 511 as a package medium or the like, for example. Programs can also be provided via wired or wireless transmission media such as local area networks, the Internet, and digital satellite broadcasts.
  • the program can be installed in the storage unit 508 via the input / output interface 505 by mounting the removable storage medium 511 in the drive 510. Further, the program can be received by the communication unit 509 and installed in the storage unit 508 via a wired or wireless transmission medium. In addition, the program can be pre-installed in the ROM 502 or the storage unit 508.
  • the program executed by the computer may be a program that is processed in chronological order according to the order described in this specification, or may be a program that is processed in parallel or at a necessary timing such as when a call is made. It may be a program in which processing is performed.
  • the embodiment of the present technology is not limited to the above-described embodiment, and various changes can be made without departing from the gist of the present technology.
  • this technology can have a cloud computing configuration in which one function is shared by a plurality of devices via a network and processed jointly.
  • each step described in the above flowchart can be executed by one device or shared by a plurality of devices.
  • one step includes a plurality of processes
  • the plurality of processes included in the one step can be executed by one device or shared by a plurality of devices.
  • this technology can also have the following configurations.
  • a position sensor configured to detect its own position in real space, Based on the vertical orientation, one of the plurality of scales of the map data in the real space is specified.
  • a mobile terminal comprising the one identified scale, a control unit configured to identify a first position of the map data based on the self-position and the horizontal orientation.
  • the control unit identifies the first position by changing the distance from the self-position to the first position based on the specified scale and the change in the vertical direction.
  • the mobile terminal according to (1) configured in.
  • the plurality of scales include a first scale and a second scale larger than the first scale.
  • the control unit changes the orientation in the vertical direction as compared with the case where the specified scale corresponds to the first scale.
  • the mobile terminal according to (2) which is configured to specify the first position by increasing the change in the distance with respect to the relative.
  • the control unit does not have a specific point associated with the first position on the specified scale, the specific point closest to the first position is replaced with the first position.
  • the mobile terminal according to (2) or (3) which is configured to specify the second position of the map data corresponding to the second point.
  • the control unit does not have a specific point associated with the first position on the specified scale, instead of the first position, between the first position and the self-position.
  • the mobile terminal is configured to specify the third position of the map data corresponding to the third point which is the specific point closest to the first position among the plurality of specific points in (2) to ().
  • the mobile terminal according to any one of 4).
  • (6) Further equipped with an input unit configured to acquire a signal according to the user's operation, The control unit A position different from the self-position based on the signal is set as a pseudo self-position. Any of (1) to (5) configured to identify the first position based on the identified scale, the pseudo self-position, and the horizontal orientation.
  • the mobile terminal according to any one of (1) to (6), wherein the control unit is configured to associate a virtual object with a position of the map data corresponding to the first position.
  • An image pickup device configured to acquire an image taken in the real space, and an image pickup device.
  • a display device configured to superimpose a virtual object on the captured image is further provided.
  • the control unit sets the first mode of associating the virtual object with the position of the map data based on one of the plurality of scales and the plane of the real space specified based on the captured image.
  • the mobile terminal according to (7) which switches between the second mode for associating a virtual object and the second mode based on the change in the vertical direction.
  • the control unit When the vertical direction is higher than the horizontal direction, the first mode is executed.
  • the mobile terminal according to (8) which is configured to execute the second mode when the vertical direction is lower than the horizontal direction.
  • the control unit controls the display device so as to display a first indicator indicating the first mode at the upper part of the captured image and a second indicator indicating the second mode at the lower part of the captured image.
  • the mobile terminal according to (9) configured to do so.
  • the control unit is configured to continuously change the appearance of at least one of the first indicator and the second indicator based on the change in the vertical orientation.
  • Mobile terminal (12) Identifying one of the plurality of scales of the map data in real space based on the vertical orientation detected by the motion sensor of the mobile terminal, and the identified one scale and the mobile terminal.
  • An information processing method including specifying the position of the map data based on the self-position in the real space detected by the position sensor and the horizontal orientation detected by the motion sensor.
  • a computer that controls a mobile terminal with a position sensor configured to detect its own position in real space. Based on the vertical orientation, one of the plurality of scales of the map data in the real space is specified.
  • 11 mobile terminals 12 servers, 21 position sensors, 22 motion sensors, 23 control units, 24 memories, 25 cameras, 26 displays, 27 communication devices, 28 input units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne : un terminal portatif et un procédé de traitement d'informations qui peuvent faciliter la spécification d'une position dans un espace réel ; un support de stockage. Ce terminal portatif est pourvu d'un capteur de mouvement, d'un capteur de position et d'une unité de commande. Le capteur de mouvement est configuré de sorte à détecter l'orientation dans la direction horizontale et l'orientation dans la direction verticale. Le capteur de position est configuré de sorte à détecter l'emplacement de son propre espace réel. L'unité de commande est configurée de sorte : à spécifier, sur la base de l'orientation dans la direction verticale, l'une de multiples échelles comprises dans des données de carte de l'espace réel ; à spécifier un premier emplacement dans les données de carte sur la base de l'échelle spécifiée, l'emplacement lui-même et l'orientation dans la direction horizontale. La présente invention peut s'appliquer aux téléphones intelligents.
PCT/JP2021/010983 2020-03-31 2021-03-18 Terminal portatif, procédé de traitement d'informations et support de stockage WO2021200187A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020062189 2020-03-31
JP2020-062189 2020-03-31

Publications (1)

Publication Number Publication Date
WO2021200187A1 true WO2021200187A1 (fr) 2021-10-07

Family

ID=77927249

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/010983 WO2021200187A1 (fr) 2020-03-31 2021-03-18 Terminal portatif, procédé de traitement d'informations et support de stockage

Country Status (1)

Country Link
WO (1) WO2021200187A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10103975A (ja) * 1996-10-02 1998-04-24 Nissan Motor Co Ltd 携帯型ナビゲーション装置
JP2017116831A (ja) * 2015-12-25 2017-06-29 株式会社ゼンリンデータコム 地図表示装置、地図表示方法及びコンピュータプログラム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10103975A (ja) * 1996-10-02 1998-04-24 Nissan Motor Co Ltd 携帯型ナビゲーション装置
JP2017116831A (ja) * 2015-12-25 2017-06-29 株式会社ゼンリンデータコム 地図表示装置、地図表示方法及びコンピュータプログラム

Similar Documents

Publication Publication Date Title
KR102414587B1 (ko) 증강 현실 데이터 제시 방법, 장치, 기기 및 저장 매체
US10217288B2 (en) Method for representing points of interest in a view of a real environment on a mobile device and mobile device therefor
CN104350736B (zh) 附近位置信息的增强现实布置
US10043314B2 (en) Display control method and information processing apparatus
KR101295712B1 (ko) 증강 현실 사용자 인터페이스 제공 장치 및 방법
KR101330805B1 (ko) 증강 현실 제공 장치 및 방법
US9507485B2 (en) Electronic device, displaying method and file saving method
US20110287811A1 (en) Method and apparatus for an augmented reality x-ray
US10062209B2 (en) Displaying an object in a panoramic image based upon a line-of-sight direction
CN107771310B (zh) 头戴式显示设备及其处理方法
US20210102820A1 (en) Transitioning between map view and augmented reality view
CN111768454A (zh) 位姿确定方法、装置、设备及存储介质
KR20120017783A (ko) 증강 현실에서 위치 정보를 표시하는 방법 및 장치
US20160284130A1 (en) Display control method and information processing apparatus
US20210118236A1 (en) Method and apparatus for presenting augmented reality data, device and storage medium
JP2017211811A (ja) 表示制御プログラム、表示制御方法および表示制御装置
KR20110070210A (ko) 위치 감지 센서와 방향 감지 센서를 이용하여 증강현실 서비스를 제공하기 위한 이동단말기 및 방법
JP6481456B2 (ja) 表示制御方法、表示制御プログラム、及び情報処理装置
JP6145563B2 (ja) 情報表示装置
JP4710217B2 (ja) 情報提示装置及び情報提示方法、情報提示システム、並びにコンピュータ・プログラム
US11568579B2 (en) Augmented reality content generation with update suspension
WO2021200187A1 (fr) Terminal portatif, procédé de traitement d'informations et support de stockage
JP2019002747A (ja) 目的地特定システム
JP2016110296A (ja) ウェアラブルデバイス、情報処理方法及びプログラム
KR101153127B1 (ko) 스마트 폰의 지리정보 표시장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21781384

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21781384

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP