US20200334888A1 - Display control method, information processing apparatus and non-transitory computer readable medium - Google Patents
Display control method, information processing apparatus and non-transitory computer readable medium Download PDFInfo
- Publication number
- US20200334888A1 US20200334888A1 US16/920,435 US202016920435A US2020334888A1 US 20200334888 A1 US20200334888 A1 US 20200334888A1 US 202016920435 A US202016920435 A US 202016920435A US 2020334888 A1 US2020334888 A1 US 2020334888A1
- Authority
- US
- United States
- Prior art keywords
- information
- display
- processing apparatus
- display device
- displayed
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D21/00—Measuring or testing not otherwise provided for
- G01D21/02—Measuring two or more variables by means not covered by a single other subclass
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G06K9/209—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/08—Volume rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30192—Weather; Meteorology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2215/00—Indexing scheme for image rendering
- G06T2215/16—Using real world measurements to influence rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2021—Shape modification
Definitions
- the present invention relates to an information processing apparatus.
- Techniques for displaying an object three-dimensionally include a method using a single display screen, a method using plural display screens arranged three-dimensionally, a method using a three-dimensional display, and the like.
- a display control method including following steps. Acquiring positional information, which indicates a position where a display device is being used, from the display device, the display device is a mobile display device. Acquiring first information, which is different from the positional information and is associated with the positional information, from a device different from the display device. Differentiating a two-dimensional image, which represents an object defined three-dimensionally from one viewpoint direction and is displayed on the display device, according to the acquired first information, and displaying the two-dimensional image as a moving image.
- an image of a virtual creature displayed on the display device, a form of which is imitated by the object changes in association with at least the first information and the positional information.
- an image of a virtual creature different from the image of the virtual creature is additionally displayed on the display device in association with a change of the first information and the positional information.
- the number of images of virtual creatures displayed on the display device, forms of which are imitated by the object is different in association with the first information and the positional information.
- the two-dimensional image is displayed in a mode according to a content of the abnormality.
- warning information is displayed on the display device.
- a magnitude of change in display of the two-dimensional image is changed according to the information regarding weather.
- the two-dimensional image increases on the display device and then decreases according to the first information.
- an information processing apparatus including a display controller.
- the display controller acquires positional information, which indicates a position where a mobile display device is being used, from the display device.
- the display controller acquires first information, which is different from the positional information and is associated with the positional information, from a device different from the display device.
- the display controller differentiates a two-dimensional image, which represents an object defined three-dimensionally from one viewpoint direction and is displayed on the display device, according to the acquired first information, and displays the two-dimensional image as a moving image.
- an image of a virtual creature displayed on the display device, a form of which is imitated by the object changes in association with at least the first information and the positional information.
- an image of a virtual creature different from the image of the virtual creature is additionally displayed on the display device in association with a change of the first information and the positional information.
- the number of images of virtual creatures displayed on the display device, forms of which are imitated by the object is different in association with the first information and the positional information.
- the display controller displays the two-dimensional image in a mode according to a content of the abnormality.
- the display controller displays warning information on the display device.
- the display controller when the first information is information regarding weather, changes a magnitude of change in display of the two-dimensional image according to the information regarding weather.
- the display controller increases and then decreases the two-dimensional image on the display device according to the first information.
- a non-transitory computer readable medium storing a program causing a computer to: acquire positional information, which indicates a position where a mobile display device is being used, from the display device; acquire first information, which is different from the positional information and is associated with the positional information, from a device different from the display device; and differentiate a two-dimensional image, which represents an object defined three-dimensionally from one viewpoint direction and is displayed on the display device, according to the acquired first information, and display the two-dimensional image as a moving image.
- FIG. 1 illustrates an appearance example of an information processing apparatus according to a first exemplary embodiment
- FIG. 2 illustrates an arrangement example of planar sensors
- FIG. 3 illustrates an arrangement example of other sensors
- FIG. 4 illustrates a hardware configuration example of the information processing apparatus
- FIG. 5 illustrates a functional configuration example of a controller according to the first exemplary embodiment
- FIG. 6 illustrates an example of a three-dimensional object displayed on a display of the information processing apparatus
- FIG. 7 illustrates changes that are made to a character displayed as the three-dimensional object upon acceleration (collision) being applied to a top face of the information processing apparatus
- FIG. 8 illustrates changes that are made to the character displayed as the three-dimensional object if acceleration (collision) is continuously applied to the top face of the information processing apparatus
- FIG. 9 illustrates changes that are made to a structure displayed as the three-dimensional object if acceleration (collision) is continuously applied to the top face of the information processing apparatus
- FIG. 10 illustrates changes that are made to the character displayed as the three-dimensional object upon pressure being applied to the head of the character
- FIG. 11 illustrates changes that are made to the character displayed as the three-dimensional object if pressure is continuously applied by a fingertip to the head of the character
- FIG. 12 illustrates changes that are made to the character displayed as the three-dimensional object upon pressure being applied to the chest of the character
- FIG. 13 illustrates changes that are made to the structure displayed as the three-dimensional object upon distortion being applied to the information processing apparatus
- FIG. 14 illustrates changes that are made to the character displayed as the three-dimensional object upon a temperature change being detected by temperature detection sensors
- FIG. 15 illustrates a change that is made to an ice cube displayed as the three-dimensional object upon a temperature change being detected by the temperature detection sensors
- FIG. 16 illustrates a change that is made to the character displayed as the three-dimensional object upon a humidity change being detected by humidity detection sensors
- FIG. 17 illustrates an example in which acceleration detection sensors are arranged in M rows and N columns in the plane of the display
- FIG. 18 illustrates a change in the number of times of detection of acceleration (collision) from a specific position in a case in which the acceleration detection sensors are arranged in 4 rows and 4 columns;
- FIG. 19 illustrates an appearance example of an information processing apparatus including displays on four faces, which are a front face, side faces, and a back face;
- FIG. 20 illustrates an appearance example of an information processing apparatus according to a second exemplary embodiment
- FIG. 21 illustrates a hardware configuration example of the information processing apparatus
- FIG. 22 illustrates a functional configuration example of a controller according to the second exemplary embodiment
- FIGS. 23A to 23E illustrate examples of a bent position determined by a bent position determiner
- FIG. 24 illustrates an example in which an image of the three-dimensional object is edited by using information on the determined bent position
- FIG. 25 illustrates another example in which an image of the three-dimensional object is edited by using information on the determined bent position
- FIG. 26 illustrates an example in which a deformation operation of the information processing apparatus is used to control a display operation of a display device whose display is controlled by another information processing apparatus;
- FIG. 27 illustrates another example in which a deformation operation of the information processing apparatus is used to control a display operation of a display device whose display is controlled by another information processing apparatus;
- FIG. 28 illustrates an example of a three-dimensional object displayed by a three-dimensional display
- FIG. 29 illustrates a state in which a change is made to a displayed three-dimensional object in response to a specific movement of a user
- FIG. 30 illustrates an example in which a change is made to a three-dimensional object that is projected onto a wall or a floor.
- FIG. 31 illustrates an example of a correspondence table representing a correspondence relationship between objects as change targets in combination with numbers of dimensions of sensor values and changed images.
- “to display an object three-dimensionally” means to display the object in a manner including depth information by using a single display screen, plural display screens arranged three-dimensionally, a so-called three-dimensional display, or the like.
- a method using a single display screen includes a case of displaying a two-dimensional image (image including information regarding perspective) obtained by actually taking a picture of an object that is present in a real space, a case of displaying a two-dimensional image representing an object defined in a three-dimensional space from one perspective, and the like.
- the object defined in a three-dimensional space include a character in a virtual space, a three-dimensional image reconstructed from tomographic images, and the like.
- the method using plural display screens arranged three-dimensionally includes a case of displaying on the respective display screens, plural two-dimensional images obtained by observing an object defined in a three-dimensional space from plural perspectives (corresponding to arrangement positions of the display screens).
- the method using a three-dimensional display includes a method requiring an observer to wear glasses having special optical characteristics, a method requiring no such special glasses, a method requiring an observer to wear a head-mounted display on the head, and the like.
- Examples of the method requiring no glasses include a method using the phenomenon that air at the focal point of condensed laser beams changes into plasma and emits light.
- information corresponding to the object does not have to necessarily include internal information (voxel defining volume data), and may be, for example, a collection of multi faces (polygon mesh).
- “to display an object two-dimensionally” means to display the object in a manner not containing depth information by using a single display screen, plural display screens arranged three-dimensionally, a so-called three-dimensional display, or the like.
- an object defined in a three-dimensional space is referred to as a three-dimensional object
- an object defined in a two-dimensional space is referred to as a two-dimensional object.
- An object may alternatively be displayed three-dimensionally by displaying two-dimensional objects corresponding to plural display apparatuses arranged three-dimensionally.
- the object in the exemplary embodiments may be displayed as a still image or a moving image.
- FIG. 1 illustrates an appearance example of an information processing apparatus 1 according to a first exemplary embodiment.
- the information processing apparatus 1 is, for example, assumed to be a mobile information terminal (mobile display device) such as a tablet computer or a smartphone.
- the information processing apparatus 1 illustrated in FIG. 1 includes six flat faces, among which a display 2 is arranged on a face. Note that all of the faces of the information processing apparatus 1 are not necessarily flat. In other words, one or more faces may be curved.
- a face on which the display 2 is provided is referred to as a front face
- faces positioned left and right of the display 2 are referred to as side faces
- a face opposite to the display 2 is referred to as a back face or a rear face.
- a face positioned above the display 2 is referred to as a top face
- a face positioned below the display 2 is referred to as a bottom face.
- the shape of the front face is a rectangle in which a length H in the Z direction (height) is longer than a length W in the X direction (width).
- a length D in the Y direction (depth) defining the side surfaces of the information processing apparatus 1 is shorter than the length W in the X direction (width).
- the display 2 is configured from a thin film display such as a liquid crystal display or an organic electroluminescent (EL) display. If the display 2 is a liquid crystal display, a light source (not illustrated) is also provided.
- a thin film display such as a liquid crystal display or an organic electroluminescent (EL) display. If the display 2 is a liquid crystal display, a light source (not illustrated) is also provided.
- a controller 3 that controls operations of the units including the display 2 and other components (not illustrated) are built in a housing of the information processing apparatus 1 .
- FIG. 2 illustrates an arrangement example of planar sensors.
- FIG. 2 illustrates a position detection sensor 4 , a pressure detection sensor 5 , and a distortion detection sensor 6 .
- the position detection sensor 4 is a sensor that detects the position of an input operation performed by a user and is stacked on the top face of the display 2 .
- An electronic device combining the position detection sensor 4 with the display 2 is called a touch panel.
- the position detection sensor 4 is an example of a detector.
- a detectable object differs according to a detection method. For example, in a case of using electrostatic capacitance for detection, parts of a person's body (e.g., fingertips) are detectable objects. For example, in a case of using infrared rays for detection, fingertips and other objects including pens are detectable objects.
- the position detection sensor 4 outputs the coordinates of a detected object.
- the position detection sensor 4 according to the first exemplary embodiment is capable of detecting plural operation positions at a time.
- the pressure detection sensor 5 is a sensor that detects the strength of pressure applied to an operation position at the time of an input operation and is provided on the rear face of the display 2 , for example.
- the pressure detection sensor 5 is a capacitive pressure sensor and detects, as the strength of pressure, the degree of flexure generated in a body of the sensor formed in the form of a film. In the case of this exemplary embodiment, the pressure detection sensor 5 is capable of detecting some levels of pressure differences.
- the distortion detection sensor 6 is a sensor that detects the degree of distortion generated in the body of the sensor and is provided on the rear face of the display 2 .
- the distortion detection sensor 6 for example, a displacement sensor to which piezoelectricity of polylactic acid is applied, developed by Murata Manufacturing Co., Ltd., is used.
- the distortion detection sensor 6 is capable of detecting the direction and degree of distortion.
- the distortion is an example of a physical quantity by which, typically, a part is not directly determined.
- the position detection sensor 4 , the pressure detection sensor 5 , and the distortion detection sensor 6 are each an example of a detector.
- FIG. 3 illustrates an arrangement example of other sensors.
- the information processing apparatus 1 illustrated in FIG. 3 includes, inside the housing, pressure detection sensors 7 A and 7 B that detect the strength of pressure applied locally to lower portions of the side surfaces, an acceleration detection sensor 8 that detects the direction and value of acceleration applied to the body, temperature detection sensors 9 A, 9 B, and 9 C that detect temperatures, and humidity detection sensors 10 A, 10 B, and 10 C that detect humidities.
- the number of temperature detection sensors and the number of humidity detection sensors may each be one.
- the pressure detection sensors 7 A and 7 B are capable of detecting some strength levels of pressure applied to the side surfaces of the information processing apparatus 1 .
- the temperature detection sensors 9 A, 9 B, and 9 C are used for detection of the temperature (ambient temperature) of the space where the information processing apparatus 1 is used and also are used for detection of the temperature of a local area.
- the temperature is an example of a physical quantity by which, typically, a part is not directly determined.
- the humidity detection sensors 10 A, 10 B, and 10 C are used for detection of the humidity of the space where the information processing apparatus 1 is used and also are used for detection of the humidity of a local area.
- the humidity is an example of a physical quantity by which, typically, a part is not directly determined.
- the pressure detection sensors 7 A and 7 B, the acceleration detection sensor 8 , the temperature detection sensors 9 A, 9 B, and 9 C, and the humidity detection sensors 10 A, 10 B, and 10 C are each an example of a detector.
- FIG. 4 illustrates a hardware configuration example of the information processing apparatus 1 .
- the information processing apparatus 1 includes, in addition to the above-described devices, a non-volatile storage device 14 used for data storage and a communication unit 15 . These devices transmit and receive data via a bus 16 .
- controller 3 includes a central processing unit (CPU) 11 that executes data processing, a read only memory (ROM) 12 that stores programs and the like such as basic input/output system (BIOS) and firmware, and a random access memory (RAM) 13 used as a work area.
- CPU central processing unit
- ROM read only memory
- BIOS basic input/output system
- RAM random access memory
- the storage device 14 is configured from, for example, a semiconductor memory or a hard disk device.
- the communication unit 15 is a communicator used for communication with an external apparatus. A variety of schemes are used for communication. Note that the communication path may be a wired path or a wireless path.
- FIG. 5 illustrates a functional configuration example of the controller 3 according to the first exemplary embodiment.
- the functional configuration illustrated in FIG. 5 is realized by a program being executed.
- the controller 3 serves as a pressure strength determiner 21 that determines the strength of pressure, a pressurized part determiner 22 that determines the part to which pressure has been applied, an operation position determiner 23 that determines the position of an operation, a temperature determiner 24 that determines a temperature, a humidity determiner 25 that determines a humidity, an acceleration direction determiner 26 that determines the direction of acceleration, an acceleration value determiner 27 that determines the value of acceleration, a distortion determiner 28 that determines the direction and degree of distortion, and a display content determiner 29 that determines display content by using determined information.
- the display content determiner 29 is an example of a display controller.
- the pressure strength determiner 21 receives a pressure value that is output from the pressure detection sensor 5 provided on the front face side and pressure values that are output from the two pressure detection sensors 7 A and 7 B provided on the side face side and outputs the strengths of pressure applied to the respective positions from comparison with a threshold that is prepared in advance.
- the pressurized part determiner 22 determines that the positions at which pressure is applied by a user operation are on the side faces. On the other hand, if operation coordinates are input from the position detection sensor 4 , the pressurized part determiner 22 determines that the position at which pressure is applied by a user operation is on the front face.
- the pressurized part determiner 22 and the operation position determiner 23 are provided separately in this exemplary embodiment, the pressurized part determiner 22 may also serve as the operation position determiner 23 .
- the operation position determiner 23 determines that a user operates the side surfaces. If operation coordinates are input from the position detection sensor 4 , the operation position determiner 23 determines that a user operates the position according to the operation coordinates. The operation position determiner 23 also determines, in addition to operation positions at respective times, a locus of the operation positions over time.
- the operation position determiner 23 according to this exemplary embodiment is capable of detecting plural operation positions at a time.
- the temperature determiner 24 determines temperatures of the respective parts, a temperature distribution, a temporal change, and the like on the basis of temperature values that are input from the temperature detection sensors 9 A, 9 B, and 9 C.
- the humidity determiner 25 determines humidities of the respective parts, a humidity distribution, a temporal change, and the like on the basis of humidity values that are input from the humidity detection sensors 10 A, 10 B, and 10 C.
- the acceleration direction determiner 26 determines the direction of acceleration that has acted on the housing and a temporal change thereof on the basis of acceleration information that is input from the acceleration detection sensor 8 .
- the acceleration value determiner 27 determines the value of acceleration that has acted on the housing and a temporal change thereof on the basis of acceleration information that is input from the acceleration detection sensor 8 .
- the distortion determiner 28 determines the direction and degree of distortion generated in the housing on the basis of an output from the distortion detection sensor 6 .
- the display content determiner 29 makes a change to an object that is displayed three-dimensionally on the display 2 . Specific change contents will be specifically described later.
- the physical quantity as a detection target differs according to the type and arrangement of sensors provided in the housing.
- FIG. 6 illustrates an example of a three-dimensional object displayed on the display 2 of the information processing apparatus 1 .
- the three-dimensional object illustrated in FIG. 6 represents a character 30 (virtual creature) in the form of a person and is an example of an object that is displayed three-dimensionally.
- FIG. 7 illustrates changes that are made to the character 30 displayed as the three-dimensional object upon acceleration (collision) being applied to the top face of the information processing apparatus 1 .
- the acceleration (collision) is an example of a physical quantity by which, typically, a part is not directly determined.
- FIG. 7 is a case in which acceleration is applied downward ( ⁇ Z direction) to the information processing apparatus 1 at time t 1 .
- This event is determined by the acceleration direction determiner 26 (see FIG. 5 ) if, for example, collision is applied to the top face of the information processing apparatus 1 .
- the display content determiner 29 determines the part of the three-dimensional object on which the external force acts, on the basis of information regarding the determined acceleration direction.
- the head of the character 30 is positioned higher than any other portions.
- the display content determiner 29 determines that the external force acts on the head of the character 30 .
- the acceleration value determiner 27 (see FIG. 5 ) also determines the value of acceleration at time t 1 .
- the acceleration value determiner 27 compares a numeric value that is received from the acceleration detection sensor 8 (see FIG. 4 ) with a threshold, and determines the strength of the external force that has acted on the information processing apparatus 1 .
- the display content determiner 29 changes the display content at time t 2 (t 2 >t 1 ) in such a manner that a small amount of blood 31 flows from the head.
- the head is an example of a specific part. Note that the number of specific parts is not necessarily one and may be plural. In addition, the specific part is, in principle, a part of the three-dimensional object.
- the display content determiner 29 changes the display content at time t 2 (t 2 >t 1 ) in such a manner that a greater amount of blood 32 flows from the head than the amount of a case in which the value of acceleration is small.
- a bruise internal bleeding
- the area of the bruise may be changed in accordance with the strength of the external force that has acted.
- a wen may be displayed at the part on which the external force has acted.
- the size of the wen may be changed in accordance with the strength of the external force that has acted.
- an arm of the character 30 may be determined as a part to which a change is to be made if the acceleration is in the left-right direction of the display 2 .
- the character 30 in the form of a person is assumed as an example of the three-dimensional object in the example of FIG. 7
- the three-dimensional object may be, for example, a structure. If the three-dimensional object is a structure, a change may be made in such a manner that a scratch or a crack is generated in accordance with the determined value of acceleration.
- FIG. 8 illustrates changes that are made to the character 30 displayed as the three-dimensional object if acceleration (collision) is continuously applied to the top face of the information processing apparatus 1 .
- the information processing apparatus 1 may tap another object, the information processing apparatus 1 may be tapped by another object, or the information processing apparatus 1 may be shaken while being held.
- the information processing apparatus 1 may tap another object, the information processing apparatus 1 may be tapped by another object, or the information processing apparatus 1 may be shaken while being held.
- FIG. 8 a case in which a user continuously taps an end face (top face) of the information processing apparatus 1 with a finger or the like is assumed.
- the column chart illustrated in FIG. 8 has the horizontal axis representing time and the vertical axis representing the number of times of collision applied at respective times.
- a part of the displayed image may be tapped with a fingertip to apply acceleration (collision) to a specific part.
- the position at which acceleration (collision) is applied is determined by the position detection sensor 4 (see FIG. 4 ).
- the value of acceleration is detected by the acceleration detection sensor 8 (see FIG. 4 ).
- the number of times of collision increases from time t 1 to time t 2 , and then decreases.
- collisions do not necessarily have equal strength, typically, the total strength of collisions that act on the specific part is in proportion to the number of times of collision. However, the total strength of five collisions each having a collision strength of “2” equals to the total strength of two collisions each having a collision strength of “5”.
- the area of flowing blood 33 increases from time t 1 to time t 2 , and the area of the blood 33 flowing from the head of the character 30 decreases after time t 2 from which the number of times of collision decreases (time t 3 ) and disappears (time t 4 ).
- the total strength is an example of a total of results of plural times of detection.
- FIG. 9 illustrates changes that are made to a structure 34 displayed as the three-dimensional object if acceleration (collision) is continuously applied to the top face of the information processing apparatus 1 .
- FIG. 9 illustrates a case in which the structure 34 is a cylinder.
- acceleration is applied downward ( ⁇ Z direction) to the information processing apparatus 1 at time t 1 .
- the display content on the information processing apparatus 1 is changed in such a manner that a scratch 35 is formed on the top face of the cylinder and a crack 36 grows inside the cylinder.
- the crack 36 generated inside is represented by a dashed line in FIG. 9 for illustration, only the scratch 35 observable from the outside may be displayed on the information processing apparatus 1 .
- acceleration is still applied to the top face of the information processing apparatus 1 at time t 2 (t 2 >t 1 ).
- the value of acceleration applied at time t 2 may be equal to or different from the value of acceleration applied at time t 1 .
- the width of the scratch 35 increases and the crack 36 grows in the depth direction compared with that in the display at time t 1 .
- acceleration is still applied to the top face of the information processing apparatus 1 at time t 3 (t 3 >t 2 ).
- the value of acceleration applied at time t 3 may be equal to or different from the value of acceleration applied at time t 1 or t 2 .
- the scratch 35 is wider and the crack 36 is deeper than those in the display at time t 2 .
- FIG. 10 illustrates changes that are made to the character 30 displayed as the three-dimensional object upon pressure being applied to the head of the character 30 .
- the position on which the pressure acts at this time is determined by the pressurized part determiner 22 (see FIG. 5 ) that receives an output signal from the position detection sensor 4 (see FIG. 4 ). This is because the position to which the pressure is applied is the same as the contact position of a fingertip 37 .
- the strength of pressure at this time is determined by the pressure strength determiner 21 (see FIG. 5 ) that receives an output signal from the pressure detection sensor 5 (see FIG. 4 ).
- the pressure strength determiner 21 compares the numerical value that is input from the pressure detection sensor 5 with a predetermined threshold to determine the strength of pressure that has acted on the information processing apparatus 1 .
- the strength is determined in some levels.
- the display content determiner 29 changes the display content at time t 2 (t 2 >t 1 ) in such a manner that a small amount of the blood 31 flows from the head.
- the display content determiner 29 changes the display content at time t 2 in such a manner that a greater amount of the blood 32 flows from the head than the amount of a case in which the strength of pressure is weak.
- a user may directly determine the part to which a change is to be made in the display.
- a user may adjust the amount of change to be made to the specific part in accordance with the strength of pressure.
- the position to which pressure is applied is determined by using the output from the pressure detection sensor 5
- the part of the character 30 to which a change is to be made in the display may be determined by using the outputs from the pressure detection sensors 7 A and 7 B (see FIG. 3 ) that are prepared for detecting locally applied pressure.
- blood may flow from the head, or a change may be made to another part such as a hand, a leg, or the chest in the display.
- the part to which a change is to be made may be set in the display content determiner 29 in advance.
- FIG. 11 illustrates changes that are made to the character 30 displayed as the three-dimensional object if pressure is continuously applied by the fingertip 37 to the head of the character 30 .
- the pressure may be applied by an object other than a fingertip, such as a tip of a pen.
- the fingertip 37 does not have to be continuously in contact with the surface of the display 2 and may tap a specific part discontinuously as in the case in which acceleration (collision) is applied.
- the head of the character 30 remains pushed by the fingertip 37 from time t 1 to time t 3 .
- the display content determiner 29 determines that the head keeps bleeding and adds an image of the blood 31 to the head at time t 2 and increases the area of the blood 32 at time t 3 .
- the bleeding amount may be in proportion to the strength of pressure. For example, if the strength is strong, it may be determined that the bleeding amount is large; if the strength is weak, it may be determined that the bleeding amount is small. However, regardless of the strength of pressure, the bleeding amount may be increased and decreased in proportion to the length of time during which a pressure exceeding a predetermined threshold is applied.
- the change content may be different in accordance with the part on which an external force acts.
- the change content may be any content as long as the specific part is displayed in a different form, such as addition, deletion, deformation, or division in the display.
- FIG. 12 illustrates changes that are made to the character 30 displayed as the three-dimensional object upon pressure being applied to the chest of the character 30 .
- the chest not the head, is pushed by the fingertip 37 at time t 1 .
- a change is made to clothing depending on a difference of the strength of pressure applied to the chest.
- the display content determiner 29 changes the image into an image in which the clothing of the character 30 is lifted or strained at time t 2 (t 2 >t 1 ).
- the display content determiner 29 changes the image into an image in which the clothing of the character 30 is ripped at time t 2 .
- the type of clothing may be changed or the clothing may be changed depending on the determined strength of pressure.
- FIG. 13 illustrates changes that are made to the structure 34 displayed as the three-dimensional object upon distortion being applied to the information processing apparatus 1 .
- FIG. 13 illustrates a cylinder as the structure 34 .
- Stripes are provided on the outer circumferential face of the cylinder. The stripes are in parallel to the axial direction of the rotationally symmetric cylinder.
- the direction and degree of distortion acting on the information processing apparatus 1 are given from the distortion determiner 28 (see FIG. 5 ) to the display content determiner 29 .
- the display content determiner 29 In accordance with the direction and degree of distortion determined at time t 1 , the display content determiner 29 generates an image in which the structure 34 is distorted and displays the image on the display 2 .
- a change is made to the structure 34 at time t 2 at which distortion is no longer applied, not at time t 1 at which distortion is applied to the information processing apparatus 1 .
- a change may be made to the displayed structure 34 at time t 1 at which distortion is applied.
- the change content works together with the distortion applied to the information processing apparatus 1 in the example of FIG. 13
- another change may be made to the three-dimensional object, which is a display target, by using information on the determined direction and degree of distortion.
- a character displayed on the display 2 may bleed or be scratched, or a change may be made to the clothing of the character.
- FIG. 14 illustrates changes that are made to the character 30 displayed as the three-dimensional object upon a temperature change being detected by the temperature detection sensors 9 A, 9 B, and 9 C (see FIG. 3 ).
- the display content determiner 29 makes a change to a specific part of the displayed character 30 on the basis of a result of comparison between a determined temperature and a predetermined threshold or information on the distribution of temperatures detected by the plural temperature detection sensors 9 A, 9 B, and 9 C.
- the display content determiner 29 changes the display content at time t 2 into a state in which a scarf 38 is put around the neck of the character 30 .
- the neck of the character 30 is an example of the specific part.
- the display content determiner 29 changes the display content at time t 2 into a state in which the forehead of the character 30 is covered with sweat 39 .
- the forehead of the character 30 is an example of the specific part.
- FIG. 15 illustrates a change that is made to an ice cube 40 displayed as the three-dimensional object upon a temperature change being detected by the temperature detection sensors 9 A, 9 B, and 9 C ( FIG. 3 ).
- the display content determiner 29 makes a change to a specific part of the displayed ice cube 40 on the basis of a result of comparison between a determined temperature and a predetermined threshold or information on the distribution of temperatures detected by the plural temperature detection sensors 9 A, 9 B, and 9 C.
- the example of FIG. 15 is a case in which the ambient temperature of the information processing apparatus 1 is higher than the predetermined threshold.
- the display content determiner 29 makes a change to in such a manner that the displayed ice cube 40 melts to be smaller and water 41 is gathered around the ice cube 40 .
- the ice cube 40 is an example of the specific part.
- FIG. 16 illustrates a change that is made to the character 30 displayed as the three-dimensional object upon a humidity change being detected by the humidity detection sensors 10 A, 10 B, and 10 C (see FIG. 3 ).
- the display content determiner 29 makes a change to a specific part of the displayed character 30 on the basis of a result of comparison between a determined humidity and a predetermined threshold or information on the distribution of humidities detected by the plural humidity detection sensors 10 A, 10 B, and 10 C.
- the example of FIG. 16 is a case in which the ambient humidity of the information processing apparatus 1 is higher than the predetermined threshold.
- the display content determiner 29 changes the clothing and footwear of the character 30 into a raincoat 43 and rain boots 42 .
- the body and legs of the character 30 are each an example of the specific part.
- the clothing and footwear may be changed to the raincoat 43 and the rain boots 42 for rainy weather.
- the clothing and footwear may be changed to those for waterside such as river or sea.
- the single acceleration detection sensor 8 detects the value of acceleration applied to the display 2 (see FIG. 1 ) in the first exemplary embodiment, plural acceleration detection sensors 8 may be provided in the plane of the display 2 as illustrated in FIG. 17 .
- FIG. 17 illustrates an example in which the acceleration detection sensors 8 are arranged in M rows and N columns in the plane of the display 2 .
- the distribution of values of local acceleration may be determined from positional information of the acceleration detection sensors 8 .
- FIG. 18 illustrates a change in the number of times of detection of acceleration (collision) from a specific position in a case in which the acceleration detection sensors 8 are arranged in 4 rows and 4 columns.
- no collision is applied at time t 1 .
- the number of times of detection is zero in each of the acceleration detection sensors 8 .
- Collision is applied once to two acceleration detection sensors 8 positioned in the first row and the second and third columns at time t 2 .
- Collision is applied 10 times to the two acceleration detection sensors 8 positioned in the first row and the second and third columns at time t 3 , and propagation of vibrations in this process changes outputs from the acceleration detection sensors 8 in the second and third rows in the same columns.
- darker shading of the acceleration detection sensors 8 indicates a larger total number of times of collision.
- acceleration propagates from the first row to the second and third rows on the basis of the distribution of the number of times of collision detected by the plural acceleration detection sensors 8 within a predetermined period.
- Information on this propagation direction may be used to control content of a change to be made to the specific part.
- substantially the same technique may be applied to the pressure detection sensor 5 .
- the position detection sensor 4 and the pressure detection sensor 5 having substantially the same shape as the display 2 (see FIG. 2 ) are combined to detect the position and strength of pressure in the first exemplary embodiment.
- plural pressure detection sensors 5 may be arranged in the plane of the display 2 , and positional information of the pressure detection sensors 5 that detect pressure may be used to calculate the position at which pressure is applied, the direction in which pressure as an external force is applied, the strength thereof, and the like.
- a part to which a change is to be made in the display may be determined on the basis of the change and distribution of strengths of pressure detected by the respective pressure detection sensors 5 .
- the first exemplary embodiment has illustrated a case in which the three-dimensional object is displayed on the single display 2 (see FIG. 1 ) included in the information processing apparatus 1 (see FIG. 1 ).
- substantially the same technique may be applied to processing of the controller 3 in an information processing apparatus 1 A that displays an object three-dimensionally by displaying plural two-dimensional objects corresponding to a single object.
- FIG. 19 illustrates an appearance example of the information processing apparatus 1 A including displays 2 A to 2 D on four faces, which are the front face, the side faces, and the back face.
- a character front image 30 A is displayed on the display 2 A provided on the front face
- a character right side image 30 B is displayed on the display 2 B provided on the right side face of the drawing
- a character left side image is displayed on the display 2 C provided on the left side face
- a character back image is displayed on the display 2 D provided on the back face.
- the displays may be provided on two faces, such as the front and back faces or the top and bottom faces.
- FIG. 20 illustrates an appearance example of an information processing apparatus 1 B according to a second exemplary embodiment.
- FIG. 20 parts corresponding to those in FIG. 1 are denoted by the corresponding reference numerals.
- the appearance of the information processing apparatus 1 B is substantially in the form of a plate as in the appearance of the information processing apparatus 1 according to the first exemplary embodiment.
- the information processing apparatus 1 B according to the second exemplary embodiment is different from the information processing apparatus 1 according to the first exemplary embodiment in that a housing is deformable at any position and in including deformation detection sensors 51 that detect the position where the housing is deformed.
- the plural deformation detection sensors 51 are arranged along the outline of the housing.
- the arrangement positions and intervals (density) of the deformation detection sensors 51 are determined in accordance with the size or specifications of the deformation detection sensors 51 .
- the deformation detection sensors 51 may be arranged to overlap with the display 2 .
- Each of the deformation detection sensors 51 is configured from a so-called strain sensor, and, for example, a displacement sensor to which piezoelectricity of polylactic acid is applied, developed by Murata Manufacturing Co., Ltd., is used.
- the strain sensor outputs a sensor output whose level is in accordance with a bent amount (angle).
- the strain sensor is a device by which deformation of an attached member is detectable. Accordingly, the deformation detection sensors 51 also detect curving before bending as deformation.
- a state in which the housing of the information processing apparatus 1 B is planar is referred to as a pre-deformation state or an initial state in the second exemplary embodiment.
- the controller 3 estimates a bent position.
- the deformation detection sensors 51 are arranged on one face (e.g., a face on which the display is provided) of the housing.
- the deformation detection sensors 51 may be arranged on both faces of the housing.
- the bent position is an example of a deformed position.
- a flexible housing formed of plastic or the like is provided with the display 2 used to display an image, the controller 3 that controls the entire apparatus, and the like.
- the information processing apparatus 1 B that is specialized in displaying is referred to as a flexible display.
- FIG. 21 illustrates a hardware configuration example of the information processing apparatus 1 B.
- the information processing apparatus 1 B is different from the information processing apparatus 1 according to the first exemplary embodiment in that the plural deformation detection sensors 51 are connected to the bus 16 .
- FIG. 22 illustrates a functional configuration example of the controller 3 according to the second exemplary embodiment.
- the information processing apparatus 1 B is different from the information processing apparatus 1 described in the first exemplary embodiment in including a bent position determiner 52 that receives outputs from the plural deformation detection sensors 51 and determines a bent position generated in the housing.
- the display content determiner 29 uses information on a bent position in the housing determined by the bent position determiner 52 to make a change to a specific part of an object that is displayed three-dimensionally or to make a change to display content.
- FIGS. 23A to 23E illustrate examples of the bent position determined by the bent position determiner 52 .
- FIG. 23A illustrates a bent position determined by the bent position determiner 52 if deformation is detected by two deformation detection sensors 51 each positioned near the midpoint of a short side of the information processing apparatus 1 B.
- the bent position determiner 52 determines that the display 2 is bent in such a manner that a crease is put along a line L 1 in parallel to long sides of the information processing apparatus 1 B. Note that creases in FIGS. 23A to 23E are illustrated for description, and the creases are not necessarily put.
- FIG. 23B illustrates a bent position determined by the bent position determiner 52 if deformation is detected by two deformation detection sensors 51 each positioned near the midpoint of a long side of the information processing apparatus 1 B.
- the bent position determiner 52 determines that the display 2 is bent in such a manner that a crease is put along a line L 2 in parallel to short sides of the information processing apparatus 1 B.
- FIG. 23C illustrates a bent position determined by the bent position determiner 52 if deformation is detected by two deformation detection sensors 51 positioned at the upper right corner and the lower left corner of the information processing apparatus 1 B.
- the bent position determiner 52 determines that the display 2 is bent in such a manner that a crease is put along a line L 3 in a diagonal line to the upper right corner.
- FIG. 23D illustrates a bent position determined by the bent position determiner 52 if deformation is detected by a deformation detection sensor 51 positioned at the upper right corner and a deformation detection sensor 51 positioned immediately below a deformation detection sensor 51 positioned at the upper left corner of the information processing apparatus 1 B.
- the bent position determiner 52 determines that the display 2 is bent in such a manner that a crease is put along a line L 4 that forms the hypotenuse of a right triangle whose right angle is at the upper left corner.
- FIG. 23E illustrates a bent position determined by the bent position determiner 52 if deformation is detected by three deformation detection sensors 51 positioned in the upper side (long side) and three deformation detection sensors 51 positioned in the lower side (long side).
- the bent position determiner 52 determines that the display 2 is deformed in such a manner that creases are put along a line L 5 in a valley fold, a line L 6 in a mountain fold, and a line L 7 in a valley fold from the left side to the right side.
- FIG. 24 illustrates an example in which an image of the three-dimensional object is edited by using information on the determined bent position.
- a three-dimensional image 53 of the three-dimensional object is displayed on the display 2 .
- the three-dimensional image 53 in this case is an apple (virtual creature).
- the shape of the information processing apparatus 1 B at time t 1 is an example of a first shape.
- the information processing apparatus 1 B is bent at a relatively right position with respect to the center of the screen in such a manner that the display 2 comes inside. Also at this time, no change is made to the displayed three-dimensional image 53 of the three-dimensional object.
- the shape of the information processing apparatus 1 B at time t 2 is an example of a second shape.
- the information processing apparatus 1 B is made flat again, but a line 54 is added to the three-dimensional image 53 of the three-dimensional object at a part corresponding to the bent position at time t 2 (curved part on the surface of the apple).
- the line addition is an example of editing on the three-dimensional image 53 (e.g., image itself) of the three-dimensional object displayed across the bent position.
- a user operation for bending the information processing apparatus 1 B is used for the display content determiner 29 to add the line 54 .
- the line 54 is started to be displayed after the information processing apparatus 1 B has been made flat again (at and after time t 3 ) in the example of FIG. 24 , the line 54 may be started to be displayed while the information processing apparatus 1 B is bent (at time t 2 ). In this case, the displaying of the line 54 may end when the information processing apparatus 1 B is no longer bent.
- the part at which a dashed line corresponding to the bent position crosses the three-dimensional image 53 of the three-dimensional object corresponds to a specific part.
- FIG. 25 illustrates another example in which an image of the three-dimensional object is edited by using information on the determined bent position.
- the displaying of the cut plane increases the presence of image processing.
- special effect processing to be performed is specified by a user in advance.
- sound effects may be added in accordance with the content of processing at the time of image processing.
- the sound of cutting an apple may be produced as a sound effect from a speaker (not illustrated).
- the addition of sound effects also increases the presence of image processing.
- Deformation of the information processing apparatus 1 B is also usable to control a processing operation of another information processing apparatus.
- FIG. 26 illustrates an example in which a deformation operation of the information processing apparatus 1 B is used to control a display operation of a display device 57 whose display is controlled by another information processing apparatus, an information processing apparatus 56 .
- deformation information detected by the information processing apparatus 1 B that is deformable at any position is transmitted to the information processing apparatus 56 through a communication unit, and is used to control a screen displayed on the display device 57 provided on or connected to the information processing apparatus 56 .
- the information processing apparatus 56 is configured as a so-called computer, and content of an image displayed on the information processing apparatus 1 B used as an operation unit may be the same as or different from content of an image displayed on the display device 57 through the information processing apparatus 56 .
- the cut plane 55 is displayed in a part of the three-dimensional image 53 of the three-dimensional object displayed on the display device 57 . In other words, a part of the three-dimensional image 53 is deleted.
- FIG. 27 illustrates another example in which a deformation operation of the information processing apparatus 1 B is used to control a display operation of the display device 57 whose display is controlled by another information processing apparatus, which is the information processing apparatus 56 .
- FIG. 27 illustrates an example in which the information processing apparatus 56 detects deformation of the information processing apparatus 1 B as an operation input unit and is used to issue an instruction for starting a slide show or for turning pages displayed on the display device 57 .
- a screen for a slide show is switched from a first page to a second page. In other words, a display image is replaced.
- This exemplary embodiment will describe a case in which a three-dimensional display is used to display an object three-dimensionally.
- FIG. 28 illustrates an example of a three-dimensional object displayed by a three-dimensional display.
- An information processing system 61 illustrated in FIG. 28 includes an image capturing apparatus 64 that captures an image of a user, an information processing apparatus 65 , and a three-dimensional space depicting apparatus 66 .
- the image capturing apparatus 64 is an apparatus that captures an image of the movement of a user 63 as a subject and is a type of sensor.
- the information processing apparatus 65 is an apparatus that performs processing for outputting data of the three-dimensional object that is a display target to the three-dimensional space depicting apparatus 66 , processing for determining the movement of the user 63 by processing the captured image, image processing on the three-dimensional object in accordance with the determination results, and the like.
- the information processing apparatus 65 is configured as a so-called computer, and the above-described various kinds of processing are performed by execution of a program.
- the three-dimensional space depicting apparatus 66 is configured from, for example, an infrared pulse laser, a lens for adjusting a focal point at which an image is formed with laser beams, a Galvanometer mirror used for planar scanning of laser beams in a space, and the like and depicts a three-dimensional image in the air on the basis of the given data of the three-dimensional object.
- a three-dimensional image 62 of an apple is formed by air plasma emission and is floating in the air as the three-dimensional object.
- the movement of the user 63 is detected through image processing in this exemplary embodiment, the movement of the user 63 may be detected by using an output from a sensor that detects the movement of the air. Not only the movement of the user 63 , but also the body temperature of the user 63 , a change thereof, the ambient temperature of the user 63 , humidity around the user 63 , and a change thereof may be detected through thermography.
- FIG. 29 illustrates a state in which a change is made to a displayed three-dimensional object in response to a specific movement of the user.
- FIG. 29 parts corresponding to those in FIG. 28 are denoted by the corresponding reference numerals.
- FIG. 29 illustrates a case in which a user moves to vertically cut a specific part of the three-dimensional image 62 .
- the information processing apparatus 65 that detects the movement of the user determines the position of the three-dimensional image 62 to which a change is to be made in response to the movement of the user, and determines a change content.
- the volumes of divisions obtained by dividing the three-dimensional image 62 of the apple along the determined position are calculated from data of the three-dimensional object, and depiction data is generated such that a division having the larger volume is depicted in the air.
- FIG. 29 a cut plane 67 of a division having the larger volume in the three-dimensional image 62 of the apple is depicted.
- the change content may be controlled in any manner in accordance with the three-dimensional object that is depicted as in the case of the first exemplary embodiment.
- FIG. 30 illustrates an example in which a change is made to a three-dimensional object that is projected onto a wall or a floor.
- FIG. 30 parts corresponding to those in FIG. 28 are denoted by the corresponding reference numerals.
- An information processing system 61 A illustrated in FIG. 30 is different from the information processing system 61 in that a projecting apparatus 68 is used in place of the three-dimensional space depicting apparatus 66 .
- the projecting apparatus 68 projects a three-dimensional image 69 of the three-dimensional object onto a wall, which is a two-dimensional space.
- a user illustrated in FIG. 30 moves his/her right hand from up to down ( ⁇ Z direction) along the wall, which is a projection plane.
- the information processing apparatus 65 detects the movement of the user through an infrared sensor (not illustrated) or image processing and makes a change to the three-dimensional image 69 to be projected.
- a cut plane 70 is displayed.
- the color of a specific part of an object that is displayed three-dimensionally may be changed.
- a change is made by using a sensor such as a sensor that detects the position on a display screen receiving user operations (the position detection sensor 4 ), a sensor that detects a user operation as pressure (the pressure detection sensor 5 , 7 A, 7 B), or a sensor that detects distortion of a housing (the distortion detection sensor 6 (see FIG. 4 )).
- a sensor such as a sensor that detects the position on a display screen receiving user operations (the position detection sensor 4 ), a sensor that detects a user operation as pressure (the pressure detection sensor 5 , 7 A, 7 B), or a sensor that detects distortion of a housing (the distortion detection sensor 6 (see FIG. 4 )).
- a sensor such as a sensor that detects the position on a display screen receiving user operations (the position detection sensor 4 ), a sensor that detects a user operation as pressure (the pressure detection sensor 5 , 7 A, 7 B), or a sensor that detects distortion of a housing (the distortion detection sensor 6 (see FIG. 4 )).
- a sensor that measures the elevation of the position where the information processing apparatus 1 is used may be used.
- this type of sensor include an altimeter that measures atmospheric pressure to calculate the elevation, a global positioning system (GPS) receiver that calculates the elevation by using GPS signals, and the like. Note that some GPS signals are supposed to be used indoors.
- GPS global positioning system
- Examples further include an air pressure sensor, an illuminance sensor, a water pressure sensor, a water depth sensor, and the like for measuring the position where the information processing apparatus 1 is used.
- a change may be made to a specific part or a background of an object displayed on the display 2 , or a screen displayed on the display 2 may be changed.
- wings may be added to a part of the object (e.g., the back or arms of a character)
- clouds may be added to the background of the object
- the screen may be switched to a bird's-eye view looking down upon surrounding landscapes from above or an image of a sky.
- a change may be made in such a manner that a part of the object (e.g., the stomach or cheeks of a character) is puffed out or sucked in in accordance with a change in the air pressure value.
- a change may be made to an image in such a manner that the volume of an object representing a sealed bag, a balloon, or the like is increased or decreased as the air pressure is decreased or increased.
- a change may be made to a part of an object in accordance with a change in the illuminance value.
- a character as the object may wear sunglasses in bright locations or may carry a flashlight in dark locations.
- the display form of the object or the display content of the screen may be switched between night and day in accordance with the brightness of an ambient environment.
- a GPS sensor may be used as the sensor.
- the language therein may be changed to the first language of the detected country or area.
- Contents of questions and answers described in the operation manual or contents related to precautions for operation may be changed on the basis of an output value of a sensor, which changes in accordance with a use environment of the information processing apparatus 1 .
- the change herein includes, for example, a change of the position of description in such a manner that contents that are likely to be referred to in an environment determined on the basis of an output from a sensor are placed in higher levels.
- a change may be made in such a manner that precautions for a hot and humid environment are placed in higher levels of the operation manual displayed as the object.
- pictures in the operation manual may be localized (customized) in accordance with a hot and humid environment.
- each sensor is provided in the information processing apparatus 1 .
- a sensor may be independent of the information processing apparatus 1 , and an output value of the sensor may be given to the information processing apparatus 1 through a communication unit.
- the environmental information herein is information that is related to positional information of the information processing apparatus 1 and that is acquirable from the outside and includes, for example, information regarding weather, such as a weather forecast, information regarding crime prevention, such as occurrence of a crime, and information regarding traffic, such as a traffic accident or a traffic jam.
- sleeves and cuffs of the character image may be shortened (clothing may be changed to a short-sleeved shirt and shorts), and the body may be sweating.
- a change may be made in such a manner that the character's expression is changed to a frightened expression or the character's body is trembling.
- a change may be made to the display in the same manner.
- the display content may be changed by combining changes corresponding to the plural pieces of environmental information. For example, if the above high-temperature warning and the above notification of occurrence of a crime are both acquired, a change may be made in such a manner that a character wears light clothing, is sweating, and has a frightened expression.
- a change may be made to the displayed object by combining information acquired by the sensor and environmental information.
- a specific change is made to the display by using a physical quantity measured by each sensor.
- outputs from plural sensors may be combined to make a change to the display.
- the physical quantity herein may include pressure, acceleration, temperature, humidity, air pressure, elevation, water depth, magnetic pole, sound, positional information, and the like, which are measurement targets.
- the physical quantity herein may further include a change in an electrical signal (current or voltage) that appears in the sensor.
- four pieces of information which are temporal information, an elevation value, a temperature value, and an illuminance, may be combined to determine a use environment of the information processing apparatus 1 , and in accordance with the determined use environment (e.g., mountain in summer), a change may be made to a part of the object or to a screen.
- the determined use environment e.g., mountain in summer
- the clothing of a character as the object may be changed to clothing in a summer resort or clothing for climbing a mountain in summer, and an image displayed on the display 2 may be changed to an image of a summer sky.
- creatures that live in water areas corresponding to the determined water depth and water temperature may be displayed on the display 2 by combining the water depth and the temperature value, and a change may be made to the display form of the object.
- a character, an apple, and the like are used as examples of the three-dimensional object.
- the object that is displayed three-dimensionally is not limited to these.
- an hourglass representing the lapse of time and a set of antennas representing the intensity of radio waves are also examples of the three-dimensional object.
- the three-dimensional object as a display target may be a product such as a built-in device, equipment, or a machine that operates in accordance with software such as firmware.
- design information of the product e.g., computer-aided design (CAD) data or performance information of constituents
- CAD computer-aided design
- each component to be configured or each member to be attached or detached is acquirable by the information processing apparatus 1 and temperature information is acquired from a sensor
- the shape of each component of a displayed object may be changed in accordance with the acquired information.
- the changed shape may be emphasized in the display.
- Such a display function enables a change in the display in such a manner that the shape of some of the components is expanded if the ambient temperature of the information processing apparatus 1 is high.
- This display equals to a simulation of a future change of a product under the current environment. Although typical simulations require inputs of temperature conditions and the like, this display function enables checking of a change to be generated in the product by only displaying the target product on a screen on-site.
- this display function enables estimation of increased viscosity of a lubricant and a change of the movement of a product displayed on a screen as a change to hardware.
- an abnormality of hardware movement due to an influence of a use environment on software or a malfunction of software may be represented as a change to a displayed object.
- a display screen For example, if the ambient environment of the information processing apparatus 1 is humid and hot, it is possible to estimate a short circuit of an electrical circuit configuring an object or thermal runaway of a CPU and to change the movement of the object on a display screen to movement that is unique to malfunction (e.g., screen blackout or error message display).
- the change of a displayed object based on information acquired by various sensors may be realized by using, for example, a correspondence table 60 illustrated in FIG. 31 .
- FIG. 31 illustrates an example of the correspondence table 60 representing a correspondence relationship between objects as change targets in combination with numbers of dimensions of sensor values and changed images.
- the correspondence table 60 includes objects as change targets displayed as, in addition to three-dimensional images, one-dimensional images and two-dimensional images. That is, although the above exemplary embodiments have described cases in which a change is made to displayed three-dimensional objects, the number of dimensions of an object as a display target may be any number as long as the display is capable of displaying the object. In addition, if changed images corresponding to the respective numbers of dimensions are prepared, displayed images may be changed on the basis of outputs from various sensors or the like.
- sensor values are classified according to the number of dimensions of acquired information. Specifically, one-dimensional sensor values 72 correspond to cases in which sensor values are one-dimensional, two-dimensional sensor values 73 correspond to cases in which sensor values are two-dimensional, and three-dimensional sensor values 74 correspond to cases in which sensor values are three-dimensional.
- the classification into the one-dimensional sensor values 72 , the two-dimensional sensor values 73 , and the three-dimensional sensor values 74 herein may be based on the number of dimensions in a three-dimensional space, for example.
- an output value of this sensor is included in the one-dimensional sensor values 72 .
- output values of the two sensors are the one-dimensional sensor values 72 .
- a physical quantity detected by a single sensor corresponds to two dimensions (e.g., X-axis direction and Y-axis direction) in a three-dimensional space
- an output value of this sensor is included in the two-dimensional sensor values 73 .
- two sensors detect one-dimensional physical quantities in different directions (e.g., one is X-axis direction and the other is Y-axis direction)
- output values of the two sensors are the two-dimensional sensor values 73 .
- a physical quantity detected by a single sensor corresponds to three dimensions (e.g., X-axis direction, Y-axis direction, and Z-axis direction) in a three-dimensional space
- an output value of this sensor is included in the three-dimensional sensor values 74 .
- three sensors detect one-dimensional physical quantities in different directions (e.g., one is X-axis direction, another one is Y-axis direction, and the other is Z-axis direction)
- output values of the three sensors are the three-dimensional sensor values 74 .
- the classification into the one-dimensional sensor values 72 , the two-dimensional sensor values 73 , and the three-dimensional sensor values 74 herein may be based on the number of dimensions of information, for example.
- the two sensor values are the two-dimensional sensor values 73 .
- the two sensor values output from the single sensor are the two-dimensional sensor values 73 .
- the plural output values herein may correspond to the same physical quantity or different physical quantities.
- the classification into the one-dimensional sensor values 72 , the two-dimensional sensor values 73 , and the three-dimensional sensor values 74 may be based on a combination of the number of dimensions of a three-dimensional space and the number of dimensions of information, for example.
- plural changed images 1 to N with respect to images of the respective numbers of dimensions are associated with the levels of sensor values of the respective numbers of dimensions.
- N changed images 1 are associated with one-dimensional images
- N changed images 4 are associated with two-dimensional images
- N changed images 7 are associated with three-dimensional images.
- M changed images 2 are associated with one-dimensional images
- M changed images 5 are associated with two-dimensional images
- M changed images 8 are associated with three-dimensional images.
- L changed images 3 are associated with one-dimensional images
- L changed images 6 are associated with two-dimensional images
- L changed images 9 are associated with three-dimensional images.
- a change may be made to each image in accordance with the number of dimensions used for depicting the image.
- a change in accordance with a sensor value may be made to each image corresponding to one of plural objects, the change may be made only to a specific object specified by a user among the plural objects or an image corresponding to a specific dimension or dimensions. For example, if there are both three-dimensionally depicted object images and two-dimensionally depicted object images, a change may be made only to three-dimensionally depicted object images, which are specified by a user.
- an output to an image forming apparatus may reflect such a change in the display.
- print data corresponding to the changed content may be generated to print an image. That is, not only image display, but also printing may reflect a sensor output value.
- a change in the display may be independent of content of the image to be printed. That is, even if a change is made to a displayed object, content to be printed may be content of the three-dimensional object before a change is made in accordance with a sensor output value. In this case, it is desirable that a user be allowed to select on a user interface screen, whether an image including display content that has been changed in accordance with the sensor output value is to be printed or an image including content before a change is made in accordance with the sensor output value is to be printed.
- the user interface screen herein may be prepared as a setting screen or a part of a confirmation screen displayed in a pop-up window each time a print instruction is issued.
- the information processing apparatus 1 converts three-dimensional information defining the three-dimensional object into two-dimensional information.
- the two-dimensional information herein may be, for example, an image obtained by observing the surface of the three-dimensional object, or an image reflecting a change in an internal structure of the three-dimensional object as illustrated in FIG. 24 or the like.
- the so-called printer may be, in addition to an apparatus that prints a two-dimensional image on a recording medium such as a sheet of paper, a three-dimensional printer that forms a three-dimensional image. If the output destination is a three-dimensional printer, the three-dimensional object is output as a three-dimensional object without being converted into three-dimensional information.
- a mobile information terminal such as a tablet computer or a smartphone is assumed as the information processing apparatus 1 .
- the above technique is applicable to any information terminal including a display (including projection function) and a sensor, such as a clock, a toy like a game machine, a television receiver, a projector, a head-mounted display, an image forming apparatus (so-called printer), an electronic white board, or a robot for communication.
- the invention provides a display control method including following steps. Acquiring positional information, which indicates a position where a mobile display device is being used, from the display device; acquiring first information, which is different from the positional information and is associated with the positional information, from a device different from the display device; and differentiating a two-dimensional image, which represents an object defined three-dimensionally from one viewpoint direction and is displayed on the display device, according to the acquired first information, and displaying the two-dimensional image as a moving image.
- An image of a virtual creature displayed on the display device, a form of which is imitated by the object changes in association with at least the first information and the positional information.
- An image of a virtual creature different from the image of the virtual creature is additionally displayed on the display device in association with a change of the first information and the positional information.
- the number of images of virtual creatures displayed on the display device, forms of which are imitated by the object is different in association with the first information and the positional information.
- the first information is information regarding weather and an abnormality of weather is notified through the information regarding weather
- the two-dimensional image is displayed in a mode according to a content of the abnormality. Warning information is displayed on the display device when the abnormality of weather is notified through the first information.
- a magnitude of change in display of the two-dimensional image is changed according to the information regarding weather.
- the two-dimensional image increases on the display device and then decreases according to the first information.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Architecture (AREA)
- Quality & Reliability (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- This application is a continuation application of the U.S. patent application Ser. No. 15/943,713, filed on Apr. 3, 2018, now allowed, which claims priority under 35 USC 119 from Japanese Patent Application No. 2017-161182 filed Aug. 24, 2017.
- The present invention relates to an information processing apparatus.
- Techniques for displaying an object three-dimensionally include a method using a single display screen, a method using plural display screens arranged three-dimensionally, a method using a three-dimensional display, and the like.
- According to a first aspect of the invention, there is provided a display control method including following steps. Acquiring positional information, which indicates a position where a display device is being used, from the display device, the display device is a mobile display device. Acquiring first information, which is different from the positional information and is associated with the positional information, from a device different from the display device. Differentiating a two-dimensional image, which represents an object defined three-dimensionally from one viewpoint direction and is displayed on the display device, according to the acquired first information, and displaying the two-dimensional image as a moving image.
- According to a second aspect of the invention, in the display control method according to the first aspect, an image of a virtual creature displayed on the display device, a form of which is imitated by the object, changes in association with at least the first information and the positional information.
- According to a third aspect of the invention, in the display control method according to the second aspect, an image of a virtual creature different from the image of the virtual creature is additionally displayed on the display device in association with a change of the first information and the positional information.
- According to a fourth aspect of the invention, in the display control method according to the second aspect, the number of images of virtual creatures displayed on the display device, forms of which are imitated by the object, is different in association with the first information and the positional information.
- According to a fifth aspect of the invention, in the display control method according to the first aspect, when the first information is information regarding weather and an abnormality of weather is notified through the information regarding weather, the two-dimensional image is displayed in a mode according to a content of the abnormality.
- According to a sixth aspect of the invention, in the display control method according to the fifth aspect, when the abnormality of weather is notified through the first information, warning information is displayed on the display device.
- According to a seventh aspect of the invention, in the display control method according to the first aspect, when the first information is information regarding weather, a magnitude of change in display of the two-dimensional image is changed according to the information regarding weather.
- According to an eighth aspect of the invention, in the display control method according to the first aspect, the two-dimensional image increases on the display device and then decreases according to the first information.
- According to a ninth aspect of the invention, there is provided an information processing apparatus including a display controller. The display controller acquires positional information, which indicates a position where a mobile display device is being used, from the display device. The display controller acquires first information, which is different from the positional information and is associated with the positional information, from a device different from the display device. The display controller differentiates a two-dimensional image, which represents an object defined three-dimensionally from one viewpoint direction and is displayed on the display device, according to the acquired first information, and displays the two-dimensional image as a moving image.
- According to a tenth aspect of the invention, in the information processing apparatus according to the ninth aspect, an image of a virtual creature displayed on the display device, a form of which is imitated by the object, changes in association with at least the first information and the positional information.
- According to an eleventh aspect of the invention, in the information processing apparatus according to the tenth aspect, an image of a virtual creature different from the image of the virtual creature is additionally displayed on the display device in association with a change of the first information and the positional information.
- According to a twelfth aspect of the invention, in the information processing apparatus according to the tenth aspect, the number of images of virtual creatures displayed on the display device, forms of which are imitated by the object, is different in association with the first information and the positional information.
- According to a thirteenth aspect of the invention, in the information processing apparatus according to the ninth aspect, when the first information is information regarding weather and an abnormality of weather is notified through the information regarding weather, the display controller displays the two-dimensional image in a mode according to a content of the abnormality.
- According to a fourteenth aspect of the invention, in the information processing apparatus according to the thirteenth aspect, when the abnormality of weather is notified through the first information, the display controller displays warning information on the display device.
- According to a fifteenth aspect of the invention, in the information processing apparatus according to the ninth aspect, when the first information is information regarding weather, the display controller changes a magnitude of change in display of the two-dimensional image according to the information regarding weather.
- According to a sixteenth aspect of the invention, in the information processing apparatus according to the ninth aspect, the display controller increases and then decreases the two-dimensional image on the display device according to the first information.
- According to a seventeenth aspect of the invention, there is provided a non-transitory computer readable medium storing a program causing a computer to: acquire positional information, which indicates a position where a mobile display device is being used, from the display device; acquire first information, which is different from the positional information and is associated with the positional information, from a device different from the display device; and differentiate a two-dimensional image, which represents an object defined three-dimensionally from one viewpoint direction and is displayed on the display device, according to the acquired first information, and display the two-dimensional image as a moving image.
- Exemplary embodiments of the present invention will be described in detail based on the following figures, wherein:
-
FIG. 1 illustrates an appearance example of an information processing apparatus according to a first exemplary embodiment; -
FIG. 2 illustrates an arrangement example of planar sensors; -
FIG. 3 illustrates an arrangement example of other sensors; -
FIG. 4 illustrates a hardware configuration example of the information processing apparatus; -
FIG. 5 illustrates a functional configuration example of a controller according to the first exemplary embodiment; -
FIG. 6 illustrates an example of a three-dimensional object displayed on a display of the information processing apparatus; -
FIG. 7 illustrates changes that are made to a character displayed as the three-dimensional object upon acceleration (collision) being applied to a top face of the information processing apparatus; -
FIG. 8 illustrates changes that are made to the character displayed as the three-dimensional object if acceleration (collision) is continuously applied to the top face of the information processing apparatus; -
FIG. 9 illustrates changes that are made to a structure displayed as the three-dimensional object if acceleration (collision) is continuously applied to the top face of the information processing apparatus; -
FIG. 10 illustrates changes that are made to the character displayed as the three-dimensional object upon pressure being applied to the head of the character; -
FIG. 11 illustrates changes that are made to the character displayed as the three-dimensional object if pressure is continuously applied by a fingertip to the head of the character; -
FIG. 12 illustrates changes that are made to the character displayed as the three-dimensional object upon pressure being applied to the chest of the character; -
FIG. 13 illustrates changes that are made to the structure displayed as the three-dimensional object upon distortion being applied to the information processing apparatus; -
FIG. 14 illustrates changes that are made to the character displayed as the three-dimensional object upon a temperature change being detected by temperature detection sensors; -
FIG. 15 illustrates a change that is made to an ice cube displayed as the three-dimensional object upon a temperature change being detected by the temperature detection sensors; -
FIG. 16 illustrates a change that is made to the character displayed as the three-dimensional object upon a humidity change being detected by humidity detection sensors; -
FIG. 17 illustrates an example in which acceleration detection sensors are arranged in M rows and N columns in the plane of the display; -
FIG. 18 illustrates a change in the number of times of detection of acceleration (collision) from a specific position in a case in which the acceleration detection sensors are arranged in 4 rows and 4 columns; -
FIG. 19 illustrates an appearance example of an information processing apparatus including displays on four faces, which are a front face, side faces, and a back face; -
FIG. 20 illustrates an appearance example of an information processing apparatus according to a second exemplary embodiment; -
FIG. 21 illustrates a hardware configuration example of the information processing apparatus; -
FIG. 22 illustrates a functional configuration example of a controller according to the second exemplary embodiment; -
FIGS. 23A to 23E illustrate examples of a bent position determined by a bent position determiner; -
FIG. 24 illustrates an example in which an image of the three-dimensional object is edited by using information on the determined bent position; -
FIG. 25 illustrates another example in which an image of the three-dimensional object is edited by using information on the determined bent position; -
FIG. 26 illustrates an example in which a deformation operation of the information processing apparatus is used to control a display operation of a display device whose display is controlled by another information processing apparatus; -
FIG. 27 illustrates another example in which a deformation operation of the information processing apparatus is used to control a display operation of a display device whose display is controlled by another information processing apparatus; -
FIG. 28 illustrates an example of a three-dimensional object displayed by a three-dimensional display; -
FIG. 29 illustrates a state in which a change is made to a displayed three-dimensional object in response to a specific movement of a user; -
FIG. 30 illustrates an example in which a change is made to a three-dimensional object that is projected onto a wall or a floor; and -
FIG. 31 illustrates an example of a correspondence table representing a correspondence relationship between objects as change targets in combination with numbers of dimensions of sensor values and changed images. - Now, exemplary embodiments of the present invention will be described in detail below with reference to the attached drawings.
- In the exemplary embodiments described below, “to display an object three-dimensionally” means to display the object in a manner including depth information by using a single display screen, plural display screens arranged three-dimensionally, a so-called three-dimensional display, or the like.
- A method using a single display screen includes a case of displaying a two-dimensional image (image including information regarding perspective) obtained by actually taking a picture of an object that is present in a real space, a case of displaying a two-dimensional image representing an object defined in a three-dimensional space from one perspective, and the like. Here, examples of the object defined in a three-dimensional space include a character in a virtual space, a three-dimensional image reconstructed from tomographic images, and the like.
- In addition, the method using plural display screens arranged three-dimensionally includes a case of displaying on the respective display screens, plural two-dimensional images obtained by observing an object defined in a three-dimensional space from plural perspectives (corresponding to arrangement positions of the display screens).
- In addition, the method using a three-dimensional display includes a method requiring an observer to wear glasses having special optical characteristics, a method requiring no such special glasses, a method requiring an observer to wear a head-mounted display on the head, and the like. Examples of the method requiring no glasses include a method using the phenomenon that air at the focal point of condensed laser beams changes into plasma and emits light.
- However, as long as the object is represented three-dimensionally, information corresponding to the object does not have to necessarily include internal information (voxel defining volume data), and may be, for example, a collection of multi faces (polygon mesh).
- In the exemplary embodiments described below, “to display an object two-dimensionally” means to display the object in a manner not containing depth information by using a single display screen, plural display screens arranged three-dimensionally, a so-called three-dimensional display, or the like.
- In the exemplary embodiments, an object defined in a three-dimensional space is referred to as a three-dimensional object, and an object defined in a two-dimensional space is referred to as a two-dimensional object.
- An object may alternatively be displayed three-dimensionally by displaying two-dimensional objects corresponding to plural display apparatuses arranged three-dimensionally.
- The object in the exemplary embodiments may be displayed as a still image or a moving image.
-
FIG. 1 illustrates an appearance example of aninformation processing apparatus 1 according to a first exemplary embodiment. - The
information processing apparatus 1 according to the first exemplary embodiment is, for example, assumed to be a mobile information terminal (mobile display device) such as a tablet computer or a smartphone. - The
information processing apparatus 1 illustrated inFIG. 1 includes six flat faces, among which adisplay 2 is arranged on a face. Note that all of the faces of theinformation processing apparatus 1 are not necessarily flat. In other words, one or more faces may be curved. - In the example illustrated in
FIG. 1 , a face on which thedisplay 2 is provided is referred to as a front face, faces positioned left and right of thedisplay 2 are referred to as side faces, and a face opposite to thedisplay 2 is referred to as a back face or a rear face. In addition, a face positioned above thedisplay 2 is referred to as a top face, and a face positioned below thedisplay 2 is referred to as a bottom face. - In the case of the
information processing apparatus 1 illustrated inFIG. 1 , the shape of the front face is a rectangle in which a length H in the Z direction (height) is longer than a length W in the X direction (width). In addition, a length D in the Y direction (depth) defining the side surfaces of theinformation processing apparatus 1 is shorter than the length W in the X direction (width). - The
display 2 is configured from a thin film display such as a liquid crystal display or an organic electroluminescent (EL) display. If thedisplay 2 is a liquid crystal display, a light source (not illustrated) is also provided. - A
controller 3 that controls operations of the units including thedisplay 2 and other components (not illustrated) are built in a housing of theinformation processing apparatus 1. -
FIG. 2 illustrates an arrangement example of planar sensors. -
FIG. 2 illustrates aposition detection sensor 4, apressure detection sensor 5, and adistortion detection sensor 6. - The
position detection sensor 4 is a sensor that detects the position of an input operation performed by a user and is stacked on the top face of thedisplay 2. An electronic device combining theposition detection sensor 4 with thedisplay 2 is called a touch panel. - The
position detection sensor 4 is an example of a detector. A detectable object differs according to a detection method. For example, in a case of using electrostatic capacitance for detection, parts of a person's body (e.g., fingertips) are detectable objects. For example, in a case of using infrared rays for detection, fingertips and other objects including pens are detectable objects. - The
position detection sensor 4 outputs the coordinates of a detected object. Theposition detection sensor 4 according to the first exemplary embodiment is capable of detecting plural operation positions at a time. - The
pressure detection sensor 5 is a sensor that detects the strength of pressure applied to an operation position at the time of an input operation and is provided on the rear face of thedisplay 2, for example. - The
pressure detection sensor 5 according to this exemplary embodiment is a capacitive pressure sensor and detects, as the strength of pressure, the degree of flexure generated in a body of the sensor formed in the form of a film. In the case of this exemplary embodiment, thepressure detection sensor 5 is capable of detecting some levels of pressure differences. - The
distortion detection sensor 6 is a sensor that detects the degree of distortion generated in the body of the sensor and is provided on the rear face of thedisplay 2. - As the
distortion detection sensor 6 according to this embodiment, for example, a displacement sensor to which piezoelectricity of polylactic acid is applied, developed by Murata Manufacturing Co., Ltd., is used. In the case of this exemplary embodiment, thedistortion detection sensor 6 is capable of detecting the direction and degree of distortion. The distortion is an example of a physical quantity by which, typically, a part is not directly determined. - The
position detection sensor 4, thepressure detection sensor 5, and thedistortion detection sensor 6 are each an example of a detector. -
FIG. 3 illustrates an arrangement example of other sensors. Theinformation processing apparatus 1 illustrated inFIG. 3 includes, inside the housing,pressure detection sensors acceleration detection sensor 8 that detects the direction and value of acceleration applied to the body,temperature detection sensors humidity detection sensors - The number of temperature detection sensors and the number of humidity detection sensors may each be one.
- The
pressure detection sensors information processing apparatus 1. - The
temperature detection sensors information processing apparatus 1 is used and also are used for detection of the temperature of a local area. The temperature is an example of a physical quantity by which, typically, a part is not directly determined. - The
humidity detection sensors information processing apparatus 1 is used and also are used for detection of the humidity of a local area. The humidity is an example of a physical quantity by which, typically, a part is not directly determined. - The
pressure detection sensors acceleration detection sensor 8, thetemperature detection sensors humidity detection sensors -
FIG. 4 illustrates a hardware configuration example of theinformation processing apparatus 1. - The
information processing apparatus 1 according to this exemplary embodiment includes, in addition to the above-described devices, anon-volatile storage device 14 used for data storage and acommunication unit 15. These devices transmit and receive data via abus 16. - Note that the
controller 3 includes a central processing unit (CPU) 11 that executes data processing, a read only memory (ROM) 12 that stores programs and the like such as basic input/output system (BIOS) and firmware, and a random access memory (RAM) 13 used as a work area. - The
storage device 14 according to this exemplary embodiment is configured from, for example, a semiconductor memory or a hard disk device. - The
communication unit 15 is a communicator used for communication with an external apparatus. A variety of schemes are used for communication. Note that the communication path may be a wired path or a wireless path. -
FIG. 5 illustrates a functional configuration example of thecontroller 3 according to the first exemplary embodiment. - The functional configuration illustrated in
FIG. 5 is realized by a program being executed. - The
controller 3 according to the first exemplary embodiment serves as apressure strength determiner 21 that determines the strength of pressure, apressurized part determiner 22 that determines the part to which pressure has been applied, anoperation position determiner 23 that determines the position of an operation, atemperature determiner 24 that determines a temperature, ahumidity determiner 25 that determines a humidity, anacceleration direction determiner 26 that determines the direction of acceleration, anacceleration value determiner 27 that determines the value of acceleration, adistortion determiner 28 that determines the direction and degree of distortion, and adisplay content determiner 29 that determines display content by using determined information. - The
display content determiner 29 is an example of a display controller. - The
pressure strength determiner 21 receives a pressure value that is output from thepressure detection sensor 5 provided on the front face side and pressure values that are output from the twopressure detection sensors - If pressure values are input from the
pressure detection sensors pressurized part determiner 22 determines that the positions at which pressure is applied by a user operation are on the side faces. On the other hand, if operation coordinates are input from theposition detection sensor 4, thepressurized part determiner 22 determines that the position at which pressure is applied by a user operation is on the front face. Although thepressurized part determiner 22 and theoperation position determiner 23 are provided separately in this exemplary embodiment, thepressurized part determiner 22 may also serve as theoperation position determiner 23. - If pressure values are input from the
pressure detection sensors operation position determiner 23 determines that a user operates the side surfaces. If operation coordinates are input from theposition detection sensor 4, theoperation position determiner 23 determines that a user operates the position according to the operation coordinates. Theoperation position determiner 23 also determines, in addition to operation positions at respective times, a locus of the operation positions over time. Theoperation position determiner 23 according to this exemplary embodiment is capable of detecting plural operation positions at a time. - The
temperature determiner 24 determines temperatures of the respective parts, a temperature distribution, a temporal change, and the like on the basis of temperature values that are input from thetemperature detection sensors - The
humidity determiner 25 determines humidities of the respective parts, a humidity distribution, a temporal change, and the like on the basis of humidity values that are input from thehumidity detection sensors - The
acceleration direction determiner 26 determines the direction of acceleration that has acted on the housing and a temporal change thereof on the basis of acceleration information that is input from theacceleration detection sensor 8. - The
acceleration value determiner 27 determines the value of acceleration that has acted on the housing and a temporal change thereof on the basis of acceleration information that is input from theacceleration detection sensor 8. - The
distortion determiner 28 determines the direction and degree of distortion generated in the housing on the basis of an output from thedistortion detection sensor 6. - On the basis of information from the above-described determiners, the
display content determiner 29 makes a change to an object that is displayed three-dimensionally on thedisplay 2. Specific change contents will be specifically described later. - Note that the physical quantity as a detection target differs according to the type and arrangement of sensors provided in the housing.
- In addition, all of the above-described sensors are not necessarily provided in the housing.
- Next, examples of a control operation performed by the
display content determiner 29 according to the first exemplary embodiment by using inputs from sensors will be described. -
FIG. 6 illustrates an example of a three-dimensional object displayed on thedisplay 2 of theinformation processing apparatus 1. - The three-dimensional object illustrated in
FIG. 6 represents a character 30 (virtual creature) in the form of a person and is an example of an object that is displayed three-dimensionally. -
FIG. 7 illustrates changes that are made to thecharacter 30 displayed as the three-dimensional object upon acceleration (collision) being applied to the top face of theinformation processing apparatus 1. The acceleration (collision) is an example of a physical quantity by which, typically, a part is not directly determined. - The example illustrated in
FIG. 7 is a case in which acceleration is applied downward (−Z direction) to theinformation processing apparatus 1 at time t1. This event is determined by the acceleration direction determiner 26 (seeFIG. 5 ) if, for example, collision is applied to the top face of theinformation processing apparatus 1. - At this time, the display content determiner 29 (see
FIG. 5 ) determines the part of the three-dimensional object on which the external force acts, on the basis of information regarding the determined acceleration direction. - In the example of
FIG. 7 , the head of thecharacter 30 is positioned higher than any other portions. Thus, thedisplay content determiner 29 determines that the external force acts on the head of thecharacter 30. - Note that the acceleration value determiner 27 (see
FIG. 5 ) also determines the value of acceleration at time t1. Theacceleration value determiner 27 compares a numeric value that is received from the acceleration detection sensor 8 (seeFIG. 4 ) with a threshold, and determines the strength of the external force that has acted on theinformation processing apparatus 1. - If the determined value of acceleration is small, the
display content determiner 29 changes the display content at time t2 (t2>t1) in such a manner that a small amount ofblood 31 flows from the head. The head is an example of a specific part. Note that the number of specific parts is not necessarily one and may be plural. In addition, the specific part is, in principle, a part of the three-dimensional object. - On the other hand, if the determined value of acceleration is large, the
display content determiner 29 changes the display content at time t2 (t2>t1) in such a manner that a greater amount ofblood 32 flows from the head than the amount of a case in which the value of acceleration is small. - Although the display content in the example of
FIG. 7 is changed in such a manner that the head of thecharacter 30 bleeds upon the detection of acceleration, a bruise (internal bleeding) may be displayed at the part on which the external force has acted. At this time, the area of the bruise may be changed in accordance with the strength of the external force that has acted. - Alternatively, a wen may be displayed at the part on which the external force has acted. At this time, the size of the wen may be changed in accordance with the strength of the external force that has acted.
- Although the head of the
character 30 is determined as an example of a part at a high position in the direction of acceleration that has acted on theinformation processing apparatus 1 in the example ofFIG. 7 , an arm of thecharacter 30 may be determined as a part to which a change is to be made if the acceleration is in the left-right direction of thedisplay 2. - In addition, although the
character 30 in the form of a person is assumed as an example of the three-dimensional object in the example ofFIG. 7 , the three-dimensional object may be, for example, a structure. If the three-dimensional object is a structure, a change may be made in such a manner that a scratch or a crack is generated in accordance with the determined value of acceleration. -
FIG. 8 illustrates changes that are made to thecharacter 30 displayed as the three-dimensional object if acceleration (collision) is continuously applied to the top face of theinformation processing apparatus 1. - As a method for continuously applying acceleration (collision), for example, the
information processing apparatus 1 may tap another object, theinformation processing apparatus 1 may be tapped by another object, or theinformation processing apparatus 1 may be shaken while being held. In the example illustrated inFIG. 8 , a case in which a user continuously taps an end face (top face) of theinformation processing apparatus 1 with a finger or the like is assumed. - Thus, the column chart illustrated in
FIG. 8 has the horizontal axis representing time and the vertical axis representing the number of times of collision applied at respective times. - Note that, unlike in the example illustrated in
FIG. 8 , a part of the displayed image may be tapped with a fingertip to apply acceleration (collision) to a specific part. In this case, the position at which acceleration (collision) is applied is determined by the position detection sensor 4 (seeFIG. 4 ). The value of acceleration is detected by the acceleration detection sensor 8 (seeFIG. 4 ). - In the example of
FIG. 8 , the number of times of collision increases from time t1 to time t2, and then decreases. Although collisions do not necessarily have equal strength, typically, the total strength of collisions that act on the specific part is in proportion to the number of times of collision. However, the total strength of five collisions each having a collision strength of “2” equals to the total strength of two collisions each having a collision strength of “5”. - Thus, in the example of
FIG. 8 , the area of flowingblood 33 increases from time t1 to time t2, and the area of theblood 33 flowing from the head of thecharacter 30 decreases after time t2 from which the number of times of collision decreases (time t3) and disappears (time t4). - Note that the total strength is an example of a total of results of plural times of detection.
-
FIG. 9 illustrates changes that are made to astructure 34 displayed as the three-dimensional object if acceleration (collision) is continuously applied to the top face of theinformation processing apparatus 1. -
FIG. 9 illustrates a case in which thestructure 34 is a cylinder. - Also in the example illustrated in
FIG. 9 , acceleration is applied downward (−Z direction) to theinformation processing apparatus 1 at time t1. At this time, the display content on theinformation processing apparatus 1 is changed in such a manner that ascratch 35 is formed on the top face of the cylinder and acrack 36 grows inside the cylinder. Although thecrack 36 generated inside is represented by a dashed line inFIG. 9 for illustration, only thescratch 35 observable from the outside may be displayed on theinformation processing apparatus 1. - In
FIG. 9 , acceleration (collision) is still applied to the top face of theinformation processing apparatus 1 at time t2 (t2>t1). The value of acceleration applied at time t2 may be equal to or different from the value of acceleration applied at time t1. - In the display at time t2, the width of the
scratch 35 increases and thecrack 36 grows in the depth direction compared with that in the display at time t1. - In addition, acceleration (collision) is still applied to the top face of the
information processing apparatus 1 at time t3 (t3>t2). The value of acceleration applied at time t3 may be equal to or different from the value of acceleration applied at time t1 or t2. - In the display at time t3, the
scratch 35 is wider and thecrack 36 is deeper than those in the display at time t2. -
FIG. 10 illustrates changes that are made to thecharacter 30 displayed as the three-dimensional object upon pressure being applied to the head of thecharacter 30. - In the example illustrated in
FIG. 10 , a case in which pressure is applied from the front face side of theinformation processing apparatus 1 toward the rear face side (Y direction) at time t1 is illustrated. - The position on which the pressure acts at this time is determined by the pressurized part determiner 22 (see
FIG. 5 ) that receives an output signal from the position detection sensor 4 (seeFIG. 4 ). This is because the position to which the pressure is applied is the same as the contact position of afingertip 37. - The strength of pressure at this time is determined by the pressure strength determiner 21 (see
FIG. 5 ) that receives an output signal from the pressure detection sensor 5 (seeFIG. 4 ). Thepressure strength determiner 21 compares the numerical value that is input from thepressure detection sensor 5 with a predetermined threshold to determine the strength of pressure that has acted on theinformation processing apparatus 1. The strength is determined in some levels. - If the determined strength of pressure is weak, the
display content determiner 29 changes the display content at time t2 (t2>t1) in such a manner that a small amount of theblood 31 flows from the head. - On the other hand, if the determined strength of pressure is strong, the
display content determiner 29 changes the display content at time t2 in such a manner that a greater amount of theblood 32 flows from the head than the amount of a case in which the strength of pressure is weak. - In this example, a user may directly determine the part to which a change is to be made in the display. In addition, in this example, a user may adjust the amount of change to be made to the specific part in accordance with the strength of pressure.
- Although, in the display position of the
character 30 displayed on thedisplay 2 inFIG. 10 , the position to which pressure is applied is determined by using the output from thepressure detection sensor 5, the part of thecharacter 30 to which a change is to be made in the display may be determined by using the outputs from thepressure detection sensors FIG. 3 ) that are prepared for detecting locally applied pressure. - For example, if pressure is detected by the
pressure detection sensors information processing apparatus 1, in accordance with the strength of the detected pressure, as inFIG. 10 , blood may flow from the head, or a change may be made to another part such as a hand, a leg, or the chest in the display. - If the output values from the
pressure detection sensors display content determiner 29 in advance. -
FIG. 11 illustrates changes that are made to thecharacter 30 displayed as the three-dimensional object if pressure is continuously applied by thefingertip 37 to the head of thecharacter 30. - Needless to say, the pressure may be applied by an object other than a fingertip, such as a tip of a pen. In addition, the
fingertip 37 does not have to be continuously in contact with the surface of thedisplay 2 and may tap a specific part discontinuously as in the case in which acceleration (collision) is applied. - In the example illustrated in
FIG. 11 , the head of thecharacter 30 remains pushed by thefingertip 37 from time t1 to time t3. In this case, thedisplay content determiner 29 determines that the head keeps bleeding and adds an image of theblood 31 to the head at time t2 and increases the area of theblood 32 at time t3. - Here, the bleeding amount may be in proportion to the strength of pressure. For example, if the strength is strong, it may be determined that the bleeding amount is large; if the strength is weak, it may be determined that the bleeding amount is small. However, regardless of the strength of pressure, the bleeding amount may be increased and decreased in proportion to the length of time during which a pressure exceeding a predetermined threshold is applied.
- Although the above-described example is a case in which a change made on a screen is bleeding or a scratch (crack) depending on a difference of the three-dimensional object displayed on the
display 2, the change content may be different in accordance with the part on which an external force acts. Note that the change content may be any content as long as the specific part is displayed in a different form, such as addition, deletion, deformation, or division in the display. -
FIG. 12 illustrates changes that are made to thecharacter 30 displayed as the three-dimensional object upon pressure being applied to the chest of thecharacter 30. - In the example illustrated in
FIG. 12 , the chest, not the head, is pushed by thefingertip 37 at time t1. In this example, a change is made to clothing depending on a difference of the strength of pressure applied to the chest. - If the determined strength of pressure is weak, the
display content determiner 29 changes the image into an image in which the clothing of thecharacter 30 is lifted or strained at time t2 (t2>t1). - On the other hand, if the determined strength of pressure is strong, the
display content determiner 29 changes the image into an image in which the clothing of thecharacter 30 is ripped at time t2. - In addition, for example, the type of clothing may be changed or the clothing may be changed depending on the determined strength of pressure.
-
FIG. 13 illustrates changes that are made to thestructure 34 displayed as the three-dimensional object upon distortion being applied to theinformation processing apparatus 1. -
FIG. 13 illustrates a cylinder as thestructure 34. Stripes are provided on the outer circumferential face of the cylinder. The stripes are in parallel to the axial direction of the rotationally symmetric cylinder. - In the example illustrated in
FIG. 13 , while a counterclockwise force acts on an upper portion of theinformation processing apparatus 1 in the form of a flat plate, a clockwise force acts on a lower portion of theinformation processing apparatus 1. That is, inFIG. 13 , opposite forces act on the upper portion and the lower portion of theinformation processing apparatus 1. - At this time, the direction and degree of distortion acting on the
information processing apparatus 1 are given from the distortion determiner 28 (seeFIG. 5 ) to thedisplay content determiner 29. - In accordance with the direction and degree of distortion determined at time t1, the
display content determiner 29 generates an image in which thestructure 34 is distorted and displays the image on thedisplay 2. - In the example of
FIG. 13 , a change is made to thestructure 34 at time t2 at which distortion is no longer applied, not at time t1 at which distortion is applied to theinformation processing apparatus 1. However, a change may be made to the displayedstructure 34 at time t1 at which distortion is applied. - Although the change content works together with the distortion applied to the
information processing apparatus 1 in the example ofFIG. 13 , another change may be made to the three-dimensional object, which is a display target, by using information on the determined direction and degree of distortion. For example, a character displayed on thedisplay 2 may bleed or be scratched, or a change may be made to the clothing of the character. -
FIG. 14 illustrates changes that are made to thecharacter 30 displayed as the three-dimensional object upon a temperature change being detected by thetemperature detection sensors FIG. 3 ). - In this case, the
display content determiner 29 makes a change to a specific part of the displayedcharacter 30 on the basis of a result of comparison between a determined temperature and a predetermined threshold or information on the distribution of temperatures detected by the pluraltemperature detection sensors - If, for example, the ambient temperature of the
information processing apparatus 1 detected at time t1 is lower than a predetermined threshold, thedisplay content determiner 29 changes the display content at time t2 into a state in which ascarf 38 is put around the neck of thecharacter 30. The neck of thecharacter 30 is an example of the specific part. - If, for example, the ambient temperature of the
information processing apparatus 1 detected at time t1 is higher than a predetermined threshold, thedisplay content determiner 29 changes the display content at time t2 into a state in which the forehead of thecharacter 30 is covered withsweat 39. The forehead of thecharacter 30 is an example of the specific part. -
FIG. 15 illustrates a change that is made to anice cube 40 displayed as the three-dimensional object upon a temperature change being detected by thetemperature detection sensors FIG. 3 ). - In this case, the
display content determiner 29 makes a change to a specific part of the displayedice cube 40 on the basis of a result of comparison between a determined temperature and a predetermined threshold or information on the distribution of temperatures detected by the pluraltemperature detection sensors - The example of
FIG. 15 is a case in which the ambient temperature of theinformation processing apparatus 1 is higher than the predetermined threshold. In this case, thedisplay content determiner 29 makes a change to in such a manner that the displayedice cube 40 melts to be smaller andwater 41 is gathered around theice cube 40. Theice cube 40 is an example of the specific part. -
FIG. 16 illustrates a change that is made to thecharacter 30 displayed as the three-dimensional object upon a humidity change being detected by thehumidity detection sensors FIG. 3 ). - In this case, the
display content determiner 29 makes a change to a specific part of the displayedcharacter 30 on the basis of a result of comparison between a determined humidity and a predetermined threshold or information on the distribution of humidities detected by the pluralhumidity detection sensors - The example of
FIG. 16 is a case in which the ambient humidity of theinformation processing apparatus 1 is higher than the predetermined threshold. In this case, thedisplay content determiner 29 changes the clothing and footwear of thecharacter 30 into araincoat 43 andrain boots 42. - The body and legs of the
character 30 are each an example of the specific part. In addition, the clothing and footwear may be changed to theraincoat 43 and therain boots 42 for rainy weather. Note that the clothing and footwear may be changed to those for waterside such as river or sea. - Although the single
acceleration detection sensor 8 detects the value of acceleration applied to the display 2 (seeFIG. 1 ) in the first exemplary embodiment, pluralacceleration detection sensors 8 may be provided in the plane of thedisplay 2 as illustrated inFIG. 17 . -
FIG. 17 illustrates an example in which theacceleration detection sensors 8 are arranged in M rows and N columns in the plane of thedisplay 2. In the case in which theacceleration detection sensors 8 are arranged as illustrated inFIG. 17 , the distribution of values of local acceleration may be determined from positional information of theacceleration detection sensors 8. -
FIG. 18 illustrates a change in the number of times of detection of acceleration (collision) from a specific position in a case in which theacceleration detection sensors 8 are arranged in 4 rows and 4 columns. - In the case of
FIG. 18 , no collision is applied at time t1. At time t1, the number of times of detection is zero in each of theacceleration detection sensors 8. Collision is applied once to twoacceleration detection sensors 8 positioned in the first row and the second and third columns at time t2. Collision is applied 10 times to the twoacceleration detection sensors 8 positioned in the first row and the second and third columns at time t3, and propagation of vibrations in this process changes outputs from theacceleration detection sensors 8 in the second and third rows in the same columns. In the illustration ofFIG. 18 , darker shading of theacceleration detection sensors 8 indicates a larger total number of times of collision. - In the example at time t3, it is found that acceleration propagates from the first row to the second and third rows on the basis of the distribution of the number of times of collision detected by the plural
acceleration detection sensors 8 within a predetermined period. Information on this propagation direction may be used to control content of a change to be made to the specific part. - At time t4, five minutes elapses after collision has no longer been applied. In this state, each number of times of detection of collision is reset to zero.
- Note that substantially the same technique may be applied to the
pressure detection sensor 5. Theposition detection sensor 4 and thepressure detection sensor 5 having substantially the same shape as the display 2 (seeFIG. 2 ) are combined to detect the position and strength of pressure in the first exemplary embodiment. However, pluralpressure detection sensors 5 may be arranged in the plane of thedisplay 2, and positional information of thepressure detection sensors 5 that detect pressure may be used to calculate the position at which pressure is applied, the direction in which pressure as an external force is applied, the strength thereof, and the like. - In addition, a part to which a change is to be made in the display may be determined on the basis of the change and distribution of strengths of pressure detected by the respective
pressure detection sensors 5. - In addition, the first exemplary embodiment has illustrated a case in which the three-dimensional object is displayed on the single display 2 (see
FIG. 1 ) included in the information processing apparatus 1 (seeFIG. 1 ). However, as illustrated inFIG. 19 , substantially the same technique may be applied to processing of thecontroller 3 in aninformation processing apparatus 1A that displays an object three-dimensionally by displaying plural two-dimensional objects corresponding to a single object. -
FIG. 19 illustrates an appearance example of theinformation processing apparatus 1A including displays 2A to 2D on four faces, which are the front face, the side faces, and the back face. - In
FIG. 19 , a characterfront image 30A is displayed on thedisplay 2A provided on the front face, a characterright side image 30B is displayed on thedisplay 2B provided on the right side face of the drawing, a character left side image is displayed on thedisplay 2C provided on the left side face, and a character back image is displayed on thedisplay 2D provided on the back face. - Note that the displays may be provided on two faces, such as the front and back faces or the top and bottom faces.
- Also in a case of displaying an object three-dimensionally by arranging
plural displays 2 three-dimensionally as in the above case, by making a change to a specific part of the displayed object on the basis of outputs from the sensors, it is possible to provide new representation to a user. -
FIG. 20 illustrates an appearance example of aninformation processing apparatus 1B according to a second exemplary embodiment. - In
FIG. 20 , parts corresponding to those inFIG. 1 are denoted by the corresponding reference numerals. - The appearance of the
information processing apparatus 1B is substantially in the form of a plate as in the appearance of theinformation processing apparatus 1 according to the first exemplary embodiment. - However, the
information processing apparatus 1B according to the second exemplary embodiment is different from theinformation processing apparatus 1 according to the first exemplary embodiment in that a housing is deformable at any position and in includingdeformation detection sensors 51 that detect the position where the housing is deformed. - The plural
deformation detection sensors 51 are arranged along the outline of the housing. The arrangement positions and intervals (density) of thedeformation detection sensors 51 are determined in accordance with the size or specifications of thedeformation detection sensors 51. Note that thedeformation detection sensors 51 may be arranged to overlap with thedisplay 2. - Each of the
deformation detection sensors 51 is configured from a so-called strain sensor, and, for example, a displacement sensor to which piezoelectricity of polylactic acid is applied, developed by Murata Manufacturing Co., Ltd., is used. The strain sensor outputs a sensor output whose level is in accordance with a bent amount (angle). Thus, the strain sensor is a device by which deformation of an attached member is detectable. Accordingly, thedeformation detection sensors 51 also detect curving before bending as deformation. - Note that a state in which the housing of the
information processing apparatus 1B is planar is referred to as a pre-deformation state or an initial state in the second exemplary embodiment. - On the basis of a positional relationship between the plural
deformation detection sensors 51 that detect bending, thecontroller 3 estimates a bent position. - In this exemplary embodiment, the
deformation detection sensors 51 are arranged on one face (e.g., a face on which the display is provided) of the housing. However, thedeformation detection sensors 51 may be arranged on both faces of the housing. - The bent position is an example of a deformed position.
- In the
information processing apparatus 1B, a flexible housing formed of plastic or the like is provided with thedisplay 2 used to display an image, thecontroller 3 that controls the entire apparatus, and the like. Theinformation processing apparatus 1B that is specialized in displaying is referred to as a flexible display. -
FIG. 21 illustrates a hardware configuration example of theinformation processing apparatus 1B. - In
FIG. 21 , parts corresponding to those inFIG. 4 are denoted by the corresponding reference numerals. Theinformation processing apparatus 1B is different from theinformation processing apparatus 1 according to the first exemplary embodiment in that the pluraldeformation detection sensors 51 are connected to thebus 16. -
FIG. 22 illustrates a functional configuration example of thecontroller 3 according to the second exemplary embodiment. - In
FIG. 22 , parts corresponding to those inFIG. 5 are denoted by the corresponding reference numerals. Theinformation processing apparatus 1B is different from theinformation processing apparatus 1 described in the first exemplary embodiment in including abent position determiner 52 that receives outputs from the pluraldeformation detection sensors 51 and determines a bent position generated in the housing. - The
display content determiner 29 according to this exemplary embodiment uses information on a bent position in the housing determined by thebent position determiner 52 to make a change to a specific part of an object that is displayed three-dimensionally or to make a change to display content. -
FIGS. 23A to 23E illustrate examples of the bent position determined by thebent position determiner 52. -
FIG. 23A illustrates a bent position determined by thebent position determiner 52 if deformation is detected by twodeformation detection sensors 51 each positioned near the midpoint of a short side of theinformation processing apparatus 1B. In the case ofFIG. 23A , thebent position determiner 52 determines that thedisplay 2 is bent in such a manner that a crease is put along a line L1 in parallel to long sides of theinformation processing apparatus 1B. Note that creases inFIGS. 23A to 23E are illustrated for description, and the creases are not necessarily put. -
FIG. 23B illustrates a bent position determined by thebent position determiner 52 if deformation is detected by twodeformation detection sensors 51 each positioned near the midpoint of a long side of theinformation processing apparatus 1B. In the case ofFIG. 23B , thebent position determiner 52 determines that thedisplay 2 is bent in such a manner that a crease is put along a line L2 in parallel to short sides of theinformation processing apparatus 1B. -
FIG. 23C illustrates a bent position determined by thebent position determiner 52 if deformation is detected by twodeformation detection sensors 51 positioned at the upper right corner and the lower left corner of theinformation processing apparatus 1B. In the case ofFIG. 23C , thebent position determiner 52 determines that thedisplay 2 is bent in such a manner that a crease is put along a line L3 in a diagonal line to the upper right corner. -
FIG. 23D illustrates a bent position determined by thebent position determiner 52 if deformation is detected by adeformation detection sensor 51 positioned at the upper right corner and adeformation detection sensor 51 positioned immediately below adeformation detection sensor 51 positioned at the upper left corner of theinformation processing apparatus 1B. In the case ofFIG. 23D , thebent position determiner 52 determines that thedisplay 2 is bent in such a manner that a crease is put along a line L4 that forms the hypotenuse of a right triangle whose right angle is at the upper left corner. -
FIG. 23E illustrates a bent position determined by thebent position determiner 52 if deformation is detected by threedeformation detection sensors 51 positioned in the upper side (long side) and threedeformation detection sensors 51 positioned in the lower side (long side). In the case ofFIG. 23E , thebent position determiner 52 determines that thedisplay 2 is deformed in such a manner that creases are put along a line L5 in a valley fold, a line L6 in a mountain fold, and a line L7 in a valley fold from the left side to the right side. - Display Control Example
- Next, examples of a control operation performed by the
display content determiner 29 according to the second exemplary embodiment by using inputs from sensors will be described. -
FIG. 24 illustrates an example in which an image of the three-dimensional object is edited by using information on the determined bent position. - In
FIG. 24 , at time t1 before theinformation processing apparatus 1B is deformed, a three-dimensional image 53 of the three-dimensional object is displayed on thedisplay 2. The three-dimensional image 53 in this case is an apple (virtual creature). In addition, the shape of theinformation processing apparatus 1B at time t1 is an example of a first shape. - At the next time, time t2, the
information processing apparatus 1B is bent at a relatively right position with respect to the center of the screen in such a manner that thedisplay 2 comes inside. Also at this time, no change is made to the displayed three-dimensional image 53 of the three-dimensional object. The shape of theinformation processing apparatus 1B at time t2 is an example of a second shape. - At the next time, time t3, the
information processing apparatus 1B is made flat again, but aline 54 is added to the three-dimensional image 53 of the three-dimensional object at a part corresponding to the bent position at time t2 (curved part on the surface of the apple). In other words, the line addition is an example of editing on the three-dimensional image 53 (e.g., image itself) of the three-dimensional object displayed across the bent position. - In the case of
FIG. 24 , a user operation for bending theinformation processing apparatus 1B is used for thedisplay content determiner 29 to add theline 54. Although theline 54 is started to be displayed after theinformation processing apparatus 1B has been made flat again (at and after time t3) in the example ofFIG. 24 , theline 54 may be started to be displayed while theinformation processing apparatus 1B is bent (at time t2). In this case, the displaying of theline 54 may end when theinformation processing apparatus 1B is no longer bent. - Here, the part at which a dashed line corresponding to the bent position crosses the three-
dimensional image 53 of the three-dimensional object corresponds to a specific part. -
FIG. 25 illustrates another example in which an image of the three-dimensional object is edited by using information on the determined bent position. - Also in
FIG. 25 , no change is made to the three-dimensional image 53 of the three-dimensional object displayed on thedisplay 2 regardless of the bending state of theinformation processing apparatus 1B at time t1 and time t2. - However, in the case of
FIG. 25 , after theinformation processing apparatus 1B has been made flat again, an image in which the three-dimensional image 53 of the three-dimensional object is cut at a part corresponding to the bent position at time t2 is displayed as the three-dimensional image 53 of the three-dimensional object. That is, acut plane 55 is added. - Although only an image obtained after cutting the three-
dimensional image 53 of the three-dimensional object is displayed in the example ofFIG. 25 , a process from the start to the end of cutting may be displayed as a moving image. - The displaying of the cut plane increases the presence of image processing. In the case of this exemplary embodiment, special effect processing to be performed is specified by a user in advance.
- Note that sound effects may be added in accordance with the content of processing at the time of image processing. For example, in the example of
FIG. 25 , the sound of cutting an apple may be produced as a sound effect from a speaker (not illustrated). The addition of sound effects also increases the presence of image processing. - Deformation of the
information processing apparatus 1B is also usable to control a processing operation of another information processing apparatus. -
FIG. 26 illustrates an example in which a deformation operation of theinformation processing apparatus 1B is used to control a display operation of adisplay device 57 whose display is controlled by another information processing apparatus, aninformation processing apparatus 56. - In the case of
FIG. 26 , deformation information detected by theinformation processing apparatus 1B that is deformable at any position is transmitted to theinformation processing apparatus 56 through a communication unit, and is used to control a screen displayed on thedisplay device 57 provided on or connected to theinformation processing apparatus 56. - The
information processing apparatus 56 is configured as a so-called computer, and content of an image displayed on theinformation processing apparatus 1B used as an operation unit may be the same as or different from content of an image displayed on thedisplay device 57 through theinformation processing apparatus 56. - In the case of
FIG. 26 , on the basis of deformation information detected by theinformation processing apparatus 1B from time t1 to time t2, thecut plane 55 is displayed in a part of the three-dimensional image 53 of the three-dimensional object displayed on thedisplay device 57. In other words, a part of the three-dimensional image 53 is deleted. -
FIG. 27 illustrates another example in which a deformation operation of theinformation processing apparatus 1B is used to control a display operation of thedisplay device 57 whose display is controlled by another information processing apparatus, which is theinformation processing apparatus 56. -
FIG. 27 illustrates an example in which theinformation processing apparatus 56 detects deformation of theinformation processing apparatus 1B as an operation input unit and is used to issue an instruction for starting a slide show or for turning pages displayed on thedisplay device 57. - In the case of
FIG. 27 , on the basis of deformation information detected by theinformation processing apparatus 1B from time t1 to time t2, a screen for a slide show is switched from a first page to a second page. In other words, a display image is replaced. - This exemplary embodiment will describe a case in which a three-dimensional display is used to display an object three-dimensionally.
-
FIG. 28 illustrates an example of a three-dimensional object displayed by a three-dimensional display. - An
information processing system 61 illustrated inFIG. 28 includes animage capturing apparatus 64 that captures an image of a user, aninformation processing apparatus 65, and a three-dimensionalspace depicting apparatus 66. - The
image capturing apparatus 64 is an apparatus that captures an image of the movement of auser 63 as a subject and is a type of sensor. - The
information processing apparatus 65 is an apparatus that performs processing for outputting data of the three-dimensional object that is a display target to the three-dimensionalspace depicting apparatus 66, processing for determining the movement of theuser 63 by processing the captured image, image processing on the three-dimensional object in accordance with the determination results, and the like. Theinformation processing apparatus 65 is configured as a so-called computer, and the above-described various kinds of processing are performed by execution of a program. - The three-dimensional
space depicting apparatus 66 is configured from, for example, an infrared pulse laser, a lens for adjusting a focal point at which an image is formed with laser beams, a Galvanometer mirror used for planar scanning of laser beams in a space, and the like and depicts a three-dimensional image in the air on the basis of the given data of the three-dimensional object. In the example ofFIG. 28 , a three-dimensional image 62 of an apple is formed by air plasma emission and is floating in the air as the three-dimensional object. - Although the movement of the
user 63 is detected through image processing in this exemplary embodiment, the movement of theuser 63 may be detected by using an output from a sensor that detects the movement of the air. Not only the movement of theuser 63, but also the body temperature of theuser 63, a change thereof, the ambient temperature of theuser 63, humidity around theuser 63, and a change thereof may be detected through thermography. -
FIG. 29 illustrates a state in which a change is made to a displayed three-dimensional object in response to a specific movement of the user. - In
FIG. 29 , parts corresponding to those inFIG. 28 are denoted by the corresponding reference numerals. -
FIG. 29 illustrates a case in which a user moves to vertically cut a specific part of the three-dimensional image 62. Theinformation processing apparatus 65 that detects the movement of the user determines the position of the three-dimensional image 62 to which a change is to be made in response to the movement of the user, and determines a change content. In the case ofFIG. 29 , the volumes of divisions obtained by dividing the three-dimensional image 62 of the apple along the determined position are calculated from data of the three-dimensional object, and depiction data is generated such that a division having the larger volume is depicted in the air. - Thus, in
FIG. 29 , acut plane 67 of a division having the larger volume in the three-dimensional image 62 of the apple is depicted. - Although the
cut plane 67 is depicted in this example, the change content may be controlled in any manner in accordance with the three-dimensional object that is depicted as in the case of the first exemplary embodiment. -
FIG. 30 illustrates an example in which a change is made to a three-dimensional object that is projected onto a wall or a floor. - In
FIG. 30 , parts corresponding to those inFIG. 28 are denoted by the corresponding reference numerals. - An
information processing system 61A illustrated inFIG. 30 is different from theinformation processing system 61 in that a projectingapparatus 68 is used in place of the three-dimensionalspace depicting apparatus 66. - In the case of
FIG. 30 , the projectingapparatus 68 projects a three-dimensional image 69 of the three-dimensional object onto a wall, which is a two-dimensional space. A user illustrated inFIG. 30 moves his/her right hand from up to down (−Z direction) along the wall, which is a projection plane. Theinformation processing apparatus 65 detects the movement of the user through an infrared sensor (not illustrated) or image processing and makes a change to the three-dimensional image 69 to be projected. In the case ofFIG. 30 , acut plane 70 is displayed. - The exemplary embodiments of the present invention have been described above. However, the technical scope of the present invention is not limited to the above-described exemplary embodiments. It is apparent from the claims that various modifications or improvements of the above-described exemplary embodiments are also included in the technical scope of the present invention.
- For example, the color of a specific part of an object that is displayed three-dimensionally may be changed.
- In the above-described exemplary embodiments, to a specific part of an object that is displayed three-dimensionally on a display, a change is made by using a sensor such as a sensor that detects the position on a display screen receiving user operations (the position detection sensor 4), a sensor that detects a user operation as pressure (the
pressure detection sensor FIG. 4 )). However, any sensor other than the above-described sensors may be used. - For example, a sensor that measures the elevation of the position where the
information processing apparatus 1 is used may be used. Examples of this type of sensor include an altimeter that measures atmospheric pressure to calculate the elevation, a global positioning system (GPS) receiver that calculates the elevation by using GPS signals, and the like. Note that some GPS signals are supposed to be used indoors. - Examples further include an air pressure sensor, an illuminance sensor, a water pressure sensor, a water depth sensor, and the like for measuring the position where the
information processing apparatus 1 is used. - In a case of using a sensor that measures the elevation, in accordance with the elevation of the position where the
information processing apparatus 1 is used, a change may be made to a specific part or a background of an object displayed on thedisplay 2, or a screen displayed on thedisplay 2 may be changed. For example, at a position where the elevation is high, wings may be added to a part of the object (e.g., the back or arms of a character), clouds may be added to the background of the object, or the screen may be switched to a bird's-eye view looking down upon surrounding landscapes from above or an image of a sky. - In a case of using a sensor that measures the air pressure, a change may be made in such a manner that a part of the object (e.g., the stomach or cheeks of a character) is puffed out or sucked in in accordance with a change in the air pressure value. Alternatively, a change may be made to an image in such a manner that the volume of an object representing a sealed bag, a balloon, or the like is increased or decreased as the air pressure is decreased or increased.
- In a case of using an illuminance sensor, a change may be made to a part of an object in accordance with a change in the illuminance value. For example, a character as the object may wear sunglasses in bright locations or may carry a flashlight in dark locations. In addition, the display form of the object or the display content of the screen may be switched between night and day in accordance with the brightness of an ambient environment.
- Furthermore, a GPS sensor may be used as the sensor. For example, in a case in which an operation manual of an apparatus is displayed as the object, the language therein may be changed to the first language of the detected country or area.
- Contents of questions and answers described in the operation manual or contents related to precautions for operation may be changed on the basis of an output value of a sensor, which changes in accordance with a use environment of the
information processing apparatus 1. The change herein includes, for example, a change of the position of description in such a manner that contents that are likely to be referred to in an environment determined on the basis of an output from a sensor are placed in higher levels. - For example, if it is determined that the
information processing apparatus 1 is used in a hot and humid environment, a change may be made in such a manner that precautions for a hot and humid environment are placed in higher levels of the operation manual displayed as the object. - Alternatively, for example, pictures in the operation manual may be localized (customized) in accordance with a hot and humid environment.
- The above exemplary embodiments have described a case in which each sensor is provided in the
information processing apparatus 1. However, a sensor may be independent of theinformation processing apparatus 1, and an output value of the sensor may be given to theinformation processing apparatus 1 through a communication unit. - In addition, a change may be made to an object displayed on the
information processing apparatus 1 on the basis of environmental information (first information) acquired from the outside. The environmental information herein is information that is related to positional information of theinformation processing apparatus 1 and that is acquirable from the outside and includes, for example, information regarding weather, such as a weather forecast, information regarding crime prevention, such as occurrence of a crime, and information regarding traffic, such as a traffic accident or a traffic jam. - For example, in a state in which a map is displayed on a display screen of the
information processing apparatus 1 and in which a character image representing the position of a user (i.e., the position of the information processing apparatus 1) is displayed on the map, upon reception of a high-temperature warning associated with the position of the user, sleeves and cuffs of the character image may be shortened (clothing may be changed to a short-sleeved shirt and shorts), and the body may be sweating. - For example, in a state in which a map is displayed on a display screen of the
information processing apparatus 1 and in which a character image representing the position of a user (i.e., the position of the information processing apparatus 1) is displayed on the map, upon reception of a notification of occurrence of a crime in an area associated with the position of the user, a change may be made in such a manner that the character's expression is changed to a frightened expression or the character's body is trembling. - For example, upon reception of a notification of occurrence of a traffic accident, a change may be made to the display in the same manner.
- Note that if plural pieces of environmental information are acquired together, the display content may be changed by combining changes corresponding to the plural pieces of environmental information. For example, if the above high-temperature warning and the above notification of occurrence of a crime are both acquired, a change may be made in such a manner that a character wears light clothing, is sweating, and has a frightened expression.
- In addition, a change may be made to the displayed object by combining information acquired by the sensor and environmental information.
- In the above-described exemplary embodiments, in principle, a specific change is made to the display by using a physical quantity measured by each sensor. However, outputs from plural sensors may be combined to make a change to the display. The physical quantity herein may include pressure, acceleration, temperature, humidity, air pressure, elevation, water depth, magnetic pole, sound, positional information, and the like, which are measurement targets. The physical quantity herein may further include a change in an electrical signal (current or voltage) that appears in the sensor.
- For example, four pieces of information, which are temporal information, an elevation value, a temperature value, and an illuminance, may be combined to determine a use environment of the
information processing apparatus 1, and in accordance with the determined use environment (e.g., mountain in summer), a change may be made to a part of the object or to a screen. - Alternatively, for example, the clothing of a character as the object may be changed to clothing in a summer resort or clothing for climbing a mountain in summer, and an image displayed on the
display 2 may be changed to an image of a summer sky. - Further alternatively, creatures that live in water areas corresponding to the determined water depth and water temperature may be displayed on the
display 2 by combining the water depth and the temperature value, and a change may be made to the display form of the object. - In the above-described exemplary embodiment, a character, an apple, and the like are used as examples of the three-dimensional object. However, the object that is displayed three-dimensionally is not limited to these. For example, an hourglass representing the lapse of time and a set of antennas representing the intensity of radio waves are also examples of the three-dimensional object.
- In addition, the three-dimensional object as a display target may be a product such as a built-in device, equipment, or a machine that operates in accordance with software such as firmware. In this case, if design information of the product (e.g., computer-aided design (CAD) data or performance information of constituents) is available, such information and output values from sensors may be combined in the display.
- For example, if the thermal expansion coefficient or the like of each component to be configured or each member to be attached or detached is acquirable by the
information processing apparatus 1 and temperature information is acquired from a sensor, the shape of each component of a displayed object may be changed in accordance with the acquired information. In this case, the changed shape may be emphasized in the display. - Such a display function enables a change in the display in such a manner that the shape of some of the components is expanded if the ambient temperature of the
information processing apparatus 1 is high. This display equals to a simulation of a future change of a product under the current environment. Although typical simulations require inputs of temperature conditions and the like, this display function enables checking of a change to be generated in the product by only displaying the target product on a screen on-site. - In addition, if the ambient temperature of the
information processing apparatus 1 is low, this display function enables estimation of increased viscosity of a lubricant and a change of the movement of a product displayed on a screen as a change to hardware. - With this display function, an abnormality of hardware movement due to an influence of a use environment on software or a malfunction of software may be represented as a change to a displayed object. For example, if the ambient environment of the
information processing apparatus 1 is humid and hot, it is possible to estimate a short circuit of an electrical circuit configuring an object or thermal runaway of a CPU and to change the movement of the object on a display screen to movement that is unique to malfunction (e.g., screen blackout or error message display). - Note that the change of a displayed object based on information acquired by various sensors may be realized by using, for example, a correspondence table 60 illustrated in
FIG. 31 . -
FIG. 31 illustrates an example of the correspondence table 60 representing a correspondence relationship between objects as change targets in combination with numbers of dimensions of sensor values and changed images. - The correspondence table 60 includes objects as change targets displayed as, in addition to three-dimensional images, one-dimensional images and two-dimensional images. That is, although the above exemplary embodiments have described cases in which a change is made to displayed three-dimensional objects, the number of dimensions of an object as a display target may be any number as long as the display is capable of displaying the object. In addition, if changed images corresponding to the respective numbers of dimensions are prepared, displayed images may be changed on the basis of outputs from various sensors or the like.
- On the other hand, sensor values are classified according to the number of dimensions of acquired information. Specifically, one-dimensional sensor values 72 correspond to cases in which sensor values are one-dimensional, two-dimensional sensor values 73 correspond to cases in which sensor values are two-dimensional, and three-dimensional sensor values 74 correspond to cases in which sensor values are three-dimensional.
- The classification into the one-dimensional sensor values 72, the two-dimensional sensor values 73, and the three-dimensional sensor values 74 herein may be based on the number of dimensions in a three-dimensional space, for example.
- For example, if a physical quantity detected by a single sensor corresponds to one dimension (e.g., X-axis direction) in a three-dimensional space, an output value of this sensor is included in the one-dimensional sensor values 72. Note that, even if two sensors detect different physical quantities, as long as the respective physical quantities correspond to the same dimension (e.g., X-axis direction), output values of the two sensors are the one-dimensional sensor values 72.
- Similarly, if, for example, a physical quantity detected by a single sensor corresponds to two dimensions (e.g., X-axis direction and Y-axis direction) in a three-dimensional space, an output value of this sensor is included in the two-dimensional sensor values 73. In addition, if two sensors detect one-dimensional physical quantities in different directions (e.g., one is X-axis direction and the other is Y-axis direction), output values of the two sensors are the two-dimensional sensor values 73.
- Similarly, if, for example, a physical quantity detected by a single sensor corresponds to three dimensions (e.g., X-axis direction, Y-axis direction, and Z-axis direction) in a three-dimensional space, an output value of this sensor is included in the three-dimensional sensor values 74. In addition, if three sensors detect one-dimensional physical quantities in different directions (e.g., one is X-axis direction, another one is Y-axis direction, and the other is Z-axis direction), output values of the three sensors are the three-dimensional sensor values 74.
- Alternatively, the classification into the one-dimensional sensor values 72, the two-dimensional sensor values 73, and the three-dimensional sensor values 74 herein may be based on the number of dimensions of information, for example.
- For example, if a change is to be made to a displayed object in accordance with a combination of output values of two sensors that output a single value, the two sensor values are the two-dimensional sensor values 73. In addition, if a single sensor outputs two values, the two sensor values output from the single sensor are the two-dimensional sensor values 73. The plural output values herein may correspond to the same physical quantity or different physical quantities.
- Further alternatively, the classification into the one-dimensional sensor values 72, the two-dimensional sensor values 73, and the three-dimensional sensor values 74 may be based on a combination of the number of dimensions of a three-dimensional space and the number of dimensions of information, for example.
- In the example of
FIG. 31 , plural changedimages 1 to N with respect to images of the respective numbers of dimensions are associated with the levels of sensor values of the respective numbers of dimensions. - For example, for the one-dimensional sensor values 72, in accordance with differences of the levels, N changed
images 1 are associated with one-dimensional images, N changedimages 4 are associated with two-dimensional images, and N changedimages 7 are associated with three-dimensional images. - For example, for the two-dimensional sensor values 73, in accordance with differences of combinations of the levels, M changed
images 2 are associated with one-dimensional images, M changedimages 5 are associated with two-dimensional images, and M changedimages 8 are associated with three-dimensional images. - Similarly, for the three-dimensional sensor values 74, in accordance with differences of combinations of the levels, L changed
images 3 are associated with one-dimensional images, L changedimages 6 are associated with two-dimensional images, and L changedimages 9 are associated with three-dimensional images. - Although the same number of changed images are associated for each number of dimensions of sensor values regardless of the number of dimensions of images in the example of
FIG. 31 , different numbers of changed images may be assigned in accordance with combinations. In addition, the number of changed images may be determined in accordance with, instead of the number of dimensions of sensor values, the number of dimensions of images. - The above exemplary embodiments have described cases in which a single object is displayed on a screen for simplicity of description. However, plural objects may be displayed. In this case, it is possible to depict plural objects in the same dimension or in different dimensions. For example, if it is possible to depict images in three dimensions, some objects may be displayed three-dimensionally, and the other objects may be displayed two-dimensionally or one-dimensionally. Note that all of the objects may be displayed in dimensions whose number is smaller than the number of possible dimensions.
- In this case, a change may be made to each image in accordance with the number of dimensions used for depicting the image.
- In addition, although a change in accordance with a sensor value may be made to each image corresponding to one of plural objects, the change may be made only to a specific object specified by a user among the plural objects or an image corresponding to a specific dimension or dimensions. For example, if there are both three-dimensionally depicted object images and two-dimensionally depicted object images, a change may be made only to three-dimensionally depicted object images, which are specified by a user.
- The above exemplary embodiments have described cases in which changes are made to the display of the
information processing apparatus 1. However, an output to an image forming apparatus (so-called printer) may reflect such a change in the display. For example, if a print instruction is issued in a state in which a change is made to a displayed three-dimensional object in accordance with a sensor output value, print data corresponding to the changed content may be generated to print an image. That is, not only image display, but also printing may reflect a sensor output value. - Note that a change in the display may be independent of content of the image to be printed. That is, even if a change is made to a displayed object, content to be printed may be content of the three-dimensional object before a change is made in accordance with a sensor output value. In this case, it is desirable that a user be allowed to select on a user interface screen, whether an image including display content that has been changed in accordance with the sensor output value is to be printed or an image including content before a change is made in accordance with the sensor output value is to be printed.
- The user interface screen herein may be prepared as a setting screen or a part of a confirmation screen displayed in a pop-up window each time a print instruction is issued.
- To print an image on a sheet of paper, the
information processing apparatus 1 converts three-dimensional information defining the three-dimensional object into two-dimensional information. The two-dimensional information herein may be, for example, an image obtained by observing the surface of the three-dimensional object, or an image reflecting a change in an internal structure of the three-dimensional object as illustrated inFIG. 24 or the like. - Note that the so-called printer may be, in addition to an apparatus that prints a two-dimensional image on a recording medium such as a sheet of paper, a three-dimensional printer that forms a three-dimensional image. If the output destination is a three-dimensional printer, the three-dimensional object is output as a three-dimensional object without being converted into three-dimensional information.
- In the above-described exemplary embodiments, a mobile information terminal (mobile display device) such as a tablet computer or a smartphone is assumed as the
information processing apparatus 1. However, the above technique is applicable to any information terminal including a display (including projection function) and a sensor, such as a clock, a toy like a game machine, a television receiver, a projector, a head-mounted display, an image forming apparatus (so-called printer), an electronic white board, or a robot for communication. - Based on above, the invention provides a display control method including following steps. Acquiring positional information, which indicates a position where a mobile display device is being used, from the display device; acquiring first information, which is different from the positional information and is associated with the positional information, from a device different from the display device; and differentiating a two-dimensional image, which represents an object defined three-dimensionally from one viewpoint direction and is displayed on the display device, according to the acquired first information, and displaying the two-dimensional image as a moving image. An image of a virtual creature displayed on the display device, a form of which is imitated by the object, changes in association with at least the first information and the positional information. An image of a virtual creature different from the image of the virtual creature is additionally displayed on the display device in association with a change of the first information and the positional information. In part of embodiments, the number of images of virtual creatures displayed on the display device, forms of which are imitated by the object, is different in association with the first information and the positional information. In part of embodiments, the first information is information regarding weather and an abnormality of weather is notified through the information regarding weather, the two-dimensional image is displayed in a mode according to a content of the abnormality. Warning information is displayed on the display device when the abnormality of weather is notified through the first information. In part of embodiments, a magnitude of change in display of the two-dimensional image is changed according to the information regarding weather. In part of embodiments, the two-dimensional image increases on the display device and then decreases according to the first information.
- The foregoing description of the exemplary embodiments of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to understand the invention for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.
Claims (17)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/920,435 US20200334888A1 (en) | 2017-08-24 | 2020-07-03 | Display control method, information processing apparatus and non-transitory computer readable medium |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-161182 | 2017-08-24 | ||
JP2017161182A JP6946857B2 (en) | 2017-08-24 | 2017-08-24 | Information processing equipment and programs |
US15/943,713 US10706606B2 (en) | 2017-08-24 | 2018-04-03 | Information processing apparatus for modifying a graphical object based on sensor input |
US16/920,435 US20200334888A1 (en) | 2017-08-24 | 2020-07-03 | Display control method, information processing apparatus and non-transitory computer readable medium |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/943,713 Continuation US10706606B2 (en) | 2017-08-24 | 2018-04-03 | Information processing apparatus for modifying a graphical object based on sensor input |
Publications (1)
Publication Number | Publication Date |
---|---|
US20200334888A1 true US20200334888A1 (en) | 2020-10-22 |
Family
ID=65435479
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/943,713 Active 2038-04-13 US10706606B2 (en) | 2017-08-24 | 2018-04-03 | Information processing apparatus for modifying a graphical object based on sensor input |
US16/920,435 Abandoned US20200334888A1 (en) | 2017-08-24 | 2020-07-03 | Display control method, information processing apparatus and non-transitory computer readable medium |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/943,713 Active 2038-04-13 US10706606B2 (en) | 2017-08-24 | 2018-04-03 | Information processing apparatus for modifying a graphical object based on sensor input |
Country Status (3)
Country | Link |
---|---|
US (2) | US10706606B2 (en) |
JP (1) | JP6946857B2 (en) |
CN (1) | CN109427104B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022272156A1 (en) * | 2021-06-25 | 2022-12-29 | Meta Platforms Technologies, Llc | Offsetting image light aberration due to waveguide movement in display assemblies |
US11719942B2 (en) | 2021-06-25 | 2023-08-08 | Meta Platforms Technologies, Llc | Offsetting image light aberration due to waveguide movement in display assemblies using information from piezoelectric movement sensors |
Family Cites Families (55)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4149574B2 (en) * | 1997-08-29 | 2008-09-10 | ゼロックス コーポレイション | User interface support device and information input method |
JP2000075781A (en) * | 1998-08-28 | 2000-03-14 | Matsushita Electric Ind Co Ltd | Movable object display device |
JP2000157732A (en) * | 1998-11-30 | 2000-06-13 | Sanseibu Entertainment:Kk | Portable electronic game device |
JP2002090326A (en) | 2000-09-20 | 2002-03-27 | Tdk Corp | Portable electronic equipment |
JP2002314649A (en) * | 2001-04-12 | 2002-10-25 | Kyocera Corp | Portable terminal |
JP2003232888A (en) * | 2001-12-07 | 2003-08-22 | Global Nuclear Fuel-Japan Co Ltd | Integrity confirmation inspection system and integrity confirmation method for transported object |
JP2004012637A (en) * | 2002-06-04 | 2004-01-15 | Olympus Corp | Camera with camera-shake detecting function |
JP2004004281A (en) | 2002-05-31 | 2004-01-08 | Toshiba Corp | Information processor and object display method for use in the same |
WO2004079530A2 (en) | 2003-03-03 | 2004-09-16 | America Online, Inc. | Using avatars to communicate |
KR101459985B1 (en) | 2004-03-01 | 2014-11-07 | 애플 인크. | Methods and apparatuses for operating a portable device based on an accelerometer |
JP4757120B2 (en) * | 2006-07-06 | 2011-08-24 | キヤノン株式会社 | Image processing apparatus and control method thereof |
JP2009543588A (en) * | 2006-07-14 | 2009-12-10 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Method, apparatus, system and computer readable medium for interactive shape manipulation |
JP5101080B2 (en) * | 2006-10-19 | 2012-12-19 | 任天堂株式会社 | GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME CONTROL METHOD |
JP2008140076A (en) * | 2006-11-30 | 2008-06-19 | Toshiba Corp | Information processor |
CN100495436C (en) * | 2006-12-15 | 2009-06-03 | 浙江大学 | Decomposing method for three-dimensional object shapes based on user easy interaction |
US20090044744A1 (en) * | 2007-01-30 | 2009-02-19 | Luna Innovations Incorporated | Thermal Sensitive Material |
KR101426082B1 (en) * | 2007-10-04 | 2014-07-31 | 삼성전자주식회사 | Method for remote-controlling target apparatus using mobile communication terminal and remote control system thereof |
US8743118B2 (en) * | 2008-03-21 | 2014-06-03 | Hitachi Medical Corporation | Medical image display device and medical image display method |
WO2010070532A2 (en) * | 2008-12-16 | 2010-06-24 | Koninklijke Philips Electronics N.V. | Positioning a cut plane in respect of a three-dimensional image |
JP5793426B2 (en) * | 2009-01-29 | 2015-10-14 | イマージョン コーポレーションImmersion Corporation | System and method for interpreting physical interaction with a graphical user interface |
JP5637346B2 (en) * | 2009-04-24 | 2014-12-10 | フリュー株式会社 | Photo sticker creation apparatus, photo sticker creation method, and program |
JP5719205B2 (en) * | 2010-11-22 | 2015-05-13 | シャープ株式会社 | Electronic device and display control method |
JP2012216148A (en) * | 2011-04-01 | 2012-11-08 | Sharp Corp | Display device, display method, computer program, and recording medium |
JP5613126B2 (en) * | 2011-09-09 | 2014-10-22 | Kddi株式会社 | User interface device, target operation method and program capable of operating target in screen by pressing |
JP2013105312A (en) * | 2011-11-14 | 2013-05-30 | Sony Corp | Information processing device, control method, and program |
CN103135889B (en) * | 2011-12-05 | 2017-06-23 | Lg电子株式会社 | Mobile terminal and its 3D rendering control method |
US9116599B2 (en) * | 2012-03-19 | 2015-08-25 | Autodesk, Inc. | Systems and methods for visualizing a 3D scene using a flexible display |
CN104272214B (en) * | 2012-05-11 | 2018-12-04 | 株式会社半导体能源研究所 | Electronic equipment, storage medium, program and display methods |
KR101984154B1 (en) * | 2012-07-16 | 2019-05-30 | 삼성전자 주식회사 | Control method for terminal using touch and gesture input and terminal thereof |
CN104508600B (en) * | 2012-07-27 | 2017-06-13 | 日本电气方案创新株式会社 | Three-dimensional user interface device and three-dimensional manipulating method |
JP2014035496A (en) * | 2012-08-09 | 2014-02-24 | Canon Inc | Display device, control method of display device, and program |
KR102043810B1 (en) * | 2012-08-20 | 2019-11-12 | 삼성전자주식회사 | Flexible display apparatus and controlling method thereof |
JPWO2014038109A1 (en) | 2012-09-10 | 2016-08-08 | 日本電気株式会社 | Notification information display processing device, notification information display method, and program |
JP5747007B2 (en) * | 2012-09-12 | 2015-07-08 | 富士フイルム株式会社 | MEDICAL IMAGE DISPLAY DEVICE, MEDICAL IMAGE DISPLAY METHOD, AND MEDICAL IMAGE DISPLAY PROGRAM |
JP2014110628A (en) | 2012-12-04 | 2014-06-12 | Tokyo Shashin Kobo Co Ltd | Picture display unit |
KR102124398B1 (en) | 2012-12-18 | 2020-06-18 | 삼성전자주식회사 | Display apparatus and Method for processing image thereof |
EP4220506A1 (en) * | 2013-10-30 | 2023-08-02 | Screening Eagle Dreamlab Pte. Ltd. | Arrangement and method for inspecting an object, in particular a structure |
CN104699235B (en) * | 2013-12-05 | 2017-12-01 | 浙江大学 | Three dimensions imaging exchange method and system based on ultrasonic wave |
WO2015111269A1 (en) * | 2014-01-27 | 2015-07-30 | 富士フイルム株式会社 | Image processing device, imaging device, image processing method, and image processing program |
JP6401459B2 (en) * | 2014-02-14 | 2018-10-10 | キヤノン株式会社 | Image processing apparatus and image processing method |
US9766715B2 (en) * | 2014-05-01 | 2017-09-19 | Seiko Epson Corporation | Head-mount type display device, control system, method of controlling head-mount type display device, and computer program |
JP6510218B2 (en) * | 2014-11-27 | 2019-05-08 | キヤノンメディカルシステムズ株式会社 | Medical image processing device |
EP3232395A4 (en) | 2014-12-09 | 2018-07-11 | Sony Corporation | Information processing device, control method, and program |
JP6437811B2 (en) * | 2014-12-10 | 2018-12-12 | 株式会社Nttドコモ | Display device and display method |
JP2016206694A (en) * | 2015-04-15 | 2016-12-08 | 富士ゼロックス株式会社 | Terminal, information processing apparatus, image forming system, and program |
JP2017009687A (en) * | 2015-06-18 | 2017-01-12 | カシオ計算機株式会社 | Display device |
JP6623657B2 (en) | 2015-10-05 | 2019-12-25 | 日産自動車株式会社 | Information providing apparatus, information providing system, and information providing method |
JP6657797B2 (en) * | 2015-10-30 | 2020-03-04 | 富士ゼロックス株式会社 | Printing system, display control device and program |
CN106933473A (en) * | 2015-12-31 | 2017-07-07 | 南宁富桂精密工业有限公司 | The electronic installation of three-dimensional object creation method and application the method |
JP2017124079A (en) * | 2016-01-15 | 2017-07-20 | セイコーエプソン株式会社 | Display method, swing analyzing device, swing analyzing system, swing analyzing program and recording medium |
US9999823B2 (en) * | 2016-01-15 | 2018-06-19 | Inxpar Inc. | System for analyzing golf swing process and method thereof |
US20180015363A1 (en) * | 2016-07-13 | 2018-01-18 | Play Impossible Corporation | Smart Playable Device, Gestures, and User Interfaces |
US10019849B2 (en) * | 2016-07-29 | 2018-07-10 | Zspace, Inc. | Personal electronic device with a display system |
US10639878B2 (en) * | 2017-02-15 | 2020-05-05 | Autodesk, Inc. | Three-dimensional printing |
JP6315122B2 (en) * | 2017-03-08 | 2018-04-25 | カシオ計算機株式会社 | Display control apparatus, display control method, and program |
-
2017
- 2017-08-24 JP JP2017161182A patent/JP6946857B2/en active Active
-
2018
- 2018-04-03 US US15/943,713 patent/US10706606B2/en active Active
- 2018-04-09 CN CN201810313367.7A patent/CN109427104B/en active Active
-
2020
- 2020-07-03 US US16/920,435 patent/US20200334888A1/en not_active Abandoned
Also Published As
Publication number | Publication date |
---|---|
US10706606B2 (en) | 2020-07-07 |
US20190066357A1 (en) | 2019-02-28 |
CN109427104A (en) | 2019-03-05 |
CN109427104B (en) | 2023-09-26 |
JP2019040340A (en) | 2019-03-14 |
JP6946857B2 (en) | 2021-10-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10748021B2 (en) | Method of analyzing objects in images recorded by a camera of a head mounted device | |
CN106643699B (en) | Space positioning device and positioning method in virtual reality system | |
US10636185B2 (en) | Information processing apparatus and information processing method for guiding a user to a vicinity of a viewpoint | |
CN105793764B (en) | For providing equipment, the method and system of extension display equipment for head-mounted display apparatus | |
JP5055516B2 (en) | System and method for displaying device maintenance and operation instructions using augmented reality | |
US10380920B2 (en) | System and method for augmented ultrasound simulation using flexible touch sensitive surfaces | |
US20200334888A1 (en) | Display control method, information processing apparatus and non-transitory computer readable medium | |
CN109584375B (en) | Object information display method and mobile terminal | |
US20180158222A1 (en) | Image processing apparatus displaying image of virtual object and method of displaying the same | |
KR20230028532A (en) | Creation of ground truth datasets for virtual reality experiences | |
KR102209745B1 (en) | An information display device of a mirror display for advertisement and shopping by recognizing the reflected images on the mirror and method thereof | |
CN110260857A (en) | Calibration method, device and the storage medium of vision map | |
CN110895676B (en) | dynamic object tracking | |
US20210409626A1 (en) | Patch tracking image sensor | |
CN115917465A (en) | Visual inertial tracking using rolling shutter camera | |
CN109906471A (en) | Real-time three-dimensional camera calibrated | |
TWI653546B (en) | Virtual reality system with outside-in tracking and inside-out tracking and controlling method thereof | |
US20230245396A1 (en) | System and method for three-dimensional scene reconstruction and understanding in extended reality (xr) applications | |
CN103152626A (en) | Far infrared three-dimensional hand signal detecting device of intelligent television set | |
CN112912936A (en) | Mixed reality system, program, mobile terminal device, and method | |
CN105203073B (en) | A kind of imager with dividing scale for electronic distance measuring | |
CN115359422A (en) | High-altitude parabolic image generation method, device and system | |
JP2019050018A (en) | Method for controlling display, information processor, and program | |
KR101850134B1 (en) | Method and apparatus for generating 3d motion model | |
US11017578B2 (en) | Display control system to control a display based on detecting wind |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: FUJIFILM BUSINESS INNOVATION CORP., JAPAN Free format text: CHANGE OF NAME;ASSIGNOR:FUJI XEROX CO., LTD.;REEL/FRAME:056294/0219 Effective date: 20210401 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |