US20090262139A1 - Video image display device and video image display method - Google Patents
Video image display device and video image display method Download PDFInfo
- Publication number
- US20090262139A1 US20090262139A1 US12/375,580 US37558007A US2009262139A1 US 20090262139 A1 US20090262139 A1 US 20090262139A1 US 37558007 A US37558007 A US 37558007A US 2009262139 A1 US2009262139 A1 US 2009262139A1
- Authority
- US
- United States
- Prior art keywords
- close
- image
- area
- display
- display area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
- A63F2300/6661—Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
Definitions
- the present invention relates to an image display apparatus and image display method that display computer graphics animation and suchlike images.
- Patent Document 1 Japanese Patent Application Laid-Open No. 2003-323628
- Patent Document 2 Japanese Patent Application Laid-Open No. 2002-150317
- Patent Document 1 a problem with the technology described in above Patent Document 1 is that the basic image is not displayed while a close-up image is being displayed. For example, if another character begins an action while a particular character is being displayed in close-up, the nature of that action cannot be displayed. Also, when only a specific region comprising a facial part of a character is subject to a close-up, if that character performs a whole-body action, the nature of that whole-body action cannot be displayed. That is to say, a problem with the technology described in Patent Document 1 is that a whole-body action, surrounding situation, or the like, of an object being displayed in close-up cannot be grasped.
- Patent Document 2 a problem with the technology described in above Patent Document 2 is that, since a basic image display area and a close-up image display area must both be placed on a limited screen prepared beforehand, the basic image display area becomes small.
- a small, low-resolution screen such as a liquid crystal panel of a mobile phone or PDA (personal digital assistants)
- PDA personal digital assistants
- An image display apparatus of the present invention employs a configuration having a display area discriminating section that discriminates a display area of a specific object in a basic image that is subject to display, and a close-up area determining section that determines a display area of a close-up image in the basic image according to the display area of the specific object in the basic image.
- An image display method of the present invention has a display area discriminating step of discriminating a display area of a specific object in a basic image that is subject to display, and a close-up area determining step of determining a display area of a close-up image in the basic image according to the display area of the specific object in the basic image discriminated by the display area discriminating step.
- the present invention enables both a basic image and a close-up image to be displayed more effectively by determining a display area of the close-up image according to a display area of a specific object in the basic image.
- FIG. 1 is a system configuration diagram showing the configuration of a CG animation display system as an image display apparatus according to an embodiment of the present invention
- FIG. 2 is an explanatory drawing showing: a sample description of an animation scenario in this embodiment
- FIG. 3 is a flowchart showing the flow of processing executed by a close-up area determining section in this embodiment
- FIG. 4 is an explanatory drawing showing the content of each item of processing executed by a close-up area determining section in this embodiment
- FIG. 5 is a flowchart showing the flow of processing executed in step S 3000 in FIG. 3 in this embodiment
- FIG. 6 is an explanatory drawing showing an example of the content of an image when close-up display areas are changed in shape in this embodiment
- FIG. 7 is an explanatory drawing showing the nature of changes of a basic image that is subject to close-up display in this embodiment
- FIG. 8 is an explanatory drawing showing the nature of changes of object placement areas in this embodiment.
- FIG. 9 is an explanatory drawing showing the nature of changes of close-up display areas in this embodiment.
- FIG. 10 is an explanatory drawing showing the nature of changes of a final image in this embodiment.
- FIG. 11 is an explanatory drawing showing an example of how the size and position of a close-up display area are interpolated by a smoothing interpolation determining section in this embodiment.
- FIG. 12 is an explanatory drawing showing another example of how the size and position of a close-up display area are interpolated by a smoothing interpolation determining section in this embodiment.
- FIG. 1 is a system configuration diagram showing the configuration of a CG animation display system as an image display apparatus according to an embodiment of the present invention.
- CG animation display system 100 has image material database 200 , CG animation generating section 300 , and image display section 400 .
- CG animation generating section 300 has camerawork determining section 310 , CG picture drawing section 320 , close-up area determining section 330 , smoothing interpolation determining section 340 , and close-up area control section 350 .
- CG picture drawing section 320 has basic image generating section 321 and close-up image generating section 322 .
- Image display section 400 has image display area 410 and close-up display area 420 .
- CG animation display system 100 inputs animation scenario 600 , which is the basis of CG animation, as input.
- FIG. 2 is an explanatory drawing showing a sample description of animation scenario 600 .
- Animation scenario 600 is like the script or scenario of a motion picture or play.
- Animation scenario 600 contains a number of “Scenes” 610 .
- Each “Scene” 610 has attribute “location” 611 indicating a background set.
- each “Scene” 610 has a plurality of “Directions” 620 as sub-elements. Information such as “Subject”, “Action”, and “Object” is written under each “Direction” 620 .
- additional information such as “Expression” is written under “Direction” 620 .
- Animation scenario 600 also contains “Resource (resource information)” 630 .
- “Resource (resource information)” 630 shows association between a name written in “Scene” 610 and image material necessary for display as a CG animation image. Specifically, each “Resource” 630 has an attribute “uri” indicating an identifier of image material, and an attribute “name” indicating a name written in “Scene” 610 , “Subject”, or the like. For example, under “Direction” 620 a of “Scene” 610 a , character name “akira” is written as a subject. And in “Resource” 630 a , image material identifier “http://media.db/id/character/akira” is written associated with the name “akira”.
- Image material database 200 shown in FIG. 1 stores image material necessary for generating CG animation.
- Image material includes at least 3D (dimension) model data indicating the shape or external appearance of various kinds of objects such as characters and background sets.
- Image material also includes motion data, still-image data, moving-image data, audio data, music data, and so forth.
- Motion data indicates motion of 3D model data.
- Still-image data and moving-image data are used in 3D model data texture, background, or suchlike drawing.
- Audio data is used in the output of sound effects, synthetic speech, and so forth.
- Music data is used in the output of BGM (background music) or the like.
- CG animation generating section 300 acquires necessary image material from image material database 200 , and generates CG animation of content in line with animation scenario 600 .
- CG animation generating section 300 causes image display section 400 to display a basic image of generated CG animation and a close-up image of an object that appears in generated CG animation.
- camerawork determining section 310 determines a position of an object such as a character, background set, or the like, in an animation space, based on an animation scenario 600 description. Then camerawork determining section 310 determines camerawork for shooting an object whose position has been determined. Specifically, for example, camerawork determining section 310 places a camera at a predetermined position of the animation space and determines basic camerawork. Alternatively, camerawork determining section 310 determines basic camerawork so that shooting is performed with reference to a specific object.
- Technology for determining CG animation camerawork from an animation scenario is known, being described in Japanese Patent Application Laid-Open No. 2005-44181, for example, and therefore a description thereof is omitted here.
- camerawork determining section 310 determines camerawork for shooting a facial part of the character together with basic camerawork. Then camerawork determining section 310 generates data in which determined object positions and camerawork contents are converted to parameters internally by means of coordinate data and so forth, and outputs the generated data to CG picture drawing section 320 .
- CG picture drawing section 320 acquires image material necessary for drawing from image material database 200 based on the data input from camerawork determining section 310 and the animation scenario 600 description, and generates a CG animation image. Specifically, CG picture drawing section 320 first acquires image material from image material database 200 in accordance with the animation scenario 600 description, and then places each acquired image material at a position determined by camerawork determining section 310 .
- CG picture drawing section 320 acquires image material corresponding to identifier “http://media.db/id/character/akira” from image material database 200 , and then places the acquired image material as a subject.
- CG picture drawing section 320 When placing each acquired image material, CG picture drawing section 320 generates an image implementing camerawork determined by camerawork determining section 310 . Then CG picture drawing section 320 causes image display section 400 to draw the generated image. Specifically, in CG picture drawing section 320 , basic image generating section 321 generates a basic image based on basic camerawork, and outputs the generated basic image to close-up area determining section 330 and image display section 400 . Also, close-up image generating section 322 generates a close-up image based on close-up camerawork, and outputs the generated close-up image to image display section 400 .
- Close-up area determining section 330 determines the size and position of close-up display area 420 . Then close-up area determining section 330 outputs information indicating the size and position of the determined close-up display area 420 to image display section 400 and smoothing interpolation determining section 340 . Close-up area determining section 330 analyzes the basic image input from CG picture drawing section 320 and discriminates an area other than a display area of an object to be displayed with priority from within image display area 410 .
- close-up area determining section 330 determines close-up display area 420 in an area determined to be an area other than a display area of an object to be displayed with priority. It goes without saying that this kind of close-up area determining section 330 function is unnecessary if image display area 410 and close-up display area 420 are prepared in advance as separate display areas, as in the technology described in Patent Document 1 above.
- Smoothing interpolation determining section 340 analyzes a change of close-up display area 420 based on information input from close-up area determining section 330 . Then smoothing interpolation determining section 340 interpolates the analyzed close-up display area 420 change, and provides for the close-up display area 420 change to be performed smoothly or naturally.
- Close-up area control section 350 determines whether or not a close-up image generated by close-up image generating section 322 is to be displayed by image display section 400 .
- Image display section 400 has a liquid crystal panel or suchlike display screen (not shown), and places image display area 410 , which is an area for displaying a CG animation basic image, in the display screen. Also, image display section 400 places close-up display area 420 , which is an area for displaying a CG animation close-up image, in the display screen. Then image display section 400 draws a basic image input from CG animation generating section 300 in image display area 410 , and also draws a close-up image input from CG animation generating section 300 in close-up display area 420 . The size, position, and display/non-display of close-up display area 420 are controlled by information input from CG animation generating section 300 .
- CG animation display system 100 comprises a CPU (Central Processing Unit), a storage medium such as ROM (Read Only Memory) that stores a control program, and RAM (Random Access Memory) or suchlike working memory.
- CPU Central Processing Unit
- storage medium such as ROM (Read Only Memory) that stores a control program
- RAM Random Access Memory
- Image material database 200 , image display area 410 , and close-up display area 420 may each be connected directly to CG animation generating section 300 via a bus, or may be connected to CG animation generating section 300 via a network.
- close-up area determining section 330 The operation of close-up area determining section 330 will now be described in detail.
- FIG. 3 is a flowchart showing the flow of processing executed by close-up area determining section 330
- FIG. 4 shows the content of each item of processing executed by close-up area determining section 330 , taking a basic image of a particular moment (hereinafter referred to simply as “basic image”) as an example.
- the operation of close-up area determining section 330 will be described here with reference to FIG. 3 and FIG. 4 .
- close-up area determining section 330 picks up an object to be displayed with priority, such as a character, from a basic image. Then close-up area determining section 330 classifies image display area 410 into a candidacy area and non-candidacy area.
- a candidacy area is an area that is treated as a close-up display area 420 candidate.
- a non-candidacy area is an area that is not treated as a close-up display area 420 candidate. It is assumed here that an object that is subject to a close-up is an object to be displayed with priority.
- basic image 701 in which character “akira” 702 a and character “natsuko” 702 b are placed is generated by basic image generating section 321 of CG picture drawing section 320 based on animation scenario 600 .
- close-up area determining section 330 picks up character “akira” 702 a and character “natsuko” 702 b.
- close-up area determining section 330 divides image display area 410 in which basic image 701 is displayed into N ⁇ M areas (where N and M are natural numbers), and determines for each division area whether or not a display area of a picked up character is present. Then close-up area determining section 330 determines a division area in which a character display area is not present within image display area 410 to be a close-up display area 420 candidacy area. Also, close-up area determining section 330 determines a division area in which a character display area is present to be close-up display area 420 non-candidacy area 703 (the hatched area in the figure).
- image display area 410 is divided into 48 rectangles, eight horizontally and six vertically, but the direction, number, and shape of the divisions are not limited to this case.
- processing may be performed in dot units, with only an area enclosed by the outline of a character being taken to be a non-candidacy area.
- close-up area determining section 330 determines whether or not an object for which close-up candidate area determination processing has not been performed remains among objects that are subject to a close-up (objects to be displayed with priority).
- a close-up candidate area is an area that may become close-up display area 420 described later herein. If an object that is subject to a close-up remains (S 2000 : YES), close-up area determining section 330 proceeds to step S 3000 processing. If an object that is subject to a close-up does not remain (S 2000 : NO), close-up area determining section 330 proceeds to step S 4000 processing.
- close-up area determining section 330 selects one object that is subject to a close-up, and determines a close-up candidate area based on the selected object.
- a close-up candidate area is determined based on character “akira” 702 a will first be described as an example.
- the selection order may be the order of appearance in animation scenario 600 , for example.
- a degree of importance may be set in advance for each object and the order selected according to the degree of importance, or selection may be performed randomly each time.
- FIG. 5 is a flowchart showing the flow of processing executed in step S 3000 in FIG. 3 .
- close-up area determining section 330 discriminates a division area 704 in which a display area of an object (hereinafter referred to as an “object placement area”) deemed to be subject to processing is present. Then close-up area determining section 330 selects a division area positioned at the greatest distance from discriminated object placement area 704 among the division areas of image display area 410 . Calculation of a division area positioned at the greatest distance may be performed using simple linear distance, or by applying weights in specific directions, such as the vertical and horizontal directions.
- calculation of a division area positioned at the greatest distance may be performed by taking the distance between adjacent division areas as “1”, taking only the vertical and horizontal directions as measurement directions, and calculating a numeric value indicating the distance of each division area from object placement area 704 .
- close-up area determining section 330 may take a division area with the highest numeric value as an area positioned at the greatest distance from object placement area 704 .
- division area 705 a is selected as a division area to be used as a close-up candidate area reference (hereinafter referred to as “candidate reference area”).
- This candidate reference area need not necessarily be an area positioned at the greatest distance from object placement area 704 , as long as it is an area other than object placement area 704 .
- close-up area determining section 330 extends the single division area selected as a candidate reference area in accordance with a predetermined condition, and selects the post-extension area as a close-up candidate area.
- the condition “2 division areas vertically ⁇ 2 division areas horizontally and not including non-candidacy area 703 ” may be set as a condition for a post-extension area.
- extension area 706 a comprising four division areas in the top-left corner of image display area 410 is selected as a close-up candidate area, as shown in FIG. 4D .
- Another example of a post-extension area condition that may be used is “the maximum area with the same number of division areas vertically and horizontally and not including non-candidacy area 703 ”.
- close-up area determining section 330 determines whether or not another division area that has not been subjected to processing in step S 3200 is present in a division area at the greatest distance from an object placement area. That is to say, close-up area determining section 330 determines whether or not a division area positioned at the same distance from an object placement area as a division area already subjected to step S 3200 processing is present. If a corresponding division area is present (S 3300 : YES), close-up area determining section 330 returns to step S 3200 and performs close-up candidate area selection based on the relevant division area. If a corresponding division area is not present (S 3300 : NO), close-up area determining section 330 terminates the series of processing steps.
- close-up area determining section 330 returns to step S 2000 in FIG. 3 .
- character “natsuko” 702 b still remains as an object that has not been subjected to close-up candidate area determination processing among objects that are subject to a close-up (S 2000 : YES). Therefore, close-up area determining section 330 proceeds to step S 3000 processing again. Close-up area determining section 330 then executes the series of processing steps shown in FIG. 5 , this time with character “natsuko” 702 b as the object that is subject to processing.
- division areas in the four corners including division area 705 b positioned in the top-right corner of image display area 410 , are selected as candidate reference areas, and extension areas in the four corners including extension area 706 b positioned in the top-right corner of image display area 410 are selected as close-up candidate areas.
- close-up candidate area determination processing is performed in this way for all objects that are subject to a close-up (S 2000 : NO)
- close-up area determining section 330 proceeds to step S 4000 processing.
- close-up area determining section 330 assigns a determined close-up candidate area as close-up display area 420 to each object that is subject to close-up display.
- extension areas in the four corners of image display area 410 are determined to be close-up candidate areas. It is assumed here that a rule for close-up display area 420 assignment—for example, “prioritize assignment of an upper close-up candidate area, and assign in order from the close-up candidate area candidate at the shortest distance”—has been set in advance.
- close-up area determining section 330 first assigns extension area 706 b to close-up display area 420 a of character “akira” 702 a , and then assigns remaining extension area 706 a to close-up display area 420 b of character “natsuko” 702 b , as shown in FIG. 4F .
- the rule “prioritize assignment of the nearest close-up candidate area to the close-up candidate area assigned immediately before” may be set in advance as a rule for close-up display area 420 assignment. By applying such a rule, it is possible to keep movement of close-up display area 420 of the same character 702 to a minimum.
- close-up area determining section 330 terminates the series of processing steps.
- Close-up area control section 350 determines sequentially whether or not close-up display area 420 determined by close-up area determining section 330 should be displayed. Then close-up area determining section 330 controls display/non-display of close-up display area 420 according to the result of the determination as to whether or not close-up display area 420 should be displayed.
- close-up area determining section 330 may perform control so that close-up display area 420 is displayed only if close-up display area 420 determined by close-up area determining section 330 can secure at least a predetermined area.
- close-up area determining section 330 may control display of corresponding close-up display area 420 in synchronization with an action of character 702 .
- close-up area determining section 330 controls in such a manner that corresponding close-up display area 420 is displayed only when character 702 speaks or only in a fixed section in which the expression of character 702 changes. By this means it is possible to cut down on close-up displays that have little effect, and to reduce the screen complexity and apparatus load.
- Close-up area determining section 330 discriminates a section in which character 702 is speaking or a section in which the expression of character 702 changes, for example, from an animation scenario 600 description. Specifically, for example, close-up area determining section 330 identifies a section corresponding to “Direction” 620 under which “Expression” is written, and determines a section extending for only a few seconds before and after that section to be a section in which the expression of character 702 changes.
- close-up display areas 420 are displayed by means of the above-described processing, close-up image 707 a of character “akira” 702 a is displayed at the top-right of basic image 701 , and close-up image 707 b of character “natsuko” 702 b is displayed at the top-left of basic image 701 , as shown in FIG. 4F .
- close-up images 707 are both displayed in positions that do not overlap characters 702 . That is to say, both the whole body and the facial part of each character 702 are displayed. Also, character 702 whole-body actions and facial expressions are displayed efficiently, and an expressive image is implemented. Furthermore, since close-up images 707 are displayed within image display area 410 , the size of the display area of basic image 701 and the whole-body display size of each character 702 are unaltered.
- close-up area determining section 330 may change the shape of a close-up display area to other than a rectangle.
- FIG. 6 is an explanatory drawing showing an example of an image when close-up display areas are changed in shape.
- a case is shown here in which a change of shape is applied to rectangular close-up display areas so that the vertex nearest the center point of a close-up of corresponding character 702 is moved to that center point.
- provision is made for objects to be displayed in front of close-up display areas 708 that have been changed in shape. Close-up display areas 708 that have been changed in shape in this way make it easier to understand which object a close-up relates to.
- FIG. 4 and FIG. 6 a case has been illustrated in which the shooting direction of a close-up image is the same the shooting direction of a basic image, but the present invention is not limited to this.
- step S 1000 in FIG. 3 an object that is subject to a close-up has been assumed to be an object to be displayed with priority, but objects that are subject to a close-up may also be assumed to be all or some objects to be displayed with priority.
- close-up area determining section 330 uses an object to be displayed with priority for determination of classification as a candidacy area or non-candidacy area
- close-up are a determining section 330 uses objects that are subject to a close-up that are all or some of objects to be displayed with priority for close-up candidate area determination.
- a rule may be applied to the effect that an item for which “Expression” is defined among “Directions” 620 shown in FIG. 2 is made subject to a close-up.
- a rule may be applied to the effect that when an attribute indicating whether or not the relevant object is the target is defined in “Resource (resource information)” 630 shown in FIG. 2 , that item is made subject to a close-up.
- An attribute indicating whether or not an object is a person may be used as an attribute indicating whether or not an object is subject to a close-up.
- An attribute indicating whether or not an object is a person may be designated “attr”, for example, and indicate whether or not an object is a person according to whether or not “person” has been set.
- close-up area determining section 330 can ensure priority display while providing for a close-up not be performed for a character for which a change of expression cannot be conveyed or non-human object, in particular.
- close-up area determining section 330 may determine a priority for each object according to whether or not there is an “Expression” definition or whether or not the object is a human object. Then close-up area determining section 330 determines objects that are subject to a close-up in order starting with a high-priority object through linkage with a candidacy area.
- close-up area determining section 330 dynamically performs the above-described close-up display area 420 determination processing each time the display area of an object changes.
- close-up area determining section 330 dynamically performs the above-described close-up display area 420 determination processing in sufficiently short cycles (such as 15 times a second) compared with the speed of change of the display area of an object.
- close-up display area 420 changes according to changes in the display area of an object in a basic image is described below, taking one example of basic image change.
- FIG. 7 is an explanatory drawing showing the nature of changes of a basic image that is subject to close-up display.
- each basic image at the time when close-up display area 420 determination processing is performed nine consecutive times by close-up area determining section 330 is illustrated in a time sequence.
- an animation time subject to close-up display area 420 determination processing by close-up area determining section 330 is referred to as an area determination time.
- character “akira” 702 a and character “natsuko” 702 b appear in basic image 701 .
- the display area of each character 702 changes over time in line with movement in the animation space of each character 702 .
- Close-up area determining section 330 discriminates object placement area 704 in the basic image for each object that is subject to close-up display.
- FIG. 8 is an explanatory drawing showing the nature of changes of object placement areas 704 .
- FIG. 8A through FIG. 8I correspond to FIG. 7A through FIG. 7I respectively.
- Close-up area determining section 330 divides image display area 410 , and here, object placement area 704 a of character “akira” 702 a and object placement area 704 b of character “natsuko” 702 b are discriminated for each basic image 701 as shown in FIG. 7A through FIG. 7I .
- each object placement area 704 also changes over time.
- An area including object placement area 704 a of character “akira” 702 a and object placement area 704 b of character “natsuko” 702 b becomes non-candidacy area 703 .
- Close-up area determining section 330 determines close-up display areas 420 within image display area 410 so as not to overlap non-candidacy area 703 .
- FIG. 9 is an explanatory drawing showing the nature of changes of close-up display areas 420 .
- FIG. 9A through FIG. 9I correspond to FIG. 7A through FIG. 7I and FIG. 8A through FIG. 8I respectively.
- close-up area determining section 330 determines areas that are areas other than non-candidacy area 703 (candidacy areas) and that satisfy a preset condition to be close-up display areas 420 .
- image display area 410 is divided into 64 rectangles, eight horizontally and eight vertically, and “the maximum area with the same number of division areas vertically and horizontally and not exceeding 3 division areas vertically ⁇ 3 division areas horizontally” has been set as a close-up candidate area condition.
- close-up display areas 420 also change over time, but never overlap a character 702 display area.
- Close-up area determining section 330 assigns close-up display areas 420 to objects that are subject to close-up display in accordance with the preset condition.
- CG picture drawing section 320 causes a final CG animation image (hereinafter referred to as “final image”) to be displayed by causing the relevant close-up image 707 to be displayed in each close-up display area 420 in accordance with this assignment.
- FIG. 10 is an explanatory drawing showing the nature of changes of a final image.
- FIG. 10A through FIG. 10I correspond to FIG. 9A through FIG. 9I respectively.
- CG picture drawing section 320 causes close-up image 707 of each object that is subject to close-up display to be displayed in the assigned close-up display area 420 .
- close-up images 707 are displayed embedded in basic image 701 without interfering with the display of characters 702 , as shown in FIG. 10 .
- close-up display areas 420 may temporarily become smaller due to the relationship between a condition and a candidacy area when determining a close-up candidate area, as described above. Also, as shown in FIG. 10A through FIG. 10I , the position of close-up display area 420 for a particular character differs according to the area determination time. If such switching of the size and position of close-up display area 420 is performed discretely only at each area determination time, there is a possibility of the final image appearing unnatural and jerky.
- smoothing interpolation determining section 340 interpolates the size and position of close-up display area 420 in a section between area determination times so that the size and position of close-up display area 420 change progressively or naturally.
- smoothing interpolation determining section 340 acquires size and position information of close-up display area 420 determined by close-up area determining section 330 . Then smoothing interpolation determining section 340 determines whether or not there is a close-up display area 420 change between adjacent area determination times. If there is a change, smoothing interpolation determining section 340 determines the implementation method of the close-up display area 420 change in a section between preceding and succeeding area determination times in accordance with a preset rule, and performs close-up display area 420 change smoothing processing.
- Smoothing interpolation determining section 340 may, for example, apply the rule “if close-up display areas 420 overlap between preceding and succeeding area determination times, change the close-up display area 420 outline progressively” as a rule for determining the close-up display area 420 change implementation method.
- smoothing interpolation determining section 340 may, for example, apply the rule “if close-up display areas 420 do not overlap between preceding and succeeding area determination times, and there is a candidacy area enabling close-up display area 420 to be moved progressively, move close-up display area 420 progressively”.
- smoothing interpolation determining section 340 may apply the rule “if close-up display areas 420 do not overlap between preceding and succeeding area determination times, and there is not a candidacy area enabling close-up display area 420 to be moved progressively, temporarily reduce the size of close-up display area 420 until it disappears, and then enlarge it to its original size after changing its position”.
- Determination of the size and position of close-up display area 420 in a section between area determination times may be performed by smoothing interpolation determining section 340 .
- smoothing interpolation determining section 340 may output a determined smoothing change implementation method to close-up area determining section 330 , after which the size and position of the above-described close-up display area 420 is determined by close-up area determining section 330 .
- Information indicating the size and position of the close-up display area 420 in the determined section between area determination times is output to image display section 400 .
- FIG. 11 is an explanatory drawing showing an example of how the size and position of close-up display area 420 are interpolated by smoothing interpolation determining section 340 .
- Horizontal axis 800 indicates animation times. The area above horizontal axis 800 relates to an explanation concerning area determination times, and the area below horizontal axis 800 relates to an explanation concerning a section between area determination times.
- non-candidacy area 703 when character 702 moves between time t- 10 , which is an area determination time, and time t- 20 , which is the next area determination time, non-candidacy area 703 also moves. Close-up display areas 420 - 10 and 420 - 20 are deemed to be determined respectively for times t- 10 and t- 20 as a result. Also, as shown in FIG. 11 , these close-up display areas 420 - 10 and 420 - 20 are deemed to be overlapping areas even though they are different in size.
- smoothing interpolation determining section 340 progressively changes the outline of close-up display area 420 between times t- 10 and t- 20 .
- interpolation is performed using close-up display area 420 - 11 of a size between close-up display areas 420 - 10 and 420 - 20 . In this way, the size of close-up display area 420 changes smoothly.
- FIG. 12 is an explanatory drawing showing another example of how the size and position of close-up display area 420 are interpolated by smoothing interpolation determining section 340 , corresponding to FIG. 11 .
- close-up display area 420 - 30 determined for time t- 30 which is an area determination time
- close-up display area 420 - 40 determined for time t- 40 which is the next area determination time
- smoothing interpolation determining section 340 temporarily reduces the size of close-up display area 420 between time t- 30 and time t- 40 until it disappears, and then enlarges it to its original size after changing its position.
- close-up display area 420 - 33 at time t- 33 between times t- 30 and t- 40 is smaller than close-up display area 420 - 303 . Then the position of close-up display area 420 moves, and close-up display area 420 - 34 at time t- 34 immediately after the move overlaps the position of time t- 40 close-up display area 420 - 40 . Also, close-up display area 420 - 36 at time t- 36 between times t- 34 and t- 40 is of a size intermediate between time t- 34 close-up display area 420 - 34 and time t- 40 close-up display area 420 - 40 .
- interpolation is performed using close-up display areas 420 - 31 and 420 - 32 of a size between close-up display areas 420 - 30 and 420 - 33 .
- interpolation is performed by means of close-up display area 420 - 35 of a size between close-up display areas 420 - 34 and 420 - 36 . In this way, the size of close-up display area 420 changes smoothly.
- an object placement area is discriminated from a basic image, enabling close-up display area 420 to be determined according to an object placement area of basic image 701 . Also, an object placement area is discriminated for a specific object, and close-up display area 420 is determined so as not to overlap a determined object placement area, enabling a close-up image to be displayed without interfering with the display of an object, such as a character, for which it is wished to display the entire image without fail in a basic image. That is to say, a close-up image can be displayed in a state in which influence on a basic image is suppressed.
- CG animation image can be displayed that is shot using a plurality of camerawork variations and from a plurality of angles on a single screen, enabling a user to be provided with more expressive images.
- close-up display area 420 is determined within image display area 410 , the size of image display area 410 can be used unaltered. Therefore, both a basic image and a close-up image can be displayed more effectively. Specifically, for example, based on a CG image in which only whole-body actions of characters can be displayed such as shown in FIG. 7 , it is possible to display a CG image in which the expression of each character can be clearly understood, and moreover whole-body actions of characters can also be displayed, such as shown in FIG. 10 .
- an animation scenario 600 description is analyzed, necessary image material is acquired from image material database 200 , appropriate camerawork is determined, and a basic image and close-up image are generated.
- CG animation can be generated in line with the content of animation scenario 600 , and above-described effective image display can be implemented with the generated CG animation.
- the size and position of close-up display area 420 of a section between area determination times can be interpolated. By these means, close-up display area 420 changes can be performed smoothly, a CG animation image viewer can easily keep visual track of close-up display areas 420 , and higher-quality image display can be implemented.
- an object displayed in a basic image has been assumed to be subject to close-up display, but the present invention is not limited to this.
- an object that is subject to close-up display may be an object that is not displayed in a basic image, or a close-up image may be provided as an image independent of a basic image.
- an image subject to display is a CG animation image, but it is also possible to apply the above-described technology to a live-action image.
- a live-action image may be analyzed by means of known image analysis technology, a display area of a specific object such as a human being detected, and a close-up image displayed in an area other than the detected area.
- An image display apparatus and image display method according to the present invention are suitable for use as an image display apparatus and image display method that enable both a basic image and a close-up image to be displayed more effectively.
- an image display apparatus and image display method according to the present invention are suitable for use in a device with a small display screen, such as a mobile phone, PDA, portable game machine, or the like.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Image Processing (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
Abstract
A video image display device is provided to effectively display both basic and closed-up video images. This video image display device is comprised of a closed-up region determining unit (330) for discriminating a display region of a specific object in the basic video image to be subjected to display and for determining a display region of the closed-up video image in the basic video image in accordance with the display region of the specific object in the basic video image.
Description
- The present invention relates to an image display apparatus and image display method that display computer graphics animation and suchlike images.
- In recent years, computer graphics animation (hereinafter referred to as “CG animation”) that provides appearing characters with detailed movements such as changes of expression have been attracting attention. Technologies have been described in
Patent Document 1 andPatent Document 2, for example, that display close-up images of a specific object in order to make such image details easy to grasp. - In the technology described in
Patent Document 1, display is switched between a basic image that is subject to display and a close-up image providing a close-up of a specific object such as a character in the basic image. This enables detailed movement such as a facial expression of a character to be grasped easily. - In the technology described in
Patent Document 2, a close-up image is displayed in a previously prepared area separate from the display area of the basic image. This enables detailed movement of an object to be grasped easily. - However, a problem with the technology described in above
Patent Document 1 is that the basic image is not displayed while a close-up image is being displayed. For example, if another character begins an action while a particular character is being displayed in close-up, the nature of that action cannot be displayed. Also, when only a specific region comprising a facial part of a character is subject to a close-up, if that character performs a whole-body action, the nature of that whole-body action cannot be displayed. That is to say, a problem with the technology described inPatent Document 1 is that a whole-body action, surrounding situation, or the like, of an object being displayed in close-up cannot be grasped. - Also, a problem with the technology described in above
Patent Document 2 is that, since a basic image display area and a close-up image display area must both be placed on a limited screen prepared beforehand, the basic image display area becomes small. In particular, when display is performed on a small, low-resolution screen such as a liquid crystal panel of a mobile phone or PDA (personal digital assistants), it is difficult to grasp a whole-body action, surrounding situation, or the like, of an object itself that is being displayed in close-up. Improvements in the processing performance of various kinds of hardware and advances in computer graphics technology have led to widespread development of application software using CG animation images for small devices of this kind. - Therefore, it is desirable to be able to display both a basic image and a close-up image effectively even on a small, low-resolution screen.
- It is an object of the present invention to provide an image display apparatus and image display method that enable both a basic image and a close-up image to be displayed more effectively.
- An image display apparatus of the present invention employs a configuration having a display area discriminating section that discriminates a display area of a specific object in a basic image that is subject to display, and a close-up area determining section that determines a display area of a close-up image in the basic image according to the display area of the specific object in the basic image.
- An image display method of the present invention has a display area discriminating step of discriminating a display area of a specific object in a basic image that is subject to display, and a close-up area determining step of determining a display area of a close-up image in the basic image according to the display area of the specific object in the basic image discriminated by the display area discriminating step.
- The present invention enables both a basic image and a close-up image to be displayed more effectively by determining a display area of the close-up image according to a display area of a specific object in the basic image.
-
FIG. 1 is a system configuration diagram showing the configuration of a CG animation display system as an image display apparatus according to an embodiment of the present invention; -
FIG. 2 is an explanatory drawing showing: a sample description of an animation scenario in this embodiment; -
FIG. 3 is a flowchart showing the flow of processing executed by a close-up area determining section in this embodiment; -
FIG. 4 is an explanatory drawing showing the content of each item of processing executed by a close-up area determining section in this embodiment; -
FIG. 5 is a flowchart showing the flow of processing executed in step S3000 inFIG. 3 in this embodiment; -
FIG. 6 is an explanatory drawing showing an example of the content of an image when close-up display areas are changed in shape in this embodiment; -
FIG. 7 is an explanatory drawing showing the nature of changes of a basic image that is subject to close-up display in this embodiment; -
FIG. 8 is an explanatory drawing showing the nature of changes of object placement areas in this embodiment; -
FIG. 9 is an explanatory drawing showing the nature of changes of close-up display areas in this embodiment; -
FIG. 10 is an explanatory drawing showing the nature of changes of a final image in this embodiment; -
FIG. 11 is an explanatory drawing showing an example of how the size and position of a close-up display area are interpolated by a smoothing interpolation determining section in this embodiment; and -
FIG. 12 is an explanatory drawing showing another example of how the size and position of a close-up display area are interpolated by a smoothing interpolation determining section in this embodiment. - An embodiment of the present invention will now be described in detail with reference to the accompanying drawings.
-
FIG. 1 is a system configuration diagram showing the configuration of a CG animation display system as an image display apparatus according to an embodiment of the present invention. - In
FIG. 1 , CGanimation display system 100 hasimage material database 200, CGanimation generating section 300, andimage display section 400. CGanimation generating section 300 hascamerawork determining section 310, CGpicture drawing section 320, close-uparea determining section 330, smoothinginterpolation determining section 340, and close-uparea control section 350. CGpicture drawing section 320 has basicimage generating section 321 and close-upimage generating section 322.Image display section 400 hasimage display area 410 and close-up display area 420. CGanimation display system 100inputs animation scenario 600, which is the basis of CG animation, as input. -
FIG. 2 is an explanatory drawing showing a sample description ofanimation scenario 600.Animation scenario 600 is like the script or scenario of a motion picture or play.Animation scenario 600 contains a number of “Scenes” 610. Each “Scene” 610 has attribute “location” 611 indicating a background set. Also, each “Scene” 610 has a plurality of “Directions” 620 as sub-elements. Information such as “Subject”, “Action”, and “Object” is written under each “Direction” 620. Also, if a subject is a character, additional information such as “Expression” is written under “Direction” 620. -
Animation scenario 600 also contains “Resource (resource information)” 630. “Resource (resource information)” 630 shows association between a name written in “Scene” 610 and image material necessary for display as a CG animation image. Specifically, each “Resource” 630 has an attribute “uri” indicating an identifier of image material, and an attribute “name” indicating a name written in “Scene” 610, “Subject”, or the like. For example, under “Direction” 620 a of “Scene” 610 a, character name “akira” is written as a subject. And in “Resource” 630 a, image material identifier “http://media.db/id/character/akira” is written associated with the name “akira”. -
Image material database 200 shown inFIG. 1 stores image material necessary for generating CG animation. Image material includes at least 3D (dimension) model data indicating the shape or external appearance of various kinds of objects such as characters and background sets. Image material also includes motion data, still-image data, moving-image data, audio data, music data, and so forth. Motion data indicates motion of 3D model data. Still-image data and moving-image data are used in 3D model data texture, background, or suchlike drawing. Audio data is used in the output of sound effects, synthetic speech, and so forth. Music data is used in the output of BGM (background music) or the like. - CG
animation generating section 300 acquires necessary image material fromimage material database 200, and generates CG animation of content in line withanimation scenario 600. - CG
animation generating section 300 causesimage display section 400 to display a basic image of generated CG animation and a close-up image of an object that appears in generated CG animation. - In CG
animation generating section 300,camerawork determining section 310 determines a position of an object such as a character, background set, or the like, in an animation space, based on ananimation scenario 600 description. Thencamerawork determining section 310 determines camerawork for shooting an object whose position has been determined. Specifically, for example,camerawork determining section 310 places a camera at a predetermined position of the animation space and determines basic camerawork. Alternatively,camerawork determining section 310 determines basic camerawork so that shooting is performed with reference to a specific object. Technology for determining CG animation camerawork from an animation scenario is known, being described in Japanese Patent Application Laid-Open No. 2005-44181, for example, and therefore a description thereof is omitted here. - Also, in a scene in which a character appears in a basic image,
camerawork determining section 310 determines camerawork for shooting a facial part of the character together with basic camerawork. Thencamerawork determining section 310 generates data in which determined object positions and camerawork contents are converted to parameters internally by means of coordinate data and so forth, and outputs the generated data to CGpicture drawing section 320. - CG
picture drawing section 320 acquires image material necessary for drawing fromimage material database 200 based on the data input fromcamerawork determining section 310 and theanimation scenario 600 description, and generates a CG animation image. Specifically, CGpicture drawing section 320 first acquires image material fromimage material database 200 in accordance with theanimation scenario 600 description, and then places each acquired image material at a position determined bycamerawork determining section 310. - For example, when the description content of “Direction” 620 a of
animation scenario 600 is converted to CG animation, CGpicture drawing section 320 acquires image material corresponding to identifier “http://media.db/id/character/akira” fromimage material database 200, and then places the acquired image material as a subject. - When placing each acquired image material, CG
picture drawing section 320 generates an image implementing camerawork determined bycamerawork determining section 310. Then CGpicture drawing section 320 causesimage display section 400 to draw the generated image. Specifically, in CGpicture drawing section 320, basicimage generating section 321 generates a basic image based on basic camerawork, and outputs the generated basic image to close-uparea determining section 330 andimage display section 400. Also, close-upimage generating section 322 generates a close-up image based on close-up camerawork, and outputs the generated close-up image to imagedisplay section 400. - Since each material is a computer graphic, it is easy to recognize what kind of image is displayed in what part of
image display area 410 when a basic image is drawn inimage display area 410 ofimage display section 400. Close-uparea determining section 330 determines the size and position of close-updisplay area 420. Then close-uparea determining section 330 outputs information indicating the size and position of the determined close-updisplay area 420 to imagedisplay section 400 and smoothinginterpolation determining section 340. Close-uparea determining section 330 analyzes the basic image input from CGpicture drawing section 320 and discriminates an area other than a display area of an object to be displayed with priority from withinimage display area 410. Then close-uparea determining section 330 determines close-updisplay area 420 in an area determined to be an area other than a display area of an object to be displayed with priority. It goes without saying that this kind of close-uparea determining section 330 function is unnecessary ifimage display area 410 and close-updisplay area 420 are prepared in advance as separate display areas, as in the technology described inPatent Document 1 above. - Smoothing
interpolation determining section 340 analyzes a change of close-updisplay area 420 based on information input from close-uparea determining section 330. Then smoothinginterpolation determining section 340 interpolates the analyzed close-updisplay area 420 change, and provides for the close-updisplay area 420 change to be performed smoothly or naturally. - Close-up
area control section 350 determines whether or not a close-up image generated by close-upimage generating section 322 is to be displayed byimage display section 400. -
Image display section 400 has a liquid crystal panel or suchlike display screen (not shown), and placesimage display area 410, which is an area for displaying a CG animation basic image, in the display screen. Also,image display section 400 places close-updisplay area 420, which is an area for displaying a CG animation close-up image, in the display screen. Thenimage display section 400 draws a basic image input from CGanimation generating section 300 inimage display area 410, and also draws a close-up image input from CGanimation generating section 300 in close-updisplay area 420. The size, position, and display/non-display of close-updisplay area 420 are controlled by information input from CGanimation generating section 300. - Although not shown in the drawings, CG
animation display system 100 comprises a CPU (Central Processing Unit), a storage medium such as ROM (Read Only Memory) that stores a control program, and RAM (Random Access Memory) or suchlike working memory. The functions of the above sections are implemented by the CPU executing the control program. -
Image material database 200,image display area 410, and close-updisplay area 420 may each be connected directly to CGanimation generating section 300 via a bus, or may be connected to CGanimation generating section 300 via a network. - The operation of close-up
area determining section 330 will now be described in detail. -
FIG. 3 is a flowchart showing the flow of processing executed by close-uparea determining section 330, andFIG. 4 shows the content of each item of processing executed by close-uparea determining section 330, taking a basic image of a particular moment (hereinafter referred to simply as “basic image”) as an example. The operation of close-uparea determining section 330 will be described here with reference toFIG. 3 andFIG. 4 . - In step S1000 in
FIG. 3 , close-uparea determining section 330 picks up an object to be displayed with priority, such as a character, from a basic image. Then close-uparea determining section 330 classifiesimage display area 410 into a candidacy area and non-candidacy area. A candidacy area is an area that is treated as a close-updisplay area 420 candidate. A non-candidacy area is an area that is not treated as a close-updisplay area 420 candidate. It is assumed here that an object that is subject to a close-up is an object to be displayed with priority. - As shown in
FIG. 4A ,basic image 701 in which character “akira” 702 a and character “natsuko” 702 b are placed is generated by basicimage generating section 321 of CGpicture drawing section 320 based onanimation scenario 600. In this case, close-uparea determining section 330 picks up character “akira” 702 a and character “natsuko” 702 b. - Then, as shown in
FIG. 4B , close-uparea determining section 330 dividesimage display area 410 in whichbasic image 701 is displayed into N×M areas (where N and M are natural numbers), and determines for each division area whether or not a display area of a picked up character is present. Then close-uparea determining section 330 determines a division area in which a character display area is not present withinimage display area 410 to be a close-updisplay area 420 candidacy area. Also, close-uparea determining section 330 determines a division area in which a character display area is present to be close-updisplay area 420 non-candidacy area 703 (the hatched area in the figure). - In
FIG. 4 ,image display area 410 is divided into 48 rectangles, eight horizontally and six vertically, but the direction, number, and shape of the divisions are not limited to this case. For example, processing may be performed in dot units, with only an area enclosed by the outline of a character being taken to be a non-candidacy area. - Next, in step S2000 in
FIG. 3 , close-uparea determining section 330 determines whether or not an object for which close-up candidate area determination processing has not been performed remains among objects that are subject to a close-up (objects to be displayed with priority). A close-up candidate area is an area that may become close-updisplay area 420 described later herein. If an object that is subject to a close-up remains (S2000: YES), close-uparea determining section 330 proceeds to step S3000 processing. If an object that is subject to a close-up does not remain (S2000: NO), close-uparea determining section 330 proceeds to step S4000 processing. - In step S3000, close-up
area determining section 330 selects one object that is subject to a close-up, and determines a close-up candidate area based on the selected object. Here, a case in which a close-up candidate area is determined based on character “akira” 702 a will first be described as an example. - When there are a plurality of objects that are subject to a close-up, the selection order may be the order of appearance in
animation scenario 600, for example. Alternatively, a degree of importance may be set in advance for each object and the order selected according to the degree of importance, or selection may be performed randomly each time. -
FIG. 5 is a flowchart showing the flow of processing executed in step S3000 inFIG. 3 . - In step S3100, close-up
area determining section 330 discriminates a division area 704 in which a display area of an object (hereinafter referred to as an “object placement area”) deemed to be subject to processing is present. Then close-uparea determining section 330 selects a division area positioned at the greatest distance from discriminated object placement area 704 among the division areas ofimage display area 410. Calculation of a division area positioned at the greatest distance may be performed using simple linear distance, or by applying weights in specific directions, such as the vertical and horizontal directions. - Alternatively, calculation of a division area positioned at the greatest distance may be performed by taking the distance between adjacent division areas as “1”, taking only the vertical and horizontal directions as measurement directions, and calculating a numeric value indicating the distance of each division area from object placement area 704. In this case, close-up
area determining section 330 may take a division area with the highest numeric value as an area positioned at the greatest distance from object placement area 704. Calculating numeric values indicating distances fromobject placement area 704 a of character “akira” 702 a for each division area inFIG. 4B using this method, and displaying the calculation results in the division areas, gives the result shown inFIG. 4C . - Here, as shown in
FIG. 4C , the numeric value ofdivision area 705 a in the top-left corner ofimage display area 410 is the highest. Therefore,division area 705 a is selected as a division area to be used as a close-up candidate area reference (hereinafter referred to as “candidate reference area”). This candidate reference area need not necessarily be an area positioned at the greatest distance from object placement area 704, as long as it is an area other than object placement area 704. - Next, in step S3200 in
FIG. 5 , close-uparea determining section 330 extends the single division area selected as a candidate reference area in accordance with a predetermined condition, and selects the post-extension area as a close-up candidate area. - For example, the condition “2 division areas vertically×2 division areas horizontally and not including
non-candidacy area 703” may be set as a condition for a post-extension area. In this case, when character “akira” 702 a is an object that is subject to processing,extension area 706 a comprising four division areas in the top-left corner ofimage display area 410 is selected as a close-up candidate area, as shown inFIG. 4D . Another example of a post-extension area condition that may be used is “the maximum area with the same number of division areas vertically and horizontally and not includingnon-candidacy area 703”. - Next, in step S3300 in
FIG. 5 , close-uparea determining section 330 determines whether or not another division area that has not been subjected to processing in step S3200 is present in a division area at the greatest distance from an object placement area. That is to say, close-uparea determining section 330 determines whether or not a division area positioned at the same distance from an object placement area as a division area already subjected to step S3200 processing is present. If a corresponding division area is present (S3300: YES), close-uparea determining section 330 returns to step S3200 and performs close-up candidate area selection based on the relevant division area. If a corresponding division area is not present (S3300: NO), close-uparea determining section 330 terminates the series of processing steps. - Here, as shown in
FIG. 4C , there is only one division area with the highest numeric value (S3300: NO), and therefore close-uparea determining section 330 terminates processing after selecting one close-up candidate area. - When the processing in step S3000 in
FIG. 3 is completed in this way, close-uparea determining section 330 returns to step S2000 inFIG. 3 . Here, character “natsuko” 702 b still remains as an object that has not been subjected to close-up candidate area determination processing among objects that are subject to a close-up (S2000: YES). Therefore, close-uparea determining section 330 proceeds to step S3000 processing again. Close-uparea determining section 330 then executes the series of processing steps shown inFIG. 5 , this time with character “natsuko” 702 b as the object that is subject to processing. - Calculating numeric values indicating distances from
object placement area 704 b of character “natsuko” 702 b and displaying the calculation results in the division areas, in the same way as in the processing to which character “akira” 702 a was subjected, gives the result shown inFIG. 4E . Here, as shown inFIG. 4E , the numeric values of the four division areas in the four corners ofimage display area 410 are the highest. - Therefore, division areas in the four corners, including
division area 705 b positioned in the top-right corner ofimage display area 410, are selected as candidate reference areas, and extension areas in the four corners includingextension area 706 b positioned in the top-right corner ofimage display area 410 are selected as close-up candidate areas. - When close-up candidate area determination processing is performed in this way for all objects that are subject to a close-up (S2000: NO), close-up
area determining section 330 proceeds to step S4000 processing. - In step S4000, close-up
area determining section 330 assigns a determined close-up candidate area as close-updisplay area 420 to each object that is subject to close-up display. - For example, in the case of
basic image 701 shown inFIG. 4 , as described above, extension areas in the four corners ofimage display area 410 are determined to be close-up candidate areas. It is assumed here that a rule for close-updisplay area 420 assignment—for example, “prioritize assignment of an upper close-up candidate area, and assign in order from the close-up candidate area candidate at the shortest distance”—has been set in advance. - When the distances from each
character 702 are compared for the upper two of the extension areas positioned in the four corners ofimage display area 410, the shortest distance is that from character “akira” 702 a toextension area 706 b. Therefore, close-uparea determining section 330 first assignsextension area 706 b to close-updisplay area 420 a of character “akira” 702 a, and then assigns remainingextension area 706 a to close-updisplay area 420 b of character “natsuko” 702 b, as shown inFIG. 4F . - The rule “prioritize assignment of the nearest close-up candidate area to the close-up candidate area assigned immediately before” may be set in advance as a rule for close-up
display area 420 assignment. By applying such a rule, it is possible to keep movement of close-updisplay area 420 of thesame character 702 to a minimum. - After determining all necessary close-up
display areas 420 in this way, close-uparea determining section 330 terminates the series of processing steps. - Close-up
area control section 350 determines sequentially whether or not close-updisplay area 420 determined by close-uparea determining section 330 should be displayed. Then close-uparea determining section 330 controls display/non-display of close-updisplay area 420 according to the result of the determination as to whether or not close-updisplay area 420 should be displayed. - For example, close-up
area determining section 330 may perform control so that close-updisplay area 420 is displayed only if close-updisplay area 420 determined by close-uparea determining section 330 can secure at least a predetermined area. Alternatively, close-uparea determining section 330 may control display of corresponding close-updisplay area 420 in synchronization with an action ofcharacter 702. To be more specific, close-uparea determining section 330 controls in such a manner that corresponding close-updisplay area 420 is displayed only whencharacter 702 speaks or only in a fixed section in which the expression ofcharacter 702 changes. By this means it is possible to cut down on close-up displays that have little effect, and to reduce the screen complexity and apparatus load. - Close-up
area determining section 330 discriminates a section in whichcharacter 702 is speaking or a section in which the expression ofcharacter 702 changes, for example, from ananimation scenario 600 description. Specifically, for example, close-uparea determining section 330 identifies a section corresponding to “Direction” 620 under which “Expression” is written, and determines a section extending for only a few seconds before and after that section to be a section in which the expression ofcharacter 702 changes. - When close-up
display areas 420 are displayed by means of the above-described processing, close-upimage 707 a of character “akira” 702 a is displayed at the top-right ofbasic image 701, and close-upimage 707 b of character “natsuko” 702 b is displayed at the top-left ofbasic image 701, as shown inFIG. 4F . - As can also be seen from
FIG. 4F , as a result of processing by close-uparea determining section 330, close-up images 707 are both displayed in positions that do not overlapcharacters 702. That is to say, both the whole body and the facial part of eachcharacter 702 are displayed. Also,character 702 whole-body actions and facial expressions are displayed efficiently, and an expressive image is implemented. Furthermore, since close-up images 707 are displayed withinimage display area 410, the size of the display area ofbasic image 701 and the whole-body display size of eachcharacter 702 are unaltered. - After determining division area 705 and extension area 706, close-up
area determining section 330 may change the shape of a close-up display area to other than a rectangle. -
FIG. 6 is an explanatory drawing showing an example of an image when close-up display areas are changed in shape. A case is shown here in which a change of shape is applied to rectangular close-up display areas so that the vertex nearest the center point of a close-up ofcorresponding character 702 is moved to that center point. However, provision is made for objects to be displayed in front of close-up display areas 708 that have been changed in shape. Close-up display areas 708 that have been changed in shape in this way make it easier to understand which object a close-up relates to. - After determination or change of shape of close-up
display area 420, provision may be made for new close-up camerawork to be determined again bycamerawork determining section 310. Alternatively, provision may be made for the actual determination of close-up camerawork to be performed after close-updisplay area 420 determination. - In
FIG. 4 andFIG. 6 , a case has been illustrated in which the shooting direction of a close-up image is the same the shooting direction of a basic image, but the present invention is not limited to this. For example, provision may be made for close-up camerawork to be determined so that a character is always shot full-face. This enables the expression of a character to be displayed even if the character is facing rearward in a basic image, for example. - In the above description, as stated in the explanation of step S1000 in
FIG. 3 , an object that is subject to a close-up has been assumed to be an object to be displayed with priority, but objects that are subject to a close-up may also be assumed to be all or some objects to be displayed with priority. In this case, in step S1000 inFIG. 3 , close-uparea determining section 330 uses an object to be displayed with priority for determination of classification as a candidacy area or non-candidacy area, and in step S3000 inFIG. 3 , close-up are a determiningsection 330 uses objects that are subject to a close-up that are all or some of objects to be displayed with priority for close-up candidate area determination. - Various rules can be applied to differentiation between an object that is subject to a close-up and an object that is not subject to a close-up. For example, a rule may be applied to the effect that an item for which “Expression” is defined among “Directions” 620 shown in
FIG. 2 is made subject to a close-up. Also, for example, a rule may be applied to the effect that when an attribute indicating whether or not the relevant object is the target is defined in “Resource (resource information)” 630 shown inFIG. 2 , that item is made subject to a close-up. - An attribute indicating whether or not an object is a person, for example, may be used as an attribute indicating whether or not an object is subject to a close-up. An attribute indicating whether or not an object is a person may be designated “attr”, for example, and indicate whether or not an object is a person according to whether or not “person” has been set. In this case, “Resource name=“akira” attr=person uri= . . . ”, for example, is written as “Resource (resource information)” 630 in
animation scenario 600. Also, “Resource name=“chair” attr=chair uri= . . . ”, for example, is written as “Resource (resource information)” 630 inanimation scenario 600. From these “Resource (resource information)” 630 items, it can be seen that the object “akira” is a person and the object “chair” is not a person. - By this means, close-up
area determining section 330 can ensure priority display while providing for a close-up not be performed for a character for which a change of expression cannot be conveyed or non-human object, in particular. - Also, if there are too many objects that are subject to a close-up compared with candidacy areas, provision may be made for close-up
area determining section 330 to reduce the number of objects that are subject to a close-up. For example, close-uparea determining section 330 may determine a priority for each object according to whether or not there is an “Expression” definition or whether or not the object is a human object. Then close-uparea determining section 330 determines objects that are subject to a close-up in order starting with a high-priority object through linkage with a candidacy area. By this means, for example, in a scene in which objects are placed discretely, there are fewer objects that are subject to a close-up due to the fact that the candidacy area is smaller, and conversely, in a scene in which objects are placed densely, there are more objects that are subject to a close-up due to the fact that the candidacy area is larger. That is to say, different close-up display effects can be obtained according to the circumstances of a scene. - The way in which CG
animation display system 100 displays an image has been described above for a basic image of a particular moment. However, a display area of an object normally changes from moment to moment in a basic image generated based on an animation scenario. Therefore, close-uparea determining section 330 dynamically performs the above-described close-updisplay area 420 determination processing each time the display area of an object changes. Alternatively, close-uparea determining section 330 dynamically performs the above-described close-updisplay area 420 determination processing in sufficiently short cycles (such as 15 times a second) compared with the speed of change of the display area of an object. - The way in which close-up
display area 420 changes according to changes in the display area of an object in a basic image is described below, taking one example of basic image change. -
FIG. 7 is an explanatory drawing showing the nature of changes of a basic image that is subject to close-up display. Here, each basic image at the time when close-updisplay area 420 determination processing is performed nine consecutive times by close-uparea determining section 330 is illustrated in a time sequence. Below, an animation time subject to close-updisplay area 420 determination processing by close-uparea determining section 330 is referred to as an area determination time. - As shown in
FIG. 7A throughFIG. 7I , character “akira” 702 a and character “natsuko” 702 b appear inbasic image 701. The display area of eachcharacter 702 changes over time in line with movement in the animation space of eachcharacter 702. - Close-up
area determining section 330 discriminates object placement area 704 in the basic image for each object that is subject to close-up display. -
FIG. 8 is an explanatory drawing showing the nature of changes of object placement areas 704.FIG. 8A throughFIG. 8I correspond toFIG. 7A throughFIG. 7I respectively. - Close-up
area determining section 330 dividesimage display area 410, and here, objectplacement area 704 a of character “akira” 702 a andobject placement area 704 b of character “natsuko” 702 b are discriminated for eachbasic image 701 as shown inFIG. 7A throughFIG. 7I . - As shown in
FIG. 8 , each object placement area 704 also changes over time. An area includingobject placement area 704 a of character “akira” 702 a andobject placement area 704 b of character “natsuko” 702 b becomesnon-candidacy area 703. - Close-up
area determining section 330 determines close-updisplay areas 420 withinimage display area 410 so as not to overlapnon-candidacy area 703. -
FIG. 9 is an explanatory drawing showing the nature of changes of close-updisplay areas 420.FIG. 9A throughFIG. 9I correspond toFIG. 7A throughFIG. 7I andFIG. 8A throughFIG. 8I respectively. - Within
image display area 410, close-uparea determining section 330 determines areas that are areas other than non-candidacy area 703 (candidacy areas) and that satisfy a preset condition to be close-updisplay areas 420. Here, a case is illustrated in whichimage display area 410 is divided into 64 rectangles, eight horizontally and eight vertically, and “the maximum area with the same number of division areas vertically and horizontally and not exceeding 3 division areas vertically×3 division areas horizontally” has been set as a close-up candidate area condition. - As shown in
FIG. 9 , close-updisplay areas 420 also change over time, but never overlap acharacter 702 display area. - Close-up
area determining section 330 assigns close-updisplay areas 420 to objects that are subject to close-up display in accordance with the preset condition. CGpicture drawing section 320 causes a final CG animation image (hereinafter referred to as “final image”) to be displayed by causing the relevant close-up image 707 to be displayed in each close-updisplay area 420 in accordance with this assignment. -
FIG. 10 is an explanatory drawing showing the nature of changes of a final image.FIG. 10A throughFIG. 10I correspond toFIG. 9A throughFIG. 9I respectively. - CG
picture drawing section 320 causes close-up image 707 of each object that is subject to close-up display to be displayed in the assigned close-updisplay area 420. By this means, close-up images 707 are displayed embedded inbasic image 701 without interfering with the display ofcharacters 702, as shown inFIG. 10 . - As shown in
FIG. 9C andFIG. 10C , close-updisplay areas 420 may temporarily become smaller due to the relationship between a condition and a candidacy area when determining a close-up candidate area, as described above. Also, as shown inFIG. 10A throughFIG. 10I , the position of close-updisplay area 420 for a particular character differs according to the area determination time. If such switching of the size and position of close-updisplay area 420 is performed discretely only at each area determination time, there is a possibility of the final image appearing unnatural and jerky. - Thus, smoothing
interpolation determining section 340 interpolates the size and position of close-updisplay area 420 in a section between area determination times so that the size and position of close-updisplay area 420 change progressively or naturally. - At each area determination time, smoothing
interpolation determining section 340 acquires size and position information of close-updisplay area 420 determined by close-uparea determining section 330. Then smoothinginterpolation determining section 340 determines whether or not there is a close-updisplay area 420 change between adjacent area determination times. If there is a change, smoothinginterpolation determining section 340 determines the implementation method of the close-updisplay area 420 change in a section between preceding and succeeding area determination times in accordance with a preset rule, and performs close-updisplay area 420 change smoothing processing. - Smoothing
interpolation determining section 340 may, for example, apply the rule “if close-updisplay areas 420 overlap between preceding and succeeding area determination times, change the close-updisplay area 420 outline progressively” as a rule for determining the close-updisplay area 420 change implementation method. Alternatively, smoothinginterpolation determining section 340 may, for example, apply the rule “if close-updisplay areas 420 do not overlap between preceding and succeeding area determination times, and there is a candidacy area enabling close-updisplay area 420 to be moved progressively, move close-updisplay area 420 progressively”. As a further example, smoothinginterpolation determining section 340 may apply the rule “if close-updisplay areas 420 do not overlap between preceding and succeeding area determination times, and there is not a candidacy area enabling close-updisplay area 420 to be moved progressively, temporarily reduce the size of close-updisplay area 420 until it disappears, and then enlarge it to its original size after changing its position”. - Determination of the size and position of close-up
display area 420 in a section between area determination times may be performed by smoothinginterpolation determining section 340. Alternatively, smoothinginterpolation determining section 340 may output a determined smoothing change implementation method to close-uparea determining section 330, after which the size and position of the above-described close-updisplay area 420 is determined by close-uparea determining section 330. Information indicating the size and position of the close-updisplay area 420 in the determined section between area determination times is output to imagedisplay section 400. -
FIG. 11 is an explanatory drawing showing an example of how the size and position of close-updisplay area 420 are interpolated by smoothinginterpolation determining section 340.Horizontal axis 800 indicates animation times. The area abovehorizontal axis 800 relates to an explanation concerning area determination times, and the area belowhorizontal axis 800 relates to an explanation concerning a section between area determination times. - As shown in
FIG. 11 , whencharacter 702 moves between time t-10, which is an area determination time, and time t-20, which is the next area determination time,non-candidacy area 703 also moves. Close-up display areas 420-10 and 420-20 are deemed to be determined respectively for times t-10 and t-20 as a result. Also, as shown inFIG. 11 , these close-up display areas 420-10 and 420-20 are deemed to be overlapping areas even though they are different in size. - When an above rule is applied, smoothing
interpolation determining section 340 progressively changes the outline of close-updisplay area 420 between times t-10 and t-20. As a result, as shown inFIG. 11 , at time t-11 between times t-10 and t-20, for example, interpolation is performed using close-up display area 420-11 of a size between close-up display areas 420-10 and 420-20. In this way, the size of close-updisplay area 420 changes smoothly. -
FIG. 12 is an explanatory drawing showing another example of how the size and position of close-updisplay area 420 are interpolated by smoothinginterpolation determining section 340, corresponding toFIG. 11 . As shown inFIG. 12 , it is assumed that close-up display area 420-30 determined for time t-30 which is an area determination time, and close-up display area 420-40 determined for time t-40, which is the next area determination time, do not overlap. Also, it is assumed that there is not a candidacy area enabling close-updisplay area 420 to be moved progressively from close-up display area 420-30 to close-up display area 420-40. - When an above rule is applied, smoothing
interpolation determining section 340 temporarily reduces the size of close-updisplay area 420 between time t-30 and time t-40 until it disappears, and then enlarges it to its original size after changing its position. - As a result, as shown in
FIG. 12 , close-up display area 420-33 at time t-33 between times t-30 and t-40 is smaller than close-up display area 420-303. Then the position of close-updisplay area 420 moves, and close-up display area 420-34 at time t-34 immediately after the move overlaps the position of time t-40 close-up display area 420-40. Also, close-up display area 420-36 at time t-36 between times t-34 and t-40 is of a size intermediate between time t-34 close-up display area 420-34 and time t-40 close-up display area 420-40. - As shown in
FIG. 12 , between times t-30 and t-33, interpolation is performed using close-up display areas 420-31 and 420-32 of a size between close-up display areas 420-30 and 420-33. Also, between times t-34 and t-36, interpolation is performed by means of close-up display area 420-35 of a size between close-up display areas 420-34 and 420-36. In this way, the size of close-updisplay area 420 changes smoothly. - As described above, according to this embodiment an object placement area is discriminated from a basic image, enabling close-up
display area 420 to be determined according to an object placement area ofbasic image 701. Also, an object placement area is discriminated for a specific object, and close-updisplay area 420 is determined so as not to overlap a determined object placement area, enabling a close-up image to be displayed without interfering with the display of an object, such as a character, for which it is wished to display the entire image without fail in a basic image. That is to say, a close-up image can be displayed in a state in which influence on a basic image is suppressed. Also, utilizing the advantage of computer graphics of not suffering image quality degradation when enlargement is performed, a CG animation image can be displayed that is shot using a plurality of camerawork variations and from a plurality of angles on a single screen, enabling a user to be provided with more expressive images. - Also, since close-up
display area 420 is determined withinimage display area 410, the size ofimage display area 410 can be used unaltered. Therefore, both a basic image and a close-up image can be displayed more effectively. Specifically, for example, based on a CG image in which only whole-body actions of characters can be displayed such as shown inFIG. 7 , it is possible to display a CG image in which the expression of each character can be clearly understood, and moreover whole-body actions of characters can also be displayed, such as shown inFIG. 10 . - Furthermore, according to this embodiment, an
animation scenario 600 description is analyzed, necessary image material is acquired fromimage material database 200, appropriate camerawork is determined, and a basic image and close-up image are generated. By this means, CG animation can be generated in line with the content ofanimation scenario 600, and above-described effective image display can be implemented with the generated CG animation. Also, the size and position of close-updisplay area 420 of a section between area determination times can be interpolated. By these means, close-updisplay area 420 changes can be performed smoothly, a CG animation image viewer can easily keep visual track of close-updisplay areas 420, and higher-quality image display can be implemented. - In the above-described embodiment, an object displayed in a basic image has been assumed to be subject to close-up display, but the present invention is not limited to this. For example, an object that is subject to close-up display may be an object that is not displayed in a basic image, or a close-up image may be provided as an image independent of a basic image. Also, a case has been described in which an image subject to display is a CG animation image, but it is also possible to apply the above-described technology to a live-action image. For example, a live-action image may be analyzed by means of known image analysis technology, a display area of a specific object such as a human being detected, and a close-up image displayed in an area other than the detected area.
- The disclosure of Japanese Patent Application No. 2006-211336, filed on Aug. 2, 2006, including the specification, drawings and abstract, is incorporated herein by reference in its entirety.
- An image display apparatus and image display method according to the present invention are suitable for use as an image display apparatus and image display method that enable both a basic image and a close-up image to be displayed more effectively. In particular, an image display apparatus and image display method according to the present invention are suitable for use in a device with a small display screen, such as a mobile phone, PDA, portable game machine, or the like.
Claims (11)
1. An image display apparatus comprising:
a display area discriminating section that discriminates a display area of a specific object in a basic image that is subject to display; and
a close-up area determining section that determines a display area of a close-up image in the basic image according to the display area of the specific object in the basic image.
2. The image display apparatus according to claim 1 , further comprising an image display section that displays the basic image, wherein:
the specific object is an object to be displayed with priority among objects placed in the basic image; and
the close-up area determining section determines an area other than the display area of the specific object within the display area of the basic image according to the image display section to be a display area of the close-up image.
3. The image display apparatus according to claim 2 , further comprising a close-up area control section that causes the close-up image to be displayed when the display area of the close-up image determined by the close-up area determining section has an area greater than or equal to a predetermined area.
4. The image display apparatus according to claim 3 , further comprising a close-up image generating section that generates an image of a close-up of the specific object as the close-up image,
wherein the close-up area control section controls display of the close-up image in synchronization with action of the specific object.
5. The image display apparatus according to claim 2 , wherein the basic image and the close-up image are computer graphics animation images in which a computer graphics object is placed.
6. The image display apparatus according to claim 5 , further comprising:
a camerawork determining section that determines camerawork of the basic image from an animation scenario;
a basic image generating section that generates the basic image based on the animation scenario and the camerawork of the basic image determined by the camerawork determining section; and
a close-up image generating section that generates the close-up image from the animation scenario.
7. The image display apparatus according to claim 6 , wherein:
the camerawork determining section determines camerawork of the close-up image from the animation scenario; and
the close-up image generating section generates the close-up image based on the camerawork of the close-up image determined by the camerawork determining section.
8. The image display apparatus according to claim 7 , further comprising an image material database that stores image material necessary for generating the computer graphics animation image,
wherein the basic image generating section and the close-up image generating section each acquire necessary image material from the image material database and generate the computer graphics animation image.
9. The image display apparatus according to claim 2 , further comprising a smoothing interpolation determining section that, in a section in which the display area of the close-up image determined by the close-up area determining section changes, performs interpolation of that change.
10. The image display apparatus according to claim 1 , wherein the display area discriminating section dynamically discriminates a display area of the specific object in the basic image that is subject to display.
11. An image display method comprising:
a display area discriminating step of discriminating a display area of a specific object in a basic image that is subject to display; and
a close-up area determining step of determining a display area of a close-up image in the basic image according to the display area of the specific object in the basic image discriminated by the display area discriminating step.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006211336 | 2006-08-02 | ||
JP2006-211336 | 2006-08-02 | ||
PCT/JP2007/064779 WO2008015978A1 (en) | 2006-08-02 | 2007-07-27 | Video image display device and video image display method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20090262139A1 true US20090262139A1 (en) | 2009-10-22 |
Family
ID=38997158
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/375,580 Abandoned US20090262139A1 (en) | 2006-08-02 | 2007-07-27 | Video image display device and video image display method |
Country Status (4)
Country | Link |
---|---|
US (1) | US20090262139A1 (en) |
JP (1) | JPWO2008015978A1 (en) |
CN (1) | CN101490738A (en) |
WO (1) | WO2008015978A1 (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080094421A1 (en) * | 2005-05-19 | 2008-04-24 | Access Co., Ltd. | Information display apparatus and information displaying method |
US20090083814A1 (en) * | 2007-09-25 | 2009-03-26 | Kabushiki Kaisha Toshiba | Apparatus and method for outputting video Imagrs, and purchasing system |
US20100057696A1 (en) * | 2008-08-28 | 2010-03-04 | Kabushiki Kaisha Toshiba | Display Processing Apparatus, Display Processing Method, and Computer Program Product |
US20100109985A1 (en) * | 2007-07-19 | 2010-05-06 | Panasonic Corporation | Image display device |
US20100229126A1 (en) * | 2009-03-03 | 2010-09-09 | Kabushiki Kaisha Toshiba | Apparatus and method for presenting contents |
US20100250553A1 (en) * | 2009-03-25 | 2010-09-30 | Yasukazu Higuchi | Data display apparatus, method ,and program |
CN102880458A (en) * | 2012-08-14 | 2013-01-16 | 东莞宇龙通信科技有限公司 | Method and system for generating player interface on background picture |
EP2983074A1 (en) * | 2014-08-07 | 2016-02-10 | Samsung Electronics Co., Ltd. | Method and apparatus for displaying a screen in electronic devices |
CN105677140A (en) * | 2014-12-08 | 2016-06-15 | 三星电子株式会社 | Method and apparatus for arranging objects according to content of background image |
CN110134478A (en) * | 2019-04-28 | 2019-08-16 | 深圳市思为软件技术有限公司 | The scene conversion method, apparatus and terminal device of panoramic scene |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012128662A (en) * | 2010-12-15 | 2012-07-05 | Samsung Electronics Co Ltd | Display control device, program and display control method |
CN111541927A (en) * | 2020-05-09 | 2020-08-14 | 北京奇艺世纪科技有限公司 | Video playing method and device |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020039924A1 (en) * | 1998-05-27 | 2002-04-04 | Nintendo Co., Ltd. | Portable color display game machine and storage medium for the same |
US20040083025A1 (en) * | 2001-07-17 | 2004-04-29 | Torahiko Yamanouchi | Industrial vehicle equipped with material handling work controller |
US6781592B2 (en) * | 2000-04-26 | 2004-08-24 | Konami Corporation | Image generating device, image generating method, readable storage medium storing image generating program, and video game device |
US20050129311A1 (en) * | 2003-12-11 | 2005-06-16 | Haynes Simon D. | Object detection |
US6979265B2 (en) * | 2000-01-07 | 2005-12-27 | Konami Corporation | Game system and computer readable storage medium |
US20060031244A1 (en) * | 2004-07-26 | 2006-02-09 | Kabushiki Kaisha Toshiba | Data structure of metadata and processing method of the metadata |
US20060097172A1 (en) * | 2004-11-09 | 2006-05-11 | Samsung Electronics Co., Ltd. | Imaging apparatus, medium, and method using infrared rays with image discrimination |
US20070057966A1 (en) * | 2005-09-09 | 2007-03-15 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, program, and storage medium |
US20070147820A1 (en) * | 2005-12-27 | 2007-06-28 | Eran Steinberg | Digital image acquisition system with portrait mode |
US7380208B2 (en) * | 2002-12-20 | 2008-05-27 | Fuji Xerox Co., Ltd. | Image processing apparatus, image processing program and image processing method |
US20090091650A1 (en) * | 2007-10-01 | 2009-04-09 | Fujifilm Corporation | Digital camera capable of appropriately discriminating the face of a person |
US20100172733A1 (en) * | 2006-03-27 | 2010-07-08 | Commissariat A L'energie Atomique | Intelligent interface device for grasping of an object by a manipulating robot and method of implementing this device |
US20100245597A1 (en) * | 2009-03-27 | 2010-09-30 | Primax Electronics Ltd. | Automatic image capturing system |
US7873411B2 (en) * | 2004-09-01 | 2011-01-18 | National Institute Of Information And Communications Technology | Interface device, interface method and control training device by the use of the interface device |
US20110141130A1 (en) * | 2009-12-10 | 2011-06-16 | Sony Corporation | Display device |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10239085A (en) * | 1997-02-26 | 1998-09-11 | Casio Comput Co Ltd | Map displaying device, map displaying method and recording medium |
JP2001188525A (en) * | 1999-12-28 | 2001-07-10 | Toshiba Corp | Image display device |
JP3643796B2 (en) * | 2001-08-03 | 2005-04-27 | 株式会社ナムコ | Program, information storage medium, and game device |
JP2004147181A (en) * | 2002-10-25 | 2004-05-20 | Fuji Photo Film Co Ltd | Image browsing device |
-
2007
- 2007-07-27 WO PCT/JP2007/064779 patent/WO2008015978A1/en active Application Filing
- 2007-07-27 CN CNA2007800270051A patent/CN101490738A/en active Pending
- 2007-07-27 US US12/375,580 patent/US20090262139A1/en not_active Abandoned
- 2007-07-27 JP JP2008527729A patent/JPWO2008015978A1/en active Pending
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020039924A1 (en) * | 1998-05-27 | 2002-04-04 | Nintendo Co., Ltd. | Portable color display game machine and storage medium for the same |
US6979265B2 (en) * | 2000-01-07 | 2005-12-27 | Konami Corporation | Game system and computer readable storage medium |
US6781592B2 (en) * | 2000-04-26 | 2004-08-24 | Konami Corporation | Image generating device, image generating method, readable storage medium storing image generating program, and video game device |
US20040083025A1 (en) * | 2001-07-17 | 2004-04-29 | Torahiko Yamanouchi | Industrial vehicle equipped with material handling work controller |
US7380208B2 (en) * | 2002-12-20 | 2008-05-27 | Fuji Xerox Co., Ltd. | Image processing apparatus, image processing program and image processing method |
US20050129311A1 (en) * | 2003-12-11 | 2005-06-16 | Haynes Simon D. | Object detection |
US20060031244A1 (en) * | 2004-07-26 | 2006-02-09 | Kabushiki Kaisha Toshiba | Data structure of metadata and processing method of the metadata |
US7873411B2 (en) * | 2004-09-01 | 2011-01-18 | National Institute Of Information And Communications Technology | Interface device, interface method and control training device by the use of the interface device |
US20060097172A1 (en) * | 2004-11-09 | 2006-05-11 | Samsung Electronics Co., Ltd. | Imaging apparatus, medium, and method using infrared rays with image discrimination |
US7848546B2 (en) * | 2005-09-09 | 2010-12-07 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, program, and storage medium |
US20070057966A1 (en) * | 2005-09-09 | 2007-03-15 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, program, and storage medium |
US20070147820A1 (en) * | 2005-12-27 | 2007-06-28 | Eran Steinberg | Digital image acquisition system with portrait mode |
US20100172733A1 (en) * | 2006-03-27 | 2010-07-08 | Commissariat A L'energie Atomique | Intelligent interface device for grasping of an object by a manipulating robot and method of implementing this device |
US20090091650A1 (en) * | 2007-10-01 | 2009-04-09 | Fujifilm Corporation | Digital camera capable of appropriately discriminating the face of a person |
US20100245597A1 (en) * | 2009-03-27 | 2010-09-30 | Primax Electronics Ltd. | Automatic image capturing system |
US20110141130A1 (en) * | 2009-12-10 | 2011-06-16 | Sony Corporation | Display device |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080094421A1 (en) * | 2005-05-19 | 2008-04-24 | Access Co., Ltd. | Information display apparatus and information displaying method |
US20100109985A1 (en) * | 2007-07-19 | 2010-05-06 | Panasonic Corporation | Image display device |
US8305305B2 (en) | 2007-07-19 | 2012-11-06 | Panasonic Corporation | Image display device |
US20090083814A1 (en) * | 2007-09-25 | 2009-03-26 | Kabushiki Kaisha Toshiba | Apparatus and method for outputting video Imagrs, and purchasing system |
US8466961B2 (en) | 2007-09-25 | 2013-06-18 | Kabushiki Kaisha Toshiba | Apparatus and method for outputting video images, and purchasing system |
US20100057696A1 (en) * | 2008-08-28 | 2010-03-04 | Kabushiki Kaisha Toshiba | Display Processing Apparatus, Display Processing Method, and Computer Program Product |
US8527899B2 (en) * | 2008-08-28 | 2013-09-03 | Kabushiki Kaisha Toshiba | Display processing apparatus, display processing method, and computer program product |
US8949741B2 (en) | 2009-03-03 | 2015-02-03 | Kabushiki Kaisha Toshiba | Apparatus and method for presenting content |
US20100229126A1 (en) * | 2009-03-03 | 2010-09-09 | Kabushiki Kaisha Toshiba | Apparatus and method for presenting contents |
US20100250553A1 (en) * | 2009-03-25 | 2010-09-30 | Yasukazu Higuchi | Data display apparatus, method ,and program |
US8244738B2 (en) | 2009-03-25 | 2012-08-14 | Kabushiki Kaisha Toshiba | Data display apparatus, method, and program |
CN102880458A (en) * | 2012-08-14 | 2013-01-16 | 东莞宇龙通信科技有限公司 | Method and system for generating player interface on background picture |
EP2983074A1 (en) * | 2014-08-07 | 2016-02-10 | Samsung Electronics Co., Ltd. | Method and apparatus for displaying a screen in electronic devices |
US10146413B2 (en) | 2014-08-07 | 2018-12-04 | Samsung Electronics Co., Ltd. | Method and apparatus for displaying screen in electronic devices |
CN105677140A (en) * | 2014-12-08 | 2016-06-15 | 三星电子株式会社 | Method and apparatus for arranging objects according to content of background image |
US10558322B2 (en) | 2014-12-08 | 2020-02-11 | Samsung Electronics Co., Ltd. | Method and apparatus for displaying objects and a background image on a display screen |
CN110134478A (en) * | 2019-04-28 | 2019-08-16 | 深圳市思为软件技术有限公司 | The scene conversion method, apparatus and terminal device of panoramic scene |
Also Published As
Publication number | Publication date |
---|---|
CN101490738A (en) | 2009-07-22 |
WO2008015978A1 (en) | 2008-02-07 |
JPWO2008015978A1 (en) | 2009-12-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20090262139A1 (en) | Video image display device and video image display method | |
CN110766777B (en) | Method and device for generating virtual image, electronic equipment and storage medium | |
CN108010112B (en) | Animation processing method, device and storage medium | |
EP2544073A2 (en) | Image processing device, image processing method, recording medium, computer program, and semiconductor device | |
CN112288665B (en) | Image fusion method and device, storage medium and electronic equipment | |
CN111161392B (en) | Video generation method and device and computer system | |
US20180024703A1 (en) | Techniques to modify content and view content on a mobile device | |
EP2133842A1 (en) | Image generating apparatus, image processing method, information recording medium, and program | |
WO2021135320A1 (en) | Video generation method and apparatus, and computer system | |
JP5713855B2 (en) | Information processing apparatus, information processing method, and data structure of content file | |
JP5851170B2 (en) | Image processing apparatus and image processing method | |
US20210397260A1 (en) | Methods and systems for decoding and rendering a haptic effect associated with a 3d environment | |
CN112884908A (en) | Augmented reality-based display method, device, storage medium, and program product | |
WO2023231235A1 (en) | Method and apparatus for editing dynamic image, and electronic device | |
CN111107427B (en) | Image processing method and related product | |
CN116152416A (en) | Picture rendering method and device based on augmented reality and storage medium | |
JP2012147404A (en) | Information processor | |
CN113132800B (en) | Video processing method and device, video player, electronic equipment and readable medium | |
US6833841B2 (en) | Image forming method, computer program for forming image, and image forming apparatus | |
JP7427786B2 (en) | Display methods, devices, storage media and program products based on augmented reality | |
CN105892663A (en) | Information processing method and electronic device | |
CN114913277A (en) | Method, device, equipment and medium for three-dimensional interactive display of object | |
JP4637199B2 (en) | Image processing apparatus, image processing method, and program | |
CN113034653A (en) | Animation rendering method and device | |
JP2014153747A (en) | Program, information equipment and method for controlling character display on basis of image complexity |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: PANASONIC CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TANAKA, TOSHIYUKI;URANAKA, SACHIKO;MIYAZAKI, SEIYA;AND OTHERS;REEL/FRAME:022342/0673;SIGNING DATES FROM 20081226 TO 20090106 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |