CN1316723A - Method and device for providing cartoon outline in three-D video image system - Google Patents

Method and device for providing cartoon outline in three-D video image system Download PDF

Info

Publication number
CN1316723A
CN1316723A CN00128840.7A CN00128840A CN1316723A CN 1316723 A CN1316723 A CN 1316723A CN 00128840 A CN00128840 A CN 00128840A CN 1316723 A CN1316723 A CN 1316723A
Authority
CN
China
Prior art keywords
pixel
depth
filtrator
data
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN00128840.7A
Other languages
Chinese (zh)
Inventor
安本吉孝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nintendo Co Ltd
Original Assignee
Nintendo Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/468,109 external-priority patent/US6747642B1/en
Application filed by Nintendo Co Ltd filed Critical Nintendo Co Ltd
Publication of CN1316723A publication Critical patent/CN1316723A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

In a 3D computer graphics system such as a 3D home video game console, efficient techniques for generating non-photorealistic effects such as cartoon outlining involve generating and displaying border lines at object edges based on contents of a pixel memory. Some techniques use depth (Z) values to determine which pixels are located at object edges, and selectively blend border coloration into those pixels which are then displayed. Object edges are located by comparing pixel depth values with neighboring pixel depth values (e.g., calculating a 'distance' value based on the absolute values of the distance(s) between a pixel's depth and depths of neighboring pixels). A desired border line color is blended into the pixel's color value based on the calculated distance value.

Description

The method and apparatus of cartoon outline is provided in 3 three-D video image systems
The present invention relates to 3 dimension computer pictures, relate in particular to (non-photorealistic) 3 d image of the non-image sense of reality.Particularly, the present invention relates to the method and apparatus that computing machine produced and showed 3 dimension objects as the border at outline line or other edges.
The most computers image studies is tended at producing image true to nature.This type of research is extremely successful.Computing machine can produce so image true to nature now, so that they and photo can not be made a distinction.For example, we majority broad in the middle have seen the special-effect that the computing machine of compellent dinosaur, outman and other image fidelities produces in film and TV.The training of new pilot on the computing machine flight simulator is so true to nature, so that almost can reappear real flight.Cheap home video game systems can provide the sense of reality of suitable high level now, feel for the player really, as driving a racing car along a road, ski from the ski slope that ice and snow covers, medieval castle etc. is passed in walking, to most recreation, this sense of reality has improved the experience of recreation widely.
But, sometimes would rather need the image of feeling of unreality.For example, the world that the interactive video of some type and computer game provide the value of amusement not lie in real simulated reality (or imagination), and be to create and show the cartoon world that is full of intentional fict caricature formula cartoon character.For example, such recreation can attempt to reappear the browsing of comic book that hand is drawn formula, and action, language and reciprocation are provided simultaneously.In such 3 dimension computer image systems, the visual effect that hope obtains provides solid-line boundary clearly, as shown the profile of object or other edge lines.Such boundary line can increase sharpness to some image, if can help the user more clearly to distinguish different surfaces, the profile of cartoon character for example, the mountain peak of landscape, the edge of wall etc., and these boundary lines can assist foundation to draw desirable impression by cartoon artist hand.
A kind of method that the non-image sense of reality as the boundary line is provided around role or other object edges is the separator bar object that definition is adjacent to polyhedron moulding object.But separating the border object, definition to increase the Flame Image Process complicacy widely.In the system of a resource-constrained, the method can slow down the image generation time or the loss performance.In the cheap 3 d image system of for example interactive 3 dimension video game systems, efficient is particular importance.The technology of more effective generation visual effect will make the system resource burden lighter, thereby improve whole visual experience under the prerequisite of significantly not losing speed and other performances.
By means of in 3 three-D video image systems as the home video games operator's console, being provided at the effective technology that profile or other edges show the boundary line, the invention solves the problems referred to above.According to one aspect of the present invention, be provided for frame buffer at an image and produce the boundary line later on.The present invention uses the value that is stored in frame buffer to determine which pixel is positioned on the profile or other edges of object, and selectively with painted those pixels that is mixed in border, is shown then.
According to an exemplary embodiment, by means of the depth value of pixel and the pixel depth value of adjacent objects are relatively located contour edge.In a specific examples, according to one of the absolute calculation " distance " of the distance between the degree of depth of the degree of depth of pixel and adjacent objects.According to institute's calculated distance value with desirable boundary line blend of colors to the pixel color value.In a specific examples, this distance value is used to calculate a pixel Alpha value, and this Alpha value is used to control the amount of the border color that will be mixed into this pixel color.
The another one example has added depth modulation.In this example, be worth to come calculating pixel Alpha value according to distance value with as another of the function of the pixel degree of depth, so that the Alpha value by the modulation of pixel depth value to be provided.According to Alpha value border color is mixed in the pixel color then by depth modulation.
In the additional embodiments, some internal edge of an object is not drawn the boundary line on the profile of this object, and a cartoon artist can add the boundary line there.For example, consider the situation of cartoon figure's arm act in his health front.Even this personage lifts arm in the front of his trunk, the edge that makes its arm is in fact in the inside of personage's whole profile, the cartoon artist boundary line of should drawing around this personage's arm.According to further feature of the present invention, designated different discre value-this ID value of the pixel of image for example can deposit in and be generally the frame buffer memory position that the Alpha value keeps.The edge that these pixel identification values are used to distinguish the edge, boundary line and do not have the boundary line.
File of the present invention comprises a painted figure at least.The copy that the present invention has cromogram will be provided by patent and trademark office according to request and payment necessary fee.
These and other feature and advantage of the present invention can be understood better and more completely by the detailed description of preferred embodiment being done with reference to following accompanying drawing.
Fig. 1-1F represents can be used to implement the example of 3 three-D video image systems of the present invention;
Fig. 2 is a process flow diagram, and exemplarily expression provides the step of the most preferred embodiment pixel filtrator of boundary line demonstration;
Fig. 2 A is the diagram of adjacent image point;
Fig. 3 A exemplarily illustrates perspective transform;
Fig. 3 B is the exemplary modification of Fig. 2 process flow diagram, so that the conversion of the scope of considering perspective transform to be provided;
Fig. 4 is another process flow diagram that example pixel filtrator step of depth modulation is provided;
Fig. 5 A-5F exemplarily illustrates according to effect screen provided by the invention;
Fig. 6 and 7 illustrates a cartoon character that has cartoon outline;
Fig. 8 A-8C illustrates an alternative embodiment of the invention, and it uses object part discre value to come regulation cartoon outline boundary line to add wherein.
Fig. 9 is the process flow diagram of exemplary process that is used for handling the discre value of Fig. 8 A;
Fig. 9 A roughly illustrates other code change; With
Figure 10 and 11 exemplarily illustrates the object part discre value that is used to encode and reaches the cartoon outline effect.
Fig. 1 represents to be used for showing by the present invention an example of 3 three-D video image systems of boundary line.
System's 1005 responses upward produce a Visual Display from the interactive input in real time of for example game console 1007 and/or other manual input equipments at a display device 1009 (as family expenses colour TV, video monitor or other displays).System 1005 can as be stored in the computer program control operation down of video game program in the outside Storage Media 1011 (as removable video-game card, CD-ROM or other CDs etc.) etc.
In this example, system 1005 comprises 1010, one 3 d image coprocessors 1020 of a processor and a storer 1030.Processor 1010 provides the 3 d image order to image coprocessor 1020.Image coprocessor 1020 alternatively produces 2 dimensional views in 3 dimension fields according to these 3 d image command jobs.For example, image coprocessor 1020 can comprise a hardware based 3 d image streamline, its handles image primitive, as the polyhedron in the definition of 3 dimension spaces, and produces expression 3 dimension spaces according to optional viewing point and projects to the pixel that visual 2 of view plane is tieed up images.In this example, the user can change observation point by means of operate game controller 1007 on the basis of real-time, interactive.
Image coprocessor 1020 is stored in the display element of its generation in the frame buffer 1040 of storage facilities 1030.Frame buffer 1040 comprises a color framing impact damper 1042 and the degree of depth (Z) impact damper 1044.In this example, color framing impact damper 1042 stores the 2 dimension arrays and corresponding Alpha (A) value of red, green, blue (RGB) color value.Can and will between the pixel that shows on the display 1009 one-to-one relationship be arranged at the rgb value that stores, perhaps frame buffer 1042 can be stored increment (Sub-samples).Z impact damper 1044 stores depth value (as the distance of Z direction with respect to observation point) to each pixel or the subpixel (sub-pixel) that is stored in the color framing impact damper.As everyone knows, when image pipeline " foundation " image, Z impact damper 1044 is used to multiple purpose (as removing hidden surface).
In this most preferred embodiment, colored impact damper 1042 and/or Z impact damper 1044 selectively border color is mixed into be stored in color framing impact damper 1042 the pixel color value to determine also to play effect aspect the boundary line of object edge.In more detail, system 1005 also comprises a pixel filtrator 50.Pixel filtrator 50 can be provided by the image pixel value that is provided by 3 d image coprocessor 1020, so that selectively the boundary line color is added on the particular pixels at the profile that is shown object and/or other edges.The pixel filtrator 50 of this most preferred embodiment has been submitted to frame buffer 1040 or the later pixel post-processing stages work of other pixel stores in view data.As an example, pixel filtrator 50 can be applied to the elementary area of the non-image sense of reality in boundary line in the view data of 3 d image coprocessor 1020 buffer memorys and submission in that the view data of image coprocessor output and the view data in storer 1030 are combined in the process.
Figure 1A is a more detailed synoptic diagram of realizing whole exemplary interactive 3 dimension computer image systems 1005 of the present invention.The stereo sound that system 1005 can be used for playing interactive 3 dimension video-games and is attended by interest.Suitable medium (as CD 1011) is inserted CD player 1134 just can play different recreation.By means of the input equipment of operation as hand-held controller 1007, the player can realize 1005 reciprocations with system, and hand-held controller comprises various control assemblies, as operating rod, button, switch, keyboard or little key plate.
In this example, primary processor 1010 is accepted from the next input of hand-held controller 1007 (and/or other input equipments) via coprocessor 1005.User's input that the response of primary processor 1010 interactivelies is such, and a video-game or other image programs that for example provides by external memory unit 1011 are provided.For example, primary processor 1010 can also be finished collision detection and animation process except various real-time interactive control function.
Primary processor 1010 produces 3 d image and voice command, and sends it to image harmony Association of Musicians processor 1020.This image harmony Association of Musicians processor 1020 is handled these orders, produces interested visual pattern and at boombox 1137R, send stereo sound on 1137L or other the suitable audible devices on display 1009.
System 1005 comprises a TV scrambler 1140, and it converts the composite video signal that is suitable in standard display device 1009 (for example computer monitor or family expenses colour television set) demonstration to from coprocessor 1005 reception picture signals and with this picture signal.System 1005 also comprises an audio compression reductor (codec) (compression/de-compression device) 1138, its compression and decompression digitizing audio frequency signal (and also can realize conversion between numeral and analog audio signal).Audio compression/decompressor 1138 can receive the sound input and provide it to coprocessor 1020 via impact damper 1141 is handled (for example mix mutually with other voice signals, this coprocessor produces and/or receive the voice output of compact disk equipment 1134 through data stream).The information that coprocessor 1020 is relevant with sound deposits the storer 1144 that is exclusively used in the sound task in.Coprocessor 1020 offers sound compression/de-compression device 1138 with final voice output signal, is used for decompress(ion) and contracts and convert simulating signal to (for example by buffer amplifier 1142L, 1142R), make them can be by loudspeaker 1137L, 1137R plays.
Coprocessor 1020 have the ability with appear at system 1005 in various peripherals communicate by letter.For example, can use parallel number bus 1146 to communicate by letter with compact disk equipment 1134.A serial peripheral bus 1148 can be communicated by letter with various peripherals, comprises as ROM and/or real-time clock 1150 modulator-demodular unit 1152 and flash memory 1154.Other external series bus 1156 can be used for communicating by letter with additional extended storage 1158 (for example storage card).
Figure 1B is the block scheme of example components in the coprocessor 1020.Coprocessor 1020 can be to comprise a 3 d image processor 1107, a processor interface 1108, a memory interface 1110, an audio data word signal processor (DSP) 1162, an audio memory interface (I/F) 1164, the monolithic integrated optical circuit of sound interface and 1166, one peripheral controllers 1168 of frequency mixer and a display controller 1128.
3 d image processor 1107 is finished the Flame Image Process task, and audio frequency digital signal processor 1162 is finished the acoustic processing task.Display controller 1128 is from storer 1130 access image informations, and provides it to TV scrambler 1140 and be used for showing on display device 1009.Sound interface and frequency mixer 1166 and sound compression/de-compression device 1138 interfaces, and (for example never can also mix the sound that comes with sound source, from coiling the 1011 sound inputs of flowing out, the output of sound DSP1162 and the external voice input that receives via sound compression/de-compression device 1138).Processor interface 1108 provides data and the control interface between primary processor 1010 and the coprocessor 1020.Memory interface 1110 provides data and the control interface between coprocessor 1020 and the storer 1030.In this example, primary processor 1010 is via processor interface 1108 and Memory Controller 1110 accessing main memories 1030 as coprocessor 1020 parts.Peripheral controls 1168 at coprocessor 1020 and above-mentioned various peripherals (as compact disk equipment 1134, controller 1007, ROM and/or real-time clock 1150, modulator-demodular unit 1152, flash memory 1154 and storage card 1158) between data and control interface are provided.Acoustic memory interface 1164 provides the interface with acoustic memory 1144.
Fig. 1 C has been shown in further detail 3 d image processor 1107 and element relevant in coprocessor 1020.3 d image processor 1107 comprises a command processor 1114 and 3 d image streamline 1116.Primary processor 1010 is sent to command processor 1114 with image data stream (being indicator gauge).Command processor 1114 is accepted these display commands and it is carried out grammatical analysis (obtain handle their necessary any additional datas from storer 1030), the summit command stream is offered image pipeline 1116 be used for 3 dimensions and handle and play up.Produce a 3 d image according to these order image pipeline 1116.Final image information can be sent to primary memory 1030, is used to be shown controller 1128 accesses, the frame buffering output of latter's display pipeline 1116 on display 1009.The storage access that storage arbitration (arbitration) circuit 130 is coordinated between image pipeline 1116, command processor 1114 and the display unit 128.
Shown in Fig. 1 C, image pipeline 1116 can comprise 1122, one texture environment unit 1124 of 1120, one texture cells of 1118, one equipment/rasterizers of converting unit (rasterizer) and a pixel-engine 1126.In image pipeline 1116, converting unit 1118 is finished various 3 dimension conversion operations, and also can finish polishing line and grain effect.For example, converting unit 1118 will be transformed into screen space from each geometric position, summit of object space input; The texture coordinates of conversion input is also calculated the texture coordinates of projection; Finish the cutting of the polyhedron degree of depth; The polishing line computation on each summit; And finish projection (bump) mapping texture coordinates and produce.Setting/rasterizer 1120 comprises that is provided with a unit, and it is accepted vertex data and the triangle configuration information is sent to rasterizer from converting unit 1118, finishes the rasterisation at edge, texture coordinates rasterisation and chromatic gratingization.Texture cell 1122 is finished the various tasks of relating to texture, comprises many texture processing, and the texture behind the high-speed cache decompresses, the texture process, and the mapping of embossment projection is by shade and polishing line that uses projective textures and the BLIT that has the Alpha penetrability and the degree of depth.Texture value after texture cell 1122 will be filtered outputs to texture environment unit 1124.Texture environment unit 1124 mixes polyhedron color and texture color, finishes the texture mist and relates to the function of environment with other.Pixel engine 1126 is finished Z buffering and is mixed, and deposits data on the chip frame buffer memory.
Shown in Fig. 1 D, image pipeline 1116 can comprise that the DRAM storer 1126a of an embedding stores buffer information with the locality.Frame buffer 1126a on the chip writes content the outer primary memory 1030 of chip periodically, so that by 1128 accesses of the display unit on the chip.In one exemplary embodiment, pixel filtrator 50 may be used on the cartoon outline of boundary line in the process that this writes.Image pipeline 1116 pixel engines 1126 finally are stored in frame buffering output in the primary memory 1030 and are shown unit 1128 and read frame by frame.Display unit 1128 provides the digital rgb pixel value to be used for showing on display 1009.
Some above-mentioned video game system parts may be different from the structure of above-mentioned home videos control desk.In general, home video games software is to write for specific home video game systems.This video game software can " be transplanted " in the different systems sometimes if consider different system hardware or software arrangements.Such " transplanting " generally needs the source code of accessing video Games Software.The other method of running game software is that second system imitates first system in the system of different configurations.If second system can successfully imitate or the hardware and software resource of first system of emulation, but then second system can successfully carry out the scale-of-two carries out image of this video game software and need not to visit the source code of this Games Software.
In the scope of one family video-game, an emulator is to be different from the system of writing games for it, and this system is designed to allow this games operation.As an example, emulator systems can provide a hardware and/or a different hardware and/or a software arrangements (platform) of software arrangements (platform) with the system of writing Games Software for it.Emulator systems comprises that imitation writes the hardware of system of Games Software and/or the software and/or the hardware component of software part for it.For example, analogue system can comprise a universal digital computer as personal computer, comes the executive software simulated program, writes the hardware and/or the firmware of Games Software for it with imitation.
Can develop such simulation software, it allows for the games that the above-mentioned home video game systems based on operator's console shown in Fig. 1-1C writes and can move on the universal digital computer of personal computer or other types.Some universal digital computer (as IBM or MacIntosh personal computer and compatible) now has been equipped with the 3 d image card, and it provides the 3 d image streamline that meets Direct X or other standard 3 d image orders APIs.They also are equipped with stereo sound card, and it provides the high-quality stereo sound according to the voice command collection of standard.Can there be enough performances to be similar to be exclusively used in the image and the sound performance of home video games operator's console hardware configuration in the personal computer operation simulation software of this multimedia hardware equipment.Hardware resource on the emulator software control personal computer platform comes the simulation games programmer to write the processing of the home video games operator's console platform of Games Software, 3 d image, sound, peripheral and other abilities for it.
Fig. 1 D illustrates an exemplary whole simulation process, and 1201, one simulator components 1303 of a host platform and one Games Software at the binary picture carried out that provides as ROM or CD 1305 or other memory devices is provided for its.Main frame 1201 can be a general or special-purpose digital calculating equipment, as the game operation platform of personal computer or other types.Emulator 1303 moves on host platform 1201, and provides order, data and out of Memory from medium 1305 are converted to the form that can be handled by main frame 1201 in real time.For example, emulator 1303 takes out the programmed instruction of carrying out at the home video games platform shown in Fig. 1-1C from medium 1205, and converts these programmed instruction to can be carried out or be handled by main frame 1201 form.As an example, suppose that games are for using Z-80, MIPS, carry out on the platform of IBM Power PC or other application specific processors and write, and main frame 1201 is to use the personal computer of different processor (as Inlel), emulator 1203 takes out one or sequence of program instructions from medium 1305, and these programmed instruction is converted to the Intel programmed instruction of one or more equivalence.Similarly, emulator 1203 takes out and is used for by handled image command of image harmony Association of Musicians processor and the voice command shown in Fig. 1, and these command conversion are become the obtainable form that can be handled by hardware and/or software image and acoustic processing resource in main frame 1201.As an example, emulator 1303 can become these command conversion can be by the special image of main frame 1201 and/or the order (for example using standard Direct X and sound APIs) of audio cards processing.
Be used to provide the emulator 1303 of all or part of feature of above-mentioned video game system also can have graphical user interface (GUI), its is simplified or carries out automatically to using the multiple option that emulator running game program done and the selection of screen pattern.In an example, emulator 1303 also can comprise than the stronger function of host platform of originally attempting to move video game software.
Fig. 1 E illustrates the main frame 1201 based on personal computer that is applicable to emulator 1303.Personal computer system 1201 comprises a processing unit 1203 and system storage 1205.System bus 1207 will comprise that the various system units of system storage 1205 are coupled to processing unit 1203.System bus 1207 can be some types, and any one in the bus structure comprises memory bus or memory controller, peripheral bus and use in the various bus structure any one local bus.System storage 1207 comprises ROM (read-only memory) (ROM) 1252 and random-access memory (ram) 1254.Basic input/output (BIOS) 1256 is stored among the ROM1252, and it comprises basic program and helps when starting transmission information between each unit in personal computer system 1201.Personal computer system 1201 also comprises various device and associated computer-readable medium.Data are read and write to hard disk drive 1207 from hard disc (normally fixed) 1211; Disc driver 1213 is from movably reading or write data floppy disk or other disks 1215; CD drive 1217 can also can write data from sense data (as CD ROM or other optical mediums) CD movably in some configuration.Hard disk drive 1209, disc driver 1213 and CD drive 1217 are respectively by hard disk drive interface 1221, and disk drive interface 1223 and CD drive interface 1225 are connected with system bus 1207.Each driver and their corresponding calculated machine readable medias provide the non-volatile memories of other used data of computer-readable instruction, data structure, program module, games and personal computer 1201.In other configuration, the data computing machine fetch medium of the storage computer-accessible of other types is (as magnetic tape cassette, fast erasable storage card, digital video disc, Bai Nuli (Bernoulli) formula disk drive, random access memory (RAM), ROM (read-only memory) (ROM) etc.) also can use.
The a series of program modules that comprise emulator 1303 can be stored in hard disk 1211, and moveable magnetic disc 1215 is among the ROM1252 and/or RAM1254 of CD 1219 and/or system storage 1205.This program module can comprise the operating system that image and sound APIs are provided, one or more application programs, other program modules, routine data and game data.The user can be input among the personal computer system 1201 by ordering as the input equipment of keyboard 1227 and sensing equipment 1229 with information.Other input equipments can comprise microphone, operating rod, game console, satellite dish, scanner etc.The input equipment of these and other often is connected to processing unit 1203 via the serial port interface 1231 that is coupled to system bus 1207, but also can connect by other interfaces, as parallel port, game port or general serial mouth bus (USB).The display device of monitor 1223 or other types also is connected on the system bus 1207 via the interface as video adapter 1235.
Personal computer 1201 also can comprise a modulator-demodular unit 1154 or other devices, to be based upon the communication on the such wide area network in similar the Internet 1152.Built-in or external modem 1154 is connected to system bus 123 via serial line interface 1231.Also can provide a network interface 1156 to allow personal computer 1201 via LAN (Local Area Network) 1158 (such communication can via wide area network 1152 as other communication paths or other communication devices that dial up on the telephone) and a remote computing device 1150 (as another personal computer) communication.Personal computer system 1201 generally includes peripheral output device and other standard peripherals as printer.
In an example, video adapter 1235 can comprise 3 d image pipeline chip group, and it provides fast 3 d image to play up in response to the 3 d image order of sending according to the standard 3 d image application programming interfaces such as the Direct X of Microsoft etc.One group of Stereoloudspeaker 1237 also is connected to system bus 1207 via similar traditional " sound card " such sound generating interface, sound interface provides hardware and embedded software support, is used for producing high-quality stereo according to the voice command that is provided by bus 1207.These hardware capabilities make main frame 1201 provide enough image and speed of sound performances to play the video-game that is stored in the medium 1305.
Illustrate cartoon outline technology according to the non-image sense of reality of the present invention
Handle and in hardware and/or software, to realize easily according to cartoon outline of the present invention/boundary line.For example, suppose to have the enough time, processor 1010 if after image coprocessor 1020 is provided to an image frame buffer and this image show as yet before access frame buffer 40, just can finish the aftertreatment of pixel, with the boundary line blend of colors in color framing impact damper 42.Perhaps in image coprocessor 1020, provide hardware and/or software to finish this function.As mentioned above, coprocessor 1020 can use and have mixing and hardware supported such as Z impact damper computational scheme is finished post processing of image/filtration as Alpha in the image pipeline, so that object edge boundary line on every side to be provided.Perhaps processor 1010 and/or display unit 1128 can be finished pixel filtration work according to the content of frame buffer 1040.In case coprocessor 1020 has been finished such pixel aftertreatment, the color value that is modified can be written back to frame buffer memory 1030, video in coprocessor produce line access content wherein be used to be created in vision map on the display 1009-or this value that is modified can deliver to other places (for example, directly delivering to display 1005).Be to increase transfer efficiency, frame buffer 1040 data can the double bufferings and can be used DMA.If read and handle a line, can use line buffer to keep Z value with the front at a moment frame buffer 1040.
Fig. 2 is the process flow diagram according to an example pixel filter 100 of most preferred embodiment, and this program provides a pixel aftertreatment boundary line to play up pixel filtrator 50.The pixel filter 100 of Fig. 2 can be implemented (frame 102) to each pixel [X] [Y] that is stored in the frame buffer 1040.Example pixel filtrator 100 read pixel color value (Pixel.R[X] [Y], Pixel.G[X] [Y], Pixel.B[X] [Y]), also read pixel depth value (Z[X] [Y] (frame 104).In addition, the example pixel filter program is read the depth value (also seeing frame 104) of the adjacent pixel of pixel [X] [Y].In this special case, pixel filter program 100 is read two depth values in abutting connection with pixel: " left side " of given pixel is in abutting connection with the depth value Z[x-1 of pixel in the X-Y array] [Y]; The D score of given pixel is in abutting connection with the depth value Z[X of pixel in the X-Y array] [Y-1].See Fig. 2 A, if frame buffer 1040 is read out, and a moment handle a line, then line buffer is used to preserve the Z value with the front.
By means of the difference Dz between the depth value (Z) of the depth value of determining this pixel (Z[X] [Y]) and one or more vicinity (here being adjacency) pixel, whether pixel filter 100 decision these pixels [X] [Y] at the edge of object, and the example calculation of Dz can following (seeing frame 106):
DzDx=|Z[X][Y]-Z[X-1][Y]|
DzDy=|Z[X][Y]-Z[X][Y-1]|
Dz=max(DzDx,DzDy)。
When finishing aforementioned calculation, pixel filtrator 100 is determined poor between each degree of depth of the degree of depth of these pixels and two adjacent image points; Get the absolute value (doing like this is to consider that an object may be nearer than another object, also may be farther) of these differences; Select in resulting two amplitudes bigger.Distance value Dz consequently, measure out this pixel of Z direction and contiguous pixel from how far.Can use this calculating to test out this pixel whether at the edge of an object.For example, when this pixel at an object edge, (because adjacent pixel have the very different degree of depth usually) that Dz is normally big, and when this pixel not at the edge of an object, (because contiguous pixel has the similar degree of depth) that Dz is normally little.
In other example, might adopt two adjacent pixels apart from amplitude phase Calais compute distance values:
Other variation of Dz=DzDx+DzDy also is possible (the different adjacent pixels or the adjacent pixel of varying number etc. are used in for example different calculating).
In this example, pixel filter 100 provides the proportional change color of Dz distance value with aforementioned calculation.In an example, pixel filtrator 100 usage ratios/correction coefficient is revised this final Dz distance value, and limits this result's value, tries to achieve a pixel hybrid cytokine (seeing frame 108) as follows:
Alpha=Clamp((Scale?Coefficent*Dz+Base?Coefficient),0,1)。Suppose scale-up factor for just, when Dz is big, result's Alpha value will be big (for example approaches 1 or be restricted to 1), and when Dz is hour, result's Alpha value will be little (for example approaches 0 or be restricted to 0).Can use different specific calculation for trying to achieve the pixel hybrid cytokine.For example, the another kind of method of calculating pixel hybrid cytokine is as follows:
Gray=Clamp((Dz-Coefficent?A)*Coefficient?B,0,1)
In this example pixel filtrator 100 in mixed process (mixing), use as Alpha final pixel hybrid cytokine selectively use pixel color value from 1042 acquisitions of color framing impact damper (Pixel.R[X] [Y], Pixel.G[X] [Y], Pixel.B[X] [Y]) with predetermined boundary line color (as Line.R, Line.G Line.B) mixes (seeing frame 110).An example calculation formulary finishing this mixing is:
New?Pixel.R[X][Y]=Old?Pixel.R[X][Y]*(1-Alpha)+Line.R*Alpha
New?Pixel.G[X][Y]=Old?Pixel.G[X][Y]*(1-Alpha)+Line.G*Alpha
New?Pixel.B[X][Y]=Old?Pixel.B[X][Y]*(1-Alpha)+Line.B*Alpha
According to these computing formula, when distance D z hour, mixed (" New "-Xin's) pixel color value will mainly be initial pixel color value, and when distance D z is big, mainly be the boundary line color.
If this boundary line be limited to black (be Line.R=0, Line.G=0, (Line.B=255), above-mentioned hybrid can be simplified (further optimization for example is provided) for Line.R=255, Line.G=255 Line.B=0) and/or only to use white canvas color.Use the non-RGB color format (it is high-visible at this moment, to need control brightness Y to guarantee that the border line drawing gets) as YUV also might obtain better optimize.
After mixing,, on display device 1009, show then by (frame 112) mixed pixel buffer memory.In an example, all pixels in the frame buffer 1040 are finished pixel filter 100, the frame buffer contents that will finally mix shows on display 1009 then.
In some cases, the result who is provided by said process can be provided in perspective transform.Consideration is in the situation that comprises a pixel PIXA far away and a near pixel PIXB shown in Fig. 3 A.Because perspective transform, the DzDy of far-end pixel PIXA is greater than the DzDy of near-end pixel PIXB.This make to use original DzDx and DzDy value accurately in entire image the detection boundaries line difficult more.For overcoming this problem, according to the Z metrological scales, it is useful that some Z value is proofreaied and correct.An exemplary correction can use log2 (n) function to do metrological scales (scope) conversion:
DzDx=|log2(Z[X][Y]-log2(Z[X-1][Y])|
DzDy=|log2(Z[X][Y]-log2(Z[X][Y-1])|
If processor 10 has the Floating-point Computation ability and the Z value stores with floating-point format, function log2 (n) is very easily.Floating-point format can followingly represent,
The Fig. 3 of [3 indexes] [11 mantissa] illustrates the modification to Fig. 2 process flow diagram, makes it comprise the conversion of log2 (n) metrological scales.
Fig. 4 illustrates another exemplary aftertreatment pixel filtrator 200, and it is provided for controlling the depth modulation of the Alpha value of mixing.Frame 204,206 and 208 among Fig. 4 is identical with 108 with the corresponding frame 104,106 of Fig. 2 (is Fig. 3 B if use the metrological scales conversion).The frame 210 that adds Fig. 4 is used to calculate the Alpha value, and this value is value and another function as the value of the function of pixel degree of depth Z (for example product) that calculates in frame 208, for example:
The final Alpha value of Alpha=Alpha Dz* Alpha Z also depends on the degree of depth of pixel, and it is as above described in conjunction with the frame 210 of Fig. 2, is used to control mixed process (seeing frame 212).
Fig. 5 A-5F shows available exemplary boundary line effect screen.Fig. 5 A shows an exemplary landscape scenery that does not have the boundary line effect.The final boundary line be mixed into white canvas color-the provide effective line chart of the landscape scenery of Fig. 5 A is provided Fig. 5 B.Fig. 5 C is mixed into the landscape of displayed map 5A in the scenery with the black border line color.
Fig. 5 D illustrates an exemplary fantastic risk recreation landscape.Fig. 5 E illustrates the final boundary line that is mixed into white canvas color.Fig. 5 F illustrates the landscape of Fig. 5 D that mixes with the black border line.
As discussed above, can accomplished in many ways pixel posttreatment filter 100.Be the example of the computer sources coded program of a pixel posttreatment filter below:
cfb_PixelProc_1:                    add    sys0,  cptr,    cbuf_width                    sub    sys1,  sys0,    cbuf_size                    lqv    depth00[0],      0(sys0)                    lsv    depth10[0],      32+14(zero)                    lqv    depth10[2],      0(sys0)                    lqv    depth01[0],      0(sys1)                    lqv    color00[0],      0(cptr)                    sqv    depth00[0],      0(sys1)                     sqv    depth00[0],      32(zero)                    vnxor  depth00,  depth00,    _0x7fff                    vnxor  depth10,  depth10,    _0x7fff                    vnxor  depth01,  depth01,    _0x7fff                    vsub   depthDX,  depth00,    depth10  #X                    vsub   depthDX,  depth00,    depth01  #Y                    vabs   depthDX,  depthDX,    depthDX                    vabs   depthDY,  depthDY,    depthDY#if def  FILTER_2                    vor    color00,  vzero,      _0xffff#endif                    vadd   depthDX,  depthDX,    depthDY                    vadd   depthDX,  depthDX,    Coeff_A                    vmudh  depthDX,  depthDX,    Coeff_B                    vge    depthDX,  depthDX,    _0x0000                    vmud1  color0r,  color00,    _0x0020                    vand   color0g,  color00,    v0x07c0                    vand   color0b,  color00,    v0x003e                    vmulf  color0r,  color0r,    depthDX                    vmulf  color0g,  color0g,    depthDX                    vmulf  color0b,  color0b,    depthDX                    vand   color00,  color0g,    v0x07c0                    vmadn  color00,  color0b,    _0x0001                    vmadn  color00,  color0r,    v0x0800                    vor    color00,  color00,    _0x0001<!-- SIPO <DP n="12"> --><dp n="d12"/>            addi    cptr,cptr,16            bne     cptr,cptr_end,cfb_PixelProc_1            sqv     color00[0],-16(cptr)            jr      return_sv            nop
Another cartoon outline embodiment
According to employed specific calculation, above-mentioned boundary line is used algorithm complete acceptable result may not be provided under some cartoon outline situation.Fig. 6 shows a such example.The exemplary cartoon personage 300 of Fig. 6 has the boundary line that is added to contour edge 302 as mentioned above.Fig. 6 illustrates this cartoon figure and also has the right hand, and wrist and act are at the part forearm of this personage front.The edge of above-mentioned technology around some situation (arm that for example depends on this personage is positioned at from personage's trunk position how far) is thought the right hand, wrist and forearm part is internal edge but not therefore contour edge does not add the boundary line to those edges.Fig. 6 has illustrated, if cartoon outline only is added to this personage's contour edge 302, then some part of cartoon figure 300 may disappear or become not obvious, yet observer's (drawing the experience of animation cartoon and/or comedy books according to colored books, hand) wishes to add upper border line, divide sell, wrist and forearm.
In order to make cartoon figure 300 as Freehandhand-drawing, preferably also the boundary line is added to some internal edge 304, promptly determine personage's hand, wrist and act internal edge in this example at the forearm of this personage self front.Fig. 7 illustrates and has the personage 300 who the boundary line is added to these internal edges 304.If this personage 300 lifts its arm to overhanging position, these internal edges 304 will become contour edge, but be internal edge when arm is orientated as shown in Figure 7.
According to other aspect provided by the invention, the cartoon outline line is added to some internal edge automatically, as shown in Figure 7.In more detail, the pixel of an object different piece of expression is assigned with different discre values.As an example, can by means of at frame buffer 1040 and/or the DRAM1126a in the embedding chip of the Alpha information that generally is used for encoding divide coordination to specify discre value.Appointed discre value can be used for determining whether will draw the boundary line at that pixel location.For example, system can compare the discre value of a pixel and the discre value of vicinity (as adjacency) pixel.If the discre value of the pixel of two adjacency has certain predetermined relation, then do not draw the boundary line.For example, if discre value is identical, then two pixels are not drawn the boundary line on same surface.But,, then draw the boundary line if two discre values in abutting connection with pixel have certain other predetermined relationship.
Fig. 8 A-8C illustrates an example.Fig. 8 A illustrates the skeleton view of an object 319 that comprises 3 object parts 320,322,324.Fig. 8 B illustrates the planimetric map of same object 319.Object part 320 is squares, and object part 322 is circles, and object part 324 is tapers.Suppose the image artist boundary line 330 (seeing Fig. 8 A) of wanting to draw in taper 324 and square 320 intersections, rather than at conical intersection circle 322 places or circle intersect the square place.In this example, be encoded into respectively different discre values with pixel in the taper 324 at square 320, circle 322.For example the pixel in square 320 is encoded with discre value " 1 "; Pixel in circle 322 is encoded with " 2 "; Pixel in taper 324 is encoded with " 3 ".Fig. 8 C illustrates the Alpha part (beat hatched grid and point out it is the grid that adds the boundary line color) of the frame buffer 1040 and/or the 1126a of exemplary this coding of storage.
In the pixel post-processing stages, the various discre values in frame buffer are tested.Do not draw the boundary line for the pixel (all such pixels are on same surface) that has a same identification value with the adjacency pixel.If the discre value that pixel has is different from the discre value of the pixel of adjacency within predetermined criteria or criterion group, do not draw boundary line (for example, if the difference of the discre value of the discre value of pixel K and pixel K+1 is not then drawn the boundary line less than 2) yet.But, if the discre value that pixel has is different from discre value in abutting connection with pixel by another criterion or criterion group, then draw the boundary line (for example, if the difference of the discre value of the discre value of pixel K and pixel K+1 be 2 or greater than 2, then in picture boundary line, pixel K place).
Fig. 9 is a process flow diagram of drawing an example pixel post processor of boundary line by present embodiment.Program 350 comprises a circulation (frame 352-362), and each pixel [i] [j] that has image among frame buffer 1040 and/or the 1126a is handled one by one.As above discuss, as the part of image being submitted to frame buffer, the image production process can be provided with discre value in each part of obviously distinguishing of the object in distributing the frame buffering position that stores the Alpha value usually.Program 350 detects these Alpha (being ID now) value, to determine whether the picture boundary line.In this example, program 350 is taken out Alpha (ID) value of pixels [i] [j] and in abutting connection with Alpha (ID) value of pixel (be pixel[i-1] [j] and pixel[i] [j-1]) (frame 352).Program 352 is finished the calculating below (at frame 354) then, with Alpha (ID) value of determining pixel [i] [j] with in abutting connection with the difference between Alpha (ID) value of pixel:
diff?X=ABS(Alpha[i][j]-Alpha[i-1][j])
Diff Y=ABS (Alpha[i] [j]-Alpha[i] [j-1]) then, program 350 detects final calculated difference diff X and diff Y, whether surpasses predetermined difference (for example, fixing arbitrarily or programmable threshold value is as 1) (frame 356) to determine both.If at least one difference surpasses predetermined difference, then program 350 is arranged to the color of pixel [i] [j] color (frame 358) of boundary line.So, when the slope of Alpha is-1 to+1, think pixel in this specific examples, be on same surface.To each the pixel repeating step 352-358 (frame 360,362) in this image.
In a variation of program 350, some object can be ignored (seeing Fig. 9 A) with those pixels that are defined in this object with a special Alpha discre value (for example 0 * 00) coding when drawing the boundary line.For example this has the object that helps a nonboundary line and is depicted as a bitmap (for example being used to the fried animation that exposes to the sun).
How Figure 10 can be used to draw on the objects displayed 300 effectively the boundary line in Fig. 6 and Fig. 7 if illustrating said procedure 350.In ratio, the different piece of object 300 is encoded with different Alpha (ID) value.For example, object 300 can comprise two arm 311a and 311b and a trunk 309.Every arm can comprise that 313, one wrists of a hand divide 308, one upper arm parts 310 of 312, one elbow section of 315, one forearm parts and a shoulder portion 317.In these different pieces each can be encoded with different AlphaID, and is as follows:
Body part Alpha ID
Left hand 313a 1
Left side wrist 315a 2
Left side forearm 312a 3
Left side elbow 308a 4
Left upper arm 310a 5
Left side shoulder 317a 6
Trunk 309 7
Right shoulder 317b 8
Right upper arm 310b 9
Right elbow 311b 10
Right forearm 312b 11
Right wrist 315b 12
Right hand 313b 13
With the exemplary AlphaID of above-mentioned coding, program 350 boundary line shown in black line among Figure 10 of will drawing, boundary line but other intersections (shown in dotted line) between the object each several part do not draw.
Above-mentioned coding also can be used for the boundary line is added to intersection between the linking portion of same object 300.Traditional colored books, hand are drawn animation cartoon etc. and cartoon outline are added to those self intersection positions sometimes and better determine to cause muscular image etc. in the position so that provide this joint provided.For example, Figure 11 A-11C illustrates the feature that personage 300 connects the articulation place 308 (being elbow) of personage's upper arm 310 and forearm 312.Use above-mentioned coding, when 308 bendings of articulation place made limbs 310 and 312 link with direction as Figure 11 B and 11C, program 350 will (according to the difference between the AlphaID of the AlphaID of forearm 310 and upper arm 312 greater than 1) boundary sections 316 be added to body part 310 and 312 and hands over intersections.
Like this, the present invention just provides otherwise effective technique to tie up the non-image sense of reality effect that adds in the computer image systems as the cartoon outline line 3.These technology can be used (promptly after the 3 d image streamline is provided to frame buffer with image) in the pixel post-processing stages, and the pixel information the information that does not need to provide except that 3 d image flowing water (that is, color, the degree of depth and Alpha pixel information).Therefore, " downstream " that can be applied in frame buffer by efficiently and effectively according to pixel post-processing technology provided by the invention (for example, the part of on-chip buffer, the outer impact damper of chip or an anti-part of mixing operation or other post-processing operation), and can not make other parts of 3 d image streamline or system greatly complicated.
Though the present invention on the contrary, is contained various modifications and configuration of equal value in conjunction with the embodiment that thinks that most realistic most preferred embodiment describes, be appreciated that to the invention is not restricted to be disclosed in the spirit and scope of appending claims.

Claims (67)

1. an interactive mode 3 is tieed up home video game systems, it is characterized in that it comprises:
At least one manually operated opertaing device, the real-time input that provides the user to handle;
A medium, 3 dimension data of at least one cartoon character of storage representation;
The memory buffer of a storing image data;
A 3 d image streamline that is coupled mutually with described memory buffer, described 3 d image streamline will offer described memory buffer corresponding to the view data of described cartoon character according to the real-time input of described user's manipulation and 3 dimension data of the described cartoon character of described expression at least; With
A filtrator that is coupled mutually with described memory buffer, described filtrator is applied to described view data with a cartoon outline function, so that draw tangible boundary line automatically around described cartoon character.
2. 3 dimension home video game systems as claimed in claim 1, it is characterized in that, described image pipeline and described memory buffer are arranged on the single semiconductor chip, and at the filtrator described in the process of described pixel data being write external unit from described semiconductor chip described pixel data are operated.
3. 3 dimension home video game systems as claimed in claim 1 it is characterized in that described view data comprises the pixel color data, and described filtrator are replaced described pixel color data with the boundary line color data.
4. 3 dimension home video game systems as claimed in claim 1, it is characterized in that described filtrator is selectively revised at least and a color data that described pixel is corresponding that is stored in the described memory buffer according to the described depth data corresponding with described pixel at least in part.
5. 3 dimension home video game systems as claimed in claim 1, it is characterized in that described filtrator is at least in part according to selectively revising the color data of the described pixel of corresponding stored in described memory buffer at least corresponding to described pixel and described depth data on every side.
6. 3 dimension home video game systems as claimed in claim 1, it is characterized in that, described filtrator is determined the difference between the depth data of at least one pixel of the depth data of corresponding described pixel and corresponding contiguous described pixel, if and the described difference of described definite announcement is then revised the color data corresponding to described pixel above predetermined threshold.
7. 3 dimension home video game systems as claimed in claim 1 is characterized in that described filtrator is revised described color data selectively according to the object part ID value that is stored in the described memory buffer at least in part.
8. 3 dimension home video game systems as claimed in claim 7 is characterized in that described buffer memory allocation becomes to store the Alpha value relevant with a plurality of pixels, and described object part ID value is deposited in described memory buffer as the Alpha value.
9. 3 dimension home video game systems as claimed in claim 1 is characterized in that, the difference of the recognition data that described filtrator is determined corresponding at least one pixel and the recognition data of corresponding at least one pixel that is adjacent to described pixel; If described definite announcement, described difference surpasses a predetermined threshold, then revises the described color data corresponding to described pixel.
10. 3 dimension home video game systems as claimed in claim 1 is characterized in that described filtrator mixes the color data that is stored in described memory buffer with the Alpha value.
11. 3 dimension home video game systems as claimed in claim 10, it is characterized in that described filtrator is set to the value corresponding with the degree of depth of described pixel by means of the Alpha value with the described pixel of correspondence and selectively revises the color data that is stored in described memory buffer.
12. 3 dimension home video game systems as claimed in claim 11, it is characterized in that, described filtrator is selectively revised the color data that being stored in the described memory buffer by means of the corresponding value of difference that an Alpha value corresponding at least one pixel is set to the degree of depth of the degree of depth of described pixel and at least one adjacent pixel.
13. as claimed in claim 12 3 the dimension home video game systems, it is characterized in that, described adjacent pixel be one in abutting connection with pixel.
14. 3 dimension home video game systems as claimed in claim 1, it is characterized in that, described pixel in described image is designated as P[X, Y], and described filtrator with the degree of depth of described pixel be appointed as P[X-1, Y] first pixel and be appointed as P[X, Y-1] the degree of depth of second pixel compare.
15. 3 dimension home video game systems as claimed in claim 1, it is characterized in that, described pixel in the described image is designated as P[X, Y], described filtrator with the degree of depth of described pixel be appointed as P[X+1, Y] first pixel and be appointed as P[X, Y+1] the degree of depth of second pixel compare.
16. 3 dimension home video game systems as claimed in claim 1 is characterized in that the difference of second logarithm value of first logarithm value of the degree of depth of the described pixel of described filter calculations and the degree of depth of at least one adjacent pixel.
17. 3 dimension home video game systems as claimed in claim 1 is characterized in that described filtrator carries out depth modulation to applying of described cartoon outline.
18. 3 dimension home video game systems as claimed in claim 1 is characterized in that described filtrator is in order to have an X-rayed the described cartoon outline of conversion.
19. comprise in the computer image system of storage corresponding to the type of the pixel store of the color of the pixel of the 3 d image that is provided and depth data at one, a kind of method that produces non-image realistic video effect, it is characterized in that it comprises the steps:
(a) from described pixel store, at least one described pixel is read described color data at least; With
(b) the described image that provides is revised described color data selectively, so that a cartoon outline effect to be provided.
20. method as claimed in claim 19 is characterized in that, described pixel store is arranged on the image chip; The described step (a) of reading is to finish the pixel post-processing step of writing the described image of submitting to out from described image chip.
21. method as claimed in claim 19 is characterized in that, modify steps comprises the step of replacing the color data of described pixel with the boundary line color data selectively.
22. method as claimed in claim 19 is characterized in that, described selectively modify steps is at least in part according to finishing corresponding to the described depth data of described pixel.
23. method as claimed in claim 19 is characterized in that, described selectively modify steps is to carry out according to the described pixel of correspondence and the described degree of depth on every side at least in part.
24. method as claimed in claim 19 is characterized in that, described selectively modify steps comprises the difference between the depth data of at least one pixel of the depth data of determining corresponding described pixel and corresponding contiguous described pixel; And if the described difference of described definite announcement is then revised the color data corresponding to described pixel above predetermined threshold value.
25. method as claimed in claim 19 is characterized in that, described selectively modify steps is carried out according to an object part ID value that is stored in described pixel store at least in part.
26. method as claimed in claim 25 is characterized in that, described pixel store is configured to store the Alpha value relevant with described pixel, and described object part ID value is deposited in described pixel store as the Alpha value.
27. method as claimed in claim 19 is characterized in that, described selectable modify steps comprises the difference of the recognition data of determining corresponding described pixel and the recognition data of corresponding at least one pixel that is adjacent to described pixel; If described certain step discloses, described difference surpasses a predetermined threshold value, then revises the described color data of corresponding described pixel.
28. method as claimed in claim 19 is characterized in that, described selectively modify steps comprises that using the Alpha value to mix is added to the boundary line in the described pixel color data.
29. method as claimed in claim 28 is characterized in that, described selectable modify steps comprises the value that will be arranged to corresponding to an Alpha value of described pixel corresponding to the degree of depth of described pixel.
30. method as claimed in claim 29 is characterized in that, described selectively modify steps comprises the value that will be arranged to corresponding to the Alpha value of described pixel corresponding to the difference between the degree of depth of the degree of depth of described pixel and at least one adjacent pixel.
31. method as claimed in claim 30 is characterized in that, described adjacent pixel be one in abutting connection with pixel.
32. method as claimed in claim 19, it is characterized in that, pixel is designated as P[X described in the described image, Y], and described selectively modify steps comprise with the degree of depth of described pixel be appointed as P[X-1, Y] first pixel and be appointed as P[X, Y-1] the degree of depth of second pixel compare.
33. method as claimed in claim 19, it is characterized in that, pixel is appointed as P[X described in the described image, Y], and described selectable modify steps comprise with the degree of depth of described pixel be appointed as P[X+1, Y] first pixel and be appointed as P[X, Y+1] the degree of depth of second pixel compare.
34. method as claimed in claim 19 is characterized in that, first logarithm value of the described degree of depth of selecting modify steps to comprise to calculate described pixel and at least one difference in abutting connection with second logarithm value of the degree of depth of pixel.
35. a computer image system is characterized in that it comprises:
An image engine, this image engine is provided to storage corresponding to the color of described 3 d image pixel and the pixel store of depth data with 3 d image;
A pixel filtrator that is coupled to described pixel store, described pixel filtrator are read the described color data of at least one described pixel at least from described pixel store; And selectively revise described color data, think that the described image that provides provides the cartoon outline effect.
36. system as claimed in claim 35, it is characterized in that, described image engine and described pixel store are arranged on the common substrate, described picture system comprises another pixel store beyond the described substrate, and described pixel filtrator is write described another pixel store with the described image that provides.
37. system as claimed in claim 35 is characterized in that, described pixel filtrator substitutes the boundary line color data color data of described pixel.
38. system as claimed in claim 35 is characterized in that, described pixel filtrator is at least in part according to selectively revising described color data corresponding to the depth data of described pixel.
39. system as claimed in claim 35 is characterized in that, described pixel filtrator is at least in part according to selectively revising described color data corresponding to described pixel and described depth data on every side.
40. system as claimed in claim 35 is characterized in that, described pixel filtrator is determined the difference between the depth data of at least one pixel of the depth data of corresponding described pixel and corresponding contiguous described pixel; And if described definite announcement, described difference surpasses predetermined threshold value, then revises the color data corresponding to described pixel.
41. system as claimed in claim 35 is characterized in that, described pixel filtrator according to the object part ID value that is stored in described pixel store, is selectively revised described color data at least in part.
42. system as claimed in claim 41 is characterized in that, described pixel store is configured to store the Alpha value relevant with described pixel, and described object part ID value is deposited in described pixel store as the Alpha value.
43. system as claimed in claim 35 is characterized in that, the difference of the recognition data that described pixel filtrator is determined corresponding described pixel and the recognition data of corresponding at least one pixel that is adjacent to described pixel; If described definite announcement, described difference surpasses a predetermined threshold value, then revises the described color data corresponding to described pixel.
44. system as claimed in claim 35 is characterized in that, described pixel filtrator mixes by means of Alpha selectively revises described color data.
45. system as claimed in claim 44 is characterized in that, described pixel filtrator is selectively revised described color data by means of the value that the Alpha value with a described pixel of correspondence is set to the degree of depth of corresponding described pixel.
46. system as claimed in claim 45, it is characterized in that, described pixel filtrator is by means of the value that an Alpha value corresponding to described pixel is set to corresponding to a difference, selectively revise described color data, described difference is the degree of depth poor of the degree of depth of described pixel and at least one adjacent pixel.
47. system as claimed in claim 46 is characterized in that, described adjacent pixel is the pixel of an adjacency.
48. system as claimed in claim 35 is characterized in that, the described pixel in the described image is designated as P[X, Y], described filtrator with the degree of depth of described pixel be appointed as P[X-1, Y] first pixel and be appointed as P[X, Y-1] the degree of depth of second pixel compare.
49. system as claimed in claim 35 is characterized in that, the described pixel in the described image is designated as P[X, Y], described filtrator with the degree of depth of described pixel be appointed as P[X+1, Y] first pixel and be appointed as P[X, Y+1] the degree of depth of second pixel compare.
50. system as claimed in claim 35 is characterized in that, first logarithm value of the degree of depth of the described pixel of described pixel filter calculations and at least one difference in abutting connection with second logarithm value of the degree of depth of pixel.
51. interactive mode 3 dimension home video game systems is characterized in that it comprises:
At least one manual operation opertaing device, the real-time input that provides the user to handle;
A medium, 3 dimension data of a cartoon character are represented in storage at least;
The frame buffer memory of a storing image data;
The real-time input that a 3 d image streamline that is coupled with described frame buffer memory, described 3 d image streamline are handled according to described user at least and represent described 3 dimension data of described cartoon character will be provided to described frame buffer corresponding to the view data of described cartoon character; With
A filtrator that is coupled with described frame buffer memory, described filtrator is applied to described view data with the cartoon outline function, so that draw tangible boundary line automatically around described cartoon character.
52. home video game systems as claimed in claim 51 is characterized in that, described image pipeline and described frame buffer memory are arranged on the common semiconductor chip; Described picture system comprises another pixel store that described substrate is outer; And described filtrator writes described another pixel store with the described image that provides.
53. system as claimed in claim 51 is characterized in that, the color data that described filtrator is replaced described pixel with a boundary line color data.
54. system as claimed in claim 51 is characterized in that, described filtrator is at least in part according to selectively revising corresponding to the color data that is stored at least one pixel in the described frame buffer memory corresponding to the described depth data of described pixel.
55. system as claimed in claim 51, it is characterized in that, described filtrator according to the possible described depth data corresponding to described pixel, is selectively revised corresponding to the color data that is stored at least one pixel in the described frame buffer memory at least in part.
56. system as claimed in claim 51 is characterized in that, described filtrator is determined the difference between the depth data of at least one pixel of the depth data of corresponding described pixel and corresponding contiguous described pixel; And if described definite announcement, described difference surpasses predetermined threshold value, then revises the color data corresponding to described pixel.
57. system as claimed in claim 51 is characterized in that, described filtrator according to the object part ID value that is stored in described frame buffer memory, is selectively revised described color data at least in part.
58. system as claimed in claim 57 is characterized in that, described frame buffer memory is configured to store the Alpha value relevant with a plurality of pixels; Described object part ID value is as in the described frame buffer memory of Alpha value storage.
59. system as claimed in claim 51 is characterized in that, the difference of the recognition data that described filtrator is determined corresponding at least one pixel and the recognition data of corresponding at least one pixel that is adjacent to described pixel; If described definite announcement, described difference surpass a predetermined threshold value, then revise described color data corresponding to described pixel.
60. system as claimed in claim 51 is characterized in that, described filtrator Alpha mixes the color data that is stored in the described frame buffer memory.
61. system as claimed in claim 60 is characterized in that, described filtrator is selectively revised the color data that is stored in described frame buffer memory by means of being set to the value corresponding with the degree of depth of described pixel corresponding to the Alpha value of described pixel.
62. system as claimed in claim 61, it is characterized in that, described filtrator is by means of the value that an Alpha value corresponding at least one pixel is set to corresponding to a difference, selectively revise the color data be stored in described frame buffer memory, described difference is the degree of depth poor of the degree of depth of described pixel and at least one adjacent pixel.
63. system as claimed in claim 62 is characterized in that, described adjacent pixel is in abutting connection with pixel.
64. system as claimed in claim 51 is characterized in that, the described pixel in described image is designated as [X, Y], and described filtrator with the degree of depth of described pixel be appointed as P[X-1, P] first pixel and be appointed as P[X, Y-1] the degree of depth of second pixel compare.
65. system as claimed in claim 51 is characterized in that, the described pixel in described image is appointed as P[X, Y], described filtrator with the degree of depth of described pixel be appointed as P[X+1, Y] first pixel and be appointed as P[X, Y+1] the degree of depth of second pixel compare.
66. system as claimed in claim 51 is characterized in that, the difference of second logarithm value of first logarithm value of the degree of depth of the described pixel of described filter calculations and the degree of depth of at least one adjacent pixel.
67. the emulator of the interactive 3 dimension home video game systems of simulation is characterized in that it comprises:
At least one manually operated opertaing device, the real-time input that provides the user to handle;
Store a medium of 3 dimension data of representing at least one cartoon character;
The frame buffer memory of a storing image data;
A 3 d image streamline that is coupled with described frame buffer memory, described 3 d image streamline will offer described frame buffer memory corresponding to the view data of described cartoon character according to the real-time input of described user's manipulation and 3 dimension data of the described cartoon character of described expression at least; With
A filtrator that is coupled mutually with described frame buffer memory, described filtrator is applied to described view data with a cartoon outline function, so that draw tangible boundary line automatically around described cartoon character.
CN00128840.7A 1999-09-24 2000-09-22 Method and device for providing cartoon outline in three-D video image system Pending CN1316723A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US15566099P 1999-09-24 1999-09-24
US60/155,660 1999-09-24
US09/468,109 US6747642B1 (en) 1999-01-29 1999-12-21 Method and apparatus for providing non-photorealistic cartoon outlining within a 3D videographics system
US09/468,109 1999-12-21

Publications (1)

Publication Number Publication Date
CN1316723A true CN1316723A (en) 2001-10-10

Family

ID=26852501

Family Applications (1)

Application Number Title Priority Date Filing Date
CN00128840.7A Pending CN1316723A (en) 1999-09-24 2000-09-22 Method and device for providing cartoon outline in three-D video image system

Country Status (5)

Country Link
JP (2) JP4349733B2 (en)
CN (1) CN1316723A (en)
AU (1) AU5652700A (en)
BR (1) BR0004415A (en)
CA (1) CA2319279A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100354892C (en) * 2004-06-29 2007-12-12 英特尔公司 Image edge filtering
CN101540055B (en) * 2009-04-13 2011-05-04 浙江大学 Cartoon stylization method facing online real-time application
CN101438319B (en) * 2006-05-08 2012-02-22 Ati技术无限责任公司 Advanced anti-aliasing with multiple graphics processing units
CN101390131B (en) * 2006-02-27 2013-03-13 皇家飞利浦电子股份有限公司 Rendering an output image
CN103198502A (en) * 2011-10-21 2013-07-10 富士胶片株式会社 Digital comic editor and method
CN109741408A (en) * 2018-11-23 2019-05-10 成都品果科技有限公司 A kind of image and video caricature effect real-time rendering method
CN111127614A (en) * 2019-12-25 2020-05-08 上海米哈游天命科技有限公司 Model stroke processing method and device, storage medium and terminal

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5236214B2 (en) * 2007-06-11 2013-07-17 任天堂株式会社 Image processing program
JP4291384B2 (en) 2007-08-23 2009-07-08 ファナック株式会社 Detection method of disconnection and power supply disconnection of IO unit connected to numerical controller
JP4847572B2 (en) * 2009-11-13 2011-12-28 株式会社コナミデジタルエンタテインメント Image processing apparatus, image processing apparatus control method, and program

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2603445B2 (en) * 1994-11-10 1997-04-23 インターナショナル・ビジネス・マシーンズ・コーポレイション Hair image adaptation method and computer system
JPH08279057A (en) * 1995-04-05 1996-10-22 Hitachi Ltd Emphasis display device for outline and ridge of three-dimensional graphic
JP3721623B2 (en) * 1995-12-29 2005-11-30 カシオ計算機株式会社 Drawing color changing method and moving picture reproducing apparatus
JPH09311954A (en) * 1996-05-22 1997-12-02 Hitachi Ltd Three-dimensional graphic display system and method

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100354892C (en) * 2004-06-29 2007-12-12 英特尔公司 Image edge filtering
CN101390131B (en) * 2006-02-27 2013-03-13 皇家飞利浦电子股份有限公司 Rendering an output image
CN101438319B (en) * 2006-05-08 2012-02-22 Ati技术无限责任公司 Advanced anti-aliasing with multiple graphics processing units
CN101540055B (en) * 2009-04-13 2011-05-04 浙江大学 Cartoon stylization method facing online real-time application
CN103198502A (en) * 2011-10-21 2013-07-10 富士胶片株式会社 Digital comic editor and method
CN109741408A (en) * 2018-11-23 2019-05-10 成都品果科技有限公司 A kind of image and video caricature effect real-time rendering method
CN111127614A (en) * 2019-12-25 2020-05-08 上海米哈游天命科技有限公司 Model stroke processing method and device, storage medium and terminal
CN111127614B (en) * 2019-12-25 2023-07-21 上海米哈游天命科技有限公司 Model edge tracing processing method and device, storage medium and terminal

Also Published As

Publication number Publication date
CA2319279A1 (en) 2001-03-24
JP4672072B2 (en) 2011-04-20
JP2009199620A (en) 2009-09-03
AU5652700A (en) 2001-03-29
BR0004415A (en) 2001-04-10
JP2001134779A (en) 2001-05-18
JP4349733B2 (en) 2009-10-21

Similar Documents

Publication Publication Date Title
US6700586B1 (en) Low cost graphics with stitching processing hardware support for skeletal animation
TW479206B (en) Method and apparatus for providing non-photorealistic cartoon outlining within a 3D videographics system
JP4890638B2 (en) Method and apparatus for processing direct and indirect textures in a graphics system
US6392655B1 (en) Fine grain multi-pass for multiple texture rendering
JP3725524B2 (en) Method for generating computer display image and computer processing system and graphics processor for generating image data
CN100507832C (en) System and method for providing intermediate target in graphic system
US6580430B1 (en) Method and apparatus for providing improved fog effects in a graphics system
CN101281656B (en) Method and apparatus for mapping texture onto 3-dimensional object model
JP4672072B2 (en) Method and apparatus for providing non-realistic cartoon outline in 3D video graphics system
JP2002074390A (en) Shadow mapping in inexpensive graphics system
US7274365B1 (en) Graphical processing of object perimeter information
CN1317666C (en) System and method suitable for setting up real time shadow of transparent target
JP3369159B2 (en) Image drawing method, image drawing apparatus, recording medium, and program
Boeykens Unity for architectural visualization
US20070291045A1 (en) Multiple texture compositing
Herrlich A tool for landscape architecture based on computer game technology
CN1286454A (en) Method and apparatus for providing deep fuzzy effect in 3D video picture system
EP1094421A2 (en) Method and apparatus for providing non-photorealistic cartoon outlining within a 3D videographics system
JP4740476B2 (en) Method and apparatus for providing a logical combination of N alpha operations in a graphics system
CN116452704A (en) Method and device for generating lens halation special effect, storage medium and electronic device
JP4698894B2 (en) Method, apparatus and program for texture tiling in a graphics system
MXPA00009280A (en) Method and apparatus for providing non-photorealistic cartoon outlining within a 3d videographics system
Patel Computer Aided Design And Visualization
Alias he News
Luo The implementation of a 3D snooker table using OpenGL

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication