GB2378111A - Image filling method, apparatus and computer readable medium for reducing filling process in producing animation - Google Patents
Image filling method, apparatus and computer readable medium for reducing filling process in producing animation Download PDFInfo
- Publication number
- GB2378111A GB2378111A GB0223892A GB0223892A GB2378111A GB 2378111 A GB2378111 A GB 2378111A GB 0223892 A GB0223892 A GB 0223892A GB 0223892 A GB0223892 A GB 0223892A GB 2378111 A GB2378111 A GB 2378111A
- Authority
- GB
- United Kingdom
- Prior art keywords
- color
- filled
- region
- line
- filling
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/40—Filling a planar surface by adding surface attributes, e.g. colour or texture
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Generation (AREA)
- Processing Or Creating Images (AREA)
Abstract
An image filling method for reducing filling process in producing animation. The image filling method includes the steps: of inputting image data; searching said image data for extracting a small region smaller than or equal to a predetermined size; and outputting a list of said small regions. Said regions may be overlaid on the image in the form or marks.
Description
237811 1
IMAGE FILLING METHOD, APPARATUS AND
COMPUTER READABLE MEDIUM FOR REDUCING PILLING
PROCESS IN PRODUCING ANIMATION
- 5 The present invention generally relates to an image processing technique. More particularly, the present invention relates to an image filling method, an apparatus and a computer readable medium storing the program.
10 First, the related art which corresponds to the after-mentioned first object wlL1 be described. Conventionally, a line drawing on a sheet of paper is transferred to a cell which is colored manually with a paintbrush when producing 15 animation. Recently, this work is being replaced by digi tal painting in which the line drawing is digitized by a scanner and is filled by using a computer. Therefore, it becomes easy to fill the line drawing which was very difficult conventionally.
20 However, it is necessary to fill the digitized line drawing data manually one by one even after the digital painting is introduced.
As conventional techniques, Japanese patent No. 2835752 and Japanese laid open patent 25 application No. 9-134422 disclose a technique for filling a plurality of line drawings at a time by specifying coordinates (a seed point) from which the filling is started, wherein the coordinates are common for regions where a plurality of line 30 drawings are superimposed and filled with the same color. In the technique of the Japanese patent No. 2835752, the centroid position of each closed region, and lateral and vertical lengths of a 35 circumscribed rectangle of each closed region are extracted from an unfilled image and the f illed mage as features. Then, the unfilled image is
-2- filled by referring to the corresponding region of the filled image by using the features. According to the above invention, a region which has relatively small movement can be filled with the 5 same color of the corresponding region of the filled image. In the Japanese laid open patent application No.9-134422, when filling closed regions of a line drawing' a point determined in the closed LO region and a color selected according to the point are correlated and stored. When the point is located in the same closed region of next images that follows, the color corresponding to the point is called and added to the closed region.
15 Accordingly, filling images which moves successively can be performed speedily and effectively.
However, in the invention of the Japanese patent No.2835752, since only the centroid position of each closed region and lateral and 20 vertical lengths of a circumscribed rectangle of each closed region are extracted, and used as f mature data, there is a problem that the calculation amount for obtaining the barycenter position is large.
25 In the invention of the Japanese laid open patent application No. 9134422, when overlapping closed regions of a plurality of unfilled line mages should be f filled with the same color, the closed regions can be filled in a stroke with a seed 3 0 point which has the same coordinates in the plurality of line drawings. Thus, the invent i on is built on a premise that there are overlapping closed regions which have the same meaning and should be filled with the same color in a plurality of line 35 drawings. Therefore, this method can be applied to only regions we th relatively small movement, thereby there is a problem that many Judgments by the
-3- operator are necessary.
Next, the related art which corresponds to the second object will be described. Conventionally, a boundary line of red, blue or the like (a colored 5 line) is used as well as a black boundary line when producing animation. There is a following rule.
When a line drawing on a sheet of paper is transferred to a cell, only the black boundary lines are transferred. Then, if a red boundary line is 10 specified when the transferred cell is put on the line drawing, the red boundary line is traced by the brighter color which is one of colors of sides which are divided by the boundary line. If a blue boundary line is specified when the transferred cell 15 is put on the line drawing, the blue boundary line is traced by the darker color which is one of colors of each side which is divided by the boundary line.
In recent years, the line drawings are digitized and digital painting becomes widespread, 20 in which the line drawing is transferred to the cell and is filled on a computer. Thus, the color of the colored line (boundary line) of red or blue should be changed sooner or later on a computer. Fig.lA shows the line drawing drawn by the colored line and 25 Fig.lB shows the filled image of the -line drawing.
The colored line in the filled image needs to be filled with a proper color as shown in Fig.lC. For this purpose, there are conventional technologies such as a paint bucket tool, a f illing process 30 method with functi on of filling the colored line and the Japanese patent No. 2762753.
The paint bucket tool is a common name of a filling tool widely used for a general painting system and the like. When a pixel is specified by a 35 pointing device such as a mouse, pixels which are connected to the pod nted pixel and have the same color as the color of the pointed pixel are filled
with a predetermined color by the paint bucket tool.
Figs. 2A-2D show a general example for filling a colored line by using the paint bucket tool. When filling a closed region enclosed by a colored line S and the colored line shown in Fig. 2A by using the paint bucket tool, the closed region is filled by the paint bucket tool first (Fig.2B), then the colored line is filled with the same color (Fig. 2C).
As a result, the image shown in Fig. 2D is obtained.
10 The order in which the processing shown in Fig. 213 and the Fig, 2C can be reverse.
On the other hand, the filling process method with function of filling the colored line is adopted by sof tware specialized four animation IS filling. According to the method, the colored line is filled with the color used for filling the region enclosed by the colored line at the same time when the region is:fl lied. Figs. 3A, 3B shows the example.
As shown Figs. 3A, 3B. the closed region enclosed by 20 the colored line and the colored line are filled at the same time when the inside of the closed region is filled. As a consequence, the colored line is filled with the color which is used first for filling each region enclosed by the colored line.
25 In the method of the Japanese patent No.2762753, every closed region is labeled and a filter of a size s defined, in which the filter is centered by a target pixel on a boundary line when the boundary line is the colored line. Then, the 30 maximum Gambol rumbas-within the = lto- ic- providod to the target pixel. The processing will be described with reference to Figs.4A-Ac. As shown in Fig.4A, a label number S is assigned to the upper closed region of the colored line and a label number 35 23 is assigned to the lower closed region. As shown in Fig.4B' by applying the filter which is we der than the thickness of the colored line, the maximum
-5- label number within the filter is assigned to the label number of the colored line. As a result, as shown in Plg.4C, the label number of the colored line becomes 2.
5 However, in the case when a colored line should be filled with a plurality of colors, there is a following problem. As shown in Figs.5A and 5B, by using the paint bucket tool, the same color pixels which are connected successively to a pixel 10 on which the paint bucket tool is applied are filled with a color. Therefor, for filling such a colored line by using the paint bucket tool, the region on which the paint bucket tool is applied should be specified in advance as shown in Fig.6A such that 15 the paint bucket tool is applied in the specie fed region as shown in Fig. 6B. Such a case occurs very f requently in which a colored line should be divided and filled with a plurality of colors. However, there is a problem in that it takes much time to 20 specify the regions to which the paint bucket tool is applied in a shape of the colored line.
Fig.7 shows the problem of the filling tool with filling function to the colored line. The filling tool has a rule in which the colored line is 25 filled with a color of the closed region which is filled first Therefore, as shown in Fig,7, the colored line is filled differently depending on the side which is filled first. That is, when the inside is filled first, the colored line is filled 30 with red. When the outside is filled first, the colored line is filled with blue. Thus, the operator should be aware of the color of the colored line and the colors of the closed regions which are divided by the colored line. That is, when the 35 colored line is red, the closed region which is brighter side of the closed regions which are divided by the colored line should be f illed f irs t.
-6- When the colored line is blue, the closed region which is darker side of: the closed regions which are divided by the colored line should be filled f irst.
There is a problem that the operator should always 5 pay particular attention to the color of the colored line and the order of filling. In addition, the same problem which is explained n Fig. 5 exists in the filling tool with function of filling the colored line as shown in Figs. 8A and 8B, For 10 avoiding the problem, it is necessary to perform the same tasks shown in Fig.6 by the filling tool with function of filling the colored line as shown in Figs.9A and 9B. Therefore, it takes much time to set regions for applying.
15 According to the method shown in the Japanese patent No.2762753, the label number of the colored line is determined as one of label numbers of closed regions. Thus, the color of the colored line is not determined until the f fling process is 20 performed as shown in Fig.lOA. Therefore, there is a problem that the above-mentioned rule which has been historically established in producing animation can not be considered. In addition, when the size of the filter is too small (Figs.lOB and lOC) or too 25 large (Fig,lOD), the processing is not performed properly. Thus, it is necessary to adjust the size of the filter according to the thickness or the complexity of the colored lines. However, the Japanese patent No.2762753 does not disclose the 30 method for solving the problem.
Next, the related art corresponding to the third object will be described.
For filling a closed region in a line drawing to be filled, at least coordinates in the 35 closed region and the color to be pained are required, The coordinates can be specified by a pointing device such as a mouse when a computer is
-7- used for filling. As for the color, an operator inputs (R. G. B) values or ( tint, chrome, lightness) values and the like by using an interface shown in Figs, llA and llB.
5 There is another method in which a color is obtained by specifying a point by the pointing device on a display such that the color is used for painting. Figs.12A and 12B show two representative examples. In the method shown in Fig.12A, a color 10 specifying table is displayed in which predetermined colors and the corresponding names are shown, The operator specifies a desired color in the table with the pointing device. In the case shown in Fig.12B, an example image which is already filled is 15 displayed. In this case, the operator finds a closed region in the example image which has a color the operator wants to use and specifies the color filth the pointing device.
However, since the color value used for 20 painting each closed region is strictly defined in producing animation, the operator needs to check the color value and input it via a keyboard every time the color is changed by the method shown in Figs.llA and llB. Thus, this operation is burdensome to the 25 operator.
As for the method shown in Flgs.12A and LAB, there is a problem that the operator needs to move the pointing device extensively every time the color to be painted is changed.
30 Next, the related art corresponding to the fourth object will be described, Conventionally, for filling a region enclosed by a line, the operator specifies the color and coordinates (which will be called a seed point) 35 which is a start point for filling. Then, the four connected pixel seed fill algorithm, the eight connected pixel seed fill algorithm or the scan line
seed fill algorithm is generally used for painting the region. These methods are explained for example, in hands-on Computer Graphics-, Fujio Yamaguchi, Nikkan Kog o shinbun-sha, pplO4-, 1987.
5 Each methods will be described in the following.
Fig. 13 is a diagram for explaining the four connected pixel seed fill algorithm. First, the color of the specify ed seed point is checked.
When the color can be changed ( that is, when the :10 color is not the color of the outline, for example), the color value of the pixel is saved and the color Of the seed point is changed to a specif ed color.
Next, pixels which are connected to each of four sides of the seed point are searched. If the color 15 of the searched pixel can be changed ( that is, when the color is the same as the saved color and is not the color of the outline)' the color of the pixel is changed to the specified color. Next, the same processing is performed for four pixels which are 20 connected to the pixel in which the color is changed.
After chat, the same processing is performed recursively until a pixel which has a color different from the saved color or a pixel which has the color same as the outline color is searched.
25 lg.13 shows pixels 1-4 which are filled in the first filling process and pixels adjacent to the pixel 1 which are further searched and filled.
Fig.14 is a diagram for explaining the eight connected pixel seed file algorithm. In the 30 algorithm, as shown in Fig.14, this method is different from the four connected pixel seed fill algorithm in that eight connected pixels are searched in this method. Fig.14 shows pixels 1-8 which are filled in the first filling process and 35 pixels around the pixel 3 which are further searched and painted.
According to the above-mentioned algorithm,
_g_ the recurs) ve processing tends to become deep and large stack region is necessary. Fig.15 is a diagram for explaining the scan line seed fill algorithm which is developed for the sake of 5 decreasing the depth of the recursive processing.
First, the color of the seed point which is specified in the first place is checked. When the color can be changed ( for example, when the color is not the color of the outline), the color lo value of the pixel is saved and the color is changed to a specified color. Next r pixels are searched from the seed point in the lateral direction until a pixel in which the color can not be changed is searched (for example, the color of the pixel is 15 different from the saved color or the color of the pi xel is that of the outline). When a pixel in which the color can be changed is searched t for example, the color of the pixel is the same as the saved color or the color of the pixel is not that of 20 the outline), the color is changed. In addition, the color of a pixel which is connected to the upper side or the lower side of the searched pixel s checked while searching the pixels. Then, the coordinates of the rightmost ( or lef tmost) pixel In 25 which the color can be changed are stored. The same processing is repeated recursively by using the pixel of the coordinates as a seed point. As a result, the color of the closed region which includes the seed point which is specif fed in the 30 first place is changed to a specified color.
In the above-mentioned conventional methods, the four connected pixel seed fill algorithm is easily programmable and the processing is fast. In addition, the tour connected pixel seed 35 fill algorithm does not have the after-mentioned problem of the eight connected pixel seed fill algorithm. Therefore, this algorithm is widely used.
-10^ However, when a region is painted once by specifying a seed point as shown in Flg.16A, unfilled regions remain as shown in Pig.16E due to the basic characteristics of this algorithm. Such a 5 case often occurs when producing animation such as shown in Fig. 17. In many cases, the remaining region is a small region such as one pixel or two pixels. Thus, the remaining region is of ten undetected by the naked eye. Therefore, the 10 operator must concentrate on checking the minute remaining region, which takes much time.
As for the eight connected pixel seed fill algorithm, the problem of unpainted region remaining does not occur. However, in the case such as one 15 shown in Fig.18, the color used for painting the inside leaks at the point specified the arrow in Fig.18 such that the outside is painted by the same color. The case shown in Fig.18 also often occurs.
Therefore, this method is not generally used.
20 The scan line seed fill algorithm has the same merits and demerits as the four connected pixel seed fill algorithm in terms of painting. As mentioned above, this method require smaller stack region that the other two methods. However, the 25 relatively large stack region used for the other two methods is much smaller that the program region or the data region! Thus, there is no reason to use the scan line seed fill algorithm instead of the four connected pixel seed fill algori thm which is 30 easily implemented at the present time when the price of a computer memory is very low.
Since the remaining region to be checked is minute in any of the abovementioned algorithms, it is difficult to detect the remaining region.
35 Thus, the operator should intensively concentrate on checking whether the unfilled region is remained, however, it takes much time.
It is an object of the present invention for the user to check and correct easily an unfulfilled small region (one pixel, two pixels or the like) which is forgotten or is a mistake when filling a ...............
digitized line drawmg which Is used In producing animation, In which high concentration Is not necessary for the user to check the small region.
According to a first aspect of the present invention, the above object of the present invention is achieved by an image processing method comprising the steps of: inputting image data; searching the image data for extracting a small region smaller than or equal to a predetermined size; and outputting a list of said small regions According to a second aspect of the present invention, the object of the invention is achieved by an image processing method comprising the steps of: inputting image data; searching said image data for extracting a small region smaller than or equal to a predetermined size; providing a mark to said small region; and displaying said mark wherein said mark is overlaid on said image data.
Further, the above-mentioned image processing method may include the step of: asking the user about processing for the small region such that processing specified by the user is performed in an interactive manner.
According to the first aspect of the invention, the object of the invention is achieved by an image processing apparatus comprising a part for inputting image data; a part for searching said image data for extracting a small region smaller than or equal to a predetermined size; and a part for outputting a list of said small regions.
According to the second aspect of the invention, the object of the invention is achieved by an image processing apparatus comprising: a part for inputting image data; a part for searching said image data for extracting a small region smaller than or equal to a predetermined size; a part for providing a mark to said small region; and a part for displaying said mark wherein said mark is overlaid on said image data.
-12 According to the first aspect of the invention, the object of the invention is achieved by a computer program code for causing a computer to process images, said computer program code comprising: program code means for inputting image data; program code means for searching said image data for extracting a small region smaller than or equal to a predetermined size; and program code means for outputting a list of said small regions. According to the second aspect of the invention, the object ofthe invention is achieved by a computer program code for causing a computer to process images, said computer program code comprising: program code means for inputting image data; program code means for searching said image data for extracting a small region smaller than or equal to a predetermined size; program code means for providing a mark to said small region; and program code means for displaying said mark wherein said mark is overlaid on said image data.
Other objects, features and advantages of
-1 3 the present invention will become more apparent from the following detailed description when read n
con junction with the accompanying drawings, n which: 5 Figs. lA-lC are diagrams for explaining fill' ng of boundary li nes drawn by colored lines; Figs. 2 -2D show a general example for filling a colored line by using a paint bucket tool; Figs. 3A and 3B show a f illing proses s 10 method with function of filling the colored line; Figs, 4A-4C are diagrams for explaining a method of Japanese patent No, 2762753: Figs. 5A and 5B, Figs.6A and 6B are diagrams for explaining problems of the paint bucket IS tool: Fig.7, Figs.8A. 8B, Figs.9A and 9B are diagrams for explaining problems of the filling tool with function of filling the colored line Figs.lOA-lOD are di agrams for explaining 20 problems of the method of the Japanese patent No. 2762753;
Figs.llA and llB are diagrams for explaining an example of a filling method on a computer according to a conventional technique; 25 igs.12A and 12B are diagrams for explaining an example of a filling method on a computer according to a conventional technique; Fig 13 is a diagram for explaining the four connected pixel seed fill algorithm: 30 Fig.14 is a diagram for explaining the eight connected pixel seed fill algorithm; Flg. lS is a diagram for explains ng the scan line seed fill algorithm, Figs. l6A, 16B and 17 are diagrams for 35 explainlug problems of the four connected scan fill algorithm and the scan line seed fill algorithm; Fig. 18 is a diagram for explaining
problems of the eight connected pixel seed fill algorithm, Fig. 19 is a block diagram of an image filling apparatus according to a first explanatory example;
Fig. 20 shows examples of a reference lin drawing, a reference picture and a line drawing to be filled; Fig. 21 shows an example of a format of closed region data of the reference line drawing, Figs. 22A and 22B shows flowcharts of a reference line drawing separation part 13; Figs. 23A-23F show examples of feature amounts of separated closed regions; Figs. 24A, 24B show flowcharts of a calculation method of the feature amount 1; Figs. 25A, 25B show flowcharts of a calculation method of the feature amount 2; Figs. 26A' 26B show flowcharts of a calculation method of the feature amount 3; Figs. 27A, 27B show flowcharts of a calculation method of the feature amount 4; Figs, 28A 28B show flowcharts of a calculation method of the feature amount 5; Figs. 29A, 29B show flowcharts of a calculation method of the feature amount 6; Figs. 3 0A, 3 OB show flowcharts of a calculation method of the feature amount 7; Figs. 31A, 31B show flowcharts of a method of in the case of calculating all the feature amounts simultaneously; Fig. 32 is a diagram showing the stored feature amount; Fig. 33 is a diagram showing variations of feature amounts; Fig. 34 is a diagram showing normalized feature amounts;
_ 1 - Flg. 35 is a diagram showing lutegrated variations; Fig.36 is a diagram showing color candidate lists which are generated in the order of 5 certainty for every closed region of the line drawing to be filled: Pigs.37A and 37B are flowcharts of a method for filling each closed region of the line drawing to be filled with the top color in the color 10 candidate 11st of the closed region: Fig. 3a is a block diagram of an image filling apparatus according to a second explanatory examp I e; Fig. 39 shows an example of the displayed 15 color candidate list; Fig. 40 is a block diagram of an image filling apparatus according to a third explanatory example;
Fig.41 is a diagram of a color alias list 2 0 generation storing part 106 of the third example Fig. 42 is an example of a color alias lis t: Fig. 43 is an example in which the color alias 11st is displayed with corresponding color 2 5 candidate list; Fig. 44 is a block diagram of another example of the image filling apparatus according to the third example; Fig. 45 is a block diagram of an image 3 0 f illing apparatus according to a fourth explanatory example;
Fig. 46 shows an example of a format of boundary line information, Fig. 47 is a flowchart showing extraction 35 of the boundary line information: Fig. 48 is a flowchart showing a filling method by using the boundary line information;
-16 Figs,4sA and 4sB shows flowaharts of a method for changing a color of a colored line; Fig.50 is a diagram showing an example of changing the color of the colored line; S Flgs.51A and 51B are diagrams showing an example in which the fourth example and the paint bucket tool are combined.
F4g.52 is a block diagram of another example of the image filling apparatus according to 10 the fourth example of the present invention; Fig.53 is a block diagram of an image fllllng apparatus according to a fifth example of the present invention; Fig.54 shows an example of color 15 specifying information; Figs,55 and 56 are flowcharts of the filling part 317; Figs.57A-57C shows displayed examples in the process of image filling; 20 Fig.58 is a block diagram of another example of the image filling apparatus according to the fifth example of the present indention; Fig.59 is a block diagram of an image filling apparatus according to an embodiment of 25 the present invention; Fig.60 shows a search state display table used for small region searching: Fig.61 shows a small region table which is generated in small region searching process; 30 Fig. 62 is a flowchart showing the whole process of the small region searching part; Figs.63 shows an initialization flowchart of the search state display table used for checking small regions included in image data whichever the 35 small region is unfilled or filled; Figs.64 shows an initialization flowchart of the search state display table used for checking
-17 unfilled small regions, Fig. 65 shows a search process flow; Fig. 66 shows an example of a process flow of the small region changing part; Figs. 67A and 67B show a mark example for a small region displayed on a display, Fig. 68 shows a process example of the small region changing part for processing a small region in an interactive manner with a user.
(first to third examples) First, first to third examples will be described with reference to figures With reference to Fig. 19, the image filling apparatus according to the first example of the present invention includes a storage device 1 1, an image reading part 12, a reference line drawing separation part 13, a separation part for line drawings to be filled 14, a reference line drawing feature amount extraction part 15, a feature amount extraction part for line drawing to be filled 16, a color candidate list generation part 17, a line drawing filling part 18 and a filled line drawing storing part 19.
The storage device 11 stores a reference line drawing, a reference picture and line drawings to be filled as shown in Fig. 20.
The image reading part 12 reads the reference line drawing, the reference picture and the line drawing to be filled from the storage device 11. In the case when the next line drawing to be filled is filled by using the same reference picture and the same reference line drawing, only the next line drawing to be filled is read from the storage device 11.
The reference drawing separation part - 18 13 extracts all closed regions which font the reference line drawing.
Each closed region to be filled should be completely enclosed by a boundary line (the color is not limited to black), since the 5 closed region is filled digitally.
There are various method for separating the closed region in an image. In thisexample, a method will be described as an example in which a different number is assigned to each closed region 10 which constitutes the reference line drawings.
Fig. 21 shows an example of the format of the closed region data generated by this method. As shown in this figure, a closed region number is assigned to each pixel wherein the closed region number 15 corresponds to the closed region lnclud ng the pixel.
A special number (possible maximum number in the case shown in Fig.21) is assigned to a pixel on the boundary line such that it is identified that the pixel is on the boundary line. In addition, by 20 assigning the special number, the boundary line is protected from being filled when filling is performed. The sizes of the x axis direction and the y axis direction are the same as the pixel numbers (xg z, you) of the x axis direction and the 25 y axis direction of the reference line drawlug respectively. Figs. Z2A and 22B shows flowcharts of the process performed. In step 21, all data in the closed region is initialized to the possible maximum 3 0 value ( MAXVALUE). The AXVALUE is determined to be more than the number of closed regions which constitute the reference line drawing. In this embodiment the process starts from the closed region number O and (x, y)-(O, o) in steps 22, 23.
35 (0, o) is determined as the seed point in this case.
when the closed region data of the corresponding pixel is MAXVALUE and the color of the pixel is
-19 white ( thus, a colored line other than black is treated as a boundary line), the closed region data of the pixel is converted to O in steps 24-27.
After all data of the obJ ect closed region is 5 converted to 0, the closed region data is incremented by 1 in step 28. Then, the same processing is repeated after finding a next seed point in which the color is white and the value is MAXVALUE. As a result, boundary lines which divides 10 the reference line drawing into closed regions have MAXVALUE and each of the closed regions is numbered by a different integer from O such that the closed regions are separated.
In the closed region numbering process 15 shown in Fig. 22B, (x, y) is substituted into (a, b) in step 31. If a and b are bigger than O and if a and b are smaller than xs' ze and Prize respectively and if the color of ( a, b) is white and the closed region data is MAXVALUE, a specified number (closed 20 region number) is assigned to (a, b) as closed region data in steps 32-34. Then, after each of a and b is incremented and decremented by 1 in steps 35-3G, the closed region numbering process (step 27) is recursively called.
25 The separation part for line drawings to be filled 14 performs the same processing to a line drawing to be filled as the reference line drawing separation part 13 performs. As a result, as in the case of the reference line drawing, MAXVALUE is 30 assi gned to the boundary line and the closed regions are numbered by integers from 0.
The reference line drawing feature amount extraction part IS extracts a feature amount for each closed region who ch s separated by the 35 reference line drawing separation part 13. In this embodiment, the following seven kinds of feature amounts are used as shown in Figs. 23A23P.
-20 1. Central coordinates of a rectangular circumscribing the closed region (Fig. 23A) 2. The area of a rectangular circumscribing the closed region (Fig. 23B) 5 3. The aspect rati o of a rectangular circumscribing the closed region (Fig. Z3C) 4, The number of pixels constituting the closed region (Fig. 23D) 5. The ratio between 2. and 4. (Fig. 23E;) lo 6. The peripheral length of the closed rent on (Fig.23F) 7. The ratio between the square root s of 6. and 4.
(Fig. 23G) In. Figs. 23A-23G, the boundary line is 15 shown for each closed region for the sake of clarity.
In reality, the boundary line is not included-in the separated closed region.
In the following, the calculation method of the feature amounts 1-7 will be described.
20 In the following, xm1n(i), xm x(), Vm1n(i), y. x ( i) are assumed to be the minimum value, the maximum value of x of the rectangular circumscribing the closed region i, and the minimum value, the max. mum value of y of the rectangular circumscribing 25 the closed region i respectively, center(i).x and center(i),y are assumed to be the x coordinate and the y coordinate of the center of the rectangular circumscribing the closed region i, area_rec(i) is as sumed to be the area of the rectangular 30 circumscribing the closed region i, aspect_ratio ( i) is assumed to be the aspect ratio of the rectangular circumscribing the closed region i, pixels ( i) is assumed to be the number of pixels constituting the closed region i, ci rcum() is assumed to be the 3 peripheral pixel number of the closed region i.
1. the calculation method for the feature amount 1 ( Figs. 24A, 24B)
-21 First, the parameters x ln(i). x (i), ymin(i), Ym X(i) are initialized. In step 41, the closed region number i is initialized. Then, initialization of x in(1)=xgiz, Y 'n(i) =Y iz, X ax(i) 5 = 0, YmaX(i)- O is performed for every region in steps 42-44. Next, as shown in Fig. 24B, the coordinates (center(i).x, center(i).y) of the center of the rectangular circumscribing the closed region i are calculated. In steps 51-53, coordinates (x, 10 y) of the closed region i are scanned and the closed region number of the coordinates is substituted into i. When the closed region number i does not represent the boundary 11ne, x is substituted into - xm n(i) f x < x(), x is substituted into xm x(i) 15 if x x X(i), y is substituted into Ymin ( i) if y < n(i), y is substituted into ym=(i) if y > y (i) in steps 54 and 55. When the scan is completed, the closed region number i is initialized in step 7 1.
Then, the coordinates (centex(i).x, center(,).y) of 20 the center of the rectangular circumscribing the closed region i in steps 72-74 are calculated.
2. The calculation method f or the f mature amount 2 (Figs. 25A, 25B) First, the parameters xmin(1), Xmax(i), 25 Ymin( i) Yr ax(i) are initialized in the same way as the feature amount 1 as shown in Fig. 25A. Next, as shown in Fig. 25B, the area area_rec ( i) of the rectangular circumscribing the closed region i is calculated. This processing is the same as the case 30 of the feature amount 1 except that the step 75 is performed instead of the step 73.
3. The calculation method for the feature amount 3 (Figs. 26A, 26B) First, the parameters Xmin ( i), Xma:e (), Ymin 35 (i), Ym x(i) are initialized in the same way as the cases of the feature amounts 1, 2 as shown in Fig. 26A. Next, as shown in Fig. 26B, the aspect
-22 ratlo aspect-ratio (i) of the rectangular circumscribing the closed region is calculated.
This processing is the same as the case of the feature amount 1 except that the step 76 is 5 performed instead of the step 73.
4. The calculation method for the feature amount ( Figs, 2 7A, 2 7B) First, the number of pixels pixels(i) in the closed region i Which iS necessary for 10 calculating the feature amount is initialized as shown in Fig.21A. For this purpose, the step 45 is performed instead of the step 43 shown in Fig.24A.
Next, as shown in Fig.2?B, the number of pixels constituting the closed region i is calculated. For 15 this purpose, the step 56 is performed instead of the step 55 shown in Fig,24B and the steps 71-74 are not performed.
5. The calculation method for the feature amount 5 (Figs. 28A, 2BB) 20 First, X,,(i), xm x(i), Y in(l), Ym x(i) pixels(i) which are necessary for calculating the feature amount 5 are initialized as shown in Fig. 28A.
For this purpose' the step 46 is performed instead of the step 43 shown In Fig.24A. Next, as shown in 25 Fig.28B, the ratio between the area and the number of pixels of the rectangular circumscribing the closed region i ratio_rect pix(i) is calculated.
For this purpose, the step 57 is performed instead of the step 55 shown in Fig,24B and the step 77 is 30 performed instead of the step 73.
6. The calculation method for the feature amount 6 (Figs.29A, 29B) First, the number of pixels constituting the periphery of the closed region i which is 35 necessary for calculating the feature amount 6 are initialized as shown in Fig.29A. For this purpose, the step 47 is performed instead of the step 43
-23 shown in Fig, 24A. Next, as shown in Fig.29B, the peripheral length circum(i) of the closed region is calculated. Before the scan of the y axis direction is started, ibex which is the possible maximum number 5 fox i is substituted into isle which is a closed region number detected just previously in step 58.
In step 59, when i does not represent the boundary line, it is judged whether ion < ibex and 1 Find in step 59. When the result is NO", the process 10 returns to the step 52 since it means that the corresponding pixel is in the outside of the reference line drawing. When the result is 'YES-, each of circum(i) and circum(iOld) is incremented by 1 in step 60. Then, i is substituted into Irvin 15 step 61.
7. The calculation method for the feature amount 7 (Figs.3OA, SOB) First, pixels tl) and ctrcum(l) which are necessary for calculating the feature amount 7 are 20 initialized as shown in Fig. 30A. For this purpose, the step 48 is performed instead of the step 43 shown in Fig. 24A. Next, as shown in Pig. SOB, the ratio between the square root of pixels ( 1) and circum(i) is calculated. For this purpose, the step 25 62 is added next to the step 54 shown in Fig.29B.
Pigs.31A and 31B show the method in which the feature amounts 1-7 are calculated simultaneously such that each feature amount can be calculated with less calculation amount. The steps 30 49, 63, 78 are performed instead of the steps 48, 62, 77 shown in Figs. 30A, SOB.
As mentioned above, the feature amounts (feature amounts 1-5 in the following) of all closed regions which constitutes the reference line drawing 35 are calculated and stored for each closed region as shown in Fig.32 The feature amount extraction part for
-24 line drawings to be filled 16 performs the same processing as the processing by the reference line drawing feature amount extraction part 15 to the line drawing to be filled, As a result, the above S mentioned feature amounts are stored for every closed amount which constitutes the line drawing to be filled.
The color candidate list generation part 17 calculates variation amounts of the feature 10 amounts for all combinations between every closed region of the line drawing to be filled and every closed region of the reference line drawing, The definition of the variation amount differs according to the kind of the feature amount. For example, it 1 S is appropri ate to consider the variation amount of the feature amount 1 to be the distance between coordinates and to consider the variation amount of the feature amounts 2-5 to be the ratio. For a features amount, differences may be appropriately 20 used for the variation amount. For every variation amount between the closed region of the line drawing to be f illed and the closed region of the reference line drawing, when the variation amount is smaller than one, the inverse of it is calculated, sorted in 25 ascending order and stored as shown in Flg.33.
P4gs.33-36 shows the case in which the reference line drawing includes four closed region.
In the description for the Figs.33-36, the number of
the closed region of the line drawings to be filled 30 is not mentioned. There occurs no problem if the number of the closed region between the reference line drawing and the line drawing to be filled is different. Next, each variation amount of the closed region of the line drawing to be filled is normalized. In this embodiment, when it can be regarded that the smaller the variation amount is,
-25 the closer the feature is between the correspondlug closed regions of the line drawing to be filled and the ref erence line drawing, the mlulmum value Vain of the vari ation amount is normalized to O.0 arid the 5 maximum value V,n x of the variation amount is normals zed to 1. O. As for the above-ment oned five feature amounts, this assumpti on is applicable.
The value Vorg between V, ln and V,pax can be converted to the following value V by applying 10 simple linear transformation.
Vorg-Vmin V Vm -Vm;n ( i) This normalization is performed to every variation 15 amount. Fig.34 shows an example of the result.
Next, the variat on amounts which are calculated for each feature amount are integrated and evaluated. The user can specify weights for each of the variation amounts (V:, V2, V3, V4, Vs) 0[ 20 feature amounts. In this example, the weights are represented as W1, W:, 3, Wit, Ws and these are provided after normalized as O.O<W1, W2, W:, A,, W5< 1.0. By using these weights, integrated variation amounts between every closed region of the line 25 drawing to be filled and every closed region of the reference line drawing are calculated from the following equation.
Veq= W1 Via+ W2 V2+ W3 V2' W4 V2+ Is V5- (2) 30 This calculation is performed for each of the variation amounts. Then, the results of the calculation are sorted in ascending order as shown in Fig.35.
Next, the color corresponding to the 3S closed region of the reference line drawing is extracted from the reference picture. When two or more closed regions of the reference line drawing
-26 has the same color on the reference picture, duplication is eliminated from below in the 1 st as shown in Fig.36. In this way, the color candidate list is generated in the order of certainty for 5 every closed region of the line drawing to be filled as shown in Fig,36.
The user can specify a threshold for the variation amount of each feature amount. In this case, the combination of the closed regions in which 10 at least a variation amount exceeds the threshold should be below a combination which does not exceed any threshold for every variation amount n the list after being integrated by the formula (2) and sorted.
The manipulations for this are performed when 15 performing normalization. For this, the normalization using the formula (l) is performed for values which do not exceed the threshold, wherein Ym X]s regarded as the maximum value which does not exceed the threshold. A value which satisfies the 20 following equation is provided for values which exceed the threshold. For example, the variation amount for the feature amount 1 exceeds the threshold, the value which satisfies the following equation is provided.
>;W1+ W2+ W3+ W4+ We (3) As a result, even when all of the o ther variation amounts which are normalized are 0, the 30 integrated variation amount becomes more than an integrated amount in the case when all of normalized variation amount are 1. Thus, the color candy date list may be generated including the value. In addition, the value may be excluded. In the above 35 equation (3), when W1- , the result of the equation becomes indeterminate. In such a case, any value which is substituted into Vex' is not evaluated in
-27 ths equation (2)..
In addition, if the variation amounts of the feature amounts 2 and 4 decreases as the closed region becomes large, for example, when a camera 5 approache.s an object or the object approaches the camera, the variation amounts are regarded as exceeding the thresholds. Conversely, if the variation amounts of the feature amounts 2 and 4 increases as the closed region becomes small, for 10 example, when a camera moves away from an object or the object moves away from the camera, the variation amounts are regarded as exceeding the thresholds.
As a result of thi s, the color candidate list becomes more certain.
15 The line drawing fllltug part 18 colors every closed region of the line drawing to be filled with the top color in the color candidate list for the closed region. There are various methods for fllllng. In this embodiment, a method which 20 conforms to the above-mentioned separation method of the closed region will be described with ref erence to Figs.37A and 37B.
In steps 81-83, filling is started by pointing coordinates (x, y) in the closed region to 25 be filled as a seed point. If the coordinates (x, y) are appropriate and if the closed region data of the coordinates (x, y) is the same as a specified closed region data, the specified color value is provided to the coordinates in steps 91-94.
30 Then, each of x and y is incremented or decremented by 1 for performing the above-mentioned process on coordinates around the seed point in steps 9S-98 Then, the paint processing is recursively called. According to the above 35 mentioned process, filling an unrelated closed region with an unrelated color can be prevented such that the boundary line is completely protected.
-28 The filled line drawing storing part l9 stores the filled line drawing which is filled according to the above-mentioned process in the storage device ll. This process will not be 5 described in detail since this process is general.
After completing the above-mentioned process, next line drawing to be filled is read and filled in the same way. When the same reference picture / reference line drawing are used, the image lo reading part 12 is instructed to read the next line drawing leaving all information on the reference picture / reference line drawing retained. In the case, processing by the reference line drawing separation part 13 and reference line drawing 15 feature amount extraction part 15 is not necessary.
In addltlon, when using the filled line drawing and the original line drawlog as a new reference picture / a new reference line drawing, all information of the filled line drawing is moved to the reference 20 picture and all information of the original line drawing is moved to the reference line drawing.
After that, the image reading part 12 is instructed to read the next line drawing to be filled. Then, the next line drawing is filled by the above 25 mentioned processing. If the reference picture / the reference line drawing are newly specified, the above-mentioned process is performed.
Fig.38 is a block diagram showing the image filling apparatus according to a second 30 example of the present invention. This image filling apparatus of the secondGxample includes a filled line draw ngicolor candidate list/closed region information storing part 101 instead of the filled line drawing storing part 19 of the first 35 embodiment shown in Fig.l9. In addition, a filled line drawing/color candidate list/closed region information readlug part L02, a filled line drawing
-29 presentation part 103, a filled line drawing color correction part 104 and a filled line drawing storing part 105.
The filled line drawing/color candidate 5 list/closed region information storing part 101 stores only the filled line drawing in the first example. In the second example the filled line drawing/color candidate llst/alosed region information storing part 101 stores the 10 corresponding color candidate list and the separated closed line data additionally. Every specified.line drawing to be filled is stored in the storage device 11. The filled line drawing/color candidate 15 list/closed region information reading part 102 reads successively the color candi date list and the closed region information corresponding to the filled line drawing from the storage device 11 according to instructions by the user, wherein the 20 color candidate list and the closed region information corresponding to the filled line drawing are obtained by filling processing for all specified line drawings.
The filled line drawing presentation part 25 103 displays the filled line drawing to a monitor such as a CRT, a LCD and the like.
The filled 14 ne drawing color correction part 104 changes a color of a closed region. More specifically' when the user clicks a mouse button on 30 a closed region specified by the mouse pointer n the filled line drawing displayed on the monitor, the closed regi on number is identified from the coordinates of the mouse pointer. Then, the color candidate list corresponding to the closed region 35 number is displayed on the filled line drawing in which the color order of the displayed 11st is the same. When the user selects a color in the 11st
- 3 0 wlth the mouse pointer, the color of the specified closed region is changed to the selected color.
Thus, even when there is an error in colors detern lned in the first examp 1 e, the color can be 5 easily changed with a small mouse movement since it is probable that the correct color is in the upper part of the list.
Fig. 3 9 shows an example of the displayed color candidate, list. The order of candidate colors 10 is the same as that in the color candidate list obtained in the first examp 1 e. At the beginning.
the region is filled with the top color in the list.
If only the colors of the color candidate list are displayed, it becomes difficult to select a color 15 when similar colors are used. Therefore, in this example, the color data is displayed next to the color. In addition, if (R. G. B)=(255, 255, 255) is treated as transparent, it is doff cult to recognizes the difference between transparent (255, 20 25S, 255) and white (254, 254, 254) on the screen.
In this case, the user can easily recognize the difference since the color data is displayed, Moreover, a description which describes that the
color is transparent is displayed on the color (in 25 this case transparent.) such that the user can recognize the color.
When the selected color is different from a color which is already filled, the closed region is filled with the specified color. For this 30 purpose, the processing shown in Figs.37A and 37B can be used.
The filled line drawing storing part 105 stores the filled line drawing in which the color is changed n the storage device 11. The corresponding 3 5 color candidate list is not necessarily stored since it is the same as the stored color candidate list.
When every filled line drawing which is
-31 automatically filled in the first example needs to be checked and corrected by the user, the user repeats the above-mentioned operation. The filled line drawing which is checked and corrected by the 5 user may be used as the reference picture, and the original line drawing may be used as the reference line drawing such that line drawings which are remained unfilled are automatically filled again.
In this case, the image reading part 12 s lo instructed to change the reference picture and the reference line drawing, and to calculate the line drawings to be filled. By using the corrected filled lone drawing, the line drawings to be filled next can be f illed automatically more properly.
IS Fig.40 shows a block diagram of an image filling apparatus according to a third example Of the present invention. The image filling apparatus of the third example includes a filled line drawingicolor candidate list/closed region 20 information storing part 101' and a filled line drawing color correction part 104' instead of the filled line drawing/colox candidate list/closed region information storing part 101 and the filled line drawing color correction part 104 of the second 25 embodiment. In addition, a color alias list generation storing part 106 and a color alias list reading part 10 7 are provided.
The color alias list generation storing part 106 provides and stores aliases of the colors 30 used in a sequence in color correction of the second embodiment, For example, aliases which are easy to identify and are self- explanatory for performing color correction in the second example are entered as shown Fig.41. The color alias list is 35 stored in the form shown in Fig. 42.
The color alias list reading part 107 reads one or a plurality of color alias lists stored
-32 rml e s orage device 11 The filled line drawing color correction part 104' displays the color alias list \ 1 zinc corresponding color of the color candidate list as shown in Fig. 43 Thus, a filling Errol c bc l eeLcd and corrected more easily than the second example.
As sho\vn in Fig 44, the image filling apparatus of the examples can also be continue es 6\ an Input apparatus 110, storage devices 111, 112, and output device 113, a recording me l un I 14 and a data processing apparatus 1 I S. The input device 1 10 inputs data such as images The Name The storage medium I I I corresponds to the storage device I] in Figs. 19, 38, 40. The storage device 1 12 corresponds to a main storage. The output device 13 displays data such as images The recording medium 1 14 is, for example a FD (floppy disk), a CD-ROM, an MO (magneto-ol' ic clisk.
and the like, and stores a image filling program which has parts shown in Figs. 19, 38, 40 The data processing apparatus I 15 is a CPU which reads the image filling program from the storage device 1 14 and executes it.
According to the examples, the computational amount can be decreased when determining, the color used for filling the closed region of the line drawing with reference to the reference line drawing In addition, the color candidate list is generated in which the candidate colors are sorted in the order of certainty for every closed region ofthe line drawing to be filled. Further, the top color In the list is used for automatically filling. Thus, any unfilled region does not remain.
Furthermore, it is easy to check and correct the error of the color Thus, the user call
-33 save effort for filling (fourth example) 5 Next, the fourth example will be described with reference to figures, As shown in Fig.45, the image filling apparatus of the fourth example includes a storage device 211, a line drawing to be filled !O reading part 212, a boundary line information extraction part 213, a filling part 214, a colored line filling part 215, a filled line drawing storing part 216 and an image display device 217, a pointing device, keyboard and the like.
l5 The storage device 211 stores line drawings to be filled which include colored lines.
The line drawing to be filled reading part 212 reads the line drawing to be filled from the storage device 211. The boundary line information 20 extraction part 213 extracts a boundary line by using the color of the line drawing to be filled as a key.
Fig.46 shows an example of the boundary line information which is generated. In this 25 embodiment, there are four kinds of boundary lines which are black, red, green, blue (as the colored line, there are three kinds, red, green, blue), One bit is assigned to each pixel of the boundary line information for a color. In the figure, X3lz and Y So,,z represent the number of pixels horizontally and vertically respectively. 0 is assigned to a pixel of the region (white) other than the boundary line as the boundary line information. Therefore, there are five kinds of color information, which are white.
35 black red, green, blue. Thus, three bits are enough for representing one pixel of the boundary line information since the colors are mutually
-34 independent events. However, assigning three bits is equivalent to assigning four bits to the boundary line information for one pixel since memory amount for storing the boundary line information in the 5 cease of assigning three bits is the same as that in the case of assigning four bits due to the byte length. In addition, when the user treats the three kinds (red, green, blue) of colored L3ne likewise, two bits are enough. In this case, it becomes lo impossible to recover the color of the colored line when an error is detected after changing the color of the colored line by after-mentioned colored line filling part 15. However, when there is no problem if the changed color is recovered to a color, for 15example, red, two bits can be assigned to the boundary line information for one pixel such that the memory amount can be decreased.
Fig.47 shows an example of extraction of the boundary line information. A threshold is 20 provided for each of R. G. B by the use beforehand as Rthr, Gth., Bthr. In step 221, the boundary line information is inltlalized. Then, the line drawing to be filled is scanned in the axis direction and y axis direction in steps 222, 223. Color 25 information (r, g, b) of (x, y) is extracted in step 2 2 4. Then, the pixels of the line drawing are scanned while comparing color information (r, g, b) of each pixel with the thresholds Rrbr' G<hr' Bohr in steps 225, 227, 229, 231. 233.
30 If every color r, g. b is larger than or equal to the corresponding threshold, that is, if the following equations are satisfied, _R,h, 3s g Scar (a) D > Brkr the pixel is determined to be white and OOOQ is set
-35 to the boundary line information which corresponds to the pixel (x, y) in step 226, If every color r, g, b is smaller than the corresponding threshold, that is, if the f allowing 5 equations are satisfied, r < R,h, g < G,h, (5) b <Echo 10 the pixel is determined to be black and 1000 is set to the boundary line information which corresponds to the pixel (x, y) in step 228.
If only r islarger than or equal to the corresponding threshold R:hr and g, b are smaller 15 than Giver' Bohr respectively, that is, if the following equations are satisfied, >and, <fish, (6) b <B,hr the pixel is determined to be red and 0100 is set to the boundary line information which corresponds to the pixel (x, y) in step 230 If only g islarger than or equal to the 25 corresponding threshold Gthr and r, b are smaller than Par, Bthr respectively, that is, if the following equations are satisfied, r R,n, 30 g >G,h, (7) b <Brbr the pixel is determined to be green and 0010 is set to the boundary line information which corresponds to the pixel (x, y) in step 232.
35 If only b is largerthan or equal to the corresponding threshold B:hr and r, g are smaller than Nehru Gear respectively, that is, tt the
-35 following equations are satisfied, r <fish, g <G,n, (8) b> Bohr the pixel is determined to be blue and 0001 is set to the boundary line information which corresponds to the pixel (x, i) in step 234.
The above-mentioned process by using the 30 equations (4) - (8) is displayed to the image display device 217 one after another such that the user can change the thresholds Rlhr' G:hr. Bohr while checking the process on the display device, When a pixel which does not satisfy any 5 one of equations (4)-(8) is detected, the image filling apparatus makes an inquiry to the user whether the thresholds need to be changed or the pixel is not on the boundary line in step 235. In the former, the apparatus requests to the user to 20 input new thresholds after suspending the process.
In the latter, the process is continued after 0000 is set to the boundary line information which corresponds to the pixel in step 236.
In tints example as shown in Fig.47, the 2 boundary line is judged in the order of white, black, red, green, blue. However, any other order can be used since the colors are mutually independent for a pixel The f illing part 214 colors the region ^O other than the boundary line by using the boundary line information which is generated by the boundary line information extraction part 213. More specif ically, only the region which has 0000 as the boundary 1nformati on is filled in this embodiment General filling method can be used for filling regions which have other than 0000 as the boundary Information. Fig.48 shows an example. In this
- 7 example, the line drawing to be filled is displayed on the image display device 217 one after another while the user colors the line drawing to be filled.
Coloring is started from coordinates (x, y) which is 5 pointed by the user with a pointing device such as a mouse, in which the color used for filling is specified by the user with the pointing device, a keyboard or the like in step 241. When the coordinates is in the line drawing to be filled. the lo boundary information is 0000 and the color of (x, y) is not the same as the color which the user specifies, specified color information is assigned to the coordinates in s teps 2 4 2 - 2 4 5. The same process (paint process) is performed to coordinates 15 around the coordinates recural rely in steps 246-249.
Thus, the closed region which includes the coordinates specified initially by the user and does not include the boundary line is f illed by the color specified by the user.
20 As a result, the line drawing is filled without filling the boundary li net The filled line drawing to be fit lied is called the filled line drawing. The colored line filling part 215 changes 25 the color of the colored line ( boundary line) to an appropriate color.
The processing by the colored line filling part 215 will be described with reference to Figs. 4gA and 49B. In the following description, a
30 mouse is used for a pointing device. If the mouse is replaced by a trackball, a graphics tablet or the like, the f allowing method is the same.
As shown in Pi g. 49A, when a mouse button and a button of the keyboard (for example, a control as button in this case) are pushed, color information (r, A, b) of the coordinates at the mouse pointer on the line drawing is obtained if the boundary line
-38 lnformation of the coordinates is 0000 ln steps 2S1-
254. If the boundary line information ls not 0000' nothing is done. When only the mouse button is pushed, the obtained color information tr, g, b) is set to the coordinates at the mouse pointer if the boundary line information at the coordinates is 0100 or 0010 or 0001 until a mouse up event is detected (that is, while the mouse button is pushed) in steps 255-257. Accordingly, in the case shown in Fig. 49A, 10 the color specified by the mouse pointer is obtained when the mouse button is pushed in conjunction with the control button. Then, when the mouse pointer is moved while pushing only the mouse button, the color of a part of the boundary line on which the mouse 15 pot nter passed is changed to the obtained color.
Theref ore, when the user wants to change the color of the boundary line ( colored line) to a brighter color or region colors dl lded by the boundary line, the user may push the mouse button in 20 conjunction with the control button at the side of the brighter color for obtaining the color, and, then, the user may move the mouse pointer on a part where the user wants to change the color while pushing the mouse button. Then, the color of the 25 part is changed to the color the user wants. When there region plurality of different color regions divided by colored lines, only a color of a part of the colored line can be changed in the same way.
When the mouse has two mouse buttons, the above 30 mentioned process can be realized by pushing the left button instead of pushing only the above mentioned mouse button and by pushing the right button instead of pushing the mouse button with the control button.
35 In the case shown in Fig. 49B, when the mouse button is pushed and the boundary line information of the coordinates at the mouse pointer
-39 is 0000, the color information (r, g, b) is obtai ed in steps 261-263. In other cases (OlOO or OOlO or 0001), the color of the coordinates specified the mouse pointer is changed to the obtained color until 5 a mouse up event is detected ( that is, while the mouse button is pushed), Thus, according to the method shown in Fig.49B, when the mouse pointer is moved from a region to another region across the colored line, the color of the colored line is C changed to the color of the region where the mouse pointer is initially located. Therefore, when the user pushes the mouse button at a brighter color region of regions divided by the colored line and moves the mouse pointer on a part of the colored 5 line where the user wants to change the color, the color of the part where the mouse pointer passed through can be changed to the brighter color as shown in Fig. 50. When there are plurality of different color rage ons divided by colored lines, 20 the color of only a part of the colored line can be changed in the same way.
In both of the methods shown in Figs.49A and 49B since all boundary line information is kept stored, the above-mentioned process can be performed 25 any number of times when the user makes a mistake.
In addition, since each of the colored lines of red blue, green is represented separately in the boundary line information, the changed color of the colored line can be turned back to the initial state.
3- The present example and the paint bucket tool can be combined. More specifically, when a colored line should be filled with a plurality of colors, the user divides the colored line by using the present invention as shown in Flg.51A. Then, 5 the user can color the divided colored line with the paint bucket tool as shown in F1g.5lB.
The filled line drawlog storing part 216
-40 stores the filled 11 e drawing in the storage device 11. As shown in ig.52, the image filling apparatus of the examples 5 Can also be configured by a mouse 71, storage devices 72, 73, an output device 74, a recording medium 75 and a data processing apparatus 76. The storage device 72 corresponds to the storage device 211 in Fig.45, The storage device 73 lo is a main storage device. The output device 74 displays data such as images. The recording medium 75 corresponds to a recording medium such as a ED (floppy disk), a CD RON, an MO (magneto-optic disk) and the like which stores an image filling program 15 which has the line drawing to be filled reading part 212 - the filled line drawing storing part 216 shown in Fig.45. The data processing apparatus 76 is a CPU which reads the image filling program from the recording medium 75 and executes it.
20 According to the present invention, the color of the colored line can be changed without being affected from filling oper.atlon of other regions, Thus, it is not necessary for the user to consider the filling order. In addition, even when 25 a colored line extends over a plurality of regions.
the user can change the color of only a necessary part of the colored line without affecting other regions. Thus, the user can perform filling effectively and flexibly.
30 The above-mentioned fourth example can be realized in concert with othe r examples of the present invention. For example, the fourth example can be used for filling the colored line in the first - third examples.
35 (fif th example' In the following, the fifth example will be described with reference to figures.
-41 As shown in Flg.53, an image filling apparatus of the fifth example lucludes a color 5 specifying information generation part 311, a color specifying information storing part 312, a storage device 313, a line drawing to be filled reading part 314, a line drawing displaying part 315, a color specifying information reading part 316, a filling lo part 317 and a filled line drawing storing part 318.
The color specifying information generation part 311 generates color specifying information according to lnstructlons by the user, wherein the color specifying information includes 15 colors used for filling and names corresponding to the colors. Fig.54 shows an example of the color specifying information.
The color specifying information storing part 312 stores the generated color specifying 20 information in the storage device 313.
The storage device 313 stores line drawings to be filled, the color specifying information and filled line drawings.
The line drawing to be filled reading part 2s 314 reads the line drawing to be filled from the storage device 313 according to instructions by the user, The line drawing displaying part 315 displays the line drawing to be filled on a image 30 displaying device 319 such as a CRT.
The color specifying information reading part 316 reads the color specifying information from the storage device according to instructions by the user. 35 The filling part 317 obtains coordinates in a closed region where the user wants to color in the line drawing to be filled displayed on the image
-42 displaying deYlce 319, in which the coordinates are specified by a pointing device. Then, the filling part 317 overlays the color specifying information on the line drawing at the coordinates. When the 5 user specifies a color in the color specifying information with a pointing device, the filling part 317 obtains the corresponding color value from the color specify ng information and colors the closed region specified by the coordinates with the 10 specified color.
In the following, the fifth example w111 be described more specifically with reference to Figs.55, 56, 57A, 57B, 57C.
In the process shown in Fig.55, a mouse 15 which has one button is used as the polnt ng device.
In step 301, the user pushes the mouse button at coordinates in a closed region in the line drawing to be filled where the user wants to color by using the mouse cursor displayed on the image displaying 20 device 319. Then, coordinates at the mouse cursor are obtained when the mouse button is pushed, and the coordinates are set in (x, y) in step 302. It is checked whether (x, y) is in the line drawing to be filled in step 303. If (x, y) is not in the line 25 drawing, the process is completed without performing any process. If (x, y) is in the lane drawing, the color specifying information which was previously read is displayed at (x, y) overlaying on the line drawing to be filled as shown in Fig.57A in step 304.
30 The user moves the mouse cursor to a color in the color specifying information which the user wants for filling while pushing the mouse button as shown in Fig.57B. Then, the user releases the mouse button in step 305. The coord' notes at the time of 35 releasing the mouse button are obtained in step 306.
In step 307, it is checked whether the coordinates is on an effective region in the color specifying
-43 information. If not., the color specifying information is deleted from the screen and the process is completed in step 310. If the coordinates are on the effective region in the color specifying information, the color value corresponding to the coordinates are obtained from the color specifying information in step 308' and an paint processing is performed by passing (x, y) and (R. G. B) to the paint process in step 309. Then, 10 the closed region including (x, y) is filled with (R.
- G. B) in step 311. When the paint processing is completed, the color specifying information is deleted from the screen as shown in Fig.57C in step 310. The above-mentioned process is repeated until 15 filling of the line drawing is completed.
Fig.56 shows an example of the paint processing, As mentioned above, when filling the closed region, it is necessary to specify the coordinates in the closed region and the color value 20 There are various methods for filling the closed region by using the coordinates and the color value.
In this embodiment, a method using a recursive call will be described.
The (x, y) passed from the step 310 s 25 saved in working coordinates (a, b) in step 321. It is checked whether (a, b) are on the line drawing to be filled and do not have the color of the boundary line in step 322. If this condition is not satisfied, the process returns in step 328. When it 30 is satisfied, the color of (a, b) in the line drawing is changed to the color (R. G. B) which was passed from the step 310 in 323. After that, coordinates which are adjacent to (a, b) are generated and are input to (x, y) in steps 324-321.
35 Then, this paint process is recursively called in step 329. Accordingly, filling by (R. G.. B) is performed from (x, y) passed from the step 310 until
-44 the coordinates reach the boundary line or the end of the closed region. That is, the inside of the closed region is filled with the color (R. G. B).
The filled line drawing storing part 318 5 stores the filled line drawing in the storage device 313. As shown in Fig.58, the image filling apparatus of theexamples of the present invention can also be conflqured by an input device 10 321, storage devices 322, 323, a display device 324.
a recording medium 325 and a data processing apparatus 326. The input device 321 is a pointing device, keyboard and the like for inputting colors and the corresponding names. The storage device 322 15 corresponds to the storage device 313 in Flg-53.
The storage device 322 is a hard disk for example.
The display device 324 corresponds to the image displaying device 319 in Fig.53. The recording medium 325 corresponds to a recording medium such as 20 a FO (floppy disk), a CD-ROM, an MO (magneto-optic disk) and the like which stores an image filling program which has the color specifying information generation part 311, the color specifying information storing part 312, the line drawing to be 25 filled reading part 314, the line drawing displaying part 315, the color specifying informat10n reading part 316, the fllllng part 317 and the filled line drawing storing part 318. The data processing apparatus 326 is a CPU which reads the image filling 30 program from the recording medium 32S and executes it. According to the above-mentioned invention, the user can save labor in producing animation in which digitized line drawings are filled with 3S predetermined colors. That is because the user can specify a color for filling the line drawing with small movement of the pointing device.
-45 The above-mentioned fifth example can be realized in concert with other examples of the present invention. For example, the fifth example can be used with the first - third 5 examples. In addition, the color which i s painted by the first - third examples can be changed by the fifth example Thus, filly ng becomes speedy and accurate.
(first embodiment) 10 The embodiment of the present invention will be described in the following. The embodin ent allows the object of the present invention to be achieved.
Fig.59 shows an example of an image 15 processing apparatus of the embodiment. The image processing apparatus includes a display 410, a keyboard 420, a pointlog device 430 (a mouse), a printer 440, a processing device 450, a memory device 460 and an external storage device 470. This 20 configuration itself is basically the same as a computer sys t em, The processing device 450 includes an image reading part 451, a small region searching part 45Z, a small region changing part 453 and an 25 image writing part 454. The processing device 450 also includes a control part and the like which is not shown in the figure for controlling each part.
The memory device 460 is a so-called working memory.
The memory device 4GO stores small region size data 30 461 which is input from the keyboard 420, a search state display table 463 and a small region table 463 which are generated by the small region searching part 420. and image data 464 which 15 processed halfway. The memory device 460 may be included in 3S the processing device 450.
The external storage device 470 stores an unfilled line drawing file and a line drawing file
-46 whlch is filled by the four connected pixel fill seed algorltI m or the scan line seed fill algorithm.
The line drawing vrill be called an itnage file in the following. This storage device can be realized by a 5 magnetic disc, a magneto-optic disk or any other devices. The external storage device 470 may be included In the memory device 460.
The image reading part 451 reads an object image file which is stored in the external storage 10 device 470, stores it in the memory device ( the i. working memory) 460 as the image data 464 and displays it on the display 410. The image data to be processed may be also input by an image scanner for example. The small region searching part 452 15 generates the search state display table 462 and the small region table 463 on the memory device 460.
Then, the small region searching part 452 searches the image data 464 for extracting small regions which are smaller than the small region size (for 20 example, one pixel, two pixels) which I; specified bef orehand by the user, wherein the searched pixel is recorded in the search state display table 4 6 2 and the extracted small region is recorded in the small region table 463.
25 The small region change ng part 453 reads the small region table 4 6 3 from the memory device 460. Then, the small region changing part 453 displays a mark representing the small region extracted by the small region searching part 452 on 30 the image data, and makes an inquiry about next processes for the extracted small region to the user The user provides instructions about the next processes by the keyboard 420 or the mouse g30. The small region changing part 453 performs processing 35 on the small region according to the instructions such that the image data 464 in the memory device is changed. The processing by the small region
-47 changing part is performed in an interactive manner between the user and the apparatus wherein the small region changing process is displayed on the display 410 one after another. The image writing part 455 5 writes the image data 454 which is processed to the external storage device 470. The processed image data may be printed out by the printer 440 according to a user's instruction and the like.
In the following, the processes by the JO small region searching part 452 and the small region changing part 453 will be described in detail.
* F'g.60 shows the search state display table 462 fox checking the searched pixel in the Image data 464. The search state display table 462 15 is generated corresponding to each of: image data 464 to be processed, In the tables one bit is assigned for one pixel because it is enough to represent two states (unstarched or searched) for one pixel, for example, O is assigned to the unsearched state and 1 20 i assigned to the searched state. The sizes of the x, y directions are the same as the sizes (Xsize, Ysize) of the image data 464 to be processed.
Fig.61 shows the small region table 463 which stores the extracted small regions (a small 25 region list). The small region table 463 has the number of extracted small region (count), the size (number of pixels) of each small region ( num), coordinates x, y of a pixel which is in the each small region and a flag indicating whether the small 30 region has been processed by the small region changing part 453 (change state flag). In the same way as the search state display table 462, the small region table 463 is generated for each image data 464 to be processed.
35 Pigs,62-65 show process flowcharts of the small region searching part 452. Fig.69 is a flowchart showing the whole process of the small
-48 region searching part 452, Figs- 63 and 64 shows generation and initialization flow Of the search state display table, and Fig. 65 shows a search process f low.
5 The process flow shown in Pig. 63 is used for checking small regions included in image data whichever the small region is unfilled or filled.
The process flow shown in Fig. 64 is used for checking unf tiled small regions included in image 10 data in which the image data has been filled by the 4 connected pixel fill algorithm or the scan line seed fill algorithm and the like. The process shown in Fig. 63 is effective for prevent the user from forgetting about changing a color of the small 15 region. The user specifies the flow shown in Fig 63 or 6 4 beforehand.
In step 421, the small region searching part 452 generates and initializes the search state display table 462. As mentioned above, the process 20 flow shown in Fig. 63 is used when checking small regions included in image data 4 6 4 whichever the small region is unfilled or filled. The process flow shown in Fig. 64 is used when checking unfilled small regions included in image data 464.
25 When checking small regions included in image data whichever the small region is unf illed or filled, it is unnecessary to search lines included in the image data, Therefore, 1 ( searched state) is assigned to coordinates of a pi xel of a 30 predetermined color representing the line, and 0 (unsearchE d) is assigned to coordinates of a pixel which not is included in the line since searching is necessary for the pixel.
As shown in Fig. 63, states of all pixels 35 in the search state display table 462 are initialized to O in step 441, and the image data 464 is scanned in steps 442 and 443. Then, when the
-49 color of a scanned pixel is not the line color, the state of the corresponding coordinates remains 0, and when it is the line color, the state is changed to 1 in steps 444, 445.
5 When check) ng only unfilled small regions, it is necessary to search unfilled pixels.
Therefore, O is set to the state of the coordinates of the unfilled pixel since searching is necessary for the unfilled pixel, and 1 is set to the state of 10 the coordinates of the filled pixel since searching is not necessary for the filled pixel.
As shown in Fig.64, states of all pixels in the search state display table 462 are initialized to O (unstarched) in step 451, and the 15 image data is scanned in steps 452 and 453. Then, when the scanned pixel is unfilled, the state of the corresponding coordinates remains 0, and when it is filled, the state is changed to 1 in steps 454, 455.
Next, the small region searching part 452 20 generates and initializes the small region table 463 in step 422. For this, all information is changed to O except the number.
After that, the small region searching part 452 initializes the number of the small regions 25 (count) to in step 423, and scans the search state display table 462 in steps 424 and 425. When an unstarched pixel is found, working variables (counter of pixels: pixels, the x coordinate of a pixel in the small region: 1, the y coordinate of a 30 pixel in the small:region: m) are initialized in steps 426 and 427. Then, the searching process shown in Fig.65 is called in step 428.
In the search) ng process shown in Fig.65, at ter the coordinates (a, b) at the time when the 35 process is called are set in step 461, it is checked whether the coordinates is in the i mage data and unsearched. When the coordinates is not in the
-50 image data or searched, the process returns. Rhen the coordinates is in the image data and unstarched, the coordinates (the pixel) in the search state table 462 are changed to searched in step 464' the 5 working variable counter (pixels) is incremented by 1 in step 46S, and a, b are substituted into 1. m respectively in step 466. Then, four coordinates around the pixel are generated in steps 467-470 so that the same searching process is recursively 10 called.
After the searching process shown in Fig.65 ends, the small region searching part 452 checks the size of the working variable (pixels) in step 429. If the size is. larger than the small 15 region size 461 specified by the user, the region is not regarded as small region. Thus, the process returns to the step 425 in which the search state display table is scanned. When the size is smaller than or equal to the small region size 461 specified 20 by the user, the value of pixels. is substituted into numb court], the value of.1. is substituted into account] , and the value of m. is substituted into y[count] in step 430. Then, count. which represents the small region number is incremented by 25 1 in step 431 and the same process is repeated.
According to the above-mentioned procedure, the number of all extracted small regions, the size of each small region and coordinates of a pixel included in the small region are obtained regardless 30 of whether the image data is unfilled or filled.
They are stored in the small region table 6 3 as the small regi on list.
Fig.66 shows a process flow example of the small region changing part 453.The small region 35 changing part 453 performs processes for each small region extracted by the small region searching part 452 in an interactive manner with the user. In this
-5 1 ernbodiment, the processes include ( 1) f ailing a region with a color o ther than that of the outline (filling of the small region) ( 2) filling a region with a color of the outline 5 (filling the small region with the color of the outline) ( 3) removing pixels of the outline ( enlarging the small region by deleting the pixels constituting the outline while keeping the closed region, and 10 filling) ( 4)malntainlng the existing state (maintaining the small region as t is) As shown in Fig. 66, the small region changing part 453 reads the small region table 463 15 from the memory device 460 in step 481 and lniti alizes the working variable ( the small region number counter: i) in step 482. Next, when i is smaller than or equal to the number of the small regions (count) in the small region table 462 in 20 step 483, the small region changing part 453 checks whether the change state flag of the ith row in the small region table 462 is 0 or not in step 484.
When the flag is O (unchanged), a mark corresponding to the small region which includes coordinates (xi, 25 y[i]) is provided to the small region and displayed on the display 410 in step 485, in which the mark is overlaid on the-tmage data by using the coordinates (xtiJ, yti]) which is in the ith row of the small region table 462, The small rage on changing part 30 459 inquires about processes for the small region to the user in step 486.
The user sees the mark displayed on the image data, recognizes the small region and the place, and specifies a process for the small region.
35 When the user specifies filling a region with a color other than that of the outline-, the small region changing part 453 asks the user about
-52 the color for filling in step 48 7 such that the small region is filled with the color in step 488.
When the user specifies filling a region with a color of the outline-, the small region changing 5 part 453 asks the user about the outline color in step 489 such that the small region is filled with the outline color in step 490. When the outline color is already known, the step 489 may be omitted.
When the user specifies Removing pixels of the 10 outline-, the small region changing part 453,asks the user about pixels to be removed and a color used for removing the pixels in steps 491 and 49Z, and replaces the pixels to be removed with the color in step 493. When the user specifies Maintaining the 15 existing state-, nothing is done.
After that, the small region changing part 453 sets the change state flag of ith row of the small region table 463 as 1 (changed) in step 494, and deletes the di splayed mark corresponding to the 20 small region in step 495.. Then, the working variable counter is incremented by 1 in step-496 and the process returns to the step 483. The above-
mentioned process is repeated until i reaches the small region number ( count) shown in the small 25 region table 462.
The small region changing part 453 may ask the user about following processes (continuing, discontinuing and the like) after displaying the small region lists for example, after reading the 30 small region table in step 481. Then, the user may provide an instruction for discontinuing the process when the extracted small region is very small and negligible. Figs. 67A and 67B show an example of the 35 mark di splayed on the display 410 wherein the mark is overlaid on the image data. The design and the color of the mark is specified by the user
-53 beforeband such that the user can recognize the location and the color of the small region at a glance when the mark is displayed. For example, the color of the mark is designed to be a conspicuous 5 color which is not used for the image data. The design and the color of the mark may be changed according to the location of the small region in the same image data.
Fig.68 shows an example of the process for 10 the small region. In the case of the example shown in Pig. 68, searching for the small region which is smaller than or equal to 4 pixels is specified by the user. For the sake of clarity, the mark is not shown in Fig. 68, and the small regions are assumed 15 to be unfilled and all small regions are assumed to be processed by one process although other cases are possible according to the present invention.
In Fig. 68, (a) shows part, al image data in the initial state in which there are small regions, 20 -each of (b)-(e) shows the partial data after being processed. In the Fig.68, the case of (e) shows the same image as (a) because the instruction by the user is maintaining the existing state. (d') shows a state in progress (in which outline pixels are 25 deleted). Such a state in progress is also displayed on the display. The changing processes shown in (b), (c), (d'), (d) and the like can be realized by a conventional filling algorithm such as the 4 connected pixel seed fill algorithm or a 30 simple 1 pixel color changing process or the like.
All or a part of the image reading part 451. a small region searching part 452, the small region changing part 453 and the image writing part 454 shown in Fig. 5g (for example, only 452 and 453) 35 can be described by a language which can be executed by a computer as a program such that the program is stored in a computer readable recording medium such
-54 as a floppy disc, a CD-ROM, a memory card and the 13 ke.
As mentioned above, according to the present invention, the small region which is smaller 5 or equal to a sloe predetermined by the user is searched for in digitized image data. Then, the existence and the location is presented to the user, and a proper processing can be performed in an interactive manner in which filling, filling with a 10 line color, deleting line pixels or the like is performed. As a result, it becomes easy and accurate to check and correct an unfilled small region which is forgotten or is a mistake in fills ng which occurs frequently in producing animation.
15 The above-mentioned embodiment can be realized in concert with other examples of the present invention. For example, the embodiment can be used with the first - third examplesy in which an unfilled region which ZO remained after filling by the first - third examples can be easily checked and corrected by the embodiment, The present invention described with the examples and embodiment is the most effective in 25 two dimensional animation, especially in filling the recent digital animation. In this specification,
the two dimensional means animation produced by filling a line drawing which is input by hand from a cell or input by a digital map. Three dimensional 30 animation means animation in which the color is already specified for each surface when constructing a frame model in a computer.
The present invention is not limited to the specifically disclosed embodiment' and 3s variations and modifications may be made without departing from the scope of the invention.
Claims (11)
1. An image processing method comprising the steps of: inputting image data; searching said image data for extracting a small region smaller than or equal to a predetermined size; and outputting a list of said small regions.
2. An image processing method comprising the steps of: inputting image data; searching said image data for extracting a small region smaller than or equal to a predetermined size; providing a mark to said small region; and displaying said mark wherein said mark is overlaid on said image data.
3. An image processing method as claimed in claim 2, further comprising the steps of: asking an user about processing for said small region such that processing specified by the user is performed.
4. An image processing apparatus comprising: a part for inputting image data; a part for searching said image data for extracting a small region smaller than or equal to a predetermined size; and a part for outputting a list of said small regions.
5. An image processing apparatus comprising: a part for inputting image data; a part for searching said image data for extracting a small region smaller than or equal to a predetermined size; a part for providing a mark to said small region; and a part for displaying said mark wherein said mark is overlaid on said image data.
-56
6. An image processing apparatus as claimed in claim 5, and further comprising: a part for asking a user about processing for said small region such that processing specified by the user is performed.
7. A computer program code for causing a computer to process images, said computer . program code cornpusmg: program code means for inputting image data; program code means for searching said image data for extracting a small region smaller than or equal to a predetermined size; and program code means for outputting a list of said small regions.
8. A computer program code for causing a computer to process images, said computer . program code compns ng: program code means for inputting image data; program code means for searching said image data for extracting a small region smaller than or equal to a predetermined size; program code means for providing a mark to said small region; and program code means for displaying said mark wherein said mark is overlaid on said image data.
9. A computer program code for causing a computer to process images, as claimed in claim 8, and further comprising: program code means for asking a user about processing for said small region such that processing specified by the user is performed.
10. An image processing method substantially as hereinbefore described with reference to and as illustrated in Figs. 59 through 68 of the accompanying drawings.
11. An image processing apparatus substantially as hereinbefore described with reference to and as illustrated in Figs. 59 through 68 of the accompanying drawings.
-57 12 A computer program code for causing a computer to process an image substantially as hereinbefore described with reference to and as illustrated in Figs S9 through 68 of the accompanying drawings
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB0310875A GB2386529B (en) | 1999-05-25 | 2000-05-24 | Image filling method apparatus and computer readable medium for reducing filling process in producing animation |
Applications Claiming Priority (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP11145116A JP2000339480A (en) | 1999-05-25 | 1999-05-25 | Image coloring method, apparatus, and recording medium recording image coloring program |
| JP15216299A JP2000339442A (en) | 1999-05-31 | 1999-05-31 | Picture coloring method, picture coloring device and recording medium recording picture coloring block |
| JP20460099A JP2001034739A (en) | 1999-07-19 | 1999-07-19 | Image coloring method, apparatus, and recording medium recording image coloring program |
| JP2000061221A JP2001250104A (en) | 2000-03-06 | 2000-03-06 | Image processing method, image processing apparatus, and recording medium recording image processing program |
| GB0012713A GB2354925B (en) | 1999-05-25 | 2000-05-24 | Image filling method ,apparatus and computer readable medium for reducing filling process in producing animation |
Publications (4)
| Publication Number | Publication Date |
|---|---|
| GB0223892D0 GB0223892D0 (en) | 2002-11-20 |
| GB2378111A true GB2378111A (en) | 2003-01-29 |
| GB2378111B GB2378111B (en) | 2003-09-03 |
| GB2378111A8 GB2378111A8 (en) | 2003-12-18 |
Family
ID=27515950
Family Applications (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| GB0223891A Expired - Fee Related GB2378110B (en) | 1999-05-25 | 2000-05-24 | Image filling method,apparatus and computer readable medium for reducing filli ng process in producing animation |
| GB0223892A Expired - Fee Related GB2378111B (en) | 1999-05-25 | 2000-05-24 | Image filling method,apparatus and computer readable medium for reducing filling process in producing animation |
| GB0223890A Expired - Fee Related GB2377871B (en) | 1999-05-25 | 2000-05-24 | Image filling method,apparatus and computer readable medium for reducing filling process in producing animation |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| GB0223891A Expired - Fee Related GB2378110B (en) | 1999-05-25 | 2000-05-24 | Image filling method,apparatus and computer readable medium for reducing filli ng process in producing animation |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| GB0223890A Expired - Fee Related GB2377871B (en) | 1999-05-25 | 2000-05-24 | Image filling method,apparatus and computer readable medium for reducing filling process in producing animation |
Country Status (1)
| Country | Link |
|---|---|
| GB (3) | GB2378110B (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107895388A (en) * | 2017-11-13 | 2018-04-10 | 广州视睿电子科技有限公司 | Graphic color filling method, device, computer equipment and storage medium |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115937356B (en) * | 2022-04-25 | 2025-04-01 | 北京字跳网络技术有限公司 | Image processing method, device, equipment and medium |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP0713325A1 (en) * | 1994-11-16 | 1996-05-22 | Mita Industrial Co., Ltd. | Dotted image area detecting apparatus and dotted image area detecting method |
| EP0845717A2 (en) * | 1996-11-29 | 1998-06-03 | Kabushiki Kaisha Toshiba | Image processing apparatus with smoothing feature |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5611037A (en) * | 1994-03-22 | 1997-03-11 | Casio Computer Co., Ltd. | Method and apparatus for generating image |
-
2000
- 2000-05-24 GB GB0223891A patent/GB2378110B/en not_active Expired - Fee Related
- 2000-05-24 GB GB0223892A patent/GB2378111B/en not_active Expired - Fee Related
- 2000-05-24 GB GB0223890A patent/GB2377871B/en not_active Expired - Fee Related
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP0713325A1 (en) * | 1994-11-16 | 1996-05-22 | Mita Industrial Co., Ltd. | Dotted image area detecting apparatus and dotted image area detecting method |
| EP0845717A2 (en) * | 1996-11-29 | 1998-06-03 | Kabushiki Kaisha Toshiba | Image processing apparatus with smoothing feature |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107895388A (en) * | 2017-11-13 | 2018-04-10 | 广州视睿电子科技有限公司 | Graphic color filling method, device, computer equipment and storage medium |
Also Published As
| Publication number | Publication date |
|---|---|
| GB0223890D0 (en) | 2002-11-20 |
| GB2377871B (en) | 2003-10-29 |
| GB2378111A8 (en) | 2003-12-18 |
| GB2377871A (en) | 2003-01-22 |
| GB0223891D0 (en) | 2002-11-20 |
| GB0223892D0 (en) | 2002-11-20 |
| GB2378111B (en) | 2003-09-03 |
| GB2378110A (en) | 2003-01-29 |
| GB2378110B (en) | 2003-10-29 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US7102649B2 (en) | Image filling method, apparatus and computer readable medium for reducing filling process in processing animation | |
| US6774889B1 (en) | System and method for transforming an ordinary computer monitor screen into a touch screen | |
| US6522329B1 (en) | Image processing device and method for producing animated image data | |
| US4807143A (en) | System for forming design pattern data | |
| EP0853293B1 (en) | Subject image extraction method and apparatus | |
| US6295066B1 (en) | Method for generating virtual three-dimensional space | |
| US5214718A (en) | Scan-in polygonal extraction of video images | |
| JP2002202838A (en) | Image processing device | |
| US5222206A (en) | Image color modification in a computer-aided design system | |
| EP0390701A2 (en) | Motion information generating apparatus | |
| JPH05250472A (en) | Method and apparatus for preparing a fine mask of boundaries on an image of an area of interest to be separated from the rest of the image | |
| US5341465A (en) | Method and apparatus for displaying and cutting out region of interest from picture | |
| US6728407B1 (en) | Method for automatically determining trackers along contour and storage medium storing program for implementing the same | |
| US6496198B1 (en) | Color editing system | |
| CA2471168C (en) | Image filling method, apparatus and computer readable medium for reducing filling process in producing animation | |
| US6665451B1 (en) | Image processing method and apparatus | |
| GB2378111A (en) | Image filling method, apparatus and computer readable medium for reducing filling process in producing animation | |
| EP1202213A2 (en) | Document format identification apparatus and method | |
| US6430583B1 (en) | Scenario editing apparatus for performing editing of multimedia using figure feature points | |
| GB2386529A (en) | Image filling method, apparatus and computer readable medium for reducing filling process in producing animation | |
| EP0637811B1 (en) | Method for defining a plurality of form definition data sets | |
| EP0263584A2 (en) | Scan-in polygonal extraction of video images | |
| JP3302855B2 (en) | Region extraction method and apparatus | |
| JPH06333008A (en) | Input device for designation of image contour | |
| JP2000339480A (en) | Image coloring method, apparatus, and recording medium recording image coloring program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 711B | Application made for correction of error (sect. 117/77) | ||
| 711G | Correction allowed (sect. 117/1977) | ||
| 711B | Application made for correction of error (sect. 117/77) | ||
| PCNP | Patent ceased through non-payment of renewal fee |
Effective date: 20120524 |