1. bookAHEAD OF PRINT
Detalles de la revista
License
Formato
Revista
Primera edición
19 Oct 2012
Calendario de la edición
4 veces al año
Idiomas
Inglés
Acceso abierto

Automatic recognition of density and weave pattern of yarn-dyed fabric

Publicado en línea: 22 Dec 2022
Volumen & Edición: AHEAD OF PRINT
Páginas: -
Detalles de la revista
License
Formato
Revista
Primera edición
19 Oct 2012
Calendario de la edición
4 veces al año
Idiomas
Inglés
Introduction

With the development of society, small-batch and multi-item have gradually become the production mode of the textile industry, which means that the textile companies are in desperate need of a rapid, automatic, and accurate method for analyzing the structural parameters and weave patterns of the fabric. In the traditional manual method, these parameters are identified by the inspectors with a textile magnifying glass, which is time-consuming and labor-intensive. Along with this trend, the recognition of fabric parameters, such as density and weave pattern, has become a research hotspot. Many feasible algorithms [1,2,3,4,5,6] based on image analysis have been proposed to recognize the fabric density and pattern. Generally, these methods can be classified into two categories: hand-crafted feature-based methods and deep learning-based methods.

The low-level features, such as the frequency and time domain features of the image, are often used to represent the texture information of the fabric. In the frequency domain, the peaks in the Fourier transform results of the fabric image represent the periodic structure formed by warp and weft yarns. Automatic detection [7,8,9,10] of fabric density can be achieved by locating the peaks in the fabric image and counting the number of located yarns. In the time domain, the pixels in the yarn have higher gray levels, while the pixels in the gap between the yarn reflect fewer lights, showing a lower gray level. In ref. [11], gray projection was proposed to locate the nodes and yarns. Then, pixel-level features, such as geometrical features [9,12,13] and statistical features [14,15], are extracted to classify the nodes. However, these methods pay more attention to white or homochromatic fabrics.

Recently, a significant breakthrough has been achieved on image analysis by moving from the low-level feature-based algorithms to deep learning-based end-to-end framework. Convolutional neural networks (CNNs) are widely applied to solve pattern recognition tasks. In the textile field, many researchers also adopted CNN methods to address some visual tasks, such as fabric image retrieval [16,17,18], weave pattern recognition [1,19,20,21], and defect detection [22,23,24]. For fabric weave pattern recognition, some researchers [20,25] converted the task to a classification task. However, Meng et al. [1] regarded the pattern recognition task as a multi-task problem that needs to locate and classify the nodes. Technologically speaking, fabric density and pattern recognition are carried out by many tasks, such as regression, classification, object detection, and color measurement. The complexity of this task makes it difficult to train and learn from the end-to-end CNN model. Moreover, the CNN methods require a lot of labeled data to drive learning, but it is difficult to obtain many ground-truth. This is the main reason why there are still few studies based on deep learning to address the weave pattern problem.

With the high demands for the appearance of clothes, yarn-dyed fabrics play a more important role in the textile market. Due to the mutual influence of various colors in the image of yarn-dyed fabric, it is difficult to achieve good performance by the current methods. Pan et al. [26] proposed a method for automatic detection of structural parameters of yarn-dyed fabric based on Hough transform and fuzzy C-means (FCM) algorithm. Xin et al. [27] used the active grid model to recognize the color pattern of yarn-dyed fabric. However, if yarn-dyed fabric has weft skew, these methods may not work well. Based on the work of previous researchers, in this article, an efficient method is proposed to recognize the structural parameters of yarn-dyed fabric, including density, weave pattern, and colored yarn arrangement. The proposed method can help companies analyze fabric parameters more quickly, resulting in efficient production. The following section introduces the proposed method in detail.

Fabric image acquisition

Woven fabrics are produced by interlacing two perpendicular sets of yarns: vertically passing warps and horizontally wefts. The cross states of warps and wefts are called interlacing nodes, which are classified into two types as follows: the warp interlacing node denotes a node with a warp residing on the top of a weft, and the weft interlacing node refers to a node with a weft passing above a warp. Generally, the weave pattern consists of the recurrence of the basic weave repeat. Yarn-dyed fabric, as a common type of woven fabric, is woven by two or more different colors of yarn.

The stability of the acquisition environment, including lighting conditions, scale, resolution, etc., is essential for fabric pattern recognition. For this requirement, the scanner is a good choice, which is not only cheap but also convenient. In this study, Canon 9000F Mark II Scanner is used to capture the yarn-dyed fabric image in the RGB (red, green, blue) model. The light source of this scanner is a white light-emitting diode, which can guarantee a stable capture environment. When obtaining fabric images, the resolution is an important parameter. A too high resolution will increase the amount of calculation in the recognition process, thereby affecting the recognition speed; whereas a too low resolution will cause inaccurate recognition. According to the warp density (weft density) of the fabric of 800 pieces/10 cm and the maximum tightness of 80%, the gap between the yarns is 0.025 mm. To ensure the accuracy of recognition, the gap should occupy at least two pixels in the scanned image. Under this condition, the corresponding resolution is not less than 2,032 dpi. So, in this study, the yarn-dyed fabric images are captured with a resolution of 2,400 dpi.

As shown in Figure 1, the image captured in 2,400 dpi contains 946 pixels and 26.5 warp threads per 1 cm. The wrap density of the fabric in the image is 67.3 threads/inch. Then, we can obtain that each warp yarn occupies about 35.7 pixels, which can ensure recognition accuracy. This calibration will serve as the benchmark for our subsequent recognition of fabric structural parameters. When acquiring fabric images, we should try to avoid the edge of the fabric. Moreover, the surface of the fabric should be clean and flat.

Figure 1

Calibration of pixels corresponding to 1 cm of fabric.

Fabric image correction

When capturing fabric images, it is difficult to ensure that the warp and weft yarns of the fabric remain completely horizontal and vertical. However, the inclination of the yarns in the fabric image will generally cause a large error in the recognition of fabric parameters. To improve the performance of the recognition, skew correction is required.

In this study, the process of skew correction is shown in Figure 2. We first enhance the fabric image, specifically, (1) convert the acquired RGB image to HSV (hue, saturation, value) space; (2) extract the brightness component V and perform histogram equalization on it. Considering that the interlacing nodes of the fabric are closely related to the brightness, we use the brightness component to detect the tilt angle of the fabric. The used histogram equalization can be represented by: s k = T ( r k ) = L 1 M N j = 1 k n j , k = 0 , 1 , L 1 , {s}_{k}=T({r}_{k})=\frac{L-1}{MN}\mathop{\sum }\limits_{j=1}^{k}{n}_{j},\hspace{1em}k=0,1\text{…},\hspace{.25em}L-1, where MN is the total number of pixels in the image, L represents the number of gray levels of the image, and n k is the number of pixels whose grayscale is r k . Figure 3 presents the effect of the histogram equalization, where Figure 3(a) shows the brightness component corresponding to the fabric image in Figure 4, and Figure 3(b) shows the result enhanced by the histogram equalization. The interlacing nodes in the enhanced image are more prominent.

Figure 2

The process of skew correction.

Figure 3

Brightness enhancement of fabric image. (a) The brightness component V and (b) enhanced V component.

Figure 4

A sample of captured fabric images.

Then, the OTSU (named after the founder’s name) method is applied to segment interlacing nodes of the fabric. Due to the complex colors in the yarn dyed-fabric, directly using this method for threshold segmentation of the entire image will cause many interlacing nodes to be unrecognized, as shown in Figure 5(a). This study first divides the image into multiple blocks, as shown in Figure 5(b), and then uses the OTSU method to threshold each block. It is stated here that the number of grids is not fixed, but depends on the image size. In this study, each block contains at least 250 pixels (50 px × 50 px). Figure 5(c) presents the segmentation results using the proposed strategy. Compared to Figure 5(a), many missing interlacing nodes are recognized. After obtaining the binary image, the Canny operator is used to detect the edge, and the detection result is shown in Figure 5(d). The promising result demonstrates the effectiveness of the proposed method for interlacing node recognition.

Figure 5

Interlacing node extraction of fabric images. (a) The result of global threshold segmentation, (b) blocked result, (c) result of local threshold segmentation, and (d) the result of edge detection.

The aforementioned method can detect most of the interlacing nodes of fabric images. Then, two morphological methods, including erosion and dilation, are used to enhance the features of interlacing nodes. When detecting the warp inclination, the kernel of erosion and dilation is (4,1), and when detecting weft inclination, the kernel of erosion and dilation is (1,4). We apply Hough transformation to detect the lines in the binary image. The Hough transformation can detect thousands of straight lines in the image. We filter the straight lines that pass through more interlacing nodes by limiting the angle range and the length of the straight line. The detection results are shown in Figure 6. Finally, we regard the mean value of the inclination angle of the filtered straight line as the inclination angle of the yarns in the image. The proposed method detects that the warp inclination angle of the sample fabric is 2.16° (90 − θ). After obtaining the warp-inclination angle, we rotate the image by 90 − θ, and then we can obtain an image where the warp is vertical, as shown in Figure 7(a).

Figure 6

Hough transformation of the fabric image. (a) Spectrogram of Hough space and (b) straight lines detected in the fabric image.

Figure 7

Results of warp and weft correction. (a) Result of warp correction and (b) result of weft correction.

When the warp yarns are in a vertical state, we use affine transformation to correct the weft-inclination. The warp-corrected image undergoes the aforementioned steps, including threshold segmentation, edge detection, morphological processing (kernel: (1,4)), and Hough transform to obtain the tilt angle. Using the tilt angle, we perform the affine transformation to correct the weft. The weft correction result of the sample fabric is shown in Figure 7(b).

Detection of the warp and weft yarns

In the yarn image, the yarn axis part reflects more light, showing higher brightness; whereas the reflected light on both sides of the yarn is less, showing lower brightness [28]. The yarns in the yarn-dyed fabric image show a similar rule, from the axis to the gap between the yarns, the gray level gradually decreases. This rule can be used to detect the yarns in the fabric image.

For accurate recognition, we adopt an efficient method to strengthen the gap between the yarns. First, we use a relative total variation (RTV)-based smooth method [29] to eliminate the details of the image. The smooth method can be expressed by: arg min S p ( S p I p ) 2 + λ D x ( p ) L x ( p ) + ε + D y ( p ) L y ( p ) + ε , \text{arg}\mathop{\text{min}}\limits_{S}\text{}\sum _{p}{({S}_{p}-{I}_{p})}^{\text{2}}+\lambda \cdot \left(\frac{{D}_{x}(p)}{{L}_{x}(p)+\varepsilon }+\frac{{D}_{y}(p)}{{L}_{y}(p)+\varepsilon }\right), where the term (S p I p )2 makes the input and the result does not deviate wildly. The effect of removing texture from an image is introduced by a new regularizer (D x (p)/(L x (p) + ɛ) + D y (p)/(L y (p) + ɛ)), which is called RTV. λ is a weight to control the degree of smoothness and ɛ is a small positive number to avoid division by zero. The division is an element-wise operation. More details can be found in ref. [29]. The smoothing result for the image of yarn-dyed fabric is shown in Figure 8(a). The uneven brightness distribution of the yarn in the fabric image is weakened or eliminated. This result can be used for color segmentation. By dividing the original image I element-wise by the smoothed image Is, we can obtain the gap-enhanced image, as shown in Figure 8(b).

Figure 8

Detection results of the warp and weft yarns. (a) Smoothed fabric image, (b) gap-enhanced fabric image, (c) brightness projection in the weft, (d) brightness projection in the warp, and (e) the detection result of yarn layout.

The automatic interlacing node recognition of fabric images can be achieved by using the brightness characteristic information of the images. In this study, all the yarns can be regarded lying as straight stripes in the vertical and the horizontal directions after the skew correction and gridding model are constructed to separate the warp yarns as well as the weft yarns. As mentioned at the beginning of this section, the brightness distribution of the fabric image has an obvious gradient change. The gradient of fabric image intensity arranged in descending order is the axial line of yarns, the marginal zone of yarns, and the transition region between the yarns; in addition, an intensity mutation can be observed in the transition region, so the minimum intensity value can be found by using the projection algorithm in the vertical and the horizontal directions. With this method, the yarns can be detected and the interlacing nodes can be successfully located. Assume that B(i, j) is the brightness value of the pixel in the ith row and jth column in the image. Then the fabric image can be mapped into two independent one-dimensional waves by using the following equations: B ( i ) = i B ( i , j ) , B(i)=\sum _{i}B(i,j), B ( j ) = j B ( i , j ) , B(j)=\sum _{j}B(i,j), where B(i) is the accumulated brightness value of ith row pixels and B(j) is the accumulated brightness value of jth column pixel of the fabric image. For the obtained image with enhanced transition area, we first convert it to HSV color space and extract its brightness component V. Then component V is mapped into the two independent one-dimensional waves. The mapping results of the sample fabric image are shown in Figure 8(c) and (d).

It is clearly shown that the valleys of the curve are corresponding to the weft gap position in the fabric image, whereas the peaks represent the axes of the yarns. The case of the warp yarns is similar to that of the weft described above, but the warp yarns lay in the vertical direction. Generally speaking, if B(i) > B(i − 1) and B(i) > B(i + 1) in an ideal smooth curve, B(i) can be regarded as a peak; if B(i) < B(i − 1) and B(i) < B(i + 1), B(i) can be regarded as a valley. However, the fact is that the intensity of the pixel points in the image does not completely follow a descending distribution. It is mentioned before that one yarn occupies about 36 pixels. So we use a 30-pixel pitch slider in the warp and weft directions to search for local minimums, respectively, and the distance between two adjacent local minimums to be greater than 30. The local minimums searched out are considered to be the gap between the yarns in the fabric image, as shown in Figure 8(e). Based on this result, we can achieve automatic measurement of fabric warp density and weft density.

Inspection of the density of yarn-dyed fabric

The density of yarn-dyed fabric can be computed with the resolution of the image when the yarns are recognized successfully by using the following equation: D = N S dpi M , D=\frac{N\cdot {S}_{\text{dpi}}}{M}, where D denotes the density of yarn-dyed fabric with thread/inch as its unit; M represents the number of pixels among the first and last yarns; N is the number of (warp or weft) yarns in the fabric image; and S dpi is the resolution, as mentioned before, S dpi = 2,400. According to this definition, we can obtain the density of the sample fabric shown in Figure 4: D warp = 79.9 threads/inch, where M = 2,228, N = 74; D weft = 59.2 threads/inch, where M = 1,580, N = 39.

Detection of the color pattern

The node type depends on the relative position of the weft and warp where they interlace. The node is defined as a warp node when the warp is at the top; otherwise, it is defined as a weft node. In this article, the node types are determined by a quadrilateral boundary of interlacing nodes, which is proposed in ref. [30]. The matrix representation of node type is shown in Figure 9(a).

Figure 9

The recognition results of the node type and color pattern of the sample fabric. (a) The matrix representation of node type and (b) color pattern of the yarn-dyed fabric image.

In the yarn-dyed fabric, color information is an important feature. The color feature of each interlacing node is represented by the following equation: V = ( R ¯ , G ¯ , B ¯ ) , V=(\overline{R},\overline{G},\overline{B}), where R ¯ = 1 h w i = 1 h j = 1 w r ( i , j ) , \overline{R}=\frac{\text{1}}{hw}\mathop{\sum }\limits_{i=\text{1}}^{h}\mathop{\sum }\limits_{j=\text{1}}^{w}r(i,j), G ¯ = 1 h w i = 1 h j = 1 w g ( i , j ) , \overline{G}=\frac{\text{1}}{hw}\mathop{\sum }\limits_{i=\text{1}}^{h}\mathop{\sum }\limits_{j=\text{1}}^{w}g(i,j), B ¯ = 1 h w i = 1 h j = 1 w b ( i , j ) , \overline{B}=\frac{\text{1}}{hw}\mathop{\sum }\limits_{i=\text{1}}^{h}\mathop{\sum }\limits_{j=\text{1}}^{w}b(i,j), where h and w represent the height and width of the corresponding interlacing node, respectively. r(i, j), g(i, j), and b(i, j) represent the value of red, green, and blue component at point (i, j). In other words, we use the average value of R, G, and B of all pixels in the interlacing node to represent it. However, the colors at the junctions between the yarns affect each other, making the colors in some interlacing nodes have outliers. Assume that the color of each pixel in the interlacing node obeys a 4D Gaussian distribution, expressed as c(r, g, b) ∼ N (C|μ,∑). The probability density function can be expressed as: f ( c ) = 1 ( 2 π ) 3 1/2 exp ( c μ ) T 1 ( c μ ) 2 , f(c)=\frac{1}{{(\sqrt{2\pi })}^{3}{\left|\sum \right|}^{\text{1/2}}}\text{exp}\left(-\frac{{(c-\mu )}^{T}{\left(\sum \right)}^{-1}(c-\mu )}{2}\right), where Σ is the covariance matrix, and μ is the matrix consisting of the mean value of the three components of R, G, and B. We select those pixels with a 95% confidence level as statistical samples to represent the color feature of each interlacing code.

The FCM clustering method is proposed to classify the nodes with the color features extracted from the nodes. And it is an unsupervised method, which attempts to minimize the objective function by organizing the data into different clusters. We follow the method proposed in ref. [26] for node classification. However, the number of yarn colors in the yarn-dyed fabric should be known first. For automatic detection, Pan et al. [26] introduced the cluster validity criterion V MPC [31] to find the optimal cluster number. V MPC indices of the sample yarn-dyed fabric are shown in Table 1. We can easily find that the sample fabric does contain five colors of yarn.

V MPC indices for yarn-dyed fabric image

c 234 5 678910
V MPC 0.6570.6960.769 0.841 0.7130.6420.6160.6090.596

Note: The best result in the table is marked in bold.

FCM (c = 5) is used to classify the nodes with the color feature. The color of the nodes is then replaced by the corresponding cluster center. The boundary pixels are removed from the image and the nodes are expressed by the color rectangles of the same size. As shown in Figure 9(b), the color pattern can be obtained by using this method. We can also obtain the layout of color yarns by combining the node type and color pattern.

Summary

For a yarn-dyed sample, first the scanner is used to obtain its image. The inclination of the yarns in the fabric image will generally cause errors in the recognition of fabric parameters. Therefore, the method introduced in Section 3 is proposed to correct the warp and weft yarns. For accurate detection of the yarns in the image, a brightness-projection method is used to locate the gaps between the yarns, and thus recognize the layout of yarns. The results can be applied to measure the density of the yarn-dyed fabric. The types of interweaving nodes are recognized by using the method proposed in ref. [30]. FCM is proposed to classify the nodes with the color features extracted from the nodes. Combining the clustering result and weave pattern, we can obtain the color pattern of the yarn-dyed fabric.

Table 2 and Figure 10 present the actual parameters and detection results of three yarn-dyed fabric samples. The promising results demonstrate the effectiveness of the proposed method for the recognition of yarn-dyed fabric.

Actual parameters and detection parameters of the three samples

NameSize (px)Actual threadsActual densityRe. threadsRe. densityColor
WarpWeftWarpWeftWarpWeftWarpWeftWarpWeft
a2,2281,58074398059743979.959.25
b2,4941,6841014597641014597.264.16
c1,9401,33260397470603974.270.36

Figure 10

Recognition results of two other samples. (a) Original images, (b) corrected images, (c) weave pattern, and (d) color pattern.

To demonstrate the superiority of the proposed method, we compare some current methods on this topic including those of Pan et al. [26], Guo [5], Xin et al. [27], Iqbal Hussain et al. [2], and Meng et al. [1]. In this experiment, we implemented the aforementioned method following the authors’ paper and then tested it on 100 samples of yarn-dyed fabrics. Here, error rate [1] is employed to evaluate the recognition performance of contrasting methods. The experimental results are presented in Table 3. The proposed method achieves superior performance of 2.1%, 1.2%, 1.4%, and 0.6% on various metrics, outperforming other comparative methods. The method of Iqbal Hussain et al. [2] is the best performing method among the compared methods, however, our method reduces the error rate by about 2%. The experimental results show that the proposed method has a certain improvement in the identification of yarn-dyed fabrics, which demonstrates the superiority of the proposed method.

Comparison of experimental results with other methods

MethodsThreads (%)Density (%)
WarpWeftWarpWeft
Pan et al. [26]7.45.23.94.2
Guo [5]8.36.55.14.9
Xin et al. [27]6.86.34.33.9
Iqbal Hussain et al. [2]4.53.43.53.1
Meng et al. [1]23.543.334.943.8
Proposed2.11.21.40.6

We also use the method proposed in ref. [1] to recognize the yarn layout of the yarn-dyed fabrics, and the results are presented in Figure 11. The poor results are mainly caused by two reasons: (1) The model was trained from a small dataset, which leads to its low generalization; (2) the image resolution is different from the original. These results prove that in the case of limited data, low-level feature-based methods are more effective than learning-based methods.

Figure 11

The recognition results of yarn-dyed using the method proposed in ref. [1]. (a) Original image, (b) warp location, (c) weft location, and (d) warp node location.

Conclusion

Aiming at the recognition of the structural parameters of yarn-dyed fabric, an effective method is proposed based on the previous work, especially refs [1,26,30]. In the recognition system, the fabric images are captured by a scanner with a resolution of 2,400 dpi. The inclination angles of the warps and wefts of the fabric are detected by Hough transform. Then we use rotation transformation and affine transformation to correct the fabric image, respectively. A smoothing method is used to enhance the yarns and the gaps between the yarns to improve the accuracy of the yarn location. Using the detection results, the density of yarn-dyed fabric can be calculated by equation (5). The edge information of the four directions of the interweaving node is the main basis for judging whether it is a warp node or a weft node. The number of yarn colors obtained from cluster validity analysis and the extracted color features are used to classify the nodes with FCM algorithm. Based on the clustering results, the color pattern of yarn-dyed fabric can be obtained.

The proposed method has some limitations. For jacquard fabrics or multi-layer fabrics, our method may not work. However, it still can be used to measure the density of the fabric.

Figure 1

Calibration of pixels corresponding to 1 cm of fabric.
Calibration of pixels corresponding to 1 cm of fabric.

Figure 2

The process of skew correction.
The process of skew correction.

Figure 3

Brightness enhancement of fabric image. (a) The brightness component V and (b) enhanced V component.
Brightness enhancement of fabric image. (a) The brightness component V and (b) enhanced V component.

Figure 4

A sample of captured fabric images.
A sample of captured fabric images.

Figure 5

Interlacing node extraction of fabric images. (a) The result of global threshold segmentation, (b) blocked result, (c) result of local threshold segmentation, and (d) the result of edge detection.
Interlacing node extraction of fabric images. (a) The result of global threshold segmentation, (b) blocked result, (c) result of local threshold segmentation, and (d) the result of edge detection.

Figure 6

Hough transformation of the fabric image. (a) Spectrogram of Hough space and (b) straight lines detected in the fabric image.
Hough transformation of the fabric image. (a) Spectrogram of Hough space and (b) straight lines detected in the fabric image.

Figure 7

Results of warp and weft correction. (a) Result of warp correction and (b) result of weft correction.
Results of warp and weft correction. (a) Result of warp correction and (b) result of weft correction.

Figure 8

Detection results of the warp and weft yarns. (a) Smoothed fabric image, (b) gap-enhanced fabric image, (c) brightness projection in the weft, (d) brightness projection in the warp, and (e) the detection result of yarn layout.
Detection results of the warp and weft yarns. (a) Smoothed fabric image, (b) gap-enhanced fabric image, (c) brightness projection in the weft, (d) brightness projection in the warp, and (e) the detection result of yarn layout.

Figure 9

The recognition results of the node type and color pattern of the sample fabric. (a) The matrix representation of node type and (b) color pattern of the yarn-dyed fabric image.
The recognition results of the node type and color pattern of the sample fabric. (a) The matrix representation of node type and (b) color pattern of the yarn-dyed fabric image.

Figure 10

Recognition results of two other samples. (a) Original images, (b) corrected images, (c) weave pattern, and (d) color pattern.
Recognition results of two other samples. (a) Original images, (b) corrected images, (c) weave pattern, and (d) color pattern.

Figure 11

The recognition results of yarn-dyed using the method proposed in ref. [1]. (a) Original image, (b) warp location, (c) weft location, and (d) warp node location.
The recognition results of yarn-dyed using the method proposed in ref. [1]. (a) Original image, (b) warp location, (c) weft location, and (d) warp node location.

Comparison of experimental results with other methods

Methods Threads (%) Density (%)
Warp Weft Warp Weft
Pan et al. [26] 7.4 5.2 3.9 4.2
Guo [5] 8.3 6.5 5.1 4.9
Xin et al. [27] 6.8 6.3 4.3 3.9
Iqbal Hussain et al. [2] 4.5 3.4 3.5 3.1
Meng et al. [1] 23.5 43.3 34.9 43.8
Proposed 2.1 1.2 1.4 0.6

V MPC indices for yarn-dyed fabric image

c 2 3 4 5 6 7 8 9 10
V MPC 0.657 0.696 0.769 0.841 0.713 0.642 0.616 0.609 0.596

Actual parameters and detection parameters of the three samples

Name Size (px) Actual threads Actual density Re. threads Re. density Color
Warp Weft Warp Weft Warp Weft Warp Weft Warp Weft
a 2,228 1,580 74 39 80 59 74 39 79.9 59.2 5
b 2,494 1,684 101 45 97 64 101 45 97.2 64.1 6
c 1,940 1,332 60 39 74 70 60 39 74.2 70.3 6

Meng, S., Pan, R., Gao, W., Zhou, J., Wang, J., He, W. (2021). A multi-task and multi-scale convolutional neural network for automatic recognition of woven fabric pattern. Journal of Intelligent Manufacturing, 32(4), 1147–1161. Meng S. Pan R. Gao W. Zhou J. Wang J. He W. 2021 A multi-task and multi-scale convolutional neural network for automatic recognition of woven fabric pattern Journal of Intelligent Manufacturing 32 4 1147 1161 10.1007/s10845-020-01607-9 Search in Google Scholar

Iqbal Hussain, M. A., Khan, B., Wang, Z., Ding, S. (2020). Woven fabric pattern recognition and classification based on deep convolutional neural networks. Electronics, 9(6), 1048. Iqbal Hussain M. A. Khan B. Wang Z. Ding S. 2020 Woven fabric pattern recognition and classification based on deep convolutional neural networks Electronics 9 6 1048 10.3390/electronics9061048 Search in Google Scholar

Pan, R., Gao, W., Liu, J., Wang, H. (2011). Automatic recognition of woven fabric pattern based on image processing and BP neural network. The Journal of the Textile Institute, 102(1), 19–30. Pan R. Gao W. Liu J. Wang H. 2011 Automatic recognition of woven fabric pattern based on image processing and BP neural network The Journal of the Textile Institute 102 1 19 30 10.1080/00405000903430255 Search in Google Scholar

Pan, R., Gao, W., Liu, J., Wang, H. (2010). Automatic recognition of woven fabric patterns based on pattern database. Fibers and Polymers, 11(2), 303–308. Pan R. Gao W. Liu J. Wang H. 2010 Automatic recognition of woven fabric patterns based on pattern database Fibers and Polymers 11 2 303 308 10.1007/s12221-010-0303-6 Search in Google Scholar

Guo, Y., Ge, X., Yu, M., Yan, G., Liu, Y. (2019). Automatic recognition method for the repeat size of a weave pattern on a woven fabric image. Textile Research Journal, 89(14), 2754–2775. Guo Y. Ge X. Yu M. Yan G. Liu Y. 2019 Automatic recognition method for the repeat size of a weave pattern on a woven fabric image Textile Research Journal 89 14 2754 2775 10.1177/0040517518801197 Search in Google Scholar

Anila, S., Rani, K. S. S., Saranya, B. (2018). Fabric texture analysis and weave pattern recognition by intelligent processing. Journal of Telecommunication, Electronic and Computer Engineering (JTEC), 10(1–13), 121–127. Anila S. Rani K. S. S. Saranya B. 2018 Fabric texture analysis and weave pattern recognition by intelligent processing Journal of Telecommunication, Electronic and Computer Engineering (JTEC) 10 1 13 121 127 Search in Google Scholar

Xu, B. (1996). Identifying fabric structures with fast Fourier transform techniques. Textile Research Journal, 66(8), 496–506. Xu B. 1996 Identifying fabric structures with fast Fourier transform techniques Textile Research Journal 66 8 496 506 10.1177/004051759606600803 Search in Google Scholar

Lachkar, A., Gadi, T., Benslimane, R., D’orazio, L., Martuscelli, E. (2003). Textile woven-fabric recognition by using Fourier image-analysis techniques: Part I: a fully automatic approach for crossed-points detection. Journal of the Textile Institute, 94(3–4), 194–201. Lachkar A. Gadi T. Benslimane R. D’orazio L. Martuscelli E. 2003 Textile woven-fabric recognition by using Fourier image-analysis techniques: Part I: a fully automatic approach for crossed-points detection Journal of the Textile Institute 94 3 4 194 201 10.1080/00405000308630608 Search in Google Scholar

Lachkar, A., Benslimane, R., D’orazio, L., Martuscelli, E. (2005). Textile woven fabric recognition using Fourier image analysis techniques: Part II–texture analysis for crossed-states detection. Journal of the Textile Institute, 96(3), 179–183. Lachkar A. Benslimane R. D’orazio L. Martuscelli E. 2005 Textile woven fabric recognition using Fourier image analysis techniques: Part II–texture analysis for crossed-states detection Journal of the Textile Institute 96 3 179 183 10.1533/joti.2004.0069 Search in Google Scholar

Matsuyama, T., Miura, S.-I., Nagao, M. (1983). Structural analysis of natural textures by Fourier transformation. Computer Vision, Graphics, And Image Processing, 24(3), 347–362. Matsuyama T. Miura S.-I. Nagao M. 1983 Structural analysis of natural textures by Fourier transformation Computer Vision, Graphics, And Image Processing 24 3 347 362 10.1016/0734-189X(83)90060-9 Search in Google Scholar

Soltany, M., Zadeh, S. T., Pourreza, H.-R. (2011). Fast and accurate pupil positioning algorithm using circular Hough transform and gray projection. International Conference on Computer Communication and Management, 211, 556–561. Soltany M. Zadeh S. T. Pourreza H.-R. 2011 Fast and accurate pupil positioning algorithm using circular Hough transform and gray projection International Conference on Computer Communication and Management 211 556 561 Search in Google Scholar

Jeong, Y. J., Jang, J. (2005). Applying image analysis to automatic inspection of fabric density for woven fabrics. Fibers and Polymers, 6(2), 156–161. Jeong Y. J. Jang J. 2005 Applying image analysis to automatic inspection of fabric density for woven fabrics Fibers and Polymers 6 2 156 161 10.1007/BF02875608 Search in Google Scholar

Jeon, B. S., Bae, J. H., Suh, M. W. (2003). Automatic recognition of woven fabric patterns by an artificial neural network. Textile Research Journal, 73(7), 645–650. Jeon B. S. Bae J. H. Suh M. W. 2003 Automatic recognition of woven fabric patterns by an artificial neural network Textile Research Journal 73 7 645 650 10.1177/004051750307300714 Search in Google Scholar

Kuo, C. F. J., Shih, C. Y., Lee, J. Y. (2004). Automatic recognition of Fabric weave patterns by a fuzzy C-means clustering method. Textile Research Journal, 74(2), 107–111. Kuo C. F. J. Shih C. Y. Lee J. Y. 2004 Automatic recognition of Fabric weave patterns by a fuzzy C-means clustering method Textile Research Journal 74 2 107 111 10.1177/004051750407400204 Search in Google Scholar

Ajallouian, F., Tavanai, H., Palhang, M., Hosseini, S., Sadri, S., Matin, K. (2009). A novel method for the identification of weave repeat through image processing. The Journal of The Textile Institute, 100(3), 195–206. Ajallouian F. Tavanai H. Palhang M. Hosseini S. Sadri S. Matin K. 2009 A novel method for the identification of weave repeat through image processing The Journal of The Textile Institute 100 3 195 206 10.1080/00405000701660244 Search in Google Scholar

Deng, D., Wang, R., Wu, H., He, H., Li, Q., Luo, X. (2018). Learning deep similarity models with focus ranking for fabric image retrieval. Image and Vision Computing, 70, 11–20. Deng D. Wang R. Wu H. He H. Li Q. Luo X. 2018 Learning deep similarity models with focus ranking for fabric image retrieval Image and Vision Computing 70 11 20 10.1016/j.imavis.2017.12.005 Search in Google Scholar

Xiang, J., Zhang, N., Pan, R., Gao, W. (2019). Fabric image retrieval system using hierarchical search based on deep convolutional neural network. IEEE Access, 7, 35405–35417. Xiang J. Zhang N. Pan R. Gao W. 2019 Fabric image retrieval system using hierarchical search based on deep convolutional neural network IEEE Access 7 35405 35417 10.1109/ACCESS.2019.2898906 Search in Google Scholar

Xiang, J., Zhang, N., Pan, R., Gao, W. (2020). Fabric retrieval based on multi-task learning. IEEE Transactions on Image Processing, 30, 1570–1582. Xiang J. Zhang N. Pan R. Gao W. 2020 Fabric retrieval based on multi-task learning IEEE Transactions on Image Processing 30 1570 1582 10.1109/TIP.2020.304387733373301 Search in Google Scholar

Meng, S., Pan, R., Gao, W., Zhou, J., Wang, J., He, W. (2019). Woven fabric density measurement by using multi-scale convolutional neural networks. IEEE Access, 7, 75810–75821. Meng S. Pan R. Gao W. Zhou J. Wang J. He W. 2019 Woven fabric density measurement by using multi-scale convolutional neural networks IEEE Access 7 75810 75821 10.1109/ACCESS.2019.2922502 Search in Google Scholar

Puarungroj, W., Boonsirisumpun, N. (2019). Recognizing hand-woven fabric pattern designs based on deep learning. Advances in Computer Communication and Computational Sciences, 19, 325–336. Puarungroj W. Boonsirisumpun N. 2019 Recognizing hand-woven fabric pattern designs based on deep learning Advances in Computer Communication and Computational Sciences 19 325 336 10.1007/978-981-13-6861-5_28 Search in Google Scholar

Wang, F., Liu, H., Sun, F., Pan, H. (2019). Fabric recognition using zero-shot learning. Tsinghua Science and Technology, 24(6), 645–653. Wang F. Liu H. Sun F. Pan H. 2019 Fabric recognition using zero-shot learning Tsinghua Science and Technology 24 6 645 653 10.26599/TST.2018.9010095 Search in Google Scholar

Liu, J., Wang, C., Su, H., Du, B., Tao, D. (2019). Multistage GAN for fabric defect detection. IEEE Transactions on Image Processing, 29, 3388–3400. Liu J. Wang C. Su H. Du B. Tao D. 2019 Multistage GAN for fabric defect detection IEEE Transactions on Image Processing 29 3388 3400 10.1109/TIP.2019.295974131870985 Search in Google Scholar

Ouyang, W., Xu, B., Hou, J., Yuan, X. (2019). Fabric defect detection using activation layer embedded convolutional neural network. IEEE Access, 7, 70130–70140. Ouyang W. Xu B. Hou J. Yuan X. 2019 Fabric defect detection using activation layer embedded convolutional neural network IEEE Access 7 70130 70140 10.1109/ACCESS.2019.2913620 Search in Google Scholar

Jun, X., Wang, J., Zhou, J., Meng, S., Pan, R., Gao, W. (2021). Fabric defect detection based on a deep convolutional neural network using a two-stage strategy. Textile Research Journal, 91(1–2), 130–142. Jun X. Wang J. Zhou J. Meng S. Pan R. Gao W. 2021 Fabric defect detection based on a deep convolutional neural network using a two-stage strategy Textile Research Journal 91 1–2 130 142 10.1177/0040517520935984 Search in Google Scholar

Xiao, Z., Liu, X., Wu, J., Geng, L., Sun, Y., Zhang, F., et al. (2018). Knitted fabric structure recognition based on deep learning. The Journal of The Textile Institute, 109(9), 1217–1223. Xiao Z. Liu X. Wu J. Geng L. Sun Y. Zhang F. 2018 Knitted fabric structure recognition based on deep learning The Journal of The Textile Institute 109 9 1217 1223 10.1080/00405000.2017.1422309 Search in Google Scholar

Pan, R., Gao, W., Liu, J., Wang, H. (2010). Automatic detection of the layout of color yarns for yarn-dyed fabric via a FCM algorithm. Textile Research Journal, 80(12), 1222–1231. Pan R. Gao W. Liu J. Wang H. 2010 Automatic detection of the layout of color yarns for yarn-dyed fabric via a FCM algorithm Textile Research Journal 80 12 1222 1231 10.1177/0040517509355349 Search in Google Scholar

Xin, B., Hu, J., Baciu, G., Yu, X. (2009). Investigation on the classification of weave pattern based on an active grid model. Textile Research Journal, 79(12), 1123–1134. Xin B. Hu J. Baciu G. Yu X. 2009 Investigation on the classification of weave pattern based on an active grid model Textile Research Journal 79 12 1123 1134 10.1177/0040517508101459 Search in Google Scholar

Liu, J., Yamaura, I., Gao, W. (2006). Discussing reflecting model of yarn. International Journal of Clothing Science and Technology, 18(2), 129–141. Liu J. Yamaura I. Gao W. 2006 Discussing reflecting model of yarn International Journal of Clothing Science and Technology 18 2 129 141 10.1108/09556220610645775 Search in Google Scholar

Xu, L., Yan, Q., Xia, Y., Jia, J. (2012). Structure extraction from texture via relative total variation. ACM Transactions On Graphics (TOG), 31(6), 1–10. Xu L. Yan Q. Xia Y. Jia J. 2012 Structure extraction from texture via relative total variation ACM Transactions On Graphics (TOG) 31 6 1 10 10.1145/2366145.2366158 Search in Google Scholar

Zhong, P., Shi, Y., Chen, X., Tan, Q., Zhang, C. (2013). Research on digital intelligent recognition method of the weave pattern of fabric based on the redundant information. Fibers and Polymers, 14(11), 1919–1926. Zhong P. Shi Y. Chen X. Tan Q. Zhang C. 2013 Research on digital intelligent recognition method of the weave pattern of fabric based on the redundant information Fibers and Polymers 14 11 1919 1926 10.1007/s12221-013-1919-0 Search in Google Scholar

Dave, R. N. (1996). Validating fuzzy partitions obtained through c-shells clustering. Pattern Recognition Letters, 17(6), 613–623. Dave R. N. 1996 Validating fuzzy partitions obtained through c-shells clustering Pattern Recognition Letters 17 6 613 623 10.1016/0167-8655(96)00026-8 Search in Google Scholar

Artículos recomendados de Trend MD

Planifique su conferencia remota con Sciendo