Abstract
In this paper, an effective method based on Transform Invariant Low-rank Textures (TILT) and HOG is proposed to identify woven fabric pattern. Firstly, the method based on TILT is used to solve the deflection phenomenon in the process of woven fabric image acquisition. Secondly, the yarn floats in the fabric image is localized by the yarns segmentation method based on the 2D spatial-domain gray projection, which is used to segment the weft and warp yarns. Thirdly, HOG is applied to extract distinctive invariant features in the process of feature extraction. According to the HOG feature, the texture features of the woven fabric are acquired. Finally, the yarn floats are classified by Fuzzy C-Means (FCM) clustering to recognize the weft and warp cross. Experimental results demonstrate that the proposed method can achieve the recognition of the three woven fabrics, plain, twill, and satin, and obtain accurate classification results.
1. Introduction
In the traditional textile industry, the recognition of the woven fabric is mostly done manually, which is very tedious and time-consuming. With the rapid development of technology, the application of image processing and machine vision technology is becoming more dominant. Recently, image processing and machine vision technology have been introduced into the area of woven fabric [1].
The analysis of fabric texture [2–7] has been studied since the mid-1980s, these methods are mainly based on the properties of the Fourier spectrum. But there are limits to its adaptability for solid fabrics, since these methods recognize patterns by light reflection from the warp and weft, there are limits to their adaptability for solid fabrics. Recently, some relevant researches have been developed for automatic analysis of fabric weave structures. Haralicck et al. [8] proposed a method of gray-level cooccurrence matrix, which is used to calculate the texture characteristics. R.Pan [9] analyzed weft and warp floats to determine the fabric weave patterns. Y. Ben Salem [10] developed a supervised recognition method using support vector machine to classify fabric weave patterns and used the SVM to classify woven fabric structure. Hu [11] used Bayesian statistics to classify fabric weave patterns, a Bayesian model was established for the classification of fabric images, and the morphological parameters of fabric images were extracted as the eigenvector. Other methods [12, 13] determine the fabric weave patterns by analyzing warp and weft floats. These methods can successfully classify several woven fabrics. However, the skew correction of fabric images in current woven fabric classification methods is ignored; real-time and integrity of these methods are low.
Hence, this paper introduces an approach for recognition and classification of woven fabrics with real-time and integrity. Firstly, TILT is used to correct the deviation of the fabric images. Histogram equalization is employed to minimize the uneven distribution of gray levels of pixels caused by local illumination. Then 2D spatial-domain gray projection is used to segment the weft and warp yarns. Secondly, we get the texture features of the woven fabric and HOG features are extracted. Finally, the yarn floats are classified by FCM clustering to recognize the weft and warp cross. The preliminary recognition results are obtained and the correct results are obtained through periodic correction.
The rest of this paper is organized as follows: the entire recognition process is described and analyzed in Section 2. Then, the recognition results of our method and comparison with other methods are obtained in Section 3. The conclusions are given in Section 4. The overall framework of the method is shown in Figure 1.

2. Experimental Methods
2.1. Transform Invariant Low-Rank Textures
It is inevitable that the fabric images are skewed due to the placement of samples when using scanner or other image devices for image acquisition. The skewness of the image usually causes a large error in the recognition result, so it is necessary to rectify the woven fabric in the process of the recognition and classification of the woven fabric. To solve the deflection phenomenon in the process of fabric image acquisition, we adopt TILT [14] to process woven fabric images.
Generally speaking, the grayscale image on a planar can be described by mathematical expression as a two-dimensional function , which is viewed as a matrix. In matrix theory, a 2D texture is considered as a function , defined on . We say that is a low-rank texture if one-dimensional functions span a finite low-dimensional linear subspace, i.e.,for some small positive integer . If is finite, then is referred to as a rank- texture. In practice, we typically never see a perfect low-rank texture in a real image, mainly due to two factors.
Camera angle and lens perspective induce a transformation on the domain of the texture function. The image that we observe from a certain viewpoint is transformed version of the low-rank texture function :where belongs to a certain Lie group .
The sampled values of the texture function are subject to many types of corruption such as quantization, occlusions, and noise. For some error matrix , such deviations are modeled as follows:
For the deflection of woven fabric image, we can view it as the first case and the second case. According to the principle of invariance and solving algorithm of low-rank texture transformation, the correction procedure of fabric image can be summarized as follows.
Woven fabric images are represented in matrix form. Firstly, the slant woven fabric image is grayscale. The pixel coordinates are the data values and a gray woven fabric image can be represented as a matrix .
According to TILT, the corresponding model is established and solved. The model is as follows:where denotes the number of nonzero entries in . is a weighting parameter that trades off the rank of the texture versus the sparsity of the error. Then the optimal inverse transformation is calculated under constraint conditions and the objective function is solved. and are optimized by alternating to meet the iterative conditions:The outer loopis to constantly update the inverse transform until it converges by determining whether the objective function converges.
The results are obtained. According to the above solution procedure, the woven fabric images are corrected; the optimal inverse transformation and the sparse noise are obtained.
Figures 2(a) and 2(c) are the original woven fabric images; Figures 2(b) and 2(d) are the corrected woven fabric images using TILT.

(a)

(b)

(c)

(d)
Using Hough transform [15], the image is corrected by obtaining the optimal angle, which is the rotation transformation of the image. It is shown as follows:
However, most of the images captured by the device are often rotary. Furthermore, there is more or less deviation in horizontal or vertical direction, which is the migration transformation. In other words, the images are not only rotated but also shifted. In this paper, the images are corrected at both horizontal and vertical directions by using TILT.
The corrected woven fabric images are obtained in Figures 3(e), 3(f), and 3(g) by using Hough transform, canny transform, and angle analysis, and the rotate angles are -3.0266 and 2.9851, -3.4682 and 3.3665, -2 and 1, respectively. The rotate angles by using TILT algorithm in Figures 3(d) and 3(h) are -3 and 3. Compared with these methods, the algorithm of TILT works effectively and robustly for woven fabric images with symmetric patterns and structures. Moreover, the sparse noise of original image can be obtained. The results obtained in the recognition of woven fabrics are given in “experimental results and analysis” section.

(a) Hough transform

(b) Canny transform

(c) Angle analysis

(d) TILT method

(e) Hough transform

(f) Canny transform

(g) Angle analysis

(h) TILT method
2.2. HOG Feature Extraction
In order to overcome the interference of a variety of pedestrian posture and clothing color. HOG [16] was proposed in 2005. In this paper, based on the HOG, we get the HOG features for recognition of woven fabrics.
The algorithm of HOG is shown in the following.
For woven fabric images, the spatial representation of the 2D image at different scales can be obtained by convolution of the preprocessed images and Gaussian. Then calculate the region, which is the firstly and the finally segment line, and the HOG features will be extracted from each of these small blocks.
HOG features extraction: the procedure of HOG features extraction is as follows.
(1) Gamma Rectification. Normalize the input images obtained in the segmented region by using gamma rectification. The formula is as follows:where gamma is set as 1/2.
(2) Gradient Image Computation. The gradient value and the gradient orientation of the image pixel in each of the small blocks are calculated:where are the gradient value of the horizontal and vertical direction in the pixel , respectively. is the pixel value in each pixel. The magnitude of gradient in each pixel and the direction are calculated by the following:
(3) Gradient Histogram Construction. We consider each segment region as the cells, project the gradient direction of each pixel in a cell onto the histogram, and generate the gradient histogram finally. The HOG features are obtained.
2.3. FCM Clustering
Clustering analysis is based on partitioning a collection of data points into a number of subgroups, where the objects inside a subgroup show a certain degree of closeness or similarity. FCM [17] is viewed as an instance of unsupervised learning. The weft and warp cross-areas can be recognized by FCM clustering; it is based on the minimization of the following objective function, with respect to and :where is a matrix showing the membership degree of instances to different classes, is a set of prototypes of the data set, is a weighting exponent, is the number of samples, is the membership degree of in the cluster , is the member of dimensional measured data, is the dimensional center of the cluster, and is any norm expressing the similarity between any measured data and the center, is Euclidean distance between the sample and the cluster center, and is the correlation coefficient of the sample and the cluster center.
The main steps of FCM clustering are as follows:
Initialize the cluster centers , weighting exponent , and termination parameter .
Initialize degree of membership matrix and the number of iteration .
According to (13), update the cluster centers.
According to (14), calculate the distance from the sample with cluster centers .
Update the degree of membership by (15).
If , stop iteration; otherwise return to .
According to the above step, the weft and warp cross-areas are divided into two groups. Then the preliminary recognition results can be obtained.
3. Experimental and Results
In this paper, we implemented the proposed method in MATLAB R2012a and Windows 7. All the experimental samples, including plain, twill, and satin, are scanned by Epson Scan V330 under the same external condition. 12 different fabric samples of different sizes were scanned and four representative fabrics which were cut into the same size were selected for the experiment. The resolution is 1200dpi. Figure 4 shows some representative samples of experimental woven fabric images.

(a)

(b)

(c)

(d)
The captured woven fabric images are corrected by using the TILT algorithm and the results are shown in Figure 5. Figures 5(a), 5(c), 5(e), 5(g) and 5(b), 5(d), 5(f), 5(h) are the corrected images and the noise of the original woven fabrics, respectively. According to (3), the optimal inverse transformation is obtained by using the method of TILT, as shown in the following:

(a)

(b)

(c)

(d)

(e)

(f)

(g)

(h)
After the image correction, it is necessary to convert into gray images for improving processing speed. However, Gaussian function for histogram equalization is used to distribute images gray level in lower and higher graylevel so that images contrast can be enhanced. The results are shown in Figure 6.

(a)

(b)

(c)

(d)

(e)

(f)

(g)

(h)
After image preprocessing, the weft and warp floats of woven fabric images are segmented, which is very important for weft and warp floats recognition. In this paper, 2D spatial-domain gray projection approach is used to segment the weft and warp floats. Due to the fact that the weft and warp yarns are perpendicular to each other, the weft and warp floats can be segmented accurately based on the gray projection curves. The segmentation results are obtained by using 2D spatial-domain gray projection.
The weft floats mean that the weft yarn is floating on the weft yarn and the warp floats reverse. In order to obtain the type of the floats, features should be extracted from the images. HOG feature grad is obtained and the results are shown as Figure 7, respectively.

(a)

(b)

(c)

(d)
Finally, FCM clustering is used to recognize the type of weft and warp floats preliminarily. The results are shown in Figure 8.

(a)

(b)

(c)

(d)
In this paper, in the recognition of woven fabric pattern, the error judgment of the preliminary recognition results cannot be avoided with the woven fabric image itself and scanning. The recognition rate is used to measure the performance of the algorithm, which is defined as follows:where means the recognition rate of woven fabrics. The greater the recognition rate, the better the recognition effect. is the number of the overall recognition windows that are created on the image. denotes the number of the right windows that are recognized of a tested image. Table 1 illustrates the recognition rate for the woven fabrics using different methods.
From the preliminary recognition results, it can be seen that the accuracy of (c) is 100%, (a) is 95.63%, (b) is 95.83%, and (d) is only 94.11%. The error may be caused by a local defect of the fabric surface, a bad segmentation, or the low quality of fabric image. From the results, more than 90% weft and warp floats can be correctly recognized.
From Figure 7, it can be seen that it is necessary to correct the error recognition results. The woven fabric consists of the basic weave pattern. In order to identify the basic weave pattern, the preliminary weave pattern can transform into a matrix in which zero and one replace the black and white squares as shown in Figure 9 [18].

(a)

(b)

(c)

(d)
From Figure 9, the minimum cycle unit matrices are obtained by using the relevant statistics. The results are shown in Figure 10.

(a)

(b)

(c)

(d)
According to the basic weave patterns in Figure 10, in order to obtain the final weave pattern, we can generate their masks, as shown in Figure 11.

(a)

(b)

(c)

(d)
In the paper, an effective method based on TILT is proposed to correct the deviation of the fabric images. It is necessary to correct the scanned images for the woven fabric recognition. Taking the example of Figures 2(a), 2(b), 3(a), 3(b), and 3(c), the recognized results are shown in Figure 12.

(a)

(b)

(c)

(d)

(e)
After the statistical data of the recognition results were analyzed, the recognition rate and the elapsed time are shown in Table 1.
The accuracy of woven fabric recognition is objectively appraised by using the correct rate, as shown in (17). With regard to the recognition accuracy of the five methods, it is observed that the recognition result by using the method of TILT performs best in the correct rate (94.57%). The recognition rate not only illustrates the recognition performance of the algorithm of the fabric image, but also shows recognition capability of the woven fabric. It can be seen that the identification of uncorrected image has a large margin of error. It makes no sense to compute the recognition rate due to the large deviations of the number of organizations available. Meanwhile, the results obtained by using the method of angle analysis have some error due to the angle deviations. The proposed algorithm is characterized as follows: it has a high recognition correct rate; the noise can be obtained through the correction; it can be applicable to recognition a wide range of plain, twill, and satin fabrics. Finally, the method based on TILT proposed in this paper is effective for the woven fabric images.
4. Conclusions
In this paper, an automatic and efficient classification method is proposed. In order to correct the captured woven fabric images, we propose a novel algorithm, based on low-rank texture transformation. In the recognition system, the skew angles of the original woven fabric images are corrected based on the algorithm of TILT. For the corrected images and image preprocessing, weft and warp floats are detected by using 2D spatial-domain gray projection in both horizontal and vertical directions; HOG is used to extract the features of the woven fabrics. Preliminary recognition results are obtained by using FCM clustering. To obtain the final recognized results, the black and white squares of the weave pattern are replaced with a matrix of zeroes and ones. The basic weave patterns are obtained by using the relevant statistics. According to the basic weave patterns, the final recognized results are obtained.
To the authors’ knowledge, no automatic method is applicable to identify all kinds of weave fabric up to now. Our method in this paper is proved to be capable of recognizing the plain, twill, and stain woven fabrics and it gains the best classification results (100%) with a faster speed. Also, there are some limits in this recognition system. For example, it cannot recognize the double layer weaves.
Data Availability
The data in this paper is supported by the following: previously reported Hough transform, Canny transform, TILT, HOG, FCM clustering, and histogram equalization data were used to support this study. These prior studies (and datasets) are cited at relevant places within the text as references [14–18]. Generating the masks of the basic weave patterns data used to support the findings of this study are included within the following information: im=imread(‘1.png’); im1=rgb2gray(im); mod=zeros(539,999); for i=1:77 for j=1:111 mod(i,j)=im1(3+i,2+j); end end tempt1=-1; tempt2=-1; for x=1:7 tempt1=tempt1+1; tempt2=-1; for y=1:9 tempt2=tempt2+1; for i=1:77 for j=1:111 mod(tempt177+i,tempt2111+j)=mod(i,j); end end end end figure imshow(mod).
Conflicts of Interest
The authors declare that they have no conflicts of interest.
Acknowledgments
This research is supported by Applied Basic Research Programs of China National Textile and Apparel Council (J201509).