2.IJAEST-Vol-No-6-Issue-No-1-Breast-Cancer-detection-and-Classification-using-Neural-Network-004-009

Page 1

Shekhar Singh et al. / (IJAEST) INTERNATIONAL JOURNAL OF ADVANCED ENGINEERING SCIENCES AND TECHNOLOGIES Vol No. 6, Issue No. 1, 004 - 009

Breast Cancer detection and Classification using Neural Network Shekhar Singh1, Dr P. R. Gupta2 1

1

C-DAC, NOIDA, INDIA

shekharsinghsengar@gmail.com 2 C-DAC, NOIDA, INDIA 2 poonamgupta@cdacnoida.in

T

attention in the construction of the expert supporting diagnosis system has to be paid to the segmentation or Breast Cancer detection stage. The main difficulty of the segmentation process is due to the incompleteness and uncertainty of the information contained in the histopathological image. The imperfection of the data acquisition process in the form of noise, chromatic distortion and deformity of histopathological material caused by its preparation additionally increases the problem complexity. The nature of image acquisition (3D to 2D transformation) and the method of scene illumination also affect the image luminance and sharpness and quality [13][15]. Until now many segmentation methods have been proposed (Carlotto, 1987; Chen et al., 1998; Kass et al., 1987; Otsu, 1979; Su and Chou, 2001; Vincent and Soille, 1991) but, unfortunately, each of them introduces numerous additional problems and usually works in practice under given assumptions and/or needs the end-user’s interaction/cooperation (Lee and Street, 2000; Street, 2000; Wolberg et al., 1993; Zhou and Pycock, 1997) [19]-[21].Since nowadays many histopathological projects assume full automation and real-time operation with a high degree of efficiency, a method free of drawbacks of the already known approaches has to be constructed. Breast cancer detection, classification, scoring and grading of histopathological images is the standard clinical practice for the diagnosis and prognosis of breast cancer. Pathologists perform cancer detection manually under a microscope. Their experience directly influences the accuracy of cancer detection. Variability among pathologists has been observed in clinical practice [4]-[6]. In a large hospital, a pathologist typically handles number of cancer detection cases per day. It is, therefore, a very difficult and time-consuming task. A Computer system that performs automatic cancer detection can assist the pathologists by providing second opinions, reducing their workload and alerting them to cases that require closer attention, allowing them to focus on diagnosis and prognosis. This paper presents a multi-segmentation method for breast cancer detection and neural network for classification. In this paper a group of modified versions of histopathological image segmentation methods adopted for fine needle biopsy images are presented, that is, the adaptive thresholding based segmentation and watershed algorithm. One can also find here a description of denoising and contrast enhancement techniques, presegmentation and fully automatic nuclei localization mechanism used in our approach. Cancer detection and

A

ES

Abstract— Breast cancer detection, classification, scoring and grading of histopathological images is the standard clinical practice for the diagnosis and prognosis of breast cancer. In a large hospital, a pathologist typically handles number of cancer detection cases per day. It is, therefore, a very difficult and timeconsuming task. This paper proposes a method for automatic Breast cancer detection, classification, scoring and grading to assist pathologists by providing second opinions and reducing their workload. A computer-aided Breast cancer detection, classification, scoring and grading system for tissue cell nuclei in histological image is introduced and validated as part of the Biopsy Analysis System. Cancer cell nuclei are selectively stained with monoclonal antibodies, such as the ant estrogen receptor antibodies, which are widely used as part of assessing patient prognosis in breast cancer. This paper also presents the classification of micro cancer object of breast tumor based on feed forward back propagation Neural Network (FNN). Twenty six hundred sets of cell nuclei characteristics obtained by applying image analysis techniques to microscopic slides. The dataset consist of eight features which represent the input layer to the FNN. The FNN will classify the input features into type4, type3, type2 and type1 cancer affected objects. The sensitivity, specificity and accuracy were found to be equal 99.10%, 95.70% and 98.60% respectively. It can be concluded that FNN gives fast and accurate classification and it works as promising tool for classification of breast cell nuclei. Keywords- color conversion, Adaptive histogram equalization, Adaptive thresholding based segmentation, Watershed segmentation

I. INTRODUCTION

IJ

The number of research works conducted in the area of breast cancer detection, classification, scoring and grading. Many university centers, research centers and commercial institutions are focused on this issue because of the fact that breast cancer is becoming the most common form of cancer disease of today’s female population. Thus, the construction of a fully automatic cancer detection system supporting a human expert has become a challenging and difficult task. Nowadays many camera-based automatic breast cancer detection systems have to face the problem of cells and their nuclei separation from the rest of the image content (Lee and Street, 2000; Pena-Reyes and Sipper, 1998; Setiono, 1996; Wolberg et al., 1993) [4]- [15]. Nuclei separation process is very important because the nucleus of the cell is the place where breast cancer malignancy can be observed. Thus, much

ISSN: 2230-7818

@ 2011 http://www.ijaest.iserp.org. All rights Reserved.

Page 4


Shekhar Singh et al. / (IJAEST) INTERNATIONAL JOURNAL OF ADVANCED ENGINEERING SCIENCES AND TECHNOLOGIES Vol No. 6, Issue No. 1, 004 - 009

data is saturated at low and high intensities of Image. This increases the contrast of the output image. C. Contrast limited Adaptive histogram equalization process

T

Contrast-limited adaptive histogram equalization (CLAHE) enhances the contrast of the grayscale image by transforming the values using contrast-limited adaptive histogram equalization (CLAHE).CLAHE operates on small regions in the image, called tiles, rather than the entire image. Each tile's contrast is enhanced, so that the histogram of the output region approximately matches the histogram specified by the 'Distribution' parameter. The neighboring tiles are then combined using bilinear interpolation to eliminate artificially induced boundaries. The contrast, especially in homogeneous areas, can be limited to avoid amplifying any noise that might be present in the image. D. Adaptive Thresholding based Segmentation Adaptive Thresholding based Segmentation process operates on small regions in the image, called tiles, rather than the entire image. For each region computes a threshold (level) that can be used to convert an intensity image to a binary image. Level is a normalized intensity value that lies in the range [0, 1]. For each region uses Otsu's method, this chooses the threshold to minimize the intra class variance of the black and white pixels. Multidimensional arrays are converted automatically to 2-D arrays using reshape. The Adaptive Thresholding function ignores any nonzero imaginary part of Image. Adaptive Thresholding returns the effectiveness metric, EM, as the second output argument. The effectiveness metric is a value in the range [0 1] that indicates the effectiveness of the thresholding of the input image. The lower bound is attainable only by images having a single gray level, and the upper bound is attainable only by two-valued images. After finding threshold value converts the grayscale image to a binary image. The output image replaces all pixels in the input image with luminance greater than level with the value 1 (white) and replaces all other pixels with the value 0 (black). Level specify in the range [0, 1], regardless of the class of the input image. The Adaptive Thresholding function can be used to compute the level argument automatically. Use these level segments the Biopsy Image. The original image is divided into an array of overlapping sub-images. A gray-level distribution histogram is produced for each sub-image, and the optimal threshold for that sub-image is calculated based on this histogram. Since the sub-images overlap, it is then possible to produce a threshold for each individual pixel by interpolating the thresholds of the sub-images. An alternative approach is to statistically examine the intensity values of the local neighborhood of each pixel. The first problem facing us when choosing this method is the choice of statistic by which the measurement is made. The appropriate statistic may vary from

ES

classification systems, however, although dedicated, do not use the manual approach of counting individual cancer cell nuclei, but rather work with specific areas of stained nuclei [10]–[13]. This is because the identification of individual cancer cell nuclei makes the accumulation of a reasonable sample size tedious and time consuming, since it is difficult to decide automatically or by visual inspection, whether cancer cell nuclei touch or are partially overlapping [14]-[20]. To overcome these difficulties methods based on assessment of areas of interest (e.g. specifically stained nuclei) have been developed, which do not rely on identifying individual nuclei. Additionally, methods depending on adaptive thresholds to distinguish between stained tissue (specific staining) and background (nonspecific staining), are based on the assumption that local variations due to preparation or imaging do not significantly influence the measurements [10]–[14]. Furthermore, one of the main objectives of computer-aided biopsy analysis is to minimize some of the variability’s that occur as a consequence of the manual microscopically inspection of stained slides. In addition, computer-aided nuclei analysis also has to be efficient, since pathologists are unlikely to spend more time on evaluating a specimen than that required for the routine manual assessments. As already mentioned, in the manual assessment of biopsy slides one strategy has been to utilize a semi-quantitative scheme to make the assessment more accurate and more objective. To the best of our knowledge no systems as yet exist which attempt to aid the expert in the detection, counting, and classification of individual nuclei using the semi-quantitative scheme. II. BREAST CANCER DETECTION ALGORITHM

A

Our breast cancer detection system adopts a adaptive histogram equalization and multi-segmentation approach. First, a low-resolution global image of the whole histological slide is reconstructed by scaling frames. These frames are captured at 40× magnification from a patient’s histological sample stained with H&E marker. Typically, down and tiling high-resolution micro-image many image frames are generated from a patient’s sample.

IJ

A. Preprocessing & Color Conversion

In Gray Scale conversion process converts the true color image RGB to the grayscale intensity image. Gray Scale conversion process converts RGB images to grayscale by eliminating the hue and saturation information while retaining the luminance. Algorithm: Gray Scale conversion process converts RGB values to grayscale values by forming a weighted sum of the R, G, and B components: 0.2989 * R + 0.5870 * G + 0.1140 * B. B. Adjust Image Intensity value In Adjust Image Intensity value process maps the intensity values in grayscale image to new values in such that 1% of

ISSN: 2230-7818

@ 2011 http://www.ijaest.iserp.org. All rights Reserved.

Page 5


Shekhar Singh et al. / (IJAEST) INTERNATIONAL JOURNAL OF ADVANCED ENGINEERING SCIENCES AND TECHNOLOGIES Vol No. 6, Issue No. 1, 004 - 009

E. Morphological Operation after adaptive thresholding based segmentation Morphological operation dilation, erosion, bottom-hat filtering, opening, closing, top-hat filtering and filling holes used after adaptive thresholding based segmentation. F. Watershed Segmentation

III. CLASSIFICATION ALGORITHM In the present work, the neural networks are used for cancer affected micro object classification purposes. The neural networks derive their power due to their massively parallel structure, and an ability to learn from experience. Neural network can be used for fairly accurate classification of input data into categories, provided they are previously trained to do so. The accuracy of the classification depends on the efficiency of training. The knowledge gained by the learning experience is stored in the form of connection weights, which are used to make decisions on fresh input. Three issues need to be settled in designing an ANN for a specific application: • Topology of the network; • Training algorithm; • Neuron activation functions. In our topology, the number of neurons in the input layer is 8 neurons for the ANN classifier. The output layer was determined by the number of classes desired. The outputs are type4, type3, type2, type1 micro object therefore; the output layer consists of 4 neurons. The hidden layer consists of 20 neurons. Before the training process is started, all the weights are initialized to small random numbers. This ensured that the classifier network was not saturated by large values of the weights. In this experiment, the training set was formed by choosing 390 data sets for the testing process. Cancer object (nucleus) is classified into one of four type object (nucleus) using the feed forward back propagation neural network classifier. After the classification of each cancer objects (nucleus), score and grade are computed. The classifier was trained and tested on the images created by one of the experts. The main steps of the classification algorithm are summarized as follows: Extract Features for Each Nucleus (micro object): The following local and global features are extracted: Area, Diameter, Solidity, Eccentricity, Extent, Perimeter, Roundness, and Compactness.

IJ

A

ES

The watershed segmentation algorithm is inspired by natural observations, such as a rainy day in the mountains (Gonzalez and Woods, 2002; Pratt, 2001; Russ, 1999). A given image can be defined as a terrain on which nuclei correspond to valleys (upside down the terrain modeled in previous steps). The terrain is flooded by rainwater and arising puddles start to turn into basins. When the water from one basin begins to pour away to another, a separating watershed is created. The flooding operation has to be stopped when the water level reaches a given threshold θ. The threshold should preferably be placed somewhere in the middle between the background and a nucleus localization point. In my approach used the distance transformation function for segmentation. The areas where the cells are located are analyzed by dilating the region maximum of the distance transforms for each cell nucleus. To visualize the individual mountain peaks, markers are applied to every peak. The watershed function starts at the peaks expands in every direction until it reaches an edge from another peak or the edge of the picture. The watershed division lines are applied to the segmented image to make a clear separation of the touching cell nuclei. The watershed lines superimposed to the segmented image and in the other picture, the lines are removed leaving segmented cell nuclei only. Segmenting the cells in the manner described above not only improves the accuracy of the cell count, but also the accuracy of the cell nucleus area statistics. In order to make statistics with regards to cell nucleus area even more accurate, we remove any cell nuclei that are touching the border, since most of those cells are more likely to be incomplete cells. Again, after the average nucleus size is calculated, excluding the incomplete border nuclei, it is possible to estimate the equivalent total number of border cell nuclei by adding all partial cell nucleus areas to be divided by the Average size, if the border cell count is needed in analyzing different types of tissue sample. It is also possible to double count the projected overlapping nuclei area (intersection) to improve the accuracy of the cell nucleus area statistics.

image border. The output image is grayscale or binary, respectively. The connectivity is 8 for two dimensions. Removing small objects: Objects removes from a binary image all connected components (objects) that have fewer than P pixels, producing another binary image. The connectivity is 8 for two dimensions. Perimeter of objects: this process returns a binary image containing only the perimeter pixels of objects in the input image. A pixel is part of the perimeter if it is nonzero and it is connected to at least one zero-valued pixel. Blob labeling: Blob labeling process returns a matrix, of the same size as Binary image, containing labels for the connected objects in Binary image. The elements of matrix are integer values greater than or equal to 0. The pixels labeled 0 are the background. The pixels labeled 1 make up one object; the pixels labeled 2 make up a second object, and so on.

T

one image to another, and is largely dependent on the nature of the image.

G. Morphological Operation after watershed segmentation Clearing Border object: Clearing Border object process, suppresses structures that are lighter than their surroundings and that are connected to the

ISSN: 2230-7818

@ 2011 http://www.ijaest.iserp.org. All rights Reserved.

Page 6


Shekhar Singh et al. / (IJAEST) INTERNATIONAL JOURNAL OF ADVANCED ENGINEERING SCIENCES AND TECHNOLOGIES Vol No. 6, Issue No. 1, 004 - 009

IV. TRAINING AND TESTING The proposed network was trained with all 1820 Micro objects (tumor) data cases. These 1820 cases are fed to the FNN with 8 input neurons, one hidden layer of 20 neurons and four outputs neuron. MATLAB software package version 8 is used to implement the software in the current work. When the training process is completed for the training data (1820 cases), the last weights of the network were saved to be ready for the testing procedure. The time needed to train the training datasets was approximately 1.80 second. The testing process is done for 390 cases. These 390 cases are fed to the proposed network and their output is recorded for calculation of the sensitivity, specificity and accuracy of prediction. V. BIOPSY SCORING AND GRADING

% of cells positive 0 1-25% 26-50% 51-75% >=76%

Score 0 1 2 3 4

IJ

A

Scoring based on the micro objects: All the detected cells (micro object) in the high resolution images are classified into one of the four type’s micro objects. Based on the type of object scoring perform. Overall Grading: The overall grade of the patient is defined in terms of the total. Then, the overall grade is assigned as follows: • Grade I (low grade): G = 3, 4, 5 • Grade II (intermediate grade): G = 6, 7, 8 • Grade III (high grade): G = 9, 10, 11 TABLE II GRADE. P: PATHOLOGIST, S: SYSTEM.

Patient Image Patient Image1 Patient Image1 Patient Image2 Patient Image2 Patient Image3 Patient Image3 Patient Image4 Patient Image4 Patient Image5 Patient Image5

P S P S P S P S P S

ISSN: 2230-7818

Figure 1 shows a sample high-resolution original image. Figure 2 illustrates detected cancer cells for nuclear scoring. Nevertheless, scoring is still reliable as long as enough type-2 and type-3 cells are detected. This is analogous to clinical practice in which a pathologist just performs a quick scan of at most 5 image frames to pick up enough type-2 and type-3 cells to make an assessment. Table II compares the grading results of the algorithm and a pathologist. It can be seen the system’s scores are very close to the pathologist’s scores. The system’s scores tend to be slightly lower than the pathologist’s scores. This could be due to the slightly more stringent criteria taken by the system. Given these encouraging results, we are confident that an automatic breast cancer detection and grading system can be developed to assist the pathologists by providing second opinions and alerting them to cases that require further attention. Nevertheless, more comprehensive tests are needed to provide better evaluation of the algorithm’s performance. The performance of the classification algorithm was evaluated by computing the percentages of Sensitivity (SE), Specificity (SP) and Accuracy (AC), the respective definitions are as follows: SE=TP/ (TP+FN)*100

(1)

SP=TN/ (TN+TP)*100

(2)

AC= (TP+TN)/ (TN+TP+FN+FP)*100

(3)

ES

Scoring according to positive cells: The scoring of the cancer detected biopsy image was evaluated by computing the percentages of positive cancer cells, the respective definitions are as follows:

VI. RESULTS AND DISCUSSION

T

Classify Each Cancer objects: A Cancer objects (nucleus) is classified into one of four type cancer objects using the feed forward back propagation neural network classifier.

Grade 3 3 2 2 3 2 2 1 1 1

Where TP is the number of true positives, TN is the number of true negatives; FN is the number of false negatives, and FP is the number of false positives. Since it is interesting to estimate the performance of classifier based on the classification of benign and malignant breast cell nuclei, the true positives (TP), false positives (FP), true negatives (TN), and false negatives (FN) are defined appropriately as shown below:

FP: Predicts benign as malignant. TP: Predicts malignant as malignant. FN: Predicts malignant as benign TN: Predicts benign as begin. Sensitivity, specificity and accuracy of prediction have been calculated according to the above formals for all of the testing data (390 micro object cases). Table 1 shows the resulted SE, SP and AC for testing data of the proposed networks. TABLE III THE RESULTS AFTER TRAINING OF THE NETWORK

No of cases 390

Sensitivity

Specificity

Accuracy

99.10%

95.70%

98.60%

@ 2011 http://www.ijaest.iserp.org. All rights Reserved.

Page 7


Shekhar Singh et al. / (IJAEST) INTERNATIONAL JOURNAL OF ADVANCED ENGINEERING SCIENCES AND TECHNOLOGIES Vol No. 6, Issue No. 1, 004 - 009

VII. CONCLUSION

Figure3 Classified Cancer Detected Image Red, Magenta, Blue, Green Objects are type4, type3, type2, type1 Objects.

REFERENCES

[2]

Ajay Nagesh Baravanhalty, Shridar Gancean, Shannan Agner, James Peter Monaco, “Coputerised image based detection and grading of lymphocytic infiltration in HER2 breast cancer histopathology”,IEEE Transactions on biomedical engineering vol 57,no-3 march -2010. Jean Roman Dalle, Wee Kheng Leow, Daniel Recoeanis, Adima Eunice Tatac, Thomas C.Putti, “Automatic breast cancer grading of histopathological images”, 30th annual international IEEE EMBS conference van couver British Columbia, Canada, august , 2008. Ali H. Al-Timemy, Fawzi M. Al-Naima ,Nebras H. Qaeeb, “Probabilistic Neural Network for Breast Biopsy Classification”, MASAUM Journal of Computing, Volume 1 Issue 2, September 2009 . F. Schnorrenberg , C.S. Pattichis , K. Kyriacou , M.Vassiliou , C.N. Schizas , “Computer–aided classification of breast cancer nuclei” , Accepted for publication in the journal Technology & HealthCare, Elsevier Science B.V.,Amsterdam, Netherlands, 1996. Spiros Kostopoulos, Dionisis Cavouras, Antonis Daskalakis, Panagiotis Bougioukos, Pantelis Georgiadis, George C. Kagadis, Ioannis Kalatzis, Panagiota Ravazoula, George Nikiforidis, “Colour-Texture based image analysis method for assessing the Hormone Receptors status in Breast tissue sections”, Proceedings of the 29th Annual International Conference of the IEEE EMBS Cité International, Lyon, France August 23-26,2007. Sokol Petushi, Fernando U Garcia, Marian M Haber,Constantine Katsinis and Aydin Tozeren, “Large-scale computations on histology images reveal grade-differentiating parameters for breast cancer”, BMC Medical Imaging 2006, 6:14 doi:10.1186/1471-2342-6-14. Tao Shi , David Seligson, Arie S Belldegrun, Aarno Palotie and Steve Horvath ,” Tumor classification by tissue microarray profiling: random forest clustering applied to renal cell carcinoma”, Modern

T

[1]

ES

This paper presented a multi-segmentation method for automatic breast cancer detection and grading of histopathological images. The individual cells are detected and classified in the high-resolution image frames. They are then scored according to the three criteria of the Nottingham system. Given the encouraging test results, we are confident that an automatic grading system can be developed to assist the pathologists by providing second opinions and alerting them to cases that require further attention. FNN has been implemented for classification of micro object of breast cancer tumor. Twenty six hundred sets of cell nuclei characteristics obtained by applying image analysis techniques to microscopic slides of H & E stained samples of breast biopsy have been used in the current work. MATLAB software package version 8 is used to implement the software in the current work. These feature vectors which consist of eight image analysis features each were carried out to generate training and testing of the proposed NN. The accuracy is calculated to evaluate its effectiveness of the proposed network. The obtained accuracy of the network was 98.60% whereas the sensitivity and specificity were found to be equal 99.10% and 95.70% respectively. We conclude that that the proposed system gives fast and accurate classification of breast tumors.

[3] [4]

A

[5]

IJ

Figure1 Biopsy Image

Figure2 Cancer Detected Image

ISSN: 2230-7818

[6]

[7]

[8]

[9]

[10]

Pathology (2005) 18, 547–557.

Anthony McCabe , Marisa Dolled-Filhart , Robert L. Camp , David L. Rimm,” Automated Quantitative Analysis (AQUA) of In Situ Protein Expression, Antibody Concentration,and Prognosis”, Journal of the National Cancer Institute, Vol. 97, No. 24, December 21, 2005. James W. Bacus,2 Charles W. Boone, James V. Bacus,Michele Follen, Gary J. Kelloff, Valery Kagan, and Scott M. Lippman,” Image Morphometric Nuclear Grading of Intraepithelial Neoplastic Lesions with Applications to Cancer Chemoprevention Trials”, Cancer Epidemiology, Biomarkers & Prevention Vol. 8, 1087–1094, December 1999 Hussain Fatakdawala, Jun Xu, Ajay Basavanhally, Gyan Bhanot, Shridar Ganesan, Michael Feldman, John E. Tomaszewski, and Anant Madabhushi,” Expectation Maximization driven Geodesic Active Contour with Overlap Resolution (EMaGACOR): Application to Lymphocyte Segmentation on Breast Cancer Histopathology”, IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, VOL. -, ISSUE. -,DECEMBER 2009.

@ 2011 http://www.ijaest.iserp.org. All rights Reserved.

Page 8


[13] [14]

[15]

[16]

[17] [18] [19]

[20] [21] [22]

[23] [24]

[25]

IJ

[26]

ES

[12]

Scarff RW, Torloni H: Histological typing of breast tumors. International histological classification of tumors. World Health Organization 1968, 2(2):13-20. Elston CW, Ellis IO: Pathological prognostic factors in breast cancer. The value of histological grade in breast cancer: experience from a large study with long-term follow-up. Histopathology 1991, 19:403410. C. Demir and B.Yener, “Automated cancer diagnosis based on histopathological images: a systematic survey,” Rensselaer Polytechnic Institute, Tech. Rep., 2005. A. Tutac, D. Racoceanu, T. Putti, W. Xong, W.-K. Leow, and V. Cretu, “Knowledge-guided semantic indexing of breast cancer histopathology images,” in Proc. Int. Conf. on Biomedical Engineering and Informatics, 2008. S. Petushi, F. U. Garcia, M. M. Haber, C. Katsinis, and A. Tozeren, “Large-scale computations on histology images reveal gradedifferentiating parameters for breast cancer,” BMC Medical Imaging, vol. 6, no. 14, 2006. S. Doyle, M. Hwang, M. Feldman, and J. Tomaszeweski, “Automated grading of prostate cancer using architectural and textural image features,” in Proc. of 4th IEEE Int. Symp. on Biomedical Imaging, 2007, pp. 1284 – 1287. H. Soltanian-Zadeh and K. Jafari-Khouzani, “Multiwavelet grading of prostate pathological images,” IEEE Trans. on Biomedical Engineering, vol. 50, pp. 697–704, 2003. A. Nedzved, S. Ablameyko, and I.Pitas, “Morphological segmentation of histology cell images,” in Proc. Int. Conf. Pattern Recognition, 2000, pp. 1500–1503. H. Jeong, T.-Y. Kim, H.-G. Hwang, and H.-J. Choi, “Comparison of thresholding methods for breast tumor cell segmentation,” in Proc. of 7th Int. Workshop on Enterprise networking and Computing in Healthcare Industry, 2005, pp. 392–395. F. Schnorrenberg, “Comparison of manual and computer-aided breast cancer biopsy grading,” in Proc. of IEEE EMBS, 1996. M. E. Adawi, Z. Shehab, H. Keshk, and M. E. Shourbagy, “A fast algorithm for segmentation of microscopic cell images,” in Proc. Of 4th Int. conf. on Information & Communications Technology, 2006. Latson L, Sebek B, Powell KA: Automated cell nuclear segmentation in color images of hematoxylin and eosin-stained breast biopsy. Analytical and Quantitative Cytology and Histology 2003, 25(6):321331. Schnorrenberg F, Pattichis C, Kyriacou K,Schizas C: ComputerAided Detection of Breast Cancer Nuclei. IEEE Trans on Information Tech in Biomedicine 1997, 1(2):128-140. Schupp S, Elmoataz A, Herlin P, Bloyet D: Mathematical morphologicsegmentation deticated to quantitative immunohistochemistry. Analytical and Quantitative Cytology and Histology 2001, 23(4):257-267. Bacus JW, Boone Charles W, Bacus James V, Michele Follen, Kelloff Gary J, Valery Kagan, Scott Lippman M: Image Morphometric Nuclear Grading of Intraepithelial Neoplastic Lesions with Applications to Cancer Chemoprevention Trials. Cancer Epidemiol Biomarkers Prev 1999, 8:1087-1094. Hoque A, Lippman Scott M, Boiko Iouri V, Atkinson Edward N, Nour Sneige, Aysegul Sahin, Weber Diane M, Seymon Risin, Lagios Michael D, Roland Schwarting, Colburn William J, Kapil Dhingra, Michele Follen, Kelloff Gary J, Boone Charles W, Hittelman Walter N: Quantitative Nuclear Morphometry by Image Analysis for Prediction of Recurrence of Ductal Carcinoma in Situ of the Breast. Cancer Epidemiol Biomarkers Prev 2001, 10:249-259. Wolberg WH, Street WN, Heisey DM, Mangasarian OL: Computerderived nuclear grade and breast cancer prognosis. Analytical Quantitative Cytology Histology 1995, 17:257-64. Petushi S, Katsinis C, Coward C, Garcia F, Tozeren A: Automated identification of microstructures on histology slides. IEEE International Symposium on Biomedical Imaging: Macro to Nano 2004, 1:424-427

A

[11]

T

Shekhar Singh et al. / (IJAEST) INTERNATIONAL JOURNAL OF ADVANCED ENGINEERING SCIENCES AND TECHNOLOGIES Vol No. 6, Issue No. 1, 004 - 009

[27] [28]

ISSN: 2230-7818

@ 2011 http://www.ijaest.iserp.org. All rights Reserved.

Page 9


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.