OCM’s New Hope for Breast Cancer Patients

Nearly one in four women who have breast cancer and opt for a breast-saving lumpectomy will need a second surgery, according to a recent study. These repeat operations increase both the cost of medical treatment and the risk of complications.

Surgeons can freeze and examine surgically removed tissue during an operation to determine whether any cancer cells remain in the margin of tissue surrounding the excised tumor. But the accuracy of this approach is limited. And the results from a more thorough histopathological evaluation of the removed tissue are not available for several days.

What if surgeons had a more accurate way to find out—in real time in the operating room—whether the tumor margins were free of cancer cells?

Chao Zhou, assistant professor of electrical engineering, and Sharon Xiaolei Huang, associate professor of computer science and engineering, are working to make that vision a reality. They have created a computer-aided diagnostic technique that combines cutting-edge imaging technology with advanced artificial intelligence to detect, in real-time, the difference between cancerous and benign cells.

“The idea is that one day, if this technique is used during surgery, it could complement the histopathology, potentially reducing the need for a second breast cancer surgery,” said Zhou.

Collaborators on the project include James G. Fujimoto of the Massachusetts Institute of Technology, James L. Connolly of Harvard Medical School, and Xianxu Zeng and Zhan Zhang of The Third Affiliated Hospital of Zhengzhou University in Henan, China.

In an article published in Medical Image Analysis, the researchers reported that their technique correctly identified benign versus cancerous cells more than 90 percent of the time. The article was titled “Integrated local binary pattern texture features for classification of breast tissue imaged by optical coherence microscopy.”

Sunhua Wan, a graduate student in Lehigh’s department of computer science and engineering, is the article’s lead author. Zhou and Huang are coauthors, along with Lehigh graduate students Ting Xu (computer science and engineering) and Tao Xu (electrical and computer engineering).

Powerful imaging meets powerful analysis
   
Zhou and Huang’s method is compelling for its relatively new application of an imaging technique called optical coherence microscopy (OCM) as a breast cancer diagnostic tool. It is also notable for using features extracted from OCM images to train a computer system to recognize texture patterns and automatically identify different tissue types.

“The process takes a large number of images, and labels the types of tissue in the sample,” says Huang. “For every pixel in an image, we know whether it is fat, carcinoma or another cell type. In addition, we extract thousands of different features that can be present in the image, such as texture, color or local contrast, and we use a machine learning algorithm to select which features are the most discriminating.”

After examining multiple types of texture features, Huang and Zhou determined that Local Binary Pattern (LBP) features—visual descriptors that compare the intensity of a center pixel with those of its neighbors—worked best for classifying tissues imaged by OCM.

The team also integrated two other features. The Average Local Binary Pattern (ALBP) compares the intensity value of each neighbor pixel with the average intensity value of all neighbors. The Block-Based Local Binary Pattern (BLBP) compares the average intensity value of pixels in blocks of a certain shape in a neighborhood around the center pixel. Two different shapes of pixel blocks, namely Spoke and Ring, are used in their work.

Finally, because texture patterns of different scales appear in the OCM images of human breast tissue, the researchers constructed a multiscale feature by integrating LBP, ALBP and BLBP features obtained with different radius parameters. All these combined to significantly improve classification accuracy.

“Our experiments show that by integrating a selected set of LBP and the two new variant (ALBP and BLBP) features at multiple scales, the classification accuracy increased from 81.7 percent (using LBP features alone) to 93.8 percent using a neural network classifier,” the group reported in Medical Image Analysis.

“In addition,” says Huang, “we used these multiscale and integrated image features to achieve high sensitivity—100 percent—and specificity—85.2 percent—for cancer detection using the OCM images.”

Zhou, whose work focuses on improving biomedical imaging techniques, is a pioneer in the use of OCM, a noninvasive imaging method that can provide 3D, high-resolution images of biological tissue at the cellular level. OCM images come very close to what can be detected through histopathology.  

Huang’s role, as an expert in training computers to recognize visual images, is to identify the best way to analyze OCM images to differentiate between benign and cancerous tissue.

The combination of powerful imaging and powerful image analysis that makes up their technique could be a significant step toward enabling real-time diagnosis of breast cancer tissue in the operating room, the researchers says. Their hope is that it may one day minimize the need for a second surgery, reducing costs and lowering the risk of complications for patients.

Story by Lori Friedman
 

Related Stories

Freed Symposium

Lehigh Bioengineering Major takes Top Honors at Undergraduate Research Symposium

Several Lehigh students were recognized at the David and Loraine Freed Undergraduate Research Symposium on April 6.

A house submerged in floodwaters following a hurricane

A Sharper Focus on Catastrophe Modeling

Paolo Bocchini, Daniel Conus, Brian Davison and their colleagues leverage their collaborative experience in probabilistic modeling to sharpen their focus on catastrophe modeling, a discipline not traditionally explored in academia.

Green, blue and gray illustration of shapes with hint of elephant trunk

Advancing Robotic Grasping, Dexterous Manipulation & Soft Robotics

Jeff Trinkle and his colleagues work to advance intrinsically safe soft robots, the future of human-machine collaboration.