Show simple item record

Breast Cancer Surgical Navigation: An Image Guidance System Leveraging Computer Vision

dc.contributor.advisorMiga, Michael I
dc.creatorRichey, Winona Lynn
dc.date.accessioned2022-05-19T17:32:17Z
dc.date.created2022-05
dc.date.issued2022-05-16
dc.date.submittedMay 2022
dc.identifier.urihttp://hdl.handle.net/1803/17408
dc.description.abstractSurgery is a primary treatment option for breast cancer patients; the majority of these patients are recommended for breast conserving surgery. Unfortunately, patient positional changes between imaging and surgery cause breast tumors to undergo large displacements and shape changes with respect to the few available landmarks: the nipple, skin, and chest wall. This confounds understanding of tumor boundaries during surgery, resulting in larger excised volumes and high reoperation rates due to incomplete excision. This dissertation couples high-sensitivity magnetic resonance images (MRI) with automatic computer vision surface measurements and patient-specific biomechanical models in an integrated system to guide breast cancer resection. A guidance platform was developed to align preoperative imaging data to the surgical scene and present the surgeon with their tool location in relation to model predicted tumor boundaries and relevant breast anatomy. The guidance system was used to characterize supine breast deformations resulting from imaging-to-surgery repositioning, arm abduction, and operating room table rotation. Results indicate that breast displacements in the supine position are nonrigid in nature, and remaining error after rigid correction is significant. A patient-specific biomechanical model-based framework was established to rapidly predict these large and nonrigid deformations from sparse intraoperative measurements collected at patient bedside via tracked ultrasound images and skin surface positional data. To more closely represent the intraoperative organ presentation, preoperative supine MRIs are deformed using model-predicted volumetric displacements. This approach significantly improves surface and subsurface localization accuracy compared to a rigid correction approach. Finally, a breast surface acquisition method was developed using stereo cameras and computer vision. The continuous, near-real-time measurements use ink-based handwritten skin fiducials to track precise surface points that can drive the model-based correction. In addition to accurate intraoperative measurements, character-based fiducials also serve as landmarks to intuitively represent the relationship between the patient in physical space and the on-screen model display. This dissertation presents a robust, accurate image guidance system compatible with surgical workflows to localize breast tumor boundaries in the operating room and fundamentally shift breast conserving surgery towards more precise tumor excision and lower reoperation rates.
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.subjectbreast cancer image guidance image guided surgery surgical guidance lumpectomy breast conserving surgery biomechanical modeling computer vision surface tracking stereovision
dc.titleBreast Cancer Surgical Navigation: An Image Guidance System Leveraging Computer Vision
dc.typeThesis
dc.date.updated2022-05-19T17:32:17Z
dc.contributor.committeeMemberByram, Brett C
dc.type.materialtext
thesis.degree.namePhD
thesis.degree.levelDoctoral
thesis.degree.disciplineBiomedical Engineering
thesis.degree.grantorVanderbilt University Graduate School
local.embargo.terms2024-05-01
local.embargo.lift2024-05-01
dc.creator.orcid0000-0001-9697-4069
dc.contributor.committeeChairMiga, Michael I


Files in this item

Icon

This item appears in the following Collection(s)

Show simple item record