2023
Authors
Lopes, D; Coelho, L; Silva, MF;
Publication
APPLIED SCIENCES-BASEL
Abstract
Listening to internal body sounds, or auscultation, is one of the most popular diagnostic techniques in medicine. In addition to being simple, non-invasive, and low-cost, the information it offers, in real time, is essential for clinical decision-making. This process, usually done by a doctor in the presence of the patient, currently presents three challenges: procedure duration, participants' safety, and the patient's privacy. In this article we tackle these by proposing a new autonomous robotic auscultation system. With the patient prepared for the examination, a 3D computer vision sub-system is able to identify the auscultation points and translate them into spatial coordinates. The robotic arm is then responsible for taking the stethoscope surface into contact with the patient's skin surface at the various auscultation points. The proposed solution was evaluated to perform a simulated pulmonary auscultation in six patients (with distinct height, weight, and skin color). The obtained results showed that the vision subsystem was able to correctly identify 100% of the auscultation points, with uncontrolled lighting conditions, and the positioning subsystem was able to accurately position the gripper on the corresponding positions on the human body. Patients reported no discomfort during auscultation using the described automated procedure.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.