Image: The enhanced VisualDx app with Core ML is a framework enabling on-device machine learning to help provide healthcare professionals with quick and accurate identification of rashes and skin lesions (Photo courtesy of VisualDx).
VisualDx (Rochester, NY, USA) has enhanced its VisualDx app with Core ML, a new framework enabling on-device machine learning to help provide healthcare professionals with quick and accurate identification of rashes and skin lesions. The technology will further help non-dermatology practitioners in identifying and treating a variety of skin conditions which were previously difficult to diagnose without a specialist referral.
The full VisualDx suite is a web- and app-based clinical decision support system designed to enhance diagnostic accuracy, aid therapeutic decisions, and improve patient safety. It combines clinical search with a database of more than 41,000 of the best medical images in the world, plus medical knowledge from experts to help with diagnosis, treatment, self-education, and patient communication.
The VisualDx app helps emergency medicine, urgent care and primary care professionals make smarter, faster, more accurate diagnoses and treatment choices based on a series of questions, including global locations, and interfaces with information such as local/regional outbreaks and health and medication history. VisualDx + DermExpert, an addition to the VisualDx subscription, is available for iPhone and iPad on the App Store. The VisualDx technology analyzes photos taken by the clinician and classifies them within a second on an iPhone or iPad. The clinician views and confirms the lesion type classification, enters additional patient information and immediately reviews diagnostic possibilities and treatment options.
“iOS 11 with Core ML enables the type of on-device learning and artificial intelligence that will benefit patients, maintain their privacy, and allow for innovation in companies like ours,” said Art Papier MD, CEO and co-founder of VisualDx. “In a few milliseconds, our app makes a recommendation based on a photo of the condition, augmenting clinical thinking and driving better care.”