The days of a depending on a human doctor may soon be numbered, as the future of the health industry looks increasingly like an AI-assisted scenario.
Researchers and startups are developing artificially intelligent systems that are capable of diagnosing disease using a patient’s breath and even from the emotional inflection of their voice. Someday, your smartphone may help you and your doctor determine whether a strange-looking lesion on your skin is cancerous or not, thanks to a team of Stanford University scientists that have developed a deep learning algorithm tailored just for the task.
Led by Sebastian Thrun, an adjunct professor at the Stanford Artificial Intelligence Laboratory, the team found that their diagnostic tool, which builds upon the same classification technique used by Google to differentiate between images of cats and dogs, performed as well or better than 21 board-certified dermatologists. Their findings were detailed in a recent paper published in Nature.
Typically, a skin cancer diagnosis starts with a human dermatologist visually examining a patient’s skin, both with the unaided eye and with a handheld, low-power microscope called a dermatoscope. If the dermatologist suspects that the lesion is cancerous, a biopsy will be performed. The team’s algorithm is designed to help dermatologists at this stage of the process, to better determine which lesions actually need a biopsy.
“We made a very powerful machine learning algorithm that learns from data,” said Andre Esteva, a graduate student and co-lead author of the paper. “Instead of writing into computer code exactly what to look for, you let the algorithm figure it out.”
Thousands of Blobs
In particular, the team refined a deep convolutional neural network (CNN) algorithm previously used by Google that was already trained to classify 1.28 million images of 1,000 different object categories. Instead of using it to identify objects, as Google researchers did, the Stanford team used it to train on 130,000 images of 2,000 different skin diseases, culled from the Internet, so that the model could use them to train and learn to pinpoint markers of disease even with variations in lighting, camera angle and zoom.
“There’s no huge dataset of skin cancer that we can just train our algorithms on, so we had to make our own,” explained graduate student Brett Kuprel, another study co-author. “We gathered images from the internet and worked with [Stanford’s] medical school to create a nice taxonomy out of data that was very messy – the labels alone were in several languages, including German, Arabic and Latin.”
The model was then tested against 370 high-quality images of cancerous skin lesions, with a resulting accuracy rate matching or even surpassing that of 21 board-certified human dermatologists, with one distinct advantage being that the algorithm can be adjusted for sensitivity and specificity, depending on what its human users are looking for.
The team’s aim is to incorporate the technology into tomorrow’s smartphones, which will be especially helpful in developing nations where ownership of cellphones is quickly growing. Such a tool will mean quicker and more accurate diagnoses in skin diseases: with earlier detection, that also translates into greater chances of survival.
“My main eureka moment was when I realized just how ubiquitous smartphones will be,” said Esteva. “Everyone will have a supercomputer in their pockets with a number of sensors in it, including a camera. What if we could use it to visually screen for skin cancer? Or other ailments?”
While it will probably take some time and further testing before such a system will be widely used, it’s yet another illustrative example of how recent advances in deep learning are changing our world. From automatic language translation of text or images, to object classification in images and speech recognition, deep learning is making deep inroads into our lives. Though nothing beats the reassuring and healing bedside presence of a living, breathing human, the quicker and more accurate diagnosis of disease is yet another problem that deep learning could help solve, hopefully in a healthcare system that partners human doctors and healers with technology, rather than replacing them with it.
Images: Stanford University.