Tapping into the technology behind facial recognition programs and self-driving cars, researchers in a new study have taught computers key elements of assessing echocardiograms.
The advance might simplify an otherwise extensive process now done by humans.
Researchers created algorithms to recognize images and potential heart problems that echocardiograms commonly capture, including enlarged chambers, diminished pumping function and even some uncommon diseases.
The computers accurately identified thousands of echocardiogram images and came up with measurements on them that were “comparable with or superior to manual measurements,” according to authors of the study, published Monday in the journal Circulation.
Echocardiograms help doctors evaluate heart function by using sound waves to create snapshots of every part of the organ. Because they don’t give off radiation and can be given easily in a medical office, they’re a popular imaging choice to diagnose heart disease. But, Dr. Rahul Deo, the study’s senior author, said they’re not done enough because the process of assessing results is long and taxing.
“It can be tedious to collect everything and interpret all that information. You need specialists on both ends, both to acquire the data and to interpret it, so it often becomes a tool that’s used when there are already symptoms or when (heart) disease has sort of already progressed,” said Deo, chief data scientist at One Brave Idea, an organization centered on finding new ways to fight heart disease. It is partially funded by the American Heart Association.
Conditions such as diabetes and high blood pressure, risk factors for cardiovascular disease, often change the structure and function of the heart muscle years before the onset of any symptoms, he said. “We wanted to figure out how to get studies done at an earlier level … picking up severe cases earlier on, even when people don’t have any symptoms.”
Looking at 14,035 echocardiograms, collected over 10 years from a University of California San Francisco database, researchers fed 23 views of each heart chamber from every test into a computer algorithm. They also provided labels for every specific, identifiable image captured.
The result was a set of programs that, ultimately, independently identified images, provided measurements and spotted potential problems. Deo said automating echocardiogram interpretation could help “democratize” the tests, allowing them to be conducted in more settings, such as primary care offices or rural areas that lack cardiologists and other medical specialists.
An estimated 7.7 million echocardiograms were performed in U.S. hospitals from 2001 to 2011, a period that saw the use of the tests grow at an average annual rate of 3.41 percent, according to one study.
The process of assessing echocardiograms is still a long way off before there’s no longer a need for human expertise, said Dr. Mario Garcia, chief of cardiology at Montefiore Medical Center in New York City.
“To arrive at a diagnosis and decide upon a treatment, you need not only the interpretation of the images, but you need to know the patient’s symptoms and incorporate that with other medical information that’s available—like what’s the patient’s blood pressure, and what have other tests that have been conducted show?” said Garcia, who was not part of the study.
Garcia, who estimates his hospital conducts anywhere between 25,000 to 30,000 echocardiograms a year, said the accuracy of the automated measurements in the study demonstrate “a real advancement” in technology, but disease detection isn’t foolproof.
“You may have the diagnosis being made correctly 85 percent of the time, but what do you do in the other 15 percent if you are wrong?” he said. “A 15 percent error in medicine is not an acceptable rate.”
Garcia compared it to the automation of the stock market over the past several decades.
Source: Read Full Article