Diabetes is a serious health condition that can also lead to quite a few health complications. For example in some cases, it can lead to the amputation of feet, or in other cases it can lead to blindness due to diabetic retinopathy. However Google is hoping to use artificial intelligence to ensure that the latter problem can be detected and treated early on.
How does this work? In regular treatment, doctors will look at photos of the back of the patient’s eye and look for lesions present like microaneurysms, hemorrhages, and hard exudates, all of which are indicative of bleeding and fluid leakage. So what Google has done is create a dataset of 128,000 images, each of which is examined by 3-7 ophthalmologists from a panel of 54 ophthalmologists, and feed that information back to the computer to “trains” it to be able to detect a healthy eye and one that could have signs of diabetic retinopathy.
According to Google, they then tested their algorithm against a panel 7 or 8 U.S. board-certified ophthalmologists and found that their results were on-par with that of the doctors. While this is a great example of machine learning in use, Google hopes that the speed and accuracy of their algorithm could be used in parts of the world where access to doctors and scanning equipment is difficult, and where the algorithm can quickly identify patients who might be in need.
Filed in AI (Artificial Intelligence), Google and Health.
. Read more about