A cutting-edge AI technology named FaceAge is set to revolutionize how doctors assess patient health by analyzing facial features. Inspired by the traditional “eyeball test” — a quick, intuitive visual check doctors use to evaluate a patient’s overall health — this AI tool aims to bring more precision and objectivity to clinical assessments.
Powered by advanced deep learning technology, FaceAge is designed to estimate a patient’s biological age—a key indicator of their overall health—by analyzing a selfie. Unlike chronological age, which is based on birth date, biological age reflects how well a person’s body is functioning. This innovative AI tool focuses on delivering deeper health insights, offering a more accurate snapshot of a patient’s wellness than traditional age-based assessments.
Biological age plays a crucial role in modern healthcare, as it helps physicians tailor treatments to a patient’s actual physical condition rather than their chronological age. For instance, if a cancer patient’s biological age suggests they are in good health, doctors may opt for more aggressive treatment options, confident that the patient can handle them. This personalized approach can lead to better outcomes and more effective care strategies.

“We found that doctors on average can predict life expectancy with an accuracy that’s only a little better than a coin flip when using a photo alone for their analysis,” quoted Dr Raymond Mak, a radiation oncologist at Mass General Brigham.
During a press conference last week, Dr. Mak shared a compelling story about an 86-year-old man diagnosed with terminal lung cancer. “He appeared significantly younger than his actual age,” Dr. Mak noted. Relying on his clinical judgment and several health indicators, he opted for an aggressive radiation therapy approach.
Years later, Dr. Mak used the FaceAge tool to analyze the same patient’s facial data. “The results showed his biological age was over a decade younger than his chronological age,” he explained. Today, the patient is 90 years old and thriving—a testament to the potential of AI-powered tools like FaceAge in personalizing treatment plans and improving long-term outcomes.
How does AI Tool Faceage work?
Researchers at Mass General Brigham revealed that FaceAge was developed using a robust dataset of over 9,000 facial images of individuals aged 60 and above, all presumed to be in good health. The majority of these images were sourced from publicly available platforms such as Wikipedia and IMDb, ensuring a diverse range of facial data.
To further enhance the AI model’s accuracy, the team also incorporated the UTKFace dataset—a large-scale collection featuring photos of individuals ranging in age from 1 to 116 years. This comprehensive training enabled FaceAge to learn nuanced patterns of aging, helping it deliver precise estimates of a person’s biological age from facial analysis.
“It is important to know that the algorithm looks at age differently than humans do. So, for example, being bald or not, or being grey is less important in the algorithm than we actually initially thought,” Hugo Aerts, one of the co-authors of the study, said.
What is the accuracy of Faceage?
Researchers of the study have emphasised that FaceAge is not meant to replace but enhance a doctor’s visual assessment of a patient, otherwise known as the “eyeball test”.
The deep learning system has also undergone some testing. FaceAge was tested on photographs of over 6,200 cancer patients. These images of the patients were captured before they underwent radiotherapy treatment. The AI algorithm determined that the patients’ biological age was on average five years older than their chronological age.
The FaceAge AI tool has shown promising potential in predicting patient survival outcomes based on facial analysis. Researchers found that the survival predictions generated by FaceAge were closely linked to how old a patient’s face appeared—an indicator of their biological age.
In a separate experiment, eight doctors were tasked with predicting whether terminal cancer patients would survive beyond six months. When relying solely on patient photographs, their accuracy stood at 61%. This improved to 73% when clinical data was included. However, when doctors used FaceAge in combination with medical records, their prediction accuracy rose significantly to 80%, demonstrating the AI tool’s potential to enhance clinical decision-making.
The study also highlighted that while facial appearance influences FaceAge’s assessment, it doesn’t definitively determine health outcomes. For example, researchers tested the AI tool on images of actors Paul Rudd and Wilford Brimley, both at the age of 50. FaceAge estimated Rudd’s biological age at 43, while Brimley’s was assessed at 69. Despite this older-looking profile, Brimley lived until the age of 85, passing away in August 2020. This underscores that while FaceAge provides valuable insights, it should be interpreted alongside other clinical information for a complete picture of health.
What about the Safety and Privacy Issues?
The developers of FaceAge acknowledge that significant work remains before the AI tool can be widely adopted in clinical settings, due to several risks that need careful management.
Privacy is a major concern for AI systems that utilize facial data. However, the study emphasizes that FaceAge focuses solely on age estimation, which they believe carries less risk of societal bias compared to facial recognition technologies.
To tackle potential racial and ethnic bias, the researchers rigorously evaluated FaceAge’s performance across diverse groups using the UTKFace validation dataset. This dataset is one of the most ethnically diverse publicly available collections, with approximately 55% of images representing non-White individuals. Such measures aim to ensure the AI provides fair and accurate biological age predictions across all ethnicities.
The study also found that FaceAge’s biological age predictions are minimally influenced by ethnicity. Researchers accounted for ethnicity as a covariate in their multivariable analysis of the Harvard clinical datasets, helping to ensure the AI tool delivers accurate and unbiased results across diverse populations.
The FaceAge development team emphasizes the importance of strong regulatory oversight and ongoing assessments to monitor and address potential biases in the AI tool’s performance across diverse populations, ensuring safe and equitable clinical use.
Hugo Aerts, director of the Artificial Intelligence in Medicine program at Mass General Brigham and co-author of the study, cautioned, “This technology holds great potential to benefit healthcare, but it also carries risks that must be carefully managed to prevent unintended harm.”
Stay tuned to mortentechnologies.com for more such AI updates!
Leave a Reply