Chronological age by itself is an outdated clinical measurement

This 2018 editorial in the New England Journal of Medicine concerned a clinical trial of an osteoporosis treatment:

“When measurement of bone density was first introduced 25 years ago, absolute bone mineral density (g per square centimeter) was considered as too onerous for clinicians to understand. Ultimately, these events led to a treatment gap in patients who had strong clinical risk factors for an osteoporotic fracture (particularly age) but had T scores in the osteopenic range.

The average age of the participants in the current trial was approximately 3.5 years older than that in the Fracture Intervention Trial. Owing to the interaction between age and bone mineral density, the results of the current trial should not be extrapolated to younger postmenopausal women (50 to 64 years of age) with osteopenia.

This trial reminds us that risk assessment and treatment decisions go well beyond bone mineral density and should focus particularly on age and a history of previous fractures.” “A Not-So-New Treatment for Old Bones”

This editorial provided some history of how a still-generally-accepted set of diagnostic measurements were selected for their relative convenience instead of chosen for their efficacy. Add chronological age to such ineffective measurements.

Let’s recognize better aging and diagnostic measurements, then incorporate them. How else will we advance past the above uninformative averaging and unhelpful recommendation based on chronological age?

The time has passed for physicians and clinicians to consider only chronological age when evaluating a patient’s clinical age. More effective human age measurements covering the entire person as well as their body’s components include:


A slanted view of the epigenetic clock

The founder of the epigenetic clock technique was interviewed for MIT Technology Review:

“We need to find ways to keep people healthier longer,” he says. He hopes that refinements to his clock will soon make it precise enough to reflect changes in lifestyle and behavior.”

The journalist attempted to dumb the subject down “for the rest of us” with distortions such as the headline. The varying correlation of epigenetic age to chronological age was somewhat better reported in the story:

“The epigenetic clock is more accurate the younger a person is. It’s especially inaccurate for the very old.”

The journalist inappropriately used luck as a synonym for randomness/stochasticity:

“He estimates that about 40% of the ticking rate is determined by genetic inheritance, and the rest by lifestyle and luck.”

A third example of less-than-straightforward journalism started with:

“Such personalization raises questions about fairness. If your epigenetic clock is ticking faster through no fault of your own..”

Were MIT Technology Review readers unable to comprehend a straightforward story on the epigenetic clock? What was the purpose of slants and distortions in an introductory article? “Want to know when you’re going to die?”