Reading over "Stem cell aging: mechanisms, regulators and therapeutic opportunities" brings to mind Hector Zenil's work on complex systems modeling. Specifically, there are systematic approaches facilitating rapid convergence toward predictive models opened up by advances in the application of algorithmic information theory based on sparse datasets.
A simple example would be compiling a dataset of longitudinal measures and experimental outcomes, and constructing stock and flow diagrams to represent various models. Model selection then consists of measuring the degree to which the models losslessly compress the dataset -- including the models themselves as aspects of the respective decompression programs. I know this seems alien to most scientists but it is entirely justified as a model selection criterion. If you think the dataset is impoverished in some way, do the appropriate measurements and add the data to the dataset.
Some seriously smart people amongst us))
If I caught the gist of it, its the same as neural network works - it linearly regresses with training to the most suitable yet the most simple model. I would not say its THE correct way thought as models often fail with new data coming, but its simple and practical, that's for sure.
To avoid complete offtopic:
I think there is some synergy with the protocol and hydrogen water. Usually, my HRV plummet the night after leg exercise day. If it coincides with C60 than fall is less pronounced, but when its C60 + hydrogen water than my Oura ring shows completely normal HRV during the night.
Edited by Andey, 25 March 2019 - 10:51 AM.