Five times slower for us distant observers, regular time (1 second = 1 second) if you’re actually there, if I’m understanding this correctly.
Still, difficult to wrap my brain around.
I read through the article and it didn’t explain at all how studying quasars determined this. I wonder if there’s a better breakdown as to how they were able to ascertain this. I know there’s been some major announcements in the field thanks to quasar study so I’m curious as to how this ties into that.
Luckily there’s a preprint of the article on arXiv if you want to read the source material for the article.
My basic summary would be that they have a model for how variable a quasar should be over time, and they can see a difference in that variability depending on the quasar’s redshift, the distance from us. And that difference is right around what we expect from relativity.
Does this assume the speed of light is a constant? Is there a difference between Time running slower while C is constant, and Time being constant while C actually changes?