Stopwatch.GetTimestamp() produced different results on linux vs windows

  .net-core-3.1,, c++, linux, windows

I am hoping someone can explain to my why the function below produces incredibly different results on linux vs windows.

If I have this little snippet of code:

var elapsed = Stopwatch.GetTimestamp() / TimeSpan.TicksPerMillisecond;

Thread.Sleep(1001); // lets sleep for one second

var ts = Stopwatch.GetTimestamp() / TimeSpan.TicksPerMillisecond; 
var result = ts  - elapsed  > 10000L // roughly 10 seconds

On a windows environment result is false <– expected result

But on a linux environment result is true <– what….why?

I have read that Stopwatch.GetTimestamp() is dependent on the processor. But this seems excessive.

As far as I can tell GetTimestamp produces wildly different values on Windows vs Linux.

e.g. in my case running the code above

On Windows Stopwatch.GetTimestamp() produces a value of roughly in range of 165100732

On Linux Stopwatch.GetTimestamp() produces a value more than a 200x bigger e.g. 349232049523

So I can see why result is different i.e. on Windows it records a elapsed duration of 1 second but on Linux is records an elapsed duration of close to 100 seconds. So that part is fine.

So question boils down to why does Stopwatch.GetTimestamp() produce such wildly different numbers between the two environments?

Source: Windows Questions