I am hoping someone can explain to my why the function below produces incredibly different results on linux vs windows.
If I have this little snippet of code:
var elapsed = Stopwatch.GetTimestamp() / TimeSpan.TicksPerMillisecond; Thread.Sleep(1001); // lets sleep for one second var ts = Stopwatch.GetTimestamp() / TimeSpan.TicksPerMillisecond; var result = ts - elapsed > 10000L // roughly 10 seconds
On a windows environment
false <– expected result
But on a linux environment
true <– what….why?
I have read that Stopwatch.GetTimestamp() is dependent on the processor. But this seems excessive.
As far as I can tell GetTimestamp produces wildly different values on Windows vs Linux.
e.g. in my case running the code above
Stopwatch.GetTimestamp() produces a value of roughly in range of
Stopwatch.GetTimestamp() produces a value more than a 200x bigger e.g.
So I can see why
result is different i.e. on Windows it records a elapsed duration of 1 second but on Linux is records an elapsed duration of close to 100 seconds. So that part is fine.
So question boils down to why does
Stopwatch.GetTimestamp() produce such wildly different numbers between the two environments?
Source: Windows Questions