|
|
Le 13/02/2019 à 20:53, clipka a écrit :
> Hi folks,
>
> can a few of you folks please run the following little program on your
> computers and report the time it takes to run?
>
> --------------------------------------------------
> #include <chrono>
> #include <thread>
> #include <iostream>
>
> inline void Delay(unsigned int msec)
> {
> std::this_thread::sleep_for(std::chrono::milliseconds(msec));
> }
>
> int main()
> {
> int count = 1000;
> for (int i = 0; i < count; ++i)
> {
> Delay(1);
> }
> std::cout << "Done." << std::endl;
> }
> --------------------------------------------------
>
> On Windows Subsystem For Linux I see results like the following:
>
> real 0m1.782s
> user 0m0.000s
> sys 0m0.000s
>
> But as I presume this is using the Windows scheduler, I expect that
> genuine Linux systems may behave differently, and I'm also interested in
> other platforms (Mac, maybe BSD if some of you folks are using that, or
> actually any system you can get your hands on.)
>
> I have also reason to believe that results may differ between compilers.
> Using g++ 5.4.0 here.
my little contribution on Mac Intel i5 - 2,7 Ghz.
g++ -v :
Configured with: --prefix=/Applications/Xcode.app/Contents/Developer/usr
--with-gxx-include-dir=/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.10.sdk/usr/include/c++/4.2.1
Apple LLVM version 6.0 (clang-600.0.57) (based on LLVM 3.5svn)
Target: x86_64-apple-darwin13.4.0
Thread model: posix
uname -v :
Darwin Kernel Version 13.4.0: Mon Jan 11 18:17:34 PST 2016;
root:xnu-2422.115.15~1/RELEASE_X86_64
result :
real 0m1.239s
user 0m0.009s
sys 0m0.019s
--
Kurtz le pirate
Compagnie de la Banquise
Post a reply to this message
|
|