OS X std::locale only supports the “C” locale and will give an exception
on any other, including “” when a different one is specified through the
POSIX locale environment variables.
This is not normally a problem since these variables are stripped out from
the environment when an application is launched by the Finder. This is not
the case when running the binary directly from the command line, such as
I have temporarily disabled the use of std::locale(“”) on OS X while I try
to come up with a better solution.
Fixes issue 3598.
Lock emulation while saving state on all platforms.
Fixes issue 2805. Fixes issue 3235.
If this causes problems on Windows, just use this intead:
#if defined(HAVE_X11) && HAVE_X11 || defined __APPLE__
Avoid shadowing variables.
Related to CPUID detecting:
* fixed a bug on display info for AMD CPUs.
Fix TextureDecoder.cl to work on both NVidia and ATI video cards.
To do so I had to re-add the casting bloat removed in revision 6102. Also, for some odd reason the NVidia OpenCL drivers don’t like 8 bit rotations, but are okay with 2, 4 bit rotations. These are apparently bugs in the NVidia drivers that are hopefully fixed in future versions.
Also, on linux make sure the TextureDecoder.cl file is copied from the shared data directory to the users directory.
Build fix for linux. svnrev.h was not found when compiling OCLTextureDecoder.cpp as that include directory was only used in Core/Common.