I recently had to start converting UTF-8 characters to ascii. I have a simple Unix /bin/sh script that does the conversion. It works fine from RHEL 6. When I run a BASIC program with EXECUTE \SH -c 'same command'\ The results are basically correct, but the UTF-8 characters are returned as '?'s.
Your thoughts on this would be greatly appreciated.
eval "$1 -o $temp"
cat $temp | iconv -c -f UTF-8 -t ascii//TRANSLIT
$1 is a curl command to access a web call. The iconv Unix command translates from UTF-8 to ascii. The temp variable is the name of a unique temp file. I create it, use it and remove it.
The execute in questions looks like EXECUTE \SH -c 'curl command'\
Curl works, the temp file contains the web call data with the UTF-8 characters in it. From Unix it works correctly, from UniVerse the translation is flawed.
Due to the API containing keys and such, I have not included them here, but they don't appears to have much to do with the problem.