All,
I recently had to start converting UTF-8 characters to ascii. I have a simple Unix /bin/sh script that does the conversion. It works fine from RHEL 6. When I run a BASIC program with EXECUTE SH -c 'same command' The results are basically correct, but the UTF-8 characters are returned as '?'s.
Your thoughts on this would be greatly appreciated.
Script:
#!/bin/sh
temp=$(mktemp)
eval "$1 -o $temp"
cat $temp | iconv -c -f UTF-8 -t ascii//TRANSLIT
rm $temp
$1 is a curl command to access a web call. The iconv Unix command translates from UTF-8 to ascii. The temp variable is the name of a unique temp file. I create it, use it and remove it.
The execute in questions looks like EXECUTE SH -c 'curl command'
Curl works, the temp file contains the web call data with the UTF-8 characters in it. From Unix it works correctly, from UniVerse the translation is flawed.
Due to the API containing keys and such, I have not included them here, but they don't appears to have much to do with the problem.
Jon
Solved
Executing a Unix command from BASIC gives different results
Best answer by Mike Rajkowski
All,
I recently had to start converting UTF-8 characters to ascii. I have a simple Unix /bin/sh script that does the conversion. It works fine from RHEL 6. When I run a BASIC program with EXECUTE SH -c 'same command' The results are basically correct, but the UTF-8 characters are returned as '?'s.
Your thoughts on this would be greatly appreciated.
Script:
#!/bin/sh
temp=$(mktemp)
eval "$1 -o $temp"
cat $temp | iconv -c -f UTF-8 -t ascii//TRANSLIT
rm $temp
$1 is a curl command to access a web call. The iconv Unix command translates from UTF-8 to ascii. The temp variable is the name of a unique temp file. I create it, use it and remove it.
The execute in questions looks like EXECUTE SH -c 'curl command'
Curl works, the temp file contains the web call data with the UTF-8 characters in it. From Unix it works correctly, from UniVerse the translation is flawed.
Due to the API containing keys and such, I have not included them here, but they don't appears to have much to do with the problem.
Jon
I recently had to start converting UTF-8 characters to ascii. I have a simple Unix /bin/sh script that does the conversion. It works fine from RHEL 6. When I run a BASIC program with EXECUTE SH -c 'same command' The results are basically correct, but the UTF-8 characters are returned as '?'s.
Your thoughts on this would be greatly appreciated.
Script:
#!/bin/sh
temp=$(mktemp)
eval "$1 -o $temp"
cat $temp | iconv -c -f UTF-8 -t ascii//TRANSLIT
rm $temp
$1 is a curl command to access a web call. The iconv Unix command translates from UTF-8 to ascii. The temp variable is the name of a unique temp file. I create it, use it and remove it.
The execute in questions looks like EXECUTE SH -c 'curl command'
Curl works, the temp file contains the web call data with the UTF-8 characters in it. From Unix it works correctly, from UniVerse the translation is flawed.
Due to the API containing keys and such, I have not included them here, but they don't appears to have much to do with the problem.
Jon
Note that there are 1,112,064, and not all of them will map to ascii.
What characters are you having a issue with?
Sign up
Already have an account? Login
Welcome to the Rocket Forum!
Please log in or register:
Employee Login | Registration Member Login | RegistrationEnter your E-mail address. We'll send you an e-mail with instructions to reset your password.
