All,
I recently had to start converting UTF-8 characters to ascii. I have a simple Unix /bin/sh script that does the conversion. It works fine from RHEL 6. When I run a BASIC program with EXECUTE SH -c 'same command' The results are basically correct, but the UTF-8 characters are returned as '?'s.
Your thoughts on this would be greatly appreciated.
Script:
#!/bin/sh
temp=$(mktemp)
eval "$1 -o $temp"
cat $temp | iconv -c -f UTF-8 -t ascii//TRANSLIT
rm $temp
$1 is a curl command to access a web call. The iconv Unix command translates from UTF-8 to ascii. The temp variable is the name of a unique temp file. I create it, use it and remove it.
The execute in questions looks like EXECUTE SH -c 'curl command'
Curl works, the temp file contains the web call data with the UTF-8 characters in it. From Unix it works correctly, from UniVerse the translation is flawed.
Due to the API containing keys and such, I have not included them here, but they don't appears to have much to do with the problem.
Jon
Page 1 / 1
All,
I recently had to start converting UTF-8 characters to ascii. I have a simple Unix /bin/sh script that does the conversion. It works fine from RHEL 6. When I run a BASIC program with EXECUTE SH -c 'same command' The results are basically correct, but the UTF-8 characters are returned as '?'s.
Your thoughts on this would be greatly appreciated.
Script:
#!/bin/sh
temp=$(mktemp)
eval "$1 -o $temp"
cat $temp | iconv -c -f UTF-8 -t ascii//TRANSLIT
rm $temp
$1 is a curl command to access a web call. The iconv Unix command translates from UTF-8 to ascii. The temp variable is the name of a unique temp file. I create it, use it and remove it.
The execute in questions looks like EXECUTE SH -c 'curl command'
Curl works, the temp file contains the web call data with the UTF-8 characters in it. From Unix it works correctly, from UniVerse the translation is flawed.
Due to the API containing keys and such, I have not included them here, but they don't appears to have much to do with the problem.
Jon
I recently had to start converting UTF-8 characters to ascii. I have a simple Unix /bin/sh script that does the conversion. It works fine from RHEL 6. When I run a BASIC program with EXECUTE SH -c 'same command' The results are basically correct, but the UTF-8 characters are returned as '?'s.
Your thoughts on this would be greatly appreciated.
Script:
#!/bin/sh
temp=$(mktemp)
eval "$1 -o $temp"
cat $temp | iconv -c -f UTF-8 -t ascii//TRANSLIT
rm $temp
$1 is a curl command to access a web call. The iconv Unix command translates from UTF-8 to ascii. The temp variable is the name of a unique temp file. I create it, use it and remove it.
The execute in questions looks like EXECUTE SH -c 'curl command'
Curl works, the temp file contains the web call data with the UTF-8 characters in it. From Unix it works correctly, from UniVerse the translation is flawed.
Due to the API containing keys and such, I have not included them here, but they don't appears to have much to do with the problem.
Jon
I don't have a specific answer. Was wondering if you added a CAPTURING clause to the EXECUTE and a "set -x" in the shell script if it might highlight a difference between running the script directly at the shell prompt compared to an EXECUTE from a BASIC program? Just a thought.
Thanks,
Neil
All,
I recently had to start converting UTF-8 characters to ascii. I have a simple Unix /bin/sh script that does the conversion. It works fine from RHEL 6. When I run a BASIC program with EXECUTE SH -c 'same command' The results are basically correct, but the UTF-8 characters are returned as '?'s.
Your thoughts on this would be greatly appreciated.
Script:
#!/bin/sh
temp=$(mktemp)
eval "$1 -o $temp"
cat $temp | iconv -c -f UTF-8 -t ascii//TRANSLIT
rm $temp
$1 is a curl command to access a web call. The iconv Unix command translates from UTF-8 to ascii. The temp variable is the name of a unique temp file. I create it, use it and remove it.
The execute in questions looks like EXECUTE SH -c 'curl command'
Curl works, the temp file contains the web call data with the UTF-8 characters in it. From Unix it works correctly, from UniVerse the translation is flawed.
Due to the API containing keys and such, I have not included them here, but they don't appears to have much to do with the problem.
Jon
I recently had to start converting UTF-8 characters to ascii. I have a simple Unix /bin/sh script that does the conversion. It works fine from RHEL 6. When I run a BASIC program with EXECUTE SH -c 'same command' The results are basically correct, but the UTF-8 characters are returned as '?'s.
Your thoughts on this would be greatly appreciated.
Script:
#!/bin/sh
temp=$(mktemp)
eval "$1 -o $temp"
cat $temp | iconv -c -f UTF-8 -t ascii//TRANSLIT
rm $temp
$1 is a curl command to access a web call. The iconv Unix command translates from UTF-8 to ascii. The temp variable is the name of a unique temp file. I create it, use it and remove it.
The execute in questions looks like EXECUTE SH -c 'curl command'
Curl works, the temp file contains the web call data with the UTF-8 characters in it. From Unix it works correctly, from UniVerse the translation is flawed.
Due to the API containing keys and such, I have not included them here, but they don't appears to have much to do with the problem.
Jon
I was using a CAPTURING clause to get the return data from the web call. I added the set -x and they were again identical except for the UTF-8 characters in Unix were correct, from UniVerse TCL and EXECUTE with '?'s. The API call is going through, the results in the temp file have the UTF-8 characters. The issue is the output of the iconv command from Unix is correct, from TCL or EXECUTE they are '?'s.
Might it have something to do with character sets or environment variables LANG=en_US.UTF-8. LANG is set when I due SH at TCL (Unix) alone, but is not set with SH -c 'env'.
Jon
All,
I recently had to start converting UTF-8 characters to ascii. I have a simple Unix /bin/sh script that does the conversion. It works fine from RHEL 6. When I run a BASIC program with EXECUTE SH -c 'same command' The results are basically correct, but the UTF-8 characters are returned as '?'s.
Your thoughts on this would be greatly appreciated.
Script:
#!/bin/sh
temp=$(mktemp)
eval "$1 -o $temp"
cat $temp | iconv -c -f UTF-8 -t ascii//TRANSLIT
rm $temp
$1 is a curl command to access a web call. The iconv Unix command translates from UTF-8 to ascii. The temp variable is the name of a unique temp file. I create it, use it and remove it.
The execute in questions looks like EXECUTE SH -c 'curl command'
Curl works, the temp file contains the web call data with the UTF-8 characters in it. From Unix it works correctly, from UniVerse the translation is flawed.
Due to the API containing keys and such, I have not included them here, but they don't appears to have much to do with the problem.
Jon
I recently had to start converting UTF-8 characters to ascii. I have a simple Unix /bin/sh script that does the conversion. It works fine from RHEL 6. When I run a BASIC program with EXECUTE SH -c 'same command' The results are basically correct, but the UTF-8 characters are returned as '?'s.
Your thoughts on this would be greatly appreciated.
Script:
#!/bin/sh
temp=$(mktemp)
eval "$1 -o $temp"
cat $temp | iconv -c -f UTF-8 -t ascii//TRANSLIT
rm $temp
$1 is a curl command to access a web call. The iconv Unix command translates from UTF-8 to ascii. The temp variable is the name of a unique temp file. I create it, use it and remove it.
The execute in questions looks like EXECUTE SH -c 'curl command'
Curl works, the temp file contains the web call data with the UTF-8 characters in it. From Unix it works correctly, from UniVerse the translation is flawed.
Due to the API containing keys and such, I have not included them here, but they don't appears to have much to do with the problem.
Jon
Thanks for the clarification. One other quick thought to perhaps isolate where the difference exists. If you were to write the results of the 'iconv' option in the script to a file, would it produce different results when run directly from Unix compared to using "SH -c" in UniVerse? Just wondering if the translation done by iconv works differently when called from UniVerse or whether something happens related to capturing the output?
Thanks,
Neil
All,
I recently had to start converting UTF-8 characters to ascii. I have a simple Unix /bin/sh script that does the conversion. It works fine from RHEL 6. When I run a BASIC program with EXECUTE SH -c 'same command' The results are basically correct, but the UTF-8 characters are returned as '?'s.
Your thoughts on this would be greatly appreciated.
Script:
#!/bin/sh
temp=$(mktemp)
eval "$1 -o $temp"
cat $temp | iconv -c -f UTF-8 -t ascii//TRANSLIT
rm $temp
$1 is a curl command to access a web call. The iconv Unix command translates from UTF-8 to ascii. The temp variable is the name of a unique temp file. I create it, use it and remove it.
The execute in questions looks like EXECUTE SH -c 'curl command'
Curl works, the temp file contains the web call data with the UTF-8 characters in it. From Unix it works correctly, from UniVerse the translation is flawed.
Due to the API containing keys and such, I have not included them here, but they don't appears to have much to do with the problem.
Jon
I recently had to start converting UTF-8 characters to ascii. I have a simple Unix /bin/sh script that does the conversion. It works fine from RHEL 6. When I run a BASIC program with EXECUTE SH -c 'same command' The results are basically correct, but the UTF-8 characters are returned as '?'s.
Your thoughts on this would be greatly appreciated.
Script:
#!/bin/sh
temp=$(mktemp)
eval "$1 -o $temp"
cat $temp | iconv -c -f UTF-8 -t ascii//TRANSLIT
rm $temp
$1 is a curl command to access a web call. The iconv Unix command translates from UTF-8 to ascii. The temp variable is the name of a unique temp file. I create it, use it and remove it.
The execute in questions looks like EXECUTE SH -c 'curl command'
Curl works, the temp file contains the web call data with the UTF-8 characters in it. From Unix it works correctly, from UniVerse the translation is flawed.
Due to the API containing keys and such, I have not included them here, but they don't appears to have much to do with the problem.
Jon
What about trying PTERM DISPLAY from UniVerse? Does that give you any clues?
Doesn't UniVerse have its own terminfo file separate from Linux? So basically the terminal you get coming out of UniVerse to the shell is potentially different from the terminal you get just going straight into the Linux shell. I've seen problems like this, but I'm not the one that's fixed them, so I don't know all of the details, but I think there's a binary called uvtic you can use to sync them up. I'm sure doing a search of the documentation (or opening a ticket) could lead you to the process.
I hope that one of those helps.
Tyrel
All,
I recently had to start converting UTF-8 characters to ascii. I have a simple Unix /bin/sh script that does the conversion. It works fine from RHEL 6. When I run a BASIC program with EXECUTE SH -c 'same command' The results are basically correct, but the UTF-8 characters are returned as '?'s.
Your thoughts on this would be greatly appreciated.
Script:
#!/bin/sh
temp=$(mktemp)
eval "$1 -o $temp"
cat $temp | iconv -c -f UTF-8 -t ascii//TRANSLIT
rm $temp
$1 is a curl command to access a web call. The iconv Unix command translates from UTF-8 to ascii. The temp variable is the name of a unique temp file. I create it, use it and remove it.
The execute in questions looks like EXECUTE SH -c 'curl command'
Curl works, the temp file contains the web call data with the UTF-8 characters in it. From Unix it works correctly, from UniVerse the translation is flawed.
Due to the API containing keys and such, I have not included them here, but they don't appears to have much to do with the problem.
Jon
I recently had to start converting UTF-8 characters to ascii. I have a simple Unix /bin/sh script that does the conversion. It works fine from RHEL 6. When I run a BASIC program with EXECUTE SH -c 'same command' The results are basically correct, but the UTF-8 characters are returned as '?'s.
Your thoughts on this would be greatly appreciated.
Script:
#!/bin/sh
temp=$(mktemp)
eval "$1 -o $temp"
cat $temp | iconv -c -f UTF-8 -t ascii//TRANSLIT
rm $temp
$1 is a curl command to access a web call. The iconv Unix command translates from UTF-8 to ascii. The temp variable is the name of a unique temp file. I create it, use it and remove it.
The execute in questions looks like EXECUTE SH -c 'curl command'
Curl works, the temp file contains the web call data with the UTF-8 characters in it. From Unix it works correctly, from UniVerse the translation is flawed.
Due to the API containing keys and such, I have not included them here, but they don't appears to have much to do with the problem.
Jon
Note that there are 1,112,064, and not all of them will map to ascii.
What characters are you having a issue with?
All,
I recently had to start converting UTF-8 characters to ascii. I have a simple Unix /bin/sh script that does the conversion. It works fine from RHEL 6. When I run a BASIC program with EXECUTE SH -c 'same command' The results are basically correct, but the UTF-8 characters are returned as '?'s.
Your thoughts on this would be greatly appreciated.
Script:
#!/bin/sh
temp=$(mktemp)
eval "$1 -o $temp"
cat $temp | iconv -c -f UTF-8 -t ascii//TRANSLIT
rm $temp
$1 is a curl command to access a web call. The iconv Unix command translates from UTF-8 to ascii. The temp variable is the name of a unique temp file. I create it, use it and remove it.
The execute in questions looks like EXECUTE SH -c 'curl command'
Curl works, the temp file contains the web call data with the UTF-8 characters in it. From Unix it works correctly, from UniVerse the translation is flawed.
Due to the API containing keys and such, I have not included them here, but they don't appears to have much to do with the problem.
Jon
I recently had to start converting UTF-8 characters to ascii. I have a simple Unix /bin/sh script that does the conversion. It works fine from RHEL 6. When I run a BASIC program with EXECUTE SH -c 'same command' The results are basically correct, but the UTF-8 characters are returned as '?'s.
Your thoughts on this would be greatly appreciated.
Script:
#!/bin/sh
temp=$(mktemp)
eval "$1 -o $temp"
cat $temp | iconv -c -f UTF-8 -t ascii//TRANSLIT
rm $temp
$1 is a curl command to access a web call. The iconv Unix command translates from UTF-8 to ascii. The temp variable is the name of a unique temp file. I create it, use it and remove it.
The execute in questions looks like EXECUTE SH -c 'curl command'
Curl works, the temp file contains the web call data with the UTF-8 characters in it. From Unix it works correctly, from UniVerse the translation is flawed.
Due to the API containing keys and such, I have not included them here, but they don't appears to have much to do with the problem.
Jon
All,
I recently had to start converting UTF-8 characters to ascii. I have a simple Unix /bin/sh script that does the conversion. It works fine from RHEL 6. When I run a BASIC program with EXECUTE SH -c 'same command' The results are basically correct, but the UTF-8 characters are returned as '?'s.
Your thoughts on this would be greatly appreciated.
Script:
#!/bin/sh
temp=$(mktemp)
eval "$1 -o $temp"
cat $temp | iconv -c -f UTF-8 -t ascii//TRANSLIT
rm $temp
$1 is a curl command to access a web call. The iconv Unix command translates from UTF-8 to ascii. The temp variable is the name of a unique temp file. I create it, use it and remove it.
The execute in questions looks like EXECUTE SH -c 'curl command'
Curl works, the temp file contains the web call data with the UTF-8 characters in it. From Unix it works correctly, from UniVerse the translation is flawed.
Due to the API containing keys and such, I have not included them here, but they don't appears to have much to do with the problem.
Jon
I recently had to start converting UTF-8 characters to ascii. I have a simple Unix /bin/sh script that does the conversion. It works fine from RHEL 6. When I run a BASIC program with EXECUTE SH -c 'same command' The results are basically correct, but the UTF-8 characters are returned as '?'s.
Your thoughts on this would be greatly appreciated.
Script:
#!/bin/sh
temp=$(mktemp)
eval "$1 -o $temp"
cat $temp | iconv -c -f UTF-8 -t ascii//TRANSLIT
rm $temp
$1 is a curl command to access a web call. The iconv Unix command translates from UTF-8 to ascii. The temp variable is the name of a unique temp file. I create it, use it and remove it.
The execute in questions looks like EXECUTE SH -c 'curl command'
Curl works, the temp file contains the web call data with the UTF-8 characters in it. From Unix it works correctly, from UniVerse the translation is flawed.
Due to the API containing keys and such, I have not included them here, but they don't appears to have much to do with the problem.
Jon
I had to go out of town for some personal business. The answer to the issue was the LANG environment variable was not set inside of UniVerse, but when I SH'ed out to Unix it was set to LANG=en_US.UTF-8. If we execute ENV LANG=en_US.UTF-8 from TCL the command works as expected.
Thanks for all those who replied. Sorry for the late response.
Jon
Sign up
Already have an account? Login
Welcome to the Rocket Forum!
Please log in or register:
Employee Login | Registration Member Login | RegistrationEnter your E-mail address. We'll send you an e-mail with instructions to reset your password.