I have a challenge to automate the creation of a backup file for an existing data file. The data file being backed up will vary and they can be hashed or dynamic files.
It has been reasonably straight forward until I came to the effort to support an existing dynamic file.
All my previous posts about dynamic files are in an attempt to better understand how I can assess the existing file such that I can create the backup file using well-optimised specifications so the item copy process is as quick/efficient as possible.
I was first thinking that ANALYZE.FILE might help determine both the Minimum modulo needed as well as the group size needed to properly cater for the data, but now I'm unsure if that is the best approach.
My alternative method is to simply count the number of items in the existing file and generate an average item size, then set the MINIMUM.MODULO to the item count and set the RECORD.SIZE to the average item size and let UV's Dynamic file magic do it's work.
I have come up with this TCL command to both count the items as well as calculate the average record size:
SELECT COUNT(*), AVG(CAST(EVAL "LEN(@RECORD)+LEN(@ID)+1" AS INT)) FROM TEMP_DYN COUNT.SUP COL.SUP MARGIN 0;
Are these approaches valid, or are there other methods/options available to achieve the results I need to deliver?
Many thanks for any guidance on this.
------------------------------
Gregor Scott
Software Architect
Pentana Solutions Pty Ltd
Mount Waverley VIC AU
------------------------------




