copyrow b703 variable exceeds maximum length (AREV Specific)
At 16 JAN 2006 09:40:10PM Janet S Scott wrote:
Hi all
I have ye old arev system 3.12 now running on win2k server with the linear hash all networks 2.1, client workstations are XP, EM is active (around 4096K available)
I am trying to restore data that went missing during a table rebuild.
The table rebuild that arev does periodically when a table starts to get a bit huge. Unfortunately the table went from being around 300K records to 45K records - arev or something ate them.
So I did copytable of a backup with the 300K records. And now I have that attached on the live system (yes I backed that up first), and I'm trying to copyrow records but even using a select list with just one id in it I get
COPYROW_SUB line 1 B703 variable exceeds maximum length.
line 1 copyrow_sub broke beause a run time error was encountered
The backup table name is CL_ADMISSION_H_OLD
The live table name is CL_ADMISSION_H
it doesn't seem to matter how many rows I try to copy using the get lists. a list of one id doesn't work. a list of 14K ids doesn't work.
ie
SELECT TABLE1 WITH ID BETWEEN "100000" AND "100500"
SAVELIST TEST1
GETLIST TEST1
COPYROW TABLE1 * TO:(TABLE2 (SE
breaks but if I copy by specifying one key it seems happy.
eg
COPYROW TABLE1 100002 TO:(TABLE2
it works
I don't want to overwrite records at this time.
I've tried using setalias to shorten the table names, no joy. The longest keys are around 6 digits.
I'm not happy. I have to copy 250,000 records.
Is there some other way of doing it, would trying a different PC help? would copying without using * help or copying all records from the new to the old etc?
At 16 JAN 2006 10:48PM Janet S Scott wrote:
updates
There were no indexes, quickdexes or rightdexes on either table1 or table2 (source or destination tables).
To check (cos I know I'll forget)
from tcl
LISTTABLES - look for ones !table1 etc
LISTMEDIA k:\arev\datapath - look for mfs
DICT TABLE1
press F6, then choose SF3 or SF4
if it asks you if you want to install a *dex file - then it hasn't got any.
COPYROW doesn't work so suggesting copying records to another table isn't going to help but might for other people.
What I have done which has improved thingsā¦
I changed PCs. And a copyrow using a savelist with about 10K keys in it worked!
eg
SELECT TABLE1 WITH @ID STARTING "3"
SAVELIST JSS_T1_3
GETLIST JSS_T1_3
COPYROW TABLE1 TO:(TABLE2 (SE)
worked. TABLE2 is now about 10K keys more.
note: I also left out the * after TABLE1 and before TO:
don't know if this made a difference.
Why the first PC may have refused to behave - the source data was on a novel server with a novell service, and the destination data was on an NT / Win2K server with All networks service. Swapping repeatedly between the two while I figured out what needed copying may have broken something.
The second PC, I have only connected to the win2K server, having copied the table1 to a folder on there from the first PC.
But I won't believe it is sorted until I finish copying stuff.
At 17 JAN 2006 01:22AM Janet S Scott wrote:
OK
I've got it sorted.
I had to copy the records in batches of 10K.
Now there is a discrepancy of about 5000 records. Not too bad out of 300K. Does anyone know a neat way of selecting all keys that are in table1 but not table2? I was thinking of writing a symbolic in the dictionary to do link up the tables and be empty if there was no matching key.
At 17 JAN 2006 03:16AM support@sprezzatura.com wrote:
Yup @Ans=Not(Xlate(OtherFile, @Id, 0, "X"))
or similar⦠and next time just write a program if copy isn't working ;)
support@sprezzatura.com
The Sprezzatura Group Web Site
World Leaders in all things RevSoft
At 17 JAN 2006 12:15PM Matt Sorrell wrote:
Jane,
As an aside, yes the "*" in the copy command makes a difference. Even if you have an active select list, the "*" overrides that and tells the system to copy all of the records from Table1 to Table2.
If you are using an active select list, omit the "*" so the system will use the list.
Finally, my guess is that, because you were not using the "(O)" for overwrite, the list of rows that could not be copied was growing too big and that was causing the error.
As Sprezz suggested, either write a small program next time or use a symbolic to only select records from Table1 that don't exist in Table2.
Matt Sorrell