Memory limit when writing records. (OpenInsight 16-Bit Specific)
At 24 APR 2003 11:38:36AM Bob Yerkes wrote:
I am trying to copy files from one table to another, I am using a readnext:
Loop Readnext @Id else Eof=True Until Eof Read @Record from TableData, @Id then
I then transform some of the data for the new table and then write it to the new table:
* write cur_record on cust_hist_data, trans_id else call msg ("Unable to write") end *
When the total of the keys I am writing are greating than 64k it bombs on the write. I don't store the keys in a variable or keep any other running total that could become to big. I Read then Write clearing all of my variables every time.
I have tried using Flush and GarbageCollect after writing the record but no success.
Thanks for any ideas,
Bob
At 24 APR 2003 11:48AM Don Miller - C3 Inc. wrote:
Bob ..
Do you have a QUICKDEX/RIGHTDEX on the target table? If so, then your write will bomb. Check the media map for the volume to see if there is one on the table you're writing to.
Don M.
At 24 APR 2003 01:48PM Bob Yerkes wrote:
Thanks for the information. I think I found the problem, The developer who created all the tables in the volume set the size lock to 2. When I changed it to 0 it started working.
Thanks,
Bob Yerkes