Sign up on the Revelation Software website to have access to the most current content, and to be able to ask questions and get answers from the Revelation community

At 29 MAY 1998 02:44:34PM Mark Petereit wrote:

I have a customer where eventually 18K employees may reside on a single database using SS# as the key.

The issue has arisen where doing the Rlist "Select" using "with" criteria and "By"`sorting criteria. The problem is that as the select will return keys surpassing the 64K string/record length limit, the results are skewed, unreliable, etc. To work around this issue we have done a preprocess program using Select "file.handle" to get all the keys and create a new table with a seq key counter to store the valid records for selecting and sorting. This buys us around 11,000 employees that can be utilized in this method.

The question: Is there some magical method that can be used (documented or otherwise) that will allow a program to Select records where the key length and total records will exceed the 64K limit.

I know with Btree you can bring in chunks of keys but will this maintain any sorting order so that the output file can be in a specific sequence?


At 01 JUN 1998 03:41AM Peter Bowyer, Swiftscan Systems wrote:

You could try using Rlist and saving the results to a list. Then use Activate_Save_Select to retrieve the list and process the records. We have used this method for large result lists quite successfully. The results will be returned more quickly if you have B-Tree indexes on the select and sort fields.


At 01 JUN 1998 08:01AM Cameron Revelation wrote:

Mark,

The issue has arisen where doing the Rlist "Select" using "with" criteria and "By"`sorting criteria. The problem is that as the select will return keys surpassing the 64K string/record length limit, the results are skewed, unreliable, etc.

Are you saying that the SELECT does not work or that you cannot put all of the results into one variable? The READNEXT statement will return only one key at a time, and that key is never over 64k. If you are getting invalid results, chances are you have a corrupt index, which is probably caused by improper justification of an indexed field, improper single/multi-value specification for an indexed field, or system delimiters (for example, @vm) being present in a key.

Cameron Purdy

info@revelation.com


At 02 JUN 1998 09:38PM Paul Rule wrote:


At 02 JUN 1998 09:45PM Paul Rule wrote:

There are a number of problems and ways of handling them with the 64 K limit. You don't mention what sort of corruption you're getting so I'll just give you some of my thoughts on the problem. You shouldn't have a Quickdex or Rightdex on a file with more than 64K of keys. If you do, it will crash during the select. The design of your program that selects a large number of keys may be the issue. If you try to read all of the keys into one variable you will have problems. Try reading one record at a time and processing that record. ie: loop, readnext key, process key, repeat etc. We have clients with databases containing over 500,000 records in a single file without encountering these problems. If you can supply more information I can be a little less vague. Good luck.


At 13 JUN 1998 10:49AM Aaron Kaplan wrote:

What version of OI? If you remove the with or the by clause does the problem go away? The engine works with these items in 64K chuncks and it works with the LH records in groups, so the 64K isn't really an issue. I have a guess as to where a problem could be, but I don't want to lead you to there.

akaplan@sprezzatura.com

Sprezzatura, Inc.

www.sprezzatura.com_zz.jpg

View this thread on the forum...

  • third_party_content/community/commentary/forums_nonworks/ecdc96e3be2a9ea2852566130066f51b.txt
  • Last modified: 2023/12/28 07:40
  • by 127.0.0.1