third_party_content:community:commentary:forums_nonworks:df677b939ef96dde8525694d002770bb

Sign up on the Revelation Software website to have access to the most current content, and to be able to ask questions and get answers from the Revelation community

At 01 SEP 2000 03:10:47AM LMS wrote:

We are having problems verifying a table within our application.

Either it breaks to the debugger with the message:

'LH_VERIFY_SUB' broke because a run time error was encountered, or we get the windows message "this program has performed an illegal operation and will be terminated"…etc

We suspect that there is a GFE somewhere within this file but do not want to use FIXLH on the whole file as we have had bad experiences using this with records being deleted.

Is there any other was of fixing the corrupted records?

Thanks


At 02 SEP 2000 10:17PM Richard Hunt wrote:

There is another way to check if a file has GFE's. It does not always catch all GFE's. Do this at the "command window" (F5 function key)…

SELECT tablename

Where tablename is the name of the table (file). If it gets all the way thru the select, then it didn't find a GFE (doesn't mean it is clean of GFE's). If it does have a problem, well then you do know it does have GFE's.

If it does have GFE's, the only way I know to fix the file is doing a "FIXLH". Before running FIXLH I would copy the file "as is" (remember to copy the *.lk and *.ov). Be sure to deal with the index part too (if there is one).


At 03 SEP 2000 06:40AM Raymond Blackwell wrote:

Never had any luck myself with FIXLH or the DUMPLH version, normally finish up having to write Rbasic routine to pick up non corrupted data and the trusty editor to fix the GFE records.

Ray


At 03 SEP 2000 02:47PM Richard Hunt wrote:

Ahhhh. I have used DUMPLH to view raw data, and to see the actual corruption. DUMPLH is also good on compressing the file too.

I have only needed to use FIXLH once. It worked fine. The file was left with good data. the bad or questionable data was saved in a temp file. Out of about 150,000 records only about 15 were lost.

I was extremely lucky cus the file that was GFE'd kinda parallels another file. What I mean is that I was able to use the other file to check the GFE'd file for missing records. Then I used a backup to restore the missing records. Luckly, a controlled audit trail system also helped to make sure the records from the backup were not updated since the backup.

Without the above mentioned controlls in place, I can definately see much difficulty recovering from GFE's.

What my concern is, is that GFE's shoud be extremely rare! About the only thing I can imagine would cause one is…

1) power failure during an update.

2) record locking problems. two users updating the same identical item at the same identical time.

3) software directly messing the file structure up.

4) accessing the database from the server. a definate NO NO. the server does not lock records, so item 2 will develop.


At 03 SEP 2000 07:26PM Raymond Blackwell wrote:

Richard,

I agree that the DUMPLH is very useful for checking the whereabouts of the GFE and also looking at raw data, what I have found is that it is pretty useless at actually fixing the GFE. I also agree thatGFE's should be rare, unfortunately they seem to occur more regularly than they should, even on single user systems and more particularly in the indexing files. Backups are not always the answer since the GFE seems to appear when you ACCESS the record rather than when it is written, this means that the GFE can remain hidden for long period before the dreaded 'Arev Blue Screen' occurs. It is a pity, since Arev is no longer supported, that the LH code is not released to the wider community so that some technical whizz could generate more bullet proof structure.

Ray


At 04 SEP 2000 12:36AM Richard Hunt wrote:

Yea most definately agree with ya. I have found lots of helpful info on the AREV linear hashing. Its in the table SYSKNOWLEDGE row R29.

I am just amazed on how many times I have read about GFE's here. Anytime a GFE appears, it is always a "near death experience", for sure.

I wonder if it has to do with the indexing??? See, I use my own indexing software. And none of my customer's ever seem to have GFE problems. Except for one time, and that was when a user crashed the server.

We do verify the files monthly. We do compress the files monthly too. I can't really think of anything else to say. I have heard that source code is available to WORKS subscribers. Not sure what source code though.


At 04 SEP 2000 08:26AM Raymond Blackwell wrote:

Richard,

Like you I have developed my own indexing routines and my GFE problems invariably occur with Arevs indexing routines. They are useful if they don't crash, but I have taken the line of least resistance and use them only occasionally.

Ray

View this thread on the forum...

  • third_party_content/community/commentary/forums_nonworks/df677b939ef96dde8525694d002770bb.txt
  • Last modified: 2023/12/28 07:40
  • by 127.0.0.1