Sign up on the Revelation Software website to have access to the most current content, and to be able to ask questions and get answers from the Revelation community

At 29 NOV 1999 10:07:23PM Scott, LMS wrote:

Hi All

I dunno how to put this but our application works fine at our office (Win 95 + Novel Lan + Novel Server + NLM1.14) but not at the client site (Novel LAN, Novel Server, Win 95, NLM1.5).

The client site is much bigger than our site. The whole system doesn't work without the revparam file being everywhere, while it works fine at our site.

Other things that go wrong.

We have 3 different versions of the application at the client site: Development, Test and Training. Development has a one user licence, the other two have a 256 user licence. If someone logs on test or training I cant use DEV. Dev is starting to disintegrate in that it now won't open in dev mode, it gpfs instantly (revdebug.dll errors mostly).

Some bits of code work fine on our system (run time or dev) but not at the client site, in particular copying template tables to temporary tables (copy_table, attach_table).

Also they have a lot more data than we are accustomed to running, so the whole thing runs like a snail on prozac, and we keep blowing 64K variable size limits - need suggestions for fixing this.

I need some troubleshooting ideas, please help.

Scott


At 29 NOV 1999 10:36PM Don Bakke wrote:

Scott,

What version of OI? Some of the older versions had problems with license counts when running multi-user runtimes and single-user development engines simultaneously. This might cause your GPF's in REVDEBUG.DLL.

As far as blowing 64K limits, what kind of process is this happening to? Is this a program blow up or a window blow up? Most, if not all, 64K problems can be resolved with proper in-code checking.

[email protected]

SRP Computer Solutions


At 30 NOV 1999 08:19AM [email protected] onmouseover=window.status=why not click here to send me email?;return(true)", [url=http://www.sprezzatura.com" onMouseOver=window.status=Why not click here to visit our web site?';return(true)]Sprezzatura Group[/url] wrote:

64K problems… luxury… try using OICGI for anything bigger than 30K!

Speed shouldn't be that related to data size unless the programs are trawling through files sequentially.

Copy_Table doesn't work on a runtime as it is restricted by the license. You might want to check the license to see if the same applies to attach_table. These restrictions need to be borne in mind when initially designing the system.

[email protected]

Sprezzatura Ltd

World Leaders in all things RevSoft


At 01 DEC 1999 10:05PM Scott, LMS wrote:

Hi All

Thanks for your responses.

I sort of fixed the dev install by copying all the files except the rev tables from our source install oinsight directory. Strangely I went from a 1 user dev licence to a 3 user dev licence. Client site has OI Version 3.6.1 on the Dev and OI 3.7 on the Run Time. Back in the office we have Dev and runtime OI3.7. We are planning to upgrade soon.

Even more strange, we have managed to create ourselves a runtime dev system. I don't know how we did this, but we have an install that is the right size to be a run time (using dos dir to check) but it will actually run in dev. Maybe we applied a licence upgrade for DEV to a runtime but I don't know.

As for the copy table not working in the run time, I understood that create table did not work but I thought copy_table did. Is there a document on this web site that confirms/denies what works and what doesn't because it is not in the programmer ref help (well I couldn't find it).

We have several workarounds for 64K limits.

What I want to do is store all the keys from a table that we are planning to update and then update them if everything has gone well with the processing. If the processing/validation etc fails I don't want to mark the records as processed. I have solved this one by writing to SYSLISTS and then activating the saved list. Works quite well.

I also have one process that effectively creates 4 oipi reports. What I end up doing is processing the records and writing result to the report and saving any error details in - you guessed it - a dynamic array. When the array gets to about 3000 records - it croaks with the data limit (or other strange things depending on it's mood at the time).

We also have enquiry screens which are supposed to display a list of keys and details to choose from. Some of the tables have over 80,000 records, so naturally this doesn't work too well either. I wrote a paging routine to get around that but it is annoying.

Reporting on the 80,000 records is slow. First we select about a quarter of them (based on a year field which is part of the key), then we sort them, then we process and report them. UGH. Seems to take for ever, especially on our office system (the one with NLM 1.14 and no REVPARAM files). Note, our office PCs have more guts (RAM, Hz, Hard drive) than our client's PCs, but they go slower.

I wondered if having a 4 part compound key would contribute to slowness. It does take a long time to select records based on part of the key.

The other thing I was wondering about was I start a Rev Report from code and I can pass a select criteria but not with a sort criteria. I had to change the report itself to sort the way I wanted. Fortunately I don't need flexibility this time but I was wondering if there was a /SORT=blah' parameter I could feed into the mix. Also the rev reporter runs incredibly slowly, compared to the rest of the processing. Mostly our users can the reporter before it even puts up the first page. Then they want the report back but it is quicker to use the export file the reporter was supposed to be reporting the input for and import the export file into excel. I think OIPI could be a winner here unless I missed something.

Scott

View this thread on the forum...

  • third_party_content/community/commentary/forums_nonworks/89ae9f0ae570c8c8852568390011280a.txt
  • Last modified: 2023/12/28 07:40
  • by 127.0.0.1