difficulty reading large text file to native table (OpenInsight 32-bit)
At 05 OCT 2012 12:43:10AM James Alexander wrote:
I have a large text file (14 GB, > 100 million records) of CRLF/tab-delimited data I am trying to read in using OSBRead command. The routine I coded runs through about 24 to 30 million records, then goes no further.
I am running this command specifically: OSBRead ifile From FileHandle at counter length 100000
Where counter = 3250253838
I am stumped as to the best course of action. Thanks in advance.
At 05 OCT 2012 04:58AM rwilson wrote:
I remember some internal limit but I think it was 16bit based.
Can you breakup the file into 5 or 6 parts using some other tool
(excel/wordpad/etc)
Rich
At 08 OCT 2012 07:25AM Carl Pates wrote:
James,
As OI is currently a 32-bit product it is limited internally to manipulating numbers using signed 32-bit integers (max value is therefore 2147483647), though this can be interpreted as an unsigned 32-bit number in certain cases.
However, the Windows API does cater for accessing files larger than this, usually by splitting a 64-bit value into two 32-bit values and allowing you to manipulate them separately.
Take a look at functions like ReadFile and SetFilePointer documented here:
http://msdn.microsoft.com/en-us/library/windows/desktop/aa364232(v=vs.85).aspx
World leaders in all things RevSoft
At 18 NOV 2012 09:56PM James Alexander wrote:
Thanks for the replies everyone. I ended up splitting the file into more manageable sections. I was able to import the text file into native OI tables, and am now trying to solve the issue of processing time on the data. The system seems to run out of RAM before getting through the job.
At 20 NOV 2012 08:32AM Andrew McAuley wrote:
The system seems to run out of RAM before getting through the job.
What version of OI? What are you accumulating in memory to lose RAM?
World leaders in all things RevSoft