Best way to use the Remove statement (OpenInsight 32-bit)
At 06 APR 2022 09:29:13AM cmeyer wrote:
I have a file with 5 multivalued fields with the mutivalue date field going back 11 years in cronological order. I need to select the last 3 years of mulivalue fields. What is the most efficient way (time wise) to select the dates and the other associated mulivalued fields. I believe the "remove" statement is the most efficient but do not know how to use the "remove" statement to also select the other multivalue fields. Is there some sort of grouping available to get the other multivalue field data. The file has 140,000 records and currently the selection process takes more than 40 minutes.
Any advice would be grateful.
Chris
At 06 APR 2022 09:56AM Andrew McAuley wrote:
Remove statement per MV field.
World leaders in all things RevSoft
At 06 APR 2022 11:47AM D Harmacek wrote:
Chris, I need a little more information. Are those 5 multivalued fields the only ones in the record?
You wanted most recent 3 dates. Are the MV updated together at the VM level per date? Are they most recent at the top, or at the bottom.
Dave Harmacek - Harmacek Database Systems - near Boston, MA USA
At 06 APR 2022 11:51AM bob carten wrote:
The snippet below will parse specific fields from a record with a remove statement.
Function test_remove_Example(void) /* ** Example code to ** extract relevant information from a record using remove statement */ #pragma format_indent_comments $insert logical * for the remove statement * Assume that the data is only fm / vm delimited Equ mark_fm$ To 2 Equ mark_vm$ To 3 * Identify the field nubers you want to extract relevant_positions = "1,7,9,12,14" * Make an array to hold the results for each record dim relevant_data(5) * MAke some fake data for the test recs = "" For i = 1 To 10 For j = 1 To 20 For k = 1 To 5 recs<i,j,k> = i:'-':j:'-':k next Next Next Convert @fm:@vm:@svm To @rm:@fm:@vm In recs * In this example, just work with the fake data * Normally this would be loop / readnext / read rp = 1 Loop rec = recs[rp,@rm]; rp = col2()+1 Gosub parse_rec /* * Do some test on the parsed data? is_wanted = (relevant_data(1) Matches criteria<1> ) And relevant_data(2) lt criteria<2> And so_forth And so_on ) If is_wanted Then Gosub process_it end */ While rp lt Len(recs) Repeat return Parse_rec: col = "" blank = "" prev_mark = 0 cur_fm = 1 cur_vm = 1 mat relevant_data = blank loop bremove item from rec at col setting mark begin case case mark == 0 * last item Gosub addit case mark == mark_fm$ Gosub addit cur_fm += 1 cur_vm = 1 case mark == mark_vm$ Gosub addit cur_vm += 1 case otherwise$ * append it? relevant_data(cur_fm)<1,cur_vm,-1> = item end Case prev_mark = mark while mark repeat debug Return ///////////////////// addIt: locate cur_fm in relevant_positions using ',' setting relevant_pos then relevant_data(relevant_pos)<1,cur_vm> = item End Return /////////////////////
At 07 APR 2022 03:35AM Mark Boorman wrote:
Considering you mention selecting the data, I'm wondering if you mean the "Reduce" statement?
At 07 APR 2022 05:30AM Joshua Goddard wrote:
equ nfields$ to 20 ;* Depends on how many fields are in a record equ date_col$ to 5 dim rec(nfields$) cutoff_date = date() - 3*365 ;* Last 3 years table_name = "" open table_name to table_handle else debug clearselect rlist("SELECT TABLE_NAME",5) data_we_want = "" eof = false$ Loop readnext key else eof = true$ until eof matread rec from table_handle, key then p1 = 1 p2 = 1 p3 = 1 p4 = 1 p5 = 1 loop remove c1 from rec(1) at p1 setting f1 remove c2 from rec(2) at p2 setting f2 remove c3 from rec(3) at p3 setting f3 remove c4 from rec(4) at p4 setting f4 remove date from rec(date_col$) at p5 setting f5 while len(date) or f5 if date >= cutoff_date then data_we_want := c1:@vm:c2:@vm:c3:@vm:c4:@vm:date:@fm end Repeat end Repeat data_we_want[-1,1] = ""
At 07 APR 2022 06:09AM cmeyer wrote:
Thanks Bob & Josh,
I will give it a go and I am sure it will improve the extraction performance.
Chris