How to read large matrix from csv effectively in Octave

There are many reports of low performance Octave dlmread. I was hoping this was fixed in 3.2.4, but when I tried to download a csv file about ok. 8 * 4 mil (total 32 million), it also took a very long time. I searched on the Internet but did not find a workaround for this. Does anyone know a good workaround?

+5
source share
3 answers

I had the same problem and it was convenient for me, so I decided to use "read.csv" in R, and then use the R. package "R.matlab" to write the file .mat, and then load that in Octave .

"read.csv" can also be quite slow, but in my case it worked very well.

+3
source

The reason is that Octave has an error that adds data to a very large matrix, takes longer, and then adds the same amount of data to a small matrix.

Below is my attempt. I choose to save data every 50,000 rows, so I might already have a look, and not be forced to wait. This is slower for small files, but much faster for large files.

function alldata = load_data(filename)
    fid = fopen(filename,'r');
    s=0;
    data=[];
    alldata=[];
    save "temp.mat" alldata;
    if fid == -1
        disp("Couldn't find file mydata");
    else
        while (~feof(fid))
            line = fgetl(fid);
            [t1,t2,t3,t4,d] = sscanf(line,'%i:%i:%i:%i %f', "C"); #reading time as hh:mm:ss:ms and data as float
            s++;
            t = (t1 * 3600000 + t2 * 60000 + t3 * 1000 + t4);
            data = [data; t, d];
            if (mod(s,10000) == 0)
                #disp(s), disp("  "), disp(t), disp("  "), disp(d), disp("\n");
                disp(s);
                fflush(stdout);
            end
            if (mod(s,50000) == 0)
                load "temp.mat";
                alldata=[alldata; data];
                data=[];
                save "temp.mat" alldata;
                disp("data saved");
                fflush(stdout);
            end
        end
        disp(s);
        load "temp.mat";
        alldata=[alldata; data];
        save "temp.mat" alldata;
        disp("data saved");
        fflush(stdout); 
    end
    fclose(fid);
+2
source

, .

, sscanf , . , .

There are a lot of lines in my CSV file. They begin with a header of 18 lines and are followed by a data block, each of which has 135 columns. The following code has been tested. My file also starts on every line with the dd / mm / yyyy hh: mm field. This will also catch the bad lines and indicate where they are using try / catch.

My. The CSV file came from a client who dumped the loading of PARCView into an Excel file.

function [tags,descr,alldata] = fbcsvread(filename)
  fid = fopen(filename,'r');
  s = 0;
  data=[];
  alldata=zeros(1,135);
  if fid==-1
    disp("Couldn't find file %s\n",filename);
  else
    linecount = 1;
    while (~feof(fid))
      line = fgetl(fid);
      data2 = zeros(1,135);
      if linecount == 1
    tags = strsplit(line,",");
      elseif linecount == 2
    descr = strsplit(line,",");
      elseif linecount >= 19
    data = strsplit(line,",");
    datetime = strsplit(char(data(1))," ");
    modyyr = strsplit(char(datetime(1)),"/");
    hrmin = strsplit(char(datetime(2)),":");
    year1 = sscanf(char(modyyr(3)),"%d","C");
    day1 = sscanf(char(modyyr(2)),"%d","C");
    month1 = sscanf(char(modyyr(1)),"%d","C");
    hour1 = sscanf(char(hrmin(1)),"%d","C");
    minute1 = sscanf(char(hrmin(2)),"%d","C");
    realtime = datenum(year1,month1,day1,hour1,minute1);
    data2(1) = realtime;
    for location = 2:134
      try
        data2(location) = sscanf(char(data(location)),"%f","C");
      catch
        printf("Error at %s %s\n",char(datetime(1)),char(datetime(2)) );
        fflush(stdout);
      end_try_catch
    endfor
    alldata(linecount-18,:) = data2;
    if mod(linecount,50) == 0
      printf(".");
      fflush(stdout);
    endif
      endif
      linecount = linecount + 1;
    endwhile
    fclose(fid);
  endif
endfunction
+1
source

All Articles