I wrote a script file that saves its output to a CSV file for later reference, but the second script to import data requires an inconvenient amount of time to read it back.
The data is in the following format:
Item1,val1,val2,val3 Item2,val4,val5,val6,val7 Item3,val8,val9
where the headers are in the leftmost column and the data values ββoccupy the rest of the row. One of the main difficulties is that arrays of data values ββcan be of different lengths for each test element. I would save it as a structure, but I need to be able to edit it outside the MATLAB environment, as sometimes I have to delete rows with bad data on a computer on which MATLAB is not installed. So, first of all, my question is: should the data be saved in a different format?
Second part of the question: I tried importdata , csvread , and dlmread , but I'm not sure which is better, or if there is a better solution. Right now I am using my own script with a loop and fgetl , which is terribly slow for large files. Any suggestions?
function [data,headers]=csvreader(filename); %V1_1 fid=fopen(filename,'r'); data={}; headers={}; count=1; while 1 textline=fgetl(fid); if ~ischar(textline), break, end nextchar=textline(1); idx=1; while nextchar~=',' headers{count}(idx)=textline(1); idx=idx+1; textline(1)=[]; nextchar=textline(1); end textline(1)=[]; data{count}=str2num(textline); count=count+1; end fclose(fid);
(I know that this is probably horribly written code - I am an engineer, not a programmer, please do not shout at me, however, any suggestions for improvement would be welcome.)
file-io matlab csv data-import
Doresoom
source share