Large data set exceeding maximum execution time

I have a script that loads the contents of a spreadsheet into scriptDB, but the table has about 15,000 rows, 9 columns and continues to give me an error "exceeded maximum execution time".

I use the function specified in the google documentation to load data:

function loadDatabaseFromSheet() { var spreadsheet = SpreadsheetApp.openById(SPREADSHEET_KEY); var sheet = spreadsheet.getActiveSheet(); var columns = spreadsheet.getLastColumn(); var data = sheet.getDataRange().getValues(); var keys = data[0]; var db = ScriptDb.getMyDb(); for (var row = 1; row < data.length; row++) { var rowData = data[row]; var item = {}; for (var column = 0; column < keys.length; column++) { item[keys[column]] = rowData[column]; } db.save(item); } } 

Is there a way to speed things up or will I just have to break it into pieces of several thousand?

+4
source share
1 answer

Calling db.save(item) 15,000 times is what causes slowness. Instead, use bulk operations if you intend to save so much data.

  var db = ScriptDb.getMyDb(); var items = []; for (var row = 1; row < data.length; row++) { var rowData = data[row]; var item = {}; for (var column = 0; column < keys.length; column++) { item[keys[column]] = rowData[column]; } items.push(item); } db.saveBatch(items,false); 

Calling a save operation once at the end saves all rounding time, so this should speed up your code and complete it before it exceeds the maximum execution time.

+4
source

All Articles