How to automate the download of weekly export service files

In SalesForce, you can schedule up to weekly “backups” / dumps of your data here: “Settings”> “Administration”> “Data Management”> “Data Export”

If you have a large Salesforce database, a significant number of files can be downloaded.

Does anyone have a best practice, tool, batch file or trick to automate this process or make it a little less manual?

+7
source share
6 answers

The last time I checked, there was no way to access the status of the backup file (or actual files) via the API. I suspect that they made this process difficult for design automation.

I use the Salesforce scheduler to prepare files weekly, then I have a scheduled task that runs on a local server that downloads files. Assuming you have the ability to automate / script some web requests, here are a few steps you can use to upload files:

  • Get Active Salesforce Session ID / Token
    • Enterprise API - SOAP Login Method ()
  • Get your organization ID ("org ID")
    • Settings> Company profile> Company information OR
    • use the getUserInfo () SOUR API to get your organization ID.
  • Send an HTTP GET request to https: // {your instance of sf.com} .salesforce.com / ui / setup / export / DataExportPage / d? setupid = DataManagementExport
    • Set the request cookie as follows:
      • oid = {your organization identifier}; ASC = {your session identifier};
  • Parse the resulting HTML for instances of <a href="/servlet/servlet.OrgExport?fileName=
    • (The file name begins after fileName =)
  • Paste the file names into this URL to download (and save):
    • https: // {your instance of sf.com} .salesforce.com / servlet / servlet.OrgExport? fileName = {filename}
    • Use the same cookie as in step 3 when uploading files

This is by no means the best practice, but it does its job. Of course, if they change the layout of the page in question, this probably won't work anymore. Hope this helps.

+8
source

I'm Naomi, CMO and co-founder of cloudHQ, so I feel like this is a question that I probably should answer. :-)

cloudHQ is a SaaS service that synchronizes your cloud. In your case, you will never need to download your reports as an export of data from Salesforce, but you will always have a backup in the folder labeled "Salesforce Reports" in the service in which you synchronized Salesforce, for example: Dropbox, Google Drive, Box, Egnyte, Sharepoint, etc.

The service is not free, but there is a free trial version for 15 days. To date, there is no other service that actually synchronizes your Salesforce reports with other cloud storage companies in real time.

Here you can try: https://cloudhq.net/salesforce

Hope this helps you!

Cheers, Naomi

+6
source

A script for downloading SalesForce backup files can be obtained at https://github.com/carojkov/salesforce-export-downloader/

It is written in Ruby and can be run on any platform. The supplied configuration file contains fields for your username, password, and download location.

With a little configuration, you can upload your downloads. The script sends email notifications of completion or failure.

It's simple enough to figure out the sequence of steps required to write your own program if the Ruby solution does not work for you.

+5
source

Be careful to find out what you get in the backup file. The backup is a ZIP archive of 65 different CSV files. Its raw data, outside the Salesforce user interface, cannot be used very easily.

+2
source

Our company makes the free DataExportConsole tool to fully automate the process. You do the following:

  • Automate weekly data export with Salesforce Scheduler
  • Use the Windows task scheduler to run the FuseIT.SFDC.DataExportConsole.exe file with the correct parameters.
+2
source

I recently wrote a small PHP utility that uses the Bulk API to load a copy of sObjects, which you define using the json configuration file.

It is quite simple, but it can be easily expanded to suit your needs.

Force.com Replicator on github.

0
source

All Articles