Have you ever needed to do a large export from your epm cloud instance ? You might be familiar with defining an export job. In this post, I will talk about splitting an existing export job into smaller chunks and run it parallel.
Suppose you have got the following Export job is defined against your ASO cube where Customer accounts is a large dimension.

int numberOfExportFiles = 4 Cube cube = operation.application.getCube("ASOCubeName") Dimension accdim = cube.getApplication().getDimension('Accounts'); List<Member> accs accs = accdim.getEvaluatedMembers('ILvl0Descendants(ALL Accounts)',cube) List<Member>[] subaccs = accs.collate( accs.size().intdiv(numberOfExportFiles), false ) int i=0
Let’s start our groovy code with a few definition.
numberofExportFiles: This is the number of jobs that will be triggered once rule is executed.
cube: Connection to the ASO cube. Please update it with the ASO Cube name you are wishing to connect.
accdim: Dimension object. getEvaluatedMembers method is used to pass member function.
accs: List of all customer accounts.
subaccs: List of List of accounts. collate method in this example splits the main list into 4 smaller chunks.
subaccs.each{ def params = new JSONObject() .put("cubeName","ASOCubeName") .put("rowMembers",subaccs[i].join(',')) .put("columnMembers","column1,column2") .put("povMembers","FY20,Working,Plan,USD,...") .put("exportFileName","ExportFile" + i + ".zip") def body = new JSONObject() .put("jobType","EXPORT_DATA") .put("jobName","ExportAccounts") .put("parameters",params) .toString() HttpResponse<String> jsonResponse = operation.application.getConnection("MyConnection").post("/jobs").body(body).asString(); i=i+1 }
subaccs.each will loop 4 times in this example as we have set numberofExportFiles to 4. Feel free to increase or decrease to find optimum export experience of your own application.
rowMembers: It will be automatically set to the list returning from subaccs list
columnMembers: You need to place member’s from your Measure dimension or equivalent dimension.
povMembers: POV references are depending on your application and requirements.
MyConnection: needs to be a defined connection as other web services to the epm cloud itself.

Once executed, you should see 4 exports jobs triggered and each of them are exporting a portion of database as split by account dimension and you will have 4 files named ExportFile0.zip, ExportFile1.zip, ExportFile2.zip, ExportFile3.zip.
You can monitor from job console, how exports jobs are doing and you can download the files from inbox/outbox folder.
2 questions
– Can we build a logic to extend the groovy to check running export jobs for monitoring ?
– Can we consolidate 4 exported files together and prepare a single zip file out of this export ?
I guess they would be a topic of another post.