Tuesday, May 1, 2018

Hyperion application data backup automation

After thinking for many months finally I am writing my first Blog.

Blogs return by technology consultants had helped me a lot in past couple of years during my work and I always feel that I am only taking benefit of this vast technical community without contributing my bit hence I wanted to contribute by helping others with my gained knowledge while working for my clients.


In my case Essbase is installed in windows server so I have used batch (.bat) file and task scheduler for automation but you can replicate the same steps in shell script (.sh)  and using Unix cron job if Essbase is installed on Linux/Unix server in your case.

Please find below steps I performed to automate the daily data backup for Hyperion applications (Essbase/Planning) in one of my client environment.

  • Create below folder structure to keep a backup history for one week. In my case I am taking data backup on daily basis and maintaining it for one week. You can decide your own data backup strategy and accordingly setup folder structure.



  • If you want to take a full data backup you can simply use below maxl command to extract the entire application data or level0 data as per your need:
********************************************************
          /* Exporting Level 0 application data  */


login '<username>' '<password>' on '<server name>'; 

alter application <application name> load database <db name>;

export database sample.basic level0 data to data_file 'D:\<data_file_path>\<data_file_name>';

alter application <application name> unload database <db name>;

*********************************************************

*********************************************************

 /* Exporting all application data  */


login '<username>' '<password>' on '<server name>'; 

alter application <application name> load database <db name>;

export database sample.basic all data to data_file 'D:\<data_file_path>\<data_file_name>';

alter application <application name> unload database <db name>;

*********************************************************


  • In my case I only need current year data backup for which I have setup below calculation script:
*********************************************************
SET DATAEXPORTOPTIONS
{
DataExportLevel "ALL";
DataExportDynamicCalc OFF;
/*DataExportDecimal n;
DataExportPrecision n; 
DataExportColFormat ON;*/
DataExportColHeader "Period";  
DataExportDimHeader OFF;
DataExportOverwriteFile ON;
DataExportRelationalFile ON;
};

FIX ("<current year>");
DATAEXPORT "File" "|" "D:\<exported data file path>\<exported_data_file_name>.txt" "NULL";
ENDFIX;

*********************************************************




  • Next you need to create a small maxl script to call above calculation script for exporting a portion of data as per your need:
*********************************************************

/* Exporting current year application data  */

spool on to 'F:\<log path>\<log_name>.txt';


login '<username>' '<password>' on '<server name>'; 

alter application <application name> load database <db name>;

execute calculation <application name>.<db name>.<calculation script>;

alter application <application name> unload database <db name>;

spool off;
logout;


*********************************************************





  • Now its time for fun part lets automate daily backups. Create a batch file named ApplicationDataBkp.bat in the folder where you have setup the folder structure for each day of week and above maxl to extract the data.



  • Open ApplicationDataBkp.bat in text editor and embedded below code I am using "Sample" as application name and "Basic" as db name in the below example:
******************************************************************************


SET Drive=D:

REM SET Gen=%2
SET HWeek=%DATE:~0,3%
if %HWeek% == Sun Set Gen=Sun
if %HWeek% == Mon Set Gen=Mon
if %HWeek% == Tue Set Gen=Tue
if %HWeek% == Wed Set Gen=Wed
if %HWeek% == Thu Set Gen=Thu
if %HWeek% == Fri Set Gen=Fri
if %HWeek% == Sat Set Gen=Sat

REM Calling MXL to export the data from Sample application

call D:\Oracle\Middleware\user_projects\epmsystem1\EssbaseServer\essbaseserver1\bin\StartMaxl.bat D:\<backup folder path>\Export.mxl <username> <password> <server_name>;

REM Zipping IBT Data

@ECHO ON

CD /D "<path for 7 zip client installed>"
7z.exe a "D:\<backup folder path>\Processing\<Application Name>\<Backup_zip_file_name>.zip" "D:\<backup folder path>\Processing\<application name>\*.txt"

REM Copying file into respective day folder
move D:\<backup folder path>\Processing\<application name>\*.zip D:\<backup folder path>\%Gen%\<application name>\

REM removing text file after zip file transfer
del D:\<backup folder path>\Processing\<application name>\*.txt



*********************************************************************************

In the above example I have consider Storage drive name as "D" . You need to replace backup folder path , application name and db name as per your environment.


Lets understand what I am trying to achieve in the above script:


  1. First of all I am defining variables based on the day when script run to select the correct folder name during file copy.
  2. Then I am calling the maxl script which I have created in the prior step to export the data.
  3. Data exported by the maxl will be initially stored in the "processing" folder under specific "application name" folder which will be huge in size (in my case 17 files each of 2 GB so total 34 GB data for current year)
  4. Next for space management I am zipping the data files before storing them to respective day folder.
  5. Once files are zipped I am copying them into respective day folder and clearing the processing folder by deleting the unzipped files.



  • At last you can create a task scheduler job and call below script to schedule for daily backup. 

"D:\<backup folder path>\ApplicationDataBkp.bat" >D:\<backup folder path>\logs\Backup.log



Please do share your feedback about this post that will help me in improving my upcoming posts





No comments:

Post a Comment