Monday, May 21, 2018

Error Connecting to Planning Application from Smartview After security changes

During my work for recent client I came across below issue multiple times so I thought of documenting the solution.

Issue Description : 

If you update a planning security for a user like adding additional role, removing a role, Adding or removing a group etc than user can face below issue while connecting to planning application through Smartview.

"Cannot open cube view. Essbase Error(1054060): Essbase failed to select application <Application Name> because <username@Directory> is not completely provisioned by Planning."





Issue Cause: Aforementioned issue caused because Essbase security is no longer in sync with Planning (Shared Services) after security changes in Shared Services for specific planning application have been made. (This is a product bug)

Issue Resolution:

To Resolve this issue you have to delete the user from Essbase and refresh the security from Planning to Essbase.

Please find detail steps below:


  • Login to EAS console and open MAXL Script Editor


  • Query user using below command
display user <username>;




  • Delete user from Essbase using below command
drop user <username@provider> from security_file;




  • Refresh security by logging into Planning application through Workspace



  • Login to Smartview to confirm issue is resolved



Friday, May 4, 2018

Hyperion metadata backup automation

In my last post Hyperion Data Backup Automation I explained a method to perform data backup automation for Essbase\Planning applications.

In this post I am going to explain how to automate metadata backup of all Hyperion components (Example: Rule Files, Essbase Outline, Planning components like forms , business rules ,custom menus, Planning outline, calculation scripts etc ) other than data.

There can be multiple ways based on the component you want to take backup like planning dimension hierarchies can be backup using Outline Load utility. Similarly for forms you can use FormDefUtil and for security you can use Import &Export Security utility but the method I am demonstrating can be used to take backup of all or selective components using LCM command line utility.

LCM command line utility can be find under your epm instance bin folder like below:
D:\Oracle\Middleware\user_projects\epmsystem1\bin\Utility.bat

Before start with automation lets understand how manual LCM works by logging into Shared services. For demonstration I am taking an example of one Planning application as it also cover all Essbase components:

To export Planning\Essbase artifacts for backup or for migration purposes you need to login to Hyperion Shared services. Under Application group expand the Planning folder and click on application under it which you want to export.After selecting the application you can see multiple checkbox at right to select the artifacts you want to export. You can expand each category and select artifacts you want to export leaving data option unchecked as we don't want to export data using LCM.




After you make the selections you need to click on Export and provide the export folder name to begin exporting artifacts. All artifacts will be exported in the folder name you have provided under below path which is import_export folder you configured at the time of application configuration :

D:\Oracle\Middleware\user_projects\epmsystem1\import_export

Navigate to above path and open the application export folder. You will see two xml files under that folder named as Export.xml and Import.xml each file is having the list of components you have exported and respective path in the exported folder.




Automating above LCM export:

Setup a folder named as Automation where you will be writing a script to automate export. You need to perform a first time manual setup to perform that please follow below steps.

  • Open a command prompt and execute below command

D:\Oracle\Middleware\user_projects\epmsystem1\bin\Utility.bat D:\Oracle\Middleware\user_projects\epmsystem1\import_export\<Exported folder>\Export.xml

Note: Modify the directory path in the above command as per your environment and replace <Exported Folder> with the folder name you have exported with LCM in the previous step.

  • After entering above command system will ask you to provide username and password to perform the LCM export. In my case I have provided credentials for Admin user.



  • As soon as you hit enter username and password provided by you will be embedded into below file. You have to react very quickly here and as soon as you hit enter you need to copy below Export.xml file from below path to Automation folder as once the export is completed by above script Export.xml will again be overwritten and you will loose encrypted credentials in the xml file.

D:\Oracle\Middleware\user_projects\epmsystem1\import_export\<Exported folder>\Export.xml

  • The above export will export all artifacts same as manual LCM export through shared services and exported artifacts will be present in below folder which you can later use to import back through LCM.
D:\Oracle\Middleware\user_projects\epmsystem1\import_export\<Exported folder>\
  • To double check you can now compare the two files to make sure credentials are present in the Export.xml which is copied over to Automation folder
Export.xml present in Automation folder which contains username and encrypted password:



Export.xml present in (D:\Oracle\Middleware\user_projects\epmsystem1\import_export\<Exported folder>\Export.xml

) after export is completed which does not contain credentials:



  • Now rename the xml file to append application name in front of it in Automation folder which contains credentials and in the next step you will create a batch script to automate the export which you just performed manually. Renaming of file is not a mandate but if you are automating backup for multiple application in your environment you will be require to repeat all the above steps for each application and all the xml files will be residing in Automation folder so you need to give unique application based name to each file. 

  • Next create a batch file and embedded below command in the batch.

copy D:\Automation\XML\Export_<application_name>.xml D:\Oracle\Middleware\user_projects\epmsystem1\import_export\<application folder>\Export.xml

D:\Oracle\Middleware\user_projects\epmsystem1\bin\Utility.bat D:\Oracle\Middleware\user_projects\epmsystem1\import_export\<application folder>\Export.xml

Note: In the above script you are copying xml file from Automation folder to application folder under import_export directory and then calling LCM utility to launch the export. By running this script you don't need to provide credentials while running the utility as credentials are embedded in the xml file which you are copying to application folder.

  • Now just schedule the above batch using task scheduler as per your backup frequency and you are all set.








Tuesday, May 1, 2018

Hyperion application data backup automation

After thinking for many months finally I am writing my first Blog.

Blogs return by technology consultants had helped me a lot in past couple of years during my work and I always feel that I am only taking benefit of this vast technical community without contributing my bit hence I wanted to contribute by helping others with my gained knowledge while working for my clients.


In my case Essbase is installed in windows server so I have used batch (.bat) file and task scheduler for automation but you can replicate the same steps in shell script (.sh)  and using Unix cron job if Essbase is installed on Linux/Unix server in your case.

Please find below steps I performed to automate the daily data backup for Hyperion applications (Essbase/Planning) in one of my client environment.

  • Create below folder structure to keep a backup history for one week. In my case I am taking data backup on daily basis and maintaining it for one week. You can decide your own data backup strategy and accordingly setup folder structure.



  • If you want to take a full data backup you can simply use below maxl command to extract the entire application data or level0 data as per your need:
********************************************************
          /* Exporting Level 0 application data  */


login '<username>' '<password>' on '<server name>'; 

alter application <application name> load database <db name>;

export database sample.basic level0 data to data_file 'D:\<data_file_path>\<data_file_name>';

alter application <application name> unload database <db name>;

*********************************************************

*********************************************************

 /* Exporting all application data  */


login '<username>' '<password>' on '<server name>'; 

alter application <application name> load database <db name>;

export database sample.basic all data to data_file 'D:\<data_file_path>\<data_file_name>';

alter application <application name> unload database <db name>;

*********************************************************


  • In my case I only need current year data backup for which I have setup below calculation script:
*********************************************************
SET DATAEXPORTOPTIONS
{
DataExportLevel "ALL";
DataExportDynamicCalc OFF;
/*DataExportDecimal n;
DataExportPrecision n; 
DataExportColFormat ON;*/
DataExportColHeader "Period";  
DataExportDimHeader OFF;
DataExportOverwriteFile ON;
DataExportRelationalFile ON;
};

FIX ("<current year>");
DATAEXPORT "File" "|" "D:\<exported data file path>\<exported_data_file_name>.txt" "NULL";
ENDFIX;

*********************************************************




  • Next you need to create a small maxl script to call above calculation script for exporting a portion of data as per your need:
*********************************************************

/* Exporting current year application data  */

spool on to 'F:\<log path>\<log_name>.txt';


login '<username>' '<password>' on '<server name>'; 

alter application <application name> load database <db name>;

execute calculation <application name>.<db name>.<calculation script>;

alter application <application name> unload database <db name>;

spool off;
logout;


*********************************************************





  • Now its time for fun part lets automate daily backups. Create a batch file named ApplicationDataBkp.bat in the folder where you have setup the folder structure for each day of week and above maxl to extract the data.



  • Open ApplicationDataBkp.bat in text editor and embedded below code I am using "Sample" as application name and "Basic" as db name in the below example:
******************************************************************************


SET Drive=D:

REM SET Gen=%2
SET HWeek=%DATE:~0,3%
if %HWeek% == Sun Set Gen=Sun
if %HWeek% == Mon Set Gen=Mon
if %HWeek% == Tue Set Gen=Tue
if %HWeek% == Wed Set Gen=Wed
if %HWeek% == Thu Set Gen=Thu
if %HWeek% == Fri Set Gen=Fri
if %HWeek% == Sat Set Gen=Sat

REM Calling MXL to export the data from Sample application

call D:\Oracle\Middleware\user_projects\epmsystem1\EssbaseServer\essbaseserver1\bin\StartMaxl.bat D:\<backup folder path>\Export.mxl <username> <password> <server_name>;

REM Zipping IBT Data

@ECHO ON

CD /D "<path for 7 zip client installed>"
7z.exe a "D:\<backup folder path>\Processing\<Application Name>\<Backup_zip_file_name>.zip" "D:\<backup folder path>\Processing\<application name>\*.txt"

REM Copying file into respective day folder
move D:\<backup folder path>\Processing\<application name>\*.zip D:\<backup folder path>\%Gen%\<application name>\

REM removing text file after zip file transfer
del D:\<backup folder path>\Processing\<application name>\*.txt



*********************************************************************************

In the above example I have consider Storage drive name as "D" . You need to replace backup folder path , application name and db name as per your environment.


Lets understand what I am trying to achieve in the above script:


  1. First of all I am defining variables based on the day when script run to select the correct folder name during file copy.
  2. Then I am calling the maxl script which I have created in the prior step to export the data.
  3. Data exported by the maxl will be initially stored in the "processing" folder under specific "application name" folder which will be huge in size (in my case 17 files each of 2 GB so total 34 GB data for current year)
  4. Next for space management I am zipping the data files before storing them to respective day folder.
  5. Once files are zipped I am copying them into respective day folder and clearing the processing folder by deleting the unzipped files.



  • At last you can create a task scheduler job and call below script to schedule for daily backup. 

"D:\<backup folder path>\ApplicationDataBkp.bat" >D:\<backup folder path>\logs\Backup.log



Please do share your feedback about this post that will help me in improving my upcoming posts