If you have jumped into Groovy Calculations, one of the things you likely would try to do is grab a value for a sub var. Hopefully, you haven’t spent too much time before reading this. I wasted a ton of time trying to get this to work before I opened a ticket with Oracle. This class is NOT available yet and was inadvertently included in the public docs at https://docs.oracle.com/cloud/latest/epm-common/GROOV/. The development team told me they are going to remove it from the API docs. Read more
Introduction
Groovy provides a very easy way to interact with the user via run time prompts, or RTPs. These can be linked to dimensions, members, and input. One of the huge benefits of getting RTPs in Groovy is that the result can be validated, and the calculation can be cancelled if they don’t validate (we will touch on this in a future post).
The Solution
This is one of the easier things to do with a Groovy calculation. There are two things required. First, the Groovy calculation must prompt the user to select a value. This is done by doing the following.
/*RTPS: {RTP_Consolidate_Data}*/
At any point in the script after the above, the value can be used. If it is going to be used multiple times, it might be easier to set a variable. Regardless of the approach, the value can be referenced using the rtps object as follows.
String sRTP sRTP = rtps.RTP_Consolidate_Data.toString()
That is all that is required!
Conclusion
Beyond the obvious uses of an RTP, I have started using these for a number of other reasons.
- On global forms where multiple values may be changed throughout a short period of time and execute long running calculations, like allocations, I have seen benefits of prompting a user with a yes/no smartlist RTP. If the user has more changes, they may not need to execute the calculation after every save. This gives them the option.
- If there is a requirement where some prompts are dependent on other prompts, using RTPs in Groovy gives you the flexibility to validate the combination. For example, if an employee is set to hourly with a VP title, the prompts can be validated and returned to the user as invalid combinations before the prompts are removed from user view.
A bug with EPM Automate has been identified. This is not replicated on every version or client. Please pay attention to any EPM Automate updates installed. In the past, I was able to install the latest version without any issues. Currently, the install prompts users to uninstall the older version. In the past, this worked as expected, but now, when selected, this has no effect and the new EPM Automate is NOT installed, leaving you with the existing version. Read more
Introduction
With the introduction of Groovy Calculations this summer, one of the things I use most, especially for applications with data forms that include a large sparse dimension in the rows with suppression on, is the option to loop through cells and identify only the POV on the cells that have changed. Read more
Introduction
We all know the Data Form validation rules are serviceable, but they are not robust. When Smart View advanced and forms were opened in Excel, the validation logic developers had in JavaScript became useless. Since then, we have really missed the ability to communicate with the user interactively with visual cues and validation rules that halted the saving of data. Well, Groovy calculations to the rescue! Read more
What Is Groovy
Recently, Groovy scripting was added to ePBCS business rules as an option instead of the GUI, or the go-to scripting for you old-timers who still refuse to change. These are defined in the Business Rule editor as Groovy calculations. So, what is Groovy? Read more
If you are a fan of the HSGetValue and HSSetValue, you probably are using a private connection. As you know, anybody that uses the template has to either change the connection string to their own predefined private connection, or set up a private connection with the same name. When dealing with inexperience users, both methods can be problematic.
You may be surprised to know that the Get and Set Value functions can use a shared connection. Read more
Problem
I am currently working with a client that is updating a planning application and one of the changes is to remove a dimension. After the new application was setup and the hierarchies were modified to meet the objectives, migrating artifacts was the next step. As many of you know, if you try to migrate web forms and composite forms, they will error during the migration due to the additional dimension in the LCM file. It wouldn’t be a huge deal to edit a few XML files, but when there are hundreds of them, it is extremely time consuming (and boring, which is what drove me to create this solution).
Assumptions
To fully understand this article, a basic understanding of XML is recommended. The example below assumes an LCM extract was run on a Planning application and it will be used to migrate the forms to the same application without a CustomerSegment dimension. It is also assumed that the LCM extract has been downloaded and decompressed.
Solution
I have been learning and implementing PowerShell scripts for the last 6 months and am overwhelmed by how easy it is to complete complex tasks. So, PowerShell was my choice to modify these XML files in bulk.
It would be great to write some long article on how smart this solution is and overwhelm you with my whit, but there is not much too it. A few lines of PowerShell will loop through all the files and remove the XML tags related to a predefined dimension. So, let’s get to it.
Step 1 – Understand The XML
There are two folders of files we will look to. Forms are under the plan type and the composite forms are under the global artifacts. Both of these are located inside the resource folder. If there are composite forms that hold the dimension in question as a shared dimension, both will need to be impacted. Scripts will be included to update both of these areas.
Inside each of the web form files will be a tag for each dimension, and it will vary in location based on whether the dimension is in the POV, page, column, or row. In this particular example, the CustomerSegment dimension is in the POV section. What we want to accomplish is removing the <dimension/> tag where the name attribute is equal to CustomerSegment.
For the composite forms, the XML tag is slightly different, although the concept is the same. The tag in composite form XML files is <sharedDimension/> and the attribute is dimension, rather than name.
Step 2 – Breaking Down the PowerShell
The first piece of the script is just setting some environment variables so the script can be changed quickly so that it can be used wherever and whenever it is needed. The first variable is the path of the Data Forms folder to be executed on. The second is the dimension to be removed.
# Identify the source of the Data Forms folder and the dimension to be removed # List all files, recursively, that exist in the path above $files = Get-ChildItem $lcmSourceDir -Recurse | where {$_.Attributes -notmatch 'Directory'} |
The next piece of the script is recursing through the folder and storing the files in an array. There is a where statement to exclude directories so the code only executes on files.
# List all files, recursively, that exist in the path above $files = Get-ChildItem $lcmSourceDir -Recurse | where {$_.Attributes -notmatch 'Directory'} |
Step 3 – Removing The Unwanted Dimension
The last section of the script does most of the work. This will loop through each file in the $files array and
- Opens the file
- Loops through all tags and deletes any <dimension/> tag with a name attribute with a value equal to the $dimName variable
- Saves the file
# Loop through the files and find an XML tag equal to the dimension to be removed Foreach-Object {
$xml = Get-Content $_.FullName
$node = $xml.SelectNodes(“//dimension”) |
Where-Object {$_.name -eq $dimName} | ForEach-Object {
# Remove each node from its parent
[void][/void]$_.ParentNode.RemoveChild($_)
}
$xml.save($_.FullName)
Write-Host “($_.FullName) updated.”
}
Executing The Logic On Composite Forms
The above concepts are exactly the same to apply the same logic on composite forms files in the LCM. If this is compared to the script applied to the web forms files, there are three differences.
- The node, or XML tag, that needs to be removed is called sharedDimension, not dimension. (highlighted in red)
- The attribute is not name in this instance, but is called dimension. (highlighted in red)
- We have added a counter to identify whether the file has the dimension to be removed and only saves the file if it was altered. (highlighted in green)
The Script
$lcmSourceDir = "Z:\Downloads\KG04\HP-SanPlan\resource\Global Artifacts\Composite Forms" $dimName = "CustomerSegment" # List all files $files = Get-ChildItem $lcmSourceDir -Recurse | where {$_.Attributes -notmatch 'Directory'} | # Remove CustomerSegment Foreach-Object { # Reset a counter to 0 - used later when files is saved $fileCount = 0
$xml = Get-Content $_.FullName
$node = $xml.SelectNodes(“//sharedDimension“) | Where-Object {$_.dimension -eq $dimName} | ForEach-Object {
#Increase the counter for each file that matches the criteria
$fileCount++
# Remove each node from its parent
[void][/void]$_.ParentNode.RemoveChild($_)
}
# If the dimension was found in the file, save the updated contents.
if($fileCount -ge 1) {
$xml.save($_.FullName)
Write-Host “$_.FullName updated.”
}
}
Summary
The first script may need to be run on multiple plan types, but the results is an identical folder structure with altered files that have the identified dimension removed. This can be zipped and uploaded to Shared Services and used to migrate the forms to the application that has the dimension removed.
The scripts above can be copied and pasted into PowerShell, or the code can be Downloaded.
If you are using PBCS, you may run into some challenges with large files being passed through FDMEE. Whether performance is an issue or you just want to parse a file my month/year, this script might save you some time.
The Challenge
I recently had the need to break apart a file. The source provided one large text file that included 2 years of data that was needed to populate the history of an employee metrics application. The current process loaded files by month and we wanted to be able to piggy back off the existing scripts to load and process data in FDMEE and the monthly Planning data pushes to the ASO reporting cube. Read more
I have been using PBCS for over a year now. As much as it surprises me, I am a fan. From my experience, I have seen good performance, above average stability, above average accessibility, and I enjoy the cutting edge functionality that is lacking with on-premise implementations. I created some info-graphics for presentations and thought I would share with the community. As always, if you would like to contract me for design work, I am always excited to take on interesting and challenging work. If you would like to donate, click the link to the right.
Please remember that I put a lot of work into these works, so please do not alter or use without consent.