Most organizations severely underestimate what it takes to reconcile data. They often don’t consider how much historical data is needed and other items that need to be accounted for. The best way to make sure data reconciliation is smooth and mostly uneventful is to take a planned and considered approach.
GETTING STARTED
The first step is to determine how much historical data to load. To do this right you need to review the data with a critical eye and operate with the assumption that data validation WILL take longer than anticipated. The minimum historical data required is the current year and the year prior. This will facilitate reporting with prior year comparisons. While there are typically some reports that report data that is older than the previous year, the time and effort to load this additional data is generally not worth it. Also, this type of reporting is best handled in Excel while data is naturally built up in the application. Bottom line with this step is: If you think you need additional data, go back and think again, or be prepared to lengthen the project.
The next step is determining how you are going to validate the data. Often the first thought is to run reports from the existing system and Financial Management and then tick and tie the numbers. This is a very manual and slow process. Depending upon what the current system is, it is usually quicker to perform the validation in Excel. If the current consolidation is done in Excel or any Hyperion product, doing the validation in spreadsheets is definitely the way to go. The Excel spreadsheets are typically set up in a financial statement format. There is a column with values from the old consolidation system, a column populated using Smart View to pull the comparable values from Financial Management and a difference column. There should be a “Point of View” section that is referenced by Smart View so that the entity or period can easily be changed and the data refreshed.
Another increasingly important consideration is what the requirements are for documenting the data validation. This needs to be discussed upfront with the various stakeholders so the required documentation gathering can be built into the process rather than added at the end.
Now that the mechanics of the process is defined, what does the process look like? I have heard the suggestion more than once, “Let’s just check the top and see how it looks!” The folks suggesting this have obviously never validated data.
The validation process needs to be an orderly one that builds from the bottom up. If possible, start with a base entity and base accounts for one month, usually January of last year. At this level, you are really checking that the data load process is loading to the data correctly. Then you move up to the parent entities and parent accounts. This step is to confirm that your hierarchies are built correctly, and that translation and eliminations are being performed at the appropriate place.
Once you have reached the top entity, you should have the majority of the issues corrected. Each month’s validation will be smoother and quicker, with a few bumps here and there. At this point, it can be appropriate to take more of an audit approach to the validation where you can even “check the top and see how it looks!”
One final note: this process is easily carried out in OneStream CPM applications.