2022/11/30

The Value of a System Review

by Lucido Group

When is the best time for a system review of your treasury risk management solution? Irrespective of whether you have just implemented, been in production for years, or even in the midst of your initial project, the answer is: How Soon is Now.

We have written extensively on the subject of system reviews, be that identifying when your system is in need of a review, recognizing problems through reporting performance, or trying to maximize the value of prior spend. In this article, we will look at the various stages of system maturity, common pitfalls, and areas to focus on whether you are about to implement, are in a post-implementation period, or are dealing with a mature system.

These Things Take Time

Reviews on Treasury and Risk Management Systems (TRMS) are often performed during three stages of system maturity. Those stages can broadly be defined as:

  • Stage 1 – initiation of the system;
  • Stage 2 – post-implementation when functionality is delivered to the business; and
  • Stage 3 – when the system has normalized into the business operations.

At each stage, there are different reasons for initiating a review, but each should result in benefits that , through increased automation, reduction of customizations, and operational and regulatory risk, as well as user satisfaction. During this note we will talk about pre- and post-implementation reviews, but similar principles apply for pre- and post-upgrades.

Heaven Knows I’m Miserable Now

During the implementation phase the system vendor, or your implementation partner, should follow a best practice delivery model. Irrespective of your own institution’s preferred approach, waterfall or agile, for example, it is often best to be guided by the vendor/partner and follow their model. They have done this before and you, likely, have not, at least not for this particular system. It is important to be flexible, impose the red lines you think are appropriate, and adopt their recommended approach to delivery.

That said, all too often the standard of the delivery is compromised by any number of the reasons. Be mindful of the following:

  • Incomplete business requirements.
  • A poorly executed Target Operating Model. Does everyone understand the signed-off “to be” business processes?
  • The business processes have not been reviewed and adapted to fit the new system.
  • Implementation consultants that lack the required system expertise.
  • Lack of staffing availability, be that on the client side, the system provider, or other third-party partners. Watch for a vendor bait-and-switch with CVs included in the RFP response that then never materialize for the project itself.
  • The scope is changed leading to delays in the delivery. Scope creep is common and to be guarded against.
  • Not all customizations are created equally. Unnecessary and sometimes extensive customizations of platforms may lead to expensive ongoing support and upgrades.
  • Staff training is neglected or left to the end resulting in insufficient time to be ready for go-live.
  • A system support model was not clearly identified; too often the emphasis is in the deployment of the solution with support becoming an afterthought, leaving little time to plan the model.

Please, Please, Please, Let Me Get What I Want

A review during the implementation is generally done for long-running projects that are perceived to be failing, but it does not have to be that way. The purpose at this stage should be to take an impartial view of the project and provide recommendations to ensure that the system implementation is on time and budget. If one is proactive you can get ahead of the problems before they occur. The saying holds quite true here: an ounce of prevention is well worth a pound of cure. It is why we would also recommend an experienced, independent third party also sit on the Steering Committee of large projects.

The review process does not need to be a long engagement. The types of activity include monitoring of the system value proposition against project progress, review of project plans and resources, the fit of expected business processes against system functionality, and review of system infrastructure and sizing. The value of true Subject Matter Expert (SME) insight at an early stage in your project cannot be underestimated.

The Boy with the Thorn in His Side

A Post Implementation Review (PIR) is conducted after the project completion, once the product goes live to the business. Its purpose is:

  • To ascertain the degree of success, evaluating whether the objectives had been met, delivery of the planned benefits, and if the original requirements had been addressed;
  • To examine the business solution implemented to see if further improvements can be made to optimize the benefits delivered; and
  • To learn lessons for the future, which can be used to by other teams in the wider organization to improve future projects.

The PIR should be performed once the solution has been deployed, and after any showstopper or high priority issues have been resolved. Ideally the PIR should be performed in the first few weeks after deployment, but is dependent on the solution, the client site, and the levels of “project fatigue”. It is an opportunity to take a step back, take a big picture perspective, and resolve the things that are going to continually annoy you as the system matures.

Work is a Four Letter Word

The PIR is usually performed by the implementation project team as they are familiar with the steps taken, the functionality deployed, the problems encountered in delivering the solution, and the resolutions adopted. There is a risk that using the project team may lead to omissions or oversights – perhaps understandably they may be quite happy just to get over the line. A fresh pair of eyes is critical in those situations; using an independent consultancy working with the project team and business users will help provide an impartial point of view. This independent review can then be submitted to the project board for evaluation and next steps.

Some of the items reviewed may be similar to those during the implementation phase. In addition, reviews of system documentation and user training can identify gaps and fend off any audits. If not already in place, the support model and plan for Business As Usual (BAU) development can be established. A true cost benefit analysis can be a challenge, but outside expertise can help quantify some of those non-financial benefits.

There is a Light That Never Goes Out

The most common review is on a mature system. Perhaps an upgrade is due and it’s a good time to identify all the customizations that can be removed. Users may be overwhelmed by all the small items that just never quite make it to the top of the BAU priority list. Surely there is a better way of doing this? This review should be scheduled some time after the system has been deployed, with enough time for the business operations to have normalized. Its purpose is:

  • To ascertain if there are any limitations to the solution that are affecting the business in performing its function;
  • A technical review to identify areas that need optimization and performance evaluation;
  • Examine the approach to testing, the test cases and any tools used to facilitate the solution testing; and
  • Assess internal processes.

Stop Me if You Think You’ve Heard This One Before

As with the PIR, annual or regular reviews can be performed by the project or system support team; however, in most circumstances the team has been reassigned to other projects and/or roles, or the support team is busy with BAU work. This leaves a vacuum in resources with the means and time to evaluate the solution. An independent consultancy working with the business and support teams can provide the necessary resources and impartiality required to perform the review.

In addition to topics covered in prior reviews, at this stage the governance and change management processes can be evaluated. Are the required archiving and purging activities in place? Is the solution fit-for-purpose, usable, maintainable, supportable, scalable, and performing as expected? Are there gains to be made through the implementation of automated testing?

What Difference Does it Make?

Our team at Lucido have performed numerous reviews on a wide range of software and infrastructure. From being dropped in to review an ailing implementation, to a legacy customer 20 years into their system journey asking for help, there is not much we have not seen. We have SMEs that can help you better understand the processes both in and outside your system. No matter what the current state of your platform, there are always improvements to be made; real, tangible, quantifiable benefits to be realized. The only question is: when will you review yours?

If you would like to book in a review of your system, please reach out!