With trading volumes down on pre-crisis years, how is it that asset managers have ended up with huge increases in data and calculation requirements?
The answer lies somewhere between increased risk management requirements and regulatory demands for more frequent reporting. Other factors include the level of corporate bond issues (especially in emerging markets) and the increasing investments in hedge funds by the pension companies.
In other words, greater accountability and complexity are adding to the workload.
All of this has resulted in more and more data elements per transaction and more levels of analysis downstream for performance, risk and compliance reporting. For many asset managers, overnight batch processing is the simplest way of reconciling those transactions and making sure that data is effectively processed and ready for value add analysis.
The idea is that no user interaction is required once everything is underway. While this can be carried out at any time, it is particularly suited to end-of-cycle processing, such as for processing portfolio transactions at the end of a day.
Batch processes remain essential for many reasons. For example, it is simply not efficient to regenerate a complete set of performance data across multiple portfolios and time periods every time the front office processes a single trade.
Batch, and its successor “workload automation”, has evolved over time, with the latest innovations adding policy or rules-driven workflows that transform batch jobs into repeatable workflows.
According to a recent article in Computer Weekly, unglamorous batch processing is ‘the new sexy’ for financial services IT. That’s because workload automation is no longer restricted to its unappealing reputation of processing last night’s batches. Today, it is about event-driven, real-time, full automation of business processes.
It’s true that workload automation often goes unnoticed. But look this way: it has embraced modernity all the way to the cloud and is now going places.
Moving batch processing out of the in-house data center and into the cloud has two important advantages, especially for boutique asset managers that have limited capacity and ability to service a 24/7 global market.
For boutique asset managers that do struggle with capacity when it comes to batch processing, the cloud can provide the right solution, offering enormous scalability and flexible processing capacity which can expand and contract according to need.
Having access to such elastic computing can also help to reduce cost and redundant over-capacity often associated with in-house data centers.
According to BMC Software in its report Workload automation: helping cloud computing take flight, resources for batch workloads can be available when you need them, which can help organizations meet their service levels.
The cloud also provides a central real time repository. This is suited for global asset managers, as they need systems that allow them to trade and settle across different time zones.
Not only that, traditional overnight batch windows are shrinking or going away as business runs 24×7, making job scheduling difficult even under the best of circumstances.
Firms also need to provide a raft of data to meet the growing list of regulations.
A common approach is to silo according to geography or asset class and hold data within distinct data centers or applications. This might serve certain needs but restricts collaboration between middle office teams and also leads to duplication of effort.
A more sophisticated approach today is to run cloud-based systems that can link to existing platforms via powerful APIs. Centralizing data sets in the cloud allows for various ‘views’ of the data to be created. The central audited copy remains, but multiple stakeholders can act on the data based on their requirements rather than having their own local copies.
Extending views of centralized data to risk and compliance teams and even the regulator themselves is also a more efficient way of calculating, analysing and reporting data.
For boutique asset managers, the ability to take advantage of flexible cloud-based technology is a significant plus. They can leverage existing capacity and feel confident they will always be up to date and scalable for future demand. Being supported by technology that is able to operate in a 24/7 global environment is also a must have.
Having access to new business applications and functionality with on demand scalability helps level the playing field for boutique asset managers competing with the larger firms.
It even tips the balance towards the smaller players who are willing to flex their more agile muscles and adopt scalable pay as you go technology platforms at speed.
- Increased risk management requirements and regulatory demands and front office product innovation means large increases in data and calculation requirements
- Bringing batch processing out of the local data centre and into the cloud has many advantages, especially for smaller asset managers
- The cloud has flexible capacity so can deal with temporary increases in processing demand
- It is also future proof, because of the provision of extra capacity, when required and in-place application upgrades