The client has multiple data feeds, but one vendor's invoices had caused confusion over the past year, as costs kept on rising outside of organic growth. Due to the nature of their engineering, they needed to understand where the cost leakages were occuring before engaging their technology teams to help remedy what they could. Their asset classes spanned mainly across equities and bonds.
We collected all their invoices in relation to bulk files and individual accounts. We analysed their investible and invested universe and queried superfluous data calls. We also pulled verification reports to assess potential errant data field calls, which could have proxies within the bulk files.
We broke down areas of interest based on the likely downstream impact and whether any cost savings were justified based on a cost / benefit analysis. We also differentiated spend where growth of coverage would necessitate higher data costs and those where poor house keeping was the main contributory factor.
The result was a report highlighting minimum six figure to potential seven figure annual savings based on immediate minor changes to more medium term investigations.
The client is upgrading their client's online trading experience which involves a change in market data and exchange licensing. They have a large mixture of non-professional users combined with external and internal professional users. Due to the nature of their data infrastructure, they have certain vendor dependencies which can't be mitigated in the short or medium term.
As with any large financial institution which has grown over a long period of time, technology infrastructure was organically grown and data was treated accordingly. Just like other similar firms, over the past decade, the need for cross harmonisation across multiple business lines and use cases is necessary, but the technology burden can be somewhat overwhelming.
Given this legacy, we focused on short term commercial wins by looking at invoices and seeing what could be re-negotiated. We also looked at upcoming projects to see what potential data costs could be reduced through commercial negotiations. We then looked at previous Exchange audits, opened up some questions regarding methodology and gave direction to the client as to how to adhere best to future compliance.
The result has been a deeper commitment with the client to look more broadly at their overall market data spend and how best they can use Glox to mitigate.
The client wanted to understand how their data was used as they felt there were complications with their system but didn’t know the provenance. Their asset class universe spanned across global public and private assets, where they comingle client transaction data to create proprietary derived data.
Their existing system was built by IT professionals who were not versed with financial systems, client experience or best data practices.
We broke down how their existing data tables had bottlenecks in the system which created scalability, performance and data provenance issues. We then mapped out how each of the calculation engines were co-dependent which was also preventing product enhancements. Working collaboratively with the IT team and a new data team, we sketched out new data tables and new data pipes which were rerouted into unbundled engines, thus allowing the possibility of new independent features to be.
The result was a more robust and streamlined product which was cheaper to develop and maintain, could integrate quickly with other systems and was more responsive to client enhancement requests.
The client was in the middle of a wealth transformation project. Our involvement related to sub projects to equip their private bankers with a new front office tool, enhance their client reporting and create a new self-directed trading platform for HNWIs. Where possible, they wanted to customise their solution.
All these workflows were not part of a cohesive global project but were being independently run by various groups within the firm across the USA (front office tool), Europe (client reporting) and Asia (trading platform).
We helped coordinate a selection process, leveraging relationships to bring best in class solutions. This involved managing the needs of different front office teams, navigating the compliance around index and fund redistribution and educating the technical architect on how to use real time streaming data within a client facing trading application.
The result was the selection of two vendors for the front office requirements, finding alternatives ways to benchmark client performance and overseeing the foundational build of a trading platform.
When two large Asset Managers merged, there were several challenges around data centres and integrations if they wanted to achieve future cost savings. Notably, there was a wave of departures and hence loss of knowledge of what each firm had and how best to combine.
We had relationships with both firms independently, so were uniquely positioned to help coordinate teams to solve for harmonising trading and data management systems. In doing so, we worked with their external third-party systems to give the new entity an impartial way of looking at their overall data architecture.
One major project was to shrink three separate instances of an Enterprise Data Management platform to just one. This involved working with their business analyst teams to map individual pieces of data to downstream systems and then re-architecture the flow so that the data would synchronise through.
The result was a reduction in implementation costs as our strategy helped bridge knowledge gaps that would have taken too long to achieve otherwise.