Build versus Buy: 5 myths and realities (part 1)


In that age old debate about whether an organisation should use their existing resources to "do it themselves", Lacima weighs in with it's observations on the five key myths and their realities.  This week we look at Myth #1, check back next week for Myths #2 and #3.


Energy market participants now demand more from their valuation and risk management functions to meet board expectations.  Also, players are operating in a more stringent regulatory climate; there is increased oversight from credit rating agencies, as well as from banks extending credit.

Companies are struggling with utilizing their Energy Trading and Risk Management (ETRM) systems to fulfil their risk analysis and valuation requirements for portfolios that typically comprise complex financial contracts and physical assets, such as power plants and gas storage facilities. Lacima have observed that the valuation and risk management capabilities of all the major ETRM systems is recognized as their weakest functional area – a fact borne out by the sheer number of market participants who use their ETRM systems effectively solely for capturing deals, and for data and position management. In many cases, the preferred strategy, to bridge this functionality shortfall, is to turn to internally developed solutions. Here we look at five myths associated with undertaking an in-house risk and valuation application development project internally.

Myth #1: “It does not cost us anything”

One justification for pursuing an internal build strategy, is that it is a less costly option than deploying an external software solution in an organisation. In fact, we have heard it claimed more than once that it “... doesn’t cost us anything as we already have the quantitative analysts sitting here”. The reality is that the costs associated with the undertaking of a project to internally build a comprehensive risk and valuation system engine are not well understood by many in senior management positions.

An internally developed application of this kind should be built to meet the needs of trading, risk management, valuation, structuring and origination groups, who have a diverse range of functional requirements for risk and valuation analysis and reporting. It is recommended that organisations conduct a thorough analysis of the short, and long, term costs involved in developing and keeping analytics model libraries up-to-date.

In its May 2010 study “Optimizing the OTC Pricing and Valuation Infrastructure – Addressing Analytics Costs and Efficiencies”, Celent analysed the cost to banks and investment firms for building up their own pricing and valuation capabilities. They concluded that, aggregated over the total software cycle, firms adopting in-house strategies for OTC pricing will require investments between $25 million and $36 million to “build, maintain, and enhance” a complete derivatives library.

Lacima has consulted in organisations where there have been 2 people solely employed to keep the internally developed system running

Although the typical spend for most energy organizations will be less than that required for the investment banking firms analysed in the Celent study, once the total costs associated with the initial specification, build and test; ongoing bug fixes, enhancements and upgrades; as well as integration costs with internal and external systems are taken into account, then even at one quarter of these costs energy companies are looking at investments of between $6.2 million and $9 million over a multi-year software cycle.

One cost area that is underestimated in nearly all organizations is the inefficiencies due to fragmented data, analytics and platforms. Lacima has consulted in organisations where there have been 2 people solely employed to keep the internally developed risk system running – as a result often of temporary workarounds and operational systems not being adequately tied together – obtaining, reformatting, and cleaning the data, running individual steps of a business process, reading the output data in XL, and forming reports. This process is something that should be able to be accomplished by one person with the push of a button in a properly configured system. It is not unusual to find very well qualified (and highly paid) quantitative analysts spending around 75% of their time on these non-value added activities, and only 25% of their time on value added activities such as research, analysis and modelling.

In reality quantitative analysts have limited knowledge of how to develop an application’s architecture or how to build interfaces with other applications. Therefore, relying on such individuals alone to undertake this type of project is likely to result in inflexible applications in the organisation with ad-hoc interfaces which are difficult to use by all business users.


Check back next week for the next installment covering myths #2 and #3.

Risk Markets Technology Awards 2019 - Lacima Pricing and analytics Commodities RiskTech 100 - 2019 EnergyRisk Software Rankings Winner 2019