FORUM: Use of Big Data and data analytics as part of a risk management strategy

April 2016  |  SPECIAL REPORT: MANAGING RISK

Financier Worldwide Magazine

April 2016 Issue


FW moderates a discussion on the use of Big Data and data analytics as part of a risk management strategy between Paul Oughton at Advent IM, Michel de Goede at Alliander, Daniel Miessler at IoActive Inc, David Navetta at Norton Rose Fulbright US LLP, and Raj Kushwaha at Warburg Pincus LLC.

FW: How would you describe the increasing use of Big Data and data analytics among companies? What factors are behind the continued rise?

Kushwaha: As the cost of hardware for computer and storage plummeted over the last decade, it became increasingly practical to store and process a large amount of data. Also, the recent rise in social media, and other unstructured data sources where potential risk can be introduced and flagged, has only increased the complexity of the data that needs to be assessed and analysed by risk managers for making risk based decisions. The good news is that the recent advances in software technology have now made it possible to draw material inferences and conclusions by correlating and manipulating different types of data, both structured and unstructured, across erstwhile silos of data and other third party sources of unstructured data, thereby unlocking the true potential of Big Data analytics. Furthermore, the advances in hardware and software, like in-memory computing, allow companies to draw these conclusions in real time or near real time – fast enough to calculate risk exposure to an asset class and block a rogue trade real time, or evaluate AML exposure for a transaction in-line, making compliant near real time fund transfer and settlement a practical possibility.

Miessler: The primary draw of Big Data and analytics is the potential to find questions that a company should have been asking about their business. Big Data has the ability to illuminate trends and patterns that would have otherwise been invisible, which then creates questions and inquiries into how the business works. Ultimately, the outcome of such pattern identification is often the ability to predict when a certain business-contextual event is about to happen, and then to adjust accordingly in an automated fashion. Another key output is the ability to optimise business functions that you could not before.

Navetta: Over the past decade – and beyond for many – companies have been collecting vast amounts of data concerning their customers, operations, suppliers, products and services. However, until fairly recently the technology did not exist to allow for the curation and analysis of enormous data sets. That has changed with the advent of cloud computing, computers with faster processing speeds and the development of specialised data analytics software. Now companies, both technology and traditional, realise they are sitting on a treasure trove of data that may, if properly organised and analysed using Big Data analytic measures, yield new and interesting insights that may provide a competitive advantage. They also realise that failing to take advantage of Big Data may result in competitive losses. This is driving Big Data in the business context.

de Goede: Big Data has been removed from Gartner’s 2015 Hype Cycle for Emerging Technologies and has instead found its way into the market in more specific applications and techniques, each of which has its own solution space. It is also a concept of which at least 10 different definitions can be found. Increasingly, organisations seem to be finding new areas in which to apply Big Data and analytics concepts and are finally asking their business questions first. One of the main drivers behind this shift is the increasingly competitive landscape in which companies are operating. The market is becoming ever more sophisticated, and in order for companies to remain on top of their game they must keep up with their competitors. The ready availability of new data producing or data storing software, services and products, the prevalent trend of consumerisation and the enormous amount of PE and VC money pushing technology driven innovation, is another string driver.

Oughton: While it’s true to say more companies are collecting Big Data, not all are using Big Data in a way that benefits the organisation or provides any tangible value. That is not to say those companies that are using Big Data are not rising in number, they are just not rising quite as quickly as those that are simply collecting and storing large volumes of data for reasons that they themselves do not fully know or understand. Humanity’s propensity for hoarding, and the idea of ‘keeping up with the Joneses’, plays a significant role in this. People, and by virtue of that, the companies they work for, love the idea of accumulating lots of ‘stuff’, just on the off-chance that it will be useful one day. Garden sheds around the country are full of half-empty tins of paint, off-cuts of wood and old mobile phone chargers that ‘might come in useful, one day’. The same principle applies to Big Data storage. Equally, when companies see that one of their competitors is investing in Big Data storage and analytics, they feel compelled to do the same, perhaps without fully identifying any reason or rationale behind it. All companies want the competitive edge that the perception of Big Data provides and there is no doubt that the speed to insight piece is vitally important – when it works.

Recent advances in software technology have now made it possible to draw material inferences and conclusions by correlating and manipulating different types of data.
— Raj Kushwaha

FW: In a world of increasing complexity and demands, to what extent does the application of Big Data and data analytics serve the risk management requirements of organisations?

Oughton: Having access to large volumes of historical records, previous transactions and customer and client information that Big Data storage and analytics provides, allows an organisation to identify patterns and trends like never before. This approach then allows the organisation to predict and plan for previously unforeseen eventualities and disruptive events that would not have been identified by traditional means. Big Data analytics allows organisations to prepare for incidents based on the information that can be mined from the large volumes of historic and real-time data. This approach allows organisations to be able to join the dots and identify vulnerabilities, areas of concern or conversely unexploited opportunities for growth or development. However, identifying a potential weakness or vulnerability based on the identification of patterns and trends is one thing; having a robust incident management process in place to deal with a disruptive event is something entirely different. Organisations must ensure that they have a plan in place that outlines what to do in real-time and to allocate the resources and skills to be able to act upon it in real time also.

Kushwaha: Big Data can be a significant advance for risk managers. It can improve the efficacy of risk models, provide broader and more comprehensive risk coverage by analysing more variables simultaneously, and generate significant cost and time savings for organisations by improving the velocity of decisions and automating many of the human judgement aspects. This is not only nice to have, but a must have. As the regulatory environment for financial institutions becomes more complex and organisations are required to monitor and report risk and credit exposure more granularly and frequently, and with an ever increasing possibility for fines and criminal liabilities, the need for speed and accuracy in this critical function cannot be overstated. Big Data allows organisations to rapidly bring together multiple data types across silos of data sources to better analyse things like credit risk, market risk, operational risk, compliance risk and asset liability risk.

Miessler: Big Data can potentially inform risk management by capturing thousands of variables from thousands of organisations and then looking at what they have in common. Do companies that get breached have certain attributes in common? Do companies that have great security have certain attributes in common? This is the type of thing that Big Data can assist with.

de Goede: Firstly, there is operational risk. For our company, which operates more than 100,000 kilometres of electricity grid, the sharply growing influx of privately generated wind and solar electricity and the electric vehicle power consumption may lead to peak values that the electricity grid has simply not been designed to handle, particularly given that elements of the grid may be between 20 and 100 years old. When doing nothing, sooner or later damages or outages might occur. ‘Sensorification’ of the grid, flexible tariffs and in-home Internet of Things switching may help prevent serious investment in additional, underground grid capacity. Over time, more and more smart sensors and technology will find their way into the grid. Should self generated energy and energy storage devices become the norm, the general public and organisations might no longer feel the need for a connection to the National Grid and go off-grid, which would create an obsolescence risk for our assets. Accordingly, companies must monitor a great variety and volume of data sources when searching for emerging trends.

Navetta: In the data security and privacy risk management arena, Big Data analytics is being used to enhance the data security of organisations. Companies have increasingly set up more monitoring of their system networks and traffic coming into their systems with the goal of identifying, preventing and remediating security vulnerabilities and incidents. This has led to the collection of vast amounts of security-related and system data, and unfortunately it is difficult for organisations to actually review and analyse that data. This is where Big Data is helping. Companies have developed sophisticated algorithms that enable real-time analysis of security-related data, which can help pinpoint potential security weaknesses and problems. They can now analyse their computer environment holistically using Big Data, including monitoring external traffic, lateral traffic within the network and external sources of attack. As organisations gain more data and experience, they can feed it back into the Big Data engines and obtain more precise results and insight into their risks.

As organisations gain more data and experience, they can feed it back into the Big Data engines and obtain more precise results and insight into their risks.
— David Navetta

FW: With over 90 percent of the world’s data said to have been created in the last two years, what advice would you give to organisations in terms of identifying and implementing an effective risk management strategy to cope with this unprecedented velocity, variety and volume of data?

de Goede: In terms of establishing effective risk management strategies, companies must first look at what data is necessary, and only then setup the relevant data infrastructure. For example, if our company wants to perform predictive failure analysis of different types of objects in the grid, would it then be necessary to save and use all monitoring and product data of these hundreds of thousands of objects? Not necessarily, as long as there is a statistical representative sample – and be very careful with the word ‘representative’. This may lead a historian to serve up the required insights quite quickly and easily – ‘I wouldn’t call it Big Data’. However, were we to identify the potential failure of each individual object in the grid, it would have to analyse all monitoring and product data, look for history, anomalies, relations with previous or known failure patterns and then apply the corresponding probability calculations. This, you could argue, is a ‘real’ Big Data and analysis effort.

Oughton: The principle of Big Data storage is not a million miles removed from the idea of keeping all of your eggs in one basket. The fact that organisations are storing and processing larger and larger amounts of data undoubtedly makes them a more attractive target to cyber criminals. Equally, the fact that such large volumes of data, that could include a wide variety of operationally, commercially or personally sensitive information, may very well increase the impact to the organisation in the event of a breach or other disruptive event. Another thing to consider is ensuring that you can safely and securely store this data. Remember the Data Protection Act and do not store personal information ‘just in case’. Current legislation states that personal information, which has had a considerably expanded definition in recent years, must only be stored and used for a defined purpose. There is a great deal of emphasis on organisations to demonstrate that this information is not only being used appropriately, but also that it is adequately protected. Think too about where the data is stored; previous assurances that were provided by schemes such as Safe Harbour have now been ruled as invalid. If you are storing data offshore, or outside of the EU, be sure that you have appropriate assurance mechanisms in place and that you are fully aware of the risks. Due diligence is important. If you are using an external storage provider or cloud-based solution, go and check out their facilities.

Kushwaha: A Big Data strategy for risk management will be critical to deal with the explosion in velocity, variety and volume of data that needs to be considered today in making risk decisions. A Big Data enabled risk strategy needs to have five distinct components. As with any journey, a Big Data enabled risk management journey needs to start with a clear vision and problem statement – management needs to decide what types of problems are the high value problems where velocity and accuracy of large data set analysis would most benefit the organisation. Second, domain experts in the problem area need to give careful thought to what data would be most useful for the problem at hand. Big Data by definition means a lot of data which can imply a lot of noise. It is important to understand the distinction between high value data and low to no value data. Third, the company needs to assess if it has the capability to be able to meaningfully identify, combine and manage multiple sources of data. This is generally a skill gap in most IT organisations that needs to be filled or supplemented. Fourth, the company needs to build the capability for advanced analytics models for predicting and optimising conclusions. This is the discipline of practical applications of data science – a skill set that is scarce. Fifth and final, and perhaps most critical, management must possess the personal conviction to transform the organisation decision-making so that the data and models can actually effectuate and yield better decisions.

Navetta: At the outset, most organisations do not have the talent or resources to understand the data they have and how it may be leveraged using Big Data. A good first step, if resources allow, is to obtain that talent. Some organisations are hiring their own Big Data experts and data scientists, but many are relying on third party vendors. In fact, in this space it is typically necessary to look to third party vendors to leverage specialised tools and specialised individuals to enable the capture, curation, storage, search, sharing and analysis of the data in a way that is valuable to the organisation. Once a company’s data inventory and analytics potential is understood, and the talent and tools identified to leverage that potential, organisations must address the potential limitations, legal risks and issues associated with the collection, analysis and use of Big Data – and the insights derived from it.

Miessler: The most important thing that can be done is a data asset inventory, where the company makes a comprehensive list of where all company data is stored, how it is protected, and what legitimate applications have different types of access to it. Without doing this crucial step it is extremely hard to be adequately protected. Everything starts with understanding what you have. As we move forward into the future, the implementation of data lakes will become increasingly useful. These will allow common interfaces for accessing data, as well as the access controls and auditing that come with them.

FW: The application of Big Data technologies and data analytics is seen as a game changer in today’s highly complex business environment. Accordingly, what are the likely consequences for those organisations that resist or are unable to integrate Big Data and data analytics into their risk management strategy?

Navetta: In many cases, early adopters of Big Data strategies have been able to gain a competitive advantage. Over time, as more companies in a variety of industries adopt similar Big Data strategies, the playing field has been levelling. The net impact of the Big Data movement, especially for companies that are data intensive, is that those companies that fail to leverage their data and apply Big Data analytics will eventually fall behind. Ironically, in some cases newer technology-driven companies with less data, but with a well-conceived Big Data infrastructure and strategy, are able to successfully compete with more established companies that have significant amounts of data, but a limited ability to organise and analyse it. In short, companies that invest in Big Data, over time, will beat their competitors, all other things being equal.

Kushwaha: In the short term, organisations that can effectively harness the power of Big Data will definitely create a strategic advantage in terms of velocity, efficacy and efficiency of risk decisions. For example, there is the much publicised case study of UON Bank from Singapore, which cut down time for calculating the total ‘value at risk’ calculation from 18 hours to a few minutes. In the medium term, as the regulatory requirements get more complex and availability of data increases, complex risk modelling will become more the rule rather than an exception. Organisations that do not harness and embrace the power of Big Data will find themselves increasingly more open and liable for regulatory and compliance exposures. Longer term, organisations that do not embrace Big Data will either find themselves not being able to comply with regulatory requirements and be subject to ever increasing fines, or will become so operationally inefficient that it will become a matter of survival.

de Goede: Gartner has introduced a new and enticing ‘Hype Cycle: the Programmable Economy’. Consumer behaviour and its analysis serve as input to automated product and service offerings or other automation applications. The use of Big Data and analytics in the customer domain is driving new business models, tailored offerings and providing a new level of user experience and customer intimacy. From intimacy and satisfaction, long term customer relationships can be built using the internet and mobile as worldwide enablers. This allows for the rapid growth of organisations, for example the Samwer brothers’ Rocket Internet venture has seen incredible results. Companies that are unable to keep up with the ever cheaper technology standards and leverage their potential could become the ‘dodo’ of their industry and hence are subject to increasing levels of business and obsolescence risk before being extinct.

Oughton: This is a hard one to judge. It is far more important to do it right than to do it fast. Businesses that take time to risk assess all of the elements of the proposal for Big Data and understand fully what they wish to get from the process – which will vary for each business – are not necessarily the ones that will be the fastest in the uptake of this new way of working. However, having put the ground work in, they will be in a much stronger position and be able to progress from a firm base, having made sure they know the concept is understood and embedded within the organisational framework and that all of the associated risks are identified and managed appropriately. It will be a long time before Big Data analytics becomes business as usual; the resources and expertise involved in implementing such an approach, as well as the financial cost, will undoubtedly put it outside of the grasp of many businesses for many years to come. To that end, it is unlikely that businesses will be disadvantaged in the short to medium term because of an unwillingness or an inability to adopt Big Data storage and analytic processes. In the longer term, this may have a more pronounced impact; as the cost of storage solutions drops and the pool of available expert resources and applications increases, it is more likely that those organisations which are still reluctant to adopt these processes will suffer.

Miessler: The biggest risk to companies not willing or able to embrace Big Data is that they will be less nimble than their competitors that do. If you cannot predict issues before they happen, if you cannot identify potential optimisations quickly, if you cannot adjust your security prioritisation based on known indicators of security maturity – these will reduce a company’s efficiency and put them at a disadvantage compared to those that are doing them.

The biggest risk to companies not willing or able to embrace Big Data is that they will be less nimble than their competitors that do.
— Daniel Miessler

FW: In your opinion, are risk managers yet to fully harness the power of Big Data technology and data analytics in their everyday decision-making? To what extent does the ability to capture, access and utilise Big Data determine ultimate risk management success?

Miessler: Right now it is still early days for Big Data and risk management, and there is so much remedial work to be done to reduce risk in most organisations. But over the next couple of years, as companies have addressed more of their fundamental flaws and Big Data is more tuned for risk management, it will be increasingly important that companies leverage the technology for this purpose.

Oughton: There are still a large number of misconceptions among key decision makers about what Big Data brings to organisations. Just because you have the ability to store ridiculous amounts of data is not in itself justification to do so. Organisations must still be able identify a reason for storing this data and to properly assess the risks associated with such practices. Regardless, a study carried out by Bain & Co assessed that over 400 US companies that had adopted Big Data analytic processes had gained a significant lead over the rest of the corporate world. It was determined that practically any business that makes, grows and sells anything can use Big Data analytics to create efficiencies and improve business outputs through the ability to access and draw-upon or ‘mine’ the huge volumes of valuable information held within Big Data repositories. These practices have also spelled opportunity for companies looking to connect a different set of dots – identifying potential new customers, spotting fraud or cyber crime in its early stages, or improving products and services. Unsurprisingly, and to an extent controversially, one of the biggest users of Big Data analytics is the US National Security Agency (NSA) which is collecting and analysing unfathomable quantities of data in the fight against crime and terrorism across the world. Ultimately, whatever your stance on civil liberties, the technologies underpinning the NSA’s data collection and analysis programmes are working; a number of notable arrests and counter terrorist operations have cited Big Data analytics as supporting conventional military and law enforcement activity in the pursuit of global stability.

Navetta: Currently Big Data is being used by risk managers on a tactical level, and is still developing as a tool on a higher strategic level. Big Data can help identify customers and market to them effectively. It can enhance the response time and effectiveness of the information security department in responding to security threats. It can suggest ways to optimise operations within an organisation. Dashboards are being developed to help with day-to-day decision making at multiple levels. Eventually it will be used by risk managers, CFOs and boards of directors at a higher strategic level as the technology and techniques associated with Big Data mature.

Kushwaha: Risk managers are not fully harnessing the power of Big Data for multiple reasons. First, understanding how Big Data can meaningfully help decision making is still in its infancy. Second, even if there are visionary leaders who can connect the dots, getting from a vision to a Big Data operational strategy requires multiple types of new capabilities like in IT, modelling, analytics and so on, that are still scarce in the industry. Third, although the benefits of Big Data are slowly coming clear with select case studies across a broad spectrum of use cases, there are still more failed experiments than successes which dampens the broad based conviction required to fundamentally transform the industry – something that will improve with time. However, having said that, longer term, correctly capturing, accessing and utilising Big Data will ultimately determine the success or failure of risk managers. With industry moving to zero tolerance for fraud and an increasing need for near real time regulatory transparency, actionable real time insights and better predictive power will get increasingly critical for assessing and responding to all types of risks.

de Goede: Risk management, much like the rest of many modern organisations, is on the brink of unleashing the potential of Big Data and analytics. However, companies still have a long way to go to truly understand all the intricacies of the new risk paradigm that comes with the dispersed and disparate data included in nearly any Big Data effort and the virtual layer that slowly takes more and more space between consumers, physical assets, employees and organisational processes. As far as IT security risk is concerned, most organisations have the tooling and necessary skills in place. As far as other business risk is concerned – operational, financial, environmental, regulatory, safety and so on – an enormous need for complex event processing, monitoring and business rule engines analysis may have to be satisfied before significant and structural results in the field of risk management can be seen.

Organisations must still be able identify a reason for storing this data and to properly assess the risks associated with such practices.
— Paul Oughton

FW: How significant are the potential cost savings when Big Data and data analytics are deployed as part of a risk management strategy?

Kushwaha: Cost savings from Big Data needs to be analysed from three value levers. First is cost avoidance from fines and penalties. As has been evidenced in recent years, regulators are not averse to slapping ever increasing fines and penalties for noncompliance. Individual financial organisations have been fined billions of dollars for things like money laundering and trading surveillance lapses. Second is operational efficiency as realised by the UOB Bank from Singapore, where financial institutions are saving millions of dollars through operational efficiency strategies. Third is the source of new revenues that can be realised, like the case study of German company Kreditech which uses Big Data, including social media, to instantly assess credit worthiness of private individuals with up to 10,000 data points being considered per assessment. A US company, Kabbage, does the same for pre-financing and working capital advances for corporate customers. These firms are creating billions in enterprise value through new and innovative way of leveraging Big Data to bring new and differentiated products to market.

Navetta: Over time, potential cost savings may be very significant, especially if the Big Data strategy is able to allow managers to efficiently identify and mitigate risk. The technology should allow fewer people to manage more risk and with better results. This is especially true as an organisation obtains and analyses more data and fine-tunes its data analytics approach. However, because an initial investment is necessary to enable Big Data, in the short term cost savings may not occur. To realise the full potential of and cost savings associated with Big Data, a long view must be taken.

Miessler: Currently, the potential cost savings are minimal, if not negative. But this is only because the technology is so new and there is still so much basic work to be done to decrease risk. So if you have to invest in Big Data technology, train people to use it, and then run a successful programme based on it, it is going to be expensive to set up. Fixing basic issues with one’s security programme, however, could be far more effective and far less expensive, but this will invert as time goes on.

de Goede: There are tremendous cost savings readily available for any company that wisely invests in new, cheaper and more intelligent technology and software. This does not always imply that Big Data or analysis must be used to unlock value; for our company, this is especially true in cases where any type of sensor has the capability to store a reasonable amount of data locally, has enough computer power to automatically decide and has the situational awareness to make a decision framework repeatable and faultless. Where local analysis is not sufficient – for example, analysis of aggregates of operational data – relations between disparate datasets, pattern recognition, and unstructured data mining, Big Data concepts might be called upon. The key is to use the tools and structures that are most adequate to answer the question, be it data lakes, historians, map reduction, SQL combinations or free text search, there seems to be no more room for a one size fits all business intelligence solution.

By using Big Data analytics, more and more automated, repeatable business outcomes can be generated.
— Michel de Goede

FW: What trends and developments do you expect to see within the risk management world in the coming months and years? How much further is the use of Big Data technologies and data analytics likely to transform this area?

de Goede: Risk, according to ISO 31000, is the “effect of uncertainty on objectives”. By using Big Data analytics, more and more automated, repeatable business outcomes can be generated. Hence, the deviation from any objective can be influenced to an increasing level. Parties mastering this technique will not only reduce risk, they will also increase quality for their stakeholders such as their customers, shareholders or even the environment by influencing product quality, dividend payout and so on. These parties may have a big advantage in today’s no defects customer expectations and in the shareholders and analysts’ appreciation of the company. Those parties starting to think about quality as ‘certainty of attaining objectives’ – meaning output equals quality minus risk – might just turn out to be the winners of the game and may as a result move the risk department higher up the corporate ladder.

Kushwaha: Big Data and Big Data analytics can be the foundation to either avoid billions in fines, or save millions in operational efficiency in the short term and perhaps billions over time as the regulatory environment gets more complex, or create billions in enterprise value through new market offerings that are founded in the principles of Big Data. Big Data is real and here to stay in the financial services industry. In terms of key trends, in the near term Big Data initiatives will predominately continue to revolve around improving customer intelligence, reducing risk and meeting regulatory objectives. However, overtime there will be more reported successes rather than failures of Big Data case studies compared to today, and momentum and industry conviction will continue to build in favour of Big Data strategies. Over time the gaps between leaders and laggards will also start to become more apparent. Fuelled by the successes, there will be increasing investment and adoption of technologies like machine learning and data science approaches applied to fraud and risk management.

Navetta: Much depends on whether Big Data strategy is able to allow managers to efficiently identify and mitigate risk. The technology should allow fewer people to manage more risk and with better results. This is especially true as an organisation obtains and analyses more data and fine-tunes its data analytics approach. However, because an initial investment is necessary to enable Big Data, in the short term cost savings may not occur. To realise the full potential of Big Data, and the associated cost savings, a long view must be taken.

Miessler: Big Data and analytics will transform the space significantly, but it will take time. The biggest changes in risk management will come from understanding that continuous configuration management, combined with security maturity models and shared indicators of maturity, are key to running a risk management organisation. Without these components it is difficult to know what the current highest risks are, and thus what next action should be taken.

 

Paul Oughton is a highly experienced security consultant and trainer for Advent IM. With a security career spanning 10 years, Mr Oughton has specialist knowledge in the provision and review of security and resilience solutions for critical sites and high profile projects. An accomplished risk management professional, he has a wealth of client advisory experience and technical expertise. He has worked extensively within the fields of critical national infrastructure and aviation security.

Michel de Goede is a Strategy Consultant / Enterprise Architect at Alliander where he advises general and IT management about the consequences of market developments, investments, divestments and innovation, sometimes co-work on startups, (international) M&A plans or due diligence.

Daniel Miessler specialises in creative problem solving that leverages an active 17-year technical background, a deep understanding of existing product and service solutions, and concise, transparent communication style that adjusts to any audience. He helps organisations counteract the intrinsic resource asymmetries between attacker and defender by creating measurable security programme focused around the maturity of prevention, detection and response capabilities.

David Navetta is a US co-chair of Norton Rose Fulbright’s data protection, privacy and cyber security practice group. He focuses on technology, privacy, information security and intellectual property law. His work ranges from compliance and transactional work to breach notification, regulatory response and litigation. Mr Navetta currently serves as ‘breach coach’ or is on the approved panel for numerous cyber insurance carriers and companies, and has helped dozens of companies across multiple industries respond to data security breaches.

Raj Kushwaha, based in New York, joined Warburg Pincus LLC in 2012 and serves as Chief Technology Officer responsible for technology diligence with prospective investments, as well as ongoing work with the firm’s existing portfolio companies. He brings over 22 years of experience in leading commercial software product development, strategic planning, technology operations, business transformation, ERP implementations, and process outsourcing initiatives at Fortune 500 companies in a variety of industries.

© Financier Worldwide


©2001-2024 Financier Worldwide Ltd. All rights reserved. Any statements expressed on this website are understood to be general opinions and should not be relied upon as legal, financial or any other form of professional advice. Opinions expressed do not necessarily represent the views of the authors’ current or previous employers, or clients. The publisher, authors and authors' firms are not responsible for any loss third parties may suffer in connection with information or materials presented on this website, or use of any such information or materials by any third parties.