Skip to main content
[et_pb_section admin_label=”section”] [et_pb_row admin_label=”row”] [et_pb_column type=”4_4″][et_pb_text admin_label=”Text”]Many organisations are struggling with backward looking technologies and processes that focus on their current risk profile, rather than identifying their optimal risk profile for the future. They need deep insight into root causes, indirect effects, and early warning signals based on new rating methodologies using new and innovative data sources. Current processes and technologies for “problem finding” and “problem solving” that increase control measurability and risk responsiveness, include
  • Sensors and agents, such as social media listening platforms, horizon scanning, and early warning systems.
  • Integrated data warehouses and big data technologies such as Hadoop
  • High frequency reporting and continuous control monitoring
  • Crowdsourcing risk information
  • Visual data discovery, optical character recognition and natural language processing
  • Scenario planning and stress testing
  • Risk analytics and machine learning
Fast automated access to accurate data through a Business Activity Management (BAM) Platform, and an integrated data warehouse are prerequisites for the strategic use of advanced analytics. A BAM platform with appropriate sensors and agents, answers questions using a set of analysis techniques, including baseline, threshold monitoring, correlation, root cause, impact analysis and predictions. Continuous control monitoring allows testing on a full population and offers near “absolute assurance” as opposed to reasonable assurance using traditional audit and assurance methods. In addition to cost reductions through improved efficiency and effectiveness, other benefits of continuous control monitoring include increased test coverage (through greater sampling and the ability to do more with the same or less labour), improved timeliness of testing, reduced risk velocity and potentially reduced remediation cost, greater visibility (when included in a GRC solution), improved consistency, and the ability to identify trends. “Risk Analytics” is the use of mathematical methods and tools to facilitate the risk value shop activities. These tools have evolved from “what did happen” data warehouse reporting, to “why something happened” OLAP tools, and now comprise “what will happen” machine learning and artificial intelligence. The key elements of risk analytics today are high frequency reporting based on high quality data, risk analysis “drilling down” to detailed data, risk modelling and optimisation to predict events, continuous monitoring and real time alerting, and automated response mechanisms. Machine learning algorithms can be classified as follows:
Algorithm Purpose Examples
Regression algorithms Iteratively refine a model of relationships between variables ordinary least squared regression (OLSR), linear regression, Logistic regression, stepwise regression, multivariate adaptive regression spines (MARS), and locally estimated scatterplot smoothing (LOESS).  
Instance based algorithms Learn from example data and compare to new data – including k-nearest neighbour (kNN), learning vector quantization (LVQ), Support Vector Machines (SVM), self-organising map (SOM) and locally weighted learning (LWL).
Regularisation algorithms Simplify other models (such as regression) to improve generalisation Ridge regression, least absolute shrinkage and selection operator (LASSO), elastic net, least angle regression (LARS).
Decision tree algorithms Construct a model of decisions made based on attributes of the data classification and regression tree (CART), Linear temporal logic (LTL) Checker, iterative dichotomiser 3 (ID3), C4.5, C5.0, chi-squared automatic interaction detection (CHAID), decision stump, M5, conditional decision trees
Bayesian algorithms Calculate the probability of an event naïve bayes, guassian naïve bayes, multinomial naïve bayes, averaged one dependence estimators (AODE), Bayesian belief network (BNN), Bayesian network (BN), Hidden Markov Model (HMM), Markov Chain Monte Carlo
Clustering algorithms Use the inherent structure in the data to organise into groups k-Means, k-Medians, Expectation Maximiser (EM), Hierarchical Clustering.
Association Rule Learning algorithms Extract rules that best explain observed relationships in data Apriori, and Eclat
Artificial Neural Network Algorithms Perceptron, Back-Propagation, Hopfield Network, Radial Basis Function Network (RBFN)
Deep Learning Algorithms For more complex and large datasets Deep Boltzmann Machine (DBM), Deep Belief Networks (DBN), Convolutional Neural Network (CNN), Stacked Auto-Encoders
Dimensionality Reduction Algorithms Seek and exploit the inherent structure in the data to summarise or describe the data Principal Component Analysis (PCA), Principal Component Regression (PCR), Partial Least Squares Regression (PLSR), Sammon Mapping, Multidimensional Scaling (MDS), Projection Pursuit, Linear Discriminate Analysis (LDA), Mixture Discriminate Analysis (MDA), Quadratic Discriminate Analysis (QDA), Flexible Discriminate Analysis (FDA)
In the “problem finding” and “problem solving” value shop activities, analytics and machine learning can be used to:
  • Discover processes from logs
  • Detect process inconsistencies between different groups or applications
  • Perform rule-based checking (such as SLAs)
  • Identify trends and make forecasts
  • Predict the likelihood and impact of an event
  • Classify data and build decisions trees
  • Visualise data
  • Discover key risk indicators
  • Analyse stresses and scenarios
  • Identify typical problems or common solutions
  • Generate treatment options – the risk of action, inaction, over-reaction, and under reaction
Current processes and technologies for the “choice” value shop activity include:
  • Machine learning (as described above)
  • Simulation models
  • Stochastic Optimisation models
  • Artificial intelligence, and
  • Business Process Management (BPM)
The benefit of analytical measures for decision making, is that they can remove the overconfidence and anchoring biases in manual decision making. However, due to evolving understanding of what information should trigger action, the use of analytics for decision making should also be iterative. As mentioned in our discussion on risk transformation architecture, the Business Intelligence (BI) provided awareness and analysis needs to be combined with the Business Process Management (BPM) to provide the business rules, process models, and process orchestration needed to make decisions and execute them. BI then enhances BPM to make decisions more efficient as well as more repeatable, scalable, traceable, and accurate. Current processes and technologies for the “execute” and “evaluate” value shop activities include:
  • Machine Learning
  • Business Process Management (BPM), and
  • Case management, through an integrated GRC
A majority of organisations have only structured part of their risk reporting process and don’t have predefined escalation mechanisms in place. An integrated GRC approach improves effectiveness by reducing complexity, facilitates standardisation of processes and taxonomy, allows for the rationalisation of controls, and allows a movement from tactical to strategic activities through automation. Automating controls reduces the cost of the control (resources, time and effort) as well as the cost of assurance and leads to reduced risk through increased control effectiveness and coverage. An advanced stage of automation is “enterprise valuation control” where business performance is assessed and process risk versus value is calculated at near real time during process execution. Beyond this stage is an adaptive IT system that will allow a movement from “security by design” to an adaptive threat, security and risk model. In determining a risk transformation roadmap, a number of technology implementation options exist including:
  • Using a full stack vendor and implementing technologies based on their roadmap – if you are prepared for potential lock-in.
  • Picking a best of breed solution that includes BI, BPM, GRC and an application development platform – if you can invest in the necessary skills.
  • Pick a BI platform only – if you have sufficient data management and integration tools and skills
[/et_pb_text][/et_pb_column] [/et_pb_row] [/et_pb_section]

Leave a Reply