Skip to content

Actuarial Process Optimization A Case for Using Modern Technology in Actuarial Domain

Metadata

Highlights

  • Actuaries are valuable and strategically important resources to insurance companies. We are trained both on the job and through an intensive education and exam curriculum to study and own insurance risk.
  • As highly capable professionals, actuaries are often self-reliant, and are interested in owning all technology tools, data and processes that support their daily jobs. However, this hands-on approach can lead to performing many tasks that do not require actuarial expertise.
  • By occupying ourselves with various high effort but low value tasks, we often neglect to focus on higher value tasks that are truly valuable to the future of our organization. Advancements in technology and the desire for lean operations have contributed to many insurers evaluating their strategic direction and the role of the actuary of the future, shifting their focus onto highest value tasks.
  • Consider below some common low-value, high-effort tasks that actuaries may encounter:
  • Unnecessarily complex and error prone ETL, production and reporting workflows.
  • Time and resources wasted on resolving errors and tracing back complex process steps.
  • Production and process errors that can result in misstatements and delays in reporting.
  • Multiple unvalidated spreadsheets with overlapping functionalities.
  • Multiple sources of information, but no single “source of truth.”
  • Storage and processing time wastage.
  • the general consensus states that the actuary of the future must be ready to provide strategic business direction, handle governance and risk frameworks, and optimize skillsets and technologies.
  • The actuarial process optimization framework defines three pillars of technological opportunity to support the actuary of the future in being successful at his or her role:
  • Optimization and upscaling of existing technologies—target “band-aid” fixes, manual legacy processes, and processes that commonly fail or produce inconsistent output.
  • Process automation—target manual and frequently repetitive processes, and processes that could require scalability.
  • Taking advantage of new technologies—target opportunities for new insights, deeper analytical capabilities, new governance and control framework capabilities, and acceleration of complex multi-step processes.
  • The actuarial process optimization framework addresses the pillars of technological opportunity by tackling the highest effort, lowest value processes first to prove the concept within the organization and support the business case for a broader actuarial process optimization program. As companies realize returns on investments and enable actuaries to focus on higher value tasks, they construct a sustainable technological infrastructure for the actuary of the future.
  • Providing Strategic Business Direction
  • Data has always been part of the actuarial realm. Now, more than ever, companies are relying on data to make strategic business decisions. Effective data management, data analysis, and predictive analytics provide opportunities to aid in the actuary’s role in guiding decision making. Insurance companies have been exploring big data, both structured and unstructured, to gain additional insights into their business and industry trends.
  • As stewards and users of data, actuaries are also adept at finding the meaning in data, seeing trends, and generating insights. Enhanced reporting, analytics, and visualization will allow better understanding of your data and improve communication of findings.
  • Disclosures must provide additional detail on financial results through attribution analysis and trends over multiple reporting periods. Many companies saw these regulatory reporting needs as catalyst to enhance the breadth and depth of their management reporting and take advantage of modern technology through actuarial process optimization.
  • Modern technology solutions have replaced manual, error-prone consolidation and reporting processes often driven by spreadsheets with significant underlying key-person risks.
  • Maintaining Governance, Controls, and Risk Frameworks
  • For actuaries to effectively own and manage the risk frameworks within their organizations, they must rely on strong governance structures driven by a robust control framework and risk reporting.
  • The repetitive nature of control testing processes makes them prime candidates for actuarial process optimization. Some insurance companies have introduced robotics process automation to allow for actuarial control testing processes be carried out in a more accurate, efficient, consistent, and scalable manner. These automated control testing routines can be initiated on a period or event-driven basis,
  • Common use cases for actuarial control testing automation include data quality testing and validation of large data sets.
  • Consider a large insurance company that routinely uses dozens of assumption sets across functional areas and insurance products. Each set may contain hundreds of tables and adjustment factors. One of the most common areas for model errors identified through model validation exercises, is the use of inappropriate assumption in actuarial calculations.
  • A rule-based automated validation routine that systematically scans assumption tables, for both data quality and logical relationships amongst table values, can increase the level of confidence in actuarial model results. Rule-based validation allows for validation of each value in an assumption set, something that would be extremely difficult to execute manually. Rules can range from simple data quality validation rules, like “all table values must be between 0 and 1,” to more complex rules validating relationships across select and ultimate rates within a table or between a smoker and non-smoker table.