Brokerage Ops the October 2021 issue

Elevating the Art of Underwriting

Q&A with Jay Nichols, Senior Corporate Advisor, and Jeff Tyler, Head of Product, Data Science and Engineering, Two Sigma IQ
Sponsored by Two Sigma IQ Posted on September 30, 2021

As the industry sits on a wealth of data, Two Sigma IQ, a data science-driven SaaS insurance solutions provider, suggests a path to better underwriting through an event-based modeling approach. The challenge, Nichols and Tyler acknowledge, is structuring the data to make it usable, but when it comes to improving the future of underwriting, they say it’s worth the effort.

Q
What are the current challenges that P&C commercial underwriters face, and what role do you see technology and data science playing in addressing these challenges?
A

Nichols: There are always challenges in commercial insurance. One fundamental issue is that capital is cheap and risk is evolving. Our primary focus is on creating tools to provide insight into the risk side of the equation. We are seeing new risks and more severe events every year, from climate change to the shared economy.

Tyler: Underwriters are under constant pressure to produce profitable business and more of it. Using the same techniques and tools, there is only so far they can get. They need a robotic arm, an augmentation to their arsenal that enables them to understand risk quickly and make decisions.

Q
Where does current insurance technology fall short for underwriters?
A

Tyler: To properly harness these data science approaches, you need to grapple with the tremendous amounts of unstructured data that live outside the existing data management infrastructures at many carriers. Additionally, for business that wasn’t written, the quality of data is very low on average.

Nichols: The opportunity is to take all the available data—internal, proprietary, and external—and create a new way to better understand risk and make better decisions across the spectrum of opportunities.

Q
In what ways could firms address those failure points?
A

Nichols: Although this is not new, we believe that the use of event-based underwriting is the future for several reasons. There are available data sets, the most valuable of which is an incumbent’s claims database, and as the risks change, event sets [a collection of possible risk outcomes the insured could experience] can be modified by science (energy, weather patterns, etc.), math (interpolation, trending, etc.) and behaviors (e.g., mitigation). Each of these areas has evolved significantly in the past decade and can be deployed with speed and an enhanced user experience to provide better risk-weighted pricing.

Q
At what points within the underwriting process do you see data science and technology making the most impact?
A

Nichols: The use of data and technology should be influencing all aspects of the insurance value chain—from targeting risks to creating appropriate risk appetite, understanding impacts of mitigation and behaviors, and informing claims experts.

Tyler: There is no corner that should be untouched. At TSIQ, we are presently focused on how to bring immediate value to underwriters upon receipt of submissions. This means fusing third-party data into the application, analyzing loss runs and schedules of exposures, and enabling the underwriter to pick which business to spend time assessing.

Q
With all the data present today, why is it such a challenge to leverage both internal data and third-party data in the underwriting process?
A

Tyler: Data being present and data being available are not the same thing. The data is often in no shape to leverage in a process (other than manually). It needs to be cleansed, organized, and cataloged, and then delivered to the users. Underwriters shouldn’t have to go hunting for the information needed to determine the quality of a risk.

Nichols: The difficulty is getting it to the underwriters at the point of decision-making within the context of their workflow. Many times, different groups or departments have access to valuable data, but it is trapped in silos or spreadsheets or stored in less than “underwriter-friendly” formats. Making it available across systems and enabling it to be shared with employees, especially to underwriters, when they most need it is a big challenge.

Q
The industry has been trying to solve that problem for a while. Are we any closer to solving it?
A

Tyler: The industry has targeted most of their solutions at the technical users and the back office. Our approach is putting the power into the hands of the decision makers, the underwriters. Our plan is to give the underwriters leverage over their data from the second a submission comes in through the entire decision process.

Q
How does the workforce need to adapt in order to pave the way for leveraging technology and data science for better underwriting results?
A

Nichols: Underwriters need to be able to allow some of the information to be created and delivered to them in a way that they can use to price risks, while still understanding pricing sensitivities. A big challenge is creating event sets by line of business or at a more granular level and being able to monitor the use and “manipulation” of the event sets or distributions of outcomes to arrive at real risk-based pricing for all the risks of an insured.

Tyler: Underwriters need to see themselves as portfolio managers, in a sense. They should not just assess one risk at a time but adapt to changing conditions, identify value opportunities, and diversify their exposure as required.

Q
How do you see the role of an underwriter changing for firms that are able to transform their business using data science and technology?
A

Nichols: In the ‘90s, the property catastrophe industry fundamentally transformed the entire underwriting process by incorporating “event-based underwriting,” replacing account-based historical results with some location-based exposure rating. The catastrophe pricing models consist of an event set, a hazard module, a frequency model and a financial model. The underwriter of the future should be better versed with this event-based approach and drive the adoption of event-based modeling more broadly than is currently used. Two positive byproducts of event-based modeling: first, equipping underwriters to have better conversations with their clients; and second, arming underwriters and the entire insurer to be more engaged in conversations about mitigation and risk management.

Q
What types of insights and analytics would you like to have that are currently challenging to extract?
A

Nichols: I think the most important insight is the utilization of existing claims information to populate an event-based model. Claims information needs to be moved to the front of the process. Inputs should learn from outcomes. Some firms are challenged to organize claims data into a usable format, but with advances in natural language processing and machine learning, it is getting more manageable and less expensive.

Tyler: This is a particular focus of ours. If you can digitally represent the loss data that’s received in an application immediately, you can start to build a usable event set that can be used for anomaly detection, probability curve generation, and countless other cases.

Q
How do you see the use of data science impacting future underwriting and pricing cycles?
A

Nichols: The goal would be to see pricing based upon the inherent risk, as informed by event-based models. Pricing cycles historically have been informed by what happened in the past, and that is not good for the industry. An insured should pay a fair price based upon the risk in the upcoming coverage period, not for the outcome of past years, although informed by the outcome of past years.

I think that there will always be a factor for volatility that will be an input for pricing and will contribute to pricing cycles, but to raise all prices X percent or to talk about achieved rate increases to me seems like the use of averages in a universe of very unique risks, exposures, and businesses.

More in Brokerage Ops

One Meeting People Want More Of
Brokerage Ops One Meeting People Want More Of
One-on-ones are the underused tool for improving relationships and performance.
Brokerage Ops Your Résumé Is Not Your Life
Employers must look beyond résumés and job titles and consider a candidate’s...
The Golden Thread
Brokerage Ops The Golden Thread
Q&A with Rob Bartlett, Co-Founder and CEO, and Ken Fraser, Co-Founder and Presid...