KEY TAKEAWAYS:
--The insurance industry needs to adapt to digital natives who are tech-savvy and connected and prioritize convenience.
--Facial analytics can predict an individual's risk of illness or disease with remarkable accuracy.
--A simple selfie photo helps insurance carriers instantly triage applicants into refined risk pools.
----------
The insurance industry has long been associated with a traditional approach to doing business; often relying on face-to-face interactions and paper-based processes. However, as digital natives enter the workforce, insurance companies must adapt to their preferences and expectations to remain competitive.
Underwriting the old way
In conventional underwriting, details such as age and gender are gathered, and survival projections are created by using tables to categorize individuals into risk categories with specific premiums.
Traditional life insurance companies don't directly evaluate personal health and lifespan but follow established guidelines to sort individuals into risk categories based on demographic features and warning signs like smoking, obesity and pre-existing health issues.
Candidates must then answer multiple questions about their family background and medical status. Depending on the policy, they might need to go to a clinic to provide blood and urine samples and have their blood pressure, weight and height checked.
Some insurers also look further into personal backgrounds by using independent sources that offer details on prescription medication usage and driving histories. This method is lengthy, often taking 30 to 45 days. It is expensive, due to the numerous individuals involved in collecting and analyzing the information. Additionally, it's invasive, as it requires the gathering of bodily fluids and the use of what appears to the customer to be unrelated data like driving records.
For clients, particularly the younger generation, the life insurance underwriting process is not enjoyable. So how can we improve it?
See also: Beware the Dark Side of AI
What better way to engage with a tech-savvy generation than with a selfie photo?
A selfie?
When examining a photo featuring two individuals, the human eye can easily recognize which person appears older or younger by observing signs of age like wrinkles, age spots and lines.
Computers, using the science of facial analytics, can mimic humans' ability to assess a face but with even greater accuracy.
During my Ph.D. work 30 years ago, I began researching the relationship between facial features and health outcomes. As computers became more powerful through the use of graphical processing units (GPUs) and advances in memory density, they propelled AI, more specifically deep learning. These advancements in deep machine learning made it possible to use AI for health intelligence.
Now AI, powered by deep machine learning, can identify dozens of health-related signals such as body mass index, biological age, senescing rate, physical stress, heart rate, blood pressure, genetic diseases and more. Soon, we will be able to predict an individual's risk of illness or disease with remarkable accuracy.
The future of health intelligence is in preventive healthcare, the ability to leverage facial analytics to provide signals of health from any connected mobile device. By identifying early warning signs of disease or illness, this technology could help insurers and healthcare providers intervene quickly, ultimately improving patient outcomes and reducing healthcare costs. This technology will have major impacts for every region of the world, especially in low-income and remote communities.
Using facial analytics for underwriting
Facial analysis can now be incorporated into underwriting by a simple face scan, a selfie, of a potential customer. This helps insurance carriers instantly validate self-reported and external data and triage applicants into more refined risk pools -- without needing body fluids or physician assessments. Not only is the technology more efficient and quicker for customers, but with continual improvement and training for the algorithms, it will become a more accurate and efficient method of assessing risk, which could lead to reduced premiums for policyholders, improved financial stability for insurance companies and, ultimately, better health outcomes for all.
Crucially, facial analytics does not entail facial recognition and cannot be used for identification or tracking; instead, it concentrates on identifying characteristics associated with risk factors and lifespan.
See also: In Race to AI, Who Guards Our Privacy?
Benefits for the insurance sector
Facial analytics can transform the insurance industry in three key ways:
- As an instant verification instrument to guarantee precise reporting of key health intelligence metrics like BMI and health conditions. This will enable immediate, accelerated processing, forever changing the insurance buying experience.
- As an indicator of life expectancy, aiding in documenting individuals with longer lives, determining their expected lifespan and assisting with their financial planning solutions.
- As a means to offer customized health information and uncover risk aspects for debilitating or fatal illnesses, encouraging client health support – and ultimately saving lives.
I was fortunate to have the opportunity to talk about this transformation and demonstrate facial analytics at InsureTech Connect Asia in Singapore recently. To lead this change, insurers will need to get on board with facial analytic based-AI technology. If leaders step up and take this opportunity, they will enable better pricing, detect and minimize fraud and, importantly for younger applicants, offer faster, more individualized underwriting decisions for the new digital generation of customers.