Research on AI Risk
Research on AI Risk
Background
-
The penetration of AI is accelerating rapidly, and the emergence of more powerful and complex AI systems is anticipated. As a lab with strong AI capability and insurance expertise, it is our urgent task to create insurance that covers the damages caused by AI.
-
ADJ has already begun to explore the development of such insurance, focusing on addressing immediate, visible issues that can be resolved in a short time frame. However, the AI landscape is vast, and we believe there exist unseen, yet significant, AI risks that can be insured and mitigated using technology.​
-
The Oxford Martin School recently launched the AI Governance Initiatives, led by Prof. Mike Osborne and Robert Trager. We have engaged in several discussions to explore potential collaboration opportunities with them to tackle the latter issue.
Hypothesis
-
By integrating AIOI's expertise in insurance, the Martin School's capabilities in AI research, and Mind Foundry's proficiency in AI for risk detection and mitigation, we could develop an effective insurance product and services.
-
ADJ's product development department said from insurance product development point of view, they would like to know areas where 'cause investigation' and 'AI performance evaluation' are possible.
​​
Research
-
Phase I: Initial scoping. The deliverable would ideally be a short, public and impactful co-branded Oxford Martin AI Governance Initiative (AIGI) - Aioi R&D Lab report. The deliverable would survey the AI insurance space and highlight key areas for future work, with a focus on the US (with a side-report on Japan, in Japanese), and a near-term (perhaps five year) horizon.
-
Phase II (from Autum): will begin research in earnest. We discussed a possible qualitative methodology, using structured interviews of experts across the three pillars (size of risk, insurability, potential of mitigation). We might also investigate quantitative approaches (e.g. using patent data).
Additional Resources
Request a report from here