<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en-US">
	<id>https://www.insurerbrain.com/w/index.php?action=history&amp;feed=atom&amp;title=Definition%3AFrequency-severity_analysis</id>
	<title>Definition:Frequency-severity analysis - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://www.insurerbrain.com/w/index.php?action=history&amp;feed=atom&amp;title=Definition%3AFrequency-severity_analysis"/>
	<link rel="alternate" type="text/html" href="https://www.insurerbrain.com/w/index.php?title=Definition:Frequency-severity_analysis&amp;action=history"/>
	<updated>2026-05-02T22:16:50Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.43.8</generator>
	<entry>
		<id>https://www.insurerbrain.com/w/index.php?title=Definition:Frequency-severity_analysis&amp;diff=20667&amp;oldid=prev</id>
		<title>PlumBot: Bot: Creating new article from JSON</title>
		<link rel="alternate" type="text/html" href="https://www.insurerbrain.com/w/index.php?title=Definition:Frequency-severity_analysis&amp;diff=20667&amp;oldid=prev"/>
		<updated>2026-03-18T03:13:16Z</updated>

		<summary type="html">&lt;p&gt;Bot: Creating new article from JSON&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;📊 &amp;#039;&amp;#039;&amp;#039;Frequency-severity analysis&amp;#039;&amp;#039;&amp;#039; is a foundational [[Definition:Actuarial science | actuarial]] technique used in insurance to decompose [[Definition:Loss | loss]] experience into two distinct components: how often claims occur (frequency) and how large those claims are when they do occur (severity). By separating these two dimensions, [[Definition:Actuary | actuaries]] and [[Definition:Underwriting | underwriters]] can build more granular models of expected losses, identify emerging trends in either component independently, and price [[Definition:Insurance policy | policies]] with greater precision. The method is central to virtually every line of business, from [[Definition:Motor insurance | motor insurance]] and [[Definition:Workers&amp;#039; compensation insurance | workers&amp;#039; compensation]] to [[Definition:Property insurance | property]] catastrophe modeling and [[Definition:Health insurance | health]] cost projections.&lt;br /&gt;
&lt;br /&gt;
⚙️ In practice, an insurer collects historical [[Definition:Claims data | claims data]] and organizes it into frequency distributions (number of claims per exposure unit over a defined period) and severity distributions (the dollar or currency amount of each individual claim). These distributions are then analyzed separately — often fitted to statistical models such as Poisson or negative binomial distributions for frequency, and lognormal or Pareto distributions for severity. The product of expected frequency and expected severity yields the [[Definition:Pure premium | pure premium]], which forms the starting point for [[Definition:Ratemaking | ratemaking]]. Regulatory frameworks across jurisdictions expect this kind of rigor: [[Definition:Solvency II | Solvency II]] in Europe, the [[Definition:Risk-based capital (RBC) | RBC]] framework in the United States, and [[Definition:C-ROSS | C-ROSS]] in China all require insurers to demonstrate that their [[Definition:Technical provisions | technical provisions]] and capital charges rest on defensible loss modeling, and frequency-severity decomposition is often the bedrock of that demonstration.&lt;br /&gt;
&lt;br /&gt;
💡 The real power of the technique lies in its diagnostic value. A rising [[Definition:Loss ratio | loss ratio]] could stem from more claims, costlier claims, or both — and the strategic response differs dramatically depending on which driver is at work. If frequency is climbing in a [[Definition:Commercial auto insurance | commercial auto]] portfolio, the insurer might tighten [[Definition:Risk selection | risk selection]] criteria or adjust [[Definition:Deductible | deductibles]]; if severity is spiking due to [[Definition:Social inflation | social inflation]] or higher medical costs, the response might involve revising [[Definition:Policy limit | policy limits]] or pursuing [[Definition:Reinsurance | reinsurance]] protection for large losses. [[Definition:Insurtech | Insurtech]] firms have further enhanced frequency-severity analysis by integrating [[Definition:Telematics | telematics]], [[Definition:Internet of things (IoT) | IoT]] sensor data, and [[Definition:Machine learning | machine learning]] to update frequency and severity estimates in near real time, moving the discipline from retrospective analysis toward predictive and even prescriptive pricing.&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Related concepts:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
{{Div col|colwidth=20em}}&lt;br /&gt;
* [[Definition:Actuarial science]]&lt;br /&gt;
* [[Definition:Loss ratio]]&lt;br /&gt;
* [[Definition:Ratemaking]]&lt;br /&gt;
* [[Definition:Pure premium]]&lt;br /&gt;
* [[Definition:Loss development]]&lt;br /&gt;
* [[Definition:Catastrophe modeling]]&lt;br /&gt;
{{Div col end}}&lt;/div&gt;</summary>
		<author><name>PlumBot</name></author>
	</entry>
</feed>