Quiz-summary
0 of 30 questions completed
Questions:
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
Information
CMFAS CGI
You have already completed the quiz before. Hence you can not start it again.
Quiz is loading...
You must sign in or sign up to start the quiz.
You have to finish following quiz, to start this quiz:
Results
0 of 30 questions answered correctly
Your time:
Time has elapsed
Categories
- Not categorized 0%
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- Answered
- Review
-
Question 1 of 30
1. Question
Which of the following are precautions in calculating claims inflation from data?
I. Use small data sets of claims from a single policyholder as portfolio/ market information.
II. The empirical index approach should always be used regardless of the data size
III. For average claim amounts, use robust statistics such as the median or trimmed means
IV. It is recommended to use a large data set of claims from different policyholdersCorrect
To minimize the difficulties in calculating claims inflation from data, it is recommended that one should use large data sets of claims from many different policyholders. This would minimize the numerical instabilities as far as applicable. The empirical index approach is not recommended for a small data set because it can show erratic results. It is recommended to use robust statistics, however, it also has undesirable side effects such as its focus on the middle distribution.
Incorrect
To minimize the difficulties in calculating claims inflation from data, it is recommended that one should use large data sets of claims from many different policyholders. This would minimize the numerical instabilities as far as applicable. The empirical index approach is not recommended for a small data set because it can show erratic results. It is recommended to use robust statistics, however, it also has undesirable side effects such as its focus on the middle distribution.
-
Question 2 of 30
2. Question
What date refers to the point in time in which an estimated amount was claimed?
Correct
When the history of estimates is needed, for example, it is essential in the field to record the reporting delays of non-zero claims, the quantification date is used in the system. This date provides factors used in the computation of compensation.
Incorrect
When the history of estimates is needed, for example, it is essential in the field to record the reporting delays of non-zero claims, the quantification date is used in the system. This date provides factors used in the computation of compensation.
-
Question 3 of 30
3. Question
Which of the following is true regarding exposure measure?
Correct
The exposure measure is proportional to the risk of a policyholder. To Increase its objectivity and accuracy, different factors relevant to risk are considered. More factors are introduced in the calculation of exposure over time as experience is accumulated.
Incorrect
The exposure measure is proportional to the risk of a policyholder. To Increase its objectivity and accuracy, different factors relevant to risk are considered. More factors are introduced in the calculation of exposure over time as experience is accumulated.
-
Question 4 of 30
4. Question
Which of the following criteria are identified by the Casualty Actuarial Society in the measure of exposure?
I. Objective and easy to obtain
II. Historical precedents
III. Proportional to the expected income
IV. Proportional to the expected lossesCorrect
These criteria may be used as a starting point in the measure of exposure. The most important of the three is the proportion of the measure to the expected losses. When the measure is objective and easy to obtain, it can be evaluated reliably.
Incorrect
These criteria may be used as a starting point in the measure of exposure. The most important of the three is the proportion of the measure to the expected losses. When the measure is objective and easy to obtain, it can be evaluated reliably.
-
Question 5 of 30
5. Question
Which of the following are the components of a Catastrophe Model?
I. Vulnerability Module
II. Hazard Module
III. Financial Module
IV. Weather ModuleCorrect
The three components of a Catastrophe Model are the hazard module, vulnerability module, and the financial module. The hazard module is concerned with the catastrophic event’s characteristics such as its strength, location, and frequency. The vulnerability module provides vulnerability curves. These vulnerability curves show the degree of damage such as the magnitude of an earthquake. The financial module translates physical damage to financial loss. Its main outputs are the event loss tables (ELTs) that give a detailed list of events and related losses.
Incorrect
The three components of a Catastrophe Model are the hazard module, vulnerability module, and the financial module. The hazard module is concerned with the catastrophic event’s characteristics such as its strength, location, and frequency. The vulnerability module provides vulnerability curves. These vulnerability curves show the degree of damage such as the magnitude of an earthquake. The financial module translates physical damage to financial loss. Its main outputs are the event loss tables (ELTs) that give a detailed list of events and related losses.
-
Question 6 of 30
6. Question
Which of the following information regarding the inadequacies of models is true?
Correct
Each risk in the individual risk model can only have a loss once which theoretically justifies the use of the binomial/ multinomial distribution. However, In a collective risk model, it is better to use a distribution with a variance/ mean ratio that is equal to that of a Poisson process because more than one event is possible for a given risk, unlike the individual risk model. In the Poisson process, the interarrival time is distributed exponentially. This process is memoryless because the occurrence of a loss does not change the probability that another will occur in a given interval.
Incorrect
Each risk in the individual risk model can only have a loss once which theoretically justifies the use of the binomial/ multinomial distribution. However, In a collective risk model, it is better to use a distribution with a variance/ mean ratio that is equal to that of a Poisson process because more than one event is possible for a given risk, unlike the individual risk model. In the Poisson process, the interarrival time is distributed exponentially. This process is memoryless because the occurrence of a loss does not change the probability that another will occur in a given interval.
-
Question 7 of 30
7. Question
What amount of claims is needed to be revalued?
Correct
To arrive at the average point of occurrence of the renewal policy, the full incurred amount of all claims is needed to be revalued by using the appropriate inflation index. Using the full incurred amount is essential to making an informed guess as to the value of a claim at a point in time.
Incorrect
To arrive at the average point of occurrence of the renewal policy, the full incurred amount of all claims is needed to be revalued by using the appropriate inflation index. Using the full incurred amount is essential to making an informed guess as to the value of a claim at a point in time.
-
Question 8 of 30
8. Question
Which of the following information is needed for data summarization of each policy year?
I. Range
II. Total Losses
III. Standard Deviation
IV. Number of claimsCorrect
Simple statistics are used to summarize data for each policy year. The information needed for data summarization of each policy year consists of total losses, Standard Deviation, and Number of Claims. These pieces of information also have the role of creating data that allows readers to observe the trends and behaviors of the risks. There is also a list of top losses that allow the readers to analyze the risks.
Incorrect
Simple statistics are used to summarize data for each policy year. The information needed for data summarization of each policy year consists of total losses, Standard Deviation, and Number of Claims. These pieces of information also have the role of creating data that allows readers to observe the trends and behaviors of the risks. There is also a list of top losses that allow the readers to analyze the risks.
-
Question 9 of 30
9. Question
Which of the following are part of the three-step approach in data preparation?
I. Data Cleansing
II. Data Revaluation
III. Data Summarization
IV. Data TransformationCorrect
Before the raw set of data is transformed into a set of results, it undergoes the three-step approach in data preparation – data cleansing, data transformation, and data summarization. This is done to filter the raw data and analyze its quality whether or not there are errors or anomalies. Another purpose of data preparation is to revise confidential information for data privacy purposes such as names, amounts, etc.
Incorrect
Before the raw set of data is transformed into a set of results, it undergoes the three-step approach in data preparation – data cleansing, data transformation, and data summarization. This is done to filter the raw data and analyze its quality whether or not there are errors or anomalies. Another purpose of data preparation is to revise confidential information for data privacy purposes such as names, amounts, etc.
-
Question 10 of 30
10. Question
Which of the following regarding Claims Inflation is false?
Correct
When all claims are above a certain level of deductible, it negatively pushes up the claims which were previously not part of the data set. As a result, the claims will not be comparable and there is no way of deducing inflation from statistics that are based on the empirical severity distribution above a fixed threshold.
Incorrect
When all claims are above a certain level of deductible, it negatively pushes up the claims which were previously not part of the data set. As a result, the claims will not be comparable and there is no way of deducing inflation from statistics that are based on the empirical severity distribution above a fixed threshold.
-
Question 11 of 30
11. Question
Which of the following statements regarding risk-sharing are false?
I. The reinsurer is the party to whom the principal or direct insurer passes the risk.
II. Protection from large losses is one of the reasons why insurance is sold.
III. The objective of risk sharing is to mitigate the risk among those involved.
IV. The relationship between the policyholder and direct insurer is perpendicular to the relationship of the direct insurer with the reinsurer.Correct
Risk-sharing aims to spread the risk among those involved. When the principal insurer passes some of the risks to another insurance company, another party becomes involved called the reinsurer. In other words, the direct insurer can be likened to the policyholder who also purchases insurance. Therefore, the relationship of the policyholder with the direct insurer and the direct insurer with the reinsurer is parallel.
Incorrect
Risk-sharing aims to spread the risk among those involved. When the principal insurer passes some of the risks to another insurance company, another party becomes involved called the reinsurer. In other words, the direct insurer can be likened to the policyholder who also purchases insurance. Therefore, the relationship of the policyholder with the direct insurer and the direct insurer with the reinsurer is parallel.
-
Question 12 of 30
12. Question
Which of the following are the effects of a reinsurance agreement?
I. The increase of the mean amount to be paid by the direct insurer on claims
II. The reduction in the variability of the claims paid out by the direct insurer
III. The increase of the probability that the direct insurer will pay a very large amount for a particular claim
IV. The direct insurer’s payout on claims is stabilized.Correct
When a direct insurer enters into a reinsurance agreement, the reinsurer protects the direct insurer from having the sole responsibility for the tails of the distributions of large claims. Thus, the effects of a reinsurance agreement are the reduction in the variability of the claims which is paid out by the direct insurer and the reductions of the mean amount paid out by the direct insurer on claims. Besides. The direct insurer will have a reduction in the probability of paying a large amount on a particular claim.
Incorrect
When a direct insurer enters into a reinsurance agreement, the reinsurer protects the direct insurer from having the sole responsibility for the tails of the distributions of large claims. Thus, the effects of a reinsurance agreement are the reduction in the variability of the claims which is paid out by the direct insurer and the reductions of the mean amount paid out by the direct insurer on claims. Besides. The direct insurer will have a reduction in the probability of paying a large amount on a particular claim.
-
Question 13 of 30
13. Question
What type of reinsurance arrangement gives both the direct insurer and reinsurer involvement in paying each claim?
Correct
Under proportional reinsurance, there is an agreement on the proportion in each claim that will be covered by the direct insurer. Both direct insurer and reinsurer have unlimited liability in this reinsurance agreement. Thus, this agreement is not favorable to the direct insurer.
Incorrect
Under proportional reinsurance, there is an agreement on the proportion in each claim that will be covered by the direct insurer. Both direct insurer and reinsurer have unlimited liability in this reinsurance agreement. Thus, this agreement is not favorable to the direct insurer.
-
Question 14 of 30
14. Question
Which of the following are the advantages of the Burning Cost Analysis?
I. It indicates that a mistake has been committed in the methods
II. Its result may be used as an intermediate step between the initial summary statistics and the results of the frequency/ severity modeling
III. It is purely retrospective
IV. There is a reliable prediction beyond the range of experienceCorrect
The Burn Cost Analysis is one of the most used pricing techniques because of its simplicity despite its many limitations. With this pricing technique, the results of the frequency/ severity modeling are better understood due to the sense-check that it provides. Thus, this technique achieves many useful results without resorting to Stochastic Modelling.
Incorrect
The Burn Cost Analysis is one of the most used pricing techniques because of its simplicity despite its many limitations. With this pricing technique, the results of the frequency/ severity modeling are better understood due to the sense-check that it provides. Thus, this technique achieves many useful results without resorting to Stochastic Modelling.
-
Question 15 of 30
15. Question
Why is it not required for a revaluation factor in the burning cost analysis to be equal to the claim inflation or the standard price indices?
Correct
The revaluation factor in the burning cost analysis, in general, reflects company-specific variables such as product price changes. Therefore, this factor is dependent on the status of the company since the burning cost analysis is mostly based on experience.
Incorrect
The revaluation factor in the burning cost analysis, in general, reflects company-specific variables such as product price changes. Therefore, this factor is dependent on the status of the company since the burning cost analysis is mostly based on experience.
-
Question 16 of 30
16. Question
Which of the following are the limitations of the burning cost analysis?
I. Retrospective in nature
II. Relies only on the range of experience
III. Lack of reliable estimation of the aggregate loss distribution
IV. Discerns frequency and severity trendCorrect
The burning cost analysis only relies on historical information which may impair the views on future risks. Since this pricing technique is not model-based, there is no attempt to find regularities in the data and predict beyond its range of experience. It is not possible to estimate the full aggregate loss distribution and predict the richness of possible outcomes.
Incorrect
The burning cost analysis only relies on historical information which may impair the views on future risks. Since this pricing technique is not model-based, there is no attempt to find regularities in the data and predict beyond its range of experience. It is not possible to estimate the full aggregate loss distribution and predict the richness of possible outcomes.
-
Question 17 of 30
17. Question
What is the advantage of experience and prior knowledge’s input in the model’s data?
I. Better judgment on what kind of model is better suited for a specific kind of data set
II. Established constraints and criteria on the results
III. Risky due to its subjective nature
IV. Has more explanatory power due to its simplicity and fundamental regularitiesCorrect
The modelling process starts with an analyst’s experience and prior knowledge. With an analyst’s professional opinion, models that are better suited for a specific client or data set are chosen. As a result, experience and prior knowledge save time on relying solely on the available data. Its input increases the efficiency and effectiveness of the model due to the real regularities present in the data.
Incorrect
The modelling process starts with an analyst’s experience and prior knowledge. With an analyst’s professional opinion, models that are better suited for a specific client or data set are chosen. As a result, experience and prior knowledge save time on relying solely on the available data. Its input increases the efficiency and effectiveness of the model due to the real regularities present in the data.
-
Question 18 of 30
18. Question
What statement on modelling is false?
Correct
All models are affected by the prediction error to some extent. Selecting the best model is one of the most essential activities in actuarial modelling for general insurance. One method used is selecting from a number of statistical distributions that are the most suitable for the data. The goal of modeling selection is to use the model that has the most predictive power.
Incorrect
All models are affected by the prediction error to some extent. Selecting the best model is one of the most essential activities in actuarial modelling for general insurance. One method used is selecting from a number of statistical distributions that are the most suitable for the data. The goal of modeling selection is to use the model that has the most predictive power.
-
Question 19 of 30
19. Question
What is the main goal of statistical learning?
Correct
Statistical learning is also known as “machine learning”. An important note to keep in mind about statistical learning is that increasing the complexity of models does not increase its accuracy in making predictions. A model that is closer to reality may sometimes be better at making predictions and decisions.
Incorrect
Statistical learning is also known as “machine learning”. An important note to keep in mind about statistical learning is that increasing the complexity of models does not increase its accuracy in making predictions. A model that is closer to reality may sometimes be better at making predictions and decisions.
-
Question 20 of 30
20. Question
What are the main assumptions for using the development of claims count triangles in the estimation of the number of IBNR claims?
I. All years develop in roughly the same way
II. There are unique irregularities in each year
III. There is no further development for the earliest year
IV. There is a steady trend of development every yearCorrect
The development of claims count triangles is the most common method for the estimation of the number of IBNR claims. Chain ladder or other similar triangle development techniques are used for projecting the claim count. First, the claims count in a triangle is organized in a cumulative format. Only then can the chain ladder method be applied to project the number of claims to ultimate.
Incorrect
The development of claims count triangles is the most common method for the estimation of the number of IBNR claims. Chain ladder or other similar triangle development techniques are used for projecting the claim count. First, the claims count in a triangle is organized in a cumulative format. Only then can the chain ladder method be applied to project the number of claims to ultimate.
-
Question 21 of 30
21. Question
What are the two main problems in using the Poisson distribution?
I. Not suitable for long term use
II. Only applicable in rare events
III. Parameter uncertainty
IV. Not suitable for taking systemic risk into accountCorrect
Assumptions of stationary and independent increments are not easy to justify in practice. The number of claims used in the Poisson distribution does not reflect the stability in practice. The Poisson rate is only estimated based on a small number of years, hence the parameter uncertainty.
Incorrect
Assumptions of stationary and independent increments are not easy to justify in practice. The number of claims used in the Poisson distribution does not reflect the stability in practice. The Poisson rate is only estimated based on a small number of years, hence the parameter uncertainty.
-
Question 22 of 30
22. Question
What is the correct order of the Stochastic Model in its Risk Costing Process?
I.Portfolio/ market information I
II. Cover Data
III. Individual loss data
IV. Exposure DataCorrect
The correct order is individual loss data, exposure data, portfolio/market information, then cover data. In frequency and severity modeling, exposure data and portfolio information are used for several of its steps.
Incorrect
The correct order is individual loss data, exposure data, portfolio/market information, then cover data. In frequency and severity modeling, exposure data and portfolio information are used for several of its steps.
-
Question 23 of 30
23. Question
When can the use of a Poisson process be justified?
Correct
The Poisson process is only adequate when systemic effects play a negligible role, therefore, elucidating the impression that the use of the Poisson Process can only be justified when the Poisson rate is genuinely constant. However, there are several factors that affect the constancy of the Poisson rate such as the volatility of the loss experience is likely to be underestimated when systemic effects are common and Poisson distribution is assumed. With the presence of seasonal volatility, the loss process can only be modeled roughly and a sophisticated approach is necessary.
Incorrect
The Poisson process is only adequate when systemic effects play a negligible role, therefore, elucidating the impression that the use of the Poisson Process can only be justified when the Poisson rate is genuinely constant. However, there are several factors that affect the constancy of the Poisson rate such as the volatility of the loss experience is likely to be underestimated when systemic effects are common and Poisson distribution is assumed. With the presence of seasonal volatility, the loss process can only be modeled roughly and a sophisticated approach is necessary.
-
Question 24 of 30
24. Question
What term refers to the uncertainty from dealing with inherently stochastic phenomena which cannot be reduced?
Correct
Process Uncertainty happens when an element exhibits random fluctuations driven by the loss generation process. It is intrinsic to the stochastic phenomena which does not have a stable pattern or order. Its effect can be calculated by producing models that show the number of losses or its total amount.
Incorrect
Process Uncertainty happens when an element exhibits random fluctuations driven by the loss generation process. It is intrinsic to the stochastic phenomena which does not have a stable pattern or order. Its effect can be calculated by producing models that show the number of losses or its total amount.
-
Question 25 of 30
25. Question
Which of the following statements on exposure rating in direct insurance is false?
Correct
Exposure rating is unnecessary in personal lines insurance. Loss is experienced normally which makes the use of an exposure rating inappropriate. On the other hand, exposure ratings are useful when the experience is barely sufficient and diversified.
Incorrect
Exposure rating is unnecessary in personal lines insurance. Loss is experienced normally which makes the use of an exposure rating inappropriate. On the other hand, exposure ratings are useful when the experience is barely sufficient and diversified.
-
Question 26 of 30
26. Question
Which of the following information in the extended warranty is true?
Correct
Extended warranties allow the insured to prolong the warranty of a purchased product in exchange for a premium. It may be provided by retailers, independent insurers, or manufacturers. The failure rate is the most important entity of extended warranty risk.
Incorrect
Extended warranties allow the insured to prolong the warranty of a purchased product in exchange for a premium. It may be provided by retailers, independent insurers, or manufacturers. The failure rate is the most important entity of extended warranty risk.
-
Question 27 of 30
27. Question
What are the elements of a failure rate?
I. Risk Profile
II. Claim Information
III. Exposure
IV. Retail InformationCorrect
The exposure information is obtained from the number of sales per day or month, then subdivided the type of product, territory, and other relevant factors. Claims data are information such as Claim ID, purchase date, repair date, type of defect, etc.. This information is needed to determine the failure rate which has the goal to predict the number of failures over the timeframe of the extended warranty.
Incorrect
The exposure information is obtained from the number of sales per day or month, then subdivided the type of product, territory, and other relevant factors. Claims data are information such as Claim ID, purchase date, repair date, type of defect, etc.. This information is needed to determine the failure rate which has the goal to predict the number of failures over the timeframe of the extended warranty.
-
Question 28 of 30
28. Question
Which of the following information are estimated with the use of a basis failure rate?
I. The “long term” failure rate for a particular cohort and for a given number of months from purchase
II. The adjustments of the base failure rate over time
III. The exact shape of the bathtub curve
IV. The base failure rate for a specific cohortCorrect
The base failure rate the failure rate during the mature phase. Adjustments are made to project the failure rate during the extended warranty period. It is calculated by subdividing all sales in a specific number of months.
Incorrect
The base failure rate the failure rate during the mature phase. Adjustments are made to project the failure rate during the extended warranty period. It is calculated by subdividing all sales in a specific number of months.
-
Question 29 of 30
29. Question
What kind of modelling is used for events such as earthquakes, tsunami, or hurricanes?
Correct
Standard actuarial techniques do not work for catastrophes. The client’s experience in catastrophes is near-zero credibility. Instead of the client’s experience, they use a database of historical events for a country or the whole world and scientific understandings of phenomena such as hurricanes and earthquakes.
Incorrect
Standard actuarial techniques do not work for catastrophes. The client’s experience in catastrophes is near-zero credibility. Instead of the client’s experience, they use a database of historical events for a country or the whole world and scientific understandings of phenomena such as hurricanes and earthquakes.
-
Question 30 of 30
30. Question
Based on the credibility theory, what is the effect of combining portfolio data with the client’s data?
Correct
When the client experience is barely sufficient, the portfolio data provides factors to consider in pricing. Portfolio data in this context is the non-client data owned by the insurer or intermediary. With the portfolio data and the client’s data combined, a better estimate of the expected losses will be calculated.
Incorrect
When the client experience is barely sufficient, the portfolio data provides factors to consider in pricing. Portfolio data in this context is the non-client data owned by the insurer or intermediary. With the portfolio data and the client’s data combined, a better estimate of the expected losses will be calculated.