I often hear the question “What is an actuary?” Well, actuaries are unique in that they have a deep understanding of both business and statistics. Traditionally, an actuary’s differentiator was their ability to make financial sense of extremely long-term horizons—typically those associated with the lifespan of human beings. This naturally lent itself to actuaries working in the age-old fields of life insurance and pensions
However, it is not just an actuary’s grasp of the long term which is unique but also their ability to harness a broad range of academic fields and apply them in a business context. To this end, the Actuaries Institute defines actuaries as professionals who:
“evaluate risk and opportunity – applying mathematical, statistical, economic and financial analyses to a wide range of business problems.”
Due to this broad skillset, I heard recently that only 30% of Actuaries Institute members actually work in life insurance or pensions—the traditional “heartland” of the industry.
Together with their focus on long-term time horizons and “risk and opportunity,” an actuary’s expertise can greatly assist small and medium-sized businesses in gaining significant insights into operational and financial performance. Of potentially greater importance, by applying actuarial techniques, a company’s management can be nudged into making decisions that have the maximization of shareholder value as a constant guiding principle.
I can personally attest to this phenomenon. As an actuary myself, I have worked with clients as diverse as:
This blog post will explore some of the ways in which applying an actuarial valuation methodology to any product, service or business—in particular those with annuity type cash flows—can greatly assist in strategic decision-making. This is particularly pertinent in these days of SaaS and subscription-based business models.
The pursuit of value should permeate a business’s decision-making process. A company’s management should make strategic decisions based on whether they will increase the long-term value of the company to shareholders, rather than any other financial metric.
I do not believe that any management team of a small or medium business would argue that they do not intend to make the company as valuable as possible for its owners. It is rather a matter of practicality that this goal is often not at the forefront of their decision-making process, for how does one measure the implications to a business’s value of each decision?
It is my contention that the use of an actuarial valuation method, or one which approximates it, is the most appropriate way to measure the value-add of a strategic decision. By quantifying the effect on a company’s value of each option being considered, management will have an incredibly powerful tool to enable it to make correct decisions based on the information available.
The value of any asset (or liability, for that matter), whether tangible or intangible, is equal to the expected present value of the future cash flows, which will be realized from that asset. This is, in essence, the discounted cash flow valuation method, which is widely used in investment finance and corporate financial management.
The actuarial valuation technique is founded on this method but has two significant additional features:
Simply put, the actuary will project forward each and every future cash flow, determine the probability with which the cash flow will occur, and then discount these probability weightedcash flows to the present time. By developing this valuation technique, actuaries crafted a method that explicitly allows for all the relevant contingencies that result from the uncertainty in the cash flows.
Chart 1 below shows an example of this, where the addition of longer-term cash flows with individual probabilities attached creates a far more irregular discounted cash flow curve than seen in traditional DCF measures.
The actuarial valuation methodology can be applied to each and every product/service that a business offers. This can assist management in comparing the value added (or not added) by their different product lines. In turn, this should direct decisions on the viability of both current and future products. The actuarial method may also highlight the drivers of the relative success of the products which could inform the actions to be taken to rectify underperforming lines.
By considering past adoption of new products, in particular, the adoption of new products by existing customers, the value of a new product can be determined with a fair amount of accuracy, provided that sufficient past data is available. Again, while the technique is fairly similar to the Net Present Value methodology, the ability to incorporate probabilities and contingencies into cash flows gives the actuarial technique an important advantage.
In a similar way, an entire business can be valued. Both in-force business and future potential business arising from goodwill can be considered. Assuming that accurate assumptions are used, this could be the most appropriate way to value an annuity type business with sufficient data.
In addition to the advantages that have already been mentioned, some significant advantages of actuarially derived values can be identified in comparison to accounting numbers:
Platinum Life, a niche South African life insurer, has partnered with a number of businesses using these actuarial valuation principles. By converting traditional retail or service businesses into membership-based business models, they have created companies supported by annuity income. They have replicated this approach in industries as diverse as cosmetics and software, fashion, education, and nutrition. Their actuaries analyze every new product envisaged for its expected value added to the business.
The numerous contingencies which affect each cash flow are calculated using statistical tables. These tables are usually the result of studies that assess past experience and the actuary’s estimation of how this past experience will change in the future. For life policies, relevant tables would be those used to determine the probability of a person’s survival, of them becoming disabled, or of them discontinuing their policy. The tables used in other businesses would be those relevant to the specific business, product, or project being valued.
Take, for example, a publishing company that offers subscription-based magazines or newspapers. The development of a statistical table from which subscription renewal rates could be predicted would be the key input into an actuarial model. By analyzing the subscription trends of past cohorts, such a table could be developed. One would expect renewal rates to increase as the cohort ages. In other words, the expected probability that a customer who has been subscribing to a title for ten years will renew their subscription is greater than a subscriber of two years. The grouping of the titles offered by the company for this analysis would depend on the homogeneity of their characteristics as well as the sufficiency of past data.
A cohort analysis is a great tool for creating such probabilities by narrowing retention and behavioral rates to classes of customers. Through looking at retention rates for past cohorts, a business can project out its future cash flows with more certainty and then tweak its marketing tactics to improve any negative trends. Figure 1 below shows an example of such an analysis alongside a brief explanation.
The actuarial valuation methodology can, therefore, be used to determine the value-add of any uncertain strategic decision. Each option can be valued independently, and the option which results in the greatest additional value to the company can be chosen. The advantages to the approach stem from the application of probabilities to each cash flow. In this way, correlations between different risks and between different products, services, or departments can be considered.
The applications of the actuarial method are of particular value to businesses which generate annuity type income. The structure of these products mimics that of insurance products; take-up rates, persistency rates, and the increase in revenues and costs can all be modeled, and important business decisions can thus be made with key information now in the hands of management.
The total net profit that a business earns, on average over the course of its relationship with a customer, is known as customer lifetime value (CLV). CLV represents an upper bound for the cost a business should spend to acquire a customer, which is referred to as customer acquisition cost (CAC).
The merits of calculating an accurate customer lifetime value are numerous and its value can be critical in determining a business’s marketing strategy. A number of different marketing strategies exist depending on the type of product being sold. These range from complex sales, which may take months of the CEO’s time to acquire; to personal sales, which require a sales force to directly contact potential clients; to viral marketing, which relies on users to invite their friends to become users too.
Determining the correct mix of marketing strategies is crucial to the success of a product, and the nature of the product will, in most instances, restrict the strategies available.
Companies such as Elon Musk’s Space X, which has signed billion-dollar contracts with NASA, are totally reliant on complex sales; with a customer lifetime value that high, almost any amount of time and direct energy from the CEO is worth the cost. On the other hand, the CLV for PayPal was much lower. Traditional advertising, even online, proved too expensive. PayPal discovered that the most cost-efficient method to acquire customers was to actually pay them to join—and then again to refer new customers.
The sales technique used by PayPal was only made possible because they had a good idea of its customer lifetime value. Many businesses, however, are not aware of the value of acquiring a new customer which leads them to employ an incorrect or cost-inefficient marketing strategy. By focusing solely on the profit generated from the customer from the initial sale, or even in the first year or two, a company could significantly underestimate its CLV. Management may, therefore, myopically, spend too little on marketing or may employ an incorrect strategy or distribution channel. As in Chart 2 below, the value derived from a customer increases significantly in future years.
In addition, a business may err in considering the customer lifetime value for a particular product in isolation. However, once a customer is acquired, it opens up opportunities for cross-selling both current and future products. A customer’s likelihood to purchase additional products can be modeled, and the value that this unlocked can then be considered in the CLV calculation.
Customer lifetime value can also be valued per salesman or client relationship manager. This can be an extremely effective way of monitoring (and rewarding) performance. The use of the actuarial valuation methodology for employee evaluation is discussed later in this article.
An actuarial valuation of a business’s CLV can, therefore, greatly assist management in marketing decisions. Moreover, it can help management answer a number of additional questions, including:
Depending on the type of business, modeling customer lifetime value can vary in complexity. An actuarial valuation of CLV, applying the actuarial valuation method described above, can be a sophisticated and powerful method of accurately modeling this important metric.
Although it may be very useful, it is impractical to use actuarial valuations for day to day decisions made by a business. However, an actuarial valuation can be performed to determine the key metrics which are responsible for driving a business’s value. Typically, businesses use financial metrics, such as revenue and operating income, to assess the performance of products, units, or managers. These metrics, particularly when used as quarterly and annual measures, are not reliably linked to the long-term cash flows that produce shareholder value.
It stands to reason that employee incentives should also be aligned to the increase in value added by each manager or employee. Apart from assisting management decisions, these metrics can, therefore, be used to determine the performance of executives and employees. Incentives and rewards can thus be linked to these metrics. Looking at the components that add to customer lifetime value (Figure 3), there are many interrelated components that, if isolated, can be correctly incentivized to enhance performance.
The actuarial valuation technique can be used to assess the value added by competing metrics that have been identified. The metrics which result in the greatest increase to the business’s value would be chosen to be used in employee evaluation. For example, borrowing from the analysis of customer lifetime value, if it is shown that customer retention adds more value to a business than the acquisition of new customers, employees should be rewarded more for improving retention rates than for acquiring new customers.
As proposed by Rappaport, a company’s goal should be to maximize shareholder value as measured using the discount present value methodology. Yet, as he notes, sometimes this is too short-termist:
“Most companies evaluate and compare strategic decisions in terms of the estimated impact on reported earnings when they should be measuring against the expected incremental value of future cash flows instead.”
The actuarial valuation technique is, therefore, consistent with his view of how a company should be run and how strategic decisions should be made. Since the actuarial technique considers long time periods as well as the uncertainty of cash flows, it is my contention that it is the most appropriate valuation and metric technique for many businesses—particularly those which have annuity type income as well as sufficient past data. Those that understand the actuarial valuation can bring significant value and insight through their valuation analysis.
The statistician George Box’s aphorism, that “all models are wrong but some are useful,” is often quoted (at least in actuarial circles). It is as true in the context of this article as it is in general. A model is, by definition, a simplification of reality; it can never predict the future perfectly. Using the actuarial valuation technique cannot be a panacea to solve all decisions that a business needs to make. The technique is heavily dependent on the quality, sufficiency, and availability of relevant data. It is also highly sensitive to the actuarial assumptions used. However, the use of the actuarial technique, together with other relevant measures, can provide management with extremely useful insights into their business and can greatly assist in the creation of value for shareholders.
Posted in Business
Print This Post
This article is about a very important question today’s innovators face when they develop new software products.
Namely, that of:
“What are the appropriate technical skills, technology stack and architecture for the new product?”
This question is very hard to answer, because very often it is not clear what features a product should have. Needless to say, that sometimes it is not even clear what problem the company is trying to solve for its customers.
Consider a development team that follows an iterative approach to test fundamental hypotheses about the product, the business model and the market.
Initially, during the startup phase, the team will attempt to build, test and adapt an MVP, in an iterative process. With each iteration, the team will improve the MVP and adapt it to reflect the feedback received during the tests.
Once the team verifies their fundamental hypotheses and validates the business model, the startup phase concludes and the scaleup phase starts. If the team disproofs or fails to verify their hypotheses, it will attempt to pivot the strategy to correct the course, formulate new fundamental hypotheses and reiterate.
By the time the team completes the MVP and validates the business model, the product may have gone through several fundamental changes and most likely already has many users. The next step is to scaleup and roll out the product to significantly more users and even enter different markets. At the same time the team should continue operating the product for the existing users.
Following the above approach comes with contradictory expectations regarding the appropriate technical skills, architecture and technology stack. The reasons are:
To put it another way, the technology dilemmas can be described as:
“A startup needs a technology strategy that not only allows for fast delivery of value and supports growth, but can also fulfill unknown, diverse and possibly contradicting requirements”
To put the technology dilemmas in a familiar and tangible context, two examples, which are based on actual situations startups have encountered, will follow.
The first example is about an e-commerce company that operates in the B2C market.
Their product idea is great, but they have to be very fast in gaining a significant market share because competition looms. The team decides to go for a simple architectural pattern and a hosted solution on premise. The reasons for their choice are, that they are very familiar with these technology options and feel that this is the fastest way to build the product. Very soon the first version of the website is ready and the team continuously develops and tests their assumptions about the market.
After a couple of years, the user base in their home country has grown significantly in size and so has the code of the B2C web-based application. It is now time to grow even further, enter new countries and even the B2B2C markets.
Sadly, the team realizes that the application cannot handle the new traffic load and regularly crashes. To make things worse, implementing requirements for other countries and for the B2B2C market turns out to be very difficult. Their web-based application, a monolithic architecture of spaghetti code, cannot be easily scaled or extended.
The team now feels that they should have chosen a different approach to build the web-based application, but they would not have been as fast as it was required in the beginning. Now, their suggestion is to migrate the application to a modern micro-service architecture and to move it to an external IaaS/PaaS.
However, this will take much time and effort which they cannot afford because the competition is catching up.
The second example is about a startup that builds a SaaS product for the industry 4.0 market.
Their business model assumes that several thousands of customers and tens of thousands of users will work concurrently with the product.
The development team is excited and chooses a micro-service architecture for the product. Work begins and the team chooses to develop the product on the IaaS/PaaS of a major provider. The team is not familiar with these technologies. Progress is slow at the beginning, but they learn fast. With each iteration they incrementally improve their MVP. After several months of work, they begin to test with users and potential customers. Everything seems to go great and finally they are making good progress.
Unfortunately, after a while, the team realizes that their target customers are not willing to adopt a SaaS solution yet. There are good reasons to believe, that in the future, customers will accept a solution running on a third party cloud, but for now it is too early. The majority of the potential customers have concerns, because they plan to use the product with intellectual property and are not willing to allow this information to leave their own infrastructure.
The product team decides to pivot. Instead of offering the product as a service to thousands of customers and tens of thousands of users, the decision is to offer it “on premise” to a handful of big customers and later to a few hundred mid-sized customers. Subsequently, every installation must take place on location, as remote access is not always possible. Furthermore, because each installation is customer specific, it will have only a few hundreds of users at most.
The team is now facing a difficult technical challenge, because the requirements have fundamentally changed:
Re-architecting the application is possible, but it will take much effort and the team still needs to continue developing new features for existing customers
Modeling the technology dilemmas can help to better understand their dynamics and develop strategies to tackle the associated challenges.
The value a new product creates, resembles an S-Curve. Initially, as the team tests various hypotheses and builds the MVP, the value creation is relatively flat. Once the right strategy is found and the business model is validated, the value creation potential assumes exponential growth. To capitalize on this potential, the scaleup phase starts. Ultimately, as pressure from competition increases and the market saturates, growth flattens out again. If it was possible to choose the perfect technical skills, technology stack and architecture, the development team would be able to develop the MVP and validate the business model fast during the startup phase. Furthermore, development could smoothly transition to a scaleup phase and the team would deliver work that optimally capitalizes on the exponential growth potential.
In reality, because of the uncertainty about the requirements and pressure to deliver, the technology choices cannot always satisfy the contradicting expectations for fast development needed to quickly validate the strategy and at the same time the ability to scaleup.
Constantly changing the MVP to reflect feedback from the market and pressure to deliver, have most likely led to quick fixes, shortcuts and poor engineering choices. In other words buildup of technical debt. In addition to the above, pivoting may have even set a totally new direction for the product, which may not even be compatible with the current technical solution.
The result of the above situations is, that product development cannot keep up to the demands that derive from the exponential growth potential. The technology debt constrains growth and value creation flattens. The business encounters difficulties scaling-up.
To avoid a premature plateau in growth, the development team will have to perform a technology shift/pivot. What type of pivot is suitable (e.g. re-engineer, migrate or rewrite) depends on the specific situation the startup faces. However, the optimal point in time to perform the pivot is “fixed” and independent of the situation. It is the moment when the team validates the business model i.e. when it becomes evident that creating value has the potential to grow exponentially.
If the pivot is successful, the product will capitalize the exponential growth potential. However, most teams have difficulties taking this decision. Teams hesitate, because the decision is counter intuitive. The reason is, that the pivot will temporarily delay growth. Additionally, it should happen at a moment when growth has, for the first time, demonstrated an exponential trend. Despite appearances, graph  shows that the decision to pivot quickly pays out and outperforms the decision to keep the existing technology choices.
What are common pitfalls when it comes to pivoting the technology, besides failing to recognize the need to pivot?
The first most common mistake is delaying to pivot. Doing so may seem to be a better option than sticking to the current technology choices. Even so, this decision will eventually perform worst when compared to pivoting at the moment the growth potential becomes exponential.
Another danger is that, if the pivot takes too long to complete, it fails to support the growth potential. As a result, the product will miss the chance to scale.
Finally, teams may be tempted to make technology choices optimal for scaling already from the start of development. For example, by choosing a highly scalable but complex architecture or a heavyweight technology stack. They will do that in hope of avoiding to pivot later. By doing so however, they will most likely be very slow in developing the MVP and validating the business model. The graph  shows how such a choice will again lead to sub-optimal growth. Demonstrating feasibility of a scalable technical solution is vital, but implementing it too early is not.
This article, described the technology dilemmas that innovators face and presented a model to explain how the dilemmas affect product development.
The next step would be to answer the question raised at the beginning of the article:
“What are the appropriate technical skills, technology stack and architecture for developing the new product?”
As already mentioned, this is a hard question to answer. Fortunately, the model presented here, offers some insights that can help in developing ideas to answer it.
In a follow-up article, I will write about these ideas and try to suggest a technology strategy for innovation.
One of the challenges product teams encounter, is how to decide which features should be included in their products. Identifying the user needs, helps teams to focus on what a product should deliver to address a certain type of user. In time, teams develop many ideas on how to address these needs. The Kano model (proposed in the 80s by Noriaki Kano) offers a way to differentiate these features by focusing on customer satisfaction. Eventually it can be used to answer the question: “Which features should be included in the Minimum Viable Product (MVP)?”
The Kano model categorizes features into five types based on the impact is has on customer satisfaction:
There is a methodology for mapping customer responses to questionnaires onto his model. When conducting a customer survey, the customer should be asked about his/her opinion on each feature, in a positive and a negative question:
Each question should provide 5 possible answers:
The feature can be mapped to the Kano types based on the chosen answers using the table below:
Looking at the distribution of the answers, it is possible to conclude on the impact a certain feature has on customer satisfaction. If the answers are evenly distributed between two or more Kano types, this could be an indication that the product should be made available into different flavors (e.g. “standard” and “professional”) to better meet the needs of different customer types.
Teams should focus the development efforts on the “Must Have”, “Linear” and “Delighter” features in that order.
The MVP product should be based on the “Must Have” features. However, even though the “Must Have” features are required for the product to survive, they do not have to be fully implemented from the beginning. In many cases a reduced version of the “Must Have” feature may also be adequate. The full feature can be planned for a later release. The “Linear” and “Delighters” should be planned for a later release.
A final important note: The impact a certain feature has on customer satisfaction is not constant and it may decrease with time. For example “Delighters” may degrade to “Linear”and “Must-have” even to “Undesired”.
How can data cubes (multidimensional data models) be used in the context of Business Process Management (BPM) to support management efforts in understanding, planning and improving business processes?
Traditional approaches define a customized multidimensional data model for analyzing operational data.
The proposed approach is to define a standardized multidimensional data model for analyzing data that can be applied to any business process model.
This approach is novel and innovative, not only because of the mapping of business process data to a data cube, but also because, once the mapping has been completed, and facts have been generated (e.g. via simulation), non-technical business analysts are able to have a completely new understanding of the dynamic behavior within organizations by using an extensive range of inquiries that are possible using standard data cube analysis interfaces.
Business process management, simulation and the use of a tested methodology have complementary roles in providing businesses with useful information so that managers can make better decisions.
Business Process Management (BPM) is about the management of business
processes. It is a top down, cross-functional, process centric management approach, which deals with the design, implementation, control and improvement of the end-end process of an organization.
Business process modeling, as a part of Business Process Management, provides a visualization and static analysis component that describes the context in which a set of activities occur.
Simulation provides a dynamic behavior component that is able to test assumptions about all conditions that effect process performance.
The three perspectives (BPM, business process models and simulation) allow the extremely complex interactions and inter-dependencies within organizations to be better understood by managers, so that they can better direct their organizations. All three views are associated with traditional analyses methods that are at best, only partially integrated. Due to lack of integration, and the current cumbersome, highly technical, analysis techniques available, many critical questions cannot be easily answered or they cannot be answered at all.
A new approach is proposed that extends the functionality and value of BPM,
process modeling, simulation and data cube analysis techniques, by combining, transforming and integrating all relevant data into a data cube structure that can be easily created and queried. In the context of business process modeling, a multidimensional model for analyzing the data generated by business process model simulations is developed. A process model, along with the simulation meta model, is used to define the dimensions and the facts for a multidimensional model. The multidimensional model developed is a generic one and is therefore not constrained to a specific kind of a business process model.
When using the this approach, non-technical users are able to analyze a specific business process model and answer critical questions about their organization. The business process model is used to fill the dimensions with data, and data from simulations of the model are used to fill the facts.
The business process cube makes it possible to answer a wide range of mission critical questions regarding the performance of the organization. Questions that can now be answered range from simple ones like “What is the time required to perform Activities per Process?”, or to more complex questions which could not be easily answered before, such as “What is the average throughput time and costs for processing a specific piece of Information for a specific type of customer for each product type between a specific Input and a specific Output?”.