When pricing the costs of building an application, knowing your projected Average Cost Per User (ACPU) is a critical first step. When starting a new project having this data early on can help influence how you design your system. It can also be an early way to know whether the economics of an idea are viable.
The Average Cost Per User will consist of hardware costs, labor costs and supporting costs needed to operate the project. Amazon Web Services (AWS) excels at clearly laying out how much hardware will cost as you gain scale. AWS provides economies of scale and costs will decrease as you use more. With some work, this can also be calculated for your own physical hardware. The following formula can be used to calculate your ACPU.
Average Cost Per User = (Hardware Costs + Labor Costs + Other Supporting Costs) / Total Users
This terminology is common in the commercial space but it's not commonly calculated within enterprise settings. More often organizations will live within a budget and make use of resources they are budgeted. But having this allows you to make data driven decisions with regards to how much investment is needed to support an enterprise application. You can also make data driven decisions on how much it will cost to support X additional users. Knowing your cost per user allows you to better determine if the application provides enough value to build or continue supporting.