The difference between normal and actual costing systems is in the way the amount of overhead to be applied to production is calculated. When normal costing is being used, a predetermined overhead rate is calculated (called an estimated normalized rate), using the expected overhead costs divided by the expected usage of the allocation base (machine hours or direct labor hours). When the overhead is applied to production, this predetermined overhead rate is multiplied by the amount of the allocation base (machine hours or direct labor hours) that was actually used to produce the product during the period. (Note: this is in contrast to standard costing, where the predetermined overhead rate is multiplied by the amount of the allocation base allowed for the actual output.) When actual costing is being used, no predetermined overhead rate is calculated. Instead, at the end of each period (month, quarter, whatever), the total actual overhead costs incurred for the period are calculated and the total is divided by the total actual number of units produced to calculate the amount of overhead to be allocated to each unit produced. Since with actual costing a unique overhead application rate is calculated for each period based on the actual incurred overhead costs and the actual production volume for the period, actual costing requires more people hours to calculate the unique overhead rate for each reporting period. Normal costing is more economical because it uses a pre-set rate that is usually used all year. The difference between normal and actual costing systems is in the way the amount of overhead to be applied to production is calculated. When normal costing is being used, a predetermined overhead rate is calculated (called an estimated normalized rate), using the expected overhead costs divided by the expected usage of the allocation base (machine hours or direct labor hours). When the overhead is applied to production, this predetermined overhead rate is multiplied by the amount of the allocation base (machine hours or direct labor hours) that was actually used to produce the product during the period. (Note: this is in contrast to standard costing, where the predetermined overhead rate is multiplied by the amount of the allocation base allowed for the actual output.) When actual costing is being used, no predetermined overhead rate is calculated. Instead, at the end of each period (month, quarter, whatever), the total actual overhead costs incurred for the period are calculated and the total is divided by the total actual number of units produced to calculate the amount of overhead to be allocated to each unit produced. Since overhead costs are not incurred smoothly during the year and production varies during the year, allocating overhead using actual costing will result in an amount of overhead being allocated to each unit that will vary each reporting period. Thus, normal costing has an advantage in that it smooths the product costs throughout the period by smoothing the amount of overhead applied to units produced each period. The difference between normal and actual costing systems is in the way the amount of overhead to be applied to production is calculated. When normal costing is being used, a predetermined overhead rate is calculated (called an estimated normalized rate), using the expected overhead costs divided by the expected usage of the allocation base (machine hours or direct labor hours). When the overhead is applied to production, this predetermined overhead rate is multiplied by the amount of the allocation base (machine hours or direct labor hours) that was actually used to produce the product during the period. (Note: this is in contrast to standard costing, where the predetermined overhead rate is multiplied by the amount of the allocation base allowed for the actual output.) When actual costing is being used, no predetermined overhead rate is calculated. Instead, at the end of each period (month, quarter, whatever), the total actual overhead costs incurred for the period are calculated and the total is divided by the total actual number of units produced to calculate the amount of overhead to be allocated to each unit produced. Since normal costing uses a predetermined overhead application rate and actual costing uses the actual overhead application rate, it is not true that normal costing provides improved accuracy of job and product costing. In fact, it is the other way around. Actual costing provides improved accuracy because it uses actual allocation rates. The difference between normal and actual costing systems is in the way the amount of overhead to be applied to production is calculated. When normal costing is being used, a predetermined overhead rate is calculated, using the expected overhead costs divided by the expected usage of the allocation base (machine hours or direct labor hours). When the overhead is applied to production, this predetermined overhead rate is multiplied by the amount of the allocation base (machine hours or direct labor hours) that was actually used to produce the product during the period. (Note: this is in contrast to standard costing, where the predetermined overhead rate is multiplied by the amount of the allocation base allowed for the actual output.) When actual costing is being used, no predetermined overhead rate is calculated. Instead, at the end of each period (month, quarter, whatever), the total actual overhead costs incurred for the period are calculated and the total is divided by the total actual number of units produced to calculate the amount of overhead to be allocated to each unit produced. Since the totals of actual overhead costs incurred and actual units produced cannot be known until after the end of the period, normal costing has an advantage over actual costing in that it is more timely. Overhead can be allocated to products without having to wait for the end-of-period actual data on overhead costs incurred to become available.
|