Click here to close now.

SYS-CON MEDIA Authors: Tim Hinds, Mike Kavis, Plutora Blog, Bart Copeland, Andreas Grabner

News Feed Item

The Real Problem with Public-Sector Pension Plans - C.D. Howe Institute

TORONTO, March 19, 2014 /CNW/ - While public debate has mainly focused on the "gold-plated" defined benefits of many public-service pension plans, the real problem lies in a flawed approach to managing compensation costs, according to respected pension expert Malcolm Hamilton.  In, "Evaluating Public-Sector Pensions: How Much Do They Really Cost?" Hamilton says the problem is government sponsors who typically underestimate the cost of guaranteeing pay-outs in the future, leading to the undervaluation of employee pension costs and the mismanagement of employee compensation.

"The problem is not with the defined-benefit plans per se, it relates to the mispricing of their guarantees, which leads to the over-compensation of employees and badly accounted-for risks to future taxpayers," says Hamilton. "This is particularly true in the federal public sector where plan members are insulated from investment risk, as compared to the provincial public sectors where members frequently bear half of the risk, if not more."

In the first of a two-part series on government employee pensions, Hamilton observes that public-sector pension plans in Canada have many virtues. 'They are generally large, efficient and well managed. However, there are large differences between the fair values of the pensions earned by public-sector employees and the "cost" of these pensions according to public-sector financial statements,' states Hamilton.

These differences arise almost entirely from the pricing of guarantees. Specifically, the financial markets attach high values to the guarantees embedded in public-sector pension plans while government financial statements attach little or no value to these guarantees. This means that pension costs are materially understated and, as a consequence:

  • employees in the public sector are paid more than is publicly acknowledged and, in many instances, more than their private-sector counterparts;
  • public-sector employees shelter more of their retirement savings from tax than other Canadians are permitted to shelter; and
  • taxpayers bear much of the investment risk taken by public-sector pension plans while the reward for risk-taking goes to public employees as higher compensation.

Hamilton notes that private-sector pension accounting standards long ago rejected the premise at the heart of today's public-sector accounting standards - that the cost of a fully guaranteed pension depends critically upon the rates of return that a pension fund can earn on risky investments even though the pension itself is totally unaffected by these returns.

Public-sector accounting practice books the returns that a pension fund might reasonably expect to earn as a reward for future risk taking long before the risks are taken, and this risk premium is used to reduce the reported cost of employee pensions. As a consequence, in a traditional defined benefit pension plan where pensions are guaranteed and employee contributions are fixed, the reward for future risk-taking goes to employees who, because their pensions are fully guaranteed, take no risk. Future taxpayers, on the other hand, will be expected to bear risk without fair compensation.

"Essentially, we have devised a complicated way to transfer wealth from future taxpayers to current plan members," notes Hamilton.

The good news, he says, is that once the accounting problem is recognized for what it is, the solution becomes obvious. "The risks that taxpayers are being asked to bear without compensation should be transferred, in whole or in part, to the plan members on whose behalf these risks are being taken."

This can be accomplished in a variety of ways, says Hamilton. Benefits can be tied to funding levels and/or to the performance of pension funds. Employee contributions and/or salaries can be tied to the cost of funding their pensions. "Many provincial governments have already started to move in this direction," notes the author.

The C. D. Howe Institute is an independent not-for-profit research institute whose mission is to raise living standards by fostering economically sound public policies. It is Canada's trusted source of essential policy intelligence, distinguished by research that is nonpartisan, evidence-based and subject to definitive expert review. It is considered by many to be Canada's most influential think tank.

For the report go to: http://www.cdhowe.org/evaluating-public-sector-pensions-how-much-do-they-really-cost/25181

SOURCE C.D. Howe Institute

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add sc...
The cloud has transformed how we think about software quality. Instead of preventing failures, we must focus on automatic recovery from failure. In other words, resilience trumps traditional quality measures. Continuous delivery models further squeeze traditional notions of quality. Remember the venerable project management Iron Triangle? Among time, scope, and cost, you can only fix two or quality will suffer. Only in today's DevOps world, continuous testing, integration, and deployment upend...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, includin...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mo...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
In the consumer IoT, everything is new, and the IT world of bits and bytes holds sway. But industrial and commercial realms encompass operational technology (OT) that has been around for 25 or 50 years. This grittier, pre-IP, more hands-on world has much to gain from Industrial IoT (IIoT) applications and principles. But adding sensors and wireless connectivity won’t work in environments that demand unwavering reliability and performance. In his session at @ThingsExpo, Ron Sege, CEO of Echelon...
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics arc...
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understand...