Click here to close now.

SYS-CON MEDIA Authors: Liz McMillan, Elizabeth White, Dana Gardner, tru welu, Blue Box Blog

News Feed Item

Keeping the Lights On: Electric Grid Safety Hinges on Partnership and Information Sharing

Public Power CEO Sue Kelly Testifies at Senate Hearing

WASHINGTON, April 10, 2014 /PRNewswire-USNewswire/ -- Electric utilities have been focused on improving the safety and reliability of the complex and dynamic electric grid for years, testified Sue Kelly, president and CEO of the American Public Power Association (Public Power) at a Senate Energy and Natural Resources Committee hearing today. Kelly testified on behalf of investor-owned, cooperatively owned, and publicly owned utilities, as well as independent generators and Canadian utilities.  The industry's top priority is to protect critical power infrastructure from cyber and physical threats by partnering with all levels of government and sharing critical information, she said.

"Keeping the lights on for customers is of paramount importance to electric utilities. Because electricity is produced and consumed instantaneously and follows the path of least resistance, ensuring reliability and grid security is a collective affair," said Kelly.

The hearing, "Keeping the Lights On — Are We Doing Enough to Ensure the Reliability and Security of the U.S. Electric Grid?" was convened by the Senate Energy and Natural Resources Committee headed by Sen. Mary Landrieu (D–La.), with ranking member Sen. Lisa Murkowski (R-Ala.).

Kelly explained the robust measures electric utilities already have in place to address physical and cybersecurity and outlined how these measures have remained responsive to evolving threats over the years.

Recent media reports profiled attacks on physical infrastructure including the incident at Pacific Gas and Electric's Metcalf substation in California. While electric utilities take this incident seriously, the notion that media stories have spurred action on grid security is inaccurate, Kelly noted. Well before the media reports, government and industry initiated a series of briefings across the country to help utilities and local law enforcement learn more about the Metcalf attack and its potential implications.

On March 7, 2014, the Federal Energy Regulatory Commission (FERC) directed the North American Electric Reliability Corporation (NERC) under Section 215 of the Federal Power Act (FPA) to submit proposed reliability standards on physical security of critical assets within 90 days.  Investor-owned, cooperatively owned, publicly owned utilities, and other industry stakeholders are participating in the NERC process to develop this important standard.

The key to electric utility physical security is a "defense-in-depth" approach, which relies on resiliency, redundancy and the ability to recover, should an extraordinary event occur, Kelly said. The industry applies a similar "defense-in-depth" approach to cyber-security to ensure a quick response if an attack occurs. As there are more than 45,000 substations in the United States, prioritizing the most critical assets and focusing security planning on them is very important, explained Kelly. She noted that cybersecurity must be an iterative process, as the nature of threats constantly evolves.

Cybersecurity of the electric grid can be enhanced by improving information sharing between the federal government and industry, emphasized Kelly. The Electricity Sub-sector Coordinating Council (ESCC), a public/private partnership between the utility sector and the federal government, plays an essential role in coordination and information sharing. The ESCC has representatives from electricity trade associations, utilities and regional transmission organizations.

"The only way industry participants on the ground can truly protect against an event is to be aware of a specific threat or concern. They know which of their assets are critical. They know what they need to do to protect against the majority of physical and cyber threats," explained Gerry Cauley, CEO of the North American Electric Reliability Corporation who also testified at the hearing. "However, if the government is aware of a specific threat, communicating that information to those individuals on the front lines is important. This communication differs from providing public access to sensitive information, but is an essential component of security protection," he added.

Others who testified were Cheryl LaFleur, FERC acting chair; Colette Honorable, National Association of Regulatory Utility Commissioners president; and Phil Moeller, FERC commissioner.

The American Public Power Association (Public Power) represents more than 2,000 not-for-profit, community-owned electric utilities providing reliable electric service to over 47 million Americans. More at www.PublicPower.org.

Kelly's full testimony is available at http://www.publicpower.org/files/PDFs/Kelly,%20Sue;%20testimony,%2004.10.14.pdf

SOURCE American Public Power Association

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
The Internet of Things promises to transform businesses (and lives), but navigating the business and technical path to success can be difficult to understand. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, demonstrated how to approach creating broadly successful connected customer solutions using real world business transformation studies including New England BioLabs and more.
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile ...
The OpenStack cloud operating system includes Trove, a database abstraction layer. Rather than applications connecting directly to a specific type of database, they connect to Trove, which in turn connects to one or more specific databases. One target database is Postgres Plus Cloud Database, which includes its own RESTful API. Trove was originally developed around MySQL, whose interfaces are significantly less complicated than those of the Postgres cloud database. In his session at 16th Cloud...
There are 182 billion emails sent every day, generating a lot of data about how recipients and ISPs respond. Many marketers take a more-is-better approach to stats, preferring to have the ability to slice and dice their email lists based numerous arbitrary stats. However, fundamentally what really matters is whether or not sending an email to a particular recipient will generate value. Data Scientists can design high-level insights such as engagement prediction models and content clusters that a...
It's time to face reality: "Americans are from Mars, Europeans are from Venus," and in today's increasingly connected world, understanding "inter-planetary" alignments and deviations is mission-critical for cloud. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance Strategy Leader at Cisco Systems, discussed cultural expectations of privacy based on new research across these elements
In today's application economy, enterprise organizations realize that it's their applications that are the heart and soul of their business. If their application users have a bad experience, their revenue and reputation are at stake. In his session at 15th Cloud Expo, Anand Akela, Senior Director of Product Marketing for Application Performance Management at CA Technologies, discussed how a user-centric Application Performance Management solution can help inspire your users with every applicati...
The consumption economy is here and so are cloud applications and solutions that offer more than subscription and flat fee models and at the same time are available on a pure consumption model, which not only reduces IT spend but also lowers infrastructure costs, and offers ease of use and availability. In their session at 15th Cloud Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positioning & Brand Manager at Solgenia, discussed this shifting dynamic with an ...
As enterprises engage with Big Data technologies to develop applications needed to meet operational demands, new computation fabrics are continually being introduced. To leverage these new innovations, organizations are sacrificing market opportunities to gain expertise in learning new systems. In his session at Big Data Expo, Supreet Oberoi, Vice President of Field Engineering at Concurrent, Inc., discussed how to leverage existing infrastructure and investments and future-proof them against e...
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackI...
Once the decision has been made to move part or all of a workload to the cloud, a methodology for selecting that workload needs to be established. How do you move to the cloud? What does the discovery, assessment and planning look like? What workloads make sense? Which cloud model makes sense for each workload? What are the considerations for how to select the right cloud model? And how does that fit in with the overall IT transformation?
The recent trends like cloud computing, social, mobile and Internet of Things are forcing enterprises to modernize in order to compete in the competitive globalized markets. However, enterprises are approaching newer technologies with a more silo-ed way, gaining only sub optimal benefits. The Modern Enterprise model is presented as a newer way to think of enterprise IT, which takes a more holistic approach to embracing modern technologies.
For better or worse, DevOps has gone mainstream. All doubt was removed when IBM and HP threw up their respective DevOps microsites. Where are we on the hype cycle? It's hard to say for sure but there's a feeling we're heading for the "Peak of Inflated Expectations." What does this mean for the enterprise? Should they avoid DevOps? Definitely not. Should they be cautious though? Absolutely. The truth is that DevOps and the enterprise are at best strange bedfellows. The movement has its roots in t...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impac...
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. 8th International Big Data Expo, co-located with 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. As advanced data storage, access and analytics technologies aimed at handling high-volume and/or fast moving data all move center stage, aided by the cloud computing bo...
Every day we read jaw-dropping stats on the explosion of data. We allocate significant resources to harness and better understand it. We build businesses around it. But we’ve only just begun. For big payoffs in Big Data, CIOs are turning to cognitive computing. Cognitive computing’s ability to securely extract insights, understand natural language, and get smarter each time it’s used is the next, logical step for Big Data.