Click here to close now.

SYS-CON MEDIA Authors: VictorOps Blog, Tim Hinds, Mike Kavis, Plutora Blog, Andreas Grabner

News Feed Item

IEEE Forms Two Working Groups to Standardize Steps in Electronic Design Automation Focusing on Mixed-Signal Language Extensions and Interoperability of Large Scale Integration, Package and Board Design

IEEE, the world’s largest professional organization dedicated to advancing technology for humanity, today announced that the IEEE Standards Association (IEEE-SA) approved the formation of the IEEE P1666.1™ SystemC Analog/Mixed-Signal (AMS) Extensions Working Group and the IEEE P2401™ LPB – Standard Format for Large Scale Integration (LSI)-Package-Board Interoperable Design Working Group. Due to their technical work in electronic design automation (EDA) standards development, both working groups are sponsored by the IEEE Computer Society’s Design Automation Standards Committee (DASC).

“These draft standards seek to aid in the development of more powerful and economically affordable electronics,” said Stan Krolikoski, chair of the DASC. “Building in affordability broadens access to electronics, so more people can expand their reach of information, automate daily tasks and much more. In addition, both working groups are a result of the IEEE-SA’s collaboration with Accellera Systems Initiative and the Japan Electronics and Information Technology Industries Association (JEITA), which are organizations that support standards development for use by the global electronics industry.”

As the core standard, IEEE 1666™-2011 “Standard for SystemC Language Reference Manual” provides the high-level design and modeling language for digital electronics. Augmenting that standard, IEEE P1666.1 “SystemC Analog/Mixed-Signal (AMS) Extensions Language Reference Manual” is intended to allow SystemC to capture both analog and digital design content. With the objective to standardize the SystemC AMS extensions, IEEE P1666.1 defines AMS extensions as a powerful language option in the electronic system-level (ESL) design process.

“IEEE P1666.1 is expanding SystemC to model both digital and AMS functions,” said Martin Barnasconi, chair of the IEEE P1666.1 working group. “These two domains are increasingly found in System-on-a-chip (SoC) and embedded systems; therefore, this extension enables the modeling and design of these emerging heterogeneous systems more efficiently and effectively.”

IEEE P2401 “Standard Format for LSI-Package-Board Interoperable Design” seeks to standardize the data exchange format for the integrated circuit, the package and the board—the three components that make up the hardware system. This interoperable format will expedite the exchange of design information among the three components, thereby speeding up system design at lower cost.

“The intention of IEEE P2401 is to offer a common format that LSI-Package-Board design tools can use to exchange information and data seamlessly,” said Yoshinori Fukuba, chair of the IEEE P2401 working group. “The standard proposes to eliminate the multiple different input and output formats. As a result, the standard will allow for the use of a common interoperable format during the design process.”

Both working groups are actively seeking participants for the development of these standards. For more information on the IEEE P1666.1 SystemC Analog Mixed Signal (AMS) Extensions Working Group, please visit https://standards.ieee.org/develop/wg/P1666.1.html. For more information on the IEEE P2401 LPB – Standard Format for LSI-Package-Board Interoperable Design Working Group, please visit https://standards.ieee.org/develop/wg/LPB.html.

IEEE 1666-2011 is available at no charge via the IEEE GET Program, which grants the public free access to view and download certain current individual standards. To view and download IEEE 1666-2011, please visit the IEEE 1666-2011 GET Program web page.

To learn more about IEEE-SA, visit us on Facebook at http://www.facebook.com/ieeesa, follow us on Twitter at http://www.twitter.com/ieeesa, connect with us on LinkedIn at http://www.linkedin.com/groups?gid=1791118 or on the Standards Insight Blog at http://www.standardsinsight.com.

About the IEEE Standards Association

The IEEE Standards Association, a globally recognized standards-setting body within IEEE, develops consensus standards through an open process that engages industry and brings together a broad stakeholder community. IEEE standards set specifications and best practices based on current scientific and technological knowledge. The IEEE-SA has a portfolio of over 900 active standards and more than 500 standards under development. For more information visit http://standards.ieee.org/.

About IEEE

IEEE, a large, global technical professional organization, is dedicated to advancing technology for the benefit of humanity. Through its highly cited publications, conferences, technology standards, and professional and educational activities, IEEE is the trusted voice on a wide variety of areas ranging from aerospace systems, computers and telecommunications to biomedical engineering, electric power and consumer electronics. Learn more at http://www.ieee.org.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add sc...
The cloud has transformed how we think about software quality. Instead of preventing failures, we must focus on automatic recovery from failure. In other words, resilience trumps traditional quality measures. Continuous delivery models further squeeze traditional notions of quality. Remember the venerable project management Iron Triangle? Among time, scope, and cost, you can only fix two or quality will suffer. Only in today's DevOps world, continuous testing, integration, and deployment upend...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, includin...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mo...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...