Live Panel: Sustainability in the Data Center

As our data-driven global economy continues to expand with new workloads such as proven digital assets and currency, artificial intelligence and advanced healthcare, our data centers continue to evolve with denser computational systems and increased data stores. This creates challenges for sustainable growth and managing costs.

On April 25, 2023, The SNIA Networking Storage Forum will explore this topic with a live webinar “Sustainability in the Data Center Ecosystem.” We’ve convened a panel of experts, who will cover a wide range of topics, including delivering more power efficiency per capacity, revolutionizing cooling to reduce heat, increasing system processing to enhance performance, infrastructure consolidation to reduce the physical and carbon footprint, and applying current and new metrics for carbon footprint and resource efficiency.

Beginning with a definition of sustainability, they will discuss: Read More

A Q&A on the Open Programmable Infrastructure (OPI) Project

Last month, the SNIA Networking Storage Forum hosted several experts leading the Open Programmable Infrastructure (OPI) project with a live webcast, “An Introduction to the OPI (Open Programmable Infrastructure) Project.” The project has been created to address a new class of cloud and datacenter infrastructure component. This new infrastructure element, often referred to as Data Processing Unit (DPU), Infrastructure Processing Unit (IPU) or xPU as a general term, takes the form of a server hosted PCIe add-in card or on-board chip(s), containing one or more ASIC’s or FPGA’s, usually anchored around a single powerful SoC device.

Our OPI experts provided an introduction to the OPI Project and then explained lifecycle provisioning, API, use cases, proof of concept and developer platform. If you missed the live presentation, you can watch it on demand and download a PDF of the slides at the SNIA Educational Library. The attendees at the live session asked several interesting questions. Here are answers to them from our presenters.

Q. Are there any plans for OPI to use GraphQL for API definitions since GraphQL has a good development environment, better security, and a well-defined, typed, schema approach?

Read More

FAQ on CXL and SDXI

How are Compute Express Link™ (CXL™) and the SNIA Smart Data Accelerator Interface (SDXI) related? It’s a topic we covered in detail at our recent SNIA Networking Storage Forum webcast, “What’s in a Name? Memory Semantics and Data Movement with CXL and SDXI” where our experts, Rita Gupta and Shyam Iyer, introduced both SDXI and CXL, highlighted the benefits of each, discussed data movement needs in a CXL ecosystem and covered SDXI advantages in a CXL interconnect. If you missed the live session, it is available in the SNIA Educational Library along with the presentation slides. The session was highly rated by the live audience who asked several interesting questions. Here are answers to them from our presenters Rita and Shyam.

Q. Now that SDXI v1.0 is out, can application implementations use SDXI today?

Read More

An Overview of the Linux Foundation OPI (Open Programmable Infrastructure)

A new class of cloud and datacenter infrastructure component is emerging into the marketplace. This new infrastructure element, often referred to as Data Processing Unit (DPU), Infrastructure Processing Unit (IPU) or xPU as a general term, takes the form of a server hosted PCIe add-in card or on-board chip(s), containing one or more ASIC’s or FPGA’s, usually anchored around a single powerful SoC device.

The Open Programmable Infrastructure (OPI) project has been created to address the configuration, operation, and lifecycle for these devices. It also has the goal of fostering an open software ecosystem for DPUs/IPUs covering edge, datacenter, and cloud use cases. The project intends to delineate what a DPU/IPU is, to define frameworks and architecture for DPU/IPU-based software stacks applicable to any vendors’ hardware solution, to  create a rich open-source application ecosystem, to integrate with existing open-source projects aligned to the same vision such as the Linux kernel, IPDK.io, DPDK, DASH, and SPDK to create new APIs for interaction with and between the elements of the DPU/IPU ecosystem:

Read More

Programming Frameworks Q&A

Last month, the SNIA Networking Storage Forum made sense of the “wild west” of programming frameworks, covering xPUs, GPUs and computational storage devices at our live webcast, “You’ve Been Framed! An Overview of xPU, GPU & Computational Storage Programming Frameworks.” It was an excellent overview of what’s happening in this space.

There was a lot to digest, so our stellar panel of experts has taken the time to answer the questions from our live audience in this blog.

Q. Why is it important to have open-source programming frameworks?

A. Open-source frameworks enable community support and partnerships beyond what proprietary frameworks support. In many cases they allow ISVs and end users to write one integration that works with multiple vendors.

Q. Will different accelerators require different frameworks or can one framework eventually cover them all?

Read More

Memory Semantics and Data Movement with CXL and SDXI

Using software to perform memory copies has been the gold standard for applications performing memory-to-memory data movement or system memory operations. With new accelerators and memory types enriching the system architecture, accelerator-assisted memory data movement and transformation need standardization.

At the forefront of this standardization movement is the SNIA Smart Data Accelerator Interface (SDXI) which is designed as an industry-open standard that is Extensible, Forward-compatible, and Independent of I/O interconnect technology.

Adjacently, Compute Express Link™ (CXL™) is an industry-supported Cache-Coherent Interconnect for Processors, Memory Expansion, and Accelerators. CXL is designed to be an industry-open standard interface for high-speed communications, as accelerators are increasingly used to complement CPUs in support of emerging applications such as Artificial Intelligence and Machine Learning.

Read More

You’ve Been Framed! An Overview of Programming Frameworks

With the emergence of GPUs, xPUs (DPU, IPU, FAC, NAPU, etc.) and computational storage devices for host offload and accelerated processing, a panoramic wild west of frameworks is emerging, all vying to be one of the preferred programming software stacks that best integrates the application layer with these underlying processing units.

On October 26, 2022, the SNIA Networking Storage Forum will break down what’s happening in the world of frameworks in our live webcast, “You’ve Been Framed! xPU, GPU & Computational Storage Programming Frameworks.”

We’ve convened an impressive group of experts that will provide an overview of programming frameworks that support:

Read More

A Deep Dive on xPU Deployment and Solutions

Our first and second webcasts in this xPU webcast series explained what xPUs are, how they work, and what they can do. If by you missed them, they are available to watch here in the SNIA Educational Library. On August 24, 2022, the SNIA Networking Storage Forum will host the third webcast in this series, “xPU Deployment and Solutions Deep Dive,” where our xPU experts will explain next steps for deployments, discussing:

When to Deploy:

  • Pros and cons of dedicated accelerator chips versus running everything on the CPU
    •  xPU use cases across hybrid, multi-cloud and edge environments
    • Cost and power considerations
Read More

SNIA Experts Answer Questions on xPU Accelerator Offload Functions

The popular xPU webcast series hosted by the SNIA Networking Storage Forum’s continued last month with an in-depth look at accelerator offload functions of the xPU. Our experts discussed the problems the xPUs solve, where in the system they live, and the functions they implement. If you missed the session, you can watch it on-demand and access the presentation slides at the SNIA Educational Library. The Q&A here offers additional insights into the role of the xPU.

Q. Since xPUs can see traffic on the host doesn’t that widen the surface area for exposure if it were to be compromised?

Read More

SmartNICs to xPUs Q&A

The SNIA Networking Storage Forum kicked off its xPU webcast series last month with “SmartNICs to xPUs – Why is the Use of Accelerators Accelerating?” where SNIA experts defined what xPUs are, explained how they can accelerate offload functions, and cleared up confusion on many other names associated with xPUs such as SmartNIC, DPU, IPU, APU, NAPU. The webcast was highly-rated by our audience and already has more than 1,300 views. If you missed it, you can watch it on-demand and download a copy of the presentation slides at the SNIA Educational Library.

The live audience asked some interesting questions and here are answers from our presenters.

Q. How can we have redundancy on an xPU?

Read More