The Role of Open Source in Data Center Transformation Brought about by Software Defined Infrastructure and the Internet of Things By Imad Sousou, VP and GM-Open Source Technology Center, Intel Corporation

The Role of Open Source in Data Center Transformation Brought about by Software Defined Infrastructure and the Internet of Things

Imad Sousou, VP and GM-Open Source Technology Center, Intel Corporation | Wednesday, 06 July 2016, 06:37 IST

  •  No Image

There’s no better example of a market in transition today than the enterprise data center. The segment is being redefined by increasing business and customer demands as well as advances in technology, coupled with the continued growth of cloud-based solutions, including public and private cloud infrastructures.

Further driving this evolution are two trends: the rapid proliferation of devices and data as a result of the Internet of Things (IoT) and the move to a Software Defined Infrastructure (SDI) for data centers. Given its ability to deliver the rapid innovation and service delivery enterprises require, it’s no surprise open source software plays a critical role in this transition.

To understand why open source is key to the data center transformation, let’s look at the trends outlined above. First, it’s well known that the number of connected devices is increasing exponentially year over year. These devices can measure their surroundings, share information they collect and many are even sensing and connecting to other devices directly, forming the Internet of Things.

It’s not just the smartphones, tablets and PCs we typically think of. Wearables, home products, smart city infrastructure, and industrial applications are all being connected. With each device capturing a steady stream of information, the amount of data being created is also increasing exponentially.

The projections are staggering with an estimated 50 billion devices to be connected by 2020 generating some 44 zettabytes of data. That’s equivalent to 44trillion gigabytes— or 11 times the amount of data contained in the entire World Wide Web in 2013. All that data needs to be stored, analyzed and accessed from somewhere, and that somewhere is the data center.

Even as the mountain of information grows, people expect immediate access to all information from any device any time they want—and in the case of video, at exceptional quality. Businesses in turn are demanding always-on availability and the highest levels of service delivery customers expect, while driving service efficiency and flexibility, cost efficiency, security and compliance. It’s no understatement to say today’s data center needs to be more agile, efficient, and responsive than ever.

This is leading directly to the second trend: the emergence of SDI that empowers enterprises to compete effectively, respond more quickly to immediate demands and better anticipate challenges ahead. With SDI, applications run on pooled server, storage and network resources rather than discrete systems, and are optimally provisioned and dynamically managed via intelligent resource orchestration software to improve and maintain desired service levels.

Put another way, what used be done by physical equipment and people in a data center is increasingly being automated and completed virtually by software, often without human intervention. In this way, SDI enables not only higher efficiency, but also greater agility and better asset utilization compared to a traditional discrete hardware approach.

One example of this is provisioning equipment to deliver a new service in a traditional, non-virtualized environment. It’s not uncommon for provisioning to take months between the service owner and IT scoping an idea, ordering the right systems, setting up equipment, installing software, testing it all together, and finally deploying the service. With SDI, the process takes minutes. The business owner uses an online portal to choose what they need and everything is pulled together automatically using a pre-qualified list with the service-level agreement determining availability, storage, connection speed, power and thermal management requirements.

If these two approaches are competing businesses, it’s not hard to see which has the advantage thanks to a massive head start.

Keeping pace with ever-increasing business and consumer demands is only possible through continuous innovation. The industry is being asked to enable new usage models, develop more robust security solutions, and help ensure new features and technologies are easy to adopt—and do it all faster than ever. In a data center defined by software capabilities, more and more companies are looking at open source to power this scale and innovation.

Why? Because the open source approach enables them to bring together diverse expertise and tap into the power of a community to tackle complex challenges and deliver new products to market in a way that all but the largest providers can do alone. In fact, the nature of open source, built on shared collaboration, allows the industry to drive advances, test solutions, spot issues and make enhancements rapidly. The diverse requirements and complexity of the required solutions almost demands collaboration to succeed.

For example, the Internet of Things depends on a vast array of products and devices working together seamlessly. That flexibility and interoperability can’t be easily achieved through proprietary solutions and is why many connected devices are using open source code, including the majority of mobile devices. It’s almost counter-intuitive to develop proprietary products and then have them integrate seamlessly with other products on the market, yet that’s exactly what consumers expect.

Similarly, open source software is helping make SDI a reality for business of all sizes. A great example is OpenStack. The market is hungry for an open cloud operating system to power private enterprise clouds. OpenStack is a complex community developing the solution. One would be hard pressed to find a larger open source project that has come together so quickly, with so many contributors, both individual and corporate. The depth of expertise working on OpenStack allows the community to drive innovation on multiple fronts and respond to user needs faster than any single participant could do alone.

That diversity illustrates one of the best aspects of the open source approach: everyone can participate, from individual developers to the largest enterprises. As data centers evolve to be more flexible and agile to meet the always-on-demand expectations of today’s enterprises, open source is delivering the innovation to make this a reality.

The open source community invites industry members and enterprises alike to join the discussion and get involved in the transformation—your participation has never been needed more.

On The Deck

CIO Viewpoint

The OpenSource Innovation Revolution

By Ray Estevez, CIO, V12 Group

CIO's Evaluation of Implementing Software...

By Shreyas Shah, SVP-IT & CIO, Lumentum

The Shifting Role of IT

By Ken Kauppila, CIO, Merchants Fleet Management

CXO Insights

Technology: A Positive Enabler for Businesses

By Timothy Bay, VP-Digital Marketing, Wilton Enterprises

Flash and Open Source; What Every Data Center...

By Nithya A. Ruff, Director-Open Source Strategy Office, SanDisk Corporation

Big Data, Analytics and Metrics to Make Better...

By Anne Legg, VP - Strategic Marketing, Credit Union Solutions, Fiserv

Facebook