Like dozens of other popular tech events, Red Hat Summit 2020 didn’t feature any in-person executive speakers or vendors.
Instead, the open-sourced enterprise IT software giant’s annual conference was held virtually due to COVID-19 using the very infrastructure and solutions the company brings to its enterprise customers.
In a virtual keynote, newly minted Red Hat CEO Paul Cormier spoke about the open hybrid cloud, its history and its future, breaking down each word and illustrating how organizations can benefit from a transition to the open hybrid cloud model.
We all have an open source story
According to Cormier, open-sourced solutions are brought to market more quickly and effectively and taps into experts across the tech industry, allowing the best ideas to win.
“Open just results in better, more innovative, more secure code,” Cormier said.
The IT executive gave a brief history of the open source movement, which he said began at the University of Manchester in 1948 with the development of the first software programs.
That first programming language brought consistency and efficiency to computing. It was free until the 1970s and 1980s when the industry – primarily Microsoft – won a legal battle to charge money for the software.
Simultaneously, the open source movement was beginning to hake hold as a hobby and for some niche players. Most software was taken to market by proprietary vendors, like Unix, Cormier said.
“Every hardware vendor had their own Unix operating system and they added proprietary pieces,” Cormier said. “The OS would also only function on their boxes. And believe it or not, they all saw this as a lock-in feature, not a bug.”
Then came Linux, a hobby group that “exploded into a massive worldwide project,” Cormier said.
When the enterprise-grade Linux was introduced, the “Unix chokehold was finally broken,” the IT chief said.
Then, enterprise and governments started using Linux and open source software, and that innovation eventually spawned virtualization, cloud computing, containers, Kubernetes and now out to the edge.
Now, open source-developed solutions essentially run the world, Cormier said.
“Open source is the backbone of inventions that has changed our lives, like the internet. Even the development model continues to drive the internet. … Open Source puts rockets into space, open source runs the stock market, it helps doctors save lives. Today, every one of us touches open source on a daily basis.”
Read Next: Red Hat Releases Red Hat Enterprise Linux 8.2
Hybrid: a strategic imperative
According to Cormier, a hybrid cloud will meet you wherever you are and hep you future-proof what you have today and what you’re going to build for tomorrow.
“Hybrid is the reality for now and for a very long time to come,” Cormier said. “Just as we don’t want to be locked into software, we don’t want to be locked in the closed infrastructure either.”
A hybrid cloud gives organizations the ability to run applications on any footprint that makes sense for them, whether it be bare metal, virtualized, private cloud, public cloud, multiple public clouds and all the way to the edge.
Cormier provided another history lesson about how Linux helped level the playing field when Linux started accelerating innovation in the tech world.
“This is where virtualization came into play,” Cormier said.
Virtualization enabled the enterprise to scale within their existing footprint, allowing applications to be built and deployed in new ways. The cloud was then built on the foundation of Linux and open source virtualization.
“Virtualization now became a key piece to this horizontal world, enabling the enterprise to scale within their existing footprint, allowing applications to be built and deployed in new ways. And then the cloud was built on the foundation of Linux and open source virtualization.”
Now, a hybrid strategy is essential to scale.
“Hybrid isn’t a trend — it’s a strategic imperative,” Cormier said.
Distributing technology seamlessly on the cloud
Cormier, an IT historian, said cloud computing took hold in the 1990s, primarily with SaaS applications like Salesforce.
“From an infrastructure standpoint, the early cloud-like experience was when someone else hosted our hardware. It was still ours, but managed by someone else,” Cormier said.
“Our data centers could be co located with other companies, saving us time cost management costs. We were then able to consume applications that lived outside our firewall managed by someone else.”
Then came Amazon’s Elastic Computer Cloud in 2006, which allowed others to use their excess data center infrastructure. That model eventually became Amazon Web Services.
Cormier defines cloud as consuming technology in a seamless way, extending from on premises, to multiple clouds all the way to the edge, the latter of which is the most important piece as it puts computing closer to users and the data.
Users shouldn’t have to worry or care about where the cloud is or which cloud it is, Cormier said.
“For all of us running the infrastructure, we should be able to choose the right cloud or on prem technology for the right application,” he said. “We should be able to have the agility and portability to move data, workloads in management, wherever it needs to be whenever we want.”
If you enjoyed this article and want to receive more valuable industry content like this, click here to sign up for our digital newsletters!
Leave a Reply