プライベート・データ・センターから Azure パブリック・クラウドへの移行パスの策定

この Tech Barometer ポッドキャストでは、マイクロソフトのクラウド・ AI 部門の Rene van den Bedem 氏が、 AI の未来と、クラウド・コンピューティングが生活のより多くの側面を強化するためにどのように進化しているかについて説明しています。

By Jason Lopez

By Jason Lopez 2024年07月30日

彼はもともと問題解決者だったのです。 Rene van den Bedem 氏は、データセンターのクラウド化を支援するソリューションを構築しています。この経験豊富なエンタープライズアーキテクトは、プライベートデータセンターがマイクロソフトのパブリッククラウドプラットフォームである Azure と相互運用できることを保証しています。その 1 つの方法として、 Nutanix と VMware の ハイパーバイザー(複数の仮想マシン( VM )を管理するためのソフトウェア)に工夫を凝らし、Microsoft Azure 内でスムーズに動作するようにしている。

Microsoft Azure のプリンシパル・テクニカル・プログラム・マネージャーである Van den Bedem 氏は、同氏が接するほとんどの顧客は、アプリケーションをパブリック・クラウド環境に長期的に移行していくと述べています。

「彼らはもう自社のデータセンターを管理するようなビジネスはしたくないのです」と同氏は話しています。「運用が複雑だからです。彼らはデータセンターの撤退を望んでいます」

「難しいことが好きで、好奇心も強いので、不可能なことを聞くと興味をそそられるんです」と、The Forecast のインタビューで、同氏は単純なことには興味を示しませんでした。

この Tech Barometer のポッドキャストでは、Van den Bedem 氏がエンタープライズ・コンピューティングの進化と、 AI が IT の未来をどのように形作るのかについて説明しています。

「業界で働く人々は、これらの新しいツールを採用しなければ、取り残されてしまうでしょう」と同氏は語っています。「10年後には、世界中のすべての仕事に、 AI を使った Co-Pilot のようなものが導入されるでしょう。そうでない人は、取り残されることになるでしょう」

同氏は、クラウド・コンピューティングの数多くの利点と、特に目標を十分に検討せずにクラウドの導入を急いだ企業にとっての課題に注目しています。

「いざその場に立つと、 『ああ、私たちが達成しようとしてきたビジネス目標は、実は達成されていないんだ』 、あるいは 『考慮されていなかったんだ』 と気づくのです」

現在と同様、コンピューティングの未来は、新しいモデルと従来のモデルが混在することになるでしょう。

「量子コンピューターは大きな要素です」と同氏は述べています。「 AI に次ぐ、技術革新のブームになるでしょう」

「量子コンピューティングが従来のコンピューティングに取って代わることはないでしょう」と同氏は話します。「どの技術にも長所と短所があります。これはハイブリッド・テクノロジーであり、両者は共存する必要があるということです」

量子コンピューティングだけでなく、 AI とも統合できるように、従来のコンピューティング・モデルの構築方法にも変化が生じるだろうと語っています。

「私たちは進化の次のロケット打ち上げの頂点にいます。それは順風満帆なものでしょうか?いや、そうは思いません」

関連記事

Forestry and Land Scotland 、クラウドへの移行で持続可能性への取り組みを一気に加速

業界のコラボレーションと顧客との出会いは、現在も、そして今後も非常に重要です。同氏は、ハイパーコンバージドインフラ(HCI)とクラウドプラットフォームが進化すると見ており、 Nutanix の顧客中心のアプローチと市場における強力な地位を指摘します。

「 Nutanix のお客様は、Nutanix が市場に提供するインフラストラクチャとソリューションに非常に関心を持っています」

AI 、クラウド技術、ハイブリッドモデルが業界を形成し続けるなか、Van den Bedem 氏は成功に不可欠なものとして、顧客体験と適応性を強調しています。

トランスクリプト(AI生成):

Rene van de Bedem: When I started, the personal computer was becoming more miniaturized.

Jason Lopez: Rene van den Bedem says when he started his career in computing it was 1994. The trend was smaller and more compact machines. Windows 3, with a more user-friendly, graphical interface was the dominant OS. It was a period of diversification in personal computing. This is the Tech Barometer podcast.  Rene van de Bedem is Principal Technical Program Manager at Microsoft, where he does a lot of work in cloud and digital transformation. We asked him about the state of enterprise computing and he ushered us into a sort of timeline, that ends up at AI... but starts in the 90s with computers becoming more miniaturized.

Rene van de Bedem: And then the networking constructs had just come out. 

関連記事

Broadcom による VMware 買収に伴い、移行に関心が集まる

Jason Lopez: The emergence of ethernet, token ring technology, and TCP/IP, ithelped establish the building blocks for the interconnected world we live in today. It was the beginning of the transition from military and academic use to the public.

Rene van de Bedem: Jump 10 years later, we went from narrowband in telco, so you know, like PSTN, dial-up modems, 64k data circuits.

Jason Lopez: This shift from slow to faster connectivity, laid the groundwork for the high speed internet technologies that would make cloud possible.

Rene van de Bedem: Jump to let's say 2001-2002, you had the explosion of the internet. The internet really became this mainstream thing.

関連記事

IT リーダーは AI に備えよ

Jason Lopez: This was the dot-com era, with faster chips, advances in hardware. It moved us from dial-up to broadband, It was a time marked by the spread of wi-fi. Mobility was becoming a big deal.

Rene van de Bedem: So you had all of these building blocks coming together to where we are now, with the invention of the cloud back in 2006, I think it was, with AWS.

Jason Lopez: There was a fundamental transformation in information technology, where physical infrastructure was being replaced by the cloud. It gave users unprecedented levels of accessibility, efficiency, and scalability.

Rene van de Bedem: And now in 2024 with AI, we're now on this cusp of this next rocket launch that's coming.

Jason Lopez: AI is becoming a tool with a wide range of uses, much the way calculators did back in the 80s and 90s. Microsoft, Rene says, is integrating AI into all its products. That's called “co-pilot.” This change signals a transformation to the era we're entering, where AI is a must have technology.

関連記事

AI がもたらすスーパーパワー

Rene van de Bedem: People who work in an industry, if they don't adopt these new tools, they're going to be left behind. So in 10 years time, all jobs around the world, most of them will have some type of AI-based co-pilot that you'll need to use to do your job, and those that don't, they'll just be left behind.

Jason Lopez: It's a continual evolution. And it especially applies to tech companies which must adapt to the changing needs and challenges of storing and processing an ever-increasing volume of data.

Rene van de Bedem: Obviously, having very, very fast, expensive storage, you need that for a part of the workloads, but then the ability to archive petabytes of data so that you can derive business value from your data sets, that's a necessity. So storage is always evolving. I'm sure it's similar is going to be true for quantum computing. We're going to see a shift in the way that we build our traditional computing models so that that can harness and integrate with AI as well as quantum computing.

Jason Lopez: Cloud service providers are beginning to offer quantum products in a limited way, though scalable quantum computers are not yet a reality. Right now, it's in the realm of researchers and developers to experiment with quantum principles and algorithms.

Rene van de Bedem: Most of the cloud providers have a service that allows customers to play with quantum computing.
Jason Lopez: Unlike traditional computing, quantum computing stores information in a more complex way, with an exponential increase in processing power for certain types of problems. Rene says the future looks like a hybrid.

Rene van de Bedem: Quantum computing is not going to replace traditional computing. Every technology has got pros and cons. Quantum computing, even though the processing is happening, is not really able to maintain its state once the problem is solved. So what happens is you'll have your quantum computing model that's running, and then you'll have traditional computing services wrapped around that, and all of the data, once it's solved, goes into traditional computing software constructs, I suppose you would say, to maintain the results of that data and the history and the archives and all the reporting and everything. It's a hybrid technology where the two need to work together. 

Jason Lopez: With the rise of the cloud, many businesses rushed in. There were and are all sorts of very good reasons: reduced IT costs, scalability and flexibility and agility, access to big data analytics, access to AI. And then, simply, it was a trend. There was a sort of peer pressure to move to the cloud.

Rene van de Bedem: When they got there, they realized, "Oh, the business goals that we've been trying to achieve are actually not being met," or they weren't considered. Typically, cost and operational complexity, because there's a level of skill that you need to have to work in a hyperscaler correctly, regardless of whether it's Google, Azure, or AWS. It may turn out that the laws of the land, the laws of physics, or the laws of economics are very, very important to them. So they're constrained as part of their business goals that they're trying to achieve. And it turns out, "Okay, running in the cloud is not such a good idea. And then they're forced to go back. So I have seen that a few times. Because obviously, you've got people, process, technology, and financials are the four major domains. And if one of those is weak, then you're probably not going to be successful.

Jason Lopez: This is what hyperconverged infrastructure was born to do: to consolidate storage, compute, and networking resources into a single, easily managed platform which is software defined, and helps reduce capital and operational expenditures. Nutanix launched its first HCI product in 2011, focusing on making data center infrastructure invisible. Rene says that when it comes to virtualization he's seen a variety of platforms such as VMware and HyperV, though the Nutanix platform offers a more user-friendly experience. Nutanix Cloud Clusters, also known as NC2, is aimed at more easily managing workloads on hybrid clouds. Rene's job is to make sure these platforms work inside Azure.

Rene van de Bedem: The beauty of running NC2 on Azure or Azure VMWare solution, to use Microsoft as an example, because that’s all being extrapolated in the back end and the customer doesn’t see it, it’s a lot more easy for them to consume, because really the main requirement is “do you have an Azure landing zone” and then you can build whatever service that you want on it.

So, I've been working with Nutanix since 2014, and what I always respected about Nutanix was the fact that Nutanix was the company that invented hyperconverged infrastructure and it was really about the customer experience. And even with the CSAT scores and support, there is really no better support than Nutanix. So customers that buy into the Nutanix ecosystem, it's similar to Apple fanboys and VMware fanboys, Nutanix customers are very passionate about the infrastructure and the solutions that Nutanix brings to market. And when you look at the evolution of NC2 on Azure, NC2 on AWS, that's really just an extension of the things that the customer is asking for. They want to get out of the data center business, they want to move into the cloud, and that's what Nutanix is doing with this multi-cloud strategy. And obviously VMware is going down a similar path as well. What makes Nutanix more interesting is that focus on customer experience and then also you have this Broadcom acquisition of VMware. At the moment, the market is very much in a state of flux and it's not clear where the chips are going to fall. What does that mean for Nutanix? I think it puts Nutnaix in a very very interesting position. Because customers that don't have visibility on where the platform that their mission critical and business critical apps are running, that's a problem. If you introduce risk to the story, you're going to have a lot of customers that are going to be looking to shift and change to mitigate that risk.

Jason Lopez: Rene van den Bedem is Principal Technical Program Manager at Microsoft. This is the Tech Barometer podcast, thanks for listening, I'm Jason Lopez. Tech Barometer is produced by The Forecast, where you can find more stories on technology. Check out Jason Johnson's article which profile's Rene, entitled, "Simplifying Hybrid Cloud and Migrations to Azure Public Cloud." Just go to theforecastbynutanix.com. That’s theforecastbynutanix, all one word, dot com.

編集部注 : Nutanix のハイブリッドマルチクラウド機能について学び、 VMware by Broadcom と Nutanix の製品を比較し、 Nutanix への移行方法を確認し、 VMware から Nutanix への移行プロモーションをご検討ください。

Jason Lopez 氏は、The Forecast のポッドキャストである Tech Barometer のエグゼクティブ・プロデューサーであり、Connected Social Media の創設者でもあります。 以前は PodTech のエグゼクティブ・プロデューサー、 NPR のレポーターを務めていました。

Ken Kaplan 氏がこの記事に寄稿しています。

© 2024 Nutanix, Inc. 無断複写・転載を禁じます。Nutanix, Inc. は VMware by Broadcom または Broadcom と一切関係ありません。追加情報および重要な法的免責事項については、こちらをご覧ください

関連する記事