Large language models are the new cloud battleground
Large language models are the new cloud battleground
Perhaps the biggest thing since open source or Google, LLMs may have companies fighting for supremacy, but it’s the developers who come out ahead.
Two months ago, Amazon didn’t make a single mention of AI on its earnings call (Google and Microsoft mentioned AI dozens of times each). This past week, by contrast, the company’s cloud division, Amazon Web Services (AWS), could talk about little else. As announced by Swami Sivasubramanian, vice president of database, analytics, and machine learning at AWS, the company is all over AI with the launch of new large language models (LLMs) and APIs to access them, as well as CodeWhisperer, a GitHub Copilot competitor, and more.
It’s not that AWS wasn’t working on AI before; Amazon has been working with AI for decades. Rather, it’s now impossible to ignore AI. For developers, I’ve recently argued, the time is now to start learning how to put LLMs to work for you in your code development. AWS, never one to chase competitors, has decided it can’t remain silent when everyone else is talking about the power of LLMs and AI to transform software development.
Just in time, too, as RedMonk’s James Governor argues that OpenAI is the new AWS. To the folks at Jina, OpenAI is the new Google. Either way, it’s big, with the potential to dramatically change how the clouds compete.
OpenAI as the new AWS
When Governor calls OpenAI “the new AWS,” he’s not suggesting that OpenAI, the company behind the LLM ChatGPT, will be rolling out its own version of Amazon EC2 or Amazon S3 anytime soon. Rather, he’s talking about the impact LLMs can have on software development. Inspired by a discussion Governor and I had recently over lunch, I wrote about this, suggesting that, “The race is on for developers to learn how to query LLMs to build and test code but also to learn how to train LLMs with context (like code samples) to get the best possible outputs.”
For Governor, past revolutions in developer productivity were launched by “AWS, open source, and GitHub” because “all of that stuff came together to help people learn and build.” With LLMs, he continues, “We’re at that point again.” LLMs lower barriers to developer productivity much like open source (no need to get purchasing’s approval for a software license) and cloud (swipe a credit card rather than ask to requisition a server). In this case, Governor says it’s not about reducing time to gain access to software/hardware or about collaboration (GitHub), but rather about dramatically reducing the time to learn. As he stresses, “AI makes it easier than ever to learn new skill sets.”
Back to AWS. One reason for the bevy of announcements this past week is because Microsoft, not AWS, has been at the forefront of enabling developer productivity with AI. Years ago, Microsoft bought GitHub, but prior to that it developed Visual Studio, the number 1 IDE and code editor used by developers. Together, that’s a powerful one-two punch. Add OpenAI’s ChatGPT, which Microsoft has built into Bing, Copilot, and other Microsoft services, and Microsoft is now in pole position to earn developer loyalty.
As Governor puts it, given that any developer using ChatGPT is running on Azure, “What about a toolchain that eliminated Azure as a gating factor for developers building apps in GitHub and [Visual Studio] Code?” In other words, could Microsoft refocus developers away from the underlying cloud infrastructure onto the applications being built with ChatGPT? It could. Microsoft has done well with Azure, but it’s still catching up to AWS. By elevating the application experience and removing the “undifferentiated heavy lifting” of even thinking about the underlying cloud infrastructure, “Microsoft has the opportunity to create a once and future developer experience which finally and properly brings the pain to AWS,” to borrow Governor’s phrase.
David Linthicum correctly contends that “ ‘cost savings’ are a terrible way to define the value of cloud-based platforms.” Instead, he posits, cloud is “about delivering the more critical business values of agility and speed to innovation.” Nowhere is that more true than in this greenfield area of AI. The way the clouds surface that developer agility—rather than forcing developers to continue to muddle through mountains of different infrastructure services—will determine who wins the next $100 billion in cloud spend.
As the cloud companies contend, the biggest winners of all will be developers. Game on.
About TechX Corp.
TechX Corporation is “AWS Partner of the Year” 2021 – 2022 in Vietnam.
AWS Partner of the Year 2021
AWS Partner of the Year 2022
TechX Corporation is a young startup, founded in 2020 by a team of well-established technology experts, with years of experience in multi-national enterprises and VN30 corporations with the mission of supporting Vietnamese companies in their digital transformation journey. TechX’s team of cloud experts possesses a comprehensive insight of Vietnam market, especially in major industry such as banking and finance, technology, E-commerce, etc.
Became AWS Advance Consulting Partner in less than 1 year since its establishment, TechX has been leveraging AWS advance cloud services and technology to provide tailored cloud transformation solutions our customers. Currently, TechX Corp. proud to be cloud consulting partner to top banks and financial institutes in Vietnam, such as Maritime Bank (MSB), Vietnam International Bank (VIB), VietinBank, FE Credit, etc., and many other companies in different industries.
Perhaps the biggest thing since open source or Google, LLMs may have companies fighting for supremacy, but it’s the developers who come out ahead.
Two months ago, Amazon didn’t make a single mention of AI on its earnings call (Google and Microsoft mentioned AI dozens of times each). This past week, by contrast, the company’s cloud division, Amazon Web Services (AWS), could talk about little else. As announced by Swami Sivasubramanian, vice president of database, analytics, and machine learning at AWS, the company is all over AI with the launch of new large language models (LLMs) and APIs to access them, as well as CodeWhisperer, a GitHub Copilot competitor, and more.
It’s not that AWS wasn’t working on AI before; Amazon has been working with AI for decades. Rather, it’s now impossible to ignore AI. For developers, I’ve recently argued, the time is now to start learning how to put LLMs to work for you in your code development. AWS, never one to chase competitors, has decided it can’t remain silent when everyone else is talking about the power of LLMs and AI to transform software development.
Just in time, too, as RedMonk’s James Governor argues that OpenAI is the new AWS. To the folks at Jina, OpenAI is the new Google. Either way, it’s big, with the potential to dramatically change how the clouds compete.
OpenAI as the new AWS
When Governor calls OpenAI “the new AWS,” he’s not suggesting that OpenAI, the company behind the LLM ChatGPT, will be rolling out its own version of Amazon EC2 or Amazon S3 anytime soon. Rather, he’s talking about the impact LLMs can have on software development. Inspired by a discussion Governor and I had recently over lunch, I wrote about this, suggesting that, “The race is on for developers to learn how to query LLMs to build and test code but also to learn how to train LLMs with context (like code samples) to get the best possible outputs.”
For Governor, past revolutions in developer productivity were launched by “AWS, open source, and GitHub” because “all of that stuff came together to help people learn and build.” With LLMs, he continues, “We’re at that point again.” LLMs lower barriers to developer productivity much like open source (no need to get purchasing’s approval for a software license) and cloud (swipe a credit card rather than ask to requisition a server). In this case, Governor says it’s not about reducing time to gain access to software/hardware or about collaboration (GitHub), but rather about dramatically reducing the time to learn. As he stresses, “AI makes it easier than ever to learn new skill sets.”
Back to AWS. One reason for the bevy of announcements this past week is because Microsoft, not AWS, has been at the forefront of enabling developer productivity with AI. Years ago, Microsoft bought GitHub, but prior to that it developed Visual Studio, the number 1 IDE and code editor used by developers. Together, that’s a powerful one-two punch. Add OpenAI’s ChatGPT, which Microsoft has built into Bing, Copilot, and other Microsoft services, and Microsoft is now in pole position to earn developer loyalty.
As Governor puts it, given that any developer using ChatGPT is running on Azure, “What about a toolchain that eliminated Azure as a gating factor for developers building apps in GitHub and [Visual Studio] Code?” In other words, could Microsoft refocus developers away from the underlying cloud infrastructure onto the applications being built with ChatGPT? It could. Microsoft has done well with Azure, but it’s still catching up to AWS. By elevating the application experience and removing the “undifferentiated heavy lifting” of even thinking about the underlying cloud infrastructure, “Microsoft has the opportunity to create a once and future developer experience which finally and properly brings the pain to AWS,” to borrow Governor’s phrase.
David Linthicum correctly contends that “ ‘cost savings’ are a terrible way to define the value of cloud-based platforms.” Instead, he posits, cloud is “about delivering the more critical business values of agility and speed to innovation.” Nowhere is that more true than in this greenfield area of AI. The way the clouds surface that developer agility—rather than forcing developers to continue to muddle through mountains of different infrastructure services—will determine who wins the next $100 billion in cloud spend.
As the cloud companies contend, the biggest winners of all will be developers. Game on.
About TechX Corp.
TechX Corporation is “AWS Partner of the Year” 2021 – 2022 in Vietnam.
AWS Partner of the Year 2021
AWS Partner of the Year 2022
TechX Corporation is a young startup, founded in 2020 by a team of well-established technology experts, with years of experience in multi-national enterprises and VN30 corporations with the mission of supporting Vietnamese companies in their digital transformation journey. TechX’s team of cloud experts possesses a comprehensive insight of Vietnam market, especially in major industry such as banking and finance, technology, E-commerce, etc.
Became AWS Advance Consulting Partner in less than 1 year since its establishment, TechX has been leveraging AWS advance cloud services and technology to provide tailored cloud transformation solutions our customers. Currently, TechX Corp. proud to be cloud consulting partner to top banks and financial institutes in Vietnam, such as Maritime Bank (MSB), Vietnam International Bank (VIB), VietinBank, FE Credit, etc., and many other companies in different industries.
Công ty Cổ phần TechX - TechX Corporation
A trusted partner of entrepreneurs and enterprises in their successful cloud transformation journey.
Our Solutions
Insight Hub
Copyright 2024, All Rights Reserved.
Copyright 2024, All Rights Reserved.
Công ty Cổ phần TechX - TechX Corporation
A trusted partner of entrepreneurs and enterprises in their successful cloud transformation journey.
HCM Office
Level 9, HM Town Building
412 Nguyen Thi Minh Khai, Ward 5, District 3, Hochiminh City
+(84) 28 3620 9897
HA NOI
Level 5, VMT Building
3, Alley 86 Duy Tan, Ward Dich Vong Hau,
Cau Giay District, Hanoi City
+(84) 24 3201 6223
Công ty Cổ phần TechX - TechX Corporation
A trusted partner of entrepreneurs and enterprises in their successful cloud transformation journey.
Our Solutions
Insight Hub
Copyright 2024, All Rights Reserved.
Công ty Cổ phần TechX - TechX Corporation
A trusted partner of entrepreneurs and enterprises in their successful cloud transformation journey.
Our Solutions
Insight Hub
Our Product
TechX DevSecOps Platform
Xdata - Insight Creation Platform
Copyright 2024, All Rights Reserved.