To satisfy their generative AI urge, firms need to secure state-of-the-art software and ramp up computing power, which quickly brings them knocking on the doors of the globe's cloud computing giants, of which Amazon's AWS is the biggest.
Amazon's computing arsenal is housed in data centers scattered across the globe, and Prasad Kalyanaraman, Vice President for AWS Infrastructure, is the man in charge of keeping them running.
Amazon's AWS data centers are spread across dozens of regions, serving as a kind of engine room to the online world, with Microsoft and Google as the company's closest rivals.
And with the generative AI revolution entering hyperdrive, it is up to Kalyanaraman to make sure the data center battalions are ready for the challenge.
"It takes a significant amount of grit and innovation" to meet the need for computing right now, Kalyanaraman told AFP during an interview at Amazon's second headquarters near Washington.
"Building the right technology, both in terms of consuming the least amount of power needed, and optimizing all the way from the chip level to the data center level... requires a lot of innovation," he said.
Kalyanaraman, a graduate of the prestigious Indian Institute of Technology and Queen's University in Canada, has been at Amazon for almost two decades where he worked on software before holding the keys to the data centers.
"Most users, unbeknownst to them, are using cloud computing today. If you go to a website, or you stream video, or you go to your financial institution and look at your transactions, you're actually using some form of cloud computing," he said.
Amazon's decision to make a side business out of the cloud dates to 2006, when the company realized that its partners and sellers didn't want to build - or buy - expensive computing pipework.
"We saw that it's so hard for our customers to... go through all the muck of building this infrastructure. So why not bring this to them," much like utilities bring power to your home, he said.
Nearly two decades later, AWS is nearing 20 percent of the giant's total revenue and bringing in about two-thirds of total profit.
- Constraint as opportunity -
"It's a pretty significant undertaking to actually construct a data center from scratch," Kalyanaraman said.
"First, obviously, we have to find enough land to be able to deploy these data centers. Typically, we deploy further away from metropolitan locations" for both cost and environmental reasons, he said.
Connectivity is also key since most clients want the high computing speeds that come by being closer to their data.
Then there has to be a power source and the power lines to get the electricity.
With success comes scrutiny, or in the case of some communities across the globe, some exasperation with the proliferation of data centers.
Data centers can creep into an area's bucolic scenery, and put a massive burden on the local power supply, straining already fragile electricity grids.
And with the emergence of generative AI, Amazon has announced new projects around the globe.
Kalyanaraman acknowledged that "power will be a constrained resource in the world today, especially with generative AI and some of the other things that are required to run this amount of compute."
But even though "it's not something that you can actually change overnight," Kalyanaraman said that AWS has worked with power companies to manage the flow, notably through renewable energy.
AWS "is the largest purchaser of renewable energy in the world today. And that's for four years in a row," he said, with AWS committed to being a net-zero carbon company by 2040.
Ever the techno-optimist, Kalyanaraman remained confident that innovation could find a way to meet the generative AI challenge, with the industry looking to nuclear energy to help.
"Every time we've actually had a constraint, we've all figured out a way of innovating."
"I see (AI) as an opportunity," he said.
Related Links
All about the robots on Earth and beyond!
Subscribe Free To Our Daily Newsletters |
Subscribe Free To Our Daily Newsletters |