AWS re:Invent 2025-AWS Generative AI Innovation Center driving enterprise success with AWS Partners

🦄 Making great presentations more accessible.
This project aims to enhances multilingual accessibility and discoverability while maintaining the integrity of original content. Detailed transcriptions and keyframes preserve the nuances and technical insights that make each session compelling.

Overview

📖 AWS re:Invent 2025-AWS Generative AI Innovation Center driving enterprise success with AWS Partners

In this video, Jacqueline Krain and Taimur Rashid from AWS discuss moving generative AI from experimentation to production deployment. Taimur, who leads the Generative AI Innovation Center, presents four key pillars for success: leveraging proprietary data as a competitive differentiator, building with a production-ready mindset, scaling through agentic AI, and redefining value measurement. He shares real-world examples including LCA’s HIPAA-compliant healthcare solution, SonicWall’s 40% code maintainability improvement, and Krugxi bank’s 50% faster cloud migration using specialized ETL agents. The session highlights AWS tools like Bedrock, SageMaker, Agent Core, and the Partner Innovation Alliance, demonstrating how customers achieve up to 40% productivity gains through strategic AI implementation.

; This article is entirely auto-generated while preserving the original presentation content as much as possible. Please note that there may be typos or inaccuracies.

Main Part

Thumbnail 0

Introduction: The Generative AI Innovation Center’s Mission and Four Pillars for Production Success

Hello everyone. How’s your reinvent going? Very well, excellent. Did you know that customers implementing generative AI at scale are seeing productivity gains of up to 40%? Yet only 15% of enterprises have moved beyond experimentation to full deployment. The difference between those who succeed and those who struggle often comes down to one thing: having the right roadmap. That’s what we’re here to talk with you about today.

Hi everyone, my name is Jacqueline Krain. Thank you for joining us here at our lightning talk. I lead our worldwide AWS growth initiatives, which are our top strategic priorities for AWS. I’m excited to be here with Taimur Rashid, who heads up our AWS Generative AI Innovation Center, one of our AWS growth initiatives.

Before we dive in, I want to acknowledge the incredible energy I have seen from our teams, from both our partners and our customers as we navigate this generative AI journey. It’s exactly the kind of challenge that makes our work at AWS meaningful. At AWS, we think of growth initiatives as our North Star, navigating market inflection points, whether it’s the generative AI revolution we’re experiencing now or helping customers transition workloads. We’re tackling the biggest challenges that keep our customers up at night.

We don’t just experiment; we build for production from day one. We don’t go it alone; we go there together. Innovation is a team sport, and we scale through our partner ecosystem, creating pathways that turn market disruption into opportunity. What you’ll gain today is battle-tested knowledge from thousands of customer implementations—not theoretical concepts, but practical insights about moving AI experiments to enterprise-wide deployment.

Taimur Rashid will walk you through impressive results we’ve achieved with partners, explain why ROI isn’t just a nice to have in the generative AI world, and reveal how our most successful customers are using data as their secret weapon while scaling agentic AI to drive efficiency. I can’t wait for you to hear these insights from Taimur. Here he is, Taimur Rashid.

Thumbnail 160

Thank you, Jacqueline. Thank you all for attending the talk here. My name is Taimur Rashid. I lead the Generative AI Innovation Center. What is the Generative AI Innovation Center? We started this team back in 2023, even though our roots go back to 2017 when AWS launched SageMaker. Our whole premise is to help customers with their journey across AI and machine learning. We are a multidisciplinary team, globally distributed, and we comprise strategists, applied scientists, and deployed engineers.

We work very closely with customers in identifying the right sets of use cases. We build alongside them, we deploy the solution, we operate it for them, and we ultimately teach them how to take that system and evolve it over time. We’ve helped customers across multiple industries, from financial services to healthcare. We even help startups as well, and we work on all kinds of use cases, from very simple chatbots to content summarization tools, all the way to very deep, experience-based things around model customization and fine-tuning.

Thumbnail 240

One of the ways that we scale is through our broad set of partners. One special thing that we’ve created within the innovation center is what we call the Partner Innovation Alliance, where we’ve taken a curated set of our Gen AI competency partners and brought them into what we call the Partner Innovation Alliance. We take these partners and we teach them our methodology and our approach. We share many of the tools that we’ve built, including a variety of solution accelerators that really help with the overall delivery of a Gen AI application.

Thumbnail 300

Many of these partners you will see here in the expo hall, and I encourage you to go visit them. What we have seen in the innovation center and what we teach our partners are the patterns of success, or what are those specific pillars that can take a guided implementation and really ensure that it can go into production safely, securely, and reliably. There are four aspects that I wanted to highlight in this talk.

Thumbnail 320

Thumbnail 350

The first thing is realizing that data is a competitive differentiator. With so many companies having access to all these large language models, where do companies truly get their differentiation? It’s by leveraging their most important asset, which is their proprietary data. Secondly, it is building with that production-ready mindset. Twenty-four months ago, we all were very focused on experimentation, doing proof of concepts and prototypes, but when you look at it today, we have learned so much about what is actually required to take a prototype into production, and now we can start off with that production-ready mindset.

Thumbnail 380

Thumbnail 400

The third thing is realizing that with agentic AI, all these applications that we have been building, we can ultimately scale by using agentic AI and really truly getting hyper-automation in the front office, back office, or those unique experiences that we are trying to enable for our customers. And finally, it’s very important to rethink and redefine what we mean by value. And today, many of us use common understandings of value realization as proxies, but as we look at how transformative agentic AI is, we have to really think about how do we redefine value. What I will do is actually go a little deeper into each of these examples.

Implementing the Four Pillars: From Data Differentiation to Agentic AI at Scale

So when you look at the pillar number one, data as a competitive differentiator, it’s so important to really focus on data platforms. And today any enterprise is dealing with large amounts of structured and unstructured data. The first step is really understanding what is the nature of this data and ensuring that you have the right foundations across security, data quality, data cleansing, all those things that ensure that ultimately the data that feeds into AI is of highest quality.

Thumbnail 440

Thumbnail 460

The second aspect is realizing that the data has to be open and secure, and when I say open, to really get the value of AI, you have to have open access to all that data that resides in your company. But at the same time, you also have to ensure that it’s done securely and with the right permissions.

Thumbnail 500

The third aspect is also choosing what kind of customization approach should you take with that data. Now, if your data is constantly changing, RAG systems, retrieval augmented generation, might be the best place to start. But if your data is generally steady and you want to leverage that data, whether it’s domain-specific or proprietary in nature, you can think about a variety of different customization approaches from fine-tuning, and we just announced a number of fine-tuning capabilities within SageMaker and within Bedrock. So that’s the first pillar.

One example that I wanted to share about how important data differentiation is is through one of our partners called LCA. LCA is actually part of the Partner Innovation Alliance, and they have unique expertise around healthcare. They actually worked with a technology provider within the healthcare payer space in taking a HIPAA-compliant data source and applying specific customization of that domain on this data lake. What they were able to do in that process is customize the model around this unique HIPAA-compliant dataset, and what this allowed them to do with this customer is ensure that they had a production-ready curated domain-specific dataset and model to be able to build applications on top of that.

Thumbnail 560

The second pillar is really transitioning towards this production-ready mindset, and there are four different aspects that I want to highlight. Number one is realizing that there are evolving fundamentals and foundational work that you all have to keep in mind. And this is ensuring that from an infrastructure standpoint, data, security, data governance, all those things are adding up. What we’ve observed through the thousands of engagements that we’ve done is when we do proof of concepts or prototypes, eventually it all leads back to foundational work that either needs to be tweaked or completely reimagined, so this is very important to do.

Secondly, it’s important to realize that agility is the name of the game. The space is evolving so fast, customer requirements are evolving so fast.

That as organizations think about being production ready, you have to think about being very agile as well. We have a mantra in the Innovation Center, which is “live in 45.” Anytime we undertake any project, we give ourselves 45 days to go do the work, and that creates a culture of agility within our teams.

The third point is model choice. No one size fits all, which is why in Bedrock we’ve given customers the ability to choose between multiple models—proprietary, first party, third party, as well as open source models. Models take different sizes. Some are small, some are big, some are very domain specific. Some models are great for coding tasks, others are great for document processing. It’s very important to realize what the task is that needs to be done and which model is ideal for that across task capability, accuracy, price, and performance.

Thumbnail 700

Finally, the fourth point is that as much as we think about operational and efficiency improvements with generative AI and agentic AI, the real value is in transformative applications. As we think about how we transform the way we work and the way we help our customers achieve that value from AI, it’s very important to think about that in the transformative mindset.

The second example I wanted to give is a customer called SonicWall. This was done in partnership with the Innovation Center and one of our partners called Cybge. In three weeks, we were able to take an open source framework, ensure that it had the right security guardrails around it, and ultimately ensure that it was performance ready to go into production. The result of this work was the customer created a multi-agent system where they had very specific agents doing different parts of the firewall configuration and the threat detection that needed to go inside the system.

Thumbnail 770

With that multi-agent architecture, which is a very modularized approach, they were able to achieve 40 percent improvement in how they had to maintain that code base. If you’ve built software systems, having 40 percent improvement in code maintainability is extremely powerful because it helps you iterate with those changes in a very agile way.

Now, as we think about the third pillar in truly being able to take agentic AI and scale it, we’ve realized that there’s a three-phase approach to this. Customers need to have a mechanism around how they design agents, how they build them, and then ultimately how they operate them. With Quiro we have a spec-driven IDE to allow teams to very quickly design the software, and this is AI native in that the code is generated, documentation is generated, unit tests are also generated, so all that heavy lifting that typically goes into software development is essentially removed. We’ve created a great equalizing effect where multiple teams, semi-technical in nature, can build software.

The second part of it is actually building the agents, and this is where Strands can be used. It’s an open source SDK that allows developers to go and build the underlying agents. Once you’ve built the agents, now you have to think about how you productionize it, operate it, and manage it, and this is where Agent Core comes into play. We have eight primitives now as of yesterday, where these are the common sets of things that developers need in being able to operationalize agents—a runtime environment, memory, gateway, identity, a browser tool capability, observability, and then evaluations.

Thumbnail 890

You can use any framework, whether it’s an open source framework like Grangraph, Crew AI, or even Strands. You can leverage protocols like MCP and A2A. But ultimately now you have a scalable and reliable way to run and deploy those agents.

I wanted to highlight my final example with a customer based out of Thailand called Krugxi. They’re the fifth largest bank in Thailand.

What is very unique about this engagement is that we co-delivered it with one of our partners called One B Zero. One B Zero is one of those partners that was the first to actually build an agent as part of our partner agent factory. For those of you who have done migrations, one of the biggest bottlenecks is integrating data or using ETL. The bottleneck is in getting all the data consolidated, which accounts for about 60% of migration challenges. What One B Zero actually did was build a very specialized agent around ETL. This agent was leveraged within the customer account, and they were able to achieve a 50% acceleration in the cloud migration, effectively reducing the cloud migration time by over half. This demonstrates the power of leveraging specialized agents for very specific tasks in the migration process.

Thumbnail 960

Redefining Value and Pathways to Implementation with AWS Partners

As we think about this agentic AI era and the value we are trying to get from this investment, it is very important for us to rethink and redefine what we mean by value. At the Innovation Center, we love partnering with emerging startups, and one of the emerging startups we have partnered with is PayI. PayI has built a very robust solution around FinOps for Gen AI and Agentic AI where you can get very detailed granularity into cost allocation and cost forecasting. Ultimately, what every C-level person and every organization is trying to answer is what is the value of the investment in AI. We are very excited to partner with them and our customers in helping define that value realization framework.

Thumbnail 1030

All of our capabilities are provided in the stack, and we have multiple ways of entering into this platform. Whether you have a buy approach and want to take something like QuickSuite or Amazon Connect AWS Transform and leverage the capabilities we have built into these products, that is certainly one way to get started. In fact, we have seen customers start off with QuickSuite to get an AI-powered knowledge base for their corporate employees. Secondly, if you have a build-it culture and want to build a customized solution, whether generative-based or agentic-based, a great way to start is with SageMaker and Agent Core. All the capabilities we have built into Bedrock and Bedrock Agent Core are at the disposal and fully available for developers and organizations to leverage.

Finally, we do not expect companies to do this alone. Where customers can leverage the expertise of the Innovation Center and our very large partner network, we bring that to the forefront for customers as well. I encourage you all to visit us at the Innovation Center. We have a booth here, and our 20 partners that are part of the Innovation Alliance also have booths here. We are all fully available to answer any questions. Thank you for the opportunity and the time, and we look forward to building with you all together.

; This article is entirely auto-generated using Amazon Bedrock.

Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Post

Did We Pass Peak AIO? — Whiteboard Friday

Next Post

Arreglar pinchazos cerca de mi en Pozuelo de Alarcón

Related Posts