
Following the technology deep-dives and innovation highlights covered in our previous blogs from the team on the ground at AWS re:Invent 2025, this final reflection from the event focuses on the key commercial impact that organisations are likely to see down the line.
If there was one defining shift at this year’s event, it was that AI is no longer experimental. It is operational. And it is becoming business critical.
The conversation has moved beyond pilots, proofs of concept and innovation ideation and the message from AWS was clear – 2026 is about embedding AI into the core of their enterprise, commercially, structurally, and culturally.
Here, Leighton’s CCO, Claire Cundill, covers some of the key commercial considerations for organisations shaping their technology and investment decisions in the year ahead.
Perhaps the most commercially significant shift was how AWS reframed AI agents. This time around the discussion wasn’t about copilots assisting individuals. It was about AI agents acting as members of teams, with memory, identity, guardrails and autonomy, built through services like AmazonBedrock AgentCore.
Organisations such as Lyft demonstrated what this means in measurable terms. By using Anthropic’s Claude via Amazon Bedrock for customer support, they achieved an 87% reduction in resolution time and 70% driver adoption of AI agents. Likewise, BMW reported a 75% reduction in test creation time and a 60% increase in coverage through domain-specific AI agents supporting modernisation. These examples not only demonstrate the huge potential available to companies in terms of operational efficiencies but also the potential impact successful implementation could have on the bottom line.
The implications of these agents are profound. Labour productivity models will change. Cost-to-serve will shift dramatically. Linear workflows will no longer be competitive and team structures will likely need redesigning.
And, these changes are happening at pace, with speakers at AWS re:invent highlighting that AI adoption is moving faster than mobile did. The organisations that treat AI agents as core operational assets and not just tools will soon be at a commercial advantage compared to the rest of the market.
Legacy technology remains the single biggest drag on innovation. Gartner estimates that up to 70% of IT budgets are spent maintaining legacy systems.
AWS directly targeted this issue with AWS Transform, an AI-powered modernisation service capable of reducing tech debt by up to 80%. Air Canada shared how AI-powered code transformation eliminated up to 70% of maintenance and licensing costs for Windows applications, reducing migration time and cost by 80%.
Equally, legacy systems are structurally difficult to integrate with AI at scale. In order to maximise the opportunities presented by AI, companies must invest in modern, reliable and robust infrastructures that can facilitate innovation and maintain operations at the same time.
This matters commercially for two reasons: modernisation is now mandatory if companies want to capitalise on the strategic advantages offered by AI and boards can now justify transformation on hard ROI, not strategic intent alone.
Organisations that fail to modernise will find themselves unable to operationalise AI at scale. Those that act can reallocate legacy spend into growth, innovation and customer experience.
Beyond AI, AWS made a strong commercial play around cost optimisation. The introduction of Database Savings Plans, spanning RDS, Aurora, DynamoDB and more, signals a shift toward more flexible commitment models and services like S3 Tables Intelligent-Tiering further automate cost efficiency.
At the same time, new infrastructure releases such as Graviton5 and Trainium3 UltraServers are delivering significant performance-per-dollar improvements.
The commercial takeaway here is that cost optimisation and innovation are no longer opposing forces. The organisations leading in cloud maturity are actively redesigning workloads for efficiency, leveraging ARM-based architectures, aligning financial governance with engineering decisions and embedding observability as a cost-control mechanism.
AWS Interconnect which enables managed private connectivity between AWS and other clouds was another notable announcement, particularly with Azure connectivity landing in 2026.
This signals commercial realism that customers are multi-cloud, and AWS is leaning into that reality. For organisations, this means vendor lock-in concerns are reduced, negotiation leverage increases, architectural flexibility improves and M&A integration becomes potentially much more streamlined.
Technology decisions will increasingly be shaped by optionality and commercial leverage, not single-platform loyalty.
One of the more reassuring signals was how seriously AWS is addressing AI governance. With Bedrock AgentCore Policy Controls (Cedar-based enforcement) and AI-assisted IAM policy generation, AWS is enabling organisations to embed compliance into agentic systems, particularly relevant under frameworks like the EU AI Act.
This matters commercially because AI-risk is now a board-level conversation, scaling AI without governance increases regulatory exposure and auditability and observability remain fundamental alongside the practical implementation of AI. The organisations that scale responsibly will move faster in the long run.
Dr. Werner Vogels’ final re:Invent keynote introduced the idea of the “renaissance developer”, describing professionals with broad, cross-disciplinary expertise across the stack, who understand more than just code and who connect multiple disciplines to achieve the best results.
Commercially, this extends beyond developers. The most competitive organisations will be those that are cross-functional by design, organised around outcomes – not just departments, built for human-AI hybrid collaboration and structured as small, empowered innovation squads that span across development, UX/UI, product, design, and strategy.
We heard repeatedly about balancing centralised control with decentralised innovation and that tension will define operating models and the shape and size of organisations in 2026.
In order to maximise on the commercial opportunities, companies should focus on identifying high-impact AI agent use cases tied to measurable outcomes, accelerating modernisation to unlock AI capability, embedding cost optimisation into architectural decisions, investing in cross-functional capability and AI fluency and leveraging partner ecosystems and funding to move faster.
The commercial winners will not be those who experiment the most, but those who operationalise the fastest. The question is no longer whether AI will reshape your business, it’s how quickly you will reshape it yourself.
This is where experienced delivery partners become critical. As a digital product engineering consultancy specialising in application modernisation, cloud enablement and practical AI implementation, Leighton partners with organisations to transform legacy systems into secure, resilient and cost-efficient platforms built for growth. We help leadership teams move from strategy to execution, modernising applications and infrastructure in ways that unlock measurable performance, scalability and commercial impact.
Through our AI ideation workshop, we also help organisations move beyond AI experimentation by identifying practical use cases, building AI-enabled features and automation, and putting the right governance in place to scale safely and responsibly.
If your organisation is exploring modernisation, AI readiness or how to build AI into your roadmap so you can achieve practical outcomes – we’d love to talk. You can contact the team here.
Following the technology deep-dives and innovation highlights covered in our previous blogs from the team on the ground at AWS re:Invent 2025, this final reflection from the event focuses on the key commercial impact that organisations are likely to see down the line.
If there was one defining shift at this year’s event, it was that AI is no longer experimental. It is operational. And it is becoming business critical.
The conversation has moved beyond pilots, proofs of concept and innovation ideation and the message from AWS was clear – 2026 is about embedding AI into the core of their enterprise, commercially, structurally, and culturally.
Here, Leighton’s CCO, Claire Cundill, covers some of the key commercial considerations for organisations shaping their technology and investment decisions in the year ahead.
Perhaps the most commercially significant shift was how AWS reframed AI agents. This time around the discussion wasn’t about copilots assisting individuals. It was about AI agents acting as members of teams, with memory, identity, guardrails and autonomy, built through services like AmazonBedrock AgentCore.
Organisations such as Lyft demonstrated what this means in measurable terms. By using Anthropic’s Claude via Amazon Bedrock for customer support, they achieved an 87% reduction in resolution time and 70% driver adoption of AI agents. Likewise, BMW reported a 75% reduction in test creation time and a 60% increase in coverage through domain-specific AI agents supporting modernisation. These examples not only demonstrate the huge potential available to companies in terms of operational efficiencies but also the potential impact successful implementation could have on the bottom line.
The implications of these agents are profound. Labour productivity models will change. Cost-to-serve will shift dramatically. Linear workflows will no longer be competitive and team structures will likely need redesigning.
And, these changes are happening at pace, with speakers at AWS re:invent highlighting that AI adoption is moving faster than mobile did. The organisations that treat AI agents as core operational assets and not just tools will soon be at a commercial advantage compared to the rest of the market.
Legacy technology remains the single biggest drag on innovation. Gartner estimates that up to 70% of IT budgets are spent maintaining legacy systems.
AWS directly targeted this issue with AWS Transform, an AI-powered modernisation service capable of reducing tech debt by up to 80%. Air Canada shared how AI-powered code transformation eliminated up to 70% of maintenance and licensing costs for Windows applications, reducing migration time and cost by 80%.
Equally, legacy systems are structurally difficult to integrate with AI at scale. In order to maximise the opportunities presented by AI, companies must invest in modern, reliable and robust infrastructures that can facilitate innovation and maintain operations at the same time.
This matters commercially for two reasons: modernisation is now mandatory if companies want to capitalise on the strategic advantages offered by AI and boards can now justify transformation on hard ROI, not strategic intent alone.
Organisations that fail to modernise will find themselves unable to operationalise AI at scale. Those that act can reallocate legacy spend into growth, innovation and customer experience.
Beyond AI, AWS made a strong commercial play around cost optimisation. The introduction of Database Savings Plans, spanning RDS, Aurora, DynamoDB and more, signals a shift toward more flexible commitment models and services like S3 Tables Intelligent-Tiering further automate cost efficiency.
At the same time, new infrastructure releases such as Graviton5 and Trainium3 UltraServers are delivering significant performance-per-dollar improvements.
The commercial takeaway here is that cost optimisation and innovation are no longer opposing forces. The organisations leading in cloud maturity are actively redesigning workloads for efficiency, leveraging ARM-based architectures, aligning financial governance with engineering decisions and embedding observability as a cost-control mechanism.
AWS Interconnect which enables managed private connectivity between AWS and other clouds was another notable announcement, particularly with Azure connectivity landing in 2026.
This signals commercial realism that customers are multi-cloud, and AWS is leaning into that reality. For organisations, this means vendor lock-in concerns are reduced, negotiation leverage increases, architectural flexibility improves and M&A integration becomes potentially much more streamlined.
Technology decisions will increasingly be shaped by optionality and commercial leverage, not single-platform loyalty.
One of the more reassuring signals was how seriously AWS is addressing AI governance. With Bedrock AgentCore Policy Controls (Cedar-based enforcement) and AI-assisted IAM policy generation, AWS is enabling organisations to embed compliance into agentic systems, particularly relevant under frameworks like the EU AI Act.
This matters commercially because AI-risk is now a board-level conversation, scaling AI without governance increases regulatory exposure and auditability and observability remain fundamental alongside the practical implementation of AI. The organisations that scale responsibly will move faster in the long run.
Dr. Werner Vogels’ final re:Invent keynote introduced the idea of the “renaissance developer”, describing professionals with broad, cross-disciplinary expertise across the stack, who understand more than just code and who connect multiple disciplines to achieve the best results.
Commercially, this extends beyond developers. The most competitive organisations will be those that are cross-functional by design, organised around outcomes – not just departments, built for human-AI hybrid collaboration and structured as small, empowered innovation squads that span across development, UX/UI, product, design, and strategy.
We heard repeatedly about balancing centralised control with decentralised innovation and that tension will define operating models and the shape and size of organisations in 2026.
In order to maximise on the commercial opportunities, companies should focus on identifying high-impact AI agent use cases tied to measurable outcomes, accelerating modernisation to unlock AI capability, embedding cost optimisation into architectural decisions, investing in cross-functional capability and AI fluency and leveraging partner ecosystems and funding to move faster.
The commercial winners will not be those who experiment the most, but those who operationalise the fastest. The question is no longer whether AI will reshape your business, it’s how quickly you will reshape it yourself.
This is where experienced delivery partners become critical. As a digital product engineering consultancy specialising in application modernisation, cloud enablement and practical AI implementation, Leighton partners with organisations to transform legacy systems into secure, resilient and cost-efficient platforms built for growth. We help leadership teams move from strategy to execution, modernising applications and infrastructure in ways that unlock measurable performance, scalability and commercial impact.
Through our AI ideation workshop, we also help organisations move beyond AI experimentation by identifying practical use cases, building AI-enabled features and automation, and putting the right governance in place to scale safely and responsibly.
If your organisation is exploring modernisation, AI readiness or how to build AI into your roadmap so you can achieve practical outcomes – we’d love to talk. You can contact the team here.
Following the technology deep-dives and innovation highlights covered in our previous blogs from the team on the ground at AWS re:Invent 2025, this final reflection from the event focuses on the key commercial impact that organisations are likely to see down the line.
If there was one defining shift at this year’s event, it was that AI is no longer experimental. It is operational. And it is becoming business critical.
The conversation has moved beyond pilots, proofs of concept and innovation ideation and the message from AWS was clear – 2026 is about embedding AI into the core of their enterprise, commercially, structurally, and culturally.
Here, Leighton’s CCO, Claire Cundill, covers some of the key commercial considerations for organisations shaping their technology and investment decisions in the year ahead.
Perhaps the most commercially significant shift was how AWS reframed AI agents. This time around the discussion wasn’t about copilots assisting individuals. It was about AI agents acting as members of teams, with memory, identity, guardrails and autonomy, built through services like AmazonBedrock AgentCore.
Organisations such as Lyft demonstrated what this means in measurable terms. By using Anthropic’s Claude via Amazon Bedrock for customer support, they achieved an 87% reduction in resolution time and 70% driver adoption of AI agents. Likewise, BMW reported a 75% reduction in test creation time and a 60% increase in coverage through domain-specific AI agents supporting modernisation. These examples not only demonstrate the huge potential available to companies in terms of operational efficiencies but also the potential impact successful implementation could have on the bottom line.
The implications of these agents are profound. Labour productivity models will change. Cost-to-serve will shift dramatically. Linear workflows will no longer be competitive and team structures will likely need redesigning.
And, these changes are happening at pace, with speakers at AWS re:invent highlighting that AI adoption is moving faster than mobile did. The organisations that treat AI agents as core operational assets and not just tools will soon be at a commercial advantage compared to the rest of the market.
Legacy technology remains the single biggest drag on innovation. Gartner estimates that up to 70% of IT budgets are spent maintaining legacy systems.
AWS directly targeted this issue with AWS Transform, an AI-powered modernisation service capable of reducing tech debt by up to 80%. Air Canada shared how AI-powered code transformation eliminated up to 70% of maintenance and licensing costs for Windows applications, reducing migration time and cost by 80%.
Equally, legacy systems are structurally difficult to integrate with AI at scale. In order to maximise the opportunities presented by AI, companies must invest in modern, reliable and robust infrastructures that can facilitate innovation and maintain operations at the same time.
This matters commercially for two reasons: modernisation is now mandatory if companies want to capitalise on the strategic advantages offered by AI and boards can now justify transformation on hard ROI, not strategic intent alone.
Organisations that fail to modernise will find themselves unable to operationalise AI at scale. Those that act can reallocate legacy spend into growth, innovation and customer experience.
Beyond AI, AWS made a strong commercial play around cost optimisation. The introduction of Database Savings Plans, spanning RDS, Aurora, DynamoDB and more, signals a shift toward more flexible commitment models and services like S3 Tables Intelligent-Tiering further automate cost efficiency.
At the same time, new infrastructure releases such as Graviton5 and Trainium3 UltraServers are delivering significant performance-per-dollar improvements.
The commercial takeaway here is that cost optimisation and innovation are no longer opposing forces. The organisations leading in cloud maturity are actively redesigning workloads for efficiency, leveraging ARM-based architectures, aligning financial governance with engineering decisions and embedding observability as a cost-control mechanism.
AWS Interconnect which enables managed private connectivity between AWS and other clouds was another notable announcement, particularly with Azure connectivity landing in 2026.
This signals commercial realism that customers are multi-cloud, and AWS is leaning into that reality. For organisations, this means vendor lock-in concerns are reduced, negotiation leverage increases, architectural flexibility improves and M&A integration becomes potentially much more streamlined.
Technology decisions will increasingly be shaped by optionality and commercial leverage, not single-platform loyalty.
One of the more reassuring signals was how seriously AWS is addressing AI governance. With Bedrock AgentCore Policy Controls (Cedar-based enforcement) and AI-assisted IAM policy generation, AWS is enabling organisations to embed compliance into agentic systems, particularly relevant under frameworks like the EU AI Act.
This matters commercially because AI-risk is now a board-level conversation, scaling AI without governance increases regulatory exposure and auditability and observability remain fundamental alongside the practical implementation of AI. The organisations that scale responsibly will move faster in the long run.
Dr. Werner Vogels’ final re:Invent keynote introduced the idea of the “renaissance developer”, describing professionals with broad, cross-disciplinary expertise across the stack, who understand more than just code and who connect multiple disciplines to achieve the best results.
Commercially, this extends beyond developers. The most competitive organisations will be those that are cross-functional by design, organised around outcomes – not just departments, built for human-AI hybrid collaboration and structured as small, empowered innovation squads that span across development, UX/UI, product, design, and strategy.
We heard repeatedly about balancing centralised control with decentralised innovation and that tension will define operating models and the shape and size of organisations in 2026.
In order to maximise on the commercial opportunities, companies should focus on identifying high-impact AI agent use cases tied to measurable outcomes, accelerating modernisation to unlock AI capability, embedding cost optimisation into architectural decisions, investing in cross-functional capability and AI fluency and leveraging partner ecosystems and funding to move faster.
The commercial winners will not be those who experiment the most, but those who operationalise the fastest. The question is no longer whether AI will reshape your business, it’s how quickly you will reshape it yourself.
This is where experienced delivery partners become critical. As a digital product engineering consultancy specialising in application modernisation, cloud enablement and practical AI implementation, Leighton partners with organisations to transform legacy systems into secure, resilient and cost-efficient platforms built for growth. We help leadership teams move from strategy to execution, modernising applications and infrastructure in ways that unlock measurable performance, scalability and commercial impact.
Through our AI ideation workshop, we also help organisations move beyond AI experimentation by identifying practical use cases, building AI-enabled features and automation, and putting the right governance in place to scale safely and responsibly.
If your organisation is exploring modernisation, AI readiness or how to build AI into your roadmap so you can achieve practical outcomes – we’d love to talk. You can contact the team here.

Following the technology deep-dives and innovation highlights covered in our previous blogs from the team on the ground at AWS re:Invent 2025, this final reflection from the event focuses on the key commercial impact that organisations are likely to see down the line.
If there was one defining shift at this year’s event, it was that AI is no longer experimental. It is operational. And it is becoming business critical.
The conversation has moved beyond pilots, proofs of concept and innovation ideation and the message from AWS was clear – 2026 is about embedding AI into the core of their enterprise, commercially, structurally, and culturally.
Here, Leighton’s CCO, Claire Cundill, covers some of the key commercial considerations for organisations shaping their technology and investment decisions in the year ahead.
Perhaps the most commercially significant shift was how AWS reframed AI agents. This time around the discussion wasn’t about copilots assisting individuals. It was about AI agents acting as members of teams, with memory, identity, guardrails and autonomy, built through services like AmazonBedrock AgentCore.
Organisations such as Lyft demonstrated what this means in measurable terms. By using Anthropic’s Claude via Amazon Bedrock for customer support, they achieved an 87% reduction in resolution time and 70% driver adoption of AI agents. Likewise, BMW reported a 75% reduction in test creation time and a 60% increase in coverage through domain-specific AI agents supporting modernisation. These examples not only demonstrate the huge potential available to companies in terms of operational efficiencies but also the potential impact successful implementation could have on the bottom line.
The implications of these agents are profound. Labour productivity models will change. Cost-to-serve will shift dramatically. Linear workflows will no longer be competitive and team structures will likely need redesigning.
And, these changes are happening at pace, with speakers at AWS re:invent highlighting that AI adoption is moving faster than mobile did. The organisations that treat AI agents as core operational assets and not just tools will soon be at a commercial advantage compared to the rest of the market.
Legacy technology remains the single biggest drag on innovation. Gartner estimates that up to 70% of IT budgets are spent maintaining legacy systems.
AWS directly targeted this issue with AWS Transform, an AI-powered modernisation service capable of reducing tech debt by up to 80%. Air Canada shared how AI-powered code transformation eliminated up to 70% of maintenance and licensing costs for Windows applications, reducing migration time and cost by 80%.
Equally, legacy systems are structurally difficult to integrate with AI at scale. In order to maximise the opportunities presented by AI, companies must invest in modern, reliable and robust infrastructures that can facilitate innovation and maintain operations at the same time.
This matters commercially for two reasons: modernisation is now mandatory if companies want to capitalise on the strategic advantages offered by AI and boards can now justify transformation on hard ROI, not strategic intent alone.
Organisations that fail to modernise will find themselves unable to operationalise AI at scale. Those that act can reallocate legacy spend into growth, innovation and customer experience.
Beyond AI, AWS made a strong commercial play around cost optimisation. The introduction of Database Savings Plans, spanning RDS, Aurora, DynamoDB and more, signals a shift toward more flexible commitment models and services like S3 Tables Intelligent-Tiering further automate cost efficiency.
At the same time, new infrastructure releases such as Graviton5 and Trainium3 UltraServers are delivering significant performance-per-dollar improvements.
The commercial takeaway here is that cost optimisation and innovation are no longer opposing forces. The organisations leading in cloud maturity are actively redesigning workloads for efficiency, leveraging ARM-based architectures, aligning financial governance with engineering decisions and embedding observability as a cost-control mechanism.
AWS Interconnect which enables managed private connectivity between AWS and other clouds was another notable announcement, particularly with Azure connectivity landing in 2026.
This signals commercial realism that customers are multi-cloud, and AWS is leaning into that reality. For organisations, this means vendor lock-in concerns are reduced, negotiation leverage increases, architectural flexibility improves and M&A integration becomes potentially much more streamlined.
Technology decisions will increasingly be shaped by optionality and commercial leverage, not single-platform loyalty.
One of the more reassuring signals was how seriously AWS is addressing AI governance. With Bedrock AgentCore Policy Controls (Cedar-based enforcement) and AI-assisted IAM policy generation, AWS is enabling organisations to embed compliance into agentic systems, particularly relevant under frameworks like the EU AI Act.
This matters commercially because AI-risk is now a board-level conversation, scaling AI without governance increases regulatory exposure and auditability and observability remain fundamental alongside the practical implementation of AI. The organisations that scale responsibly will move faster in the long run.
Dr. Werner Vogels’ final re:Invent keynote introduced the idea of the “renaissance developer”, describing professionals with broad, cross-disciplinary expertise across the stack, who understand more than just code and who connect multiple disciplines to achieve the best results.
Commercially, this extends beyond developers. The most competitive organisations will be those that are cross-functional by design, organised around outcomes – not just departments, built for human-AI hybrid collaboration and structured as small, empowered innovation squads that span across development, UX/UI, product, design, and strategy.
We heard repeatedly about balancing centralised control with decentralised innovation and that tension will define operating models and the shape and size of organisations in 2026.
In order to maximise on the commercial opportunities, companies should focus on identifying high-impact AI agent use cases tied to measurable outcomes, accelerating modernisation to unlock AI capability, embedding cost optimisation into architectural decisions, investing in cross-functional capability and AI fluency and leveraging partner ecosystems and funding to move faster.
The commercial winners will not be those who experiment the most, but those who operationalise the fastest. The question is no longer whether AI will reshape your business, it’s how quickly you will reshape it yourself.
This is where experienced delivery partners become critical. As a digital product engineering consultancy specialising in application modernisation, cloud enablement and practical AI implementation, Leighton partners with organisations to transform legacy systems into secure, resilient and cost-efficient platforms built for growth. We help leadership teams move from strategy to execution, modernising applications and infrastructure in ways that unlock measurable performance, scalability and commercial impact.
Through our AI ideation workshop, we also help organisations move beyond AI experimentation by identifying practical use cases, building AI-enabled features and automation, and putting the right governance in place to scale safely and responsibly.
If your organisation is exploring modernisation, AI readiness or how to build AI into your roadmap so you can achieve practical outcomes – we’d love to talk. You can contact the team here.