AWS launches Kiro powers, integrating Stripe, Figma, and Datadog for AI-assisted coding



Amazon Web Services on Wednesday introduced Kiro powers, a system that allows software developers to instantly provide expertise on specific tools and workflows to AI coding assistants. This addresses what the company calls a fundamental bottleneck in the way artificial intelligence agents operate today.

AWS made the announcement at its annual re:Invent conference in Las Vegas. This feature represents a departure from the way most current AI coding tools work. These tools typically preload all possible functionality into memory. This process consumes computational resources and can overwhelm the AI ​​with irrelevant information. Kiro powers takes the opposite approach, enabling expertise only in the moments when developers actually need it.

"Our goal is to provide agents with specialized context to help them reach the right results faster. It is also a method that can reduce costs." Deepak Singh, Amazon’s vice president of developer agency and experiences, said in an exclusive interview with VentureBeat:

The launch includes partnerships with nine technology companies: Datadog, Dynatrace, Figma, Neon, Netlify, Postman, Stripe, Supabase, and AWS proprietary services. Developers can also create their own powers and share them with the community.

Why AI coding assistants choke when developers connect too many tools

To understand why Kiro’s power is important, it helps to understand the growing tensions in the AI ​​development tools market.

Modern AI coding assistants utilize something called the Model Context Protocol (MCP) to connect to external tools and services. If a developer wants their AI assistant to work with Stripe for payments, Figma for design, or Supabase for databases, they connect to each service’s MCP server.

The problem is that each connection loads dozens of tool definitions into the AI’s working memory before the AI ​​writes a single line of code. According to AWS documentation, connecting to just five MCP servers can consume more than 50,000 tokens, or about 40 percent of an AI model’s context window, before a developer can enter their first request.

Developers have become increasingly vocal about this issue. Many people complain that they don’t want to use up their token allocation just to have an AI agent decide which tools are relevant for a particular task. They want to reach workflows instantly, rather than watching overloaded agents struggle to sort through irrelevant context.

Some in the industry call this phenomenon. "corruption of context," AI services are typically charged by tokens, which leads to delayed responses, lower quality output, and significantly higher costs.

Inside the technology that loads AI expertise on demand

Kiro powers addresses this problem by packaging the three components into one dynamically loaded bundle.

The first component is a steering file called POWER.md, which serves as the onboarding manual for the AI ​​agent. This tells agents what tools are available and, importantly, when to use them. The second component is the MCP server configuration itself, the actual connection to external services. The third includes optional hooks and automations that trigger specific actions.

When the developer mentions, "payment" or "check out" During your conversation with Kiro, the system automatically enables Stripe power and loads its tools and best practices into your context. When a developer moves to database work, Supabase is activated and Stripe is deactivated. Baseline context usage with no active power approaches zero.

"It will load automatically when you click the button." Shin said. "Once the power is created, developers simply select “Open in Kiro” and the IDE launches and everything is ready to go."

How AWS brings the techniques of elite developers to the masses

Singh framed kilopower as the democratization of advanced development practices. Before this feature, only the most sophisticated developers knew how to properly configure AI agents in specialized contexts, including creating custom steering files, creating precise prompts, and manually managing which tools were active at any given time.

"We found that developers are adding features to make agents more specialized." Shin said. "They wanted to give agents special powers to perform specific problems. For example, they wanted front-end developers, and they wanted their agents to be backend-as-a-service experts."

This observation led to important insights. This means that once Supabase or Stripe can build an optimal context configuration, all developers using those services can potentially benefit.

"Kiro powers formalize what only people, the most advanced, used to do, and make those skills available to everyone." Shin said.

Why dynamic loading is better than tweaking for most AI coding use cases

The announcement also positions Kiro Power as a more economical alternative to fine-tuning, the process of training an AI model on specialized data to improve performance in a specific area.

"It’s much cheaper and" When asked how power and fine-tuning compare, Singh replied: "Fine-tuning is very expensive and most Frontier models cannot be fine-tuned."

This is an important point. The most capable AI models from Anthropic, OpenAI, and Google typically "closed source," This means that developers cannot modify the underlying training. You can influence the model’s behavior only through the prompts and context it provides.

"Most people are already using powerful models like Sonnet 4.5 or Opus 4.5." Shin said. "All you need for these models is to orient them in the right direction."

The dynamic loading mechanism also reduces ongoing costs. Permissions are enabled only when relevant, so developers don’t have to pay token fees for tools they don’t currently use.

Kiro’s power fits Amazon’s big bet on autonomous AI agents.

Kiro Power was introduced as part of AWS’ broader push for the company’s so-called initiatives. "agent AI" — Artificial intelligence systems that can operate autonomously for long periods of time.

At the beginning of re:Invent, AWS announced three features. "frontier agent" Kiro Autonomous Agents for Software Development, AWS Security Agents, and AWS DevOps Agents are designed to run for hours or days without human intervention. These represent a different approach than Kiro’s purview, tackling large and vague problems rather than providing specialized expertise on specific tasks.

The two approaches are complementary. Frontier agents handle complex multi-day projects that require autonomous decision-making across multiple codebases. In contrast, Kiro powers provide developers with precise and efficient tools for everyday development tasks where speed and token efficiency are paramount.

The company believes that developers need both ends of the spectrum to be productive.

What Kiro powers reveals about the future of AI-assisted software development

This announcement reflects the maturation of the AI ​​development tools market. GitHub Copilot, launched by Microsoft in 2021, introduced AI-assisted coding to millions of developers. Since then, tools like Cursor, Cline, and Claude Code have proliferated and captured the attention of developers.

However, as these tools have improved in functionality, they have also become more complex. The Model Context Protocol, which Anthropic open sourced last year, created a standard for connecting AI agents to external services. While this solved one problem, it created another. That’s the context overload that Kiro is currently dealing with.

AWS positions itself as a company that understands operational software development at scale. Singh emphasized that Amazon’s 20 years of experience running AWS, combined with its large in-house software engineering organization, gives it unique insight into how developers actually work.

"It is not only used for prototypes and toys." Mr. Singh had this to say about AWS’s AI development tools. "If you want to build a production application, there is a lot of knowledge that AWS can apply here."

The future of Kiro power and cross-platform compatibility

AWS said Kiro powers currently only work within the Kiro IDE, but the company is building toward cross-compatibility with other AI development tools such as command line interfaces, Cursors, Cline, and Claude Code. The company’s documentation explains what the future holds for developers. "Build power once, use it anywhere" –For now, though, the vision remains ambitious.

For technology partners launching power today, the appeal is straightforward. Rather than maintaining separate integration documentation for every AI tool on the market, Kiro allows you to create a single power that works everywhere. As more AI coding assistants enter the market, such efficiencies are becoming increasingly valuable.

Currently, Kiro powers are available to developers using Kiro IDE version 0.7 or later at no additional charge on top of the standard Kiro subscription.

The underlying bet is well known in the history of computing. The winners in AI-assisted development are not tools that try to do everything at once, but tools that are smart enough to know what to forget.



Source link

Leave a Reply