Whiteboard to cloud in minutes using Amazon Q, Amazon Bedrock Data Automation, and Model Context Protocol


Upgrading legacy systems has become increasingly important to stay competitive in today’s market as outdated infrastructure can cost organizations time, money, and market position. However, modernization efforts face challenges like time-consuming architecture reviews, complex migrations, and fragmented systems. These delays not only impact engineering teams but have broader impacts including lost market opportunities, reduced competitiveness, and higher operational costs. With Amazon Q DeveloperAmazon Bedrock Data Automation (Bedrock Data Automation) and Anthropic’s Model Context Protocol (MCP), developers can now go from whiteboard sketches and team discussions to fully deployed, secure, and scalable cloud architectures in a matter of minutes, not months.

We’re excited to share the Amazon Bedrock Data Automation Model Context Protocol (MCP) server, for seamless integration between Amazon Q and your enterprise data. With this new capability, developers can use the features of Amazon Q while maintaining secure access to their organization’s data through standardized MCP interactions. In this post, you will learn how to use the Amazon Bedrock Data Automation MCP server to securely integrate with AWS Services, use Bedrock Data Automation operations as callable MCP tools, and build a conversational development experience with Amazon Q.

The problem: Five systems, lack of agility

Engineers looked at a whiteboard, eyeing a complex web of arrows, legacy system names, and integration points that had long stopped making sense. The diagram represented multiple disconnected systems held together by brittle scripts, fragile batch jobs, and a patchwork of manual workarounds as shown in the following illustration. Collaborative AWS solution design meeting with whiteboard diagrams showing cloud services integration and data flow

The meeting audio was synthesized using Amazon Polly to bring the conversation to life for this post.

“We need to stop patching and start transforming,” Alex said, pointing at the tangled mess. The team nodded, weary from another outage that left the finance team reconciling thousands of transactions by hand. Feature development had slowed to a crawl, infrastructure costs were unpredictable, and any change risked breaking something downstream. Migration felt inevitable but overwhelming. The question wasn’t whether to modernize – it was how to begin without burning months in planning and coordination. That’s when they turned to the new pattern.

The breakthrough

Just a few months ago, building a working prototype from a whiteboard session like this would have taken months, if not longer. The engineers would have started by manually transcribing the meeting, converting rough ideas into action items, cleaning up architecture diagrams, aligning teams across operations and security, and drafting infrastructure templates by hand. Every step would have required coordination, and each change made would have invited risk to the system. Even a proof-of-concept would have demanded hours of YAML, command line interface (CLI) commands, policy definitions, and trial-and-error troubleshooting. Now the engineers need to only ask, and what used to take months happens in minutes.

With Amazon Q CLI, the team initiates a conversation. Behind the scenes, Amazon Q CLI invokes the MCP server and extracts information from multimodal content using Bedrock Data Automation. The meeting recording and the draft architecture diagram are also analyzed using Bedrock Data Automation. Amazon Q uses the extracted content from Bedrock Data Automation to generate the AWS CloudFormation template. It even deploys it to the AWS Cloud when asked. There is no manual translation, no brittle scripting, and no dependency mapping across systems. The result is a fully deployable, secure AWS architecture generated and provisioned in minutes. What once required cross-functional coordination and prolonged development cycles now starts and completes with a chat.

Understanding the Model Context Protocol

The Model Context Protocol (MCP) is an open standard developed by Anthropic to facilitate secure, two-way connections between AI models and multiple data sources, including content repositories, business tools, and development environments. By standardizing these interactions, MCP enables AI systems to access the data they need to provide more relevant and accurate responses.

MCP operates on a client-server architecture, where developers can either expose their data through MCP servers or build AI applications (MCP clients) that connect to these servers. This setup allows for a more streamlined and scalable integration process, replacing the need for custom connectors for each data source.

Enhancing Amazon Q with Amazon Bedrock Data Automation and MCP server

Bedrock Data Automation complements MCP by providing a robust suite of tools that automate the extraction, transformation, and loading (ETL) of enterprise data into AI workflows at scale and with minimal manual intervention. With Bedrock Data Automation, customers can:

  • Extract unstructured data from diverse sources such as document, image, audio, and video files.
  • Transform and validate data using schema-driven extraction using Blueprints, confidence scoring, and responsible AI practice to maintain accuracy, completeness, and consistency.
  • Load ready-to-use data into AI models for real-time, context-aware reasoning across business.

This deep integration makes sure that AI models are not just connected to data, they are grounded in clean, validated, and context-rich information. As a result, intelligent agents deliver more accurate, relevant, and reliable outputs that drive faster decisions and richer insights across the enterprise. Amazon Q Developer is a generative AI-powered conversational assistant from AWS designed to help software developers and IT professionals build, operate, and transform software with greater speed, security, and efficiency. It acts as an intelligent coding companion and productivity tool, integrated with the AWS environment and available in popular code editors, the AWS Management Console, and collaboration tools such as Microsoft Teams and Slack. As described in the following figure, the Bedrock Data Automation MCP server works in the following way:

  1. The User sends a “Request action” to the MCP Host.
  2. The MCP Host processes the request with an LLM.
  3. The MCP Host then requests a tool execution to the MCP Client.
  4. The MCP Client makes a tool call request to the MCP Server.
  5. The MCP Server makes an API request to the Bedrock Data Automation.
  6. Bedrock Data Automation sends back an API response to the MCP Server.
  7. The MCP Server returns the tool result to the MCP Client.
  8. The MCP Client sends the result back to the MCP Host.
  9. The MCP Host again processes with LLM.
  10. The MCP Host sends a final response to the User.

End-to-end request flow diagram showing MCP Host/Client/Server interaction with AWS Bedrock and LLM processing steps

Step-by-step guide

If this is your first time using AWS MCP servers, visit the Installation and Setup guide in the AWS Labs GitHub repository for installation instructions. After installation, add the following MCP server configuration to your local setup:

Prerequisites

Set up MCP

Install Amazon Q for command line and add the configuration to ~/.aws/amazonq/mcp.json. If you’re already an Amazon Q CLI user, add only the configuration.

{
  "mcpServers": {
    "bedrock-data-automation-mcp-server": {
      "command": "uvx",
      "args": [
        "awslabs.aws-bedrock-data-automation-mcp-server@latest"
      ],
      "env": {
        "AWS_PROFILE": "your-aws-profile",
        "AWS_REGION": "your-aws-region",
        "AWS_BUCKET_NAME": "amzn-s3-demo-bucket"
      }
    }
  }
}

To confirm the setup was successful, open a terminal and enter q chat to enter into a chat session with Amazon Q.

Need to know what tools are at your disposal? Enter:"Tell me the tools I have access to"

If MCP has been properly configured, as shown in the following screenshot, you will have, aws_bedrock_data_automation suffixed by getprojects, getprojectdetails, and analyzeasset as its three tools. This will help you quickly verify access and make sure that the necessary components are properly set up.

Interactive AWS terminal interface showing Q CLI, MCP and Bedrock Data Automation project management and analysis capabilities

Now, you can ask Amazon Q to use Bedrock Data Automation as a tool and extract the transcript from the meeting stored in the .mp3 file and refer to the updated architecture diagram, as shown in the following screenshot.

can you extract the meeting recording from  and refer to the updated architecture diagram from  using Bedrock Data Automation 

Terminal interface initiating AWS Bedrock analysis of MP3 and PNG files with project listing request

You can seamlessly continue a natural language conversation with Amazon Q to generate an AWS CloudFormation template, write prototype code, or even implement monitoring solutions. The potential applications are virtually endless.

Clean up

When you’re done working with the Amazon Bedrock Data Automation MCP server, follow the given steps to perform cleanup:

  1. Empty and delete the S3 buckets used for Bedrock Data Automation.
   aws s3 rm s3://amzn-s3-demo-bucket --recursive
    aws s3 rb s3://amzn-s3-demo-bucket

  1. Remove the configuration added to ~/.aws/amazonq/mcp.json for bedrock-data-automation-mcp-server.

Conclusion

With MCP and Bedrock Data Automation, Amazon Q Developer can turn messy ideas into working cloud architectures in record time. No whiteboards are left behind.

Are you ready to build smarter, faster, and more context-aware applications? Explore Amazon Q Developer and see how MCP and Amazon Bedrock Data Automation can help your team turn ideas into reality faster than ever before.


About the authors

Wrick TalukdarWrick Talukdar is a Tech Lead and Senior Generative AI Specialist at Amazon Web Services, driving innovation through multimodal AI, generative models, computer vision, and natural language processing. He is also the author of the bestselling book “Building Agentic AI Systems”. He is a keynote speaker and often presents his innovations and solutions at leading global forums, including AWS re:Invent, ICCE, Global Consumer Technology conference, and major industry events such as CERAWeek and ADIPEC. In his free time, he enjoys writing and birding photography.

Ayush GoyalAyush Goyal is a Senior Software Engineer at Amazon Bedrock, where he focuses on designing and scaling AI-powered distributed systems. He’s also passionate about contributing to open-source projects. When he’s not writing code, Ayush enjoys speed cubing, exploring global cuisines, and discovering new parks—both in the real world and through open-world games.

Himanshu SahHimanshu Sah is an Associate Delivery Consultant in AWS Professional Services, specialising in Application Development and Generative AI solutions. Based in India, he helps customers architect and implement cutting-edge applications leveraging AWS services and generative AI capabilities. Working closely with cross-functional teams, he focuses on delivering best-practice implementations while ensuring optimal performance and cost-effectiveness. Outside of work, he is passionate about exploring new technologies and contributing to the tech community.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *