Select your cookie preferences

We use essential cookies and similar tools that are necessary to provide our site and services. We use performance cookies to collect anonymous statistics, so we can understand how customers use our site and make improvements. Essential cookies cannot be deactivated, but you can choose “Customize” or “Decline” to decline performance cookies.

If you agree, AWS and approved third parties will also use cookies to provide useful site features, remember your preferences, and display relevant content, including relevant advertising. To accept or decline all non-essential cookies, choose “Accept” or “Decline.” To make more detailed choices, choose “Customize.”

AWS Logo
Menu
How to Build Open Source Agentic AI Toolkit on AWS Graviton(Keep updating)

How to Build Open Source Agentic AI Toolkit on AWS Graviton(Keep updating)

Learn how to leverage AWS Graviton's cost-efficient ARM architecture to build an open-source Agentic AI toolkit, optimizing performance and scalability for cloud-native AI workflows.

Vincent Wang
Amazon Employee
Published Mar 18, 2025
AWS Graviton processors, designed for cloud-native workloads, deliver significant cost savings, energy efficiency, and performance improvements for AI/ML applications. In this guide, we’ll explore how to harness Graviton’s ARM-based architecture to build a powerful Agentic AI toolkit using open-source frameworks like LangManus (AI automation framework). We’ll walk through setup, optimization tips, and deployment steps to help you create scalable AI agents on AWS.

Why AWS Graviton for Agentic AI?

AWS Graviton processors (e.g., Graviton4) are optimized for modern workloads, offering:
- **Cost Efficiency**: Up to 40% better price-performance vs. comparable x86 instances.
- **Sustainability**: Reduced energy consumption for eco-friendly AI.
- **ARM Compatibility**: Native support for frameworks like PyTorch, TensorFlow, and Python-based AI tools.
For Agentic AI—where autonomous systems require efficient scaling—Graviton’s cost-performance balance makes it ideal.

Setting Up Your AWS Graviton Environment

### **Step 1: Launch a Graviton Instance**
1. Navigate to AWS EC2.
2. Select an ARM-compatible OS (Amazon Linux 2023).
3.Choose a Graviton-based instance (e.g., `c8g` or `m8g`).
### **Step 2: Install Dependencies**

Installing LangManus API server on Graviton

Now the LangManus API server is launching

Installing LangManus Web UI on Graviton

Create a .env file in the project root and configure the following environment variables:
  • NEXT_PUBLIC_API_URL: The URL of the LangManus API server.
Now the LangManus Web UI is launching
## **5. Deployment and Scaling**
- **Containerize**: Package your toolkit in Docker with ARM-based images.
- **Orchestrate**: Deploy on AWS EKS/ECS/Fargate (ARM-compatible) for scaling.
- **Monitor**: Use Amazon CloudWatch to track performance and costs.

Conclusion

By combining AWS Graviton’s efficiency with open-source tools like LangManus, you can build high-performance Agentic AI systems at a fraction of the cost. Start with the approach to maximize savings while ensuring stability.

Ready to dive deeper?

- Explore [LangManus Documentation](https://github.com/langmanus/langmanus)
- Learn more about [AWS Graviton](https://aws.amazon.com/graviton)
 

Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.

Comments