logo
Menu
Cloud Engineer Partner AI Assistant

Cloud Engineer Partner AI Assistant

This AI Cloud Engineer Partner application helps you answer your problem and explore AWS Cloud by providing them with domain expertise

Published Feb 25, 2024
Envisioned Real-World Application
This application is a tool that helps everyone understand their problems related to AWS Cloud Computing and simultaneously find solutions. The real impact that can be provided is increasing productivity to solve complex problems in AWS, such as scripting requirements for infrastructure provisioning, system design and so on.
Potential Benefits
This application can receive input according to the user's wishes, which will later be used as one of the main parameters for AI capabilities, especially in the chat feature.
We try to input capabilities into AI, namely expertise in AWS DevOps and AWS Security. After that, we tried to do a test by asking about general problems regarding how to send logs from EC2 to CloudWatch Log Group and asked AI to create a Terraform Script.
Then, an initial Terraform script is generated that applies various AWS best practices, such as least privilege, according to input provided by the user.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
resource "aws_iam_role" "ec2_logging_role" {
name = "ec2_cloudwatch_logs_role"

assume_role_policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Action": "sts:AssumeRole",
"Principal": {
"Service": "ec2.amazonaws.com"
},
"Effect": "Allow",
"Sid": ""
}
]
}
EOF
}

resource "aws_iam_role_policy" "ec2_logging_policy" {
name = "ec2_cloudwatch_logs_policy"
role = aws_iam_role.ec2_logging_role.id

policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"logs:CreateLogGroup",
"logs:CreateLogStream",
"logs:PutLogEvents",
"logs:DescribeLogStreams"
],
"Resource": [
"arn:aws:logs:*:*:*"
]
}
]
}
EOF
}

resource "aws_cloudwatch_log_group" "ec2_log_group" {
name = "/ec2/logs"
}

resource "aws_instance" "web" {
ami = "ami-0c55b159cbfafe1f0"
instance_type = "t2.micro"

iam_instance_profile = aws_iam_instance_profile.ec2_logging_profile.name

user_data = <<-EOF
#!/bin/bash
sudo yum install -y awslogs
sudo service awslogs start
sudo service awslogs status
EOF

tags = {
Name = "EC2 with CloudWatch Logs"
}
}

resource "aws_iam_instance_profile" "ec2_logging_profile" {
name = "ec2_logging_profile"
role = aws_iam_role.ec2_logging_role.name
}

resource "aws_ssm_association" "ec2_logging" {
name = aws_ssm_document.send_logs.name
instance_id = aws_instance.web.id
}

resource "aws_ssm_document" "send_logs" {
name = "EC2Logging"
document_type = "Command"

content = <<DOC
{
"schemaVersion": "1.2",
"description": "Send EC2 logs to CloudWatch",
"parameters": {

},
"runtimeConfig": {
"aws:runShellScript": {
"properties": [
{
"id": "0.aws:runShellScript",
"runCommand": ["/home/ec2-user/configure_cw_logs.sh"]
}
]
}
}
}
DOC
}

resource "aws_s3_bucket_object" "configure_cw_logs_script" {
bucket = aws_s3_bucket.script_bucket.id
key = "configure_cw_logs.sh"
source = "./configure_cw_logs.sh"
etag = filemd5("./configure_cw_logs.sh")
}

resource "aws_s3_bucket" "script_bucket" {
bucket = "ec2-script-bucket"
}
Of course, this is only a tiny example of what AI can do. We can carry out various experiments, such as changing domain expertise and asking for real-case scenarios related to architectural problems in AWS.
Alternative Development Scenario with Amazon Bedrock
If PartyRock is not available, as an alternative, we can use Amazon Bedrock, which we can also integrate with other AWS services. Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon via a single API, along with a broad set of capabilities we need to build generative AI applications with security, privacy, and responsible AI. Using Amazon Bedrock, we can easily experiment with and evaluate top FMs for our use case, privately customize them with our data using techniques such as fine-tuning and Retrieval Augmented Generation (RAG), and build agents that execute tasks using our enterprise systems and data sources. Since Amazon Bedrock is serverless, we don't have to manage any infrastructure, and we can securely integrate and deploy generative AI capabilities into your applications using the AWS services you are already familiar with.
We will also need a storage service that we can utilize Amazon S3 as a fully managed object storage to store our image generations from Amazon Bedrock. It also requires some programming skills, such as Python programming, to develop a Lambda function script that can be triggered by API Gateway to create, publish, maintain, monitor, and secure APIs at any scale. Of course, we also need Amazon CloudWatch to monitor our AWS resources.
We can see that, the scenario might be available if PartyRock not available that can be complicated for beginner to build application. However, PartyRock allows everyone to build their application simply using prompt engineering. :)
Snapshot Cloud Engineer Partner:
https://partyrock.aws/u/kandlagifari/-nMQpyOMo/Cloud-Engineer-Partner/snapshot/buosqVro7
 

Comments