AWS Logo
Menu
Streamlining Code Documentation

Streamlining Code Documentation

Using AWS Bedrock Prompt Management for writing code documentation

Stuart Clark
Amazon Employee
Published Mar 5, 2025
Creating consistent code documentation across teams is a challenge that can consume up to 25% of development time (longer for me, as I need to keep topping up my coffee intake!. Different styles, missing details, and outdated docs make this process inefficient. In this post, we'll explore how to create a prompt that generates comprehensive documentation for code snippets using Amazon Bedrock Prompt Management's features.

Introduction to AWS Bedrock Prompt Management

Amazon Bedrock Prompt Management offers capabilities that simplify the process of building generative AI applications:
  • Structured prompts: Define system instructions, tools, and additional messages when building your prompts
  • Converse and InvokeModel API integration: Invoke your cataloged prompts directly from the Amazon Bedrock Converse and InvokeModel API calls

Creating the Prompt

  1. Navigate to the Bedrock console → Builder tools → Prompt management
  2. Click "Create prompt"
  3. Fill in the input fields:
    • Name: Code-Documentation-Generator
    • Description: "Generates detailed documentation for code snippets"
    • Click "Create"

Building the Prompt Structure

Step 1: Define System Instructions

In this step, we define the model's role with the following system instructions:
You are an expert software developer specializing in code documentation. Your task is to explain the code's purpose and functionality, document all parameters and return values. Also provide usage examples and note any dependencies or requirements. Highlight best practices and potential issues.

Step 2: Create Variables

You can create variables by enclosing a name with double curly braces. Values for these variables are passed at invocation time and injected into your prompt template. For our documentation generator, we'll use this prompt with three variables, 'language', 'code_snippet;, and 'format'
Next, select your model. For this example, we're selecting Claude-3 with default settings.

Step 3: Configure Tools

In the Tools setting section (which is located on the bottom right side of the page), configure function calling with this JSON structure:
This JSON structure includes:
  1. toolChoice: {"auto": {}} indicates the system will automatically choose when to use this tool.
  2. tools: An array containing a single tool specification
  3. Tool Details:
    • name: "generate_documentation"
    • description: "Generate documentation for a given code snippet"
    • inputSchema: Defines the required inputs
      • language: The programming language of the code
      • code_snippet: The actual code to be documented
      • format: Desired documentation output format (e.g., Markdown, HTML)
When using function calling, an LLM doesn't directly use tools; instead, it indicates the tool and parameters needed to use it. Users must implement the logic to invoke tools based on the model's requests and feed results back to the model.
Click "Save" and return to the overview page. Now you need to copy the Prompt Amazon Resource Name (ARN), which is a unique identifier for your prompt resource in Amazon Bedrock.

Testing and Implementation

To invoke the prompt from our applications, we can include the prompt identifier and version as part of the Amazon Bedrock Converse API call. This allows us to evaluate different documentation styles across models, then integrate using the Bedrock API.
Let's build an example using the AWS SDK in a Jupyter Notebook with Python and Boto3:
This code sets up the necessary components to interact with Amazon Bedrock:
  • It creates a boto3 Session with a specific AWS profile ('stuartck-admin') that contains AWS credentials and configuration settings
  • It creates a specific client for the Bedrock runtime to interact with the Bedrock runtime service
Now, let's invoke the model:
In this code:
  • We pass the prompt ARN in the modelId parameter
  • We include prompt variables as a separate parameter
  • Amazon Bedrock loads our prompt version from the prompt management library to run the invocation without latency overheads
This approach simplifies the workflow by enabling direct prompt invocation through the Converse or InvokeModel APIs, eliminating manual retrieval and formatting. It also allows teams to reuse and share prompts and track different versions.
The key elements in the invoke_model() function are:
  • The modelId parameter identifies the exact AI model you want to use
  • The body parameter provides the details of the task
  • Inside the body, promptVariables are customizable settings (language, code snippet, and output format)
  • The contentType and accept headers ensure proper format handling for request and response
Let's retrieve and display the response:
The response will look similar to this:
This response can be broken down into three sections:
  1. Top-level structure
  2. Content section
  3. Usage statistics
The response indicates that:
  • The input was our Python hello_world function
  • The process consumed 506 input tokens and generated 119 output tokens
  • The response stopped because it completed a tool use operation
This structured response format allows for programmatic processing of the model's output and tracking of resource usage.

Conclusion

Amazon Bedrock Prompt Management simplifies generative AI development by offering a centralized platform for creating, customizing, and managing prompts. With features like system instruction definition, tool configuration, and prompt variant comparison, developers can efficiently build and deploy AI solutions that generate more relevant output.
By automating code documentation with Amazon Bedrock Prompt Management, teams can maintain consistent documentation standards while reducing the time spent on this essential but time-consuming task. You can start using Amazon Bedrock Prompt Management today to improve your development workflow with AI-powered documentation generation.
Here are some linked for Amazon Bedrock Prompt Management which i used when creating this post.
https://aws.amazon.com/bedrock/prompt-management/
https://docs.aws.amazon.com/bedrock/latest/studio-ug/creating-a-prompt.html
 

Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.

Comments