Amazon Q Developer
Analyze usage, adoption, and performance metrics for your engineering team using Amazon Q Developer. This guide will walk you through the process of setting up a secure, read-only integration.
Prerequisites
Before you begin, please ensure you have the following:
- An active AWS account with administrative permissions to:
- Create and manage S3 buckets.
- Create and manage IAM roles and policies.
- Access to create a DataSync Task and Locations.
- Amazon Q Developer Pro subscriptions assigned to developers on your team.
Configure AWS for DataSync
The process involves configuring AWS to sync Amazon Q metrics to a private S3 bucket.
A. Create an IAM DataSync Role
Create an IAM role with DataSync as the trusted entity.
- Log in to the AWS Management Console with your source account.
- Open the IAM Console at https://console.aws.amazon.com/iam/.
- In the left navigation pane, under Access Management, choose Roles, and then choose Create Role.
- On the Select Trusted Entity page, for Trusted Entity Type, choose AWS Service.
- For Use Case, choose DataSync in the dropdown list and select DataSync. Choose Next.
- On the Add Permissions page, choose Next.
- Give your role a name and choose Create Role.
B. Add permissions to the DataSync IAM role
The IAM role that you just created needs the permissions that allow DataSync to transfer data to the S3 bucket in your destination account.
- On the Roles page of the IAM console, search for the role that you just created and choose its name.
- On the role's details page, choose the Permissions tab. Choose Add Permissions then Create inline policy.
- Choose the JSON tab and paste the following JSON into the policy editor:
{
"Version":"2012-10-17",
"Statement": [
{
"Action": [
"s3:GetBucketLocation",
"s3:ListBucket",
"s3:ListBucketMultipartUploads"
],
"Effect": "Allow",
"Resource": "arn:aws:s3:::q-developer-data",
"Condition": {
"StringEquals": {
"aws:ResourceAccount": "134217665810"
}
}
},
{
"Action": [
"s3:AbortMultipartUpload",
"s3:DeleteObject",
"s3:GetObject",
"s3:GetObjectTagging",
"s3:GetObjectVersion",
"s3:GetObjectVersionTagging",
"s3:ListMultipartUploadParts",
"s3:PutObject",
"s3:PutObjectTagging"
],
"Effect": "Allow",
"Resource": "arn:aws:s3:::q-developer-data/*",
"Condition": {
"StringEquals": {
"aws:ResourceAccount": "134217665810"
}
}
}
]
}
- Choose Next.
- Give your policy a name and choose Create policy.
D. Create your DataSync destination location
Since you can't create cross-account locations by using the DataSync console interface, these instructions require that you run a create-location-s3 command to create your destination location. Amazon recommends running the command by using AWS CloudShell, a browser-based, pre-authenticated shell that you launch directly from the console. CloudShell allows you to run AWS CLI commands like create-location-s3 without downloading or installing command line tools.
⚠️ Important: To complete the following steps by using a command line tool other than CloudShell, make sure that your AWS CLI profile uses the same IAM role that includes the required user permissions to use DataSync in your source account.
To create a DataSync destination location by using CloudShell:
- While still in your source account, do one of the following to launch CloudShell from the console:
- Choose the CloudShell icon on the console navigation bar. It's located to the right of the search box.
- Use the search box on the console navigation bar to search for CloudShell and then choose the CloudShell option.
- Copy the following create-location-s3 command:
aws datasync create-location-s3 \
--s3-bucket-arn arn:aws:s3:::q-developer-data \
--s3-config '{
"BucketAccessRoleArn":"arn:aws:iam::source-account-id:role/source-datasync-role"
}'
- Replace source-account-id with the source AWS account ID.
- Replace source-datasync-role with the DataSync IAM role that you created in your source account.
- Run the command in CloudShell. If the command returns a DataSync location ARN similar to this, you successfully created the location:
{
"LocationArn": "arn:aws:datasync:us-east-2:123456789012:location/loc-abcdef01234567890"
}
- In the left navigation pane, expand Data Transfer, then choose Locations.
From your source account, you can see the S3 location that you just created for your destination account bucket.
⚠️ Important:
- Create your bucket in us-east-1. If your organization does not support us-east-1, let us know in which region the bucket was created.
- q-developer-data is the name of the S3 bucket in Software.com’s destination account. Do not change this value.
E. Create and Start Your DataSync Task
- While still using the DataSync console in your source account, expand Data Transfer in the left navigation pane, then choose Tasks and Create Task.
- On the Configure Source Location page, do the following:
- Select Choose an existing location.
- For Existing Locations, choose the source location for the S3 bucket that you're transferring data from, then choose Next.
- On the Configure Settings page, choose a Task mode. Note: Amazon recommends using Enhanced mode.
- Give the task a name and configure additional settings, such as specifying an Amazon CloudWatch log group. Choose Next.
- On the Review page, review your settings and choose Create Task.
- On the task's details page, choose Start.
- To run the task without modification, choose Start with defaults.
C. Enable Granular Metrics in Amazon Q
- In your Amazon Q admin settings, enable the “Collect granular metrics per user” option.
- Set the S3 location to the bucket path you just created.
- Grant the IAM role you created in the previous step access to this S3 bucket.
⚠️ Important: Do not enable prompt logging. This setting includes full prompts and proprietary code in the data export. If this is enabled, we will be unable to ingest your data.
Ongoing Data Sync
- Amazon Q Developer writes new usage data to your S3 bucket daily.
- Software.com will automatically ingest this data every 24 hours to align with Amazon's schedule.
Resources
- Amazon Q Developer Dashboard: Introducing the Amazon Q Developer Dashboard and CloudWatch metrics
- Tutorial: Transferring data between Amazon S3 buckets across AWS accounts