Skip to Content
AI ImpactAI Overview

AI Overview

The AI Overview dashboard provides a comprehensive view of how AI coding tools are being adopted and utilized across your development team, along with their measurable impact on productivity. This page helps you track adoption rates, engagement levels, and the productivity gains achieved by developers using AI tools compared to those who aren’t.

Understanding the Dashboard

The AI Overview dashboard is designed to answer three critical questions:

  1. How widely are AI tools being adopted? Track the percentage of developers who have adopted AI coding tools over time.
  2. How engaged are developers with AI tools? Measure how actively developers are using the tools they’ve adopted.
  3. What is the productivity impact? Compare the performance of AI-assisted developers against non-AI cohorts and their own pre-adoption baseline.

Key Metrics

Productivity Gain (%)

Measures the percentage improvement in productivity for developers after adopting AI tools compared to their performance before adoption. This metric helps quantify the direct impact of AI tools on individual developer output.

Calculation: Compares the New Deliveries per Developer for AI-assisted developers in the current period to their performance in the pre-adoption period.

Productivity Gain ($)

Translates the percentage productivity gain into a dollar value by applying your organization’s average fully burdened cost per developer. This helps you understand the financial impact of AI tool adoption and can be used to justify investment in AI tooling.

Calculation: Productivity Gain (%) × Average Fully Burdened Cost per Developer

Adoption Rate

The Adoption Rate measures what percentage of your active developers have adopted AI coding tools. A developer is considered “adopted” if they have any passive or active AI-related activity, such as installing an extension or receiving suggestions—they don’t need to actively engage with the tool (like accepting suggestions or using chat) to be counted as adopted.

Calculation: Number of AI-assisted developers / Total active developers

Who counts as a developer: Only developers who have created at least one pull request in the period are included in the calculation. This ensures the metric focuses on active contributors rather than the entire organization.

Adoption criteria: A developer is considered to have adopted AI tools when they have shown any AI-related activity within the last 90 days, including:

  • Installing an AI coding tool extension
  • Receiving code suggestions (even if not accepted)
  • Any other passive AI tool activity

Note: Adoption does not require active engagement. A developer can be counted as “adopted” even if they haven’t accepted a single suggestion or used any interactive features. See Engagement Rate below for metrics on active usage.

Engagement Rate

While adoption measures passive exposure to AI tools (like receiving suggestions), Engagement Rate measures active usage. This metric tells you what percentage of developers who have adopted AI tools are actually engaging with them through intentional actions.

Calculation: Number of engaged AI developers / Number of AI-assisted developers

What counts as engagement: A developer is considered “engaged” if they take intentional, active actions with their AI tools, such as:

  • Accepting code suggestions
  • Using chat features
  • Actively interacting with AI-powered code completion

Unlike adoption (which includes passive activity like just receiving suggestions), engagement requires the developer to take deliberate actions that demonstrate they’re finding value in the tool.

Engagement levels: Engagement is based on the percentage of work days a developer uses AI tools:

  • High engagement: >80% of work days
  • Medium engagement: 50-80% of work days
  • Low engagement: <50% of work days
  • Unengaged: 0% of work days

How it’s calculated: For each developer, we calculate: Total days using AI tools / Work days in the period. A developer is considered “engaged” if this percentage is above 0%.

Comparison Charts

AI vs. Non-AI Developer Productivity

This chart compares the productivity of three distinct cohorts over time:

Cohorts:

  • AI-assisted: Developers who have adopted AI tools and meet the adoption criteria
  • Unassisted: Developers who have not adopted AI tools or haven’t shown activity in the last 90 days
  • Overall: All active developers, regardless of AI tool usage

What it shows: The chart displays New Deliveries per Developer for each cohort, allowing you to see the productivity difference between developers using AI tools and those who aren’t.

How to interpret: A higher line for AI-assisted developers indicates that those using AI tools are delivering more work compared to their unassisted peers. The gap between the lines represents the productivity advantage of AI tool adoption.

Pre vs. Post AI Adoption Productivity

This chart focuses specifically on developers who have adopted AI tools and compares their productivity before and after adoption.

Comparison periods:

  • Before adoption: The developer’s productivity metrics from the period before they adopted AI tools
  • After adoption: The developer’s productivity metrics from the period after they adopted AI tools

How pre-adoption productivity is calculated: For each developer, we measure their productivity during the 6-month period before they individually adopted AI tools. These individual pre-adoption metrics are then averaged across all AI-assisted developers to create the baseline. This approach ensures that each developer’s “before” period is personalized to their actual adoption date, providing a more accurate comparison.

What it shows: This chart isolates the impact of AI tool adoption by comparing the same developers against themselves, removing variables related to individual skill levels or team differences.

How to interpret: An increase in productivity after adoption indicates that AI tools are helping developers deliver more work. This is often the most compelling metric because it controls for individual differences by comparing developers to their own baseline.

Productivity vs. Headcount

This chart shows the relationship between team size and total productivity output, broken down by AI-assisted and unassisted developers.

What it shows:

  • A stacked bar chart showing the count of AI-assisted and unassisted developers over time
  • The correlation between team composition and total New Deliveries

How to interpret: This helps you understand whether productivity gains are coming from increased headcount or from improved efficiency through AI tools. Ideally, you’ll see stable or increasing productivity even as the ratio of AI-assisted to unassisted developers changes.

Scope and Data Considerations

Developer Definition

The dashboard focuses exclusively on developers—defined as individuals who have created at least one pull request during the measured period. This ensures metrics reflect the impact on active contributors rather than the entire organization.

Time Windows

  • Adoption tracking: Uses a 90-day rolling window to determine who is actively using AI tools
  • Engagement calculation: Measures engagement as a percentage of work days within the selected reporting period
  • Comparison periods: Typically compares recent periods (e.g., last 6 months) against pre-adoption baseline periods

Work Days

Work days are calculated based on days with development activity (commits, pull requests, etc.). This excludes weekends, holidays, and days with no recorded activity, ensuring engagement rates reflect actual working time.

Best Practices

Interpreting Adoption Rates

  • Target: While adoption rates vary by organization, aim for steady growth over time
  • Low adoption: If adoption is stagnant or declining, investigate barriers such as lack of awareness, training needs, or tool accessibility issues
  • High adoption with low engagement: This pattern suggests developers have access but aren’t finding value—consider additional training or reviewing tool configuration

Interpreting Engagement Rates

  • High engagement (>80%): Indicates developers are finding consistent value in AI tools
  • Medium engagement (50-80%): Developers are using tools regularly but not every day—this can be normal depending on work type
  • Low engagement (<50%): May indicate developers are struggling with the tools, finding them unhelpful, or facing technical issues

Maximizing Productivity Impact

  1. Track trends over time: Look for sustained productivity improvements rather than short-term spikes
  2. Segment by team: Different teams may see different levels of impact based on the type of work they do
  3. Combine metrics: Use adoption, engagement, and productivity together to get a complete picture
  4. Account for learning curves: Expect productivity gains to increase over time as developers become more proficient with AI tools
Last updated on