ITM Platform - Projects Programs Portfolio
Menu
Language
English Español
← Back to Blog

20 AI Prompts Every Project and Portfolio Manager Should Be Asking

20 AI Prompts Every Project and Portfolio Manager Should Be Asking

Most PMOs are data-rich and insight-poor. Dashboards full of metrics, reports generated weekly, color-coded status fields maintained by diligent project managers. And yet, somehow, projects still go off the rails without warning. Resources get quietly overloaded. Budgets drift. Strategic alignment becomes a slide in a quarterly presentation rather than something anyone checks in real time.

The problem was never the data. The problem was the cost of asking a good question.

Every non-trivial question used to require a chain of steps: export the data, open a spreadsheet, cross-reference two or three sources, build a pivot table, interpret, summarize, present. That chain was expensive enough - in time, not money - that most people simply stopped asking. They defaulted to the handful of metrics that were easy to read and hoped the rest would take care of itself.

That has changed. With AI-powered data analysis built into your PPM tool, the cost of asking a hard question drops to nearly zero. You type a question in natural language, and you get an answer grounded in your actual project data - not a generic template, but a response that reflects your timelines, your budgets, your resource assignments.

This post collects 20 prompts organized by scope: the first ten for project managers working inside a single project, the second ten for PMO directors and executives looking across the portfolio. Each prompt includes the reasoning behind it and guidance on what to do with the answer.

These prompts are designed for PMPilot Data Analysis, ITM Platform’s AI assistant. But the thinking behind them applies to any organization trying to get more signal out of its project data.

Part 1: Prompts for Your Project

These prompts are for project managers, team leads, and anyone responsible for delivering a single project on time and within budget.

1. What is the status of my project in terms of schedule, effort, cost, risks, and issues?

The prompt:

“What is the status of project X in terms of schedule execution, effort execution, cost execution, and the state of risks and issues?”

Why this matters

This is the foundational prompt - the one that replaces the Monday morning scramble through three different views to piece together a status picture. Most project managers know roughly where things stand, but “roughly” is the enemy of good reporting. You might know the schedule is tight without realizing that effort consumption is outpacing the plan, or that cost is tracking fine only because a major purchase hasn’t been recorded yet.

Asking for all five dimensions in a single prompt forces a holistic view. It also produces a response you can paste directly into a status report or share with a sponsor who doesn’t want to log into the tool.

How to read the response

PMPilot returns a structured summary covering schedule (actual progress vs. expected progress), effort (estimated total vs. consumed), cost (broken down by workforce, purchases, and total - budget vs. actual), and counts of open risks and issues.

Look for mismatches between dimensions. A project that’s ahead on schedule but over on cost might be throwing resources at the problem. A project that’s on budget but behind on effort might have underestimated the work. The interesting insights live in the gaps between numbers, not in the numbers themselves.

What to do next

  • If schedule and cost are both healthy, document it and move on. Don’t over-investigate a project that’s working.
  • If effort consumed is significantly higher than expected for the current progress level, dig into which tasks are consuming more than estimated. This often signals scope creep at the task level.
  • If risks or issues are at zero, that’s not necessarily good news. It might mean nobody is logging them. Ask PMPilot to suggest risks based on the plan execution data (see Prompt 6).

2. Interpret my earned value metrics

The prompt:

“Give me the interpretation of the earned value metrics for project X.”

Why this matters

Earned value management (EVM) is one of the most powerful lenses for understanding project health, and one of the least used in practice. Not because people don’t care, but because reading EVM output requires fluency in a set of acronyms and ratios that most stakeholders don’t have. CPI, SPI, EAC, VAC - these mean something precise, but the translation from number to decision is where most teams stall.

The result is predictable: EVM data sits in the system, technically available, practically ignored. Status meetings default to percent-complete and gut feel.

How to read the response

PMPilot pulls the project’s earned value data and delivers a plain-language interpretation covering:

  • Earned Value (EV) vs. Actual Cost (AC): whether you’re spending more or less than the value of work completed
  • Cost Variance (CV) and Cost Performance Index (CPI): whether the project is over or under budget, and by what ratio
  • Schedule Variance (SV) and Schedule Performance Index (SPI): whether the project is ahead or behind plan
  • Budget at Completion (BAC) vs. Estimate at Completion (EAC): where the project is likely to land financially if current trends continue

Instead of “CPI = 0.87,” you get something like: for every euro budgeted, the project is delivering 87 cents of value - a 13% cost overrun that, if sustained, will push the final cost above the approved budget.

Look at CPI and SPI first. Both above 1.0 means the project is in good shape. CPI below 1.0 but SPI above means you’re overspending but on schedule - a resourcing or estimation problem. SPI below 1.0 but CPI fine means you’re behind schedule but not burning extra money - possibly a scope or dependency bottleneck.

What to do next

  • CPI < 1.0: Review the cost breakdown. Is it workforce overruns, unplanned purchases, or scope creep? The fix depends on the source.
  • SPI < 1.0: Check which phases or tasks are dragging. Follow up with: “Which tasks are behind schedule and by how much?”
  • EAC exceeds BAC by more than 10%: Escalate. Prepare a revised budget request or a scope reduction proposal before the next steering committee.
  • Both CPI and SPI healthy: Document it. A project running well is still worth a one-paragraph status note so the trend is visible over time.

This prompt is particularly valuable for training and onboarding. New project managers can use it to learn how to read earned value by seeing the interpretation applied to their own project, not a textbook example.

3. Does my project description match the work plan?

The prompt:

“Does the description of project X match the work plan?”

Why this matters

Project descriptions tend to be written once - at kickoff, when the scope is still aspirational - and never updated. Meanwhile, the actual plan evolves: phases get added or dropped, timelines shift, the methodology might change entirely. Over time, the description and the plan drift apart until the description is essentially fiction.

This matters more than it seems. Descriptions are often the first thing a sponsor, a new team member, or an auditor reads. If the description says “agile delivery in three sprints” but the plan shows a waterfall structure with eight phases, you’ve created confusion before anyone opens a Gantt chart.

More importantly, when AI-powered tools analyze your project - for reports, risk assessments, or cross-portfolio comparisons - the description is part of the context. A misleading description leads to misleading analysis.

How to read the response

PMPilot compares the free-text description field against the actual plan structure: methodology, phases, timeline, milestones, and deliverables. It flags mismatches and tells you specifically where they are. A common finding is that the description is too generic or still carries placeholder text from a template, while the plan has been developed in detail.

What to do next

  • If the description is outdated or generic, ask PMPilot the follow-up prompt below (#4) to generate a fresh one based on the actual plan.
  • Make description maintenance part of your phase-gate or sprint-review checklist. It takes two minutes and keeps everything aligned.
  • Consider doing this check across all projects in a program. Inconsistent or placeholder descriptions are a sign of governance gaps that compound over time.

4. Write a project description based on the work plan

The prompt:

“Give me a description of the project based on the work plan.”

Why this matters

This is the natural follow-up to Prompt 3. Once you know the description is stale, you need a replacement. Writing project descriptions from scratch is one of those tasks that’s simple in theory and surprisingly annoying in practice - you need to synthesize the objective, scope, methodology, key phases, and timeline into a paragraph or two that’s accurate and readable.

PMPilot does this by reading the actual plan structure and generating a description that reflects what the project really is, not what someone hoped it would be six months ago.

How to read the response

The generated description typically covers the project objective, scope and deliverables, methodology (waterfall, agile, or hybrid), key phases with date ranges, and major milestones. It reads like something a PMO analyst would write after reviewing the plan - structured, specific, and free of the vagueness that plagues most project descriptions.

What to do next

  • Review the generated description for accuracy. PMPilot works from the data in the system, so if the plan itself has issues (missing dates, incomplete phases), those will show up in the description.
  • Use the output as a starting point, not a final draft. Add context that only you know: the business driver, the stakeholder expectations, the strategic rationale.
  • Apply this prompt to every project in a program to create consistent, data-grounded descriptions across the board. This is especially valuable before portfolio reviews or audit cycles.

Encouraging teams to maintain meaningful descriptions - for projects, tasks, risks, and issues - pays compounding returns. Every AI-powered analysis, report, and recommendation becomes more accurate when the underlying text fields contain real information rather than placeholders.

5. Hours assigned per resource on unfinished tasks

The prompt:

“Give me a table with the total hours assigned to each resource on tasks that are not finished.”

Why this matters

Resource allocation is one of the most common sources of project risk, and one of the hardest to see without deliberate analysis. A project manager might know that the team is busy, but “busy” doesn’t tell you whether one person is carrying 60% of the remaining workload or whether three people have almost nothing left to do.

This prompt gives you a clean picture of who owes what to the project from this point forward.

How to read the response

PMPilot may ask clarifying questions before answering - for example, whether “not finished” means less than 100% complete, or how to allocate hours when a task has multiple assignees. These confirmations are a feature, not a delay. They surface assumptions that often go unexamined.

The result is a table listing each resource and their total remaining hours on incomplete tasks. Look for concentration: if one or two people hold the majority of the remaining work, you have a key-person dependency whether you planned for one or not.

What to do next

  • If the workload is heavily concentrated, consider whether you can redistribute tasks or bring in support. Don’t wait until the bottleneck becomes a missed deadline.
  • Cross-reference this with the schedule. A resource with 200 hours of work and four weeks left has a very different situation than one with 200 hours and twelve weeks.
  • Use this as input for your next resource conversation with the PMO, especially if your team members are shared across projects.

6. Risks the data suggests but nobody has documented

The prompt:

“Do you identify any risks that should be documented based on plan execution, purchases, revenue, or resource assignments?”

Why this matters

Risk registers in most organizations suffer from a common disease: they contain the risks that were obvious at kickoff and almost nothing discovered since. The project was supposed to have an ongoing process of risk identification, but in practice, nobody revisits the register unless something goes wrong.

This prompt turns that passive process into an active one. Instead of relying on human memory and periodic workshops, you ask the AI to scan the project’s actual data - schedule performance, cost trends, resource allocations, purchase records - and flag patterns that look like risks.

How to read the response

PMPilot proposes risks based on quantitative indicators. For example: if the SPI shows schedule slippage, it may suggest a “schedule recovery risk” noting that catching up will require either more resources or scope reduction. If purchases are under-recorded relative to the budget, it might flag a procurement tracking risk. If a single resource holds the majority of the remaining effort, it will call out a key-person dependency.

Each proposed risk typically includes a brief rationale (“why this is a risk”), a suggested severity, and a mitigation approach. The response tends to cover three dimensions: plan execution, financial data, and resource patterns.

What to do next

  • Don’t accept every suggested risk blindly. Treat the output as a conversation starter: does this pattern actually represent a risk in your specific context?
  • For the risks you do accept, add them to the risk register with proper ownership and response plans.
  • Run this prompt periodically - monthly or at every phase gate. The risks that matter change as the project progresses.
  • If you want to focus on a single dimension, narrow the prompt: “Identify risks based only on resource assignments” or “Identify risks based on purchase and revenue data.”

7. Issues the data suggests but nobody has raised

The prompt:

“Do you identify any issues that should be raised based on plan execution, purchases, revenue, or resource assignments?”

Why this matters

The distinction between a risk and an issue is simple but important: a risk is something that might happen; an issue is something that has already happened and needs attention. In practice, issue logs are even more neglected than risk registers. Teams deal with problems as they come up but don’t always formalize them, which means there’s no record, no tracking, and no pattern recognition over time.

This prompt is especially valuable when the issue log is empty. An empty log in a project with visible schedule or cost deviations isn’t a sign of health - it’s a sign that the issue management process isn’t being used.

How to read the response

PMPilot scans the same data dimensions as the risk prompt but frames its findings as issues that already exist. For example: a significant gap between reported progress and earned value might be flagged as a “progress reporting inconsistency” requiring immediate investigation. Unrecorded purchases against a non-zero procurement budget might become a “procurement traceability” issue.

Each proposed issue includes a description, its likely impact, and a suggested immediate action.

What to do next

  • Review each proposed issue against what you already know. Some will be things you’re already handling informally - those are worth formalizing so there’s a record.
  • Assign ownership immediately. An issue without an owner is just a complaint.
  • Use the output to prepare for steering committee meetings. Arriving with a clear, data-backed issue list signals maturity and control, even if the issues themselves aren’t great news.

Part 2: Prompts for Your Portfolio

These prompts are for PMO directors, portfolio managers, and executives who need to see across projects and make decisions about where to invest attention, resources, and budget.

8. Which projects have the most issues?

The prompt:

“Give me the three projects with the most issues.”

Why this matters

Issue count is a rough but useful proxy for project distress. A project with significantly more open issues than its peers is either genuinely struggling or doing a better job of logging problems (both worth knowing). Either way, it deserves PMO attention.

How to read the response

The response is a ranked table showing the top three projects by issue count, along with their status, responsible managers, and the number of open issues. It’s deliberately simple - the point is triage, not analysis.

What to do next

  • For the top project, ask PMPilot for the detail: “Show me the open issues for project X, ordered by severity.”
  • Compare the issue counts with project size and complexity. A large, complex project with ten issues might be healthier than a small project with five.
  • If the top-ranked project also shows schedule or cost deviations, it moves to the top of your intervention list.

9. Identify undocumented risks across the program

The prompt:

“Identify risks that are not documented in the projects of program X.”

Why this matters

At the program level, risk management needs to go beyond what individual project managers have logged. A program manager should be looking for risks that span projects - shared resource dependencies, common vendors, correlated timelines - and for risks that individual PMs might not see from their vantage point.

How to read the response

PMPilot analyzes the projects within the program and proposes risks that aren’t currently in any project’s risk register. These tend to be cross-cutting concerns: scope creep patterns, change resistance, resource availability across the program, dependency-driven delays, and cost overrun tendencies.

Each proposed risk includes a rationale, a suggested severity, and a mitigation approach. Pay attention to whether the response references specific projects. If you asked about the program and the risks are generic (not tied to any project), you may want to rephrase: “For each risk, indicate which project or projects it applies to.”

What to do next

  • Review the proposed risks with your project managers. They’ll have context on whether a flagged pattern is real or an artifact of how data was entered.
  • Add the valid ones to a program-level risk register, or assign them back to the relevant projects.
  • Use this as a standing agenda item in program review meetings. Running the prompt before each meeting takes seconds and keeps the risk conversation grounded in data.

10. Weighted progress of the program

The prompt:

“Give me the weighted progress percentage of program X.”

Why this matters

A simple average of project progress percentages is misleading. A program with one massive project at 20% and five small projects at 90% is not at 78% complete - it’s closer to 30% if you weight by effort. The weighted calculation tells you how the program is actually progressing, accounting for the relative size of each project.

How to read the response

PMPilot calculates a weighted average using estimated effort as the weight. If estimated effort is not available for all projects, it falls back to a simple average and tells you so. The response includes a table showing each program with its weighted progress, number of projects, and total estimated effort.

If the weighted progress is significantly lower than the simple average, it means the larger projects are lagging while smaller ones are racing ahead. That’s a classic pattern in programs where the quick wins get done first and the hard, high-effort work keeps slipping.

What to do next

  • Report the weighted figure to stakeholders, not the simple average. It’s more honest and helps set realistic expectations.
  • If there’s a significant gap between weighted and simple averages, investigate the large projects dragging the number down.
  • Ask for the project-by-project breakdown to identify which projects are most responsible for the overall figure.

11. Top resources by number of assignments

The prompt:

“Give me the three resources with the most assignments across projects. Just give me the number of tasks assigned.”

Why this matters

This is the quick-and-dirty overallocation check. Before you get into hours, capacity models, or utilization rates, start with a simple count: who is spread across the most tasks? A person assigned to 40 tasks across six projects is, by definition, context-switching constantly. Whether or not the hours add up, the cognitive load is real.

How to read the response

The response is a ranked table: resource name and number of assigned tasks. It’s intentionally minimal - the prompt asks for task count only, which keeps the answer fast and focused.

What to do next

  • For the top-ranked resources, follow up with: “Show me the project breakdown for resource Y” to understand whether their assignments are concentrated in one project or scattered.
  • Discuss the findings with project managers. Heavy assignment counts sometimes reflect a naming issue (one person assigned to umbrella tasks they don’t actually work on) rather than real overload.
  • Use this as a leading indicator. If the same names appear at the top month after month, the organization has a structural dependency that needs to be addressed.

12. Which resource has the most impact on the portfolio?

The prompt:

“Which resource has the most impact on projects? By impact, I mean that if they were unavailable and couldn’t work on their assignments, the portfolio would suffer the most.”

Why this matters

This is a fragility analysis. Every organization has key-person dependencies, but few make them explicit. The person who would cause the most disruption if they went on extended leave, changed roles, or left the company is usually known informally (“we’d be in trouble without Maria”) but never quantified.

Quantifying it changes the conversation. It moves from anecdote to data, which makes it easier to justify cross-training, backup assignments, or hiring decisions.

How to read the response

PMPilot interprets “impact” as the total estimated effort assigned to each resource across all projects, combined with the number of task assignments. The response shows a ranking - typically the top three to five resources - with their total effort (in hours or minutes) and task count.

This is a reasonable proxy for impact, though not the only one. A person with 500 hours across ten projects causes more disruption than a person with 500 hours on one project, because the blast radius is wider.

What to do next

  • For the top-ranked resource, ask: “What would happen to the portfolio if resource Y were unavailable for four weeks?” This forces a concrete scenario analysis.
  • Initiate cross-training or backup planning for the top two or three. You don’t need a full succession plan - just make sure someone else knows what they’re working on.
  • Present the findings to leadership. Key-person risk is often invisible until it materializes, and by then the options are limited.

13. If we had to cut 20% of the portfolio, what would the data suggest?

The prompt:

“If we had to cut 20% of the portfolio, what would the data suggest?”

Why this matters

Every executive thinks about this question. Few have the data to answer it. Portfolio rationalization is one of the highest-leverage activities in project management - killing the right projects frees resources and budget for the ones that matter - but it’s also one of the most politically charged. Without data, the conversation devolves into who has the loudest voice or the most senior sponsor.

This prompt forces a structured, numbers-first answer. It doesn’t make the decision for you, but it reframes the debate from opinion to evidence.

How to read the response

PMPilot interprets “20%” as a cost target - typically 20% of the total top-down workforce cost across active projects. It then proposes multiple cut options, each with a different tradeoff profile:

  • Single-project cut: The simplest option. If one project has a large cost and low or zero expected ROI, cutting it alone may exceed the 20% target. This is the low-hanging fruit.
  • Multi-project combination: A more surgical approach that combines smaller, lower-ROI projects to reach the target. This preserves the high-value work but requires cutting across several initiatives.
  • Tradeoff option: What it would take to hit the target exactly, and what ROI you’d lose in the process.

For each option, the response includes the cost saved, the expected ROI lost, and the ROI-to-cost ratio for each affected project. This ratio is the key figure: a project with an ROI/Cost ratio of 0 is an obvious candidate; one with a ratio of 19.45 is probably worth keeping unless something else is wrong.

What to do next

  • Don’t treat the output as a recommendation to execute. Treat it as a briefing document for a prioritization conversation.
  • Look at the ROI/Cost ratios first. Projects with a ratio below 1.0 are costing more than they’re expected to return - those deserve scrutiny regardless of whether you’re cutting 20%.
  • Check the assumptions. If a project shows zero expected ROI, it might genuinely have none, or the ROI field might never have been filled in. Data quality matters here.
  • Use this prompt before annual planning or whenever there’s a budget pressure event. Having the analysis ready before someone asks for it puts the PMO in a proactive position.
  • Consider running variations: “If we had to cut 20% of effort instead of cost, what changes?” or “Which projects have the lowest strategic alignment score relative to their cost?”

More Prompts to Explore

The thirteen prompts above cover the core of what most project and portfolio managers need. But the question space is much larger. Below are seven more prompts that target specific scenarios - from short-term resource conflicts to long-term strategic alignment.

Project scope

14. Who on my team is overallocated in the next two weeks, and on what? Short-term resource conflicts at the task level. Useful as a weekly check, especially for teams shared across projects.

15. What changed in my project in the last 7 days that I should know about? The Monday morning briefing. Surfaces schedule shifts, new risks, updated progress, and anything else that moved since you last looked.

16. Which tasks have 0% progress but a start date in the past? Finds stalled work hiding in the plan. Tasks that should have started but show no progress are either blocked, forgotten, or assigned to someone who doesn’t know they own them.

Portfolio scope

17. Which projects are in trouble but haven’t raised a flag? The question every executive wants answered. Looks for projects whose data tells a different story than their reported status.

18. Give me a summary of the status of all projects. The portfolio-level snapshot: every project with its status, progress, deviation, business unit, and responsible manager. The Monday morning triage list, generated in seconds.

19. How much of our capacity goes to strategic initiatives vs. maintenance and operational work? The “keeping the lights on” ratio. Almost always worse than leadership assumes.

20. Which resources are assigned to tasks in more than three projects simultaneously? The cross-project spread detector. People working across too many projects aren’t necessarily overloaded in hours, but the context-switching cost is real and invisible in most reports.

Getting started

These twenty prompts share a common thread: they’re all questions that someone in your organization should be asking regularly, and that until recently cost too much time and effort to answer on demand.

The shift isn’t about AI replacing project managers. It’s about removing the friction that prevented good questions from being asked in the first place. When you can get a portfolio status summary, an earned value interpretation, or a resource fragility analysis in seconds, you ask more often. You catch problems earlier. You make decisions based on data instead of the last status report someone remembered to update.

PMPilot Data Analysis is available as part of ITM Platform’s AI-enhanced capabilities. If you want to try these prompts against your own project data, start a free trial and see what your portfolio has been trying to tell you.

Stay updated