Attention!

The content on this site is a materials pilot. It represents neither changes to existing policy nor pending new policies. THIS IS NOT OFFICIAL GUIDANCE.

We also have a plain text version of this rubric.

Outcomes-orientation

Are efforts clearly connected to intended outcomes and end users?

Why this matters...

Outcomes-oriented Medicaid IT teams have a clear, program-driven direction. As a result, they more promptly identify and address issues that can impact progress and make smart tradeoffs in their day-to-day work.

Top priority

Ask to see the product roadmap and the overall roadmap.

  • Bad: There is no roadmap for the product / service or enterprise.
  • Meh: There is a roadmap but it is unclear when value will be delivered, product or enterprise roadmaps conflict.
  • Good: The roadmap captures how the product / service will evolve and demonstrates value to end users within 12 months, aligns with the enterprise roadmap.
Related course lessons: oo1.1

Ask to see how teams measure their progress against program, policy, and/or baseline metrics.

  • Bad: Explanations of how the state will measure progress or program impact are incongruent or missing all together.
  • Meh: Teams consistently articulate the impact they are targeting but do not have metrics or baselines.
  • Good: Teams consistently articulate their target metrics and can demonstrate how they are doing against baselines.
Related course lessons: oo1.2

Medium priority

Ask how feedback from users impacts priorities.

Ask how the state ensures services are accessible for all.

  • Bad: The team has no information about the current user experience.
  • Meh: The team has information about the current experience but has not spoken directly to users.
  • Good: The team has collected feedback from end users directly and can demonstrate how they apply what they learn.
Related course lessons: oo2.1

Lower priority

Ask both project and enterprise-level team members what the current priorities are.

  • Bad: State team members can't speak to priorities at all.
  • Meh: State team members can speak to priorities, but they're not captured anywhere in written or visual form.
  • Good: Outcomes and priorities are clearly articulated by all state team members and live in a shared roadmap that's regularly updated.
Related course lessons: oo3.1

Ask team members from different departments (program, IT, procurement, etc.) about their role and priorities.

  • Bad: IT, program, finance, etc. priorities are unconnected or at odds.
  • Meh: Some priorities are aligned, others are disjointed.
  • Good: IT, program, finance, etc. priorities are aligned around end-user outcomes.
Related course lessons: oo3.2

State capacity

Does the state have the skills and capacity to to manage the work?

Why this matters...

Without some familiarity with software development, an agency is not well equipped to evaluate vendor work and lead a dvelopment project successfully. Poor working relationships and limited capacity across departments is the biggest blocker we see in projects at both the federal and state level.

Top priority

Ask the product and technical leads what their role is and what other responsibilities they have.

Look at the APD staffing plan vs. current staffing.

  • Bad: The state depends entirely on a vendor for vision and strategy.
  • Meh: State leads are identified but have limited time to lead the strategy and take direction from the vendor.
  • Good: State staff with IT experience have the bandwidth to lead the technical strategy with feedback from vendors.
Related course lessons: sc1.1

Medium priority

Ask how and when the team utilizes different types of expertise.

  • Bad: Teams are not able to be staffed, are only staffed with one skill set/perspective (ex. only a PM), and/or do not include program expertise.
  • Meh: Teams are staffed with project and program expertise, able to pull in other experts when needed.
  • Good: Teams are staffed with project, program, technical, and other expertise.
Related course lessons: sc2.1

Ask detailed questions about technology, program goals, end users, etc.

Pay attention to who answers each question.

  • Bad: Team members can't answer questions about the project or issues in plain language.
  • Meh: Team members rely on vendors to answer questions, but answers are in plain language.
  • Good: All team members (including IT and procurement) can speak in plain language about the project.
Related course lessons: sc2.2

Lower priority

Join meetings and watch team dynamics.

  • Bad: Adversarial / non-existent relationships across the teams or divisions.
  • Meh: Individuals have good relationships, but team dynamics are strained.
  • Good: Individuals and teams from all divisions talk regularly and have good relationships.
Related course lessons: sc3.1

Procurement flexibility

Do the state's procurement and vendor management practices provide visibility and flexibility?

Why this matters...

States and vendors have different incentives. In order to make sure work is driving toward program outcomes, the state should drive and actively manage vendor work and build monitoring of quality into their contracts.

Top priority

Ask how the state knows what progress has been made.

Ask if there are known risks that may cause delay.

  • Bad: Little to no visibility into progress. Target dates and budget estimates are missed or frequently revised.
  • Meh: There's limited visibility into progress. Many dates are hit, but misses are surprising.
  • Good: State has full visibility into progress. Most dates are hit, and the state knows early on when and why they are not going to meet a date.
Related course lessons: pf1.1

Ask how the team implements quality monitoring through their contracts.

  • Bad: The state has little or no expectations for monitoring quality in their contracts.
  • Meh: The state has vague expectations around monitoring quality in their contracts, mostly leaves monitoring to development teams.
  • Good: The state can demonstrate how they monitor quality in their contracts and work with teams to make sure monitoring is implmented.
Related course lessons: pf1.2

Medium priority

Ask how the state manages and accesses system data.

  • Bad: The state has little or no visibility into how data is stored or accessed.
  • Meh: The state can use the vendor's tools to access data, but does not own the data.
  • Good: The state owns and controls access to all data.
Related course lessons: pf2.1

Lower priority

Ask what challenges the team is having in driving toward intended outcomes.

Ask who is involved in setting strategy and outcomes.

  • Bad: The state blames or relies on the vendor for the strategy and achievement of the outcomes.
  • Meh: The state takes partial accountability for the strategy and achievement of the outcomes; leans on vendors for procurement strategy.
  • Good: The state considers themselves wholly responsible for the strategy and achievement of the outcomes; leads strategy, collects feedback from vendors.
Related course lessons: pf3.1

Ask team leads and contract owners how they monitor vendor budget.

  • Bad: The state has little or no regular visibility into how the vendor is billing.
  • Meh: The state has visibility into how the vendor is billing but is uncertain how to manage if a vendor is burning too hot.
  • Good: The state has regular conversations with the vendor about burn rate.
Related course lessons: pf3.2

Iterative development

Does the state use iterative, secure development practices?

Why this matters...

More important than any individual practice or technique, healthy design and development teams are able to iterate quickly by 1) identifying process failures and opportunities for improvement, and 2) adjusting and improving their approach accordingly, in real time.
 
The more insight a state has, the better able they are at identifying whether they are on track or not. One tool states use for this is testing. There are many different types of testing. CMS is in the process of releasing new testing guidance to teams.

Top priority

Ask how the development and state team share in-progress work.

Ask to see progress. For example, join demos throughout development.

  • Bad: Nothing is shared until its completely finished, no code is shown.
  • Meh: In-progress work is shown, but it's always fairly polished, some code is shown.
  • Good: You see regular (once a month or more) demos ranging from very messy, incomplete work all the way to polished, finished products.
Related course lessons: id1.1

Ask how the state incorporates the end user during the development & testing processes.

  • Bad: The team only collects input from end users at the end of the process.
  • Meh: The team collects feedback at the beginning of the process, but does not validate they are meeting user needs during development.
  • Good: The team regularly collects feedback and tests to ensure the product improves the experience for end users.
Related course lessons: id1.2

Ask how the state approaches security, performance, and migration testing.

Ask how project leads interact with the testing process.

  • Bad: The team cannot answer what types of testing they are doing, only that they test at the end of the process.
  • Meh: The team can describe their testing approaches but testing is done by a siloed team.
  • Good: The team can demonstrate their testing approaches. Testing and development is done by the same team.
Related course lessons: id1.3-1, id1.3-2, id1.3-3

Medium priority

Ask how the process of getting new things into production works.

How long does it take? How many steps are involved?

  • Bad: All changes are rolled out as a "big-bang" effort, where the system is turned off and then back on.
  • Meh: Deploying changes is minimally disruptive to end users, but requires work stops for the internal team.
  • Good: Deploying changes is minimally disruptive to end users and the internal team.
Related course lessons: id2.1

Ask how developers test changes before delivering a feature for review.

  • Bad: Developers don't test changes before adding them to the project.
  • Meh: Developers manually test features, but don't do automated tests.
  • Good: Developers can show how they run manual and automated tests before adding changes to the project.
Related course lessons: id2.2

Ask how they monitor system health in production.

  • Bad: The state can't tell you about the health of the system that's being used.
  • Meh: The state manually monitors system health and creates reports on a regular basis.
  • Good: System health is monitored automatically and staff can show you a reporting dashboard.
Related course lessons: id2.3

Lower priority

Ask how development choices are made and if open source or reusable options are considered.

  • Bad: The state can't tell you how they make/made their decisions.
  • Meh: The state can tell you how they made their decisions, but did not consider open source options.
  • Good: The state can show you how/why they make their decisions and are using open source when possible.
Related course lessons: id3.1