8 Improvements to the CCC Milestone Review
We have moved forward in making CCC Milestone Reviews more usable and convenient. There is an extensive list of items to implement over time, and these 8 are the first of many. So, what are the latest improvements?
Milestone results from prior Reviews now appear on the radar chart
In the current version of CCC Milestone Reviews there is no way to visualize
a resident’s progress over time. In order to help you easily see your resident’s
progress and identify problem areas our individual radar charts will soon show
current results, results from the prior 2 reviews and the peer average.
You can view and print the full list of your program’s ACGME Milestones
One item that caught us by surprise was your interest in viewing and printing
the ACGME Milestones. While this information is available on the ACGME website
it turns out many of you either prefer our format or the ease of access through
our software. In the current version of our CCC Milestone Review software we
provide a way for you to print out the Subcompetencies one at a time. We realized
that this is tedious and inconvenient. Accordingly, we will provide a way for each
program to print their full list of Milestones from both of the CCC pages and the
Evaluations page for their Clinical Competency Committee Review.
You can now view a count of “N/A” evaluation response per Subcompetency
This challenge began when you could see that there were 20 total responses
related to a Subcompetency, but could only visualize 18 responses on the trend
chart because 2 of them were entered as N/A. Additionally, for programs that
utilize the ‘Has Not Yet Achieved’ language, visualizing the count helps them
determine resident progress. Now, by showing the Not Yet Achieved and Not Yet
Rotated at the aggregate level, there is now a clear way to determine the
average score and response counts.
Improved legibility for low resolution views (Tablets, etc.)
Before this release, when the display size of a screen was reduced (for a tablet or
smartphone), the trend chart and average values on the subcompetency details page
were pushed below the details information. This created a problem because the
trend and average apply directly to the subcompetency grid. Therefore, they
should be near it to avoid extraneous navigation. This problem is now corrected.
When a page is re-sized these charts now display immediately below the
subcompetency grid and are easier to visualize.
Milestones flagged as “Requires Attention” on prior Reviews are now clearly identified
The current version of CCC Milestone Reviews does not indicate milestones that
may have previously required attention. Any milestone with an associated subcompetency
level can be identified as requiring attention to help acknowledge where a resident
is struggling. When in the process of a review, CCC members find it helpful to
see if any milestones from prior reviews were flagged as problem areas.
We have now provided our customers with an area on the subcompetency details page
that includes a list of all past milestones that were flagged as requiring attention.
Optionally restrict residents from accessing their peer averages
In the current version of CCC Milestone Reviews, residents can show or hide
their peer averages. This became an issue in small programs. Residents could
mathematically solve for the scores of the other residents by knowing their
individual score and the peer average. Some educators also simply reject the
idea of showing a peer average to their students. Administrators now have a
configuration option that disables residents from being able to see peer averages.
Optionally prohibit rating scale responses from being Normalized and Averaged
Currently all rating scale responses that were mapped to a subcompetency are
automatically normalized and included in the average score for that subcompetency.
Many of you do not want the results of their rating scale evaluation questions
to be normalized and averaged with results from questions that more easily tie
to the milestones. Administrators now have a configuration option that disables
this calculation.
We have altered our normalization calculation to better represent skill acquisition
Our current normalization strategy utilized a purely proportional calculation that
converts responses from their native scale to a percentage (0-100) in the
“Dreyfus” scale. A problem occurs because the lowest Dreyfus level is 1.
Therefore, a percentage value of 0 does not correspond accurately with the Dreyfus
scale. As numerous customers utilized their own custom grade scales in Evaluations,
the corresponding normalized data no longer had a direct numerical correspondence.
In order to solve the problem we have changed the normalization calculation.
It now accounts for the fact that the Dreyfus scale is based on a beginning value of 1.