01.02.2014 Views

Dashboard

Dashboard

Dashboard

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Institutional <strong>Dashboard</strong>s<br />

for Continuous Quality<br />

Improvement<br />

Deb Simpson, Medical College of Wisconsin (Moderator)<br />

Marc Triola, New York University<br />

Mary C. Hill, University of Michigan<br />

Henry Sondheimer, AAMC<br />

Robby Reynolds, AAMC<br />

1


Background:<br />

Why dashboards? Why now?<br />

Medical educators generate a LOT of data for<br />

different stakeholders<br />

New accreditation criteria for academic quality<br />

(LCME, ACGME, HLC)<br />

New expectations for data analysis and reporting<br />

• Link objectives to outcomes<br />

• Link individual, unit, and aggregate data<br />

• Cross-link data<br />

2<br />

2012 AAMC Annual Meeting


Background:<br />

Why dashboards? Why now?<br />

New approaches to evaluating and comparing<br />

outcomes require new tools<br />

Business has led the way in developing and using<br />

Business Intelligence (BI) technology<br />

<strong>Dashboard</strong>s are a navigational tool<br />

3<br />

2012 AAMC Annual Meeting


What is a <strong>Dashboard</strong>?<br />

Auto <strong>Dashboard</strong>?<br />

Or Cockpit<br />

Instrument Panel?<br />

4<br />

2012 AAMC Annual Meeting


Process & Objectives for Session<br />

Process<br />

WHAT: Intro & <strong>Dashboard</strong> Examples<br />

• NYU<br />

• Michigan<br />

• AAMC – MBM & ASSET<br />

NOW WHAT: Discussion Questions<br />

• What are key decisions/questions?<br />

• What data need/have/access?<br />

• Challenges in dashboarding data?<br />

• Benchmarking?<br />

SO WHAT: How do dashboards<br />

• Add value (to whom)<br />

• Support a CQI- edu.environment<br />

Goal & Objectives<br />

GOAL: Expand our awareness re:<br />

available data sets<br />

OBJECTIVE: Address the questions “you”<br />

want answered re:<br />

• Available data sets:<br />

• What you have<br />

• What others have (AAMC, USMLE)<br />

• What you don’t have<br />

• Benchmarks – Criterion<br />

5<br />

2012 AAMC Annual Meeting


Introduction: Decisions & ?’s<br />

Key Concepts & Terms<br />

Business Analytics<br />

Tell key stakeholders in multidimensional manner<br />

• Process: What’s happening now<br />

• Outcomes: What’s happening relative to “criterion”<br />

• Prediction: What MAY happen<br />

• Discovery: Something interesting<br />

Hammergren TC. Data Warehousing for Dummies. 2009<br />

Terkla DG. Institutional <strong>Dashboard</strong>s AIR 2012<br />

6<br />

2012 AAMC Annual Meeting


Business Analytics<br />

Key Indicators – THE critical information<br />

• Varies by institution/stakeholder/decision-maker<br />

• Indicators must be<br />

• Easy to understand<br />

• Relevant to user<br />

• Strategic<br />

• Not used in isolation<br />

• Quantitative; up-to-date with current info; reliable<br />

Hammergren TC. Data Warehousing for Dummies. 2009<br />

Terkla DG. Institutional <strong>Dashboard</strong>s AIR 2012<br />

7<br />

2012 AAMC Annual Meeting


Business Analytics – Tells key stakeholders by:<br />

Visualization of data<br />

• Multidimensional<br />

• Chronological (Past-Present & IF Predictive Future)<br />

Key Concept: Tell me a lot of things but don’t make me<br />

work too hard.<br />

• <strong>Dashboard</strong><br />

• Presents current information on your operational<br />

performance (car, airplane)<br />

• Purpose – manage/guide the organization<br />

• Scorecard<br />

• Shows performance against a plan, set of objectives,<br />

criterion/standards<br />

Hammergren TC. Data Warehousing for Dummies. 2009<br />

Terkla DG. Institutional <strong>Dashboard</strong>s AIR 2012<br />

8<br />

2012 AAMC Annual Meeting


<strong>Dashboard</strong> Examples<br />

9


The NYU Educational Data Warehouse<br />

Marc Triola, MD<br />

Associate Dean for Educational Informatics<br />

NYU School of Medicine Division of Educational Informatics


Overview<br />

• Background<br />

• Our Approach<br />

• Implementation<br />

• Lessons Learned and Next Steps


Background<br />

• Who we are<br />

• Our new curriculum: C21<br />

• With the rollout of C21, DEI had a goal<br />

to create a Framingham Heart Study<br />

style database of educational data


Needs<br />

• C21<br />

o<br />

o<br />

o<br />

Learning analytics at the individual level<br />

Curriculum mapping and management<br />

Regulatory reporting<br />

• Strategic operational dashboards<br />

o<br />

o<br />

o<br />

Admissions, Diversity Affairs, etc.<br />

UME, GME and Biomedical Sciences graduate program<br />

DEI


Benefits of an EduDW<br />

• Integrates metrics from numerous heterogeneous sources and enables<br />

analysis across multiple systems & processes<br />

• The EduDW architecture, based on dimensions and facts, promotes<br />

exploration:<br />

o<br />

o<br />

o<br />

o<br />

provides single analytic view that is easier for users<br />

insures high performance<br />

is supported by a variety of query & reporting tools<br />

facilitates creation of multidimensional cubes<br />

• Preserves historical data<br />

• Takes off the load of resource-intensive queries from operational<br />

systems


Education Data Warehouse<br />

Lecture<br />

Podcasting<br />

ePortfolio<br />

Evaluations<br />

Learning<br />

Modules<br />

Student<br />

Patient Log<br />

LMS<br />

SIS<br />

Exams<br />

Admission<br />

s<br />

Simulation<br />

ETL<br />

Data Marts<br />

EduDW<br />

BI<br />

Reporting and Analytics


Learning Analytics and Individual <strong>Dashboard</strong>s


Learning Analytics and Individual <strong>Dashboard</strong>s


Learning Analytics and Individual <strong>Dashboard</strong>s


Learning Analytics and Individual <strong>Dashboard</strong>s


Learning Analytics and Individual <strong>Dashboard</strong>s


Learning Analytics and Individual <strong>Dashboard</strong>s


Learning Analytics and Individual <strong>Dashboard</strong>s


Learning Analytics and Individual <strong>Dashboard</strong>s


Learning Analytics and Individual <strong>Dashboard</strong>s


Curriculum Mapping


Curriculum Mapping


Curriculum Management


Operational <strong>Dashboard</strong>s


Operational <strong>Dashboard</strong>s


Operational <strong>Dashboard</strong>s


Lessons Learned<br />

• Support within the organizational culture<br />

o<br />

Commitment from senior leadership<br />

• Ongoing partnerships between<br />

Informatics/IT, faculty and administration<br />

o<br />

An iterative process!


<strong>Dashboard</strong>s for Continuous Quality<br />

Improvement at the University of Michigan<br />

Mary Hill (maryhill@umich.edu)<br />

Director, Data Management Services<br />

University of Michigan Medical School<br />

November 6, 2012


Why? Manage Better and Smarter<br />

Being “successful” is a matter<br />

of survival<br />

We all are experiencing financial<br />

challenges…<br />

Doubling of the NIH budget is over<br />

Reductions in state support<br />

More restrictive funding in a world of higher<br />

compliance<br />

Mary Hill 11/6/2012


Looked at What’s Important<br />

• Determined relevant key performance<br />

indicators (KPIs) and benchmarks<br />

• Reviewed metrics currently in use<br />

• Ensured consistency with Leadership<br />

Vision and Strategy<br />

• Bring transparency and agreement<br />

• Don’t worry about getting goal “right”<br />

• Worry about being nimble<br />

Mary Hill 11/6/2012


University of Michigan<br />

First <strong>Dashboard</strong>:<br />

• Key constituents in one room<br />

• Pushed for set of KPIs in 3 months<br />

• Paper first – drew pictures<br />

• War room<br />

Mary Hill 11/6/2012


Current Status<br />

• Must be able to get to the data<br />

• Locally available<br />

• Peer – who are they?<br />

• Externally available<br />

• Numbers must tie back to something<br />

users trust<br />

• Allow drill down<br />

Mary Hill 11/6/2012


Process<br />

• Process of analysis brought<br />

changes to goals<br />

• Different departments different peers<br />

• Tied performance incentives to goals<br />

– brought engagement<br />

Mary Hill 11/6/2012


Learning Program Metrics<br />

Breakdown by area<br />

• Undergraduate<br />

• Graduate Basic Science<br />

• Graduate Medical Education<br />

• Continuing Professional Education<br />

Create metrics for:<br />

• Input<br />

• Throughput<br />

• Output<br />

Mary Hill 11/6/2012


Example Undergraduate<br />

Input<br />

• % admitted accepted to peer<br />

schools<br />

Throughput :<br />

• # Abstracts/Presentations<br />

Output :<br />

• Primary Care Focus %<br />

Mary Hill 11/6/2012


Example Graduate Basic Science<br />

Input<br />

• % Diversity<br />

Throughput :<br />

• #/$ students on Institutional<br />

Fellowships<br />

Output :<br />

• % in science related position five<br />

years post training<br />

Mary Hill 11/6/2012


Example Graduate Medical Education<br />

Input<br />

• #/% in-state resident<br />

Throughput :<br />

• #/% house officers engaged in<br />

Quality or Patient Safety Initiatives<br />

Output :<br />

• # publications with house officers<br />

as first author<br />

Mary Hill 11/6/2012


Example Professional Development<br />

Input<br />

• # Regularly Scheduled Series/# of<br />

attendees<br />

Throughput :<br />

• % CME participants providing post<br />

course evaluations<br />

Output :<br />

• # Papers published/#external<br />

presentations<br />

Mary Hill 11/6/2012


Snapshot<br />

Mary Hill 11/6/2012


Challenges<br />

• Keeping current – environment<br />

changes<br />

• Explosion of dashboards – everyone<br />

is doing –<br />

• Trying to keep a metric in one place<br />

• Have consistent user interface<br />

across the Health System<br />

Mary Hill 11/6/2012


AAMC Mission Management Tool<br />

• 2008 meeting of Group on Student Affairs<br />

• Challenge to give medical schools something<br />

better than USN&WR<br />

• Reviewed the MSAR for medical school<br />

missions<br />

• Six missions selected<br />

• First release March 2009<br />

2012 AAMC Annual Meeting


46<br />

2012 AAMC Annual Meeting


47<br />

2012 AAMC Annual Meeting


48<br />

2012 AAMC Annual Meeting


49<br />

2012 AAMC Annual Meeting


50<br />

2012 AAMC Annual Meeting


51<br />

2012 AAMC Annual Meeting


52<br />

2012 AAMC Annual Meeting


The Missions <strong>Dashboard</strong><br />

2012<br />

53<br />

2012 AAMC Annual Meeting


ASSET <strong>Dashboard</strong><br />

Monitor LCME Standards Performance Annually<br />

www.aamc.org/medaps<br />

www.aamc.org/medaps<br />

2012 AAMC Annual Meeting


MedAPS: Suite of Services<br />

Provide AAMC member medical schools with the<br />

tools necessary to assess, maintain and fulfill<br />

accreditation standards and promote continuous<br />

quality improvement.<br />

Curriculum Inventory<br />

& Reports<br />

(Replacing CurrMIT)<br />

ASSET<br />

(Accreditation Standards Self-<br />

Evaluation Tool)<br />

ASSET <strong>Dashboard</strong><br />

www.aamc.org/medaps<br />

2012 AAMC Annual Meeting


ASSET <strong>Dashboard</strong><br />

• Review performance on LCME standards<br />

annually<br />

• Compare performance<br />

and curricula with national<br />

data<br />

• Compare performance<br />

and curricula with peer<br />

institutions<br />

• Link to AAMC tools and solutions<br />

to help address deficiencies<br />

www.aamc.org/medaps<br />

2012 AAMC Annual Meeting


2012 AAMC Annual Meeting<br />

www.aamc.org/medaps


Populating MedAPS<br />

LCME AQ Part I-A<br />

Data Sources<br />

LCME AQ Part II<br />

ASSET<br />

(1/3 Pre-Populated)<br />

LCME AQ Part I-B<br />

Curriculum<br />

Inventory<br />

AAMC Data Warehouse<br />

Curriculum<br />

Inventory Reports<br />

Student Record System<br />

Graduation<br />

Questionnaire<br />

ASSET <strong>Dashboard</strong><br />

2012 AAMC Annual Meeting<br />

Faculty Database


MedAPS: Timeline<br />

Curriculum Inventory<br />

& Reports<br />

Phase 1: Upload School Data<br />

Launch<br />

ASSET<br />

Curriculum Inventory<br />

& Reports<br />

Phase 2: Launch Service<br />

2013 2014 2015<br />

Launch<br />

2012 AAMC Annual Meeting<br />

ASSET <strong>Dashboard</strong><br />

www.aamc.org/medaps


Discussion Questions<br />

•How can dashboards help schools monitor their individual<br />

missions? How are schools monitoring their missions<br />

without dashboards?<br />

•How can dashboards organize vast amounts of data to<br />

keep it from becoming overwhelming?<br />

•What data would be the most useful in a dashboard<br />

environment?<br />

•What are the data sources for dashboards? What are the<br />

challenges to collecting the data?<br />

•What are some questions you would like to be able to<br />

answer?<br />

61<br />

2012 AAMC Annual Meeting


Summary of Key Findings<br />

Next Steps<br />

So What ?<br />

Now What?<br />

62<br />

2012 AAMC Annual Meeting


References & Resources<br />

Alexander M. Excel 2007 <strong>Dashboard</strong>s & Reports for Dummies. Wiley Publishing. Hoboken, NJ<br />

2007.<br />

Arizona State University: http://www.asu.edu/dashboard/<br />

Arnold KE. Signals: Applying Academic Analytics EduCause Quarterly 2010<br />

Scheps S. Business Intelligence for Dummies. Wiley & Sons Hoboken NJ. 2008<br />

Eckerson WW Deploying <strong>Dashboard</strong>s & Scorecards TDWI Best Practices Report-July 2006<br />

Elias T. Learning Analytics: Definitions, Processes & Potential 2011<br />

LogiXML white paper: <strong>Dashboard</strong> Best Practices (2004). Fuchs, G.<br />

Hammergren TC & Simon AR. Data Warehousing for Dummies. Wiley Publishing. Hoboken, NJ.<br />

2009<br />

i<strong>Dashboard</strong>: www.idashboard.com<br />

Simpson D, Colbert J, Ferguson K, O’Sullivan P. BI & <strong>Dashboard</strong>s. Presented at SDRME Annual<br />

Mtg 2011. Madison WI.<br />

Terkla DG, Sharkness J, Cohen M, et al. Institutional <strong>Dashboard</strong>s: Navigational Tools for Colleges<br />

& Universities. Association for Institutional Research Professional Files Winter 2012 #123<br />

http://www.airweb.org/EducationAndEvents/Publications/Pages/ProfessionalFiles.aspx<br />

Baldrige Education Criteria for Performance Excellence. http://www.nist.gov/baldrige/<br />

63<br />

2012 AAMC Annual Meeting


Disclosure(s)<br />

I affirm that all persons involved in the planning/content development<br />

do not have relevant financial relationships with pharmaceutical<br />

companies, biomedical device manufacturers or distributors, or others<br />

whose products or services may be considered related to the subject<br />

matter of the educational activity.<br />

Partial funding for this MCW project was provided by:<br />

1. Educational Leadership for the Health of the Public Research and<br />

Education Initiative fund, a component of the Advancing a Healthier<br />

Wisconsin endowment at the Medical College of Wisconsin<br />

2. Drs. Elsa B. and Roger D. Cohen Children’s Hospital of<br />

Wisconsin/Medical College of Wisconsin Student Fellowship in Medical<br />

Education.<br />

64<br />

2012 AAMC Annual Meeting

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!