Download - National Association of Schools of Public Affairs and ...

naspaa.org

Download - National Association of Schools of Public Affairs and ...

JPAE

VOLUME 19 NUMBER 1

JOURNAL OF PUBLIC AFFAIRS EDUCATION

Flagship journal of the National Association of Schools of Public Affairs and Administration

WINTER 2013


National Association of Schools of Public Affairs and Administration

(NASPAA)


Jack Knott, Vice President

Frances S. Berry, Immediate Past President

Laurel McFarland, Executive Director

JPAE Oversight Committee:

Andrew Ewoh, Greg Lindsey & Amy Donahue

David Schultz, Editor in Chief, Hamline University

Kristen Norman-Major, Managing Editor, Hamline University

Iris Geva-May, Associate Editor for International and Comparative Education, Simon Fraser University

Lisa Dejoras, Editorial Assistant, Hamline University

Copy Editor: Chris Thillen Layout and Cover Design: Val Escher

EDITOR’S COUNCIL

H. George Frederickson, Founding Editor, University of Kansas James L. Perry, Indiana University, Bloomington

Danny L. Balfour, Grand Valley State University

Mario A. Rivera, University of New Mexico

Marc Holzer

Heather E. Campbell, Claremont Graduate University

Edward T. Jennings, University of Kentucky

Muhittin Acar, Hacettepe University, Turkey

Mohamad Alkadry, Florida International University

Burt Barnow, George Washington University

Peter J. Bergerson, Florida Gulf Coast University

Rajade Berry-James, North Carolina State University

John Bohte, University of Wisconsin, Milwaukee

Espiridion Borrego, University of Texas Pan American

John M. Bryson, University of Minnesota

Lysa Burnier, Ohio University

N. Joseph Cayer, Arizona State University

Heather Campbell, Claremont Graduate University

Barbara Crosby, University of Minnesota

Robert B. Cunningham, University of Tennessee, Knoxville

Dwight Denison, University of Kentucky

Anand Desai, Ohio State University

James W. Douglas, University of North Carolina at Charlotte

Robert Durant, American University

Jo Ann G. Ewalt, Eastern Kentucky University

Cynthia Fukami, University of Denver

Susan Gooden, Virginia Commonwealth University

Cynthia Jackson-Elmoore, Michigan State University

Meagan Jordan

Edward Kellough, University of Georgia

Don Kettl, University of Maryland, College Park

John Kiefer, University of New Orleans

William Earle Klay, Florida State University

Chris Koliba, University of Vermont

BOARD OF EDITORS

Kristina Lambright, Binghamton University,

State University of New York

Laura Langbein, American University

Scott Lazenby, City of Sandy, Oregon

Deanna Malatesta, Indiana University-Purdue

University Indianapolis

Steven R. Maxwell, Florida Gulf Coast University

Barbara McCabe, University of Texas

Juliet Musso, University of Southern California

Michael O’Hare, University of California, Berkeley

Michael Popejoy, Florida International University

David Powell, California State University, Long Beach

David Reingold, Indiana University

Dahlia Remler, Baruch College CUNY

R. Karl Rethemeyer, University at Albany SUNY

Michelle Saint-Germain, California State University, Long Beach

Jodi Sandfort, University of Minnesota

Robert A. Schuhmann, University of Wyoming

Patricia M. Shields, Texas State University

Robert Smith, Kennesaw State University

Jessica Sowa, University of Colorado

Kendra Stewart, University of Charleston

Giovanni Valotti, Università Bocconi

David Van Slyke, Syracuse University

Karel Van der Molen, Stellenbosch University, South Africa

Howard Whitton, Griffith University

Blue Wooldridge, Virginia Commonwealth University

Firuz Demir Yasamıs, American University in the Emirates

CORRESPONDENTS

Khalid Al-Yahya, Dubai School of Government

Charlene M. L. Roach, University of the West Indies,

Edgar Ramirez Delacruz

St. Augustine Campus

Teaching in Economics (CIDE), Mexico

Journal of Public Affairs Education is published quarterly by the National Association of Schools of Public Affairs and Administration.

Claims for missing numbers should be made within the month following the regular month of publication. The publishers expect to

supply missing numbers free only when losses have been sustained in transit and when the reserve stock will permit. Subscription Rates:

JPAE articles can be accessed at www.

naspaa.org/JPAEMessenger. Change of Address: Please notify us and your local postmaster immediately of both old and new addresses.

Please allow four weeks for the change. Postmaster: Send address changes to JPAE, National Association of Schools of Public Affairs and

Educators and Copy Centers:

National Association of Schools of Public Affairs and Administration. All rights reserved. Educators may reproduce any material for

classroom use only and authors may reproduce their articles without written permission. Written permission is required to reproduce JPAE in



JPAE is


Journal of Public Affairs Education

Winter 2013 Volume 19, No. 1

From the Editor—Why Government?

David Schultz.................................................................................................... ii

From the NASPAA President—Looking Outward:

The Changing Context of Public Service Education

John Knott.......................................................................................................... 1

From the Guest Editor—Emergency Management, Homeland Security,

and Public Administration: From the Field to the Classroom

John J. Kiefer....................................................................................................... 9

FEMA SYMPOSIUM

Homeland Security in Higher Education: The State of Affairs

Kendra Stewart and John Vocino........................................................................ 13

Utilization of Service Learning in Emergency Management Programs in

the United States

Naim Kapucu and Claire Connolly Knox............................................................ 31

Developing Decision-Making Skills for Uncertain Conditions:

The Challenge of Educating Effective Emergency Managers

Louise K. Comfort and Clayton Wukich.............................................................. 53

ARTICLES OF CURRENT INTEREST

Productivity and Leadership Patterns of Female Faculty Members in

Public Administration

Meghna Sabharwal........................................................................................... 73

The Development of an MPA Major Field Test

Paulette Camp Jones, Gary E. Roberts, Ervin Paul Martin,

Elaine Ahumada, Stephen M. King, and Pat Kircher........................................... 97

Public Administration Theory, Research, and Teaching: How Does

Turkish Public Administration Differ?

Murat Onder and Ralph S. Brower................................................................... 117

Competency Model Design and Assessment: Findings and Future Directions

Heather Getha-Taylor, Raymond Hummert, John Nalbandian, and Chris Silvia.... 141

Statistical Software for Curriculum and Careers

William C. Adams, Donna Lind Infeld, and Carli M. Wulff............................. 173

BOOK REVIEWS

Review of Rethinking Public Sector Compensation: What Ever Happened to

the Public Interest?

Charles E.Menifield......................................................................................... 189

Review of Financial Management for Public, Health, and Not-for-Profit Organizations

Meagan M. Jordan.......................................................................................... 193

Information for Contributors.....................................................Inside back cover

Cover design by Val Escher. Cover design property of NASPAA. Cover photo: University of New Orleans.


From the Editor—

Why Government?

Why government? This is perhaps the core question of political theory and an

often neglected topic in public affairs programs. So many of us take for granted

the rationale and need for government that we forget that for the last generation,

this question has been a contentious topic of political discourse. In part the recently

concluded American presidential election was an answer to provide reason for

government, and elections across Europe, Asia, and Africa are similar contests

over how to reply to this question.

Public affairs faculty must be able to articulate why we need government and

what it should look like—not just to our students, but to university administrators,

politicians, and the general public. The failure to explain what value government

offers, especially for the 21st century, dooms public administration to the dustbin

of history. The Winter 2013 issue of JPAE begins a dialogue on why government

is necessary and considers how a variety of forces are challenging that institution

as well as what we do in the classroom.

It’s traditional to start a new volume of JPAE by printing the inaugural address

of the incoming NASPAA president. Here President Jack Knott begins the exploration

of Why government? He describes what he sees as the five factors transforming

public service education: challenges to democracy, cross-sector governance, technology,

globalization, and demographics. According to Knott, these five factors challenge

public service delivery across the world, necessitating that our curriculum and

the way we teach also reflect these trends. His address charges the academy to

adapt, learn, and lead, urging the academy to give more than mere lip service to

change. NASPAA must renew its mission in light of these five factors and seek

improved collaboration with other organizations if we are to do a better job in

educating our future leaders.

Although Knott identified five factors driving public affairs education today,

he easily could have listed a sixth—confronting disasters and addressing emergencies.

This is the subject of this issue’s symposium on FEMA. It is almost trite now to

say that the terrorist attacks against the United States on September 11, 2001,

changed everything. Yes, those events have made domestic security and fighting

terrorism a concern for governments across the globe. But 9/11 also highlighted

something that many had forgotten, which is that the task of government is to

address emergencies. Beyond the security issue, 9/11 was about emergency

management. Hurricane Katrina in 2005 represented for many a failure of

government in responding to a crisis; Hurricane Sandy, at least from preliminary

reports, demonstrates how a better prepared government can respond.

Events like 9/11, Katrina, and most recently Sandy, point to the important

role of emergency management as a governmental function. A decade ago, probably

no one thought of teaching emergency management in a public affairs curriculum.

Today, this is an emerging if not a central theme of many programs. The FEMA

ii

Journal of Public Affairs Education


From the Editor

symposium guest editor John Kiefer introduces the three articles in this issue

addressing this subject. Kiefer provides a succinct overview and history of the field

of emergency management, defines the subject matter, and then summarizes the

four articles for the symposium.

The three articles are “Homeland Security in Higher Education: The State of

Affairs,” by Kendra Stewart and John Vocino; “Utilization of Service Learning in

Emergency Management Programs in the United States,” by Naim Kapucu and Claire

Connolly Knox; and “Developing Decision-Making Skills for Uncertain Conditions:

The Challenge of Educating Effective Emergency Managers,” by Louise Comfort

and Clayton Wukich. These articles offer a kaleidoscope of perspectives on the

emergency management field. The authors describe the skills that professors

need to emphasize, and the tasks to be done as public affairs education moves

forward in its rediscovery of a core function of government that had been forgotten

and that now sits alongside the five challenges identified by Knott.

But this issue presents articles that go beyond the topic of emergency management.

Meghna Sabharwal’s “Productivity and Leadership Patterns of Female Faculty

Members in Public Administration” examines the gender gap among faculty in

public affairs teaching. Although the percentage of women as public affairs faculty

has increased, little attention has been given to their productivity and advancement

to leadership positions. Sabharwal describes one of the first studies of its kind to

do so. Drawing on data from the Survey of Doctorate Recipients spanning 241

schools, she finds female faculty have lower productivity than their male cohorts,

even after accounting for variables such as demographic, institutional, and career

factors. The article begins an exploration of why this gender gap exists and

considers its implications for women and their career paths, both within higher

education and for our students wishing to work in government.

In “The Development of an MPA Major Field Test,” Paulette Camp Jones

and her colleagues describe the yearlong process of systematically developing an

MPA Major Field Test with a team of five schools in various regions of the

United States. The test was based on NASPAA’s five competencies for the MPA

degree. Four schools list the data gleaned from the test results and its impact on

their MPA program. The authors report on the reliability and effectiveness of the

field test in measuring the strength of MPA programs. This excellent piece contains

useful information for all programs wishing to do self-evaluations.

It is too easy to assume that a rose is a rose, or that teaching public administration

in one country is the same as in others. Not so. In “Public Administration Theory,

Research, and Teaching: How Does Turkish Public Administration Differ?” Murat

Onder and Ralph S. Brower provide a 20-year overview on the evolution of teaching

public administration in that country. They note the emergence of a movement

away from using standard texts found in the United States and elsewhere and the

creation of a unique body of Turkish literature. The authors describe the unique

challenges of public affairs education in Turkey, highlighting themes and issues

other curriculums across the world face as they seek to educate their future leaders.

Journal of Public Affairs Education iii


From the Editor

Perhaps the message of the article is to suggest a seventh challenge to public affairs

education that is the opposite of the globalization—the nationalization of training.

In “Competency Model Design and Assessment: Findings and Future Directions,”

Heather Getha-Taylor and coauthors discuss the implementation of a model to

test learning competencies, using the University of Kansas MPA program as a case

study. The article provides a fine overview of the problems regarding definition,

design, and evaluation of competencies, and it offers useful instruction to other

programs looking for directions as they do the same.

For those who teach statistics and research methods, one of the perennial questions

is which software package is most relevant for careers. (What do professionals use

in the real world? And which software programs are the best for teaching critical

concepts?) William C. Adams, Donna Lind Infeld, and Carli M. Wulff address

that query in “Statistical Software for Curriculum and Careers.” They seek to identify

the skills employers are looking for, the statistics packages they use, and the kinds of

software that best meet the instructional needs for class. This is a terrific piece for

those teaching research methods classes and public affairs curricula.

This Winter 2013 issue concludes with two book reviews: Charles E. Menifield

reviews Rethinking Public Sector Compensation: What Ever Happened to the Public Interest?

by Thom Reilly, and Meagan M. Jordan looks at the fourth edition of Financial

Management for Public, Health, and Not-for-Profit Organizations, by Steven A. Finkler,

Robert M. Purtell, Thad D. Calabrese, and Daniel L. Smith. The first book is a

serious examination of public sector compensation, a book suitable for human

resources; the second is a terrific core textbook for many public affairs classes.

Collectively, the articles in this issue of JPAE should challenge readers to find

answers to the question, Why government? This should be the first question we ask

in every class—and we and our students must put this same question to the public

if we are to remain relevant and vibrant programs into the future.

— David Schultz

Editor in Chief

Hamline University

dschultz@hamline.edu

David Schultz is a Hamline University professor in the School of Business and School

of Law. Professor Schultz is a two-time Fulbright Scholar and the author of more

than 25 books and 90+ articles on various aspects of American politics, election law,

and the media and politics, and he is regularly interviewed and quoted on these

subjects in the local, national, and international media, including the New York Times,

Wall Street Journal, Washington Post, the Economist, and National Public Radio.

His most recent book, American Politics in the Age of Ignorance: Why Lawmakers

Choose Belief Over Research, was published in 2012 by Palgrave Macmillan.

iv

Journal of Public Affairs Education


From the NASPAA President

NASPAA Presidential Address

Annual Conference, Austin, Texas

October 19, 2012

Looking Outward: The Changing

Context of Public Service Education

Jack H. Knott

Sol Price School of Public Policy,

University of Southern California

Good afternoon! I am truly honored and humbled to be selected as the president

of NASPAA. I care deeply about our profession and the education of our students

and look forward to working with all of you over this coming year as we continue

to build a great professional association for our schools and programs.

Before I begin my remarks, let me first thank Nadia for her service to NASPAA

and to her great year as president!

She has been a delight to work with on the Council and the Executive Committee.

I especially appreciated her initiation of the Strategic Planning process and her

dedication to furthering NASPAA’s international agenda.

I also want to thank Fran Berry, the past, past president of NASPAA who has

served with Nadia and me on the Executive Committee. What a privilege to work

with these smart and capable individuals!

In addition, a hearty thank you to the members of the Executive Council for

their dedication and hard work in helping to deal with the issues facing the organization

and in shaping a vision for the future.

And, I thank the very fine NASPAA staff: Crystal Calarusse and Stacy Drudy

for managing accreditation; Peter Green for keeping the finances growing and in

the black; Stuart Heiser for good work on policy issues; Monchaya Wanna for her

work behind the scenes with web services and membership; Meihua Zhai for her

work on data, and Jackie Lewis for conference planning.

I especially thank Laurel McFarland, an efficient and creative Executive Director,

who among many other accomplishments over the past few years has extended

NASPAA’s presence externally, especially in Washington, D.C.

But most important, I thank all of you for the amazing work you do in your

schools and programs and your commitment to working with NASPAA to truly

make us the “Global Standard for Public Service Education!”

JPAE 19(1), 1–8

Journal of Public Affairs Education 1


From the NASPAA President

Five Factors Transforming Public Service Education

This is a very exciting time for NASPAA! The mission of NASPAA could not

be more pertinent and critical than it is today: “To ensure excellence in education

and training for public service and to promote the ideal of public service.”

To accomplish this mission, however, I believe that NASPAA needs to change.

It must transform from an intimate, club-like group of deans, directors, and faculty

who look inward to become a major professional and trade association that looks

outward to promote public service, relevant research, the role of our graduates in

society, and the competitiveness of our schools and programs.

This internal and external focus is essential. Several factors are dramatically

changing higher education and are having a major impact on our schools and

programs. These factors present significant challenges and potentially new

opportunities for us, but only if we rise to meet them. My hope is that by becoming

more aware and engaged externally, we will be better prepared to adapt internally

to survive and thrive in the future.

So, I would like to outline five factors that are changing the world of public

service and that shape my vision and priorities for NASPAA to become a more

outward-looking organization.

The first factor is the challenge to the legitimacy of democratic government.

This factor is best described by John Dilulio this year in his remembrance of

James Q. Wilson in the Public Administration Review (PAR, 2012). He writes that

Wilson worried most “not about crime or any other single policy issue, but about

the health of the institutions that made or implemented the nation’s policies.”

He states that Wilson worried that the “compromise, ambiguity, and contradiction”

of democratic governance would increasingly confound the quality of administration

and public policy. And that this would lead political elites to ever-greater ideological

and partisan polarization and for citizens to openly doubt the “legitimacy of

government itself.”

What are the implications for our schools? First of all, students will no longer

come to our schools from around the world simply because the American government

used to be the global standard in democratic governance. Our nation’s leaders

have failed in recent years to work together through our democratic institutions

to address the pressing and long-term problems of the country. Consequently,

public affairs schools must gain greater recognition as leaders in our own right on

behalf of instilling public service values, research, education, and training.

This crisis of legitimacy and growing polarization should inspire NASPAA

to increasingly look outward to strengthen efforts to promote public service

education, to make sure that the best and the brightest students are going into

government as well as the private sector. It is also about developing ways for our

schools to engage with government in providing reasoned, expert advice, data,

and research findings.

John Kennedy stated, “Today we move along the knife-edge path which

requires a Government service more highly skilled than ever before…

2 Journal of Public Affairs Education


From the NASPAA President

Government service must be attractive enough to lure our most talented people

(Kennedy, 1961).” This has never been more true than today.

I also believe that this factor calls NASPAA to promote and help develop

undergraduate, interdisciplinary education in public affairs that instills public

service values and knowledge of policy analysis and democratic governance.

It is critical that this public service education instill a strong sense of character,

not just technical information and knowledge.

The second factor is cross-sector governance, which is dramatically changing

the way in which our society and other societies address public problems.

Our governance system is much more cross sector than when our profession

began in the 1920s. It involves business, nonprofits and philanthropy, the media,

the community and government.

The implication is that the realities of cross-sector governance out in the real

world mean the end of graduate education for a single sector (as in government

sector, or business sector). It means that the boundaries of public service, and

hence the organizational domain of NASPAA, are permeable and changing.

Business schools now claim to prepare students for public governance, not just

business, including a focus on nonprofits, social entrepreneurship, health management,

and leadership. The same is true for schools of communication, such as

the USC Annenberg School. At the same time, public affairs schools that focus

on a single sector threaten to make our degrees largely irrelevant to the next

generation of students.

Students do continue to resonate with Dr. Martin Luther King, Jr. when he

stated that, “Life’s most persistent and urgent question is, ‘What are you doing

for others?’” But while there is a hunger among young people today for public

service, do these students want to work for government? Do they see our schools

as the place to realize their public service passion? Often the answer is no. At the

Price School and other schools, we see a growing number of our graduates who

end up working in all three sectors.

Since our public service domain is becoming increasingly competitive and

contested, we need to embrace all of our major fields, including public policy,

public administration, and public affairs. This is one reason why we have decided

to propose a new name for NASPAA to encompass our diverse major fields.

We also need to consider accrediting public policy and public administration

programs together rather than separately.

It is imperative that we become more innovative. We must innovate in our

types of degrees and focus and make strategic choices about leadership studies,

social innovation, nonprofit management, and health management and policy.

Innovation should drive our knowledge development and curriculum in

strategic communications, acquisitions, and private sector contracting, negotiation,

communications, network management, and public private financial management

and regulation.

Journal of Public Affairs Education 3


From the NASPAA President

We thus face a growing tension between NASPAA standards and accreditation

and the multiplicity of economic and business models and institutional forms

our schools must adopt for survival in this competitive environment.

The third factor is the dramatic advance in technology. This advance

threatens our traditional model of public affairs education down to its very core

identity: It is bringing an end to the geographic monopoly that many public

affairs programs have enjoyed, as the “only game in town” in a particular area.

Online education is inevitable, and it is creating global virtual campuses.

Today technology also is changing how we provide public service education.

The Price School, for example, which includes urban planning as well as public

policy and administration, is increasingly using spatial GIS analysis and

visualization technologies in addressing issues of land use, sustainability, health

care, transportation, housing, and urban development. We are building in

partnership with the Southern California Area Government a spatial and visual

analysis laboratory.

We use asynchronous learning, blogs, chats, and electronic case studies.

An important implication for NASPAA is how to accommodate and

facilitate online degrees and learning in the standards and accreditation process

and how to become a resource for schools who want to develop Internet and

other advanced technologies in the classroom.

The fourth factor is the growth in globalization. The professions are globalizing.

In not that many years ahead, we are going to be benchmarking against both

domestic and overseas programs, and a NASPAA membership without leading

non-U.S. schools is an increasingly parochial organization.

Internationally, our field is seeing dramatic growth. There are over 100 public

administration programs in China alone, and even Britain has established new

public policy schools at the University of Edinburgh, Oxford University, and

University College London. The performance of research and education in our

field is shifting to a global stage.

Many of our comprehensive schools are forming alliances with schools

abroad for joint degrees, developing global policy and management programs,

and other collaborations.

It is very important that NASPAA, as the global standard in public service

education, engage with, learn from, identify best practice with, and ultimately

accredit top schools from other countries. We also need to engage with our

schools abroad and NASPAA’s sister associations in Europe, Latin America, Asia,

and elsewhere, to identify, and then continue to refine, the global standard, and

then require it of schools around the world.

The fifth factor is changing demographics. I live in Los Angeles, one of

the most diverse cities on the planet. People of Caucasian, northern European

backgrounds are not the majority, and Los Angeles in this respect looks like the

future for many parts of our country.

4 Journal of Public Affairs Education


From the NASPAA President

In addition, the American population and the populations of many other

countries are aging. The retirement of the baby boom generation will cause

major leadership gaps in government and the private sector.

People change careers over their lifetime more than ever before, including

working at jobs across the public, nonprofit, and private sectors.

The implications for NASPAA are multiple. The most important is the continued

emphasis on diversity in hiring, educational programs, and student recruitment.

The retirement of the baby boom and aging of the population is creating

unsustainable fiscal and financial pressures that call for new models of financing

and greater expertise in fiscal and financial management.

These demographic changes also have an important implication for NASPAA’s

role in developing opportunities for executive education for mid-career officials

and others.

A Vision For NASPAA

In sum, these five factors—the legitimacy crisis, cross-sector governance,

technology, globalization, and demographics—constitute major trends that

should help define the vision and changing priorities for NASPAA.

So what do these priorities look like? First of all, NASPAA must continue to

embrace the public service mission. Our profession started in the 1920s, with the

creation of what are today the Kennedy School at Harvard, the Maxwell School

at Syracuse, and the Price School at USC.

The formation of the Price School grew out of the Progressive movement.

A group of engaged citizens lobbied the university president to establish a school

dedicated to training and educating people for public service with skills in budgeting

and financial management, human resource management, policy analysis,

and urban planning.

They also wanted students trained in ethics and strong moral character. They

wanted to overcome the corrupt and inefficient administration of the time. They

hoped to produce graduates who would transform the government to effectively

and efficiently address the public problems of the day.

While technology, demographics, and governance have changed and become

more global, this mission of public service for our school and all NASPAA schools

endures and is more important than ever.

However, the association and the schools’ contribution to society and their

role in it are endangered if we are content to stay where we are and don’t embrace

the forces of change enveloping governance and society.

It is often said, if we don’t know where we want to go, we won’t know how to

get there. As Oliver Wendell Holmes, a 19th-century American poet, physician,

and Harvard professor, astutely remarked, “The great thing in the world is not so

much where we stand, as in what direction we are moving.”

Journal of Public Affairs Education 5


From the NASPAA President

So, where do we want to go, and is NASPAA moving in the right direction

to get there? My answer to this question is that NASPAA is not yet where it

needs to go but has started over the past few years with the Policy Issues Initiative,

the International Initiative, the national curriculum initiative, and the database

development to move in the right direction.

NASPAA needs to grow up from being a somewhat intimate, club-like

organization of deans, directors, and some faculty who talk about curriculum

and accreditation to become a full-blown professional and trade association for

the public policy, administration, and public affairs fields.

This transformation will take the organization from a primary focus internally

to a strategic set of internal and external priorities.

I’ve already suggested specific ways in which external trends have important

implications for NASPAA. Let me summarize these implications to highlight

where my presidency will focus its attention.

1. Expand the external reach of NASPAA to include promotion of our

schools, relevant research, and the role of our graduates in public

service. We will do this through three broad strategies:

• First, through the Policy Issues initiative that includes working with

federal agencies to implement the U.S. Government Student Pathways

program for our students and development of a Policy Issues Café in

partnership with the federal government as a clearinghouse for the

application of our expertise in public problem solving;

• Second, through a close working relationship with APPAM in particular

and our other sister associations of ASPA and NAPA to include

extensive joint database development not only for accreditation but

also for understanding our programs, students, and schools; for

promotion to the external world; and through relating new competencies

in education to research priorities and outreach to policy

makers across the sectors. This includes holding the 2014 conference

jointly with APPAM.

• Third, through exploring ways to foster student engagement and

recognition, such as a case competition or other activity.

2. Foster cross-school and program learning to increase our schools’

competitiveness in at least two ways:

• Through the formation of national working groups on state-of-the-art

curricula and best practices in the schools on key new competencies,

including financial management and regulation, acquisitions, negotiation,

strategic communications, network management, human capital,

and resource management; and

• Through capturing and disseminating to the schools the learning on

best practices and innovations encountered in the accreditation and

other review processes.

6 Journal of Public Affairs Education


From the NASPAA President

• Through fostering better learning among types of schools that

share common interests, such as the comprehensive schools or

the small programs.

• Strengthen learning about recruitment of diverse faculty and

students and for developing programs in social justice and other

topics related to diversity.

3. Identify ways to address the multiplicity of institutional forms,

degree formats, and economic models of our member schools

to compete to survive and thrive. We will do this through

the following:

• Develop proposals on how NASPAA can promote undergraduate

education in public policy, administration, and public affairs.

• Focus on the implications of online degrees for accreditation and

the role of NASPAA as a resource to the schools in using educational

technology.

• Engage the standards committee on the potential tensions between

developing multiplicity for competitiveness and the processes of

NASPAA standards.

4. Focus on competitive areas including social entrepreneurship, nonprofits,

leadership, and executive education and lifelong learning.

5. Provide further support for the International Committee in the

following ways:

• Developing ways to promote learning across schools about global

competitive strategies and activities;

• Promote relationships with international public affairs associations

in Europe, Asia, and Latin America through student and faculty

exchanges; and

• Pursue membership and accreditation for top quality schools and

programs abroad.

• Begin to understand global standards in public policy, administration,

and public affairs education.

To achieve these goals, NASPAA must move to embrace changes occurring

in governance and society; it must transform itself into a full-blown professional

association with an internal and external strategic agenda. It must rise to the

challenge of stiffer competition, and with the right vision, innovation, and strategies

—as the title to this conference so aptly emphasizes—turn these challenges into

new, great opportunities to build a stronger and better association in support of

public service education for the future.

I believe that NASPAA is moving in the right direction, and I look forward

to working with all of you to make significant progress along this exciting path

over the coming year. Thank you!

Journal of Public Affairs Education 7


From the NASPAA President

References

Dilulio Jr., J. (2012). The Legitimacy of Government Itself, Public Administration Review, 72(4),

485–486.

Kennedy, J. F. (1961). Message to the Federal Civil Service, Civil Service Journal, January-March.

Jack H. Knott is the Erwin and Ione Piper Dean and Professor at the Sol Price

School of Public Policy at the University of Southern California. He is a scholar

in the fields of organization theory, public management, and public policy. His

research focuses on the impact of institutions and decision-making processes on

public policy and on government and bureaucratic reform. He is an elected fellow

of the National Academy of Public Administration.

8 Journal of Public Affairs Education


Emergency Management, Homeland

Security, and Public Administration

—from the Field to the Classroom:

An Introduction to the Symposium

John J. Kiefer

University of New Orleans

Emergency management, and lately homeland security, is at its core public

management. The skills forged in the crucible of crises are well suited to a broad

range of public management situations and career paths other than that of

Emergency Manager (or Homeland Security director). An appropriate career

path for an emergency manager is to move “upward” to positions of more

responsibility such as city manager or chief administrative officer (CAO). In

some communities, city managers and CAOs eventually seek elected office,

bringing with them a deeper understanding of how communities can mitigate,

prepare for, respond to, and recover from disasters.

“Emergency management is the management of risk so that societies can

live with environmental and technical hazards and deal with the disasters that

they cause” (Waugh, 2000, p. 3). The lessons brought from field research can

play an important role in educating future public managers, even those who do

not seek to specialize directly in emergency management or homeland security.

For many years, the American Society for Public Administration (ASPA) Section

on Emergency and Crisis Management (SECM) has actively fostered a nexus

between disaster-related scholarship and mainstream public administration

theory and practice. A quick review of the programs at recent ASPA, Southeastern

Conference of Public Administration (SECoPA), and Northeastern Conference

of Public Administration (NECoPA) conferences shows a broad range of papers

related to emergency management and homeland security and panels directly

related to public administration theory and practice. The 2013 ASPA national

conference has a hazard/disaster-related overall conference theme of “Governance

& Sustainability: Local Concerns, Global Challenges.” All are indicators of the

long-term success of SECM’s efforts to tightly integrate emergency management

and public management.

JPAE 19(1), 9–12

Journal of Public Affairs Education 9


J. J. Kiefer

Many public administration courses have integrated lessons learned from

disasters even though not specifically directed at emergency management students.

They offer appropriate lessons for a broad range of public managers. Before 9/11

and Katrina, many of us read or used “The Blast in Centralia No. 5: A Mine

Disaster No One Stopped” by John Bartlow Martin, included in Richard Stillman’s

classic public administration overview textbook (Stillman, 2010). “Crisis management”

and “disaster policy” can be offered as modules in leadership or policy courses

respectively, or even as entire courses within a public administration program.

Education in decision making can be honed through real-world examples of

uncertainty and ambiguity. Intergovernmental relations courses can include

lessons from disasters, or can be offered with a single focus on the complexities

of intergovernmental relations during and post-disaster. The service-learning

requirement of public administration programs can be linked closely to emergency

management and homeland security.

In late 2011, SECM invited a group of active members to present their views

of how they moved important lessons from their field research to their

classrooms. A broad range of proposals were submitted, and the top ones were

selected for presentation at an SECM-sponsored panel at the 2012 national

ASPA conference in Las Vegas. The three articles in this symposium provide

cutting-edge examples and practices from scholars whose research focuses on

emergency management and homeland security issues, but also integrate the

lessons from their research into their mainstream public administration courses.

Stewart and Vocino begin the issue by addressing the role of higher

education in meeting the demands of a rapidly increasing market for public

managers skilled in securing the homeland and managing the consequences of

attacks and disasters. In the wake of 9/11 and Hurricane Katrina, homeland

security and emergency management programs enjoyed exponential growth

across the spectrum of community colleges, baccalaureate, and graduate

programs. However, this explosive growth has been characterized by the failure

to identify common competencies within the broad fields of homeland security.

Although this lack of common competencies has been long identified as a

problem in the field, only recently have scholars and practitioners been

systematically meeting to identify and agree upon basic standards for program

graduates at all levels of higher education. The authors provide some important

and sorely needed research-based guidelines for the establishment of common

standards for disaster and hazard-related programs.

Kapucu and Knox continue with a discussion of the integration of service

learning into emergency management education programs. Their research shows

that a significant number of emergency management programs incorporate service

learning into their curriculums, yet many face challenges in doing so. The authors

capably support the importance of service learning as a pedagogical tool, and they

10 Journal of Public Affairs Education


Introduction to the Symposium

provide a framework for integrating service learning into emergency management

degree and certificate programs throughout the United States. They conclude

with several sound recommendations for effectively implementing service learning

into emergency management and homeland security curricula.

Comfort and Wukich conclude the symposium by examining the important

challenge presented by trying to “teach” students to make effective decisions

under conditions of uncertainty. They argue that the ability to comprehend a

process for making sound decisions requires the student to move beyond the

traditional learning paradigm. The authors stress that students must understand,

recognize, and minimize risk, a challenge for the classroom environment. They

suggest a system for identifying the skills needed to make such decisions through

a curriculum that focuses on elements of administrative process and provide

suggestions for their delivery.

As the symposium’s authors have noted, special skill sets are often required of

those teaching in such specializations due to the unique problems presented by

decision making under conditions of uncertainty. The authors have clearly outlined

the most important problems that must be overcome to provide meaningful

experiential learning for public administration students.

This symposium addresses many of the important challenges that have emerged

due to the rapid growth of emergency management and public administration

programs over the past two decades. Academic and professional organizations such

as ASPA, NASPAA, IAEM (International Association of Emergency Managers),

EMI (Emergency Management Institute), SECM, HSDEC (Homeland Security

Defense Education Consortium), and others have started the process, but much

work remains to be done to identify and agree upon common standards for

homeland security and emergency management specializations.

In early 2011, then ASPA president Erik Bergrud convened an ASPA task

force to make recommendations to ASPA as to accrediting homeland security

and emergency management programs. I’ve had the privilege of chairing that

task force, and we’ll make our recommendations to the ASPA National Council

at the 2013 national conference. Still, there are significant obstacles to overcome.

Once core standards and competencies have been crafted, it will be essential to

examine the best ways to measure those special competencies. For public

administration programs, this effort will likely evolve into a role for NASPAA,

much like the already existent requirements for specialization in nonprofit

management/leadership. It remains to be seen how well homeland security

and emergency management programs not based in public administration

departments and schools will respond to and accept the new standards.

We welcome the challenges, and look to the broadest participation as we

move forward.

Journal of Public Affairs Education 11


J. J. Kiefer

References

Stillman, Richard J. (2009). Public administration: Concepts and cases (9th ed.). Belmont, CA: Wadsworth.

Waugh, William L., Jr. (2000). Living with hazards, dealing with disasters: An introduction to emergency

management. Armonk, NY: M.E. Sharpe.

John J. Kiefer is the director of the Master of Public Administration program

at the University of New Orleans, and an associate professor of political science.

He is also the immediate past chair of ASPA’s Section on Emergency and Crisis

Management. His research and teaching focuses on the development of outcomefocused

collaborative networks to create disaster resilience in organizations and

communities. He is or has been either principal investigator or a research team

member for projects that include elderly evacuation, children’s readiness for

disasters, technology initiatives for vulnerable populations, repetitive flood loss

mitigation, disaster resiliency studies, and a disaster-resilient university. He has

published in numerous journals, including Public Administration Review, Public

Works Management and Policy, and Journal of Emergency Management. His first

book, Natural Hazard Mitigation, co-edited with Alessandra Jerolleman, was

recently published.

12 Journal of Public Affairs Education


Homeland Security in Higher

Education: The State of Affairs

Kendra B. Stewart

College of Charleston

John Vocino

U.S. Government Accountability Office

Abstract

In the past two decades, the number of institutions of higher education offering

degrees in homeland security has increased exponentially. This rapid growth,

brought on by external factors, has led to some discussion about the ability

of programs to address the needs of the field. This article is an overview of

the state of higher education (college and technical school) programs in the

fields of homeland security and emergency management. The authors look at

the rapid growth of these programs in the U.S. system, explain the state of a

lack of shared learning outcomes and standards in the field and describe the

evolution of these issues, and conclude by offering some criteria and guidelines

based on recent studies and organizational needs for such programs.

Catastrophes like the terrorist attacks on Oklahoma City in 1995, the attacks

on the World Trade Center in 1993 and on September 11, 2001, along with

natural disasters such as Hurricane Katrina in 2005, have raised the nation’s

awareness of natural, accidental, and human-made risks. The costs of natural

disaster have been increasing exponentially, largely due to increases in population

and wealth density in disaster-prone areas. Even when accounting for the

exponential rise in gross domestic product (GDP) over the last four decades, the

costs of natural disasters have tripled (Blanchard, 2008). In the decade following

9/11, the United States—federal and state governments, private and nonprofit

sectors—have invested countless billions in increasing our nation’s preparedness

Keywords: homeland security, emergency management, higher education

JPAE 19(1), 13–29

Journal of Public Affairs Education 13


K. B. Stewart & J. Vocino

to all hazards and other possible acts of terrorism on U.S. soil. Mueller and Stewart

recently estimated that the nation’s investment in domestic homeland security

over this period has amounted to $1 trillion (Mueller & Stewart, 2011). The growing

priority of securing the homeland and managing the consequences of attacks and

disasters has brought about corresponding adaptation and response by the nation’s

higher education sector to meet the demands of the market.

This article begins with an overview of the state of higher education (college

and technical school) programs in the fields of homeland security (HS) and

emergency management (EM). We look at the rapid growth of these programs

in the U.S. system, explain the state of a lack of shared learning outcomes and

standards in the field, and discuss how this situation is evolving. We conclude by

offering some criteria and guidelines based on recent studies and organizational

needs for such programs.

Overview of Programs

Emergency management and homeland security are in much the same position

as public administration always is—striving to find a way to relate theory to

practice and education to professional identity. Scholars who have examined this

issue of developing emergency management and homeland security as a discipline

(Plant, Arminio, & Thompson, 2011) have quoted Dwight Waldo’s observations

that public administration should be a profession without actually becoming one

in the strictest sense, leaving behind the true professional designations, institutionalized

standards for entry, licensing, training, and clear ethical guidelines. The

leaders in public administration in the 1960s saw the need for professionalization

in the field, yet no professional identity emerged for those engaged in the work

of the public sector.

The terrorist attacks of September 11, 2001, found most universities as

un prepared to deal with the “new normalcy” as government agencies. As has been

well documented in the literature, emergency management and homeland security

courses and programs sprang up quickly to meet demand and opportunity

(Bellavita & Gordon, 2006; Cwiak, 2011; Drabek, 2007; Kiltz, 2011; McCreight,

2011; Polson, Persyn, & Cupp, 2010). Plant et al. (2011) concluded that whether

homeland security is a true profession or not is inconsequential when one considers

its universal relevance. Like public administration, homeland security needs to act

as if it is an emerging profession in its attention to the most important aspects of

any professional grouping: a concern for subject matter expertise, shared values,

and a commitment to dialogue and excellence among its members.

Tierney noted in 2005 that disasters often serve as “focusing events” leading

to the development of new legislation, policies, and practices. On September 10,

2001, there was no single federal agency for homeland security. Most of the U.S.

states did not have a Director of Homeland Security, a budget line for homeland

security activities, nor an agency committed to coordinating these efforts. By the

14 Journal of Public Affairs Education


Homeland Security in Higher Education

end of that year, things had changed significantly. A federal office was created

(and in November 2002, a federal agency), and all 50 states had created a position

to coordinate homeland security efforts. Today, every state and most large localities

have some form of governmental organization tasked specifically with homeland

security efforts (compared to seven states and only a handful of cities before

September 11, 2001). And, since the creation of the Department of Homeland

Security grants programs in 2003, state and local governments have been granted

billions of dollars in funding for improving their preparedness efforts. It is an area

of employment that has been growing rapidly over the past decade. And although

it is hard to pinpoint a single number representing homeland security/emergency

management employment in the United States, the growth in personnel at the

Department of Homeland Security is evidence of this continued upward trend

(see Table 1). In addition, this number does not include the over 200,000

contractors the agency employs annually. This growth is projected to continue.

The Office of Bureau of Labor and Statistics expects future growth in

several fields under the umbrella of homeland security, such as cybersecurity

(http://www.bls.gov/oco/cg/cgs041.htm#emply).

Table 1.

U.S. Department of Homeland Security Full-Time Employees

Year

Employees

2007 168,344

2008 179,871

2009 189,507

2010 191,063

2011 199,492

Although many homeland security jobs are with the federal, state, and local

governments, both for-profit and nonprofit organizations also hire in the field

of homeland security. According to the Bureau of Labor and Statistics’ (BLS)

Occupational Outlook Quarterly on Homeland Security, security is one of the

largest areas of employment in the private sector (Jones, 2006, p. 4). The field of

homeland security is broad and varied, and according to the BLS includes careers

in areas such as Information Security, Law Enforcement, Emergency Management,

Infrastructure Protection, Business Continuity, Intelligence Analysis, and Physical

Security to name a few (Jones, 2006). Developing programs to prepare graduates

adequately for such a variety of positions can be challenging for academic institutions

when it comes to curriculum development.

Journal of Public Affairs Education 15


K. B. Stewart & J. Vocino

These increased employment opportunities are likely a main factor contributing

to the growth in interest in homeland security by both students and colleges and

universities. The Federal Emergency Management Agency’s (FEMA) Emergency

Management Institute charted this growth of higher education programs in emergency

management over the period from 1983 to 2007 (See Figure 1).

Figure 1.

Emergency Management College Programs by Year

160

150

140

130

120

110

100

90

80

70

60

50

40

30

20

10

0

1983

1984

1985

1986

1987

1988

1989

1990

1991

1992

1993

1994

1995

1996

1997

1998

1999

2000

2001

2002

2003

2004

2005

2006

2007

Source. FEMA (2011).

As of October 2012, the number of colleges with specialized programs in

emergency management has grown from zero in 1993 to 259, according to FEMA’s

Emergency Management Institute. (see Figure 1). This includes

• 67 certificates, minors, diplomas, tracks, focus

• 50 associate degrees

• 46 bachelor degrees

• 87 master’s-level programs

• 9 doctoral-level programs

FEMA’s Emergency Management Institute EMI also has identified 131 higher

education programs related to HomelandSecurity, Homeland Defense and/or

Terrorism. Additionally, there are 31 programs related to public health, 10 U.S.

16 Journal of Public Affairs Education


Homeland Security in Higher Education

international disaster relief/humanitarian assistance programs, and 29 “other

related” programs.

Tierney’s point, noted earlier in this work, regarding disasters serving as

“focusing events” can be seen in the growth in emergency management programs

evident in Figure 1. The first bubble of growth occurs in 1995—following the

Oklahoma City bombing—followed by unabated growth through 9/11 in 2001

and by Hurricanes Katrina and Rita in 2005. Simultaneously, this period has also

seen growth in programs focusing on homeland security. U.S. News and World

Report identified both homeland security and cybersecurity as two of the nine hottest

college majors in 2011 (Gearon, 2011). U.S. News says job demand has grown

“tenfold over the last 10 years” for cybersecurity or information assurance (teaching

students to spot and fix vulnerabilities in the nation’s information infrastructure).

Further, the article identified a high interest in public health programs, which may

stem from the heightened awareness of pandemics such as SARS and influenza

viruses circulating in animals that pose threats to human health, the spread of

antibiotic resistance, or health care reform. 1 As of 2009, some 140 institutions

offered an undergraduate major, minor, or concentration in homeland security,

according to the U.S. News article.

These programs have grown in absolute numbers as the higher education

community has tried to respond to national priorities and increased awareness of

its homeland security role under the War on Terror, as well as a greater appreciation

of the effect of catastrophic disasters due to the aftermath of Hurricane Katrina.

With this rapid growth in program development come questions of consistency

in curriculum quality, student learning outcomes, industry standards, and guidelines

as well as the need for accreditation.

Curriculum Overview

In 2007, Rollins and Rowan noted that higher education programs for the

homeland security discipline was an evolving, ungoverned environment of numerous

programs purporting to prepare students for various positions of responsibility

(Rollins & Rowan, 2007). Many programs were established using current institutional

courses that have since been revised to address issues related to homeland security.

As the discipline continues to mature, program commonalities, core teaching areas,

and course standardization may emerge that shape the homeland security academic

environment and produce graduates conversant with a standard set of homeland

security topics.

Although the homeland security academic environment does not appear to have

matured to the point that common core courses are being taught at any level of higher

education, the education community is still debating whether the establishment

criteria and standardization of the curriculum is optimal at this time. Members

of both the academic and the professional communities are concerned that the

diversity of issues related to the discipline does not lend itself to identifiable core

teachings, and that the pace of change in homeland security—especially when you

Journal of Public Affairs Education 17


K. B. Stewart & J. Vocino

include the mission of cybersecurity—is currently too fluid. Rollins and Rowan

(2007) noted that homeland security practitioners and academicians agreed that

greater attention is needed for the role and utility of homeland security as a permanent,

well-understood discipline. Much consensus also exists around the idea that for the

field to mature, the homeland security environment must be further defined; this

in turn would provide for the development of core educational objectives (Rollins &

Rowan, 2007). However, Drabek (2007) notes that curricula reflecting home land

security issues and competencies have been established following the terrorist

attacks of 9/11.

Although clear consensus exists on the curriculum that should be delivered in

this area, one institution is working closely with the leading agency in the field in

developing and delivering coursework. The Center for Homeland Defense and Security,

the Naval Postgraduate School (NPS) Center for Homeland Defense and Security

(CHDS), has been the nation’s premier provider of homeland security graduate- and

executive-level education since 2002. NPS and the U.S. Department of Home land

Security (DHS) are collaborating to pioneer the development and delivery of

home land security education programs for governors, mayors, and senior homeland

security leaders across a wide spectrum of disciplines in local, tribal, state and federal

governments, and the military. Given the time CHDS has spent in the homeland

security academic environment, its ongoing interaction with homeland security

entities throughout the nation, the use of its graduate program as a model for

numerous other universities, and the collaborative activities occurring with DHS,

many institutions look to this organization for guidance on homeland security

academic issues (Rollins & Rowan, 2007, p. 14).

Regarding curriculum standards, the field of emergency management appears

to be a bit further along than homeland security. According to the research,

emergency management has become more professionalized as a field over the past

three decades (Drabek, 2007). An important part of this transformation has been

the explosive growth in higher education programs designed to provide the fundamental

knowledge and skills required of emergency managers (Blanchard, 2005).

Before this unexpected rapid growth, FEMA created a Higher Education Program

in 1994 to serve as the “nation’s leading focal point for emergency management

higher education, foster the professionalization of the field via educational efforts,

and contribute to a more resilient nation by creating a cadre of professional emer gency

managers” (Blanchard, 2008, p. 5). This program has worked to align higher

education standards and curriculum with the needs of the field.

Recently, the FEMA Higher Education Program organized a collaborative effort

of emergency management academics that produced Curriculum Outcomes describing

the knowledge and skills expected of a person possessing an undergraduate degree in

the field of emergency management (Cwiak, 2011). And although this document

“is a much needed step forward and will serve the field well,” it does not address

18 Journal of Public Affairs Education


Homeland Security in Higher Education

all the existing issues and “is just one step of many to come as the emergency management

community seeks to standardize what those holding emer gency management

degrees should know” (Cwiak, 2011, pp. 11–12). As Cwiak points out, graduate

education and research agenda priorities in emergency manage ment still need to be

addressed. The efforts of FEMA around higher education seem to have helped

advance the discussion of shared learning objectives and outcomes in this area.

Perhaps as the DHS continues to evolve, this agency will provide a similar leadership

role in the field of homeland security.

Challenges to Curriculum Development

Some of the challenges preventing the development of a straightforward set of

standards are that initiatives for increased integration among emergency management

and homeland security curricula are constrained by important cultural differences,

future governmental policies, disaster events, and other external factors (Drabek,

2007). These factors include the nation’s War on Terror priority that dominated

all-hazards disaster management during the first decade of this new century. This

period also emphasized the teaching of a top-down management paradigm reflective

of homeland security, rather than the emergency management alternative model,

which emphasizes cooperation, not command; coordination, not control. These

factors have created tension over the intergovernmental nature of emergency

management versus homeland security. For example, homeland security emphasizes

“the crime scene” nature of the disaster setting and the important roles played by law

enforcement and intelligence agencies that gather information to thwart potential

enemy attacks and the quick capture of those who might be successful in implementing

their plot. In the homeland security paradigm, local officials are recognized,

but the role of the federal bureaus rises to the top of the agenda (Drabek, 2007).

Again, this culture differs significantly from the world of emergency management

as it is practiced within most local communities.

These competing cultures are so strong that they even exist in national doctrine

such as the National Preparedness Goal. This conflict is evident in how the Goal

operationalizes missions such as Mitigation versus the Prevention and Protection

missions (FEMA, 2011). DHS, FEMA, and its partners and national stakeholders

are currently working on developing national frameworks for Prevention, Protection,

and Mitigation. These frameworks are all deliverables of Presidential Policy Directive

8 and set the foundation for the implementation of each mission area. As part of

this directive, the frameworks lay out key roles and responsibilities among all the

partners, including local, state, tribal, territorial, and federal governments; the

private sector; voluntary, faith-based, and community organizations; and the public

(http://fema.ideascale.com/a/ideafactory.do?discussionID=57956). The National

Preparedness Goal separately distinguishes protection, mitigation, and prevention

in a way that is unclear given the stated definitions. 2 Distinctions between these

Journal of Public Affairs Education 19


K. B. Stewart & J. Vocino

three missions become clear upon looking at the core capabilities list for the goal

(see Table 2), which identifies what steps can be taken to achieve Mitigation, Protection,

or Prevention. You may note that Protection and Prevention have mostly the same

terrorism-centric capabilities and differ from the core capabilities of Mitigation.

This structure leads to the conclusion that these missions are the focus of the

Homeland Security community that is focused on law enforcement, intelligence,

and homeland defense. This national planning effort highlights one of the challenges

that exist in developing common curriculum standards due to the competing cultures

within the fields of homeland security and emergency management.

Table 2.

National Preparedness Goal: Unique Core Capabilities

Protection Prevention Mitigation

Intelligence and

Intelligence and

Community resilience

information sharing

information sharing

Interdiction and

disruption

Screening, search,

and detection

Access control and

identity verification

Cybersecurity

Physical protective measures

Risk management for protection

programs and activities

Supply chain integrity

and security

Access control and

identity verification

Interdiction and disruption Long-term vulner -

ability reduction

Screening, search, and detection Risk and disaster

resilience assessment

Forensics and attribution Threats and

hazard identification

Professional Education

In October 2006, the Post-Katrina Emergency Management Reform Act

(PKEMRA) 3 charged FEMA with responsibility for leading the nation in developing

a national preparedness system, of which training and education are key elements.

In 2009, the Government Accountability Office (GAO) reviewed FEMA’s efforts

to implement PKEMRA, which established a Homeland Security Education Program,

including a graduate-level Homeland Security Education Program in the National

Capital Region (NCR). PKEMRA acknowledged the need to provide educational

opportunities to senior federal officials and selected state and local officials with

20 Journal of Public Affairs Education


Homeland Security in Higher Education

homeland security and emergency management responsibilities. It also requires

the leveraging of existing resources, as well as establishing student enrollment

priorities and selection criteria and employee service commitments. As a result,

the Naval Postgraduate School’s Center for Homeland Defense and Security,

FEMA, and DHS have created an 18-month Homeland Security Master’s Degree

Program for the NCR. The Homeland Security Master’s Degree Program is taught,

and the degree awarded, by the Naval Postgraduate School’s Center for Homeland

Defense and Security using the approved curricula. The program employs adjunct

faculty from universities and colleges across the United States and is open to DHS

employees at the GS-13, GS-14, GS-15, and exceptional GS-12 levels, as well as

to other federal and nonfederal employees.

FEMA has developed new strategies for its training and education policies.

According to FEMA’s 2011–2014 Strategic Plan, FEMA will realign and enhance

existing and emerging training and education programs for state, local, and tribal

emergency management officials, and FEMA employees, into a comprehensive

emergency management curriculum. FEMA’s Emergency Management Institute

(EMI), Center for Domestic Preparedness (CDP), and the FEMA-sponsored Center

for Homeland Defense and Security (CHDS) at the Naval Postgraduate School

will play prominent roles in this effort. According to the FEMA plan, the programs

at EMI, CDP, and CHDS, along with FEMA’s other training and education providers,

are to serve as the basis for training in core competencies across four areas; foundational,

technical, management, and leadership. FEMA’s approach will emphasize

education opportunities for newly appointed emergency managers and staff from

state, local, tribal and territorial, and federal emergency management offices. The

training will focus on the core competencies to provide new practitioners with

broad and generalized knowledge and skills in the field of emergency management

that meaningfully correlate to job performance. It will also build on executivelevel

programs that build strategic leadership competencies and foster collaborative

action among current and future emergency management leaders. These standards

in homeland security and emergency management professional development could

be a step in the direction of articulating agreed-upon curriculum standards for

bachelor’s and master’s degrees in both of these areas as well.

Accreditation

Currently there is no official organization that accredits undergraduate or grad uate

programs specifically in homeland security/emergency management. These programs

do exist in colleges and universities accredited by regional accrediting organizations

(such as the Southern Association of Colleges and Schools) and within accredited

programs such as Masters of Public Administration Programs (MPA). So, although

the National Association for Schools of Public Affairs and Administration (NASPAA,

the MPA accrediting organization) and other regional accrediting organizations

Journal of Public Affairs Education 21


K. B. Stewart & J. Vocino

do not accredit homeland security or emergency management programs specifically,

they do examine these courses and programs with the same attention they give to

all other courses and programs offered. There is a check on the general quality of

these classes, offering some assurance that they meet basic guidelines and standards

in advancing an institution’s goals and mission.

Other organizations will accredit specific types of homeland security programs,

such as information security. For example, the National Centers of Academic

Excellence in Information Assurance Education Program (CAE/IAE) accredits

undergraduate and graduate programs in Information Assurance (IA) and other

similar areas. However, this accreditation does not apply to the fields of homeland

security or emergency management in general, and it still leaves open the issue of

how to ensure that graduates learn a certain body of knowledge, understanding,

or skill set in an area so critical.

Although in many academic areas no accreditation exists, it seems to be more

prevalent in fields where specific professional careers are associated with the degree

(e.g., business, public administration, education, medicine, etc.) Accreditation

can lend credibility to programs and ensure that institutions prepare students

with a particular body of knowledge and skill set. It provides accountability to

programs by setting standards for quality of instruction and program outcomes.

It is not our argument that accreditation is a necessary component in higher

education degree programs; only that it indicates generally a shared set of values

and principles within a discipline.

Recommendations

As the field has continued to grow, so has the need for communication among

programs and between academics and practitioners. Several organizations have

either developed or refocused to address this need. The National Academic

Consortium for Homeland Security (the Consortium) comprises public and

private academic institutions engaged in activities related to current and future

U.S. national security challenges. The primary role of the Consortium is to promote,

support, and enhance academic research, technology development, education

and training, and service programs dealing with all aspects of international and

homeland security (Rollins & Rowan, 2007, p. 14). In addition, the Center for

Homeland Defense and Security along with FEMA’s Higher Education Program

have worked to coordinate discussions among academics and practitioners to

develop curriculum outcomes or focus areas within degree programs.

Based on survey input and face-to-face meetings, both FEMA and CHDS

have put together some suggested guidelines for curriculum development. These

are general core areas in which foundational knowledge and skills should be developed.

According to the FEMA work group in the Curriculum Outcomes document, an

individual with an undergraduate degree in Emergency Management should be

able to demonstrate knowledge, skills, and abilities in the following areas.

22 Journal of Public Affairs Education


Homeland Security in Higher Education

Foundational Tenets

• Historical awareness: History of natural disasters and the field of

Emergency Management

• Effective communications: Proficiency in scientific research methodology;

able to produce multiple forms of written professional documentation;

strong verbal and written communication

• Leadership, management, and decision making: Leadership, management,

and decision-making skills; strategic planning; ethics

• Personal, organizational, and professional development: A commitment

to the promotion of personal, organizational, and professional development

Core Areas

• The “Principles of Emergency Management”: Definition, mission,

concepts, and terminology used and applied in emergency management

• Human dimensions: Social, political, economic, cultural and ecological

issues; interpersonal and interorganizational behavior; disaster myths;

the concept of vulnerability and the social construct of disaster

human behavior

• Policy and legal dimensions: Statutory basis of emergency

management in the public sector and federal, state, tribal and local

policies, legislation, directives, and regulations

• Areas of emergency management responsibilities: Mitigation opportunities,

planning, training, exercises, warning, evacuation, sheltering, damage

assessment, debris removal, donations management, volunteer management,

public information, federal assistance programs, and recovery

programs. Graduates should be able to understand the need to integrate

the essential stakeholders within their community (e.g., law enforcement,

emergency medical services, public health, fire, Voluntary Organizations

Active in Disaster (VOAD), public works, critical infrastructure partners,

and businesses) in order to create a community framework that reduces

vulnerability to hazards and enhances the ability to cope with disasters.

• Risk assessment process and methodology: Hazard identification, threat

analysis, and vulnerability assessment within the overlapping contexts

of the social, built, and physical environments

• Fiscal dimensions of emergency management: Fiscal responsibilities of

the private, nongovernmental organization (NGO), and public

sectors at the federal, state, tribal, and local level; internal and

external sources of revenue; budgets and expenditures; accountability;

reimbursements; grant management; resource lists; cost-benefit

analysis; mutual aid; procurement; disaster assistance funding

Journal of Public Affairs Education 23


K. B. Stewart & J. Vocino

• Awareness and promotion of EM: Recognize and promote the awareness

and advancement of emergency management through the involvement

of political leaders and key decision makers, policy advocacy, stakeholder

engagement, partnerships among practitioners and scholars, public

education, and involvement in professional organizations

• EM standards, best practices, and comparative practices: Existing standards;

best practices and comparative perspectives; current, ongoing, and

developing societal and technological changes (Jaffin et al., 2011)

Based on a study conducted by Craig Marks in 2005, the following master’s

level competencies in Emergency Management were suggested for FEMA support:

• Leadership: Incident Command/NIMS/NRP, Consensus Building,

Risk Communication

• Communications: Oral, written, and technical

• Analytical and planning skills: Preparedness and Prevention Operations,

Response Operations, Recovery Operations and Mitigation Operations

• Hazard and risk assessment: Risk Planning, Risk Management and

Business Recovery/COOP

• Government operations: Administration and Financial Management

• Training and professional development: Professional Development, Exercise

Design, Evaluation, Development and Execution (Marks, 2005)

In 2009, the Center for Homeland Defense and Security sponsored a meeting

of faculty from a number of programs across the country to discuss the key curricular

components that should be included in undergraduate programs or certificates in

Homeland Security. The following major areas were identified as critical.

• Administering homeland security: Leadership; Management; Budget

and Finance; Logistics; Human Resources; Organizational Behavior

and Public Administration

• Intelligence: History, Evolution, and Current Structure of Intelligence

Community; Counterintelligence; State and Local Intelligence;

Intelligence Cycle; Covert and Clandestine Activities

• Private sector and homeland security: Public/Private Partnerships;

Business Continuity and Resilience; Private Sector Motivations;

Public Education; Public Relations; Public vs. Private Organizational

Functions; Role of Private Sector in Planning

• Research and analysis: Information Literacy; Collection and Analysis;

Theory Analysis and Application; Inductive/Deductive Reasoning;

Applied Statistics; Spatial Analysis; Geographic Information Systems

(GIS); Evaluation Research; Quantitative and Qualitative Analysis

• Emergency management: Application of All-hazards Analysis; Federal,

State, Local, and Tribal Emergency Management; Disaster Planning

Models; Community Preparedness; Understanding Basic Concepts

Such as Mitigation, Preparedness, Prevention, Recovery, etc.; Land

24 Journal of Public Affairs Education


Homeland Security in Higher Education

Use Planning; Resilient Community Design; Exercise and

Evaluation Programs; Social, Economic, Political, and

Environmental Recovery

• Natural and human caused hazards: Types and History of Hazards;

Organizational Responses; All-hazards Approach; Developing Preparedness

and Instilling Resilience; Needs of Vulnerable Populations

• Risk management: Components of Risk, Methods for Conducting

Risk Assessments; Federal, State, Local, and Private Sector Perspectives;

Application of Prevention, Mitigation, and Recovery Strategies to

Various Sectors

• Critical infrastructure protection: Critical Infrastructure, Key Resources,

and Interdependencies; Critical Components in an Infrastructure in

Particular Contexts (State, Local, Private, etc.); Various Methods to

Achieve Levels of Protection; Financial and Operational Relationships;

Strategies, Policies, Programs, and Agencies Involved; Global Security

Threats and Hazards; Scalable Assessment Methodologies

• Strategic planning: Budgeting; Integrated Planning Systems; Riskbased

and Scenario-based Planning; Deliberate and Crisis-Action

Planning; Interagency and Interorganizational Planning; Current

Policy Mechanisms; Leveraging Resources and Grants; NIMS, ICS,

NIPP, NRF

• Strategic communication: Elements of Strategic Communication;

Risk Communication; Cultural Awareness; Audience Identification;

Communication Planning; Synchronization of Messages; Maintaining

Consistency; Role of Media; Public Affairs, Education, and Emergency

Communication; Agencies and Organizations; Technology; Interoperability

of Messaging and Strategies; Community Outreach

• Law and policy: Society and Civics; Constitutional Law and Federalism;

Statutes, Executive Orders, and Directives; National, Regional, State,

and Local Policies and Strategies; International Treaties and Agreements;

Sector-specific Laws; Civil-military Relations; Policy-making Processes

and Analysis; Administrative Law; Regulator Processes

• Technology: Role and Types used in Homeland Security; Approaches

to Framing; Ethical and Privacy Considerations; Technology Development

Cycle; Network/Cyber infrastructure Protection; Consequences;

Limitations and Interoperability

• Terrorism and counterterrorism: Definitions and Distinctions; History

and Root Causes; Theories of Motivations; Tactics and Operations of

Groups; Role of Media and Internet; Effects of Terrorism; Military

Role; Policing and Actionable Intelligence; Lack of Support;

Competition Among Terrorist Groups; Compromise, Political

Resolution and Cooption

Journal of Public Affairs Education 25


K. B. Stewart & J. Vocino

In addition, the group recommends that four themes underlie all courses: critical

thinking, ethics, oral and written communication, and whole of society.

In his research, Drabek (2007) made three points that are relevant to this evolving

field. Within all democratic societies, universities and other institutions of higher

learning have performed the function of helping to sort out the priorities and roles

of emergency management and homeland security. Hence, within emergency

management and homeland security programs, students must be encouraged to

critically examine current doctrine, no matter its source. It is not enough just to

“know” the book. The capacity for critical analysis must be developed, encouraged,

and protected. Indeed, it must be required of all participants, students, and faculty.

Second, developments of programs and curriculums in emergency management

and homeland security should be promoted and stimulated by various governmental

bureaus, but the government must not dictate. As faculty implement a wide variety

of courses, programs, and credentials of various types, including specialized

certificates and formal degrees, all governmental bureaus should maintain a clear

boundary. Their role is to nurture, not to prescribe. As with curricular innovations

of the past, the pathways will be many and marked with both successes and failures.

As the professions of emergency management and homeland security continue to

evolve, they must become more active participants in the standard-setting process.

Decisions regarding curricular content and assessments of academic excellence

must come from within these institutions and the accreditation procedures and

bodies they construct.

Finally, we must recognize that future catastrophic events will have the greatest

impact on the long-term developmental pattern of such curricular innovation. As

new disaster events, some of which most policy makers, professionals, or faculty

cannot imagine, the field must continue to evolve and adapt (Drabek, 2007). It

is important for Homeland Security/Emergency Management programs agree upon

a shared set of values and skills necessary for degree and/or certificate completion,

yet be flexible enough to address future changes and needs of an ever-evolving field.

Conclusion

Homeland Security and Emergency Management education will continue to

be important topics within the field of public administration. What started out

as mostly a subfield within our discipline (as well as within other disciplines) has

quickly grown into a field of its own. Along with tremendous growth in standalone

programs, degrees, and certificates, the scholarship of homeland security

has also significantly increased. Several new journals focus solely on this area—

including the Journal of Homeland Security Education, the most recent journal

covering innovative concepts and models, strategies, and technical tools connecting

education to practice.

In this article, we discuss the challenges associated with such a fast-growing

field. This past year alone, nine new master’s-level programs in Emergency

26 Journal of Public Affairs Education


Homeland Security in Higher Education

Management were added across the country. As higher education institutions

develop these programs and curriculum, employers are pushing for set standards

across the field so they have some assurance of the knowledge and skill set of their

workforce educated in this area. At this time, no accrediting body or core competencies

for programs in Homeland Security and Emergency Management exist, causing

some concern for consistency in the education and training of degree holders.

However, through the leadership of both FEMA, the Center for Homeland Defense,

and leading academics in the field, there is movement in the direction of developing

standard learning outcomes and competencies for programs. This movement seems

to be more natural for the field of Emergency Management than it does for Homeland

Security, in part as the definition of homeland security continues to evolve.

The continuing dialogue that is taking place should in theory lead to a richer

discussion of the expectations of the higher education community in preparing

students for Homeland Security and Emergency Management careers.

Because the evolving nature of these fields still keeps measurable standards

out of reach, future students to these fields are challenged to determine the best

value for their education dollar. And such challenges face our nation if we are to

assess the extent to which the growth in these studies and schools has enhanced

our national capabilities in homeland security and emergency management.

Footnotes

1 See World Health Organization sources: http://www.who.int/csr/sars/en/ and http://www.who.

int/influenza/human_animal_interface/en/

2 The Goal defines Mitigation as including those capabilities necessary to reduce loss of life and

property by lessening the impact of disasters. It is focused on the premise that individuals, the

private sector, communities, critical infrastructure, and the nation as a whole are made more

resilient when the consequences and impacts, the duration, and the financial and human costs

to respond to and recover from adverse incidents are all reduced. But the Goal defines Protection

as “including capabilities to safeguard the homeland against acts of terrorism and man-made or

natural disasters. It is focused on actions to protect the citizens, residents, visitors, and critical

assets, systems, and networks against the greatest risks to our nation in a manner that allows our

interests, aspirations, and way of life to thrive.” Further, the Goal defines Protection as “capabilities

to safeguard the homeland against acts of terrorism and man-made or natural disasters. It is

focused on actions to protect the citizens, residents, visitors, and critical assets, systems, and

networks against the greatest risks to our Nation in a manner that allows our interests, aspirations,

and way of life to thrive.”

3 The Post-Katrina Act was enacted as Title VI of the Department of Homeland Security

Appropriations Act, 2007, Pub. L. No. 109–295, 120 Stat. 1355 (2006).

Journal of Public Affairs Education 27


K. B. Stewart & J. Vocino

References

Bellavita, C., & Gordon, E. M. (2006). Changing Homeland Security: Teaching the Core. Homeland

Security Affairs, 2(1), 1–19.

Blanchard, B. W. (2005). Top ten competencies for professional emergency management. Emmitsburg,

MD: Higher Education Program, Emergency Management Institute, Federal Emergency

Management Agency.

Blanchard, B.W. (2008). FEMA emergency management higher education program description: Background,

mission, current status, and future planning. Emmitsburg, MD: Higher Education Program, Emergency

Management Institute, Federal Emergency Management Agency.

Cwiak, C. (2011). Future of Homeland Security and Emergency Management Education: Framing the

future: What should emergency management graduates know? Journal of Homeland Security and

Emergency Management, 8(2), 1–14.

Drabek, T. E. (2007, August). Social problems perspectives, disaster research and emergency management:

Intellectual contexts, theoretical extensions and policy implications. Paper presented at American

Sociological Association, New York, NY.

Federal Emergency Management Agency (FEMA). (2011). Higher Education Program Bits and Pieces.

Retrieved from http://content.govdelivery.com/bulletins/gd/USDHSFEMA-22535a

Gearon, C. (2011, September 19). Discover 9 hot college majors: Colleges are adding degrees in growth

fields, such as homeland security and public health. US News and World Report, Retrieved from

www.usnews.com/education/best-colleges/articles/2012/09/12/discover-9-new-college-majorswith-a-future

Jaffin, R. D., Berry, R. T., Carter, S. S., Cwiak, C. L., McEntire, D. A., et al. (2011). Curriculum

outcomes. Emmitsburg, MD: Higher Education Program, Emergency Management Institute,

Federal Emergency Management Agency.

Jones, E. (2006, Summer). Bureau of Labor and Statistics occupational outlook summer 2006: Jobs in

homeland security. Retrieved from ww.bls.gov/opub/ooq/2006/summer/art01.pdf

Kiltz, L. (2011). Future of homeland security and emergency management education: The challenges

of developing a homeland security discipline to meet future threats to the homeland. Journal of

Homeland Security and Emergency Management, 8(2), article 1.

Marks, C. (2005) Professional competencies for the master’s level emergency manager: Knowledge systems

necessary for the emergency manager of the 21st century. Blue Horizons, LLC. Presentation Higher

Education Program, Emergency Management Institute, Federal Emergency Management Agency,

Emmitsburg, MD.

McCreight, R. (2011). Future of homeland security and emergency management education: Introduction

to the Journal of Homeland Security and Emergency Management Special Issue. Journal of Homeland

Security and Emergency Management, 8(2), 1-8.

Mueller, J., & Stewart. M. G. (2011, April 1). Terror, security, and money: Balancing the risks, benefits

and costs of homeland security. Paper presented at the Annual Convention of the Midwest Political

Science Association, Chicago IL.

Plant, J. F., Arminio, T., & Thompson, P. (2011). Future of homeland security and emergency

management education: A matrix approach to homeland security professional education. Journal

of Homeland Security and Emergency Management, 8(2), 8.

28 Journal of Public Affairs Education


Homeland Security in Higher Education

Polson, C. J., Persyn, J. M., & Cupp, O. S. (2010, May). Partnership in progress: A model for

development of a homeland security graduate degree program. Homeland Security Affairs, 3–25.

Rollins, J., & Rowan, J. (2007). Homeland Security Education Survey Project. Homeland Security and

Defense Education Consortium. Retrieved from www.hsdeca.org

Tierney, K. (2005, January 18–20). Recent developments in US homeland security policies and their

implications for the management of extreme events. Paper presented at the First International

Conference on Urban Disaster Reduction, Kobe, Japan.

Kendra B. Stewart is an associate professor and director of the Joseph P. Riley, Jr.

Center for Livable Communities at the College of Charleston. Her research

interests include South Carolina government, nonprofit management, homeland

security, state and local government, and women and politics. She is coeditor of a

book entitled The Practice of Government Public Relations. The articles she has

authored have appeared in The Practice of Strategic Collaboration: From Silos to

Actions; Urban Affairs Review; Public Finance and Management; Perspective in

Politics; and various scholarly books.

Mr. Vocino is currently a senior analyst for the U.S. Government Accountability

Office (GAO) in the Homeland Security and Justice division, as an expert on

emergency management issues. Mr. Vocino’s 25-year body of GAO work has

focused on the intergovernmental relationships and effects on state and local

governments across federal domestic policies and programs including emergency

management, social services and community development, and regulatory

rulemaking. Before joining the GAO, Mr. Vocino served as a county planner and

project administrator for St. Bernard Parish, Louisiana.

The information and views contained in this paper do not represent the evidence,

findings, opinions, or conclusions of the U.S. GAO, and are solely the authors’.

Journal of Public Affairs Education 29


Utilization of Service Learning in

Emergency Management Programs

in the United States

Naim Kapucu and Claire Connolly Knox

School of Public Administration, University of Central Florida

Abstract

Emergency management academic programs continue to strive toward linking

students’ theoretical and practical knowledge before they enter the evolving and

challenging field of emergency management. This article recommends including

service-learning pedagogy in the development of emergency management programs

and curriculum to help meet this educational challenge. Results from a national survey

of emergency management and homeland security academic programs indicate

that many programs are incorporating service-learning projects in some courses.

This article concludes by discussing the benefits and challenges associated with

using service learning in emergency management programs and by presenting advice

for program directors and faculty considering implementing this pedagogy.

Emergency management academic programs have significantly increased in

the last decade, from approximately 75 programs in 2001 to over 150 programs

at colleges and universities throughout the United States in 2011 (Kapucu, 2011).

Yet, there remains an educational challenge: linking students’ theoretical and

practical knowledge before they enter the evolving and challenging field of

emergency management (McCreight, 2009). Emergency management certificate,

degree programs, and curricula were created rapidly and resulted in discrepancies

in the quality and rigor of the programs. Without a professional accreditation body

for these academic programs, the debate continues on how best to prepare future

emergency managers (Clement, 2011). This article recommends including servicelearning

pedagogy in the development of emergency management programs

and curriculum.

Keywords: experiential learning, service learning, emergency management

JPAE 19(1), 31–51

Journal of Public Affairs Education 31


N. Kapucu & C. C. Knox

Service learning, one type of experiential learning, engages students in

“activities that address human and community needs together with structured

opportunities intentionally designed to promote student learning and development”

(Jacoby, 1996, p. 5). Service learning allows students to integrate theory (academic

perspectives) and emergency management practice through a facilitated individual

or group project that takes them out of the classroom and into a community setting

and enables them to comprehend the course material better (Bringle & Hatcher,

1996; Bryer, 2011; Bushhouse & Morrison, 2001; Jelier & Clarke, 1999; Kapucu,

2011; Lambright & Lu, 2009; McEntire, 2002; Ostrander, 2004). At its core,

service learning includes a “theoretical foundation with clear learning objectives,

activities, and reflective components” (Kenworthy-U’Ren & Peterson, 2005, p.

272). Well-designed service-learning projects identify community needs, as well

as student needs, before the project starts (Kenworthy-U’Ren & Peterson, 2005).

Additionally, these projects require a strong, committed facilitator throughout

the four cycles: experience, reflect, think, and act (Bringle & Hatcher, 1996). One

distinguishing aspect of service learning is the reflective nature of the experience,

in which students “gain further understanding of course content, a broader appreciation

of the discipline, and an enhanced sense of civic responsibility” (Bringle &

Hatcher, 1996, p. 222).

Although previous research highlights the effectiveness of service-learning pedagogy

in graduate public administration programs (Bryer, 2011; Campbell & Tatro, 1998;

Cunningham, 1997; Jelier & Clarke, 1999), this article narrows the focus to service

learning implemented in emergency management certificate and degree programs

in the United States. This article provides a brief literature review of service learning

as experiential learning and looks at current trends in emergency management

curriculum and program design. Using results from a national survey, this article

aims to promote a greater understanding of how service learning is used in emergency

management programs. More specifically, the article examines (a) how service learning

is being defined by emergency management faculty, (b) how service learning is

being integrated into emergency management programs, and (c) how service

learning can better the emergency management discipline. This information adds

to the ongoing discussion about creating standards, best teaching practices, and

rigorous programs and courses in the emergency management discipline (Clement,

2011; Donahue, Cunnion, Balaban, & Sochats, 2010; Kapucu, 2011; McCreight,

2009; McEntire, 2002).

Literature Review

In recent years, the mode of teaching and the ways of learning in higher

education settings can be summarized as a shift from theory-based education to

experience-supported theoretical education. The trend toward such an approach,

known also as experiential learning, is explained by several sociocultural factors

ranging from changing workforce and nontraditional learners in academic

settings to increased understanding of learning theories and the need to be closer

32 Journal of Public Affairs Education


Service Learning in Emergency Management

and more responsive to business and community (Cantor, 1997). Dewey (1938),

considered the father of experiential learning, is one of the most important scholars

of the 20th century. He focused on education and praised experiential education,

especially with the purpose of reaching community partners related to the academic

subject matter. Cunningham (1997) argues that experiential learning is an empowerment

tool for creativity that brings motivation into the learning process.

Service Learning as Experiential Learning Pedagogy

Kolb (1984), a follower of Dewey’s ideas, also worked on the theory of experiential

learning. In light of Dewey’s work, Kolb defined learning as “the process whereby

knowledge is created through the transformation of experience” (p. 38). Kolb (1984)

argues that learners get experience either through concrete experience or abstract

conceptualization, and transform experience either through reflective observation

or active experimentation. These approaches produce four different styles of learning

(Table 1), namely, divergers, who are imaginative and can reflect on issues from

different perspectives; accommodators, who are hands-on and experience oriented;

assimilators, who are theory oriented and can build theories through inductive

reasoning; and convergers, who tend to understand theories through application

and focus on deductive reasoning to solve problems.

Table 1.

Learning Styles

Reflective Observation Active Experimentation

Concrete Experience Divergers Accommodators

Abstract Conceptualization Assimilators Convergers

Source. From Kolb (1984).

The different cognitive, behavioral, and affective learning styles, in turn, result

in different forms of experiential learning. Some commonly used experiential learning

approaches are practicum experiences, internships, service learning (or community

service programs), cooperative education, undergraduate research, fieldwork, or

study abroad. As regards range of applicability, on the other hand, practica, internships,

and service learning stand out as the most effective tools. Renger, Wood, and

Granillo (2011) identify three key components of experiential learning:

1. The learner must be an active participant in the learning process and

have control over the direction of the learning.

2. The learning experience must be based on direct confrontation with

practical, social, personal, or research problems.

3. The learner must be able to evaluate his or her own progress. (p. 58)

Journal of Public Affairs Education 33


N. Kapucu & C. C. Knox

Although Renger and colleagues identify these principles as important

components in designing training for disaster preparedness and response, these

principles are also applicable to designing emergency management programs

with experiential learning components.

Although practicum experience involves a focused and supervised application

of theory in a laboratory or field setting, an internship is a structured and supervised

learning experience in an agency relevant to a student’s field. Yet service learning

involves organized, structured, and supervised participation in a service that meets

a community need. Practica and internships are generally skill based within the context

of a specific profession, but service learning is an experience-based activity not limited

by skill requirements (Bringle & Hatcher, 1996). Generally speaking, students earn

academic credit for such experiential learning activities in academic settings.

Service learning has been considered one of the most influential, effective ways

of integrating theory and practice in higher education (Bringle & Hatcher, 2000).

Bringle and Hatcher (1996) define service learning as an educational experience

through “an organized service activity that meets identified community needs

and reflect on the service activity in such a way as to gain further understanding

of course content, a broader appreciation of the discipline, and an enhanced

sense of civic responsibility” (p. 222). Ash, Clayton, and Moses (2009), on the

other hand, state that service learning mainly contributes to personal development

and growth, civic learning, and academic enhancement. McCrea (2004) summarizes

common definitions of service learning and provides a guide: “The service must

meet an actual community need; the learning from service must be clearly integrated

with course objectives; reflection about the service experience is essential; and the

relationship between service recipients and learners must be reciprocal” (p. 5).

Seigel and Rockwood (1993) also point to the importance of reflection as a

tool for learning from experience. Eyler, Giles, and Schmeide (1996) state that

reflection is a “transformative link between the action of serving and the ideas

and understanding of learning” (p. 14). According to Jacoby (1996), reflection

and reciprocity are two key elements of service learning.

Astin, Vogelgesang, Ikeda, and Yee (2000) conducted a longitudinal study

analyzing how service learning affects students in regard to such variables as academic

outcomes, values, self-efficacy, leadership, career plans, and future service plans.

The authors found a positive relationship between those variables: Specifically,

their findings stress the importance of reflection in making sense of theory; show

that theory and practice are mutually reinforcing; identify a shift from teaching

to learning; show the complex nature of learning interwoven with personal experiences;

and advise including service learning in the curriculum. Lambright and

Lu (2009) have analyzed factors positively affecting learning in service-learning

projects and found that it is mainly (a) the extent to which class materials are

integrated into the project; (b) whether the service-learning project is conducted

34 Journal of Public Affairs Education


Service Learning in Emergency Management

through a group or individually; and (c) whether students are full-time or parttime,

which would influence the effectiveness of the service-learning projects.

Aside from its benefits in the learning process, service learning is about community

impact. Although studies analyzing the community impact of service-learning

research are rare, due to disagreements on the definition of community and related

variables as well as methodological problems (Cruz & Giles, 2000), some research

has been done on this subject. Service-learning projects conducted by higher

education programs do make a difference and contribute to the overall well-being

of the community, democratic citizenship, and organizations involved in the projects

(Bacon, 2002; Bushouse, 2005; Saltmarsh & Hartley, 2011; Sandy & Holland, 2006;

Waldner & Hunter, 2008). D’Agostino (2006), in turn, argues that service learning

is a tool for building social capital because of its inherent focus on collaboration

and partnerships, which also would foster development of citizenship. In addition,

service learning can contribute to community capacity building (Kapucu &

Petrescu, 2006).

Service Learning in Emergency Management

Scholars have rigorously worked on professionalization of the emergency

management discipline (Britton & Lindsay, 2005; McCreight, 2009; Wilson &

Oyola-Yemaiel, 2002). Despite these initiatives, there are no general standards,

no widely accepted body of knowledge, no appropriate research methodologies,

no set of standards and ethical norms, and no academic program accreditation

agency that represents the emergency management and homeland security

disciplines (Clement, 2011). These related efforts also have been reflected in

attempts to develop appropriate emergency management curriculum and related

higher education programs (Darlington, n.d.; Kiltz, 2009).

The higher education programs, in turn, are developed and evaluated today

based on three general principles: teaching, scholarship, and service. Clement (2011)

argues that the three principles are essential for developing and refining the discipline,

and stresses that teaching and scholarship are of little value unless they are

applied in the real world through community and public service. Darlington’s

(n.d.) study points to this already existing problem and argues that there is a lack

of linkage between theory and practice in higher education programs:

What the nation currently has is not a vision of needs, but rather

a reactionary mix of courses that have been assembled to respond

to specific laws aimed at specific hazards and specific responses.

Curriculums are not holistic, but an accumulation of topics related

to hazards and disasters. We need to harness this misdirected energy

in a new direction. Leadership is needed with a vision of how to link

theory and performance based training within a core curriculum of

emergency management education. (p. 11)

Journal of Public Affairs Education 35


N. Kapucu & C. C. Knox

Thomas and Mileti (2003) also argue that practical, hands-on learning should be

incorporated into emergency management higher education curricula. Kiltz (2009)

similarly points to the importance of learning by doing in emergency management

and homeland security programs, and states that such an approach fosters and

enhances critical thinking.

With these calls to integrate theory with practice, experiential learning in

general—and service learning in particular—plays a vital role in designing higher

education curricula for the emergency management discipline. Because service

learning envisions a broader appreciation and apprehension of the related discipline

(Bringle & Hatcher, 1996), service learning in the emergency management field

offers a tool for gaining a better understanding of the field, one that is sensitive

to and based on practice and experience (Kapucu, 2011). Kushma (n.d.) is a

supporter of incorporating service learning into higher education curriculum

because it

combines educational objectives with practice environments in such

a way that students, academic institutions, and communities profit

from their experiences. The emphasis on the “real world” provides

students an opportunity to immediately use and apply their classroom

knowledge and skills, and to gain valuable feedback through

assessment and reflection. Service-learning also allows students to “try

on” a number of emergency management roles and functions, and

thus contributes to a successful transition from school to professional

setting. (pp. 3–4)

Renger, Wood, and Granillo (2011) likewise emphasize the application of

knowledge in real-world settings to prepare future emergency managers.

McEntire (2002) states that such preparation would be beneficial both in the

short and long run: In the short run, service learning helps reinforce theory and

demonstrates how it is applied in real world; in the long run, service learning

helps students gain skills and abilities that would improve the profession of

emergency management.

Emergency management academic programs are expected to cover the

follow ing core competencies in core curriculum: comprehensive emergency

management framework or philosophy (all-hazards); leadership and team

building; critical decision making; organizational management; networking and

coordination; integrated emergency management; emergency management

functions; political, bureaucratic, social contexts; technical systems and standards;

social vulnerability reduction approach; interdisciplinary perspectives; ethics and

professionalism; and analytical and research skills (Blanchard, 2005; Cwiak, 2008;

Kapucu, 2011).

36 Journal of Public Affairs Education


Service Learning in Emergency Management

The core courses addressing these competencies are as follows:

• Introduction to Emergency Management

• Building Disaster-Resilient Communities

• Homeland Security and Emergency Management

• Disaster Response Operations and Management

• Terrorism and Emergency Management

• Political and Policy Basis of Emergency Management

• Social Dimensions of Disaster

• Principles and Practice of Hazard Mitigation

• Planning Principles

• Information Systems for Emergency Management

Public Administration and Emergency Management

• Technology and Emergency Management

• Crisis Management; Sociology of Disaster

• Disaster Recovery

• Research and Analysis Methods in Emergency Management

• Social Vulnerability Approach to Disasters

We expect service-learning projects in emergency management programs to

help address some of the core competencies mentioned earlier with partnering

emergency managers. Overall, experiential learning in emergency management is

a relatively new phenomenon, so a gap exists regarding the need to link theory

and practice before the graduates obtain a job in the real-world setting. Despite

the lack of agreed-upon body of knowledge, standards, research methodology, and

academic program accreditation and evaluation agencies, the need for hands-on

experience and reflection about what is being learned is apparent. Service learning

presents an opportunity to fill that gap by having students engage in interdisciplinary

and collaborative projects that are beneficial to students and the community

(Kapucu, 2011).

Methods

The data source for this study is a questionnaire of emergency management

programs in the United States. The Emergency Management Institute (EMI) at

the Federal Emergency Management Agency (FEMA) provided a list of program

coordinators and directors from 245 Emergency Management academic programs.

The questionnaire consisted of a mix of Likert Scale questions and open-ended

questions (see Appendix A).

Of the 245 contacts provided, 5 (2%) had missing contact information and

23 (9.4%) had incorrect contact information (error message received when survey

Journal of Public Affairs Education 37


N. Kapucu & C. C. Knox

request was sent electronically). Of those 28 incomplete contacts, correct contact

information was found for seven program coordinators, and the contact list was

finalized. Before distributing the survey, we obtained approval from the institutional

review board (IRB).

Based on Dillman’s Tailored Design Methodology (2007), the electronic question

naire using Survey Monkey was sent to 227 participants in early December

2012. After the initial mailing, nonresponsive participants were sent two reminder

e-mails, two weeks apart, to encourage participation. After three rounds of

encouragement, 70 (30.8%) responded. Of those responses, 50 (22%) were

usable replies. The large majority of respondents (94.3%) were the addressee

(i.e., the director of the Emergency Management academic program). Most of

the respondents (55.9%) incorporate service learning as part of the curriculum.

Most of the respondents are male (66%) and have a master’s degree (48%).

43% of the respondents have PhDs, and 9% have a bachelor’s degree. These degrees

are primarily in Public Management/Administration, Public Policy/Political Science,

Criminal Justice, Emergency Management, Education, and Engineering. The respondents

represent Emergency Management and Homeland Security academic programs in

25 of the 46 states surveyed. Most respondents represent universities programs (70%);

a smaller portion represent community or technical colleges (30%).

On average, the emergency management academic programs have been operating

for 8 years and are housed in a variety of departments and schools. The most

reported departments are Public Administration/Affairs/Management (26%),

Emergency Services/Disaster Management (15%), Public Health/Health Services

(13%), Environmental/Earth Sciences (9%), Engineering (9%), Criminal Justice

(9%), Public Safety (7%), and Business Management (7%). The least reported

departments are Social/Behavior Sciences (4%), Political Science (2%), Urban

Planning (2%), and Protective Services (2%).

Findings

Emergency Management academic programs are taught in a variety of ways.

Respondents for this study indicated their three dominant modes of teaching are

online (38%), face-to-face (36%), and mixed mode (26%). The size of these

programs varies at the undergraduate and graduate level. Most of the undergraduate

programs have 20 or fewer students (34.1%), but 31.7% have 21 to 30 students,

2.4% have 31 to 40 students, 2.4% have 41 to 50 students, and 29.3% have 51

or more students. Most of the graduate programs have 10 or fewer students

(61.9%); only 11.9 % have 11 to 20 students, 2.4% have 21 to 30 students,

2.4% have 31 to 40 students, and 21.4% have 41 or more students. Table 2 is an

overview of the multiple certificates, minors, and degree emergency management

programs offered by the respondents.

38 Journal of Public Affairs Education


Service Learning in Emergency Management

Table 2.

Type of Emergency Management Program Offered

Type of emergency management program offered (Check all that apply.)

Answer Options Response (%)

Response

Count

Certificate (undergraduate level) 35.4 17

Certificate (graduate level) 18.8 9

Concentration within an undergraduate major 8.3 4

Minor 18.8 9

Associate Degree 33.3 16

Bachelor’s Degree 31.3 15

Master’s Degree 10.4 5

Master’s Level Concentrations 12.5 6

Doctoral Degree 0.0 0

Doctoral Level Concentrations 10.4 5

Other 5

N 48

For 65.1% of respondents, the primary purpose of the Emergency Management

program was pre-employment (e.g., preparation for entry into emergency

management, homeland security, or related field); for 34.9% of respondents, the

primary purpose was advancement in a current emergency management, homeland

security, or related position.

Most of the programs (53%) had 25% or fewer full-time faculty members,

therefore relying heavily on part-time instructors and adjunct faculty. Part-time

faculty members usually are affiliated with emergency management agencies and

have significant planning, management, and training experience (e.g., emergency

management operation centers). A quarter of the emergency management and

homeland security programs had between 75% and 100% full-time faculty

members teaching the courses. The remaining 22% of the programs ranged

between 26% and 67% of full-time faculty.

Most of the programs reported that none of their faculty had a university

degree in emergency management (51%). Some programs (40.8%) indicated

that between 25% and 50% of its faculty had a degree. Few programs (8.2%)

reported that the entire faculty had a university degree in emergency management.

However, when asked about the faculty’s work experience in emergency

Journal of Public Affairs Education 39


N. Kapucu & C. C. Knox

management, the percentages are different. Of the programs, 34% required

their entire faculty to have emergency management work experience. Only 4.3%

of the programs reported that their faculty had no related work experience. For

25.5% of questionnaire respondents, 25% of the faculty had work experience. For

12.8% of respondents, 50% of the faculty had work experience. For 23.4% of

respondents, 75% of the faculty had work experience.

A little over half of the respondents (55.9%) indicated that their program

incorporates service learning as part of the curriculum. Service learning has been

included in the program for an average of 4.5 years. Yet, 65% of the respondents

reported that only 10% to 30% of emergency management courses include a

service-learning component. Twelve percent of the respondents indicated that

half of their courses have service-learning projects, and a few respondents (6%) said

that 75% of their courses include service learning. Eighteen percent of the respondents

indicated that all of their emergency management courses incor porate service-learning

projects. Most respondents (82%) have interdisciplinary service-learning programs.

Service learning is viewed as promoting university-community collaboration (85%)

and contributing to faculty learning about the local community (75%).

As highlighted in Table 3, for 48% of the respondents service-learning projects

are mandatory; for 47% of the respondents, the projects are voluntary. Respondents

preferred individual service-learning projects (64%) over group projects (43%).

Nearly all respondents (89%) indicated that they integrate course materials into

the projects, and the same percentage believed that student reflection is an essential

part of the service-learning process. For 79% of respondents, civic engagement is

viewed as part of their service-learning program.

Students are active throughout the service-learning process. For 82% of respondents,

students evaluate their progress throughout the service-learning project, and

72% of students actively design and execute the project. Most respondents perceive

service-learning projects as positively affecting students’ intellectual growth (96%) as

well as their research skills (79%). Three fourths of respondents include an evaluation

strategy to assess the impact of service-learning projects on student’s learning.

Although faculty make a large time commitment in designing, implementing,

and evaluating service-learning projects, the questionnaire results also indicated

that many programs use students as active participants in the project. It is

recommended that the service-learning process increasingly incorporate students

in the design, execution, and evaluation phases to help them become active

participants as well as reduce some of the faculty time commitment (Ash et al.,

2009; Bushouse, 2005). (See Table 3 for the entire evaluation of service-learning

pedagogy in emergency management programs by respondents.)

Based on the literature and feedback from respondents, institutional support

is necessary. As highlighted in Table 3, although most respondents felt supported

by their department (86%) and university (79%), yet 28% of the respondents

did not have a service-learning office that provides assistance for faculty planning,

implementing, and monitoring service-learning projects.

40 Journal of Public Affairs Education


Service Learning in Emergency Management

Table 3.

Implementation of Service Learning in Emergency Management Programs

Answer Options

Strongly

Disagree

Disagree

Neither

Agree/

Disagree

Agree

Strongly

Agree

The main goal of service learning in our program is

to link theory with practice.

1 0 0 11 18 30

Service-learning projects in our curriculum are man -

datory for our emergency management students.

3 6 6 4 10 29

Service-learning projects in our curriculum are

voluntary for our emergency management students.

7 6 3 10 4 30

Students completing service-learning projects are active

participants in design and execution of the projects.

1 0 7 12 9 29

Students evaluate their progress throughout the

service-learning project.

0 3 2 20 3 28

Course materials are integrated into the servicelearning

project.

1 0 2 20 5 28

Service-learning projects are completed mainly as a

group project.

2 10 4 10 2 28

Service-learning projects are completed mainly as

an individual project.

1 6 3 14 4 28

Civic engagement is part of service learning in

our program.

0 2 4 14 8 28

Reflection is an essential part of service learning in

our program.

1 1 1 16 9 28

Our program has an evaluation strategy to assess the

impact of service-learning projects on students’ learning.

1 5 1 17 4 28

Service learning in our program better prepares

future emergency managers.

1 0 0 14 13 28

Service learning in our program contributes to

students’ intellectual growth.

1 0 0 15 12 28

Service learning in our program improves students’

research skills.

1 1 4 15 7 28

Service learning in our program is interdisciplinary. 1 2 2 15 8 28

Service learning in our program promotes universitycommunity

collaboration.

1 0 3 11 12 27

Service learning in our program contributes to

faculty learning about our communities.

1 0 6 15 6 28

Service learning in our program is supported by

the department.

1 1 2 15 9 28

Service learning in our program is supported by

the institution.

1 0 5 11 11 28

Our institution has an office of service learning to

assist faculty in designing and implementing servicelearning

projects.

1 7 5 8 7 28

N

Journal of Public Affairs Education 41


N. Kapucu & C. C. Knox

Defining Service Learning

Respondents provided varying definitions of service learning as it applies to

their program. However, a few common themes emerged. The main theme was

applying theory or academic learning to real-world experience or practice (24%)

for a community-based need (38%). As one respondent put it: “Putting hands

and feet to brainwork. Bringing the curriculum to life.” Another respondent

stressed the mixing of academic learning with community needs: “Service learning

is the application of concepts, principles, and activities to enhance learning and

support the emergency management programs in our community.”

Another theme was enhancing the learning and growth of the organization

and community as well as that of the students. One respondent defined service

learning as “Active involvement in a project or position that contributes to the

interdisciplinary growth of our student and to the communities and organizations

they are involved with.” Another respondent stated that it also includes “learning

to empower others outside the university.”

Some respondents defined service learning by listing some projects completed

by students, including “co-ops and internships” as well as “participation in

community based projects including training exercises, functional drills, and

special community outreach projects. Interacting with professionals in the

discipline and assisting with special projects.” Lastly, one respondent directly

quoted Bringle and Hatcher’s (2000, p. 222) definition of service learning:

Service-Learning is a “credit-bearing, educational experience in which

students participate in an organized service activity that meets identified

community needs and reflect on the service activity in such a way as to

gain further understanding of course content, a broader appreciation

of the discipline, and an enhanced sense of civic responsibility.”

Benefits of Incorporating Service Learning

Respondents who are implementing service-learning programs see many

benefits. In the Likert Scale questions, 97% agreed or strongly agreed that the

main goal of service learning within their program is to link theory with practice,

and 96% agreed and strongly agreed that service learning better prepares future

emergency managers. This consensus was reflected in the open-ended questions,

in which the two benefits most discussed were applying theory to practice (41%)

and “real-world”/“hands-on” experience for the students (36%). As one respondent

elaborated:

Students get true “hands-on” experience, working with real deadlines,

in true pressure situations, and sometimes experiencing first-hand

the outside influence of politics in Emergency Management. (This

experience is quite valuable as compared to academic theory, where

the student more or less has control over their grades and deadlines.)

42 Journal of Public Affairs Education


Service Learning in Emergency Management

Additional benefits stated by a few respondents were that the service-learning

experience leads to an increase in students’ “civic engagement and responsibility”;

increase of “institutional visibility”; development of a student’s research and leadership

skills; and the “professional contacts” students make while completing the project.

Challenges of Incorporating Service Learning

The biggest challenge in implementing service learning in emergency management

programs is the amount of time required for the student and faculty member

(43%). Many students in these programs are employed full-time (sometimes in a

field related to emergency management) or are distance learners (this case is especially

critical for online programs, because students are not co-located with the educational

institution). Therefore, finding time to complete service-learning projects

can be problematic. One respondent elaborated, “Most of our student body are

older students who have fulltime jobs and families to support. Other than scheduled

class times, they have little ‘spare time’ to devote to service learning.”

Time for faculty to develop, implement, monitor, and evaluate these projects

is seen as a challenge by some of the respondents. As one respondent stated, “It takes

a significant commitment of staff resources to identify opportunities, work with

organizations/agencies, support, monitor, and evaluate student and organizational

outcomes. We have a small part-time staff and are recruiting volunteers to help

develop our program.” A few respondents elaborated about needing enough

faculty to mentor a large number of students, “more than 200 declared majors,”

completing service-learning projects, as well as having “faculty with enough EM

[emergency management] background to evaluate the project.”

Scheduling service-learning projects within the limitation of one semester

and having to include other assessments for the students is another challenge

provided by respondents. For example, a respondent described “trying to fit this

[service learning project] in when you are lecturing, giving quizzes or tests,

assigning papers, watching videos, having guest speakers, etc.”

Other challenges mentioned by a few respondents include having “institutional

and administrative buy-in”; “balancing between a safety and applicable activities”;

“balancing between theoretical instruction and practical application”; and providing

a “realistic expectation of student capabilities to organizational personnel.”

Successful Service-Learning Projects

Most of the successful service-learning projects (45%) involved students developing

emergency management plans, manuals, or guidelines for local organizations,

including nonprofit organizations, senior citizens, local government, local businesses,

faith-based organizations, and the university. As one respondent explains, “Students

in my Developing Community Resources course are required to work closely

with a county department or agency, creating a resource manual for that agency/

department. This endeavor has been well-received in our community, with agencies

calling to request a student’s assistance.”

Journal of Public Affairs Education 43


N. Kapucu & C. C. Knox

Other successful projects include having students develop and offer trainings

for senior citizens, immigrant populations, local businesses, elementary schoolchildren,

and other community members; hazardous materials flow studies; and

emergency exercise design, execution, and participation.

Respondents provided advice for those emergency management programs

considering implementing service-learning projects. Having “sufficient administrative

and faculty support” was provided as both advice and a challenge among

respondents implementing service-learning projects. Service-learning projects can

take a significant amount of time required for planning, including finding the “best

location for service learning” as well as “finding an appropriate organization that

serves the general public.” Other planning aspects include providing “back-up

support for students who get caught in community rows and political dynamics,”

helping students choose their project, clearly outlining the “goals and objectives

in their projects,” and monitoring the “experiences to be certain that learning

outcomes are supported.”

Although service-learning projects require significant time commitment by

faculty members, respondents provided advice on how to overcome this challenge:

• “Go for the obvious community needs to help fill and don’t be afraid

to let the students take on responsibilities that at first glance may seem

‘beyond’ their training. Practical experience combined with reading

materials and writing assignments catapults students to a level of expertise

that takes four times as long (at least) in the classroom.”

• “Work closely with an Emergency Management Advisory Committee

to develop and create opportunities for service learning outside the

classroom.” (Advisory committee members can be helpful in designing

service-learning projects and finding an appropriate partner in

the community.)

• “Be very clear to organizational personnel about what types of questions

will be asked or the work to be done, and also share what

products or outcomes can be expected by the organization as a

result of their sponsorship.”

• Extensive preplanning with the agencies so that they thoroughly

understand what the student’s role is and that they are clear on their

responsibilities.”

• “Collaborate with local and state professional organizations. Utilize

local professional Emergency Managers as part of instructional staff

or at least as guest lecturers.”

• “Have very willing partners willing to mentor students.”

• “Start small…build trust…community-based not community-placed.”

44 Journal of Public Affairs Education


Service Learning in Emergency Management

Conclusion

Service learning as a pedagogical tool has gained significant attention at U.S.

colleges and universities. Of course, more research is needed in analyzing the impacts

of service-learning projects on student learning and on the community. Service

learning can be an important pedagogical tool, especially in professional disciplines

such as emergency management. The article focused on using service learning as

one experiential learning strategy in emergency management programs in the

United States. As indicated by study respondents, a significant portion of the emergency

management higher education programs in the United States incorporate

service learning in their curriculum to link theoretical perspective with practice as

well as provide students with real-world experience in local communities.

Based on feedback from respondents, regardless of their definition of service

learning, they face challenges in implementing these projects. The article highlights

successful emergency management service-learning projects in which most of the

deliverables are tangible products for community agencies. We find that servicelearning

projects should start with smaller projects. For successful completion of

a service-learning project, it is also critical to provide clear guidelines to students

and community partners. The absence of institutional support is a challenge to

successful service-learning projects, and we advise program directors to secure

sufficient administrative and faculty support before pre-planning.

Based on the literature and responses from the survey, a successful emergency

management service-learning program will have institutional support from multiple

levels (i.e., department, college, and university support). An office or center for

experiential learning can support faculty and community members across multiple

disciplines. The program will start with smaller projects that address a specific

community need in which students are capable of completing either individually

or in a small group within the semester time frame. For example, a course that

requires one student to complete a county’s continuity of operations plan within

2 months is not ideal. However, the plan’s numerous sections could be divided

among small groups of students to be completed within the semester term. Before

starting the service-learning project, the faculty member, students, and partnering

organization should agree upon all expected outcomes and goals. The project

should always align with the course materials, so the newly acquired practical

knowledge coincides with the theory or academic perspective. Lastly, time needs

to be allotted before, during, and after the project for student input and reflection.

It is vital for students to play an active role throughout the process.

Key skills and competencies highlighted in this article can be achieved with

carefully designed service-learning projects. These projects help students as well

as faculty to learn more about the community, and they are especially useful in

fostering networking, team-building, and leadership skills. Through these

projects, the university or community college can be established as a valuable

resource and partner in the local community as well as with local emergency

management professionals.

Future research can be conducted among students who completed service-learning

projects to measure impact of service learning on student learning in emergency

Journal of Public Affairs Education 45


N. Kapucu & C. C. Knox

management programs. Research is also needed to measure the impact of servicelearning

projects on recipient community organizations. Lastly, documenting servicelearning

projects that emergency management faculty could implement would be

beneficial to the expanding knowledge of emergency management academic programs.

Acknowledgments

Our thanks to the emergency management program coordinators and directors

who took the time to respond to the survey. We also thank anonymous reviewers

of the JPAE and the editor for their feedback.

References

Ash, S. L., Clayton, P. H., & Moses, M. G. (2009). Learning through critical reflection: A tutorial for

service learning students. Raleigh, NC: PHC Ventures.

Astin, A. W., Vogelgesang, L. J., Ikeda, E. K., & Yee, J. A. (2000). How service learning affects students.

Los Angeles: University of California Higher Education Research Institute.

Bacon, N. (2002). Differences in faculty and community partners’ theories of learning. Michigan

Journal of Community Service Learning, 9, 34–44.

Blanchard, W. B. (2005). Top ten competencies for professional emergency management. Retrieved

from http://training.fema.gov/EMIWeb/edu/EMCompetencies.asp

Bringle, R. G., & Hatcher, J. A. (1996). Implementing service learning in higher education. Journal of

Higher Education, 67(2), 221–239.

Bringle, R. G., & Hatcher, J. A. (2000). Institutionalization of service learning in higher education.

Journal of Higher Education, 71(3), 273–290.

Britton, N. R., & Lindsay, J. (2005). Designing educational opportunities for the emergency management

professional of the 21st century: Formulating an approach for a higher education curriculum. Retrieved

from http://training.fema.gov/EMIWeb/edu/emfuture.asp

Bryer, T. A. (2011). Linking students with community in collaborative governance: A report on a

service-learning class. Journal of Public Affairs Education, 17(1), 89–114.

Bushouse, B. K. (2005). Community nonprofit organizations and service learning: Resource constraints

to building partnerships with universities. Michigan Journal of Community Service Learning, 12, 32–40.

Bushouse, B. K., & Morrison, S. (2001). Applying service learning in master of public affairs programs.

Journal of Public Affairs Education, 7(1), 9–17.

Campbell, H. E., & Tatro, B. J. (1998). Teaching program evaluation to public administration students

in a single course: An experiential solution. Journal of Public Affairs Education, 4(2), 101–122.

Cantor, J. A. (1997). Experiential learning in higher education: Linking classroom and community. San

Francisco: Jossey-Bass.

Clement, K. E. (2011). The essentials of emergency management and homeland security graduate

education programs: Design, development, and future. Journal of Homeland Security and

Emergency Management, 8(2), 1–10.

46 Journal of Public Affairs Education


Service Learning in Emergency Management

Cruz, N. I., & Giles, D. E., Jr. (2000). Where’s the community in service-learning research? Michigan

Journal of Community Service Learning, 7(1), 28–34.

Cunningham, B. (1997). Experiential learning in public administration education. Journal of Public

Affairs Education, 3(2), 219–227.

Cwiak, C. L. (2008). FEMA emergency management higher education program report. Retrieved from

http://www.training.fema.gov/emiweb/edu/surveys.asp

D’Agostino, M. J. (2006, August). Social capital: Lessons from a service-learning program (white paper).

Parkville, MO: Park University International Center for Civic Engagement.

Darlington, J. D. (n.d.). The profession of emergency management: Educational opportunities and gaps.

Retrieved from http://training.fema.gov/EMIWeb/edu/highpapers.asp

Dewey, J. (1938). Experience and education. New York: Simon & Schuster.

Dillman, D. A. (2007). Mail and Internet surveys: The tailored design method (2nd ed.). Hoboken, NJ:

John Wiley & Sons.

Donahue, D. A., Jr., Cunnion, S. O., Balaban, C. D., & Sochats, K. (2010). Meeting educational

challenges in homeland security and emergency management. Journal of Homeland Security and

Emergency Management, 7(1).

Eyler, J., Giles, D. E., & Schmeide, A. (1996). A practitioner’s guide to reflection in service-learning:

Student voices and reflections. Nashville, TN: Vanderbilt University.

Jacoby, B. (1996). Service-learning in today’s higher education. In B. Jacoby (Ed.), Service- learning in

higher education: Concepts and Practices (pp. 3–25). San Francisco: Jossey-Bass.

Jelier, R. W., & Clarke, R. J. (1999). The community as a laboratory of study: Getting out of the ivory

tower. Journal of Public Affairs Education, 5(2), 167–180.

Kapucu, N. (2011). Developing competency-based emergency management degree programs in public

affairs and administration. Journal of Public Affairs Education, 17(4), 501–521.

Kapucu, N., & Petrescu, C. (2006). Capacity building through service learning. Academic Exchange

Quarterly, 10(1), 132–138.

Kenworthy-U’Ren, A. L., & Peterson, T. O. (2005). Service-learning and management education:

Introducing the “WE CARE” approach. Academy of Management: Learning and Education, 4(3),

272–277.

Kiltz, L. (2009). Developing critical thinking skills in homeland security and emergency management

courses. Journal of Homeland Security and Emergency Management, 6(1), 1–23.

Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development.

Englewood Cliffs, NJ: Prentice-Hall.

Kushma, J. A. (n.d.). Incorporating service-learning in emergency management higher education

curriculum. Retrieved from http://training.fema.gov/EMIWeb/edu/sl.asp

Lambright, K. T., & Lu, Y. (2009). What impacts the learning in service learning? An examination of

project structure and student characteristics. Journal of Public Affairs Education, 15(4), 425–444.

McCrea, J. M. (2004). Intergenerational service-learning in gerontology: A manual for faculty—

introduction and instructions. Pittsburgh, PA: Association for Gerontology in Higher Education.

Journal of Public Affairs Education 47


N. Kapucu & C. C. Knox

McCreight, R. (2009). Educational challenges in homeland security and emergency management.

Journal of Homeland Security and Emergency Management, 6(1), 1–6.

McEntire, D. A. (2002). Service learning in the emergency administration and planning program.

Retrieved from http://training.fema.gov/EMIWeb/downloads/SL_McEntire.doc

Renger, R., Wood, S., & Granillo, B. (2011). Using experiential learning theory to design emergency

preparedness training curricula. Journal of Emergency Management, 9(5), 57–63.

Saltmarsh, J., & Hartley, M. (2011). “To serve a larger purpose”: Engagement for democracy and the

transformation of higher education. Philadelphia: Temple University Press.

Sandy, M., & Holland, B. A. (2006). Different worlds and common ground: Community partner

perspectives on campus-community partnerships. Michigan Journal of Community Service

Learning, 13, 30–43.

Seigel, S., & Rockwood, V. (1993). Democratic education, student empowerment, and community

service: Theory and practice. Equity & Excellence in Education, 26(2), 65–70.

Ostrander, S. A. (2004). Democracy, civic participation, and the university: A comparative study of

civic engagement on five campuses. Nonprofit and Voluntary Sector Quarterly, 33(1), 74–93.

Thomas, D., & Mileti, D. (2003). Designing educational opportunities for the hazards manager of the 21st

century. Retrieved from http://training.fema.gov/EMIWeb/edu/highpapers.asp

Waldner, L. S., & Hunter, D. (2008). Client-based courses: Variations in service learning. Journal of

Public Affairs Education, 14(2), 219–239.

Wilson, J., & Oyola-Yemaiel, A. (2002). An emergency management profession: Will we make it?

ASPEP Journal, 9, 74–81.

Naim Kapucu is professor of public policy and administration and founding director

of the Center for Public and Nonprofit Management in the School of Public

Administration at the University of Central Florida. His main research interests

are emergency and crisis management, decision-making in complex environment,

collaborative governance, and scholarship of teaching and learning. His book,

Network Governance in Response to Acts of Terrorism: Comparative Analyses,

was published in 2012 by Routledge. He teaches public and nonprofit management,

emergency and crisis management, research methods, and analytic techniques for

public administration courses. He can be reached at Kapucu@ucf.edu.

Claire Connolly Knox is an assistant professor and the Emergency Management

and Homeland Security program coordinator in the School of Public Administration

at the University of Central Florida. Her research interests include environmental

policy and management, critical theory, and environmental vulnerability and

disaster response. She has coauthored a Public Administration Review article on

government ethics and a forthcoming Journal of Emergency Management article

on an analysis of after action reports from hurricanes Andrew and Katrina.

48 Journal of Public Affairs Education


Service Learning in Emergency Management

Appendix

Survey on Service Learning in Higher Education Emergency Manage ment Programs

The following short survey collects information about service-learning utilization

among emergency management academic programs in the United States. The survey

takes about 10–15 minutes to complete. Your responses are confidential and will

not be revealed without your consent; only aggregate results will be made available.

We will be happy to provide you with the final results upon request.

Please answer the following questions:

1. Are you the addressee?

£ Yes

£ NogPlease state your position/title here: ______________________

£ Program representing (university and degree): __________________

2. School/Department your program is housed in: ______________________

3. How long has your program been in operation? ______________________

4. Does your program incorporate service learning as part of the curriculum?

£ Yes £ No

If yes, please continue to Question 5. If no, please continue to Question 32.

Service Learning in Emergency Management

Please assess the following statements regarding the utilization of service learning

in your degree programs. Please use the following scale:

Strongly Neither Agree Strongly

Disagree Disagree nor Disagree Agree Agree

1 2 3 4 5

5. The main goal of service learning in our program is to link theory

with practice.

6. Service-learning projects in our curriculum are mandatory for our emergency

management students.

7. Service-learning projects in our curriculum are voluntary for our

emergency management students.

8. Students completing service-learning projects are active participants in

design and execution of the projects.

9. Students evaluate their progress throughout the service-learning project.

10. Course materials are integrated into the service-learning project.

11. Service-learning projects are mainly completed as a group project.

Journal of Public Affairs Education 49


N. Kapucu & C. C. Knox

Strongly Neither Agree Strongly

Disagree Disagree nor Disagree Agree Agree

1 2 3 4 5

12. Service-learning projects are mainly completed as an individual project.

13. Civic engagement is part of service learning in our program.

14. Reflection is an essential part of service learning in our program.

15. Our program has an evaluation strategy to assess the impact of servicelearning

projects on students’ learning.

16. Service learning in our program better prepares future emergency managers.

17. Service learning in our program contributes to students’ intellectual growth.

18. Service learning in our program improves students’ research skills.

19. Service learning in our program is interdisciplinary.

20. Service learning in our program promotes university-community collaboration.

21. Service learning in our program contributes to faculty learning about

our communities.

22. Service learning in our program is supported by the department.

23. Service learning in our program is supported by the institution.

24. Our institution has an office of service learning to assist faculty in designing

and implementing service learning projects.

Open-Ended Questions

25. How do you define service learning?

26. How long has your program incorporated service learning in the curriculum?

27. What percentage of your emergency management courses includes

service-learning projects?

28. Can you give us some examples of successful and least successful student

service-learning projects?

29. What are some challenges of incorporating service learning in the curriculum?

30. What are some benefits of incorporating service learning in the curriculum?

31. What advice would you provide to programs considering implementation

of service learning as part of the curriculum?

50 Journal of Public Affairs Education


Service Learning in Emergency Management

Demographic Questions

32. Our emergency management program is:

£ Face-to-face £ Online £ Mixed mode

33. Number of graduate students in the emergency management program:

£ under 10 £ 11–20 £ 11–30 £ 31–40 £ over 41

34. Number of undergraduate students in the emergency management program:

£ under 20 £ 21–30 £ 31–40 £ 41–50 £ over 51

35. Type of emergency management program offered (Check all that apply.)

£ Certificate (undergraduate level) £ Certificate (graduate level)

£ Concentration within an undergraduate major £ Minor

£ Associate Degree £ Bachelor’s Degree

£ Master’s Degree £ Doctoral Degree

£ Master’s Level Concentrations £ Doctoral Level Concentrations

£ Other: Please specify: ______________________

36. What do you consider the primary purpose of your program(s)?

(Please select only one.)

£ Pre-employment (preparation for entry into EM, HS, or related field)

£ Advancement (preparation of current EM, HS, etc. personnel

for advancement)

£ Other: Please specify: _______________________________________

37. What proportion of your program faculty has a university degree in

emergency management? (Please select only one.)

£ None £ Some 25% £ Some 50% £ Some 75% £ All

38. What proportion of your program faculty has work experience in

emergency management? (Please select only one.)

£ None £ Some 25% £ Some 50% £ Some 75% £ All

39. What is the percentage of full-time faculty in your emergency management

program? (Please provide percentage based on your knowledge.) ____________

40. What is your highest degree? ______ In what field? __________________

41. What is your gender?

£ Male £ Female

Thank you very much for your cooperation.

Journal of Public Affairs Education 51


Developing Decision-Making Skills

for Uncertain Conditions:

The Challenge of Educating

Effective Emergency Managers

Louise K. Comfort

University of Pittsburgh

Clayton Wukich

Sam Houston State University

Abstract

Effective decision making under conditions of uncertainty involves the ability to

recognize risk, formulate strategies for action, and coordinate with others in an

effort to bring an incident under control quickly. Learning to make decisions

effectively in urgent, uncertain conditions is not easily achieved in a classroom

setting. Educators face a particular challenge in creating a learning environment

in which students can develop this ability in preparation and/or support for careers

in emergency management. The Scholarship of Teaching and Learning (SoTL)

suggests that higher-level thinking skills facilitate the kind of problem-solving skills

and subject mastery helpful to decision making under conditions of uncertainty.

A content analysis of syllabi on emergency management demonstrates that instructors,

in practice, focus disproportionately on lower-level thinking skills. We present a

set of propositions informed by SoTL and the study of cognition to design curricula

that facilitate the development of higher-order thinking skills that support decision

making under conditions of uncertainty.

Creating a Forum for Learning Problem-Solving Skills

The key quality that distinguishes effective from less effective decision makers

operating under conditions of uncertainty is the judgment they bring to bear

Keywords: Decision-making skills, emergency management, higher-order thinking

JPAE 19(1), 53–71

Journal of Public Affairs Education 53


L. K. Comfort & C. Wukich

under a novel and complex set of conditions (Comfort, 1999; Klein, Orasanu,

Calderwood, & Zsambok, 1993; Weick & Sutcliffe, 2007). In situations characterized

by uncertainty and stress, well-known rules may not apply, resources may be scarce,

and the usual means of support may not be available. Yet, the decision maker has

legal responsibility for action and so must act under conditions in which lives may

be at risk, high-value property may be at stake, and chances for escalation of damage

may be high. In such settings, a decision maker will draw upon a stored base of

knowledge about the context, constraints, personal experience, and immediate

opportunities for action to fashion a strategy that works to the best of his or her

capacity (Klein, 1998). The decision may not always be optimal, and it may not

always be efficient; but if lives are saved, stability of operations restored, and the

incident brought under reasonable control, the decision is considered to be effective

(Klein et al., 1993).

Learning to make decisions effectively in urgent, uncertain conditions is not

easily achieved in a classroom setting, using traditional lecture formats. In many

respects, the basic assumptions of traditional classroom learning do not fit the

requirements for developing problem-solving skills (Kiltz, 2009). In a traditional

lecture format, the basic assumptions are that the instructor imparts knowledge

to the students through readings and lectures and that the students learn as individuals

to comprehend the material they have been given. Problem-solving skills in

emergency management, rather, require several different levels of intellectual activity

(Collins & Peerbolte, 2012).

We argue that emergency management courses in higher education currently

focus on lower-level thinking skills that, while useful to developing a foundation

for decision making capacities, do not necessarily lead to higher-level thinking

skills. Scholarship of Teaching and Learning (SoTL) research demonstrates that

higher-level skills are vital to problem solving, critical thinking, and achieving

“mastery” within a domain. We present a set of propositions informed by SoTL

and cognition research to improve the effectiveness of emergency management

curricula in higher education.

Strategy to Achieve Higher-Level Learning

The ability to navigate uncertain, stressful situations requires multiple skills.

How should instructors facilitate these skills in a classroom setting? Research

from SoTL offers a strategy to achieve what is referred to as mastery. Students

develop mastery over time as a result of a three-step process of learning as visually

demonstrated by Figure 1 (Ambrose, Bridges, DiPeitro, Lovett, & Norman,

2010). 1 In the first step, students acquire specific component skills that underlie

their area of study. Second, students practice integrating these skills in various

combinations depending on the context and guided by the instructor. Students

at this point do not yet achieve mastery. Mastery is achieved when students

demonstrate the ability to know when to apply component skills in different

combinations without the instructor’s intervention. This third step, or transfer,

54 Journal of Public Affairs Education


Developing Decision-Making Skills for Uncertain Conditions

represents “the application of skills (or knowledge, strategies, approaches, or habits)

learned in one context to a novel context” (Ambrose et al. 2010, p. 108). These

three steps used to achieve mastery provide the foundation upon which to organize

a set of teaching and learning activities aimed at training effective decision makers.

Figure 1.

Ambrose’s Elements of Mastery

Mastery

Know When

to Apply

Skills

Practice

Integrating

Skills

Acquire

Component

Skills

Source. From Ambrose et al. (2010, p. 96).

Assessment of Current Instruction in Emergency Management With

Learning Needs

How does this strategy of mastery compare with current instructional practice

in emergency management? This concept is difficult to measure, but one means

is to review what instructors do in their courses. Building on Bloom’s (1956)

taxonomy, Anderson and Krathwohl (2000) lay out six levels of learning; remembering,

understanding, applying, analyzing, evaluating, and creating. Each level

represents a step in a cognitive process that increases in difficulty as a student first

acquires knowledge and then moves on to application and finally to integrating

Journal of Public Affairs Education 55


L. K. Comfort & C. Wukich

various pieces of knowledge to create new ideas. To ascertain the degree to which

instructors distributed their learning objectives across these levels, we conducted

a content analysis of 29 syllabi commissioned by the Federal Emergency Manage -

ment Agency (FEMA).

FEMA and National Association of Schools of Public Affairs and Administration

(NASPAA), in 1984, created a working group of scholars focused on hazards research.

Efforts by this group eventually led to the development of the FEMA Higher Education

Project (Comfort, Waugh, & Cigler, 2012). Over the past decade, the project

commissioned curricula for dozens of emergency management courses by top scholars

in the field. FEMA provides these syllabi and lesson plans at no cost to universities.

Although this sample is not necessarily a definitive representation of the larger teaching

community, it does represent a set of professional examples that are developed,

posted, and used by instructors throughout higher education.

Figure 2 demonstrates that instructors rely disproportionately on lower-level

learning objectives for their courses. Of the 342 objectives posted, 77.5% (265)

were related to remembering or understanding. It is logical that most of these

objectives would fall into the lower-level categories, because the attainment of

higher-level thinking skills first often requires lower-level skills (Anderson &

Krathwohl, 2000). However, many of the FEMA courses failed to expect even

one higher-level skill. Mastery as outlined by Ambrose and her colleagues (2010)

requires application and other higher-level learning skills. Of the 29 courses, only

13 (44.8%) posted application objectives, 19 (65.8%) analyzing, 9 (31.03%)

evaluating, and 6 (20.7%) creating.

The FEMA courses do promote basic knowledge about the domain of emergency

management: the laws, policies, and practices that characterize the field; the major

actors and their responsibilities; and the major risks encountered in daily practice.

This rudimentary level of understanding for emergency management can indeed

be taught effectively in classroom settings or largely acquired by individual students

working on their own. In practice, the challenge is that the knowledge developed

in this format likely will get them only through daily, routine operations in wellstructured

situations with limited uncertainty. Emergency management requires

more dynamic, flexible approaches to problem solving than the bureaucratic

approaches that had long characterized the system (Neal & Phillips, 2007). Phillips,

Klein, and Sieck (2004), for example, suggest that expertise is built not on checklists

and memorization, but through deliberate, experienced-based practice in which

mental models are developed and enhanced through the introduction of multiple

scenarios, contingencies, and varied repertoires of action.

Given the objective of mastery of the field of emergency management, the

key question confronting educators in emergency management programs is

whether their teaching and learning activities enable effective decision making

under conditions of uncertainty. The goal becomes enabling students to move

through different levels of problem-solving skills (Savin-Badin, 2000). The

primary goals, in practice, of life safety, incident stabilization, and protection

56 Journal of Public Affairs Education


Developing Decision-Making Skills for Uncertain Conditions

Figure 2.

Course Objectives by Level of Learning*

250

200

150

100

50

0

Remembering

Understanding

Applying

Analyzing

Evaluating

Creating

* This figure represents the aggregate of all learning objectives for 29 courses that posted learning objectives.

Source. FEMA Higher Education Project.

of property and resources for the community represent the overall mission of

emergency managers, and they may require reorganization of routine activities to

meet them under changing conditions (Howitt & Leonard, 2009; Sylves, 2008).

Yet, it is essential that students understand the importance of moving beyond

laws, policies, and procedures to learn how to operate within this context as

functioning decision makers.

Keeping the end goal in mind becomes the guiding principle in recognizing

risk and searching for information to validate a strategy of action in uncertain

conditions. Meeting this threshold requires interaction with other experts, groups,

and sources of information, such as knowledge bases or real-time reports of

changing status of the community at risk. It means learning which sources of

information are valid and which are not, and using valid information to build a

strategy of action. This approach requires thinking in a broader context, mindful

of the impact of one unit’s action upon others, and the reciprocal effect that the

actions of others will have on one’s own group. Only by developing sufficient

skill in interacting with others and searching a wider set of knowledge sources to

achieve a common goal do decision makers pass this threshold and advance to

the next level of problem solving that is critical in managing large-scale, extreme

Journal of Public Affairs Education 57


L. K. Comfort & C. Wukich

events: creating a common operating profile for the multiple organizations and

jurisdictions that are involved in response operations.

In what ways does this approach differ from traditional approaches to education

and training for emergency managers? The primary difference is the focus on process

rather than structure. Further, the means of acquiring skills in process differ from

those involved in learning the terms, organizational procedures, and responsibilities

that provide structure to professional practice. Both are important, but the acquisition

of process skills involves developing a mental model that can accommodate uncertainty

and is open to new information, correction of error, and rapid revision of

strategies of action based on valid information. Traditional approaches to emergency

management have focused, rather, on comprehending and analyzing the structure

of emergency management and emphasizing organiza tional charts, prescribed roles,

detailed procedures, and checklists for performance under specific conditions.

This approach works well in routine, well-structured, stable conditions, and

most emergency management training programs have adopted it as a means to

increase the level of professional performance among disparate organizations or

organizations with large numbers of voluntary personnel, such as fire departments

in most communities outside major cities. For example, the National Incident

Management System (NIMS) represents an effort to stan dardize emergency

management training and procedures across all levels of jurisdictional operation.

Although this objective is certainly worthy and has contributed to the wider

acquisition of shared knowledge regarding procedures of emergency management,

the instruction has emphasized structure over process (e.g., communication,

networking, decision making), largely because it is easier to teach in classroom

settings (Buck, Trainor, & Aguirre, 2006). The result, in many cases, has increased

the rigidity of practice in extreme events, making emergency personnel less

attentive to indicators that fall outside the prescribed rules, less able to adapt to

changing conditions, and more vulnerable to organizational failure in shifting,

uncertain conditions (Wise, 2006).

Traditional approaches to training have in fact attempted to focus on

process, but this is difficult to do effectively. Tabletop exercises, for example,

often are relatively simplistic models that do not address the complex sets of

interdependencies that affect and inhibit action in a rapidly evolving incident.

Consequently, such exercises rarely change the existing mind-sets of emergency

personnel involved in them. Instead, they often reinforce previous practice and

exclude novel contingencies. Functional and full-scale exercises provide more

complexity (McEntire & Myers, 2004). The classic example of a major training

exercise that failed to attune the existing mental models of emergency managers

at multiple jurisdictional levels to unanticipated extreme conditions was the

Hurricane Pam exercise, conducted by FEMA in Louisiana in July 2004. The

Hurricane Pam exercise used the scenario of a major hurricane striking the

vulnerable city of New Orleans (FEMA, 2004), requiring timely action and

coordination among all four governmental jurisdictions for effective response.

58 Journal of Public Affairs Education


Developing Decision-Making Skills for Uncertain Conditions

The exercise involving emergency personnel from federal, state, parish, and city

jurisdictions was considered highly successful; 2 but on August 29, 2005, only 13

months later, the actual Hurricane Katrina followed nearly exactly the same scenario

and relentlessly revealed the inability of all four jurisdictions to coordinate their

actions to reduce risk and bring the incident quickly under control. Clearly, different

approaches to education and training in emergency management are needed.

Theoretical Lenses for Changing Conditions

Other theorists have considered the problem of decision making under

uncertainty, and their insights inform our understanding of the relationship

between creating a sufficient structure to hold and exchange information while

also having sufficient flexibility to adapt to changing conditions (Kauffman, 1993).

Edwin Hutchins (1995) documented the process of distributed cognition in group

decision-making processes that pooled knowledge from different sources to inform

complex decisions. Hutchins also acknowledged the role of instruments used to

assess the changing environment in providing real-time information directly to

the decision makers. Gary Klein and his collaborators (Klein et al., 1993) note

the importance of “recognition primed decision making” used by experienced

managers, and the role of experience in developing a repertoire of strategies that

could be combined and recombined to fit emerging situations. Steven Bankes,

Robert Lempert, and Steven Popper (2003), in a sixth edition of their book,

use computational simulation as a method of exploring plausible strategies for

anticipating complex scenarios in uncertain conditions. Although each of these

theorists addresses key aspects of decision making under uncertainty, educators

face a particular challenge in creating the environment in which students can

develop the skills of adapting informed processes within a structured environment

under rapidly changing conditions.

One approach to this dilemma is to identify the skills needed for effective

decision making under uncertainty and, mapping backward, to design learning

tasks that facilitate the development of these skills. In earlier work, Comfort

(2007) identified four skills as essential to decision making under uncertainty:

cognition, communication, coordination, and control. These skills are cumulative;

each skill depends upon the preceding skill. For example, communication depends

on cognition; for without awareness of risk, emergency personnel would not

communicate the threat of risk. Likewise, coordination depends upon communication,

for without effective communication, other actors will not be able to coordinate

activities among multiple actors. In turn, the fourth skill, achieving control in

complex, dynamic settings, depends on the third, coordination of collaborative

activities among multiple actors. Of the four requisite skills, the most difficult

to develop is cognition, which requires a mental model of how the system in

question “ought” to operate (Klein et al, 1993). The skill lies in distinguishing

any departure from that model as an “anomaly” that warrants attention. The

Journal of Public Affairs Education 59


L. K. Comfort & C. Wukich

ability to recognize anomalies and use them to question the existing state of operations

leads decision makers to assess risk and consider what might be alternative

strategies of action in a given context (Weick, 1995). The anomalies may be vague,

ambiguous indicators that, separately, may not constitute a threat, but taken together,

would be recognized by an experienced manager as conditions leading to danger.

Developing the skills needed to recognize risk requires building a cumulative base

of knowledge about a certain area of operation that allows the decision maker to

distinguish valid from invalid indicators of possible dysfunction.

Building Skills for Decision Making Under Conditions of Uncertainty

The classic question for educators is how to design instruction that can substitute

for experience in building problem-solving skills for students. At best, such instruction

can enable students to develop a systematic knowledge base regarding the set

of interacting conditions for a given community that requires continual monitoring

for possible risk. These conditions would include, minimally, (a) an assessment

of risks, given the physical, engineered, and social characteristics of the region;

(b) an assessment of assets for emergency response and recovery, given the same

characteristics of the region; (c) an assessment of the existing information infrastructure

that allows the rapid search, exchange, and analysis of information regarding

emerging risk; and (d) knowledge of who has the capacity to act upon informed

judgments regarding risk with what degree of authority and competence.

The four functions of cognition, communication, coordination, and control

are linked through their progressive dynamic toward action. The goal of problem

solving in extreme events is to build a “common operating picture” among all

participating actors, so it is essential to identify who those actors are likely to be

in any given situation, and second, to recognize their existing constraints on

action. Then, the dynamic toward action can be understood by students as it

evolves in a realistic setting, coached by the instructor to facilitate student

advancement through the sequential set of thresholds of learning. Each of these

four sets of skills can best be fostered by different learning activities, and if

possible, in different actual or virtual settings.

Cognition. Returning to Herbert Simon’s (1997) adage that “we can only create

what we already understand,” the first step to building the capacity to recognize

risk is developing a detailed understanding of the context of operations for known

emergencies. To do so means identifying key indicators for change in a system,

which implies sufficient knowledge of the system’s critical functions to recognize

which indicators, or combination of indicators, would impair system performance,

if found in practice (Klein et al., 1993). Determining the discrepancy between

how the system should function in desired operating conditions and how it actually

functions in practice leads to cognition of risk in a changing environment.

60 Journal of Public Affairs Education


Developing Decision-Making Skills for Uncertain Conditions

Communication. Once risk has been recognized and identified as a threat,

the situation moves from individual to group action. Students then need to learn

how to communicate risk to individuals, organizations, groups, and jurisdictions

that share the risk as well as to agencies that can mobilize response to reduce the

risk. This task requires the “capacity to create shared meanings among individuals,

organizations, and groups” (Comfort, 2007, p. 194) that will enable them to take

appropriate action to reduce their own risk as well as mobilize timely response

for the whole community. The capacity for timely, valid communication of risk

depends, as stated earlier, upon the accuracy and timeliness of recognition of that

risk as well as the willingness to update old information and correct errors based

on obsolete judgment (Doyle, 1979).

Coordination. Skills of coordination depend upon both the validity of the

information being communicated and the residual knowledge and skills of each

of the participating actors. Coordination means the ability to articulate a common

profile of risk for multiple actors, so that each can align his or her actions with

those of other actors to achieve a shared goal (Comfort, 2007; Koppenjan & Klijn,

2004). The timeliness and ease of coordination depends upon the level of shared

cognition of risk achieved through valid communication of risk to the participating

actors and wider community.

Control. The intended goal of emergency managers is to bring the incident

under control, or to a stable, non-escalating state, through the proper integration

of critical tasks performed simultaneously by multiple actors (Comfort, 2007;

Waugh & Tierney, 2007). This effort depends on each of the preceding functions:

cognition of risk by at least one actor in the operating system; communication of

that risk to all other actors in the system and to external actors who have the capacity

and responsibility to support the system with resources and relevant knowledge,

and further, coordination of action among multiple actors simultaneously in the

emergence of a coherent response system. In sequential fashion, the emerging response

system expands through the information processes used to drive the system. The

degree of timeliness, effectiveness, and efficacy of the emerging response system

depends on the degree to which each of the three previous thresholds for cognition,

communication, and coordination have been achieved for that specific region.

A Teaching and Learning Strategy to Facilitate Effective

Decision Making

Students need to grasp sufficient information about the domain of emergency

management that they recognize tensions inherent in uncertain situations and

understand that the process of prioritizing actions often means giving up preferred

outcomes to achieve primary goals (e.g., life safety, incident stabilization, and

protection of property). The cumulative acquisition of knowledge and experience,

Journal of Public Affairs Education 61


L. K. Comfort & C. Wukich

however, requires practice. Following the basic steps to achieve mastery (Ambrose

et al., 2010) and informed by research on cognition (Klein, 1998; Comfort, 2007),

we outline a set of learning activities to facilitate effective decision making in the

field of emergency management. Table 1 illustrates the order in which we suggest

these learning activities be tested. We propose that these activities will facilitate

improved decision making under conditions of uncertainty.

Table 1.

Structured Teaching and Learning Activities to Facilitate Effective Decision Making

Elements of Mastery

Acquire component skills

Practice integrating skills

Know when and how to apply skills

(Transfer)

Teaching and Learning Activities

Describe the structure of the emergency management system

Identify hazards

Monitor change

Identify vulnerability

Link hazards with vulnerabilities

Link hazards and vulnerability to appropriate strategies to

reduce risk

Alta Madre and Rim Sim exercises

Watch-floor exercise

Emergency operations center (EOC) exercise

Component Skills

Learning activities focusing on one core skill at a time represent an initial

foundation for learning. Students experience less difficulty in learning isolated,

single tasks as opposed to more complicated, intensive skills. Integrating more

complex skills places heavier cognitive loads on students, because these skills require

more attention and focus. As the demands of a complex task increase, performance

and learning suffers as student focus dissipates (Ambrose et al., 2010).

Students enter the classroom with varying backgrounds and different levels of

knowledge. In an MPA program, for example, classes often consist of a mix of

pre-professional students right out of college and mid-career professional students

who already have experience but are looking for ways to enhance it and gain a

better appreciation of the administrative world in which they operate. Identifying

a student’s level of knowledge at the beginning of the semester offers instructors

the opportunity to offer remediation or background information necessary for

promoting higher levels of learning. Students may not have requisite knowledge

of intergovernmental relations, federalism, or other key concepts. Also, identifying

students with high levels of experience and knowledge enables the instructor to

empower those students to be a resource for others early in the semester.

62 Journal of Public Affairs Education


Developing Decision-Making Skills for Uncertain Conditions

We begin our instruction with the objective to develop basic knowledge and

comprehension skills (e.g., describing the legal and organizational structure of

emergency management). We move on to facilitate two key skills: identifying

hazards and vulnerabilities using multiple indicators as guides to recognize changes

and anomalies.

Describe the structure of the emergency management system. Students need to

grasp basic knowledge about the domain of emergency management in order to

organize subsequent learning within that broader context. Most university courses

focus on this learning objective. They explore the laws, policies, and practices that

characterize the field. As outlined earlier, this objective is indeed a critical first step

to our goal of developing decision-making capabilities. The ability of a student

to identify and describe the major actors in this operating environment and their

responsibilities create the framework in which students structure an initial conception

of a common operating picture.

Identify hazards. The initial formulation of a common operative picture depends

not just on the major organizations involved but also on the timely and accurate

recognition of risk. This step may require assigning students a specific geographic

location exposed to risk. The key task is to engage students in developing the

capacity to recognize emerging risk rather than to discuss risk in abstract terms

(Comfort, 2007). In emergency management, the risk of a hazard occurring and

the vulnerability of people and property to that hazard create a fundamental external

problem. A key skill set exhibited by practicing emergency managers is the ability

to identify critical indicators that distinguish risk from otherwise normal, safe

conditions. This function represents the initial step in developing what practitioners

refer to as situational awareness.

Monitor change. Monitoring the status of critical indicators then allows students

to track the performance record of the system and note significant discrepancies

from expected performance. It further requires sufficient understanding of the

structure of the operating system, so that students can identify the conditions

and processes that threaten its continued operation.

Identifying and later retrieving from memory a diverse array of indicators

and hazards that exist across jurisdictions represents an initial key learning objective.

Earthquakes, floods, hurricanes, winter storms, and human-made incidents make

up a handful of potential threats to explore. This is a lower level of thinking

(Anderson & Krathwohl, 2000; Bloom, 1956) that can be facilitated through

traditional teaching and learning activities (e.g., lectures and directed readings).

Access to historical data online, for example, visually demonstrates the ubiquitous

nature of hazards across the globe.

Journal of Public Affairs Education 63


L. K. Comfort & C. Wukich

The inability to distinguish high levels of risk and to take action to mitigate

that risk, in practice, precipitates disaster situations (Comfort, 1999; D. J. Johnson,

2006). The ability to recognize risk can be facilitated by observation, intuition

developed through experience, and technologically sophisticated systems (Comfort,

2007; Klein et al., 1993; Ling, Znati, & Comfort, 2010). Both practitioners and

academics develop and maintain several open source indicators used to recognize

risk. The National Oceanic and Atmospheric Administration (NOAA), for example,

maintains descriptions of a variety of hazards as well as historical information

on past incidents and indicators of current risk.

Even experienced emergency service personnel operate in silos on occasion,

whether the silos are based on discipline or geographic location. Learning activities

that expose students to multiple indicators and to the science that underlies those

indicators help break down the silos that students will experience in their professional

roles and facilitate an all-hazards approach to emergency management. This learning

activity empowers students to actively seek out multiple sources of data upon which

to draw initial conclusions. In-class discussions, student presentations, and short

papers all enable students to match hazards with appropriate indicators and identify

baseline evaluative criteria for each indicator.

We now have unprecedented access to both real-time and historical data on

risk. Instructors and students can access National Weather Service (NWS) data

and monitor weather conditions in class. They can locate wildfires with NOAA’s

Satellite Fire Detection program or monitor flood risk across the United States.

And they can monitor seismic activity around the world with the U.S. Geological

Survey’s “Real-time Earthquake Map.”

Identify vulnerability. After (or as) students identify different types of hazards,

a coinciding learning objective is to recognize the vulnerability of people who face

risk. Again, students with experience in the emergency services will demonstrate

previous knowledge in this area. However, they are likely also to be limited in their

scope and interest based on their discipline and geographic area. Three types of

vulnerability—social, built, and geophysical—require time and attention in the

classroom (Mileti, 1999).

Social vulnerability speaks to the level of susceptibility that a population exhibits

to a threat. For example, the assumption is that poorer communities with high

elderly populations are more vulnerable than others. U.S. Census data provides a

means for an initial quantification for this category using metrics based on income,

education, and other variables (Cutter, Boruff, & Shirley, 2003). Learning activities

that require students to use U.S. Census data to identify particularly vulnerable

communities within their region help reinforce the concept.

The built infrastructure, on which people in every community rely to maintain

daily, normal life, including highly technical systems, creates a distinct vulnerability

that often interacts with social and geological vulnerability to exacerbate risk to a

64 Journal of Public Affairs Education


Developing Decision-Making Skills for Uncertain Conditions

community. Housing, transportation systems, public utilities, and other critical

systems make up the built environment (Mileti, 1999; D. J. Johnson, 2006).

Brainstorming sessions in class or homework assignments can be used to identify

the critical infrastructures necessary to maintain basic services in a community.

That exercise will enable students to recognize our dependence on key institutions

and systems. Geophysical vulnerability represents varying levels of risk based on an

area’s geophysical surroundings, that is, soil composition, topography, seismic risk,

and waterways as well as weather patterns inherent to a particular area (Mileti, 1999).

Lectures and directed readings can facilitate the recognition of these natural systems

and the process of risk analysis used in practice to measure vulnerability.

Integration

Complex incidents demand not only multiple skills from emergency managers

but also the ability to apply those skills in combination, and sometimes in situations

perhaps not yet experienced by the emergency manager. As students develop

separate sets of skills, they are then prepared to practice integrating those skills in

different combinations. Several teaching and learning activities aim to integrate

component skills.

Link hazards with vulnerabilities. Students should recognize that an actual

hazard is not the main concern of an emergency manager, nor is a particular vulnerability.

Instead, the main concern is the interaction between the hazard and community:

the damage or disruption that a hazard inflicts on our social, built, and geophysical

systems. To put it another way: If a tree falls down in a forest and no one hears,

there is no immediate issue. If a tree falls down in a forest and knocks down a power

line that leads to a disruption in the next town, including the local hospital, then

it is matter of concern. Linking hazards with vulnerabilities in order to identify

the potential interdependencies among geophysical, built, and social systems that

lead to loss of life, property damage, and disruption of necessary services represents

a key learning objective. This type of skill set can be facilitated by analyzing historical

case studies. Identifying incidents based on hazard type and framing a set of

ques tions with the intent of matching hazards with vulnerabilities allow students

to recognize cause-and-effect relationships.

Link hazards and vulnerability to appropriate strategies to reduce risk. Once

risk is recognized, effective emergency managers develop strategies of action to

reduce that risk. The identification of appropriate resources, organizations, and

people needed to mitigate and/or respond to that hazard represents an important

step. Learning activities in which groups take on the role of emergency support

functions (ESFs) allow for students to identify appropriate resources, communicate

need across simulated organizational boundaries, and begin coordinating possible

cooperative activities.

Journal of Public Affairs Education 65


L. K. Comfort & C. Wukich

Transfer

Knowing where and when to apply skills without guidance represents the

critical point at which a student achieves mastery over a subject (Ambrose et al.,

2010). Simulating decision points in which uncertainty and stress strain a person’s

cognitive ability is a key task of an instructor. Simulations are recognized as learning

activities that facilitate higher-level learning (Silvia, 2012). We review two simulations

readily available to instructors and then develop two exercises in which students

integrate skills and then choose where and when to implement skills in different

combinations with little to no guidance from the instructor.

Alta Madre and Rim Sim. We have used two readily available exercises in class,

which inform our development of the additional learning activities described later. 3

In an American context, Booher and Sutkus (2008) designed a simulation, “Alta

Madre,” for emergency management planning and administration that structures

a scenario for interagency coordination. In this scenario, multiple actors exhibit

varying priorities and compete for grant allocations. The exercise incorporates a

narrative of past conflict to simulate realistic political dynamics. It also recognizes

the need to communicate planning decisions to the public. The exercise does not,

however, develop specific measures of risk or make clear the hazards faced by the

interagency system. As part of a larger set of learning activities, the simulation is

indeed useful.

Another useful simulation, “Rim Sim,” integrates recovery with planning

operations in an international environment (Barrett, Frew, Howell, Karl, & Rudin,

2003). The simulation models a process of negotiation between international actors

working to distribute aid from abroad in a manner that effectively and efficiently

spurs recovery for a hypothetical region of three neighboring nations that experienced

different consequences from the same major earthquake and are exposed to different

degrees of continuing vulnerability. Rim Sim also addresses risk and the goal of

creating more resilient systems of built infrastructure. Particularly innovative is

its use of scientific and technical information as a way to inform debate and counterbalance

traditional political notions of power and authority. In both simulations,

we have learned the importance, first, of student preparation beforehand to comprehend

individual roles and responsibilities and, second, of the role of class-wide

debriefings to promote a larger, systemic understanding of the decision space

involved in the simulation.

Watch-floor exercise. Emergency services and management personnel experience

normal administrative and operational conditions, punctuated by short bursts of

extreme stress and effort. Between emergency incidents, personnel fill their time

in varying ways: mitigation activities, administrative tasks, and preparation for the

next incident. Increasingly, agencies monitor risk indicators more systematically

to make informed, evidence-based decisions related to action. In law enforcement,

for example, fusion centers at varying levels of government collect and exchange

key pieces of information with other agencies. In emergency management, personnel

66 Journal of Public Affairs Education


Developing Decision-Making Skills for Uncertain Conditions

set up so-called watch floors to monitor risk and inform decision makers. Large

agencies, like FEMA, assign personnel specifically for this function. Usually, these

“watch standers” simply monitor the current status of their assigned geographic

region, take notes, and distribute standard situation reports to key decision makers.

However, in the event of a major incident, the watch floor serves a key function in

collecting data, making an initial evaluation, and distributing that intelligence

throughout the system.

On the watch floor, personnel demonstrate and integrate multiple skills such

as the identification of risk, the initial development of a strategy for action, the

communication of that strategy to others, and finally coordination. A weekly learning

activity simulates the watch-floor experience. Assignments first integrate component

skills (e.g., the identification of risk and the matching of resources) and later allow

students to practice recognizing where and when to apply specific skills without

guidance from the instructor.

The watch-floor exercise simulates, with varying degrees of detail and operational

responsibility, scenarios in which the student receives information from the instructor

that characterizes varying levels of risk and vulnerability. The student then makes

a judgment based on available information and formulates a strategy for action.

The exercise expands as the semester progresses to include students who represent

personnel from varying emergency support functions (ESFs). As the simulation

scales out, students practice communication and coordination.

Emergency operations center (EOC) exercise. The watch-floor exercise practices

and reinforces core skills learned throughout the course. It prepares students to

participate in larger-scale EOC simulations, which replicate more intensively the

complexity and uncertainty of a disaster situation. By design, an EOC serves as

central data hub in which agencies across jurisdictions and sections exchange information

and coordinate collective response to assist on-scene personnel.

In an EOC exercise, students simulate the processes of locating and distributing

resources for emergency response operations, sharing information, establishing

priorities in terms of response tasking, and changing course in response to (or in

anticipation of) new demands. Operating within the context of separate, yet often

agencies throughout the distributed system. The objective of these teaching and

learning activities is to demonstrate key component skills (e.g., recognizing risk,

formulating strategies for action, and then communicating and coordinating those

strategies with others), practice integrating those skills in different combinations,

and apply those skills appropriately without guidance from the instructor.

Online decision support dashboards offer platforms for simulation exercises

that replicate the EOC environment (T. Johnson & Summerton, 2012). At the

University of Pittsburgh, we have developed a decision support system with, and

for, practicing emergency managers that also lends itself to the classroom. The

Java Interactive, Intelligent, Spatial Information System (JIISIS) is a working

prototype that provides interactive communication and coordination scenarios

that replicate risk indicators for people and infrastructure (Comfort & Wukich,

Journal of Public Affairs Education 67


L. K. Comfort & C. Wukich

2009; D. E. A. Johnson, Zagorecki, Gelman, & Comfort, 2011). Students are

asked to monitor risk indicators and match appropriate strategies for action with

the demands of the external system—in our scenarios, flooding and hazardous

materials (hazmat) incidents. Anecdotally, students have responded positively to

the local granularity of our scenarios, which have been informed by data from a

series of semi-structured interviews and surveys with practicing managers,

geographic information systems (GIS) files, and actual exposure to hazards faced

by local communities. A high percentage of our students have either lived in

those communities or are at least familiar with them. By structuring the learning

activity within the students’ home community as opposed to a foreign or

fictional environment, we seek to reduce cognitive load and provide a scaffold

that allows students to focus on the learning objectives and not be distracted by

tertiary details (see Ambrose et al., 2011; Simons & Klein, 2007). Students are

less burdened by learning new geography and names of agencies.

The replication of a complex, uncertain scenario in a structured environment

is not a simple task. It requires a significant amount of consideration and preparation.

EOC situation reports from actual incidents provide historical data on which to

base a simulation. Practicing managers also provide potential partners to create

authentic learning environments that replicate decision spaces characterized by

risk and uncertainty.

Conclusion

Many emergency management courses focus primarily on elements of organizational

structure; formal rules, procedures, and organizational hierarchy. The

comprehension of these elements indeed warrants time and attention, because

they form the basis for conceptualizing a common operating picture. Structure

alone, however, is insufficient to inform emergency managers in practice. During

incidents that cascade beyond anticipation and that demand novel strategies for

action, decision making informed by experience proves to be an essential skill set.

Facilitating this skill through teaching and learning activities in a classroom is not

an easy task. A curriculum focused on elements of administrative process such as

cognition, communication, coordination, and control and delivered through a

tested strategy to achieve mastery provides a road map on which to move forward.

Footnotes

1 We thank Mr. Joel Brady and Dr. Carol Washburn from Pitt’s Center for Instructional Development

and Distance Learning (CIDDE) for their input and guidance regarding teaching and

learning strategies.

2 “We made great progress this week in our preparedness efforts,” said Ron Castleman, FEMA

regional director. “Disaster response teams developed action plans in critical areas such as search

and rescue, medical care, sheltering, temporary housing, school restoration and debris management.

These plans are essential for quick response to a hurricane but will also help in other emergencies”

(FEMA, 2004).

68 Journal of Public Affairs Education


Developing Decision-Making Skills for Uncertain Conditions

3 The Alta Madre simulation, formally entitled “Emergency Management & Homeland Security:

Interagency Collaboration—Emergency!” is available for download at the Maxwell School’s

Program for the Advancement of Research on Conflict and Collaboration website at http://www.

maxwell.syr.edu/uploadedFiles/parcc/eparcc/simulations/Booher-Sutkus%20edited.pdf. “Rim

Sim: A Role-Play Simulation” is available for download at the U.S. Geological Survey’s website at

http://pubs.usgs.gov/bul/b2212/.

References

Ambrose, S., Bridges, M., DiPeitro, M., Lovett, M., & Norman, M. (2010). How learning works:

Seven research-based principles for smart teaching. San Francisco: Jossey-Bass.

Anderson, L., & Krathwohl, D. (2000). A taxonomy for learning, teaching, and assessment: A revision

of Bloom’s taxonomy of educational objectives. White Plains, NY: Longman.

Bankes, S., Lempert, R., & Popper, S. (2003). Shaping the next one hundred years: New methods for

quantitative, long-term policy analysis (6th ed.). Santa Monica, CA: RAND.

Barrett, R. C., Frew, S. L., Howell, D. G., Karl, H. A., & Rudin, E. B. (2003). Rim Sim: A role-play

simulation (Version 1.0, 2003). Bulletin 2212. U.S. Geological Survey. Retrieved from http://

pubs.usgs.gov/bul/b2212/

Bloom, B. (1956). Taxonomy of educational objectives, handbook I: The cognitive domain. New York:

David McKay.

Booher, D. E., & Sutkus, A. (2008). Emergency Management & Homeland Security: Interagency

Collaboration—Emergency! Maxwell School of Syracuse University’s Program for the

Advancement of Research on Conflict and Collaboration (E-PARCC). Retrieved from http://

www.maxwell.syr.edu/uploadedFiles/parcc/eparcc/simulations/Booher-Sutkus%20edited.pdf

Buck, D. A., Trainor, J. E., & Aguirre, B. E. (2006). A critical evaluation of the Incident Command

System and NIMS. Journal of Homeland Security and Emergency Management, 3(3), 1–27.

Collins, M. L., & Peerbolte, S. L. (2012). Public administration emergency management pedagogy:

Cultivating the habit of critical thinking. Journal of Public Affairs Education, 18(2), 315–326.

Comfort, L. (1999). Shared risk: Complex systems in seismic response. New York: Pergamon.

———. (2007). Crisis management in hindsight: Cognition, communication, coordination, and

control. Public Administration Review, 67(1), 189–197.

Comfort, L., Waugh, W. L., & Cigler, B. (2012). Emergency management research and practice

in public administration: Emergence, evolution, expansion, and future directions. Public

Administration Review, 72(4), 539–554.

Comfort, L. K., & Wukich, C. (2009). Designing resilience for communities at risk: Building capacity

for collective action. In R. Shaw & R. R. Krishnamurthy (Eds.), Disaster management: Global

challenges and local solutions (pp. 384–399). Hyderabad: Universities Press India Limited.

Cutter, S., Boruff, B., & Shirley, W. (2003). Social vulnerability to environmental hazards. Social

Science Quarterly, 84(2), 242–261.

Doyle, J. (1979). A truth maintenance system. Artificial Intelligence, 12, 231–272.

Journal of Public Affairs Education 69


L. K. Comfort & C. Wukich

Federal Emergency Management Agency. (FEMA). (2004, July 23). Hurricane Pam Exercise Concludes.

Retrieved from http://coop.fema.gov/news/newsrelease.fema?id=13051

Howitt, A. M., & Leonard, H. B. (2009). Managing crises: Responses to large-scale emergencies. Washington,

DC: CQ Press.

Hutchins, E. (1995). Cognition in the wild. Cambridge, MA: MIT Press.

Johnson, D. E. A., Zagorecki, A., Gelman, J. M., & Comfort, L. K. (2011). Improved situational awareness

in emergency management through automated data analysis and modeling. Journal of Homeland

Security and Emergency Management, 18(1), 40.

Johnson, D. J. (2006). Dynamic hazard assessment: Using agent-based modeling of complex, dynamic hazards

for hazard assessment. Unpublished doctoral dissertation, University of Pittsburgh, Pittsburgh, PA.

Johnson, T., & Summerton, S. (2012). Use of WebEOC to create an authentic learning environment. Paper

presented at the annual Emergency Management Higher Education Conference, Emmitsburg, MD.

Kauffman, S. (1993). The origins of order: Self-organization and selection in evolution. New York: Oxford

University Press.

Kiltz, L. (2009). Developing critical thinking skills in homeland security and emergency management

courses. Journal of Homeland Security and Emergency Management, 6(1), Article 36.

Klein, G. (1998) Sources of power: How people make decisions. Cambridge, MA: MIT Press.

Klein, G., Orasanu, J., Calderwood, R., & Zsambok, C. (1993). Decision making in action: Models and

methods. Norwood, NJ: Ablex Publishing.

Koppenjan, J., & Klijn, E.-H. (Eds.). (2004). Managing uncertainties in networks: A network approach to

problem solving and decision making. Oxford and New York: Routledge.

Ling, H., Znati, T., & Comfort, L. (2010). Designing resilient systems: Integrating science, technology,

and policy in international risk reduction. In L. K. Comfort, A. Boin, & C. Demchak (Eds.), Designing

resilience: Preparing for extreme events (pp. 244–271). Pittsburgh: University of Pittsburgh Press.

McEntire, D. A., & Myers, A. (2004). Preparing communities for disasters: Issues and processes for

government readiness. Disaster Prevention and Management, 13(2), 140–152.

Mileti, D. (1999). Disasters by design: A reassessment of natural hazards in the United States. Washington,

DC: Joseph Henry Press.

Neal, D. M., & Phillips, B. D. (2007). Effective emergency management: Reconsidering the bureaucratic

approach. Disasters, 19(4), 327–337.

Phillips, J. K., Klein, G., & Sieck, W. R. (2004). Expertise in judgment and decision making: A case

for training intuitive decision skills. In D. J. Koeler & N. Harvey (Eds.), Blackwell handbook of

judgment and decision making (pp. 297–315). Malden, MA: Blackwell Publishing.

Savin-Baden, M. (2000). Problem-based learning in higher education: Untold stories. Buckingham, England:

Society for Research into Higher Education and Open University Press.

Silvia, C. (2012). The impact of simulations on higher-level learning. Journal of Public Affairs Education,

18(2), 397–422.

70 Journal of Public Affairs Education


Developing Decision-Making Skills for Uncertain Conditions

Simon, H. E. (1997). Administrative behavior: A study of decision-making processes in administrative

organizations (4th ed.). New York: Free Press.

Simons, K. D., & Klein, J. D. (2007). The impact of scaffolding and student achievement levels in a

problem-based learning environment. Instructional Science, 35, 41–72.

Sylves, R. (2008). Disaster policy & politics: Emergency management and homeland security. Washington,

DC: CQ Press.

Waugh, W. L. Jr., & Tierney, K. (2007). Emergency management: Principles and practice for local government

(2nd ed.). Washington, DC: ICMA Press.

Weick, K. E. (1995). Sensemaking in organizations. Thousand Oaks: Sage Publications.

Weick, K. E., & Sutcliffe, K. M. (2007). Managing the unexpected: Resilient performance in an age of

uncertainty. San Francisco: Jossey-Bass.

Wise, C. R. (2006). Organizing for homeland security after Katrina: Is adaptive management what’s

missing? Public Administration Review, 66(3): 302–318.

Louise K. Comfort is professor of Public and International Affairs and director,

Center for Disaster Management, University of Pittsburgh. She is a Fellow,

National Academy of Public Administration, and author or coauthor of six

books, including Designing Resilience: Preparing for Extreme Events (University of

Pittsburgh Press, 2010) and Mega-Crises (Charles C. Thomas, 2012). Her

primary research interests are in decision making under conditions of uncertainty

and rapid change, and the uses of information technology to develop decision

support systems for managers operating under urgent conditions. She teaches

courses in organizational theory and design, systems thinking, and public policy.

She has published articles on information policy, organizational learning, and

sociotechnical systems, and is editor of Safety Science and book review editor for

the Journal of Comparative Policy Analysis. E-mail: comfort@gspia.pitt.edu

Clayton Wukich is an assistant professor in the Department of Political Science

at Sam Houston State University. He received his PhD from the University of

Pittsburgh in 2011. Clayton teaches courses on public management, emergency

management, organizational theory, policy making, and state and local government.

His research interests include intergovernmental relations and collaborative governance,

particularly in the policy domain of disaster and emergency management.

Email: wukich@shsu.edu.

Journal of Public Affairs Education 71


Productivity and Leadership Patterns

of Female Faculty Members

in Public Administration

Meghna Sabharwal 1

School of Economic, Political and Policy Sciences

The University of Texas at Dallas

Abstract

Though the number of female faculty members has risen in public administration

programs throughout the nation, few studies have analyzed the advances made by

them at universities and colleges. The most commonly used method of examining

success in academic settings is by analyzing the research productivity patterns of

faculty members. However, evaluation should not be limited to measuring publication

productivity alone, but also through measuring gender equity in leadership positions.

Thus the purpose of this research is to analyze the scholarly output and leadership

patterns of faculty members in fields of public administration and policy by gender.

The study uses data from the Survey of Doctorate Recipients to examine career

trajectories and 241 schools offering degrees in public administration, affairs,

policy, and management listed on the NASPAA website to examine leadership

patterns by gender. The results suggest that female faculty members have lower

productivity despite controlling for demographic, institutional, and career factors.

However, when interaction terms are introduced between female faculty and ages

of children, the productivity gap by gender disappears.

According to the United States Department of Labor statistics, women

comprise close to half of the active labor workforce in this country (46.8% in

2008). The number is expected to rise by 2018. The unemployment rate for

women in 2009 was 2.1 percentage points lower than for men. Despite these

encouraging statistics, the public administration field continues to grapple with

issues facing women. A few commonly cited challenges include glass ceiling, glass

walls, sticky floors, job segregation, job compression, pay inequities, and gender

Keywords: research productivity, gender, public administration faculty, diversity

JPAE 19(1), 73–96

Journal of Public Affairs Education 73


M. Sabharwal

stereotypes. One scholar (Guyot, 2008) argues that the glass ceiling is a factor of

demand and supply and that greater numbers of women are in appointed compared

with elected positions. Guyot suggested, “Glass implies a smooth surface, while

the barrier to women’s advance in government is uneven, as wavy as a fun-house

mirror” (p. 529). Thus there is a clear indication of challenges that women continue

to face in the workforce. 2 This study examines the advancements made by women

in an area that is neglected in the public administration literature with specific

reference to scholarship and leadership.

The most common method of evaluating success in universities is by assessing

the contributions made by faculty members to the knowledge base of the field,

and publications are used as a barometer to judge faculty effectiveness (Farber,

Powers, & Thompson, 1984; Forrester, 1996; Kellough & Pitts, 2005; Rodgers

& Rodgers, 2000). Other methods often used by researchers to measure progress

in the fields of public administration and policy are (a) investigating the theoretical

development (Adams, 1992; Box, 1992); (b) assessing the quality of research in

the discipline (Cozzetto, 1994; Houston & Delevan, 1990, 1994; Kraemer & Perry,

1989; Stallings & Ferris, 1988; Wright, Manigault, & Black, 2004); (c) ranking

of public administration, affairs, and policy programs (W. Adams, 1983; Morgan

et al., 1981; Morgan & Meier, 1982); (d) examining the quality of dissertations

(G. Adams & White, 1994; R. Cleary, 1992, 2000; McCurdy & Cleary, 1984;

White, 1986); (e) assessing graduate productivity (Brewer et al., 1999; Douglas,

1996); and (f) measuring gender equity (Durbin, Ospina, & Schall, 1999; Rubin,

1990, 2000).

The dominant theme of this research is to analyze the scholarly output and

leadership patterns of faculty members in fields of public administration and

policy by gender. This study attempts to answer these research questions:

(a) How do faculty members in public administration and policy differ across

personal, institutional, and career factors by gender? (b) Are there differences in

scholarly publication rates of male and female faculty members? (c) Are there

differences in the leadership patterns of academic faculty in public administration

and policy based on gender? While there are several studies in science and

engineering that measure research productivity of faculty members by gender,

public administration as a discipline is devoid of such studies. The motivation

for this study partly arises from the existing lack of knowledge on this subject,

but more importantly it aims to assess the current standing of female faculty

members in regards to research productivity and leadership. To examine the

association between productivity and leadership by gender, the following sections

review the literature in three main areas (a) gender as it relates to personal and

work factors; (b) gender as it relates to research productivity; and (c) gender

and leadership.

74 Journal of Public Affairs Education


Productivity and Leadership of Female Faculty

Background Literature

Gender as It Relates to Personal and Work Factors

Underrepresentation of female faculty members in academia has received much

attention in several disciplines—especially science and engineering. In the field

of public administration, the most recent literature has found that female faculty

members have made advances, but they still lack parity in rank, tenure, salary, and

scholarly productivity (Hale, 1999; Mani, 2001; Rubin, 1990, 2000; Sabharwal,

2010; Slack, Myers, Nelson, & Sirk, 1996). Slack et al. (1996) found that women

in public administration were twice as likely to be in junior-ranking positions as

compared with men, creating differences in publication rates. A similar pattern is

reported by researchers in other disciplines (Long & Fox, 1995; Menges & Exum,

1983; Smart, 1991; Toutkoushian, 1999; Weiler, 1990). Why is it important to

have representation of female faculty members in public administration and policy

programs? Apart from the fact that female faculty members bring demographic

diversity to the workplace, they bring diverse perspectives creating an environment

that reflects society as a whole (Oldfield, Candler, & Johnson III, 2006). Further,

some evidence suggests that female faculty members are more likely to serve in

nurturing and advising roles than male faculty members (Allen, 1994; A. Astin,

Korn, & Lewis, 1991) because they feel a special responsibility to cater to the

needs of female students and students of color (Park, 1996).

However, female faculty members are confronted with several challenges, as

highlighted by Hale (1999). One of the participants, a university professor in a

public administration program, commented on the challenges female faculty

members encounter while going up for promotion.

My wife is a faculty member…[who] can’t get promoted. The question

of why I could get promoted with no questions, with probably less

credentials…and she [with] more credentials…can’t get promoted,

bothers me…In that sense, I think the academic world is much harder.

The glass ceiling is much more difficult to break. (Hale, 1999, p. 415)

Overall, the study pointed to various gender inequities that continue to persist

at work: the lack of dialogue and open exchange on issues relating to socialization

experiences, personal and family life, sexuality and power, and so on, creating barriers

and fortifying existing stereotypes held by both genders.

While other researchers have argued that the lag is at least partially due to

marriage and family obligations, Long, Allison, & McGinnis (1993) concluded

that female faculty members who shoulder child-rearing duties took a longer time

to advance up the academic ladder when compared to men with children. On the

other hand, Hargens & Long (2002) explained the slower growth of women in

higher levels of the academe by a theory of demographic inertia. They argue that

the age and sex structure of a field can severely limit women’s representation in

Journal of Public Affairs Education 75


M. Sabharwal

tenure-track and full professor positions. Others have argued that the lower representation

of women among tenured faculty is due to the differences in human

capital, structural characteristics, and research productivity (Perna, 2001).

Aggregation of female faculty members in lower-ranking positions further leads

to discrepancies in salaries. Researchers in various disciplines including public

administration have found that female faculty members earn less than male faculty

(Barbezat, 1988; Bellas, 1994; Levin & Stephan, 1998; Perna, 2003; Slack, et al.,

1996; Weiler, 1990). Institutionally, female faculty members are less likely to be

hired at a Carnegie I or II research university when compared with male faculty

members (Fassinger, Scantlebury, & Richmond, 2004; Long & Fox, 1995) and

thus are less likely to spend time on research related activities. Drawing on the

evidence presented, the study makes the following hypotheses.

Hypothesis 1: Female faculty members in public administration and policy

programs are more likely to be in junior-ranking positions as compared

to male faculty members.

Hypothesis 2: Female faculty members in public administration and policy

are less likely to be employed at research universities than male faculty,

and thus are less likely to spend their time in research-related activities.

Research Productivity by Gender

Publications are used as a vital evaluation tool in academia to make promotion

and tenure decisions (Bellas & Toutkoushian, 1999; Xie & Shauman, 1998;

Zamarripa, 1993). Departments with highly productive faculty not only help

advance the knowledge base but also act as catalysts in the socialization process

of graduate students into academic culture (Kraemer & Perry, 1989). Productive

faculty members serve as a motivational force for students who wish to advance

their graduate careers (Brewer et al., 1999; Douglas, 1996). Productive faculty

members also are likely to be better teachers because they bring their research

into the classroom. Research and teaching should not be divorced from each

other; in fact one can reinforce the other (Fairweather, 2002). However, increased

time spent on teaching is shown to negatively affect the time spent on research

(Bellas & Toutkoushian, 1999; M. Fox, 1992).

Only a handful of studies in public administration have thus far analyzed the

research productivity of faculty by gender (Condit & Hutchinson, 1997; Rubin,

1990, 2000; Slack et al., 1996). Although most of the studies on gender and

productivity are more than a decade old, a more recent study found female faculty

in the field of public administration and policy to be less productive in terms of

research publications when compared with male faculty members (Corley &

Sabharwal, 2010). Similar findings were reported in other disciplines (Astin, 1969;

Bellas & Toutkoushian, 1999; Cole & Singer, 1991; Cole & Zuckerman, 1984;

Corley, 2005; M. Fox, 2005; Levin & Stephan, 1998; Long & Fox, 1995; Pfeffer

& Langton, 1993; Sonnert & Holton, 1995; Stack, 2004. Some scholars have

demonstrated, however, that the difference in productivity levels between female

76 Journal of Public Affairs Education


Productivity and Leadership of Female Faculty

and male academics has declined in recent years (Kellough & Pitts, 2005; Sax,

Hagedorn, Arredondo, & DiCrisi, 2002; Ward & Grant, 1996; Xie & Shauman,

1998). Kellough & Pitts (2005) reported women to have significantly higher

acceptance rates in Public Administration Review (PAR), the premier journal in the

discipline (41% vs. 31%). The same study found opposite results for minority

faculty—the acceptance rates in PAR for minority faculty members was disproportionately

lower when compared with white faculty members.

In addition, some studies have established an association between productivity

levels and personal characteristics like marital status and parenthood (Astin & Davis,

1985; Clark & Corcoran, 1986; Cole & Zuckerman, 1984; Mason & Goulden,

2004; Stack, 2004). However, many of these scholars disagree about the direction

of the relationship between productivity levels and marriage or parenting variables

for female faculty. Some studies have found a positive influence of marriage and

children on faculty productivity (Astin & Davis, 1985; Clark & Corcoran, 1986;

Cole & Zuckerman, 1984; Xie & Shauman, 1998). Yet, others have shown that

marriage and children negatively affect research productivity among female faculty

members (Mason & Goulden, 2004; Sonnert & Holton, 1995; Stack, 2004). Stack

(2004) provided a detailed analysis of faculty research productivity in various

disciplines. The author found that among social scientists, women with younger

children (aged 2–11) were disadvantaged in relation to their research productivity

due to childbearing and child-rearing responsibilities, findings that contradicted

results of another study done by M. Fox & Faver (1985). The presence of older

children (18 and above) had no effect on productivity by gender.

Mason & Goulden (2004) found that women with children under the age of

6 are least likely to secure tenure-track positions. Though it could be argued that

female faculty members with children are self-selecting out of tenure-track jobs,

past studies have indicated that female faculty members are far less likely to be

hired at research universities (Fassinger et al., 2004; Long & Fox, 1995). Female

faculty members in tenure-track jobs report spending greater amounts of their

time in teaching and service-related activities (Allen, 1994; Banks, 1984; Bellas

& Toutkoushian, 1999; Carr et al., 1998; Park, 1996). This is in part due to the

perception that women are more caring than men and are hence more likely to

be placed in mentoring roles. Women faculty members also are more likely to be

asked to serve on university-wide committees to fulfill diversity and institutional

goals (Garcia, 1974; Howe, 1980; Menges & Exum, 1983; Sandler, 1991). Thus,

based on the past research, this study poses the following.

Hypothesis 3: Female faculty members in public administration and

policy are likely to have lower publication productivity rates than

male faculty members even after controlling for family and workrelated

factors.

Hypothesis 4: Female faculty members in public administration and

policy with children in their preteen years are likely to have lower

publication productivity rates than male faculty members.

Journal of Public Affairs Education 77


M. Sabharwal

Leadership Patterns by Gender

Leadership of female faculty in public administration and policy programs in

the United States is an understudied area. Overwhelming evidence suggests that

women in local, state, and federal government jobs are less likely to be present in

high-ranking positions (Cornwell & Kellough, 1994; Guy, 1993; Guy & Newman,

2004; Harris, 2009; Hsieh & Winslow, 2006; Kellough, 1990; Kim, 1993; Kim

& Lewis, 1994; Lee & Cayer, 1987; Naff, 2001; Purcell & Baldwin, 2003; Rivera

& Ward, 2008; Selden & Selden, 2001). Currently, no studies are examining the

role of female faculty members in positions of chairs or deans in public administration

and related programs. Based on the overwhelming evidence suggesting

that female faculty members are far less likely to be in senior ranking positions,

this study poses the following hypothesis.

Hypothesis 5: Female faculty members in public administration and policy

are less likely to be in positions of chair, department, or program

heads when compared with male faculty.

Data and Methodology

Data for this study are taken from the 2003 Survey of Doctorate Recipients

(SDR), which is nationally conducted and funded by the National Science

Foundation (NSF) and the National Institutes of Health (NIH). The survey is

conducted biennially. The latest wave 2006 SDR was not used for this research,

because the productivity variables were omitted. Even though these data are old,

the 2003 data set was chosen because it asks questions related to faculty research

productivity that are not asked in the more recent 2006 survey. 3

The sample consists of doctorate recipients with highest degrees in public

administration or public policy studies. Individuals in the data are full-time

academics working at four-year colleges or universities. Part-time faculty members

and postdoctoral research fellows are not included in the sample, because the career

trajectories of both these groups of individuals can be very different from full-time

faculty members employed in four-year institutions. After filtering part-time faculty

and postdoctoral researchers, the sample resulted in a total of 1,275 (weighted).

Individuals with a doctoral degree in public health were excluded because several

of these faculty members have medical degrees resulting in divergent career trajectories,

publication norms, and salary structures. The variables used from the SDR

data set range from various demographic variables (such as gender, race, ethnicity,

marital status, children status) to work-related factors (rank, tenure, Carnegie

classification of the employer, 4 primary work activity, salary, and research productivity

of faculty members). The analysis was computed using the weighted variables

created by NSF. Weights in this type of sample survey data are produced to make

statistics more representative of the population. The weighted variable is defined

as the reciprocal of the probability of selection under the sample design and is further

adjusted for nonresponse. Data analyses are performed using an independent sample

t-test of the aforementioned variables by gender. To control for the impact of

gender on publishing, an ordinary least squares regression is computed.

78 Journal of Public Affairs Education


Productivity and Leadership of Female Faculty

The dependent variable used in this study is productivity, which is measured

by the annual number of peer-reviewed scholarly articles published by individual

faculty members from 1998 to 2003. The survey asked respondents to report the

number of articles (co-)authored or accepted for publication in peer-reviewed

journals since 1998, which is for a total of five years since the date of the survey

(2003). Because not all faculty members have the entire five years to publish,

annual rate of article productivity was computed. For those with highest degree

granted in or before 1998, annual number of articles published was calculated as

a measure of total articles between 2003 and 1998 divided by 5 years. For those

with highest degree granted after 1998, annual number of articles published

equals total articles between 2003 and 1998 divided by experience, which is

calculated by subtracting the year of highest degree from the survey year 2003.

Though these data are self-reported, past research has established high correlation

between self-reported and indexed publications (Allison & Stewart, 1974; Cole

& Zuckerman, 1984; M. Fox & Faver, 1985).

The model incorporates various control variables that are documented in the

literature (Cole & Singer, 1991; M. Fox, 2005; Stack, 2004; Xie & Shauman, 1998).

These include demographic variables like age, marital status, gender, and children.

Also included are employer type, work activity, salary, and rank. Publication counts

or “straight counts” are the most commonly used methods for assessing research

productivity (Cole & Cole, 1973; Corley 2005; Lange, 2001; Lindsey, 1980;

Morgan et al., 1981; Sabharwal & Corley, 2008). As the name suggests, straight

counts measure the number of publications a researcher produces. This approach

is occasionally treated as a quantification of the peer-review process, because each

publication would have undergone the seal of approval through the expert judgment

of peers (King, 1987; Melkers, 1993; Weingart, 2005). As a program evaluation

tool, publication count can serve as an indicator of progress by estimating the

number of publications per research dollar spent. The biggest drawback of this

method is that, when used alone, it reveals only the quantity, not the quality of

the publication produced (Melkers, 1993). Quantity and quality of publications

are highly correlated, and research by Cole & Zuckerman (1984) showed a .50

to .75 correlation between publication and citation counts. This finding suggests

that faculty members who are prolific publishers also heavily influence the research

in the field by being cited by other authors. Thus the use of publication count

can partly serve as a measure of quality.

To measure the leadership roles of faculty members in public administration

and policy programs, this study gathered information featured on the National

Association of Schools of Public Affairs and Administration (NASPAA) website.

Data were collected for the 241 schools listed on the NASPAA website, and the

names of directors and department heads were taken from the websites of individual

schools. The names of various program heads were further coded for gender and

Carnegie classification of the institution. 5 Independent sample t-tests were employed

to find differences in personal, career, and institutional characteristics by gender.

Journal of Public Affairs Education 79


M. Sabharwal

Results

Gender as It Relates to Various Personal and Work Factors

Overall, close to 40% of the sample were female faculty members, and about

60% were male. The respondents hold various academic positions. Overall, close

to one third of the faculty members are assistant professors (34%), followed by

associate professors (24.4%), 22.5% full professors, and close to 10% in instructor

positions and other (9.6%). An overwhelming majority of faculty members are

white, non-Hispanic males. Female faculty members have higher representation

among black, non-Hispanic, and Hispanic faculty. Overall, 77.5% of the respondents

were white, non-Hispanic; 14.7% were black, non-Hispanic; 2.1% Hispanic,

all races; and 5.6% were Asian. The t-test results (see Table 1) show a significant

difference in the percentage of faculty employed in rank by gender. The results in

Table 1 suggest that female faculty members are significantly younger and are less

likely to be married or living in a marriage-like relationship as compared with male

faculty (79% vs. 88%). Female faculty members in public administration and

policy programs also are significantly less likely than male faculty to be present

in full professor positions. The results verify the first hypothesis, that female

faculty members are less likely to be in senior-ranking positions as compared

with male faculty members in public administration and policy programs. The

results also are in line with previous studies in public administration (Hale, 1999;

Rubin, 1990, 2000; Slack et al., 1996).

Additionally, the mean difference between salaries by gender is significant

at p < .001 using a t-test. Female faculty members on average are compensated

approx imately $10,000 less than male faculty, thus verifying past studies that

found female faculty earn less than male faculty (Barbezat, 1988; Bellas, 1994;

Hamermesh, 1993; Levin & Stephan, 1998; Perna, 2003; Slack et al., 1996;

Weiler, 1990). The variation in salary also can be attributed to rank and

tenure differences.

Results in Table 1 show that female faculty members are significantly (p < .001)

more likely to be employed at research institutions than male faculty members

(48% female vs. 23% male). These findings are encouraging for female faculty

members in public administration and policy given that research in other disciplines

has indicated that female faculties are much less likely to be employed at research

I or II universities (Fassinger et al., 2004; Long & Fox, 1995). Additionally, results

in Table 1 also show that female faculty members were more likely (30%) than

male faculty (16%) to report research and development as their primary work

activity. This result is not surprising given the higher proportion of female faculty

members in public administration and policy employed at research universities.

These findings are very different from studies that have reported female faculty

members spend a greater proportion of their time in teaching and service-related

activities when compared with time spent on research (Banks, 1984; Carr et al.,

80 Journal of Public Affairs Education


Productivity and Leadership of Female Faculty

Table 1.

Differences in Personal, Institutional, and Career Characteristics by Gender

I. Personal Characteristics

Total

(N = 1,275)

Female

(N = 507, 39.8%)

Male

(N = 768, 60.2%)

Total N % Total N % Total N %

Race/Ethnicity***

% White, non-Hispanic 985 77.5 350 69.7 635 82.7

Age**

Mean Age 1,275 46.7 507 45.7 768 47.3

Marital Status***

% Married 1,073 84.1 399 78.7 674 87.8

% Children living with parents 807 63.3 326 64.3 481 62.5

(Not significant)

II. Career Characteristics

Rank*

% Professor 286 22.5 99 19.5 187 24.3

Tenure Status***

% Tenured 477 37.4 104 20.5 373 48.5

% Dean/academic chair position*** 262 20.5 73 14.4 189 24.6

III. Institutional Characteristics

Carnegie Classification of Employer***

% Research I/II 422 33.1 244 48.1 178 23.2

Primary Work Activity***

% Research and development 273 21.4 151 29.8 122 15.9

Salary Mean*** 1,275 71,595 507 65, 458 768 75,642

Note. Difference of mean tests (t-test) are presented for female and male faculty members for personal,

institutional, and career-related characteristics.

* p < .05; ** p < .01, *** p < .001

1998). The results of this study are unable to verify the second hypothesis, which

states that female faculty members in public administration and policy are less

likely to be employed at research universities than male faculty, and thus are less

likely to spend significant amounts of their time in research-related activities.

Does increased time spent on research translate into increased publication productivity

for women? To answer this question, the next section analyzes the research

productivity of public administration and policy faculty by gender.

Journal of Public Affairs Education 81


M. Sabharwal

Research Productivity by Gender

To further analyze the productivity of faculty members by gender, an ordinary

least squares regression was conducted, controlling for various personal, institutional,

and career factors. The dependent variable used for this study is the annual

number of peer-reviewed articles published. Table 2 provides the results of two

regression models of research productivity and gender. Model 1 controls for various

demographic, institutional, and career factors; Model 2 has various interactions

with gender and age of children and gender and type of institution, in addition

to the controls used in Model 1. The study found that Model 1 explained 67.6%

of the variance in article productivity rates based on gender. The model used

rank as a proxy for tenure and age due to problems with multicollinearity. The

current model has very low variance inflation factors (all below 10) and tolerance

levels greater than .1, indicating no problems with multicollinearity in

the model.

The results in Model 1 suggest that female faculty members publish 0.6

fewer articles annually than their counterparts after controlling for factors like

marital status, race/ethnicity, ages of children, employer institution type, faculty

rank, salary, and primary work activity. The findings are consistent with prior

research on scholarly productivity of public administration faculty by gender

(Rubin, 1990, 2000; Slack et al., 1996). Additionally, this research found that

faculty members belonging to Asian and Hispanic race/ethnic groups were more

productive compared with white, non-Hispanic public administration faculty,

which is different from the finding of Kellough and Pitts (2005), who reported

that faculties of color have lower submission and acceptance rates in PAR. 6

As expected, there was a direct relationship between working at a research

university and scholarly productivity. The effects of location were similarly

confirmed by Stack (2004) in his study on publications and gender productivity

across disciplines. The resources available to faculty members at a research university

such as graduate teaching assistants, doctoral students for collaboration on

research, reduced teaching loads, start-up funds, and research monies for travel to

conferences and workshops, for example, help with increasing research productivity

when compared with faculty members at a liberal arts or a comprehensive

institution who might not have such resources and opportunities available. Faculty

who reported spending a major proportion of their time on research-related

activities rather than management and administration had a higher number

of articles published annually in refereed journals. Spending more time on

teaching-related activities led to higher research productivity. These results

contradict studies that report an inverse relationship between time spent on

teaching and research (Bellas & Toutkoushian, 1999; M. Fox, 1992; Olsen,

Maple, & Stage, 1995).

Assistant professors were more productive than full professors in public

administration. These findings contradict past research that shows lower

82 Journal of Public Affairs Education


Productivity and Leadership of Female Faculty

productivity among junior faculty members (Baldwin & Blackburn, 1981;

Bonzi, 1992; Latif & Grillo 2001; Olsen et al., 1995; Smart, 1990). These

studies found that younger faculty members are more likely to spend a large

proportion of their time on teaching-related activities, which affects their

scholarly productivity in a negative way.

The findings in this study suggest that publication productivity of faculty

members with children of any age is lower than faculty members without children

—a factor that negatively affects female productivity rates as compared to male

faculty members who have young children and who have reported higher publication

and tenure rates (Ginther & Kahn, 2006; Stack, 2004). The most common

explanation offered in the literature to explain lower productivity among female

faculty members is the presence of children (Carr et al., 1998; Kyvik & Teigen,

1996; Sax et al., 2002; Stack, 2004). To explore this factor further, several interaction

terms are introduced between gender and ages of children in the second model,

the results of which are presented in Table 2.

The difference in research productivity by gender disappears when several

interaction terms between female faculty and ages of children, and female faculty

and type of employer, are introduced. This partially confirms Hypothesis 3,

which suggests that female faculty have a lower publication rate as compared

with male faculty members. When controlling for institutional, career, and

demographic factors, Hypothesis 3 holds up (Model 1). However, when

additional controls are added (i.e., interaction terms between female faculty

members and ages of children and type of employer), the relationship between

gender and publication productivity becomes insignificant. In other words, men

and women do not publish at varying rates when interactions between ages of

children, institution type, and female faculty members are factored into the

regression equation. The lower research productivity of female faculty reported

in Model 1 disappears with the addition of various interaction terms in Model 2.

To conclude that women faculty members are less productive than male faculty

members might be misleading without taking children and institution type into

consideration. The results thus confirm Hypothesis 4 that female faculty members

in public administration and policy with children in their preteen years are likely

to have lower publication productivity rates than male faculty members.

While the remaining relationships (race/ethnicity, marital status, employer

type, and primary work activity) are the same as in Model 1, the impact of rank

on research productivity is different. In Model 2, full professors are the most

productive when compared with assistant and associate professors. Full professors

publish approximately one full article more than associate professors annually,

while the gap is much lower for assistant professors. Overall, Model 2 explains

83.9% of the variance in research productivity, which is an additional 16.3% of

the variance when compared with Model 1.

Journal of Public Affairs Education 83


M. Sabharwal

Table 2.

Relationship between Annual Article Productivity a and Gender

Independent Variable

Model 1 Unstd.

Coefficients

Model 2 Unstd.

Coefficients

Constant 1.638*** 1.593***

I. Personal Characteristics

Female –.596*** .159 (NS)

Race/Ethnicity (White, non-Hispanic serves as the Reference Group)

Asian, non-Hispanic .918*** 1.037***

Black, non-Hispanic –.643*** –1.551***

Hispanic all races

Marital Status (Married serves as the Reference Group)

Never Married –.767*** –2.375***

Divorced and separated .082 (NS) –.053 (NS)

Children aged 2–5 living with parents –1.349*** –.553***

Children aged 6–11 living with parents –.549*** 1.424***

Children aged 12–18 living with parents –.119 (NS) .187*

Children aged 19 and older living with parents .331** 1.284***

II. Institutional Characteristics

Carnegie Classification of Employer (Research I/II serves as the Reference Group)

Doctoral I/II –1.466*** –2.662***

Comprehensive I/II –1.243*** –1.434***

Liberal Arts I/II –1.758*** –2.429***

Primary Work Activity (Research and development serves as the Reference Group)

Teaching .221* .129*

Management and Administration –.347* –2.655***

Other .953*** .711***

Salary 1.534E-6 6.011E-6***

III. Career Characteristics

Rank (Professor serves as the Reference Group)

Assistant Professor .598*** –.226*

Associate Professor –.134* –.912***

84 Journal of Public Affairs Education


Productivity and Leadership of Female Faculty

Table 2.

Relationship between Annual Article Productivity a and Gender (continued)

Independent Variable

Model 1 Unstd.

Coefficients

Model 2 Unstd.

Coefficients

IV. Interaction Variables

Female × Children aged 2–5 living with parents –1.136***

Female × Children aged 6–11 living with parents –1.308***

Female × Children aged 12–18 living with parents 1.200***

Female × Children aged 19 and older living with parents

–.120 (NS)

Female Faculty at Liberal Arts 1.096**

Female Faculty at Comprehensive .273

Adjusted R Square

.676

(F change = 74.392 b )

.839

(F change = 110.290 b )

a

For those with highest degree granted in or before 1998, Annual Number of Articles Published = (Total

Articles Between 2003 and 1998) ÷ 5 years. For those with highest degree granted after 1998, Annual

Number of Articles Published = (Total Articles Between 2003 and 1998) ÷ (2003—Year of Highest Degree).

NS = Not significant; b F change is significant at p < .0001.

Leadership Patterns by Gender

A systematic examination of gender patterns among 241 schools offering degrees

in public administration, affairs, policy, and management listed on the NASPAA

website revealed that women are half as likely as men to be in leadership roles as

department heads and chairs (33% vs. 67%). Further analysis of leadership patterns

by gender and type of institution are presented in Table 3.

Table 3.

Leadership Patterns by Type of Institution and Gender of Public Administration/Policy/

Management Faculty

Female

(N = 79, 32.8%)

Male

(N = 162, 67.2%)

Total N % Total N %

Carnegie Classification

Research University Very High and High 35 44.3 83 51.2

Doctoral Research Universities 8 10.1 19 11.7

Master’s Large/Medium/Small Programs 33 41.8 58 35.8

Baccalaureate Colleges 3 3.8 2 1.2

Note. The difference in means (t-test) are not significant for the various categories of universities by gender.

Journal of Public Affairs Education 85


M. Sabharwal

Though the results are not statistically different, the percentage of female faculty

members holding leadership positions in research I/II universities is lower than

that for male faculty. Institutions classified as Master’s (small/medium/large) and

Baccalaureate had higher percent ages of women in positions of academic chair as

compared to the percentages of male faculty members. Overall, the results indicate

that public administration as a discipline has certainly progressed over the years,

but considerable work remains to be done before women can achieve parity with

men in leadership positions (Rubin, 2000).

Discussion and Conclusion

The purpose of this research was to measure faculty research productivity and

advancement by gender. To date, none of the studies have empirically analyzed

the gender differences in research productivity of public administration and policy

faculty. Past studies on faculty productivity in public administration have taken a

narrow view on productivity by assessing the number of publications in the premier

journal of the discipline—Public Administration Review (PAR). Public administration

as a discipline is very interdisciplinary; thus, using PAR as the sole publication

outlet to capture productivity patterns might be misleading. This study adds to

that literature by using a national database that accounts for all refereed articles

published by individual authors over a span of five years. However, these data are

limited, because they do not distinguish between sole authored or coauthored or

multiple authored articles, or quality of published articles (all are peer-reviewed,

but journal impact factor is not available), and are self-reported. Another limitation

is that the data used for this study are dated. Since such a survey is not conducted

by NASPPA/ASPA, the current study relies on the best available data set that

captures faculty productivity and career trajectory. Future studies can use more

recent 2008 SDR data to examine productivity patterns by gender.

The study finds female faculty members in public administration or public

policy are less likely to be senior-ranking positions. Does this indicate some form

of academic glass ceiling that some of these previous studies have indicated? Since a

majority of female faculty members are younger than male faculty, and close to

40% (N = 199) were not in tenure-track positions (or tenure was not applicable),

it is difficult to make outright claims of discriminatory practices in decisions of

hiring and promotion. It is not clear whether the presence of women in low-ranking

positions is due to certain barriers confronted by women, or because they have

entered academia much later than men, or perhaps a combination of these factors.

This issue should be explored in future studies with the use of longitudinal data

and interviews.

Women in public administration and policy are being hired at a higher rate

in research universities and report spending greater proportion of their time on

research-related activities when compared with male faculty members who are

more likely to be hired at doctoral, comprehensive, and liberal arts colleges. The

results are thus unable to confirm Hypothesis 2 and are certainly reflective of the

86 Journal of Public Affairs Education


Productivity and Leadership of Female Faculty

positive steps institutions have made in hiring female faculty. Despite being hired

at higher rates in research universities and spending greater proportions of their

time in research-related activities, women respondents in Model 1 report lower

productivity than male faculty members (controlling for various personal, institutional,

and career factors). This could be for several reasons. Firstly, data in Table 1

suggests that very few female faculty members hold senior-ranking positions in

public administration and policy programs, resulting in fewer mentoring opportunities

for young female faculty members. Women are more likely to seek female mentors

with whom they feel most comfortable and with whom they are able to self-identify

(R. Fox & Schuhmann, 2001; Guy, 1993; Ragins & Cotton, 1999). Guy (1993,

p. 290) asserted that “women benefit from having male mentors, they also need

mentors who have successfully forded the barriers that confront women but which

men may not even be aware exist.” Mentoring by senior faculty can result in research

collaborations, which in turn can increase faculty publication rates (Slack et al.,

1996). Secondly, lower productivity of female faculty members might be a result

of choices these faculty members make in selecting outlets for their research. A

recent study conducted by Corley & Sabharwal (2010) suggests that although

women in public administration publish fewer articles than men, they are likely

to publish in high-impact journals. Quality might be paramount to female faculty

members rather than quantity of publications.

However, the results in Model 2 are different from Model 1, in which no

difference in publication productivity is found between male and female faculty

members. Contrary to popular belief that female faculty members are less productive

than male faculty (Bellas & Toutkoushian, 1999; Cole & Zuckerman, 1984; Cole

& Singer, 1991; Corley, 2005; M. Fox, 2005; Levin & Stephan, 1998; Long &

Fox, 1995; Pfeffer & Langton, 1993; Sonnert & Holton, 1995; Xie & Shauman,

1998), the current study finds that the productivity gap disappears when ages of

children and institution type are taken into consideration. Female researchers with

children ages 11 and younger publish fewer articles than men; however, the greatest

negative impact is reported by female faculty members with children in their preteen

years (6 to 11), thus confirming Hypothesis 4. The results are consistent with

previous research in other fields of study (Hochschild, 1997; Stack, 2004). Thus

the productivity gap reported in Model 1 is not a function of innate and intellectual

differences in female scholars’ ability to publish, but rather a function of familyand

work-related factors. Women often times are disproportionately in caregiving

roles, thus creating a gender role deficit that helps explain why women publish

less when compared with their male colleagues. The challenge of balancing work

and child care responsibilities can take time away from scholarship. The results of

this study can help university policy makers recognize the importance of familyfriendly

policies that can cater to the needs of women faculty, especially those

with younger children.

These results should be interpreted with caution since the sample included

only full-time faculty members, and the analysis does not control for time spent by

Journal of Public Affairs Education 87


M. Sabharwal

faculty members on mentoring/advising students or community outreach activities,

which are common pursuits among public administration scholars. Activities relating

to advising and mentoring are often disproportionately carried out by female faculty

members (Allen, 1994; Banks, 1984; Bellas and Toutkoushian, 1999; Carr et al.,

1998; Park, 1996). Despite opportunities available to female faculty members (in

most institutions) to stop their tenure clock during the birth or adoption of a child,

some women scholars do not choose this option due to the fear of unfavorable reviews

they might receive at the time of tenure or promotion (Ropers-Huilman, 2003).

There are currently no studies on this issue in the public administration literature.

Future studies could further investigate whether similar patterns emerge for female

faculty members teaching and researching in public administration programs.

The results in Table 1 using the SDR data suggest that women are significantly

less likely to be in dean and administrative positions. Using the NASPPA data, the

current study finds higher percentages of female in leadership positions in schools

offering master’s and baccalaureate programs than in research and doctoral universities.

More research is required to understand institutional differences that lead to varying

patterns of leadership roles among female faculties. Past studies have attributed

this gap to differences in leadership style among men and women leaders: Men

emphasized a quid pro quo approach focusing on self-interest, while women were

found to meet the needs and expectations of their followers (Dominici, Fried, &

Zeger, 2009). In addition, a few studies suggest that women currently in leadership

roles are often not appreciated for their achievements and lack the much-needed

support and encouragement from senior leaders or peers (Dominici et al., 2009;

Greenhaus & Parasuraman, 1993). However, the most common deterrent cited

for the lack of women in leadership roles is the exclusion from informal networks

and the lack of mentorship opportunities provided to them (Chliwniak, 1997;

Dominici et al., 2009; Niemeier & González, 2004). Most of these studies are

conducted in science and engineering, a field that is documented to have a

hostile environment for female faculty in general. Social science, on the contrary,

is perceived as being free of such hostilities due to higher proportions of women

in disciplines like psychology and sociology.

Another important finding reported in this study is that increased time spent

in teaching does not lower the publication productivity of faculty members, which

is contrary to previous reports on this subject (Bellas & Toutkoushian, 1999; M.

Fox, 1992; Olsen et al., 1995). The results help dispel the notion that teaching is

divorced from research. In fact, both these activities are mutually reinforcing in

advancing the scholarship of public administration faculty. Assessing scholarly

productivity in academia should not be an end in itself, but rather a means to

creating an inclusive environment that promotes diversity and equal opportunity.

Universities need to recognize what is working and do away with what is not, and

continue to promote equitable environments by valuing diversity in all its forms.

88 Journal of Public Affairs Education


Productivity and Leadership of Female Faculty

Footnotes

1 The use of NSF data does not imply NSF endorsement of the research methods or conclusions

contained in this report.

2 For more details on the current state of women in public administration (public, private,

and nonprofits), see Women in Public Administration: Theory and Practice (2010), edited by

Mario J. D’Agostino and Helisse Levine. This book expounds on the role women have played

in public service, and it informs theory and practice of women public administrators from a

gendered perspective.

3 The NSF survey is biennial. In 2006, NSF did not ask research productivity-related questions;

instead, it focused on international collaborations and postdoctoral issues. NSF reintroduced

productivity-related questions in the most recent 2008 survey. The author has requested access to

NSF’s 2008 SDR restricted data.

4 The data reported are for the 2003 SDR, which uses the old Carnegie Classification system that

includes the following categories: Research I/II, Doctoral I/II, Comprehensive I/II, Liberal Arts

I/II, two-year colleges, and specialized institutions. The new classification system that came into

effect in 2005 no longer uses I and II designations. However, since these data are from 2003, the

old classification system is pertinent.

5 The NASPAA data are current and thus use the newer Carnegie classification system, which

includes these categories:

1. Doctoral Granting Universities that further are classified into RU/VH: Research Universities

(very high research activity), RU/H: Research Universities (high research activity), and

DRU: Doctoral/Research Universities

2. Master’s Colleges and Universities: Master’s/L: Master’s Colleges and Universities (larger

programs) Master’s/M: Master’s Colleges and Universities (medium programs), Master’s/S:

Master’s Colleges and Universities (smaller programs)

3. Baccalaureate Colleges: Bac/A&S: Baccalaureate Colleges—Arts & Sciences, Bac/Diverse:

Baccalaureate Colleges—Diverse Fields, Bac/Assoc: Baccalaureate/Associate’s Colleges

4. Associate’s Colleges: have 14 different sub-classifications of all colleges offering

two-year degrees

5. Special Focus Institutions

6. Tribal Colleges

6 Direct comparisons with Kellough and Pitt’s (2005) study are hard to make, because their research

analyzed the submission and publication rates in PAR and did not compute scholarly productivity

across other journals.

References

Adams, G. B. (1992). Enthralled with modernity: The historical context of knowledge and theory

development in public administration. Public Administration Review 52(4), 363–373.

Adams, G. B., & White, J. D. (1994). Dissertation research in public administration and cognate

fields: An assessment of methods and quality. Public Administration Review 54(6), 565–576.

Journal of Public Affairs Education 89


M. Sabharwal

Adams, W. C. (1983). Reputation, size, and student success in public administration/affairs programs.

Public Administration Review 43(5), 443–446.

Allen, H. L. (1994). Workload and productivity in an accountability era. Almanac of Higher Education,

25–38.

Allison, P. D., & Stewart, J. A. (1974). Productivity differences among scientists: Evidence for accumulative

advantage. American Sociological Review 39(4), 596–606.

Astin, H. S. (1969). The woman doctorate in America: Origins, career, and family. Russell Sage Foundation.

Astin, A. W., Korn, W. S., & Lewis, E. L. (1991). The American college teacher: National norms for

the 1989–90 HERI Faculty Survey. Los Angeles: Higher Education Research Institute, Graduate

School of Education, UCLA.

Astin, H. S., & Davis, D. E. (1985). Research productivity across the life and career cycles: Facilitators

and barriers for women. In M. F. Fox (Ed.), Scholarly writing and publishing: Issues, problems, and

solutions (pp. 147–160). Boulder, CO: Westview Press.

Baldwin, R. G., & Blackburn, R. T. (1981). The academic career as a developmental process:

Implications for higher education. Journal of Higher Education, 52(6), 598–614.

Banks, W. M. (1984). Afro-American scholars in the university: Roles and conflicts. American Behavioral

Scientist, 27(3), 325–338.

Barbezat, D. A. (1988). Gender differences in the academic reward system. In D. W. Breneman & T. I.

K. Youn (Eds.), Academic labor markets and careers (pp. 138–164). New York: Falmer Press.

Bellas, M. L. (1994). Comparable worth in academia: The effects on faculty salaries of the sex composition

and labor-market conditions of academic disciplines. American Sociological Review 59(6), 807–821.

Bellas, M. L., & Toutkoushian, R. T. (1999). Faculty time allocations and research productivity:

Gender, race and family effects. Review of Higher Education 22(4), 367–390.

Bonzi, S. (1992). Trends in research productivity among senior faculty. Information Processing &

Management 28(1), 111–120.

Box, R. C. (1992). An examination of the debate over research in public administration. Public

Administration Review 52(1), 62–69.

Brewer, G. A., Douglas, J. W., Facer II, R. L., & O’Toole, L. J., Jr. (1999). Determinants of graduate

research productivity in doctoral programs of public administration. Public Administration Review

59(5), 373–382.

Carr, P. L., Ash., A. S., Friedman, R. H., Scaramucci, A., Barnett, R. C., Szalacha, L., et al. (1998). Relation

of family responsibilities and gender to the productivity and career satisfaction of medical faculty.

Annals of Internal Medicine 129, 532–538.

Chliwniak, L. (1997). Higher education leadership: Analyzing the gender gap. ASHE-ERIC Higher

Education Report 25(4), 1–138.

Clark, S. M., & Corcoran, M. (1986). Perspectives on the professional socialization of women faculty:

A case of accumulative disadvantage? Journal of Higher Education 57(1), 20–43.

90 Journal of Public Affairs Education


Productivity and Leadership of Female Faculty

Cleary, R. E. (1992). Revisiting the doctoral dissertation in public administration: An examination of

the dissertations of 1990. Public Administration Review 52(1), 55–61.

———. (2000). The public administration doctoral dissertation reexamined: An evaluation of the

dissertations of 1998. Public Administration Review 60(5), 446–455.

Cole, J. R., & Cole, S. (1973). Social stratification in science. Chicago: University of Chicago Press.

Cole, J. R., & Singer, B. (1991). A theory of limited differences: Explaining the productivity puzzle in

science. In H. Zuckerman, J. R. Cole, & J. T. Bruer (Eds.), The outer circle: Women in the scientific

community (pp. 277–310). New Haven, CT: Yale University Press.

Cole J. R., & Zuckerman, H. (1984). The productivity puzzle: Persistence and change patterns of publication

among men and women scientists. In M. W. Steinkamp, M. L. Maehr, D. A. Kleiber, &

J. G. Nicholls (Eds.), Advances in motivation and achievement (pp. 217–258). Greenwich, CT:

JAI Press.

Condit, D. M., & Hutchinson, J. R. (1997). Women in public administration: Extending the metaphor

of the emperor’s new clothes. American Review of Public Administration, 27(2), 181–197.

Corley, E. A. (2005). How do career strategies, gender, and work environment affect faculty

productivity levels in university-based science centers? Review of Policy Research, 22(5), 637–655.

Corley, E. A., & Sabharwal, M. (2010). Scholarly collaboration and productivity patterns in public

administration: Analyzing recent trends. Public Administration, 88(3), 627–648.

Cornwell, C., & Kellough, J. E. (1994). Women and minorities in federal government agencies: Examining

new evidence from panel data. Public Administration Review, 54(3), 265–276.

Cozzetto, D. A. (1994). Quantitative research in public administration: A need to address some serious

methodological problems. Administration & Society, 26(3), 337–343.

D’Agostino, M. J., & Levine, H. (Eds.). (2010). Women in public administration: Theory and practice

Sudbury, MA: Jones & Bartlett Learning.

Dominici, F., Fried, L. P., & Zeger, S. L. (2009). So few women leaders: It’s no longer a pipeline problem,

so what are the root causes? Academe, 95(4), 25–27.

Douglas, J. W. (1996). Faculty, graduate student, and graduate productivity in public administration

and public affairs programs, 1986–1993. Public Administration Review, 56(5), 433–440.

Durbin, E., Ospina, S., & Schall, E. (1999). Living and learning: Women and management in public

service. Journal of Public Affairs Education, 5(1), 25–41.

Fairweather, J. S. (2002). The mythologies of faculty productivity: Implications for institutional policy

and decision making. Journal of Higher Education, 73(1), 26–48.

Farber, M., Powers, P., & Thompson, F. (1984). Assessing faculty research productivity in graduate

public policy programs. Policy Sciences, 16(3), 281–289.

Fassinger, R. E., Scantlebury, K., & Richmond, G. (2004). Career, family, and institutional variables

in the work lives of academic women in the chemical sciences. Journal of Women and Minorities in

Science and Engineering, 10(4), 297–316.

Journal of Public Affairs Education 91


M. Sabharwal

Forrester, J. P. (1996). Public administration productivity: An assessment of faculty in PA programs.

Administration & Society, 27(4), 537–566.

Fox, M. F. (1992). Research, teaching, and publication productivity: Mutuality versus competition in

academia. Sociology of Education, 65(4), 293–305.

Fox, M. F. (2005). Gender, family characteristics, and publication productivity among scientists. Social

Studies of Science, 35(1), 131–150.

Fox, M. F., & Faver, C. A. (1985). Men, women, and publication productivity: Patterns among social

work academics. Sociological Quarterly, 26(4), 537–549.

Fox, R. L. & Schuhmann, R. A. (2001). American Review of Public Administration, 31(4), 381–392.

Garcia, R. L. (1974). Affirmative action hiring: Some perceptions. Journal of Higher Education, 45(4),

268–272.

Ginther, D. K., & Kahn, S. (2006). Does science promote women? Evidence from academia 1973–

2001 (Greenhaus, J. H., & Parasuraman, S. (1993). Job performance attributions and career

advancement prospects: An examination of gender and race effects. Organizational Behavior and

Human Decision Processes 55(2), 273–297.

Guy, M. E. (1993). Three steps forward, two steps backward: The status of women’s integration into

public management. Public Administration Review 53(4), 285–292.

Guy, M. E., & Newman, M. A. (2004). Women’s jobs, men’s jobs: Sex segregation and emotional labor.

Public Administration Review 64(3), 289–298.

Guyot, J. F. (2008). Is the ceiling truly glass or something more variable? Society 45(6), 529–533.

Hale, M. (1999). He says, she says: Gender and worklife. Public Administration Review 59(5), 410–424.

Hamermesh, D. S. (1993). Treading water: The annual report on the economic status of the profession.

Academe, 79(2), 6–7.

Hargens, L. L., & Long, J. S. (2002). Demographic inertia and women’s representation among faculty

in higher education. Journal of Higher Education 73(4), 494–517.

Harris, G. L. A. (2009). Review of Public Personnel Administration 29(4), 354–372.

Hochschild, A. R. (1997). The time bind: When work becomes home and home becomes work. New York:

Henry Holt.

Houston, D. J., and Delevan, S. M. (1990). Public administration research: An assessment of journal

publications. Public Administration Review 50(6), 674–681.

Houston, D. J., & Delevan, S. M. (1994). A comparative assessment of public administration journal

publications. Administration & Society 26(2), 252–271.

Howe, R. (1980). Retrenchment policies and practices: A summary. Journal of the College and University

Personnel Association 31, 136–47.

Hsieh, C., & Winslow, E. (2006). Gender representation in the federal workforce: A comparison

among groups. Review of Public Personnel Administration 26(3), 276–295.

92 Journal of Public Affairs Education


Productivity and Leadership of Female Faculty

Kellough, J. E. (1990). Integration in the public workplace: Determinants of minority and female

employment in federal agencies. Public Administration Review 50(5), 557–566.

Kellough, J. E., & Pitts, D. W. (2005). Who contributes to public administration review? Examining

the characteristics of authors who submit manuscripts to the journal. Public Administration

Review 65(1), 3–7.

Kim, P. S. (1993). Racial integration in the American federal government: With special reference to

Asian Americans. Review of Public Personnel Administration 13(1), 52–66.

Kim, P. S., & Lewis, G. B. (1994). Asian Americans in the public service: Success, diversity, and

discrimination. Public Administration Review 54(3), 285–290.

King, J. (1987). A review of bibliometric and other science indicators and their role in research

evaluation. Journal of Information Science 13(5), 261–276.

Kraemer, K. L., & Perry, J. L. (1989). Institutional requirements for academic research in public

administration. Public Administration Review 49(1), 9–16.

Kyvik, S., & Teigen, M. (1996). Child care, research collaboration, and gender differences in scientific

productivity. Science, Technology & Human Values 21(1), 54.

Lange, L. L. (2001). Citation counts of multi-authored Papers—First-named authors and further

authors. Scientometrics 52(3), 457–470.

Latif, D. A., & Grillo, J. A. (2001). Satisfaction of junior faculty with academic role functions.

American Journal of Pharmaceutical Education 65(2), 137–143.

Lee, D. S., & Cayer, N. J. (1987). Recruitment of minority students for public administration

education. Public Administration Review 47(4), 329–335.

Levin, S. G., & Stephan, P. E. (1998). Gender differences in the rewards to publishing in academe:

Science in the 1970s. Sex Roles 38(11), 1049–1064.

Lindsey, D. (1980). Production and citation measures in the sociology of science: The problem of

multiple authorship. Social Studies of Science 10(2), 145–162.

Long, J. S., Allison, P. D., & McGinnis, R. (1993). Rank advancement in academic careers: Sex

differences and the effects of productivity. American Sociological Review 58(5), 703–722.

Long, J. S., & Fox, M. F. (1995). Scientific careers: Universalism and particularism. Annual Review

of Sociology 21(1), 45–71.

Mani, B. (2001). American Review of Public Administration 31Mason, M. A., & Goulden, M. (2004).

Marriage and baby blues: Redefining gender equity in the academy. Annals of the American

Academy of Political and Social Science 596(1), 86–103.

McCurdy, H. E., & Cleary, R. E. (1984). Why can’t we resolve the research issue in public

administration? Public Administration Review 44(1), 49–55.

Melkers, J. (1993). Bibliometrics as a tool for analysis of R&D impacts. In B. Bozeman & J.

Melkers (Eds.), Evaluating R&D impacts: Methods and practice (pp. 43–61). Boston: Kluwer

Academic Publishers.

Journal of Public Affairs Education 93


M. Sabharwal

Menges, R. J., & Exum, W. H. (1983). Barriers to the progress of women and minority faculty.

Journal of Higher Education, 54(2), 123–144.

Morgan, D. R., & Meier, K. J. (1982). Reputation and productivity of public administration/affairs

programs: Additional data. Public Administration Review, 42(2), 171–173.

Morgan, D. R., Meier, K. J., Kearney, R. C., Hays, S. W., & Birch, H. B. (1981). Reputation and

productivity among U.S. public administration and public affairs programs. Public Administration

Review, 41(6), 666–673.

Naff, K. C. (2001). To look like America: Dismantling barriers for women and minorities in government.

Westview Press.

Niemeier, D. A., & González, C. (2004). Breaking into the guildmasters’ club: What we know about

women science and engineering department chairs at AAU universities. NWSA Journal, 16(1),

157–171.

Oldfield, K., Candler, G., & Johnson III, R. G. (2006). Social class, sexual orientation, and toward

proactive social equity scholarship. American Review of Public Administration, 36(2), 156–172.

Olsen, D., Maple, S. A., & Stage, F. K. (1995). Women and minority faculty job satisfaction:

Professional role interests, professional satisfactions, and institutional fit. Journal of Higher

Education, 66(3), 267–293.

Park, S. M. (1996). Research, teaching, and service: Why shouldn’t women’s work count? Journal of

Higher Education, 67(1), 46–84.

Perna, L. W. (2001). Sex and race differences in faculty tenure and promotion. Research in Higher

Education, 42(5), 541–567.

———. (2003). Studying faculty salary equity: A review of theoretical and methodological approaches.

In J. C. Smart (Ed.), Higher education: Handbook of theory and research Vol. XVIII (pp. 323–386).

The Netherlands: Kluwer Academic Publishers.

Pfeffer, J., & Langton, N. (1993). The effect of wage dispersion on satisfaction, productivity, and

working collaboratively: Evidence from college and university faculty. Administrative Science

Quarterly, 38(3), 382–407.

Purcell, M. A., & Baldwin, J. N. (2003). The relationship between dependent care responsibility and

employee promotions. Review of Public Personnel Administration, 23(3), 217–240.

Ragins, B. R., & Cotton, J. L. (1999). Mentor functions and outcomes: A comparison of men and

women in formal and informal mentoring relationships. Journal of Applied Psychology, 84(4),

529–550.

Rivera, M., & Ward, J. D. (2008). Employment equity and institutional commitments to diversity:

Disciplinary perspectives from public administration and public affairs education. Journal of

Public Affairs Education, 14(1), 9–20.

Rodgers, R., & Rodgers, N. (2000). Defining the boundaries of public administration: Undisciplined

mongrels versus disciplined purists. Public Administration Review, 60(5), 435–445.

Ropers-Huilman, B. (Ed.). (2003). Gendered futures in higher education: Critical perspectives for change.

Albany: State University of New York Press.

94 Journal of Public Affairs Education


Productivity and Leadership of Female Faculty

Rubin, M. M. (1990). Women in ASPA: The fifty-year climb toward equality. Public Administration

Review, 50(2), 277–287.

———. (2000). Women in the American society for public administration: Another decade of progress

but still a way to go. Public Administration Review, 60(1), 61–71.

Sabharwal, M. (2010). Research productivity and career trajectories of women in public administration.

In H. Leslie & M. J. D’Agostino (Eds.), Women in Public Administration: Theory and Practice.

New York: Jones & Bartlett.

Sabharwal, M., & Corley, E. A. (2008). Categorization of minority groups in academic science and

engineering. Journal of Women and Minorities in Science and Engineering, 14(4), 427–446.

Sandler, B. R. (1991). Women faculty at work in the classroom, or, why it still hurts to be a woman in

labor. Communication Education, 40(1), 6–15.

Sax, L. J., Hagedorn, L. S., Arredondo, M., & DiCrisi, F. A. (2002). Faculty research productivity:

Exploring the role of gender and family-related factors. Research in Higher Education, 43(4),

423–446.

Selden, S. C., & Selden, F. (2001). Rethinking diversity in public organizations for the 21st century:

Moving toward a multicultural model. Administration & Society, 33(3), 303–329.

Slack, J. D., Myers, N., Nelson, L., & Sirk, K. (1996). Women, research, and mentorship in public

administration. Public Administration Review, 56(5), 453–458.

Smart, J. C. (1990). A causal model of faculty turnover intentions. Research in Higher Education, 31(5),

405–424.

———. (1991). Gender equity in academic rank and salary. Review of Higher Education, 14(4), 511–526.

Sonnert, G., & Holton, G. (1995). Gender differences in science careers: The project access study. New

Brunswick, NJ: Rutgers University Press.

Stack, S. (2004). Gender, children and research productivity. Research in Higher Education, 45(8),

891–920.

Stallings, R. A., & Ferris, J. M. (1988). Public administration research: Work in PAR, 1940–1984.

Public Administration Review, 48(1), 580–7.

Toutkoushian, R. K. (1999). The status of academic women in the 1990s no longer outsiders, but not

yet equals. Quarterly Review of Economics and Finance, 39(5), 679–698.

Ward, K. B., & Grant, L. (1996). Gender and academic publishing. In John C. Smart (Ed.), Higher

education: Handbook of theory and research Vol. XI (pp. 172–212). Edison, NJ: Agathon.

Weiler, W. C. (1990). Integrating rank differences into a model of male–female faculty salary

discrimination. Quarterly Review of Economics and Business, 30(1), 3–15.

Weingart, P. (2005). Impact of bibliometrics upon the science system: Inadvertent consequences?

Scientometrics, 62(1), 117–131.

White, J. D. (1986). Dissertations and publications in public administration. Public Administration

Review, 46(3), 227–234.

Journal of Public Affairs Education 95


M. Sabharwal

Wright, B. E., Manigault, L. J., & Black, T. R. (2004). Quantitative research measurement in public

administration: An assessment of journal publications. Administration & Society, 35(6), 747–764.

Xie, Y., & Shauman, K. A. (1998). Sex differences in research productivity: New evidence about an old

puzzle. American Sociological Review, 63(6), 847–870.

Zamarripa, E. J. (1993). Research productivity: A definition. Mental Retardation, 31(5), 320–325.

Dr. Meghna Sabharwal is an assistant professor in the Public Affairs Program in

the School of Economic, Political and Policy Sciences at the University of Texas

at Dallas. Her research interests are focused on workforce policy as it relates to

job satisfaction, productivity, diversity, and comparative human resource issues.

Dr. Sabharwal’s work is published in academic journals such as Public Administration,

Review of Public Personnel Administration, Research Policy, Social Science Journal,

and Government Information Quarterly among others.

96 Journal of Public Affairs Education


The Development of an

MPA Major Field Test

Paulette Camp Jones

Mid-America Christian University

Gary E. Roberts

Regent University

Ervin Paul Martin

Belhaven University

Elaine Ahumada

California Baptist University

Stephen M. King

Taylor University

Pat Kircher

California Baptist University

Abstract

This article describes the yearlong process of systematically developing an MPA

Major Field Test with a team of five schools in various regions of the United States.

Each school professor on the test team interviewed numerous practitioners to

determine current issues in the profession. The practitioner information was filtered

through at least one academic team member who used the information to create

exam questions and submitted them for peer review. The test is based on NASPAA’s

five competencies for the MPA degree. Four schools list the data gleaned from the

test results and its impact on their MPA program. Though the data set is currently

not large enough to determine validity or reliability, the article describes the process

and the utility of the test to offer the first collaborative measurement for schools

to determine the strength of their MPA program.

Keywords: capstone, testing, program improvement

JPAE 19(1), 97–115

Journal of Public Affairs Education 97


Camp Jones et al.

An Argument for the MPA Major Field Test

The MPA degree has an identity crisis. A proliferation of MPA programs with

great variety and purpose has obscured the definitive study of public administration.

Approximately 260 MPA programs offered in the United States have no set curriculum

or standards. Also, some public administration school administrators have fallen

victim to academic pressure to provide education and socialization within an

academic discipline, instead of professional education for public service (Henry,

Goodsell, Lynn, Stivers, & Wamsley, 2009).

So what standards should be used in defining the MPA program? Does the

National Associationof Schools of Public Affairs & Administration (NASPAA)

accreditation provide the standards? According to a study of MPA alumni, yes,

accreditation does make a difference. It is a way to set apart an MPA program

from its competitors on the basis of perceived excellence standards (Yeager, Hildreth,

Miller, & Rabin, 2007).

In 2010, according to reports from the U.S. Department of Education’s National

Center for Education Statistics (NCES), Harvard University reported that it had

conferred the largest number of master degrees in public administration for the

2007–2008 academic year of all Title IV institutions with MPA programs in the

nation (Keating, 2010). Some of the institutions listed as conferring the largest

number of MPA degrees are Columbia University, Troy University, New York

University, Syracuse University, CUNY, University of Washington, and California

State University–Northridge (Keating, 2010).

MPA programs are now growing at a pace equal to MBA program offerings

in the 1960s. According to the American Society of Public Administration (ASPA)

Task Force evaluation in the Master of Public Administration Degree, “a complex

array of degrees and curricula (approximately 260 programs) has blurred the core

mission of public administration, the MPA, and the values that ground it” (Henry

et al., 2009, p. 2). How are MPA institutions to validate curriculum content and

measure graduate competency? Many instructors, institutions, students, and

graduates have expressed interest in a standardized, uniform MPA program

measurement mechanism.

Major Field Tests (MFTs) are innovative program outcome measurements

designed to determine the basic knowledge and understanding achieved by

students in a major field of study.

In response to the former U.S. Secretary of Education Margaret Spellings’

Commission on the Future of Higher Education, states and universities are

attempting to measure learning directly. Direct measures of learning provide

evidence of behavior change (Shavelson, 2009). Test results enable academic

departments to better assess and refine curricula and gauge the progress of

students compared to others in the program and those in similar programs at

schools throughout the country, according to Internet articles by Texas A&M

University (2010, p. 1) and Middle Tennessee State University (2010, p. 1).

Major Field Tests have been developed for the following programs: “Agribusiness,

History, Industrial/Organizational Psychology, Political Science/International

98 Journal of Public Affairs Education


The Development of an MPA Major Field Test

Relations, Social Work, Psychology, Computer Science, Physics, and Sociology”

(Middle Tennessee State University, 2010, p. 1). The Educational Testing Service

(ETS) website lists various MFTs available, including Music and Criminal Justice

(ETS, 2011a). The Master of Business Administration Major Field Test, which

includes 124 multiple choice questions, is widely used. Mathematical problems

do not require a calculator, and most of the questions require knowledge drawn

from various subfields of business (ETS, 2011b). It is interesting to note that no

MFT exists for graduate programs in public administration, public policy, or public

affairs, all of which differ in emphasis and focus of program core content.

The purpose of MFTs is to allow programs and institutions to measure student

achievement and curriculum effectiveness as well as evaluate graduate performance

against other institutions across the country (ETS, 2011a). MFTs are only one of

several assessment tools for evaluating programs for appropriate content and student

knowledge performance.

MFTs in the most closely related area of business demonstrate predictive validity

in such domains as student engagement and grade point average (Black & Duhon,

2003; Ward, Yates, & Song, 2010). However, field test predictive validity does vary

by business major and student demographics (Contreras, Badua, Chen, & Mitchell,

2011; Settlage & Settlage, 2011), thus necessitating ongoing group analysis. Hence,

the development of an MPA field test will require ongoing validation evaluation.

Another key issue raised by the business MFT relates to its use in reducing

grade inflation. A recent study indicated that the adoption of an MFT increases

the incentive of faculty to enhance grading fairness and maintain rigor (Lee &

Sekhar, 2008). However, to reduce grading inflation, ongoing monitoring by

administrators is also required (Lee & Sekhar, 2008).

How the MPA-MFT Was Developed

To assess the effectiveness of the Mid-America Christian University (MACU)

program, the chair sought an independent measure. Observing that her colleagues

in the School of Business used the MBA Major Field Test and other such instruments

for several degree programs, the chair wondered why there was no such test for the

MPA. Although the MPA is a relatively new field (Shafritz & Hyde, 2007), it is just as

relevant as other disciplines, especially in light of government expansion and non-profit

program growth in the United States. (Keating, 2010). This situation is especially

true in many states that have a shortage of qualified city managers (Rabe, 2006).

After determining that there was no such test, the chair discussed the void with

the school’s director of assessment. She noted that it would be important to gather

information not only from current texts and journals in the field, but from practitioners.

Not wanting to gather information only from Oklahoma, the director of

assessment recommended that several universities from several U.S. regions be involved

in the process because the ultimate test should be usable throughout the country.

Because NASPAA has five standards for the MPA degree, the NASPAA

“competencies,” as NASPAA terms them, seemed to be the logical benchmarks

(NASPAA, 2011). The search began for universities across the United States to

Journal of Public Affairs Education 99


Camp Jones et al.

join the team. Ideally, a total of five to six schools from various geographic regions

of the United States would participate. After several months it was determined that

the West, Southeast, North-Central, South, and Midwest regions of the United States

would be represented. The team would be Pat Kircher and Elaine Ahumada with

California Baptist University, Stephen M. King with Southeastern University (Florida),

Michael Card with the University of South Dakota, Ervin Martin with Belhaven

University (Mississippi), and Paulette Camp Jones with Mid-America Christian

University (Oklahoma). Four of the schools offer MPA degrees, and the fifth offers

a BPA. (This fifth team member has a PhD in political science and had been on

the faculty for the Master in Government Program at Regent University). Three of

the schools are members of NASPAA, and one is NASPAA accredited. (Test results and

characteristics of each school are provided later in this article.)

Research Design

To build a Major Field Test, the team had to ask some critical questions.

• What are the foundational job-related elements that government

management practitioners believe most strongly affect the practice

of public administration?

• How do these practitioner job elements compare to the NASPAA

competencies for an MPA program?

• According to academicians in the same geographic areas, which of these

job elements are critically relevant to curriculum in MPA programs?

• What academic factors do academicians believe are critically relevant for

MPA programs?

The process used to create the MFT included a process similar to that used

by developers of the MBA-MFT and the Graduate Record Examination (GRE).

The process includes determining objectives; item development review; writing

and reviewing questions; pre-testing with a sample group; removing unfair questions;

and finalizing the test (ETS, 2011c).

Determining the Objectives

Various measures could be used for determining objectives, but the NASPAA

Universal Required Competencies for NASPAA accreditation for the MPA degree

seemed the most definitive. NASPAA is the premiere national organization devoted

to public administration education.

It is NASPAA philosophy that these competencies assist schools in assessing

their curriculum:

An accredited program should implement and be accountable for

delivering its distinctive, public service mission through the course

of study and learning outcomes it expects its graduates to attain. The

curriculum should demonstrate consistency and coherence in meeting

the program’s mission. (NASPAA, 2009)

100 Journal of Public Affairs Education


The Development of an MPA Major Field Test

The team used these competencies as a starting point to determine practicality

and scholarship in the discipline.

The chair at MACU worked as project manager to lead the team. Similar to

the intent of other major field tests, the goal was to develop an exam containing

160 questions. NASPAA Competencies for the MPA Program include the

following abilities:

1. Lead and manage in public governance.

2. Participate in and contribute to the policy process.

3. Analyze, synthesize, think critically, solve problems and make decisions.

4. Articulate and apply a public service perspective.

5. Communicate and interact productively with a diverse and changing

workforce and citizenry. (NASPAA, 2009)

The timeline used for the project during the 2009–2010 school year follows:

The five schools and team members are determined. August 10

Subject matter of the five major parts of exam agreed upon. August 17

Practitioners are interviewed for Part 1 and material is September 1

reviewed in view of scholarly material.

Questions are submitted to the project manager. October 1

All questions are e-mailed to the team. October 8

Votes determine the best 32 questions for Part 1. October 15

Practitioners are interviewed for Part 2. November 1

Questions submitted. December 1

Votes determine Part 2 questions. January 8

Practitioners are interviewed for Part 3. January 15

Questions submitted. February 15

Votes determine Part 3 questions. February 22

Practitioners are interviewed for Part 4. March 1

Questions submitted.

April1

Votes determine Part 4 questions. April 15

Practitioners are interviewed for Part 5. May 1

Questions submitted. June 1

Votes determine Part 5 questions. June 1

Similar to the process used by ETS, the team started with Item Development Review.

Journal of Public Affairs Education 101


Camp Jones et al.

The test was developed by building only one section at a time, addressing only

one NASPAA competency at a time. Each professor interviewed a practitioner in

his or her geographic area about the practical aspects of the competency at hand.

Only one competency was discussed in the interview. The practitioner was asked

if there were any practical applications of the competency in everyday life of the

public administrator. Example: The ability to lead and manage in public governance

was discussed with a statewide director for a program. The director explained how

this concept applies in daily work, how leadership is a key ingredient to strong

organizations, and how motivating and managing people are skills that can be

learned. Each interview would take approximately 60–90 minutes. The professor

would then review the information to determine if it was consistent with public

administration scholarship.

Writing and Reviewing Questions

After reviewing the interview information, the professor would then use it to

create multiple choice questions. Multiple choice questions were used in this instrument

because they are common in most major field tests as well as the GRE.

Multiple choice questions more accurately measure test results as compared with

essay questions, which include such irrelevant factors as speed and verbiage under

time constraints or even penmanship (Cooper, 1984).

The team members submitted their questions to the program manager with

the names and positions of their sources. The program manager compiled all 75

questions and e-mailed them to the team for peer review. Each member voted for

the best 32 questions. The 32 questions receiving the most votes became Part 1

of the test. This process was then replicated for Parts 2–5 until the test was complete,

covering five parts in conjunction with the five NASPAA MPA competencies.

After completing all five parts, the team examined each section for grammatical

errors, clarity, and accuracy.

Pretesting With a Sample Group and Removing Unfair Questions

Upon completing the process for all five sections of the test, the team scrutinized

it for correct ness, clarity, and more than one correct answer. They also administered

the test to some faculty and graduates to determine any problem questions. Twelve

questions were revised, and two were replaced due to lack of clarity.

Finalizing the Test

Finally, the team administered the test to four of the team schools and an

additional school in the northeast. Following are the results from each school.

MPA Major Field Test Components

Part 1: The ability to lead and manage in public governance

Part 2: The ability to participate in and contribute to the policy process

Part 3: The ability to analyze, synthesize, think critically, solve problems

and make decisions

102 Journal of Public Affairs Education


The Development of an MPA Major Field Test

Part 4:

Part 5:

The ability to articulate and apply a public service perspective

The ability to communicate and interact productively with a

diverse and changing workforce and citizenry

Importance of Practitioner Input

An important component of test development included interviewing public

administration practitioners in government positions at all three levels in five states

to determine issues related to the competency test section being built. This process

did not change the development of the test, because we started the test development

with the practitioner input.

For example, NASPAA MPA Competency 2 is “The ability to participate in

and contribute to the policy process.” Each professor interviewed a government

practitioner in his or her geographic area about how they contributed to the policy

process. This was done with various government practitioners throughout the five

states. Each faculty researcher would then submit his or her questions to the team.

Therefore, the process included not only practitioner input but academic screening

and peer review from the team of faculty researchers. Here is an example of the results:

3. As a member of a legislative body, to gain a reputation as an authority

in specific fields,

a. read all the bills and numerous publications related to the bills.

b. focus on becoming well informed in only 1–2 key areas

which you’re passionate about.

c. try to work with the house speaker or the senate president.

d. none of the above.

Only one vote received for Question 3. This question was deleted.

4. Public administrators advocate a view of policymaking in which goals

and programs are

a. set correctly initially and require minimal revision.

b. continuously modified to adjust to various constraints and

changing circumstances.

c. set without revision for ten years.

d. set behind closed doors by the elite.

Five votes received for Question 4. This question was used.

5. When formulating policy with leaders, it is wise to…

a. identify the problem, explore the alternatives, determine

viable choices, and compromise.

b. brainstorm the problem, consider few alternatives, determine

the most expedient solution.

c. determine who the leader is and follow them.

d. identify who holds the power, determine their agenda and follow it.

Three votes received for Question 5. This question was used.

Journal of Public Affairs Education 103


Camp Jones et al.

Practitioner emphasis has been upheld historically in public administration

journals. The trends in Public Administration Review (PAR) over a 70-year history

indicate its dedication to include practitioner as well as academic knowledge. Starting

in the 1970s, the number of articles written by practitioners as compared with

those by academicians were about equal. Both PAR and the American Society of

Public Administration have attempted to include both groups. But in recent years,

PAR and other Western public administration journals have had significantly less

articles from practitioners. Between 2000 and 2009, only 6.8% of the articles in

PAR were from practitioners. Including both types of articles is paramount to the

discipline of public administration. Therefore, vehicles to accomplish this goal have

included calls for practical components in courses as well as for practical application

of theory and research (Raadschelders & Lee, 2011).

Practitioners interviewed for this project included the following:

• City of Lakeland, FL—Director of Communications Kevin Cook

• City of Lakeland, FL—Fire Chief Gary Ballard

• Mississippi State Auditor’s Office—Chief of Staff Bill Pope

• Mississippi Secretary of State’s Office—Human Resources Director

Carla Thornhill

• Mississippi Center for Public Policy—President Forrest Thigpen

• Mississippi Lt. Governor’s Office—Special Assistant, Constituent

Affairs Nycole Campbell-Lewis, PhD

• Mississippi Assistant State Attorney General Patrick Beasley

• U.S. Dept. of Veterans Administration, G.V. Montgomery VA

Medical Center—Dr. Michael Pavalock

• Oklahoma Department of Human Services—Director of Human

Resources Ed Sweeney

• Oklahoma State Bureau of Investigation—Director DeWade

Langley, EdD

• Oklahoma Department of Transportation—Director of Human

Resources Brian Kirtley

• Oklahoma State Representative Rebecca Hamilton

• Oklahoma State Representative Sally Kern

• Oklahoma State Representative Paul Wesselhoft

• Former Oklahoma State Deputy Commissioner of Labor, Neva Hill

• Regional Manager, U.S. Transportation Safety Administration

• City of Rialto—Chief of Police Dr. Mark Kling

• City of Hemet (California)—City Manager Dr. Brian Nakamura

• City of Vermillion (SD)—City Manager John Prescott

• Vermillion Coalition for Dispute Resolution—Executive Director

Rich Braunstein

104 Journal of Public Affairs Education


The Development of an MPA Major Field Test

• Lutheran Social Services of South Dakota—V.P. of Community-Based

Service Rebecca Kiesow-Knudsen

• Northeast South Dakota Development Corporation—Executive Director

Darah Darrington

Measuring Programs

Often what the public refers to as testing is actually measuring achievement

instead of learning. College tests measure the status of students at a particular time.

In interpreting the results, higher education may take credit for the achievement

that college students have attained, but many other outside influences and experiences

of the student should also be considered (maturation, experiences outside the classroom,

etc.). Determining causality of instruction causes problems for accountability

and assessment (Shavelson, 2009).

Achievement tests such as Major Field Tests measure domain-specific knowledge

and reasoning. There are four types of domain-specific knowledge and reasoning

(Shavelson, 2009): declarative (knowing what), procedural (knowing how), schematic

(knowing why), and strategic (knowing when, where, and how to apply the knowledge).

Educational measurements are used to obtain visible evidence to infer the

unseen learning of students. Collecting such data is wise when attempting to measure

the effects of instruction. When designing tests, curricular aims are paramount.

When the final destination of knowledge has been defined, it is much easier to identify

the necessary skills and knowledge the students will need. Also, determining important

enabling subskills of knowledge improves the chances that such knowledge

will be included in the instruction (Popham, 2003).

Preliminary Results by School

With minor variations, all four of the schools administering the test include

the following core courses in their programs: Organizational Behavior, Managing

Public Programs, Public Policy, Budget and Finance, and Research in Public

Administration. All of the schools offer an MPA degree or a Master of Arts in

Government. The programs total between 36 and 39 hours of credit. None of

the schools require an exam for graduation, but they do require a capstone

project such as a thesis or research project.

Table 1 presents the results from each school and is followed by the narrative

from each professor who administered the test to his or her students. Each

professor describes how the findings have affected their curriculum, including

course revisions, and how he or she plans to use the test. Although the University

of South Dakota is NASPAA accredited and was part of the development team

for MPA-MFT, the school was not able to administer the test to his students.

Also, South-eastern University does not have MPA students, only BPA students.

However, he contacted a former colleague, Gary Roberts with Regent University

who administered the test to 27 Master of Arts in Government students.

Journal of Public Affairs Education 105


Camp Jones et al.

Table 1.

Results of Major Field Test for Participating Schools

School

Belhaven

University

California

Baptist

University

Mid-America

Christian

University

Regent

University

Age of Program 2 years 5 years 2 years 12 years

Mean Score on

MPA-MFT

74% 58% 61% 66%

N 14 14 15 27

NASPAA Member? N N Y Y

Part- or Full-Time? Part-time Full-time Full-time Part-time

Pre-service or

In-service?

Both In-service In-service Both

Regent University Results

What impact has the test had on your curriculum? The MPA Field Test has

provided an excellent baseline instrument of the overall and dimensional competency

level for Regent public administration concentration students. The scores demonstrated

an undesirable level of variability in performance. Raw scores ranged from

71 to 160 with a standard deviation of 22.15. This translates to a percentage score

spread of 44% to 100% correct. Part of the score variability is attributed to the

varying credit completion program standing of the respondent students contributing

to inconsistent baseline ability levels.

Dimension Scale Mean # of Items

Leadership 22.37 32

Participate & Contribute 19.96 32

Critical Thinking 20.85 32

Public Service Orientation 19.74 32

Communications & Interaction 20.93 32

We recognize the need to improve the scores in all five dimensions as noted

in the table. (Although these results were helpful, the other schools did not

compile their findings in this manner so such a comparison was not made.)

Did you determine that courses should be revised? This exam will serve as the

foundation for enhancing key learning objectives of four courses in the public

administration concentration: (GOV670) Principles of Public Administration,

106 Journal of Public Affairs Education


The Development of an MPA Major Field Test

(GOV630) Public Human Resources Management, (GOV634) Public Budgeting

and Taxation Policy, and (GOV671) Organizational Theory. In the Principles of

Public Administration course, there is a need to more firmly articulate and provide

case study examples of outstanding public sector servant leadership and the importance

of a vibrant and vital public service motivational orientation. In all of the

courses there is a need to provide more rigorous service-learning opportunities that

promote leadership and critical thinking skills. Another key need is for students

to study best practice examples in the respective content domains. For example,

in the human resources management class, it is important to illustrate the key elements

of effective leadership in organizational change situations and how human

resource professionals integrate strategic and critical thinking in terms of surveys,

focus groups, and needs assessments to gauge change receptivity and effectiveness.

The low and highly variable field test scores reinforced the need to develop a

formal MPA degree. The MA in Government degree program with a concentration

in public administration did not provide a valid and reliable foundation in the key

public administration competency areas. This was a product of the structure of

the concentration. Students who were in the public administration concentration

selected from 14 total courses: 4 in public administration, 3 in nonprofit administration,

2 in economics, 4 in political campaign management, and 1 in public policy.

Students selected courses based on availability and feedback from their advisors

and there-fore did not possess a uniform course preparation foundation, producing

a significant degree of variance in foundational knowledge. The current public

administration faculty compliment is small (two faculty), but with the advent of

the MPA the program is recruiting additional faculty. All 27 students in the sample

were students with a single instructor, which illustrated a systematic weakness given

that the faculty member taught all four PA classes, thus reducing the desirable

level of instructor diversity. Both PA faculty members strongly support the use of

field tests and their use as a criterion validity indicator. The low and variable test

scores indicate the need to increase admissions standards and to recruit more

mid-career students.

How do you plan to use the test? The program is in the process of gaining approval

for a 45 credit hour MPA degree based on NASPAA standards. There are two options

for using the field test. The first is an exit or capstone requirement with a minimum

passing score (say 80%) along with an essay component. This requirement promotes

a high level of student motivation in preparing for the exam. The second is to serve as

a pre- and post-assessment of student learning. The exam could be administered at

orientation, and students would receive an initial overall and dimensional summary

of the scores. The feedback would be given in a supportive and nonthreatening

fashion. Students would take the exam at the end of their public administration

coursework and be provided with a summary of change in overall and dimensional

scores. In this fashion, the test will serve as one standardized means for assessing

Journal of Public Affairs Education 107


Camp Jones et al.

individual student and aggregate program learning. Both options possess merit,

but the best approach given the move to a fully developed MPA will be its use

as a diagnostic exam to chart progress.

Belhaven University Results

What impact has the test had on your curriculum? I administered the MFT

to a MPA graduating class and two new incoming groups. I did so because I

have not had the MFT long enough to introduce it to a new incoming group

and then administer it to them upon program completion, so the comparison

of scores is more abstract than concrete.

The graduating class average was 74%. The incoming groups’ average

was 43%.

Significant improvement of knowledge was gained. Obviously, areas of

emphasis have been identified and course syllabi revisions are planned. I have

three ongoing groups in the program and starting a new one here in August,

so by next Spring 2012, I will have more meaningful comparisons. Scores in

relevant areas will result in course content revisions as needed.

Did you determine that courses should be revised? One thing I learned from

this process is that, whereas our program emphasizes federal and state government

because most of our students and adjunct instructors are state or federal government

employees, other faculty participants in this MPA-MFT process initiative focused

on local government issues. This difference required me to redress that issue

within several of our program courses and revise the syllabi for those courses.

Mid-America Christian University Results

What impact has the test had on your curriculum? Because our program is

only two years old, we administered the test to our first graduating class. Raw

scores ranged from 89 to 116 with a standard deviation of 5.77. This yielded a

percentage score spread of 55.6% to 72.5%. The following table illustrates the

number of questions that less than 50% of the class answered correctly, the

corresponding competency, and the corresponding courses.

After reviewing the results, it was determined that more case studies will be

incorporated into the courses. Detailed and well-developed case studies (Harvard

and the Electronic Hallway) will more clearly delineate the relationship between

theory-based learning objectives and their practical applications in the key competency

areas. For example, case studies can illustrate the challenges of managing a more

diverse workforce in terms of age and race in conjunction with communities that

are in a state of demographic transition. NASPAA competency components will

be threaded throughout the program, as well as an emphasis on current issues

and communication skills.

108 Journal of Public Affairs Education


The Development of an MPA Major Field Test

NASPAA MPA Competency # of Difficult Questions Courses

1. To lead and manage in 9 Leadership Theory & Practice

public governance

Organizational Culture & Behavior

Managing Public Programs

2. The ability to participate 10 Managing Public Programs

in and contribute to the

Organizational Culture & Behavior

policy process

Legislative Process & Behavior

3. The ability to analyze, 11 Public Budgeting & Finance

synthesize, think critically,

Leadership Theory & Practice

solve problems and

Organizational Culture & Behavior

make decisions

Managing Public Programs

Legislative Process & Behavior

4. The ability to articulate 12 Managing Public Programs

and apply a public

service perspective

5. To communicate and 12 Organizational Culture & Behavior

interact productively

Managing Public Programs

with a diverse and changing

workforce and citizenry

Did you determine that courses should be revised? After determining which

questions concepts to be incorporated into the appropriate courses. Care has been

taken not to teach the test questions, since doing so would not equip students for

the profession. Due to the age of students in our adult accelerated program, and

the number of years since they graduated with a bachelor degree, it was also

determined that the students coming into the program did not have a sufficient

foundation in public administration. Either they had forgotten such basic

underpinnings or had never been taught the principles. Therefore a new course

was created, Foundations of Public Administration, that will be interspersed as

the second course.

How do you plan to use the test? The MPA-MFT will be one of three assessment

tools for our program. In addition to the students’ yearlong research project to

evaluate an existing program, and the learning outcomes management system, the test

will be administered to each graduating class to determine the need for curriculum

revisions. We are also considering it as a pre-test instrument.

California Baptist University Results

What impact has the test had on your curriculum? The MPA Field Test

has been implemented as a tool for education evaluation in conjunction with

Western Association of Schools and Colleges standards for the Master of Public

Journal of Public Affairs Education 109


Camp Jones et al.

Administration program at CBU. Sixteen candidates for graduation within the

spring 2010 semester completed the exam. The exam has provided valuable analytical

insight for revision, evaluation, and validation of course curriculum components

that best meet the expectations of an MPA scholar practitioner.

Although the tool is also pertinent to analyze and examine the competency

level of the graduate, it has been more useful in addressing the areas of curriculum

that can be bolstered to improve rigor and content. Fourteen students took the

exam. The range of correct answers was 61–112 out of a possible 160. The mean

number of correct answers was 92.86 with a standard deviation of 15.25. The

mean percentage score was 58% with a standard deviation of 10%. Although a

few categories determined higher levels of competency, the goal is to improve

each area with higher scores.

Did you determine that courses should be revised? This exam revealed that

Public Finance and Budgeting needs more in-depth instruction designed to teach

relevant budgeting skill-sets as well as more review of term-inology and practices.

How do you plan to use the test? At CBU, the obvious choice of how we would

likely use the test is to analyze educational effectiveness in combination with the

culminating activity of either the graduate thesis or comp exam. With the advent

of an online MPA at CBU it may be useful to compare data from tests from

two different populations (in-class students and online students). Results can be

used to ensure that instruction, whether online or on the ground, can be equally

effective with consistent outcomes for learning. Another way the test can be used

is as a pre- and post-test instrument. The exam will be administered again in the

summer session of 2011. Comparative data can then be analyzed and assessed for

further review.

Results

As noted from each of the school representatives, the test did help determine

weak areas in their curriculum and/or concepts that have not been communicated

well to students. It even indicated that some new courses are needed to address

the lack of knowledge by students with no prior public administration education

or experience.

Of the 20 practitioners who were interviewed for this project, none indicated

that the NASPAA competencies were not appropriate or impractical. It may be

that the competencies are so broad that they address large segments of

administration. Part of the mission of this project was to move from the general

to the specific to create questions of measurement, thereby determining concepts

needed in MPA curriculum. Are the NASPAA competencies too broad to define

what is needed in curriculum? Further testing is needed to answer this question.

110 Journal of Public Affairs Education


The Development of an MPA Major Field Test

Conclusion

After reviewing the results from the four schools, some questions remain:

• Should low test scores lead to changes in curriculum, instructors,

or admission standards?

• Do the NASPAA competencies address the major tenets of an MPA program?

• Should a minimum score be established?

Should Low Test Scores Lead to Changes in Curriculum, Instructors, or

Admission Standards?

Low scores may or may not indicate the need for changes to curriculum,

instructors, or admission standards. Each chair will have to determine if the

questions missed are important concepts that should be addressed in his or her

own program. Given NASPAA’s new mission-driven standards, there will be

more variability in the learning objectives and competency areas than was the

case with the old common core curriculum components. Regarding admissions

criteria, that could be determined after pre-test scores have been tracked with

graduation rates and/or grade point averages in a longitudinal study. If correlation

is found, the MPA-MFT scores may serve as a predictor for student success.

Do the NASPAA Competencies Address the Major Tenets of an MPA Program?

There may be other organizations with foundations on which to build an

MPA major field test, but the team felt this was a good start. Thus far there have

been no benchmarks for the MPA program. Hopefully, this will be the beginning

of a continuing project to standardize key core course learning objectives in the

degree and thereby benefit the discipline.

Should a Minimum Score Be Established?

A minimum score at the end of the degree program could be set if research

can demonstrate that a minimum score will indicate competency in the profession;

but once again, this score would have to come from a more robust database.

The immediate aim of this test is not to assess individual student achievement,

but as an aggregate benchmark of global competency levels of the foundational

learning dimensions.

The MPA-MFT is a much-needed instrument for graduate public administration

programs. Significant strides in graduate and undergraduate public administration

programs over the last four decades have been made in the areas of curriculum development,

program enhancement, adaptability to information technology changes, and

public, private, and nonprofit inter-sectoral and/or interdisciplinary focus. These

changes and adaptations to such graduate programs are significant, particularly with

regard to discovering weaknesses in program and curricular strength. However, assessing

and comparing results of graduate-level public administration content knowledge

has been missing. The creation of the MPA Major Field Test fills this gap.

Journal of Public Affairs Education 111


Camp Jones et al.

Use of NASPAA’s five competency areas strongly denotes the broad-based

approach to competency development sought by accrediting bodies. Researching,

developing, and crafting the MPA Major Field Test by and through these five

competencies further demonstrates the need for both content and purpose as well

as critical skill evaluation. These factors are all highly sought after by potential

employers in the public sector.

The MPA-MFT is strengthened by incorporating the knowledge and experiences

of mid- to upper-level public administration practitioners. This action is

supported by the American Society of Public Administration’s Strategic Draft Plan,

2011–2013, which among other things calls for significant diversification of membership

among mid-level public employees. There have been several initiatives by ASPA

over the last 25 years to better balance the academic and practitioner perspective.

Although the practitioner input waned originally (Raadschelders & Lee, 2011),

there is some evidence of resurgence of practitioner interest (Strieb, Slotkin, &

Rivera, 2001) and influence (Ospina & Dodge, 2005) in public administration

research. Accessing and incorporating practi-tioner input was a key decisional

element by the researchers when determining what types of questions should be

included within the five competency areas.

Although four of the participating universities that contributed to the development

of the MPA-MFT submitted it to their students, the results are far from reliable

given the small value of N. First, the MPA-MFT should be tested and cross-tested

by multiple departments and faculty to ensure reliability and validity of content

and measures. Second, the test should be administered at various intervals, including

as a pre-test, either as a graduating undergraduate major in public administration,

or to incoming MPA students; and a post-test (i.e., to graduating MPA students).

The PTPT method provides an elementary way of measuring not only content

knowledge, but program consistency, strength, and weakness. This approach is

critically important when faculty and departments wish to strengthen their

programs and curriculum.

Clearly, there is a need for an MPA-MFT instrument. This article has described

the first step in constructing such a tool. Hopefully others will build on this

foundation and refine it for the good of the discipline. The team has agreed that

it would be best to submit the test to NASPAA and for that organization to make

it available to other schools for further testing, revisions, and updates. With NASPAA

as the keeper of the test, N can be increased as numerous schools administer the

test and examine any problems with the questions and/or answers. The team

realizes this test is only the first effort of an MPA-MFT, but with the help of

NASPAA and other dedicated public administration faculty and administrators,

the test can become a valuable tool for schools wishing to assess their MPA programs

and determine the caliber of their graduates. The demand for program and

curricular evaluation as well as student content knowledge is paramount to

producing future public professionals to meet the challenges ahead.

112 Journal of Public Affairs Education


The Development of an MPA Major Field Test

References

Black, H. T., & Duhon, D. L. (2003). Evaluating and improving student achievement in business

programs: The effective use of standardized assessment tests. Journal of Education for Business,

79(2), 90–97.

Contreras, S., Badua, F., Chen, J. S., & Mitchell, A. (2011). Documenting and explaining major field

test results among undergraduate students. Journal of Education for Business, 86(2), 64–70.

Cooper, Peter L. (1984). The assessment of writing ability: A review of research. GRE Board Research

Report. Retrieved from http://www1.ets.org/Media/Research/pdf/RR-84-12-Cooper.pdf

Educational Testing Service (ETS). (2011a, January 19). Major Field Tests. Retrieved from http://www.

ets.org/mft/about

———. (2011b, August 9). Master of Business Administration (MBA) Degree. Retrieved from http://

www.ets.org/mft/about/content/mba

———. (2011c, October 11). How tests and test questions are developed. Retrieved from http://www.

ets.org/understanding_testing/test_development/

Henry, N., Goodsell, C. T., Lynn, L. E., Stivers, C., & Wamsley, G. L. (2009, April 1). Understanding

excellence in public administration: The report of the task force on educating for excellence in the

master of public administration degree of the American society for public administration. Journal

of Public Affairs Education, 15(2), 117–133.

Keating, M. (2010). Harvard reports increased interest in MPA degree programs. Retrieved from http://

govpro.com/news/mpa-degree-interest-harvard-20100429/

Lee, J., & Sekhar, A. (2008). Using externally developed standardized tests to control grade inflation.

Review of Business Research, 8(4), 79–87.

Middle Tennessee State University. (2010, October 28). Counseling & Testing Center– Major Field

Testing. Retrieved from http://www.mtsu.edu/countest/majfield_countest.shtml

National Association of Schools of Public Affairs and Administration (NASPAA). (2009). NASPAA

standards 2009. Commission on peer review and accreditation. Retrieved from http://www.

naspaa.org/accreditation/doc/NS2009FinalVote10.16.2009.pdf

———. (2011, September 27). NASPAA accreditation standards for master’s degree programs.

Retrieved from http://www.naspaa.org/accreditation/NS/naspaastandards.asp

Ospina, S. M., & Dodge, J. (2005). Narrative Inquiry and the Search for Connectedness: Practitioners

and academics developing public administration scholarship. Public Administration Review, 65(4),

409–423.

Raadschelders, J. C., & Lee, K. H. (2011). Trends in the study of public administration: Empirical and

qualitative observations from Public Administration Review, 2000–2009. Public Administration

Review, 71, 19–33. doi:10.1111/j.1540-6210.2010.02303.x

Rabe, J. (2006, November 19). Politics blamed for shortage of city managers. The Oklahoman

(Oklahoma City), p. A2.

Settlage, D. M., & Settlage, L. A. (2011). A statistical framework for assessment using the ETS major

field test in business. Journal of Education for Business, 86(5), 274–278.

Shafritz, J. M., & Hyde, A. C. (2007). Classics of public administration. Boston: Wadsworth.

Journal of Public Affairs Education 113


Camp Jones et al.

Shavelson, R. (2009). Measuring college learning responsibly: Accountability in a new era. Palo Alto, CA:

Stanford University Press.

Strieb, G., Slotkin, B. J., & Rivera, M. (2001). Public administration research from a practitioner

perspective. Public Administration Review, 61(5), 515–525.

Texas A&M University Corpus Christi. (2010, October 28). Office of Academic Testing—Major Field

Tests. Retrieved from http://testing.tamucc.edu/major_field.html

Ward, C., Yates, D., & Song, J. Y. (2010). Examining the relationship between the National Survey

of Student Engagement and the ETS business major field test. American Journal of Business

Education, 3(12), 33–39.

Yeager, S. J., Hildreth, W. B., Miller, G. J., & Rabin, J. (2007). What difference does having an MPA

make? Journal of Public Administration Education, 13(2), 147–167.

Paulette Camp Jones received her EdD in Educational Technology from

Oklahoma State University in 2002 and an MA in Political Science–Urban

Affairs from the University of Central Oklahoma in 1987. She established and

chaired the Master in Leadership of Public Administration program (MLPA) at

MACU from 2007 to 2012. She is currently a professor of Political Science, and

director of Online Learning at Hillsdale College in Oklahoma City. Her current

focus is to establish an online public administration program using educational

technology and to assess instructional effectiveness.

Pat Kircher is professor of Public Administration and Political Science in the

Professional and Online Studies Division of California Baptist University. She holds

a doctorate in public administration (DPA) from the University of La Verne.

Stephen M. King received his PhD in Political Science from the University of

Missouri–Columbia in 1990. He is professor of Political Science and the R. Philip

Loy Endowed Chair of Political Science at Taylor University, where he teaches

undergraduate courses in the areas of American Government, Public Administration,

and Public Policy. He developed and oversaw a Masters of Public Administration

(MPA) program at Regent University from 1998 to 2002. His current research

agenda focuses on identifying, assessing, and evaluating the type, role, and influence

of local government leadership in addressing crisis points in the stability and strength

of the government organization.

114 Journal of Public Affairs Education


The Development of an MPA Major Field Test

Gary E. Roberts is an associate professor and MPA program director at Regent

University’s Robertson School of Government. His primary teaching areas are

nonprofit administration, human resource management, and public administration.

Current research interests center on workplace spiritual intelligence, servant

leader human resource policy and practice, the impact of the religious-friendly

workplace, and organizational policies to promote employee work–life balance.

He has authored 40 plus journal articles, one book, and many book chapters on

various human resource and public management issues.

Elaine Ahumada is an associate professor of Political Science at California Baptist

University in the Division of Online and Professional Studies. She is the MPA

program director and current president of the California Inland Empire ASPA

chapter. She holds an MPA degree and Doctorate in Public Administration

(DPA) from the University of LaVerne.

Ervin Paul Martin, PhD, is MPA program director. He has 12 years on the faculty

with Belhaven University, School of Business, Jackson, MS. Undergraduate areas

of emphasis: Ethics, Leadership, Management; Graduate areas of emphasis:

Graduate Research (Thesis); PhD The Fielding Graduate Institute 1999 Human

and Organizational Systems; MA The Fielding Graduate Institute 1995 Organization

Development; MS University of Alaska, Anchorage 1987 Planning; MPA

University of Alaska, Anchorage 1983 Public Administration; and BA Portland

State University 1977 Justice Administration–Political Science.

Journal of Public Affairs Education 115


Public Administration Theory,

Research, and Teaching: How Does

Turkish Public Administration Differ?

Murat Onder

Yıldırım Beyazıt University

Ralph S. Brower

Florida State University

Abstract

This article gives a broad overview of Turkish public administration research over

the past 20 years and Turkey’s current situation of public administration education.

It presents descriptive findings and discusses, compares, and contrasts them with

previous research in the United States and Turkey. It examines public administration

theory, research, and education together because, in an integrated body of scholarship,

the three should reflect each other. Evidence in this study illustrates that the

field of public administration in Turkey is quite different from American public

administration, but that the elements of theory, research, and teaching are consonant

with each other.

This article examines the current state of public administration research and

education in Turkey. Following earlier research in the United States (Houston &

Delavan, 1990; Perry & Kramer, 1986), we explore the topical content and questions

about research design in the principal Turkish public administration journal. We

also investigate the content of public administration subject matter in Turkey and

compare it to American and European curriculum.

Public administration as a combination of different theories and practices

is concerned with developing four kinds of theories (Henry, 1995, pp. 21–22):

descriptive, normative, assumptive, and instrumental. Normative knowledge

Keywords: public administration research, comparative public administration

education, turkish public administration

JPAE 19(1), 117–139

Journal of Public Affairs Education 117


M. Onder & R. S. Brower

provides essential direction and inherent obligations for practice. Professional public

administration education, therefore, should include applications, operations, and

performance. In the U.S. setting, the National Association of Schools of Public

Affairs and Administration (NASPAA) provides guidance on public administration

education to converge theory and practice for knowledgeable action, theoretical

understanding, and mutual learning (Ventriss, 1991, pp. 5–6). Both practitioners

and academics have contributed to the development and evolution of public administration

theory. New theories of explanation and models for practice often arise from

practical experience or from qualitative inquiry. Academics then create frameworks

and hypotheses and test them through research. We anticipate that this connection

between practice and scholarship should travel well to other national settings.

The nature of public administration theory, methodology, and teaching—and

the relationships among them—are issues that have been debated from the first

day of self-aware public admini-stration. Theory development and methodology

cannot be isolated, due to necessities of relevant methodology to test theories and

critically evaluate them. Two decades ago, Houston and Delevan (1990, p. 674)

argued that to produce a meaningful and cumulative body of knowledge as a

discipline, we need to have research methodology that permits us to appropriately

test and further develop our theories. Perry and Kraemer (1986) concluded at

that time that public administration research lacked appropriate methodological

sophistication to develop cumulative theory. These and other studies assessing the

viability of public administration research and theory have largely limited themselves

to dissertations and journal articles published in the United States. The generalizability

of their results should be debated. The research published in American public

administration journals may not represent all that American public administrationists

produce, and it largely underestimates public administration scholarship

elsewhere in the world, including from American-trained scholars.

Ventriss reported (1991) two decades ago that American public administration

was relatively insulated from other cultures and was neglectful of international

issues in general. More recently Jreisat (2005) reported that, although improvements

have occurred, at mid-decade comparative public administration had “not successfully

integrated with the main field of public administration, to the detriment of

both” (p. 231). We observe that outreach toward and participation from international

settings has been uneven. Recent publications and public administration conferences

in the United States show increasing participation from European and East Asian

scholars, but participation from other parts of the world is limited. Some schools

have recently renamed themselves to give an outward appeal to international affairs,

but recent faculty advertisements in the United States show only an occasional position

for internationalists. These ads are being dwarfed by the numbers of ads for local

government, financial administration, and nonprofit management specialties.

We offer an incremental contribution to this void by examining the standing

of public administration in Turkey. Our intention is to complement existing know-

118 Journal of Public Affairs Education


Turkish Public Administration

ledge rather than repudiate the American experience. In fact, the history of public

administration in the United States provides a foundation to push off against, and

our study employs frameworks previously suggested by Perry and Kraemer (1986)

and Houston and Delevan (1990). As a bridging country between East and West

and the Muslim and non-Muslim world, Turkey offers a significant setting in which

to examine the development of the field of public administration. In addition, many

of its administrative traditions originated in Europe; it possesses long historical ties

to the Byzantines, Balkans, and Turkish Republics in the former USSR; and its

academic ranks have been reinforced by American-trained scholars.

Methods

In this section, we discuss our research questions, our model and statistical

techniques, and our data collection. Articles published in the Journal of Public

Administration (AID) 1 are the population for the first part of our analysis. We studied

published articles over a 20-year period and, for the second part of our analysis,

examined the contents of course catalogs in Turkish public administration schools.

Research Questions

We seek explanations for this question: “What is the current situation of public

administration research and education in Turkey?” Together with this general question,

we also seek answers for the following subquestions:

1. Who publishes in AID? Scholars or practitioners? Can interdisciplinary

perspectives be seen in the background of the faculty and the authors

of the articles? If they are scholars, how do their backgrounds equip

them to understand interdisciplinary tendencies?

2. What do Turkish public administration programs emphasize?

What are the main subject areas for articles and courses in

education programs?

3. Do they build theory and/or test theory? What is the focus of

typical articles?

4. Do articles employ basic research methods? What is the research

stage for each study? Is research funded?

5. What types of methods, empirical analysis, or statistical techniques

do they employ?

6. What types of data and units of analysis do they have?

Data Collection and Statistical Techniques

Our study includes two dimensions: The first evaluates articles published in

the Journal of Public Administration (AID) in Turkey; the second evaluates public

administration education in Turkey. Data for the article analysis section were gathered

from a content analysis and descriptive information of published articles in AID.

Journal of Public Affairs Education 119


M. Onder & R. S. Brower

All 601 articles were reviewed for the 20-year period from 1990 to 2009. Review

essays and special issues were not included. We picked AID, published by TODAIE 2

in native language four times a year, as a journal to review because it is the only

public administration journal screened by Social Sciences Citation Index (SSCI)

in Turkey.

Each article was coded according to descriptive information and methodologies

they employed after a content analysis of articles. General information about the

article and author include characteristics such as the number of authors, university/

practitioner affiliation, academic rank of principal author, and funding for the

research. Following frameworks from Stallings and Ferris (1988) and Houston

and Delevan (1990), the analysis identified main researchable topics, whether a

theory was tested, main areas of study, article topics from other disciplines, and

the general approach of each article.

Each article was coded according to whether the general purpose or approach

of the article was to identify, introduce, or interpret law (legal briefs); introduce a

new subject; discuss issues critically; review literature; or analyze particular issues

with well-defined empirical research design (empirical). Another variable was created

regarding main areas of study in public administration or related disciplines to

search for interdisciplinary approaches in public administration. Additional variables

were created based on methods and statistical techniques employed to examine

whether they aim to test or build a theory, whether they employ statistical techniques,

and what types of data and units of analysis were used.

The second part of the study focuses on teaching in public administration. We

examined catalogs in four-year bachelor’s degree programs in public administration

in Turkey. Although studies of public administration in the United States focus on

graduate-level courses, we examined undergraduate public administration programs

because these programs in Turkey are well established, accepted, and better known

by the public. We collected and analyzed course catalogs from the universities

offering the public administration degree. Of the 139 universities present in 2010,

only 62 of them have public administration departments. We gathered programs

from 42 universities with public administration programs. Some of the remaining

20 programs had new departments and had not yet completed four-year programs,

and catalogs for the rest were not available. We reviewed these public administration

department catalogs to see what was being taught in core courses. Elective courses

were not used in this analysis, because some universities have broad lists of optional

courses that are difficult to categorize into meaningful themes. After reviewing, we

grouped the core courses for public administration into identifiable categories and

coded them accordingly.

In the analysis of articles, we used t-tests for independent samples for intervallevel

data and used chi-square tests for nominal data to compare influence over

decades on our dependent variables. We evaluated Levene’s test, the chi-square

test, and Phi coefficients to explain significance and strength of relationships. Our

samples met the conditions for both tests.

120 Journal of Public Affairs Education


Turkish Public Administration

Turkish Public Administration Publishing

This portion of the study addresses several issues regarding publishing in Turkish

public administration. Findings are presented under these topics: characteristics

of authors, main areas of study, general purpose of article, and statistical techniques.

Findings were compared using those of the United States for the period of the 1990s

and those of Turkey for the period of 1990–2009, because more recent statistical

information was not available for the United States.

Characteristics of Authors

Our first question is, who publishes in AID? Descriptive information about

this question is provided in Tables 1, 2, and 3. Articles in this public administration

journal tend to be single authored (Table 1). Although single-authored public

administration articles were about 65% in the Houston and Delevan U.S. study

(1990), this rate is 90% in the 1990s and 78% between 2000 and 2009 in Turkey.

Berkman (1987, p. 25) found this rate to be 94% for the period 1967–1987 for

AID articles. We observe a small but increasing trend both in coauthored and multiauthored

articles, suggesting that professionals from different topics and disciplines

are coming together to produce higher-quality publications.

Table 1.

Number of Authors

Number of

Authors

Number of

Articles

1990–1999 2000–2009

Percentage

(%)

Number of

Articles

Percentage

(%)

1 287 90.3 220 77.7

2 28 8.8 53 18.7

3 3 0.9 10 3.6

Total 318 100.0 283 100.0

N: 601, d.f.: 599, Levene’s test: t = 4.302, p < .001

The articles are mostly authored by university academicians (Table 2). The trend

shows that the percentage of “practitioner articles” declined from 18.9% to 8.5%

between the two decades. Only 1% of articles were authored by practitioners in the

private or nonprofit sectors. This finding suggests that AID is a common outlet for

academicians. Academicians’ articles were 81% in 1990s, and increased to 91.5%

in the period 2000–2009, comparable to the average level of academicians’ articles

in U.S. public administration journals of the 1980s (Houston & Delevan, 1990,

p. 675). The other important finding is that articles authored by TODAIE members

Journal of Public Affairs Education 121


M. Onder & R. S. Brower

have a declining trend over time. Berkman (1987, pp. 35–40) found that articles

written by TODAIE members between 1967 and 1987 represented 38.4% out of

503 articles. Our findings show a continuing decline in this trend. This percentage

fell to 28.6 in the 1990s and to 18.7% in the most recent decade.

Table 2.

Principal Author Affiliation

Affiliation

Number of

Articles

1990–1999 2000–2009

Percentage

(%)

Number of

Articles

Percentage

(%)

University 167 52.5 206 72.8

TODAIE* 91 28.6 53 18.7

Practitioner 60 18.9 24 8.5

Total 318 100.0 283 100.0

N: 601, Pearson’s chi-square test: 27.589, Phi: 0.214, p < .001

* Academicians with TODAIE affiliation.

We also found that universities in Ankara such as Ankara University, Gazi

University, and Hacettepe University contributed most to the AID journal. These

three universities alone produced 20% of all articles between 1990 and 2009. This

finding illustrates an additional dynamic to the trend that AID articles are less

frequently authored by TODAIE members even though TODAIE publishes

the journal.

The articles between 1990 and 1999 were dominated by established academicians

(associate professors and professors), whose work constituted 57.2% of the articles

(Table 3). After 2000, we see a sharp increase in research done by assistant professors,

and research done by associate professors and professors declined to 33.4% of the

total. Contrary to our findings, the rate of publishing by established academicians

in earlier studies in the United States was around 45%. As an explanation for changes

in the Turkish context, the Higher Education Board (YOK) raised the standards

for promotion from assistant professorship to associate professorship after 2000,

which resulted in a sharp increase of publications authored by assistant professors.

However, the board made no significant changes regulating promotions from

associate to full professorship. Anybody with five years in the position of associate

professor could become professor throughout the period to 2010. Since AID is the

only public administration journal screened by SSCI, we suspect it also attracted

higher numbers of assistant professors from other disciplines.

We also examined funding for research. We found that only 2% of article

authors reported having outside funding for their work. This rate is very low com-

122 Journal of Public Affairs Education


Turkish Public Administration

Table 3.

Academic Rank of Principal Author

Rank of

Principal Author

Number of

Articles

1990–1999 2000–2009

Percentage

(%)

Number of

Articles

Percentage

(%)

Student 69 24.4 69 26.1

Assistant Professor 52 18.4 107 40.5

Associate Professor 90 31.8 49 18.6

Professor 72 25.4 39 14.8

Total 283 4 100 264 100

N: 601, Pearson’s chi-square test: 27.589, Phi: 0.214, p < .001

* Academicians with TODAIE affiliation.

pared to public administration research even decades earlier in the United States.

Perry and Kraemer (1986, p. 218) found that this rate was around 10% in PAR

articles between 1975 and 1984; in five other major public administration journals,

according to Houston and Delevan (1990, p. 676), the rate was around 13%

by 1988. These numbers suggest that social science research is underfunded in

Turkish universities.

Most universities in Turkey are state owned, and this situation creates many

regulations and procedures to get research support for the social sciences and

generally discourages doing so. Even when funding is approved, the researcher is

limited to cost-basis funding for materials that are bought, whereas funding for

the physical sciences includes primary costs for laboratory equipment and other

physical materials for experiments. Furthermore, funded researchers lack

discretion to employ other researchers of their choosing. The amounts that get

approved after all these limitations are symbolic at best, not enough to cover

expenses and reward the researcher individually. A review of university funding in

general shows that most research funded at universities is in the physical sciences

and medicine. 5

Main Areas of Study

Tables 4 through 7 present articles by subject areas. Berkman (1987, pp. 24–

25) reported that 70% of articles published in AID between 1967 and 1987 were

directly related to public administration topics. 6 By comparison to Berkman’s

earlier study, we found slightly more articles published in public administration

topics and a slight but statistically insignificant downward trend for subject matter

in related disciplines during the second decade.

Journal of Public Affairs Education 123


M. Onder & R. S. Brower

Table 4.

Subject Area of Study

Subject

Area

Number of

Articles

1990–1999 2000–2009

Percentage

(%)

Number of

Articles

Percentage

(%)

Public Administration 206 64.8 191 67.5

Related Discipline 112 35.2 92 32.5

Total 318 100.0 283 100.0

N: 601, Pearson’s chi-square test: Not significant

Administrative theory, organization theory, public policy, budgeting, and communication

in the public sector increased in significance between the two decades

as a proportion of total articles. Constitutional law, personnel administration, urban

and environment, and local administration all declined in significance. Similar to

the findings of Perry and Kraemer (1986), our results show that although public

policy attracts more scholars, personnel administration is losing ground in this

journal. Administrative law maintained the same level of attraction for scholars.

However, contrary to Perry and Kraemer (1986), we found that administrative

theory and budgeting are increasing in significance in AID. One possible explanation

is that new public management movements and downsizing policies result in

more articles published in these topics in the 1990s and at the beginning of the

new millennium.

An interdisciplinary perspective can be seen clearly in the background of the

faculty who teach in departments or schools of public affairs. Two decades ago,

only about 40% of public affairs faculties were from political science or public

administration in U.S. public administration schools. The remaining 60% came

from a variety of disciplines (Ventriss, 1991, pp. 8–9). Holzer, Xu, and Wan

(2003, p. 645) provided detailed explanations from PhD program course catalogs

that reveal similar multidisciplinary tendencies a decade later.

The interdisciplinary nature of public administration is to be encouraged.

But directionless fragmentation can erode the field’s substantive worth. We look

briefly (Table 6) at the examples of journals published by the Faculty of Economic

and Administrative Sciences (IIBF Journals). IIBF journals have articles from all

majors with departments in colleges of economic and administrative sciences,

including finance, management, economics, international relations, public

administration, econometrics, and industrial relations. IIBF journals sometimes

appear to offer something from everywhere without apparent direction, despite

their long lists of distinguished referees. Topics from different areas are frequently

reviewed by a referee who is not expert in that area. We contend that AID as a

124 Journal of Public Affairs Education


Turkish Public Administration

Table 5.

Public Administration Topics

Area of

Study

Number of

Articles

1990–1999 2000–2009

Percentage

(%)

Number of

Articles

Percentage

(%)

Administrative Theory 33 16.0 58 30.4

Organization Theory 33 16.0 40 20.9

Public Policy 21 10.2 25 13.1

Personnel Administration 31 15.0 13 6.8

Public Finance & Budgeting 8 3.9 13 6.8

Urban and Environment 15 7.3 11 5.8

Administrative Law 10 4.9 11 5.8

Constitution Law 23 11.2 3 1.6

Local Administration 26 12.6 4 2.1

Communication 6 2.9 13 6.8

Total 206 100.0 191 100.0

N: 397, Pearson’s chi-square test: 50.707, Phi: 0.357, p < .001

reputable professional public administration journal should keep disciplines of

study more successfully separated than IIBF journals have done.

Dose and Finger (1999, p. 653) argued that the original interdisciplinary

approach in Germany is dominated by a political science perspective and by a

management branch directed at public administration. As an extension of the

European tradition we can talk about similarities in the Turkish Public Administration

tradition. Given the complexities and boundary erosion in contemporary

societies, it seems nearly impossible for any discipline or profession alone to handle

even its own problems. Accordingly, political science and public administration

are taught together under the name of public administration in Turkey, and, as

we expected, around 40% of nonpublic administration articles are for political

science topics.

It is interesting that other law topics have increased substantially—from 15.2%

to 18.5%—between the two decades. These findings also support Heper’s findings

(Berkman, 1987, p. 26) that legalistic approaches are still dominant in Turkish public

administration. We note that law courses in public administration catalogs make

up about 20–25% of total courses. We suggest that this strong legalistic approach

contains a normative influence that is likely to hold back other types of public

administration research.

Journal of Public Affairs Education 125


M. Onder & R. S. Brower

Table 6.

Nonpublic Administration

Area of Study

Number of

Articles

1990–1999 2000–2009

Percentage

(%)

Number of

Articles

Percentage

(%)

Political Science 38 33.9 37 40.2

Management 17 15.2 9 9.8

Sociology 27 24.1 14 15.2

Law 17 15.2 17 18.5

Economics 13 11.6 15 16.3

Total 112 100.0 92 100.0

N: 204, Pearson’s chi-square test: Not significant

even its own problems. Accordingly, political science and public administration

are taught together under the name of public administration in Turkey, and, as

we expected, around 40% of nonpublic administration articles are for political

science topics.

It is reasonable to assume that AID is a major publication of professionals

and academicians of public administration in Turkey. We see that other related

disciplines are increasingly getting published in its pages (Table 7). Whereas it is.

Table 7.

Articles Pertaining to Public Administration

Yes/No

Number of

Articles

1990–1999 2000–2009

Percentage

(%)

Number of

Articles

Percentage

(%)

Do not pertain 62 19.5 76 26.9

Do pertain 256 80.5 207 73.1

Total 318 100.0 283 100.0

N: 601, chi-square test: 4.583, Phi: 0.087; p < .05

acceptable and encouraged to seek multidisciplinary topics, we note an increase in

articles that do not pertain to public administration from 19.5% to 26.9% between

the two decades. We assert that articles about other disciplines at least should be

closely related to public administration topics or issues. Unfortunately, AID articles

126 Journal of Public Affairs Education


Turkish Public Administration

sometimes have nothing to do with public administration, such as marketing

strategies for commercial products. We suspect they get published largely because

they have good research designs, but this means editors are generally neglecting

important content criteria.

General Purpose of Article

What is the major purpose or approach of the article? Data in Table 8 report

the general purpose of articles published in AID. The findings support previous

studies reporting that public administration articles have been dominated by literature

reviews or legal briefs. However, the research orientation in AID articles is a

lot thinner than for Perry and Kraemer’s 1986 Public Administration Review (PAR)

literature review, which found 50% of articles offering empirical studies with welldefined

research designs. Houston and Delevan (1990, p. 677) reported that five

other major public administration journals had published 35% empirical studies

on average. Üsdiken and Pasadeos (1992, p. 254) reported that 37 articles out of

237 (15.6%) between 1975 and 1989, among four management journals in Turkey,

were empirical studies. The proportion of empirical studies remains low in AID,

even though we see an increasing trend between the two decades in empirical studies

(p < .05) against legal briefs. Nonetheless, published articles are still dominated

by literature reviews, which mostly focus on introducing new concepts and discuss

problems and topics through descriptive, historical, and logical arguments. We

can conclude that articles in AID do not generally engage in rigorous empirical

research. This observation parallels findings regarding public administration course

catalogs reporting insufficient statistics and methodology courses in both graduate

and undergraduate programs of public administration. Therefore, we see that

neither quantitative nor qualitative methods are employed extensively among

published articles. It is unfortunate that qualitative study is almost nonexistent in

Table 8.

Articles Pertaining to Public Administration

Article Purpose

Number of

Articles

1990–1999 2000–2009

Percentage

(%)

Number of

Articles

Percentage

(%)

Legal Briefs 33 10.4 15 5.3

Literature Review 242 76.1 218 77

Empirical Study

(Research Design)

43 13.5 50 17.7

Total 318 100.0 283 100.0

N: 601, Pearson’s chi-square test: 6.503, Phi: 0.104, p < .05

Journal of Public Affairs Education 127


M. Onder & R. S. Brower

Table 9.

Hypothesis Testing

No/Yes

Number of

Articles

1990–1999 2000–2009

Percentage

(%)

Number of

Articles

Percentage

(%)

No 305 95.9 265 93.6

Yes 13 4.1 18 6.4

Total 318 100.0 283 100.0

N: 601, Pearson’s chi-square test: 6.503, Phi: 0.104, p < .05

published articles; our review revealed two case studies and no other alternative

qualitative methods in use.

Findings of this study support the argument that public administration research

in Turkey has not engaged in theory testing (Table 9). Only 6.44% of articles have

robust research design with well-prepared hypotheses described and used to test

theories. We see an increasing trend in hypothesis testing in the second decade,

but it is not statistically significant. Twenty percent of PAR articles between 1975

and 1984 were theory oriented (Perry & Kraemer, 1986, p. 217), a rate that

increased to approximately 30% among major public administration journals in

the United States (Houston & Delevan, 1990; 678). Cleary (2000, pp. 447–448)

looked for answers to similar questions: “Did the dissertations have a rigorous

research design?” He concluded that 57 of the 168 dissertations in 1998 (33.9%)

met the criterion for methodological validity, compared to 48 of 165 studies in

1990 (29.1%) and 30 of 142 dissertations in 1981 (21.1%).

Several factors might explain the findings regarding insufficient empirical focus

and lack of theory testing in public administration research. One possibility is that

other scholarly journals from reputable schools such as IIBF journals in Turkey

have become outlets for public administration research that engages in more rigorous

design and theory testing. Similar research done in the United States has noted

that public administration researchers often publish outside public administration

journals (Rodgers & Rodgers, 2000). However, Turkish public administration

scholars know that other IIBF journals are not doing extensive empirical research

either. Arı and her colleagues (2005, p. 21) found that 106 out of 151 Turkish

management master’s theses employed empirical research design, although two

thirds of these 106 had important flaws in their proposed hypotheses and methods

(p. 31). Assuming that management programs cover basic courses in methodology,

we suggest that Turkish management instruction needs to equip its researchers

with stronger methodological skills.

Other Turkish scholars fuel resistance to quantitative and empirical study by

criticizing approaches with too many numbers and too little theory (Keleş, 2009).

128 Journal of Public Affairs Education


Turkish Public Administration

Such academic gatekeeping discourages others from developing research skills.

Unavailability of data in Turkey also contributes to the lack of empirical research.

The data in the Turkish Statistical Institute, for example, are at an aggregate rather

than individual level. These data do not help very much to test hypotheses devised

from public administration theories. Scholars are left to collect their own data and

often lose their motivation because of insufficient financial capacity to handle

bigger projects. Other arguments point to identity crisis explanations. The identity

crisis of the academic discipline of public administration and the “battleground

of administrative theory” has been discussed on both sides of the Atlantic, arguing

that public administration does not have a unique framework for guiding scholars.

This situation results in lack of theory building and theory testing. Modest evidence

in our study and others suggests the public administration discipline is nonetheless

better off than before with regard to theories and methodologies.

Statistical Techniques

Next we asked what methodologies empirical studies use (Table 10). We

examined research designs of published articles to examine their data, units of

analysis, and statistical techniques. Most articles do not describe their techniques

and data explicitly. They typically use data to support their descriptive or logical

arguments through cross-tabulation. We could not compare articles in terms of

pre-experimental, experimental, or quasi-experimental design, because too few

were described clearly enough. Among the articles in both decades, we found

only two case studies with well-defined qualitative methods.

Table 10.

Statistical Techniques

Yes/No

Number of

Articles

1990–1999 2000–2009

Percentage

(%)

Number of

Articles

Percentage

(%)

T (1&2) Samples 5 11.61 7 14

Cross-Tab 31 72.09 27 54

Regression & Other

Advanced Techniques

7 16.30 16 32

Total 43 100.00 50 100

N: 93, Pearson’s chi-square test: 8.771, Phi: 0.198, p < .1

Wright and his associates (2004) noted that most researchers do not report

their measures appropriately, thus causing reliability and validity problems. The

Journal of Public Affairs Education 129


M. Onder & R. S. Brower

articles in our study similarly failed to define their techniques and measures

clearly. Although the reasons are not clear, we offer three possible explanations

for this failure. First, they might think that their measures are not vitally

important. Second, they might not know the statistical techniques very well.

Finally, they intentionally might not report their measures if they know their

work was not carefully done.

Among articles that employed statistics, most used univariate and bivariate

statistics. Some employed t-tests for one or two samples. We see there is very

little increase in t-test usage between the two decades. Most articles employed

cross-tabulations in their analysis, with chi-square, the Mann-Whitney U-test,

and t-tests. We note that only a few of the articles supported their crosstabulations

with statistical techniques; that is, most used cross-tabulations only

for visual evidence rather than statistically sound technique.

Even in the United States and other countries, Hallett (2000) found that

students who have good technical skills often have little understanding of how to

solve a problem when it is given in context. They can perform computations, but

cannot tell or explain the units of the quantity they have computed. Students of

public administration need to have both skills: computation and interpretation.

Multiple regression was a multivariate technique used more often in the

second decade. Other advanced techniques were ANOVA (n = 5), factor analysis

(n = 6), data envelopment analysis (n = 2), and other mathematical modeling

techniques (n = 4).

Table 11.

Sources of Data

Data

Number of

Articles

1990–1999 2000–2009

Percentage

(%)

Number of

Articles

Percentage

(%)

Primary 40 50 38 53.52

Secondary 40 50 33 46.48

Total 80 100.0 71 100.0

N: 151, Pearson’s chi-square test: Not significant

Berkman (1987) found that only 10% of Turkish articles used firsthand (primary)

data during the period of 1968–1987. We see a slight increase in total numbers

and percentages compared to his findings (Table 11); however, there was no

significant change between the two decades in our study. Wright and colleagues

(2004, p. 755) reported that public administration journal researchers created

their own data with self-administered surveys in 66% of cases and used secondary

130 Journal of Public Affairs Education


Turkish Public Administration

data sets in 21% of studies. They argued that research questions should guide the

type of data to be used. This implies that, by comparison, data employed in many

Turkish studies were not optimally suited to their application.

Most of the research we examined used the individual as the primary unit of

analysis, and an increasing trend in this regard is seen between the two decades

(Table 12). Selecting individuals as units of analyses is suitable to test public administration

theories that largely adapt behavioral approaches.

Table 12.

Units of Analysis

Data

Number of

Articles

1990–1999 2000–2009

Percentage

(%)

Number of

Articles

Percentage

(%)

Individual 31 72.09 45 90

Organization 12 27.91 5 10

Total 43 100.00 50 100

N: 93, Pearson’s chi-square test: 4.350, Phi: 0.182, p < .05

The Curriculum in Turkish Public Administration

In the 1960s, Liebman (1963, p. 167) asked, “Can we teach what we do not

know?” We know a lot more now than we did then. We are arguably more knowledgeable

and more experienced. We also benefit from other disciplines with better

and available quantitative and qualitative methods, advances in computer technology,

better theories, and successful implementations. Professional public administration

provides administratively capable and politically responsible bureaucrats to improve

democratic, strong government to carry out public policies for today’s world.

What are we trying to accomplish? Denhardt (2001, p. 526) suggested four

basic questions for educators in the field of public administration to focus on. He

asked first whether we seek to educate our students with respect to theory or to

practice. The tension between theory and practice is central to public administration

education. Administrators must develop specific skills that they can use in practice.

Second, he asked whether we prepare students for their first jobs, or for those they

might aspire to later. Third, he asked what the appropriate delivery mechanisms

are for courses and curricula. Finally, he asked, “What personal commitments do

we make as public administration educators?”

Many public administration graduates get into positions that are primarily

technical or analytical, such as budget analysts, personnel analysts, or administrative

assistants. Learning is a process of sharing, and sharing goes both ways. Thus we

Journal of Public Affairs Education 131


M. Onder & R. S. Brower

must consider how much we share and to what extent we have systems that promote

sharing with students. We need to make sure that our students develop their technical,

managerial, and institutional skills cognitively, linguistically, and physiologically.

Ventriss (1991) also argued that public administration and public policy educators

overemphasize administration and analysis (teaching students to cope with complexity,

planning and budgeting, and problem solving) and underemphasize leadership

(teaching students to cope with change, communicating a vision, and motivating).

The analytical, management, and policy knowledge is regarded as essential to

prepare students with the knowledge and skills necessary to participate successfully

in careers devoted to public service. NASPAA’s past standards called for a common

curriculum in three broad areas (Ventriss, 1991, p. 8): (a) management of public

and third sector organizations; (b) application of quantitative and qualitative

techniques; and (c) an understanding of the public policy and organizational

environment. 6 It appears that similar to NASPAA, CAPPA accreditation—the

counterpart in Canada—considerably reduced the variations among programs

and emphasized professional skills (Gow & Sutherland, 2004). Core curriculum

for NASPAA schools included organization theory, public personnel, budget

management, research methods, policy analysis, public law, information systems,

and policy-making process (Averch & Dluhy, 1992, p. 543). Even though it was

in the core, public law was an uncommon course in public administration schools.

Canadian public administration core focus was on governance, policy decision

making, research methods, theories of public administration and policy, human

resources, organization theory, public finance, and macroeconomics (Gow &

Sutherland, 2004, pp. 11–13). When we look at Turkish programs of study, we

do not find a homogenous core curriculum that was present even in these earlier

versions of American and Canadian standards.

Unlike that in the United States, public administration in Western Europe is

rooted in a strong state tradition. Public administration has to keep the state

going and exercise its public authority. The issue of common administrative law

has been a matter of debate since the outset of the European Community. The

main administrative law principles common to Western European countries are

discussed as follows (Connaughton & Randma, 2002, p. 2): (a) reliability and

predictability (legal certainty or judicial security), (b) openness and transparency,

(c) accountability, and (d) efficiency and effectiveness. These shared basic public

administration values and principles are thought to have led to some convergence

among national administrations. Verheijen and Connaughton (2003, p. 833),

however, contend that it is difficult to speak of a unique European model of

public administration teaching. They assert that a European “mode” or “model”

of public administration teaching has not yet emerged. Public administration

education in Western Europe does not constitute a “regional” model of its own,

due to the variations in administrative culture and the stronger dominance of a

legal orientation and analysis of the use of public power in Southern Europe in

132 Journal of Public Affairs Education


Turkish Public Administration

comparison to Northern Europe (Connaughton & Randma, 2002, p. 3). For the

current analysis, this difference is a major concern and divergence point in Europe,

one that inhibits the sort of standardization that has occurred in the United States

and Canada. In particular, British common law stands out against the dominant

codified law perspective of continental Europe.

Public administration in Continental Europe has been predominantly a legal

study. The key theoretical concept in teaching administrative law has been the

concept of the state and its authoritative power over citizens. Thus one should

evaluate public administration together with the development of the state. Public

administration is studied from the integrated viewpoints of different disciplines,

generally those of political science, law, economics, and sociology, and public

administration is the core subject of the program. Since the 1970s, the ideas of

New Public Management and the trend of “doing more with less” in government

have become established, and many countries have included management and

business administration perspectives to public administration education. Connaughton

and Randma (2002, pp. 5–7) note that the public administration discipline and

education is less developed in former socialist/communist states. Having a weaker

public administration discipline and teachings in these countries with strong

state traditions is initially puzzling. But when public administration is understood

to be fed primarily by political science and management, we see that in these former

socialist states, party dictatorship prevented democratic political science from

developing. On the other hand, due to lack of ownership rights, private sector

organizations and the study of management did not develop, either. Authors from

these countries often discuss the very limited number of applicable textbooks.

The emergence of specialized academic education programs in public administration

is a relatively recent phenomenon of this century in Turkey. However, the

study of government activity, governance, the administrative process, and public

policies may be traced back for centuries to Ottoman times of the 13th-century

Enderun experience. Ottoman sultans relied on people trained in these special

schools to be specialists in the “general business” of governments and states. Together

with Renaissance and Western influences, the Turkish system began to be heavily

influenced by the French system, and Turkey adopted many things from France.

The Turkish system is actually historically rooted in a combination of Roman law

and traditions from Continental Europe on the one hand, and Anglo-American

influences after World War II on the other. This influence can also be seen in

public administration research oriented toward the state and theories coming out

of research in the United States and Europe.

Political science in Turkey is rooted jointly in the disciplines of law and

history, going back to the Ottoman Empire. Organization theory gained a

stronger position in public administration research, and we observe a transition

from legal/constitutional analysis to organizational analysis. However, Turkish

public administration still has very strong legal influences in its course catalogs.

Journal of Public Affairs Education 133


M. Onder & R. S. Brower

We can observe schools of thought in administrative science that stress judicial

thinking and others that emphasize social science thinking. Thus some catalogs

yield the impression that one is looking at a law school program. Approximately

one fifth of Turkey’s public administration programs contain this law school

impression. Such programs contain on average six other law courses not directly

related to public administration. Political science courses are also common in

many programs.

Political science departments have recently started to separate themselves from

public administration and are showing up mostly in new and private universities.

Contrary to programs in the United States, in Turkey political science was dominated

by public administration. In checking the trends of separating public administration

and political science departments, we discovered there are only three “political science,”

eight “public administration and political science,” and nine “political science and

international relations” departments in Turkish universities today. The other 42

departments are identified only as public administration. We see that in nine

instances (20%), political science courses are taught under public administration.

Table 13 depicts the numbers of courses under various general topics and the

percentage that each topic constitutes in the overall curricula of the programs in

our study.

Our findings reveal that, although courses in Turkey cover conventional topics

such as the intellectual history of public administration, human resource management,

and the policy process, coverage of these topics is less extensive than in the

United States. Another finding is that new public management receives more

attention in programs in Turkey than in the United States, whereas new governance

and public values are emphasized more in the United States than in Turkey. The

topics of ethics and intergovernmental relations receive less attention among courses

in Turkey than in the United States.

The question of what quantitative methods graduate students in public administration

should be required to master was actively debated among U.S. scholars

in the 1980s and 1990s, and NASPAA requirements provided public administration

education guidance on this matter (Ventriss, 1991, p. 6). The core quantitative

curriculum in public administration masters and PhD programs are more or less

well known in the United States. It seems that most public administration programs

focus heavily on teaching basic regression analysis and probability theory through

linear regression to maximum likelihood techniques in an applied perspective and

axiomatically. Five main categories of courses are typically taught (Rethemeyer &

Helbig, 2005, pp. 188–189): (a) introduction to probability theory and hypothesis

testing; (b) research design and survey methods; (c) introduction to regression

analysis; (d) a continuation of regression analysis; and (e) advanced topics in

research methods. However, none of these categories teach all techniques used

in leading research journals.

In both course catalogs and published articles, Turkish public administration

programs are by comparison far behind the American and Canadian experience

of using and teaching statistical techniques. In firsthand conversation, the first

134 Journal of Public Affairs Education


Turkish Public Administration

Table 13.

Public Administration Undergraduate Course Distribution

Course Topic

Average Number

of Courses

Percentage of

Curriculum

Economics 4.16 9.03

Sociology 2.65 5.69

Methodology and Statistics 2.13 4.81

General Accounting 2.03 4.28

Mathematics 1.12 2.66

Computers 1.66 3.46

Introduction to Law 1.05 2.38

Constitutional Law 1.87 4.13

Administrative Law 1.97 4.27

Other Law 5.92 12.5

Administrative Theory 3.42 7.14

Organization Theory 2.72 5.72

Local Administration 1.68 3.62

Public Policy 1.67 4.21

Public Budget & Financing 2.03 4.45

City and Environment 2.25 4.83

Communication 1.56 3.42

Personnel Administration 1.04 2.23

Political Science 8.88 19.47

International Relations 2.10 4.45

author observes that Turkish public administration scholars are still influenced and

dominated by positivist ideology and often resist using many ideas that originate

in the United States. What Turkish programs need, in our view, is to develop both

quantitative and qualitative methodologies as complementary tools. While we

propose teaching increasingly sophisticated quantitative methods, we do not

advocate turning over the entire core to the statisticians. Instead, we suggest that

programs must address deficiencies by introducing methods that are appropriately

tied to or used to test the theories they employ. They should be able to introduce

qualitative methods following this same heuristic.

Journal of Public Affairs Education 135


M. Onder & R. S. Brower

Conclusions

This article gives a broad overview of Turkish public administration research

over the past 20 years. It describes a research tradition based on organizational

theory and democratic theory. The account offered here is one in which public

administration and political science are usually taught under the heading of public

administration. It also draws a picture of public administration as integrated into

a complex network of domestic political institutions, public agencies, organized

interests, and clients as well as extensive European and international networks

and influence. We have characterized the strong influence of continental European

public law in Turkish course catalogs and published articles.

Conclusions can be drawn from this study that are similar to those in previous

American studies. Articles are primarily authored by public administration academicians

rather than practitioners. Public administration research gets little funding

support. Academic articles mostly focus on literature reviews and conceptual

development for future research. They engage little in testing and developing

theory. The articles published in the principal Turkish journal of public administration

raise serious questions about whether they actually have advanced

theory development.

We found that Turkish theory development and research orientation is probably

weaker than what was found in American studies of two decades ago. We found

that Turkish public administration research articles showed low quality on such

indicators as the presence of an explicit theoretical or conceptual framework,

research design, and uses of qualitative and quantitative techniques.

We found that law courses are dominant in public administration programs

in Turkey, and political science courses are also very common in most programs,

although political science departments have recently started to separate from public

administration. Nonetheless, course coverage in many programs continues to reflect

the legalistic orientation that dominated Turkish public administration in the past.

In Turkish public administration, the past is prologue. A prominent theme

in traditional Turkish public administration was the dominant influence of the

military, which led the modernization of the Turkish state by promoting technocratic

aspects of administration. The period from 1990 to the present saw a growing

influence of management and political science perspectives and a contrasting

decline in the legalistic orientation. Other likely trends may include scholarship

on nonprofits and civil society.

Turkey is subject to the globalizing pressures pushing the world away from

traditional and collectivist concerns toward secular and self-expressive values

(Inglehart & Welzel, 2005). Turkey has long desired a place in the European

Union, and thus continues to adopt European administrative practices. Turkish

scholarship, on the other hand, takes its lead increasingly from the United States.

In addition to the influence of the expansive American public administration

literature, Turkish funding favors the rigorous methodological training in

American graduate programs.

136 Journal of Public Affairs Education


Turkish Public Administration

But what of the capacity of Turkish traditions to help maintain a distinctive

Turkish public administration? We suggest that two features of Turkish national

culture, power distance and uncertainty avoidance (Hofstede & Hofstede, 2005),

will continue to play roles in retaining Turkey’s distinctive administrative

practices, and hence its scholarship. When subordinates generally accept their

bosses’ authoritative influence and prefer the certainty and stability of rules and

regulations, hierarchical structures persist. Significantly, Turkey’s 2011 economic

growth rate was second only to China’s, perhaps reinforcing the thesis that

hierarchies are more efficient than markets for short-term growth. Reinforced

through path dependency, hierarchical practices are assured a prominent role in

the future of Turkish public administration.

This study aims to provide information to those who, in public administration

or the framework of university networks, seek to reform the systems of public

administration programs to better adapt them to the needs of Turkish society. Of

particular concern, we find insufficient methodological courses in these programs

to equip students with necessary skills. Most programs include only introductorylevel

methodological courses and are largely devoid of advanced qualitative and

quantitative methods content. This problem begs immediate action by Turkish

scholars, and its implications are found in both scholarly articles in AID and in

public administration programs. In addition to pointing out the direct implications

for Turkish public administration, we hope our comparative analysis offers useful

insights for American scholars attempting to place their own programs of public

administration education in international context and to make generalizable

contributions to its study in other national contexts.

Footnotes

1 Amme İdaresi Dergisi (AID) in Turkish suggests an equivalent translation of Journal of

Public Administration.

2 Public Administration Institute for Turkey and the Middle East (Türkiye ve Orta Doğu Amme

İdaresi Enstitüsü—TODAIE) is an institute with special status established by the United Nations

in the 1950s. It grants graduate degrees and certificates and is staffed by academicians, but it is

not considered as a university.

3 Numbers of faculty differ somewhat from those in Table 2 because some academicians who were

temporarily assigned to government were counted as practitioners in Table 2, but their acquired

academic ranks are depicted in Table 3.

4 See, for example, university-funded research in Cumhuriyet University at http://www.cumhuriyet.edu.tr

5 Berkman (1987, p. 25) did not include communication and urban and environment as public

administration topics; Turkish public administration education now includes these topics in

course curricula.

6 The NASPAA guidelines have been updated several times since then, but we offer this earlier

framework because it applies temporally to the period of publications and programs of study in

our analysis.

Journal of Public Affairs Education 137


M. Onder & R. S. Brower

References

Arı, G. S., Armutlu, C., Tosunoğlu, N. G., & Toy, B. Y. (2005). Nicel Araştırmalarda Metodoloji

Sorunları: Yüksek Lisans Tezleri Üzerine Bir Araştırma. Ankara Üniversitesi SBF Dergisi, 64,

16–36. (Methodological issues in quantitative research: A research on master’s thesis. Ankara

University Journal of Faculty of Political Sciences.)

Averch, H., & Dluhy, M., (1992). Teaching public administration, public management, and

policy analysis: Convergence or divergence in the master’s core. Journal of Policy Analysis and

Management, 11, 541– 551.

Berkman, A. Ü. (1987). Amme İdaresi Dergisi’nde Yayınlanan Makaleler ve Türk Yönetim Bilimi.

Amme İdaresi Dergisi, 20, 19–42. (The articles published in AID and Turkish Administrative

Sciences. AID.)

Cleary, R. E. (2000). The public administration doctoral dissertation reexamined: An evaluation of the

dissertations of 1998. Public Administration Review, 60, 446–455.

Connaughton, B., & Randma, T. (2002). Teaching ideas and principles of public administration: Is it

possible to achieve a common European perspective? EPAN Fifth Annual Conference, June 14–15.

Granada, Spain.

Denhardt, R. B. (2001). The big questions of public administration education. Public Administration

Review, 61, 526–534.

Dose, N., & Finger, M. (1999). Public administration in Germany and Switzerland: A review

symposium. Public Administration, 77, 651–687.

Gow, J. I., & Sutherland, S. L. (2004). Comparison of Canadian masters programs in public

administration, public management and public policy. Canadian Public Administration, 47,

379–405.

Hallett, D. H. (2000). Teaching quantitative methods to students of public affairs: Present and future.

Journal of Policy Analysis and Management, 19, 335–341.

Henry, N. (1995). Public administration and public affairs. Englewood Cliffs, NJ: Prentice Hall.

Hofstede, G., & Hofstede, G. J. (2005). Cultures and organizations: Software of the mind. New York:

McGraw-Hill.

Holzer, M., Xu, H., & Wang, T. (2003). The status of doctoral programs in public affairs and

administration. Journal of Public Affairs Education, 13, 631–647.

Houston, D. J., & Delevan, S. M. (1990). Public administration research: An assessment of journal

publications. Public Administration Review, 50, 674–681.

Inglehart, R., & Welzel, C. (2005). Modernization, cultural change, and democracy: The human

development sequence. New York: Cambridge University Press.

Jreisat, J. E. (2005). Comparative administration is back in prudently. Public Administration Review,

65, 231–242.

Keleş, R. (2009). Yerinden Yönetim ve Siyaset (Local Government and Politics). İstanbul: Cem yayınevi.

138 Journal of Public Affairs Education


Turkish Public Administration

Liebman, C. S. (1963). Teaching public administration: Can we teach what we don’t know? Public

Administration Review, 23, 167–169.

Perry, J. L., & Kraemer, K. L. (1986). Research methodology in Public Administration Review. 1975–

1984. Public Administration Review, 46, 215–226.

Rethemeyer, K. R., & Helbig, N. C. (2005). By the numbers: Assessing the nature of quantitative

preparation in public policy, public administration and public affairs doctoral education. Journal

of Policy Analysis and Management, 24, 179–191.

Rodgers R., & Rodgers, N. (2000). Defining the boundaries of public administration: Undisciplined

mongrels versus disciplined purists. Public Administration Review, 60, 435–445.

Stallings, R. A., & Ferris, J. A. (1988). Public administration research: Work in PAR, 1940–1984.

Public Administration Review, 48, 580–587.

Üsdiken, B., & Pasadeos, Y., (1992). Türkiye’deYayınlanan Yönetimle İlgili Makalelerdeki Atıflar

Üzerine Bir İnceleme. Amme İdaresi Dergisi, 25, 107–134. (A study on the references of articles

published on public administration in Turkey. AID)

Ventriss, C. (1991). Contemporary issues in American public administration education: The search for

an educational focus. Public Administration Review, 51, 4–14.

Verheijen, T., & Connaughton, B. (2003). Public administration education and Europeanization:

Prospects for the emancipation of a discipline? Public Administration Review, 81, 833–851.

Wright, B. E., Manigault, L. J., & Black, T. R. (2004). Quantitative research measurement in public

administration: An assessment of journal publications. Administration & Society, 35, 747–764.

Murat Onder is an associate professor of Public Administration at Yıldırım

Beyazıt University in Turkey. His research interests include organizational and

institutional theory, cross-national comparisons, culture, nonprofits, public

administration research and theory, strategic and performance management in

the public sector, and the application of analytical techniques to the decisionmaking

process. E-mail: monder@ybu.edu.tr

Ralph S. Brower is associate professor, Askew School of Public Administration

and Policy, and director, Center for Civic and Nonprofit Leadership, Florida

State University. His teaching and research focus on organization studies,

voluntary organizing, and international/comparative administration. His work

has appeared in numerous public administration and nonprofit journals. E-mail:

rbrower@fsu.edu

Journal of Public Affairs Education 139


Competency Model Design

and Assessment: Findings

and Future Directions

Heather Getha-Taylor, Raymond Hummert,

John Nalbandian, and Chris Silvia

University of Kansas

Abstract

Competency models offer potential for defining effective and/or superior performance

and then aligning curriculum and other learning opportunities with individual

development goals. However, barriers exist that prevent optimal use of competency

models, including difficulty identifying competencies and assessing development

appropriately. This paper presents insights based on the design and implementation

of a competency model for MPA students at the University of Kansas. Goals of

this multiyear effort include (a) helping students assess their development as they

progress through the MPA program, (b) linking competencies to curriculum and

experiential learning opportunities, and (c) assessing progress using multiple evaluations

over time. This paper considers associated challenges, including competency

identification, assessment, and the need to capture emerging competencies.

Today’s MPA students will face a host of challenges when practicing public

management after graduation. To prepare them for that reality, NASPAA-accredited

programs seek to “develop the skills and techniques used by leaders and managers

to implement policies, projects, and programs that resolve important societal problems

while addressing organizational, human resource, and budgetary challenges” (NASPAA,

2012). Of course, programs accomplish these tasks in various ways. Regardless

of the approach taken, it is incumbent upon the faculty to assess how well students

develop the skills, aptitudes, and perspectives they will need to operate successfully

in their chosen profession and to develop habits of reflective self-assessment. One

way to address this challenge is to use competency models that define the characteristics

that result in effective and/or superior performance on the job (Boyatzis, 1982)

and then align those competencies with curriculum and other learning opportunities.

Keywords: competency models, student self-assessment, outcomes evaluation

JPAE 19(1), 141–171

Journal of Public Affairs Education 141


H. Getha-Taylor et al.

Competency modeling offers a number of benefits, including a focus on both

current and future individual development (Sanchez & Levine, 2009), but many

barriers prevent its optimal use, including difficulty identifying competencies and

assessing development appropriately (Op de Beeck & Hondeghem, 2010). Further,

in teaching MPA students, we face the challenge of developing a competency-based

assessment that acknowledges the differences in experiences and accomplishments of

mid-career and pre-service students. This paper provides insights on these challenges

based on the design and implementation of a competency model for MPA students

at the University of Kansas. The goals of this multiyear effort include (a) helping

students track their competency development as they progress through the MPA

program, (b) linking competencies to curriculum and experiential learning opportunities,

and (c) assessing progress using multiyear data collection.

Scholarly literature indicates varied applications for competency models in

the context of graduate education. Competency models can be used to respond

to changing needs of the profession (Batalden, Leach, Swing, H. Dreyfus, & S.

Dreyfus, 2002), help students prepare for leadership roles (Kleinman, 2003), and

help faculty members and administrators respond to curriculum gaps (Johnson

& Rivera, 2007; Rice, 2007). Further, such models can be used to design holistic

educational approaches (Robotham & Jubb, 1996; Talbot, 2004; Tomkins, Laslo vich,

& Greene, 1996) and contribute to lifelong professional development and learning

(Rodolfa et al., 2005). Finally, we believe that engaging students with a competencies

model can promote the kind of self-reflection that adds value to the goal

of lifelong learning. The challenge, of course, is identifying, defining, and assessing

the competencies of interest for successful performance and aligning such competencies

with educational components (McEvoy et al., 2005).

In this article, we examine the experience of the University of Kansas in

applying its MPA competency model to pre-service students. Topics presented

include (a) an overview of competency modeling and applications to an MPA

curriculum; (b) the history, development, and purpose of the MPA competency

model used by the University of Kansas; (c) a multiyear competency assessment

data; and (d) consider ations for future directions and broader applications.

Competency Modeling and Applications to MPA Curriculum

Describing the concept of “competencies” is the starting point. Competencies

are those underlying characteristics, says Boyatzis (1982), that are “causally related

to effective or superior performance in a job” (p. 21). Competencies move beyond

traditional knowledge, skills, and abilities (KSAs) to capture job-related motives,

traits, and self-concepts (Daley, 2002). Further, competencies are distinguishable

from KSAs in that they focus on future development and potential for performance.

To this end, competencies can help answer the question, “How do we know good

performance when we see it?” Ideally, competencies can guide a number of critical

142 Journal of Public Affairs Education


Competency Model Design and Assessment

workforce functions, including hiring, development, and even evaluation. In the

context of MPA education, competency models can help connect curriculum to

desired outcomes and guide students in their professional development efforts.

Identifying and validating core competencies for MPA education rests on a

process that includes competency identification and modeling, validation, and

assessment. Beginning with identification and modeling, varied methods are available.

Programs may adopt an existing model or may choose to develop an original model

that draws on first-person accounts or expert panel data (L. Spencer & S. Spencer,

1993) to determine those characteristics related to effective or superior performance

in the selected context. Next steps, regardless of the selection process, involve applying

the model and determining the validity of the selected competencies. A final

and often forgotten step is continuous evaluation of the competency model. Just

as workforce demands change with time, so too should competency models keep

pace with emerging developmental needs (Getha-Taylor, 2008).

The contemporary emphasis on competencies reflects rapidly changing environments

that require skills extending beyond the boundaries of any one job and

that indicate an individual’s ability to adapt and learn (Rodriguez, Patel, Bright,

Gregory, & Growing, 2002). Further, a focus on technical, ethical, and leadership

competencies helps ensure that public servants do things right and also do

the right things (Bowman, West, Berman, & Van Wart, 2004). The rate of rapid

change that affects public service also affects the continued validity of public

service competency models. Ideally, such models should reflect current demands

and help emerging public service professionals meet the challenges of the future.

Despite the promise of competency management, a number of difficulties

for utilizing competency models in MPA programs are notable. First, real or

perceived resource constraints (including financial resources, time, and in-house

expertise) can stall competency efforts. Second, determining how best to select

and/or develop original competency models for use in MPA programs presents a

challenge. In addition, the utilization of these models and the reassessment to

reflect contemporary development needs requires long-term commitment and

continued attention. However, it is not the aim of this paper to focus exclusively

on the challenges. Rather, our goal is to highlight both the need and opportunity

for such initiatives.

Competency models speak to the related and critical instructional concepts

of mastery and transference. To develop mastery, Ambrose and colleagues (2010)

note that “students must acquire component skills, practice integrating them,

and know when to apply what they have learned” (p. 95). Competency models

can help students identify what they’ve learned and reflect on applications. Mastery,

though, is a multiphase developmental process that includes two key dimensions:

competence and consciousness (Sprague & Stuart, 2000). This process and its appli -

cations to MPA competency models are presented in Table 1.

Journal of Public Affairs Education 143


H. Getha-Taylor et al.

Table 1.

Developmental Stages and MPA Competency Applications

Level Stage Description MPA Competency Application

1 Unconscious

incompetence

2 Conscious

incompetence

3 Conscious

competence

4 Unconscious

competence

Students do not know what

they don’t know

Students are aware of what

they need to learn

Students have competence

but must act deliberately

Students exercise skills automatically

or instinctively

Consider initial assessment inflation

Interpretation of subsequent assessments

Consider development timeline

Source. Adapted from Ambrose et al. (2010) and Sprague and Stuart (2000).

Integrate focus on long-term development

It should be stressed that the mastery developmental process begins in a state

where students not only lack competence but also are generally unaware of what

they do not know. This situation may result in inflated initial self-assessments. As

students progress in their education, it is expected that both their consciousness

and competence will develop to help them identify what they are learning and what

they still have to learn. True mastery occurs only when the initial stage of unconscious

incompetence progresses to the final stage of unconscious competence: We want our

students to have competence that can be used automatically and instinctively.

To this end, MPA programs aim to provide students with foundational learning

opportunities and resources that they can apply to the practical challenges of

governance. Key to effectiveness in this regard is the ability of students to transfer

what they are learning to practical contexts. This goal, educational transference,

rests on similar learning and application contexts and the ability of students to

know how to apply what they are learning in the classroom to practical challenges

(Ambrose et al., 2010). One way to improve transference is to give pre-service

students opportunities to apply skills or knowledge, through such activities as

service learning and/or internship experiences. The University of Kansas MPA

program incorporates these components and also provides opportunities for

student growth and reflection on the development process.

In addition to academic coursework and the internship experience, pre-service

students participate in a series of professional development seminars with faculty

members during their second year—their full-time internship year—that allow

for group discussion, individual processing, and debriefing experiences. These

reflective activities help illustrate how students apply what they have learned in

the classroom to practical challenges. This approach is aligned with Schon’s (1983)

“reflection-in-action” perspective, which emphasizes managing through turbu-

144 Journal of Public Affairs Education


Competency Model Design and Assessment

lence and uncertainty using observation and reflective conversation as a supple ment

to technical knowledge. Together, it is expected that academic coursework, the

yearlong internship experience, and the series of professional development seminars

provide a total student experience that addresses the connected needs of mastery,

reflection, and application.

Learning from Project History

This section reports the approach and lessons learned from the University of

Kansas (KU) portfolio and competencies project. 1 The competencies project, the

results of which are reported in this paper, grew out of a larger focus on portfolio

development following a National Association of Schools of Public Affairs and

Administration (NASPAA) accreditation visit in the early 2000s. Portfolio development

was discussed with the site visit team and then pursued in conversations at

KU with a faculty member invited from West Virginia University, where a portfolio

requirement had been implemented. Seeking more information, the department

chair and another faculty member at KU visited the Dean of the KU School of

Architecture, where portfolios are commonplace. Of singular importance to our

project was the dean’s observation that although a portfolio contains evidence of

an architect’s work, most important for readers and viewers are reflective statements

conveying why a particular artifact is in the portfolio and what movement from

one project or style of architecture to another means to the architect.

From those beginnings, the project took off with a gathering of first-year

pre-service MPA students and local government practitioners in the summer of

2001 to discuss what elements should be included in student portfolios. The first

group of portfolios consisted of paper documents in three-ring binders, loosely

organized around the International City/County Management Association (ICMA)

competencies. The portfolio project progressed very slowly, in part because the

university lacked enabling software and also because the department was working

its way through portfolio purpose and design issues, including faculty roles. A key

point occurred with a decision to link the portfolio assignment to development

of a competencies matrix. The thinking was that student focus on competencies

could provide an anchor for portfolios and help standardize their presentation.

Work on a competencies rubric began as an assignment in an MPA class in

Human Resources Management. In that initial class, students were asked to search

for lists of managerial/leadership competencies in public sector organizations and

associations. They were encouraged to engage in a worldwide electronic search so

they could understand that the movement toward competencies-based human

resources management was not isolated to the United States.

Following that assign ment, in spring 2005, the department convened a meeting

in Lawrence of representatives from regional NASPAA schools to talk about outcomebased

education. Also, at a NASPAA conference in fall 2006, KU faculty presented

a paper documenting our interest in outcome-based education and our progress

toward developing a list of competencies.

Journal of Public Affairs Education 145


H. Getha-Taylor et al.

A significant point in the development of a rubric occurred when the Canadian

Public Service (CPS) competencies project was discovered in preparation for the

NASPAA panel. On its face, the CPS definition of public service and its broad

categorization of competencies into four sections made so much sense that the

department simply adopted it as written. The CPS conceptualization has given

the KU rubric intellectual coherence, providing an intellectual guide for future

discussions of curriculum. It should be restated that at the time the Canadian

Public Service Model was discovered, the rubric was still in the class project

stage. In other words, no formal faculty adoption of a rubric or its content was

required. This situation greatly simplified development of the rubric and its

intellectual underpinnings.

In a subsequent class with the same instructor, a small group of students accepted

an assignment to review the lists the previous class had gathered and develop a matrix

consisting of common competencies. The resulting rubric was presented to faculty,

and two faculty members not originally involved in the project volunteered to

review the rubric the students developed with an eye toward creating symmetry.

Symmetry was needed regarding degree of specificity of the competencies (rows

on the rubric) and also consistency in levels of achievement (columns on the rubric).

Also, the two faculty members looked for obvious areas of content omission given

our curriculum and MPA program focus.

Faculty acceptance, with minor changes, of the rubric attests to its face validity.

That validity is drawn from the sources of the lists the students drew from to

identify common competencies. The list included competencies identified and

used by a local government, a state government, the ICMA, and the International

Public Management Association–Human Resources (IPMA–HR), suggesting a

sound professional public administration grounding in practice. Faculty confidence

in the rubric was further enhanced when we saw how easily the competencies fit

into the Canadian Public Service template, since the CPS model captured the

KU public administration curriculum so nicely.

Even though faculty were not involved in the original development of the

competencies rubric, as key stakeholders, they had endorsed the idea of student

portfolios, which pre-career students were then required to prepare and then

present to a faculty advisor during an internship seminar—where they were

discussed—immediately before their graduation. Because the project developed

the way it did, other than for the internship faculty member, its purposes did not

intrude on faculty prerogatives and time.

As described later, our initial goal was student self-assessment, and because of

the way the rubric was developed—drawing from professional practices—it was a

rubric of career-long learning rather than learning isolated to an MPA curriculum

alone. Eventually, we hoped to extend our goal from student self-assessment to

curriculum reform. But the self-assessment purpose, the career-long focus of the

rubric, and questions that might be raised about validity of the rubric—developed

from preexisting lists—postponed that discussion to the present. In addition, the

146 Journal of Public Affairs Education


Competency Model Design and Assessment

student cohort we required to use the rubric was focused and had a faculty advisor.

This approach meant that general faculty involvement in reviewing student progress

could be minimized until the department was confident it was on a solid path.

Over time, our goals for the competencies project became clear:

• Help students track development as they progress through the MPA

program both in the classroom and their internships.

• Link competencies to curriculum and experiential learning opportunities.

• Assess progress using both quantitative and qualitative evaluation.

Regarding the first goal, two aspects of the project argued for student selfassessment

rather than third-party (faculty) assessment. First, the project did not

begin with an explicit connection to the MPA curriculum. The rubric was drawn

from other sources, and it represented career-long learning by professional public

administrators rather than from a curriculum itself. Thus, objective third-party

assessment by faculty of student progress in their MPA program seemed inappropriate.

Second, and more important, in the HR class where the competencies

project originated, the idea that students and/or professionals are responsible for

their own professional development was emphasized. Although feedback on

progress is important to inform reflection, faculty assessment of student progress

reinforces hierarchical accountability—a concept deemed inappropriate to

career-long professional development.

To achieve the first goal, students are required periodically to report on their

movement along the matrix. They are asked to pick any three of the competencies

where they can report movement, to produce the evidence they have of movement,

and then to reflect in writing about their movement—why it is important to them.

We have produced a guide with examples of “evidence” artifacts and reflective

statements. Faculty involvement to this point has been isolated to individual

advisors, the faculty member who was involved with the project from the beginning,

and to the school’s academic advisor. Responses to the student’s work are

purposefully nonevaluative but are intended to provoke honesty, accuracy, and

reflection. Faculty comments are often phrased as questions: “You have produced

evidence of progress based on your work in a jurisdiction of 30,000; are you confident

you could do the same in a jurisdiction of 500,000?” What we have learned

from reviewing student statements of progress is that often students think they

have accomplished more than an experienced observer might agree with. We also

saw that while working in their full-time internships, students occasionally saw

themselves as knowing less than originally thought.

We have addressed this issue by re-conceptualizing the rubric from a flat screen

to a cube. In other words, each box in the rubric contains depth, and progress

may be seen in terms of building depth rather than moving horizontally. This

method has relieved some student self-imposed incentive to show “progress” as

horizontal movement toward mastery, and we think it has enabled students to be

more honest with themselves.

Journal of Public Affairs Education 147


H. Getha-Taylor et al.

A key moment in the entire portfolio/competencies project occurred during

one of the seminars when students were discussing their progress constructing

portfolios. The discussion began to focus on what faculty expected when one of

the graduating students blurted out poignantly and in frustration, “Screw it: It is

not theirs!” This simple expression reinforced the idea that we were not trying to

create another grading tool; we were trying to assist our students and graduates

to plan and chart their careers. It was their responsibility, and we were providing

a tool with the portfolio and the competencies rubric.

Moving to a discussion of the second goal, the most important question we

face in linking the competencies to curriculum and experiential learning opportunities

is how much overlap we want in the MPA curriculum and competencies.

At KU, our pre-career students have a part-time internship in their first year, and

their second year consists of a full-time internship with three professional development

seminars led by KU faculty. The second goal forces a reexamination of the

view that the curriculum is the sole contribution to the student’s education. Linking

the competencies rubric to our students’ total educational experience both in the

classroom and out—as they define their educational experience—has led us to

focus on the concept of the “totality of the student experience” rather than looking

solely at the contribution that faculty make in class to a student’s education.

Faculty have been asked to identify in their syllabi the competencies covered

in their classes. Interestingly, this request was met with no resistance by faculty.

This result suggests that faculty members are quite attuned to the concept of

competencies, and including references to the rubric in their syllabi is not an

intrusion on their prerogatives or approach to learning.

Our intent is to use the data we now have about student progress to review

the curriculum—which at KU includes substantial experiential learning for

pre-career students. With the student self-assessment data, we now are able to

identify pre-career student progress in their academic year on campus and also in

their internship year. Rather than working from a preconceived notion of what

competencies are required and shaping the curriculum in a traditional approach,

we now have data we can use to determine what students are learning in their on

campus year and what they are learning in their internships. These data, coupled

with a general survey of intern supervisors at the conclusion of the internships on

the preparedness of students, provide additional knowledge about students’ learning.

Our goal is to bring this information to faculty to inform a discussion of

curriculum. This effort is significant in terms of the sequence of curriculum

development, which often has an unspoken purpose of legitimizing faculty

interest areas. Now, we truly will be focused on what students have reported as

their learning. We now can ask more comfortably, “Shouldn’t they be learning

148 Journal of Public Affairs Education


Competency Model Design and Assessment

about X on campus?” In other words, the curriculum discussion will have a base

grounded in data rather than one solely based on faculty understanding of the

academic discipline, the field of professional practice, and their own intellectual

connections to that practice and the discipline.

Turning now to the third goal, assessing progress through evaluation, at each

stage in development of the competencies matrix and its connection to a portfolio

requirement, we asked students for feedback. Their input reinforced the idea

of self-assessment, but it also provided ideas on how we could or should use the

matrix, involve faculty, and conceive of the project. We are now at a point where

we can assess the results from separate cohorts on the competencies rubric. The

quantitative results are presented in the next section.

Data Findings

The effort to gather and analyze quantitative data from student competency

self-assessments began with the intern-option KU MPA Class of 2009. At that

time, data were collected at two specific points: at the beginning and end of the

program. It was later determined that a third data collection point would be

necessary to separate the effects of classroom experiences in year one and the

internship/professional development seminars in year two. Beginning with the

Class of 2010, data were collected at three separate points: (a) at the start of

their academic coursework (Time Point 1), (b) at the end of their academic

coursework (Time Point 2), and (c) at the end of their internship experience

(Time Point 3). At each point, the students were asked to self-evaluate their

competency in each of 29 areas of interest using a 5-point Likert Scale, where

higher scores indicate greater perceived competencies. Although early versions

of the matrix included labels for each Likert Scale point such as novice, apprentice,

and so forth, the current version has omitted these in favor of providing

only a competency-based description of what is meant by each Likert Scale point

(see Appendix B). At the time of this writing, complete data sets exist for two

cohorts (Class of 2010 [n = 12] and Class of 2011 [n = 14]). The resulting data

are analyzed and presented here.

Student self-assessments at the start of their program (Time Point 1) indicate

that overall, conflict resolution, resource allocation, financial management, group

dynamics, and understanding policy trends are the competencies students rate

lowest, and thus are most in need of development (Figure 1). Initial assessments

also reveal that, overall, student self-assessments identify strengths in individual

differences, diversity, verbal communication, public service, and written communication.

These data serve as a starting point for the students as they learn and

grow over the two years in the program and beyond.

Journal of Public Affairs Education 149


H. Getha-Taylor et al.

Figure 1.

Initial Competency Self-Assessment Score

3.5

3

2.5

Likert Scale Score

2

1.5

1

0.5

0

Individual Differences

Diversity

Verbal Communication

Public Service

Written Communication

Decision Analysis

Collaboration (group dynamics)

Integrity & Ethics

Creativity & Innovation

Long-Term Outlook

Evidence-Based Practice

Info System Management

Citizen Engagement

Timeliness

Work in Political Environment

Continous Tech Improvement

External Policy Impact

Service Standards & Delivery

Productivity/Efficiency

Employment Law

Capacity Building

Conflict Prevention

Policy Formation

Policy Implementation

Policy Trends

Interpersonal (group dynamics)

Financial Management

Resource Allocation

Conflict Resolution

Note. A 5-point Likert Scale was used, where higher scores indicate greater perceived competencies.

Figure 2 considers the development that takes place over the first year in the

MPA program. Student self-assessments from Time Point 1 and Time Point 2 are

analyzed to consider the change that occurs as a result of classroom experiences

and traditional instructional techniques. During the year of intensive academic

coursework, students illustrate greatest developmental change related to resource

allocation, group dynamics, policy trends, policy impact, and written communication.

These findings are particularly interesting in two ways. First, given the self-assessed

initial strength in written communication, it seems that although students perceived

strength in this area, graduate coursework developed this competency even further.

Second, the classroom experience seems well suited to addressing the competencies

that students perceive in need of development, including resource allocation, group

dynamics, and policy trends. Perhaps even more intriguing is the finding related

to the diversity competency: Students generally reverse their initial strong selfassessment

to indicate perhaps an improved understanding of what diversity really

means in the context of public management. This finding supports the conclusion

that many students enter the program with unconscious incompetence: They do

not yet know what they do not know.

150 Journal of Public Affairs Education


Competency Model Design and Assessment

Figure 2.

Examining Development in Year 1: Impact of Classroom Experiences

Average Change in Likert Scale Score

0.70

0.60

0.50

0.40

0.30

0.20

0.10

0.00

–0.10

Resource Allocation

Interpersonal (group dynamics)

Policy Trends

External Policy Impact

Written Communication

Policy Formation

Financial Management

Timeliness

Capacity Building

Policy Implementation

Citizen Engagement

Employment Law

Work in Political Environment

Creativity & Innovation

Productivity/Efficiency

Continous Tech Improvement

Public Service

Decision Analysis

Verbal Communication

Service Standards & Delivery

Conflict Resolution

Info System Management

Collaboration (group dynamics)

Individual Differences

Integrity & Ethics

Conflict Prevention

Evidence-Based Practice

Long-Term Outlook

Diversity

Note. Scores represent the average change in Likert Scale scores between the initial assessment (Time Point 1)

and the assessment at the end of the first year in the program (Time Point 2).

Figure 3 considers the developmental change that occurs in year two of the

program, which involves a full-time internship and also professional development

seminars. Findings indicate significant competency growth on policy formation,

policy implementation, information system management, verbal communication,

and collaboration. Again, it is notable that one of these (verbal communication)

was identified as one of the initial competency strengths. Further, Figure 3 demonstrates

the complementary impact the internship experience and professional

development seminars have on coursework: It appears that these experiences

support growth in needed competency areas, including collaboration and information

system management. Finally, this figure indicates the development of

mastery as revealed by the employment law competency. The data indicate that

students provide a lower self-assessment on this competency after completing their

internship and professional development experiences. This result suggests that

students are learning to apply what they covered in the classroom to practical

situations. Students’ reverse movement on this competency suggests two potential

explanations. First, the reality of applying employment law to practical management

challenges may have revealed the complexity of mastering this competency.

Journal of Public Affairs Education 151


H. Getha-Taylor et al.

Second, this finding may suggest that the curriculum needs updating to ensure

students are receiving adequate training before beginning their internships.

Figure 3.

Examining Development in Year 2: Impact of Internship and Professional

Development Experiences

1.00

0.90

Average Change in Likert Scale Score

0.80

0.70

0.60

0.50

0.40

0.30

0.20

0.10

0.00

–0.10

Policy Formation

Policy Implementation

Info System Management

Verbal Communication

Collaboration (group dynamics)

Interpersonal (group dynamics)

Productivity/Efficiency

Continous Tech Improvement

Integrity & Ethics

Evidence-Based Practice

Long-Term Outlook

Work in Political Environment

Conflict Resolution

Policy Trends

Written Communication

Financial Management

Public Service

Service Standards & Delivery

Diversity

Resource Allocation

Decision Analysis

Capacity Building

Citizen Engagement

Individual Differences

Conflict Prevention

Creativity & Innovation

External Policy Impact

Timeliness

Employment Law

Note. Scores represent the average change in Likert Scale scores between the assessment at the end of Year 1

in the program (Time Point 2) and the assessment at the end of Year 2 in the program (Time Point 3).

To present a picture of the overall change in competency ratings over the course of

the two-year program, Figure 4 combines the developmental changes that occurred

during both the academic coursework (see Figure 2) and the internship/professional

development experiences (see Figure 3). Unlike Figure 4, which depicts the change

in competency self-assessment, Figure 5 focuses on the average Likert Scale score

for each competency. It combines the data taken at each of the three time points

to present a cumulative bar chart of the average competency self-assessment. Together,

these figures provide an opportunity for discussion related to strengths and strategic

adjustments for the future. For instance, they can help faculty members identify

competency areas that are developed effectively in the classroom (e.g., group

dynamics, resource allocation, policy impact, and written communication) and

those that are effectively addressed through internship and other professional

development activities (e.g., policy implementation, verbal communication,

information system management, evidence-based practice, and long-term impact).

152 Journal of Public Affairs Education


Competency Model Design and Assessment

Figure 4.

Overall Program Impact on Competencies

1.40

Average Change in Likert Scale Score

1.20

1.00

0.80

0.60

0.40

0.20

0.00

–0.20

Policy Formation

Interpersonal (group dynamics)

Policy Implementation

Policy Trends

Resource Allocation

Written Communication

Financial Management

Verbal Communication

Productivity/Efficiency

Capacity Building

Work in Political Environment

Info System Management

Continous Tech Improvement

Collaboration (group dynamics)

External Policy Impact

Public Service

Integrity & Ethics

Conflict Resolution

Citizen Engagement

Evidence-Based Practice

Decision Analysis

Service Standards & Delivery

Long-Term Outlook

Creativity & Innovation

Timeliness

Individual Differences

Conflict Prevention

Diversity

Employment Law

Change Linked to Classroom Experience Change Linked to Classroom Experience Adjustment to After-Internship Experience

Figure 5.

Student Competency Self-Assessments Over Time

Conflict Resolution

Resource Allocation

Financial Management

Interpersonal (group dynamics)

Policy Trends

Policy Implementation

Policy Formation

Conflict Prevention

Capacity Building

Employment Law

Productivity/Efficiency

Service Standards & Delivery

External Policy Impact

Continuous Tech Improvement

Work in Political Environment

Timeliness

Citizen Engagement

Info System Management

Evidence-Based Practice

Long-Term Outlook

Creativity & Innovation

Integrity & Ethics

Collaboration (group dynamics)

Decision Analysis

Written Communication

Public Service

Verbal Communication

Diversity

Individual Differences

0.00 0.50 1.00 1.50 2.00 2.50 3.00 3.50 4.00

Likert Scale Score

Initial Measurement

(Time Period 1)

Adjustment of Measurement

at Time Period 2

Midpoint Measurement

(Time Period 2)

Adjustment of Measurement

at Time Period 2

Final Measurement

(Time Period 3)

Journal of Public Affairs Education 153


H. Getha-Taylor et al.

These figures also illustrate the impact of time and application for key concepts

covered in the classroom. Notably, competency areas like integrity/ethics, collaboration,

and diversity, which are addressed in curriculum, may not be fully under stood

until students have opportunities to apply these concepts in practice. This speaks

to the developmental principle of conscious competence, which emerges only over

time. In addition, these figures reveal areas of opportunity for the future curriculum

changes such as conflict resolution, conflict prevention, decision analysis, and

service standards and delivery. Students did not feel that their competence in these

areas grew as much relative to their reported growth in other areas. Overall, these

figures illustrate the totality of the student experience, including both classroom

and applied experiences, and the complementary impact these activities have on

competency development overall.

Discussion

The findings present an opportunity to extend this discussion beyond the

university context in which these data were collected to consider three broad themes:

(a) the totality of the MPA student experience, (b) the developmental stages of

mastery, and (c) implications for program decisions, including curriculum choices.

Regarding the first broad theme, the data illustrate the distinctions between

first- and second-year assessments. This finding speaks to the totality of the

student experience, including both the classroom learning experiences and the

second-year full-time internship experience facilitated by professional development

seminars. These intern-option students illustrate different developmental

gains through participating in these separate but related activities. This result speaks

to the key educational goal of transference. Although the data illustrate distinct

developmental differences, it is notable that the classroom experiences are supplemented

with opportunities to apply foundational lessons to practice. It is important

to note that career-option students—who are not part of our project to date—may

illustrate different developmental patterns, given their work experiences.

In addition, the data reveal changes that suggest reflection and reassessment.

Interpreted on the average, some competency components reveal lower self-assessment

following the completion of program components. This finding supports the work

of Sprague and Stuart (2000) that describes mastery as a developmental process

that involves the gradual recognition of competence (or incompetence). Initial

assessments on some competency dimensions are inflated because students do not

yet know what they do not know. As they gain classroom and internship experience,

they become more conscious of their developmental needs as academic

themes are highlighted in practice.

The data speak to the final theme in several ways. First, competency assessments

can help faculty members make data-based curriculum decisions by

identify ing curricular strengths and needs. Second, the data can highlight the

program components that are best developed through classroom instruction or

internships. Third, competency assessments can become an important segment

154 Journal of Public Affairs Education


Competency Model Design and Assessment

of a “balanced scorecard” approach to MPA program evaluation when combined

with student reviews upon graduation and assessments of students by intern

supervisors. This approach helps determine the continued relevance of the model

and its components.

Finally, competency models can help programs align their efforts with broad

priorities that are shared by other public administration programs across the nation

and even around the world. Especially when programs choose to adopt and apply

existing models, these goals can be met more easily. Crafting an original competency

model may prove a useful exercise for legitimizing the program and its curriculum,

but it is not the position of this paper to advocate for either approach. Rather, its

aim is to encourage programs to determine the approach best suited for their needs.

Conclusion

Although this paper presents the University of Kansas MPA competency

model and assessment experience, we expect that the findings and discussion

may help other programs evaluate their current development and evaluation

methods to balance continued focus areas and emerging needs. All NASPAAaccredited

programs are expected to develop competencies in five domains: (a)

leading and managing in public governance; (b) participating in and contributing

to the policy process; (c) analyzing, synthesizing, thinking critically, solving

problems, and making decisions; (d) articulating and applying a public service

perspective; and (e) communicating and interacting productively with a diverse

and changing workforce and citizenry. The KU MPA program competency

model captures these domains. Moving forward, it is expected that this program

and others will further develop and refine competency models to meet these

expectations and reflect program-specific content and concentrations (Powell,

Piskulich, & Saint-Germain, 2011).

Based on our experience with the KU competency model, a number of

issues should be considered when designing, implementing, and evaluating a

competency model.

1. The competency model should attempt to align the broader priorities

of the academic and practitioner communities. Although both groups

have a similar goal in that they have a vested interest in the development

of future public servants, faculty and practitioners can sometimes have

different views on how that development should occur and what types

of conceptual understandings and skills should be developed. Though

different, these two perspectives are important and the voices representing

both perspectives should be heard. This takes time and effort, but it

is one way to help make sure that the model remains relevant. When

engaging the practitioner community, it is important to seek out

those who value the competency initiative and who can be expected

to participate thoughtfully in the conversation.

Journal of Public Affairs Education 155


H. Getha-Taylor et al.

2. Another voice that should be heard is that of the students. As such,

students and faculty should share the responsibility of developing,

implementing, and revisiting the competency model. Because both

groups will be closely working with the model, their individual and

collective buy-in is paramount. In addition, if student self-assessment

is to become part of the competency initiative, attention should be

paid to rewards and/or punishments that will foster honesty.

3. The competency model should capture the totality of the student

experience. It should be realized that student learning is not restricted

to the classroom. Although the internship aspect of the KU MPA

program is unique, the vast majority of MPA programs incorporate

at least a summer internship experience. In addition, many MPA

programs provide, support, and encourage service learning, volunteerism,

and the like. The learning that occurs during these other types

of experiential learning should be reflected in the model because it

will affect student competency.

4. A significant output of the competency model project can be the

collection of data. As mentioned earlier, the model should contain

learning objectives and areas of competency that are seen as important

by the faculty in consultation with a thoughtful practitioner

community. The data can and should be used to help the faculty

understand how they and the program are helping to meet the

learning goals established in the model. Faculty can see areas in

which the program is succeeding and those in which it is not. The

faculty can then engage in a data-based curriculum discussion to

improve the program where needed.

5. Competency models also provide an opportunity to illustrate

developmental gains over time. This is particularly the case with

models that employ a self-assessment aspect, because they enable

students to compare where they were in regards to a competency to

where they are now. This knowledge can build awareness, confidence,

and an appreciation of the value added by the program, and it can

reinforce a commitment to continuing professional development

after graduation as a responsibility of the student.

6. Developmental gains should not be seen solely as an increase in a

competency. As was seen in the data presented earlier, the perceived

competency of our students in a few cases actually decreased. This

could be because the students inflated their previous competency

and realized that on a subsequent self-assessment. However, this insight

can be a good thing and demonstrate an increased awareness of the

competency itself. It could occur as the students come to appreciate

156 Journal of Public Affairs Education


Competency Model Design and Assessment

the complexity of an area of public administration and realize that

they are not as advanced in an area as they originally thought.

7. There is value in supplementing the competency assessment with

reflective statements from the students. It is one thing for students to

give themselves competency scores; it is another for students to

reflect upon what that score means to them. Again, this is another

method of building personal responsibility for future professional

development and continuous professional learning.

8. Although competency models should be continuously revisited to

ensure that the content is appropriate, too many changes can pose

significant problems. Multiple competency model versions may

introduce additional complexity in administration. If development

of competencies is the goal, the model must remain constant so that

repeat measures of the same competency can be taken. Therefore, it

is recommended that models remain the same for a cohort during

the entire course of their program.

9. Finally, links between the competency model and program components

should be identified and evaluated. If the competency model is a true

reflection of what the faculty sees as valuable, the courses offered

should be explicitly tied to one or more expected competencies. The

reverse should also be true. The faculty should ensure that the

competencies are all linked to at least one program component.

Additional questions remain to be considered. First, this effort illustrates the

importance of phased data collection but focuses primarily on self-assessment:

How should faculty members balance student self-assessment data with external

evaluations from instructors and/or internship supervisors? Connected to this

issue, what are the implementation costs associated with analyzing multiple forms

of assessment? Further, how should we interpret these forms of assessment and

connect them to program outcomes? Consideration of these questions leads to

an understanding of the integral relationship between classroom feedback and

grades from faculty and the student self-assessment process. Without a foundation

of faculty evaluation, an academic program would lack credibility. But a

self-assessment element introduces a critical element of self-responsibility, and

sharing those self-assessments in a seminar setting introduces the idea of mutual

accountability as well.

Another question centers on whether competency models should differ for

varied student groups: Should we develop customized competency models for

part-time, pre-service, and in-career students? Or, should we reconsider the

evaluation schedule for these groups? It is expected that the competency model

can serve to focus the MPA student experience, but important questions remain

on how best to design and evaluate progress on the model. In addition, programs

Journal of Public Affairs Education 157


H. Getha-Taylor et al.

should consider the issue of faculty buy-in. Effective integration of competency

models depends on faculty investment, and it will be important to use the model

as a way to enhance communication and address any concerns related to instructional

autonomy. Finally, continually validating over time presents a challenge:

How should we integrate competency feedback from alumni, internship supervisors,

and/or other stakeholders in the future?

Competency management for MPA educational purposes will require a longterm

investment and continual reevaluation. Bowman and colleagues (2010) note

that “public service has been greatly affected by the rapidly changing context within

which it is organized”; there has been and will continue to be changes in the

“(1) technical, (2) internal, (3) external, and (4) managerial environments that

encompass the organization and delivery of public services” (p. 7). Competency

assessment is an iterative process that involves faculty members as learners. To

this end, faculty members must be sufficiently invested in the process to ensure

that both methods and applications are revisited on an ongoing basis. In the case

of the KU MPA program, efforts are under way to revisit the competency model

to ensure its continued applicability. It is expected that faculty, students, alumni,

and program stakeholders will be involved in these discussions.

The ever-changing context of public sector work requires that existing competency

models be revisited to identify congruence (or lack thereof) between

competency models and changing leadership needs. Although these investments

may seem substantial, we hold that the collective investment in competency

management can be regarded as an investment in public leadership development

and one that contributes to the shared goal of preparing students to manage in a

changing public sector landscape (Berry, 2010).

Footnote

1 A chronology of the project appears as Appendix A to this paper.

References

Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (2010). How learning

works: 7 Research-based principles for smart teaching. Jossey-Bass.

Batalden, P., Leach, D., Swing, S., Dreyfus, H., & Dreyfus, S. (2002). General competencies and

accreditation in graduate medical education. Health Affairs, 21(5), 103–111.

Berry. F. S. (2010). The changing climate for public affairs education. Journal of Public Affairs

Education, 17(1), 1–6.

158 Journal of Public Affairs Education


Competency Model Design and Assessment

Bowman, J. S., West, J. P., & Beck, M. A. 2010. Achieving competencies in public service: The professional

edge (2nd ed.). Armonk, NY: M. E. Sharpe.

Bowman, J. S., West, J. P., Berman, E. M., & Van Wart, M. (2004). Professional edge: Competencies in

public service. Armonk, NY: M. E. Sharpe.

Boyatzis, A. R. (1982). The competent manager: A model for effective performance. New York: Wiley.

Daley, D. M. (2002). Strategic human resource management: People and performance management in the

public sector. Upper Saddle River, NJ: Prentice Hall.

Getha-Taylor, H. (2008). Identifying collaborative competencies. Review of Public Personnel Administration,

28(2), 103–119.

Johnson, R. G., & Rivera, M. A. (2007). Refocusing graduate public affairs education: A need for

diversity competencies in human resource management. Journal of Public Affairs Education, 13(1),

15–27.

Kleinman, C. S. (2003). Leadership roles, competencies, and education. Journal of Nursing Administration,

33(9), 451–455.

McEvoy, G. M., Hayton, J. C., Warnick, A. P., Mumford, T. V., Hanks, S. H., & Blahna, M. J. (2005).

A competency-based model for developing human resource professionals. Journal of Management

Education, 29(3), 383–402.

National Association of Schools of Public Affairs and Administration. (NASPAA). (2012). What is an

MPA/MPP degree? Retrieved from http://www.naspaa.org

Op de Beeck, S., & Hondeghem, A. (2010). Managing competencies in government: State of the

art practices and issues at stake for the future. Organization for Economic Cooperation and

Development (OECD). Retrieved from http://soc.kuleuven.be/io/ned/project/pdf/hrm27_GOV_

GC_PEM(2010).pdf

Powell, D., Piskulich, M., & Saint-Germain, M. (2011). NASPAA White paper: Expectations for

student learning outcomes assessment for NASPAA-COPRA accreditation. Retrieved from http://

www.naspaa.org/accreditation/NS/24%20COPRA%20Appendix%204%20White%20Paper%20

Competencies.pdf

Rice, M. F. (2007). Promoting cultural competency in public administration and public service

delivery: Utilizing self-assessment tools and performance measures. Journal of Public Affairs

Education, 13(1), 41–57.

Robotham, D., & Jubb, R. (1996). Competences: Measuring the unmeasurable. Management

Development Review, 9(5), 25–29.

Rodolfa, E., Bent, R., Eisman, E., Nelson, P., Rehm, L., & Ritchie, P. (2005). A cube model for

competency development: Implications for psychology educators and regulators. Professional

Psychology: Research and Practice, 36(4), 347–354.

Rodriguez, D., Patel, R., Bright, A., Gregory, D., & Gowing, M. K. (2002). Developing competency

models to promote integrated human resource practices. Human Resource Management, 41(3),

309–324.

Journal of Public Affairs Education 159


H. Getha-Taylor et al.

Sanchez, J. I., & Levine, E. L. (2009). What is (or should be) the difference between competency

modeling and traditional job analysis? Human Resource Management Review, 19(2), 53–63.

Schon, D. A. (1983). The reflective practitioner: How professionals think in action. New York: Basic

Books.

Spencer, L. M., & Spencer, S. M. (1993). Competence at work: Models for superior performance. New

York: Wiley.

Sprague, J., & Stuart, D. (2000). The speaker’s handbook. Fort Worth, TX: Harcourt College Publishers.

Talbot, M. (2004). Monkey see, monkey do: A critique of the competency model in graduate medical

education. Medical Education, 38(6), 587–592.

Tompkins, J., Laslovich, M. J., & Greene, J. D. (1996). Developing a competency-based MPA

curriculum. Journal of Public Affairs Education, 2(2), 117–130.

Heather Getha-Taylor is an assistant professor in the School of Public Affairs and

Administration at the University of Kansas. Her teaching and research interests

focus on human resource management and collaborative governance.

Ray Hummert is on the staff of the School of Public Affairs and Administration

at the University of Kansas. He has a MPA from the University of Kansas. Before

joining the University, he worked for over 30 years in local government management

in cities in California, Nebraska, Missouri, and Kansas.

John Nalbandian is a professor in the School of Public Affairs and Administration.

He teaches human resources management to MPA students, and he has been

involved with the competencies project since its initiation.

Chris Silvia is an assistant professor in the School of Public Affairs and Administration

at the University of Kansas. His teaching and research interests focus on the

leadership and management of intersectoral networks and public and nonprofit

service delivery.

160 Journal of Public Affairs Education


Competency Model Design and Assessment

Appendix A

Chronology of Portfolio and Competencies Project

Chronological History

Competency/Portfolio Project

University of Kansas

Prepared by Raymond Hummert, Academic Advisor, School of Public Affairs

and Administration

March 7, 2001: Faculty, Practitioner, and Student representatives met

with Dr. David G. Williams, West Virginia University, to discuss professional

portfolios and the West Virginia assessment model. Discussion on the use of

portfolios included

• Faculty evaluation of student performance

• Personal self-evaluation

• Career planning

• Job search tool

March 30, 2001: Chuck Epp and John Nalbandian met with John Gaunt,

dean of School of Architecture and Urban Design, to discuss use of portfolio by

architects. In the field, portfolios were used for a long time that were generally

graphic narratives that showed past professional development as well as well as

had a limited use for future development. Portfolio concept solidified.

April–May 2001: Faculty begins discussing portfolio concepts.

August 2001: Faculty approves Student Portfolio Policy.

Summer 2001: Faculty, practitioners, and Intern Option students meet for

the first time to discuss the portfolio project. The discussion centered on ICMA

competencies. Importance of faculty review was emphasized.

Summer 2002: Faculty meets with second class of Intern Option students.

Bill Carswell, School of Architecture and Urban Design, to discuss use of portfolios

by architects made presentation. List of competencies were expanded beyond

ICMA. Creativity and flexibility were encouraged as a professional development

tool. Review by supervisors, faculty, and others encouraged.

April 2003: Portfolios were discussed and reviewed by outside consultant

(Bill Hansell). Emphasis was on professional development. Portfolio Project was

evaluated by first class. Results were mixed, but generally favorable with encouragement

to proceed with the recommendations:

• Portfolios took two forms: (a) Compilation of materials and

reflection and (b) Expanded resume.

• Use a faculty mentor.

• Share with mentors other than faculty.

Journal of Public Affairs Education 161


H. Getha-Taylor et al.

Summer 2003: Faculty meets with third class of Intern Option students to

discuss portfolios. Portfolios in the academic setting were discussed with representatives

from the Writing Center, English Department, California State University,

Long Beach, and the University of Kansas Portfolio project. Emphasis was given

to process, importance of writing, and reflective nature of portfolio.

October 2003: Portfolio discussion was made part of Professional Development

Seminar.

April 2004: Personal development was emphasized along with professional

development in professional Development Center. Consultant reviewed/discussed

portfolios (Jim Keene). Evaluation of second class of Intern Option students.

Response was good to concept. Recommendations included

• Use as a self-development tool.

• Use as a tool to connect with students.

• Connect students with faculty in the second year of MPA program.

Summer 2004: Faculty meets with Intern and Career option students.

Discussion centered on use of portfolio as professional and personal

development. Process was better defined with emphasis on reflection, writing,

evaluation, and lifelong learning process.

April 2005: Consultant (Linda Barton) and Practitioner in Residence (Carol

Gonzales) discussed portfolios and continual learning. Portfolios have to be

individualized to be relevant. Evaluation of third class. Portfolio better used by

students. Consultants were impressed by concept. The discussion emphasized the

individuality of the process/document. “Screw it! It is not theirs.”

April 2005: John Nalbandian convenes the Mid America Competency

Summit to discuss learning in PA Departments. Representatives from regional

NASPAA schools attended an afternoon session at KU to talk about outcomes

based education and where their schools were programmatically. MU, UNO,

UMKC, and Iowa State University (via phone) were present as well as NASPAA

representatives: Steven Maynard-Moody, Laurel McFarland (NASPAA executive

director via conference phone)

Spring 2006: Hummert surveys graduates on use of portfolios. Graduates

responding to the survey recommended

• Provide students a better outline of what needs to be learned, or

define competencies more clearly.

• Provide guidelines and a mechanism for both the student and the

person evaluating the student.

Spring 2006: Nalbandian assigned an optional research paper in PUAD 834

on competencies. It consisted of

• A review of literature

• A collection of lists and identification of competencies that

commonly appear in managerial/public service arena

162 Journal of Public Affairs Education


Competency Model Design and Assessment

Fall 2006: Nalbandian presented a paper at the NASPAA conference in

Minneapolis on the project.

Fall 2006: Nalbandian made another optional research assignment in PUAD

834 to develop a rubric of competencies. This placed the competencies within the

Canadian Public Service framework, which identified components of each competency

(vertical element of the rubric) and levels of competence (horizontal elements)

Spring 2007: The rubric was given to graduating intern option students, who

spent a morning discussing its values and drawbacks.

Fall 2007: Students participating in a PUAD 831 project reviewed the content

of the rubric with the goal of achieving vertical and horizontal alignment.

The rubric will be presented to all incoming students.

The Intern Option students will anchor the boxes in the rubric with case studies

provided from interviews with local government professionals who attend the

ICMA annual conference. And Career Option students in PUAD 845 will find

cases to describe the boxes.

Fall 2007: All incoming MPA students were presented the competency rubric

and portfolio. They developed three additional documents. They were reflective

essay guidelines; artifacts examples; and faculty expectations and student obligations.

Fall 2007: Met with second-year Intern Option students and suggested

continuum instead of rubric, depth to the rubric, and continue to emphasize

flexibility. “No KUCIMAT left behind program.” NOT!

Fall 2007: University IT agreed to do a feasibility study of an electronic

portfolio that would include the competency rubric.

May 2008: The first-year students of the 2009 class leaving campus are

asked where they are on the competency rubric.

Fall 2008: The electronic portfolio is tested by students.

October 2008: At the Richmond ICMA Professional Development Seminar,

students of the 2009 class are asked where they are on the competency rubric.

April 2009: At the last Professional Development meeting for the 2009

class, students are asked where they are on competency rubric.

May 2009: As the 2010 class leaves campus, they are asked where they are

on the competency rubric.

June 2009: Students in the incoming class of 2011 are asked where they are

on the competency rubric.

August 2009: The electronic competency rubric and portfolio is activated

and moved to production server.

January 2010: After testing electronic rubric and portfolio with students, it

was abandoned as a platform.

March 2010: KU Center for Teaching Excellence demonstrates KU Keeps as

a platform for portfolios.

Journal of Public Affairs Education 163


H. Getha-Taylor et al.

Spring 2010: Continue to collect data from first- and second-year students

on their placement on rubric.

August 2010: Getha-Taylor and Silvia join Nalbandian and Hummert in

working on competencies and portfolios.

September 2010: Developed concept of electronic resume to develop, store,

and present portfolio information.

October 2010: Explored Digication as a platform for rubric, portfolio, and

electronic resume, which seemed promising.

November 2010: Presented to faculty two years of student data.

Spring 2011: Continue to collect data from first- and second-year students

on their placement on rubric, which will be the beginning of four full years of data.

April 2011: Students experiment using web-based sites for portfolios.

The Future

Phase I:

• Refine competency rubric using information from faculty.

• Connect the competency rubric to the curriculum and require a higher

level of confidence in the rubric. Students would be asked at the end

of the semester to indicate which competencies were covered in a

particular course.

• Use web-based sites for portfolios.

Phase II: Use the rubric to objectively evaluate student progress. This phase

requires the highest level of confidence in the validity of the rubric.

164 Journal of Public Affairs Education


Competency Model Design and Assessment

Appendix B

University of Kansas Competencies Matrix

I. Values & Ethics (serving with integrity and respect)

1

Diversity

Management

Individual

Differences

1 2 3 4 5

Little experience in

working with people

from a different background.

Diversity Unaware of differences

between

cultures, ethnicities,

and groups.

Aware of the importance

of individual

differences.

Aware of the importance

of diversity.

Demonstrates respect

for difference in people

in own personal and

professional actions.

Demonstrates sensitivity

toward and appreciation

of diversity in own

personal and professional

actions.

Encourages others to

respect and provide fair

and equitable treatment

for all people. Articulates

benefits of individual

difference to others.

Encourages others to

value diversity in

the workplace.

Inculcates organizationwide

recognition of

the benefits differences

bring to the organization

and the community.

Strengthens organization

by integrating

diversity into

operating culture.

Employment Law Little knowledge of

employment law.

Understanding the

basics of employment

law.

Applies principles

employment law in

personal and professional

actions.

Encourages others to

understand and fairly

apply provision of

employment law.

Promotes organizational

culture that adheres

to the letter of the law,

and also values the

spirit of the law.

2 Professionalism 1 2 3 4 5

Public Service Does not distinguish

between commitment

to public service and

working in the

public sector.

Integrity & Ethics Minimal understanding

of the role that

ethics and integrity

play in effective

public service.

Grasps the meaning of

commitment to public

service.

Respects importance

of integrity and ethical

reasoning in public

service and is in formed

of related laws, rules,

and regulations.

Demonstrates commitment

to public service

in personal and professional

actions.

Demonstrates integrity

and ethical reasoning in

personal and professional

actions and complies with

related requirements.

Instills in others a

commitment to public

service.

Encourages others to

act with integrity, to

employ ethical reasoning,

and to respect

related rules, regulations,

and laws.

Inspires an organizational

commitment to

public service.

Elevates integrity and

ethical reasoning as

defining organizational

characteristics.

Journal of Public Affairs Education 165


H. Getha-Taylor et al.

II. Strategic Thinking (innovating through analysis and ideas)

1

Administrative

Policy Making

Policy

Formulation

1 2 3 4 5

Has a limited knowledge

of policy formulation

processes.

Aware of how policy

processes work.

Participates effectively in

policy-making initiatives.

Has led an effective

policy-making team

or effort.

Set overall organizational policy

direction and serves as overarching

policy entrepreneur.

Policy Implementation

Simplistic understanding

of how

policy gets implemented.

Policy Trends Unaware of policy

trends that may

impact organization.

Basic and sound understanding

of the importance

of policy implementation

strategy.

Aware of policy trends

that may impact

the organization.

Plays a role in the

implementation of

adopted policy.

Is mindful of the impact of

policy trends in personal and

professional actions.

Leads policy

implementation

effort or team.

Encourages work

group to consider

broader policy trends.

Responsible for realization of

organization’s policy goals.

Integrates trends into organization’s

policy making and

strategic planning.

2 Innovation 1 2 3 4 5

3

Creativity &

Innovation

Strategic

Management

Long-Term

Outlook

Capacity

Building

Unaware of the

need creativity or

innovation.

Limited awareness

of long-term issues

or needs.

Appreciates value

of creativity in the

workplace.

Demonstrates creativity in

personal and professional

actions.

Encourages creativity

among coworkers

and staff.

Develops a work environment

that encourages creative

solutions that lead to organizational

improvements.

1 2 3 4 5

Unaware of organizational

capacities.

Develops long-term perspective

on organizational

issues and needs.

Aware of organizational

capacities.

Factors long-term consequences

and objectives into personal

and professional actions.

Participates in efforts to

define and expand needed

organizational capacities.

Responsible for a conducting

strategic planning

team or activities.

Leads a team effort to

define and expand needed

organizational capacities.

Articulates an organizational vision

that frames strategic plans.

Aligns vision, strategic planning,

and capacity development.

Sufficient capacity is realized to

achieve organizational vision

and plans.

166 Journal of Public Affairs Education


Competency Model Design and Assessment

III. Engagement (mobilizing people, organizations, & partners)

1 Communication 1 2 3 4 5

Verbal Basic verbal communication

skills.

College-level

speaking skills.

Written Basic writing skills. College-level

writing ability.

Effectively communicates

regarding own personal

and professional actions

(e.g., conferring with

colleagues and reporting

to superiors).

Effectively writes for

personal and professional

needs (e.g., internal

memos, letters, and

staff reports).

Effectively communicates

within department or

workgroup (e.g., staff

meetings, departmental

briefings).

Ensures effective writing

from unit/workgroup

(includes ability to edit,

e.g., budget narratives,

departmental reports,

policy statements).

Effective at public communication

and clearly

and responsibly articulates

organization’s mission

and activities (e.g.,

gives effective speeches,

testimony, is comfortable

in public discourse).

Ensures effective

organizational

written communication

(includes editing, e.g.,

press releases, annual

reports, legislative testimony,

strategic plans).

2 Conflict 1 2 3 4 5

Resolution Inexperience

in conflict resolution.

Aware of different

processes for

conflict resolution.

Working familiarity

with at least one

ADR technique.

Involved in the formal

resolution of a conflict

and skills sought by

others.

Creates organizational

culture that recognizes

and resolves conflicts

as they arise.

Prevention Unaware of how to

prevent conflict in the

workplace.

Understands and uses

proper protocol to

prevent escalation of

work-related conflicts.

Anticipates and takes

appropriate actions

to avoid detrimental

conflicts in

the workplace.

Provides counsel to

individuals in early

stages of workplace

conflict and encourages

individuals to

take appropriate

preventive action.

Creates organizational

culture, where conflict

is recognized in

earliest stages and has

systematic means for

early resolution.

Journal of Public Affairs Education 167


H. Getha-Taylor et al.

III. Engagement (mobilizing people, organizations, & partners) continued

3 External Awareness 1 2 3 4 5

Working in a Political

Environment

Unaware of inextricable

link between policy

and politics.

Recognizes the

democratic values

that politics adds to

policy making.

Is mindful of appropriate

political considerations

in his or her personal

and professional policymaking

actions.

Helps workgroup appreciate

how politics

informs policy making,

and vice versa

Works to maintain alignment

between organization’s

policy goals and

political support.

External Policy

Impact

Not aware of external

stakeholders.

Aware of external stakeholders

and

their interests

Effectively considers

external stakeholders and

their interests in personal

and professional actions.

Encourages work group

to understand and address

external stakeholders

and their interests.

Ensures that external

stakeholders are viewed as

legitimate partners and

have an appropriate voice

in the organization.

Citizen Engagement Does not understand

that public service exists

to serve the interests and

needs of citizens.

Appreciates that public

service exists to serve

the needs of citizens and

is familiar with basic

techniques of citizen

engagement.

Serving needs of citizens

is at the heart of personal

and professional actions

and is skilled planning

and conducting citizen

engagement.

Helps workgroup understand

the public service

exists to serve the public’s

needs and develops

in others effectiveness in

citizen engagement.

Fosters an organizational

culture that is skilled and

committed to using citizen

engagement as an essential

component of policy planning

and implementation.

4 Group Dynamics 1 2 3 4 5

Interpersonal Has basic self- awareness

and sensitivity to

needs of others in group

setting.

Collaboration Inexperienced in collaborative

efforts.

Has used formal instruments

(e.g., Meyers-

Briggs, NBTI, SDI, or

other) to assess interpersonal

skills.

Some experience in

collaborative efforts

and aware of costs and

benefits of collaboration

and competition.

Information from formal

assessment informs

personal and professional

behavior.

Required to work in a

collaborative fashion in

personal and professional

activities.

Effectively manages

group’s interpersonal

skills and dynamics promoting

group effectiveness.

Leads collaborative efforts

within workgroup/

unit (e.g., policy development

team or annual

planning retreats).

Ensures development of

organization’s interpersonal

skills sufficient

to engage the needs of

employees, citizens, and

stakeholders.

Ensures effective collaboration

both within

the organization and

among the organization’s

stakeholders.

168 Journal of Public Affairs Education


Competency Model Design and Assessment

IV. Management Excellence (delivering through action management, people management, & financial management)

1 Decision Making 1 2 3 4 5

Timeliness Aware only of personal

deadlines.

Understands value of

timelines and deadlines

associated with

the decision-making

processes.

Effectively establishes

timelines, and works

effectively with competing

deadlines associated

with decision-making

processes.

Aligns group activities

with timing of decisionmaking

processes and is

seen as a resource in time

management and decisionmaking

processes.

Recognizes moments

of opportunity and acts

to advance the

organization’s goals.

Evidence-Based

Practice

Limited understanding

of links between information,

analytical tools,

and decision making.

Sees the need to link

information and analytical

thinking and tools to

decision making.