09.08.2015 Views

TheCenter The Top American Research Universities

2005 - The Center for Measuring University Performance - Arizona ...

2005 - The Center for Measuring University Performance - Arizona ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

highlighted the caution that must accompany broadcomparative generalizations about institutional qualityand competitiveness.TARU includes a table (beginning with the 2003edition) that displays the percent of federal researchexpenditures attributable to different disciplinaryfields. <strong>The</strong>se percentages offer an intriguing view intothe wide differences in the distribution of researchfunding by discipline. <strong>The</strong>se differences do not necessarilyreflect a strategy related to research competitionbut rather may reflect institutional traditions, studentprofiles, state mission definitions, institutional scale,presence or absence of particular schools or colleges,and similar institutional characteristics that affectresearch competitiveness. Because research universitiesserve many constituencies, only one of which isconcerned with research competitiveness, they rarelyfocus exclusively on research competition. <strong>The</strong> TARUand other studies that categorize institutions relativeto their research performance speak only to thatportion of the institution’s mission associated withresearch.<strong>The</strong> Magic of Medicineand EngineeringAmong these compositional issues, it is commonfor university people to believe that the presence orabsence of a medical school or an engineering collegeprofoundly affects research competitiveness.This notion presumes that universities with medicalschools have a significant competitive advantagebecause medical schools have a reputation for producingsignificant research funding. In an earlier TARU(2001) we looked at the question of whether havingor not having a medical school distinguishes universitiesin terms of their research competitiveness. In thatreview, it became clear that the simple presence or absenceof a medical school does not guarantee researchsuccess at high levels.In this year’s TARU, we look at the possibilityof disaggregating the medical school component aswell as the engineering component from the federalresearch expenditures reported in our data for thoseuniversities with more than $20 million in federal researchexpenditures. <strong>The</strong> data for this exercise provedsomewhat difficult to acquire, given the various waysin which universities report information to differentagencies for different purposes. As is frequentlythe case for university data, reports provided to oneagency or for one purpose do not necessarily matchinformation collected for another agency or purpose,even if the information appears to address the sameuniverse. We have discussed elsewhere the extremedifficulty in identifying a number for faculty, eventhough common sense tells us it should be easy.<strong>The</strong> Medicine and EngineeringDataIn the current analysis, we have three sets of dataof interest. <strong>The</strong> primary set comes from the NationalScience Foundation (NSF) and captures all federalresearch expenditures. <strong>The</strong> second set comes from theAssociation of <strong>American</strong> Medical Colleges (AAMC)and identifies medical school research expendituresdefined in the same fashion as the NSF data.† <strong>The</strong>third set comes from the <strong>American</strong> Society for EngineeringEducation‡ (ASEE) for engineering colleges,again using the same definitions as the NSF data, tocapture engineering research expenditures. If we addup the engineering expenditures from ASEE and themedicine expenditures from AAMC for each institution,in some cases we have more expenditures thanthe institution reported to the NSF for all researchfields. This usually means that the institution usedslightly different definitions of what should beincluded in the various data reporting, which leadsto some overlap. <strong>The</strong>se inconsistencies in the datarecommend caution in making too-fine distinctionsamong institutions because relatively small differencesmay well be data reporting artifacts and not reflectionsof actual differences in performance. For the broaderissues related to understanding the general impact of__________________________† <strong>The</strong> tables in the current study reflect the accreditation status ofschools for the years reported. Florida State University (FSU)received initial accreditation for its medical school in 2005.However, the AAMC data provided to <strong>The</strong> Center showed FSUwith $511,000 in 2003 which amounted to 0.6% of their NSFtotal. Florida State University’s AAMC reported federal researchexpenditures were subtracted when we removed medical from theinstitutions with accredited medical schools. This had no effecton their ranking when removing medicine only. When removingboth medicine and engineering, FSU ranked 35th but would haveranked 34th had the $511,000 not been removed. Rutgers whichranked 34th would then have ranked 35th.‡ Not all institutions with federal engineering research expendituresreported data to <strong>American</strong> Society for Engineering Education(ASEE). For example, the California Institute of Technologydoes not report data to ASEE although it is a member institution.However, Cal Tech reported to NSF that 16.9 percent of their$219 million in federal research was in engineering. To take thisinto account, when an institution did not report to ASEE, theirengineering dollars reported to NSF were used instead. For simplicity,the text and tables refer to institutions with and withoutASEE Engineering Schools.<strong>The</strong> <strong>Top</strong> <strong>American</strong> <strong>Research</strong> <strong>Universities</strong> 2005Page

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!