An Index of Oscar-Worthiness - Tuck School of Business
An Index of Oscar-Worthiness - Tuck School of Business
An Index of Oscar-Worthiness - Tuck School of Business
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
<strong>An</strong> <strong>Index</strong> <strong>of</strong> <strong>Oscar</strong>-<strong>Worthiness</strong>: Predicting<br />
the Academy Award for Best Picture ∗<br />
<strong>An</strong>drew B. Bernard<br />
<strong>Tuck</strong> <strong>School</strong> <strong>of</strong> <strong>Business</strong><br />
February 21, 2005<br />
∗ Pr<strong>of</strong>use thanks are due to Conor Dougherty and the staff at the Weekend Journal <strong>of</strong> the<br />
Wall Street Journal for asking whether this might work and providing all the data. As <strong>of</strong><br />
the writing <strong>of</strong> this paper I had not seen any <strong>of</strong> the nominated films for 2004. Unfortunately,<br />
all errors remain mine alone.<br />
1
1 Introduction<br />
The awarding <strong>of</strong> the <strong>Oscar</strong> for Best Picture probably has more to do with<br />
art than with science. However, once the five nominees have been selected,<br />
history and statistics provide a strong guide as to which film will take home<br />
the small golden statue. I use data from the last twenty years to identify<br />
whichmovieshadthegreatestchanceatwinningtheBestPictureaward<br />
and to make a prediction about the probability <strong>of</strong> winning for this year’s<br />
nominees. According to my model, The Aviator is a heavy favorite to take<br />
home the 2004 Best Picture award.<br />
In addition, I create a complete historical ranking for all the nominated<br />
films over the last twenty years, an index <strong>of</strong> so-called ‘<strong>Oscar</strong>-<strong>Worthiness</strong>’.<br />
This index helps us identify several types <strong>of</strong> movies: a) ‘Sure Things’ or<br />
films that would have won in almost any year, b) ‘Lucky Dogs’, movies that<br />
won by virtue <strong>of</strong> having weak competition, and c) ‘Doomed Gems’, movies<br />
thatwouldhavewoninalmostanyotheryearbuthadthemisfortuneto<br />
compete against a ‘Sure Thing’. Even by historical standards, The Aviator<br />
is a strong <strong>Oscar</strong> contender, with an <strong>Oscar</strong>-<strong>Worthiness</strong> <strong>Index</strong> among the top<br />
15 nominated movies since 1984.<br />
The paper is laid out as follows: Section 2 presents a list <strong>of</strong> movie characteristics<br />
and their association with winning the Best Picture award; Section<br />
3 presents the empirical model used to predict the Best Picture Award and<br />
gives the results <strong>of</strong> the prediction exercise for 2004; Section 4 introduces the<br />
<strong>Index</strong> <strong>of</strong> <strong>Oscar</strong>-<strong>Worthiness</strong> and presents ‘Sure Things’, ‘Lucky Dogs’, and<br />
‘Doomed Gems’ over the last twenty years. Section 5 provides the requisite<br />
happy ending.<br />
2 Potential Predictors<br />
To begin, I identify a list <strong>of</strong> characteristics <strong>of</strong> each film nominated for an <strong>Oscar</strong><br />
overthelasttwentyyears. 1 These potential explanatory variables include<br />
both performance measures, e.g. the number <strong>of</strong> Golden Globe awards, and<br />
attributes <strong>of</strong> the film, e.g. whether the lead rode a horse, etc. The complete<br />
list <strong>of</strong> variables is given below:<br />
• Performance Measures<br />
1 Ideas for the variables were contributed by the WSJ staff.<br />
2
— Number <strong>of</strong> <strong>Oscar</strong> Nominations<br />
— Number <strong>of</strong> Golden Globe Wins<br />
— Golden Globe for Best Picture<br />
• Movie Characteristics<br />
— Based on Novel or Play<br />
— Comedy2 — Set at Least 20 Years before Release Date<br />
— Deals with Real Incident or Person<br />
— Lead Character Meets Untimely Death<br />
— Five Hanky Tearjerker<br />
— Unlikely Love Story<br />
— Lead Actor/Actress Comes From a Commonwealth Country<br />
— Has a Sports Theme<br />
— Lead Character gets on a Horse<br />
— Happy but Poor<br />
— Lead is Disabled<br />
— Lead Character is a Genius<br />
— Film Includes War Plot Line<br />
— Includes Action Outside <strong>of</strong> North America or Europe<br />
The list <strong>of</strong> potential explanatory variables is not intended to be exhaustive<br />
and surely misses important elements in the <strong>Oscar</strong> selection process. Table<br />
1 reports the differences for these characteristics across winners and losers for<br />
the last twenty years. Quite a few <strong>of</strong> the variables show statistical differences<br />
across the two groups. Perhaps not surprisingly, winners have more total<br />
<strong>Oscar</strong> nominations, 10.5, than losers, 6.7. In addition, Best Picture winners<br />
aremorelikelytohavewonGoldenGlobeawardsinthatsameyear: on<br />
average, winners take home 2.9 Golden Globes while losers end up with just<br />
2The designation <strong>of</strong> a movie as a comedy was based on its primary category on<br />
Netflix R° . For example, Shakespeare in Love is classified as a romance.<br />
3
Table 1: Differences Between Best Picture Winners and Losers, 1983-2003<br />
Losers Winners Significant<br />
Number <strong>of</strong> <strong>Oscar</strong> Nominations 6.7 10.5 <br />
Number <strong>of</strong> Golden Globe Wins 0.9 2.9 <br />
Golden Globe for Best Picture 20% 85% <br />
Based on Novel or Play 41% 45%<br />
Set at Least 20 Years 55% 70%<br />
Deals with Real Incident or Person 30% 30%<br />
Lead Meets Untimely Death 33% 30%<br />
Five Hanky Tearjerker 20% 10%<br />
Unlikely Love Story 28% 35%<br />
Lead Comes From A Commonwealth Country 35% 60% <br />
Has a Sports Theme 4% 5%<br />
Lead Gets on a Horse 10% 30% <br />
Happy but Poor 11% 15%<br />
Lead is Disabled 10% 10%<br />
Lead is a Genius 5% 20% <br />
Includes War Plot Line 18% 20%<br />
Includes Action Outside <strong>of</strong> North America or Europe 19% 35%<br />
Comedy 15% 0% <br />
The first two columns give the mean (or percentage) <strong>of</strong> the variable for winners and losers.<br />
The third column indicates that the differences in the variable between winners and losers<br />
were statistically significant at the 5% level.<br />
under one Golden Globe apiece. Similarly, movies that go on to win the<br />
Academy Award for Best Picture are substantially more likely to have won<br />
aGoldenGlobeforBestPicture. 3<br />
Turning to the characteristics <strong>of</strong> the movies themselves, I find fewer statistically<br />
significant variables. However, there are some surprises. Movies<br />
whose lead comes from a Commonwealth country, i.e. an English-speaking<br />
country such as England, Australia or New Zealand, are almost twice as<br />
likely to end up winning. Having the lead character get on a horse is also<br />
associated with <strong>Oscar</strong> success; 30 percent <strong>of</strong> the Best Picture winners saw<br />
the lead get on a horse while only 10 percent <strong>of</strong> the losers had mounted leads.<br />
It also appears to help if the leading character has above average intelligence;<br />
20 percent <strong>of</strong> winners had ‘geniuses’ for a lead character while only 5 percent<br />
3There are two Golden Globe awards for Best Picture, one for Drama and one for<br />
Musical or Comedy.<br />
4
<strong>of</strong> the losers did. 4 Finally, the ultimate predictor <strong>of</strong> failure among nominated<br />
pictures is the designation as a comedy. None <strong>of</strong> the ten nominated<br />
comedies has won the Best Picture <strong>Oscar</strong> over the last twenty years. 5<br />
3 A Model for Best Picture<br />
Based on the results in Table 1, I assemble a short list <strong>of</strong> likely candidates<br />
for Best Picture predictors. I then proceed to assess the predictive power <strong>of</strong><br />
combinations <strong>of</strong> the variables. The basic empirical technique employed in the<br />
paper is commonly referred to as a probit model. The simplest probit model<br />
attempts to estimate an unobserved variable, in this case the probability<br />
<strong>of</strong> winning the Best Picture award, by relating two observed phenomena:<br />
whether the picture won or lost and a characteristic <strong>of</strong> the movie. 6<br />
Pr(Winning)=f (Variable) .<br />
Results from these simple probits (the log-likelihoods) can be examined to<br />
determine which variables are most strongly correlated with winning over the<br />
last twenty years. Using these results, I then run a probit on larger groups<br />
<strong>of</strong> variables, i.e. a probit with several explanatory factors.<br />
3.1 The Simplest Model That Works Well<br />
From the earlier results, I narrow down the list <strong>of</strong> potential predictors to<br />
the group <strong>of</strong> seven significant characteristics, i.e. those marked with a X<br />
in Table 1. As expected, the number <strong>of</strong> overall <strong>Oscar</strong> nominations does a<br />
reasonable job in predicting <strong>Oscar</strong> winners. By itself, total nominations<br />
correctly identifies a clear winner in 14 <strong>of</strong> the last twenty years. 7 Similarly,<br />
the number <strong>of</strong> Golden Globes correctly predicts the winner in 13 <strong>of</strong> 20 years. 8<br />
4However, these results do not suggest that any film featuring an English genius who<br />
rides a horse is a shoe-in for Best Picture.<br />
5<strong>An</strong>nie Hall won the Best Picture in 1977, however, that film is not designated as a<br />
comedy by Netflix R° .<br />
6Formally probits are a form <strong>of</strong> regression that are described in every introductory<br />
econometrics text.<br />
7In four more years, the eventual winner is tied for the most nominations.<br />
8The number <strong>of</strong> Golden Globes incorrectly identifies the winner 6 times and in one year<br />
there is a tie.<br />
5
Table 2: The Effects <strong>of</strong> Nominations and Golden Globes on Winning, 1984-<br />
2003<br />
Marginal Effect Standard Error Z-Stat<br />
Number <strong>of</strong> <strong>Oscar</strong> Nominations 0.045 0.019 3.18<br />
Number <strong>of</strong> Golden Globe Wins 0.102 0.040 3.73<br />
Number <strong>of</strong> Correct Best Picture Predictions 18 out <strong>of</strong> 20 90%<br />
The first column gives the marginal increase in the probability <strong>of</strong> winning the Best<br />
Picture <strong>Oscar</strong> for an extra <strong>Oscar</strong> nomination or Golden Globe for the average movie.<br />
The second column reports the standard error and the third column gives the z-score.<br />
The probit was run on the sample <strong>of</strong> non-comedy nominated movies as Comedy<br />
perfectly predicts losing in sample. Both variables are significant at the 1% level.<br />
However, using groups <strong>of</strong> variables I am able to substantially improve the<br />
predictive power <strong>of</strong> the simple model.<br />
Among these seven variables, one group <strong>of</strong> three variables far outperforms<br />
any other combination in predicting the Best Picture winner over the<br />
last twenty years. These three predictors are: Total <strong>Oscar</strong> Nominations,<br />
Golden Globes Won, and Comedy. No other variable on the list improves<br />
the explanatory power <strong>of</strong> the model once these three predictors are included.<br />
The resulting estimating equation is<br />
Pr(Winning)=f (α · Nominations + β · Golden Globes + θ · Comedy + εit) .<br />
To estimate the marginal effects <strong>of</strong> each variable on the probability <strong>of</strong><br />
winning, I run a probit on Nominations and Golden Globes excluding the<br />
sample <strong>of</strong> comedies. Since no Comedy has won Best Picture in the last<br />
twenty years, the Comedy variable is perfectly correlated with losing. The<br />
results <strong>of</strong> the probit are summarized in Table 2.<br />
Foreachmovie,themodelgivesanoverallprobability<strong>of</strong>beingawinner<br />
across the twenty years. To turn this into a prediction for each year, I<br />
examine the probability <strong>of</strong> winning for each <strong>of</strong> the five nominees in every<br />
year. The movie with the highest predicted probability is designated the<br />
predicted winner for that year. This simple framework, using only three<br />
variables, does an excellent job in separating winners from losers. Over the<br />
last twenty years, it accurately predicts eighteen <strong>Oscar</strong> winners, i.e. it is<br />
6
Table 3: The Probability <strong>of</strong> Winning in 2004<br />
<strong>Oscar</strong>-<br />
Predicted <strong>Worthiness</strong> Probability<br />
Year Winner <strong>Index</strong> This Year Title<br />
2004 0.0 0.0% Sideways<br />
2004 0.4 0.4% Finding Neverland<br />
2004 1.3 1.4% Ray<br />
2004 12.9 13.2% Million Dollar Baby<br />
2004 83.0 85.0% The Aviator<br />
correct 90 percent <strong>of</strong> the time. 9<br />
The model also gives the relative importance <strong>of</strong> each <strong>of</strong> the component<br />
predictors. For the average nominated film, an additional <strong>Oscar</strong> nomination<br />
increases the probability <strong>of</strong> winning by 4.5 percent, while each additional<br />
Golden Globe raises the probability <strong>of</strong> winning by 10.2 percent.<br />
It should not be surprising that these two variables are the best predictors<br />
<strong>of</strong> <strong>Oscar</strong> success. Both <strong>of</strong> these variables capture some body <strong>of</strong> opinion on<br />
the quality <strong>of</strong> the movie, which is precisely the underlying characteristic that<br />
is being considered by the members <strong>of</strong> the Academy. The number <strong>of</strong> overall<br />
<strong>Oscar</strong> nominations is a good predictor because <strong>of</strong> the nominating process<br />
itself. Members <strong>of</strong> the Academy, regardless <strong>of</strong> their area - cinematography,<br />
editing, costume etc - may nominate films in their area <strong>of</strong> specialty as well as<br />
intheBestPicturecategory. Thusthetotalnumber<strong>of</strong>nominationscaptures<br />
the breadth <strong>of</strong> support for the film among the Academy members.<br />
3.2 <strong>An</strong>d the <strong>Oscar</strong> Goes to ...<br />
Using the estimated model for 1984-2003, and the characteristics <strong>of</strong> the nominated<br />
movies for 2004, I estimate the probability <strong>of</strong> success for each film.<br />
The results for 2004 are given in column 4 <strong>of</strong> Table 3. The Aviator, withits<br />
eleven overall <strong>Oscar</strong> Nominations and three Golden Globes, is a clear favorite<br />
with an 85.0 percent chance <strong>of</strong> winning. Million Dollar Baby is a distant<br />
second according to the model with only a 13.2 percent chance.<br />
9 The model incorrectly predicts that Born on the Fourth <strong>of</strong> July should have narrowly<br />
beaten Driving Miss Daisy in 1989. The biggest ‘surprise’ is the success <strong>of</strong> Silence <strong>of</strong> the<br />
Lambs in 1991 which the model has tied for last place in that year.<br />
7
4 The <strong>Index</strong> <strong>of</strong> <strong>Oscar</strong>-<strong>Worthiness</strong><br />
One advantage <strong>of</strong> the prediction model is that I am able to go beyond just<br />
predicting the winner in any particular year and create an overall <strong>Index</strong> <strong>of</strong><br />
<strong>Oscar</strong>-<strong>Worthiness</strong>. The model actually yields a measure <strong>of</strong> the probability<br />
that a movie would win a Best Picture <strong>Oscar</strong>, not just in the year <strong>of</strong> its<br />
nomination, but against all <strong>of</strong> the nominated films <strong>of</strong> the last twenty years.<br />
4.1 The Greatest Movies <strong>of</strong> Our Time<br />
For 1984-2003, Tables 4 and 5 report both overall <strong>Oscar</strong>-<strong>Worthiness</strong> in column<br />
3 and the probability <strong>of</strong> winning in the year <strong>of</strong> nomination in column<br />
4. The average for the <strong>Oscar</strong>-<strong>Worthiness</strong> <strong>Index</strong> is 20.0, i.e. the average<br />
probability <strong>of</strong> winning, however, there are big differences across movies. The<br />
median <strong>of</strong> the <strong>Oscar</strong>-<strong>Worthiness</strong> <strong>Index</strong> across all movies is 3.0, while 75 percent<br />
<strong>of</strong> the movies have an <strong>Index</strong> rating below 27. Titanic has the highest<br />
<strong>Oscar</strong>-<strong>Worthiness</strong> <strong>Index</strong> at 99.7 closely followed by Amadeus and The Lord <strong>of</strong><br />
the Rings: Return <strong>of</strong> The King, while, among the non-comedies, ASoldier’s<br />
Story, Field <strong>of</strong> Dreams, andAwakenings score the lowest at 0.004. The<br />
‘<strong>Oscar</strong>-<strong>Worthiness</strong> <strong>Index</strong>’ for this year’s movies is given in the third column<br />
<strong>of</strong> Table 3. Even relative to the movies <strong>of</strong> the last twenty years, The Aviator<br />
is a strong competitor with an <strong>Oscar</strong>-<strong>Worthiness</strong> <strong>Index</strong> <strong>of</strong> 83.0, ranking in<br />
the top 15 movies <strong>of</strong> the past two decades.<br />
Different years vary substantially in the quality <strong>of</strong> the nominated films.<br />
Adding up the index for all the movies in a year, I find that the highest<br />
overall level <strong>of</strong> <strong>Oscar</strong>-<strong>Worthiness</strong> occurred in 1984. In that year, there were<br />
two highly rated movies, Amadeus and A Passage to India, each with an<br />
<strong>Oscar</strong>-<strong>Worthiness</strong> <strong>Index</strong> above 80. 1988 was a particularly lean year; in that<br />
year Rain Main led a relatively weak field with an <strong>Oscar</strong>-<strong>Worthiness</strong> <strong>Index</strong><br />
<strong>of</strong> only 21.2. This year’s five nominees are close to an average group - their<br />
combined <strong>Oscar</strong>-<strong>Worthiness</strong> <strong>Index</strong> is 97.6, just under the annual average <strong>of</strong><br />
100.<br />
4.2 Sure Things, Doomed Gems, and Lucky Dogs<br />
Using the <strong>Index</strong> <strong>of</strong> <strong>Oscar</strong>-<strong>Worthiness</strong> reported in Tables 4 and 5, I identify<br />
three types <strong>of</strong> movies in the Best Picture competition: ‘Sure Things’ or films<br />
that would have won in almost any year, ‘Lucky Dogs’, movies that won by<br />
8
Year Winner<br />
Table 4: The <strong>Oscar</strong>-<strong>Worthiness</strong> <strong>Index</strong>, 1984-1993<br />
<strong>Oscar</strong>-<br />
<strong>Worthiness</strong><br />
<strong>Index</strong><br />
Probability<br />
That Year Title<br />
1984 0.0 0.0% A Soldier's Story<br />
1984 3.0 1.6% The Killing Fields<br />
1984 3.0 1.6% Places in the Heart<br />
1984 83.0 45.0% A Passage to India<br />
1984 95.6 51.8% Amadeus<br />
1985 0.0 0.0% Prizzi's Honor<br />
1985 0.0 0.0% Kiss <strong>of</strong> the Spider Woman<br />
1985 1.1 0.9% Witness<br />
1985 29.1 25.7% The Color Purple<br />
1985 83.0 73.3% Out <strong>of</strong> Africa<br />
1986 0.0 0.0% Hannah and Her Sisters<br />
1986 0.5 0.8% Children <strong>of</strong> a Lesser God<br />
1986 6.1 9.0% A Room with a View<br />
1986 12.9 19.1% The Mission<br />
1986 48.1 71.2% Platoon<br />
1987 0.0 0.0% Broadcast News<br />
1987 0.0 0.0% Hope and Glory<br />
1987 0.1 0.2% Moonstruck<br />
1987 7.1 7.7% Fatal Attraction<br />
1987 85.0 92.1% The Last Emperor<br />
1988 0.0 0.0% Working Girl<br />
1988 0.0 0.1% The Accidental Tourist<br />
1988 0.4 1.9% Mississippi Burning<br />
1988 0.4 1.9% Dangerous Liaisons<br />
1988 21.2 96.1% Rain Man<br />
1989 0.0 0.0% Field <strong>of</strong> Dreams<br />
1989 0.0 0.0% Dead Poets Society<br />
1989 0.0 0.0% My Left Foot<br />
1989 61.2 44.6% Driving Miss Daisy<br />
1989 75.9 55.3% Born on the Fourth <strong>of</strong> July<br />
1990 0.0 0.0% Awakenings<br />
1990 0.1 0.2% Goodfellas<br />
1990 0.4 0.5% The Godfather Part III<br />
1990 0.5 0.6% Ghost<br />
1990 90.1 98.8% Dances With Wolves<br />
1991 3.0 5.5% The Prince <strong>of</strong> Tides<br />
1991 3.0 5.5% The Silence <strong>of</strong> the Lambs<br />
1991 6.1 11.1% JFK<br />
1991 18.8 34.5% Bugsy<br />
1991 23.7 43.5% Beauty and the Beast<br />
1992 0.0 0.0% A Few Good Men<br />
1992 0.1 0.3% The Crying Game<br />
1992 8.4 16.2% Scent <strong>of</strong> a Woman<br />
1992 11.2 21.6% Howards End<br />
1992 32.1 61.9% Unforgiven<br />
1993 0.4 0.4% In the Name <strong>of</strong> the Father<br />
1993 1.1 1.1% The Remains <strong>of</strong> the Day<br />
1993 3.0 3.0% The Fugitive<br />
1993 6.1 6.0% The Piano<br />
1993 90.1 89.5% Schindler's List<br />
9
Year Winner<br />
Table 5: The <strong>Oscar</strong>-<strong>Worthiness</strong> <strong>Index</strong>, 1994-2003<br />
<strong>Oscar</strong>-<br />
<strong>Worthiness</strong><br />
<strong>Index</strong><br />
Probability<br />
That Year Title<br />
1994 0.0 0.0% Four Weddings and a Funeral<br />
1994 0.0 0.0% Quiz Show<br />
1994 0.4 0.4% The Shawshank Redemption<br />
1994 3.0 3.0% Pulp Fiction<br />
1994 94.7 96.5% Forrest Gump<br />
1995 0.0 0.0% The Postman (Il Postino)<br />
1995 0.0 0.0% Babe<br />
1995 2.5 7.2% Apollo 13<br />
1995 12.9 37.7% Sense and Sensibility<br />
1995 18.8 55.1% Braveheart<br />
1996 0.4 0.6% Fargo<br />
1996 0.5 0.7% Jerry Maguire<br />
1996 0.5 0.7% Secrets & Lies<br />
1996 3.0 4.0% Shine<br />
1996 70.3 94.0% The English Patient<br />
1997 0.0 0.0% The Full Monty<br />
1997 0.0 0.0% As Good as it Gets<br />
1997 11.2 9.2% Good Will Hunting<br />
1997 11.2 9.2% L.A. Confidential<br />
1997 99.7 81.7% Titanic<br />
1998 0.4 0.3% The Thin Red Line<br />
1998 0.4 0.3% Life is Beautiful<br />
1998 3.0 1.9% Elizabeth<br />
1998 58.0 37.0% Saving Private Ryan<br />
1998 94.7 60.5% Shakespeare in Love<br />
1999 0.0 0.0% The Green Mile<br />
1999 0.1 0.3% The Sixth Sense<br />
1999 0.4 0.9% The Cider House Rules<br />
1999 0.4 0.9% The Insider<br />
1999 48.1 98.0% American Beauty<br />
2000 0.0 0.0% Chocolat<br />
2000 0.5 0.5% Erin Brockovich<br />
2000 3.6 3.0% Traffic<br />
2000 44.7 37.5% Crouching Tiger, Hidden Dragon<br />
2000 70.3 59.0% Gladiator<br />
2001 0.5 0.3% In the Bedroom<br />
2001 3.0 1.9% Gosford Park<br />
2001 26.3 17.1% The Lord <strong>of</strong> the Rings: The Fellowship <strong>of</strong> the Ring<br />
2001 48.1 31.3% Moulin Rouge<br />
2001 75.9 49.4% A Beautiful Mind<br />
2002 0.1 0.1% The Lord <strong>of</strong> the Rings: The Two Towers<br />
2002 0.4 0.2% The Pianist<br />
2002 32.1 18.6% The Hours<br />
2002 44.7 26.0% Gangs <strong>of</strong> New York<br />
2002 94.7 55.0% Chicago<br />
2003 0.4 0.4% Seabiscuit<br />
2003 5.1 4.4% Master and Commander: The Far Side <strong>of</strong> the World<br />
2003 7.1 6.1% Mystic River<br />
2003 8.4 7.2% Lost in Translation<br />
2003 95.6 82.0% The Lord <strong>of</strong> the Rings: The Return <strong>of</strong> the King<br />
10
Table 6: Sure Things, Doomed Gems, and Lucky Dogs<br />
The Sure Things<br />
Overall Rank Year<br />
<strong>Oscar</strong>-<br />
<strong>Worthiness</strong><br />
<strong>Index</strong> Title<br />
1 1997 99.7 Titanic<br />
2 1984 95.6 Amadeus<br />
3 2003 95.6 The Lord <strong>of</strong> the Rings: The Return <strong>of</strong> the King<br />
4 1994 94.7 Forrest Gump<br />
5 1998 94.7 Shakespeare in Love<br />
6 2002 94.7 Chicago<br />
7 1990 90.1 Dances With Wolves<br />
8 1993 90.1 Schindler's List<br />
9 1987 85.0 The Last Emperor<br />
11 1985 83.0 Out <strong>of</strong> Africa<br />
13 2001 75.9 A Beautiful Mind<br />
Doomed Gems<br />
Overall Rank Year<br />
<strong>Oscar</strong>-<br />
<strong>Worthiness</strong><br />
<strong>Index</strong> Title<br />
10 1984 83.0 A Passage to India<br />
12 1989 75.9 Born on the Fourth <strong>of</strong> July<br />
17 1998 58.0 Saving Private Ryan<br />
Lucky Dogs<br />
Overall Rank Year<br />
<strong>Oscar</strong>-<br />
<strong>Worthiness</strong><br />
<strong>Index</strong> Title<br />
28 1988 21.2 Rain Man<br />
30 1995 18.8 Braveheart<br />
47 1991 3.0 The Silence <strong>of</strong> the Lambs<br />
Note: A 'Sure Thing' is a movie with a predicted probability <strong>of</strong> 75% or higher that won the<br />
Best Picture <strong>Oscar</strong>. A 'Doomed Gem' is a movie with a probability <strong>of</strong> winning greater than<br />
50%, i.e. a movie that would have won had it not faced a 'Sure Thing'. A 'Lucky Dog' is a<br />
Best Picture winner who probability <strong>of</strong> winning was less than 25%.<br />
11
virtue <strong>of</strong> having weak competition, and ‘Doomed Gems’, movies that would<br />
have won in almost any other year but had the misfortune to compete against<br />
a‘SureThing’.<br />
More formally, I use the following criteria to group movies into these<br />
categories. ‘Sure Things’ win the Best Picture <strong>Oscar</strong> and have an <strong>Oscar</strong>-<br />
<strong>Worthiness</strong> <strong>Index</strong> <strong>of</strong> 75 or higher. ‘Doomed Gems’ have an <strong>Oscar</strong>-<strong>Worthiness</strong><br />
<strong>Index</strong> above 50 but fail to win the <strong>Oscar</strong> because <strong>of</strong> the presence <strong>of</strong> a ‘Sure<br />
Thing’, or because <strong>of</strong> a surprise win, ’Lucky Dogs’ have an <strong>Oscar</strong>-<strong>Worthiness</strong><br />
<strong>Index</strong> <strong>of</strong> less than 25 but win the Best Picture nonetheless.<br />
Table 6 reports on all three categories over the last twenty years. The<br />
ultimate ‘Lucky Dog’ is Silence <strong>of</strong> the Lambs with an <strong>Oscar</strong>-<strong>Worthiness</strong> <strong>Index</strong><br />
<strong>of</strong> only 3.0. The most <strong>Oscar</strong>-Worthy picture not to win is A Passage to India<br />
which had the misfortune to be up against Amadeus, a veritable maestro <strong>of</strong><br />
<strong>Oscar</strong>-<strong>Worthiness</strong>. Similarly, Saving Private Ryan, more <strong>Oscar</strong>-Worthy than<br />
92% <strong>of</strong> all nominated movies over the last twenty year was knocked <strong>of</strong>f by<br />
Shakespeare in Love, a true ‘Sure Thing’.<br />
This year, we find that The Aviator ranks well on the <strong>Oscar</strong>-<strong>Worthiness</strong><br />
index and will make it onto the list as either a ‘Sure Thing’ or a ‘Doomed<br />
Gem’, depending on the voting. If any other movie wins in 2004, it would<br />
certainly have to qualify as a ‘Lucky Dog’.<br />
5 The Happy Ending<br />
In this paper, I create a simple model thatdoeswellinpredictingpastwinners<br />
<strong>of</strong> the Academy Award for Best Picture. The key variables, the number <strong>of</strong><br />
overall nominations, the number <strong>of</strong> Golden Globe awards, and a designator<br />
for Comedy, correctly rank movies in 18 <strong>of</strong> the last 20 years. Using these<br />
criteria, I give The Aviator a 85 percent probability <strong>of</strong> winning the 2004 Best<br />
Picture <strong>Oscar</strong>. I also create an <strong>Oscar</strong>-<strong>Worthiness</strong> <strong>Index</strong> which allows me to<br />
compare movies across years and identify ’Sure Things’, ‘Doomed Gems’ and<br />
‘Lucky Dogs’.<br />
Theultimatetestforanymodelishowwellitdoesinpredictingevents<br />
that have not yet occurred. This simple model will face its first test on<br />
Sunday, February 27, 2005. Remember it is an honor just to be nominated.<br />
12