Quality Matters: The Expulsion of Professors and the Consequences for PhD Student Outcomes in Nazi Germany
Fabian Waldinger
University of Warwick
I investigate the effect of faculty quality on PhD student outcomes. To address the endogeneity of faculty quality I use exogenous variation provided by the expulsion of mathematics professors in Nazi Germany. Faculty quality is a very important determinant of short- and long-run PhD student outcomes. A one-standard-deviation increase in faculty quality increases the probability of publishing the dissertation in a top journal by 13 percentage points, the probability of becoming a full professor by 10 percentage points, the probability of having positive lifetime citations by 16 percentage points, and the number of lifetime citations by 6.3.
University quality is believed to be one of the key drivers for a successful professional career of university graduates. This is especially true for PhD students. Attending a better university is likely to improve the quality of a student’s dissertation and will provide students with superior skills and contacts. It is therefore not surprising that students who want
Veronika Rogers-Thomas and Michael Talbot provided excellent research assistance. I thank the editor Derek Neal and an anonymous referee for very helpful suggestions that have greatly improved the paper. Furthermore, Guy Michaels, Sharun Mukand, Andrew Oswald, Chris Woodruff, and especially Steve Pischke provided helpful comments and suggestions. Reinhard Siegmund-Schulze gave very valuable information on historical de- tails. I am also grateful for comments received from seminar participants at Bank of Italy, Bocconi, London School of Economics, Stanford, the Centre for Economic Policy Research Economics of Education and Education Policy in Europe conference in London, the Centre for Economic Performance conference in Brighton, the Christmas Conference of German Economists Abroad in Heidelberg, and the CEPR Transnationality of Migrants conference in Milan. I gratefully acknowledge financial support from CEP at LSE.
[Journal of Political Economy, 2010, vol. 118, no. 4]
2010 by The University of Chicago. All rights reserved. 0022-3808/2010/1180-0003$10.00
787
788 journal of political economy
to pursue a PhD spend considerable time and effort to be admitted into the best PhD programs. As a result, students obtaining their PhD from the best universities often have the most successful careers later in life. The positive relationship between university quality measured by average citations per faculty member and PhD student outcomes is documented in figure 1. Figure 1a shows the relationship between faculty quality and the probability that a PhD student publishes her dissertation in a top journal for mathematics PhDs in Germany between 1925 and 1938. For comparison figure 1b shows the relationship between university quality and the placement rate for mathematics PhD students in the United States today.1
The figure demonstrates that university quality is a strong predictor of PhD student outcomes. It is not certain, however, whether this cor- relation corresponds to a causal relationship. Obtaining evidence on the causal effect of university quality is complicated by a number of reasons. More talented and motivated students usually select universities of higher quality. The selection therefore biases ordinary least squares (OLS) estimates of the university quality effect. Further biases are caused by omitted variables that are correlated with both university quality and student outcomes. One potential omitted factor is the quality of labo- ratories, which is difficult to observe. Better laboratories increase pro- fessors’ productivity and thus university quality measured by research output of the faculty. Better laboratories also improve students’ out- comes. Not being able to fully control for laboratory quality will there- fore bias conventional estimates of the effect of university quality on PhD student outcomes. The estimation of university quality effects is also complicated by measurement error since it is extremely difficult to obtain measures for university quality. It is particularly challenging to construct quality measures that reflect the aspects of university quality that truly matter for PhD students. Therefore, any measure of university quality is bound to measure true quality with a lot of error, leading to further biases of OLS estimates.
To address these problems, I propose the dismissal of mathematics professors in Nazi Germany as a source of exogenous variation in uni- versity quality. Immediately after seizing power, the Nazi government dismissed all Jewish and “politically unreliable” professors from the German universities. Overall, about 18 percent of all mathematics pro-
1 Data for pre-1933 Germany are described in further detail below. Data for the United States come from different sources compiled for the Web site http://www.phds.org. Uni- versity quality is measured as average citations per faculty member between 1988 and 1992 (data source: National Research Council 1995). Placement rate is measured as the fraction of PhD students who have secured a permanent job or postdoc at the time of graduation (data source: Survey of Earned Doctorates 2000–2004). The graph shows all mathematics departments with at least nine mathematics PhD graduates per year for the U.S. data.
Fig. 1.—University quality and PhD student outcomes. a, Germany for the years 1925– 38. The vertical axis reports the average probability of publishing the PhD dissertation in a top journal. The horizontal axis measures faculty quality as average department citations (see Sec. II.D for details). b, United States for the years 2000–2004. The vertical axis reports the fraction of PhD students who have secured a permanent job or postdoc at the time of graduation. The horizontal axis measures faculty quality as average department citations (see http://www.phds.org and National Research Council [1995]).
790 journal of political economy
fessors were dismissed between 1933 and 1934. Among the dismissals were some of the most eminent mathematicians of the time such as Johann ( John) von Neumann, Richard Courant, and Richard von Mises. Whereas some mathematics departments (e.g., Go ̈ttingen) lost more than 50 percent of their personnel, others were unaffected because they had not employed any Jewish or politically unreliable mathematicians. As many of the dismissed professors were among the leaders of their profession, the quality of most affected departments fell sharply as a result of the dismissals. This shock persisted until the end of my sample period because the majority of vacant positions could not be filled im- mediately. I investigate how this sharp, and I argue exogenous, drop in university quality affected PhD students in departments with faculty dis- missals compared to students in departments without dismissals.
I use a large number of historical sources to construct the data for my analysis. The main source is a compilation covering the universe of students who obtained their PhD in mathematics from a German uni- versity between 1923 and 1938.2 The data include rich information on the students’ university experience and their future career. An advantage of having data on PhD students from the 1920s and 1930s is that they have all completed their scientific career (which in some cases stretched into the 1980s). One can therefore investigate not only short-term but also long-term outcomes that reach almost to the present. This would not be possible with data on recent university graduates. I combine the PhD student–level data set with data on all German mathematics pro- fessors including their publication and citation records. This allows me to construct yearly measures of university quality for all German math- ematics departments. I obtain information on all dismissed professors from a number of sources and can therefore calculate how much uni- versity quality fell because of the dismissals after 1933. More details on the data sources are given in the data section below (Sec. II).
I use the combined data set to estimate the causal effect of university quality on a variety of student outcomes. The outcomes cover not only the early stages of the former PhD student’s career but also long-term outcomes. I find that the dismissal of Jewish and politically unreliable professors had a very strong impact on university quality. The dismissal can therefore be used as a source of exogenous variation to identify the effect of university quality on PhD student outcomes. The results in- dicate that university quality, measured by average faculty citations, is a very important determinant of PhD student outcomes. In particular, I find that increasing average faculty quality by one standard deviation increases the probability that a former PhD student publishes her dis- sertation by about 13 percentage points. I also investigate how university
2 I do not consider the years after 1938 because of the start of World War II in 1939.
expulsion of professors in nazi germany 791
quality affects the long-run career of the former PhD students. A one- standard-deviation increase in faculty quality increases the probability of becoming a full professor later in life by about 10 percentage points and lifetime citations by 6.3 (the average former PhD student has about 11 lifetime citations). I also show that the probability of having positive lifetime citations increases by 16 percentage points. This indicates that the lifetime citation results are not driven by outliers.
These results indicate that the quality of training matters greatly even in highly selective education markets such as German mathematics in the 1920s and 1930s. At that time, it was very common that outstanding mathematics students from lower-ranked universities transferred to the best places such as Go ̈ttingen and Berlin to obtain their PhD under the supervision of one of the leading mathematics professors. German math- ematics during that time can be considered a great example of a flour- ishing research environment. The likes of David Hilbert, John von Neu- mann, Emmy Noether, and many others were rapidly advancing the scientific frontier in many fields of mathematics. I therefore believe that these results are particularly informative about thriving research com- munities such as the United States today. While it is difficult to assess a study’s external validity, it is reassuring that the organization of math- ematical research in Germany of the 1920s and 1930s was very similar to today’s practices. Mathematicians published their findings in aca- demic journals, and conferences were very common and widely at- tended. It is particularly informative to read recommendation letters that mathematics professors had written for their former PhD students. Their content and style are strikingly similar to today’s academic ref- erences (see, e.g., Bergmann and Epple 2009).
To my knowledge, this paper is the first to investigate the effect of university quality on PhD student outcomes using credibly exogenous variation in quality. The existing literature on PhD student outcomes has mostly shown correlations that are only indicative of the causal effect of university quality. Siegfried and Stock (2004) show that economics PhD students in the United States who graduate from higher-ranked programs complete their PhD faster, have higher salaries, and are more likely to be full-time employed. In a similar study, Stock, Finegan, and Siegfried (2006) show that PhD students in mid-ranked economics pro- grams in the United States have higher attrition rates than students in higher- or lower-ranked programs. Van Ours and Ridder (2003), study- ing Dutch economics PhD students from three universities, find that students who are supervised by more active researchers have lower drop- out and higher completion rates.
Other researchers have investigated the effect of university quality on the career of undergraduates. These findings cannot be easily extrap- olated to PhD students because department quality, in particular, re-
792 journal of political economy
search output of the faculty, is likely to have a different impact on PhD students. Nonetheless, it is interesting to compare my findings to the findings on undergraduates. The studies on undergraduates usually in- vestigate the effect of college quality on wages. A large part of the literature tries to tackle the endogeneity problem using a large set of control variables or matching estimators (see, e.g., Black and Smith 2004, 2006; Hussain, McNally, and Telhaj 2009). Only a few studies attempt to tackle the endogeneity problem more rigorously. Dale and Krueger (2002) measure college quality by Scholastic Aptitude Test score, which captures the effect of undergraduate peers and is likely to act as a proxy for faculty quality. They address selection bias by matching students who were accepted by similar sets of colleges. While they do not find evidence for positive returns to attending a more selective college for the general population, children from disadvantaged families earn more if they attend a more selective college. Behrman, Rosenzweig, and Taubman (1996) use twins to control for selection and find that attending private colleges, PhD-granting colleges, or colleges with higher faculty salaries increases earnings later in life. Brewer, Eide, and Ehrenberg (1999) use a structural selection model with variables related to college costs as exclusion restrictions and find that undergraduate college quality has a significant impact on wages.3
The remainder of the paper is organized as follows: Section I gives a brief description of historical details. A particular focus lies on the de- scription of the dismissal of mathematics professors. Section II describes the data sources in more detail. Section III outlines the identification strategy. The effect of faculty quality on PhD student outcomes is ana- lyzed in Section IV. Section V presents conclusions.
I. German Universities and National Socialism
A. The Expulsion of Jewish and “Politically Unreliable” Professors
This section gives an overview of the dismissal of mathematics professors that will be used to identify the effect of faculty quality. For simplicity, I use the term “professors” for all faculty members who had the right
3 A more recent strand of the literature has tried to disentangle the impact of different professor attributes on academic achievement of undergraduate students within a uni- versity. Hoffmann and Oreopoulos (2009) find that subjective teacher evaluations have an important impact on academic achievement. Objective characteristics such as rank and salary of professors do not seem to affect student achievement. Carrell and West (2008) find that academic rank and teaching experience are negatively related to contempora- neous student achievement but positively related to the achievement in follow-on courses in mathematics and science. For humanities they find almost no relationship between professor attributes and student achievement.
expulsion of professors in nazi germany 793
to lecture at a German university, which includes everybody who was at least Privatdozent.4
Just over 2 months after the National Socialist Party seized power in 1933 the Nazi government passed the Law for the Restoration of the Professional Civil Service on April 7, 1933. Despite this misleading name the law was used to expel all Jewish and politically unreliable persons from civil service in Germany. At that time most German university professors were civil servants. Therefore, the law was directly applicable to them. Via additional ordinances the law was also applied to other university employees who were not civil servants. The main parts of the law read as follows:
Paragraph 3: Civil servants who are not of Aryan descent are to be placed in retirement. . . . [This] does not apply to officials who had already been in the service since the 1st of August, 1914, or who had fought in the World War at the front for the German Reich or for its allies, or whose fathers or sons had been casualties in the World War.
Paragraph 4: Civil servants who, based on their previous political activities, cannot guarantee that they have always unreservedly sup- ported the national state, can be dismissed from service. (Quoted in Hentschel 1996, 22–23)
A further implementation decree defined “Aryan decent” as follows: “Anyone descended from non-Aryan, and in particular Jewish, parents or grandparents, is considered non-Aryan. It is sufficient that one parent or one grandparent be non-Aryan.” Thus professors who were baptized Christians were dismissed if they had a least one Jewish grandparent. Also, undesired political activities were further specified, and members of the Communist Party were all expelled.
The law was immediately implemented and resulted in a wave of dismissals and early retirements from the German universities. A careful early study by Hartshorne published in 1937 counts 1,111 dismissals from the German universities and technical universities between 1933
4 At that time a researcher could hold a number of different university positions. Or- dinary professors held a chair for a certain subfield and were all civil servants. Furthermore, there were different types of extraordinary professors. First, they could either be civil servants (beamteter Extraordinarius) or not have the status of a civil servant (nichtbeamteter Extraordinarius). Universities also distinguished between extraordinary extraordinary pro- fessors (ausserplanma ̈ßiger Extraordinarius) and planned extraordinary professors (planma ̈- ßiger Extraordinarius). Then at the lowest level of university teachers were the Privatdozenten, who were never civil servants. Privatdozent is the first university position a researcher could obtain after obtaining the right to teach (venia legendi).
794 journal of political economy
TABLE 1
Number of Dismissed Mathematics Professors
Year of Dismissal
1933
1934
1935
1936
1937
1938
1939
1940
1933–34 41
Percentage of All Mathematics Professors in 1933
15.6 2.7 2.2 .4 .9 .4 .4 .4 18.3
Number of Dismissed Professors
35 6 5 1 2 1 1 1
and 1934.5 This amounts to about 15 percent of the 7,266 university researchers present at the beginning of 1933. Most dismissals occurred in 1933 immediately after the law was implemented. Not everybody was dismissed in 1933 because the law allowed Jewish scholars to retain their position if they had been in office since 1914, if they had fought in the First World War, or if they had lost a father or son in the war. None- theless, many of the scholars who could stay according to this exception decided to leave voluntarily, for example, the famous applied mathe- matician Richard von Mises, who was working on aerodynamics and solid mechanics at the University of Berlin. The originally exempted scholars were just anticipating a later dismissal as the Reich citizenship laws (Reichsbu ̈rgergesetz) of 1935 revoked the exception clause.
Table 1 reports the number of dismissed mathematics professors. Sim- ilarly to Harthorne, I focus my analysis on researchers who had the right to teach (venia legendi) at a German university. According to my cal- culations, about 18.3 percent of all mathematics professors were dis- missed between 1933 and 1934. It is interesting to note that the per- centage of dismissals was much higher than the fraction of Jews living in Germany, which was about 0.7 percent of the total population at the beginning of 1933.
My data do not allow me to identify whether the researchers were dismissed because they were Jewish or because of their political orien- tation. Other researchers, however, have investigated this issue and have shown that the vast majority of the dismissed were either Jewish or of Jewish descent. Siegmund-Schultze (1998) estimates that about 79 per- cent of the dismissed scholars in mathematics were Jewish.
5 The German university system had a number of different university types. The main ones were the traditional universities and the technical universities. The traditional uni- versities usually covered the full spectrum of subjects whereas the technical universities focused on technical subjects.
expulsion of professors in nazi germany 795
Before giving further details on the distribution of dismissals across different universities, I am going to provide a brief overview of the fate of the dismissed professors. Immediately after the first wave of dismissals in 1933, foreign e ́migre ́ aid organizations were founded to assist the dismissed scholars in obtaining positions in foreign universities. The first organization to be founded was the English Academic Assistance Council (later renamed the Society for the Protection of Science and Learning). It was established as early as April 1933 by the director of the London School of Economics, Sir William Beveridge. In the United States the Emergency Committee in Aid of Displaced Scholars was founded in 1933. Another important aid organization, founded in 1935 by some of the dismissed scholars themselves, was the Emergency Al- liance of German Scholars Abroad (Notgemeinschaft Deutscher Wis- senschaftler im Ausland). The main purpose of these and other, albeit smaller, organizations was to assist the dismissed scholars in finding positions abroad. In addition to that, prominent individuals such as Eugen Wigner or Hermann Weyl tried to use their extensive network of personal contacts to find employment for less well-known mathe- maticians. Owing to the very high international reputation of German mathematics, many of them could find positions without the help of aid organizations. Less renowned and older scholars had more problems in finding adequate positions abroad. Initially, many dismissed scholars fled to European countries. Most of these countries were only a tem- porary refuge because the dismissed researchers obtained only tem- porary positions in many cases. The expanding territory of Nazi Ger- many in the early stages of World War II led to a second wave of emigration from the countries that were invaded by the German army. The main final destinations of dismissed mathematics professors were the United States, England, Turkey, and Palestine. The biggest propor- tion of dismissed scholars eventually moved to the United States. For the purposes of this paper it is important to note that the vast majority of the emigrations took place immediately after the researchers were dismissed from their university positions. It was therefore extremely difficult for dismissed supervisors to continue to unofficially supervise their former PhD students. A very small minority of the dismissed pro- fessors did not leave Germany, and most of them died in concentration camps or committed suicide. Extremely few managed to stay in Germany and survive the Nazi regime. Even the mathematicians who stayed in Germany were no longer allowed to use university resources for their research. The possibility of ongoing supervision of their students was thus extremely limited.
The aggregate numbers of dismissals hide the fact that the German universities were affected very differently. Even within a university there
796 journal of political economy
was a lot of variation across different departments.6 Whereas some math- ematics departments did not experience any dismissals, others lost more than 50 percent of their personnel. As shown above, the vast majority of dismissals occurred between 1933 and 1934. Only a small number of mathematics professors were dismissed after that. The few dismissals occurring after 1933 affected researchers who had been exempted un- der the clause for war veterans or for having obtained their position before 1914. In addition to that, some political dismissals occurred dur- ing the later years. In order to have a sharp dismissal measure I focus on the dismissals in 1933 and 1934. Table 2 reports the number of dismissals in the different mathematics departments. Some of the best departments were hit hardest by the dismissals. Go ̈ttingen, for example, lost almost 60 percent of its mathematics faculty and Berlin, the other leading university, lost almost 40 percent. The following quote from a conversation between David Hilbert (one of the most influential math- ematicians of the early twentieth century) and Bernhard Rust (Nazi minister of education) at a banquet in 1934 exemplifies the dimension of the dismissals for the mathematics department in Go ̈ttingen and the complete ignorance of the Nazi elite.
Rust: How is mathematics in Go ̈ttingen now that it has been freed of Jewish influence?
Hilbert: Mathematics in Go ̈ttingen? There is really none any more. (Quoted in Reid 1996, 205)
Table 3 gives a more detailed picture of the quantitative and quali- tative loss to German mathematics. The dismissed mathematics profes- sors were on average younger than their colleagues who remained in Germany and much more productive as is exemplified by the publications and citations data.7
B. PhD Students in Germany
I will analyze the effect of the dismissal of professors on the outcomes of mathematics PhD students. Table 4 summarizes some of the char- acteristics of the PhD students in my sample.8
At the time, students of mathematics could obtain two degrees: a high
6 See Waldinger (2010) for the number of dismissals in physics and chemistry depart- ments.
7 For a more detailed description of the publications and citations data, see Sec. II.
8 Further details on data sources are given in Sec. II.
TABLE 2
Dismissals across Different Universities
Number of
Professors University Beginning of 1933
Dismissed 1933–34 Number Percentage
Dismissal- Induced Change to Department Quality
Aachen TU 7 Berlin 13 Berlin TU 14 Bonn 7 Braunschweig TU 3 Breslau 6 Breslau TU 5 Darmstadt TU 9 Dresden TU 10 Erlangen 3 Frankfurt 8 Freiburg 9 Giessen 7
3 42.9 5 38.5 2 14.3 1 14.3 0 0 0 3 50.0 2 40.0 1 11.1 0 0 0
0 0 0
1 12.5
1 11.1 1 14.3
Go ̈ ttingen Greifswald Halle Hamburg Hannover TU Heidelberg Jena Karlsruhe TU Kiel
Ko ̈ln
Ko ̈ nigsberg Leipzig Marburg Mu ̈nchen Mu ̈nchen TU Mu ̈nster Rostock Stuttgart TU Tu ̈ bingen Wu ̈ rzburg
17 10 3 0 7 1 8 0 6 0 5 1 5 0 6 1 5 2 6 2 5 2 8 2 8 0 9 0 5 0 5 0 2 0 6 0 6 0 4 0
58.8 0 0 14.3 0 0 0 0
20.0 0 0 16.7 0
40.0 33.3 40.0 25.0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Note.—This table reports the total number of professors in 1933. Number dismissed indicates how many professors were dismissed in each department. Percentage dismissed indicates the percentage of dismissed professors in each department. The column dismissal-induced change to department quality indicates how the dismissal affected av- erage department quality: indicates a drop in average department quality by more than 50 percent, a drop in average department quality between 0 and 50 percent, 0 no change in department quality, and an improvement in average department quality between 0 and 50 percent.
TABLE 3
Quality of Dismissed Professors
Dismissed 1933–34
Number of professors (begin- ning of 1933)
Number of chaired professors Average age (1933)
Average publications (1925–32) Average citation-weighted
publications (1925–32)
All
224 117
48.7 .33
1.45
Stayers
183 99
50.0 .27
.93
Number
41 18 43.0
.56 3.71
Percentage Loss
18.3 15.4 . . . 31.1
46.8
Note.—Percentage loss is calculated as the fraction of dismissed pro- fessors among all professors or as the fraction of publications and citations that were contributed by the dismissed professors.
TABLE 4
Summary Statistics PhD Students
Average age at PhDa
Average time to PhD in years (from be-
ginning of studies)b
% female
% foreign
Average number of university changes
(from beginning of studies)c Outcomes:
% obtaining high school teacher di- ploma as additional degree
% published dissertation in top journal % became chaired professor later in
life
Average lifetime citations
% with positive lifetime citations % with lifetime citations 1 25
% with lifetime citations 1 100
Number of PhD students
Obtaining PhD in Top 10
All Department
27.5 26.9
7.4 7.2 8.7 8.2 7.5 9.7
1.1 1.1
51.6 42.3 24.1 29.5
18.7 25.0 11.2 16.6 29.9 37.8
Obtaining PhD in Lower-Ranked Department
28.1
7.7 9.2 5.3
1.2
61.2 18.3
12.1 5.5 21.6 9.0 13.1 4.7 3.2 4.8 1.5
690 352 338
Note.—Summary statistics are based on all 690 PhD students in the sample.
a Information on average age at PhD is available for 667 individuals.
b Average time to PhD is available for 579 individuals.
c Information on the number of university changes is available for 626 individuals.
expulsion of professors in nazi germany 799
school teacher diploma (Staatsexamen) and/or the PhD.9 The majority of all students studying mathematics obtained a high school teacher diploma and mostly started working as high school teachers. Abele, Neunzert, and Tobies (2004) calculate that about 8 percent of all math- ematics students at the time obtained a PhD in mathematics. In this study I focus on PhD students. The mathematics PhD students were on average 27 years old when they obtained their degree. About 9 percent were females and 8 percent foreigners. The future PhD students en- rolled in university after high school (Abitur) and then took courses for eight or nine semesters. It was very common to change universities at the time. The average PhD student in my sample had about one uni- versity transfer during her studies. About 30 percent of students changed university at least twice.
About half of the mathematics PhD students obtained the PhD degree only. The other half also obtained the high school teacher diploma. PhD students usually started working on their dissertation after com- pleting their course work, that is, after about 4–5 years. They then worked on their thesis for about 3 years. After submitting their disser- tation they had to pass an oral examination.
My data show that about 24 percent published their dissertation in a top academic journal. Later in their career about 19 percent became chaired professors. During his or her career the average former PhD student had about 11 lifetime citations. About 30 percent of the sample has positive lifetime citations. Table 4 also demonstrates that students who obtained their PhD from a top 10 university (measured by average faculty quality) had better outcomes. They were more likely to publish their dissertation in a top journal, were more likely to become full professors, and had higher lifetime citations. This, of course, does not indicate that university quality has a causal impact on PhD student out- comes because of the endogenous sorting of good students into high- quality universities.
The following changes affected student life after the Nazi government seized power. After 1934 the new government restricted student num- bers by introducing a quota on first-year enrollments, though it was not binding.10 Overall student numbers fell during the 1930s, but this fall
9 In the 1920s the first technical universities started to offer the diploma (Diplom) as an alternative degree to students of mathematics. In the beginning only a very small fraction of mathematics students chose to obtain a diploma. Only 63 students in all tech- nical universities obtained a diploma between 1932 and 1941 (see Lorenz 1943). From 1942 onward the diploma was also offered by the traditional universities.
10 In the first semester of the quota (summer semester 1934) the number of entering students was set at 15,000. The actual entering cohort in the previous year had been 20,000. The quota of 15,000, however, was not binding since only 11,774 students entered university in the summer semester of 1934. The Nazi government furthermore capped the number of female students at 10 percent of the whole entering cohort (the actual
800 journal of political economy
was only partly caused by the Nazi quota. Two other factors were also responsible for falling student numbers: first, high unemployment fig- ures of university graduates that were caused by the Great Depression and, second, the fact that the smaller birth cohorts that were born during the First World War were entering university. Hartshorne (1937) shows that student numbers started falling from 1931 onward and thus already 2 years before the Nazi government came into power and 3 years before it introduced quotas for the entering cohorts.
Jewish students were subject to discrimination already some months after the Nazi government gained power. On April 25, 1933, the Nazi government passed the Law against Overcrowding in Schools and Uni- versities. The law limited the proportion of Jewish students among the entering cohort to 1.5 percent. The proportion of Jewish students among existing students in all universities and departments was capped at 5 percent. This cap, however, was never binding since no department had a larger fraction of Jewish students.11
The enactment of the Reich citizenship laws (Reichsbu ̈rgergesetz) of 1935 aggravated the situation for Jews living in Germany. The university quotas, however, were left unchanged even though they became more and more meaningless since many Jewish students discontinued their studies because of continuous humiliation or because they were fleeing from Germany. Jewish students who did not leave could obtain a PhD degree until April 1937. Students who had only some Jewish grandpar- ents could in principle continue to obtain a PhD until 1945 but had to pass the scrutiny of the Nazi party’s local organizations, assessing their alignment with the Nazi party.12
Figure A2 shows the total number of Jewish PhD students in each PhD cohort.13 The PhD student data reflect the political changes very well. The number of Jewish PhD students is more or less constant until 1934. From 1935 onward it declines and is zero in 1938. The figure also demonstrates that the total number of Jewish students was not very large. The data include only students who actually finished the PhD. I there- fore cannot directly investigate how many Jewish students discontinued their studies during the Nazi era. The pre-1933 figures, however, suggest
fraction of female students in 1932 had been about 18 percent). For further details see Hartshorne (1937).
11 Jewish students whose father had fought in World War I and Jewish students who had only one or two Jewish grandparents were counted as non-Jewish for the calculation of the quotas. The proportion of Jewish students defined in that way had been below 5 percent in all universities and departments even before 1933.
12 For a detailed report on the life of Jewish students in Nazi Germany, see von Olen- husen (1966).
13 The data do not include the students’ religion. They include, however, a lot of biographical information. Most Jewish students managed to emigrate from Germany. This is indicated in the biographical information of the data. I classify any student who emigrates between 1933 and 1945 as Jewish.
expulsion of professors in nazi germany 801
that about four Jewish students per year left the German system in the later years of the sample. The relatively small number of Jewish PhD students makes it unlikely that their selection into departments with many Jewish professors may be driving my results. Nonetheless I explore this possibility in my analysis, and I show below that my results are unaffected by excluding all Jewish students from the regressions.
I consider only students who obtained their PhD until 1938. This PhD cohort entered university in 1930 on average. During their course work (the first 4–5 years of their studies) the policies of the Nazi government affected them only relatively late. As the Nazi government was extremely centralized, the measures of the new government affected all universities in the same fashion. In my identification strategy I exploit the fact that different departments were affected very differently by the dismissal of professors, and I control for PhD cohort fixed effects. There is thus no worry that these aggregate policies affect my findings.
II. Data
A. Data on PhD Students
The data on PhD students include the universe of students who received their PhD in mathematics from a German university between 1923 and 1938. The data were originally compiled by Renate Tobies for the German Mathematical Association (Deutsche Mathematiker Vereini- gung). She consulted all university archives of the former PhD students and combined that information with data from additional sources.14 The data set includes short biographies of the PhD students including in- formation on the universities they attended, whether and where they published their dissertation, and their professional career after obtain- ing their PhD. I define four outcome variables for PhD students. The first, a short-term outcome, is a dummy variable indicating whether the student publishes her dissertation in a top journal. The second outcome looks at the long-run career of the former PhD students. It is a dummy variable that takes the value of one if the former student ever becomes a full professor during her career. I also construct a third outcome equal to the number of lifetime citations in mathematics journals. To explore whether the lifetime citation results are driven by outliers I also use an indicator for positive lifetime citations as a fourth outcome.15
I combine the data on PhD students with information on the quality
14 For a more detailed description of the data collection process, see Tobies (2006).
15 Lifetime citations are obtained by merging data from all mathematics journals in the Web of Science listed below (see Sec. II.D) to the PhD students. I then calculated lifetime citations as all citations the former PhD student received for publications in mathematics journals published in the year before obtaining his PhD and the first 30 years after PhD graduation. Citations of these articles are counted until the present.
802 journal of political economy
of departments measured by the faculty’s research output. I also obtain measures of how department quality changed as a result of the dismissal of professors.
B. Data on Dismissed Mathematics Professors
The data on dismissed mathematics professors are obtained from a number of different sources. The main source is the List of Displaced German Scholars (Notgemeinschaft Deutscher Wissenschaftler im Aus- land 1936). This list was compiled by the relief organization Emergency Alliance of German Scholars Abroad and was published in 1936. Ap- pendix figure A1 shows a sample page from the mathematics part of the list including two very prominent dismissals: Richard Courant, who later founded the renowned Courant Institute for Mathematical Sci- ences at New York University, and Paul Bernays, who was working on axiomatic set theory and mathematical logic. Both of them were dis- missed from Go ̈ttingen. The purpose of publishing the list was the fa- cilitation of finding positions for the dismissed professors in countries outside Germany. Overall, the list contained about 1,650 names of re- searchers from all university subjects. I have extracted all mathemati- cians from the list. In the introductory part, the editors of the list explain that they have made the list as complete as possible. It is therefore the most common source used by historians of science investigating the dismissals of professors in Nazi Germany. Out of various reasons, for example, if the dismissed died before the list was compiled, a small number of dismissed scholars did not appear on the list. To get a more precise measure of all dismissals I complement the information in the List of Displaced German Scholars with information from other sources.16
The main additional source is the Biographisches Handbuch der deutschs- prachigen Emigration nach 1933 (Ro ̈der and Strauss 1983). The compi- lation of the handbook was initiated by the Institut fu ̈r Zeitgeschichte Mu ̈nchen and the Research Foundation for Jewish Immigration New York. Published in 1983, it contained short biographies of artists and university researchers who emigrated from Nazi Germany.17 In addition to these two main data sources, I obtained further dismissals from a list compiled by Siegmund-Schultze (1998), who has studied the history of mathematics in Nazi Germany.
The complete list of dismissed professors also contains a few research- ers who were initially exempted from being dismissed but resigned vol-
16 Slightly less than 20 percent of the 1933–34 dismissals appear only in the additional sources but not in the List of Displaced German Scholars.
17 Kro ̈ner (1983) extracted a list of all dismissed university researchers from the hand- book. I use Kro ̈ner’s list to append my list of dismissed mathematics professors.
expulsion of professors in nazi germany 803
untarily. The vast majority of them would have been dismissed as a result of the racial laws of 1935 anyway and were thus only anticipating their dismissal. All of these voluntary resignations were directly caused by the discriminatory policies of the Nazi regime.
C. Roster of Mathematics Professors between 1923 and 1938
To assess the effect of department quality on student outcomes one needs yearly measures of faculty quality for each of the 33 German mathematics departments. I measure department quality as the average quality of all mathematics professors who were present in a given de- partment and year. I therefore construct a complete roster of all math- ematics professors at the German universities from 1923 to 1938 using data published in the semiofficial University Calendar.18
The data contain all mathematics professors (with their affiliation in each year) from winter semester 1922/23 (lasting from November 1922 until April 1923) until winter semester 1937/38. The data for the 10 technical universities start in 1927/28 since they were published in the University Calendar only after that date. The University Calendar is a compilation of all individual university calendars listing the lectures held by each scholar in a given department. I can identify all mathematics professors by the lectures they are giving (such as Algebra II). If a researcher was not lecturing in a given semester, he was listed with his subject under the heading “not lecturing.”19
D. A Measure of Department Quality Based on Publication Data
To measure the dismissals’ effect on department quality I construct productivity measures for each professor. These are then averaged within departments to measure department quality in each academic year. The
18 The University Calendar was published by J. A. Barth. He collected the official uni- versity calendars from all German universities and compiled them into one volume. Orig- inally named Deutscher Universita ̈tskalender, it was renamed Kalender der deutschen Universita ̈ten und technischen Hochschulen in 1927/28. From 1929/30 it was renamed Kalender der Deutschen Universita ̈ten und Hochschulen. In 1933 it was again renamed Kalender der reichsdeutschen Universita ̈ten und Hochschulen.
19 The dismissed researchers who were not civil servants (Privatdozenten and some ex- traordinary professors) all disappear from the University Calendar between winter semester 1932/33 and winter semester 1933/34. Some of the dismissed researchers who were civil servants (ordinary professors and some extraordinary professors), however, were still listed even after they were dismissed. The original law forced Jewish civil servants into early retirement. As they were still on the states’ payroll, some universities still listed them in the University Calendar even though they were not allowed to teach or do research anymore (which is explicitly stated in the calendar in some cases). My list of dismissals includes the exact year after which somebody was barred from teaching and researching at a German university. I thus use the dismissal data to determine the actual dismissal date and not the year a dismissed scholar disappears from the university calendars.
804 journal of political economy
productivity measure is based on publications in the top academic jour- nals of the time. At that time most German mathematicians published in German journals. The quality of the German journals was usually very high because many of the German mathematicians were among the leaders in their field.
The top publications measure for each mathematics professor is based on articles contained in the online database Institute for Scientific In- formation Web of Science.20 I extract all German-language mathematics journals that are included in the database for the time period 1920–38. Furthermore, I add the most important foreign mathematics journals of the time, which were important outlets for German mathematicians.21
About 70 percent of the publications by mathematicians are in pure mathematics journals. Some mathematicians also published more ap- plied work, on theoretical physics, for example, in general science, phys- ics, and even chemistry journals from time to time. To get a more accurate productivity measure for the professors I also extract the most important German and foreign journals in those fields from the Web of Science. Appendix table A1 lists all journals used in the analysis.
For each of these journals I obtain all articles published between 1925 and 1932. A very small number of contributions in the top journals were letters to the editor or comments. I restrict my analysis to contributions classified as articles since they provide a cleaner measure for a re- searcher’s productivity. The database includes the names of the authors of each article and statistics on the number of subsequent citations of each of these articles. For each mathematics professor I then calculate a measure of his predismissal productivity. It is based on his publications in top journals in the 8 years between 1925 and 1932. For each of these publications I then count the number of citations these articles received in the first 50 years after publication. This includes citations in journals that are not in my list of journals but that appear in the Web of Science. The measure therefore includes citations from the entire international scientific community. The measure is therefore less heavily based on German mathematics. I then calculate the yearly average of this citation- weighted publications measure for each professor. The following simple example illustrates the construction of the predismissal productivity measure. Suppose that a mathematician published two top journal ar-
20 In 2004 the database was extended to include publications between 1900 and 1945. The journals included in that extension were all journals that had published the most relevant articles in the years 1900–1945 based on their citations in later years. (For more details on the process, see http://wokinfo.com/products_tools/backfiles/cos.) The jour- nals available in the Web of Science are therefore by construction the top journals of the time period 1900–1945 with a heavy emphasis on German journals because of the leading role of German mathematics at the time.
21 The foreign mathematics journals were suggested by Reinhard Siegmund-Schultze and David Wilkins; both are specialists in the history of mathematics.
expulsion of professors in nazi germany 805
ticles between 1925 and 1932. One is cited 10 times and the other six times in any journal covered by the Web of Science in the 50 years after its publication. The researcher’s predismissal citation-weighted publi- cations measure is therefore (10 6)/8 (years) p 2. Appendix table A2 lists the top mathematics professors according to the predismissal productivity measure. It is reassuring to realize that the vast majority of these top 20 researchers are well known in the mathematics community. Economists will find it interesting that John von Neumann is the most cited mathematician.22
Yearly department quality measures are then calculated as the de- partment average of the individual productivity measures. Say a de- partment has three mathematics professors with citation-weighted pro- ductivity measures equal to 1, 2, and 3. Department quality is then equal to (1 2 3)/3 p 2. This measure changes only if the composition of the department changes. The implicit assumption of calculating de- partment quality in this way is that Richard Courant always contributes in the same way to department quality independently of how much he publishes in a given year.23
I combine the PhD student–level data with measures of faculty quality based on publications in top journals and with information on the effect of the dismissal of professors on faculty quality and department size to obtain my final data set.
22 The large number of very well-known mathematicians among the top 20 researchers indicates that citation-weighted publications are a good measure of a scholar’s productivity. Nevertheless, the measure is not perfect. As the Web of Science reports only last names and the initial of the first name for each author, there are some cases in which I cannot unambiguously match researchers and publications. In these cases I assign the publication to the researcher whose field is most closely related to the field of the journal in which the article was published. In the very few cases in which this assignment rule is still ambiguous between two researchers, I assign each researcher half of the citation-weighted publications. Another problem is the relatively large number of misspellings of authors’ names. All articles published at that time were of course published on paper. In order to include these articles in the electronic database, Thomson Scientific employees scanned all articles published in the historically most relevant journals. The scanning was error prone and thus led to misspellings of some names. As far as I discovered these misspellings I manually corrected them.
23 Yearly department quality is measured as the mean of the individual productivities of all current department members. Individual productivities are computed as average productivity between 1925 and 1932 for each individual. Using performance measures that rely on post-1933 data is problematic because the dismissals may affect post-1933 publications of professors. In particular, one would likely underestimate the quality of some of the professors who were dismissed. The 1925–32 productivity measure is, however, not defined for the very few mathematics professors who join after 1933. These professors are therefore not included in the calculation of the post-1933 department averages. An alternative way of calculating average department quality uses publications in years until 1938, which is defined for all professors but may be affected by the dismissal of professors. Using this alternative way of measuring department quality leaves the results unchanged.
806 journal of political economy
III. Identification
Using this data set I investigate the effect of faculty quality and de- partment size on PhD student outcomes with the following regression model:
Outcome
idt
p b b (Avg. Faculty Quality)
1 2 dt1
b (Student/Faculty Ratio)
3 dt1
(1) b4 Femaleidt b5 Foreigneridt b6 CohortFEt
b7 DepartmentFEd idt .
I regress the outcome of student i from department d who obtains her PhD in year t on a measure of university quality, student/faculty ratio, and other controls. The main controls are dummy variables in- dicating whether the PhD student is a female or a foreigner. To control for factors affecting a whole PhD cohort I also include a full set of yearly cohort dummies. I also control for department-level factors that affect PhD student outcomes and are constant over time by including a full set of department fixed effects. In some specifications reported below I also control for a set of 28 dummy variables indicating father’s occu- pation.24
The main coefficient of interest is b2, indicating how faculty quality affects PhD student outcomes. A further interesting coefficient is b3, which indicates how the student/faculty ratio affects PhD students. The number of students that is used to construct the student/faculty ratio variable measures the size of the whole cohort of mathematics students in a given department that may decide to obtain a PhD.25
Estimating this equation using OLS will, however, lead to biased results since university quality is endogenous. The estimates are likely to be
24 The most important occupations are salesman, high school teacher, primary or middle school teacher, salaried employee, higher-level civil servant, craftsman, and civil servant in the national post office. For about 30 percent of the sample father’s occupation is missing. I therefore include an additional dummy indicating whether father’s occupation is missing.
25 As mentioned before, only a small fraction of all mathematics students in a given department proceed to obtain a PhD degree (the majority leave university with a high school teacher diploma). University quality may affect the number of students who enroll and/or complete the PhD. It is therefore preferable to use all mathematics students (not only PhD students) in a department to construct the student/faculty measure. Otherwise one would control for a variable that is endogenous to university quality. The average PhD student takes courses for 4 years and subsequently works on her dissertation for 3 years. A student who obtains her PhD in 1932 therefore comes from a cohort of students that were taking courses at her university 3 years earlier (i.e., in 1929). I therefore assign each PhD student her potential cohort size using all mathematics students in her department 3 years prior to her PhD completion date. The data on all mathematics students in each department and year come from the Deutsche Hochschulstatistik, vols. 1–14, and Statistisches Jahrbuch, vols. 1919–1924/25.
expulsion of professors in nazi germany 807
biased for three reasons: selection of inherently better students into better universities; omitted variables, such as the quality of laboratories; and measurement error of the faculty quality variable. Measurement error occurs for two main reasons. First, average faculty citations mea- sure the particular aspects of faculty quality that matter for PhD student success with substantial error. Second, measurement error comes from misspellings of last names in the publications data and the fact that the Web of Science reports only the first letter of each author’s first name. Similar problems affect the OLS coefficient of the student/faculty ratio, for example, because inherently better PhD students may choose to study in departments with a lower student/faculty ratio.
In order to address these concerns I propose the dismissal of math- ematics professors in Nazi Germany as an exogenous source of variation in quality and size of mathematics departments (which affects the stu- dent/faculty ratio). As outlined before, PhD students in some depart- ments experienced a large shock to the quantity and quality of the faculty whereas others were not affected. Figure 2 shows how the dis- missal of professors affected mathematics departments. The dashed line shows mathematics departments with dismissals in 1933 or 1934. The solid line shows departments without dismissals. Figure 2a shows that affected departments were of above-average size. Not surprisingly, the dismissal caused a strong reduction in the number of mathematics pro- fessors in the affected departments. In the same time period the size of departments without dismissals remained relatively constant. The dis- missed were not immediately replaced because of a lack of suitable researchers without a position and the slow appointment procedures. Successors for dismissed chaired professors, for example, could be ap- pointed only if the dismissed scholars gave up their pension rights, because the dismissed were originally placed into early retirement. The states did not want to pay the salary for the replacement and the pension for the dismissed professor at the same time. It thus took years to fill open positions in most cases.
Figure 2b shows the evolution of average department quality in the two types of departments. Obviously, one would expect a change in average department quality only if the quality of the dismissed professors was either above or below the predismissal department average. The figure demonstrates two interesting points: the dismissals occurred mostly at departments of above-average quality, and within many of the affected departments the dismissed professors were on average more productive than the stayers. As a result, average department quality in affected departments fell after 1933. The graph shows only averages for the two groups of departments. It greatly understates the department- level variation I am using in the regression analysis. Some departments with dismissals also lost professors of below-average quality, as can be
Fig. 2.—Effect of dismissals on faculty in mathematics departments. a, Effect on de- partment size. Dashed line: departments with dismissals in 1933 and 1934; solid line: departments without dismissals. b, Effect on average faculty quality. Dashed line: depart- ments with dismissals of above-average department quality between 1933 and 1934; solid line: departments without dismissals.
expulsion of professors in nazi germany 809
seen from table 2. In those departments average quality increased after 1933.
It is important to note that the fact that most of the dismissals occurred in bigger and better departments does not invalidate the identification strategy since level effects will be taken out by including department fixed effects. The crucial assumption for the validity of this differences- in-differences type strategy is that the trends in affected versus unaf- fected departments were the same prior to the dismissal. Below I show in various ways that this is indeed the case.26
The figure suggests that the dismissal had a strong effect on average department quality and department size. It is therefore possible to use it as an instrument for the endogenous faculty quality and student/ faculty ratio variables. The two first-stage regressions are as follows:
Avg.FacultyQuality p dt
g g (Dismissal-Induced Reduction in Faculty Quality) 12 dt
g (Dismissal-Induced Increase in Student/Faculty Ratio) (2) 3 dt
g4 Femaleidt g5 Foreigneridt g6 Cohortt g7 DepartmentFEd idt ,
Student/FacultyRatiodt p
d d (Dismissal-Induced Reduction in Faculty Quality)
12 dt
d (Dismissal-Induced Increase in Student/Faculty Ratio) (3)
3 dt d4 Femaleidt d5 Foreigneridt d6 Cohortt
d7 DepartmentFEd idt .
Equation (2) is the first-stage regression for average faculty quality. The main instrument for average faculty quality is the dismissal-induced re- duction in faculty quality. It measures how much average faculty quality fell as a result of the dismissals. The variable is zero until 1933 for all departments. After 1933 it is defined as the predismissal average quality of all professors in the department minus the average quality of the
26 The fact that mostly bigger and better departments were affected, however, influences the interpretation of the instrumental variable (IV) estimates. According to the local average treatment effect framework pioneered by Imbens and Angrist (1994), the IV estimates measure the effect of a change in department quality and the student/faculty ratio in bigger and better departments. As nowadays most mathematics departments are bigger than the average in the early twentieth century, this LATE effect is arguably more interesting than the corresponding average treatment effect.
810 journal of political economy
professors who were not dismissed (if the dismissed were of above-av- erage quality):
Dismissal-Induced Reduction in Faculty Quality p
(Avg. Pre 1933 Quality) (Avg. Pre 1933 QualityFStayer).
In departments with above-average quality dismissals (relative to the department average) it will be positive after 1933. The variable remains zero for all other departments. The implicit assumption is therefore that dismissals of below-average quality professors did not positively af- fect PhD student outcomes.27 The following example illustrates the def- inition of the IV. Average faculty quality in Go ̈ttingen in 1932 (before the dismissals) was 2.1 citation-weighted publications per year on av- erage. In 1933 some outstanding professors with citations that were higher than the department average were dismissed. Average quality of the remaining professors was only 1.7 (citation-weighted publications). In 1934 another professor with above-average citations was dismissed. After that, average quality of the remaining professors was only 1.1. For Go ̈ttingen the variable is zero until 1933 (as I use a 1-year lag in the department variables, it is zero for 1933 inclusive). In 1934 the value of the dismissal-induced reduction in faculty quality variable is 2.1 1.7 p 0.4. From 1935 onward the value of the variable is 2.1 1.1 p 1. Higher values of the dismissal-induced reduction in faculty quality variable therefore reflect a larger fall in average department quality.
The instrument for student/faculty ratio is driven by the number of dismissals in a department. It measures how much the student/faculty ratio increased as a result of the dismissals. It is zero before 1933. After 1933 it is defined as follows:
Dismissal-Induced Increase in Student/Faculty Ratio p
1932 Student Cohort 1932 Student Cohort .
No. of Profs. in 1932 No. of Dismissed Profs. No. of Profs. in 1932
Suppose that a department had 50 students in its potential 1932 cohort. Before the dismissals there were 10 professors. The student/faculty ratio before 1933 would therefore be 50/10 p 5. In 1933 five professors were dismissed. So the new student/faculty ratio would be 50/(10 5) p 10. The dismissal-induced increase in student/faculty ratio for this de- partment would be zero until 1933 and [50/(10 5)] (50/10) p 5 thereafter.
The dismissals between 1933 and 1934 may have caused some PhD
27 An alternative way of defining the dismissal-induced change in faculty quality would be to allow the dismissal of below-average quality professors to have a positive impact on department quality and thus PhD student outcomes. In specifications not reported in this paper I have explored this possibility. The results are unchanged.
expulsion of professors in nazi germany 811
students to change university after 1933. This switching behavior, how- ever, will be endogenous. To circumvent this problem I assign each PhD student the relevant dismissal variables for the department she attended at the beginning of 1933.
The effect of the dismissals is likely to be correlated within a de- partment. I therefore account for any dependence between observations within a department by clustering all regression results at the depart- ment level. This not only allows the errors to be arbitrarily correlated for all PhD students in one department at a given point in time but also allows for serial correlation of these error terms.
Using the dismissals as IVs relies on the assumption that the dismissals had no other effect on PhD student outcomes than through its effect on faculty quality and department size and thus the student/faculty ratio. It is important to note that any factor affecting all German PhD students in the same way, such as a possible decline of journal quality, will be captured by the yearly PhD cohort effects and would thus not invalidate the identification strategy. As students in unaffected depart- ments act as a control group, only factors changing at the same time as the dismissals and exclusively affecting students in departments with dismissals (or only students at departments without dismissals) may be potential threats to the identification strategy. In the following I discuss some of these potential worries.
One of the main worries is that departments with many Jewish pro- fessors also attracted more Jewish students. If Jewish students were better on average than other students and if many Jewish students quit the PhD program, average outcomes in departments with dismissals would have fallen just because all good Jewish students were no longer studying there. I show below that all results hold if I exclude Jewish students from the regressions.
Another worry is that disruption effects during the dismissal years could drive my findings. I show, however, that omitting the turbulent dismissal years 1933–34 from my regressions does not affect my findings.
One may also be concerned that other policies by the Nazi govern- ment affected professors remaining in Germany only in departments with dismissals (or only in departments without dismissals). This could affect student outcomes in the respective department. A potential ex- ample may be that less ardent Nazi supporters remained in departments with dismissals compared to departments without dismissals. This could negatively affect the early career of their students. In a different paper I investigate many factors that could differentially affect professors in affected and unaffected departments (see Waldinger 2010). I show that the dismissals were unrelated to the number of ardent Nazi supporters, changes in funding, and promotion incentives of professors. It is there-
812 journal of political economy
fore unlikely that direct effects on professors are driving my PhD student results.
Any difference-in-difference type strategy relies on the assumption that treatment and control groups did not follow differential trends over time. I test this assumption in two ways. First, I show that most results are not affected by including linear department-specific time trends in the regressions. This approach would not address the problem if differential trends were nonlinear. I therefore estimate a so-called placebo experiment only using the predismissal period of the data and moving the dismissal from 1933 to 1930. Columns 5–8 of Appendix table A3 report the results for the placebo experiment. The coefficients are all close to zero, and none of them is significant. In two of the four regressions the coefficient even has the opposite sign from the results of the true reduced form. This makes it particularly unlikely that dif- ferential time trends explain my findings.
IV. The Effect of Faculty Quality on PhD Student Outcomes
An interesting starting point for the empirical investigation is the com- parison of PhD student outcomes in departments with dismissals com- pared to those of students in departments without dismissals. Figure 3 shows the evolution of three PhD student outcomes in the two sets of departments. Figure 3a shows the probability of publishing the disser- tation in a top journal for different PhD cohorts in departments with above-average quality dismissals (dashed line) and departments without dismissals. Before 1933, the probability of publishing the dissertation in a top journal is always above 0.4 in departments that later on experience dismissals of professors. Students graduating from those departments in the years after 1933, however, have a much lower probability of pub- lishing their dissertation in a top journal. The probability of publishing the dissertation varies from year to year in departments that do not experience any dismissals, but it does not change substantially after 1933.
Figure 3b shows the probability of becoming a full professor later in life for PhD students graduating in a certain year. It is evident that the data are much more noisy for this outcome. One reason for this is that becoming a full professor is a relatively low-probability event. As average PhD cohort size across all universities is only about 50 students per year (across all departments with above-average quality dismissals it is only about 17), it is not surprising that the probability of becoming a full professor varies substantially from year to year. Nonetheless, one can see a relatively sharp drop in affected departments in 1933 and an even more substantial drop toward the end of the sample period.
The probability of having positive lifetime citations in mathematics journals is plotted in figure 3c. In departments with dismissals of above-
Fig. 3.—Effect of dismissals on PhD student outcomes. a, Effect on the probability of publishing dissertation in a top journal. Dashed line: departments with above-average quality dismissals between 1933 and 1934; solid line: departments without dismissals. b, Effect on probability of becoming a full professor later in life. Dashed line: departments with above-average quality dismissals between 1933 and 1934; solid line: departments with- out dismissals. c, Effect on the probability of having positive lifetime citations. Dashed line: departments with above-average quality dismissals between 1933 and 1934; solid line: de- partments without dismissals.
814 journal of political economy
average quality the probability of having positive lifetime citations is always higher before 1933. After 1933, however, the probability of pos- itive lifetime citations declines and is eventually below the probability of students who graduate from departments without dismissals.
The graphical analysis suggests that PhD student outcomes in affected departments deteriorate after the dismissal of high-quality professors. Again it is important to highlight that the figure understates the vari- ation I am using in the regression analysis. In the regressions I use ample department-level variation in changes to faculty quality and the student/faculty ratio. In columns 1–4 of Appendix table A3 I therefore report regression results of the reduced form.28 The dismissal-induced reduction in faculty quality has a strong negative impact on PhD student outcomes that is always significant at the 1 percent level.29 The dismissal- induced increase in student/faculty ratio never has a significant effect on PhD student outcomes. Columns 5–8 of table A3 show the results from the placebo test suggested before, where I estimate the effect of a placebo dismissal in 1930. I use only the predismissal period and investigate whether students in departments that later experienced dis- missals were already on a downward trend before 1933. The results suggest the opposite, if anything. None of the coefficients is significantly different from zero, and in fact two of the four coefficients on the dismissal-induced reduction in faculty quality have the opposite sign. These results strongly support the view that the dismissal of professors can be used as a valid source of exogenous variation in faculty quality and the student/faculty ratio.
In the following, I investigate the effect of faculty quality and the student/faculty ratio on PhD student outcomes using the regression model proposed above. As discussed before, both faculty quality and the student/faculty ratio are endogenous. Using the dismissal as an instrument can overcome these endogeneity problems. Table 5 reports the two first-stage regressions equivalent to equations (2) and (3) pre- sented before. Some of the students may have reacted to the dismissal of professors by changing departments after 1933. I address this problem
28 The estimated reduced-form equation is
Outcome p b b (Dismissal-Induced Reduction in Faculty Quality)
idt 1 2 dt b (Dismissal-Induced Increase in Student/Faculty Ratio)
3 dt
b4Femaleidt b5Foreigneridt b6CohortFEt
b7DepartmentFEd idt.
29 The PhD student data include only students who have finished their PhD. The dis- missals may have caused some post-1933 PhD students in affected departments to quit the PhD program altogether. It is quite likely that these quitters were in fact the weakest students. In that case my results would underestimate the true effect of the dismissals on PhD student outcomes.
expulsion of professors in nazi germany 815
TABLE 5
First Stages
Dismissal-induced fall in faculty quality
Dismissal-induced increase in student/faculty ratio
Female .142*
Dependent Variable
Foreigner
Cohort dummies
University fixed effects Observations
R2
Cragg-Donald eigenvalue statistic
25.2
Average Quality (1)
(.060) .046 (.097)
Yes Yes 690 .795
Student/Faculty Ratio
(2)
4.195 (2.058)
.439** (.116) 1.165 (.705)
1.971 (1.183)
Yes Yes 690 .757
1.236** (.074) .014 (.008)
Note.—All standard errors are clustered at the department level. *Significant at the 5 percent level.
**Significant at the 1 percent level.
by assigning the department fixed effect for all post-1933 years according to the department that a student attended at the beginning of 1933.30 Column 1 reports the first-stage regression for faculty quality. As ex- pected the dismissal-induced reduction in faculty quality has a strong and highly significant effect on average faculty quality. The dismissal- induced increase in student/faculty ratio does not affect faculty quality. The second first-stage regression for student/faculty ratio is reported in column 2. In this case, the dismissal-induced reduction in faculty quality has no significant effect. The dismissal-induced increase in stu- dent/faculty ratio variable, which is driven by the number of dismissals in a department, is a strong and highly significant predictor of the student/faculty ratio. This pattern is very reassuring since it indicates that the dismissal indeed provides two orthogonal instruments: one for average faculty quality and one for the student/faculty ratio. A common concern in IV estimation is bias due to weak instruments as highlighted by Bound, Jaeger, and Baker (1995) and Stock, Wright, and Yogo (2002). In this paper there are two endogenous regressors and two instruments. Using a simple F-test on the instruments would be misleading because not only do the instruments have to be strong but one also needs at least as many instruments as endogenous regressors. Stock and Yogo
30 Only students who finished their PhD before 1933 or who had at least started their undergraduate studies at the beginning of 1933 are included in my sample.
816 journal of political economy
(2005) therefore propose a test based on the Cragg-Donald (1993) min- imum eigenvalue statistic to test for weak instruments. Stock and Yogo calculate the critical value of the Cragg-Donald eigenvalue statistic to be equal to 7.03 for a model with two endogenous regressors and two instruments. I report the Cragg-Donald eigenvalue statistic at the bottom of table 5. As it clearly exceeds the critical value, there is no worry of weak instrument bias in this context.31
Table 6 reports the OLS and IV results. The regressions are estimated for a sample of students who have started their degree before January 1933 because I assign the dismissal variables according to the university attended at the beginning of 1933 for those who finish their PhD after 1933. This rules out the possibility that the results are driven by an inability to recruit good students at the places that lost professors. Col- umns 1 and 2 show the effects of faculty quality and the student/faculty ratio on the probability of publishing the dissertation in a top journal. Faculty quality has a strong positive and significant effect on the prob- ability of publishing the dissertation. Student/faculty ratio, however, does not affect the outcome. The IV estimate of faculty quality is not only highly significant but also economically relevant. The standard deviation in faculty quality across departments is about 1.3. A one-stan- dard-deviation increase in faculty quality therefore increases the prob- ability of publishing the dissertation by about 13 percentage points.32
Column 4 reports the IV results for becoming a full professor later in life. Again the IV coefficient on faculty quality is positive and highly significant. A one-standard-deviation increase in faculty quality increases the probability of becoming a full professor by about 10 percentage points. The student/faculty ratio does not affect the probability of be- coming a full professor. The IV result for lifetime citations in mathe- matics journals is reported in column 6.
The coefficient on faculty quality is positive and highly significant. A
31 As the number of instruments is equal to the number of endogenous regressors, the model is just-identified. Just-identified models typically suffer less from weak instruments. Stock and Yogo (2005), however, characterize instruments to be weak not only if they lead to biased IV results but also if hypothesis tests of IV parameters suffer from severe size distortions. It may therefore happen that one obtains a significant IV coefficient that is actually not significantly different from zero. To test whether this problem can potentially affect IV estimates, Stock and Yogo propose values of the Cragg-Donald (1993) minimum eigenvalue statistic for which a Wald test at the 5 percent level will have an actual rejection rate of no more than 10 percent. As outlined in the text, the critical value in this context is 7.03 and thus is way below the Cragg-Donald statistics reported for the regressions in this paper.
32 Interestingly, some of the IV standard errors are slightly smaller than the correspond- ing OLS ones. This occurs only when I cluster the standard errors at the department level. As the IV residuals are different from the OLS residuals, the intradepartment correlations of these residuals may be smaller in the IV case. If I do not cluster, all results remain very similar and highly significant, and OLS standard errors are always larger than IV standard errors.
Average faculty quality Student/faculty ratio Female
Foreigner
.056** (.018)
.102** (.015)
.037 (.021) .000 (.001)
.076** (.015)
2.388* (1.119)
4.901** (1.546)
.068* (.026)
.125** (.016)
(.048) Cohort dummies Yes Department fixed effects Yes Observations 690
(.045) (.053) Yes Yes Yes Yes 690 690
(5.136) Yes Yes 690
Yes Yes 690
Yes Yes 690 .126
Yes Yes 690
R 2 .163 Cragg-Donald eigenvalue
.155
.080
statistic
25.2
25.2
25.2
25.2
.000 (.001)
.014 .022 .134*
.016 (5.262)
.025 (.072)
.019 (.074)
Note.—All standard errors are clustered at the department level. *Significant at the 5 percent level.
**Significant at the 1 percent level.
TABLE 6
Instrumental Variables
Dependent Variable
Number of Lifetime Positive Lifetime Published Top Full Professor Citations Citations
OLS IV OLS IV OLS IV OLS IV
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
.003
(.002)
.015 .022 .099* (.059) (.055) (.041)
.001 (.003)
.040 (.106) 8.711**
.284 (.328)
.002 (.001)
.003 (.004)
.103** (.036) .135* (.053) Yes Yes 690
(2.756) .803
8.850** (2.436)
.090 (.068)
.099 (.062)
818 journal of political economy
one-standard-deviation increase in faculty quality translates into an in- crease in lifetime citations by 6.3. This is again a very sizable effect because the average PhD student has about 11 lifetime citations. Similar to the results on previous outcomes, student/faculty ratio has no effect on lifetime citations. Lifetime citation counts are usually quite skewed. Because mathematicians receive fewer citations than other researchers, this is less of a problem in this sample as can be seen from table 4. Nonetheless, I investigate whether outliers are driving the lifetime ci- tation result using an indicator for positive lifetime citations as the dependent variable. The results are reported in column 8 of table 6. The regression suggests that the lifetime citation results are not driven by outliers. A one-standard-deviation increase in faculty quality increases the probability of having positive lifetime citations by about 16 per- centage points.
The coefficients on the control variables reveal some interesting pat- terns. Women have about the same probability of publishing their dis- sertation in a top journal as men. Women’s long-term outcomes, how- ever, are significantly worse than men’s. They have a lower probability of becoming a full professor later in life and have fewer lifetime citations. They also seem to have a lower probality of having positive lifetime citations even though the coefficient is not significant at conventional levels. Foreigners have about the same probability of publishing their dissertation in a top journal as Germans. They also have a similar num- ber of lifetime citations and a similar probability of having positive lifetime citations. The probability of becoming a full professor is, how- ever, significantly lower for foreigners than for German PhD students.
In the following, I report a large number of checks indicating that these findings are very robust. The results are reported in table 7. In contrast to the previous tables, each column reports regression results from four separate regressions, each with one of the four PhD student outcomes as the dependent variable.
First, I add 28 dummies indicating different occupations of the fa- ther.33 The results are reported in column 2 of table 7 (col. 1 reports the baseline results). It is reassuring that the inclusion of these powerful individual controls hardly affects the results.
It is extremely unlikely that students could forecast the dismissal of professors because the expulsion occurred just 2 months after the Nazi
33 The data include very detailed information on father’s occupation. Unfortunately the information is missing for about 30 percent of the data. I include an additional dummy for all those who do not have any information on their father’s occupation.
expulsion of professors in nazi germany 819
party came into power.34 Nonetheless, I address this concern by inves- tigating a sample of students who started studying between 1922 and 1930. In this sample, all of those affected by the dismissals should have been well attached to their programs when the dismissal shock occurred, but there was no way of forecasting the dismissals at the time they started studying. Again, the results are very similar to those reported in column 3.
One may worry that the results are mostly driven by Jewish students. They may have been the best students studying in the best universities that later experienced more dismissals. Jewish students faced substantial difficulties after 1933. One would therefore find a drop in the probability of publishing the dissertation, the probability of becoming a full pro- fessor, or the number of lifetime citations for students in affected de- partments that is not caused by a fall in faculty quality. I investigate this issue by reestimating the regressions for non-Jewish students only, as reported in column 4. Encouragingly, the results hardly change. This indicates that the results are not driven by Jewish students.
Another worry is that student life in the dismissal years may have been disrupted. This may have had a direct effect on student outcomes. I therefore reestimate the regressions omitting the years 1933 and 1934, when most of the dismissals took place. Interestingly, the point estimates reported in column 5 are now larger in most cases. This indicates that students who finished their PhD in the early years after the dismissal actually suffered less than students who finished later and were thus exposed to the fall in faculty quality for a longer time period. It is thus relatively unlikely that acute disruption effects can explain my findings.
A related concern is that students’ outcomes may have worsened be- cause of the direct disruption caused by the loss of their advisor, not necessarily because of a fall in faculty quality. I investigate this concern by estimating the regressions focusing on students who were still doing course work at the beginning of 1933 and who had thus not yet started working on their dissertation with a specified advisor.35 Reassuringly, the results reported in column 6 are unchanged. This indicates that students who had not yet started their dissertation were equally affected by the dismissals. This strongly suggests that the loss of a PhD supervisor is not driving my results. Finally, I investigate whether differential time trends across departments can explain my findings by including linear
34 Even forecasting the fact that the Nazi party would form part of the government would have been very difficult. The Nazi party actually lost votes between the elections of July and November 1932 (which was the last free election). By the beginning of 1933, many political commentators were even suggesting that the threat of the Nazi party gaining power was abating.
35 For the pre-1933 cohorts I include all students, of course, since they were by definition not doing course work anymore.
820
Dependent Variable
Full Sample (1)
Full Sample (2)
Starting 1922–30 (3)
Non-Jewish Students (4)
Omitting 1933 and 1934 (5)
Students Doing Course Work (6)
Full Sample (7)
Published top:
Average faculty quality
.102** (.015)
.109** (.015)
.096** (.019)
.094** (.018)
.124** (.019)
.144** (.040)
.048* (.022) .004 (.005)
Student/faculty ratio
.003 (.002)
.001 (.003)
.002 (.003)
.002 (.003)
.001 (.003)
.008 (.014)
Full professor:
Average faculty quality
.076** (.015)
.082** (.016)
.057** (.014)
.065** (.019)
.091** (.020)
.069* (.033) .013 (.008)
.055** (.019)
Student/faculty ratio
.001 (.003)
.002 (.004)
.002 (.005)
.004 (.003)
.004 (.003)
.006 (.003)
Number of lifetime citations: Average faculty quality
4.901** (1.546)
6.038** (1.658)
5.588** (1.684)
4.066** (1.233)
6.102** (2.078)
5.448** (1.973)
4.540 (2.922)
TABLE 7
Robustness Instrumental Variable Results Sample
821
Student/faculty ratio
.284 .272 .272 .213 (.328) (.247) (.353) (.291)
.534 .246 .058 (.375) (.446) (.266)
Positive lifetime citations: Average faculty quality
.125** .136** .120** .106** (.016) (.017) (.018) (.020)
.130** .174** .012 (.028) (.040) (.030) .001 .004 .010 (.004) (.013) (.011) Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes
Student/faculty ratio
.003 .000 .003 .001 (.004) (.004) (.003) (.004) Yes Yes Yes Yes
Controls
Father’s occupation Cohort dummies Department fixed effects Department-specific time
Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes
trends
Observations 690 690 Cragg-Donald eigenvalue
542 635 18.7 23.5
Yes 570 501 690
statistic 25.2 21.6
17.1 6.4 15.1
to separate regressions with an indicator for publishing the dissertation in a top journal, an indicator for becoming a full professor, number of lifetime citations, and an indicator for positive lifetime ciations as the respective dependent variables. Regressors not listed are indicators for being female and foreigner. See the text for details. All standard errors
Note.—This table shows only regressors of interest. Results in the four panels correspond
are clustered at the department level. *Significant at the 5 percent level. **Significant at the 1 percent level.
822 journal of political economy
department-specific trends. The results are reported in column 7. The coefficient on the probability of publishing the dissertation in a top journal falls substantially but remains significant. The coefficient on the probability of becoming a full professor changes very little and remains significant at the 1 percent level. The coefficients on the lifetime citation measures, however, are no longer significant at conventional levels. In the regression with the number of lifetime citations as the dependent variable the point estimate remains similar to the previous point esti- mates but has a p-value of only 0.12. The coefficient on the indicator for positive lifetime citations falls substantially and is no longer signif- icant. This is a very demanding test since department-specific time trends will also take out some of the true effect of the fall in university quality.
One defining feature of these results is that the IV point estimates are higher than the corresponding OLS estimates. Measurement error of faculty quality is likely to be an important reason for this. Black and Smith (2006) highlight that measurement error is likely to affect any measure of university quality. In this context, measurement error atten- uates the OLS estimates because average citation-weighted publications are not perfectly measuring those aspects of faculty quality that are important for PhD students. Furthermore, the Web of Science publi- cations and citations data include only the first letter of the first name. This will introduce further measurement error in the faculty quality variable.
Another important reason for obtaining higher IV estimates is the fact that these estimates can be interpreted as a local average treatment effect as suggested by Imbens and Angrist (1994). The dismissals mostly affected high-quality departments. It is quite likely that the compliers in this setup have indeed very high returns to changes in faculty quality since students in these departments are much more research oriented. They may therefore respond more strongly to changes in faculty quality. Table 8 shows OLS and IV results for students in top 10 departments (ranked by average faculty quality) and students in lower-ranked de- partments. It is obvious that both the OLS and the IV returns to faculty quality are higher for students in top departments. Furthermore, the low Cragg-Donald eigenvalue statistic for the lower-ranked department indicates that the instruments do not affect faculty quality and depart- ment size in lower-ranked departments.
V. Conclusion
This paper uses the dismissal of professors by the Nazi government to identify the effect of faculty quality on PhD student outcomes. I show that the dismissals had indeed a very strong effect on average faculty
expulsion of professors in nazi germany 823
TABLE 8
Heterogeneity in Returns Sample
Top 10 Lower-Ranked Department Department
Dependent Variable
Published top:
Average faculty quality
Student/faculty ratio
Full professor:
Average faculty quality
Student/faculty ratio
Number of lifetime citations: Average faculty quality
Student/faculty ratio
Positive lifetime citations: Average faculty quality
Student/faculty ratio
Controls
Cohort dummies
Department fixed effects Observations
Cragg-Donald eigenvalue statistic
OLS (3)
IV (4)
.214 (.259)
.016 (.026)
.065 (.214)
.000 (.023)
6.032 (24.717)
.158 (2.914)
.164 (.204)
.007 (.020)
Yes Yes Yes 338 .8
OLS IV (1) (2)
.059** .094**
(.014) (.022) (.032)
.001 .002 .001 (.002) (.003) (.002)
.046* .074** .086* (.017) (.019) (.034)
.001 .001 .001 (.002) (.003) (.003)
2.092* 2.667** 4.570
(.685) (.557) .071 .156 (.167) (.211)
.092** .138** (.016) (.020)
.001 .003 (.002) (.003)
Yes Yes Yes Yes Yes Yes 352 352
30.0
(3.396) .023
(.156)
.055 (.043)
.003 (.002)
Yes Yes Yes 338
.012
Note.—This table shows only regressors of interest. Results in the four panels correspond to separate regressions with an indicator for publishing the dissertation in a top journal, an indicator for becoming a full professor, lifetime citations, and an indicator for positive lifetime citations as the respective dependent variables. Regressors not listed are indicators for be- ing female and foreigner. See the text for details. All standard errors are clustered at the department level.
*Significant at the 5 percent level. **Significant at the 1 percent level.
quality and the student/faculty ratio. I then use the exogenous variation in faculty quality and student/faculty ratio to estimate their effect on short- and long-term outcomes of PhD students. Faculty quality is found to have a very sizable effect on the career of former PhD students. A one-standard-deviation increase in average faculty quality increases the probability of publishing the dissertation in a top journal by about 13 percentage points. Furthermore, faculty quality during PhD training is
824 journal of political economy
also very important for the long-run career of former students. A one- standard-deviation increase in faculty quality increases the probability of becoming a full professor by about 10 percentage points. Further- more, faculty quality has a strong effect on lifetime citations. A one- standard-deviation increase in faculty quality leads to an increase of lifetime citations by 6.3 and increases the probability of having positive lifetime citations by 16 percentage points. Student/faculty ratio does not seem to affect PhD student outcomes.
The results suggest that, even in highly selective education markets in which some may claim that the main value of the degree is the signal that it sends concerning the talent required to obtain entry into and to complete the program, the quality of instruction matters greatly for future outcomes. The results presented above could of course be partly driven by signaling if journal editors and future employers would dis- criminate sharply between a student from Go ̈ttingen who graduated before 1933 and a student who graduated after 1933. It is, however, unlikely that Go ̈ttingen’s reputation fell so sharply in a single year. If the reputation of universities with dismissals did not fall very sharply, the results are likely driven by large differences in human capital that PhD students received during their training. This suggests that attending high-quality PhD programs does have real effects on the human capital of PhD students. This result is very different from the findings of Van Ours and Ridder (2003). Their results from three Dutch universities suggest that the main value of good supervisors is to attract good stu- dents. From a policy perspective these results suggest that the most efficient way of training PhD students is to have large PhD programs in a small number of very high-quality departments. In pre–World War II Germany, Go ̈ttingen and Berlin jointly produced more than 20 per- cent of all mathematics PhD students. The five best universities pro- duced about 28 percent of all mathematics PhD students at the time. Today the five best universities in Germany produce only about 8.5 percent of all mathematics PhD students. In fact, none of the five best German mathematics departments (according to the faculty’s research output) are among the top five producers of PhD students today.36 The less optimal organization of PhD student training may have been an important factor contributing to the decline of German science after World War II. In the United States, however, the best research univer- sities are also the main producers of PhD students. My findings suggest that this is a very productive way of organizing PhD training that should be further encouraged by science policy makers.
36 The data on current PhD students in Germany come from CHE (2009). Quality of departments is measured by publications. While there were 32 universities in Germany that produced mathematics PhD students in the 1920s and 1930s, 62 German universities produce mathematics PhDs today.
expulsion of professors in nazi germany 825
My results on the effect of local department quality on PhD student outcomes are particularly interesting if they are contrasted with findings in Waldinger (2010). That paper investigates how the dismissal of pro- fessors affected the productivity of professors who remained in Germany after 1933. Interestingly, dismissals in the local department do not affect the productivity of staying professors. That suggests that the quality of the local department is not important for more senior researchers who already have a professional network outside their university. PhD stu- dents, however, do not have any such network and are therefore par- ticularly dependent on studying in a department with high-quality faculty.
Appendix
TABLE A1
Top Journals
Journal Name Published in
Mathematics:
Journal fu ̈r die reine und angewandte Mathematik Mathematische Annalen
Mathematische Zeitschrift
Zeitschrift fu ̈r angewandte Mathematik und Mechanik Acta Mathematica
Journal of the London Mathematical Society Proceedings of the London Mathematical Society
General science:
Naturwissenschaften
Sitzungsberichte der Preussischen Akademie der Wissenschaften
Physikalisch Mathematische Klasse Nature
Proceedings of the Royal Society of London A (Mathematics and Physics)
Science Physics:
Annalen der Physik Physikalische Zeitschrift Physical Review
Chemistry:
Berichte der Deutschen Chemischen Gesellschaft Biochemische Zeitschrift
Journal fu ̈r Praktische Chemie
Justus Liebigs Annalen der Chemie
Kolloid Zeitschrift
Germany Germany Germany Germany Sweden
United Kingdom United Kingdom
Germany
Germany
United Kingdom
United Kingdom United States
Germany Germany United States
Germany Germany Germany Germany Germany Germany
Germany Germany
United Kingdom
Zeitschrift Zeitschrift Chemie Zeitschrift Journal of
fu ̈ r Anorganische Chemie
fu ̈r Elektrochemie und Angewandte Physikalische
fu ̈ r Physikalische Chemie the Chemical Society
und Allgemeine
Chemie
TABLE A2
Top 20 Mathematics Professors, 1925–32: Citation-Weighted Publications Measure
University
Beginning Name of 1933
Average Citation- Weighted Publications (1925–32)
Average
Publicatons Dismissed
(1925–32) 1933–34
John von Neumann Richard Courant Richard von Mises Heinz Hopfa
Paul Epstein Oskar Perron Willy Prager Gabiel Szego ̈ Werner Rogosinski Wolfgang Krull Erich Rothe
Hans Peterssonn
Adolf Hammerstein Alexander Weinstein Erich Kamke
Hellmuth Kneser
Bartel van der Waerden Max Mu ̈ller
Richard Brauer
Leon Lichtenstein
Berlin
Go ̈ ttingen Berlin
Frankfurt Mu ̈ nchen Go ̈ ttingen Ko ̈ nigsberg Ko ̈ nigsberg Erlangen Breslau TU Hamburg Berlin Breslau TU Tu ̈ bingen Greifswald Leipzig Heidelberg Ko ̈ nigsberg Leipzig
36.3 1.5 Yes 22.3 1.3 Yes 15.6 .9 Yes 13.3 1.3
11.5 .6
10.6 1.5
10.0 .4 Yes
9.4 1.4 Yes 9.1 .6
8.9 1.4
8.0 1.0 Yes 8.0 2.0
8.0 .5
6.3 .7 Yes 6.3 .8
6.3 .6
5.8 1.8
5.3 .3
5.0 .6 Yes 4.9 1.5 Yes
a The university in 1933 is missing for professors who retired before 1933.
826
827
Dismissal-induced fall in faculty quality
.134** (.017) .002 (.001)
.090** (.021) .000 (.001)
6.137** .164** (2.218) (.019) .042 .002
.023 .053 (.031) (.037)
3.434 (5.597) .462
.037 (.030)
Dismissal-induced increase in student/faculty ratio
.004 .002 (.004) (.005) .009 .167* (.066) (.068)
.002 (.003)
Female
.004 .119* (.048) (.045) .031 .147*
(.114) (.002) 10.723* .067
(.431) 12.114** .104
Foreigner
(4.459) (.058) .942 .033
.017 .136 (.103) (.102)
(4.228) (.071) 7.169 .050
Father’s occupation Cohort dummies Department fixed effects Observations
(.048) Yes Yes Yes
(.065) (6.151) (.075) Yes Yes Yes Yes Yes Yes Yes Yes Yes
Yes Yes Yes Yes Yes Yes 403 403 .302 .291
(6.479) (.134) Yes Yes Yes Yes Yes Yes 403 403
R2
690 690 .221 .208
690 690 .185 .208
.224 .260
Note.—All standard errors are clustered at the department level. *Significant at the 5 percent level.
**Significant at the 1 percent level.
Published Top (1)
Full Professor (2)
Lifetime Lifetime Citations Citations (3) (4)
Published Full Top Professor
No. of Lifetime Citations (7)
Positive Lifetime Citations (8)
TABLE A3
Reduced-Form and Placebo Test
Dependent Variable
Reduced Form
No. of Positive
Placebo Moving Dismissal to 1930 (Only Pre-1933 Observations)
(5) (6)
Fig. A1.—Sample page from the mathematics section of the List of Displaced German Scholars (1936).
expulsion of professors in nazi germany 829
Fig. A2.—Total number of Jewish mathematics PhD graduates in all German universities in each year. Black bar: all departments; white bar: departments with above-average quality dismissals between 1933 and 1934.
References
Abele, Andrea E., Helmut Neunzert, and Renate Tobies. 2004. Traumjob Math- ematik—Berufswege von Frauen und Ma ̈nnern in der Mathematik. Basel: Birkha ̈user Verlag.
Behrman, Jere R., Mark R. Rosenzweig, and Paul Taubman. 1996. “College Choice and Wages: Estimates Using Data on Female Twins.” Rev. Econ. and Statis. 78 (November): 672–85.
Bergmann, Birgit, and Moritz Epple, eds. 2009. Ju ̈dische Mathematiker in der deutschsprachigen akademischen Kultur. Berlin: Springer Verlag.
Black, Dan A., and Jeffrey A. Smith. 2004. “How Robust Is the Evidence on the Effects of College Quality? Evidence from Matching.” J. Econometrics 121 ( July): 99–124.
———. 2006. “Estimating Returns to College Quality with Multiple Proxies for Quality.” J. Labor Econ. 24 (July): 701–28.
Bound, John, David A. Jaeger, and Regina M. Baker. 1995. “Problems with In- strumental Variables Estimation When the Correlation between the Instru- ments and the Endogenous Explanatory Variables Is Weak.” J. American Statis. Assoc. 90 (June): 443–50.
Brewer, Dominic, Eric Eide, and Ronald Ehrenberg. 1999. “Does It Pay to Attend an Elite College? Cross Cohort Evidence on the Effects of College Type on Earnings.” J. Human Resources 34 (Winter): 104–23.
Carrell, Scott E., and James E. West. 2008. “Does Professor Quality Matter?
830 journal of political economy
Evidence from Random Assignment of Students to Professors.” Working Paper
no. 14081 (June), NBER, Cambridge, MA.
CHE. 2009. Das CHE Forschungsranking deutscher Universita ̈ten 2009—Mathematik
2009. Gu ̈tersloh: CHE gemeinu ̈tziges Zentrum fu ̈r Hochschulentwicklung. Cragg, John G., and Stephen G. Donald. 1993. “Testing Identifiability and Spec- ification in Instrumental Variables Models.” Econometric Theory 9 (June): 222–
40.
Dale, Stacy, and Alan B. Krueger. 2002. “Estimating the Payoff to Attending a
More Selective College: An Application of Selection on Observables and Unob-
servables.” Q.J.E. 117 (November): 1491–1527.
Hartshorne, Edward Y. 1937. The German Universities and National Socialism. Cam-
bridge, MA: Harvard Univ. Press.
Hentschel, Klaus. 1996. Physics and National Socialism—an Anthology of Primary
Sources. Berlin: Birka ̈user Verlag.
Hoffmann, Florian, and Philip Oreopoulos. 2009. “Professor Qualities and Stu-
dent Achievement.” Rev. Econ. and Statis. 91 (January): 83–92.
Hussain, Iftikhar, Sandra McNally, and Shqiponja Telhaj. 2009. “University Qual- ity and Graduate Wages in the UK.” Working Paper no. 99 (March), Center
Econ. Educ., London.
Imbens, Guido, and Joshua Angrist. 1994. “Identification and Estimation of Local
Average Treatment Effects.” Econometrica 62 (March): 467–75.
Kro ̈ner, Peter. 1983. Vor fu ̈nfzig Jahren—die Emigration deutschsprachiger Wissen- schaftler 1933–1939. Edited by Gesellschaft fu ̈r Wissenschaftsgeschichte Mu ̈n-
ster. Wolfenbu ̈ttel: Heckners Verlag.
Lorenz, Charlotte. 1943. Zehnjahres-Statistik des Hochschulbesuchs und der
Abschlußpru ̈fungen. Berlin: n.p.
National Research Council. 1995. Research Doctorate Programs in the United States:
Continuity and Change. Edited by Marvin L. Goldberger, Brendan H. Maher,
and Pamela Ebert Flattau. Washington, DC: Nat. Acad. Press. Notgemeinschaft Deutscher Wissenschaftler im Ausland. 1936. List of Displaced
German Scholars. London: n.p.
Reid, Constance. 1996. Hilbert. New York: Springer.
Ro ̈der, Werner, and Herbert Strauss. 1983. Biographisches Handbuch der deutsch-
sprachigen Emigration nach 1933. Vol. 2, The Arts, Sciences, and Literature. Edited by Institut fu ̈r Zeitgeschichte Mu ̈nchen and Research Foundation for Jewish Immigration. New York: K. G. Saur Verlag.
Siegfried, John J., and Wendy A. Stock. 2004. “The Market for New Ph.D. Econ- omists in 2002.” A.E.R. Papers and Proc. 94 (May): 272–85.
Siegmund-Schultze, Reinhard. 1998. Mathematiker auf der Flucht vor Hitler. Braun- schweig/Wiesbaden: Vieweg Verlag.
Stock, James H., Jonathan H. Wright, and Motohiro Yogo. 2002. “A Survey of Weak Instruments and Weak Identification in Generalized Method of Mo- ments.” J. Bus. and Econ. Statis. 20 (October): 518–39.
Stock, James H., and Motohiro Yogo. 2005. “Testing for Weak Instruments in Linear IV Regression.” In Identification and Inference for Econometric Models: Essays in Honor of Thomas Rothenberg, edited by Donald W. K. Andrews and James H. Stock. New York: Cambridge Univ. Press.
Stock, Wendy A., T. Aldrich Finegan, and John J. Siegfried. 2006. “Attrition in Economics Ph.D. Programs.” A.E.R. Papers and Proc. 96 (May): 458–66.
Tobies, Renate. 2006. Biographisches Lexikon in Mathematik promovierter Personen— an deutschen Universita ̈ten und Technischen Hochschulen WS 1907/08 bis WS 1944/ 45. Augsburg: Dr. Erwin Rauner Verlag.
expulsion of professors in nazi germany 831
Van Ours, Jan C., and Geert Ridder. 2003. “Fast Track or Failure: A Study of Graduation and Dropout Rates of Ph.D. Students in Economics.” Econ. Educ. Rev. 22 (April): 157–66.
von Olenhusen, Albrecht G. 1966. “Die ‘nichtarischen’ Studenten an den deutschen Hochschulen—Zur nationalsozialistischen Rassenpolitik 1933– 1945.” Vierteljahreshefte fu ̈r Zeitgeschichte 14 (April): 175–206.
Waldinger, Fabian. 2010. “Peer Effect in Science—Evidence from the Dismissal of Scientists in Nazi Germany.” Manuscript, Univ. Warwick, Dept. Econ.