FeatureBirch and Bombieri, which in turn used Deligne’s work onthe Riemann hypothesis on varieties); one group focused onthe combinatorics of putting these estimates together to createa distributional estimate on primes; one group focused onoptimising the GPY sieve to convert distributional estimateson primes into a concrete value of k; and one group focusedon finding narrow prime tuples which converted values of kto values of H. We had managed to organise ourselves into asort of factory production line: an advance in, say, the TypeI estimates would be handed over to the combinatorics groupto produce a new distributional estimate in primes, which thesieve team would then promptly convert into a revised valueof k, which the prime tuples team would then use to updatetheir value of H 1 . Steady improvements from all of thesegroups led to new improvements to these values on a dailybasis for several weeks (see michaelnielsen.org/polymath1/index.php?title=Timeline_of_prime_gap_bounds for a timelineof these gains).Many important contributions came from mathematicianswho were not initially involved in the Polymath project. Forinstance, in the first week of June, Janos Pintz released apreprint [33] in which he went through Zhang’s paper andoptimised all the numerical constants, leading to further improvementsin the values of k and H 1 ; within a day or two,the Polymath team had gone through the preprint (correctingsome arithmetic errors along the way), lowering k to26 024 and H 1 to about 280 000. Later, we ran into a problemin which we became victims of our own success: thevalues of k had dropped so low (around 6000) that a certainerror term in the sieve analysis that decayed exponentiallyin k became non-negligible. Pintz, who had followeddevelopments closely, came up with a much more efficientway to truncate the sieve that rendered this error term almostnon-existent (and dropping k almost immediately from6000 to about 5000) and then shared these computations withthe Polymath project. Similarly, Etienne Fouvry, EmmanuelKowalski, Phillipe Michel and Paul Nelson, who had previouslyworked on using Deligne-type exponential sum estimatesto obtain distributional estimates similar to Zhang’sType III estimates, found a simpler approach than Zhang tothese estimates that gave significantly better numerology andthey also donated these arguments to the Polymath projectand began actively participating in the following discussion.Thomas Engelsma, who had made extensive numerical computationsof narrow admissible tuples over a decade ago, alsoopened up his database to our project.By the end of June (with k dipping below 1000 and Hfalling below 7000), our understanding of various componentsof the project had matured significantly, though the paceof discussion was still lively (with perhaps 20 comments eachday, down from a peak of 50 or so each day). A databasehad been set up at math.mit.edu/~primegaps/ to automaticallyrecord the narrowest known prime tuples for a given value ofk (up to k = 5000), which automated the task of convertingk-values to H-values, with several of the more computationallyminded participants competing to add ever narrower tuplesto the database. A prime distribution estimate could similarlybe converted into a value of k with a few lines of Maplecode, using Pintz’s version of the truncated sieve, and a furtherfew lines of code allowed us to automatically convert anyimprovement in the Type I, II, III estimates to a prime distributionestimate in an optimised fashion. The Type II and TypeIII estimates had been optimised to such an extent that theywere no longer the dominant barrier in further improvementof the prime distribution estimates. The Type I estimates werethe only remaining major source of improvement but over thecourse of the next few weeks, several new ideas on optimisingthese estimates (most notably, using the q-van der Corputmethod of Graham and Ringrose [21], combined with thetheory of trace weights arising from the work of Deligne, andcarefully splitting the summations to maximise the effectivenessof the Cauchy-Schwarz inequality) were implemented,leading to the final values of k = 632 and H 1 = 4680.Further numerical improvements looked very hard tocome by and so, for the next few months, the Polymath projectparticipants devoted their efforts to writing up the results,splitting the paper into several sections on a shared Dropboxfolder, with various participants working in separate sectionsin parallel and coordinating their efforts through blog comments.Somewhat to our surprise, once all the various arguments,spread out over a dozen blog posts, had been collatedtogether, the final paper turned out to be extremely large – 163pages, to be exact 4 (for comparison, Zhang’s original paperwas 53 pages in length). Somehow, the Polymath format hadmanaged to efficiently segment the research project into moremanageable chunks, so that no individual participant had toabsorb the entire 163-page argument at any given time whilethe research was ongoing.In late October, when the Polymath paper was goingthrough several rounds of proofreading in preparation for submission,James Maynard announced a major breakthrough[28], in which the prime gap H 1 was lowered substantiallyto 700 (quickly revised downward to 600), with the improvementH 1 ≤ <strong>12</strong> available under the assumption of the Elliott-Halberstam conjecture. Furthermore, Maynard had found away to modify the GPY sieve argument in such a way that thenew distribution estimates on primes (the most difficult andnovel part of Zhang’s work and also of the Polymath project)were no longer needed and, furthermore, the methods couldfor the first time be used to bound H m for larger m. (Ihadalso begun to arrive at similar conclusions independently ofMaynard, though I did not get the precise numerical resultsthat Maynard did.) After some discussion with Maynard andwith the Polymath team, it was decided that we should bothpublish our results separately and then join forces to see if thenew ideas in Maynard’s paper could be combined with thosein the Polymath paper to obtain further improvements (thiseffort was soon dubbed “Polymath8b”).I had some mixed feelings about the continuation of thePolymath8 project; running the Polymath8a project had alreadyconsumed several months of my time and I was thinkingof turning to unrelated research projects. However, theenthusiasm of the other participants and the lure of getting aquick payoff from comparatively brief snatches of mathematicalthought were once again too great to resist.4 After significant revisions in the light of the referee report and subsequentdevelopments, the paper has since been shortened somewhat to 107pages.16 EMS Newsletter December <strong>2014</strong>
FeatureInterestingly, Polymath8b ended up moving in a ratherdifferent mathematical direction to Polymath8a; it turned outto be quite difficult to effectively combine the prime distributionestimates from Polymath8a with Maynard’s sieve tobound H 1 (although they could be used to bound H m for largervalues of m) and it was soon realised that the most promisingway forward was to either optimise or generalise a certainmultidimensional calculus of variations problem used inMaynard’s work.The former approach was initially more fruitful; by takingthe code Maynard used to obtain lower bounds for avariational problem by testing that problem on polynomials,and making it run faster and more efficiently on morepowerful computers, we managed to cut down the unconditionalH 1 bound from 600 to 300. However, improving theconditional bound of H 1 ≤ <strong>12</strong> was far tougher; indeed, wehad established an upper bound on the variational problemthat Maynard used to show that we could not hope to improveupon this bound without modifying the sieve. In orderto break the <strong>12</strong> barrier, the efficiency of our sieve (measuredby a quantity that we called M 4 , which arose from afour-dimensional variational problem) had to exceed 2. Therewas then a lengthy and frustrating “Zeno’s paradox” periodin which the efficiency M 4 of our sieves kept improving incrementally(from 1.845, to 1.937, to 1.951, etc.) but neverquite enough to surpass the magic threshold of 2 neededto break the barrier. Finally, there was a breakthrough; afterseveral weeks of effort, we stumbled upon an “epsilontrick” that allowed one to slightly enlarge the class of permissiblecutoff functions in the variational problem, at the expenseof worsening the quantity that one was trying to optimise.It turned out that this tradeoff was advantageous, allowingus to move M 4 to 2.018 and to reduce the bound toH 1 ≤ 8 (though we had to replace the Elliott-Halberstamconjecture with a strengthening of that conjecture, which wecall the generalised Elliott-Halberstam conjecture). A similar“Zeno’s paradox” game then played out for the analogousthree-dimensional variational problem M 3 , which afterseveral further refinements, both to the sieve and to the numericalprocedure for optimising the sieve, eventually pushedM 3 to be larger than 2 as well, giving the optimal resultH 1 ≤ 6 assuming the generalised Elliott-Halberstam conjecture.(The parity obstruction of Selberg prevents any betterbound on H 1 from purely sieve-theoretic considerations.)Some of this new technology also allowed a slight loweringof the unconditional bound of H 1 to 246 but further improvementbeyond this point seemed to require enormous amountsof computation and by early May we were happy to “declarevictory” at this point and write up the Polymath8b results.The Polymath8 project had perhaps one or two dozenactive participants but many more mathematicians and interestedamateurs followed the progress online. During theproject, whenever I visited another institution, I was usuallyasked what the latest value of H 1 was and how low I thought itwould go. In that respect, we were fortunate that we had sucha simple and easily understood statistic that could be used as aproxy for the degree of technical advancement we were making;it is not clear if future Polymath projects will be as easyto follow on a casual basis.The project also forced me to think and work in differentways from what I was accustomed to. I do not have extensiveexperience with programming, and most of the reallyheavy computational work was done by others on this project,but I did manage to write up some simple Maple code to atleast verify the numerical computations that others had generated.At another juncture, the way forward hinged on findingthe optimal way to decompose the unit cube into polyhedralpieces; lacking sufficient geometric or algebraic intuition,I ended up having to build a cubic lattice out of myson’s construction toys in order to visualise all the decompositionsbeing proposed. I also found myself having to learn areasof mathematics I would not otherwise have been exposedto, from -adic cohomology to the Krylov subspace method.All in all, it was an exhausting and unpredictable experiencebut also a highly thrilling and rewarding one.3 Andrew GibsonAndrew Gibson is an undergraduate mathematics student atthe University of Memphis.Shortly after Zhang announced his result and you (Tao)proposed the project, my classmates and I began a small,weekly seminar with a professor devoted to studying someof the theory involved (analytic number theory, sieve methods,etc.), albeit on a much more elementary level that waswithin our reach. Of course, the majority of the actual proofis still mostly over our heads but at least I feel as if I’vegained a bird’s-eye-view of the strategy and, probably moreimportantly, how it fits into the larger field. (For instance, beforeany of this, I could never have explained the Bombieri-Vinogradov theorem or the Hardy-Littlewood prime tupleconjecture.) So for us the project was a great excuse to entera new subject and has been immensely educational.More than that though – reading the posts and followingthe ‘leader-board’ felt a lot like an academic spectator sport.It was surreal, a bit like watching a piece of history as it occurred.It made the mathematics feel much more alive and social,rather than just coming from a textbook. I don’t think usundergrads often get the chance to peak behind closed doorsand watch professional mathematicians “in the wild” like thisso, from a career standpoint, it was illuminating. I get thesense that this is the sort of activity I can look forward toin grad school and as a post-doc doing research (...hopefully).I also suspect that many other students from many otherschools have had similar experiences but, like me, chose tostay quiet, as we had nothing to contribute. So, thank you allfor organising this project and for making it publicly availableonline.4 Pace NielsenPace Nielsen is an assistant professor at Brigham Young University.As a pre-tenure professional mathematician, I believe thata short introduction to my experiences in the Polymath8project will be helpful to other young mathematicians decidingwhether or not to participate in similar endeavours. I initiallyjoined the project for a number of reasons. One is thatI enjoy optimising numerical results. There is a certain plea-EMS Newsletter December <strong>2014</strong> 17
- Page 7 and 8: EMS News- mini-symposia;- film and
- Page 9 and 10: EMS NewsTransitionOn 29 June, in th
- Page 11 and 12: Newsford, Maxwell, and Kelvin. Some
- Page 13: Newsconjecture of Elliott and Halbe
- Page 16 and 17: Featurejecture for acute triangles
- Page 20 and 21: Featuresure that comes from derivin
- Page 22 and 23: FeatureFor me, personally, it felt
- Page 24 and 25: Featureplaced by several applicatio
- Page 26 and 27: ObituaryAnatolii Skorokhod(10 Sept
- Page 28 and 29: ObituaryIn 1957-1964, Skorokhod was
- Page 31 and 32: Obituarysion processes in a region
- Page 33 and 34: Obituarysupport the study of the ri
- Page 35 and 36: Discussionthe self-citations? One o
- Page 37 and 38: Discussiona large, diligent work-fo
- Page 39 and 40: Discussionattention of the media, 6
- Page 41 and 42: Discussionequations) also received
- Page 43 and 44: Short Report on ICM2014DiscussionMi
- Page 45 and 46: Interview with Fields MedallistMart
- Page 47 and 48: InterviewCertainly, it was.Competit
- Page 49 and 50: InterviewPictures are very importan
- Page 51 and 52: InterviewThere are, of course, no r
- Page 53 and 54: Research CentresThe IMS also plays
- Page 55 and 56: Research CentresMIMS “Mathematics
- Page 57 and 58: Book ReviewsCANP TanzaniaThe CANP (
- Page 59 and 60: Book ReviewsReviewer: Werner Kleine
- Page 61 and 62: Book Reviewscation” (by Jean-Luc
- Page 63 and 64: Book ReviewsIn 1923, Artin was only
- Page 65 and 66: Letters to the EditorReply to: ‘A
- Page 67 and 68: Introducing new and recent books fr