A dramatic growth in research and applications of Bayesian methods occurred in the 1980s. The Markov Chain Monte Carlo (MCMC) techniques provided a numerical integration scheme that made feasible practical Bayesian analyses.3  With this tool, statisticians could use the Bayesian approach to analyze data from previously unwieldy, nonstandard, and complex applications.

Statistical modeling continued to be an active statistical research area. In 1978, Schwartz published his Bayesian Information Criterion (BIC) for model selection without over-fitting. In geostatistics, the variogram and kriging interpolation methods became popular. Statisticians conducted associated research on other interpolation techniques such as smoothing splines, generalized least squares fitting of polynomials, and Bayes linear statistics. Generalized linear models became popular for dealing with heteroscedasticity. General linear models, including logistic regression, became conventional tools. In general, statisticians were relying on regression models adapted for application to non-Gaussian responses. Multiple comparisons techniques continued to receive researchers’ attention; e.g., the Holm-Bonferroni method (1979).

Scatterplot smoothing and non-parametric regression became popular tools for viewing data. The main idea was promoted by statisticians at Bell Laboratories who coined the term LOWESS for locally weighted scatter-plot smoothing. Edward Tufte’s The Visual Display of Quantitative Information (1982) set new standards for graphic visualization of data. Data visualization and communication became an increasingly active research area, largely facilitated by advances in digital technology.

The First International Conference on Genetic Algorithms was held in the mid-1980s, although the genetic algorithm optimization technique had been known for decades. In biostatistics, statisticians sought improved estimates of excess morbidity/mortality attributable to specific risk factors. For example, cancer risk and mutagenic risk assessments were conducted for purposes of developing coherent environmental regulations. Hidden Markov Models (HMMs) became ubiquitous in the field of bioinformatics during the late 1980s.

In multivariate analysis, the advantages of James-Stein (shrinkage) estimation and empirical Bayes analysis were compelling. The jackknife and the bootstrap techniques became quite popular, as Bradley Efron and colleagues extended statistical analysis to take advantage of new computational power. For science in general, statisticians developed and critically evaluated meta-analysis techniques for combining data from several studies. During the 1970s, the projection pursuit technique was proposed to machine-pick "interesting" low-dimensional projections of a high-dimensional point cloud. Some of the methods of classical multivariate analysis (e.g., principal component analysis and discriminant analysis) turned out to be special cases of projection pursuit.

placeholder image

Active Chapters of the American Statistical Association in 1989

During this period, five new sections of ASA were created, the Biopharmaceutical Section (BIOP), the Statistical Graphics Section (GRPH), and the Government Statistics Section (GOVT), the Quality and Productivity Section (Q-P), the Statistics and the Environment Section (ENVR).

The Montana Chapter of the ASA served a large geographic region. From any point in an area of about 225,000 sq. miles, Bozeman is the closest ASA chapter. See the map of ASA chapters as of 1989.6

Personal computers and associated software came to the University during this era. In 1977, the world's first personal computer, the Commodore PET, was introduced; also the Apple Computer company was incorporated. In 1980, a consortium of computer companies issued a standard for Ethernet, the first network to support 10 Mbit/s speeds. In 1983, the Lotus 1-2-3 spreadsheet program and Microsoft Word were first released for personal computers. In 1985, Microsoft Corporation released the first version of Windows to replace MS-DOS. Also the non-profit Free Software Foundation (FSF) was founded to support the free software movement and distribution under licenses, such as the GNU General Public License. There were some negative aspects also. Before the end of the decade, the first computer worm malware was distributed via the Internet; the culprit was caught and prosecuted.

Computational capabilities were rapidly increasing. The commercial package STATA was released in 1985. In 1980, Bell Laboratories distributed the first public version of S, followed in 1984 by two books, S: An Interactive Environment for Data Analysis and Graphics and Extending the S System. Also, in 1984 the source code for S became licensed for education and commercial purposes. By 1988, many changes were made to S and the syntax of the language, resulting in the New S Language which was very similar to modern versions of S-PLUS.

In the early 1980s the NSF funded the establishment of national supercomputing centers at several universities and provided interconnectivity in 1986 with the NSFNET project, which also created network access to the supercomputer sites in the United States from research and education organizations. MSU had access to supercomputers via that network. The ARPANET was decommissioned in 1990. In the early 1980s, networked personal computers on local area networks became increasingly important. Server-based systems similar to the earlier mainframe systems were developed; e.g. cc:Mail and Lotus Notes. BITNET (1981) provided electronic mail services for educational institutions. Eventually these systems could link different organizations as long as each organization ran the same email system and proprietary protocol. On the horizon lay the world wide web with its open source software that eventually led to the current (2019) SMTP, POP3, and IMAP email standard protocols.

Advances in Stat during Era 8
Next Topic (Annals of MSU) during Era 7
Table of Contents


Last revised: 2021-04-19