Menu Close

Celebrating the UP Centennial

By Flor Lacanilao

If you are out to describe the truth, leave elegance to the tailor.
— Albert Einstein

UP is celebrating its Centennial next year. What can we say about its contribution to knowledge and national progress, two major functions of a university?

The University has produced the recognized leaders of government, industry, scientific community, and other sectors of Philippine society. Some private universities would be proud of such graduates because these would attract more students. I think UP is proud of its graduates if they contributed to knowledge or national progress.

When I entered UP as a freshman in 1954, there was only one UP with the main campus in Diliman. The Philippines was second only to Japan in Asia, we are told. UP has since grown into a system of 7 constituent universities on 12 campuses, 5,000 teaching and research staff, 360 graduate programs, and 50,000 students all over the country.

Now no less than 10 countries are ahead of us in Asia, in science & technology and national progress. Is the State University partly to blame? Where has UP failed? I will review the last two decades and limit myself to some blunders in science, their consequences, and the revival of academic excellence at UP. They give some simple lessons and signs of hope to help reverse what happened to our country in the last five decades.

1. Developing capability

Leaders of developed countries have long reminded their counterparts in poor countries that the best way to development is through science and technology. “America’s huge economic success comes from innovation, which is fuelled by its research enterprise. And this in turn is driven by graduate education” (1). They emphasize the sequential relations between research, science, technology, and development — the R&D process — where research is the basic component.

I think UP Diliman established the College of Science in 1983 along these ideas. This will be one focus of my discussions. The College objective often repeated by its first dean was to have an all-PhD faculty. The problem here is that the PhD degree, under existing conditions, merely reflects capability (promise) and does not guarantee performance (contribution to knowledge).

By 1993 over 70 PhDs were added to the faculty of UPD College of Science, making a total of 101 PhDs or half of its teaching force (College of Science General Information, 1993-1994). Most of them were products of its graduate programs. Further, the College granted over 500 advanced degrees, including 133 PhDs, from 1985 to 1994. Other major colleges of UP have greatly increased these figures, thus boosting the University’s and the country’s R&D capability during the 90s.

At the national level, the National Science Development Board established in 1958 was transformed into the National Science and Technology Authority in 1982, and elevated to the Department of Science and Technology (DOST) in 1987. The University’s graduates and staff have dominated the management of this national agency.

The DOST launched the Science and Technology Master Plan 1990-2000. Its 1991 budget doubled to P1.7 billion in 1992, went up to P2.4 billion in 1993, and to P3.2 billion in 1995 (The S&T Post, October 1995). These are other means of improving capability.

Some of our neighbor countries also launched their development programs to advance S&T. When Singapore was developing its industrial base in the ’60s and ’70s, for example, its government relied much on the country’s scientists (internationally-published researchers, not to be confused with PhDs) and focused on developing S&T.

2. Measuring performance

Meanwhile, over the last 50 years, a revolutionary indexing system for the S&T literature was being developed and refined through the initiative of Eugene Garfield, founder of the Institute for Scientific information or ISI (2). It was later enriched with the help of a few others, notably Nobel Laureate Joshua Lederberg of biotechnology fame. Citation indexes were issued and “led to wholly new ways of searching the literature and understanding the structure of scientific knowledge.” An example is the Science Citation Index (SCI).

The SCI introduced new ways to evaluate research. The number of publications indexed in SCI later evolved into a common indicator of research and S&T performance. Using this indicator, for example, the Scientific American ranks the Philippines in 1995 as no. 60 in the world (3).

With publications data from the same influential index, the journal Science reports in 1995 and 2005 the rapid progress of science in China and India (4, 5). And the journal Nature compares in 2004 the quality of research in 31 countries using ISI’s publication and citation data (6).

The national total of listed papers in the SCI has been widely used to assess a country’s research performance and estimate its productive scientist count. This has been widely accepted to show the state of science & technology and economic development or underdevelopment.

The database of the ISI expanded into other fields, like social sciences, arts, and humanities, leading to other ISI indexes — SCI Expanded, Social Science Citation Index (SSCI), and Arts and Humanities Citation Index (AHCI). These major indexes of ISI have become the preferred sources of information (publications, citations, etc.) for searching the literature and measuring the research performance in all fields, not only of countries but also of universities, other institutions, and individuals.

They transformed the way researchers work. They also made it easier to define what an international refereed or peer-reviewed journal is, especially in poor and developing countries — covered in SCI, SCIE, SSCI, or AHCI.

With this development, it has been easier to search the literature backward and forward and evaluate research performance with objective indicators. Sadly, many continue to use “reputable” or “prestigious” journal in rating publications without a useful definition of these terms.

3. Capability vs performance

During its first 10 years, the number of papers produced per PhD in the UPD College of Science increased to a yearly average of 4.7 in 1991-1993 (Table 1). Only a third of this was published, however. And the percentage that appeared in ISI-indexed journals was reduced through the years to 15 in 1991-1993 — only 5 percent of the total number of papers produced.

Two academic institutions abroad, with related disciplines like those of the UPD College of Science, had SCI-indexed publications consisting of 52 percent and 83 percent of published papers. They are the Faculty of Science of the National University of Singapore (NUS) and the Scripps Institution of Oceanography in the University of California at San Diego (NUS Bulletin of the Faculty of Science 1988-1993 and Scripps Annual Reports 1990-1992).

The average yearly publications per PhD from NUS were more than 10 times those of the UPD College of Science, or 2.53 against 0.24.

Table 1. Number of papers produced by PhDs in the College of Science faculty, University of the Philippines Diliman (Yearly mean per PhD)

Period
No. of PhDs
Total papers produced
Number published
ISI-indexeda
Percentb
1985-1987
42
2.60
1.23
0.30
24
1988-1990
67
3.74
1.68
0.31
18
1991-1993
89
4.74
1.60
0.24
15

Data from the Dean’s Office. An earlier version of this table was published in Diliman Review 47(2):11-18, 1999.

aln journals covered by the Science Citation Index of the Institute for Scientific Information.
bOf published papers.

UP Diliman produced 27 ISI-indexed publications in 1995. All of these were from the College of Science, 15 of them from the Marine Science Institute. The outputs of UP Los Banos, UP Manila, and UP Visayas were 16, 14, and 3. Note that UP had nearly 4,500 teaching and research staff during this time, more than half of them with graduate degrees.

On the other hand, with only about 60 PhDs, the Los Baños-based International Rice Research Institute (IRRI) produced 106 ISI-indexed publications in 1995. The big difference is partly explained by the teaching load at UP. But how many more are in the research staff?

The main reason is of course the numerous papers that UP produced but did not count. They are the unpublished papers and gray literature (published papers without adequate peer review and not widely accessible for international peer verification of results). In the UPD College of Science, they consisted 95 percent of total (Table 1). Production of such papers continued because the University rewarded their authors.

(The most outstanding Filipino scientist who has done his work in the Philippines was produced by IRRI. Dr. Bienvenido Juliano is the only Filipino listed in the ISI Highly Cited Researchers since 1981. These consist of 6,520 scientists worldwide in 21 categories in life sciences, medicine, physical sciences, engineering, and social sciences. Singapore is the only other ASEAN country included, with four highly cited researchers from NUS. See: http://isihighlycited.com)

Since the UPD College of Science had the highest college output (27 publications against 33 for the rest of UP in 1995), representing only 5 percent of papers produced, the College could also have spent the most amount of time and money for unpublished papers and gray literature.

Yet in 1997 when my professorial chair was up for renewal at the UPD College of Science, I filled out the appropriate blanks for international publications, got rid of the rest of the form’s pages that I considered irrelevant, and submitted only that one page with two-three lines filled out. I lost my professorial chair (which I don’t recall how I got).

The problems described above were prevalent throughout the University. The major cause was the failure to use objective indicators to evaluate research proposals and output. The common way of evaluation was peer judgment or expert opinion (of which we do not have enough).

4. Poor rating in university rankings?

In the ranking of world universities, the criteria used may be grouped into expert opinion, capability indicators, and performance indicators. Two commonly cited rankings are the THES-QS World University Rankings (2004-2006) and the Shanghai Jiao Tong University’s “Academic Ranking of World Universities” (2003-2006).

THES-QS rankings used largely expert opinion (40%), faculty/student ratio (20%), other non-performance indicators (20%), and research citations (20%), which are the only objective indicator of academic performance (7). Four Philippine universities made it in the world’s top 500. Their rankings: UP 299, La Salle 392, Ateneo 484, and UST 500.

Expert opinion has been shown to have no correlation with bibliometric scores. “We criticize the recent expert-based rankings by showing that the correlation of expert score and bibliometric outcomes is practically zero. This finding casts severe doubts on the reliability of these expert-based rankings” (Leiden University, The Netherlands, 2005)

The rankings done by Shanghai Jiao Tong University in 2003-2006, on the other hand, used only objective indicators of academic and research performance. The study used the following: alumni and staff winning prizes and awards (30%), articles covered in major citation indexes (20%), highly cited researchers (20%), articles published in the journals Nature and Science (20%), and the per capita academic performance of an institution (10%).

The source of publications and citations data was ISI’s major indexes. In these rankings, none from the Philippines has yet made the top 100 in the Asia Pacific or in the top 500 in the world (8).

The above examples show the general weakness in academic or research performance of Philippine universities compared with their academic capabilities.

5. Philippines’ S&T performance

The national publication output would be a reflection of that at UP. Among seven nations in the region — Taiwan, South Korea, Singapore, Thailand, Malaysia, Indonesia, and the Philippines — we had the least progress in S&T between 1981 and 1995 (9, 10).

Our Asian neighbors had 37-300 percent increases in ISI-indexed publications in 1994-1995 over their 1981-1992 average figures; the Philippines performance increased by only 7 percent. Smaller Taiwan had 23 times and tiny Singapore had 6 times more publications than the Philippines, which had 20 times more people than Singapore in 1995. Recent ISI data show that Vietnam has already passed the Philippines in number of publications.

Whereas we relied mainly on peer judgment, the countries above, which have left us behind, used the established indicators — publications in international peer-reviewed journals — to measure performance and progress.

Despite its poor S&T performance in 1981-1995, the DOST placed a 33-page advertisement in the Scientific American in February 1996 under the banner “Globally Competitive Philippines.” It claims pioneering efforts in metals technology, materials science, electronics, information technology, “and, most especially, biotechnology.” Who were we kidding?

Using SCI-indexed publications, the same science magazine ranked the Philippines’ S&T performance a poor no. 60 in the world in 1995, below all the countries mentioned above except Indonesia (3). The advertisement must have cost the country millions of pesos and worldwide embarrassment. Resorting to publicity and claims of achievements will not improve performance.

6. UP and national progress

The crucial role of science and technology in the economic transformation was again shown by the rapid growth of China, which has become the fifth leading nation in terms of its share of the world’s scientific publications (11). On the other hand, the Philippines performance in S&T, which was partly the fault of the University of the Philippines, proves RP’s lack of the requisite research ability and political leadership such a transformation needs.

UNDP’s Human Development Reports using social and economic indicators show a nation’s development compared with those of others. Among 177 countries and territories in 1997 and 1998, the Philippines ranked no. 77, but this dropped to no. 83-85 in 2000-2004. Other indicators may show a better or worse Philippines.

For example, “We could abandon gross national product (GNP) as an indicator of economic well-being; it suggests to the consumer that our economies need take no account of sustainability. In the United States, per capita GNP rose by 49% during 1976-98, whereas per capita ‘genuine progress’ (the economy’s output with environmental and social costs subtracted and added weight given to education, health, etc.) declined by 30%” (12).

7. Why NAST failed?

The foregoing sections show that UP’s poor S&T performance in the 80s and 90s must have influenced national progress. Here I show the nature of that influence.

Crucial to a country’s economic transformation is the role played by its national academy of science. In the Philippines this is the National Academy of Science and Technology (NAST), which is mandated “to advise the President and the Cabinet on matters related to science and technology.” Hence, it is mainly responsible for the DOST’s administration of science in the Philippines.

As a national academy of science, the NAST should not only provide extensive policy advice to the government, but must also be active in policy debates related to science-based programs. It should play a major role in economic reform and social transformation. It should promote science literacy, especially among government and industry leaders, and ensure that scientific research is incorporated into all of the country’s development strategies.

With UP graduates and staff dominating the NAST membership and officers, why did it fail in its national functions? Perhaps the best way to explain this is to look at the individual performance of its members in science and social sciences. Note that the number of ISI-indexed publications is the established measure of research and S&T performance, not only of countries and organizations but also of individuals.

For example, only 7 of NAST’s 50 members in 1998 have enough ISI-indexed publications in 1981-1997 to be considered scientists (Table 2). None of its presidents is among them. Over a third of its members have had no publications during these 16.5 years. Neither can publications in the ‘60s and ‘70s of most of its older members, including national scientists, can compare with those of the top seven NAST members. For a national science academy, this profile is indeed a very sad reflection of its criteria for electing members.

The good news is that our most outstanding scientists are its members; seven of them are shown in Table 2. Foremost are BO Juliano of the International Rice Research Institute, LJ Cruz and ED Gomez of the UP Diliman Marine Science Institute, who continue to publish after retirement. Other scientists and nonscientists have since 1998 been elected to NAST. The outstanding scientists among them are C Saloma of the UP Diliman National Institute of Physics, AC Alcala of Siliman University, and CC Bernido of the Research Center for Theoretical Physics at Central Visayas Institute in Bohol.

They strengthen the organization, but there are not enough of them to make a difference. In making decisions, they are outnumbered by the dominant unpublished and poorly published members and officers. This is also true of the National Research Council of the Philippines, our other national science organization.

We often hesitate to speak of serious problems, especially if the problems are people, because we don’t want to offend anyone. We forget that problems will remain and give rise to other problems if we don’t talk and do something about them. Are we more concerned about the feelings of culprits rather than the persistent poverty in our country? As the editor-in chief emeritus of Science says, “Scientists who mute their voices to avoid irritating colleagues do not help the overall science program” (13).

We have other outstanding scientists who have not been elected to NAST. No less than 15 of them have scientific publications in 1981-1997 within the range of the best seven of the NAST members shown in Table 2. Two of the four members elected in 2007 were among them, the two others, however, were poorly published — raising more doubts about NAST’s criteria for electing members.

Consider how the U.S. National Academy of Sciences takes new members (14). “A formal nomination can be submitted only by an Academy member. Each nomination includes a brief curriculum vitae plus a 250-word statement of the nominee’s scientific accomplishments— the basis for election—and a list of not more than 12 publications. The latter limit helps to focus on the quality of a nominee’s work, rather than the number of publications”.

The reason why members are carefully chosen is to insure that “each member should serve as a role model for defining excellence in science for the next generation of scientists in his or her field.”

Table 2. Published members in international refereed journals in 1981-1997 of the 50 members of the National Academy of Science and Technology in 1998

Academician
Year elected
No. of publicationsa
Total
As sole or lead author
1. Juliano BO
1979
107
30
2. Cruz LJ
1987
40
10
3. Gomez ED
1993
28
13
4. Domingo EO
1992
27
9
5. Fabella RV
1995
14
14
6. Garcia EG
1987
30
6
7. Encarnacion J
1979
8
8
Of the 50 members in 1998, 25 had less than 6 published papers as sole or lead author, 18 did not have any published paper in ISI-indexed journals in 1981-1997.

Source: National Citation Report (1981- June 1997), Institute for Scientific Information, Philadelphia.
aPublished papers with Philippine addresses.

The “scientific accomplishments” as basis for election to the U.S. academy of science does not necessarily include science administration. In developed countries, a science administrator may also become a member of a national science academy. In those countries, they are well-published scientists first before they are science administrators. This is rarely seen in the Philippines. Yet past heads of DOST are among NAST members.

“From global terrorism and the spread of disease to the dangers of global warming, we are increasingly facing the sorts of threats for which governments everywhere will need to turn to their scientists” (6). But with the above picture of NAST membership and leadership, how can we rely on this important national science organization to effectively address these threats. The NAST is the basic cause of the country’s poor state of S&T. And UP is partly responsible.

8. Reviewing poor performance

A lot of money was spent for funding research proposals, research honorariums, professorial chairs, and faculty grants to researchers without published papers in international peer-reviewed journals. They have hardly taught recipients how to do research properly. They led instead to continued production of unpublished studies, institutional publications, and other gray literature, earning promotion or even awards for the authors. An example is the award given by NAST to authors of winning papers published in Philippine journals.

International scientists searching the literature don’t find most of these papers because they are largely not covered by widely used indexes. If they ever found them, these are largely ignored by established scientists and institutions around the world. The papers are largely of doubtful scientific value or contribution to knowledge. Their manuscripts have not passed adequate peer review, and published results are not easily accessible for peer verification. Hence, a former editor-in-chief of the leading journal Nature says, only international publications are taken seriously (15).

Authors of many local books cite such unpublished papers and gray literature (from project reports, institutional publications, conference proceedings books, and local journals), which often dominate the list of references (16). In some of these books, not a single international journal paper by the author is cited. Some training and extension manuals by local authors don’t have a single ISI-indexed reference listed. The quality and integrity of a publication depend on the quality of the bibliography added to it (17).

An even worse practice is to use unpublished data for policy-making and development programs. Totally ignoring the established procedures of scientific research (which has over three centuries of developing tradition), the practice is common in local projects because of contractual demands from the government or international funding agencies. Some seekers of high positions even include a long list of “unpublished research” among their achievements. I remember two candidates for UP president who included this in their CVs published in the UP Newsletter sometime in 1993.

The causes of these poor practices are the following: (a) PhD students are not required to publish their theses, when the thesis is meant as training for research, and research is not completed until it is published properly, (b) government research honorariums require only progress or final report, (c) peer judgment is the main basis for funding research proposals and rating output, and (d) gray literature earns for the author a recognition and benefits.

These four causes of poor performance continued with hardly any changes. Faculty members became full professors without any indication of contribution to knowledge or publication in international peer-reviewed journals. ISI’s indexes have always been out there and were used by progressive countries.

If only part of the nearly P2 billion increase in the budget of the DOST from 1991 to 1993 was used as incentives (e.g., for research proposal and output), the state of S&T at UP and in the country would have been different. And economic growth would have taken a different turn. “The alternatives are clear: keep up or be left behind.”

9. Renaissance

The revival of scholarship in UP had to come. The major development was the proper use of research funds. And this was focused on funding only properly published researchers and rewarding authors of valid publications.

The beginning of the 21st century was also the start of the revival of academic excellence in UP. The number of UP’s scholarly publications covering all fields increased twofold in 2002 to 190. This was 40 percent of the national total, up from 25 percent in 1997-1999. The increase was largely due to the P50,000 reward given for every ISI-indexed paper published from 1999.

On the other hand, the combined publication output of La Salle, Ateneo, UST, and San Carlos during the same period increased from 7.8 to 8 percent only of the national total. Sooner or later, they are likely to join UP in this scientific revolution.

In the UPD College of Science, the increased number of publications is seen in more detail in Table 3. All the constituent units greatly increased their number of publications over the 1994 figures. Note the College total of 62 valid publications at the turn of the century. As noted earlier, the entire UP in 1995 produced only 60 such publications.

Table 3. Number of ISI-indexed papers produced by the UPD College of Science

(Yearly averages)

Year
Institute or Department

Physics

(25)

Mar Sci

(17)

Geology

(18)

Chemistry

(32)

Biology

(20)

Others

(47)

Totala
(159)
1994
8
11
0
2
3
0b
24
2000-2001
20.5
19
9.5
8.5
2.5
7c
62
2005-2006
18.5
15
13
6
6.5
6c
65

Data from Science Citation Index (1994 figures) and College of Science Annual Report SY 2006-2007 (2000-2001 & 2005-2006 figures).

( ), number of Professors, Assoc Professors, and Asst Professors in the second semester AY 2006-2007.
aCorrected total, since publications coauthored by two institutes are entered twice.
bMathematics and Meteorology.
cMathematics, Environmental Science & Meteorology, Molecular Biol & Biotech, and Natural Science Research Institute.

Given the difficulty of sustaining creativity on an empty stomach, the cash incentive brought some relief and made a significant change in the evaluation practice — the key to better performance. It also insured that money spent for research produced scholarly publications. Let me quote a former colleague about rewarding researchers:

“UP professors are very much underpaid and the general public does not realize how difficult it is to raise a family on a UP salary alone. Why would young people aspire to become publishing UP professors if they see that their mentors can barely make ends meet? What role models would they have if most professors do not publish and recite ‘science lessons’ from textbooks, rather than doing it themselves! It is in this light that ‘financial inducements’ for doing research of an international caliber and publishing in peer-reviewed international journals make sense.”

“You shouldn’t have to bribe people to be researchers; however, human nature being what it is, most people will put their energy into rewarded rather than unrewarded activities” (18).

At the Southeast Asian Fisheries Development Center in Iloilo, the requirement of ISI-indexed publication for promotion was introduced in 1986 and for a cash incentive — 50 percent of the researcher’s annual salary — in 1989 (3). By 1993, the average number of publications per researcher increased seven-fold. The all-Filipino staff consisting of 50 researchers, with only 9 PhDs, published an average of 0.77 papers per researcher. Among Philippine R&D organizations and academic institutions, only the UP Marine Science Institute achieved comparable performance at the time.

There have also been increased publications in international journals from Latin American countries, whose government funding agencies gave research incentives (3). Incentives were in the form of increased salary (e.g., Mexico) or lucrative fellowships (e.g., Brazil) to scientists who published in international peer-reviewed journals.

As much as four-fold increase in publications has been reported in Brazil. Three universities (Mexico, Brazil, and Argentina) have made the Shanghai Jiao Tong University’s top 200 in the world in 2006. Three others (Brazil, Chile) are in the top 500. None from the Philippines has yet made it to the SJTU’s top 500.

Another significant development in the University was the inclusion in 2001 of UPLB’s Philippine Agricultural Scientist (PAS) among ISI’s nearly 6,000 journals indexed in SCI Expanded. A part of UP’s publication increase in 2002 was through this journal, which is the Philippines’ only ISI-indexed journal in science since 1984.

The Philippine Journal of Science could have joined the PAS as the Philippines’ second international journal and boosted the revival of science. Through the initiative of some scientists from UP Diliman, the new PJS was launched with 35 international scientists in the editorial board headed by Dr. Lourdes Cruz. Some 15 Filipinos in the board were based in the country. The journal could have easily been ISI-indexed with three or four issues. But after two issues in 1999, the editors gave up in frustration with the DOST, which did not give the support expected of a national agency for science.

The Philippines used to have three ISI-indexed journals, all from UP — Philippine Journal of Veterinary Medicine, Philippine Journal of Internal Medicine, and Kalikasan — until the early 80s. They could not maintain the required standard for ISI coverage. I think those involved in their management simply did not see the importance of journal accessibility through ISI indexes for peer verification of results, a requirement of the scientific method.

The research incentives have proven to be a more effective way of learning and doing research properly. There are thousands of journals out there that can publish our papers — SCI Expanded covers nearly 6,000 journals; Social Science Citation Index, over 1,700 (including one from the Philippines); and Arts & Humanities Citation Index, over 1,100. There are more than enough journals in one’s field to choose from, without page charges.

Papers published in ISI-index journals are assured of superior quality, of reaching more scientists for verification of results, and a permanent place in the scientific literature or in those of the social sciences and humanities. Only then can research output be useful for policy-making, education, development programs, and generating useful technologies.

There is no shortcut for an underdeveloped country. We cannot get the full benefits of technologies from developed countries without knowing how to do research properly. Otherwise using foreign-made technologies would have made ours a developed country by now. But we never seem to learn our lessons.

Now that UP has shown how to do science, the NAST should review its criteria for membership, and the DOST should change its ways of science administration. At least the tax-paying public is entitled to the benefits of science, and reducing poverty is long over due.

10. Was there a Golden Age?

If there was, it would have been sometime during UP’s first 50 years. At the end of this period we were second to Japan in Asia. I don’t know how they measured UP’s performance and what criteria were used for developing academic excellence. A golden age could not have been in the second half of the last century as this was the time some Asian countries left us behind.

At the conference to celebrate the 125th anniversary of Nature in 1994, the speakers agreed on the conditions for excellence (19). Among them are the following: First, excellence should be the primary criterion in decisions on appointments and funding. Second, excellence in research is not an excuse for mediocrity in teaching. Third, regular and objective assessment of research and teaching is essential. There are four other conditions including, networking internationally is crucial.

The development of objective indicators of academic performance in all fields was established over the last 50 years. When these indicators (e.g., number of ISI-indexed publications) were used to assess our recognized scholars and academic leaders, including national scientists, most of them failed to measure up to their status and reputation. Their absence in Table 2 is an example.

Another commonly used indicator of performance is the number of publication citations (6, 20, 21). When divided by the number of publications, it gives some measure of the quality of the average paper. It is a widely used objective measure of quality. Citation analysis gives an estimate of one’s contribution to knowledge. Using the Science Citation Index and the Social Science Citation Index, I found only two past UP presidents from the 80s and 90s who can be considered scholars and academic leaders — Dr. Jose V. Abueva and Dr. Francisco Nemenzo.

The revival of scholarship at UP actually started in 1993, when President Abueva launched the incentive program, which gave P30,000 for outstanding publications. It was timely because the DOST had increased its budget three-fold from 1991 to 1993. And UP could have easily asked DOST for additional support for rewarding researchers and making scientists out of them.

But the innovative program was stopped when Dr. Emil Javier assumed the UP presidency in 1993. The revival or beginning of true scholarship at the UP had to wait 6 more years. So did the UP’s national impact at the time when DOST had plenty of money — four-fold increase in its budget from 1991 to 1995. The research incentive was restored only in 1999 when President Nemenzo took over and raised the prize to P50,000 (later further raised to P55,000) per published paper in ISI-indexed journal. What better way is there to save research funds and insure the desired output and achieve academic excellence?

Perhaps the government saw that the increased funds for S&T in 1991-1995 did not bring about the expected results, the DOST budget started declining from 1996. But now increased S&T funding would be justified because of the changes happening in research and development programs started by UP.

It took the social scientists Abueva and Nemenzo to see the importance of proper research publication to promote scholarship and advance scientific knowledge essential to national progress. The setback in the growth of UP’s academic performance from 1993-1999, and a lost opportunity for national impact, will be recorded in a properly-written history of UP, archived for future scholars who will make objective assessments of academic leadership and excellence in the University of the Philippines.

11. Acknowledgment

I thank Dr. Gil Jacinto of the UP Marine Science Institute for help in the ISI data search and Dr. Jurgenne Primavera of the Southeast Asian Fisheries Development Center in the Philippines for reading the manuscript.

12. References

  1. Bhattacharjee Y. How to hone U.S. grad schools. ScienceNOW Daily News, 26 April 2007.
  2. Arunachalam S. 2005. Fifty years of citation indexing. Current Science 89:10.
  3. Wayt Gibbs W. l995. Lost science in the Third World. Scientific American (August): 76-83.
  4. Mervis J and Kino*bleep*a J. 1995. Science in China: a great leap forward. Science 270: 1131-1152.
  5. Mashelkar RA. 2005. India’s R&D: Reaching for the top. Science 307:1415-1417.
  6. King DA. 2004. The scientific impact of nations. Nature, 430:311-316.
  7. http://www.topuniversities.com/worlduniversityrankings/faqs/
  8. Cheng Y and Liu NC. 2006. A first approach to the classification of the top 500 world universities by their disciplinary characteristics using Scientometrics. Scientometrics 68: 135-150. Also posted at http://ed.sjtu.edu.cn/ranking2006.htm
  9. Garfield E. 1993. A citationist perspective on science in Taiwan, 1981-1992. Current Contents 24(17):3-12, April 26.
  10. ISI’s Science Citation Index, 1994 & Jan-Nov 1995.
  11. Zhou Z and Leydesdorff L. 2006. The emergence of China as a leading nation in science. Research Policy 35: 83-104.
  12. Myers N. 2000. Sustainable consumption. Science 287:2419.
  13. Koshland DE. 1993. Basic research (III): Priorities. Science 259:1379. (Editorial)
  14. Alberts B and Fulton KR. 2005. Election to the National Academy of Sciences: Pathways to membership. PNAS 102:7405-7406.
  15. Maddox J. 1995. Center for research excellence replicates. Nature 374:403.
  16. Lacanilao F. 1997. Continuing problems with gray literature. Environmental Biology of Fishes 49:1-5. (Invited editorial)
  17. Kochen, M. 1987. How well do we acknowledge intellectual debts? Journal of Documentation 43: 54-64.
  18. Johnston TD. 1991. Evaluating teaching. Science 251:1547.
  19. Maddox J. 1994. How to pursue academic excellence. Nature 372: 721-723.
  20. May RM. 1997. The scientific wealth of nations. Science 275: 793-796.
  21. Warner J. 2000. Research assessment and citation analysis. The Scientist 14:39.
Advertisements

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *