Menu Close

Basic problems in Philippine science and education

By Flor Lacanilao

Producing a bigger workforce and putting in more money have been the usual answers to address the poor state of Philippine science and education. The lack of workforce and money has been the convenient excuse for poor work output or outright failure to do the job. An analysis of the situation, however, does not point to them as the real causes of this problem. It is failure to attend to the basic causes of the problem that has led to the continued deterioration of our science and education situation (see Only science can solve poverty).

A major culprit is in the performance evaluation using personal judgment by unpublished and poorly published officials and faculty members — instead of objective, internationally accepted criteria.

Consider the predicament of the National Science Consortium, which has been put up by the country’s seven top universities and the Science Education Institute of the Department of Science and Technology (DOST). Its objectives are (1) enhance the capability of the higher education system by producing technically competent PhD and MS graduates and (2) address the lack of Filipino researchers to enable our country to compete economically with its neighbors (see Time to abandon GDP).

The Consortium cites UNESCO figures showing the Philippines with 7,500 researchers in 2009 against Singapore’s 28,000. Last year, at the international conference in Japan on teaching and research activities, a report on scientific publications of 10 Asian countries showed the Philippines with the least published papers. The Philippines had only 178 valid publications in 2005, whereas tiny Singapore had 3,609, or 20 times more. Since Singapore’s 28,000 researchers were 4 times that of the Philippines’ 7,500, and Singapore’s research output was 3,609 against our 178, then Singapore researchers were 5 times more productive than their Philippine counterparts. How did this happen?

Failed programs

Data in the last 3 decades show that although the country’s number of researchers has been increasing, properly published papers per PhD even decreased; indeed, the national output hardly increased. A lot of the research funds went to unpublished or poorly published researchers who produced unpublished or poorly published papers (without adequate per review) — in short, gray literature. Gray literature is not taken seriously, and it doesn’t count in international evaluations of research performance, as the above-cited study of publications in 10 Asian countries. It doesn’t contribute to development.

At UP Diliman, the newly established College of Science in 1983 had aimed at an all-PhD faculty. It succeeded in doubling the number to over 90 PhDs in 10 years. But the number of properly published papers decreased. It dropped from 24 to 15 percent of total publications, or from 12 to only 5 percent of the papers produced per PhD. This means that for every 20 papers, only 1 counted in international evaluations of S&T performance.

Further, the DOST launched the Science and Technology Master Plan in 1990-2000. Its R&D budget had increased yearly in 1991-1995, a four-fold growth from P800 million to over P3 billion. In 1992-1998, it implemented the Engineering and Science Education Project (ESEP). This was to upgrade engineering and science through PhD and MS scholarships. “If one surveys local universities today, one will find that many of the leaders were ESEP graduates,” a report said.

Yet with all the money, effort, and years spent in those programs, the country’s S&T hardly improved. The research output remained the same from 1981 to 1995. And the total of Philippine publications in leading journals even decreased in 2000 to 2005 — from a mere 185 down to 178, reported by the above-cited study of publications in Asia; China, South Korea, Singape, and Thailand with about two-fold increase in publications during the same period.

Hence, the programs largely succeeded only in increasing the number of poor mentors and decreasing the general quality of graduates.

There are 764 PhD faculty members from our top universities that are involved in the Consortium program. Granting they produced all of the country’s 178 publications in 2005, their research productivity is only 0.23 per PhD. This is far below that of the National University of Singapore, where its 154 PhDs in science produced 389 publications in 1994, or 2.5 per PhD. This is more than 10 times than the output of our best graduate faculties. (World-class performance is 1 per PhD per year.) How can they be expected to properly mentor the projected 250 PhD and 350 MS graduates yearly? (see Training graduate students)

The programs cited above have failed because unpublished and poorly published officials and faculty members have relied on personal judgment, when they evaluate research proposals and publications, screen faculty applicants, and select candidates for promotions, recognitions, and awards (see Problems preventing academic reforms).

Effective systems

There are objective and internationally accepted criteria for performance evaluation (The scientific impact of nations). When implemented with cash rewards for outstanding publications, these criteria greatly increased useful research output. At the UP, where a P55,000 reward has been given per published paper in an international journal, publications increased from 25 to 40 percent of the national total between 1997-99 and 2002. (The combined publication output of La Salle, Ateneo, UST, and San Carlos during the same period increased from only 7.8 to 8.0 percent of the national total; the rest was largely produced by the International Rice Research Institute in Los Baños with only about 60 PhDs.) At the Southeast Asian Fisheries Development Center (SEAFDEC) in Iloilo, which offered a cash incentive of 50% of annual salary, publications of the 50 all-Filipino research staff — with only 9 PhDs — increased sevenfold in 1993 after 6 years. In fast-developing countries like China and Brazil, other forms of incentives have significantly increased published papers in international journals. (See Celebrating the UP Centennial.)

With an incentive system that uses objective, internationally accepted criteria, it would be possible for the Philippines to produce the desired output, save on research funds, justify even higher R&D budgets, and find a viable way to really reforming its science and higher education. Program funds should therefore be available for rewards on properly published papers and support for proposals of published proponents. Accepted criteria are journals covered in Thomson ISI indexes. Important journals are covered in Science Citation Index and Social Sciences Citation Index. These are internationally acknowledged indicators of academic performance.

For its part, the Commission of Higher Education (CHED) should do away with putting up research journals; instead, it should encourage researchers to publish in journals covered in Thomson ISI indexes, like the two mentioned above. CHED is supporting 190 state universities and colleges where only 10 percent of the faculty members have PhD degrees. Worse yet, how many of them are properlypublished? How can they effectively manage research journals or review manuscripts?

The objective performance indicators and the incentive system will minimize subjective evaluations by nonscientists, fix the other wrong research practices, improve the performance of the 7,500 researchers, and produce better-qualified mentors and instructors for graduate and undergraduate students. And provide better teachers for primary and secondary education (see an example on how academic scientists are solving persistent problems in primary education in Education reform amid scarcity).

As Carl Wieman, Nobel laureate in physics, has observed, it is doubtful that great progress can be made at the primary and secondary levels until a higher standard of science learning is set at the post-secondary level (see (Reinventing science education)..

In sum, the Philippines should radically reform its approaches in solving problems to ensure the achievement of its objectives. It is not about the advancement of science just for science’s sake. Rather, it is about advancing science in the context of a desire to improve the human condition. This entails attention to the processes by which understandings from the natural sciences, the social sciences, and engineering influence—or fail to influence—public policy (S&T for sustainable well-being). The Philippines will then have a chance of catching up with its more progressive neighboring countries in science, education, and national progress. There is No shortcut to progress.

Advertisements

Related Posts

1 Comment

  1. beatburn

    “And provide better teachers for primary and secondary education (see an example on how academic scientists are solving persistent problems in primary education in Education reform amid scarcity).”

    I strongly agree with this. Start a research and development culture in basic education. Children are more eager to come up with solutions to external even global issues. Ask them to come up with creative ideas to propose solutions for our energy and environment problems and for sure they will have answers. Once they become adults they tend to be constrained or affected by their personal and domestic affairs.

    Start them young so to speak.

    Thanks for this very informative article most especially regarding the research output of our academics.

Leave a Reply

Your email address will not be published. Required fields are marked *