by Flor Lacanilao, Ph.D.
A leading Chinese university has been ranking world universities based on academic or research performance. So far, no university from the Philippines made it in the top 100 in the Asia Pacific, or in the world’s top 500.
The top universities are Harvard, Cambridge, Stanford, Berkeley, MIT, Caltech, Columbia, Princeton, Chicago, and Oxford.
In the Asia Pacific, they are Tokyo U, Kyoto, Australian Nat’l, Hebrew U, Osaka, Tohoku, Melbourne, Tokyo Tech, Nagoya, and several universities that tied for no. 10, including Nat’l U Singapore, Tel Aviv, and Queensland.
The study uses the following indicators: alumni and staff winning prizes and awards (30%), articles covered in major citation indexes (20%), highly cited researchers (20%), articles published in the journals Nature and Science (20%), and the per capita academic performance of an institution (10%).
Sources of published articles and citations are the following indexes of the Institute for Scientific Information (ISI): Science Citation Index Expanded, Social Science Citation Index, and Arts & Humanities Citation Index. (For full report: Scientometrics 68:135-150, 2006 or http://ed.sjtu.edu.cn/ranking.htm)
Note that they use only measures of academic or research performance, rather than including academic indicators of capability (e.g., advanced degree holders, faculty/student ratio, financial resources, etc.), which are also used in other university rankings. But capability doesn’t guarantee performance. Further, other university rankings rely also on peer opinion of academic reputation, which has been shown to have no correlation with bibliometric scores.
In the ranking of Asian universities, for example, by the defunct Asiaweek Magazine (1997-2000), 35 Asian universities, including Tokyo U, withdrew participation because of such subjective and non-performance indicators. Four from the Philippines (UP, La Salle, Ateneo, UST) made the list of the remaining 77 universities in Asia.
(http://www.asiaweek.com/asiaweek/features/universities2000/schools/multi.overall.html)
Further, in the THES-QS World University Rankings 2006, which also included academic capability indicators, the same four Philippine universities made it in the first 500. Their rankings: UP 299, La Salle 392, Ateneo 484, and UST 500.
(http://www.topuniversities.com/worlduniversityrankings/faqs/)
The above examples show our universities’ general weakness in academic or research performance compared with their academic capabilities.
Quantitative indicators
ISI’s citation indexes have been widely used in evaluating research performance. The most commonly used measure for research and S&T activities is the number of published papers indexed in the Science Citation Index (SCI) and, recently, the SCI Expanded, as in the above ranking of universities. Examples have appeared in top science journals and magazines.
Another common indicator is the number of publication citations, as also used above for universities. The number of corrected citation data is an estimate of readers’ acceptability of a published paper. Citations in the hard sciences are generally formal acknowledgment of authoritative or important work. They indicate the extent of peer verification of published results.
Publications and citations are objective measures of research performance. In developed countries, peer judgment often supplements publication and citation counts. But in the Philippines, we are yet short of scientists. For us, therefore, peer judgment is not an appropriate and reliable measure of research performance.
Publications, citations, and peer judgment are measures of research output. Research input or presumed capability is usually assessed by research funding and number of research staff. Again, these are not suitable in the Philippines because most researchers are unpublished in international journals, but they receive grants from our funding agencies.
Hence, the most reliable measure of research performance for us is the quantitative measure or number of publications in ISI-indexed journals, referred to here as int. journals. They may be augmented with citations. In fact, the two are the common measures used by developing countries.
Symptoms and causes of R&D problems (Bato-bato sa langit)
Neighbor nations that had recognized the importance of proper research publication and evaluation have made big progress in S&T. They were behind the Philippines in the 1960s, but they now lead in the regional race for development. Examples are China, South Korea, Taiwan, and Singapore.
The Philippines, however, has yet to accept the established measures of research performance. Most of our researchers are still engaged in the production of gray literature. This is the main cause of our R&D problems. Symptoms are seen at all levels of our R&D endeavor, including graduate training, writing books and manuals, giving research grants and awards, decision making, disseminating scientific information, and implementing development programs. Among those to blame are academic science departments, awards bodies, journal editorial boards, funding and R&D institutions, and the two national science organizations.
Notable exceptions are the following: (a) the UP Marine Science Institute in UP Diliman with an all-published, all-PhD faculty, (b) the SEAFDEC in Iloilo, now regaining its lost image as a world class R&D organization under a new leadership, and (c) the Philippine Agricultural Scientist, our only ISI-indexed journal in science. The National Institute of Physics in UP Diliman has been fast increasing its research output to join the above models. These are a small part only of our R&D enterprise. But if multiplied nationwide, we can easily catch up with our progressive neighbor countries.
We often blame corruption, overpopulation, and poverty, forgetting that these are effects or symptoms rather than causes of underdevelopment. We have spent too much time and resources addressing problem symptoms instead of attending to their causes. The direct cause of underdevelopment is poor S&T, brought about by failure to do research properly. One way to fix this is by proper evaluation of research performance.
Research incentives
An effective way to improve the research output is to reward researchers who publish in int. journals. This was done at SEAFDEC in Iloilo from 1989. At the University of the Philippines, cash incentives were started in 1993 by President Jose V. Abueva. It was cut short when his term ended, only to be revived in 1999 by President Francisco Nemenzo. Cash reward has been given for each published paper in int. journal.
After only 3 years with research incentives, output was up 2-3 times. International publications of UP increased from 26 to 40 percent of the national total in 1999-2002. By contrast, the combined publication output of La Salle, Ateneo, UST, and San Carlos during the same period increased from 7.8 to 8 percent only of the national total, which was 478 in 2002.
It took two of our social scientists to see the importance of proper research publication to promote scholarship and advance scientific knowledge, which are essential to national progress.
Hence, instead of relying on peer review, research performance can be assessed simply by it output — the int. journal paper. A related problem is peer review of research proposals. Again, instead of peer evaluation of the proposal, the proponent’s tract record can be the focus in preliminary screening. Only those published in int. journals are entitled for funding. In both cases, unlike peer evaluation, we can never go wrong.
There will then be a double-incentive program – funding only published proponents (proven capability) and rewarding those with similarly published output (valid publication). Note that the proposed incentive program is just a shift to using funds wisely. And with new funds, we are headed for a major S&T development.
Clearly, a single crucial action, in the form of a double-incentive program, will not only justify increased R&D funding but will greatly improve research performance nationwide, even under trying conditions. We can be sure of a move we will not regret.
______________
The author is a retired professor of marine science at UP Diliman, a former chief of the Southeast Asian Fisheries Development Center in the Philippines, and a former chancellor of UP Visayas. E-mail him at flor_lacanilao @ yahoo.com.