:::

詳目顯示

回上一頁
題名:從學術評鑑角度探討JCR期刊領域分類問題:以「資訊科學與圖書館學」為例
作者:邵婉卿
作者(外文):Wang-Ching Shaw
校院名稱:國立臺灣大學
系所名稱:圖書資訊學研究所
指導教授:黃慕萱
學位類別:博士
出版日期:2016
主題關鍵詞:期刊分類JCR領域分類圖書資訊學資訊科學與圖書館學引用分析journal classificationJCR categorylibrary and information scienceinformation science and library sciencecitation analysis
原始連結:連回原系統網址new window
相關次數:
  • 被引用次數被引用次數:期刊(0) 博士論文(0) 專書(0) 專書論文(0)
  • 排除自我引用排除自我引用:0
  • 共同引用共同引用:0
  • 點閱點閱:11
由於JCR(Journal Citation Reports)的期刊影響係數(Impact Factor, IF)排名與學術評鑑有關,IF排名又與JCR期刊領域分類有關,故本研究從學術評鑑角度,以書目計量法探討JCR的期刊領域分類問題,以2005至2014年JCR收錄的88種IS&LS(Information Science & Library Science)期刊為例,分析該領域中LS(Library Science)、IS(Information Science)、SM(Scientometrics)與MIS(Management Information System)四子領域與四種選刊(LISR、JASIST、SMs、MISQ)的IF排名、Q值(Quartile)論文引用文獻領域與作者隸屬機構領域的分布結果等。
有關引用文獻領域分析的結果,本研究發現IS&LS、LS、IS與SM引用領域最多的全部都是LIS(Library & Information Science)領域的期刊,唯獨MIS很少引用LIS領域的期刊。LS、IS與SM期刊同樣以引用「LIS、電腦科學、科學、醫學」等領域為最多;然而MIS期刊引用領域最多的是「電腦科學、MIS、管理學、商學」,LIS與MIS彼此之間互引的關係並不密切。選刊部分,LISR、JASIST與SMs最常引用的期刊領域全部都是LIS期刊、極少MIS期刊,MISQ最常引用的期刊主要是MIS和電腦科學等領域的期刊,幾乎沒有LIS領域的期刊,可見MIS選刊與LIS選刊之間的互引行為亦極為少見,可見從二者的引用領域分來看,並非同質。
有關作者隸屬機構的領域分析,IS&LS、LIS作者隸屬機構領域主要來自LIS領域,MIS作者隸屬機構領域主要來自商學與MIS領域,來自LIS領域者極少,可見MIS與LIS有各自的作者隸屬機構領域。選刊情形也相似,LIS選刊的作者隸屬機構領域同樣主要來自LIS領域,MISQ的作者隸屬機構主要來自「商學、MIS、管理」這3個領域,來自LIS領域者極少,同樣具有明顯的差異。
本研究指出無論從四子領域期刊或四種選刊的論文引用文獻領域與作者隸屬機構領域的統計與分布結果,均顯示MIS與LIS之間的領域異質性過高,而且MIS在跨領域的學科特性與成長幅度上,也與LIS大不相同。MIS與LIS之間的IF值亦有顯著性差異,MIS的Q值始終都以Q1取勝、LS大多落在Q4。LIS領域雖然在期刊種數、論文數量和作者數量上都占多大數,但平均引用文獻數遠遠不及MIS領域的期刊,可見LIS期刊與MIS期刊二者除了引用領域與作者領域的分布不同,引用行為也有明顯差異。
總之,根據本研究之引用分析結果,無論描述統計或推論統計均已證實十年來IS&LS與LIS領域中的LS、IS、SM三子領域之間,在引用文獻領域、作者隸屬機構領域與IF表現等各方面均極為相關、具同質性,但MIS與LIS兩個領域之間,無論是引用文獻領域、作者隸屬機構領域與IF表現等各方面的差異皆相當大、實屬異質,尤其排名居前25%之期刊由MIS期刊占大多數,對LIS期刊的IF排名和LIS論文作者的學術評鑑不利。LIS期刊並不經常引用MIS期刊,LIS的引用文獻領域大多與MIS領域無關,MIS更極少引用LIS期刊,而且MIS與LIS彼此的作者又來自不同的機構領域,可見MIS與LIS二者之間異質性過高,在JCR中卻將MIS與LIS一同納入IS&LS領域進行期刊排名,建議JCR應重新檢視IS&LS領域收錄期刊的範圍,讓IS&LS期刊處於同質領域的歸類結果,才能得出正確的期刊排名與相對公平的評鑑結果。
The Impact Factor (IF) ranking of journals in Journal Citation Reports (JCR) influences journal’s academic evaluation and the IF ranking is close related to its journal category. Therefore, I put this research into the perspective of academic evaluation to analyze the 88 “Information Science and Library Science” (IS&LS) journals that are included in JCR to explore the category problem of JCR through bibliometrics. I analyze the citations and author’s institutions of the four different sub-categories, “LS” (Library Science)、”IS” (Information Science)、”SM” (Scientometrics) and “MIS” (Management Information System), and four specific journals (LISR、JASIST、SMs、MISQ), also examine the Impact Factor and Quartile distribution of these journals.
As for discipline of citation, my research shows that the articles in IS&LS、LS、IS and SM mostly cite the literature from the LIS, except articles in MIS. Articles in LS, IS and SM cite the literature in LIS, computer science, general science and medicine most, while the articles in MIS most cite the literature from computer science, MIS, management and business. It worth mentioned that the citation disciplines between LIS and MIS are not very close. As for citation in specific journal, LISR, JASIST and SMs are mostly citing and cited by journals in LIS and are least by journals in MIS. MISQ is mostly citing and cited by journals in computer science and business, has no cross citation with journals in LIS, which means the selected journal between MIS and LIS have few cross citations.
As for the author’s institutions, authors in IS&LS, LIS journals are mostly based in LIS institutions; authors in MIS basically come from business school and MIS, few come from LIS. It is obvious that the authors in MIS and LIS journals do not come from the similar discipline institutions. The authors in selected journals also show the similar feature. The authors in my selected journal of LIS are from LIS; authors in MISQ are mostly from business school, MIS and management, very few come from LIS.
My research finds that the citation discipline and author’s institution discipline of journals in four subcategories are with high heterogeneity in MIS and LIS (LS, IS, SM). MIS is inclined to transdisciplinary and has faster development than LIS. The IF is significantly different between MIS and LIS. The Q value of MIS falls in Q1, while the Q value of LS mostly falls in Q4. LIS though leads in numbers of journals, articles, and authors but the average numbers of citation per article are far behind the MIS, which means the difference between LIS and MIS falls not only in citation discipline and author background but also in citation behaviors.
According to the citation analysis in this research, my finding shows that IS&LS and LIS are statically significant in citation discipline and author’s institutions. However, MIS and LIS have significant difference in citation discipline, author institutions and IF. As we can see, the top 25% in journal ranking are mostly occupied by MIS journals, which might have negative impact on the IF ranking and academic evaluation of the LIS journals and authors. Besides, LIS journals seldom cite the MIS journals, and articles in MIS almost don’t cite articles in LIS Journals. The authors in MIS and LIS journals mostly come from institutions in different disciplines. To conclude, the journals in MIS and LIS are heterogeneous.
王崇德(1991)。情報科學原理。臺北市:農業科學資料服務中心。
何光國(1994)。文獻計量學導論。臺北市:三民。
胡述兆(1995)。圖書館學(Library Science)。在圖書館學與資訊科學大辭典。檢自:http://terms.naer.edu.tw/detail/1680129/
黃慕萱、何蕙菩(2007)。圖書資訊學知識來源與知識擴散學科之研究。圖書資訊學刊,5(1/2),1-30。new window
黃慕萱、何蕙菩(2009)。圖書資訊學知識來源與知識擴散指標之研究。圖書館學與資訊科學,35(2),14-33。new window
張郁蔚(2009)。以直接引用、書目耦合及共同作者探討圖書資訊學跨學科之變遷。國立國立臺灣大學圖書資訊學研究所博士論文。臺北市:未出版。new window
Abrizah, A., Noorhidawati, A., & Zainab, A. N. (2015). LIS journals categorization in the Journal Citation Report: a stated preference study. Scientometrics, 102(2), 1083–1099.
Abrizah, A., Zainab, A. N., Kiran, K., & Raj, R. G. (2013). LIS journals scientific impact and subject categorization: a comparison between Web of Science and Scopus. Scientometrics, 94(2), 721–740.
Afsharpanah, S. (1984). Interdisciplinary structure of information science. (Unpublished doctoral dissertation). Case Western Reserve University, United States-Ohio.
Aharony, N. (2012). Library and Information Science research areas: A content analysis of articles from the top 10 journals 2007-8. Journal of Librarianship and Information Science, 44(1), 27–35.
Al-Sabbagh, I. A. (1987). The Evolution of the Interdisciplinarity of Information Science: A Bibliometric Study. (Unpublished doctoral dissertation). Florida State University, Tallahassee, FL, USA.
Amin, M., & Mabe, M. A. (2003). Impact factors: use and abuse. Medicina (B.Aires), 63(4), 347–354.
Archambault, É., Beauchesne, O., & Caruso, J. (2011). Towards a multilingual, comprehensive and open scientific journal ontology. In Proceedings of the 13th International Conference on Scientometrics and Infometrics, Durban, South Africa. Retrieved from http://www.researchgate.net/profile/Julie_Caruso/publication/228490842_Towards_a_Multilingual_Comprehensive_and_Open_Scientific_Journal_Ontology/links/02e7e52b3328486d53000000.pdf
Åström, F. (2007). Changes in the LIS research front: Time-sliced cocitation analyses of LIS journal articles, 1990–2004. Journal of the American Society for Information Science and Technology, 58(7), 947–957.
Åström, F. (2010). The visibility of information science and library science research in bibliometric mapping of the LIS field. The Library Quarterly: Information, Community, Policy, 80(2), 143–159.
Atkins, S. E. (1988). Subject trends in library and information science research, 1975- 1984. Library Trends, 36, 633–658.
Ball, R., Mittermaier, B., & Tunger, D. (2009). Creation of journal-based publication profiles of scientific institutions - A methodology for the interdisciplinary comparison of scientific research based on the J-factor. Scientometrics, 81(2), 381–392.
Banks, M. A., & Dellavalle, R. (2008). Emerging alternatives to the impact factor. OCLC Systems & Services: International Digital Library Perspectives, 24(3), 167–173.
Bar-Ilan, J. (2012). Journal report card. Scientometrics, 92(2), 249–260.
Bensman, S. J. (2007). Garfield and the impact factor. Annual Review of Information Science and Technology, 41(1), 93–155.
Bensman, S. J., & Leydesdorff, L. (2009). Definition and identification of journals as bibliographic and subject entities: Librarianship versus ISI Journal Citation Reports methods and their effect on citation measures. Journal of the American Society for Information Science and Technology, 60(6), 1097–1117.
Birnholtz, J. P. (2006). What does it mean to be an author? The intersection of credit, contribution, and collaboration in science. Journal of the American Society for Information Science and Technology, 57(13), 1758–1770.
Bladek, M. (2013). San Francisco Declaration on Research Assessment (May 2013). College & Research Libraries News, 75(4), 191–196.
Bonnevie-Nebelong, E., & Frandsen, T. F. (2006). Journal citation identity and journal citation image: A portrait of the Journal of Documentation. Journal of Documentation, 62(1), 30–57.
Borgman, C. L., & Rice, R. E. (1992). The convergence of information science and communication: A bibliometric analysis. Journal of the American Society for Information Science, 43(6), 397–411.
Bornmann, L., Mutz, R., Neuhaus, C., & Daniel, H. (2008). Citation counts for research evaluation: Standards of good practice for analyzing bibliometric data and presenting and interpreting results. Ethics in Science and Environmental Politics, 8, 93–102.
Boyack, K. W., Klavans, R., & Börner, K. (2005). Mapping the backbone of science. Scientometrics, 64(3), 351–374.
Buttlar, L. (1999). Information sources in library and information science doctoral research. Library & Information Science Research, 21(2), 227–245.
Campanario, J. M., & Cabos, W. (2014). The effect of additional citations in the stability of Journal Citation Report categories. Scientometrics, 98(2), 1113–1130.
Chang, Y.-W., & Huang, M.-H. (2012). A study of the evolution of interdisciplinarity in library and information science: Using three bibliometric methods. Journal of the American Society for Information Science and Technology, 63(1), 22–33.
Chapman, K., & Brothers, P. (2006). Database coverage for research in management information systems. College & Research Libraries, 67(1), 50–62.
Chua, A. Y. K., & Yang, C. C. (2008). The shift towards multi-disciplinarity in information science. Journal of the American Society for Information Science and Technology, 59(13), 2156–2170.
Crespo, J. A., Herranz, N., Li, Y., & Ruiz-Castillo, J. (2014). The effect on citation inequality of differences in citation practices at the Web of Science subject category level. Journal of the Association for Information Science and Technology, 65(6), 1244–1256.
Cronin, B. (2013). Thinking about data. Journal of the American Society for Information Science and Technology, 64(3), 435–436.
Cronin, B., & Meho, L. I. (2008). The shifting balance of intellectual trade in information studies. Journal of the American Society for Information Science and Technology, 59(4), 551–564.
Davarpanah, M. R., & Aslekia, S. (2008). A scientometric analysis of international LIS journals: Productivity and characteristics. Scientometrics, 77(1), 21–39.
Ding, Y., Foo, S., & Chowdhury, G. (1999). A bibliometric analysis of collaboration in the field of information retrieval. The International Information & Library Review, 30(4), 367–376.
Dorta-González, P., & Dorta-González, M. I. (2013). Comparing journals from different fields of science and social science through a JCR subject categories normalized impact factor. Scientometrics, 95(2), 645–672.
Eisenberg, T., & Wells, M. T. (2014). Ranking Law Journals and the Limits of Journal Citation Reports. Economic Inquiry, 52(4), 1301–1314.
Elsevier. (2015). Library & Information Science Research. Retrieved from http://www.journals.elsevier.com/library-and-information-science-research/
Ennas, G., Biggio, B., & Guardo, M. C. D. (2015). Data-driven journal meta-ranking in business and management. Scientometrics, 105, 1911–1929.
Erfanmanesh, M. A., Didegah, F., & Omidvar, S. (2010). Research productivity and impact of Library and Information Science in the Web of Science. Malaysian Journal of Library & Information Science, 15(2), 85–95.
Franceschini, F., Galetto, M., Maisano, D., & Mastrogiacomo, L. (2012). The success-index: an alternative approach to the h-index for evaluating an individual’s research output. Scientometrics, 92(3), 621–641.
Franceschini, F., Maisano, D., & Mastrogiacomo, L. (2014). The citer-success-index: A citer-based indicator to select a subset of elite papers. Scientometrics, 101, 963–983.
García, J. A., Rodriguez-Sánchez, R., & Fdez-Valdivia, J. (2012). Scientific subject categories of Web of Knowledge ranked according to their multidimensional prestige of influential journals. Journal of the American Society for Information Science and Technology, 63(5), 1017–1029.
García, J. A., Rodríguez-Sánchez, R., Fdez-Valdivia, J., Robinson-García, N., & Torres-Salinas, D. (2012). Mapping academic institutions according to their journal publication profile: Spanish universities as a case study. Journal of the American Society for Information Science and Technology, 63(11), 2328–2340.
Garfield, E. (1965). Can citation indexing be automated. In M. E. Stevens (Ed.), Statistical Association Methods for Mechanized Documentation: Symposium Proceedings (pp. 189–192). U.S. Government Printing Office. Retrieved from https://goo.gl/vfWQ7l
Garfield, E. (1972). Citation analysis as a tool in journal evaluation journals can be ranked by frequency and impact of citations for science policy studies. Science, 178(4060), 471–479.
Garfield, E. (1977a). The 250 most-cited primary authors, 1961–1975. Part 1. How the names were selected. Current Contents, 49, 5–15.
Garfield, E. (1977b). The 250 most-cited primary authors, 1961–1975. Part II. The correlation between citedness, Nobel prizes, and academy membership. Current Contents, 50, 5–15.
Garfield, E. (1979). Is citation analysis a legitimate evaluation tool?. Scientometrics, 1(4), 359–375.
Garfield, E. (1998). Random thoughts on citationology its theory and practice. Scientometrics, 43(1), 69–76.
Garfield, E. (2006). THe history and meaning of the journal impact factor. Journal of the American Medical Association, 295(1), 90–93.
Garfield, E. (2014). Farewell Editorial. Scientometrics, 98(1), 1–2.
Glänzel, W. (2014). Greetings from the new Editor-in-Chief. Scientometrics, 98(1), 3–4.
Glänzel, W., & Schubert, A. (2003). A new classification scheme of science fields and subfields designed for scientometric evaluation purposes. Scientometrics, 56(3), 357–367.
Glänzel, W., Schubert, A., Thijs, B., & Debackere, K. (2011). A priori vs. a posteriori normalisation of citation indicators. The case of journal ranking. Scientometrics, 87(2), 415–424.
Glänzel, W., Thijs, B., Schubert, A., & Debackere, K. (2009). Subfield-specific normalized relative indicators and a new generation of relational charts: Methodological foundations illustrated on the assessment of institutional research performance. Scientometrics, 78(1), 165–188.
Gómez-Núñez, A. J., Batagelj, V., Vargas-Quesada, B., Moya-Anegón, F., & Chinchilla-Rodríguez, Z. (2014). Optimizing SCImago Journal & Country Rank classification by community detection. Journal of Informetrics, 8(2), 369–383.
González-Alcaide, G., Castelló-Cogollos, L., Navarro-Molina, C., Aleixandre-Benavent, R., & Valderrama-Zurián, J. C. (2008). Library and information science research areas: Analysis of journal articles in LISA. Journal of the American Society for Information Science and Technology, 59(1), 150–154.
Gross, P. L. K., & Gross, E. M. (1927). College libraries and chemical education. Science, 66(1713), 385–389.
Harzing, A.-W. (2012). Document categories in the ISI Web of Knowledge: misunderstanding the social sciences? Scientometrics, 94(1), 23–34.

Hernon, P., & Schwartz, C. (1998). Editorial: Library & Information Science Research—Marking the journal’s 20th anniversary. Library & Information Science Research, 20(4), 309–320.
Hjrland, B. (2010). The foundation of the concept of relevance. Journal of the American Society for Information Science and Technology, 61(2), 217–237.
Introna, L., & Whittaker, L. (2004). Truth, journals, and politics: The case of the MIS Quarterly. In B. Kaplan, D. P. T. III, D. Wastell, A. T. Wood-Harper, & J. I. DeGross (Eds.), Information Systems Research (pp. 103–120). Springer US.
Ivancheva, L. (2008). Scientometrics today: A methodological overview. Collnet Journal of Scientometrics and Information Management, 2(2), 47–56.
Jacsó, P. (2010). Eigenfactor and article influence scores in the Journal Citation Reports. Online Information Review, 34(2), 339–348.
Jacsó, P. (2012). The problems with the subject categories schema in the EigenFactor database from the perspective of ranking journals by their prestige and impact. Online Information Review, 36(5), 758–766.
Jarvelin, K., & Vakkari, P. (1993). The evolution of library and information science 1965-1985—A content analysis of journal articles. Information Processing & Management, 29(1), 129-144.
Journal Citation Reports. (2014). Retrieved from http://admin-apps.webofknowledge.com/JCR/
Julien, H. & Duggan, L. J. (2000). A longitudinal analysis of the information needs and uses literature. Library & Information Science Research, 22(3), 291–309.
Katz, J. S., & Martin, B. R. (1997). What is research collaboration? Research Policy, 26(1), 1–18.
Klavans, R., & Boyack, K. W. (2009). Toward a consensus map of science. Journal of the American Society for Information Science and Technology, 60(3), 455–476.
LaBonte, K. (2005). Citation analysis: A method for collection development for a rapidly developing field. Issues in Science and Technology Librarianship, 43. Retrieved from http://www.istl.org/05-summer/refereed.html
Larivière, V., Archambault, É., & Gingras, Y. (2008). Long-term variations in the aging of scientific literature: From exponential growth to steady-state science (1900–2004). Journal of the American Society for Information Science and Technology, 59(2), 288–296.
Larivière, V., Gingras, Y., & Archambault, É. (2009). The decline in the concentration of citations, 1900–2007. Journal of the American Society for Information Science and Technology, 60(4), 858–862.
Larivière, V., Sugimoto, C. R., & Cronin, B. (2012). A bibliometric chronicling of library and information science’s first hundred years. Journal of the American Society for Information Science and Technology, 63(5), 997–1016.
Larsen, T. J., & Levine, L. (2008). Citation patterns in MIS: An analysis of exemplar articles. In G. León, A. M. Bernardos, J. R. Casar, K. Kautz, & J. I. D. Gross (Eds.), Open IT-Based Innovation: Moving Towards Cooperative IT Transfer and Knowledge Diffusion (pp. 23–38). US: Springer. Retrieved from http://link.springer.com/chapter/10.1007/978-0-387-87503-3_2
Levitt, J. M., & Thelwall, M. (2009). The most highly cited Library and Information Science articles: Interdisciplinarity, first authors and citation patterns. Scientometrics, 78(1), 45–67.
Levitt, J. M., Thelwall, M., & Oppenheim, C. (2011). Variations between subjects in the extent to which the social sciences have become more interdisciplinary. Journal of the American Society for Information Science and Technology, 62(6), 1118–1129.
Leydesdorff, L. (1998). Theories of citation?. Scientometrics, 43(1), 5–25.
Leydesdorff, L. (2006). Can scientific journals be classified in terms of aggregated journal-journal citation relations using the Journal Citation Reports?. Journal of the American Society for Information Science and Technology, 57(5), 601–613.
Leydesdorff, L. (2007). Mapping interdisciplinarity at the interfaces between the Science Citation Index and the Social Science Citation Index. Scientometrics, 71(3), 391–405.
Leydesdorff, L. (2008). Caveats for the use of citation indicators in research and journal evaluations. Journal of the American Society for Information Science and Technology, 59(2), 278–287.
Leydesdorff, L., & Bornmann, L. (2011a). How fractional counting of citations affects the impact factor: Normalization in terms of differences in citation potentials among fields of science. Journal of the American Society for Information Science and Technology, 62(2), 217–229.
Leydesdorff, L., & Bornmann, L. (2011b). Integrated impact indicators compared with impact factors: An alternative research design with policy implications. Journal of the American Society for Information Science and Technology, 62(11), 2133–2146.
Leydesdorff, L., Carley, S., & Rafols, I. (2013). Global maps of science based on the new Web-of-Science categories. Scientometrics, 94(2), 589–593.
Leydesdorff, L., de Moya-Anegón, F., & Guerrero-Bote, V. P. (2010). Journal maps on the basis of Scopus data: A comparison with the Journal Citation Reports of the ISI. Journal of the American Society for Information Science and Technology, 61(2), 352–369.
Leydesdorff, L., & Opthof, T. (2010). Normalization at the field level: Fractional counting of citations. Journal of Informetrics, 4(4), 644–646.
Leydesdorff, L., & Shin, J. C. (2011). How to evaluate universities in terms of their relative citation impacts: Fractional counting of citations and the normalization of differences among disciplines. Journal of the American Society for Information Science and Technology, 62(6), 1146–1155.
Lin, C. S., Huang, M. H. & Chen, D. Z. (2013). The influences of counting methods on university rankings based on paper count and citation count. Journal of Informetrics, 7(3), 611–621.
Lluch, J. O. (2005). Some considerations on the use of the impact factor of scientific journals as a tool to evaluate research in psychology. Scientometrics, 65(2), 189–197.
Lowry, P. B., Moody, G., Gaskin, J., Galletta, D. F., Humphreys, S., Barlow, J. B., & Wilson, D. (2013). Evaluating journal quality and the association for information systems (AIS) senior scholars’ journal basket via bibliometric measures: Do expert journal assessments add value? MIS Quarterly, 37(4), 993–1012.
Lozano, G. A. (2010). A new criterion for allocating research funds:“impact per dollar.” Current Science, 99(9), 1187–1188.
Lozano, G. A., Larivière, V., & Gingras, Y. (2012). The weakening relationship between the impact factor and papers’ citations in the digital age. Journal of the American Society for Information Science and Technology, 63(11), 2140–2145.
Lundberg, J., Fransson, A., Brommels, M., Skår, J., & Lundkvist, I. (2006). Is it better or just the same? Article identification strategies impact bibliometric assessments. Scientometrics, 66(1), 183–197.
Mason, R. O., McKenney, J. L., & Copeland, D. G. (1997). Developing an historical tradition in MIS Research. MIS Quarterly, 21(3), 257–278.
McKechnie, L., Goodall, G. R., & Lajoie-Paquette, D. (2005). How human information behaviour researchers use each other’s work: A basic citation analysis study. Information Research, 10(2). Retrieved from http://www.informationr.net/ir/10-2/paper220.html
McKechnie, L., & Pettigrew, K. E. (2002). Surveying the use of theory in library and information science research: A disciplinary perspective. Library Trends, 50(3), 406–17.
Meyer, T., & Spencer, J. (1996). A citation analysis study of library science: Who cites librarians?. College & Research Libraries, 57(1), 23–33.
Milojević, S., Sugimoto, C. R., Yan, E., & Ding, Y. (2011). The cognitive structure of Library and Information Science: Analysis of article title words. Journal of the American Society for Information Science and Technology, 62(10), 1933–1953.
Minguillo, D. (2010). Toward a new way of mapping scientific fields: Authors’ competence for publishing in scholarly journals. Journal of the American Society for Information Science & Technology, 61(4), 772–786.
MIS Quarterly. (2015). About MIS Quarterly. MIS Quarterly. Retrieved from http://www.misq.org/about/
Moed, H. F. (2011). The source normalized impact per paper is a valid and sophisticated indicator of journal citation impact. Journal of the American Society for Information Science and Technology, 62(1), 211–213.
Morillo, F., Bordons, M., & Gómez, I. (2001). An approach to interdisciplinarity through bibliometric indicators. Scientometrics, 51(1), 203–222.
Moya-Anegón, F., Herrero-Solana, V., & Jiménez-Contreras, E. (2006). A connectionist and multivariate approach to science maps: the SOM, clustering and MDS applied to library and information science research. Journal of Information Science, 32(1), 63–77.
Ni, C., & Ding, Y. (2010). Journal clustering through interlocking editorship information. Proceedings of the American Society for Information Science and Technology, 47(1), 1–10.
Ni, C., Shaw, D., Lind, S. M., & Ding, Y. (2013). Journal impact and proximity: An assessment using bibliographic features. Journal of the American Society for Information Science and Technology, 64(4), 802–817.
Ni, C., & Sugimoto, C. R. (2011). Four-facets study of scholarly communities: Artifact, producer, concept, and gatekeeper. Proceedings of the American Society for Information Science and Technology, 48(1), 1–4.
Ni, C., Sugimoto, C. R., & Cronin, B. (2013). Visualizing and comparing four facets of scholarly communication: Producers, artifacts, concepts, and gatekeepers. Scientometrics, 94(3), 1161–1173.
Nisonger, T. E. (1999). JASIS and library and information science journal rankings: A review and analysis of the last half-century. Journal of the American Society for Information Science, 50(11), 1004–1019.
Nisonger, T. E., & Davis, C. H. (2005). The perception of library and information science journals by LIS education Deans and ARL library directors: A replication of the Kohl–Davis Study. College & Research Libraries, 66(4), 341–377.
Nixon, J. M. (2014). Core journals in library and information science: Developing a methodology for ranking LIS journals. College & Research Libraries, 75(1), 66–90.
Odell, J., & Gabbard, R. (2008). The interdisciplinary influence of Library and Information Science 1996–2004: A journal-to-journal citation analysis. College & Research Libraries, 69(6), 546–565.
Paisley, W. (1989). Bibliometrics, scholarly communication, and communication research. Communication Research, 16(5), 701–717.
Peritz, B. C., & Bar-Ilan, J. (2002). The sources used by bibliometrics-scientometrics as reflected in references. Scientometrics, 54(2), 269–284.
Persson, O. (1994). The intellectual base and research fronts of JASIS 1986–1990. Journal of the American Society for Information Science, 45(1), 31–38.
Põder, E. (2010). Let’s correct that small mistake. Journal of the American Society for Information Science and Technology, 61(12), 2593–2594.
Prebor, G. (2010). Analysis of the interdisciplinary nature of library and information science. Journal of Librarianship and Information Science, 42(4), 256–267.
Pudovkin, A. I., & Garfield, E. (2002). Algorithmic procedure for finding semantically related journals. Journal of the American Society for Information Science and Technology, 53(13), 1113–1119.
Rafols, I., & Leydesdorff, L. (2009). Content-based and algorithmic classifications of journals: Perspectives on the dynamics of scientific communication and indexer effects. Journal of the American Society for Information Science and Technology, 60(9), 1823–1835.
Rafols, I., Porter, A. L., & Leydesdorff, L. (2010). Science overlay maps: A new tool for research policy and library management. Journal of the American Society for Information Science and Technology, 61(9), 1871–1887.
Reitz, J. M. (Ed.). (2013). Information System. In Online dictionary for library and information science. Retrieved from http://www.abc-clio.com/ODLIS/odlis_i.aspx
Ringle, C. M., Sarstedt, M., & Straub, D. W. (2012). A critical look at the use of PLS-SEM in MIS Quarterly. MIS Quarterly, 36(1), iii–xiv.
Rinia, E. J., Van Leeuwen, T. N., Bruins, E. E., Van Vuren, H. G., & Van Raan, A. F. (2002). Measuring knowledge transfer between fields of science. Scientometrics, 54(3), 347–362.
Rousseau, R. (2002). Journal Evaluation: Technical and Practical Issues. Library Trends, 50(3), 418–39.
Saracevic, T. (1999). Information science. Journal of the American Society for Information Science, 50(12), 1051–1063.
Sawyer, S., & Huang, H. (2007). Conceptualizing information, technology, and people: Comparing information science and information systems literatures. Journal of the American Society for Information Science and Technology, 58(10), 1436–1447.
Schubert, A. (2002). The web of scientometrics: A statistical overview of the first 50 volumes of the journal. Scientometrics, 53(1), 3–20.
Small, H., & Sweeney, E. (1985). Clustering thescience citation index using co-citations. Scientometrics, 7(3-6), 391–409.
So, C. (1988). Citation patterns of core communication journals: An assessment of the developmental status of communication. Human Communication Research, 15(2), 236–255.
Springer. (2015). Scientometrics - Description. Retrieved from http://link.springer.com/journal/11192
Sugimoto, C.R., Li, D., Russell, T.G., Finlay, C., & Ding, Y. (2011). The shifting sands of disciplinary development: Analyzing North American Library and Information Science (LIS) dissertations using Latent Dirichlet Allocation (LDA). Journal of the American Society for Information Science & Technology, 62(1), 185–204.
Sugimoto, C. R., Pratt, J. A., & Hauser, K. (2008). Using field cocitation analysis to assess reciprocal and shared impact of LIS/MIS fields. Journal of the American Society for Information Science and Technology, 59(9), 1441–1453.
Szostak, R. (2011). Complex concepts into basic concepts. Journal of the American Society for Information Science and Technology, 62(11), 2247–2265.
Tang, R. (2004). Evolution of the interdisciplinary characteristics of information and library science. Proceedings of the American Society for Information Science and Technology, 41(1), 54–63.
Thomson Reuters. (2016). Thomson Reuters Announces Definitive Agreement to Sell its Intellectual Property & Science Business to Onex and Baring Asia for $3.55 billion. Retrieved from http://thomsonreuters.com/en/press-releases/2016/july/thomson-reuters-announces-definitive-agreement-to-sell-its-intellectual-property-science-business.html
Tseng, Y.-H., & Tsay, M.-Y. (2013). Journal clustering of library and information science for subfield delineation using the bibliometric analysis toolkit: CATAR. Scientometrics, 95(2), 503–528.
Turner, S. (2000). What are disciplines? And how is interdisciplinarity different? In P. Weingart & N. Stehr (Eds.), Practising Interdisciplinarity (pp. 46-65). Toronto: University of Toronto Press.
Vandegrift, M., & Bowley, C. (2014). Librarian, heal thyself: a scholarly communication analysis of LIS journals. Retrieved from http://www.inthelibrarywiththeleadpipe.org/2014/healthyself/
Van Raan, A. F. J. (1997). Scientometrics: State-of-the-art. Scientometrics, 38(1), 205–218.
Walters, W. H. (2014). Do article influence scores overestimate the citation impact of social science journals in subfields that are related to higher-impact natural science disciplines? Journal of Informetrics, 8(2), 421–430.
Waltman, L., & van Eck, N. J. (2012). A new methodology for constructing a publication-level classification system of science. Journal of the American Society for Information Science and Technology, 63(12), 2378–2392.
Waltman, L., Yan, E., & van Eck, N. J. (2011). A recursive field-normalized bibliometric performance indicator: An application to the field of library and information science. Scientometrics, 89(1), 301–314.
Wang, F., & Wolfram, D. (2015). Assessment of journal similarity based on citing discipline analysis. Journal of the Association for Information Science and Technology, 66(6), 1189–1198.
Warner, J. (2001). W(H)ITHER Information Science?/!. The Library Quarterly: Information, Community, Policy, 71(2), 243–255.
White, H. D. (2001). Authors as citers over time. Journal of the American Society for Information Science and Technology, 52(2), 87–108.
White, H. D., & McCain, K. W. (1998). Visualizing a discipline: An author co-citation analysis of information science, 1972-1995. Journal of the American Society for Information Science, 49(4), 327–355.
Wiley Online Library. (2015). Journal of the Association for Information Science and Technology - Overview. Retrieved from http://onlinelibrary.wiley.com/journal/10.1002/(ISSN)2330-1643/homepage/ProductInformation.html
Xia, J. (2012). Positioning open access journals in a LIS journal ranking. College & Research Libraries, 73(2), 134–145.
Zhao, D., & Strotmann, A. (2008). Evolution of research activities and intellectual influences in information science 1996–2005: Introducing author bibliographic-coupling analysis. Journal of the American Society for Information Science and Technology, 59(13), 2070–2086.
Zhu, Y. Q., Wang, M.-H., & Ho, Y.-S. (2011). An analysis of research activity in department of chemical engineering in USA. Archives of Environmental Science, 5, 62–70.
Zins, C. (2007). Classification schemes of Information Science: Twenty-eight scholars map the field. Journal of the American Society for Information Science and Technology, 58(5), 645–672.
Zitt, M., & Cointet, J.-P. (2013). Citation impacts revisited: how novel impact measures reflect interdisciplinarity and structural change at the local and global level. ArXiv E-Prints, 1302, 4384.
Zitt, M., & Small, H. (2008). Modifying the journal impact factor by fractional citation weighting: The audience factor. Journal of the American Society for Information Science and Technology, 59(11), 1856–1860.


 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top