(一)中文部份
1.方維,演算法與資料結構,維科出版社,台北市,1994年。
2.吳育儒,決策樹中移除不相關值問題的研究,淡江大學資訊工程系碩士論文,1998年。
3.林傑斌,劉明德,資料採掘-與OLAP理論與實務,文魁圖書公司,2002年。
4.徐芳玲,以主成分分析應用在決策樹名目屬性之二元分割上,國立成功大學資訊管理研究所碩士論文,2002年。
5.馬芳資,林我聰,決策樹形式知識合併修剪之研究,電子商務研究,已接受未出刊,2005a年。6.馬芳資,林我聰,Web-based 決策樹知識發掘預測系統架構,ICTA2005技術與認證國際學術研討會,2005b年6月,pp.35-49。
7.馬芳資,林我聰,決策樹形式知識整合之研究,資訊管理學報,已接受未出刊,2005c年。8.馬芳資,林我聰,決策樹形式知識之線上預測系統架構,圖書館學與資訊科學,Vol. 29,No.2,2003年10月,pp.60-76。9.馬芳資,信用卡信用風險預警範例學習系統之研究,第十屆全國技職及職業教育研討會,技職研討會,商業類I,1995年,pp.427-436。
10.馬芳資,信用卡信用風險預警範例學習系統之研究,國立政治大學資訊管理系碩士論文,1994年。
11.陳重銘,結合直線最適法於決策樹修剪之影響研究,國立中山大學資訊管理研究所碩士論文,1995年。
12.陳偉,決策樹中不相關的條件值問題之探討,淡江大學資訊工程學系博士論文,1999年。
13.彭文正譯,Data Mining資料採礦理論與實務-顧客關係管理的技巧與科學,維科圖書,台北市,2001年。
14.曾憲雄,黃國禎,人工智慧與專家系統-理論╱實務╱應用,旗標出版股份公司,台北市,2005年,pp.1-25。
15.曾憲雄,蔡秀滿,蘇東興,曾秋蓉,王慶堯,資料探勘-Data Mining,旗標出版股份公司,台北市,2005年。
16.楊建民,在微平行電腦上發展範例學習系統研究信用卡信用風評估,行政院國家科學委員會專題研究計畫,1993年7月。
17.楊建民,專家系統與機器學習:財務專家系統知識庫建構與學習之研究,台北,時英出版社,1991年3月。
18.蔣以仁,資料發掘之模糊分類,國立臺灣大學資訊工程學系博士論文,1997年。
19.謝孟錫,分徑指標在建立決策樹的比較,國立中央大學工業管理研究所碩士論文,2002年。
20.謝國義,決策樹形成過程中計算複雜度之改善研究,國立成功大學工業管理學系碩士論文,1998年。
(二)英文部份
1.Auer, P., Holte, R.C. & Maass, W. “Theory and application of agnostic PAC-learning with small decision trees,” Proceedings of the twelfth international conference on machine learning, 1995, pp.21-29.
2.Bauer, E. & Kohavi, R., “An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants,” Journal of Machine Learning Vol. 36, Nos. 1/2, July/August 1999, pp.105-139.
3.Berry, M. J. A. & Linoff, G.. S., Data Mining Techniques- For Marketing, Sales, and Customer Relationship Management (Second Edition), Wiley, 2004。
4.Bolakova, I., “Pruning Decision Trees to Reduce Tree Size,” Proceedings of the international conference--Traditional And Innovations In Sustainable Development Of Society, Rezekne, Latvia, February 28 - March 2002, pp.160-166, ISBN 9984-585-02-6.
5.Bradford, J. & Kunz, C. & Kohavi, R. & Brunk, C. & Brodley, C.E., “Pruning decision trees with misclassification costs,” In Proceedings of Tenth European Conference on Machine Learning(ECML-98), Berlin, 1998, pp. 131-136.
6.Breiman, L., “Bagging predictors,” Machine Learning 24, 1996, pp.123-140.
7.Breiman, L. Friedman, J.H. Olshen, R. & Stone, C., Classification and Regression Trees, Belmont, California: Wadsworth, 1984.
8.Breslow, L. A. & Aha, D. W., “Comparing simplification procedures for decision trees,” Artificial Intelligence and Statistics, 5, 1998, pp.199-206.
9.Brodley, C.E. & Utgoff, P.E., “Multivariate decision trees,” Machine Learning, 19, 1995, pp.45-77.
10.Butine, W., “Learning classification trees,” Statistics and Computing 2(2), 1992, pp.63-73.
11.Chan, P. K. & Stolfo, S. J., “On the Accuracy of Meta-learning for Scalable Data Mining,” Journal of Intelligent Integration of Information, L. Kerschberg, Ed., 1998.
12.Chan, P. K. & Stolfo, S. J., “A comparative evaluation of voting and meta-learning on partitioned data,” In Proceedings of the 12th International Conference on Machine Learning (ICML-95), 1995a, pp:90-98, Morgan Kaufmann.
13.Chan, P.K. & Stolfo, S.J., “Learning arbiter and combiner trees from partitioned data for scaling machine learning,” In Proc. Intl. Conf. on Knowledge Discovery and Data Mining, 1995b, pp.39-44.
14.Chen, K. & Wang, L., & Chi, H., “Methods of Combining Multiple Classifiers with Different Features and Their Applications to Text-Independent Speaker Identification,” International Journal of Pattern Recognition and Artificial Intelligence, 11(3), 1997, pp.417-445.
15.Cherkauer, K.J. & Shavlik, J.W. “Growing simpler decision trees to facilitate knowledge discovery,” Procceedings of the Second International Conference on Knowledge Discovery and Data Mining, 1996, pp.315-318.
16.Cormen, T.H. & Leiserson, C.E. & Rivest, R.L., & Stein, C., Introduction to Algorithms (Second Edition), McGraw-Hill, 2001.
17.DMG, The Data Mining Group, http://www.dmg.org, 2005.
18.Dong M. & Kothari, R., “Classifiability Based Pruning of Decision Trees,” Proc. International Joint Conference on Neural Networks (IJCNN), Volume 3, 2001, pp.1739-1743.
19.Dunham, M.H., Data Ming: Introductory and Advanced Topics, Pearson Education, Inc., 2003.
20.Esposito, F., Malerba, D. & Semeraro, G.., “A Further Study of Pruning Methods in Decision Tree Induction,” Proceedings of the Fifth International Workshop on Artificial Intelligence and Statistics, 1995, pp.211-218.
21.Fayyad, U. M. et al., “Data Mining and Knowledge Discovery,” Kluwer Academic Publishers, 1997.
22.Fayyad, U. M., “Data Mining and Knowledge Discovery: Making Sense out of Data,” IEEE Expert, Vol.11, No. 5, October 1996, pp.20-25.
23.Fournier D. & Crémilleux B., “A Quality Index for Decision Tree Pruning,” Knowledge-Based Systems, Volume 15, 2002, Elsevier, pp.37-43.
24.Frank, E., Hall, M., Trigg, L., Holmes, G. and Witten, I.H., “Data mining in bioinformatics using Weka,” Bioinformatics Advance Access, published online Bioinformatics, April 8 2004. Oxford University Press.
25.Frank, E., Pruning Decision Trees and Lists, Department of Computer Science, University of Waikato, Hamilton, New Zealand., 2000.
26.Freund, Y & Schapire, R. E., “A decision-theoretic generalization of on-line learning and an application to boosting,” Journal of Computer and System Sciences, 55, 1997, pp.119-139.
27.Gama, J., “Probabilistic Linear Tree,” Proc. 14th International Conference on Machine Learning, 1997.
28.Hall, L.O. & Chawla, N. & Bowyer, K.W., “Combining decision trees learned in parallel,” In Working Notes of the KDD-97 Workshop on Distributed Data Mining, 1998, pp.10-15.
29.Holmes, G., & Kirkby, R. & Pfahringer, B., “Mining data streams using option trees,” Working Paper 03, Department of Computer Science, The University of Waikato, Hamilton, 2004.
30.John, G., “Robust Decision Trees: Removing Outliers in Databases,” Proceedings of the First International Conference on Knowledge Discovery and Data Mining, 1995, pp.174-179.
31.Kargupta, H., Hamzaoglu, I., Stafford, B., Hanagandi, V., & Buescher, K., “PADMA: Parallel Data Mining Agent for Scalable Text Classification,” In Proceedings Conference on High Performance Computing 1997, pp.290-295. The Society for Computer Simulation International.
32.Kohavi, R. & Quinlan, J.R., “Decision-tree discovery,” In Will Klosgen and Jan M. Zytkow, editors, Handbook of Data Mining and Knowledge Discovery, chapter 16.1.3, 2002, pp.267-276. Oxford University Press.
33.Kohavi, R. & Kunz, C., “Option Decision Trees with Majority Votes,” Machine Learning: Proceedings of the Fourteenth International Conference, 1997.
34.Krishnaswamy, S., Zaslavsky, A., & Loke, S.W., “Federated Data Mining Services and a Supporting XML Markup Language,” Proceedings of the 34th Annual Hawaii International Conference on System Sciences (HICSS-34), Hawaii, USA, January 2001. In the "e-Services: Models and Methods for Design, Implementation and Delivery" mini-track of the "Decision Technologies for Management" track, IEEE Press, ISBN 0-7695-0981-9.
35.Krishnaswamy, S., Zaslavsky, A. & Loke, S.W., “An Architecture to Support Distributed Data Mining Services in E-Commerce Environments,” Proceedings of the Second International Workshop on Advanced Issues in E-Commerce and Web-Based Information Systems, San Jose, California, June 8-9 2000, pp.238-246.
36.Mingers, J., “An Empirical Comparision of Selection Measures for Decision-Tree Induction,” Machine Learning, 3, 1989a, pp.319-342.
37.Mingers, J., “An empirical comparison of pruning methods for decision tree induction,” Machine Learning, Volume 4, 1989b, pp.227-443.
38.Mingers, J., “Expert systems-rule induction with statistical data,” Journal of the Operational Research Society, 38, 1987, pp.39-47.
39.Murphy, O. J. & McCraw, R. L., “Designing Storage Efficient Decision Trees,” IEEE Transactions on Computers, 40(3), 1991, pp.315-319.
40.Murphy, S. K., “Automatic Construction of Decision Trees from Data: a multi-disciplinary survey,” Data Mining and Knowledge Discovery, 2, 1998, pp.345-389.
41.Murphy, S. K., On growing better decision trees from data, Doctoral dissertation, University of Maryland, 1997.
42.Niblett, T. & Bratko, I., “Learning decision rules in noisy domain,” In Bramer, M. A. (Ed.), Research and Development in Expert Systems III. Proceedings of Expert Systems 1986, Brighton, pp.413-420. Madison, Wisconsin: Morgan Kaufmann, San Francisco, CA.
43.Pagallo, G. & Haussler, D. “Boolean feature discovery in empirical learning,” Machine Learning, 5, 1990, pp.71-100.
44.PMML 3.0 – Predictive Model Markup Language. http://www.dmg.org/pmml-v3-0.html, 2005.
45.Prodromidis, A.L. & Stolfo, S.J., “Mining databases with different schemas: Integrating incompatible classifiers,” Proc. KDD-98, August 1998.
46.Quinlan, J. R., “MiniBoosting Decision Trees,” Journal of Artificial Intelligence Research, 1998.
47.Quinlan, J.R., “Bagging, Boosting, and C4.5,” In Proceedings Thirteenth National Conference on Artificial Intelligence, 1996a, pp.725-730, AAAI Press.
48.Quinlan, J.R., “Improved use of continuous attributes in C4.5,” Journal of Artificial Intelligence Research, 4, 1996b, pp.77-90.
49.Quinlan, J.R., C4.5: Programs for Machine Learning, San Mateo: Morgan Kaufmann, 1993.
50.Quinlan, J.R., “Simplifying decision trees,” International Journal of Man-Machine Studies, 27(3), 1987, pp.221-234.
51.Quinlan, J.R., Machine Learning: An Artificial Intelligence Approach, Volume 2, chapter The effect of noise on concept learning, Los Altos, CA: Morgan Kaufmann, 1986, pp.149-166.
52.Ragavan, H. & Rendell, L. “Lookahead feagure construction for learning hard concepts,” Proceeding of the Tenth International Conference on Machine Learning, 1993, pp.252-259.
53.Smyth, P. & Gray, A. & Fayyad, U., “Retrofitting decision tree classifiers using kernel density estimation,” In Proceedings of the Twelfth International Conference on 844 Machine Learning, 1995, pp:506-514, Morgan Kaufmann Publishers.
54.Ting, K.M. & Low, B.T., “Model combination in the multiple-data-batched scenario,” Proc. European Conference on Machine Learning, Prague, Czech Republic, LNAI-1224, 1997, pp.250-265, Springer-Verlag.
55.Ting, K.M. & Witten, I.H., “Stacking bagged and dagged models,” Proc International Conference on Machine Learning, Tennessee, 1997, pp.367-375.
56.Ting, K.M. & Low, B.T., “Theory combination: an alternative to data combination,” Working Paper 1996/19, Department of Computer Science, University of Waikato.
57.Todorovski, L., & Dzeroski, S., “Combining Classifiers with Meta Decision Trees,” Machine Learning Journal. 50(3): pp.223-249; Mar 2003.
58.Todorovski, L. & Dzeroski, S., “Combining Multiple Models with Meta Decision Trees,” In Proceedings of the Fourth European Conference on Principles of Data Mining and Knowledge Discovery, Springer 2000, pp.54-64.
59.Todorovski, L. & Dzeroski, S., “Experiments in meta-level learning with ILP,” In Proceedings of the Third European Conference on Principles of Data Mining and Knowledge Discovery. Springer 1999, pp.98-106.
60.Utgoff, P.E. & Clouse, J.A., “A Kolmogorov-Smirnoff metric for decision tree induction,” Technical Report 96-3, Department of Computer Science, University of Massachusetts, 1996.
61.Utgoff, P.E., “Decision tree induction based on efficient tree restructuring,” Technical Report 95-18, Department of Computer Science, University of Massachusetts, 1996.
62.Webb, G.I., “Decision Tree Grafting From The All Tests But One Partition,” In Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence (IJCAI 99). Morgan Kaufmann, 1999.
63.Webb, G.I., “Further experimental evidence against the utility of occam’s razor,” Journal of Artificial Intelligence Research, 4, 1996, pp.397-417.
64.Williams, G.., Induction and Combining Multiple Decision Trees, Ph.D. Dissertation, Australian National University, Canberra, Australia, 1990.
65.Windeatt T. & Ardeshir G.., “An empirical comparison of pruning methods for ensemble classifiers,” Proc. of Int. Conf Intelligent Data Analysis, Sep. 13-15 2001, Lisbon, Portugal, Lecture notes in computer science, Springer-Verlag, pp.208-217.
66.Witten, I.H. & Frank, E., Data Mining: Practical Machine Learning Tools and Techniques with JAVA Implementations, Morgan Kaufmann, 2000.
67.Wolpert, D.H., “Stacked generalization,” Neural Networks, 5, 1992, pp.241-259.
68.Zheng, Z. & Webb, G. I., “Multiple Boosting: A Combination of Boosting and Bagging,” In Proceedings of the 1998 International Conference on Parallel and Distributed Processing Techniques and Applications (PDPTA 98), 1998, pp.1133-1140. CSREA Press.
69.Zheng, Z., “Constructing nominal X-of-N attributes,” Proceedings of the Fourteenth International Joint Conference on Artificial Intelligence, 1995, pp.1064-1070.
70.Zhou, Z.H. & Chen, Z.Q., “Hybrid Decision Tree,” Knowledge-Based Systems, 15(8), 2002, pp.515-528.
71.Zimenkov, A., Tree Classifiers, Department of Information Technology, Lappeenranta University of Technology, 2000.