:::

詳目顯示

回上一頁
題名:由已訓練類神經網路擷取成本敏感之分類規則
書刊名:電子商務學報
作者:黃宇翔 引用關係毛紹睿
作者(外文):Huang, Yeu-shiangMao, Shiao-rei
出版日期:2005
卷期:7:3
頁次:頁275-291
主題關鍵詞:資料探勘類神經網路規則擷取分類錯誤成本Data mining neural networkRule extractionMisclassification cost
原始連結:連回原系統網址new window
相關次數:
  • 被引用次數被引用次數:期刊(2) 博士論文(0) 專書(0) 專書論文(0)
  • 排除自我引用排除自我引用:2
  • 共同引用共同引用:0
  • 點閱點閱:28
類神經網路為處理資料探勘問題的技術之一,其學習結果通常有較高的正確率,且對於存有雜訊的資料有較好的容錯能力,其網路架構也能夠表達屬性問複雜的關係。然而其學習結果為一黑箱,對於使用者缺乏解釋能力,使得類神經網路在應用上受到一定程度的限制。本研究透過規則歸納演算法由己訓練類神經網路中擷取出明確的規則,用以解釋類神經網路的學習結果,且所提出之規則擷取架構將能夠適用於不同的類神經網路模式中。並於規則擷取的過程考量分類錯誤成本的影響,使所擷取之規則能反應不同類別的分類錯誤成本更能符合實務上的需要。本研究架構以Cendrowska所提出之PRISM演算法為規則擷取基礎,分別以Adacost、Metacost以及修改PRISM資訊函數三種方式使所擷取之規則能考量分類錯誤成本。並將本研究方法與REFNE規則擷取架構,以UCI-ML資料庫為評比基礎就所產生規則之規則數目、正確率以及分類錯誤成本進行比較與分析。
Neural network, as a popular approach in data mining, usually has better learning results with relatively high accuracy. It provides good fau1t-tolerant ability for handling data with noises, and its network structure can also presents the complicated relationships among attributes. However, such black-boxed type of neural network process lacks the ability of explanation to offer the users with comprehensibly manageable knowledge, and the applications of neural network are occasionally restricted. In this paper, a rule induction algorithm is employed to retrieve the explicit rules for interpret the learning results from neural networks. Furthermore, by considering the misclassification costs in the retrieval process, the retrieved rules would be more realistic to practical uses. The proposed approach is based on PRISM algorithm proposed by Cendrowska, and uses the methods of Adacost, Metacost, and information entropy to consider the misclassification costs. An empirical investigation is performed by utilizing the UCI-ML database to verify the effectiveness of the proposed approach.
期刊論文
1.Craven, Mark W.、Shavlik, Jude W.(1996)。Extracting tree-structured representations of trained networks。Advances in Neural Information Processing Systems,1,24-30。  new window
2.Cendrowska, J.(1987)。PRISM: An Algorithm for Inducing Modular Rules。International Journal of Man-Machine Studies,27(4),349-370。  new window
3.Fu, Li-Min(1998)。A Neural-network Model for Learning Domain Rules Based on Its Activation Function Characteristics。IEEE Transactions on Neural on Neural Networks,9(5),787-795。  new window
4.Liu, Huan、Setiono, Rudy(1997)。Feature Selection and Discretization of Numeric Attributes。IEEE Transaction on Knowledge and Data,9(4),642-645。  new window
5.Núñez, Marlon(1991)。The Use of Background Knowledge in Decision Tree Induction。Machine Learning,6(3),231-250。  new window
6.Setiono, Rudy、Leow, Wee-Kheng、Zurada, Jacek M.(2002)。Extraction of Rules From Artificial Neural Networks for Nonlinear Regression。IEEE Transactions on Neural on Neural Networks,13(3),564-577。  new window
7.Thrun, Sebastian、Tesauro, G.、Touretzky, D.、Leen, T.(1995)。Extracting Rules from Artificial Neural Networks with Distributed Representations。Advances in Neural Information Processing Systems,7,505-512。  new window
8.Tsukimoto, H.(2000)。Extracting Rules from Trained Neural Networks。IEEE Transactions on Neural on Neural Networks,11(2),377-389。  new window
9.Turney, Peter D.、Turney, P. D.(1995)。Cost-sensitive Classification: Empirical Evaluation of a Hybrid Genetic Decision Tree Induction Algorithm。Journal of Artificial Intelligence Research,2,369-409。  new window
10.Zhou, Zhi-Hua、Jiang, Yuan-Shu、Chen, Shi-Fu(2003)。Extracting Symbolic Rules from Trained Neural Network Ensembles。AI Communications,16(1),3-15。  new window
會議論文
1.Boz, Olcay(2002)。Extracting Decision Trees from Trained Neural Networks。New York, NY。456-461。  new window
2.Chan, Philip K.、Stolfo, Salvatore J.(1998)。Toward Scalable Learning with Non-uniform Class and Cost Distributions: A Case Study in Credit Card Fraud Detection。0。164-168。  new window
3.Cohen, William W.(1995)。Fast Effective Rule Induction。0。115-123。  new window
4.Domingos, Pedro(1999)。MetaCost: A General Method for Making Classifiers Cost-sensitive。Fifth International Conference on Knowledge Discovery and Data Mining,155-164。  new window
5.Drummond, Chris、Holte, Robert C.(2000)。Exploiting the Cost (In)sensitivity of Decision Tree Splitting Criteria。0。239-246。  new window
6.Fan, Wei、Stolfo, Salvatore J.、Zhang, Junxin、Chan, Philip K.(1999)。AdaCost: Misclassification Cost-sensitive Boosting。0。97-105。  new window
7.Fu, Xiuju、Wang, Lipo(2001)。Rule Extraction by Genetic Algorithms Based on A Simplified RBF Neural Network。Seoul, South Korea。753-758。  new window
8.Norton, Steven W.(1989)。Generating Better Decision Trees。San Francisco, CA。800-805。  new window
9.Provost, Foster、Fawcett, Tom、Kohavi, Ron(1997)。The Case Against Accuracy Estimation for Comparing Induction Algorithms。0。445-453。  new window
10.Tan, Ming(1991)。Cost-sensitive Reinforcement Learning for Adaptive Classification and Control。0。774-780。  new window
11.Tickle, A. B.、Golea, M.、Hayward, R.、Diederich, J.(1997)。The Truth is in There: Current Issues in Extracting Rules from Trained Feedforward Artificial Neural Networks。Houston, TX。2530-2534。  new window
12.Ting, Kai-Ming、Zheng, Zijian(1998)。Boosting Trees for Cost-sensitive Classifications。0。190-195。  new window
13.Turney, Peter B. B.(2000)。Types of Cost in Inductive Concept Learning。Cost-sensitive Learning Workshop at the Seventeenth International Conference on Machine Learning。0。15-21。  new window
14.Zubek, Valentina Bayer、Dietterich, Thomas G.(2002)。Pruning Improves Heuristic Search for Cost-sensitive Learning。0。27-34。  new window
研究報告
1.Ikizler, Nazli(2002)。Benefit Maximizing Classification Using Feature Intervals。0。  new window
圖書
1.Han, Jiawei、Kamber, Micheline(2000)。Data mining: Concepts and techniques。Morgan Kaufmann Publishers。  new window
2.Witten, Ian H.、Frank, Eibe(2005)。Data Mining: Practical Machine Learning Tools and Techniques。Amsterdam:Morgan Kaufmann。  new window
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
:::
無相關書籍
 
無相關著作
 
無相關點閱
 
QR Code
QRCODE