:::

詳目顯示

回上一頁
題名:機器學習和元啟發式最佳化演算法應用於TFT-LCD廠製程參數設定之研究
作者:熊碩成
作者(外文):Shou-Cheng Hsiung
校院名稱:義守大學
系所名稱:工業管理學系
指導教授:曹以明
學位類別:博士
出版日期:2023
主題關鍵詞:薄膜電晶體液晶顯示器智慧製造特徵選擇機器學習最佳化演算法Thin film transistor liquid crystal display (TFT-LCD)Intelligent manufacturingFeature selectionMachine learningOptimization algorithm
原始連結:連回原系統網址new window
相關次數:
  • 被引用次數被引用次數:期刊(0) 博士論文(0) 專書(0) 專書論文(0)
  • 排除自我引用排除自我引用:0
  • 共同引用共同引用:0
  • 點閱點閱:0
柱狀間隔物 (CS)是薄膜晶體管液晶顯示器 (TFT-LCD) 產業中彩色濾光片 (CF) 的重要組成。 CS 除了有助於保持 CF 和 TFT 基板之間的距離一致之外,還可以防止液晶 (LC) 洩漏和漏光。
CS數據集的特徵包括7個類別:CL(清潔)、CO(塗佈)、H(熱板)、U(紫外線)、A(校準器)、D(顯影)和O(烤箱)。CS數據集的特點是數據量少、維度高(同時具有類別及數值型態特徵),而且CS製程的結果不夠穩定。本研究旨在應用機器學習和元啟發式最佳化演算法來求解TFT-LCD產業耗時的參數設定問題。
利用半監督機器學習中“先聚類後標記”的概念,本研究提出了一個三階段的求解框架,依序為:構建預測模型、最佳化和新進指派。 根據實證研究,得到五項結果。
1.本研究透過數據預處理和混合特徵選擇技術從223個不需要調整的特徵中選擇了30個特徵,用來分群及建模以避免過度擬合。
2.本研究利用“K-means++”演算法將數據分群為三組,並由三種眾所周知的評估方法確定分群個數。 然後,為集群內的每個成員指派相同的類別標籤以進行建模。
3.本研究為每個類別群組建立了具有不同策略的多元線性回歸預測模型。
4.本研究比較了 14 個優化器,並確定Harris Hawks Optimizer (HHO) 是最有效率及最穩定的。能夠在1秒鐘內獲得與實際測量CS高度非常吻合的結果。
5.本研究比較了5個分類器,確定分類回歸樹 (CART) 分類器是最合適的。在選定不需要調整的熱板特徵中,發現,H2、H4、H5 和 H6都是分類的關鍵特徵(分割器)。預設分割器的值可以指派新進,以利用之前為各個類別群組建立的預測模型和優化器。
提出的方法可以擴展到解決其它具有少量高維數據集特徵的智慧製造問題,從而提升產業競爭力。
Column spacer (CS) is an essential component of the color filter (CF) in the thin-film transistor liquid crystal display (TFT-LCD) industry. In addition to helping maintain a consistent distance between CF and TFT substrates, CS can also prevent liquid crystal (LC) leakage and light leakage.
The features of the CS dataset include 7 categories: CL (clean), CO (coater), H (hot plate), U (UV light), A (aligner), D (development), and O (oven). The characteristics of the CS data set are small amounts with high dimensionality (both categorical and numerical features), and the results of the CS fabrication process are not stable enough. This study aims to apply machine learning and metaheuristic optimization algorithms to solve the time-consuming problem of parameter setting in the TFT-LCD industry.
Leveraging the concept of “cluster-then-label” in semi-supervised machine learning, this research proposed a solution framework consisting of three sequence stages: constructing the prediction model, optimizing, and newcomer assignment. According to the empirical study, five results were obtained.
1. By adopting data preprocessing and hybrid feature selection techniques, this study selected 30 features out of 223 features that do not require adjustments for clustering and modeling to avoid overfitting.
2. This study used the “K-means++” algorithm to partition the data into three groups and determine the number of clusters by three well-known evaluation methods. Then, the same category label is assigned to each member within the cluster for modeling.
3. This study constructed distinct multiple linear regression prediction models with different strategies for each category group.
4. This study compared 14 optimizers and determined that Harris Hawks Optimizer (HHO) was the most efficient and stable. The result closely matches the actual measured CS height and can be obtained within 1 second.
5. This study compared five classifiers and determined that the Classification and Regression Tree (CART) classifier was the most suitable. Among the selected hot plate features that do not require adjustment, it has been discovered that H2, H4, H5, and H6 are all critical features for classification (splitters). Pre-setting the value of splitters can assign newcomers to take advantage of previously built prediction models and optimizers for each category group.
The proposed approach can be extended to solve other intelligent manufacturing problems characterized by a small number of high-dimensional datasets, thereby enhancing the competitiveness of the industry.
Abdallah, Z. S., Du, L., & Webb, G. I. (2017), Data Preparation. In C. Sammut & G. L. Webb (Eds.), Encyclopedia of Machine Learning and Data Mining (2nd ed., pp. 318-327), Boston: Springer.
Abdirad, H., & Mathur, P. (2021), Artificial Intelligence for BIM Content Management and Delivery: Case Study of Association Rule Mining for Construction Detailing, Advanced Engineering Informatics, 50, pp. 101414.
Ahmad, I., Basheri, M., Iqbal, M. J., & Rahim, A. (2018), Performance Comparison of Support Vector Machine, Random Forest, and Extreme Learning Machine for Intrusion Detection, IEEE Access, 6, pp. 33789-33795.
Amiri, S., Clarke, B. S., & Clarke, J. L. (2018), Clustering Categorical Data via Ensembling Dissimilarity Matrices, Journal of Computational and Graphical Statistics, 27(1), pp. 195-208.
Ang, J. C., Mirzal, A., Haron, H., & Hamed, H. N. A. (2015), Supervised, Unsupervised, and Semi-Supervised Feature Selection: A Review on Gene Selection, IEEE/ACM Transactions on Computational Biology and Bioinformatics, 13(5), pp. 971-989.
Arthur, D., & Vassilvitskii, S. (2007). K-Means++: The Advantages of Careful Seeding. SODA '07: Proceedings of The Eighteenth Annual ACM-SIAM Symposium on Discrete Algorithms, pp. 1027-1035, SIAM, New Orleans, Louisiana, USA, January 7 - 9, 2007.
Barhoom, A. M., Almasri, A., Abu-Nasser, B. S., & Abu-Naser, S. S. (2022), Prediction of Heart Disease Using a Collection of Machine and Deep Learning Algorithms, International Journal of Engineering and Information Systems (IJEAIS), 6(4), pp. 1-13.
Beni, G., & Wang, J. (1993), Swarm Intelligence in Cellular Robotic Systems. In Robots and Biological Systems: Towards A New Bionics? (pp. 703-712), Springer.
Berrar, D. (2019), Cross-Validation. In B. D. Roitberg (Ed.), Encyclopedia of Bioinformatics and Computational Biology (Vol. 1, pp. 542–545), Elsevier.
Bouveyron, C., & Brunet-Saumard, C. (2013), Model-Based Clustering of High-Dimensional Data: A Review, Computational Statistics & Data Analysis, 71, pp. 52-78.
Bouveyron, C., & Brunet, C. (2012), Theoretical and Practical Considerations on the Convergence Properties of the Fisher-EM Algorithm, Journal of Multivariate Analysis, 109, pp. 29-41.
Bouveyron, C., Girard, S., & Schmid, C. (2007), High-Dimensional Data Clustering, Computational Statistics & Data Analysis, 52(1), pp. 502-519.
Caliński, T., & Harabasz, J. (1974), A Dendrite Method for Cluster Analysis, Communications in Statistics-Theory and Methods, 3(1), pp. 1-27.
Chandrashekar, G., & Sahin, F. (2014), A Survey on Feature Selection Methods, Computers & Electrical Engineering, 40(1), pp. 16-28.
Chao, I.-M., Hsiung, S.-C., & Liu, J.-L. (2020), Improved Whale Optimization Algorithm Based on Inertia Weights for Solving Global Optimization Problems, Advances in Technology Innovation, 5(3), pp. 147-155.
Chapelle, O., Scholkopf, B., & Zien, A. (2009), Semi-Supervised Learning, IEEE Transactions on neural networks, 20(3), pp. 542-542.
Chen, J. C., Chen, T.-L., Pratama, B. R., & Tu, Q.-F. (2018), Capacity Planning with Ant Colony Optimization for TFT-LCD Array Manufacturing, Journal of Intelligent Manufacturing, 29, pp. 1695-1713.
Chen, K. (2022). Future of Smart Display in Public Spaces(Industrial Economics and Knowledge Consulting), I. T. R. Institute.
Cho, J.-H., Sohn, J.-M., Kim, S.-H., Kim, J.-S., Kim, Y.-H., & Oh, H.-K. (2006), Improvement of Column Spacer Uniformity in A TFT LCD Panel, Journal of the Korean Physical Society, 48(2), pp. 240.
Daniya, T., Geetha, M., & Kumar, K. S. (2020), Classification and Regression Trees with Gini Index, Advances in Mathematics: Scientific Journal, 9(10), pp. 8237-8247.
Davies, D. L., & Bouldin, D. W. (1979), A Cluster Separation Measure, IEEE transactions on pattern analysis and machine intelligence, PAMI-1(2), pp. 224-227.
Dennis, J. E., & Schnabel, R. B. (1989), A View of Unconstrained Otimization. In Optimization (pp. 1–72), Amsterdam: Elsevier.
Despois, J. (2018). Memorizing Is Not Learning! — 6 Tricks to Prevent Overfitting in Machine Learning. Retrieved February 18, 2023 from https://hackernoon.com/memorizing-is-not-learning-6-tricks-to-prevent-overfitting-in-machine-learning-820b091dc42
Dogan, A., & Birant, D. (2021), Machine Learning and Data Mining in Manufacturing, Expert Systems with Applications, 166, pp. 114060.
Droste, S., Jansen, T., & Wegener, I. (2006), Upper and Lower Bounds for Randomized Search Heuristics in Black-Box Optimization, Theory of computing systems, 39(4), pp. 525-544.
Eberhart, R., & Kennedy, J. (1995). Particle Swarm Optimization. Proceedings of The IEEE International Conference on Neural Networks, pp. 1942-1948, Perth, Western Australia, 27 November-1 December.
Ezugwu, A. E., Shukla, A. K., Nath, R., Akinyelu, A. A., Agushaka, J. O., Chiroma, H., & Muhuri, P. K. (2021), Metaheuristics: A Comprehensive Overview and Classification along with Bibliometric Analysis, Artificial Intelligence Review, 54(6), pp. 4237-4316.
Fisher, R. A. (1925), Statistical Methods for Research Workers. In, Springer.
García, S., Ramírez-Gallego, S., Luengo, J., Benítez, J. M., & Herrera, F. (2016), Big Data Preprocessing: Methods and Prospects, Big Data Analytics, 1(1), pp. 1-22.
Goldberg, D. E. (1989), Genetic Algorithms in Search, Optimization and Machine Learning (1st ed.), USA: Addison-Wesley Longman Publishing Co., Inc.
Gupta, M. K., & Chandra, P. (2020), A Comprehensive Survey of Data Mining, International Journal of Information Technology, 12(4), pp. 1243-1257.
Guyon, I., & Elisseeff, A. (2003), An Introduction to Variable and Feature Selection, Journal of machine learning research, 3(Mar), pp. 1157-1182.
Guyon, I., Weston, J., Barnhill, S., & Vapnik, V. (2002), Gene Selection for Cancer Classification Using Support Vector Machines, Machine learning, 46(1), pp. 389-422.
Han, J., Kamber, M., & Pei, J. (2011), Data Mining: Concepts and Techniques (3rd ed.), University of Illinois at Urbana-Champaign & Simon Fraser University.
Heidari, A. A., Mirjalili, S., Faris, H., Aljarah, I., Mafarja, M., & Chen, H. (2019), Harris Hawks Optimization: Algorithm and Applications, Future Generation Computer Systems, 97, pp. 849-872.
Hossin, M., & Sulaiman, M. N. (2015), A Review on Evaluation Metrics for Data Classification Evaluations, International journal of data mining & knowledge management process, 5(2), pp. 1.
Hsiung, S.-C., & Chao, I.-M. (2019), A Preliminary Study on Applying Big Data Analysis to Measure the Effectiveness of "Strategic Communication" - Using the 2017 Self-Reliant National Defense Policy as an Example, Chinese Journal of Strategy, Summer(1), pp. 1-60.
Hsiung, S.-C., Chao, I.-M., & Hsu, H.-C. (2022a). Applying the K-Mode Method to Cluster TFT-LCD Manufacturing Procedures with Categorical Data. 2022 International Conference on Globalization and Innovation, ICGI, pp. 21, Shinawatra University, Bangkok, Thailand, August 16-18, 2022.
Hsiung, S.-C., Chao, I.-M., & Hsu, H.-C. (2022b), Selecting Proper Population-Based Metaheuristic Algorithms for Solving the Different Modals of Unconstrained Optimization Problems, International Journal of Organizational Innovation (Online), 14(4), pp. 415-430.
Hsu, C.-Y., Chien, C.-F., Lin, K.-Y., & Chien, C.-Y. (2010), Data Mining for Yield Enhancement in TFT-LCD Manufacturing: An Empirical Study, Journal of the Chinese Institute of Industrial Engineers, 27(2), pp. 140-156.
Huang, Z. (1997). A Fast Clustering Algorithm to Cluster Very Large Categorical Data Sets in Data Mining. Proceedings of the SIGMOD Workshop on Research Issues on Data Mining and Knowledge Discovery, DMKD 1997 in cooperation with ACM SIGMOD'97, pp. 1-8, Tucson, Arizona, USA, May 11, 1997.
Hunter, J. D. (2007), Matplotlib: A 2D Graphics Environment, Computing in science & engineering, 9(03), pp. 90-95.
Jebb, A. T., Parrigon, S., & Woo, S. E. (2016), Exploratory Data Analysis as A Foundation of Inductive Research, Human Resource Management Review, 27(2), pp. 265-276.
Jiang, J., Ma, J., Chen, C., Wang, Z., Cai, Z., & Wang, L. (2018), SuperPCA: A Superpixelwise PCA Approach for Unsupervised Feature Extraction of Hyperspectral Imagery, IEEE Transactions on Geoscience and Remote Sensing, 56(8), pp. 4581-4593.
Jordan, M. I., & Mitchell, T. M. (2015), Machine Learning: Trends, Perspectives, and Prospects, science, 349(6245), pp. 255-260.
Kargın, K. (2021). Multiple Linear Regression Fundamentals and Modeling in Python. Retrieved Feb 23, 2023 from https://medium.com/mlearning-ai/multiple-linear-regression-fundamentals-and-modeling-in-python-60db7095deff
Khan, K., Rehman, S. U., Aziz, K., Fong, S., & Sarasvady, S. (2014). DBSCAN: Past, Present and Future. The Fifth International Conference on The Applications of Digital Information and Web Technologies (ICADIWT 2014), pp. 232-238, Chennai, India, 17-19 February 2014.
Kim, J., Song, T.-J., Kim, J. H., Cho, S.-P., Yang, M. S., Kang, I.-B., Hwang, Y. K., & Chung, I.-J. (2010), Formation of the Overcoat Layer and Column Spacer for TFT-LCD Using Capillary Force Lithography, Displays, 31(2), pp. 82-86.
Kwon, O., Kim, H. G., Ham, M. J., Kim, W., Kim, G.-H., Cho, J.-H., Kim, N. I., & Kim, K. (2020), A Deep Neural Network for Classification of Melt-Pool Images in Metal Additive Manufacturing, Journal of Intelligent Manufacturing, 31(2), pp. 375-386.
Lee, C.-Y., Chang, H.-C., & Wang, K.-W. (2021), Business Ecosystem and Technology Roadmap for Taiwan’s TFT-LCD Industry, Technology Analysis & Strategic Management, 33(1), pp. 1-17.
Lewis, R. J. (2000). An Introduction to Classification and Regression Tree (CART) Analysis. In Annual Meeting of The Society for Academic Emergency Medicine, pp. 1-14, San Francisco, California, 2000.
Li, J., Cheng, K., Wang, S., Morstatter, F., Trevino, R. P., Tang, J., & Liu, H. (2017), Feature Selection: A Data Perspective, ACM computing surveys (CSUR), 50(6), pp. 1-45.
Lin, J., Wang, F., & Peng, C. (2008), Lot Release Times and Dispatching Rule for A TFT-LCD Cell Process, Robotics and Computer-Integrated Manufacturing, 24(2), pp. 228-238.
Liu, H., & Motoda, H. (2007), Computational Methods of Feature Selection, CRC Press.
Liu, Y., Han, T., Ma, S., Zhang, J., Yang, Y., Tian, J., He, H., Li, A., He, M., & Liu, Z. (2023), Summary Of Chatgpt/Gpt-4 Research and Perspective Towards the Future of Large Language Models, arXiv preprint arXiv:2304.01852,
Loh, W. Y. (2011), Classification and Regression Trees, Wiley interdisciplinary reviews: data mining and knowledge discovery, 1(1), pp. 14-23.
Lu, J., Qian, W., Li, S., & Cui, R. (2021), Enhanced K-Nearest Neighbor for Intelligent Fault Diagnosis of Rotating Machinery, Applied Sciences, 11(3), pp. 919.
Luong, N. C., Hoang, D. T., Gong, S., Niyato, D., Wang, P., Liang, Y.-C., & Kim, D. I. (2019), Applications of Deep Reinforcement Learning in Communications and Networking: A Survey, IEEE Communications Surveys & Tutorials, 21(4), pp. 3133-3174.
Müllner, D. (2011), Modern Hierarchical, Agglomerative Clustering Algorithms, arXiv preprint arXiv:1109.2378,
MacQueen, J. (1967). Some Methods for Classification and Analysis of Multivariate Observations. In Proceedings of the 5th Berkeley Symposium on Mathematical Statistics and Probabilityy, pp. 281-297,
Mafarja, M. M., & Mirjalili, S. (2017), Hybrid Whale Optimization Algorithm with Simulated Annealing for Feature Selection, Neurocomputing, 260, pp. 302-312.
Mahesh, B. (2020), Machine Learning Algorithms-A Review, International Journal of Science and Research (IJSR).[Internet], 9, pp. 381-386.
Mehmood, M., Alshammari, N., Alanazi, S. A., & Ahmad, F. (2022), Systematic Framework to Predict Early-Stage Liver Carcinoma Using Hybrid of Feature Selection Techniques and Regression Techniques, Complexity, 2022, pp. 1-11.
Mirjalili, S. (2015), Moth-Flame Optimization Algorithm: A Novel Nature-Inspired Heuristic Paradigm, Knowledge-Based Systems, 89, pp. 228-249.
Mirjalili, S. (2016), SCA: A Sine Cosine Algorithm for Solving Optimization Problems, Knowledge-Based Systems, 96, pp. 120-133.
Mirjalili, S., Gandomi, A. H., Mirjalili, S. Z., Saremi, S., Faris, H., & Mirjalili, S. M. (2017), Salp Swarm Algorithm: A Bio-Inspired Optimizer for Engineering Design Problems, Advances in Engineering Software, 114, pp. 163-191.
Mirjalili, S., & Lewis, A. (2016), The Whale Optimization Algorithm, Advances in Engineering Software, 95, pp. 51-67.
Mirjalili, S., Mirjalili, S. M., & Hatamlou, A. (2015), Multi-Verse Optimizer: A Nature-Inspired Algorithm for Global Optimization, Neural Computing and Applications, 27(2), pp. 495-513.
Mirjalili, S., Mirjalili, S. M., & Lewis, A. (2014), Grey Wolf Optimizer, Advances in Engineering Software, 69, pp. 46-61.
Nalepa, J., & Kawulok, M. (2019), Selecting Training Sets for Support Vector Machines: A Review, Artificial Intelligence Review, 52(2), pp. 857-900.
Nematzadeh, H., Enayatifar, R., Mahmud, M., & Akbari, E. (2019), Frequency Based Feature Selection Method Using Whale Algorithm, Genomics, 111(6), pp. 1946-1955.
Njeri, N. u. R. (2022), Data Preparation for Machine Learning Modelling, International Journal of Computer Applications Technology and Research, 11(06), pp. 231-235.
Ogutu, J. O., Schulz-Streeck, T., & Piepho, H.-P. (2012). Genomic Selection Using Regularized Linear Regression Models: Ridge Regression, Lasso, Elastic Net and Their Extensions. The Proceeding of 15th European Workshop on QTL Mapping and Marker Assisted Selection (QTLMAS), pp. 1-6, Rennes, France, 19-20 May 2011.
Oliff, H., Liu, Y., Kumar, M., Williams, M., & Ryan, M. (2020), Reinforcement Learning for Facilitating Human-Robot-Interaction in Manufacturing, Journal of Manufacturing Systems, 56, pp. 326-340.
Ouyang, L., Wu, J., Jiang, X., Almeida, D., Wainwright, C., Mishkin, P., Zhang, C., Agarwal, S., Slama, K., & Ray, A. (2022), Training Language Models to Follow Instructions with Human Feedback, Advances in Neural Information Processing Systems, 35, pp. 27730-27744.
Pan, J. S., Liu, J. L., & Hsiung, S. C. (2019). Chaotic Cuckoo Search Algorithm for Solving Unmanned Combat Aerial Vehicle Path Planning Problems. Proceedings of the 2019 11th International Conference on Machine Learning and Computing, pp. 224-230, Zhuhai, China, February 22 - 24, 2019.
Pavlyukevich, I. (2007), Lévy Flights, Non-Local Search and Simulated Annealing, Journal of Computational Physics, 226(2), pp. 1830-1844.
Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., & Dubourg, V. (2011), Scikit-Learn: Machine Learning in Python, the Journal of machine Learning research, 12, pp. 2825-2830.
Peikari, M., Salama, S., Nofech-Mozes, S., & Martel, A. L. (2018), A Cluster-then-label Semi-supervised Learning Approach for Pathology Image Classification, Scientific reports, 8(1), pp. 1-13.
Peng, B., Li, C., He, P., Galley, M., & Gao, J. (2023), Instruction Tuning with Gpt-4, arXiv preprint arXiv:2304.03277,
Potdar, K., Pardawala, T. S., & Pai, C. D. (2017), A Comparative Study of Categorical Variable Encoding Techniques for Neural Network Classifiers, International journal of computer applications, 175(4), pp. 7-9.
Rajagopal, D. (2011), Customer Data Clustering Using Data Mining Technique, International Journal of Database Management Systems, 3(4), pp. 1-11.
Reback, J., McKinney, W., Van Den Bossche, J., Augspurger, T., Cloud, P., Klein, A., Hawkins, S., Roeschke, M., Tratner, J., & She, C. (2020), pandas-dev/pandas: Pandas 1.0.5, Zenodo,
Robles-Velasco, A., Cortés, P., Muñuzuri, J., & Onieva, L. (2020), Prediction of Pipe Failures in Water Supply Networks Using Logistic Regression and Support Vector Classification, Reliability Engineering & System Safety, 196, pp. 106754.
Rousseeuw, P. J. (1987), Silhouettes: A Graphical Aid to The Interpretation and Validation of Cluster Analysis, Journal of computational and applied mathematics, 20, pp. 53-65.
Sahoo, K., Samal, A. K., Pramanik, J., & Pani, S. K. (2019), Exploratory Data Analysis Using Python, International Journal of Innovative Technology and Exploring Engineering (IJITEE), 8(12), pp. 4727-4735.
Samie Tootooni, M., Dsouza, A., Donovan, R., Rao, P. K., Kong, Z. J., & Borgesen, P. (2017), Classifying The Dimensional Variation in Additive Manufactured Parts from Laser-Scanned Three-Dimensional Point Cloud Data Using Machine Learning Approaches, Journal of Manufacturing Science and Engineering, 139(9), pp. 0910051-09100514.
Saritas, M. M., & Yasar, A. (2019), Performance Analysis of ANN and Naive Bayes Classification Algorithm for Data Classification, International Journal of Intelligent Systems and Applications in Engineering, 7(2), pp. 88-91.
Schober, P., Boer, C., & Schwarte, L. A. (2018), Correlation Coefficients: Appropriate Use and Interpretation, Anesthesia & analgesia, 126(5), pp. 1763-1768.
Shi, W., Gong, Y., Ding, C., Tao, Z. M., & Zheng, N. (2018). Transductive Semi-Supervised Deep Learning Using Min-Max Features. Proceedings of the European Conference on Computer Vision (ECCV), pp. 299-315, Munich, Germany, 8-14 September.
Shilpa, M., & Naidu, N. (2014), Quantitative Evaluation of Quality Loss for Nominal-The-Best Quality Characteristic, Procedia Materials Science, 5, pp. 2356-2362.
Sinaga, K. P., & Yang, M.-S. (2020), Unsupervised K-means Clustering Algorithm, IEEE access, 8, pp. 80716-80727.
D. o. I. T. o. M. o. E. Affairs. (2020), Smart Display Forward-Looking System Development Verification Plan, Taipei, R.O.C.: Executive Yuan, R.O.C.
Souk, J., Morozumi, S., Luo, F.-C., & Bita, I. (2018), Flat Panel Display Manufacturing, John Wiley & Sons.
Storn, R., & Price, K. (1997), Differential Evolution – A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces, Journal of Global Optimization, 11, pp. 341–359.
Sutton, R. S., & Barto, A. G. (2018), Reinforcement Learning: An Introduction, MIT Press.
B. o. S. a. T. Office. (2020), Taiwan Display Technology and Application Course of Action, Taipei, R.O.C.: Executive Yuan, R.O.C.
Talbi, E. G. (2009), Metaheuristics: from Design to Implementation (Vol. 74), Hoboken, New Jersey, USA: John Wiley & Sons, Inc.
Tharwat, A., Gaber, T., Ibrahim, A., & Hassanien, A. E. (2017), Linear Discriminant Analysis: A Detailed Tutorial, AI communications, 30(2), pp. 169-190.
Timofeev, R. (2004). Classification and Regression Trees (CART) Theory and Applications, Master of Art, Humboldt University, Berlin.
Toshiro, N., Tomoko, I., Hitoshi, H., & Toru, K. (2005), Photosensitive Column Spacer Materials for Liquid Crystal Display Panels, Journal of Photopolymer Science and Technology, 18(1), pp. 11-16.
Tranmer, M., Murphy, J., Elliot, M., & Pampaka, M. (2020), Multiple Linear Regression (2nd ed.), Manchester, UK: Cathie Marsh Institute of The University of Manchester.
Tuba, M., Subotic, M., & Stanarevic, N. (2011). Modified Cuckoo Search Algorithm for Unconstrained Optimization Problems. Proceedings of fhe European Computing Conference, pp. 263-268, Paris, France, April 28-30, 2011.
Turney, S. (2022 September 14, 2022). Coefficient of Determination (R²) | Calculation & Interpretation. Retrieved Feb 27, 2023 from https://www.scribbr.com/statistics/coefficient-of-determination/
Van Engelen, J. E., & Hoos, H. H. (2020), A Survey on Semi-Supervised Learning, Machine learning, 109(2), pp. 373-440.
Venkata Rao, R. (2016), Jaya: A simple and New Optimization Algorithm for Solving Constrained and Unconstrained Optimization Problems, International Journal of Industrial Engineering Computations, 7(1), pp. 19-34.
Wang, W., Yang, J., & Muntz, R. (1997). STING: A Statistical Information Grid Approach to Spatial Data Mining. Vldb, pp. 186-195,
Wolpert, D. H., & Macready, W. G. (1997), No Free Lunch Theorems for Optimization, IEEE Transactions on Evolutionary Computation, 1(1), pp. 67-82.
Yan, K., & Zhang, D. (2015), Feature Selection and Analysis on Correlated Gas Sensor Data with Recursive Feature Elimination, Sensors and Actuators B: Chemical, 212, pp. 353-363.
Yang, X. S. (2009), Firefly Algorithms for Multimodal Optimization. In W. O. & Z. T. (Eds.), Stochastic Algorithms: Foundations and Applications. SAGA 2009. Lecture Notes in Computer Science (Vol. 5792, pp. 169-178), Springer, Berlin, Heidelberg.
Yang, X. S. (2010), A New Metaheuristic Bat-Inspired Algorithm. In Nature Inspired Cooperative Strategies for Optimization (NICSO 2010) (pp. 65-74), Springer.
Yang, X. S. (2011), Review of Metaheuristics and Generalised Evolutionary Walk Algorithm, International Journal of Bio-Inspired Computation, 3(2), pp. 77-84.
Yang, X. S., & Deb, S. (2009). Cuckoo Search via Lévy Flights. Proceeding of the World Congress on Nature & Biologically Inspired Computing (NaBIC 2009), pp. 210-214, India, December.
Zeng, X., & Martinez, T. R. (2000), Distribution-Balanced Stratified Cross-Validation for Accuracy Estimation, Journal of Experimental & Theoretical Artificial Intelligence, 12(1), pp. 1-12.
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
QR Code
QRCODE