:::

詳目顯示

回上一頁
題名:技術型高中教師評鑑性思考量表發展之研究
作者:李志原
作者(外文):LEE, CHIH-YUAN
校院名稱:國立臺北科技大學
系所名稱:技術及職業教育研究所
指導教授:曾淑惠
學位類別:博士
出版日期:2022
主題關鍵詞:評鑑性思考量表發展常模Evaluative ThinkingScale DevelopmentNorm
原始連結:連回原系統網址new window
相關次數:
  • 被引用次數被引用次數:期刊(0) 博士論文(0) 專書(0) 專書論文(0)
  • 排除自我引用排除自我引用:0
  • 共同引用共同引用:0
  • 點閱點閱:0
因應十二年國教及新課程大綱之推動,在教育場域中教師面對有關評鑑相關事務比重逐漸提升,且提升評鑑性思考對於學校及教師皆有所助益。為此,本研究主要目的在於建構一份評鑑性思考量表,以了解技術型高中教師對於評鑑性思考之認知程度,首先進行國內外相關文獻分析及彙整,彙編成德懷術問卷,透過14位專家小組進行三回合之德懷術問卷,以平均數及標準差作為判斷刪題之標準,形成一份李克特五點量表初稿。量表施測分為二階段,第一階段為量表初稿之預試,以便利抽樣方式,共抽取296名教師作為預試樣本,將預試統計結果進行量表信效度驗證及題目篩選後,產生正式量表;第二階段為正式施測,共計回收1,231份正式量表,將正式施測統計結果進行差異性分析,並建立全國性常模以供後續測驗參考對照,本研究結果如下:
一、評鑑性思考量表由七個構面39個題項組成,其中包含有「知識與能力」(8題)、「思考與觀點」(5題)、「態度與信念」(7題)、「運用與結果」(5題)、「自我反思」(5題)、「自我超越」(4題)及「領導與組織」(5題)等。
二、量表具有良好之信效度,包含有:內部一致性α係數部分,各構面為.866~.912,整體量表為.969、組合信度CR值為.870~.912,平均變異量萃取量為.508~.676,顯示量表具有良好之信度;因素負荷量為0.632~0.875,顯示量表具有收斂效度;在區別效度部分,每一配對組中受限模式與未受限模式之卡方值差異量皆大於10.83,達.001顯著水準,量表中各構面之間均具有區別效度。
三、建立量表之常模,依據正式施測之結果,分別建構七個百分等級對照常模表,供後續實際使用本量表施測後進行量表結果解釋之用。
四、差異分析結果,除性別、年齡及任教領域等三項背景變數沒有差異性外,在學校性質、學校區域、學歷、任教年資以及職務等背景變項中,其統計值顯示,皆具有顯著差異性。
最後依據研究結果部分,針對技術型高中、技術型高中教師及未來研究提出相關建議。
In response to the execution of the 12-year compulsory education and the new curriculum outline, teachers facing evaluative-related matters in the education field have gradually increased. Therefore, improving evaluative thinking is helpful among schools, teachers and administrative.
The main purpose of this study was to develop an Evaluative Thinking Scale in order to understand the evaluative thinking awareness of technical and vocational senior high school teachers. This study was conducted by Delphi Technique. First, review relevant literature and compiled them into a questionnaire for Delphi Technique. Secondly, a group of 14 experts were selected to conduct three rounds of Delphi Technique questionnaire; the mean and standard deviation were used as the criteria for questions deletion and selection. After forming the questions, they were compiled into the first draft of the Likert five-point scale. Two stages of scale testing were carried out. The first stage was the preliminary test of the first draft of the scale. A total of 296 teachers from North, Central, South and East part of Taiwan were selected as pre-samples for the convenience sampling. The results were analyzed by reliability and validity test through internal consistency analysis and confirmatory factor analysis. The items were edited into a formal scale. The second stage was the formal test. A total of 1231 formal scales were collected and the difference analysis of the results was carried out, and a national norm was established for the follow-up test reference comparison. The results of this study were as follows:
1.The Evaluative Thinking Scale consists of 39 items in seven constructs. It was an anonymous Likert five-point scale including 8 questions of Knowledge and Capability, 5 questions of Thoughts and Opinions, 7 questions of Attitude and Faith, 5 questions of Implantation and Results, 5 questions Self-Reflection, 4 questions of Self- Transcendence, 5 questions of Leadership and Organization.
2.The reliability and validity of the scale was good. Internal consistency α, each construct was .866~.912, the overall scale was .969, and the value of Composite Reliability was .870~.912, the average variance extracted was .508~.676, indicating that the scale had good reliability. Factor loading was 0.632~0.875, indicating that the scale had convergent validity. For discriminant validity, the difference in chi-square value between restricted and unrestricted model was greater than 10.83, reaching a significant level of .001, indicating that each construct had discriminant validity.
3.Establish a norm for the scale. According to the results of the formal measurement, construct seven percentile comparison norm tables for the subsequent interpretation of the scale results after the actual use of this scale.
4.According to the data from the formal test, background variables such as school nature, school area, education background, teaching years, and position were in significant differences. There was no difference in gender, age and teaching field.
Finally, based on the results of this study, relevant suggestions were proposed for technical high schools, technical high school teachers and future research.
壹、中文部分
DeVellis,R. F.(1999)。量表的發展:理論與應用(吳齊殷譯)。弘智。(原著出版於1991年)
Fraenkel, J. & Wallen, N.(2003)。教育研究法:規劃與評鑑(黃光雄審閱)。麗文。(原著出版於1990年)
Lazear, D(2000)。落實多元智慧教學評量(郭俊賢、陳淑惠譯)。遠流。(原著出版於1994)
Ron, R., Mark, C., & Karin, M.(2020)。讓思考變得可見。(伍晴文譯)。大家。(原著出版於2011年)
王文科、王智弘(2014)。教育研究法。五南。
王俊明(1999)。問卷與量表的編製及分析方法。在張至滿、王俊明(主編),體育測驗與評價(頁139-158)。中華民國體育學會。
王凱、張震元(2010)。整合實用性、享樂性與社群認同性建構社群網站態度衡量模型之研究。資訊管理學報,19(2),275-313。http://doi.org/10.6382/JIM.201204.0064
王寶墉(1995)。現代測驗理論。心理。
余民寧(1991)。試題反應理論的介紹(一)-測驗理論的發展趨勢。研習資訊,8(6),13-18。
余民寧(2009)。試題反應理論(IRT)及其應用。心理。
余民寧(2011)。教育測驗與評量:成就測驗與教學評量(第三版)。心理。
吳明隆(2008)。SPSS操作與應用:多變量分析實務。五南。
吳明隆(2013)。SPSS統計應用學習實務-問卷分析與應用統計。易習圖書。
吳明隆(2014)。論文寫作與量化研究(第四版)。五南。
吳明隆、涂金堂(2014)。SPSS與統計應用分析。五南。
吳明隆、張毓仁(2010)。結構方程式:實戰應用秘笈。五南。
吳修廉(2013)。以試題反應理論為基礎的開展模式之延伸及其在電腦化適性測驗上之應用〔未出版之博士論文〕。國立中正大學心理學研究所。
吳清山、王湘栗(2004)。教育評鑑的概念與發展。教育資料集刊,29,1-26。
吳清山、林天佑(2001)。德懷術。教育名詞解釋。教育研究月刊,92,127。
吳雅玲(2001)。德懷術及其在課程研究上的應用。教育研究,9,297-306。
宋曜廷主編(2011)。數位學習研究方法。高等教育。
李隆盛(1988)。德爾非預測術在技職教育上的應用。工業教育雙月刊,7(1),34-60。
沈家平、陳文典(2019)。批判思考智能。https://phy.ntnu.edu.tw/~wdchen/pdf/book5/03.pdf
林秋菊(2014)。量表的建構與測試。高雄護理雜誌,31(2),12-21。http://doi.org/10.6692/KJN-2014-31-2-2
林清山(1976)。心理與教育統計學。東華。
邱皓政(2017)。量化研究法(三):測驗原理與量表發展技術。雙葉。
邱皓政(2017)。量表編製理論與應用。五南。
柳建營、劉曉明(2017)。青年心理健康教程。崧博。
洪久賢(1995)。批判思考教學在家政教育之應用研究。行政院國家科學委員會。
洪文東(2003)。創造性問題解決化學單元教學活動設計與評估。科學教育學刊,11(4),407-430。http://doi.org/10.6173/CJSE.2003.1104.04
徐台閣、李光武(2013)。如何決定調查研究適當的問卷樣本數。臺灣運動教育學報,8(1),89-95。http://doi.org/10.6580/JTSP.2013.8(1).06
徐昊杲、曾淑惠(2008)。高職學校評鑑人員評鑑能力與角色之研究。當代教育研究,16(2),101-131。http://doi.org/10.6151/CERQ.2008.1602.04
秦夢群、陳遵行(2012)。臺灣高等教育評鑑制度與實施之分析研究。教育資料與研究,106,105-142。
國家教育研究院(2018)。素養導向的「紙筆測驗」要素與範例試題。國家教育研究院。
張玉成(1995)。發問技巧與學生創造力之增進。教育資料集刊,30,181-200。
張春興(1989)。張氏心理學辭典。東華。
張春興(2013)。教育心理學-三化取向的理論與實踐(重修二版)。東華。
張美智(2017)。一份想像力量表編製之研究〔未出版之博士論文〕,國立臺北科技大學技術及職業教育研究所。
張紹勳(2007)。研究方法。滄海。
教育部(2006a)。高級中學學校評鑑實施方案。
教育部(2006b)。試辦中小學教師專業發展評鑑宣導手冊。
教育部(2007)。高職學校評鑑實施方案。
教育部(2014)。十二年國教課程綱要總綱。
教育部(2016)。高級中等教育法。
教育部(2018)。高級中等學校評鑑辦法。
教育部(2019)。教育百科─詞條名稱:標準化測驗。http://pedia.cloud.edu.tw/E-ntry/Detail/?title=%E6%A8%99%E6%BA%96%E5%8C%96%E6%B8%AC%E9%A9%97
郭生玉(2000)。心理與教育測驗。精華。
郭生玉(2010)。教育測驗與評量。精華。
郭伯臣、吳慧珉、陳俊華(2012)。試題反應理論在教育測驗上之應用。新竹縣教育研究集刊,12,5-40。
郭昭佑(2001)。教育評鑑指標建構方式探究。國教學報,13,257-285。
郭柏臣、吳慧珉(2013)。階層式 IRT 模式及在大型測驗上之應用(報告編號:NSC99-2410-H-142-008-MY3)。行政院國家科學委員會補助專題研究計畫成果報告。
陳玉娟(2014)。幼兒園內部行銷評估指標建構之研究。臺中教育大學學報:教育類,28(2),49-68。
陳柏熹(2006)。能力估計方法對多向度電腦化適性測驗測量精準度的影響。教育心理學報,38(2),195-211。http://doi.org/10.6251/BEP.20061003
陳新豐(2014)。教育測驗與學習評量。五南。
陳寬裕、王正華(2010)。論文統計分析實務:SPSS與AMOS的運用。五南。
陳龍安(1995)。創造思考的策略與技法。教育資料集刊,30,201-265。
陳龍安(1998)。創造思考教學的理論與實際。心理。
陳麗華、李涵鈺、林陳涌(2004)。國內批判思考工具及其應用之分析。課程與教學,7(2),1-24。http://doi.org/10.6384/CIQ.200404.0001
黃芳銘 (2002)。結構方程模式理論與應用。五南。
黃俊斌(2007)。國小批判性思考教學模式及其在社會領域的應用〔未出版之博士論文〕。國立臺灣師範大學教育學系。
黃政傑(2003)。學校課程評鑑的概念與方法。課程與教學季刊,6(3),1-20。http://doi.org/10.6384/CIQ.200307.0001
黃嘉雄(2004)。課程評鑑概念分析。教育資料集刊,29,209-224。
楊智為(2014)。Higher-Order IRT與Higher-Order DINA混合模型探究〔未出版之博士論文〕。國立台中教育大學教育測驗統計研究所。
溫明麗(2000)。批判性思考教學理論與師資培育模式之探討(編號:NSC-89-2413-H-003-054)。行政院國家科學委員會補助專題研究計畫報告。
溫明麗(2002)。皮亞傑與批判性思考教學。紅葉文化。
萬文隆(2004)。深度訪談在質性研究中的應用。生活科技教育月刊,37(4),17-23。http://doi.org/10.6232/LTE.2004.37(4).4
葉玉珠(1998)。有效批判思考教學的基礎之探討。教育研究雙月刊,59,57-67。
葉玉珠(2002)。高層次思考教學設計的要素分析。國立中山大學通識教育學報,創刊號,75-101。
葉玉珠(2003)。批判思考測驗—第一級。心理。
葉玉珠(2005)。批判思考測驗—第二級。http://www3.nccu.edu.tw/~ycyeh/instrum-ent-english/2005%20CTT-II-30.pdf
葉玉珠、陳月梅、謝佳蓁、葉碧玲(2001)。「成人批判思考技巧測驗」之發展。測驗年刊,48(2),35-50。
葉玉珠、葉碧玲、謝佳蓁(2000)。「中小學批判思考技巧測驗」之發展。測驗年刊,47(1),27-46。
鼎茂(2000)。教育與心理測驗。鼎茂。
劉真主編(2000)。教育大辭書。文景。
劉圓玲(2004)。高職國語文標準化成就測驗的編製。心理。
歐滄和(1993)。標準化測驗的編製。測驗統計年刊,1,33-42。http://doi.org/10.6773/JRMS.199312.0033
潘慧玲(2003)。從學校評鑑談到學校本位課程評鑑。北縣教育,46,32-36。
蕭儒棠、曾建銘、吳慧珉、林世華、謝佩蓉、謝名娟(2014)。測驗之編製:命題技巧與測驗資料之分析。國家教育研究院。
謝水南(2003)。實用彈性思考法:思考放輕鬆。心理。
謝典佑(2011)。階層式試題反應理論模式及其等化估計方法〔未出版之博士論文〕。國立台中教育大學教育測驗統計研究所。
謝龍卿、黃德祥(2015)。青少年臉書成癮量表編製與臉書使用現況之研究。臺中教育大學學報:數理科技類,29(2),25-52。
韓繼成(2008)。論試辦教師專業發展評鑑及改進策略。學校行政,55,169-189。http://doi.org/10.6423/HHHC.200805.0169
簡茂發(1993)。測驗的編制。測驗統計年刊,1,13-32。http://doi.org/10.6773/JRMS.199312.0013
簡茂發(2019)。國家教育研究院-教育大辭書。http://terms.na er.edu.tw /detail /1311488/

貳、英文部分
Anderson J. C. & Gerbing D. W. (1988).Structural Equation Modeling in Practice:A Review and Recommended Two-Step Approach. Psychological Bulletin Copyright 1988 by the American Psychological Association, Inc. 1988, Vol. 103, No. 3, 411-423
Archibald, T. (2013). Evaluative thinking? Free Range Evaluation. https://tgarchibald.wor-dpress.com/2013/11/11/18/
Archibald, T.& Buckley, J. (2013). Evaluative thinking: principles and practices to enhance evaluation capacity and quality. Poster session presented at the 27th Annual Conference of the American Evaluation Association, Washington, DC.
Archibald, T., Sharrock, G., Buckley, J., & Young, S.(2018).Every practitioner a “knowledge worker”: promoting evaluative thinking to enhance learning and adaptive management in international development.New Directions for Evaluation,158,73-91.https://doi.org/10.1002/ev.20323
Baker, A. (2011). Evaluative thinking in philanthropy pilot. Bruner Foundation. http://w-ww.evaluativethinking.org/docs/eTip.FINAL_REPORT.V8.pdf
Baker, A., &Bruner, B. (2012). Integrating evaluative capacity into organizational practice. Bruner Foundation. http://www.evaluativethinking.org/docs/Integ_Eval_Capaci-ty_Final.pdf
Bequette, M., Cardiel, C. L. B., Cohn, S., Kollmann, E. K., &Lawrenz, F. (2019). Evaluation capacity building for informal STEM education: working for success across the field. New Directions for Evaluation, 161, 107–123.https://doi.org/10.1002/ev.20351
Bond, L. A. (1996). Norm-and criterion-referenced testing. Retrieved from ERIC database.
Bond, T. G., & Fox, C. M. (2007). Applying the Rasch model: fundamental measurement in the human sciences. Erlbaum.
Briggs, D. C., & Wilson, M. (2003). An introduction to multidimensional measurement using Rasch models. Journal of Applied Measurement, 4(1), 87–100.
Bruner Foundation (2019).Modified Evaluative Thinking Assessment Tool.Bruner Foundation. http://www.evaluativethinking.org/ docs/evaluativethinking.assessment.v5.xls
Buckley, J., Archibald, T., Hargraves, M., &, Trochim, W. M. (2015). Defining and teaching evaluative thinking: Insights from research on critical thinking. American Journal of Evaluation, 36(3), 375-388.https://doi.org/10.1177%2F1098214015581706
Churchill, G. A., Jr. (1979). A paradigm for developing better measures of marketing -26- constructs.Journal of Marketing Research, 16(1), 64-73.https://doi.org/10.2307/3150876
Churchill, G.A. &Iacobucci, D. (2015). Marketing research: Methodological foundation (11th ed.).Earlie Lite Books, Inc.
Cornell Office for Research on Evaluation(2019). Evaluative thinking inventory. Cornell University . https://core.human.cornell.edu/documents/ETInventory_CORE.pdf
Cottrell, T.(2017). Critical thinking skills: Effective analysis, argument and reflection. Macmillan International Higher Education.
Dalkey, N.(1969). The Delphi method:An experimental study of group opinion.Rand.
Davidson, E. J. (2004). Evaluation methodology basics: The nuts and bolts of sound evaluation. Sage.
Davidson, E. J. (2005). Evaluative thinking and learning-enabled organizations. Paper presented at the Canadian Evaluation Society and American Evaluation Association Conference, Toronto, Canada.
Davidson, E. J., Howe, M., & Scriven, M. (2004). Evaluative thinking for grantees. In M. Braverman, N. Constantine,& J. K. Slater (Eds.), Foundations and evaluation: Contexts and practices for effective philanthropy(pp. 259–280). Jossey-Bass.
Davies, A., Brown, A., Elder, C., Hill, K., Lumley, T., & McNamara, T. (2002). Dictionary of language testing. Cambridge University Press.
Delbecq, A. L., Van de Ven, A. H. & Gustafson, D. H. (1975). Group Techniques for Program Planning: A Guide to Nominal Group and Delphi Processes.Scott, Foresman and Company.
DeVellis, R. (1991).Scaledevelopment: Theory and applications. Sage.
Duncan, B. A. &Stevens, A. (2011). High-stakes standardized testing: Help or hindrance to public education. National Social Science Journal, 36(2). 38-43.
Earl, L.&Timperley,H. (2015), “Evaluative thinking for successful educational innovation”, OECD Education Working Papers, No. 122, OECD Publishing, Paris. http://dx.doi.org/10.1787/5jrxtk1jtdwf-en
Faherty, V. (1979). Continuing social work education:results of a delphi survey. Journal of Education for Social Work, 15(1),12-19.
Gerbing, D. W. & Anderson, J. C. (1988). An updated paradigm for scale development incorporating unidimensionality and its assessment.Journal of Marketing Research, 25( 2), 186-192.https://doi.org/10.2307/3172650
Giordano, G. (2005). How testing came to dominate American schools: The history 01 educational assessment. Peter Lang.
Hinkin, T. R. (1998). A brief tutorial on the development of measures for use in survey questionnaires. Organizational Research Methods, 1, 104-121.https://doi.org/10.1177%2F109442819800100106
House, E. (2014). Evaluating values, biases and practical wisdom. Information Age.
Kahneman, D. (2011). Thinking fast and slow. Farrar, Straus and Giroux.
L. W. Anderson, & D. R. Krathwohl (Eds.). (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom's taxonomy of educational objectives. Allyn and Bacon.
Labin, S. N., Duffy, J. L., Meyers, D. C., Wandersman, A., &Lesesne, C. A. (2012). A research synthesis of the evaluation capacity building literature. American Journal of Evaluation, 33, 307–338.https://doi.org/10.1177%2F1098214011434608
Leslie A. Fierro, L. A., Codd, H., Gill, H., Pham P. K., Grandjean Targos, P. T. &Wilce, M.(2018). Evaluative thinking in practice: The national asthma control program. New Directions for Evaluation,158, 49-72.https://doi.org/10.1002/ev.20322
McKegg, K., & King, S. (2014). What is evaluation: A brief introduction.Aotearoa New Zealand Evaluation Association. https://www.anzea.org.nz
Myers, S. (2019). Item response theory. Salem Press Encyclopedia.
Nelson, M., & Eddy, R. M. (2008). Evaluative thinking and action in the classroom.New Directions for Evaluation, 117, 37–46.https://doi.org/10.1002/ev.250
NSW Department of Education(2019).Evaluation process flow image.https://education.nsw.gov.au/teaching-and-learning/professional-learning/evaluation-resource-hub/evaluative-thinking/plan,-do,-use
Patton, M. Q. (2010). Incomplete successes. The Canadian Journal of Program Evaluation, 25, 151–163.
Patton, M. Q. (2018a). A historical perspective on the evolution of evaluative thinking. In A. T. Vo & T. Archibald(Eds.), Evaluative thinking. New Directions for Evaluation, 158, 11–28.https://doi.org/10.1002/ev.20325
Patton, M. Q. (2018b). Principles‐focused evaluation: The GUIDE. Guilford Press.
Phelps, R. P. (2008). The Role and Importance of Standardized Testing in the world of teaching and training. Nonpartisan Education Review, 4(3), 1-9.
Preskill, H. (2014).Now for the hard stuff: next steps in ECB research and practice.American Journal of Evaluation,35,116-119.https://doi.org/10.1177%2F1098214013499439
Preskill, H., & Boyle, S. (2008).Insights into evaluation capacity building: Motivations, strategies, outcomes, and lessons learned. Canadian Journal of Program Evaluation, 23(3), 147-174.
Schwandt, T. (2015). Evaluation foundations revisited: Cultivating a life of the mind for practice. Stanford University Press.
Sethi, V. & King, W. R. (1994).Development of measures to assess the extent to which an information technology application provides competitive advantage. Management Science, 40(12), 1601-1627.https://doi.org/10.1287/mnsc.40.12.1601
Sharrock, G., Buckley, J., & Archibald, T.(2019).Evaluative thinking: Identifying assumptions. Catholic Relief Services.
StataCorp.(2019). Item response theory reference manual. (16th ed.), Texas: StataCorp.
Stufflebeam, D. L. (2000). The CIPP Model for Evalutaion. In D. L. Stufflebeam, G. F. Madaus, & T. Kellaghan (Eds.), Evalutaion Models: Viewpoints on Educational And Human Services Evaluation (2nd ed.), 280-317. Kluwer Academic.
Vo, A. T., Schreiber, J. S., & Martin, A. (2018). Toward a conceptual understanding of evaluative thinking. Evaluative Thinking. New Directions for Evaluation, 158, 29-47.https://doi.org/10.1002/ev.20324
Volkov, B. B. (2011). Beyond being an evaluator: The multiplicity of roles of the internal evaluator. New Directions for Evaluation, 132,25-42.https://doi.org/10.1002/ev.394
Weiss, D. J., &Yoes, M. E. (1991). Item response theory. In R. K. Hambleton and J. Zaal(eds.), Advances in educational and psychological testing. Boston: Kluwer Academic Publishers, 69-97.
Xia, W. & Lee, G. (2005).Complexity of information systems development projects: Conceptualization and measurement development.Journal of Management Information Systems, 22(1), 45-83.https://doi.org/10.1080/07421222.2003.11045831

 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
QR Code
QRCODE