Análisis de la competencia científica de alumnado de secundariarespuestas y justificaciones a ítems de PISA

  1. Beatriz Crujeiras Pérez
  2. María Pilar Jiménez Aleixandre
Revista:
Revista Eureka sobre enseñanza y divulgación de las ciencias

ISSN: 1697-011X

Ano de publicación: 2015

Volume: 12

Número: 3

Páxinas: 385-401

Tipo: Artigo

DOI: 10498/17598 DIALNET GOOGLE SCHOLAR lock_openRODIN editor

Outras publicacións en: Revista Eureka sobre enseñanza y divulgación de las ciencias

Resumo

This paper addresses students’ development of scientific competency by examining: a) their answers and justifications to PISA multiple choice items; and b) the effect of students’ engagement in inquiry-based laboratory tasks related to some performances assessed in PISA in their acquisition of scientific competency. The participants are 21 high school students attending Physics and Chemistry courses in 9th and 10th grades. Their written responses at the beginning and end of the study were analysed by means of a rubric based on the PISA scales of proficiency levels (OECD, 2008) for the competencies in identifying scientific issues and using scientific evidence. The results obtained in the pre and post-test are compared in terms of percentages and number of adequate justifications. Differences in the results have been identified both for the justifications versus choices to multiple choice questions and the results and justifications in the pre and post test

Referencias bibliográficas

  • Bybee, R. (2009). Scientific literacy and contexts in PISA 2006 Science. Journal of Research in Science Teaching, 46(8), 862-864.
  • Crujeiras, B., Gallástegui, J. R., y Jiménez-Aleixandre, M. P. (2013). Indagación en el laboratorio de química. Alambique: Didáctica de las ciencias experimentales, 74, 49-56.
  • Denzin, N. K. & Lincoln, Y. S. (2000). The discipline and practice of qualitative research. En Dezin, N. K. & Lincoln, Y. S. (Eds.). Handbook of Qualitative Research (pp.1-28). Segunda Edición. California: Sage Publications.
  • Drechsel, B., Carstensen, C., y Prenzel, M. (2011). The role of content and context in PISA interest scales: A study of the embedded interest items in the PISA 2006 science assessment. International Journal of Science Education, 33(1), 73-95.
  • Goldstein, H. (2004). International comparisons of student attainment: Some issues arising from the PISA study. Assessment in Education, 11(3), 319-330.
  • Haja, S., y Clarke, D. (2011). Middle school students' responses to two-tier tasks. Mathematics Education Research Journal, 23(1), 67-76.
  • Jiménez-Aleixandre, M. P. (2008). Designing argumentation learning environments. En S. Erduran, y M. P. Jiménez-Aleixandre (Eds.), Argumentation in Science Education. Perspectives from classroom-based research. Dordrecht: Springer
  • Jiménez Aleixandre, M. P., Bravo, B., y Puig, B. (2009). Cómo aprende el alumnado a usar y evaluar pruebas? Aula de Innovación Educativa, 186, 10-12.
  • Jiménez-Aleixandre, M. P., y Puig, B. (2011). The role of justifications in integrating evidence in arguments: Making sense of gene expression. Comunicación presentada en el congreso de ESERA, Lyon (Francia), del 5 al 9 de septiembre.
  • Lau, K. C. (2009). A critical examination of PISA's assessment on scientific literacy. International Journal of Science Education, 7(6), 1061-1088.
  • Ministerio de Educación Cultura y Deporte (MECD). Ley Orgánica 8/2013, de 9 de diciembre, para la mejora de la Calidad Educativa, Boletín Oficial del Estado (BOE), 106.
  • Organización para la Cooperación y Desarrollo Económico (OCDE). (2008). Competencias científicas para el mundo del mañana. Madrid: Santillana.
  • Organisation for Economic Co-operation and Development (OECD). (2009). PISA take the test: Sample questions from OECD's PISA assessments. Paris: OECD.
  • Organisation for Economic Cooperation and Development (OECD) (2013). PISA 2015 Draft Science Framework. Paris: OECD.
  • Swanborn, P. G. (2010). Case study research: What, why and how? California: Sage Publications.
  • Tamir, P. (1990). Justifying the selection of answers in multiple choice items. International Journal of Science Education, 12(5), 563-573.
  • Treagust, D. F. (1988). Development and use of diagnostic tests to evaluate students' misconceptions in science. International Journal of Science Education, 19(2), 159-169.
  • Treagust, D. F., y Chandrasegaran, A. L. (2007). The Taiwan national science concept learning study in an international perspective. International Journal of Science Education, 29(4), 391-403.