banner



How To Quantify That Technology Is Improving Learning

Abstract

Researchers, evaluators, and practitioners demand tools to distinguish betwixt applications of technology that support, enhance, and transform classroom instruction and those that are ineffective or even deleterious. Here nosotros nowadays a new classroom observation protocol designed specifically to capture the quality of engineering use to back up scientific discipline inquiry in loftier school science classrooms. We iteratively developed and piloted the Technology Observation Protocol for Scientific discipline (Top-Science), building direct on our published framework for quality technology integration. Development included determining content and face validity, as well as the calculation of reliability estimates beyond a broad range of loftier school scientific discipline classrooms. The resulting protocol focuses on the integration of technology in classrooms on three dimensions: science and engineering practices, student-centered pedagogy, and contextualization. It uses both quantitative coding and written descriptions of show for each code; both are synthesized in a multi-dimensional measure of quality. Data are collected and evaluated from the perspective of the teacher's intentions, actions, and reflections on these actions. This protocol fills an of import gap in agreement technology'southward part in teaching and learning by doing more monitoring engineering science's presence or absence, and considering its integration in the context of the Adjacent Generation Science Standards science and engineering science practices. Its applications include research, evaluation, and potentially peer evaluation.

Introduction

Today's science teachers are challenged with immersing their students in the practices of scientific discipline research (DiBiase and McDonald 2015; Herrington et al. 2016). This includes having students enquire testable questions, pattern related studies, collect and interpret data, engage in argumentation from evidence, connect findings to other research, and more than. The importance of these practices is readily apparent in the Side by side Generation Science Standards (NGSS), which address eight science and engineering practices (SEPs) in the context of disciplinary cadre ideas and crosscutting concepts (National Research Council 2012).

At the same time, digital technology has become a commonplace learning and organizational tool in many schools. For example, according to a 2017 study on connectivity in education, over twoscore governors invested resources in improving technology infrastructure in their states, resulting in over thirty million students in seventy,000 schools gaining broadband access between 2013 and 2016 (Education Super Highway 2017). Indeed, technology applications take the potential to support and extend student application and learning of science practices. This support or extension tin occur with the utilise of everyday applications, such as Net searches and word processing, as well as with applications designed specifically for learning environments, such as classroom management systems (Herrington and Kervin 2007). Information technology tin fifty-fifty take advantage of the aforementioned applications used by scientific discipline professionals, such every bit delineating watershed boundaries, mapping country use, and measuring water quality with GIS to accost questions most factors impacting stream health (e.1000., Stylinski & Doty 2013). These tools take the potential to back up, enhance, and even transform the learning experience if used effectively. However, they can also be used in ineffective means that confuse, distract, and interfere with learning.

Researchers and evaluators need instruments that will enable them to describe the effective integration of technology into scientific discipline classrooms to understand this irresolute classroom mural and better support teachers. Observation protocols provide researchers and evaluators with critical tools for examining pedagogical approaches and curricular resources, including technology hardware and software, and their relationship to student learning (e.thousand., Bielefeldt 2012). A recent review identified 35 observation protocols aimed at describing teachers' instructional practice for science classrooms; however, less than a third deemed for technology use (Stylinski et al. 2018). Of these, most just documented the presence of technology. Fewer than half of those that included technology described the quality of applied science utilize by looking at the integration of technology with pedagogical practices. None comprehensively captured the quality of engineering utilise in the specific context of instruction focused on science disciplinary practices.

In this paper, we present the Technology Observation Protocol for Science (Summit-Science), a new classroom observation protocol designed specifically to capture the quality of applied science utilise, to back up science inquiry in high schoolhouse science classrooms. We describe how we adamant face up validity and construct validity through (a) a careful alignment of the Protocol with our published framework of engineering science integration in science classrooms and (b) iterative pilot testing that included expert review and inter-rater reliability.

Theoretical Foundation of the Protocol

The Protocol builds on a published theoretical framework that consists of four dimensions: technology blazon, scientific discipline and engineering practices (SEPs), student-centered teaching, and relevance to students' lives (Parker, Stylinski, Bonney, Schillaci & McAuliffe 2015). These dimensions are disquisitional to loftier-quality technology integration, and their inclusion in our ascertainment protocol supports the musical instrument's content validity. In this section, nosotros describe each of these dimensions and their connectedness to the Protocol presented in this paper.

Technology Blazon

Digital technology in classrooms can be divers as computer hardware, software, data, and media that utilise processors to store or transmit data in digital format (Pullen et al. 2009). In our framework, we expanded on this definition past describing three types based on the use of technology. Specifically, some technology applications are primarily used in instructional settings, such as assessment tools (e.g., clicker software) and online course direction (east.k., PowerSchool, Google Classroom). Other technology applications are near commonly used in STEM workplace settings (e.g., estimator modeling, image information analysis). Finally, ubiquitous technology applications are used in many settings, including in the classroom, workplace, and elsewhere in daily life (e.m., word processing, Cyberspace search engines). The designation of a engineering awarding to the instructional, ubiquitous, or STEM workplace category may modify over time as technologies evolve and are adapted to new settings. For example, Microsoft Excel was originally developed for businesses, universities, and other workplace environments; but it now appears regularly in many unlike settings such every bit K-12 schools and homes. In the Protocol, we treated engineering science blazon differently from the other three dimensions. As described later, we consider the integration of applied science with each of the other three dimensions, and we simply had observers tape applied science type as a way to capture the diversity of applied science applications in the classroom setting.

Authentic Science and Engineering Practices

Real-world engineering applications offer the potential to assist teachers align with Next Generation Science Standards (NGSS), which weave eight science and engineering practices with cantankerous-cutting concepts and core ideas (NRC, 2012). These eight practices provide a thoroughly vetted description of science practices relevant for the classroom, and thus, they form one of 3 central dimensions of our observation protocol, which, equally noted, are considered in the lite of technology integration. Specifically, users consider the integration of engineering science with this description of authentic scientific discipline enquiry.

Engineering science, regardless of blazon, can scaffold students' understanding and learning during science-inquiry-based activities (eastward.grand., Devolder et al. 2012; Rutten et al. 2012). Classroom-based technology applications, both those designed specifically for pedagogy and those drawing from science workplace settings, tin help teachers engage students in real-world science and engineering pursuits—deepening their understandings of science research. This linkage betwixt scientific discipline and technology expands options for classroom scientific discipline research such equally allowing students to discover phenomena otherwise not possible in a classroom setting (D'Angelo et al., 2014; Smetana and Bell 2011). For example, figurer simulations, such as those offered by SimQuest, scaffold students' understanding by asking them to conduct multiple trials of an experiment to explore scientific principles and the effects of changing certain variables on outcomes of interest (De Jong 2006).

Pupil-Centered Education

Pupil-centered teaching involves instructional practices that let students to have a more than cocky-directed feel with their learning. The Nellie Mae Didactics Foundation (NMEF) defines pupil-centered teaching as encompassing personalized instruction, pupil autonomy or buying of the learning process, and competency-based instruction (NMEF and Parthenon-EY 2015; Reif et al. 2016). Footnote i Our Protocol breaks educatee-centered teaching into three aspects closely aligned with the Nellie Mae definition: personalized instruction, autonomy, and competency-based teaching. Personalized didactics occurs when students' individual skills, interests, and developmental needs are taken into account for instruction. Students have autonomy over their own learning when they take frequent opportunities to have ownership of their learning process and take a caste of pick. Competency-based instruction is defined as teaching that progresses based on students' mastery of skills and knowledge, rather than being dictated by how much time has been spent on an activity or topic (NMEF and Parthenon-EY 2015).

Many teachers have capitalized on the increased availability of technology by using digital devices and technology resources to encourage student-centered learning through increased personalization, individual pacing, and pupil ownership (Reif et al. 2016). Maddux and Johnson (2006) note that technology implementation strategies are nigh successful when they are more educatee-centered and motility away from instructor-directed instruction. The utilize of engineering science in student-centered instruction tin let students to learn in ways that would non otherwise be possible without the technology. For example, online modules may facilitate personalized learning by allowing students to learn at their ain pace, using diverse tools and resources to support learning and providing an appropriate level of scaffolding (Patrick et al. 2013). In improver, a learning management organization such as Blackboard or Moodle might facilitate competency-based learning past helping to document students' progress toward demonstrating competencies, which might otherwise be too burdensome without applied science (Sturgis and Patrick 2010). Applied science might also support educatee-centered instruction such that students may accept autonomy over what kind of technology to cull for an activeness, allowing engineering to exist a "potential enabler" of student-centered learning (NMEF and Parthenon-EY 2015, p. 6).

Contextualizing Learning for Relevance

The employ of technology in classrooms enables teachers to brand learning more relevant to the world in which students live (Brophy et al. 2008; Hayden et al. 2011; Miller et al. 2011). Since the 1990s, there has been an emphasis on trying to make science more relevant to students (Fensham 2004). The proponents of making science content more than relevant to students are generally those who argue that such instructional practices lead to educatee-centered learning and increase student motivation for learning science (Moeller and Reitzes 2011; Ravitz et al. 2000; Stearns et al. 2012).

One of the more common ways to increment the relevance of science to students includes making connections between the content and the real earth in which students live. This idea of increasing relevance is also sometimes referred to every bit accurate learning (Lombardi 2007; Reeves et al. 2002). Grounding lessons in the local geographic context, making connections to youth civilization, or matching classroom tasks and activities with those of professionals in practise are all ways in which learning can be made more than relevant or accurate to students. Technology can support and facilitate this blazon of authentic learning via such tools every bit web resources and search engines, online discussions or emails, online imitation environments, digital photography, or voice recorders (Herrington and Kervin 2007). Technology tin facilitate authentic learning by using animations, sounds, and images, allowing students to engage more easily with the cloth on a conceptual level. For case, high school students in Australia used web resources in conjunction with a field trip to the Sydney Olympic Park, to research a problem the park was having with mosquitoes and rats attracted by nutrient scraps, continuing water, and trash left behind by visitors. The "geography challenge" was presented to students using animated scenarios involving the mosquitoes and rats affecting visitors' experiences at the park, and invited them, as "geography consultants," to investigate and advise recommendations on ways to mitigate the problems and "restore an ecological balance to the area" (Brickell and Herrington 2006, p. 536). Students conducted web research earlier visiting the site, and presented a report with their findings to "park government."

Summary

While the three central dimensions of the Protocol—SEPs, student-centered education, and contextualization—provide a useful frame for describing the integration of engineering with fundamental pedagogical practices, they are not mutually exclusive. The SEPs encourage making scientific discipline learning accurate; that is, contextualizing learning, through reflecting the existent work of scientists and including questions of interest to the field of science. The SEPs likewise encourage autonomy, a key facet of student-centered teaching, especially through opportunities to ascertain researchable questions.

Development of the Protocol

The development and testing process of the Pinnacle-Science Protocol reflects that undertaken by similar ascertainment tools (e.m., Dringenberg et al. 2012; Sawada et al., 2002). Equally noted, nosotros built the initial typhoon of the Protocol on the theoretical framework previously described and after review of similar observation protocols. We refined this first draft through preliminary piloting and based on feedback from several experts in instrument development. We and then conducted four rounds of iterative pilot testing of the Protocol (Tabular array one), each followed by feedback-driven revisions as well every bit additional expert review.

Table i Protocol pilot testing

Total size table

In the first two rounds, individual observers completed feedback sheets that were and so analyzed by the squad and used to revise the Protocol. In the third and fourth rounds, observer pairs completed observations, compared results, and provided feedback based on the comparisons. Our enquiry team, along with additional trained observers, conducted 66 observations of diverse science classes in 26 loftier schools across seven states. The seven states included in our study represent a wide range of areas from suburban, urban, and rural locations, also as from the west coast, New England, and Mid-Atlantic regions of the country. The classrooms observed ranged from 9th to 12th grades (including mixed grades). They occurred in a diversity of schoolhouse settings including low-resourced and loftier-resourced schools, public schools, and lease schools, and included some that self-place as STEM schools.

The terminal Protocol consists of four parts (Appendix A): (1) pre-ascertainment teacher questions to understand teachers' intention for technology integration in their class (Fig. A1), (2) observation sheets where codes and field notes are recorded for the 4 framework dimensions (Fig. A2) in 10-min intervals, (3) post-observation teacher questions to get together the teacher's reflection on their lesson (Fig A3), and (4) a code summary sheet that produces a multi-dimensional mensurate of technology integration quality (Fig. A4).

A detailed Reference Guide walks the observer through each stride in the Protocol procedure, and provides all instructions needed to implement the Protocol. For each of the three categories, there is a detailed description of what observers should look for under each, with descriptors and examples for each category and technology level. Tabular array 2 provides sample engineering codes and show text for each category, demonstrating unlike levels of technology integration for similar lessons.

Table 2 An instance from the Reference Guide that illustrates iii levels of engineering integration for the three central dimensions of the protocol

Full size tabular array

The pre- and postal service-observation instructor questions offer insight into teacher technology integration goals and reflections on implementation. The observation sheet provides a quantitative metric and qualitative description of engineering science integration focused on teacher practices. Observations occur in 10-min intervals; inside each interval, observers consummate codes for each dimension and include field notes providing evidence for each of the codes. The code summary sheet includes a template to synthesize results across the x-min intervals, as well incorporating instructor responses to the pre- and mail-questions. Four cardinal features emerged from the protocol development process. Each is described beneath.

Multi-Level Coding System Coupled with Descriptive Text

To accost the challenging process of developing meaningful and user-friendly codes that could draw the presence of each of our framework dimensions, including the extent to which engineering science was integrated, we tested a range of options, showtime with the almost basic check for presence or absence of the category and technology. As we had already found in our literature review, however, the presence of engineering is not the same as quality integration of that applied science. To address this, the next iteration practical the terms absent/low/medium/high for each category. Ultimately, across the four rounds of pilot testing, we institute that observers found it easiest to describe each dimension as either non nowadays, incidentally present, or embedded in the lesson, and the level of technology integration in each of the dimensions equally minimally integrated, partially integrated, or fully integrated.

Our initial draft of the Protocol utilized tallies to indicate the frequency of observed technology utilise and relevant instructional practices. After the offset round of pilot testing, we realized that tallies of codes did not paint a rich enough picture of the quality of technology integration. Therefore, we introduced field notes to complement these tallies (as written text lonely would present too onerous a job for our 10-min observation intervals), and piloted this structure in subsequent rounds. Specifically, observers supply written text describing evidence for each called code. Later, they combine these data to describe the quality of technology integration in the classroom (see "Synthesis of Data into a Quality Measure" for more information). Thus, the Protocol does not limit observers to a checklist or prepare of codes, but instead couples quantitative and qualitative data. This pairing provides the necessary flexibility to capture multiple and diverse means that the same technology can exist used for different purposes. For example, after selecting the appropriate technology-practise integration that involves GIS (a workplace technology application), the observer could depict evidence of how the instructor used GIS to help students brainstorm environmental science research questions, collect geospatially based data, present geospatially explicit findings, develop maps to share with community members, and more.

Integration of Engineering science with the Three Dimensions

In our initial attempts to operationalize the four dimensions from our Framework into an ascertainment protocol, we recorded the technology use category separately from the iii other categories (SEPs, student-centered pedagogy, and contextualization). Nosotros quickly realized, all the same, that in order to truly assess the quality of technology use, we had to more conspicuously define technology integration within the three other dimensions. Coding technology utilize separately from the dimensions and and then combining the codes post hoc was bogus and did not accurately reverberate a truthful measure of integration. To address this challenge, as described earlier, we replaced it with a system of coding the degree of technology integration with each of the components, non simply technology use. For example, in one pilot classroom, students studied chemical reactions past choosing a reaction of interest and designing a demonstration that would illustrate this reaction during an open house event. With Chromebook laptop computers, they researched chemic reactions and potential risks and identified required supplies. Some students also used Google Docs to tape this information and program their demonstration in collaboration with other students. With our coding system, we were able to document the extent to which the activity was student-centered (i.east., embedded student centeredness, due to the level of autonomy students had over their proposal and inquiry methods), and to what extent technology use was integrated in those student-centered practices (i.east., partially integrated, considering students had options about the kind of technology to employ in the activeness, and the technology enhanced the student-centeredness of their demo designs). For this particular example, it would not be considered fully integrated because, while the employ of engineering enhanced the student-centeredness of the activity, the level of student centeredness was not necessarily contingent on the utilise of technology. Students would still have been able to choose reactions to study, blueprint their demonstration, and collaborate with other students without engineering.

Focus on Instructor Intentions, Actions, and Reflections

The Protocol explicitly focuses on observing the teacher and his/her deportment and statements during the grade period, and does not include codes for student actions. While relevant student deportment and statements can exist recorded in the evidence section, the observer does not try to capture students' responses, as it is difficult to follow all of the students and know how they are responding without talking with them. In addition, their responses may be varied and confounded with factors unrelated to the instructor'southward deportment or intentions.

This focus on the teacher requires that the Protocol also include an opportunity for the teacher to describe their intentions and goals earlier class and their reflections later class, which provides context for the observation. Pairing these insights with the teacher'due south actions during the lesson offers a more encompassing perspective for the observer to consider when evaluating the quality of technology integration.

Synthesis of Data into a Quality Mensurate

At the conclusion of the ascertainment, the observer has a drove of 10-min "interval sheets" with codes for the iii categories and evidence (written text) of each code. In order to aggregate these data into a meaningful moving picture of the quality of technology integration across the full lesson, nosotros developed a post-ascertainment summary that combines the quantitative and qualitative information from each interval canvass, as well as information from the pre- and mail service-ascertainment teacher questions. This summary consists of an average and the highest-level engineering science codes for each category (science and engineering science practices, educatee-centered instruction, and contextualized teaching) that is supported by relevant qualitative evidence from field notes.

Examining Validity and Reliability

We examined the Protocol's validity throughout the development process, and inter-rater reliability of the quantitative codes was calculated using results from the fourth round of iterative piloting.

Instrument Validity

Nosotros addressed content validity of the Protocol by adjustment it with an existing published theoretical framework and its literature base, and comparing the Protocol with like protocols. The evolution of the Protocol consisted of an iterative process of collecting data from a variety of classrooms, reviewing interpretations and definitions of items and constructs, and reflecting on the literature and previously developed protocols. In this fashion, we established categories and indicators that reverberate the range of practices of engineering integration (American Educational Research Association 2014). We established face validity of the Protocol with an expert panel who reviewed and submitted written feedback on three drafts of the Protocol and supporting materials. In detail, the expert review confirmed the categories and ratings of engineering integration and their feedback resulted in definitions that were more than closely aligned with understandings of technology beyond a broader range of classrooms. We likewise fabricated refinements to the reference guide and other support materials after each review.

Inter-Rater Reliability

We calculated inter-rater reliability using results of our fourth implementation (see Table 1). We observed 24 teachers in 17 schools in 6 states (MA, RI, CA, MD, WV, and OR). Grade levels ranged from 8th to twelfth grade; schools were in urban, suburban, and rural areas, with a range of demographic characteristics.

Eight observers completed a total of 41 classroom observations, with two to three observers in each classroom. Each observer was observed from three to ten full grade periods. To the extent possible (given geographic limitations), different sets of partners observed different classrooms. After each observation, individuals completed the Protocol and recorded their results online. The pairs then compared results using a worksheet that helped to reconcile codes and place feedback for further revisions. Inter-relater reliability was conducted on the x-min intervals within each observed lesson, resulting in a total of 215 paired intervals. Krippendorf's alpha was used to determine reliability; Krippendorf's alpha is appropriate for this analysis because information technology "generalizes across scales of measurement; can be used with any number of observers, with or without missing information" (Hayes and Krippendorf 2007, p. 78). Alpha values ranged from 0.40 (student engagement) to 1.0 (STEM workplace engineering science, which had 100% agreement) (Table iii).

Tabular array 3 Krippendorf'due south alpha values for Protocol codes

Full size table

Given these results, nosotros made a serial of adjustments to the Protocol and to the definitions and guidance included in the Reference Guide. Specifically, the lawmaking for student engagement had the lowest alpha estimate (0.40). That code was removed rather than revised, in guild to retain the focus on teacher activities rather than student activities. Student-level observations, including engagement as well as academic achievement, were non inside the telescopic of this Protocol.

The four codes for technology type (instructional, ubiquitous, Stem instructional, and STEM workplace) had alpha estimates ranging from 0.46 to 1.0. Given this range and elimination of engineering science type from the post-observation code summary, the four codes were removed and replaced with a infinite for a qualitative description of the applied science type. The Reference Guide provided additional support on classifying technology types into each category.

Nosotros retained half-dozen codes in the final Protocol with Krippendorf's blastoff estimates from 0.51 to 0.78. Four codes (SEP level, SEP technology, student-centered level, student-centered engineering science) had acceptable values above 0.68, while contextualization level and contextualization engineering science were beneath 0.65. Given that the values were lower than desired and that the post-ascertainment checks between observers resulted in recommendations for clarifications to the Reference Guide, we revised the Guide to address the recommendations, including refining descriptions for each code.

Discussion and Determination

The Pinnacle-Science Protocol successfully addresses the gap effectually observing and documenting the quality of engineering science integration in loftier school classrooms (Bielefeldt 2012; Stylinski et al. 2018). In particular, it offers a tool that captures how teachers apply applied science to support pedagogy and learning of scientific pursuits and practices. The Protocol'southward focus on integration of technology in three key dimensions expands other observation protocols by moving beyond but registering applied science use, and by targeting primal categories that are specifically relevant for science classrooms. Its use of a limited number of descriptors, a straightforward three-level coding system, and the integration of descriptive text justifying each lawmaking choice provides a manageable frame for each 10-min observation interval. Together, these strategies direct the observer's attention on teacher intentions, deportment, and reflections without overwhelming them with trying to translate student actions and perspectives. The final step of summarizing interval data by category offers an overall quality measure of technology use, while still providing the pick to substitute any other synthesis calculation.

The Protocol has some limitations that should be considered in any application and could be addressed in futurity revisions. Commencement, its structure suggests that full integration of engineering in all three categories points to the highest quality. However, such extensive integration across an entire class period is likely non realistic or fifty-fifty desirable, as engineering science is not appropriate for all learning outcomes and activities. Our Protocol does non account for this variation in appropriate utilize of technology. That is, information technology is effective at identifying high-quality technology utilise, besides as incidental applications of technology, but does not elucidate when applied science might or might not exist the most appropriate tool to support learning. Thus, further piece of work is needed to explore how to determine engineering use relative to applied science benefits and affordances. Second, the inter-rater reliability scores in some areas were lower than desired; revisions were made to both the coding structure and the Reference Guide to address the problems we identified, and additional inter-rater reliability using the updated Protocol and Reference Guide is needed to examine boosted prove of the Protocol's reliability. Finally, the Protocol does not explicitly examine the touch on of observed teacher practices on student outcomes. Our Protocol allows a view into teachers' intention and actions; yet, the final czar of high-quality instruction must always be pupil outcomes. A divide evaluative tool is needed to identify or measure out these outcomes—ideally in ways that reflect the instructional dimensions relevant for scientific discipline classrooms (i.e., scientific discipline practices, pupil-centered teaching, and contextualization).

Despite these limitations, our Protocol successfully frames the quality of technology integration from the perspective of the instructor and in the context of today'southward science classrooms and is being used in research (Marcum-Dietrich et al. 2019). As described higher up, the Protocol was developed with care to determine both face validity and construct validity, and adequate reliability has been demonstrated. Additional studies of inter-rater reliability using the concluding version of the Protocol will serve to further appraise its strength as a measure of the quality of technology integration. The inclusion of a mail-observation code summary adds an important component not included in many other protocols, past helping researchers to synthesize the observations of what is observed in the classrooms using the Protocol'due south theoretical framework.

Notes

  1. Nellie Mae's definition includes a fourth component of student-centered education—"learning tin can happen anytime, anywhere"—however, because this component addresses learning that happens outside the classroom, it extends beyond the scope of the Protocol.

References

  • American Educational Enquiry Association, American Psychological Association, National Council on Measurement in Education, & Joint Committee on Standards for Educational and Psychological Testing. (2014). Standards for Educational and Psychological Testing. Washington, DC: AERA.

    Google Scholar

  • Bielefeldt, T. (2012). Guidance for engineering science decisions from classroom ascertainment. Journal of Inquiry on Technology in Education (International Club for Technology in Education), 44(iii), 205–223.

    Google Scholar

  • Brickell, One thousand., & Herrington, J. (2006). Scaffolding learners in authentic, problem based eastward-learning environments: The geography challenge. Australas J Educ Technol, 22(iv), 531–547.

    Commodity  Google Scholar

  • Brophy, S., Klein, S., Portsmore, G., & Rogers, C. (2008). Advancing applied science education in P-12 classrooms. J Eng Educ, 97(three), 369–387.

    Article  Google Scholar

  • D'Angelo, C., Rutstein, D., Harris, C., Haertel, Grand., Bernard, R., & Borokhovski, E. (2014, March). Simulations for STEM Learning: Systematic Review and Meta-Analysis Study Overview. Menlo Park, CA: SRI Educational activity.

  • De Jong, T. (2006). Technological advances in research learning. Scientific discipline, 312(5773), 532–533.

    Article  Google Scholar

  • Devolder, A., van Braak, J., & Tondeur, J. (2012). Supporting cocky-regulated learning in computer-based learning environments: Systematic review of effects of scaffolding in the domain of science education. J Comput Assist Learn, 28(6), 557–573.

    Article  Google Scholar

  • DiBiase, W., & McDonald, J. R. (2015). Science teacher attitudes toward inquiry-based didactics and learning. The Clearing House, 88(2), 29–38.

    Article  Google Scholar

  • Dringenberg, E., Wertz, R. E. H., Purzer, Due south., & Strobel, J. (2012). Development of the science and engineering classroom learning observation protocol. Presented at the 2012 American Society for Technology Education National Conference. Retrieved Baronial viii, 2015 from http://www.asee.org/public/conferences/viii/papers/3324/view.

  • Education Super Highway. (2017). The 2016 land of the states. Retreived July 30, 2018, https://s3-usa-west-1.amazonaws.com/esh-sots-pdfs/2016_national_report_K12_broadband.pdf.

  • Fensham, P. J. (2004). Increasing the relevance of science and technology education for all students in the 21st century. Sci Educ Int, xv(1), 7–26.

    Google Scholar

  • Hayden, Thousand., Ouyang, Y., Scinski, Fifty., Olszewski, B., & Bielefeldt, T. (2011). Increasing student interest and attitudes in STEM: Professional person development and activities to appoint and inspire learners. Contemporary Bug in Engineering and Teacher Education, xi(ane), 47–69.

    Google Scholar

  • Hayes, A. F., & Krippendorf, Thousand. (2007). Answering the phone call for a standard reliability measure for coding data. Commun Methods Meas, one(1), 77–89.

    Article  Google Scholar

  • Herrington, J., & Kervin, L. (2007). Accurate learning supported past engineering: 10 suggestions and cases of integration in classrooms. Educational Media International, 44(iii), 219–236.

    Article  Google Scholar

  • Herrington, D. G., Bancroft, South. F., Edwards, One thousand. M., & Schairer, C. J. (2016). I want to be the enquiry guy! How research experiences for teachers alter behavior, attitudes, and values about instruction science as research. J Sci Teach Educ, 27(2), 183–204.

    Article  Google Scholar

  • Lombardi, M. K. (2007). Authentic learning for the 21st century: An overview. In D. G. Oblinger (Ed.) Educause Learning Initiative, ELI Paper 1: 2007. Retrieved October nineteen, 2017 from: https://www.researchgate.net/profile/Marilyn_Lombardi/publication/220040581_Authentic_Learning_for_the_21st_Century_An_Overview/links/0f317531744eedf4d1000000.pdf.

  • Maddux, C. D., & Johnson, D. L. (2006). It, type Ii classroom integration, and the limited infrastructure in schools. Comput Sch, 22(iii–4), i–5.

    Google Scholar

  • Marcum-Dietrich, Due north., Bruozas, M., & Staudt, C. (2019). Embedding computational thinking into a heart schoolhouse science meteorology curriculum. Baltimore, Medico: Interactive affiche presented at the annual meeting of the National Clan of Enquiry in Science Educational activity (NARST).

    Google Scholar

  • Miller, 50. Thousand., Chang, C.-I., Wang, S., Beier, One thousand. E., & Klisch, Y. (2011). Learning and motivational impacts of multimedia science game. Computers and Education, 57(1), 1425–1433.

    Article  Google Scholar

  • Moeller, B., & Reitzes, T. (2011). Integrating technology with pupil-centered learning. Quincy, MA: Pedagogy Development Center, Inc., Nellie Mae Instruction Foundation.

    Google Scholar

  • National Research Quango. (2012). A framework for K-12 science educational activity: Practices, crosscutting concepts, and core ideas. Washington, DC: National Academies Press.

    Google Scholar

  • Nellie Mae Education Foundation, & Parthenon-EY. (2015). The continued classroom: Understanding the mural of technology supports for student-centered learning.

  • Parker, C. East., Stylinski, C. D., Bonney, C. R., Schillaci, R., & McAuliffe, C.  (2015). Examining the quality of engineering science implementation in Stalk classrooms: Demonstration of an evaluative framework. J Res Technol Educ, 47(2), 105–121.

  • Patrick, Due south., Kennedy, K., & Powell, A. (2013). Mean what you say: Defining and integrating personalized, composite and competency education. Vienna, VA: International Clan for Thousand-12 Online Learning.

    Google Scholar

  • Pullen, D. L., Baguley, Grand., & Marsden, A. (2009). Dorsum to basics: Electronic collaboration in the education sector. In J. Salmons & L. Wilson (Eds.), Handbook of research on electronic collaboration and organization synergy (pp. 205–222). Hershey, PA: IGI Global.

    Chapter  Google Scholar

  • Ravitz, J., Wong, Y., & Becker, H. (2000). Constructivist-compatible and practices among US teachers. Irvine, CA: Center for Research on Information technology and Organizations.

    Google Scholar

  • Reeves, T. C., Herrington, J., & Oliver, R. (2002). Authentic activities and online learning. Almanac Conference Proceedings of Higher Education Inquiry and Development Society of Australasia. Perth: Retrieved October 19, 2017, from http://researchrepository.murdoch.edu.au/id/eprint/7034/i/authentic_activities_online_HERDSA_2002.pdf.

  • Reif, G., Schultz, G., & Ellis, S. (2016). A qualitative study of student-centered learning practices in New England loftier schools. Boston: Nellie Mae Didactics Foundation and Academy of Massachusetts Donahue Institute.

    Google Scholar

  • Rutten, N., van Joolingen, W. R., & van der Veen, J. T. (2012). The learning effects of computer simulations in scientific discipline instruction. Comput Educ, 58(1), 136–153.

    Article  Google Scholar

  • Stylinski, C.D. & Doty, C.  (2013). The inquiring with GIS (iGIS) project: helping teachers create and atomic number 82 local GIS-based investigations. In: Pages 161-190. In J. One thousand. MaKinster, N. G. Trautman, & One thousand. Barnett (Eds.), Teaching science and investigating environmental issues with geospatial technology: Designing effective professional evolution for teachers. Springer Publishing Co.

  • Stylinski, C.D., DeLisi, J., Wong, J., Bonney, C., Parker, C.E. & Doty, C. (2018). Listen the Gap: Reviewing measures of quality and technology use in classroom observation protocols. Newspaper presented at NARST International Briefing, Atlanta GA.

  • Sawada, D., Piburn, M. D., Judson, Due east., Turley, J., Falconer, K., Benford, R., Bloom, I. (2002). Measuring Reform Practices in Science and Mathematics Classrooms: The Reformed Teaching Observation Protocol. Sch Sci Math, 102(6):245–253

    Article  Google Scholar

  • Smetana, L. Grand., & Bell, R. L. (2011). Computer simulations to support science instruction and learning: A critical review of the literature. Int J Sci Educ, 34(9), 1337–1370.

    Commodity  Google Scholar

  • Stearns, L. M., Morgan, J., Capraro, Thou. Yard., & Capraro, R. K. (2012). A teacher ascertainment instrument for PBL classroom education. Periodical of STEM Teaching: Innovations and Research, 13(2), 7–16.

    Google Scholar

  • Sturgis, C., & Patrick, S. (2010). When success is the only option: Designing competency-based pathways for next generation learning. Vienna, VA: International Association for K-12 Online Learning.

    Google Scholar

Download references

Acknowledgments

Nosotros thank all of our study teachers who donated their time and opened their classrooms in back up of this research.

Funding

This textile is based upon work supported past the National Science Foundation under Grants #1438396 and 1438368. Any opinions, findings, and conclusions or recommendations expressed in this fabric are those of the authors and do non necessarily reflect the views of the National Science Foundation.

Author data

Affiliations

Respective author

Correspondence to Caroline East. Parker.

Ethics declarations

Upstanding Blessing

All procedures performed in studies involving homo participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

Disharmonize of Interest

The authors declare that they accept no conflict of interest.

Additional data

Publisher'due south Notation

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix A. Technology Observation Protocol for Science

Appendix A. Engineering Ascertainment Protocol for Science

Fig. A1
figure 1

Part i. Pre-observation information (complete one for each observation)

Full size image

Fig. A2
figure 2

Part two. Interval sheet (complete one for every 10-min interval during the ascertainment)

Full size image

Fig. A3
figure 3

Part 3. Interval sheet (complete one for every 10-min interval during the observation)

Full size image

Fig. A4
figure 4

Part 4. Postal service-ascertainment summary

Full size paradigm

Rights and permissions

Open up Access This article is distributed under the terms of the Creative Eatables Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(south) and the source, provide a link to the Artistic Commons license, and signal if changes were made.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Parker, C.E., Stylinski, C.D., Bonney, C.R. et al. Measuring Quality Engineering science Integration in Scientific discipline Classrooms. J Sci Educ Technol 28, 567–578 (2019). https://doi.org/x.1007/s10956-019-09787-seven

Download citation

  • Published:

  • Issue Date:

  • DOI : https://doi.org/10.1007/s10956-019-09787-vii

Keywords

  • Classroom observation protocol
  • Engineering science integration
  • Loftier school scientific discipline
  • Science inquiry
  • Evaluation

Source: https://link.springer.com/article/10.1007/s10956-019-09787-7

Posted by: blaisdellprifid.blogspot.com

0 Response to "How To Quantify That Technology Is Improving Learning"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel