Hostname: page-component-586b7cd67f-2plfb Total loading time: 0 Render date: 2024-11-28T11:38:24.588Z Has data issue: false hasContentIssue false

Formative assessment and key competences for a conscious recovery after COVID-19: an Action-Research at a school in Italy to enhance reflection starting from mistakes

Published online by Cambridge University Press:  28 November 2022

Daniela Canfarotta*
Affiliation:
Department of Pedagogical Psychological Sciences, Physical Exercise and Training, University of Palermo, Viale delle Scienze – Edificio 15, Palermo, Italy
Carla Lojacono
Affiliation:
Department of Pedagogical Psychological Sciences, Physical Exercise and Training, University of Palermo, Viale delle Scienze – Edificio 15, Palermo, Italy
*
Author of correspondence: Daniela Canfarotta, E-mail: [email protected]
Rights & Permissions [Opens in a new window]

Abstract

The study presents the results of an Action-Research project carried out during the COVID-19 pandemic with Italian teachers of primary, lower and upper secondary schools, interested in monitoring the activity of students in that difficult situation. The purposes of this study were: (a) to demonstrate that the involvement of teachers in the creation of metacognitive tools promotes the use of formative assessment at school; (b) to verify to what extent the use of a metacognitive form makes students more aware of the mistakes made during the test. The results were: (a) teachers showed great enthusiasm in adapting the metacognitive form to their school subject; (b) students pointed out high percentages of appreciation of the form; what is more, a group of students improved in identifying the typology of errors and understood more clearly what they should study in a better way to correct their mistakes; another group noted that their awareness of their strengths in study method had grown; finally, one group highlighted that the skills used during the completion of the form were also useful in other areas of their daily life, not only at school. Both teachers and students appreciated the online version of the tool: the pie charts created automatically by the system, by displaying percentages over typologies of errors made, provided immediate feedback, motivating students more and more. The study shows how much reflection on mistakes can be a source of growth.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
© The Author(s), 2022. Published by Cambridge University Press on behalf of The Classical Association

Introduction

During the 2020–2021 school year, schools around the world faced the challenges of the new wave of the COVID-19 pandemic (Colao et al., Reference Colao, Piscitelli, Pulimeno, Colazzo, Miani and Giannini2020). While in the first wave, from February to June 2020, teachers and students were caught unprepared, in the second, from September 2020 to June 2021, there was a greater awareness of the risks of a prolonged online teaching activity if not properly designed (Harris and Jones, Reference Harris and Jones2020).

For instance, in Europe, surveys were carried out in Germany and Austria to investigate the most relevant effects of the pandemic and, among the various results, a significant difference between diligent and less diligent students was found when considering their use of online teaching. In particular, home study and the consequent teachers' requests were perceived as particularly challenging. Furthermore, in the transition from face-to-face to online teaching, the most relevant issue was the evaluation of students. Since the students were no longer physically in the classroom, it was indeed more difficult for the teachers to ascertain the students' knowledge: in fact, when at home, it is more likely for students to draw information from other sources than themselves during the moments of evaluation (Huber and Helm, Reference Huber and Helm2020). The blockade of the pandemic has also had a significant impact in Great Britain. On the one hand, the time available for teaching classical languages was reduced, on the other hand the exams remained substantially unchanged. Certainly, having postponed the exam dates by a few weeks did not solve the students' problems (Hunt, Reference Hunt2020). In Switzerland it was not possible to take national tests in schools or university entrance exams, which had an impact on admissions to universities in the fall (Wiberg et al., Reference Wiberg, Lyren and Pantzare2021). In America, many schools have not been able to adapt the summative assessment system they generally used to the tests carried out at home, since the latter had different characteristics from the traditional ones (Wyse et al., Reference Wyse, Stickney, Butz, Beckler and Close2020). In Africa, the need to update the usual teaching practices, inadequate for online teaching, has been highlighted too (Owolabi, Reference Owolabi2020). On the other hand, an attempt was made to look for some sort of solution: for instance, in the Middle East, in Saudi Arabia, a program based on metacognitive strategies was developed to raise students' awareness of syntactic and semantic errors in order to improve their translation performances (Amin, Reference Amin2019).

Overall, in these months of pandemic, the importance of combining the summative evaluation, which controls the amount of knowledge learned, with the formative one has emerged in all its urgency. As Taras (Reference Taras2010) argues, all evaluation begins with a summative one, which is a judgement, whereas formative assessment is actually a summative assessment plus feedback that the students use. Nowadays, in professional refresher courses for teachers, the effort is to make them more and more capable of knowing how to use this last type of assessment, both in face-to-face and in virtual mode (Badii and Lorenzo, Reference Badii and Lorenzo2018; Huertas-Bustos et al., Reference Huertas-Bustos, Lopez-Vargas and Sanabria-Rodriguez2018; Martinez-Borreguero et al., Reference Martinez-Borreguero, Naranjo-Correa, Maestre-Jimenez, Chova, Martinez and Torres2017; Postholm, Reference Postholm2012; Verschaffel et al., Reference Verschaffel, Depaepe and Mevarech2019; Wade-Jaimes et al., Reference Wade-Jaimes, Demir and Qureshi2018). When doing so, it is possible to observe the growth of the students' metacognitive skills that concern the entire learning process, and thus monitor how they assimilate knowledge (Judge, Reference Judge2021; Nieto, Reference Nieto2017; Wong and Zhang, Reference Wong and Zhang2020).

Assessment based on competences

The European Council (2018) has underlined the need for an assessment based on competences characterised by: a vision of knowledge contrary to an academic encyclopedism focused only on contents; and a more practical approach, aimed at solving problems; an active learning, that develops skills, abilities, values and transversal competences, which are applicable in more than one context (Díaz-Barriga, Reference Díaz-Barriga2011; Vidal Ledo et al., Reference Vidal Ledo, Salas Perea, Fernández Oliva and García Meriño2015).

When the assessment is based on competences, it is crucial that students receive formative feedback on the competences they acquired through their study activities. In this way, self-regulated learning is facilitated, and students can reflect on the mistakes they made in an atmosphere of trust and not judgement, which is strictly connected to the grade (Karabenick and Zusho, Reference Karabenick and Zusho2015; Ya-Hui et al., Reference Ya-Hui, Li-Yia, Chao-Chin and Tzu-Ling2012). Furthermore, when teachers ask students appropriate questions about why they did their homework in that way, they are progressively making them active citizens, able to reflect on the ‘Why’ of things (Pellerey, Reference Pellerey2016; Pring, Reference Pring2016; Rodríguez et al., Reference Rodríguez Gómez, Ibarra Saiz and Cubero Ibáñez2018; Taysum, Reference Taysum2012). On the other hand, according to Ciappei and Cinque (Reference Ciappei and Cinque2014), the effort to acquire these competences is not limited to the execution of a task but, with adequate assessment and self-assessment tools, it helps the student to realise that these skills are useful outside of school and in daily life too. Hence, these competences have a lot to do with soft skills (see Table 1). With an interesting ethic perspective, Ciappei identifies the relationship between soft skills and virtue, in an Aristotelian sense: ‘Virtue is a habitual ability (habit) to do good well, while soft skill can be defined as a habitual ability to do good tout court’ (Ciappei and Cinque, Reference Ciappei and Cinque2014, p. 13). Therefore, transversal competences connect the cognitive and emotional sphere with the ethical and organisational skills and the spirit of initiative and communication skills. Investing in them means not losing sight of a more complete view of the person, so as not to be absorbed by the spiral of hyper-specialisation. Also, meta-skills need to be practised more than understood (La Marca and Gülbay, Reference La Marca and Gülbay2018).

Table 1. Relationships between competences and soft skills

Among these competences, the ‘Personal and social competence and the learning to learn’ ability is strategic not only at school, but in any other professional sphere. The definition of the Recommendation of the European Council reads:

Personal, social competence and the ability to learn to learn consists in the ability to reflect on oneself, to manage time and information effectively, to work with others in a constructive way, to remain resilient and to manage oneself's learning and career. (European Council, 2018).

Table 1 shows the similarities between key competences and soft skills. This can become relevant to the student, as it manifests how his/her behaviour at school will have a lot to do with his/her behaviour in daily and professional life.

Furthermore, since these behaviours are observable, it is important to monitor their development over time (Heckman and Kautz, Reference Heckman and Kautz2016). For this purpose, adequate observation tools are needed, which make students aware that the effort made to learn also improves them as a person, making them more reflective and responsible for their learning process.

Evaluation for competences in the field of classical languages

This is particularly necessary for students of classical languages, who are usually assessed on the ability to translate an ancient text, from the source language to the target language, with morpho-syntactic and lexical correctness.

In 2007 the University of Oxford developed an interesting piece of software to move from a traditional paper evaluation to an electronic one, so as to speed up the mistakes' quantitative analysis (Ashdowne, Reference Ashdowne2009; Salema, Reference Salema, Cravo and Marques2017). Sometimes though, the risk is that students may not be aware that, while carrying out translation activities, they are also developing key and metacognitive skills. To avoid this, the moment of test correction and the relative evaluation turns out to be an important moment for reflection on the mistakes made. Often, however, teachers complain that students just look at the grade obtained, without paying particular attention to why they made certain mistakes.

Purpose of the study and research question

The purposes of this study were:

  1. a) to demonstrate that the active involvement of teachers in the creation of metacognitive assessment tools facilitates the use of competence assessment at school;

  2. b) to verify to what extent the use of a metacognitive form, administered to students after the tests' correction, makes students more aware that reflecting on the mistakes which they made increases their key competences and metacognitive skills.

Therefore, this study's research questions were:

  1. 1) to what extent does the active involvement of teachers in the creation of metacognitive assessment tools facilitate the use of competence assessment in school?

  2. 2) how to make students aware of the fact that reflecting on mistakes made in teaching activities increases their learning-to-learn competence?

Methodology

Study design

Given the aforementioned aims, the action-research method was chosen to carry out this study. Action-research allows researchers to both study a specific situation or problem and try out the actions directed to solve it, in order to improve their practice (Elliott, Reference Elliott1991; Latorre, Reference Latorre2004; Lewin, Reference Lewin1946).

Thus, when the action-research method is used in an educational environment, the teacher can directly address the issues of his/her instructional practice and deal with them in the specific environment where he/she works (Baldacci, Reference Baldacci2013). Moreover, this method was chosen to have the chance to engage the teachers first-hand and thus start from their instructional practice to make an actual change in their environment.

The research took place between December 2020 and July 2021 and participants were selected on a voluntary basis among Italian teachers from the area of Cuneo and Vicenza (both in northern Italy) and Ragusa (in southern Italy) who were participating in the Teachers' Training National Plan. Eventually, 12 teachers decided to participate in the study. Thus, the sample consisted of:

  • seven female and one male high school teachers

  • one female middle school teacher

  • three female primary school teachers (one of them was a special needs students' teacher)

All the participants, after being informed about the research aims, the chosen methodology and about how data would be treated, gave their formal consent to participate, as established by the British Educational Research Association (2018).

Tools and materials

A metacognitive form, the result of the studies following the research documented by the author (Canfarotta, Reference Canfarotta2021, p. 84), was shared with participants.

The original form (Figure 1) consisted of five major columns:

  1. (1) Mistake transcription;

  2. (2) Mistake typology: here students had to choose if the mistake they made was caused by a) distraction, b) no identification or c) ignorance;

  3. (3) Mistake nature: here students had to choose if it was a a) morphological, b) syntactical or c) lexical mistake;

  4. (4) Mistake correction;

  5. (5) Comments, doubts, questions for the teacher.

Figure 1. The metacognitive form.

To stimulate students' metacognitive reflection on their mistakes, the filling-in process was structured so that, after each test, they would write in the form: (a) the mistakes they made, (b) the mistakes' typology, (c) nature, and (d) the correct answer.

The last column of the table was used to write comments, doubts, and questions addressed to the teacher. While doing so, students had the chance to verbalise the process that led them to make the mistake and, after seeing it, they were able to better understand why they went wrong. Also, students had to give to every mistake the value 1, writing the number in the typology and nature correspondent column. This allowed the system to automatically create two pie charts where both the student and the teacher could see the mistakes distribution based on the typology (chart 1) and the nature (chart 2).

Moreover, in the last sheet of the file the teachers and students were using, there was an overall form (Figure 2) where the pie charts automatically updated every time the student or the teacher changed some data. Thus, it was possible to have an up-to-date, overall comprehension of the students' performance. Therefore, the tables also served as a tool to collect quantitative data (cf. Figures 1 and 2).

Figure 2. The overall form.

Students had to fill in a form for each test they took from February to May 2021.

The metacognitive table was firstly shared with participants through Google Drive. Every participant agreed to the structure, even though afterwards everyone adapted the table to his/her school subject and students. For instance, when used with Mathematics in a middle school program (when students are 11–13 years old), the options among which students had to choose when considering the mistake nature were: (a) numbers, (b) space and figures, and (c) correlations. Also, an easier language was used to describe the mistake typologies among which pupils had to choose, e.g.: ‘I got distracted’; ‘I didn't understand the text’; ‘I didn't know how to start/to which topic I had to relate’ (Figure 3).

Figure 3. Table for middle school students, Mathematics.

Similarly, classical language students in upper secondary school (when students are 14–18 years old) filled out the form after taking a translation test.

Figure 4 shows in the first column the error made by the student, in the second column the type of error, in the third the nature of the error, in the fourth the self-correction made by the student, in the fifth the reflection of the student on the process that generated the error, e.g.: ‘I confused the dative for the verb in the first person singular’.

Figure 4. Table for high school students, Latin.

The student has entered the number 1 for each error, specifying the type of error and its nature. The sum of these numbers generated the graphs below. In this way he/she has immediate feedback on the causes of the errors and the arguments that he/she must go to deepen in his study.

During the Action-Research, qualitative and quantitative data were collected using: (a) online meetings, (b) a research journal, and (c) two semi-structured online questionnaires, one for students and one for teachers.

The research journal consisted in a shared Google Sheet that teachers and researchers could fill in at every step of the process. Every teacher had some dedicated rows where he/she could take notes about: (a) how he/she adapted the metacognitive table to his/her context; (b) first results he/she observed; (c) next steps to take; (d) the March and April phase; and (e) conclusions.

The semi-structured online questionnaire for students collected some socio-demographic data such as: gender, area of residence, and school level and then investigated: (a) their degree of appreciation for the metacognitive form; (b) whether the table had been useful and, if so, why; (c) if they had difficulties in filling in the form and, if so, which ones; and (d) if they would recommend the use of the form to their friends.

Besides gathering data about the number of students per class, their gender, the school level and their degree of appreciation for the table, the semi-structured online questionnaire for teachers collected data about: (a) how teachers encouraged their students in using the form; (b) the difficulties students encountered when filling in the form; (c) ideas regarding how to solve these difficulties; (d) the form's strengths; and (e) the form's flaws.

Both questionnaires were constructed and submitted online using Google Forms. Qualitative data were analysed using thematic analysis (Green et al., Reference Green, Willis, Hughes, Small, Welch, Gibbs and Daly2007), whereas quantitative data (percentages and statistics) were extracted from the Google Form itself.

Results

The analysis of both quantitative and qualitative data showed that:

  1. a) Teachers

The teachers' questionnaire was filled in by nine participants out of 12; seven of them think their students had a medium degree of appreciation for the table, whereas two of them state pupils were highly engaged with it.

Data analysis also showed the strategies teachers used to encourage their students to use the form. All of them tried to show the benefits allegedly provided by its use, which are:

  • the chance for students to reflect on their cognitive process and become more conscious of themselves;

  • the opportunity to learn from the mistakes;

  • the possibility to improve their study method.

Teachers were also asked which difficulties students encountered when filling in the form and how these difficulties could be overcome. The detected difficulties were:

  • Technical: some students struggled to get access to the online form and then had issues in sharing it with the teacher because they were not used to this kind of tool;

  • Linguistic: some students had problems in understanding the table because Italian was not their native language;

  • About recognition: some students were not able to recognise the mistake itself and thus its typology and nature. This meant they could not identify its seriousness too;

  • Self-critical: some students struggled to recognise which difficulties they had in their study method;

  • Organisational: pupils had difficulties in dealing with the new assessment system and new challenges arose when the teachers had to explain the table in a distance learning setting.

To overcome these issues teachers used basically two strategies:

  • At the beginning, they did the filling-in process together with the students, with many examples and a close tutoring;

  • Then, they repeatedly assigned the table, in order to make it a recurring tool.

Finally, teachers' data analysis highlighted the strengths and weaknesses of the form.

Among the first, participants particularly pointed out:

  • The pie charts that self-created below the table, because they gave immediate, visual feedback;

  • The opportunity to identify strategies to overcome difficulties;

  • The chance for every student, both ‘good’ and ‘bad’ ones, to reflect on their learning process and monitor their performance;

  • The possibility to cooperate among peers;

  • The fact that the table can be used as an online tool, thus in a lockdown situation, too;

  • The motivation enhancement they saw in their students.

Teachers had to express the weaknesses of the table, too. Most of all, they pointed out the struggle to implement a new evaluation process and tool. This issue resulted in the necessity of more time than expected to explain to students the new method and to correct the tests.

Another weakness is related to the technical part: in some situations the teachers' digital skills were not sufficient and thus became an obstacle.

  1. b) Students

The students' questionnaire was filled in by 154 subjects and, when asked the degree of appreciation for the metacognitive form, 65% of students said it was medium, 24% high and 11% low. Also, 83% of the students would recommend the form to a friend.

Students were asked to point out if the form had been useful to them and, if so, why. They reported that, using it, they had become more conscious of the reasons why they made some mistakes and developed some strategies they could use in order to improve their study method. Moreover, they stated they got better in self-evaluating and in understanding what they need to review to limit the mistakes.

Data analysis also showed which difficulties students had encountered when using the form. Although most of them (115) stated they did not have any difficulties, some issues emerged, which are:

  • The difficulty to actually recognise the mistakes and understand their typology and nature;

  • Technical troubles (e.g. share the table with the teacher or actually get access to the online form);

  • The difficulty of adapting to a new assessment method.

Lastly, a final evaluation of the action-research took place in July 2021, and showed that:

  • The Action-Research was an opportunity for discussion between colleagues on a delicate and never-resolved issue, that is to say reflecting on the nature of errors when translating;

  • It was an engaging and interesting experience, which gave the teachers ideas to work on in the future;

  • Many of the participants expressed their wish to be able to continue discussing metacognitive teaching.

Discussion

Overall, a reflection on the data of teachers and students showed that:

  1. 1. The degree of interest in the use of the metacognitive form is medium-high: what the students say is in line with what teachers highlighted, even though none of them stated that his/her students had a low degree of interest in the table. In particular, it seemed very significant to us that 83% of students would recommend this form to their friends. Numerous benefits were actually found: a correspondence can be found when the supposed benefits of the table are compared with the responses the students gave when asked why the table had been useful. The students confirmed the usefulness of the form, because:

‘By personally correcting mistakes I can better understand what I need to study more’ (student, 15 years old).

‘I don't just focus on the grade but I learn from mistakes' (student, 14 years old).

‘It helps me to improve human qualities in my studies' (student, 14 years old).

‘It also helps me in everyday life’ (student, 15 years old).

‘It enhances my strengths and those I still have to work on’ (student, 15 years old).

An increase in key and metacognitive skills is therefore evident (reflection on error, attention to the learning process, better competence in planning this process with greater rigour, awareness of one's own strengths and growth, even beyond study). This highlights that the students recognise, in their opinion, a certain usefulness in the use of this tool in ordinary teaching. Above all, they emphasise their growing awareness of the reason for their mistakes.

  1. 2. On the other hand, the teachers stressed that with this tool it was possible to personalise the teaching more, because they were able to ‘meet’ each student by correcting their personal forms and thus were able to better understand their learning process. In this way, the time spent in correcting the form was offset by the increase in students' motivation to study.

  2. 3. Among the difficulties during the initial phase of using the form, the most fragile pupils highlighted some problems in identifying the nature and severity of the error, if not guided by the teacher. This highlights the fundamental guiding role that the teacher assumes in similar activities. In these cases, it is suggested to explain the types of errors to students first, in order to facilitate their identification when filling in the form.

  3. 4. The results emerged from the qualitative data of the action-research participants showed their clear awareness of the importance of planning educational online activities during the time of the pandemic (Colao et al., Reference Colao, Piscitelli, Pulimeno, Colazzo, Miani and Giannini2020; Harris and Jones, Reference Harris and Jones2020; Huber and Helm, Reference Huber and Helm2020; Wyse et al., Reference Wyse, Stickney, Butz, Beckler and Close2020).

  4. 5. Also, the chosen research methodology, based on the full involvement of the participants, fostered in them the acquisition of: a) scientific research skills because, through learning by doing, they learnt how the various phases of the action-research are carried out (Taysum, Reference Taysum2012); b) an attitude of reflection upon the data obtained by observing the behaviour of the students (Judge, Reference Judge2021); c) a more conscious judgement about their knowledge gaps regarding some aspects of the learning process, related in particular to the self-assessment of students (Wong and Zhang, Reference Wong and Zhang2020).

  5. 6. Teachers stressed that this type of intervention to encourage students to reflect on errors makes them better understand that the way in which the disciplinary contents are studied can make them better people too (Pellerey, Reference Pellerey2016). One of the participants expressed this concept vividly by sending the researchers a comment made by an American high school headmaster in a letter to his teachers:

Dear professor, I am a survivor of a concentration camp. My eyes have seen things that no human being should ever see: gas chambers built by educated engineers; children killed with poison by well-trained doctors; infants killed by test-tube nurses; women and children killed and burned by high school and university graduates. Therefore, I distrust education. My request is: help your pupils to become human beings. Your efforts must never produce polite monsters, qualified psychopaths, educated Eichmanns. Reading, writing, arithmetic are of no importance except they serve to make our children more human (Cojean, Reference Cojean1995).

  1. 7. This shows that non-bureaucratic work on skills, animated by the desire to make the human qualities of students flourish through study, reliably gives teachers and students the opportunity to grow in awareness and self-regulation (Pring, Reference Pring2016).

  2. 8. What is more, the research motivated teachers to get involved in experimenting with metacognitive tools and in learning new, useful and close to students' ways of teaching.

  3. 9. On the other hand, the online version of the sheet, with pie charts immediately viewable, guaranteed a quick view of the learning processes to students and teachers so they could start to remedy the gaps.

  4. 10. Furthermore, through this way of carrying out the teaching function, knowledge is humanised: it is not just a matter of informing and making known the different epistemologies of the subjects but is a matter of raising questions about the person who is learning (Bergen, Reference Bergen2009; Mellinger, Reference Mellinger2019; Pietrzak, Reference Pietrzak2018).

In conclusion, the challenges generated by COVID 19 have highlighted the need for the whole school staff to update its competences to create a distributed leadership (Harris, Reference Harris2020): in fact, challenges and problems will be successfully faced only if the entire educating community is able to reflect more and more deeply on mistakes made.

Action-research results showed an interesting change in teaching and learning: the reflection that pupils can make upon the mistakes made at school will be an initial preparation to face life's bigger problems. Never as in this case has it been understood that it is the human dimension of effective teaching that makes the difference (Harris and Jones, Reference Harris and Jones2020).

Limitations of the study

Due to some students' low level of digital literacy, two teachers reported an initial difficulty when students had to fill in and send the metacognitive form through Google Drive.

Another difficulty was the lower involvement of foreign pupils (three Chinese with an A1 level of Italian and a very unmotivated Moroccan).

Also, a pupil with disabilities did not fill in the form, although he was interested in the activity. Unfortunately, due to the progress of the pandemic, the continuous alternation of distance and face-to-face learning has made the activity more difficult for teachers and students to follow.

Conclusions

The purposes of this study were: (a) to verify to what extent the use of a metacognitive form, administered after the tests' correction, makes students more aware that reflecting on errors made in teaching activities increases their key and metacognitive skills; (b) to demonstrate that the active involvement of teachers in the creation of metacognitive assessment tools facilitates the use of competence assessment at school.

The results showed that the students of primary and secondary school appreciated the use of the metacognitive form. In particular, from the qualitative analysis carried out it emerges that: a group of pupils, in addition to identifying the nature of the errors, understood more clearly what they had to review to limit the errors and how to act to correct them; another group pointed out that their awareness of their strengths and growth has grown; finally, a last group highlighted how the skills implemented during the compilation of the form were also useful in other areas of daily life, not only at school.

A high degree of appreciation for the online version of the form emerged from the involved teachers: in fact, the pie charts with percentages about the typology and nature of the mistakes provided students with immediate feedback.

Among the limitations of the research, we point out: students' initial difficulty in filling out the form because they had not already learnt to recognise the nature of the mistakes; the teachers' greater dedication of time to correct the pupils' sheet. On the other hand, both students and teachers have pointed out that this method allows a more personal relationship between teacher and learner and therefore becomes an element of motivation to study.

In conclusion, the use of the metacognitive form helps students to reflect on errors more easily and immediately. In this way pupils can become more aware of the fact that they can grow in key and metacognitive competences.

Acknowledgements

We thank the School Managers, the teachers and the students of the schools involved for the time and commitment dedicated to this study. Daniela Canfarotta, PhD in Theory & Practice of Education for Teacher Training, is a Latin and Greek teacher in Higher Secondary School in Bagheria (Palermo, Sicily, Italy). Her research focused the development of key competences and metacognition through Latin and Greek study deepening how different didactics may increase them. She undertook a study period abroad (University of Burgos, Spain and University of Leicester, UK). Current themes of research: Metacognition; Latin and Greek languages; Key competences; Life skills; Didactics. She is the author of several scientific papers. Carla Lojacono, PhD in teachers' training with a thesis about the Flipped Classroom in Higher Education. She currently works as an educator with high school students and teachers helping them in developing soft skills and reflective skills. She also is a trainer in schools within the civic education programs.

References

Amin, E (2019) Using awareness raising in syntactic and semantic errors to foster translation performance among Majmaah University EFL students. Arab World English Journal 10, 196212.CrossRefGoogle Scholar
Ashdowne, R (2009) Accidence and acronyms: deploying electronic assessment in support of classical language teaching in a university context. Arts and Humanities in Higher Education 8, 201216.Google Scholar
Badii, I and Lorenzo, M (2018) Weaving ethics with experimental sciences: a didactic proposal for teacher training with the TV series Breaking Bad. Didáctica De Las Ciencias Experimentales Y Sociales 34, 105121.Google Scholar
Baldacci, M (2013) Questioni di rigore nella ricerca-azione educativa. Journal of Educational, Cultural and Psychological Studies (ECPS Journal) 3, 97106.Google Scholar
Bergen, D (2009) The role of metacognition and cognitive conflict in the development of translation competence. Across Languages and Cultures 10, 231250.Google Scholar
British Educational Research Association (2018) Ethical Guidelines for Educational Research, 4th Edn. London: BERA.Google Scholar
Canfarotta, D (2021) Crescere studiando Latino e Greco. Esperienze dall’Italia e dall’estero. Roma: Armando Editore.Google Scholar
Ciappei, C and Cinque, M (2014) Soft skills per il governo dell'agire. La saggezza e le competenze prassico-pragmatiche. Milano: Franco Angeli.Google Scholar
Cinque, M (2017) Soft skills e lavoro: come sviluppare le competenze trasversali? Rivista di Scienze dell'Educazione 60, 197211.Google Scholar
Cojean, A (1995) Les mémoires de la Shoah, in Le Monde of 29th of April 1995.Google Scholar
Colao, A, Piscitelli, P, Pulimeno, M, Colazzo, S, Miani, A and Giannini, S (2020) Rethinking the role of the school after COVID-19. Correspondence 5, 370371.Google ScholarPubMed
Díaz-Barriga, Á (2011) Competencias en educación: Corrientes de pensamiento e implicaciones para el currículo y el trabajo en el aula. Revista Iberoamericana de Educación Superior 5, 324.Google Scholar
Elliott, J (1991) Action Research for Educational Change. Bristol, PA: Open University Press.Google Scholar
European Council (2018) Recommendation on Key Competences for Lifelong Learning. Brussels: Official Journal of the European Union.Google Scholar
Green, J, Willis, K, Hughes, E, Small, R, Welch, N, Gibbs, L and Daly, J (2007) Generating best evidence from qualitative research: the role of data analysis. Australian and New Zealand Journal of Public Health 31, 545550.CrossRefGoogle ScholarPubMed
Harris, A (2020) COVID-19 – school leadership in crisis? Journal of Professional Capital and Community 5, 321326.Google Scholar
Harris, A and Jones, M (2020) COVID 19 – school leadership in disruptive times. School Leadership & Management 40, 243247.CrossRefGoogle Scholar
Heckman, JJ and Kautz, T (2016) Formazione e valutazione del capitale umano. Bologna: Edizioni Il Mulino.Google Scholar
Huber, S and Helm, C (2020) COVID-19 and schooling: evaluation, assessment and accountability in times of crises-reacting quickly to explore key issues for policy, practice and research with the school barometer. Educational Assessment Evaluation and Accountability 32, 237270.Google ScholarPubMed
Huertas-Bustos, A, Lopez-Vargas, O and Sanabria-Rodriguez, L (2018) Effect of a metacognitive scaffolding on information web search. Electronic Journal of E-Learning 16, 91106.Google Scholar
Hunt, S (2020) Editorial. Journal of Classics Teaching 21, 14.Google Scholar
Judge, M (2021) COVID 19, school closures and the uptake of a digital assessment for learning pilot project during Ireland's national lockdown. Irish Educational Studies 40, 419429.CrossRefGoogle Scholar
Karabenick, SA and Zusho, A (2015) Examining approaches to research on self- regulated learning: conceptual and methodological considerations. Metacognition and Learning 10, 151163.Google Scholar
La Marca, A and Gülbay, E (2018) Didattica universitaria e sviluppo delle «soft skills». Lecce: Pensa Multimedia Editore.Google Scholar
Latorre, A (2004) La investigación acción; conocer y cambiar la práctica educativa, 2nd Edn. Barcelona: GRAO.Google Scholar
Lewin, K (1946) Action research and minority problems. Journal of Social Issues 2, 3446.Google Scholar
Martinez-Borreguero, G, Naranjo-Correa, F and Maestre-Jimenez, J (2017) Relationship between teacher efficacy and emotions in teacher training in technology. In Chova, L, Martinez, A and Torres, I (eds), Inted2017: 11th International Technology, Education and Development Conference. Valencia: IATED, pp. 67436752.CrossRefGoogle Scholar
Mellinger, C (2019) Metacognition and self-assessment in specialized translation education: task awareness and metacognitive bundling. Perspectives-Studies in Translation Theory and Practice 27, 604621.Google Scholar
Nieto, N (2017) Active learning and metacognitive competences to achieve the transfer of learning in secondary education. Revista De Investigación Educativa De La Escuela De Graduados En Educación 7, 1925.Google Scholar
Owolabi, J (2020) Virtualising the school during COVID-19 and beyond in Africa: infrastructure, pedagogy, resources, assessment, quality assurance, student support system, technology, culture and best practices. Advances in Medical Education and Practice 11, 755759.Google ScholarPubMed
Pellerey, M (2016) Promuovere la capacità di governare se stessi nell'affrontare le sfide poste dallo studio e dal lavoro in una società complessa e altamente dinamica. Studi e Ricerche 10, 3952.Google Scholar
Pietrzak, P (2018) The effects of students’ self-regulation on translation quality. Babel-Revue Internationale De La Traduction-International Journal of Translation 64, 819839.Google Scholar
Postholm, M (2012) Teachers’ professional development: a theoretical review. Educational Research 54, 405429.CrossRefGoogle Scholar
Pring, R (2016) Preparing for citizenship: bring back John Dewey. Citizenship, Social and Economics Education 15, 614.Google Scholar
Rodríguez Gómez, G, Ibarra Saiz, MS and Cubero Ibáñez, J (2018) Competencias básicas relacionadas con la evaluación. Un estudio sobre la percepción de los estudiantes universitarios. Educación XX1 21, 181208.Google Scholar
Salema, L (2017) Shaping language education: teaching Greek, evaluating learning. In Cravo, C and Marques, S (eds), Ensino Das Linguas Classicas: Reflexoes E Experiencias Didacticas, pp. 111127. Available at https://doi.org/10.14195/978-989-26-1340-6_7.CrossRefGoogle Scholar
Taras, M (2010) Assessment – summative and formative. Some theoretical reflections. British Journal of Educational Studies 53, 466478.CrossRefGoogle Scholar
Taysum, A (2012) Evidence Informed Leadership in Education. London: Continuum.Google Scholar
Verschaffel, L, Depaepe, F and Mevarech, Z (2019) Learning mathematics in metacognitively oriented ICT-based learning environments: a systematic review of the literature. Education Research International 1, 119.CrossRefGoogle Scholar
Vidal Ledo, MJ, Salas Perea, RS, Fernández Oliva, B and García Meriño, AL (2015) Educación basada en competencias. Educación Médica Superior 30, 110. Available at http://www.ems.sld.cu/index.php/ems/article/view/801/335.Google Scholar
Wade-Jaimes, K, Demir, K and Qureshi, A (2018) Modeling strategies enhanced by metacognitive tools in high school physics to support student conceptual trajectories and understanding of electricity. Science Education 102, 711743.CrossRefGoogle Scholar
Wiberg, M, Lyren, P and Pantzare, A (2021) Schools, universities and large-scale assessment responses to COVID-19: the Swedish example. Education Sciences 11, 175191.CrossRefGoogle Scholar
Wong, L and Zhang, Y (2020) COVID-19 pivot: a reflection on assessments. Accounting Research Journal 34, 357362.CrossRefGoogle Scholar
Wyse, A, Stickney, E, Butz, D, Beckler, A and Close, C (2020) The potential impact of COVID-19 on student learning and how schools can respond. Educational Measurement-Issues and Practice 39, 6064.CrossRefGoogle Scholar
Ya-Hui, S, Li-Yia, F, Chao-Chin, Y and Tzu-Ling, C (2012) How teachers support university students’ lifelong learning development for sustainable futures. Student's Perspective Futures 44, 158165.Google Scholar
Figure 0

Table 1. Relationships between competences and soft skills

Figure 1

Figure 1. The metacognitive form.

Figure 2

Figure 2. The overall form.

Figure 3

Figure 3. Table for middle school students, Mathematics.

Figure 4

Figure 4. Table for high school students, Latin.