European Journal of STEM Education
Research Article
2023, 8(1), Article No: 03

Digital Research Skills in Secondary Science Education: A Guiding Framework and University Teachers’ Perception

Published in Volume 8 Issue 1: 03 Mar 2023
Download: 1587
View: 3751

Abstract

This study focuses on the perceived gap between the required and actual level of digital research skills (DRS) of students entering tertiary science education. By combining existing frameworks for research skills and digital literacy skills, a guiding framework of DRS was constructed. The DRS framework incorporates seven categories and is evaluated using an exploratory qualitative study employing semi-structured interviews with university teachers (N = 15). The level of DRS at the start of university science education and university teachers’ perceptions of first-year students’ level of DRS have been investigated. The results show that the skills of writing a research paper using digital tools, using proper resources, and analysing, transforming, and visualising data were generally found to be wanting.

INTRODUCTION

In preparing students for academic education in STEM subjects, secondary education has the goal of providing a certain initial level of academic skills. Apart from the obvious STEM domain content, these skills include, among others, research skills (RS) and digital literacy skills (DLS). To give just one example, one of the most important research skills for STEM students is constructing a graph out of a dataset. A recent study in science education shows that pre-university physics students are generally able to construct suitable graphs on paper, 67% of which meet all scientific conventions (Pols et al., 2021). However, when students are confronted with constructing similar graphs by digital means, the results are insufficient (Sadikin et al., 2021).

First-year university students are confronted with an environment that often draws upon RS and DLS, and there have been consistent reports that they are lacking the necessary skills. To mention a few examples, Julien and Barker (2009) noticed that students are unaware of how search engines identify potentially relevant sources. Hyytinen et al. (2017) found that many beginning students in higher education still struggled with how to use their sources. Wollscheid’s (2021) data show a lack of adequate scientific writing skills among starting students. First-year university students also have been reported to have difficulties related to adapting, interpreting and evaluating outcomes during data processing and working with statistics (Oakleaf and Owen, 2010). Akuegwu and Uche (2019) found that reading, presentation, communication and information-gathering skills of freshman STEM students were adequate, whereas especially data analysis was found wanting.

Other research confirmed this problematic level of skills such as information-seeking and using ICT among students in secondary education (Julien and Barker, 2009; Meelissen et al., 2014; Smith et al., 2013; Walraven et al., 2008). In the past decades several authors advocated integration of DLS in undergraduate and secondary education (Pratolo and Solikhati, 2021; Voogt et al., 2013; Udeogalanya, 2022).

In many countries, the learning objectives for secondary education are prescribed in terms of the subject aims, core knowledge and skill areas. However, even when secondary education syllabi do give some guidelines on the development of RS and DLS, concrete demands concerning the level of these skills are often lacking (Maddens et al., 2021; Thijs et al., 2014; Voogt et al., 2013). These levels are therefore not easily measured or assessed. As a consequence of this situation, several university science programs offer compulsory modules for first-year students, teaching them RS and DLS (Curtis et al., 2017). It is recently found that students’ digital literacy positively affects learning outcomes in science subjects (Akhyar et al., 2021; Latip et al., 2022)

Despite the importance of DLS and RS for academic success (Briggs et al., 2012; Hurwitz and Schmitt, 2020; Oostdam et al., 2007), only a few scientific studies have addressed the development of these skills in secondary education and the way this affects the ensuing transition to science undergraduate education. Most studies in this field have focused on students’ characteristics (Maddens et al., 2021; Smith and White, 2015), general transition factors (Leese, 2010), teaching and testing approaches (Mumba et al., 2002) or a mismatch of STEM syllabi (De Meester et al., 2020).

In this study we synthesized a framework for the combination of DLS and RS, which we label Digital Research Skills (DRS). The main aim of this study is to evaluate how proficient science university teachers judge their starting students to be in DRS. The following research question will be addressed: For which digital research skills (DRS) do university teachers perceive a gap between the final level of secondary education and the required entry level for science studies?

The two sub-questions are:

  1. Which DRS do university teachers consider important for starting science students?

  2. What are university teachers’ perceptions about starting science students’ level of DRS?

In the next section, we describe a framework for DRS based on existing RS and DLS frameworks. To answer the sub-questions, we then use this DRS framework as a basis for interviewing university teachers on their perceptions about the level of DRS among first-year science students.

THEORETICAL FRAMEWORK

In this section we aim to combine existing frameworks for RS and DLS into a framework that covers DRS.

Existing Frameworks for RS

Fischer et. al, defined research skills (RS) as a set of “skills and abilities to understand how scientific knowledge is generated in different scientific disciplines, to evaluate the validity of science-related claims, to assess the relevance of new scientific concepts, methods, and findings, and to generate new knowledge using these concepts and methods” (Fischer et al., 2014: 29). Significant consensus on RS is found in literature, resulting in the introduction of several similar frameworks for RS. For example, for upper secondary education, Fischer et al. (2014) and Opitz (2017) deconstruct research skills into eight scientific activities in which students have to be able to engage, from problem identification, to drawing conclusions and communicating. Similarly, Stokking et al. (2004) mapped secondary school examination requirements to RS in 10 steps. The inquiry-based learning framework developed by Pedaste et al. (2015) is another example of such an approach. Furthermore, different versions of the Research Skills Development (RSD) framework (Willison and O’Regan, 2007) have been developed for students in higher education (van Laar et al., 2017) and undergraduate students (Willison and Buisman-Pijlman, 2016). This RSD framework (not to be confused with our acronym DRS for digital research skills) aims to conceptualise research skills using levels of autonomy and identifies six research skills.

In the present study, the latter framework presented by Willison (2018) is adopted, because: (1) it covers most of what has been mentioned in the other frameworks for RS; (2) it is comprehensive, (3) quite recent, and (4) suitable for the target group.

A comparison of this RSD framework with the other known RS frameworks is given in Table 1.

 

Table 1. Research skills, steps or stages described in literature compared with the RSD framework of Willison

Research skill

(Willison, 2018; Willison and Buisman-Pijlman, 2016; Willison and O’Regan, 2007)

General description

(Willison, 2018, p. 2; Willison and Buisman-Pijlman, 2016, p. 66; Willison and O’Regan, 2007, pp. 402–403)

Similar skill(s) in research stages or steps described in the literature

Embark and clarify

Students respond to or initiate direction, clarify and consider ethical, cultural, social and team (ECST) issues.

1. Recognise the problem. 2. Identify and define the problem. 3. Formulate a problem hypothesis, deduce consequences and define basic terms and variables. (Allison et al., 2016, p. 17,18)

1. Problem identification, 2. Questioning, 3. Hypothesis generation. (Fischer et al., 2014, pp. 33–35; Maddens et al., 2021, p. 494; Opitz et al., 2017, p. 79)

1. Identify and formulate a problem using subject-specific concepts. 2. Formulate the research question(s), hypotheses and expectations (if any). (Stokking et al., 2004, p. 99)

1.Questioning. 2. Hypothesis generation. (Pedaste et al., 2015, p. 54)

Find and generate

Students find information and generate data/ideas using appropriate methodology.

6. Conduct experiment. (Allison et al., 2016, p. 18)

5. Evidence generation. (Fischer et al., 2014, pp. 33–35; Maddens et al., 2021, p. 494; Opitz et al., 2017, p. 79)

4. Gather and select information/data. 5. Assess the value and utility of the data. (Stokking et al., 2004, p. 99)

4. Experimentation. (Pedaste et al., 2015, p. 54)

Evaluate and reflect

Students determine the credibility of sources, information, data and ideas, and make their own research processes visible.

4. Select experimental variables. (Allison et al., 2016, p. 18)

4. Construction and redesign of artefacts. 6. Evidence evaluation. (Fischer et al., 2014, pp. 33–35; Maddens et al., 2021, p. 494; Opitz et al., 2017, p. 79)

8. Evaluate the research. (Stokking et al., 2004, p. 99)

7. Reflection. (Pedaste et al., 2015, p. 54)

Organise and manage

Students organise information and data to reveal patterns/themes, managing teams and processes.

5. Construct experimental plan. 7. Reduce raw data to allow examination of effect thought to exist. (Allison et al., 2016, p. 18)

3. Make and monitor the research plan: research design and time schedule. (Stokking et al., 2004, p. 99)

3. Exploration (Pedaste et al., 2015, p. 54)

Analyse and synthesise

Students analyse information/data critically and synthesise new knowledge to produce coherent individual/team understandings.

8. Test for significance. (Allison et al., 2016, p. 18)

7. Drawing conclusions. (Fischer et al., 2014, pp. 33–35; Maddens et al., 2021, p. 494; Opitz et al., 2017, p. 79)

6. Analyse the data. 7. Draw conclusions. (Stokking et al., 2004, p. 99)

5. Data interpretation (Pedaste et al., 2015, p. 54)

Communicate and apply

Students apply their understanding and discuss, listen, write, perform, respond to feedback and present processes, knowledge and implications of research.

8. Communicating and scrutinising. (Fischer et al., 2014, pp. 33–35; Maddens et al., 2021, p. 494; Opitz et al., 2017, p. 79)

9. Develop and substantiate a personal point of view. 10. Report (describe) and present (communicate) the research. (Stokking et al., 2004, p. 99)

6. Communication (Pedaste et al., 2015, p. 54)

 

A Framework for DRS

In contrast to the consensus on RS, there appears to be no consensus about what exactly the concept of digital literacy in education entails (Pangrazio et al., 2020). The different definitions of digital literacy (scientific and reference frameworks) mainly address skills and knowledge (Voogt et al., 2019). In order to work towards a framework for both RS and DLS, we have studied publications on DLS from the past ten years that were related to or assessed in secondary education.

We considered six frameworks for digital literacy in the 21st century aimed specifically at students over 12: learning literacies for the digital age (LLiDA) with a broad audience, described by Beetham et al. (2009); the Digital Competence Assessment (DCA) by Calvani et al. (2008, 2010), the international computer and information literacy study (ICILS) by Fraillon et al. (2013, 2014, 2020), digital literacy (DL), by Eshet-Alkalai (2012, 2002; Eshet-Alkalai and Chajut, 2009); the European digital competence framework for citizens (DigComp) by Carretero et al. (2017), Ferrari (2013) and Vuorikari et al. (2016, 2019); and the technology & engineering literacy framework (TEL) by the National Assessment Governing Board (2018).

In Table 2, we have compared these frameworks for DLS with the RSD framework presented by Willison (2018). The third column in Table 2 gives the corresponding categories of our DRS framework. The ability to browse, search, filter and use advanced search methods for various purposes from DLS overlaps with RSD skill 1 ‘embark and clarify’ in Table 1. We therefore defined the combination of these research and digital literacy skills as ‘browse, search and filter information’, the first digital research skills (DRS) category in the third column of Table 2.

DLS such as accessing and selecting information, the ability to construct knowledge and use digital resources are connected to research skill 2 ‘find and generate’ of the RSD framework. This combination forms the second DRS category: ‘gather, measure and collect digital content/data’ in Table 2.

The ability to evaluate information for timeliness and accuracy, and developing strategies to check the credibility of sources overlaps with skill 3 ‘evaluate and reflect’ of the RSD framework. In a similar way, category 4 ‘structure, manage and protect digital content/data’ and category 5 ‘analyse, transform and visualise content/data digitally’ found their way in Table 2.

 

Table 2. Matching of research skills and digital literacy skills to identify corresponding digital research skills (DRS)

Research skill (RSD)

Similar digital literacy skills in other frameworks

Corresponding digital research skill

1. Embark and clarify

Browse, search and filter data, information and digital content (DigComp)

Use advanced search techniques with digital and network tools and media resources (TEL)

Browse, search and filter information

2. Find and generate

Access information (ICILS)

The ability to construct knowledge by nonlinear navigation (DL)

Select digital and network tools and media resources (TEL)

Use digital tools and resources (TEL)

Gather, measure and collect digital content/data

3. Evaluate and reflect

Evaluate information (ICILS, DigComp)

The ability to consume information critically and sort out false and biased information (DL)

Search media and digital resources on a community or world issue and evaluate the timeliness and accuracy of the information (TEL)

Evaluate the credibility of the source (TEL)

Justify choices based on the tools’ efficiency and effectiveness for a given purpose (TEL)

Determine the accuracy and validity of sources/methods

4. Organise and manage

Manage information and digital content (ICILS, DigComp)

Safety, privacy and security (ICILS, DigComp, TEL)

Netiquette, copyright and licenses (DigComp)

Knowledge about many different ICT tools (TEL)

Responsible and ethical behaviour (TEL)

Structure, manage and protect digital content/data

5. Analyse and synthesise

Transform and create information (ICILS)

Integrate and re-elaborate digital content (DigComp)

Creatively use digital technology (DigComp)

The ability to process and evaluate large volumes of information in real time (DL)

Use digital tools to collect, analyse, and display data in order to design and conduct complicated investigations (TEL)

Conduct a simulation of a system using a digital model (TEL)

Analyse, transform and visualise content/data digitally

6. Communicate and apply

Develop digital content (DigComp)

Manipulate pre-existing digital texts and formats (DL)

The ability to create authentic, meaningful written and artwork (DL)

Explain rationale for the design and justify conclusions based on observed patterns in the data (TEL)

Write a research paper using digital tools

7. Communicate and apply

Share and interact with information through digital technologies (ICILS, DigComp)

Develop digital content (DigComp)

The ability to communicate effectively in online communication platforms (DL)

Share and present content/data

 

Skill 6 ‘communication and apply’ of the original RSD framework is defined as “Discuss, listen, write, present and perform the processes, understandings and applications of the research…” (Willison and Buisman-Pijlman, 2016: 67). Research projects in secondary education, however, often entail writing a report as well as giving a presentation. For the purpose of this study RSD skill 6 was therefore subdivided into two DRS categories: ‘Write a research paper using digital tools’ (category 6) and ‘Share and present content/data’ (category 7).

METHOD

In order the answer the sub-questions, we conducted an exploratory qualitative study to clarify what DRS are required at the start of higher science education in the Netherlands.

Context

To start academic STEM studies in the Netherlands, only a valid pre-university diploma with one or more compulsory subjects such as chemistry and physics is required. No further specification or prior knowledge in the field of RS and DLS is required to obtain this diploma. The required entrance level of these skills is also unspecified at many Dutch universities.

Every secondary education subject has a nationally-established syllabus which contains a description of learning goals. In the syllabi for physics and chemistry, DRS are mentioned three times in the domain of ‘skills’.

A8.1: Acquiring and selecting information from written, oral and audio-visual sources, partly with the help of ICT:

  • Extract data from graphs, tables, drawings, simulations, schemes and diagrams;

  • Look up quantities, units, symbols, formulas and data in suitable tables. (CvTE, 2021a: 13, 2021b: 9)

A8.2: Analysing, displaying and structuring information, data and measurement results in graphs, drawings, diagrams, diagrams and tables, partly with the help of ICT. (CvTE, 2021a: 13, 2021b: 9)

A11.2: Using the computer to model and visualize phenomena and processes, and to process data. (CvTE, 2021a: 15)

Only the third specification, which relates only to pre-university physics, requires that the required skills must be fully performed on the computer. The syllabus for mathematics states that pre-university students are able to use ICT for, among other things, modelling, algebraizing and investigating geometric properties of objects. However, ICT refers here to a rudimentary graphic calculator (CvTE, 2021c: 6).

Participants

This qualitative study was based on semi-structured interviews with 15 academic science teachers (Table 3), selected based on the following criteria: they teach first-year chemistry and/or physics students and/or supervise or teach these students when the students are doing a research project. Out of a total of 30 candidates approached, 15 responded. Participants provided written informed consent, which was repeated orally at the start of the interview. The academics were men and women in different age categories (30-67 year) from eight comprehensive and two technical universities in the Netherlands.

 

Table 3. Pseudonymized name and function of interviewees (N = 15)

 

Function

Pseudonymized name

1

Coordinator and lecturer Physics practical

Ed

2

Full professor, program director and lecturer Chemical Science and Engineering

Ethan

3

Lecturer Physics practical

Dean

4

Coordinator and Lecturer Physics practical

Kevin

5

Supervisor Educational Chemistry

Esther

6

Lecturer academic skills Technical Natural Sciences

Emmy

7

Lecturer Science Education and Communication

Alycia

8

Lecturer Molecular Life Sciences

Noah

9

Lecturer academic skills Pharmacy

Nadir

10

Lecturer Introduction Scientific Research

Aimee

11

Full professor, program director and lecturer in Life, Science and Technology

Simon

12

Lecturer Chemistry Practical

Rachid

13

Full professor and lecturer of Mechanical Engineering

Kyano

14

Program director and associate professor of Engineering Physics

Nigel

15

Coordinator and lecturer Chemistry practical

Michael

 

Procedure

All interviews were conducted and recorded in an online platform. In order to determine academics’ views on DRS, the interview consisted of six basic questions.

  1. In which course or courses are you involved and what is your role in this?

  2. Can you name some DRS that you think first-year students should be able to apply while conducting research?

  3. To what extent are you satisfied with the degree to which students have mastered the DRS?

  4. If you were allowed to include one thing in secondary education in the field of DRS, what would you like to add to the program in pre-university education?

The interview ended with two closed-ended questions.

  1. How satisfied are you with the degree to which first-year students apply digital skills in applications such as Word, Excel, PowerPoint? Can you grade this from 1 to 10, where 1 = least and 10 = most?

  2. How important do you think it is that first-year students have mastered digital skills in applications? Can you grade this from 1 to 10, where 1 = least and 10 = most?

During the first part of the interview follow-up questions were used, such as: Can you give an example? How should they be able to do this? Does this apply to most students? The participants were not introduced to our DRS framework before the interview, in order to leave the approach of the subject entirely to the interviewee. No questions were framed around specific DRS. Note that question 5 addresses no specific science applications, such as Python, Origin, or LateX. Only the basic applications that all students in secondary education are likely to encounter have been included in the questions. All questions were presented in Dutch. The quotes have been translated into English.

Data Analysis

All recordings were transcribed and pseudonymised. Potentially interesting quotes from the participants were selected and coded using the categories in the DRS framework in Table 2. The DRS framework outlined in Table 2 above has been employed as an axial coding scheme. Similar self-contained quotes were grouped together per category in the framework. Each quote was also coded as reflecting either a positive or negative perception by the interviewee. We organized the data in a bar chart, indicating the number of quotes per category and subdividing the bars to include positive and negative quotes. Duplicate coding was done on 50 of the 223 quotes; Cohen’s kappa was found to be 0.88 for determining whether a quote was positive or negative and 0.81 for assigning the codes to the seven categories of the DRS framework, indicating good to near-perfect interrater reliability for both coding processes.

Quantitative data were obtained from the last two interview questions. The scales on the interviewees’ ratings of importance or satisfaction with the degree to which their students have mastered digital skills in applications such as Word, Excel, and PowerPoint were averaged and organized in a diagram.

RESULTS

This section outlines the main findings of the interviews.

Important Digital Research Skills

Following transcription and coding, 223 quotes with examples in the field of DRS were identified. The quotes were coded according to the framework in Table 2. Table 4 gives some examples for each category of the DRS framework.

The most frequently mentioned examples (76) belong to category 6: writing a research paper using digital tools; 60 quotes concern category 5: analysing, transforming and visualising content/data digitally; 41 quotes concern category 2: gathering, measuring and collecting digital content/data; 21 quotes concern category 1: Browse, search and filter information and 14 quotes concern category 7: Share and present content/data. Categories 3 (7 quotes) and 4 (4 quotes) were least mentioned. Eight quotes contain examples in more than one category.

Quotes with examples such as being able to use search engines, assessing and selecting sources for reliability and transform data into a graph were mentioned by 13 of the 15 interviewees. Examples associated with writing a research paper were mentioned by all interviewees, included details such as using calculating functions, a functional type of chart with correct labelling of axes, display of measurement points and use of a caption.

 

Table 4. Examples of skills related to proposed DRS categories mentioned by interviewees (N = 15)

DRS categories

Examples mentioned by interviewees*

1. Browse, search and filter information

Using search engines and databases (n = 4)

Filtering information (n = 4)

Using advanced search techniques by combining keywords with operators (n = 3)

2. Gather, measure and collect digital content/data

Maintaining a resource list and refer to sources (n = 8)

Selecting scientific literature (n = 5)

Knowledge and use of digital/online environments to (automatically) acquire data (n = 2)

Avoiding plagiarism using quoting and paraphrasing (n = 1)

3. Determine the accuracy and validity of sources/methods

Comparing and assessing digital sources/information (n = 2)

Digital visualisation of the search strategy (n = 1)

Evaluation of information/data for timeliness and accuracy (n = 1)

Checking the credibility of sources (n = 1)

4. Structure, manage and protect digital content/data

Drawing a research setup (n = 2)

Storing data in a safe and orderly way (n = 1)

Developing a research plan, using a structural procedure (n = 1)

5. Analyse, transform and visualise content/data digitally

Using application software to enter calculation functions (n = 11)

Plotting a graph using a functional type of chart with axis labels and/or a legend (n = 8)

Using error bars, a curve-fit and/or trendlines (n = 3)

Processing multiple datasets (n = 2)

Using a curve-fit or fitting a model to the data (n = 1)

Programming a script or model to process and analyse data (n = 1)

6. Write a research paper using digital tools

Inserting figures, graphs and tables using references and captions (n = 5)

Processing and formatting content using a digital application (n = 4)

Using styles, headings and subheadings (n = 2)

Using automatic numbering (n = 2)

7. Share and present content/data

Using application software or an online tool for displaying and formatting content (n = 2)

Collaborating and presenting in a digital/online environment (n = 1)

Using a slide master and/or a digital template (n = 1)

Reduction of text and graphic design of content (n = 1)

* We paraphrased these quotes for the sake of brevity.

 

Perceptions of Students’ Level of Digital Research Skills

In Figure 1, the total number of quotes on the students’ level of proficiency is displayed, divided into positive and negative quotes. A positive quote means that the interviewee was mostly satisfied with the extent to which students applied the corresponding skill. A negative quote means the interviewee was generally not satisfied.

 

Figure 1. Frequency of positive and negative interview quotes per category of our DRS framework (N = 15)

 

It is striking that most quotes (106 out of 125), in response to the third interview question were negative. The negative quotes mainly indicate a lack of skill in: writing a research paper using digital skills; analysing, transforming and visualising content/data; and gathering, measuring and collecting digital content/data. Dean (Lecturer Physics practical) said:

They have very little familiarity with more advanced packages. Really, just the digital skills of making a report neat. How do I actually just draw a research setup? What am I going to use? What exactly can I measure?

The positive quotes mainly related to skills such as sharing and presenting content/data and also writing a research paper using digital tools.

We now give examples of quotes as presented in Figure 1, starting with the largest group, where the numbers in parentheses mean: (number of quotes/number of interviewees mentioning the quotes).

Category 6: Write a research paper using digital tools (42 negative quotes; 5 positive quotes)

The majority of the negative quotes in this category are related to big differences in skill level between the students (11/8). Kevin (Coordinator and Lecturer Physics practical) said:

There are students who come from a pre-university school with a lot of practicals and research report experience, but there are also students who have actually done zero practicals, because the teachers think it is a waste of time, so that is just very diverse what comes in.

Other quotes refer to lack of skill in using captions and references (10/8), line of reasoning in a paper (10/6) and text formatting (8/5). Michael (Coordinator and lecturer Chemistry practical) said:

I see, which is a very common mistake, that students just put pictures in there. So you see the results section and then you see two images without a proper caption. And then, if you’re lucky, there will be a few more lines of text underneath and that’s it. That is really something I see very often.

The positive quotes (5/3) related to general skills in writing a paper.

Category 5: Analyse, transform and visualise content/data digitally (29 negative quotes; 4 positive quotes).

In addition to the lack of adding an explanation in the caption (14/8), a significant number of the negative quotes in this category are related to students’ lack of digital skills to process data (8/5). Michael (Coordinator and lecturer Chemistry practical) said:

30 to 40 percent of the students can handle it just fine and others have never done it. So if you look in their lab journal on the first few experiments, they have a hand-drawn graph, while you really said, well, try doing it in Excel, but then they say yes, they have no idea how it should be.

Other quotes refer adding a correct axis label (8/8) and adding a trendline instead of connecting the measure points (7/7). The positive quotes were about visualizing content/data in general (4/3).

Category 2: Gather, measure and collect digital content/data (16 negative quotes; 3 positive quotes)

Three interviewees noticed that there is a lot of difference in the level of skill with which students select scientific literature. They were surprised that most first-year students are not yet able to do this, so they have to be trained to do it. Aimee (Lecturer Introduction Scientific Research) said:

What I often notice is that they have trouble citing sources properly, because very often they cite an article they have found in PubMed as a link to PubMed. I keep encountering that, even in the third year. The difference between a journal and a database is not yet completely clear to them.

Other quotes mentioned that students use too few references (6/5), the quality of used reference does not suffice (8/4) or that they use non-evaluable links (2/2).

Category 1: Browse, search and filter information (15 negative quotes; 2 positive quotes)

The majority of the negative quotes in this category are related to first-year students’ insufficiency to search for information on the Internet (11/6). Nigel (Program director and associate professor of Engineering Physics) said:

The average student taking final exams is not very good at looking for very specific information. If you’re talking about: “Look for an article that is about topic x or about application y”, then that is quite disappointing.

Not every teacher expected students to be familiar with these skills.

Category 3: Determine the accuracy and validity of sources/methods (11 negative quotes; 0 positive quotes) and Category 4: Structure, manage and protect digital content/data (2 negative quotes; 0 positive quotes)

Only four interviewees gave quotes related to students’ skills at determining the accuracy and validity of sources/methods, category 3 of the DRS framework. The majority of the negative quotes in this category are related to students’ insufficient ability to assess sources for reliability (9/4), that they should take more time during this step. Dean (Lecturer Physics practical):

What they are just not used to is to properly review sources. So the moment you don’t train them and before we trained them, they came up with a lot of Wikipedia, that you think: yes guys, gosh. I mean, you’ve already written a research paper in secondary education.

There were not many quotes about students’ ability to determine the accuracy and validity of sources/methods, category 3, and to structure, manage, and protect content/data, category 4. Moreover, all were negative.

Category 7: Share and present content/data (3 negative quotes; 5 positive quotes)

As can be seen in Figure 1, students’ ability to share and present content/date was the only skill with more positive than negative quotes. The positive quotes concerned presentation skills, such as oral proficiency and designing a poster or slide show (5/4).

Ratings of Importance and Satisfaction

Figure 2 gives the distribution of the ratings from the two closed-ended interview questions. The interviewees rated the students’ ability to apply digital skills in software applications such as Word, Excel and PowerPoint with an average grade of 7.4 (SD = 1,1). This indicates that they were sufficiently satisfied with the students’ ability. When asked about the importance of these skills, the interviewees rated their importance at an average of 7.6 (SD = 1,8).

 

Figure 2. Frequency distribution of the ratings of importance of students’ ability and satisfaction with their ability to apply digital skills in software applications (N = 15). 1 = least satisfied/important and 10 = most satisfied/important

 

Suggestions Offered

Each of the 15 interviewees was also asked what they would like to add to the secondary education curriculum in the field of DRS. In answering this question, six interviewees did not mention a research skill, but a digital skill: being able to work with a word processor, a program to process data, or a search engine.

Six of the 15 interviewees would want students in pre-university education to be able to process data better, which is related to category 5 of our DRS framework. Three interviewees explicitly mentioned working with Excel and 3 interviewees wanted students to be able to program. Nigel (Program director and associate professor of Engineering Physics) said:

I actually don’t understand why they use Coach in high school and not just very simple programming language with Python scripts. If you can do Python, then every company will say, “oh nice that you can program in Python”.

Four out of 15 interviewees wanted students in pre-university education to learn to write a coherent story better, which is related to category 6 of our DRS framework. Four of the 15 interviewees also stated that they want students to be able to assess the quality of sources, which is related to category 3 of the DRS framework.

In addition, two interviewees mentioned points that were partly related to DRS. They wanted students to be able to combine the knowledge and skills from several subjects, so that there is a connection between them when formulating sentences in a report.

CONCLUSION

By structuring and comparing both RS and DLS in pre-university education, we formulated seven categories in a framework for DRS: (1) Browse, search and filter information; (2) Gather, measure and collect digital content/data; (3) Determine the accuracy and validity of sources/methods; (4) Structure, manage and protect digital content/data; (5) Analyse, transform and visualise content/data digitally; (6) Write a research paper using digital tools and (7) Share and present content/data.

Our research question was: For which digital research skills (DRS) do university teachers perceive a gap between the final level of secondary education and the required entry level for science studies?

Starting with the sub-questions:

SQ 1. Which DRS do university teachers consider important for starting science students?

The 15 interviewed academic science teachers and professors generally agreed on the importance of writing skills using digital tools. These skills include the ability to format text, using styles and headings, subscripts and superscripts or to edit formulas, the ability to use captions when inserting figures and graphs, their references and explanatory notes in the text and the ability to write and construct a paper that has a logical structure. All of them also valued skills with charts, axis labels, error bars and trendlines, and the ability to program a script to analyse data. Most of them mentioned the use of search engines, maintaining a source list, and assessing and selecting sources for reliability and credibility.

SQ 2. What are university teacher perceptions about the level of DRS of starting science students?

The university teachers’ perceptions about students’ DRS proficiency were mostly negative, with the exception of skills in elementary software applications such as Word and PowerPoint. The primary problem concerned the writing of a research paper using digital tools. Students lack the ability to structure a paper and lack skills in referencing, use of captions and formatting. Despite the fact that students are able to draw adequate graphs on paper (Pols et al., 2021), displaying graphs digitally remains difficult. Similar to Sadikin et al.’s (2021) findings, many students were found to have insufficient experience with using spreadsheets, e.g., in processing data, axis labelling, drawing of a trendline, etc. However, consistent with the results of earlier studies (Hyytinen et al., 2017; Rakedzon and Baram-Tsabari, 2017; Salisbury and Karasmanis, 2011), the interviewees note big differences between individual students in this respect. Furthermore, students are reported to demonstrate difficulties browsing, searching and filtering information, consistent with Walraven et al.’s (2008) and Julien and Barker’s (2009) observations. In contrast, the interviewees were mostly satisfied with students’ level of sharing and presenting content/data.

All but one interviewee experienced a gap from secondary to university science education and suggested adding DRS or both DRS and DLS to the secondary science education curriculum. Three interviewees even mentioned explicitly that students should be able to program in Python and draw graphs in Excel when starting science studies.

In answering the main research question, our results showed that there is indeed a gap in DRS between the final level of secondary education and the required entry level for science studies. The perceived gap is most pronounced in digitally analysing, transforming and visualising content/data and in writing a research paper using digital tools.

DISCUSSION

Limitations

The sample of interviewed university teachers was exclusively from Dutch universities and thus solely reflects the situation in the Netherlands. However, the general literature cited in the introduction showed that the issues addressed are encountered in other countries as well (Akuegwu and Uche, 2019; Care et al., 2018; Wollscheid et al., 2021). The sample size was relatively small and there is a possible availability bias because the interviews were on a voluntary basis. The interviewees have not been informed about the DRS framework. It is to be expected that the quotes would have been of a different nature if the interviewees would have been informed on the framework beforehand. However, the fact that the framework was not mentioned does give the interviewee more freedom in approaching the subject.

The two closed-ended questions during the interview (questions 5 and 6) essentially were about digital skills rather than DRS. Also in this case, openness of the interview was preferred over completeness of the concepts. The fact that certain aspects of DRS were not mentioned spontaneously is in itself an indicator of the perceived importance of these aspects. Specific software used in science studies (e.g., modelling and probe software) was not mentioned in the questions. We focused on the more elementary software all students in secondary education are likely to encounter.

The university teachers were, however, consistent about the gap in DRS; further research with broader samples in different countries could confirm our result.

Our proposed framework for DRS was also based on the most-cited works in the field and the existing frameworks showed considerable overlap, so that we expect to have arrived at a universally applicable framework for DRS.

Implications

Students appear to be especially lacking in skill on data processing and writing a research paper. More attention for these skills in secondary science education could help to bridge this gap. This could be reflected in the learning goals in the syllabi of science subjects in secondary education, for example, by simply rephrasing ‘partly with the help of ICT’ to ‘with the help of ICT’, or by adding the skill of drawing a trend line with the help of ICT and being able to digitally process and format different parts of one’s research in a research paper.

Future research

Our study provides a step towards identifying national and international digital research skill levels to bridge the gap from pre-university to academic science education. An important next step will be to analyse to what extent pre-university students actually apply skills from the DRS framework, such as the use of resources, citing, paraphrasing and the extent to which graphs, axis labels and captions are used or the extent to which students use the formula editor, sub/superscripts, styles and headings in research assignments. Our proposed framework could serve as a guiding instrument in this regard. In this way, concrete steps can be formulated for pre-university teachers and students to bridge the gap in DRS when making the transition to academic science study.

Figure 1 Figure 1. Frequency of positive and negative interview quotes per category of our DRS framework (N = 15)
Figure 2 Figure 2. Frequency distribution of the ratings of importance of students’ ability and satisfaction with their ability to apply digital skills in software applications (N = 15). 1 = least satisfied/important and 10 = most satisfied/important
  • Akhyar, Y., Fitri, A., Zalisman, Z., Syarif, M. I., Niswah, N., Simbolon, P., Purnamasari S, A., Tryana, N., Abidin, Z. and Abidin, Z. (2021). Contribution of Digital Literacy to Students’ Science Learning Outcomes in Online Learning. International Journal of Elementary Education, 5(2), 284–290. https://doi.org/10.23887/ijee.v5i2.34423
  • Akuegwu, B. A. and Uche, K. (2019). Assessing graduate students’ acquisition of research skills in universities in Cross River State Nigeria for development of the total person. Educational Review: International Journal, 6(1), 27–42.
  • Allison, B., Hilton, A., O’Sullivan, T., Owen, A. and Rothwell, A. (2016). Research Skills for Students. Routledge. https://doi.org/10.4324/9781315041780
  • Beetham, H., McGill, L. and Littlejohn, A. (2009). Thriving in the 21st century: Learning literacies for the digital age (LLiDA project): Executive summary, conclusions and recommendations. UK Joint Information Systems Committees (JISC). Available at: http://www.jisc.ac.uk/media/documents/projects/llidaexecsumjune2009.pdf
  • Briggs, A. R. J., Clark, J. and Hall, I. (2012). Building bridges: Understanding student transition to university. Quality in Higher Education, 18(1), 3–21. https://doi.org/10.1080/13538322.2011.614468
  • Calvani, A., Cartelli, A., Fini, A. and Ranieri, M. (2008). Models and instruments for assessing digital competence at school. Journal of E-Learning and Knowledge Society, 4(3), 183–193. https://doi.org/10.1017/cbo9780511554445.007
  • Calvani, A., Fini, A. and Ranieri, M. (2010). Digital competence in K-12. Theoretical models, assessment tools and empirical research. Anàlisi, 40, 157–171. https://doi.org/10.7238/a.v0i40.1151
  • Care, E., Kim, H., Vista, A. and Anderson, K. (2018). Education system alignment for 21st century skills: Focus on assessment. Brookings: Center for universal education. Available at: https://www.brookings.edu/wp-content/uploads/2018/11/Education-system-alignment-for-21st-century-skills-012819.pdf
  • Carretero, S., Vuorikari, R. and Punie, Y. (2017). DigComp 2.1: The Digital Competence Framework for Citizens. With eight proficiency levels and examples of use. Joint Research Centre of the European Commission. Available at: http://publications.jrc.ec.europa.eu/repository/bitstream/JRC106281/web-digcomp2.1pdf_(online).pdf
  • Curtis, E., Wikaire, E., Jiang, Y., McMillan, L., Loto, R., Fonua, S., Herbert, R., Hori, M., Ko, T., Newport, R., Salter, D., Wiles, J., Airini and Reid, P. (2017). Open to critique: predictive effects of academic outcomes from a bridging/foundation programme on first-year degree-level study. Assessment and Evaluation in Higher Education, 42(1), 151–167. https://doi.org/10.1080/02602938.2015.1087463
  • CvTE. (2021a). Natuurkunde VWO Syllabus centraal examen 2021 [Physics VWO syllabus central exam 2021]. CvTE, College voor Toetsen en Examens [Board of Tests and Examinations].
  • CvTE. (2021b). Scheikunde VWO Syllabus centraal examen 2021 [Chemistry VWO syllabus central exam 2021]. CvTE, College voor Toetsen en Examens [Board of Tests and Examinations].
  • CvTE. (2021c). Wiskunde B VWO Syllabus centraal examen 2021 [Math VWO syllabus central exam 2021]. CvTE, College voor Toetsen en Examens [Board of Tests and Examinations].
  • De Meester, J., Boeve-De Pauw, J., Buyse, M.-P., Ceuppens, S., De Cock, M., De Loof, H., Goovaerts, L., Hellinckx, L., Knipprath, H., Struyf, A., Thibaut, L., Van de Velde, D., Van Petegem, P. and Dehaene, W. (2020). Bridging the gap between secondary and higher STEM education-The case of STEM@school. European Review, 28(S1), 135–157. https://doi.org/10.1017/S1062798720000964
  • Eshet-Alkalai, Y. (2002). Digital literacy: A new terminology framework and its application to the design of meaningful technology-based learning environments, in P. Barker and S. Rebelsky (eds), ED-MEDIA 2002 World Conference on Educational Multimedia (pp. 493–498). Association for the Advancement of Computing in Education.
  • Eshet-Alkalai, Y. (2012). Thinking in the digital era: A revised model for digital literacy. Issues in Informing Science and Information Technology, 9, 267–276. https://doi.org/10.28945/1621
  • Eshet-Alkalai, Y. and Chajut, E. (2009). Changes over time in digital literacy. Cyberpsychology and Behavior, 12(6), 713–715. https://doi.org/10.1089/cpb.2008.0264
  • Ferrari, A. (2013). Digital competence in practice: An analysis of frameworks. Joint Research Centre of the European Commission. https://doi.org/10.2791/82116
  • Fischer, F., Kollar, I., Ufer, S., Sodian, B. and Hussmann, H. (2014). Scientific reasoning and argumentation: Advancing an interdisciplinary research agenda in education. Frontline Learning Research, 2(3), 28–45. https://doi.org/10.14786/flr.v2i2.96
  • Fraillon, J., Ainley, J., Schulz, W., Friedman, T. and Duckworth, D. (2020). Preparing for life in a digital world: IEA international computer and information literacy study 2018 international report (p. 297). Springer Nature. https://doi.org/10.1007/978-3-030-38781-5
  • Fraillon, J., Ainley, J., Schulz, W., Friedman, T. and Gebhardt, E. (2014). Preparing for Life in a Digital Age, in Preparing for Life in a Digital Age. https://doi.org/10.1007/978-3-319-14222-7
  • Fraillon, J., Schulz, W. and Ainley, J. (2013). International computer and information literacy study: assessment framework. International Association for the Evaluation of Educational Achievement. Available at: https://www.iea.nl/publications/assessment-framework/international-computer-and-information-literacy-study-2013
  • Hurwitz, L. B. and Schmitt, K. L. (2020). Can children benefit from early internet exposure? Short- and long-term links between internet use, digital skill, and academic performance. Computers and Education, 146, 103750. https://doi.org/10.1016/j.compedu.2019.103750
  • Hyytinen, H., Löfström, E. and Lindblom-Ylänne, S. (2017). Challenges in argumentation and paraphrasing among beginning students in educational sciences. Scandinavian Journal of Educational Research, 61(4), 411–429. https://doi.org/10.1080/00313831.2016.1147072
  • Julien, H. and Barker, S. (2009). How high-school students find and evaluate scientific information: A basis for information literacy skills development. Library and Information Science Research, 31(1), 12–17. https://doi.org/10.1016/j.lisr.2008.10.008
  • Latip, A., Hardinata, A. and Sutantri, N. (2022). The effect of digital literacy on student learning outcomes in chemistry learning. Jurnal Inovasi Pendidikan IPA, 8(2).
  • Leese, M. (2010). Bridging the gap: Supporting student transitions into higher education. Journal of Further and Higher Education, 34(2), 239–251. https://doi.org/10.1080/03098771003695494
  • Maddens, L., Depaepe, F., Janssen, R., Raes, A. and Elen, J. (2021). Research skills in upper secondary education and in first year of university. Educational Studies, 4(47), 491–507. https://doi.org/10.1080/03055698.2020.1715204
  • Meelissen, M., Punter, R. and Drent, M. (2014). Digitale geletterdheid van leerlingen in het tweede leerjaar van het voortgezet onderwijs [Digital literacy of students in the second year of secondary education]. University of Twente. Available at: http://archief.kennisnet.nl/onderzoek/alle-onderzoeken/digitale-geletterdheid-van-leerlingen-in-het-tweede-leerjaar-van-het-voortgezet-onderwijs-nederlandse-resultaten-van-icils-2013/
  • Mumba, F., Rollnick, M. and White, M. (2002). How wide is the gap between high school and first-year chemistry at University of the Witwatersrand? South African Journal of Higher Education, 16(3), 148–156. https://doi.org/10.4314/sajhe.v16i3.25227
  • National Assessment Governing Board. (2018). Technology & engineering literacy framework for the 2018 National Assessment of Educational Progress. US Department of Education.
  • Oakleaf, M. and Owen, P. L. (2010). Closing the 12 - 13 gap together: School and college librarians supporting 21st century learners. Teacher Librarian, 37(4), 52–58.
  • Oostdam, R., Peetsma, T. and Blok, H. (2007). Het nieuwe leren in basisonderwijs en voortgezet onderwijs nader beschouwd: een verkenningsnotitie voor het Ministerie van Onderwijs, Cultuur en Wetenschap [A closer look at the new learning in primary and secondary education: an exploratory memorandum for the Ministry of Education, Culture and Science]. The Kohnstamm Institute. Available at: https://www.researchgate.net/publication/267636159
  • Opitz, A., Heene, M. and Fischer, F. (2017). Measuring scientific reasoning–a review of test instruments. Educational Research and Evaluation, 23(3–4), 78–101. https://doi.org/10.1080/13803611.2017.1338586
  • Pangrazio, L., Godhe, A. L. and Ledesma, A. G. L. (2020). What is digital literacy? A comparative review of publications across three language contexts. E-learning and Digital Media, 17(6), 442-459. https://doi.org/10.1177/2042753020946291
  • Pedaste, M., Mäeots, M., Siiman, L. A., de Jong, T., van Riesen, S. A. N., Kamp, E. T., Manoli, C. C., Zacharia, Z. C. and Tsourlidaki, E. (2015). Phases of inquiry-based learning: Definitions and the inquiry cycle. Educational Research Review, 14, 47–61. https://doi.org/10.1016/j.edurev.2015.02.003
  • Pols, C. F. J., Dekkers, P. J. J. M. and de Vries, M. J. (2021) What do they know? Investigating students’ ability to analyse experimental data in secondary physics education. International Journal of Science Education, 43(2), 274-297. https://doi.org/10.1080/09500693.2020.1865588
  • Pratolo, B. W. and Solikhati, H. A. (2021). Investigating Teachers’ Attitude toward Digital Literacy in EFL Classroom. Journal of Education and Learning (EduLearn), 15(1), 97-103. https://doi.org/10.11591/edulearn.v15i1.15747
  • Rakedzon, T. and Baram-Tsabari, A. (2017). To make a long story short: A rubric for assessing graduate students’ academic and popular science writing skills. Assessing Writing, 32, 28–42. https://doi.org/10.1016/j.asw.2016.12.004
  • Sadikin, A. N., Mustaffa, A. A., Hasbullah, H., Zakaria, Z. Y., Hamid, M. K. A., Man, S. H. C., ... and Yusof, K. M. (2021). Qualitative Development of Students’ Digital Skills by Integrating a Spreadsheet Software in First Year Introduction to Engineering and Seminar Course. International Journal of Emerging Technologies in Learning, 16(18). https://doi.org/10.3991/ijet.v16i18.24325
  • Salisbury, F. and Karasmanis, S. (2011). Are they ready? Exploring student information literacy skills in the transition from secondary to tertiary education. Australian Academic and Research Libraries, 42(1), 43–58. https://doi.org/10.1080/00048623.2011.10722203
  • Smith, E. and White, P. (2015). What makes a successful undergraduate? the relationship between student characteristics, degree subject and academic success at university. British Educational Research Journal, 41(4), 686–708. https://doi.org/10.1002/berj.3158
  • Smith, J. K., Given, L. M., Julien, H., Ouellette, D. and DeLong, K. (2013). Information literacy proficiency: Assessing the gap in high school students’ readiness for undergraduate academic work. Library and Information Science Research, 35(2), 88–96. https://doi.org/10.1016/j.lisr.2012.12.001
  • Stokking, K., Van der Schaaf, M., Jaspers, J. and Erkens, G. (2004). Teachers’ assessment of students’ research skills. British Educational Research Journal, 30(1), 93–116. https://doi.org/10.1080/01411920310001629983
  • Thijs, A., Fisser, P. and Van der Hoeven, M. (2014). 21e eeuwse vaardigheden in het curriculum van het funderend onderwijs [21st century skills in the basic education curriculum]. SLO. Available at: https://www.slo.nl/publicaties/@4176/21e-eeuwse-0/
  • Udeogalanya, V. (2022). Aligning digital literacy and student academic success: Lessons learned from COVID-19 pandemic. International Journal of Higher Education Management, 8(2). https://doi.org/10.24052/IJHEM/V08N02/ART-4
  • van Deursen, A. and van Dijk, J. (2011). Internet skills and the digital divide. New Media and Society, 13(6), 893–911. https://doi.org/10.1177/1461444810386774
  • van Laar, E., van Deursen, A. J. A. M., van Dijk, J. A. G. M. and de Haan, J. (2017). The relation between 21st-century skills and digital skills: A systematic literature review. Computers in Human Behavior, 72, 577–588. https://doi.org/10.1016/j.chb.2017.03.010
  • Voogt, J., Erstad, O., Dede, C. and Mishra, P. (2013). Challenges to learning and schooling in the digital networked world of the 21st century. Journal of Computer Assisted Learning, 29(5), 403–413. https://doi.org/10.1111/jcal.12029
  • Voogt, J., Godaert, E., Aesaert, K. and Van Braak, J. (2019). Review digitale geletterdheid [Digital literacy reviewed]. Windesheim University of Applied Sciences/Ghent University.
  • Vuorikari, R. and Punie, Y. (2019). The use of reference frameworks to support digitally competent citizens–the case of DigComp. Digital Skills Insights.
  • Vuorikari, R., Punie, Y., Carretero, S. and Van Den Brande, L. (2016). DigComp 2.0: The digital competence framework for citizens. Joint Research Centre of the European Commission. https://doi.org/10.2791/11517
  • Walraven, A., Brand-gruwel, S. and Boshuizen, H. P. A. (2008). Information-problem solving: A review of problems students encounter and instructional solutions. Computers in Human Behavior, 24(3), 623–648. https://doi.org/10.1016/j.chb.2007.01.030
  • Willison, J. W. (2018). Research skill development spanning higher education: Critiques, curricula and connections. Journal of University Teaching and Learning Practice, 15(4). https://doi.org/10.53761/1.15.4.1
  • Willison, J. W. and Buisman-Pijlman, F. (2016). PhD prepared: research skill development across the undergraduate years. International Journal for Researcher Development, 7(1), 63–83. https://doi.org/10.1108/ijrd-07-2015-0018
  • Willison, J. W. and O’Regan, K. (2007). Commonly known, commonly not known, totally unknown: a framework for students becoming researchers. Higher Education Research and Development, 26(4), 393–409. https://doi.org/10.1080/07294360701658609
  • Wollscheid, S., Lødding, B. and Aamodt, P. O. (2021). Prepared for higher education? Staff and student perceptions of academic literacy dimensions across disciplines. Quality in Higher Education, 27(1), 20–39. https://doi.org/10.1080/13538322.2021.1830534
AMA 10th edition
In-text citation: (1), (2), (3), etc.
Reference: Blankendaal-Tran KN, Meulenbroeks RFG, van Joolingen WR. Digital Research Skills in Secondary Science Education: A Guiding Framework and University Teachers’ Perception. European Journal of STEM Education. 2023;8(1), 03. https://doi.org/10.20897/ejsteme/13017
APA 6th edition
In-text citation: (Blankendaal-Tran et al., 2023)
Reference: Blankendaal-Tran, K. N., Meulenbroeks, R. F. G., & van Joolingen, W. R. (2023). Digital Research Skills in Secondary Science Education: A Guiding Framework and University Teachers’ Perception. European Journal of STEM Education, 8(1), 03. https://doi.org/10.20897/ejsteme/13017
Chicago
In-text citation: (Blankendaal-Tran et al., 2023)
Reference: Blankendaal-Tran, Kim N., Ralph F. G. Meulenbroeks, and Wouter R. van Joolingen. "Digital Research Skills in Secondary Science Education: A Guiding Framework and University Teachers’ Perception". European Journal of STEM Education 2023 8 no. 1 (2023): 03. https://doi.org/10.20897/ejsteme/13017
Harvard
In-text citation: (Blankendaal-Tran et al., 2023)
Reference: Blankendaal-Tran, K. N., Meulenbroeks, R. F. G., and van Joolingen, W. R. (2023). Digital Research Skills in Secondary Science Education: A Guiding Framework and University Teachers’ Perception. European Journal of STEM Education, 8(1), 03. https://doi.org/10.20897/ejsteme/13017
MLA
In-text citation: (Blankendaal-Tran et al., 2023)
Reference: Blankendaal-Tran, Kim N. et al. "Digital Research Skills in Secondary Science Education: A Guiding Framework and University Teachers’ Perception". European Journal of STEM Education, vol. 8, no. 1, 2023, 03. https://doi.org/10.20897/ejsteme/13017
Vancouver
In-text citation: (1), (2), (3), etc.
Reference: Blankendaal-Tran KN, Meulenbroeks RFG, van Joolingen WR. Digital Research Skills in Secondary Science Education: A Guiding Framework and University Teachers’ Perception. European Journal of STEM Education. 2023;8(1):03. https://doi.org/10.20897/ejsteme/13017
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Submit My Manuscript



Phone: +31 (0)70 2190600 | E-Mail: info@lectitojournals.com

Address: Cultura Building (3rd Floor) Wassenaarseweg 20 2596CH The Hague THE NETHERLANDS

Disclaimer

This site is protected by copyright law. This site is destined for the personal or internal use of our clients and business associates, whereby it is not permitted to copy the site in any other way than by downloading it and looking at it on a single computer, and/or by printing a single hard-copy. Without previous written permission from Lectito BV, this site may not be copied, passed on, or made available on a network in any other manner.

Content Alert

Copyright © 2015-2024 LEUKOS BV All rights reserved.