Dr. Ivelina Kotseva, Assist. Prof.,
Dr. Maya Gaydarova, Assoc. Prof.
Sofia University “St. Kliment Ohridski’’ (Bulgaria)
https://doi.org/10.53656/ped2025-9s.07
Abstract. This study employs an explanatory sequential mixed-methods design to examine differences in digital competencies among university lecturers at Sofia University. The first stage involves a quantitative analysis of survey data from 90 professors across various faculties and six professors from the Faculty of Physics, using the Chi-square test to compare empirical and theoretical distributions. The second stage consists of structured interviews with the six Faculty of Physics lecturers to explore reasons behind these differences. Results show that physics lecturers primarily develop digital competencies through self-study and peer collaboration rather than formal training. Statistically significant differences emerged in three areas: (1) computational thinking and professional development, (2) self-regulated and collaborative learning with digital technologies, and (3) ethical and responsible use of digital resources. Findings suggest a need for a systematic approach to digital competency development across faculties, with physics lecturers showing distinct engagement patterns.
Keywords: DigCompEdu, digital competences, mix-methods, physics teacher training
Introduction
Digital competencies are core professional skills for physics teachers. They are developed during both initial university training and through postgraduate courses and practical experience in schools. The integration of digital competencies during university training facilitates a smoother connection with other pedagogical skills essential for teachers’ full professional development.
According to a European Commission report (Ferrari 2012), digital competence encompasses a set of knowledge, skills, and attitudes – including abilities, strategies, values, and awareness – required for effectively using ICT (Information and Communication Technology) and digital media to: (1) Perform tasks; (2) Solve problems; (3) Communicate; (4) Manage information; (4) Collaborate; (5) Create and share content; (7) Build knowledge in a way that is effective, efficient, appropriate, critical, creative, autonomous, flexible, ethical, and reflective across various domains such as work, leisure, participation, learning, socializing, consuming, and empowerment.
Digital tools are also successfully used to assess learners’ achievements in reflective activities and self-assessment (Miliou et al. 2024). Self-assessment is a key competency for the 21st century (Greenstein 2012). Students should learn to apply self-assessment practically, a process that teachers should actively encourage within formative assessment practices (Hughes & Thompson 2022).
There are various frameworks for defining digital competences of teachers, such as the National Educational Technology Standards (NETS-T) developed by the International Society for Technology in Education (ISTE)1, the UNESCO ICT Competency Framework for Teachers (2018)2, Claro et al. (2018), and DiKoLAN (Thoms et al. 2022). Efforts are underway at various levels to standardize and classify the digital skills of educators. For example, the UNESCO Competency Framework outlines six aspects: understanding the role of IT technologies in learning, in assessment, and in pedagogical practice; mastery of technological tools (hardware and software); planning and organizing the learning process; and engaging in professional development.
Selected for our research purposes, the DigCompEdu European Reference Framework (2017) defines general and universal digital competences relevant to various types of teachers, both in schools and universities (Punie & Redecker 2017). According to Redecker and Punie, being a digitally competent teacher means helping students develop their own digital competence. This framework outlines a model of digital competence with 22 competencies divided into six categories:
Area 1: Professional Engagement (1.1 Organizational Communication, 1.2 Professional Collaboration, 1.3 Reflective Practice, 1.4 Digital Continuous Professional Development).
Area 2: Digital Resources (2.1 Selecting Digital Resources, 2.2 Creating and Modifying Digital Content, 2.3 Managing, Protecting, and Sharing Digital Resources).
Area 3: Teaching and Learning (3.1 Teaching, 3.2 Guidance, 3.3 Collaborative Learning, 3.4 Self-Regulated Learning).
Area 4: Assessment (4.1 Assessment Strategies, 4.2 Analyzing Evidence, 4.3 Feedback and Planning).
Area 5: Empowering Learners (5.1 Accessibility and Inclusion, 5.2 Differentiation and Personalization, 5.3 Actively Engaging Learners).
Area 6: Facilitating Learners’ Digital Competence (6.1 Information and Media Literacy, 6.2 Digital Communication and Collaboration, 6.3 Digital Content Creation, 6.4 Responsible Use, 6.5 Digital Problem Solving).
Models of Teacher Educators’ Digital Competences
Pupils and students study computer science and information technology at school and university, and many of them have good digital literacy. This necessitates the continuous improvement of teachers’ digital skills to meet new requirements for creating an effective learning environment. University teachers must meet the challenges of teaching in the information age (Esteve-Mon et al. 2020). The use of digital technologies in teacher education should be embedded in the curricula and programs of bachelor’s degrees to link school practice and university knowledge and ensure adequate digital skills in the context of increasingly digitalized learning environments (Krumsvik 2014).
To effectively use digital technologies in education, it is essential for future teachers to be both theoretically and practically familiar with educational approaches and methods; however, this knowledge cannot be replaced by digital skills alone (Mishra & Koehler 2006). For effective teaching, Mishra and Koehler (2006) propose a Technological Pedagogical Content Knowledge (TPACK) model as a framework for conceptualizing the various knowledge requirements for digitally enabled teaching. This model builds on Shulman’s work (Shulman, 1986) and includes three intersecting domains: Content Knowledge (CK) – knowledge about the subject being taught, Pedagogical Knowledge (PK) – knowledge about teaching and learning processes, and Technological Knowledge (TK) – knowledge about technologies, including digital tools. The intersections between these areas represent specific domains, such as Technological Content Knowledge (TCK), Technological Pedagogical Knowledge (TPK), and Content Pedagogical Knowledge (CPK). The intersection of all three areas, Technological Pedagogical Content Knowledge (TPACK), is a form of expert knowledge that combines content, pedagogy, and technology, forming the foundation for effective teaching. This approach suggests that teaching any subject requires content knowledge, pedagogical skills, and technological support, including digital tools. This model further proposes that teachers should be systematically trained to use technology and continually improve their knowledge of both hardware and software to enhance teaching effectiveness (Mishra & Koehler 2006).
Collis (Collis et al. 2001; Collis & Moonen 2002) proposed one of the first models predicting the use of information technology in education. The model, which is based on practical experience, describes expectations about the influence of digital technologies on changing teaching and learning. Information and communication technologies facilitate flexible learning in terms of location, types of communication, interactivity, time, approaches, and resources used.
The proposed model, E-4 (Collis et al. 2001), outlines the factors that influence the likelihood of successfully integrating technology in an educational environment: the institutional environment, educational effectiveness, ease of use, and personal responsibility. Kirschner and Davis (2003) propose a model with two axes for using digital technologies in teacher training. The axes represent core and complementary technologies, as well as approaches for teaching how to use and learn through them. Research has explored the varying extents to which digital technology is used in teacher education. Benchmarks for teacher training have been identified, including personal digital competences, digital competences as cognitive tools, pedagogical digital competences, digital competences as instructional tools, competences for social communication in the learning process (e.g., information exchange and collaborative knowledge construction), and competences for evaluating student achievements.
A tool for assessing the digital competencies of university lecturers is the scale DSC-UT (Digital Competence Scale for University Teachers). By equipping teachers with essential digital competencies, the scale supports the long-term adaptability and effectiveness of higher education institutions in an increasingly digital environment. Working with this rating scale includes several stages: (a) creating an initial pool of items and a response scale, (b) assessing content and face validity, and (c) testing the factor structure and reliability of the new scale (Licen & Procen 2024). The psychometric properties of the scale were investigated by surveying 411 university lecturers from the University of Primorska (Licen & Procen 2024). The scale has four factors, between which a relationship is established: digital literacy, digital skills, digital interaction and technology integration. This study highlights the critical role of digital competencies in promoting sustainable education practises in university.
Researchers are developing reliable tools to assess the digital skills of university teachers (Tondeur et al. 2017). Tondeur and colleagues defined a two-factor structure for ICT competencies: first, competencies to support pupils in using ICT in the classroom, and second, and competencies to use ICT in instructional design.
In assessing digital readiness and competency among Sofia University professors, a version of the SELFIE tool adapted to the Bulgarian context was implemented. How can SELFIE be applied in higher education institutions? Although the main focus of the tool is on primary and secondary schools, the concept of self-assessment and digital self-improvement can be readily adapted to a university environment:
– Adapting to the digital needs of the academic context. All areas of the DigCompEdu framework (resource creation, assessment, development of students’ digital competencies, etc.) are equally applicable to teachers in higher education.
– Collective self-reflection and development. Group reflections by teaching teams, departments, or chairs can provide valuable insights for identifying common learning needs and developing joint strategies.
– Institutional policies and support. Anonymized and aggregated data can be used to plan training programs, institutional digitalization strategies, and the introduction of new technologies.
– Recognizing progress. Certificates and digital badges can be integrated into quality portfolios, professional development pathways, or even the design of teacher education programs.
Recognizing the success of SELFIE in the school sector, the European Commission initiated the development of DIGI-HE to address the specific needs of higher education institutions. DIGI-HE builds on the existingtested SELFIE tool for schools and the DigCompOrg Framework, transferring the successful approach and lessons learned from the school sector to higher education while adapting to the unique characteristics of
universities (Ehlers & Bonaudo 2020).
The development of DIGI-HE filled a significant gap in the higher education sector, as no European self-evaluation tool equivalent to SELFIEexisted for universities. While some tools focused on digital learning and teaching or digital skills for individual university members, DIGI-HE represents the first comprehensive institutional assessment tool of its kind for higher education (Ehlers & Bonaudo 2020).
The Bulgarian adaptation of SELFIE used in this research was developed and tested as part of the SUMMIT project (contract no. BG-RRP-2.004-0008) to ensure its relevance to the Bulgarian educational environment. This adaptation was designed to meet specific educational needs and cultural nuances, enhancing the tool’s effectiveness in assessing and improving digital readiness among teachers at all levels of the Bulgarian educational system (Peytcheva-Forsyth & Yovkova, 2024; Peytcheva-Forsyth & Racheva 2024).
The original evaluation scale, based on a progression model ranging from ‘A1 – I am aware…’ to ‘C2 – I contribute…,’ was preserved, with an additional ‘None of the above’ option, resulting in seven selectable responses for each statement. During the testing of the Bulgarian version, over 30 university instructors in ICT-based disciplines, involved in initial teacher training and ongoing qualification programs, provided valuable feedback on the wording of the items. The Bulgarian version retains the original questionnaire structure, with the same number of items per subscale (ranging from three to nine), totaling 32 items overall.
The main psychometric properties of the self-assessment questionnaire were preliminary verified using two independent convenience samples composed of university professors (n = 96) and secondary education teachers (n = 281) (Mizova & Peytcheva-Forsyth 2024). High Cronbach’s alpha coefficients were observed: 0.975 for university professors and 0.896 for teachers, confirming the instrument’s consistency and applicability as outlined in the project documentation (Peytcheva-Forsyth & Racheva 2024). To further explore the internal consistency of the instrument, a correlation analysis using Pearson’s r was conducted in both samples to examine the intercorrelations among the six subscales. The positive correlations between the subscales indicate that increases in one subscale are associated with increases in the others. For all subscale pairs, the correlation was statistically significant (p < 0.05). The measured effect sizes in both validation samples (university professors and teachers) were large to very large, with correlation values ranging from 0.70 to 0.80, indicating the cohesiveness and consistency of the measured construct (Mizova & Peytcheva-Forsyth 2024).
Research methods and tools
Our research follows an explanatory sequential model – a mixed method, consisting of two stages. The first one is a quantitative stage in which we apply quantitative methods and statistical hypothesis and the second one is a qualitative stage, which is based on analysis of the results of structured interviews.
The methodology of our research is presented schematically in Figure 1. We follow the mixed-methods sequential explanatory design outlined by Ivankova et al. (2006). In a sequential model, quantitative data analysis in the first phase can reveal extreme or outlier cases. Follow-up qualitative interviews with these outlier cases can then provide insight into why they diverged from the broader quantitative sample (Creswell 2015).
In the first stage, during the 2023-2024 academic year, the Bulgarian version of SELFIE was administered to a sample of 90 university professors from various faculties at Sofia University, excluding the Faculty of Physics (Sample 1). The sample from the Faculty of Physics alone consisted of six professors (Sample 2). In our research we primarily focus on the following research questions:
RQ#1: Are there statistically significant differences in responses between general university teacher training and physics-specific teacher training at Sofia University?
RQ#2: To what extent do qualitative interviews align with or diverge from survey responses regarding digital competencies in teacher training?
It is important to note that our samples are independent, suggesting that, for the quantitative phase, the Chi-square test is the most appropriate for comparing theoretical and empirical distributions. The theoretical distribution (expected values for Sample 2) is generated based on the empirical results of Sample 1. The empirical distribution of responses in Sample 2 is then compared to this theoretical distribution.
The qualitative phase follows the quantitative analysis and aims to explore the reasons behind the statistical differences if there are any through interviews with the six professors from the Faculty of Physics.
Figure 1. Research Process Flowchart
The chosen methodology is based on a multifaceted approach to studying the level of digitalization, as developed by a scientific team from Sofia University (Mizova et al. 2025). Their research describes the full toolkit, which is grounded in qualitative methods. We apply some of these methods to our study, specifically to assess the digital competence of lecturers from the Faculty of Physics at Sofia University.
Results and Analysis
The Chi-square procedure was used to compare two distributions for Sample 2 (n=6): one theoretical distribution, based on the expected frequency of each response option for each question, and one empirical distribution, based on the actual responses of Sample 2. Expected values were calculated by applying the percentage response distribution from Sample 1 to Sample 2, by multiplying each percentage by six (the sample size of Sample 2). The null hypothesis is rejected for the upper-tail, one-sided Chi-square test (α = 0.05) in three areas (Table 1). Critical values of the Chi-square distribution are taken from the official website of National Institute of Standards and Technology (2025).
Table 1. Chi-square values across Areas 1 to 6 and corresponding hypothesis results
(H0 indicates no significant difference between theoretical and empirical distributions;
H1 indicates a significant difference between theoretical and empirical distributions)
| Area of competence | Number of questions (each question has 7 options) | Total number of options (x7) | Chi-square | Degrees of freedom (df) | Critical value for the upper-tail one-sided test | Hypothesis accepted |
| Area 1. Professional Engagement | 9 | 63 | 95.27 | 57 | 75.62 | H1 |
| Area 2. Digital Resources | 5 | 35 | 37.03 | 32 | 49.19 | H0 |
| Area 3. Teaching and Learning | 5 | 35 | 82.37 | 34 | 48.60 | H1 |
| Area 4. Assessment | 3 | 21 | 25.29 | 20 | 31.41 | H0 |
| Area 5. Empowering Learners | 4 | 28 | 22.90 | 28 | 41.34 | H0 |
| Area 6. Facilitating Learner’s Digital Competence | 6 | 42 | 70.98 | 41 | 56.94 | H1 |
Significant statistical differences were found between Sample 1 and Sample 2 in Areas 1, 3, and 6, supporting hypothesis H1, which indicates differences between the distributions. The largest effect sizes contributing to these differences were observed for Question 1.8 (Option 7) in Area 1, Question 3.4 (Option 7) in Area 4, Question 6.4 (Options 1 and 2) in Area 6 (Table 2). These differences will be further explored in the qualitative stage of the research, considering the context of other response options for Question 1.8, as well as Questions 3.4 and 6.4. For ease of qualitative analysis, these options are presented in Table 3 (Appendix).
Table 2. Options from the questionnaire with the greatest effect size contributing to the significant differences between the theoretical and empirical distributions in Area 1, 3 and 6
| Area of competence | Question No | Option No | Effect size |
| Area 1. Professional Engagement | Q1.8. Engagement in Professional Training for the Development of Teachers’ Digital Competence. | 7 | 61.70 |
| Area 3. Teaching and Learning | Q3.4. Use of Digital Technologies to Promote and Enhance Learners’ Collaboration for Individual and Collective Learning. | 7 | 56.13 |
| Area 6. Facilitating Learner’s Digital Competence | Q6.4. Empowering Learners to Use Digital Technologies Safely While Mitigating Risks to Ensure Physical, Psychological, and Social Well-being.
Q6.2. Enhancing the digital competence of learners – communication and collaboration. Implementation of educational activities that require learners to communicate and collaborate using digital technologies. |
1
2 |
23.43
20.72 |
Empirical distributions of responses from Sample 1 and Sample 2 for Questions 1.8, 3.4, and 6.4 are presented in Figure 2.
In the second stage of our research, which was qualitative, we conducted structured interviews to confirm and explain the results of the first stage. These interviews were carried out with university professors from Sample 2, after they completed the survey. Quirkos software was used for the qualitative analysis of the interview data. The interviews lasted between 45 and 60 minutes and followed a set of pre-defined questions covering several topics:
- General digital competencies as a necessity for modern teaching and learning.
- Development of pedagogical digital competencies.
- Approaches for using digital technologies in teaching.
- Application of pedagogical digital competencies in education.
These topics address a range of issues relevant to the study.
Question 1.8 includes seven options. The data in Figure 2 show that lecturers chose Options 3,2, and 1 most often from the Faculty of Physics. The results for options 1, 3, and 5 are comparable across both samples. These options relate to participation in digital competence development activities and the use of digital resources in teaching. In contrast, the participants from the Faculty of Physics did not select options 2, 4, and 6. These options pertain to choosing and participating in professional training programs on resource utilization, as well as developing such programs. This suggests that lecturers from the Faculty of Physics rely more on self-study and peer collaboration for digital skill development.
Figure 2. Empirical distributions of responses from Sample 1 and Sample 2 for Questions 1.8, 3.4, and 6.4
Half of the participants in Sample 2 selected option 7 („None of the above“). This may indicate that more professional training opportunities exist for using digital resources than were reflected in the survey. Notably, participants who chose option 7 did not explicitly mention participation in professional training. However, they did receive training on using the MOODLE platform during the Covid-19 pandemic. Additionally, the use of other platforms requiring training suggests that faculty members largely depend on self-learning and peer support to develop their digital skills.
To further support the survey data, we analyze interview responses related to options 1, 3, and 5 for Question 1.8. The lecturers from the Faculty of Physics are numbered from 1 to 6 (corresponding to the number of study participants).
Option 1 is confirmed by Lecturer №1, who stated: “Maintaining professional development is extremely important, as individual software products are constantly being updated and improved.”
Options 3 and 5 are supported by responses from:
Lecturer №6, who noted: “We are trying to develop competence, learning from students and continuing to improve our skills in the future…”
Lecturer №4, who emphasized: “Professional development is extremely important, so we refine and perfect our use of software products.”
The interview data for Question 1.8, specifically for options 1, 3, and 5, align with the interview responses.
Similarly, Question 3.4 includes seven options, just like Question 1.8. According to the data (Figure 2), differences can be observed in options 1, 2, 5, 6, and 7. Options 1, 5, and 6 were not selected, while option 7 („None of the above“) was chosen by two participants from the Faculty of Physics. This suggests that one-third of the surveyed professors from Sample 2 may not use digital technologies for self-regulated and collective learning. However, three professors indicated in the survey that they actively try to use digital technologies to support students in planning and regulating their own learning (Options 2 and 3).
A complete correlation between both samples is found in option 4, which refers to selecting and using technologies to enhance students’ self-regulated learning skills. This includes encouraging students to take initiative and engage in reflective learning processes. Option 4 also expands on options 2 and 3, both of which were selected by other participants.
To validate the survey data for options 2, 3, and 4 in Question 3.4 (Sample 2), we analyzed interview responses:
For option 3, participants 1 and 3 provided supporting statements:
“I use digital tools to manage the learning process, such as electronic assessment platforms, course management, and communication between teachers and students.”
“…Such resources are also used in assessment, offering students self-assessment sheets. They can use digital resources to encourage reflection and self-assessment, which is part of formative assessment.”
The interview responses confirm and align with the survey data.
Regarding option 4, three professors (Participants 3, 5, and 6) responded positively in the interviews, although only one of them selected this option in the survey. Their responses are as follows:
“During the exam, we create a folder with students’ digital coursework, which they exchange. I encourage them to edit each other’s work and express their opinions on its content.”
“Exercises are provided where students generate content using artificial intelligence, such as ChatGPT or other generative AI tools. Based on their physics knowledge, they verify the accuracy of the generated content. Since AI models use large language datasets, their calculations are often incorrect. Therefore, students must carefully check all calculations—the logic may be correct, but numerical results often have a high probability of error.”
“In astronomy, we use many databases. The resources I share typically have simplified graphical interfaces to ensure accessibility. I deliberately select tools that facilitate information sharing without creating unnecessary barriers for students.”
Option 2, which relates to using digital technologies to encourage and support students’ independent learning, was selected by two teachers in the survey but was not directly confirmed in the interviews.
Overall, the teachers’ responses indicate a consistent and systematic approach to integrating digital technologies in education, rather than isolated efforts. This is further supported by their answers regarding options 3 and 4, which were also selected by two teachers in the survey.
The differences in survey responses for Q6.4 are primarily seen in Option 1, which indicates that participants are aware of learning activities related to the safe use of digital technologies, data privacy protection, participation in social networks, and related topics. This option was selected by two-thirds of the lecturers from the Faculty of Physics, suggesting that their knowledge of these issues is more passive—meaning they are aware of the concepts but do not actively conduct activities with students on safe digital practices.
Option 1 was also the most frequently chosen response among other lecturers, highlighting a general tendency to neglect this aspect of digital technology training, despite its importance. Additionally, there is a correlation between option 7 („None of the above“) and option 3. Only one of the six lecturers reported conducting activities with students on the responsible use and creation of digital resources, such as privacy protection, creating strong passwords, safeguarding personal data, and blocking suspicious individuals.
The interview data provide weak support for the survey results from Q6.4. Only one teacher explicitly mentioned this topic, stating:
“One such issue is misinformation. The best way to combat this problem, in my opinion, is to instill in students a discipline regarding resources and sources.”
A possible explanation for this weak alignment is that the interview questions may not have emphasized this topic, leading to an omission in the responses.
Discussion
The statistical analysis of survey data from both samples reveals correlations in responses across the six thematic areas. Statistically significant differences were found in three areas, while the remaining three showed no significant variation.
There were no statistical differences in Areas 2, 4, and 5. We provide the following interpretation.
For Area 2 – Digital Resources (2.1 Selecting Digital Resources, 2.2 Creating and Modifying Digital Content, 2.3 Managing, Protecting, and Sharing Digital Resources):
Lecturers from the Faculty of Physics provided comparable responses regarding the selection, creation, management, and sharing of digital resources. This suggests that their efforts are focused on developing these essential digital competencies.
For Area 4 – Assessment (4.1 Assessment Strategies, 4.2 Analyzing Evidence, 4.3 Feedback and Planning):
The use of digital technologies for assessment is consistent across both samples, with similar practices such as computer-based tests and feedback assignments.
For Area 5 – Empowering Learners (5.1 Accessibility and Inclusion, 5.2 Differentiation and Personalization, 5.3 Actively Engaging Learners):
The provision of access to digital resources for university students is uniform, as evidenced by access to various electronic platforms and databases.
These findings are further supported by the interview data, where these areas were most frequently discussed.
The differences observed in survey responses primarily concern Areas 1.8, 3.4, and 6.4, though they represent only a small portion of the questions in these areas.
Area 6 (Students’ Digital Competencies in Problem-Solving, Ethical and Responsible Use of Digital Resources, Creation of Digital Content for Self-Expression, and Collaborative Use of Resources) contains five questions, with no significant differences between the two samples.
Area 3 (Use of Emerging Technologies, Collaborative Learning, Digital Feedback, and Digital Learning Environments) includes four questions, also without differences.
Area 1 (Computational Thinking, Independent Digital Learning, Safe and Responsible Digital Practices, Reflection, University-Level Digital Infrastructure, Collaboration with Colleagues, Online Learning Management, and Digital Communication) consists of eight questions, with no statistical differences between responses.
Overall, the findings indicate that the use of digital technologies and the development of digital competencies are comparable between both samples. However, it is important to acknowledge that the small sample size of lecturers from the Faculty of Physics may not fully represent the broader faculty population.
Conclusion
This study examined the digital competencies of university lecturers at Sofia University, with a particular focus on differences between general faculty and those from the Faculty of Physics.
Regarding RQ#1 (Are there statistically significant differences in responses between general university teacher training and physics-specific teacher training at Sofia University?), the quantitative analysis confirmed statistically significant differences in three key areas: professional development in digital competencies (Area 1), self-regulated and collaborative learning using digital technologies (Area 3), and ethical and responsible use of digital resources (Area 6). These results suggest that physics lecturers tend to rely more on self-directed learning and peer collaboration rather than structured professional training. However, no significant differences were found in areas related to digital resource management, assessment practices, and student empowerment, indicating a shared foundation in integrating digital tools into teaching.
For RQ#2 (To what extent do qualitative interviews align with or diverge from survey responses regarding digital competencies in teacher training?), the qualitative phase provided partial confirmation of the survey results. While interviews supported the findings that physics lecturers actively use digital tools for teaching and assessment, they also revealed a gap in formal training opportunities and structured engagement with ethical and security aspects of digital technology. Notably, awareness of digital ethics was high, but few structured activities were conducted with students in this area. In some cases, interview responses highlighted additional digital practices not explicitly captured in the survey, particularly regarding the use of AI and astronomy databases.
These findings indicate that while digital technology is systematically integrated into university teaching, its development remains largely informal for physics lecturers. Addressing this gap through structured institutional training programs, targeted professional development, and a stronger emphasis on digital ethics could enhance digital competence and ensure a more comprehensive approach to digital learning in higher education.
Appendix 1
Options of Q1.8, Q3.4, and Q6.4
| Q1.8. Option 1 | I am aware that my participation in training on the use of digital technologies can develop my digital competences (e.g. webinars or workshops on the use of digital technologies in teaching and learning). |
| Q1.8. Option 2 | I have attended professional training activities on the use of digital technologies to develop my digital competencies (e.g., micro-training, seminars on the use of digital technologies in teaching and learning). |
| Q1.8. Option 3 | I participate in various formal and informal professional training activities on the use of digital technologies to develop my digital competencies (e.g., hands-on training on the pedagogical use of digital technologies, online learning approaches, digital assessment). |
| Q1.8. Option 4 | I analyze and select professional training on the use of digital technologies based on my needs (e.g., using a self-reflection tool for my digital competence, setting learning goals, designing my training, reflecting on my learning). |
| Q1.8. Option 5 | I provide learning activities on the use of digital technologies and support colleagues in developing their digital competence (e.g., seminars, informal sessions with colleagues, micro-training on the use of digital technologies). |
| Q1.8. Option 6 | I contribute to the design of professional training programs aimed at developing teachers’ digital competence (e.g., project-based training using digital technologies, sharing best practices). |
| Q1.8. Option 7 | None of the above. |
| Q3.4. Option 1 | I am aware that digital technologies can be used to promote active and autonomous learning (e.g., planning, goal setting, tracking progress). |
| Q3.4. Option 2 | I have tried using digital technologies to support students in planning their own learning (e.g., planning with digital calendars, setting goals with digital journals, tracking progress). |
| Q3.4. Option 3 | I use various digital technologies to support students in planning and regulating their own learning (e.g., online learning environments, online resource repositories, collaboration tools and spaces, learning journals, e-portfolios). |
| Q3.4. Option 4 | I choose digital technologies that facilitate the presentation and analysis of learning data to support my observations of my teaching practice and of my students’ learning (e.g. recording and visualizing data, automatically generated graphs, mind mapping tools, digital boards). |
| Q4.2. Option 5 | I select and use digital technologies in my teaching projects based on their characteristics to facilitate my students’ self-regulated learning skills and autonomy (e.g., taking initiative in their own learning, being creative and responsive to new learning situations, engaging in self-reflection to plan and guide their progress). |
| Q3.4. Option 6 | Together with my students, I reflect and support them in (re)designing their learning through and on the use of digital technologies, encouraging their self-regulation of learning and learner autonomy (e.g., identifying their needs, setting learning goals, describing their strategy for achieving those goals, completing learning tasks, gathering evidence of their learning, reflecting on it, and sharing their learning outcomes). |
| Q3.4. Option 7 | None of the above. |
| Q6.4. Option 1 | I am aware of learning activities that encourage students to use digital technologies safely (e.g., how to protect data privacy, read terms of use, avoid social exclusion, prevent violence in digital environments). |
| Q6.4. Option 2 | I have tried learning activities that allow students to reflect on the safety and well-being consequences of using digital technologies (e.g., identifying inappropriate behavior, discussing issues of excessive use/addiction). |
| Q6.4. Option 3 | I carry out various learning activities to encourage students to act in responsible and ethical ways when creating and using digital information (e.g., adjusting their social media settings, protecting personal data and privacy, setting strong passwords, blocking and reporting individuals who make them feel uncomfortable). |
| Q6.4. Option 4 | I design training to help students develop strategies for the responsible and ethical use of technology to protect their reputation and promote social well-being (e.g., balancing online and offline activities, recognizing and addressing cyberbullying/ sexting/racism, etc. in digital environments). |
| Q6.4. Option 5 | I reflect on and (re)design learning activities based on the continuous evolution of online risks and threats, so that I enable students to follow and adopt positive practices for their physical, psychological, and social well-being, as well as that of their peers (e.g., how companies collect and use data on individuals, how social media affects emotional and social relationships). |
| Q6.4. Option 6 | My students and I contribute to creating a culture in our university and its broader community where the negative and positive uses of digital technologies are openly discussed, along with ways to avoid risks and threats (e.g., practical workshops on online safety, coaching on digital well-being for peers, teachers, and parents). |
| Q6.4. Option 7 | None of the above. |
NOTES
- https://iste.org/standards
- 2. https://teachertaskforce.org/sites/default/files/2020-07/ict_framework.pdf
- NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY, 2025. E-Handbook of Statistical Methods. Available at: http://www.itl.nist.gov/div898/handbook / (last accessed 23 June 2025).
Acknowledgement and Funding
This study is supported by the European Union-NextGenerationEU, through the National Recovery and Resilience Plan of the Republic of Bulgaria, project No BG-RRP-2.004-0008.
REFERENCES
COLLIS, B.; PETERS, O. & PALS, N., 2001. A model for predicting the educational use of information and communication technologies, Instructional Science, no. 29, pp. 95 – 125. Available from: https://doi.org/10.1023/A:1003937401428.
COLLIS, B. & MOONEN, J., 2002. Flexible learning in a digital world, Open Learning: The Journal of Open, Distance and e-Learning, vol. 17, no. 3, pp. 217 – 230. Available from: https://doi.org/10.1080/0268051022000048228.
CLARO, M.; SALINAS, Á.; CABELLO-HUTT, T.; SAN MARTÍN, E.; PREISS, D. D.; VALENZUELA, S. & JARA, I., 2018. Teaching in a Digital Environment (TIDE): Defining and measuring teachers’ capacity to develop students’ digital information and communication skills, Computers & Education, no. 121, pp. 162 – 174. Available from: https://doi.org/10.1016/j.compedu.2018.03.001.
CRESWELL, J. W., 2015. Educational research: Planning, conducting, and evaluating quantitative and qualitative research. Pearson.
EHLERS, U. D. & BONAUDO, P., 2020.DIGI-HE–A Strategic Reflection
Tool on Digitalisation at European Higher Education Institutions. EDEN
Conference Proceedings, no. 1, pp. 289 – 298. Available from: https://doi.org/10.38069/edenconf-2020-ac0027.
ESTEVE-MON, F. M.; LLOPIS-NEBOT, M. Á. & ADELL-SEGURA, J., 2020. Digital teaching competence of university teachers: A systematic review of the literature, IEEE revista Iberoamericana de Tecnologías del aprendizaje, vol. 15, no. 4, pp. 399 – 406.
FERRARI, A., 2012. Digital competence in practice: An analysis of frameworks, vol. 10, p. 82116, Luxembourg: Publications Office of the European Union.
LIČEN, S. & PROSEN, M., 2024. Strengthening sustainable higher education with digital technologies: development and validation of a digital competence scale for university teachers (DCS-UT), Sustainability, vol. 16, no. 22, p. 9937. Available from: https://doi.org/10.3390/su16229937.
HUGHES, J. & THOMPSON, S., 2022. Assessment in the makerspace. Making, Makers, Makerspaces: The Shift to Making in 20 Schools. Cham: Springer International Publishing.
IVANKOVA, N. V.; CRESWELL, J. W. & STICK, S. L., 2006. Using mixed-methods sequential explanatory design: From theory to practice. Field methods, vol. 18, no. 1, pp. 3 – 20. Available from: https://doi.org/10.1177/1525822X05282260.
KIRSCHNER, P. & DAVIS, N., 2003. Pedagogic benchmarks for information and communications technology in teacher education. Technology, Pedagogy and Education, vol.12, no. 1, pp. 125 – 147. Available from: https://doi.org/10.1080/14759390300200149.
KRUMSVIK, R. J., 2014. Teacher educators’ digital competence. Scandinavian Journal of Educational Research, vol. 58, no. 3, pp. 269 – 280. Available from: https://doi.org/10.1080/00313831.2012.726273.
MISHRA, P. & KOEHLER, M. J., 2006. Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers college record, vol. 108, no. 6, pp. 1017 – 1054. Available from: https://doi.org/10.1111/j.1467-9620.2006.00684.x.
MILIOU, O.; ADAMOU, M.; MAVRI, A. & IOANNOU, A., 2024. An exploratory case study of the use of a digital self-assessment tool of 21st-century skills in makerspace contexts. Educational technology research and development, vol. 72, no. 1, pp. 239 – 260. Available from: https://doi.org/10.1007/s11423-023-10314-0.
MIZOVA, B. & PEYTCHEVA-FORSYTH, R., 2024. Digital Pedagogical Competences of Bulgarian Secondary Teachers – Preliminary Data. EDULEARN24 Proceedings, IATED, pp. 2623 – 2632. Available from: https://doi.org/10.21125/edulearn.2024.0717.
MIZOVA, B.; PEYTCHEVA-FORSYTH, R. & MELLAR, H., 2025. A multi-faceted approach to researching the level of digitalization in initial teacher preparation, Strategies for Policy in Science and Education, vol. 33, no. 1, pp. 28 – 520.
PEYTCHEVA-FORSYTH, R. & YOVKOVA, B., 2024. Development of Digital Pedagogical Competences in the University Initial Teacher Training Programmes – the Perspective of University Professors and Future Teachers (Sofia University Case). EDULEARN24 Proceedings, IATED, pp. 9603 – 9613. Available from: https://doi.org/10.21125/edulearn.2024.2321.
PEYTCHEVA-FORSYTH, R. & RACHEVA, V., 2024. Bridging Digital Competences: A Comparative Analysis Between University Teacher Trainers and Secondary School Teachers in Bulgaria. EDULEARN24 Proceedings, IATED, pp. 4518 – 4527. Available from: https://doi.org/10.21125/edulearn.2024.1125.
PUNIE, Y. & REDECKER, C., 2017. European Framework for the Digital Competence of Educators: DigCompEdu. Luxembourg: Publications Office of the European Union.
SHULMAN, L. S., 1986. Those who understand: Knowledge growth in teaching. Educational researcher, vol. 15, no. 2, pp. 4 – 14. Available from: https://doi.org/10.3102/0013189X015002004.
THOMS, L. J.; KREMSER, E.; VON KOTZEBUE, L.; BECKER, S.; THYSSEN, C.; HUWER, J.; … & MEIER, M., 2022. A framework for the digital competencies for teaching in science education–DiKoLAN. Journal of Physics: Conference Series, vol. 2297, no. 1, p. 012002. Available from: https://doi.org/10.1088/1742-6596/2297/1/012002.
TONDEUR, J.; AESAERT, K.; PYNOO, B.; VAN BRAAK, J.; FRAEYMAN, N. & ERSTAD, O., 2017. Developing a validated instrument to measure preservice teachers’ ICT competencies: Meeting the demands of the 21st century. British Journal of Educational Technology, vol. 48, no. 2, pp. 462 – 472. Available from: https://doi.org/10.1111/bjet.12380.
Dr. Ivelina Kotseva, Assist. Prof.
ORCID iD: 0000-0003-2932-7069
Dr. Maya Gaydarova, Assoc. Prof.
ORCID iD: 0000-0003-1552-1039
Faculty of Physics
Sofia University “St Kliment Ohridski’’
Sofia, Bulgaria
E-mail: iva_georgieva@phys.uni-sofia.bg
E-mail: mayag@phys.uni-sofia.bg
>> Download the article as a PDF file <<

