ISSN: 2394-8418
Journal of Dental Problems and Solutions
Research Article       Open Access      Peer-Reviewed

Conception of an endodontics questionnaire in undergraduate dental education

S Sacha, D Sonntag, S Rüttermann and S Gerhardt-Szép*

Department of Operative Dentistry, Carolinum Dental University Institute, J.W. Goethe University, Frankfurt am Main, Germany
*Corresponding author: Dr. Susanne Gerhardt-Szép, Professor, MME, Department of Operative Dentistry, Carolinum dental university-Institute GmbH, JW. Goethe University, Theodor-Stern-Kai 7, Frankfurt am Main 60590, Germany, Tel: +49-69-6301-7505; E-mail: Szep@em.unifrankfurt.de
Received: 13 May, 2020 | Accepted: 28 May, 2020 | Published: 29 May, 2020
Keywords: Endodontics; Preclinical; Education; Questionnaire; Conception

Cite this as

S Sacha, D Sonntag, Rüttermann S, Szép SG (2020) Conception of an endodontics questionnaire in undergraduate dental education. J Dent Probl Solut 7(1): 049-055. DOI: 10.17352/2394-8418.000084

The purpose of this study was to create a valid assessing instrument using a set of questions to examine preclinical dental education in endodontics. For this reason, we constructed the “German Endodontology Questionnaire” (GEndoQ), which assesses preclinical dental education in endodontics. In recent years and decades, various curricula in preclinical and clinical endodontic education have been evaluated at both national and international levels and the results published. However, the conception of the questionnaires that have been used have never been discussed or published.

The GEndoQ was constructed in five phases using the Delphi technique. In the first phase, the questionnaire was generated according to former questionnaires and publications through the first Delphi round and divided into 10 categories. In phases two and three, different expert panellists, such as specialists in endodontology and attendants of a master’s degree programme in endodontology, used the Delphi technique to confirm the content validity of GEndoQ. The newest literature was implemented in phase four. In phase five, GEndoQ was finally completed after multiple abbreviations were included following feedback from the expert panellists who employed the think-aloud method.

Within the five phases, GEndoQ Version 5 was created comprising 49 questions in nine categories. This was done using different answer options. The Likert scale with six possible options to choose from was used the most: 1= don’t agree at all; 2= don’t agree; 3= undecided; 4= agree; 5= fully agree, 6= don’t know. Seven questions could be answered in a free text format, while five questions were in a single-choice format, such as yes/no answers. The GEndoQ is a valid instrument for assessing preclinical dental education in endodontics. Future research will focus on further refining and validating the instrument, for example, within a pilot test. Additionally, the questionnaire should be translated into English and validated to make comparisons among international dental faculties easier.

Introduction

On a national and international level, various curricula in preclinical and clinical endodontic education have been evaluated and published in recent years and decades [1-7]. Al Raisi et al. conducted a survey in the United Kingdom from November 2017 to January 2018 on basic endodontic education in British dental schools and compared their results with an earlier paper-based survey [1]. The study revealed a great divergence, especially in the teacher-student ratio, time management and teaching methods within British dental faculties. Endodontic education in the United Kingdom has developed positively over the last 20 years [1]. Sonntag, et al. evaluated endodontic training in Germany in 2008 [3]. They noted that preclinical endodontic training varied considerably among German universities due to differences in curricula designs, staff and course content [3]. In 2014, Gerhardt-Szép presented a survey in the context of the 11th conference "Training for Trainers". The survey analysed the divergence in general examination projects in preclinical endodontology [8].

Nevertheless, the conception of an assessing instrument with a set of questions to examine preclinical and clinical dental education in endodontics at the national and international levels has not been presented [2-7]. A former paper-based survey by Qualtrough and Dummer from 1997 [7] was used in the studies mentioned above, where the authors stated that no modifications to the former paper-based survey had been made [1,4-6]. After 1997, various guidelines for endodontic education [9-11] and general guidelines for endodontics [12,13] have been formulated since many changes have occurred in the last decades with regard to endodontic equipment and materials [14,15]. Furthermore, the guidelines from the European Society of Endodontology (ESE) [9] and the Association for Dental Education in Europe (ADEE) [10,11] support dental faculties in creating undergraduate endodontic curricula that promote consistent standards within Europe and enhance the quality of patient care in the community [1]. It is also known that considerable differences exist among countries in terms of curricula structure and content as well as the scope of practice generally in dental education [16].

Therefore, the use of contemporary dental assessing instruments for national and international comparison is important. Unfortunately, the current state of research indicates that no validated instrument for examining preclinical dental education in endodontics has been published, nor has the methodology of the conception and validation of such an instrument been explained [1-7]. In the most recent study about endodontic education in Great Britain, the author described that their survey “was piloted locally to check for question readability, clarity, validity and functionality and time required to complete” [1]. However, the authors did not specify the process. Reviews of literature concerning instruments for evaluating endodontics by general dental practicers worldwide show that little to nothing has been published about the construction and validation of the questionnaires used for their studies [17-20]. However, plenty can be found for the development and validation of instruments within the dental field [21-23] and beyond the horizon of dental education [22-24]. Many publications offer general explantations for questionnaire conception and validation, such as following the “seven-step process for designing high-quality questionnaires” given in the Association for Medical Education in Europe (AMEE) Guide: No. 87 [21, 25-27] or similar step by step methods [28]. A large number use various methods, such as the Delphi technique [29-33] and the think-aloud method [34-37], for content validation.

On that account, the main goal of this study was to create an assessing instrument comprised of a set of questions to examine preclinical dental education in endodontics. Therefore, two different methods were applied, the Delphi technique as well as the think-aloud method.

Materials and methods

Ethical approval

After consultation with the ethics committee of the Department of Medicine of the Goethe University Frankfurt am Main, it was decided that a vote by the ethics committee was not required (reference 114/2019).

General methodology

The development of the questionnaire consisted of five phases using four Delphi rounds and the think-aloud method (Figure 1).

Phase 1: Literature review and conception of a discipline-specific questionnaire

In phase one, the literature search of English articles published between 1991 and June 2019 was executed using the PubMed database. The search included combinations of keywords such as “endodontics”, “education”/“teaching”, “survey”/“questionnaire”, “preclinical”, “undergraduate”, and “root canal treatment”.

These keywords were used with the Boolean operator “AND”, although “education” and “teaching”, as well as “survey” and “questionnaire” were used as synonyms in the search. In the first step, the abstracts from the literature that was found was examined to determine the relevance of the content of each publication. Non-relevant literature was identified and excluded. For all literature references with a (possible) content reference, the full text of the publication was then searched online. The following inclusion criteria were used in the consideration of publications in order to create the questionnaire: questionnaire/survey requesting the general status quo of endodontic education in the preclinical or/and clinical undergraduate part of dental faculties nationally/internationally (Table 1).

The questionnaire was constructed by two experts (Table 2) based on the data provided in several studies and publications [1,11,14,15,38,39] and the Frankfurt learning objectives for endodontology (FLOE) (Figure 2). The working group named the instrument the “German Endodontology Questionnaire”. In the following article, the questionnaire is titled GEndoQ.

The expert group wrote e-mails, had meetings and discussed their conceptions of the questions. Due to the dental and medical education expertise of one of the authors, the content validity was ensured. GEndoQ Version 1 was created.

Phase 2: Conception of the questionnaire

In phase two, the first pre-testing took place within the second Delphi round, which consisted of one expert from the Department of Operative Dentistry, Carolinum Dental University Institute, J.W. Goethe University, Frankfurt am Main, Germany (Table 2). In this round, the expert received the questionnaire digitally and was asked to ensure context validity and return written feedback directly on the printed questionnaire.

Phase 3: Conception of the questionnaire

In phase three, the second pre-testing of the updated questionnaire took place within the third Delphi round. The four-member panellist group, from the previously mentioned Department of Operative Dentistry, consisted of one specialist in endodontology, one graduate of a structured advanced training course with endodontology content and two participants of a master’s degree course in endodontology (Table 2). Each of them received a printed questionnaire and was asked to improve its grammar, clarity and comprehensibility. They were also asked to ensure objectivity and provide written feedback directly on the printed questionnaire.

Phase 4: Modification of the questionnaire

In the fourth Delphi round, the expert group from the first phase modified and supplemented the third version of GEndoQ using the most recent publication from 2019 concerning undergraduate endodontic education [1].

Phase 5: Final conception of the questionnaire

In the last phase, the panellists, 15 dentists from the previously mentioned Department of Operative Dentistry (specialists in endodontology, graduates of structured further education with endodontological content and participants of a master's degree course in endodontology, participants with and without a doctorate, see Table 2), checked the fourth version of the GEndoQ for reliability, objectivity, content validity and processing time. The study and the questionnaire were presented as a PowerPoint presentation to the panellists. In this phase, the think-aloud method was used [35].

Results

Phase 1

Using the combination of the keywords “endodontics AND education AND survey AND undergraduate” in the PubMed database, 87 results were returned. For the search “endodontics AND teaching AND survey AND undergraduate”, 81 results were returned by PubMed. Twenty-nine results were returned when asking for “endodontics AND teaching AND survey AND preclinical”. Following the criteria mentioned above, six publications were considered (Table 1). In addition, two publications of guidelines for endodontic education for undergraduates and guidelines for endodontic treatment in general were chosen (Table 1).

A total of 45 questions with 149 sub-items (sub-items are the possible answer options presented below the questions) were conceived in the first phase of the Delphi round 1. The GEndoQ was divided into ten different categories that are illustrated in Table 3. For a better overview, GEndoQ has been highlighted in different colours to show which questions and sub-items belong to which source (Figure 3). The expert group chose different answer options. The Likert scale with six possible options from which to choose was the most frequently used: 1= don’t agree at all; 2= don’t agree; 3= undecided; 4= agree; 5= fully agree, 6= don’t know. Seven questions could be answered in a free text format, while five questions were in single-choice formats, such as yes/no answers. The category III “Learning-teaching settings: Theoretical endodontology” and category IV “Learning-teaching materials: Theoretical endodontology” also asked for the percentage of the used settings and materials. At the end of categories III, IV, VI, VIII, IX and X, the question was asked whether something else was worth mentioning for the specific category (Table 4).

Phase 2

In the second phase, the second Delphi round, 22 changes by the panel expert were accepted and the questionnaire was adapted. For example, questions 5 and 10—the sub-item “dentist without specific education”—were complemented, while some questions had to be formulated more precisely for better comprehension (Tables 5-7). This resulted in the second version of the questionnaire with 44 questions and 164 sub-items (Figures 4–5 illustrate some examples of the changes made).

Phase 3

The feedback from the expert group involved in the third Delphi round included 44 suggestions for change, 24 of which were used. The proposed amendments were changes in paragraphs and the use of other vocabulary to achieve a better understanding. One definition was added in question 16, where peer-tutoring was defined (Figures 6-7 illustrate examples of the changes made). This resulted in the third version of the questionnaire with 44 questions and 171 sub-items (Tables 8-11). The declined suggestions were due to different comprehension of definitions, and reconstructing of already existing phrases used in the publications that formed the basis of the questionnaire.

Phase 4

The third version of the GEndoQ was modified and supplemented by 11 more questions to a total of 55 questions and 218 sub-items (Table 12). A question was added that asked for the time that dental schools dedicate to specific topics taught in preclinical endodontic classes: root canal anatomy and pulp histology, pulp pathology and endodontic microbiology, endodontic radiology, endodontic materials, vital pulp therapies, root canal treatment on immature teeth with non-vital pulp tissues, root canal treatment, root canal re-treatment, endodontic surgery, endodontic regeneration, restoration of root-filled teeth, bleaching of endodontically treated teeth, dental trauma, endodontic emergencies.

In addition, multiple questions addressed the materials and methods used for root canal treatment: measurement length determination, type of NiTi instruments used, root canal irrigation, activation of the root canal irrigation, apexification and materials, type of inter-visit medicament, root canal filling pastes (sealers). The questions about post-endodontic restoration, the time needed to complete the endodontic treatment and whether there was a special department for endodontology were added to complement the questionnaire. Some questions were supposed to be answered in a free text format; others were in a multiple-choice format.

Phase 5

In phase five, the expert group recommended that some questions and sub-items should be abbreviated in order to shorten the processing time. Therefore, the endodontic learning goals had to be reduced to five major ones, which combined theoretical and practical learning goals. Several questions were changed to free text answer questions instead of the Likert scale, and vocabulary was defined more precisely. All in all, 20 changes were suggested Tables 13-14, Figures 8-9 illustrate some examples of the changes made). This step of development guaranteed the objectivity of interpretation. After this phase, GEndoQ was shortened to 49 questions with 99 sub-items (see GEndoQ Version 5 Figures 10-15).

Discussion

The aim of this study was to develop a questionnaire, using various methods, that can evaluate preclinical endodontic education.

Literature review and conception of a discipline-specific questionnaire

In the first phase, the expert group conducted a literature review to ensure that the idea of the study was orientated on decisive former publications, following the AMEE Guide No. 87 and the Survey Development Guidance for Medical Education Researchers [25,40]. Its aim was to bring the reader up to date with the literature on a specific theme and form the groundwork for future research [41]. Kossioni, et al. described in their methods for developing a questionnaire to measure the clinical learning environment for undergraduate dental students, that a literature review should be conducted [22]. Several other studies have made a similar statement [21,25,42-44]. In this current study, the publication of Sonntag, et al. [3] and the survey of Gerhardt-Szép [8], served as the basis for the questionnaire; both authors are experts in this study. In addition, the new licensing regulations for dentists (ZApprO) [38], the national competence-based catalogue of learning objectives (NKLZ) [39] and the Frankfurt learning objectives for endodontology (FLOE) were used. The publications using similar questionnaires that were found through the literature review [1,2,4-7] were used as orientations for developing the GEndoQ. These were the only publications between 1991 and 2020 provided by the database that included the criteria “questionnaire/survey requesting the general status quo of endodontic education in the preclinical or/and clinical undergraduate part of dental faculties nationally/internationally”. Furthermore, the guidelines for endodontic education for undergraduates and guidelines for endodontic treatment in general were added as orientation publications for the questionnaire [9,12]. The European Society of Endodontology (ESE) published undergraduate curriculum guidelines for endodontology in 1992, 2001 [45,46], the latest publication in 2013, and “ […] formed a benchmarking reference for dental schools and regulatory bodies […]” [9]. Therefore, the ESE guidelines were taken into account to ensure that the questions and sub-items remained relevant.

In this study, the GEndoQ was generated by a team of two individuals using the Delphi technique (explained below). The one individual, besides being a dentist, is an expert in medical education, thereby ensuring content validity.

Content validation using different methods

To evaluate the instrument’s content validity, which assesses the relevance of each question and sub-item, different panel experts reviewed the questionnaire in several phases [27]. Baker, et al. stated that “[e]xperts provide an accessible source of information that can be quickly harnessed to gain opinion. They can often provide knowledge when more traditional research has not been undertaken” [30]. Corresponding to Skulmoski, et al., the expert panellists in the Delphi round should meet four “expertise requirements: i) knowledge and experience with the issues under investigation; ii) capacity and willingness to participate; iii) sufficient time to participate in the Delphi technique; and iv) effective communication skills” [47]. The use of expert panels is common for assuring content validity of questionnaires [22], [24], [29], [43], [44] and is recommended by the AMEE Guide No. 87 [25].

In the current publication, the experts used the Delphi technique in different phases. The so-called Delphi technique is a systematic, multi-stage decision-making process with resonance, in which experts or groups of experts assess the questionnaire and provide feedback [48-51]. According to Buckley, et al., the Delphi technique is used when the topic under investigation is not suitable for precise analytical techniques but can benefit greatly from subjective judgements on a collective basis [52]. He also refers to the possibility of variations in a true Delphi study [53]. In general, the Delphi technique results in multiple iterations to create a consensus of opinions concerning a specific topic, using a controlled feedback process [47,54,55].

For the second and third Delphi rounds in the development of the GEndoQ, the experts were selected according to their professional competence [56]. Therefore, the second Delphi round consisted of one expert. This expert, among other qualifications, was a specialist in endodontology and an author of an already existing publication on the same topic [3]. According to Powell, et al. representation is assessed by the qualities of the expert panel rather than its numbers [57]. A similar approach was made for the third phase within the third Delphi round evaluating GEndoQ Version 3. Four panel experts were chosen: one specialist in endodontology, one graduate of a structured advanced training course with endodontology content and two participants of a master’s degree programme in endodontology. From today’s point of view, no standardized number of experts has been recommended for the Delphi technique. The literature has indicated that panel size has ranged from two experts to hundreds of experts [47,56,58,59].

Comparing studies that have evaluated preclinical and clinical education in endodontics, only Al Raisi, et al. [1] stated that they validated their questionnaire, which was a former paper-based survey that had been developed by Qualtrough and Dummer in 1997 [7]. Unfortunately, the validation process of the questionnaire was not part of the publication since no modification had to be made to the almost two-decades-old instrument. Therefore, it is unknown through which method the instrument used by Al Raisi, et al. was validated and why it was decided not to change anything. One might think that modifications would have been necessary because various guidelines for endodontic education were formulated after 1997 [9-11]. In addition, many changes have occurred with regard to endodontic equipment and materials in the last decades [14,15].

After generating GEndoQ Version 3, the instrument was complemented by 11 questions from the publication of Al Raisi, et al. Part of the questionnaire that Al Raisi, et al. used in their study was illustrated in their publication [1]. Their questionnaire was based on a former paper survey [7]. The expert group of the first phase could, therefore, approximate the questions to the questionnaire of Qualtrough and Dummer from 1997 [7]. This has made it more comparable to the studies mentioned above [2], [4], [5] since those studies were based on the same questionnaire. In addition, the questions and sub-items used by Al Raisi, et al. considering preclinical education in endodontics, completed GEndoQ Version 4 after the additional literature review [25,40].

In the last phase, GEndoQ Version 4 and the study was presented in a PowerPoint presentation to an expert group consisting of 15 dentist expert panellists with different qualifications (Table 2). After the presentation, the panellists were asked to provide their feedback on GEndoQ version 4 using the think-aloud method, and the verbal feedback was transcribed (Tables 13-14). This method asks the experts to verbalise their thinking during the thought process [60]. The think-aloud method is a scientific method that has been used in various disciplines [61]. For example, Zahiri Esfahani, et al. used the think-aloud method to measure effectiveness, learnability, errors and efficiency characteristics of a picture archiving and communication system [35]. In contrast, Adams, et al. explained the use of the think-aloud interview in order to create and validate a test [34]. Field et al. described that their questionnaire-which assessed the “pan-European practice in relation to curriculum content, teaching and learning strategies and assessment of preclinical dental skills”-had been originally piloted through think-aloud testing [62]. Therefore, the think-aloud method was an adequate method to gather feedback and make modifications to GEndoQ Version 4.

Transferability to national and international levels

The GEndoQ questions and sub-items were based on recent publications in order to make the results more comparable [1,2], [4,7]. It is important to see how and whether dental faculties have developed in the last years and whether they have adapted their endodontic curriculum to the contemporary guidelines of the ESE [9]. Standardisation and quality assurance in both endodontic training and general dentistry are necessary because since 1981, qualifications for various health professions, including dentistry, have been mutually recognised throughout the European Union in accordance with EU Directive 81/1057/EEC [63].

Limitations

Concerning the literature review in the first phase, it has to be mentioned that a single database was used for the search. Perhaps more matching publications could have been found if more databases had been used. However, this would have exceeded the time allowed for this project.

The GEndoQ is an instrument for evaluating preclinical education in endodontics for German-speaking dental faculties. The questions have been formulated in the German language. Therefore, the English version of the questionnaire would need to be validated after translating as well.

Finally, as the validation process is not yet complete, the final instrument should be field-tested. Therefore, in future research, the GEndoQ will be tested and applied in real dental faculty environments in order to refine further and eliminate any weaknesses that the questionnaire might have.

While this questionnaire has been generated as a paper-based survey, a web-based or mail survey would “offer anonymity and […] afford respondents time to complete the questionnaire at their own pace”, according to Cavana et al. [64]. Hence, implantation into an online survey software could be the next step.

Conclusion

GEndoQ is an instrument to assess preclinical education in endodontics, but the complete validity and reliability has not been assessed. It should be part of further research after field-testing this instrument. This study emphasises the significance of an instrument to assess preclinical education in endodontics. The goal is to make endodontic curricula comparable with each other on a national or international basis. The five phases show the effort of the conception of our questionnaire. In general, the methodology of such an instrument should not be underrated.

  1. Al Raisi H, Dummer PMH, Vianna M  (2019). How is Endodontics taught? A survey to evaluate undergraduate endodontic teaching in dental schools within the United Kingdom. Int Endod J 2: 1077–1085. Link: https://bit.ly/3bPDD9l  
  2. Narayanaraopeta U, Alshwaimi E (2015) Preclinical endodontic teaching. A survey of Saudi dental schools. Saudi Med J 36: 94–100. Link: https://bit.ly/2TnmNrW  
  3. Sonntag D, Bärwald R, Hülsmann M, Stachniss V (2008). Pre-clinical endodontics: a survey amongst German dental schools. Int Endod J. 41(10):863–868. Link: https://bit.ly/2X9bfJW
  4. Petersson K, Olsson H, Söderström C, Fouilloux I, Jegat N, and Lévy G (2002). Undergraduate education in endodontology at two european dental schools. Eur J Dent Educ. 6(4):176–181. Link: https://bit.ly/36gwe1I
  5. Cruz E V, Jimena MEM, Puzon EG, Iwaku M (2000) Endodontic teaching in Philippine dental schools. Int Endod J 33: 427-434. Link: https://bit.ly/3g5bVsN  
  6. Qualtrough AJ, Whitworth JM, Dummer PMH (1999) Preclinical endodontology: an international comparison. Int Endod J 32: 406-414. Link: https://bit.ly/3e2yvAg
  7. Qualtrough AJ, Dummer PMH (1997) Undergraduate endodontic teaching in the United Kingdom: an update. Int Endod J. 30: 234–239. Link: https://bit.ly/3e2PceX
  8. Gerhardt-Szép S (20014). Wie prüft man Endo-Inhalte NKLZ gerecht?  Link: https://bit.ly/2TqvWQH
  9. De Moor R, Ulsmann MH, Kirkevang LL, Tanalp J, Whitworth A J (2013) Undergraduate Curriculum Guidelines for Endodontology. Int Endod J 46:1105–1114. Link: https://bit.ly/2ANQ88K
  10. Field JC, Cowpe JG, Walmsley AD (2017) The Graduating European Dentist: A New Undergraduate Curriculum Framework. Eur J Dent Educ 21: 2-10. Link: https://bit.ly/2TqJT0V
  11. Cowpe J, Plasschaert A, Harzer W, Vinkka-Puhakka H, Walmsley AD (2010) Profile and competences for the graduating European dentist-update 2009. Eur J Dent Educ 14: 193-202. Link: https://bit.ly/3dZ5iWS  
  12. European Society of Endodontology (2006) Quality guidelines for endodontic treatment: Consensus report of the European Society of Endodontology. Int Endod J 39 :921–930. Link: https://bit.ly/2TqC639  
  13. Gluskin AH (2014) Endodontics Colleagues for Excellence–The Standard of Practice in Contemporary Endodontics. [Online]. Available: https://bit.ly/2zQOa6Z  
  14. Lee M, Winkler J, Hartwell G, Stewart J, Caine R (2009) Current Trends in Endodontic Practice: Emergency Treatments and Technological Armamentarium. J Endod 35: 35-39. Link: https://bit.ly/2TqJFH7  
  15. Lababidi EA (2013) Discuss the impact technological advances in equipment and materials have made on the delivery and outcome of endodontic treatment. Aust Endod J 39: 92–97. Link: https://bit.ly/3bWAfK9
  16. Scott J (2003) Dental Education in Europe: The Challenges of Variety. J Dent Educ 67: 69-78. Link: https://bit.ly/2XhoN6d  
  17. Savani GM, Sabbah W, Sedgley CM, Whitten B (2014) Current Trends in Endodontic Treatment by General Dental Practitioners: Report of a United States National Survey. J Endod 40: 618–624. Link: https://bit.ly/36gvR7k
  18. Ahmed M F, Elseed AI,  Ibrahim YE (2000) Root canal treatment in general practice in Sudan. Int Endod J 33: 316-319. Link: https://bit.ly/3bJqZsA
  19. Zaugg LK, Savic A, Amato M, Amato J, Weiger R, Connert T (2019) Endodontic Treatment in Switzerland. A National Survey. Swiss Dent J 130: 18-29. Link: https://bit.ly/2XeIdZD
  20. Palmer NOA, Ahmed M, Grieveson B (2009) An investigation of current endodontic practice and training needs in primary care in the north west of England. Br Dent J 206: E22-585. Link: https://bit.ly/3e5IXqE
  21. Chughtai MA, Jamil B, Mahboob U (2019) Developing and validating a questionnaire to Measure Ethical Sensitivity of Freshly Graduated Dentists. JPMA 69: 518–522. Link: https://bit.ly/2Xgaqzd  
  22. Kossioni AE, Lyrakos G, Ntinalexi I, Varela R, Economu I (2014) The development and validation of a questionnaire to measure the clinical learning environment for undergraduate dental students (DECLEI). Eur J Dent Educ 18: 71–79. Link: https://bit.ly/2zf66YU
  23. Schleyer TK, Torres-Urquidy H, Straja S (2001) Validation of an instrument to measure dental students’ use of, knowledge about, and attitudes towards computers. J Dent Educ 65: 883–891. Link: https://bit.ly/2WMmIjQ
  24. Inglis A (2008) Approaches to the validation of quality frameworks for e-learning. Qual Assur Educ. 16: 347–362. Link: https://bit.ly/2LIvjhd  
  25. Artino AR, La Rochelle JS , Dezee KJ, Gehlbach H (2014) Developing questionnaires for educational research: AMEE Guide No. 87. Med Teach 36: 463-474. Link: https://bit.ly/36jMkYv
  26. Aalboe J,  Schumacher MM (2016) An Instrument to Measure Dental Students’ Communication Skills With Patients in Six Specific Circumstances: An Exploratory Factor Analysis. J Dent Educ 80: 58–64. Link: https://pubmed.ncbi.nlm.nih.gov/26729685/
  27. Da Costa ED, Pinelli C, Da Silva Tagliaferro E, Corrente JE, Ambrosano GMB (2017) Development and validation of a questionnaire to evaluate infection control in oral radiology. Dentomaxillofacial Radiol 46: 20160338. Link: https://bit.ly/3e6c5Ou  
  28. Strand P, Sjöborg K, Stalmeijer R, Wichmann-Hansen G, Jakobsson U, et al. ( 2013) Development and psychometric evaluation of the Undergraduate Clinical Education Environment Measure (UCEEM). Med Teach 35: 1014–1026. Link: https://bit.ly/2Xh5pGz  
  29. Nordin N, Deros BM, Wahab DA, Rahman MNA (2012) Validation of lean manufacturing implementation framework using delphi technique. J Teknol Sciences Eng 59: 1–6. Link: https://bit.ly/3e6bZGN
  30. Baker J, Lovell K, Harris N (2006) How expert are the experts? An exploration of the concept of ‘expert’ within Delphi panel techniques. Nurse Res 14: 59-70. Link: https://bit.ly/3g6Xt3j
  31. Tigelaar DEH, Dolmans DHJM, Wolfhagen IHAP,  Van Der Vleuten CPM (2004) The development and validation of a framework for teaching competencies in higher education. High Educ. 48: 253–268. Link: https://bit.ly/2A04Ngo  
  32. Roff S, McAleer S, Skinner A (2005) Development and validation of an instrument to measure the postgraduate clinical learning and teaching educational environment for hospital-based junior doctors in the UK. Med Teach 27: 326–331. Link: https://bit.ly/2Tqvj9N  
  33. Salcedo-Rocha AL, García-de-Alba-Garcia JE, Velásquez-Herrera JG, Barba-González EA (2011) Oral Health: Validation of a questionnaire of self-perception and self-care habits in Diabetes Mellitus 2, hypertensive and obese patients. The UISESS-B scale. Med Oral Patol Oral Cir Bucal 16: 834–839. Link: https://bit.ly/2ToK1OE  
  34. Adams WK, Wieman CE (2011) Development and validation of instruments to measure learning of expert-like thinking. Int J Sci Educ 33: 1289-1312. Link: https://bit.ly/3cO7Kzu
  35. Zahiri Esfahani M, Khajouei R, Baneshi MR (2018) Augmentation of the think aloud method with users’ perspectives for the selection of a picture archiving and communication system. J Biomed Inform 80: 43–51. Link: https://bit.ly/2AORWyj
  36. Yen PY, Bakken S (2009) A comparison of usability evaluation methods: heuristic evaluation versus end-user think-aloud protocol - an example from a web-based communication tool for nurse scheduling. AMIA Annu Symp Proc 714–718. Link: https://bit.ly/2LRrF4p  
  37. Odukoya OK, Chui MA (2012) Using Think Aloud Protocols to Assess E-Prescribing in Community Pharmacies. Inov Pharm. 3(3):88. Link: https://bit.ly/3e6c8d8
  38. Federal Ministry of Health (Bundesministerium für Gesundheit) (2019) Verordnung zur Neuregelung der zahnärztlichen Ausbildung. Link: https://bit.ly/2Tpg2Ge  
  39. Association of Medical Faculties in Germany (Medizinischer Fakultätentag der Bundesrepublik Deutschland e.V) (2015) Nationaler Kompetenzbasierter Lernzielkatalog Zahnmedizin. Link: https://bit.ly/2zVNYU9
  40. Gehlbach H, Artino AR, Durning SJ (2010) AM Last Page: Survey Development Guidance for Medical Education Researchers. Acad Med 9: 76–99. Link: https://bit.ly/2zUCJLG  
  41. Cronin P, Ryan F, Coughlan M (2008).Undertaking a literature review: a step-by-step approach. Br J Nurs 17: 38-43. Link: https://bit.ly/3bPoLb9  
  42. Sim JH,  Tong WT, Hong WH, Vadivelu J, Hassan H (2015) Development of an instrument to measure medical students’ perceptions of the assessment environment: Initial validation. Med Educ Online. 20:28612. Link: https://bit.ly/2zQNX3H  
  43. Alotaibi G, Youssef A (2013). Development of an assessment tool to measure students′ perceptions of respiratory care education programs: Item generation, item reduction, and preliminary validation. J Fam Community Med 20: 116. Link: https://bit.ly/3bQA9mU
  44. Rucker R (2018) Development and preliminary validation of an ageism scale for dental students. Spec Care Dent 38: 31–35. Link: https://bit.ly/3cQbMr1  
  45. European Society of Endodontology (1992) Undergraduate Curriculum Guidelines For Endodontology.  Int Endod J 25:169–172. Link: https://bit.ly/3bKSAcJ  
  46. European Society of Endodontology (2001) Undergraduate Curriculum Guidelines for Endodontology. Int Endod J 34: 574–580. Link: https://bit.ly/2ylhfXL
  47. Skulmoski GJ, Hartman FT, Krahn J (2007) The Delphi Method for Graduate Research. J Inf Technol Educ. 6:1–21. Link: https://bit.ly/2XflIDP
  48. Scientific Services German Bundestag (Wissenschaftliche Dienste Deutscher Bundestag) (2014) Elaboration: Aspects of the Delphi method (Ausarbeitung: Aspekte der Delphi-Methode). Link: https://bit.ly/3d2khQ1  
  49. Baden-Württemberg Cooperative State University (Duale Hochschule Baden-Württemberg). Arbeitspapier: Die Delphi-Methode. Projekt OPEN:OPen Education in Nursing. Link: https://bit.ly/3bNrMc4
  50. Steurer J (2011) The Delphi method: An efficient procedure to generate knowledge. Skeletal Radiol 40: 959–961. Link: https://bit.ly/36fx135  
  51. McMillan SS, King M, Tully M P (2016) How to use the nominal group and Delphi techniques. Int J Clin Pharm 38: 655–662. Link: https://bit.ly/2WN4ej2  
  52. Buckley CC (1994) Delphi technique supplies the classic result? Aust Libr J 43: 158-164.  Link: https://bit.ly/2LRrgip  
  53. Wilson RD (1975) Research priorities in social welfare library and information work. J Libr 7: 252-261. Link: https://bit.ly/36il8sZ
  54. Hsu CC, Sandford B A (2007) The Delphi technique: Making sense of consensus. Pract Assessment  Res Eval 12: 1–8. Link: https://bit.ly/3cPzhAD   
  55. Thangaratinam S, Redman CW (2005) The delphi technique. Obstet Gynecol 7: 120-125.
  56. Hatcher T, Colton S (2007) Using the internet to improve HRD research: The case of the web-based Delphi research technique to achieve content validity of an HRD-oriented measurement. J Eur Ind Train 31: 570–587. Link: https://bit.ly/2Zo6nDN
  57. Powell C (2003) The Delphi technique: Myths and realities. J Adv Nurs 41: 376–382. Link: https://bit.ly/3cPyRKz  
  58. Grisham T ( 2009) The Delphi technique: a method for testing complex and multifaceted topics. Int J Manag Proj Bus 2: 112–130. Link: https://bit.ly/3e6bhJD
  59. Van Someren M. Barnard Y, Sandberg J (1994) The think aloud method: A practical guide to modelling cognitive processes. London. Link: https://bit.ly/3cRv6Ea
  60. Güss CD (2018) What is going through your mind? Thinking aloud as a method in cross-cultural psychology. Front Psychol. 9:1292. Link: https://bit.ly/3bTZXi1
  61. Field J (2018) Curriculum content and assessment of pre-clinical dental skills: A survey of undergraduate dental education in Europe. Eur J Dent Educ 22: 122–127. Link: https://bit.ly/3cSJJXV
  62. THE COUNCIL OF THE EUROPEAN COMMUNITIES (1981) Directive 81/1057/EEC of the European Parliament and of the Council of 14th December 1981 on the recognition of professional qualifications. Off J Eur Communities. Link: https://bit.ly/2LI9v5n   
  63. Ab Latif R, Mohamed R, Dahlan A, Mat Nor MZ (2016) Using Delphi Technique: Making Sense of Consensus in Concept Mapping Structure and Multiple Choice Questions (MCQ). Educ Med J. 8(3):89– 98. Link: https://bit.ly/2Zpg04P
© 2020 S Sacha, et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.