Evaluating Qualitative Studies - From Qualitative Research to a Journal Article - Conference Proposals and Article Types

Writing for Publication: Transitions and Tools that Support Scholars’ Success - Mary Renck Jalongo, Olivia N. Saracho 2016

Evaluating Qualitative Studies
From Qualitative Research to a Journal Article
Conference Proposals and Article Types

Qualitative studies examine intricate phenomena. Well-designed and well-written studies can contribute to knowledge of the field and guide future research. Most journals provide guidelines that specify a structure to make sure that the published research is of high quality. McWilliam (2000) provides a summary of key indicators that helps to evaluate the quality of the research. Use the checklist in Table 8.4 to assess the quality of a qualitative research report.

Table 8.4

Indicators of quality in qualitative research reports: a checklist

Yes

No

Does the qualitative report describe



The theoretical background?



How the research questions were derived?



How the participants were selected?



The participants’ roles?



How the data were recorded?



The depth and duration of data collection?



How the data were reduced?



The steps for arriving at findings or themes?



How often and thoroughly the original data were consulted during analysis?



How participants or others contributed to verifying information?



The level of information (e.g., transcripts, summaries, manuscripts) used in member checks?



The relationship of the findings to theory and other studies?

A framework can be developed to assess any type of qualitative design. Tong, Sainsbury, and Craig (2007) developed a 32-item checklist called “Consolidated criteria for reporting qualitative studies (COREQ) is a 32-item checklist that qualitative researchers use as a guide in their work. Table 8.5 is a checklist based on the COREQ.

Table 8.5

Checklist to evaluate qualitative studies

Item

Evaluation questions

Area 1: Research group

Personal qualities

1. Interviewer/organizer

Who conducted the interview or organized focus groups?

2. Qualifications

What were the researchers’ areas of expertise? (Knowledge of the subject area, methodologies, etc.)

3. Preparation and experience

What are the researchers’ preparation and experience?

Association with participants

4. Establishing relationships

When was a relationship with the participants established?

5. Communication with participants

Were the participants informed about the researchers’ personal goals, purpose, assumptions, and reasons, interests in, and method of conducting the study?

Area 2: Research design

Theoretical framework

6. Methods and theory

What research methodology was used? Grounded theory, discourse analysis, ethnography, phenomenology, content analysis?

What theory was used to support the study?

Participants’ description

7. Selection

What process was used to select the participants? Purposive, convenience, consecutive, snowball?

8. Recruitment

What process was used to recruit participants? Personal contact, telephone, mail, email?

9. Selection criteria

What are the essential qualities for selecting participants?

10. Rejection and declined

How many contacts declined to participate or dropped out? Reasons? How many volunteers were rejected? Reasons?

11. Sample size

How many participants were used in the study?

Background

12. Location for collecting data

Where were the data collected? Home, school, workplace, community?

13. Spectators

Who was present during data collection other than the participants and researchers?

14. Description of participants

What are the major characteristics of the participants? Were demographic data included?

Data collection

15. Interview schedule

What were the questions or prompts for the interviews? Were they opened or closed? How long were the interviews? Were they pilot tested?

16. Quantity of interviews

How many interviews were conducted?

17. Technology

What type of technology was used to collect data? Audio or video recording?

18. Recording of field notes

When were field notes recorded? During and/or after interviews or focus groups?

19. Data saturation

How was the level of data saturation achieved? (e.g., no need for new data, new themes, or new coding to be able to replicate the study)

20. Sharing transcriptions

Were transcriptions shared with participants for comments and/or revision?

Area 3: Analyses and final report

Data analyses

21. Data coders

Were coders trained? Were coders used to determine validity and reliability? If so, how many were used?

22. Description the coding system

Was there a description of what data were coded and how data were coded?

23. Identification of themes

Were the process of determining themes and generating codes from the data described?

24. Software

If software was used to code the data, was it described?

25. Member checks

Were data, analytic categories, interpretations, and conclusions tested with the participants to obtain feedback?

Final report

26. Quotes

Were quotes from participant used to support the themes/findings? How were quotes used and identified?

27. Correspondence of data and findings

Did the data presented matched the findings?

28. Presentation of key themes

Were the key themes distinctly presented in the findings?

29. Presentation of secondary themes

Were different situations or minor themes described?

Adapted from Tong, Sainsbury, and Craig (2007)

The criteria included in the checklist can help researchers to report important aspects of the research team, study methods, context of the study, findings, analysis and interpretations.

Activity 8.5: Self-Evaluation of a Qualitative Research Report

Using Table 8.5 as a guide, write answers to each question for a published manuscript or one that you have written or are developing. Create a list of strengths and weaknesses and make a plan for addressing the flaws.