Sample size in qualitative research has been the subject of enduring discussions [4, 10, 11]. Whilst the quantitative research community has established relatively straightforward statistics-based rules to set sample sizes precisely, the intricacies of qualitative sample size determination and assessment arise from the methodological, theoretical, epistemological, and ideological pluralism that characterises qualitative inquiry (for a discussion focused on the discipline of psychology see [12]). This mitigates against clear-cut guidelines, invariably applied. Despite these challenges, various conceptual developments have sought to address this issue, with guidance and principles [4, 10, 11, 13,14,15,16,17,18,19,20], and more recently, an evidence-based approach to sample size determination seeks to ground the discussion empirically [21,22,23,24,25,26,27,28,29,30,31,32,33,34,35].
Pragmatic approaches to qualitative analysis are likely valuable for IS researchers yet have not received enough attention in the IS literature to support researchers in using them confidently. By pragmatic approaches, we mean strategic combining and borrowing from established qualitative approaches to meet the needs of a given IS study, often with guidance from an IS framework and with clear research and practice change goals. Pragmatic approaches are not new, but they receive less attention in qualitative research overall and are not always clearly explicated in the literature [9]. Part of the challenge in using pragmatic approaches is the lack of guidance on how to mix and match components of established approaches in a coherent, credible manner.
Qualitative Inquiry and Research Design: Choosing Among Five Approaches download pdf
Download Zip: https://conrumencpe.blogspot.com/?uq=2vH5m0
Framework analysis comes from the policy sphere and tends to have a practical orientation; this applied nature typically includes a more structured and deductive approach. The history, philosophical assumptions, and core processes are richly described by Ritchie and Spencer [36]. Framework analysis entails several features common to many qualitative analytic approaches, including defining concepts, creating typologies, and identifying patterns and relationships, but does so in a more predefined and structured way [37, 38]. For example, the research team can create codes based on a framework selected in advance and can also include open-ended inquiry to capture additional insights. This analytic approach is well-suited to multi-disciplinary teams whose members have varying levels of experience with qualitative research [37]. It may require fewer staff resources and less time than some other approaches.
The framework analysis process includes five key steps. Step 1 is familiarization: Team members immerse themselves in the data, e.g., reading, taking notes, and listening to audio. Step 2 is identifying a coding framework: The research team develops a coding scheme, typically using an iterative process primarily driven by deductive coding (e.g., based on the IS framework). Step 3 is indexing: The team applies the coding structure to the entire data set. Step 4 is charting: The team rearranges the coded data and compares patterns between and within cases. Step 5 is mapping and interpretation: The team looks at the range and nature of relationships across and between codes [36, 39, 40]. The team can use tables and diagrams to systematically synthesize and display the data based on predetermined concepts, frameworks, or areas of interest. While more structured than other approaches, framework analysis still presents a flexible design that combines well with other analytic approaches to achieve study objectives [37]. The case example given in section 3 offers a detailed application of a modified framework analytic approach.
Building on the discussion of pragmatic combination of approaches for a given study, we turn now to the question of ensuring and communicating rigor so that consumers of the scientific products will feel confident assessing, interpreting, and engaging with the findings [46]. This is of particular importance for IS given that the field tends to emphasize quantitative methods and there may be perceptions that qualitative research (and particularly research that must be completed more quickly) is less rigorous. To address those field-specific concerns and ensure pragmatic approaches are understood and valued, IS researchers must ensure and communicate the rigor of their approach. Given journal constraints, authors may consider using supplementary files to offer rich details to describe the study context and details of coding and analysis procedures (see for example, Aveling et al. [47]). We build on the work of Mays and Pope [38], Tracy [8], and others [48,49,50,51,52] to offer a shortlist of considerations for IS researchers to ensure pragmatic analysis is conducted with rigor and its quality and credibility are communicated (Table 1). We also recommend these articles as valuable resources for further reading.
We encourage IS researchers to explore the diversity and flexibility of qualitative analytic approaches and combine them pragmatically to best meet their needs. We recognize that some approaches to analysis are tied to particular methodological orientations and others are not, but a pragmatic approach can offer the opportunity to combine analytic strategies and procedures. To do this successfully, it is essential for the research team to ensure fit, preserve quality, and rigor, and provide transparent explanations connecting the analytic approach and findings so that others can assess and build on the research. We believe pragmatic approaches offer an important opportunity to make strategic analytic decisions, such as identifying an appropriate balance of insider and outsider perspectives, to extend current IS frameworks and models. Given the urgency to increase the utilization and utility of EBIs in practice settings, we see a natural fit with the pragmatist prompt to judge our research efforts based on whether or not the knowledge obtained serves our purposes [63]. In that spirit, the use of pragmatic approaches can support high-quality, efficient, practice-focused research, which can broaden the scope and ultimate impact of IS research.
Because there have been few published examples of qualitative approaches and methods using RE-AIM for planning or assessment and no guidance on how qualitative approaches can inform these processes, we provide guidance on qualitative methods to address the RE-AIM model and its various dimensions. The intended audience is researchers interested in applying RE-AIM or similar implementation models, but the methods discussed should also be relevant to those in community or clinical settings.
Qualitative measures are of value in RE-AIM (and other planning and evaluation approaches) for several reasons. Some questions simply cannot be answered with quantitative data. Pulling data from an electronic medical record (EMR), analyzing a survey scale or counting does not work for some questions, or are too expensive to collect feasibly. Second, qualitative data provide answers to not just what happened, but why and how. They can illuminate patterns of results and why and how results were obtained for various outcomes, including unintended effects. Third, they provide diverse and multiple assessment methods to provide convergent validity for quantitative results. They can engage the participants in a collaborative manner and consider their inputs to a program or policy in a way that quantitative approaches do not. Finally, as with many research questions and evaluation approaches, having quantitative and qualitative methods for RE-AIM dimensions allows these methods to iteratively inform each other. This should enhance understanding and lessons learned, ultimately leading to better dissemination of evidence-based approaches into practice.
The purpose of this article is to summarize and recommend qualitative approaches to address the RE-AIM model and its various dimensions. We provide guidance for researchers and community groups that wish to use qualitative methods in their RE-AIM applications.
Qualitative research provides meaning and understanding. It is utilized in both exploratory and explanatory research. This is in contrast to quantitative methods that utilize numbers and address statistical outcomes. In general, qualitative methods help understand how and why results on various individual RE-AIM dimensions, or patterns of results across dimensions (e.g. high reach and low effectiveness) occur. A wide variety of qualitative techniques and approaches can be used to address RE-AIM issues. As the focus of this paper is not to provide a comprehensive description of qualitative data collection and analysis methods, we refer the reader to excellent texts. [13,14,15,16] Instead of using one strategy, the methods selected should be tailored to the setting, research questions and resources available. Table 1 provides simple translational questions that can be used to inquire about RE-AIM issues by and with clinicians and community members. [17] In summary, there are a variety of methods conducive to qualitative work in exploration of RE-AIM dimensions. These include interviews, observations, focus groups, photovoice, digital storytelling, and ethnography. Analysis methods are also varied and dependent on the research or evaluation issue and question. Choices include grounded theory, thematic analysis, matrix, and immersion crystallization. Below we describe how qualitative methods can be used to address each RE-AIM dimension and key issues involved. Table 2 provides examples of questions and possible qualitative methods for each RE-AIM dimension.
Standard means of assessing reach are to describe the number and percent of participants who participate in a desired initiative. From a qualitative method perspective, key issues concerning reach are understanding why people accept or decline participation and describing characteristics of participants versus non-participants that are not available from quantitative data or records. For example, if the desired goal is to reach all patients with diabetes and a hemoglobin A1c level over 8, the quantitative measure of reach would be the number or percent participating in the initiative out of the total eligible. Knowing that 25% of patients are participating provides insight into what degree of penetration occurred as a result of the initiative, but does not help to understand situations in and characteristics of the reached population that distinguish them from non-participants. Often, quantitative approaches have been used to describe reach in terms of the demographics of the reached versus non-reached population. For example, maybe the reach was 25%, but three quarters of the participants were female, Caucasian, and privately insured. Thus, the reach for this program largely misses Medicaid insured participants of both genders. These data represent identifiable characteristics of participants that provide a more comprehensive picture of who is missing. However, there are often characteristics that impact participation versus non-participation that are not routinely collected, readily available from EMRs or other databases, or not easy to quantify. Perhaps reach is limited by factors such as lack of trust in health care providers, disinterest in medication taking, or social determinants barriers faced by non-participants such as lack of transportation or family support to participate. These factors are difficult to ascertain without qualitative inquiry. To thoroughly understand reach, it is often necessary to conduct more in-depth and qualitative work to identify root cause issues of suboptimal reach. 2ff7e9595c
Comments