Assignment: Intro to Health Research Methods Paper

University of Toronto Assignment: Intro to Health Research Methods Paper

Assignment: Intro to Health Research Methods Paper

Please answer the following questions with the attached article.

Read the attached program-evaluation research article:

Kadomoto, N; Iwasa, H; Takahashi, M; Dulnuan, MM; & Kai, I. (2011) “Ifugao males, learning and teaching for the improvement of maternal and child health status in the Philippines: an evaluation of a program”,

BMC Public Health

, 11:280-289.



Kadomoto et alia 2011 Males Learning & Teaching MCH.pdf


Then, answer these questions about this health-promotion intervention outcomes-evaluation study:

  1. Why do the authors report drop-out rates for the 2 groups of men in this maternal-health training program? …and what are the implications for study findings of the fact that drop-outs had lower

    knowledge

    scores at Time 1 (before entering the training program) than did participants who completed the program?
  2. What effect might the use of the English language in the maternal-health educational materials and training lecture have had on the effectiveness of this intervention? …and why is that?
  3. Participants in Group 1 were taught by a trainer who was not from their own village (barangay), but those who taught the program to Group 2 participants were residents of that barangay. How and why might this have affected the outcome measure on men’s attitudes toward involvement in pregnancy, delivery, and post-natal maternal health? What do the outcomes data show about this?
  4. Although this was an outcomes evaluation (testing the effectiveness of the training provided), what changes did researchers suggest for improving the maternal-health intervention program which are more typical of a process evaluation? …and why?Assignment: Intro to Health Research Methods Paper




ORDER HERE FOR ORIGINAL, PLAGIARISM-FREE PAPERS



Secondary, “Comparative”, & “Critical” Research; “Action” Research & CBPR; Health Services/Systems Research; and “(Program/Policy) Evaluation Research”


D


a


niel


Schluter,


P


h


D











Specialty Areas of Research



(not really different research methods or separate types of research)

“Nonreactive Research” Assignment: Intro to Health Research Methods Paper


  • “Comparative Research”

  • “Critical Research”

  • “Action Research”

  • Health Services/Systems Research

  • “(Program/Policy) Evaluation Research”

Basically, there are two things going on here…

–     unusual examples/variations of usual research methods

or

  • specific areas of interest, studied with the usual


What is “Non-Reactive Research”?


…Topics of Study? …Approaches/Orientations? …Strategies? …Study Designs?


The textbook defines “nonreactive research”, as that which uses methods that either…

  • involve indirect data collection (neither directly observing activities/events nor asking participants about experiences/opinions) or

–      don’t allow participants being studied to react to researchers’ enquiries/observations.


This includes two types of studies that we have discussed before:

  • Secondary Research (quantitative)
  • Content Analysis (qualitative)

There are many, many sources of data for secondary research:

  • Data collected by public institutions (

    official statistics

    )

    • hospitals, police, coroner (births, deaths, stabbings, etc.), Stats Canada
  • Data collected by private individuals or companies

    • patient records, employment files, claims for on-the-job accidents,
  • Data collected for another study that can be re-used

    • by other researchers, Health Canada, CIHR,


Advantages:


cheap (cost-effective), less work


Concerns:

  • The necessary information may be unavailable or

  • “Equivalence of Concepts”


    may be difficult to

    • Those who collected the data may not have had the same concept in
    • There is no universal definition of many “phenomena” (e.g., injury accident, crimi- nal assault, infant mortality, child neglect, spousal abuse, or health generally),


so define concepts carefully and fully

.

•      Research conducted

to examine the content of communications


(messages


that


a


re


c


onveyed


by written text, spoken word, or even visual images)

  • Involves analysis of


    data

    derived from


    • media sources


      (magazines, newspapers, films, television programs, )

      ;

    • political messages


      (policy statements, party pamphlets, campaign speeches, )

      ;

    • procedural documents


      (meeting agendas/minutes, textbooks, instruction manuals, guidebooks, protocols for organizations’ activities, training materials, etc.)

      ; or

    • personal accounts


      (diaries, war stories, memoirs, personal letters, )
  • The


    unit of analysis

    is

    not a person or group, but

    the individual items studied


    —e.g.: magazine articles, speeches given, diary entries, recipes, pamphlets




Data is


each “piece of information” or “unit of meaning” communicated


(an idea conveyed, an instruction given, a sentence (a statement made or a “line of dialogue”)

  • May be qualitative

    (analyzing themes, presence/absence of characteristics)

    or quantitative

    (enumerating/analyzing the frequency of things mentioned, positions taken, )



“Comparative” Research


really means:

Cross-Cultural or Historical Studies

  • look at similarities and differences


    between



    cultures

    or


    within



    the same culture

    over time
  • commonly employ


    secondary data


    , official statistics, including historical documents, diaries, memoirs, or other types of published information
  • can do


    primary research


    involving field studies, experiments, surveys,

Advantages: multivariate analysis is often possible, even easy


Concerns:

  • How researchers go about


    sampling


    countries/cultures to be compared requires explanation.


  • Causality


    can be difficult to demonstrate, and may differ by group or country (individual sanitation practices vs. sanitation infrastructure, for instance).
  • “Equivalence of Indicators”

    will

    be a

    • Can be challenging to establish the


      validity


      of the indicators.
    • Evidence collected in different countries/cultures is rarely based on identical definitions or data-collection procedures,

      so use trend variables rather than absolute measures

      . Assignment: Intro to Health Research Methods Paper

“Critical” Research


W


e


live in a world in which knowledge is used to maintain oppressive relations. Infor- mation is interpreted and organized in such a way that the views of a small group of people are presented as objective knowledge, as “The Truth.”

Kirby & McKenna 1989


Experience Research Social Change: Methods from the Margins

(Toronto: Garamond Press)


R


esearch,


which


so


far


has been largely the instrument of dominance and legitimation of power elites, must be brought to serve the interests of dominated, exploited, and oppressed groups.

Maria Mies 1984

“Towards a Methodology for Feminist Research” Ch 54 in


German Feminism: Readings in Politics and Literature


,

E.H. Altbach, J. Clausen, D. Schultz, & N. Stephan, editors (Albany, NY, USA: State University of New York Press)

• Based on the social-change orientation of the “critical approach”

  • Sees inequality as rooted in exploitative social relations
  • Advocates for greater equality
  • Advocates for sociopolitical and structural change

• Examples of “critical research”

  • Marxism – Rights-Based Research
  • Feminism – Disability/Diversity Studies
  • Racial/Ethnic/Minority Studies – Post-Modernist Research

“Feminist” Research

Feminists share many views but may differ on key issues: hence, there are multiple

feminisms/feminist approaches.

Shared assumptions and orientations:

  • Value women and their experiences, ideas, and needs
  • See phenomena from the perspective of women
  • Recognize the existence of conditions that oppress women
  • Desire to change conditions through research leading to political action

Shared methods:

  • Uses a variety of quantitative and qualitative methods, but more emphasis on qualitative methods including in-depth interviews, oral histories, comparative and field studies
  • Emphasis on subjectivity and the personal experiences of the participants
  • Data are analyzed within the context of women’s lives in such a way that women are empowered rather than portrayed in ways that stereotype them
  • Encourage women to report their experiences in their own voices
  • Allow for a structural analysis of the conditions of women’s lives with a goal of improving the conditions


Core Themes


Common to All “Critical” Approaches

Since knowledge=power…

1.            Research process/results should have the potential to benefit marginalized groups.

  1. Research topics should be ones that concern disadvantaged, oppressed, marginalized
  2. Researchers’ assumptions must be made explicit and be openly
  3. Prior scholarship must be critiqued to expose
  4. Researchers and participants should interact as collaborators.





Research


from


the


ma


rgins


i


s


not research on people from the margins, but research by, for, and with them.”  (Kirby & McKenna 1989)











Which Research Methods / Study Designs Should Be Used for “Critical Research”?

Many critical researchers reject positivist assumptions…

  • (

    e.:

    that research and knowledge are

    objective

    or

    value free

    ).
  • Hence, some reject quantitative methods
  • Others agree with the critique of positivism but believe that quantitative research is

    useful for exposing social inequality

    .

    • Because “numbers” can have

      more impact

      leading to policy
    • Activists want to show that their social/political/economic/health interventions both have a positive impact AND are cost

•           So, “critical theorists” may employ either quantitative or qualitative methods ––sometimes both.

  • Many emphasize mixed-method research designs or “methodological triangulation”.
  • Use “stories and numbers” — qualitative +


(Community-Based) (Participatory) “Action Research”: CBR/PAR/CBPAR



Action Research



pr


oceeds in cyclical stages that involve planning, implementing, reflecting, and evaluating



ac




tion steps to improve some situation



, and it involves collaboration between researchers and participants throughout the entire process.

• Action research:

  • is educative
  • deals with individuals as members of social groups
  • is problem-focused, context-specific, and future-oriented
  • involves an intervention designed to create change

• Action research:

  • aims at improvement and involvement (“empowerment”)
  • involves a cyclical process that

    interlinks prevention, social/medical- service intervention, program evaluation, and social/political action
  • is founded on a research relationship in which

    all those involved are participants in the change process
  • has three elements:

-research,              -adult education, and                              -sociopolitical action



Methods of Data Collection for Action Research

oral history or “life history” interviewing

(multiple, in-depth interviews with structured/semi-structured interview guides)

•        participant observation

  • focus group interviews
  • document collection

(from government, community organizations, etc.)



can also involve…

  • quantitative analysis (surveys)

•        secondary data analysis

(of government statistics, data from community organizations, etc.)


Steps in Action Research
  • Entry into the community
  • Assessment of the situation and goals of participants
  • Planning for research and required/desirable action
  • Implementation of research plan
  • Evaluation of the research implementation and reflection on its successfulness, usefulness for prompting action
  • Reporting and reassessment of the situation
  • Planning future research and future action



Role of the Researcher

Non-hierarchical, reciprocal                       relationship

  • Views self as a

    partner with participants

•        Be vulnerable, share experiences and emotions with participants as team member


  • Reflexivity is expected

    of the researcher

    • Think about own role in creating/resolving situation
    • Concerns about drawing unwanted attention to the issue
    • Assignment: Intro to Health Research Methods Paper



To the greatest extent possible…

•        Participants should be made active partners in the planning, data-collection, and data-analysis processes.


Reporting Findings
  • Use descriptive, non-sexist
  • Portray participants as people, in their own voices.
  • Provide a structural analysis of the everyday lives of participants.
  • Avoid academic jargon; make findings accessible to all people, not just
  • Include an analysis of the role of the


Strengths of & Challenges in Conducting “Action Research”



Strengths

  • Involving community members as mem- bers of the research team

    ensures that the work will be useful

    to
  • Offers opportunities for researchers to simultaneously plan, implement, and evaluate
  • Researchers gain practical knowledge; while

    participants gain research skills

    , political knowledge, and a sense of

    empowerment

    .
  • Understanding and respecting diversity of values, abilities, and perspectives among community members and researchers

    builds good will (for science & society)

    .



Challenges/Weaknesses

  • Involving community members as mem- bers of the research team can be difficult/complicated, as it

    requires extra training by researchers

    .
  • Ensuring that all members of the research team are sensitive and responsive to the needs of different forms and types of leadership at different stages of the research process is

    stressful and “political”

    .
  • Conducting Action Research takes a great deal of

    time, patience, and tact

    .


Health-Systems & Health Services Research

  • Healthcare

    Quality & Service


    Improvement
(waiting lists, wait times, provider skills, delivery systems, location and accessibility of services)
  • Focus on Assuring

    “Stakeholder Interests”
(including patients, practitioners, healthcare organizations, government/regulators, insurance companies, and the public)
  • Basis for


    Policy Decisions


    & Implementation Plans
(…how to prevent disease/injury, …when to intervene to provide care)
  • Promoting/Ensuring Adherence to Treatments, Developing, Testing, & Changing

    “Standards of Care”
(“best practices”, “evidence-based medicine”)









“(Program/Policy) Evaluation Research”



(Health Promotion Campaigns, Health Policies, Prevention/Intervention Programs)


What is it?



  • systematically investigating the effectiveness of (pre/inter)vention programs and procedures that are designed to improve various health conditions




    (Health Promotion & Disease Prevention or “HPDP”)



  • examination of a specific policy, program, or project —aimed at assessing its merit/value/worth, relevance, or contribution



  • research intended to assess the effectiveness, practicality, or suitability of policy implementation in particular areas or for specific groups




    (policy-analysis research)


Why do it?



  • to determine the effectiveness of a (new) program:

    • Did it achieve its objectives?
    • to assess effects by subgroups
    • to identify ways of improving it
    • to expand it / restrict it
•    to satisfy funder requirements:
  • accountability (“money trail”)
  • efficiency (“money well-spent”)
•    for “political”/policy reasons, for “PR” (public relations):
  • to justify program’s existence, ensure continued funding
  • to calm/address public concerns

  • Formative Evaluation


    (conducted in the program-planning stages to provide insights into the problem that help to guide program design and development)

    • Needs Assessment for diagnosing the situation, determining value to participant
    • “Best Practices” / protocol to be followed
    • pretesting (communication programs, data-collection tools, )
    • establishing a baseline level against which to evaluate program success/impact

  • Process/Implementation Evaluation


    (during the course of activity to assess quality, accessibility, reach, and successful implementation of program/project/policy)

    • Is it being implemented well? …as planned?
    • What improvements/revisions can be made?
    • monitoring of service utilization
    • monitoring of behaviour change and/or health status

  • Summative/Outcomes Evaluation


    (conducted at the end of project, program, policy implementation to determine impact, cost-effectiveness)

    • program-based measures of client satisfaction, # of services provided,
    • population-based measures to show reduced incidence of disease, or increased protection against infections,



Key Components

Literature Review

  • present background information: “

    Environmental Scan

  • identify need for the intervention/program/policy: “

    Needs Assessment

  • provide review of what is known about the impact of such projects

•          Evaluability Assessment

  • describe what can be assessed and how it should be done
  • prototype, pretest data-collection instrument, tool, or suggested approach
  • program planning tool for timeline and deliverables

•          “Logic Model” / “Intervention Model”

  • clarify program goals and hoped impact from a systems perspective
  • illustrate the flow/process involved (

    environment

    ,

    inputs

    ,

    activities

    ,

    outputs

    )


Methods for Evaluation Research


(both “ask” and “observe” options)

Interviews

  • intake and exit interviews with program participants / clients
  • interviews with agency staff, directors, volunteers
  • stakeholders’ assessments of services/needs met (relevance)

    Tests & Simulations
  • direct assessments of clients’ knowledge, skill, awareness in areas targeted by the program

(nutrition, self-care, health habits, injury- prevention,

etc.

)

  • “simulated client” testing to assess staff training


Surveys

  • client surveys to document needs, attitudes, effects, outcomes, and satisfaction
  • provider surveys to assess effectiveness, efficiency, difficulties in program delivery


Secondary Analysis

  • project documents

    • client assessments (ongoing notation by staff of needs, success, )
    • activity reports (for staff, clients, family members, lay health advisers)
  • service statistics, agency ratings
  • program coverage data


Key Factors to Consider:


Circumstances:

  • characteristics of target population
  • political context (conflict over goals)

Program Structure:

  • scope, duration of activities
  • types of services provided
  • number and location of service sites

Program Goals:

  • action strategies, activities undertaken
  • stage of program development

Resources Available:

  • funding staffing time/efforts
  • equipment facilities

“SWOT” Analysis:


  • Strengths/Weaknesses/Opportunities/Threats

  • alt: Vulnerabilities, Challenges, Risks, Rewards

Inputs:

  • funding, monetary resources
  • administrators, staff, volunteer time
  • facilities, equipment available

Process:

  • set of activities conducted in program

(including recruitment, retention efforts)

  • communication strategies used
  • accessibility and quality of services
  • Assignment: Intro to Health Research Methods Paper

Outputs:

  • number of events held, services offered
  • products/services delivered

Outcomes:

  • Initial (such as psycho-social benefits)
  • Intermediate (such as behaviour changes)
  • Long-Term (such as overall health status)


Outcome Measures



Timing Issues

•           Initial

  • antecedents to behaviour
  • knowledge, attitudes, efficacy

  • Intermediate

    • change in behaviours
    • exercise, diet, condom use

  • Long-term

    • health status
    • fertility rates
    • death rates



Confusions/Conclusions


  • Program Failure:

    • Program is implemented as planned, but…
    • Intervention doesn’t produce intermediate results, and/or
    • Doesn’t lead to desired long- term

  • Implementation Failure:

    • Program is not implemented properly (whether as planned or not) and therefore
    • Intervention cannot be truly tested.


Common Problems / Assessment Issues

  • dealing with the “politics” of program operations

    • strategic planning
    • stakeholder input
    • reach of program/policy

  • fidelity to program design

    • strategies, success of recruitment and retention
    • dose intended vs. dose delivered/received
    • effects of program changes, mid-course



  • practical constraints in evaluation research

    • budgets and timeline
    • data available/collectable
    • political and social context

  • balancing tensions between

    • (scientific) soundness,
    • practicality, and
    • utility of results (for decision- makers)


 

smilesmilePLACE THIS ORDER OR A SIMILAR ORDER WITH US TODAY AND GET AN AMAZING DISCOUNT

get-your-custom-paper