RESEARCH METHODOLOGY-a step-by-step guide for beginners-Ranjit Kumar


RESEARCH METHODOLOGY-a step-by-step guide for beginners
Ranjit Kumar- 1999

SAGE Publications Ltd
1 Oliver’s Yard
55 City Road
London EC1Y 1SP

Research methodology is taught as a supporting subject in several ways in many academic disciplines such as health, education, psychology, social work, nursing, public health, library studies and marketing research. A research design is a plan, structure and strategy of investigation so conceived as to obtain answers to research questions or problems.‘Formulating a research problem’ is the first operational step in the research process. For formulating a ‘good’ research problem, in my opinion, you need to know how to review the literature, formulate a research problem, deal with variables and their measurement, and construct hypotheses. Hence, under this step, there are four chapters. The information they provide will enable you to formulate a problem that is researchable.

These chapters are titled: ‘Reviewing the literature’, ‘Formulating a research problem’, ‘Identifying variables’ and ‘Constructing hypotheses’. Similarly, for the operational step, step III, ‘Constructing an instrument for data collection’, the chapters titled ‘Selecting a method of data collection’, ‘Collecting data using attitudinal scales’ and ‘Establishing the validity and reliability of a research instrument’ will provide sufficient information for you to develop an instrument for data collection for your study. For every aspect at each step, a smorgasbord of methods, models, techniques and procedures is provided for both quantitative and qualitative studies in order for you to build your knowledge base in research methodology.

Your philosophical orientation may stem from one of the several paradigms and approaches in research – positivist, interpretive, phenomenology, action or participatory, feminist, qualitative, quantitative – and the academic discipline in which you have been trained. The concept of ‘validity’ can be applied to any aspect of the research process. It ensures that in a research study correct procedures have been applied to find answers to a question. ‘Reliability’ refers to the quality of a measurement procedure that provides repeatability and accuracy. ‘Unbiased and objective’ means that you have taken each step in an unbiased manner and drawn each conclusion to the best of your ability and without introducing your own vested interest. The author makes a distinction between bias and subjectivity. Subjectivity is an integral part of your way of thinking that is ‘conditioned’ by your educational background, discipline, philosophy, experience and skills. For example, a psychologist may look at a piece of information differently from the way in which an anthropologist or a historian looks at it. Bias, on the other hand, is a deliberate attempt to either conceal or highlight something. Adherence to the three criteria mentioned above enables the process to be called ‘research’

The research process must have certain characteristics: it must, as far as possible, be controlled, rigorous, systematic, valid and verifiable, empirical and critical.

Controlled – In real life there are many factors that affect an outcome. A particular event is seldom the result of a one-to-one relationship. Some relationships are more complex than others. Most outcomes are a sequel to the interplay of a multiplicity of relationships and interacting factors. In a study of cause-and-effect relationships it is important to be able to link the effect(s) with the cause(s) and vice versa. In the study of causation, the establishment of this linkage is essential; however, in practice, particularly in the social sciences, it is extremely difficult – and often impossible – to make the link.

Rigorous – You must be scrupulous in ensuring that the procedures followed to find answers to questions are relevant, appropriate and justified. Again, the degree of rigour varies markedly between the physical and the social sciences and within the social sciences.

Systematic – This implies that the procedures adopted to undertake an investigation follow a certain logical sequence. The different steps cannot be taken in a haphazard way. Some procedures must follow others.

Valid and verifiable – This concept implies that whatever you conclude on the basis of your findings is correct and can be verified by you and others.

Empirical – This means that any conclusions drawn are based upon hard evidence gathered from information collected from real-life experiences or observations.

Critical – Critical scrutiny of the procedures used and the methods employed is crucial to a research enquiry. The process of investigation must be foolproof and free from any drawbacks. The process adopted and the procedures used must be able to withstand critical scrutiny.

A research endeavour from the perspective of its application, there are two broad categories: pure research and applied research. In the social sciences, according to Bailey (1978: 17):

Pure research involves developing and testing theories and hypotheses that are intellectually challenging to the researcher but may or may not have practical application at the present time or in the future. Thus such work often involves the testing of hypotheses containing very abstract and specialised concepts.

A hypothesis is a speculative statement that is subjected to verification through a research study. In formulating a hypothesis it is important to ensure that it is simple, specific and conceptually clear; able to be verified; rooted in an existing body of knowledge; and able to be operationalised.

[Ranjit Kumar]


1 Research: a way of thinking
Research: an integral part of your practice
Research: a way to gather evidence for your practice
Applications of research
Research: what does it mean?
The research process: characteristics and requirements
Types of research
Types of research: application perspective
Types of research: objectives perspective
Types of research: mode of enquiry perspective
Paradigms of research
2 The research process: a quick glance
The research process: an eight-step model
Phase I: deciding what to research
Step I: formulating a research problem
Phase II: planning a research study
Step II: conceptualising a research design
Step III: constructing an instrument for data collection
Step IV: selecting a sample
Step V: writing a research proposal
Phase III: conducting a research study
Step VI: collecting data
Step VII: processing and displaying data
Step VIII: writing a research report

3 Reviewing the literature
The place of the literature review in research
Bringing clarity and focus to your research problem
Improving your research methodology
Broadening your knowledge base in your research area
Enabling you to contextualise your findings
How to review the literature
Searching for the existing literature
Reviewing the selected literature
Developing a theoretical framework
Developing a conceptual framework
Writing about the literature reviewed
4 Formulating a research problem
The research problem
The importance of formulating a research problem
Sources of research problems
Considerations in selecting a research problem
Steps in formulating a research problem
The formulation of research objectives
The study population
Establishing operational definitions
Formulating a research problem in qualitative research
5 Identifying variables
What is a variable?
The difference between a concept and a variable
Converting concepts into variables
Types of variable
From the viewpoint of causal relationship
From the viewpoint of the study design
From the viewpoint of the unit of measurement
Types of measurement scale
The nominal or classificatory scale
The ordinal or ranking scale
The interval scale
The ratio scale
6 Constructing hypotheses
The definition of a hypothesis
The functions of a hypothesis
The testing of a hypothesis
The characteristics of a hypothesis
Types of hypothesis
Errors in testing a hypothesis
Hypotheses in qualitative research

7 The research design
What is a research design?
The functions of a research design
The theory of causality and the research design
8 Selecting a study design
Differences between quantitative and qualitative study designs
Study designs in quantitative research
Study designs based on the number of contacts
Study designs based on the reference period
Study designs based on the nature of the investigation
Other designs commonly used in quantitative research
The cross-over comparative experimental design
The replicated cross-sectional design
Trend studies
Cohort studies
Panel studies
Blind studies
Double-blind studies
Study designs in qualitative research
Case study
Oral history
Focus groups/group interviews
Participant observation
Holistic research
Community discussion forums
Reflective journal log
Other commonly used philosophy-guided designs
Action research
Feminist research
Participatory and collaborative research enquiry

9 Selecting a method of data collection
Differences in the methods of data collection in quantitative and qualitative research
Major approaches to information gathering
Collecting data using primary sources
The interview
The questionnaire
Constructing a research instrument in quantitative research
Asking personal and sensitive questions
The order of questions
Pre-testing a research instrument
Prerequisites for data collection
Methods of data collection in qualitative research
Constructing a research instrument in qualitative research
Collecting data using secondary sources
Problems with using data from secondary sources
10 Collecting data using attitudinal scales
Measurement of attitudes in quantitative and qualitative research
Attitudinal scales in quantitative research
Functions of attitudinal scales
Difficulties in developing an attitudinal scale
Types of attitudinal scale
The summated rating or Likert scale
The equal-appearing interval or Thurstone scale
The cumulative or Guttman scale
Attitudinal scales and measurement scales
Attitudes and qualitative research
11 Establishing the validity and reliability of a research instrument
The concept of validity
Types of validity in quantitative research
Face and content validity
Concurrent and predictive validity
Construct validity
The concept of reliability
Factors affecting the reliability of a research instrument
Methods of determining the reliability of an instrument in quantitative research
External consistency procedures
Internal consistency procedures
Validity and reliability in qualitative research

12 Selecting a sample
The differences between sampling in quantitative and qualitative research
Sampling in quantitative research
The concept of sampling
Sampling terminology
Principles of sampling
Factors affecting the inferences drawn from a sample
Aims in selecting a sample
Types of sampling
Non-random/non-probability sampling designs in quantitative research
Systematic sampling design: a ‘mixed’ design
The calculation of sample size
Sampling in qualitative research
The concept of saturation point in qualitative research

13 How to write a research proposal
The research proposal in quantitative and qualitative research
Contents of a research proposal
The problem
Objectives of the study
Hypotheses to be tested
Study design
The setting
Measurement procedures
Ethical issues
Analysis of data
Structure of the report
Problems and limitations
Work schedule

14 Considering ethical issues in data collection
Ethics: the concept
Stakeholders in research
Ethical issues to consider concerning research participants
Collecting information
Seeking consent
Providing incentives
Seeking sensitive information
The possibility of causing harm to participants
Maintaining confidentiality
Ethical issues to consider relating to the researcher
Avoiding bias
Provision or deprivation of a treatment
Using inappropriate research methodology
Incorrect reporting
Inappropriate use of the information
Ethical issues regarding the sponsoring organisation
Restrictions imposed by the sponsoring organisation
The misuse of information

15 Processing data
Part one: Data processing in quantitative studies
Part two: Data processing in qualitative studies
Content analysis in qualitative research – an example
The role of statistics in research
16 Displaying data
Methods of communicating and displaying analysed data

17 Writing a research report
Writing a research report
Developing an outline
Writing about a variable
Writing a bibliography
18 Research methodology and practice evaluation
What is evaluation?
Why evaluation?
Intervention–development–evaluation process
Perspectives in the classification of evaluation studies
Types of evaluation from a focus perspective
Evaluation for programme/intervention planning
Process/monitoring evaluation
Evaluating participation of the target population
Evaluating service delivery manner
Impact/outcome evaluation
Cost–benefit/cost-effectiveness evaluation
Types of evaluation from a philosophical perspective
Goal-centred/objective-oriented evaluation
Consumer-oriented/client-centred evaluation
Improvement-oriented evaluation
Holistic/illuminative evaluation
Undertaking an evaluation: the process
Step 1: Determining the purpose of evaluation
Step 2: Developing objectives or evaluation questions
Step 3: Converting concepts into indicators into variables
Step 4: Developing evaluation methodology
Step 5: Collecting data
Step 6: Analysing data
Step 7: Writing an evaluation report
Step 8: Sharing findings with stakeholders
Involving stakeholders in evaluation
Ethics in evaluation
Appendix: Developing a research project: a set of exercises for beginners

The applications of research

1.2 Types of research
2.1 The research journey
2.2 The research process
2.3 The chapters in the book in relation to the operational steps
3.1a Developing a theoretical framework – the relationship between mortality and fertility
3.1b Theoretical framework for the study ‘community responsiveness in health’
3.2 Sample of outline of a literature review
4.1 Dissecting the subject area of domestic violence into subareas
4.2 Steps in formulating a research problem – alcoholism
4.3 Formulating a research problem – the relationship between fertility and mortality
4.4 Narrowing a research problem – health
4.5 Characteristics of objectives
5.1 Types of variable
5.2 Types of variable in a causal relationship
5.3 Independent, dependent and extraneous variables in a causal relationship
5.4 Sets of variables in counselling and marriage problems
5.5 Independent, dependent, extraneous and intervening variables
5.6 Active and attribute variables
6.1 The process of testing a hypothesis
6.2 Two-by-two factorial experiment to study the relationship between MCH, NS and infant mortality
6.3 Types of hypothesis
6.4 Type I and Type II errors in testing a hypothesis
7.1 Factors affecting the relationship between a counselling service and the extent of marital problems
7.2 The relationship between teaching models and comprehension
7.3 The proportion attributable to the three components may vary markedly
7.4 Building into the design
8.1 Types of study design
8.2 Before-and-after (pre-test/post-test) study design
8.3 The regression effect
8.4 The longitudinal study design
8.5a Retrospective study design
8.5b Prospective study design
8.5c Retrospective-prospective study design
8.6 Experimental and non-experimental studies
8.7 Randomisation in experiments
8.8 The after-only design
8.9 Measurement of change through a before-and-after design
8.10 The control experimental design
8.11 Double-control designs
8.12 Comparative experimental design
8.13 The placebo design
8.14 The cross-over experimental design
8.15 The replicated cross-sectional design
8.16 Action research design
9.1 Methods of data collection
9.2 A three-directional rating scale
9.3 Types of interview
9.4 Example 1: Where to go? A study of occupational mobility among immigrants
9.5 Example 2: Occupational redeployment – a study of occupational redeployment among state
government employees
9.6 Examples of closed questions
9.7 Examples of open-ended questions
10.1 An example of a categorical scale
10.2 An example of a seven-point numerical scale
10.3 An example of a scale with statements reflecting varying degrees of an attitude
10.4 The procedure for constructing a Likert scale
10.5 Scoring positive and negative statements
10.6 Calculating an attitudinal score
10.7 The procedure for constructing the Thurstone scale
12.1 The concept of sampling
12.2 Types of sampling in quantitative research
12.3 The procedure for using a table of random numbers
12.4 The procedure for selecting a simple random sample
12.5 The procedure for selecting a stratified sample
12.6 The concept of cluster sampling
12.7 Snowball sampling
12.8 The procedure for selecting a systematic sample
12.9 Systematic sampling
15.1 Steps in data processing
15.2 Example of questions from a survey
15.3 Some selected responses to the open-ended question in Figure 15.2
15.4 Some questions from a survey – respondent 3
15.5 Some questions from a survey – respondent 59
15.6 Some questions from a survey – respondent 81
15.7 An example of coded data on a code sheet
15.8 Manual analysis using graph paper
16.1 The structure of a table
16.2a Two-dimensional histogram
16.2b Three-dimensional histogram
16.2c Two-dimensional histogram with two variables
16.3 Bar charts
16.4 The stacked bar chart
16.5 The 100 per cent bar chart
16.6 The frequency polygon
16.7 The cumulative frequency polygon
16.8 The stem-and-leaf display
16.9 Two- and three-dimensional pie charts
16.10 The line diagram or trend curve
16.11 The area chart
16.12 The scattergram
18.1 The concept of evaluation
18.2 The intervention–development–evaluation model
18.3 Perspectives in the classification of evaluation studies
18.4 Aspects of process evaluation
18.5 Reflexive control design
18.6 Interrupted time-series design
18.7 Replicated cross-sectional design
18.8 Converting concepts into indicators into variables
18.9 An example of converting concepts into questions

Types of research studies from the perspective of objectives

2.1 Differences between qualitative and quantitative research
3.1 Some commonly used electronic databases in public health, sociology, education and business studies
4.1 Aspects of a research problem
4.2 Operationalisation of concepts and the study populations
5.1 Examples of concepts and variables
5.2 Converting concepts into variables
5.3 Categorical/continuous and quantitative/qualitative variables
5.4 Characteristics and examples of the four measurement scales
9.1 Guidelines for constructing a research instrument
10.1 The relationship between attitudinal and measurement scales
12.1 The difference between sample statistics and the population mean
12.2 The difference between a sample and a population average
12.3 Selecting a sample using a table for random numbers
12.4 Selected elements using the table of random numbers
13.1 Developing a time-frame for your study
15.1 An example of a code book
16.1 Respondents by age (frequency table for one population)
16.2 Respondents by age (frequency table comparing two populations)
16.3 Respondents by attitude towards uranium mining and age (cross-tabulation)
16.4 Attitude towards uranium mining by age and gender
16.5 Age and income data
18.1 Types of evaluation from the perspective of its focus and the questions they are designed to answer

Keywords: applied research, controlled, correlational research, descriptive
research, empirical, explanatory research, exploratory research, evidence-based
practice, interpretive paradigm, positivistic paradigm, pure research, qualitative
research, quantitative research, reliability, research, structured and unstructured
enquiries, systematic, validity.

Keywords: data, data display, data processing, empiricism, hypotheses, interview
schedule, non-probability sample, primary data, probability sample, qualitative
research, questionnaire, rationalism, reliability, research design, research instrument,
research objectives, research problem, research proposal, sample, sample size,
sampling design, secondary data, study design, unstructured interview, validity,

Keywords: catalogue, conceptual framework, contextualise, Internet, knowledge
base, literature review, search engines, summary of literature, thematic writing,
theoretical framework.

Keywords: concepts, dissect, operational definition, qualitative research,
quantitative research, research objectives, research problem, study area, study
population, subject area, validity, variable, working definition.

Keywords: active variables, attribute variables, categorical variables, causation,
constant variables, continuous variables, dependent variables, dichotomous,
extraneous variables, independent variables, interval scale, intervening variables,
measurement scales, nominal scale, ordinal scale, polytomous, ratio scale, unit of

Keywords: alternate hypotheses, hunch, hypothesis, hypothesis of point-prevalence,
null hypothesis, operationalisable, research hypothesis, Type I error, Type II error,
unidimensional, valid.

Keywords: chance variables, control group, experimental group, extraneous
variables, independent variable, matching, ‘maxmincon’ principle, random error,
randomisation, research design, study design, treatment group.

Keywords: action research, after-only design, before-and-after study design, blind
studies, case studies, cohort studies, control studies, cross-sectional study design,
double-blind studies, experimental study design, feminist research, focus studies,
longitudinal studies, non-experimental studies, panel studies, prospective study
design, quasi-experimental studies, reflective journal, retrospective studies, semi-experimental studies, trend studies.

Keywords: closed questions, content analysis, double-barrelled questions, elevation
effect, error of central tendency, focus group, halo effect, Hawthorne effect, interview
schedule, leading questions, non-participant observation, open-ended questions, oral
history, participant observation, primary data, primary sources, questionnaire,
secondary data, secondary sources, structured interview, unstructured interview.

Keywords: attitudinal scales, attitudinal score, attitudinal value, attitudinal weight,
cumulative scale, equal-appearing scale, Guttman scale, interval scale, Likert scale,
negative statements, neutral items, non-discriminate items, numerical scale, ordinal
scale, positive statements, ratio scale, summated rating scale, Thurstone scale.

Keywords: concurrent validity, confirmability, construct validity, content validity,
credibility, dependability, external consistency, face validity, internal consistency,
reliability, transferability, validity.

Keywords: accidental sampling, cluster sampling, data saturation point,
disproportionate sampling, equal and independent, estimate, information-rich,
judgemental sampling, multi-stage cluster sampling, non-random sample, population
mean, population parameters, quota sampling, random numbers, random sample,
sample statistics, sampling, sampling design, sampling element, sampling error,
sampling frame, sampling population, sampling unit, sample size, sampling strategy,
saturation point, snowball sampling, study population, stratified sampling, systematic

Keywords: conceptual framework, data analysis, data processing, hypothesis,
limitations, literature review, research design, research problem, sampling, study
design, study objectives, theoretical framework, time-frame.

Keywords: bias, code of conduct, confidentiality, deprivation of treatment, ethos,
harm, informed consent, principles of conduct, research participants, sensitive
information, sponsoring organisations, stakeholders, subjectivity.

Keywords: analysis, closed questions, code book, coding, concepts, content
analysis, cross-tabulation, data displaying, data processing, editing, frame of
analysis, frequency distribution, multiple responses, open-ended questions, pre-test.

Keywords: area chart, bar diagram, bivariate, cumulative frequency polygon, data
display, frequency graph, line diagram, pie chart, polygon, polyvariate, scattergram,
table, univariate.

Keywords: association, bibliography, intellectual rigour, non-spurious, outline,
referencing, spurious, variable, verifiability.

client-centred evaluation, cost–benefit evaluation, cost-effectiveness
evaluation, ethics, evaluation, evaluation process, goal-centred, holistic evaluation,
illuminative evaluation, impact evaluation, improvement-oriented evaluation,
indicators, intervention, monitoring, objective-oriented evaluation, outcome
evaluation, perspectives, process evaluation, stakeholders.

%d bloggers like this: