Research methodology vs. research methods
The research methodology or design is the overall strategy and rationale that you used to carry out the research. Whereas, research methods are the specific tools and processes you use to gather and understand the data you need to test your hypothesis.
To further understand research methodology, let’s explore some examples of research methodology:
a. Qualitative research methodology example: A study exploring the impact of author branding on author popularity might utilize in-depth interviews to gather personal experiences and perspectives.
b. Quantitative research methodology example: A research project investigating the effects of a book promotion technique on book sales could employ a statistical analysis of profit margins and sales before and after the implementation of the method.
c. Mixed-Methods research methodology example: A study examining the relationship between social media use and academic performance might combine both qualitative and quantitative approaches. It could include surveys to quantitatively assess the frequency of social media usage and its correlation with grades, alongside focus groups or interviews to qualitatively explore students’ perceptions and experiences regarding how social media affects their study habits and academic engagement.
These examples highlight the meaning of methodology in research and how it guides the research process, from data collection to analysis, ensuring the study’s objectives are met efficiently.
When it comes to writing your study, the methodology in research papers or a dissertation plays a pivotal role. A well-crafted methodology section of a research paper or thesis not only enhances the credibility of your research but also provides a roadmap for others to replicate or build upon your work.
Wondering how to write the research methodology section? Follow these steps to create a strong methods chapter:
At the start of a research paper , you would have provided the background of your research and stated your hypothesis or research problem. In this section, you will elaborate on your research strategy.
Begin by restating your research question and proceed to explain what type of research you opted for to test it. Depending on your research, here are some questions you can consider:
a. Did you use qualitative or quantitative data to test the hypothesis?
b. Did you perform an experiment where you collected data or are you writing a dissertation that is descriptive/theoretical without data collection?
c. Did you use primary data that you collected or analyze secondary research data or existing data as part of your study?
These questions will help you establish the rationale for your study on a broader level, which you will follow by elaborating on the specific methods you used to collect and understand your data.
Now that you have told your reader what type of research you’ve undertaken for the dissertation, it’s time to dig into specifics. State what specific methods you used and explain the conditions and variables involved. Explain what the theoretical framework behind the method was, what samples you used for testing it, and what tools and materials you used to collect the data.
Once you have explained the data collection process, explain how you analyzed and studied the data. Here, your focus is simply to explain the methods of analysis rather than the results of the study.
Here are some questions you can answer at this stage:
a. What tools or software did you use to analyze your results?
b. What parameters or variables did you consider while understanding and studying the data you’ve collected?
c. Was your analysis based on a theoretical framework?
Your mode of analysis will change depending on whether you used a quantitative or qualitative research methodology in your study. If you’re working within the hard sciences or physical sciences, you are likely to use a quantitative research methodology (relying on numbers and hard data). If you’re doing a qualitative study, in the social sciences or humanities, your analysis may rely on understanding language and socio-political contexts around your topic. This is why it’s important to establish what kind of study you’re undertaking at the onset.
Now that you have gone through your research process in detail, you’ll also have to make a case for it. Justify your choice of methodology and methods, explaining why it is the best choice for your research question. This is especially important if you have chosen an unconventional approach or you’ve simply chosen to study an existing research problem from a different perspective. Compare it with other methodologies, especially ones attempted by previous researchers, and discuss what contributions using your methodology makes.
No matter how thorough a methodology is, it doesn’t come without its hurdles. This is a natural part of scientific research that is important to document so that your peers and future researchers are aware of it. Writing in a research paper about this aspect of your research process also tells your evaluator that you have actively worked to overcome the pitfalls that came your way and you have refined the research process.
1. Remember who you are writing for. Keeping sight of the reader/evaluator will help you know what to elaborate on and what information they are already likely to have. You’re condensing months’ work of research in just a few pages, so you should omit basic definitions and information about general phenomena people already know.
2. Do not give an overly elaborate explanation of every single condition in your study.
3. Skip details and findings irrelevant to the results.
4. Cite references that back your claim and choice of methodology.
5. Consistently emphasize the relationship between your research question and the methodology you adopted to study it.
To sum it up, what is methodology in research? It’s the blueprint of your research, essential for ensuring that your study is systematic, rigorous, and credible. Whether your focus is on qualitative research methodology, quantitative research methodology, or a combination of both, understanding and clearly defining your methodology is key to the success of your research.
Once you write the research methodology and complete writing the entire research paper, the next step is to edit your paper. As experts in research paper editing and proofreading services , we’d love to help you perfect your paper!
Here are some other articles that you might find useful:
What does research methodology mean, what types of research methodologies are there, what is qualitative research methodology, how to determine sample size in research methodology, what is action research methodology.
Found this article helpful?
This is very simplified and direct. Very helpful to understand the research methodology section of a dissertation
Leave a Comment: Cancel reply
Your email address will not be published.
Your organization needs a technical editor: here’s why, your guide to the best ebook readers in 2024, writing for the web: 7 expert tips for web content writing.
Subscribe to our Newsletter
Get carefully curated resources about writing, editing, and publishing in the comfort of your inbox.
How to Copyright Your Book?
If you’ve thought about copyrighting your book, you’re on the right path.
© 2024 All rights reserved
Published on February 5, 2021 by Pritha Bhandari . Revised on June 22, 2023.
The methods section of an APA style paper is where you report in detail how you performed your study. Research papers in the social and natural sciences often follow APA style. This article focuses on reporting quantitative research methods .
In your APA methods section, you should report enough information to understand and replicate your study, including detailed information on the sample , measures, and procedures used.
Upload your document to correct all your mistakes in minutes
Structuring an apa methods section.
Participants
Other interesting articles, frequently asked questions about writing an apa methods section.
The main heading of “Methods” should be centered, boldfaced, and capitalized. Subheadings within this section are left-aligned, boldfaced, and in title case. You can also add lower level headings within these subsections, as long as they follow APA heading styles .
To structure your methods section, you can use the subheadings of “Participants,” “Materials,” and “Procedures.” These headings are not mandatory—aim to organize your methods section using subheadings that make sense for your specific study.
Heading | What to include |
---|---|
Participants | |
Materials | |
Procedure |
Note that not all of these topics will necessarily be relevant for your study. For example, if you didn’t need to consider outlier removal or ways of assigning participants to different conditions, you don’t have to report these steps.
The APA also provides specific reporting guidelines for different types of research design. These tell you exactly what you need to report for longitudinal designs , replication studies, experimental designs , and so on. If your study uses a combination design, consult APA guidelines for mixed methods studies.
Detailed descriptions of procedures that don’t fit into your main text can be placed in supplemental materials (for example, the exact instructions and tasks given to participants, the full analytical strategy including software code, or additional figures and tables).
The AI-powered APA Citation Checker points out every error, tells you exactly what’s wrong, and explains how to fix it. Say goodbye to losing marks on your assignment!
Get started!
Begin the methods section by reporting sample characteristics, sampling procedures, and the sample size.
When discussing people who participate in research, descriptive terms like “participants,” “subjects” and “respondents” can be used. For non-human animal research, “subjects” is more appropriate.
Specify all relevant demographic characteristics of your participants. This may include their age, sex, ethnic or racial group, gender identity, education level, and socioeconomic status. Depending on your study topic, other characteristics like educational or immigration status or language preference may also be relevant.
Be sure to report these characteristics as precisely as possible. This helps the reader understand how far your results may be generalized to other people.
The APA guidelines emphasize writing about participants using bias-free language , so it’s necessary to use inclusive and appropriate terms.
Outline how the participants were selected and all inclusion and exclusion criteria applied. Appropriately identify the sampling procedure used. For example, you should only label a sample as random if you had access to every member of the relevant population.
Of all the people invited to participate in your study, note the percentage that actually did (if you have this data). Additionally, report whether participants were self-selected, either by themselves or by their institutions (e.g., schools may submit student data for research purposes).
Identify any compensation (e.g., course credits or money) that was provided to participants, and mention any institutional review board approvals and ethical standards followed.
Detail the sample size (per condition) and statistical power that you hoped to achieve, as well as any analyses you performed to determine these numbers.
It’s important to show that your study had enough statistical power to find effects if there were any to be found.
Additionally, state whether your final sample differed from the intended sample. Your interpretations of the study outcomes should be based only on your final sample rather than your intended sample.
Write up the tools and techniques that you used to measure relevant variables. Be as thorough as possible for a complete picture of your techniques.
Define the primary and secondary outcome measures that will help you answer your primary and secondary research questions.
Specify all instruments used in gathering these measurements and the construct that they measure. These instruments may include hardware, software, or tests, scales, and inventories.
Make sure to report the settings of (e.g., screen resolution) any specialized apparatus used.
For each instrument used, report measures of the following:
Giving an example item or two for tests, questionnaires , and interviews is also helpful.
Describe any covariates—these are any additional variables that may explain or predict the outcomes.
Review all methods you used to assure the quality of your measurements.
These may include:
For data that’s subjectively coded (for example, classifying open-ended responses), report interrater reliability scores. This tells the reader how similarly each response was rated by multiple raters.
Report all of the procedures applied for administering the study, processing the data, and for planned data analyses.
Data collection methods refers to the general mode of the instruments: surveys, interviews, observations, focus groups, neuroimaging, cognitive tests, and so on. Summarize exactly how you collected the necessary data.
Describe all procedures you applied in administering surveys, tests, physical recordings, or imaging devices, with enough detail so that someone else can replicate your techniques. If your procedures are very complicated and require long descriptions (e.g., in neuroimaging studies), place these details in supplementary materials.
To report research design, note your overall framework for data collection and analysis. State whether you used an experimental, quasi-experimental, descriptive (observational), correlational, and/or longitudinal design. Also note whether a between-subjects or a within-subjects design was used.
For multi-group studies, report the following design and procedural details as well:
Describe whether any masking was used to hide the condition assignment (e.g., placebo or medication condition) from participants or research administrators. Using masking in a multi-group study ensures internal validity by reducing research bias . Explain how this masking was applied and whether its effectiveness was assessed.
Participants were randomly assigned to a control or experimental condition. The survey was administered using Qualtrics (https://www.qualtrics.com). To begin, all participants were given the AAI and a demographics questionnaire to complete, followed by an unrelated filler task. In the control condition , participants completed a short general knowledge test immediately after the filler task. In the experimental condition, participants were asked to visualize themselves taking the test for 3 minutes before they actually did. For more details on the exact instructions and tasks given, see supplementary materials.
Outline all steps taken to scrutinize or process the data after collection.
This includes the following:
To ensure high validity, you should provide enough detail for your reader to understand how and why you processed or transformed your raw data in these specific ways.
The methods section is also where you describe your statistical analysis procedures, but not their outcomes. Their outcomes are reported in the results section.
These procedures should be stated for all primary, secondary, and exploratory hypotheses. While primary and secondary hypotheses are based on a theoretical framework or past studies, exploratory hypotheses are guided by the data you’ve just collected.
This annotated example reports methods for a descriptive correlational survey on the relationship between religiosity and trust in science in the US. Hover over each part for explanation of what is included.
The sample included 879 adults aged between 18 and 28. More than half of the participants were women (56%), and all participants had completed at least 12 years of education. Ethics approval was obtained from the university board before recruitment began. Participants were recruited online through Amazon Mechanical Turk (MTurk; www.mturk.com). We selected for a geographically diverse sample within the Midwest of the US through an initial screening survey. Participants were paid USD $5 upon completion of the study.
A sample size of at least 783 was deemed necessary for detecting a correlation coefficient of ±.1, with a power level of 80% and a significance level of .05, using a sample size calculator (www.sample-size.net/correlation-sample-size/).
The primary outcome measures were the levels of religiosity and trust in science. Religiosity refers to involvement and belief in religious traditions, while trust in science represents confidence in scientists and scientific research outcomes. The secondary outcome measures were gender and parental education levels of participants and whether these characteristics predicted religiosity levels.
Religiosity
Religiosity was measured using the Centrality of Religiosity scale (Huber, 2003). The Likert scale is made up of 15 questions with five subscales of ideology, experience, intellect, public practice, and private practice. An example item is “How often do you experience situations in which you have the feeling that God or something divine intervenes in your life?” Participants were asked to indicate frequency of occurrence by selecting a response ranging from 1 (very often) to 5 (never). The internal consistency of the instrument is .83 (Huber & Huber, 2012).
Trust in Science
Trust in science was assessed using the General Trust in Science index (McCright, Dentzman, Charters & Dietz, 2013). Four Likert scale items were assessed on a scale from 1 (completely distrust) to 5 (completely trust). An example question asks “How much do you distrust or trust scientists to create knowledge that is unbiased and accurate?” Internal consistency was .8.
Potential participants were invited to participate in the survey online using Qualtrics (www.qualtrics.com). The survey consisted of multiple choice questions regarding demographic characteristics, the Centrality of Religiosity scale, an unrelated filler anagram task, and finally the General Trust in Science index. The filler task was included to avoid priming or demand characteristics, and an attention check was embedded within the religiosity scale. For full instructions and details of tasks, see supplementary materials.
For this correlational study , we assessed our primary hypothesis of a relationship between religiosity and trust in science using Pearson moment correlation coefficient. The statistical significance of the correlation coefficient was assessed using a t test. To test our secondary hypothesis of parental education levels and gender as predictors of religiosity, multiple linear regression analysis was used.
If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.
Methodology
Research bias
In your APA methods section , you should report detailed information on the participants, materials, and procedures used.
You should report methods using the past tense , even if you haven’t completed your study at the time of writing. That’s because the methods section is intended to describe completed actions or research.
In a scientific paper, the methodology always comes after the introduction and before the results , discussion and conclusion . The same basic structure also applies to a thesis, dissertation , or research proposal .
Depending on the length and type of document, you might also include a literature review or theoretical framework before the methodology.
If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.
Bhandari, P. (2023, June 22). How to Write an APA Methods Section | With Examples. Scribbr. Retrieved August 12, 2024, from https://www.scribbr.com/apa-style/methods-section/
Other students also liked, how to write an apa results section, apa format for academic papers and essays, apa headings and subheadings, "i thought ai proofreading was useless but..".
I've been using Scribbr for years now and I know it's a service that won't disappoint. It does a good job spotting mistakes”
Tio Gabunia (B.Arch, M.Arch)
Tio Gabunia is an academic writer and architect based in Tbilisi. He has studied architecture, design, and urban planning at the Georgian Technical University and the University of Lisbon. He has worked in these fields in Georgia, Portugal, and France. Most of Tio’s writings concern philosophy. Other writings include architecture, sociology, urban planning, and economics.
Learn about our Editorial Process
Chris Drew (PhD)
Dr. Chris Drew is the founder of the Helpful Professor. He holds a PhD in education and has published over 20 articles in scholarly journals. He is the former editor of the Journal of Learning Development in Higher Education. [Image Descriptor: Photo of Chris]
Research methodologies can roughly be categorized into three group: quantitative, qualitative, and mixed-methods.
Below are research methodologies that fit into each category.
1. case study.
Conducts an in-depth examination of a specific case, individual, or event to understand a phenomenon.
Instead of examining a whole population for numerical trend data, case study researchers seek in-depth explanations of one event.
The benefit of case study research is its ability to elucidate overlooked details of interesting cases of a phenomenon (Busetto, Wick & Gumbinger, 2020). It offers deep insights for empathetic, reflective, and thoughtful understandings of that phenomenon.
However, case study findings aren’t transferrable to new contexts or for population-wide predictions. Instead, they inform practitioner understandings for nuanced, deep approaches to future instances (Liamputtong, 2020).
Grounded theory involves generating hypotheses and theories through the collection and interpretation of data (Faggiolani, n.d.). Its distinguishing features is that it doesn’t test a hypothesis generated prior to analysis, but rather generates a hypothesis or ‘theory’ that emerges from the data.
It also involves the application of inductive reasoning and is often contrasted with the hypothetico-deductive model of scientific research. This research methodology was developed by Barney Glaser and Anselm Strauss in the 1960s (Glaser & Strauss, 2009).
The basic difference between traditional scientific approaches to research and grounded theory is that the latter begins with a question, then collects data, and the theoretical framework is said to emerge later from this data.
By contrast, scientists usually begin with an existing theoretical framework , develop hypotheses, and only then start collecting data to verify or falsify the hypotheses.
In ethnographic research , the researcher immerses themselves within the group they are studying, often for long periods of time.
This type of research aims to understand the shared beliefs, practices, and values of a particular community by immersing the researcher within the cultural group.
Although ethnographic research cannot predict or identify trends in an entire population, it can create detailed explanations of cultural practices and comparisons between social and cultural groups.
When a person conducts an ethnographic study of themselves or their own culture, it can be considered autoethnography .
Its strength lies in producing comprehensive accounts of groups of people and their interactions.
Common methods researchers use during an ethnographic study include participant observation , thick description, unstructured interviews, and field notes vignettes. These methods can provide detailed and contextualized descriptions of their subjects.
Example Study
Liquidated: An Ethnography of Wall Street by Karen Ho involves an anthropologist who embeds herself with Wall Street firms to study the culture of Wall Street bankers and how this culture affects the broader economy and world.
Phenomenology to understand and describe individuals’ lived experiences concerning a specific phenomenon.
As a research methodology typically used in the social sciences , phenomenology involves the study of social reality as a product of intersubjectivity (the intersection of people’s cognitive perspectives) (Zahavi & Overgaard, n.d.).
This philosophical approach was first developed by Edmund Husserl.
Narrative research explores personal stories and experiences to understand their meanings and interpretations.
It is also known as narrative inquiry and narrative analysis(Riessman, 1993).
This approach to research uses qualitative material like journals, field notes, letters, interviews, texts, photos, etc., as its data.
It is aimed at understanding the way people create meaning through narratives (Clandinin & Connelly, 2004).
A discourse analysis examines the structure, patterns, and functions of language in context to understand how the text produces social constructs.
This methodology is common in critical theory , poststructuralism , and postmodernism. Its aim is to understand how language constructs discourses (roughly interpreted as “ways of thinking and constructing knowledge”).
As a qualitative methodology , its focus is on developing themes through close textual analysis rather than using numerical methods. Common methods for extracting data include semiotics and linguistic analysis.
Action research involves researchers working collaboratively with stakeholders to address problems, develop interventions, and evaluate effectiveness.
Action research is a methodology and philosophy of research that is common in the social sciences.
The term was first coined in 1944 by Kurt Lewin, a German-American psychologist who also introduced applied research and group communication (Altrichter & Gstettner, 1993).
Lewin originally defined action research as involving two primary processes: taking action and doing research (Lewin, 1946).
Action research involves planning, action, and information-seeking about the result of the action.
Since Lewin’s original formulation, many different theoretical approaches to action research have been developed. These include action science, participatory action research, cooperative inquiry, and living educational theory among others.
Using Digital Sandbox Gaming to Improve Creativity Within Boys’ Writing (Ellison & Drew, 2019) is a study conducted by a school teacher who used video games to help teach his students English. It involved action research, where he interviewed his students to see if the use of games as stimuli for storytelling helped draw them into the learning experience, and iterated on his teaching style based on their feedback (disclaimer: I am the second author of this study).
See More: Examples of Qualitative Research
8. experimental design.
As the name suggests, this type of research is based on testing hypotheses in experimental settings by manipulating variables and observing their effects on other variables.
The main benefit lies in its ability to manipulate specific variables to determine their effect on outcomes which is a great method for those looking for causational links in their research.
This is common, for example, in high-school science labs, where students are asked to introduce a variable into a setting in order to examine its effect.
Non-experimental design observes and measures associations between variables without manipulating them.
It can take, for example, the form of a ‘fly on the wall’ observation of a phenomenon, allowing researchers to examine authentic settings and changes that occur naturally in the environment.
Cross-sectional design involves analyzing variables pertaining to a specific time period and at that exact moment.
This approach allows for an extensive examination and comparison of distinct and independent subjects, thereby offering advantages over qualitative methodologies such as case studies or surveys.
While cross-sectional design can be extremely useful in taking a ‘snapshot in time’, as a standalone method, it is not useful for examining changes in subjects after an intervention. The next methodology addresses this issue.
The prime example of this type of study is a census. A population census is mailed out to every house in the country, and each household must complete the census on the same evening. This allows the government to gather a snapshot of the nation’s demographics, beliefs, religion, and so on.
Longitudinal research gathers data from the same subjects over an extended period to analyze changes and development.
In contrast to cross-sectional tactics, longitudinal designs examine variables more than once, over a pre-determined time span, allowing for multiple data points to be taken at different times.
A cross-sectional design is also useful for examining cohort effects , by comparing differences or changes in multiple different generations’ beliefs over time.
With multiple data points collected over extended periods ,it’s possible to examine continuous changes within things like population dynamics or consumer behavior. This makes detailed analysis of change possible.
Quasi-experimental design involves manipulating variables for analysis, but uses pre-existing groups of subjects rather than random groups.
Because the groups of research participants already exist, they cannot be randomly assigned to a cohort as with a true experimental design study. This makes inferring a causal relationship more difficult, but is nonetheless often more feasible in real-life settings.
Quasi-experimental designs are generally considered inferior to true experimental designs.
Correlational research examines the relationships between two or more variables, determining the strength and direction of their association.
Similar to quasi-experimental methods, this type of research focuses on relationship differences between variables.
This approach provides a fast and easy way to make initial hypotheses based on either positive or negative correlation trends that can be observed within dataset.
Methods used for data analysis may include statistic correlations such as Pearson’s or Spearman’s.
14. sequential explanatory design (quan→qual).
This methodology involves conducting quantitative analysis first, then supplementing it with a qualitative study.
It begins by collecting quantitative data that is then analyzed to determine any significant patterns or trends.
Secondly, qualitative methods are employed. Their intent is to help interpret and expand the quantitative results.
This offers greater depth into understanding both large and smaller aspects of research questions being addressed.
The rationale behind this approach is to ensure that your data collection generates richer context for gaining insight into the particular issue across different levels, integrating in one study, qualitative exploration as well as statistical procedures.
This methodology goes in the other direction, starting with qualitative analysis and ending with quantitative analysis.
It starts with qualitative research that delves deeps into complex areas and gathers rich information through interviewing or observing participants.
After this stage of exploration comes to an end, quantitative techniques are used to analyze the collected data through inferential statistics.
The idea is that a qualitative study can arm the researchers with a strong hypothesis testing framework, which they can then apply to a larger sample size using qualitative methods.
When I first took research classes, I had a lot of trouble distinguishing between methodologies and methods.
The key is to remember that the methodology sets the direction, while the methods are the specific tools to be used. A good analogy is transport: first you need to choose a mode (public transport, private transport, motorized transit, non-motorized transit), then you can choose a tool (bus, car, bike, on foot).
While research methodologies can be split into three types, each type has many different nuanced methodologies that can be chosen, before you then choose the methods – or tools – to use in the study. Each has its own strengths and weaknesses, so choose wisely!
Altrichter, H., & Gstettner, P. (1993). Action Research: A closed chapter in the history of German social science? Educational Action Research , 1 (3), 329–360. https://doi.org/10.1080/0965079930010302
Audi, R. (1999). The Cambridge dictionary of philosophy . Cambridge ; New York : Cambridge University Press. http://archive.org/details/cambridgediction00audi
Clandinin, D. J., & Connelly, F. M. (2004). Narrative Inquiry: Experience and Story in Qualitative Research . John Wiley & Sons.
Creswell, J. W. (2008). Educational Research: Planning, Conducting, and Evaluating Quantitative and Qualitative Research . Pearson/Merrill Prentice Hall.
Faggiolani, C. (n.d.). Perceived Identity: Applying Grounded Theory in Libraries . https://doi.org/10.4403/jlis.it-4592
Gauch, H. G. (2002). Scientific Method in Practice . Cambridge University Press.
Glaser, B. G., & Strauss, A. L. (2009). The Discovery of Grounded Theory: Strategies for Qualitative Research . Transaction Publishers.
Kothari, C. R. (2004). Research Methodology: Methods and Techniques . New Age International.
Kuada, J. (2012). Research Methodology: A Project Guide for University Students . Samfundslitteratur.
Lewin, K. (1946). Action research and minority problems. Journal of Social Issues , 2, 4 , 34–46. https://doi.org/10.1111/j.1540-4560.1946.tb02295.x
Mills, J., Bonner, A., & Francis, K. (2006). The Development of Constructivist Grounded Theory. International Journal of Qualitative Methods , 5 (1), 25–35. https://doi.org/10.1177/160940690600500103
Mingers, J., & Willcocks, L. (2017). An integrative semiotic methodology for IS research. Information and Organization , 27 (1), 17–36. https://doi.org/10.1016/j.infoandorg.2016.12.001
OECD. (2015). Frascati Manual 2015: Guidelines for Collecting and Reporting Data on Research and Experimental Development . Organisation for Economic Co-operation and Development. https://www.oecd-ilibrary.org/science-and-technology/frascati-manual-2015_9789264239012-en
Peirce, C. S. (1992). The Essential Peirce, Volume 1: Selected Philosophical Writings (1867–1893) . Indiana University Press.
Reese, W. L. (1980). Dictionary of Philosophy and Religion: Eastern and Western Thought . Humanities Press.
Riessman, C. K. (1993). Narrative analysis . Sage Publications, Inc.
Saussure, F. de, & Riedlinger, A. (1959). Course in General Linguistics . Philosophical Library.
Thomas, C. G. (2021). Research Methodology and Scientific Writing . Springer Nature.
Zahavi, D., & Overgaard, S. (n.d.). Phenomenological Sociology—The Subjectivity of Everyday Life .
Your email address will not be published. Required fields are marked *
Run a free plagiarism check in 10 minutes, automatically generate references for free.
Published on 25 February 2019 by Shona McCombes . Revised on 10 October 2022.
Your research methodology discusses and explains the data collection and analysis methods you used in your research. A key part of your thesis, dissertation, or research paper, the methodology chapter explains what you did and how you did it, allowing readers to evaluate the reliability and validity of your research.
It should include:
Be assured that you'll submit flawless writing. Upload your document to correct all your mistakes.
How to write a research methodology, why is a methods section important, step 1: explain your methodological approach, step 2: describe your data collection methods, step 3: describe your analysis method, step 4: evaluate and justify the methodological choices you made, tips for writing a strong methodology chapter, frequently asked questions about methodology.
Your methods section is your opportunity to share how you conducted your research and why you chose the methods you chose. It’s also the place to show that your research was rigorously conducted and can be replicated .
It gives your research legitimacy and situates it within your field, and also gives your readers a place to refer to if they have any questions or critiques in other sections.
You can start by introducing your overall approach to your research. You have two options here.
What research problem or question did you investigate?
And what type of data did you need to achieve this aim?
Depending on your discipline, you can also start with a discussion of the rationale and assumptions underpinning your methodology. In other words, why did you choose these methods for your study?
Once you have introduced your reader to your methodological approach, you should share full details about your data collection methods .
In order to be considered generalisable, you should describe quantitative research methods in enough detail for another researcher to replicate your study.
Here, explain how you operationalised your concepts and measured your variables. Discuss your sampling method or inclusion/exclusion criteria, as well as any tools, procedures, and materials you used to gather your data.
Surveys Describe where, when, and how the survey was conducted.
Experiments Share full details of the tools, techniques, and procedures you used to conduct your experiment.
Existing data Explain how you gathered and selected the material (such as datasets or archival data) that you used in your analysis.
The survey consisted of 5 multiple-choice questions and 10 questions measured on a 7-point Likert scale.
The goal was to collect survey responses from 350 customers visiting the fitness apparel company’s brick-and-mortar location in Boston on 4–8 July 2022, between 11:00 and 15:00.
Here, a customer was defined as a person who had purchased a product from the company on the day they took the survey. Participants were given 5 minutes to fill in the survey anonymously. In total, 408 customers responded, but not all surveys were fully completed. Due to this, 371 survey results were included in the analysis.
In qualitative research , methods are often more flexible and subjective. For this reason, it’s crucial to robustly explain the methodology choices you made.
Be sure to discuss the criteria you used to select your data, the context in which your research was conducted, and the role you played in collecting your data (e.g., were you an active participant, or a passive observer?)
Interviews or focus groups Describe where, when, and how the interviews were conducted.
Participant observation Describe where, when, and how you conducted the observation or ethnography .
Existing data Explain how you selected case study materials for your analysis.
In order to gain better insight into possibilities for future improvement of the fitness shop’s product range, semi-structured interviews were conducted with 8 returning customers.
Here, a returning customer was defined as someone who usually bought products at least twice a week from the store.
Surveys were used to select participants. Interviews were conducted in a small office next to the cash register and lasted approximately 20 minutes each. Answers were recorded by note-taking, and seven interviews were also filmed with consent. One interviewee preferred not to be filmed.
Mixed methods research combines quantitative and qualitative approaches. If a standalone quantitative or qualitative study is insufficient to answer your research question, mixed methods may be a good fit for you.
Mixed methods are less common than standalone analyses, largely because they require a great deal of effort to pull off successfully. If you choose to pursue mixed methods, it’s especially important to robustly justify your methods here.
The academic proofreading tool has been trained on 1000s of academic texts and by native English editors. Making it the most accurate and reliable proofreading tool for students.
Correct my document today
Next, you should indicate how you processed and analysed your data. Avoid going into too much detail: you should not start introducing or discussing any of your results at this stage.
In quantitative research , your analysis will be based on numbers. In your methods section, you can include:
In qualitative research, your analysis will be based on language, images, and observations (often involving some form of textual analysis ).
Specific methods might include:
Mixed methods combine the above two research methods, integrating both qualitative and quantitative approaches into one coherent analytical process.
Above all, your methodology section should clearly make the case for why you chose the methods you did. This is especially true if you did not take the most standard approach to your topic. In this case, discuss why other methods were not suitable for your objectives, and show how this approach contributes new knowledge or understanding.
In any case, it should be overwhelmingly clear to your reader that you set yourself up for success in terms of your methodology’s design. Show how your methods should lead to results that are valid and reliable, while leaving the analysis of the meaning, importance, and relevance of your results for your discussion section .
Remember that your aim is not just to describe your methods, but to show how and why you applied them. Again, it’s critical to demonstrate that your research was rigorously conducted and can be replicated.
The methodology section should clearly show why your methods suit your objectives and convince the reader that you chose the best possible approach to answering your problem statement and research questions .
Your methodology can be strengthened by referencing existing research in your field. This can help you to:
Consider how much information you need to give, and avoid getting too lengthy. If you are using methods that are standard for your discipline, you probably don’t need to give a lot of background or justification.
Regardless, your methodology should be a clear, well-structured text that makes an argument for your approach, not just a list of technical details and procedures.
Methodology refers to the overarching strategy and rationale of your research. Developing your methodology involves studying the research methods used in your field and the theories or principles that underpin them, in order to choose the approach that best matches your objectives.
Methods are the specific tools and procedures you use to collect and analyse data (e.g. interviews, experiments , surveys , statistical tests ).
In a dissertation or scientific paper, the methodology chapter or methods section comes after the introduction and before the results , discussion and conclusion .
Depending on the length and type of document, you might also include a literature review or theoretical framework before the methodology.
Quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings.
Quantitative methods allow you to test a hypothesis by systematically collecting and analysing data, while qualitative methods allow you to explore ideas and experiences in depth.
A sample is a subset of individuals from a larger population. Sampling means selecting the group that you will actually collect data from in your research.
For example, if you are researching the opinions of students in your university, you could survey a sample of 100 students.
Statistical sampling allows you to test a hypothesis about the characteristics of a population. There are various sampling methods you can use to ensure that your sample is representative of the population as a whole.
If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.
McCombes, S. (2022, October 10). What Is a Research Methodology? | Steps & Tips. Scribbr. Retrieved 12 August 2024, from https://www.scribbr.co.uk/thesis-dissertation/methodology/
Other students also liked, how to write a dissertation proposal | a step-by-step guide, what is a literature review | guide, template, & examples, what is a theoretical framework | a step-by-step guide.
Terms & conditions.
As the Christmas season is upon us, we find ourselves reflecting on the past year and those who we have helped to shape their future. It’s been quite a year for us all! The end of the year brings no greater joy than the opportunity to express to you Christmas greetings and good wishes.
At this special time of year, Research Prospect brings joyful discount of 10% on all its services. May your Christmas and New Year be filled with joy.
We are looking back with appreciation for your loyalty and looking forward to moving into the New Year together.
"Claim this offer"
In unfamiliar and hard times, we have stuck by you. This Christmas, Research Prospect brings you all the joy with exciting discount of 10% on all its services.
Offer valid till 5-1-2024
We love being your partner in success. We know you have been working hard lately, take a break this holiday season to spend time with your loved ones while we make sure you succeed in your academics
Discount code: RP23720
Published by Nicolas at March 21st, 2024 , Revised On March 12, 2024
Research methodology is a crucial aspect of any investigative process, serving as the blueprint for the entire research journey. If you are stuck in the methodology section of your research paper , then this blog will guide you on what is a research methodology, its types and how to successfully conduct one.
Table of Contents
Research methodology can be defined as the systematic framework that guides researchers in designing, conducting, and analyzing their investigations. It encompasses a structured set of processes, techniques, and tools employed to gather and interpret data, ensuring the reliability and validity of the research findings.
Research methodology is not confined to a singular approach; rather, it encapsulates a diverse range of methods tailored to the specific requirements of the research objectives.
Here is why Research methodology is important in academic and professional settings.
Research methodology forms the backbone of rigorous inquiry. It provides a structured approach that aids researchers in formulating precise thesis statements , selecting appropriate methodologies, and executing systematic investigations. This, in turn, enhances the quality and credibility of the research outcomes.
In both academic and professional contexts, the ability to reproduce research outcomes is paramount. A well-defined research methodology establishes clear procedures, making it possible for others to replicate the study. This not only validates the findings but also contributes to the cumulative nature of knowledge.
In professional settings, decisions often hinge on reliable data and insights. Research methodology equips professionals with the tools to gather pertinent information, analyze it rigorously, and derive meaningful conclusions.
This informed decision-making is instrumental in achieving organizational goals and staying ahead in competitive environments.
For academic researchers, adherence to robust research methodology is a hallmark of excellence. Institutions value research that adheres to high standards of methodology, fostering a culture of academic rigour and intellectual integrity. Furthermore, it prepares students with critical skills applicable beyond academia.
Research methodology instills a problem-solving mindset by encouraging researchers to approach challenges systematically. It equips individuals with the skills to dissect complex issues, formulate hypotheses , and devise effective strategies for investigation.
In the pursuit of knowledge and discovery, understanding the fundamentals of research methodology is paramount.
Research, in its essence, is a systematic and organized process of inquiry aimed at expanding our understanding of a particular subject or phenomenon. It involves the exploration of existing knowledge, the formulation of hypotheses, and the collection and analysis of data to draw meaningful conclusions.
Research is a dynamic and iterative process that contributes to the continuous evolution of knowledge in various disciplines.
Research takes on various forms, each tailored to the nature of the inquiry. Broadly classified, research can be categorized into two main types:
To conduct effective research, one must go through the different components of research methodology. These components form the scaffolding that supports the entire research process, ensuring its coherence and validity.
Research design serves as the blueprint for the entire research project. It outlines the overall structure and strategy for conducting the study. The three primary types of research design are:
Choosing the right data collection methods is crucial for obtaining reliable and relevant information. Common methods include:
Once data is collected, analysis becomes imperative to derive meaningful conclusions. Different methodologies exist for quantitative and qualitative data:
Selecting an appropriate research method is a critical decision in the research process. It determines the approach, tools, and techniques that will be used to answer the research questions.
Quantitative research involves the collection and analysis of numerical data, providing a structured and objective approach to understanding and explaining phenomena.
Experimental research involves manipulating variables to observe the effect on another variable under controlled conditions. It aims to establish cause-and-effect relationships.
Key Characteristics:
Applications: Commonly used in scientific studies and psychology to test hypotheses and identify causal relationships.
Survey research gathers information from a sample of individuals through standardized questionnaires or interviews. It aims to collect data on opinions, attitudes, and behaviours.
Applications: Widely employed in social sciences, marketing, and public opinion research to understand trends and preferences.
Descriptive research seeks to portray an accurate profile of a situation or phenomenon. It focuses on answering the ‘what,’ ‘who,’ ‘where,’ and ‘when’ questions.
Applications: Useful in situations where researchers want to understand and describe a phenomenon without altering it, common in social sciences and education.
Qualitative research emphasizes exploring and understanding the depth and complexity of phenomena through non-numerical data.
A case study is an in-depth exploration of a particular person, group, event, or situation. It involves detailed, context-rich analysis.
Applications: Common in social sciences, psychology, and business to investigate complex and specific instances.
Ethnography involves immersing the researcher in the culture or community being studied to gain a deep understanding of their behaviours, beliefs, and practices.
Applications: Widely used in anthropology, sociology, and cultural studies to explore and document cultural practices.
Grounded theory aims to develop theories grounded in the data itself. It involves systematic data collection and analysis to construct theories from the ground up.
Applications: Commonly applied in sociology, nursing, and management studies to generate theories from empirical data.
Research design is the structural framework that outlines the systematic process and plan for conducting a study. It serves as the blueprint, guiding researchers on how to collect, analyze, and interpret data.
Exploratory design.
Exploratory research design is employed when a researcher aims to explore a relatively unknown subject or gain insights into a complex phenomenon.
Applications: Valuable in the early stages of investigation, especially when the researcher seeks a deeper understanding of a subject before formalizing research questions.
Descriptive research design focuses on portraying an accurate profile of a situation, group, or phenomenon.
Applications: Widely used in social sciences, marketing, and educational research to provide detailed and objective descriptions.
Explanatory research design aims to identify the causes and effects of a phenomenon, explaining the ‘why’ and ‘how’ behind observed relationships.
Applications: Commonly employed in scientific studies and social sciences to delve into the underlying reasons behind observed patterns.
Cross-sectional design.
Cross-sectional designs collect data from participants at a single point in time.
Applications: Suitable for studying characteristics or behaviours that are stable or not expected to change rapidly.
Longitudinal designs involve the collection of data from the same participants over an extended period.
Applications: Ideal for studying developmental processes, trends, or the impact of interventions over time.
Experimental design.
Experimental designs involve manipulating variables under controlled conditions to observe the effect on another variable.
Applications: Commonly used in scientific studies, psychology, and medical research to establish causal relationships.
Non-experimental designs observe and describe phenomena without manipulating variables.
Applications: Suitable for studying complex phenomena in real-world settings where manipulation may not be ethical or feasible.
Effective data collection is fundamental to the success of any research endeavour.
Objective Design:
Structured Format:
Pilot Testing:
Sampling Strategy:
Establishing Rapport:
Open-Ended Questions:
Active Listening:
Ethical Considerations:
1. participant observation.
Immersive Participation:
Field Notes:
Ethical Awareness:
Objective Observation:
Data Reliability:
Contextual Understanding:
1. using existing data.
Identifying Relevant Archives:
Data Verification:
Ethical Use:
Incomplete or Inaccurate Archives:
Temporal Bias:
Access Limitations:
Conducting research is a complex and dynamic process, often accompanied by a myriad of challenges. Addressing these challenges is crucial to ensure the reliability and validity of research findings.
Sampling bias:.
Measurement error:.
Timeline pressures:.
Selection bias:.
Conducting successful research relies not only on the application of sound methodologies but also on strategic planning and effective collaboration. Here are some tips to enhance the success of your research methodology:
Well-defined research objectives guide the entire research process. Clearly articulate the purpose of your study, outlining specific research questions or hypotheses.
A thorough literature review provides a foundation for understanding existing knowledge and identifying gaps. Invest time in reviewing relevant literature to inform your research design and methodology.
A detailed plan serves as a roadmap, ensuring all aspects of the research are systematically addressed. Develop a detailed research plan outlining timelines, milestones, and tasks.
Ethical practices are fundamental to maintaining the integrity of research. Address ethical considerations early, obtain necessary approvals, and ensure participant rights are safeguarded.
Research methodologies evolve, and staying updated is essential for employing the most effective techniques. Engage in continuous learning by attending workshops, conferences, and reading recent publications.
Unforeseen challenges may arise during research, necessitating adaptability in methods. Be flexible and willing to modify your approach when needed, ensuring the integrity of the study.
Research is often an iterative process, and refining methods based on ongoing findings enhance the study’s robustness. Regularly review and refine your research design and methods as the study progresses.
What is the research methodology.
Research methodology is the systematic process of planning, executing, and evaluating scientific investigation. It encompasses the techniques, tools, and procedures used to collect, analyze, and interpret data, ensuring the reliability and validity of research findings.
Research methodologies include qualitative and quantitative approaches. Qualitative methods involve in-depth exploration of non-numerical data, while quantitative methods use statistical analysis to examine numerical data. Mixed methods combine both approaches for a comprehensive understanding of research questions.
To write a research methodology, clearly outline the study’s design, data collection, and analysis procedures. Specify research tools, participants, and sampling methods. Justify choices and discuss limitations. Ensure clarity, coherence, and alignment with research objectives for a robust methodology section.
In the methodology section of a research paper, describe the study’s design, data collection, and analysis methods. Detail procedures, tools, participants, and sampling. Justify choices, address ethical considerations, and explain how the methodology aligns with research objectives, ensuring clarity and rigour.
Mixed research methodology combines both qualitative and quantitative research approaches within a single study. This approach aims to enhance the details and depth of research findings by providing a more comprehensive understanding of the research problem or question.
Stuck with your dissertation. Worried about that dissertation explicative that has been haunting you for several days but you can’t […]
Discover the factors influencing dissertation length and find guidance on typical page ranges for shorter, average, and longer dissertations.
Cancer research is a vast and dynamic field that plays a pivotal role in advancing our understanding of this complex […]
Ready to place an order?
Learning resources.
Research methodology involves a systematic and well-structured approach to conducting scholarly or scientific inquiries. Knowing the significance of research methodology and its different components is crucial as it serves as the basis for any study.
Typically, your research topic will start as a broad idea you want to investigate more thoroughly. Once you’ve identified a research problem and created research questions , you must choose the appropriate methodology and frameworks to address those questions effectively.
Research methodology is the process or the way you intend to execute your study. The methodology section of a research paper outlines how you plan to conduct your study. It covers various steps such as collecting data, statistical analysis, observing participants, and other procedures involved in the research process
The methods section should give a description of the process that will convert your idea into a study. Additionally, the outcomes of your process must provide valid and reliable results resonant with the aims and objectives of your research. This thumb rule holds complete validity, no matter whether your paper has inclinations for qualitative or quantitative usage.
Studying research methods used in related studies can provide helpful insights and direction for your own research. Now easily discover papers related to your topic on SciSpace and utilize our AI research assistant, Copilot , to quickly review the methodologies applied in different papers.
While deciding on your approach towards your research, the reason or factors you weighed in choosing a particular problem and formulating a research topic need to be validated and explained. A research methodology helps you do exactly that. Moreover, a good research methodology lets you build your argument to validate your research work performed through various data collection methods, analytical methods, and other essential points.
Just imagine it as a strategy documented to provide an overview of what you intend to do.
While undertaking any research writing or performing the research itself, you may get drifted in not something of much importance. In such a case, a research methodology helps you to get back to your outlined work methodology.
A research methodology helps in keeping you accountable for your work. Additionally, it can help you evaluate whether your work is in sync with your original aims and objectives or not. Besides, a good research methodology enables you to navigate your research process smoothly and swiftly while providing effective planning to achieve your desired results.
Usually, you must ensure to include the following stated aspects while deciding over the basic structure of your research methodology:
Explain what research methods you’re going to use. Whether you intend to proceed with quantitative or qualitative, or a composite of both approaches, you need to state that explicitly. The option among the three depends on your research’s aim, objectives, and scope.
Based on logic and reason, let your readers know why you have chosen said research methodologies. Additionally, you have to build strong arguments supporting why your chosen research method is the best way to achieve the desired outcome.
The mechanism encompasses the research methods or instruments you will use to develop your research methodology. It usually refers to your data collection methods. You can use interviews, surveys, physical questionnaires, etc., of the many available mechanisms as research methodology instruments. The data collection method is determined by the type of research and whether the data is quantitative data(includes numerical data) or qualitative data (perception, morale, etc.) Moreover, you need to put logical reasoning behind choosing a particular instrument.
The results will be available once you have finished experimenting. However, you should also explain how you plan to use the data to interpret the findings. This section also aids in understanding the problem from within, breaking it down into pieces, and viewing the research problem from various perspectives.
Anything that you feel must be explained to spread more awareness among readers and focus groups must be included and described in detail. You should not just specify your research methodology on the assumption that a reader is aware of the topic.
All the relevant information that explains and simplifies your research paper must be included in the methodology section. If you are conducting your research in a non-traditional manner, give a logical justification and list its benefits.
Include information about the sample and sample space in the methodology section. The term "sample" refers to a smaller set of data that a researcher selects or chooses from a larger group of people or focus groups using a predetermined selection method. Let your readers know how you are going to distinguish between relevant and non-relevant samples. How you figured out those exact numbers to back your research methodology, i.e. the sample spacing of instruments, must be discussed thoroughly.
For example, if you are going to conduct a survey or interview, then by what procedure will you select the interviewees (or sample size in case of surveys), and how exactly will the interview or survey be conducted.
This part, which is frequently assumed to be unnecessary, is actually very important. The challenges and limitations that your chosen strategy inherently possesses must be specified while you are conducting different types of research.
You must have observed that all research papers, dissertations, or theses carry a chapter entirely dedicated to research methodology. This section helps maintain your credibility as a better interpreter of results rather than a manipulator.
A good research methodology always explains the procedure, data collection methods and techniques, aim, and scope of the research. In a research study, it leads to a well-organized, rationality-based approach, while the paper lacking it is often observed as messy or disorganized.
You should pay special attention to validating your chosen way towards the research methodology. This becomes extremely important in case you select an unconventional or a distinct method of execution.
Curating and developing a strong, effective research methodology can assist you in addressing a variety of situations, such as:
As a researcher, you must choose which tools or data collection methods that fit best in terms of the relevance of your research. This decision has to be wise.
There exists many research equipments or tools that you can use to carry out your research process. These are classified as:
An interview aimed to get your desired research outcomes can be undertaken in many different ways. For example, you can design your interview as structured, semi-structured, or unstructured. What sets them apart is the degree of formality in the questions. On the other hand, in a group interview, your aim should be to collect more opinions and group perceptions from the focus groups on a certain topic rather than looking out for some formal answers.
In surveys, you are in better control if you specifically draft the questions you seek the response for. For example, you may choose to include free-style questions that can be answered descriptively, or you may provide a multiple-choice type response for questions. Besides, you can also opt to choose both ways, deciding what suits your research process and purpose better.
Similar to the group interviews, here, you can select a group of individuals and assign them a topic to discuss or freely express their opinions over that. You can simultaneously note down the answers and later draft them appropriately, deciding on the relevance of every response.
If your research domain is humanities or sociology, observations are the best-proven method to draw your research methodology. Of course, you can always include studying the spontaneous response of the participants towards a situation or conducting the same but in a more structured manner. A structured observation means putting the participants in a situation at a previously decided time and then studying their responses.
Of all the tools described above, it is you who should wisely choose the instruments and decide what’s the best fit for your research. You must not restrict yourself from multiple methods or a combination of a few instruments if appropriate in drafting a good research methodology.
A research methodology exists in various forms. Depending upon their approach, whether centered around words, numbers, or both, methodologies are distinguished as qualitative, quantitative, or an amalgamation of both.
When a research methodology primarily focuses on words and textual data, then it is generally referred to as qualitative research methodology. This type is usually preferred among researchers when the aim and scope of the research are mainly theoretical and explanatory.
The instruments used are observations, interviews, and sample groups. You can use this methodology if you are trying to study human behavior or response in some situations. Generally, qualitative research methodology is widely used in sociology, psychology, and other related domains.
If your research is majorly centered on data, figures, and stats, then analyzing these numerical data is often referred to as quantitative research methodology. You can use quantitative research methodology if your research requires you to validate or justify the obtained results.
In quantitative methods, surveys, tests, experiments, and evaluations of current databases can be advantageously used as instruments If your research involves testing some hypothesis, then use this methodology.
As the name suggests, the amalgam methodology uses both quantitative and qualitative approaches. This methodology is used when a part of the research requires you to verify the facts and figures, whereas the other part demands you to discover the theoretical and explanatory nature of the research question.
The instruments for the amalgam methodology require you to conduct interviews and surveys, including tests and experiments. The outcome of this methodology can be insightful and valuable as it provides precise test results in line with theoretical explanations and reasoning.
The amalgam method, makes your work both factual and rational at the same time.
If you have kept your sincerity and awareness intact with the aims and scope of research well enough, you must have got an idea of which research methodology suits your work best.
Before deciding which research methodology answers your research question, you must invest significant time in reading and doing your homework for that. Taking references that yield relevant results should be your first approach to establishing a research methodology.
Moreover, you should never refrain from exploring other options. Before setting your work in stone, you must try all the available options as it explains why the choice of research methodology that you finally make is more appropriate than the other available options.
You should always go for a quantitative research methodology if your research requires gathering large amounts of data, figures, and statistics. This research methodology will provide you with results if your research paper involves the validation of some hypothesis.
Whereas, if you are looking for more explanations, reasons, opinions, and public perceptions around a theory, you must use qualitative research methodology.The choice of an appropriate research methodology ultimately depends on what you want to achieve through your research.
1. how to write a research methodology.
You can always provide a separate section for research methodology where you should specify details about the methods and instruments used during the research, discussions on result analysis, including insights into the background information, and conveying the research limitations.
There generally exists four types of research methodology i.e.
The set of techniques or procedures followed to discover and analyze the information gathered to validate or justify a research outcome is generally called Research Methodology.
Your research methodology directly reflects the validity of your research outcomes and how well-informed your research work is. Moreover, it can help future researchers cite or refer to your research if they plan to use a similar research methodology.
Educational resources and simple solutions for your research journey
Writing a research paper is both an art and a skill, and knowing how to write the methods section of a research paper is the first crucial step in mastering scientific writing. If, like the majority of early career researchers, you believe that the methods section is the simplest to write and needs little in the way of careful consideration or thought, this article will help you understand it is not 1 .
We have all probably asked our supervisors, coworkers, or search engines “ how to write a methods section of a research paper ” at some point in our scientific careers, so you are not alone if that’s how you ended up here. Even for seasoned researchers, selecting what to include in the methods section from a wealth of experimental information can occasionally be a source of distress and perplexity.
Additionally, journal specifications, in some cases, may make it more of a requirement rather than a choice to provide a selective yet descriptive account of the experimental procedure. Hence, knowing these nuances of how to write the methods section of a research paper is critical to its success. The methods section of the research paper is not supposed to be a detailed heavy, dull section that some researchers tend to write; rather, it should be the central component of the study that justifies the validity and reliability of the research.
Are you still unsure of how the methods section of a research paper forms the basis of every investigation? Consider the last article you read but ignore the methods section and concentrate on the other parts of the paper . Now think whether you could repeat the study and be sure of the credibility of the findings despite knowing the literature review and even having the data in front of you. You have the answer!
Having established the importance of the methods section , the next question is how to write the methods section of a research paper that unifies the overall study. The purpose of the methods section , which was earlier called as Materials and Methods , is to describe how the authors went about answering the “research question” at hand. Here, the objective is to tell a coherent story that gives a detailed account of how the study was conducted, the rationale behind specific experimental procedures, the experimental setup, objects (variables) involved, the research protocol employed, tools utilized to measure, calculations and measurements, and the analysis of the collected data 2 .
In this article, we will take a deep dive into this topic and provide a detailed overview of how to write the methods section of a research paper . For the sake of clarity, we have separated the subject into various sections with corresponding subheadings.
Table of Contents
The methods section is a fundamental section of any paper since it typically discusses the ‘ what ’, ‘ how ’, ‘ which ’, and ‘ why ’ of the study, which is necessary to arrive at the final conclusions. In a research article, the introduction, which serves to set the foundation for comprehending the background and results is usually followed by the methods section, which precedes the result and discussion sections. The methods section must explicitly state what was done, how it was done, which equipment, tools and techniques were utilized, how were the measurements/calculations taken, and why specific research protocols, software, and analytical methods were employed.
The primary goal of the methods section is to provide pertinent details about the experimental approach so that the reader may put the results in perspective and, if necessary, replicate the findings 3 . This section offers readers the chance to evaluate the reliability and validity of any study. In short, it also serves as the study’s blueprint, assisting researchers who might be unsure about any other portion in establishing the study’s context and validity. The methods plays a rather crucial role in determining the fate of the article; an incomplete and unreliable methods section can frequently result in early rejections and may lead to numerous rounds of modifications during the publication process. This means that the reviewers also often use methods section to assess the reliability and validity of the research protocol and the data analysis employed to address the research topic. In other words, the purpose of the methods section is to demonstrate the research acumen and subject-matter expertise of the author(s) in their field.
Similar to the research paper, the methods section also follows a defined structure; this may be dictated by the guidelines of a specific journal or can be presented in a chronological or thematic manner based on the study type. When writing the methods section , authors should keep in mind that they are telling a story about how the research was conducted. They should only report relevant information to avoid confusing the reader and include details that would aid in connecting various aspects of the entire research activity together. It is generally advisable to present experiments in the order in which they were conducted. This facilitates the logical flow of the research and allows readers to follow the progression of the study design.
It is also essential to clearly state the rationale behind each experiment and how the findings of earlier experiments informed the design or interpretation of later experiments. This allows the readers to understand the overall purpose of the study design and the significance of each experiment within that context. However, depending on the particular research question and method, it may make sense to present information in a different order; therefore, authors must select the best structure and strategy for their individual studies.
In cases where there is a lot of information, divide the sections into subheadings to cover the pertinent details. If the journal guidelines pose restrictions on the word limit , additional important information can be supplied in the supplementary files. A simple rule of thumb for sectioning the method section is to begin by explaining the methodological approach ( what was done ), describing the data collection methods ( how it was done ), providing the analysis method ( how the data was analyzed ), and explaining the rationale for choosing the methodological strategy. This is described in detail in the upcoming sections.
Contrary to widespread assumption, the methods section of a research paper should be prepared once the study is complete to prevent missing any key parameter. Hence, please make sure that all relevant experiments are done before you start writing a methods section . The next step for authors is to look up any applicable academic style manuals or journal-specific standards to ensure that the methods section is formatted correctly. The methods section of a research paper typically constitutes materials and methods; while writing this section, authors usually arrange the information under each category.
The materials category describes the samples, materials, treatments, and instruments, while experimental design, sample preparation, data collection, and data analysis are a part of the method category. According to the nature of the study, authors should include additional subsections within the methods section, such as ethical considerations like the declaration of Helsinki (for studies involving human subjects), demographic information of the participants, and any other crucial information that can affect the output of the study. Simply put, the methods section has two major components: content and format. Here is an easy checklist for you to consider if you are struggling with how to write the methods section of a research paper .
Now that you know how to write the methods section of a research paper , let’s address another challenge researchers face while writing the methods section —what to include in the methods section . How much information is too much is not always obvious when it comes to trying to include data in the methods section of a paper. In the next section, we examine this issue and explore potential solutions.
The technical nature of the methods section occasionally makes it harder to present the information clearly and concisely while staying within the study context. Many young researchers tend to veer off subject significantly, and they frequently commit the sin of becoming bogged down in itty bitty details, making the text harder to read and impairing its overall flow. However, the best way to write the methods section is to start with crucial components of the experiments. If you have trouble deciding which elements are essential, think about leaving out those that would make it more challenging to comprehend the context or replicate the results. The top-down approach helps to ensure all relevant information is incorporated and vital information is not lost in technicalities. Next, remember to add details that are significant to assess the validity and reliability of the study. Here is a simple checklist for you to follow ( bonus tip: you can also make a checklist for your own study to avoid missing any critical information while writing the methods section ).
To address “ how to write the methods section of a research paper ”, authors should not only pay careful attention to what to include but also what not to include in the methods section of a research paper . Here is a list of do not’s when writing the methods section :
We hope that by this point, you understand how crucial it is to write a thoughtful and precise methods section and the ins and outs of how to write the methods section of a research paper . To restate, the entire purpose of the methods section is to enable others to reproduce the results or verify the research. We sincerely hope that this post has cleared up any confusion and given you a fresh perspective on the methods section .
As a parting gift, we’re leaving you with a handy checklist that will help you understand how to write the methods section of a research paper . Feel free to download this checklist and use or share this with those who you think may benefit from it.
References
Editage All Access is a subscription-based platform that unifies the best AI tools and services designed to speed up, simplify, and streamline every step of a researcher’s journey. The Editage All Access Pack is a one-of-a-kind subscription that unlocks full access to an AI writing assistant, literature recommender, journal finder, scientific illustration tool, and exclusive discounts on professional publication services from Editage.
Based on 22+ years of experience in academia, Editage All Access empowers researchers to put their best research forward and move closer to success. Explore our top AI Tools pack, AI Tools + Publication Services pack, or Build Your Own Plan. Find everything a researcher needs to succeed, all in one place – Get All Access now starting at just $14 a month !
Last Updated: May 27, 2024 Approved
This article was co-authored by Alexander Ruiz, M.Ed. and by wikiHow staff writer, Jennifer Mueller, JD . Alexander Ruiz is an Educational Consultant and the Educational Director of Link Educational Institute, a tutoring business based in Claremont, California that provides customizable educational plans, subject and test prep tutoring, and college application consulting. With over a decade and a half of experience in the education industry, Alexander coaches students to increase their self-awareness and emotional intelligence while achieving skills and the goal of achieving skills and higher education. He holds a BA in Psychology from Florida International University and an MA in Education from Georgia Southern University. wikiHow marks an article as reader-approved once it receives enough positive feedback. In this case, several readers have written to tell us that this article was helpful to them, earning it our reader-approved status. This article has been viewed 527,700 times.
The research methodology section of any academic research paper gives you the opportunity to convince your readers that your research is useful and will contribute to your field of study. An effective research methodology is grounded in your overall approach – whether qualitative or quantitative – and adequately describes the methods you used. Justify why you chose those methods over others, then explain how those methods will provide answers to your research questions. [1] X Research source
To write a research methodology, start with a section that outlines the problems or questions you'll be studying, including your hypotheses or whatever it is you're setting out to prove. Then, briefly explain why you chose to use either a qualitative or quantitative approach for your study. Next, go over when and where you conducted your research and what parameters you used to ensure you were objective. Finally, cite any sources you used to decide on the methodology for your research. To learn how to justify your choice of methods in your research methodology, scroll down! Did this summary help you? Yes No
Prof. Dr. Ahmed Askar
Apr 18, 2020
M. Mahmood Shah Khan
Mar 17, 2020
Shimola Makondo
Jul 20, 2019
Zain Sharif Mohammed Alnadhery
Jan 7, 2019
Lundi Dukashe
Feb 17, 2020
Don’t miss out! Sign up for
wikiHow’s newsletter
An official website of the United States government
The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.
The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.
Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .
Lawrence mbuagbaw.
1 Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON Canada
2 Biostatistics Unit/FSORC, 50 Charlton Avenue East, St Joseph’s Healthcare—Hamilton, 3rd Floor Martha Wing, Room H321, Hamilton, Ontario L8N 4A6 Canada
3 Centre for the Development of Best Practices in Health, Yaoundé, Cameroon
Livia puljak.
4 Center for Evidence-Based Medicine and Health Care, Catholic University of Croatia, Ilica 242, 10000 Zagreb, Croatia
5 Department of Epidemiology and Biostatistics, School of Public Health – Bloomington, Indiana University, Bloomington, IN 47405 USA
6 Departments of Paediatrics and Anaesthesia, McMaster University, Hamilton, ON Canada
7 Centre for Evaluation of Medicine, St. Joseph’s Healthcare-Hamilton, Hamilton, ON Canada
8 Population Health Research Institute, Hamilton Health Sciences, Hamilton, ON Canada
Data sharing is not applicable to this article as no new data were created or analyzed in this study.
Methodological studies – studies that evaluate the design, analysis or reporting of other research-related reports – play an important role in health research. They help to highlight issues in the conduct of research with the aim of improving health research methodology, and ultimately reducing research waste.
We provide an overview of some of the key aspects of methodological studies such as what they are, and when, how and why they are done. We adopt a “frequently asked questions” format to facilitate reading this paper and provide multiple examples to help guide researchers interested in conducting methodological studies. Some of the topics addressed include: is it necessary to publish a study protocol? How to select relevant research reports and databases for a methodological study? What approaches to data extraction and statistical analysis should be considered when conducting a methodological study? What are potential threats to validity and is there a way to appraise the quality of methodological studies?
Appropriate reflection and application of basic principles of epidemiology and biostatistics are required in the design and analysis of methodological studies. This paper provides an introduction for further discussion about the conduct of methodological studies.
The field of meta-research (or research-on-research) has proliferated in recent years in response to issues with research quality and conduct [ 1 – 3 ]. As the name suggests, this field targets issues with research design, conduct, analysis and reporting. Various types of research reports are often examined as the unit of analysis in these studies (e.g. abstracts, full manuscripts, trial registry entries). Like many other novel fields of research, meta-research has seen a proliferation of use before the development of reporting guidance. For example, this was the case with randomized trials for which risk of bias tools and reporting guidelines were only developed much later – after many trials had been published and noted to have limitations [ 4 , 5 ]; and for systematic reviews as well [ 6 – 8 ]. However, in the absence of formal guidance, studies that report on research differ substantially in how they are named, conducted and reported [ 9 , 10 ]. This creates challenges in identifying, summarizing and comparing them. In this tutorial paper, we will use the term methodological study to refer to any study that reports on the design, conduct, analysis or reporting of primary or secondary research-related reports (such as trial registry entries and conference abstracts).
In the past 10 years, there has been an increase in the use of terms related to methodological studies (based on records retrieved with a keyword search [in the title and abstract] for “methodological review” and “meta-epidemiological study” in PubMed up to December 2019), suggesting that these studies may be appearing more frequently in the literature. See Fig. 1 .
Trends in the number studies that mention “methodological review” or “meta-
epidemiological study” in PubMed.
The methods used in many methodological studies have been borrowed from systematic and scoping reviews. This practice has influenced the direction of the field, with many methodological studies including searches of electronic databases, screening of records, duplicate data extraction and assessments of risk of bias in the included studies. However, the research questions posed in methodological studies do not always require the approaches listed above, and guidance is needed on when and how to apply these methods to a methodological study. Even though methodological studies can be conducted on qualitative or mixed methods research, this paper focuses on and draws examples exclusively from quantitative research.
The objectives of this paper are to provide some insights on how to conduct methodological studies so that there is greater consistency between the research questions posed, and the design, analysis and reporting of findings. We provide multiple examples to illustrate concepts and a proposed framework for categorizing methodological studies in quantitative research.
Any study that describes or analyzes methods (design, conduct, analysis or reporting) in published (or unpublished) literature is a methodological study. Consequently, the scope of methodological studies is quite extensive and includes, but is not limited to, topics as diverse as: research question formulation [ 11 ]; adherence to reporting guidelines [ 12 – 14 ] and consistency in reporting [ 15 ]; approaches to study analysis [ 16 ]; investigating the credibility of analyses [ 17 ]; and studies that synthesize these methodological studies [ 18 ]. While the nomenclature of methodological studies is not uniform, the intents and purposes of these studies remain fairly consistent – to describe or analyze methods in primary or secondary studies. As such, methodological studies may also be classified as a subtype of observational studies.
Parallel to this are experimental studies that compare different methods. Even though they play an important role in informing optimal research methods, experimental methodological studies are beyond the scope of this paper. Examples of such studies include the randomized trials by Buscemi et al., comparing single data extraction to double data extraction [ 19 ], and Carrasco-Labra et al., comparing approaches to presenting findings in Grading of Recommendations, Assessment, Development and Evaluations (GRADE) summary of findings tables [ 20 ]. In these studies, the unit of analysis is the person or groups of individuals applying the methods. We also direct readers to the Studies Within a Trial (SWAT) and Studies Within a Review (SWAR) programme operated through the Hub for Trials Methodology Research, for further reading as a potential useful resource for these types of experimental studies [ 21 ]. Lastly, this paper is not meant to inform the conduct of research using computational simulation and mathematical modeling for which some guidance already exists [ 22 ], or studies on the development of methods using consensus-based approaches.
Methodological studies occupy a unique niche in health research that allows them to inform methodological advances. Methodological studies should also be conducted as pre-cursors to reporting guideline development, as they provide an opportunity to understand current practices, and help to identify the need for guidance and gaps in methodological or reporting quality. For example, the development of the popular Preferred Reporting Items of Systematic reviews and Meta-Analyses (PRISMA) guidelines were preceded by methodological studies identifying poor reporting practices [ 23 , 24 ]. In these instances, after the reporting guidelines are published, methodological studies can also be used to monitor uptake of the guidelines.
These studies can also be conducted to inform the state of the art for design, analysis and reporting practices across different types of health research fields, with the aim of improving research practices, and preventing or reducing research waste. For example, Samaan et al. conducted a scoping review of adherence to different reporting guidelines in health care literature [ 18 ]. Methodological studies can also be used to determine the factors associated with reporting practices. For example, Abbade et al. investigated journal characteristics associated with the use of the Participants, Intervention, Comparison, Outcome, Timeframe (PICOT) format in framing research questions in trials of venous ulcer disease [ 11 ].
There is no clear answer to this question. Based on a search of PubMed, the use of related terms (“methodological review” and “meta-epidemiological study”) – and therefore, the number of methodological studies – is on the rise. However, many other terms are used to describe methodological studies. There are also many studies that explore design, conduct, analysis or reporting of research reports, but that do not use any specific terms to describe or label their study design in terms of “methodology”. This diversity in nomenclature makes a census of methodological studies elusive. Appropriate terminology and key words for methodological studies are needed to facilitate improved accessibility for end-users.
Methodological studies provide information on the design, conduct, analysis or reporting of primary and secondary research and can be used to appraise quality, quantity, completeness, accuracy and consistency of health research. These issues can be explored in specific fields, journals, databases, geographical regions and time periods. For example, Areia et al. explored the quality of reporting of endoscopic diagnostic studies in gastroenterology [ 25 ]; Knol et al. investigated the reporting of p -values in baseline tables in randomized trial published in high impact journals [ 26 ]; Chen et al. describe adherence to the Consolidated Standards of Reporting Trials (CONSORT) statement in Chinese Journals [ 27 ]; and Hopewell et al. describe the effect of editors’ implementation of CONSORT guidelines on reporting of abstracts over time [ 28 ]. Methodological studies provide useful information to researchers, clinicians, editors, publishers and users of health literature. As a result, these studies have been at the cornerstone of important methodological developments in the past two decades and have informed the development of many health research guidelines including the highly cited CONSORT statement [ 5 ].
Methodological studies can be found in most common biomedical bibliographic databases (e.g. Embase, MEDLINE, PubMed, Web of Science). However, the biggest caveat is that methodological studies are hard to identify in the literature due to the wide variety of names used and the lack of comprehensive databases dedicated to them. A handful can be found in the Cochrane Library as “Cochrane Methodology Reviews”, but these studies only cover methodological issues related to systematic reviews. Previous attempts to catalogue all empirical studies of methods used in reviews were abandoned 10 years ago [ 29 ]. In other databases, a variety of search terms may be applied with different levels of sensitivity and specificity.
In this section, we have outlined responses to questions that might help inform the conduct of methodological studies.
Q: How should I select research reports for my methodological study?
A: Selection of research reports for a methodological study depends on the research question and eligibility criteria. Once a clear research question is set and the nature of literature one desires to review is known, one can then begin the selection process. Selection may begin with a broad search, especially if the eligibility criteria are not apparent. For example, a methodological study of Cochrane Reviews of HIV would not require a complex search as all eligible studies can easily be retrieved from the Cochrane Library after checking a few boxes [ 30 ]. On the other hand, a methodological study of subgroup analyses in trials of gastrointestinal oncology would require a search to find such trials, and further screening to identify trials that conducted a subgroup analysis [ 31 ].
The strategies used for identifying participants in observational studies can apply here. One may use a systematic search to identify all eligible studies. If the number of eligible studies is unmanageable, a random sample of articles can be expected to provide comparable results if it is sufficiently large [ 32 ]. For example, Wilson et al. used a random sample of trials from the Cochrane Stroke Group’s Trial Register to investigate completeness of reporting [ 33 ]. It is possible that a simple random sample would lead to underrepresentation of units (i.e. research reports) that are smaller in number. This is relevant if the investigators wish to compare multiple groups but have too few units in one group. In this case a stratified sample would help to create equal groups. For example, in a methodological study comparing Cochrane and non-Cochrane reviews, Kahale et al. drew random samples from both groups [ 34 ]. Alternatively, systematic or purposeful sampling strategies can be used and we encourage researchers to justify their selected approaches based on the study objective.
Q: How many databases should I search?
A: The number of databases one should search would depend on the approach to sampling, which can include targeting the entire “population” of interest or a sample of that population. If you are interested in including the entire target population for your research question, or drawing a random or systematic sample from it, then a comprehensive and exhaustive search for relevant articles is required. In this case, we recommend using systematic approaches for searching electronic databases (i.e. at least 2 databases with a replicable and time stamped search strategy). The results of your search will constitute a sampling frame from which eligible studies can be drawn.
Alternatively, if your approach to sampling is purposeful, then we recommend targeting the database(s) or data sources (e.g. journals, registries) that include the information you need. For example, if you are conducting a methodological study of high impact journals in plastic surgery and they are all indexed in PubMed, you likely do not need to search any other databases. You may also have a comprehensive list of all journals of interest and can approach your search using the journal names in your database search (or by accessing the journal archives directly from the journal’s website). Even though one could also search journals’ web pages directly, using a database such as PubMed has multiple advantages, such as the use of filters, so the search can be narrowed down to a certain period, or study types of interest. Furthermore, individual journals’ web sites may have different search functionalities, which do not necessarily yield a consistent output.
Q: Should I publish a protocol for my methodological study?
A: A protocol is a description of intended research methods. Currently, only protocols for clinical trials require registration [ 35 ]. Protocols for systematic reviews are encouraged but no formal recommendation exists. The scientific community welcomes the publication of protocols because they help protect against selective outcome reporting, the use of post hoc methodologies to embellish results, and to help avoid duplication of efforts [ 36 ]. While the latter two risks exist in methodological research, the negative consequences may be substantially less than for clinical outcomes. In a sample of 31 methodological studies, 7 (22.6%) referenced a published protocol [ 9 ]. In the Cochrane Library, there are 15 protocols for methodological reviews (21 July 2020). This suggests that publishing protocols for methodological studies is not uncommon.
Authors can consider publishing their study protocol in a scholarly journal as a manuscript. Advantages of such publication include obtaining peer-review feedback about the planned study, and easy retrieval by searching databases such as PubMed. The disadvantages in trying to publish protocols includes delays associated with manuscript handling and peer review, as well as costs, as few journals publish study protocols, and those journals mostly charge article-processing fees [ 37 ]. Authors who would like to make their protocol publicly available without publishing it in scholarly journals, could deposit their study protocols in publicly available repositories, such as the Open Science Framework ( https://osf.io/ ).
Q: How to appraise the quality of a methodological study?
A: To date, there is no published tool for appraising the risk of bias in a methodological study, but in principle, a methodological study could be considered as a type of observational study. Therefore, during conduct or appraisal, care should be taken to avoid the biases common in observational studies [ 38 ]. These biases include selection bias, comparability of groups, and ascertainment of exposure or outcome. In other words, to generate a representative sample, a comprehensive reproducible search may be necessary to build a sampling frame. Additionally, random sampling may be necessary to ensure that all the included research reports have the same probability of being selected, and the screening and selection processes should be transparent and reproducible. To ensure that the groups compared are similar in all characteristics, matching, random sampling or stratified sampling can be used. Statistical adjustments for between-group differences can also be applied at the analysis stage. Finally, duplicate data extraction can reduce errors in assessment of exposures or outcomes.
Q: Should I justify a sample size?
A: In all instances where one is not using the target population (i.e. the group to which inferences from the research report are directed) [ 39 ], a sample size justification is good practice. The sample size justification may take the form of a description of what is expected to be achieved with the number of articles selected, or a formal sample size estimation that outlines the number of articles required to answer the research question with a certain precision and power. Sample size justifications in methodological studies are reasonable in the following instances:
For example, El Dib et al. computed a sample size requirement for a methodological study of diagnostic strategies in randomized trials, based on a confidence interval approach [ 40 ].
Q: What should I call my study?
A: Other terms which have been used to describe/label methodological studies include “ methodological review ”, “methodological survey” , “meta-epidemiological study” , “systematic review” , “systematic survey”, “meta-research”, “research-on-research” and many others. We recommend that the study nomenclature be clear, unambiguous, informative and allow for appropriate indexing. Methodological study nomenclature that should be avoided includes “ systematic review” – as this will likely be confused with a systematic review of a clinical question. “ Systematic survey” may also lead to confusion about whether the survey was systematic (i.e. using a preplanned methodology) or a survey using “ systematic” sampling (i.e. a sampling approach using specific intervals to determine who is selected) [ 32 ]. Any of the above meanings of the words “ systematic” may be true for methodological studies and could be potentially misleading. “ Meta-epidemiological study” is ideal for indexing, but not very informative as it describes an entire field. The term “ review ” may point towards an appraisal or “review” of the design, conduct, analysis or reporting (or methodological components) of the targeted research reports, yet it has also been used to describe narrative reviews [ 41 , 42 ]. The term “ survey ” is also in line with the approaches used in many methodological studies [ 9 ], and would be indicative of the sampling procedures of this study design. However, in the absence of guidelines on nomenclature, the term “ methodological study ” is broad enough to capture most of the scenarios of such studies.
Q: Should I account for clustering in my methodological study?
A: Data from methodological studies are often clustered. For example, articles coming from a specific source may have different reporting standards (e.g. the Cochrane Library). Articles within the same journal may be similar due to editorial practices and policies, reporting requirements and endorsement of guidelines. There is emerging evidence that these are real concerns that should be accounted for in analyses [ 43 ]. Some cluster variables are described in the section: “ What variables are relevant to methodological studies?”
A variety of modelling approaches can be used to account for correlated data, including the use of marginal, fixed or mixed effects regression models with appropriate computation of standard errors [ 44 ]. For example, Kosa et al. used generalized estimation equations to account for correlation of articles within journals [ 15 ]. Not accounting for clustering could lead to incorrect p -values, unduly narrow confidence intervals, and biased estimates [ 45 ].
Q: Should I extract data in duplicate?
A: Yes. Duplicate data extraction takes more time but results in less errors [ 19 ]. Data extraction errors in turn affect the effect estimate [ 46 ], and therefore should be mitigated. Duplicate data extraction should be considered in the absence of other approaches to minimize extraction errors. However, much like systematic reviews, this area will likely see rapid new advances with machine learning and natural language processing technologies to support researchers with screening and data extraction [ 47 , 48 ]. However, experience plays an important role in the quality of extracted data and inexperienced extractors should be paired with experienced extractors [ 46 , 49 ].
Q: Should I assess the risk of bias of research reports included in my methodological study?
A : Risk of bias is most useful in determining the certainty that can be placed in the effect measure from a study. In methodological studies, risk of bias may not serve the purpose of determining the trustworthiness of results, as effect measures are often not the primary goal of methodological studies. Determining risk of bias in methodological studies is likely a practice borrowed from systematic review methodology, but whose intrinsic value is not obvious in methodological studies. When it is part of the research question, investigators often focus on one aspect of risk of bias. For example, Speich investigated how blinding was reported in surgical trials [ 50 ], and Abraha et al., investigated the application of intention-to-treat analyses in systematic reviews and trials [ 51 ].
Q: What variables are relevant to methodological studies?
A: There is empirical evidence that certain variables may inform the findings in a methodological study. We outline some of these and provide a brief overview below:
Q: Should I focus only on high impact journals?
A: Investigators may choose to investigate only high impact journals because they are more likely to influence practice and policy, or because they assume that methodological standards would be higher. However, the JIF may severely limit the scope of articles included and may skew the sample towards articles with positive findings. The generalizability and applicability of findings from a handful of journals must be examined carefully, especially since the JIF varies over time. Even among journals that are all “high impact”, variations exist in methodological standards.
Q: Can I conduct a methodological study of qualitative research?
A: Yes. Even though a lot of methodological research has been conducted in the quantitative research field, methodological studies of qualitative studies are feasible. Certain databases that catalogue qualitative research including the Cumulative Index to Nursing & Allied Health Literature (CINAHL) have defined subject headings that are specific to methodological research (e.g. “research methodology”). Alternatively, one could also conduct a qualitative methodological review; that is, use qualitative approaches to synthesize methodological issues in qualitative studies.
Q: What reporting guidelines should I use for my methodological study?
A: There is no guideline that covers the entire scope of methodological studies. One adaptation of the PRISMA guidelines has been published, which works well for studies that aim to use the entire target population of research reports [ 71 ]. However, it is not widely used (40 citations in 2 years as of 09 December 2019), and methodological studies that are designed as cross-sectional or before-after studies require a more fit-for purpose guideline. A more encompassing reporting guideline for a broad range of methodological studies is currently under development [ 72 ]. However, in the absence of formal guidance, the requirements for scientific reporting should be respected, and authors of methodological studies should focus on transparency and reproducibility.
Q: What are the potential threats to validity and how can I avoid them?
A: Methodological studies may be compromised by a lack of internal or external validity. The main threats to internal validity in methodological studies are selection and confounding bias. Investigators must ensure that the methods used to select articles does not make them differ systematically from the set of articles to which they would like to make inferences. For example, attempting to make extrapolations to all journals after analyzing high-impact journals would be misleading.
Many factors (confounders) may distort the association between the exposure and outcome if the included research reports differ with respect to these factors [ 73 ]. For example, when examining the association between source of funding and completeness of reporting, it may be necessary to account for journals that endorse the guidelines. Confounding bias can be addressed by restriction, matching and statistical adjustment [ 73 ]. Restriction appears to be the method of choice for many investigators who choose to include only high impact journals or articles in a specific field. For example, Knol et al. examined the reporting of p -values in baseline tables of high impact journals [ 26 ]. Matching is also sometimes used. In the methodological study of non-randomized interventional studies of elective ventral hernia repair, Parker et al. matched prospective studies with retrospective studies and compared reporting standards [ 74 ]. Some other methodological studies use statistical adjustments. For example, Zhang et al. used regression techniques to determine the factors associated with missing participant data in trials [ 16 ].
With regard to external validity, researchers interested in conducting methodological studies must consider how generalizable or applicable their findings are. This should tie in closely with the research question and should be explicit. For example. Findings from methodological studies on trials published in high impact cardiology journals cannot be assumed to be applicable to trials in other fields. However, investigators must ensure that their sample truly represents the target sample either by a) conducting a comprehensive and exhaustive search, or b) using an appropriate and justified, randomly selected sample of research reports.
Even applicability to high impact journals may vary based on the investigators’ definition, and over time. For example, for high impact journals in the field of general medicine, Bouwmeester et al. included the Annals of Internal Medicine (AIM), BMJ, the Journal of the American Medical Association (JAMA), Lancet, the New England Journal of Medicine (NEJM), and PLoS Medicine ( n = 6) [ 75 ]. In contrast, the high impact journals selected in the methodological study by Schiller et al. were BMJ, JAMA, Lancet, and NEJM ( n = 4) [ 76 ]. Another methodological study by Kosa et al. included AIM, BMJ, JAMA, Lancet and NEJM ( n = 5). In the methodological study by Thabut et al., journals with a JIF greater than 5 were considered to be high impact. Riado Minguez et al. used first quartile journals in the Journal Citation Reports (JCR) for a specific year to determine “high impact” [ 77 ]. Ultimately, the definition of high impact will be based on the number of journals the investigators are willing to include, the year of impact and the JIF cut-off [ 78 ]. We acknowledge that the term “generalizability” may apply differently for methodological studies, especially when in many instances it is possible to include the entire target population in the sample studied.
Finally, methodological studies are not exempt from information bias which may stem from discrepancies in the included research reports [ 79 ], errors in data extraction, or inappropriate interpretation of the information extracted. Likewise, publication bias may also be a concern in methodological studies, but such concepts have not yet been explored.
In order to inform discussions about methodological studies, the development of guidance for what should be reported, we have outlined some key features of methodological studies that can be used to classify them. For each of the categories outlined below, we provide an example. In our experience, the choice of approach to completing a methodological study can be informed by asking the following four questions:
A methodological study may be focused on exploring sources of bias in primary or secondary studies (meta-bias), or how bias is analyzed. We have taken care to distinguish bias (i.e. systematic deviations from the truth irrespective of the source) from reporting quality or completeness (i.e. not adhering to a specific reporting guideline or norm). An example of where this distinction would be important is in the case of a randomized trial with no blinding. This study (depending on the nature of the intervention) would be at risk of performance bias. However, if the authors report that their study was not blinded, they would have reported adequately. In fact, some methodological studies attempt to capture both “quality of conduct” and “quality of reporting”, such as Richie et al., who reported on the risk of bias in randomized trials of pharmacy practice interventions [ 80 ]. Babic et al. investigated how risk of bias was used to inform sensitivity analyses in Cochrane reviews [ 81 ]. Further, biases related to choice of outcomes can also be explored. For example, Tan et al investigated differences in treatment effect size based on the outcome reported [ 82 ].
Methodological studies may report quality of reporting against a reporting checklist (i.e. adherence to guidelines) or against expected norms. For example, Croituro et al. report on the quality of reporting in systematic reviews published in dermatology journals based on their adherence to the PRISMA statement [ 83 ], and Khan et al. described the quality of reporting of harms in randomized controlled trials published in high impact cardiovascular journals based on the CONSORT extension for harms [ 84 ]. Other methodological studies investigate reporting of certain features of interest that may not be part of formally published checklists or guidelines. For example, Mbuagbaw et al. described how often the implications for research are elaborated using the Evidence, Participants, Intervention, Comparison, Outcome, Timeframe (EPICOT) format [ 30 ].
Sometimes investigators may be interested in how consistent reports of the same research are, as it is expected that there should be consistency between: conference abstracts and published manuscripts; manuscript abstracts and manuscript main text; and trial registration and published manuscript. For example, Rosmarakis et al. investigated consistency between conference abstracts and full text manuscripts [ 85 ].
In addition to identifying issues with reporting in primary and secondary studies, authors of methodological studies may be interested in determining the factors that are associated with certain reporting practices. Many methodological studies incorporate this, albeit as a secondary outcome. For example, Farrokhyar et al. investigated the factors associated with reporting quality in randomized trials of coronary artery bypass grafting surgery [ 53 ].
Methodological studies may also be used to describe methods or compare methods, and the factors associated with methods. Muller et al. described the methods used for systematic reviews and meta-analyses of observational studies [ 86 ].
Some methodological studies synthesize results from other methodological studies. For example, Li et al. conducted a scoping review of methodological reviews that investigated consistency between full text and abstracts in primary biomedical research [ 87 ].
Some methodological studies may investigate the use of names and terms in health research. For example, Martinic et al. investigated the definitions of systematic reviews used in overviews of systematic reviews (OSRs), meta-epidemiological studies and epidemiology textbooks [ 88 ].
In addition to the previously mentioned experimental methodological studies, there may exist other types of methodological studies not captured here.
Most methodological studies are purely descriptive and report their findings as counts (percent) and means (standard deviation) or medians (interquartile range). For example, Mbuagbaw et al. described the reporting of research recommendations in Cochrane HIV systematic reviews [ 30 ]. Gohari et al. described the quality of reporting of randomized trials in diabetes in Iran [ 12 ].
Some methodological studies are analytical wherein “analytical studies identify and quantify associations, test hypotheses, identify causes and determine whether an association exists between variables, such as between an exposure and a disease.” [ 89 ] In the case of methodological studies all these investigations are possible. For example, Kosa et al. investigated the association between agreement in primary outcome from trial registry to published manuscript and study covariates. They found that larger and more recent studies were more likely to have agreement [ 15 ]. Tricco et al. compared the conclusion statements from Cochrane and non-Cochrane systematic reviews with a meta-analysis of the primary outcome and found that non-Cochrane reviews were more likely to report positive findings. These results are a test of the null hypothesis that the proportions of Cochrane and non-Cochrane reviews that report positive results are equal [ 90 ].
Methodological reviews with narrow research questions may be able to include the entire target population. For example, in the methodological study of Cochrane HIV systematic reviews, Mbuagbaw et al. included all of the available studies ( n = 103) [ 30 ].
Many methodological studies use random samples of the target population [ 33 , 91 , 92 ]. Alternatively, purposeful sampling may be used, limiting the sample to a subset of research-related reports published within a certain time period, or in journals with a certain ranking or on a topic. Systematic sampling can also be used when random sampling may be challenging to implement.
Many methodological studies use a research report (e.g. full manuscript of study, abstract portion of the study) as the unit of analysis, and inferences can be made at the study-level. However, both published and unpublished research-related reports can be studied. These may include articles, conference abstracts, registry entries etc.
Some methodological studies report on items which may occur more than once per article. For example, Paquette et al. report on subgroup analyses in Cochrane reviews of atrial fibrillation in which 17 systematic reviews planned 56 subgroup analyses [ 93 ].
This framework is outlined in Fig. 2 .
A proposed framework for methodological studies
Methodological studies have examined different aspects of reporting such as quality, completeness, consistency and adherence to reporting guidelines. As such, many of the methodological study examples cited in this tutorial are related to reporting. However, as an evolving field, the scope of research questions that can be addressed by methodological studies is expected to increase.
In this paper we have outlined the scope and purpose of methodological studies, along with examples of instances in which various approaches have been used. In the absence of formal guidance on the design, conduct, analysis and reporting of methodological studies, we have provided some advice to help make methodological studies consistent. This advice is grounded in good contemporary scientific practice. Generally, the research question should tie in with the sampling approach and planned analysis. We have also highlighted the variables that may inform findings from methodological studies. Lastly, we have provided suggestions for ways in which authors can categorize their methodological studies to inform their design and analysis.
Abbreviations.
CONSORT | Consolidated Standards of Reporting Trials |
EPICOT | Evidence, Participants, Intervention, Comparison, Outcome, Timeframe |
GRADE | Grading of Recommendations, Assessment, Development and Evaluations |
PICOT | Participants, Intervention, Comparison, Outcome, Timeframe |
PRISMA | Preferred Reporting Items of Systematic reviews and Meta-Analyses |
SWAR | Studies Within a Review |
SWAT | Studies Within a Trial |
LM conceived the idea and drafted the outline and paper. DOL and LT commented on the idea and draft outline. LM, LP and DOL performed literature searches and data extraction. All authors (LM, DOL, LT, LP, DBA) reviewed several draft versions of the manuscript and approved the final manuscript.
This work did not receive any dedicated funding.
Ethics approval and consent to participate.
Not applicable.
Competing interests.
DOL, DBA, LM, LP and LT are involved in the development of a reporting guideline for methodological studies.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
BMC Medical Research Methodology volume 20 , Article number: 226 ( 2020 ) Cite this article
41k Accesses
58 Citations
61 Altmetric
Metrics details
Methodological studies – studies that evaluate the design, analysis or reporting of other research-related reports – play an important role in health research. They help to highlight issues in the conduct of research with the aim of improving health research methodology, and ultimately reducing research waste.
We provide an overview of some of the key aspects of methodological studies such as what they are, and when, how and why they are done. We adopt a “frequently asked questions” format to facilitate reading this paper and provide multiple examples to help guide researchers interested in conducting methodological studies. Some of the topics addressed include: is it necessary to publish a study protocol? How to select relevant research reports and databases for a methodological study? What approaches to data extraction and statistical analysis should be considered when conducting a methodological study? What are potential threats to validity and is there a way to appraise the quality of methodological studies?
Appropriate reflection and application of basic principles of epidemiology and biostatistics are required in the design and analysis of methodological studies. This paper provides an introduction for further discussion about the conduct of methodological studies.
Peer Review reports
The field of meta-research (or research-on-research) has proliferated in recent years in response to issues with research quality and conduct [ 1 , 2 , 3 ]. As the name suggests, this field targets issues with research design, conduct, analysis and reporting. Various types of research reports are often examined as the unit of analysis in these studies (e.g. abstracts, full manuscripts, trial registry entries). Like many other novel fields of research, meta-research has seen a proliferation of use before the development of reporting guidance. For example, this was the case with randomized trials for which risk of bias tools and reporting guidelines were only developed much later – after many trials had been published and noted to have limitations [ 4 , 5 ]; and for systematic reviews as well [ 6 , 7 , 8 ]. However, in the absence of formal guidance, studies that report on research differ substantially in how they are named, conducted and reported [ 9 , 10 ]. This creates challenges in identifying, summarizing and comparing them. In this tutorial paper, we will use the term methodological study to refer to any study that reports on the design, conduct, analysis or reporting of primary or secondary research-related reports (such as trial registry entries and conference abstracts).
In the past 10 years, there has been an increase in the use of terms related to methodological studies (based on records retrieved with a keyword search [in the title and abstract] for “methodological review” and “meta-epidemiological study” in PubMed up to December 2019), suggesting that these studies may be appearing more frequently in the literature. See Fig. 1 .
Trends in the number studies that mention “methodological review” or “meta-
epidemiological study” in PubMed.
The methods used in many methodological studies have been borrowed from systematic and scoping reviews. This practice has influenced the direction of the field, with many methodological studies including searches of electronic databases, screening of records, duplicate data extraction and assessments of risk of bias in the included studies. However, the research questions posed in methodological studies do not always require the approaches listed above, and guidance is needed on when and how to apply these methods to a methodological study. Even though methodological studies can be conducted on qualitative or mixed methods research, this paper focuses on and draws examples exclusively from quantitative research.
The objectives of this paper are to provide some insights on how to conduct methodological studies so that there is greater consistency between the research questions posed, and the design, analysis and reporting of findings. We provide multiple examples to illustrate concepts and a proposed framework for categorizing methodological studies in quantitative research.
Any study that describes or analyzes methods (design, conduct, analysis or reporting) in published (or unpublished) literature is a methodological study. Consequently, the scope of methodological studies is quite extensive and includes, but is not limited to, topics as diverse as: research question formulation [ 11 ]; adherence to reporting guidelines [ 12 , 13 , 14 ] and consistency in reporting [ 15 ]; approaches to study analysis [ 16 ]; investigating the credibility of analyses [ 17 ]; and studies that synthesize these methodological studies [ 18 ]. While the nomenclature of methodological studies is not uniform, the intents and purposes of these studies remain fairly consistent – to describe or analyze methods in primary or secondary studies. As such, methodological studies may also be classified as a subtype of observational studies.
Parallel to this are experimental studies that compare different methods. Even though they play an important role in informing optimal research methods, experimental methodological studies are beyond the scope of this paper. Examples of such studies include the randomized trials by Buscemi et al., comparing single data extraction to double data extraction [ 19 ], and Carrasco-Labra et al., comparing approaches to presenting findings in Grading of Recommendations, Assessment, Development and Evaluations (GRADE) summary of findings tables [ 20 ]. In these studies, the unit of analysis is the person or groups of individuals applying the methods. We also direct readers to the Studies Within a Trial (SWAT) and Studies Within a Review (SWAR) programme operated through the Hub for Trials Methodology Research, for further reading as a potential useful resource for these types of experimental studies [ 21 ]. Lastly, this paper is not meant to inform the conduct of research using computational simulation and mathematical modeling for which some guidance already exists [ 22 ], or studies on the development of methods using consensus-based approaches.
Methodological studies occupy a unique niche in health research that allows them to inform methodological advances. Methodological studies should also be conducted as pre-cursors to reporting guideline development, as they provide an opportunity to understand current practices, and help to identify the need for guidance and gaps in methodological or reporting quality. For example, the development of the popular Preferred Reporting Items of Systematic reviews and Meta-Analyses (PRISMA) guidelines were preceded by methodological studies identifying poor reporting practices [ 23 , 24 ]. In these instances, after the reporting guidelines are published, methodological studies can also be used to monitor uptake of the guidelines.
These studies can also be conducted to inform the state of the art for design, analysis and reporting practices across different types of health research fields, with the aim of improving research practices, and preventing or reducing research waste. For example, Samaan et al. conducted a scoping review of adherence to different reporting guidelines in health care literature [ 18 ]. Methodological studies can also be used to determine the factors associated with reporting practices. For example, Abbade et al. investigated journal characteristics associated with the use of the Participants, Intervention, Comparison, Outcome, Timeframe (PICOT) format in framing research questions in trials of venous ulcer disease [ 11 ].
There is no clear answer to this question. Based on a search of PubMed, the use of related terms (“methodological review” and “meta-epidemiological study”) – and therefore, the number of methodological studies – is on the rise. However, many other terms are used to describe methodological studies. There are also many studies that explore design, conduct, analysis or reporting of research reports, but that do not use any specific terms to describe or label their study design in terms of “methodology”. This diversity in nomenclature makes a census of methodological studies elusive. Appropriate terminology and key words for methodological studies are needed to facilitate improved accessibility for end-users.
Methodological studies provide information on the design, conduct, analysis or reporting of primary and secondary research and can be used to appraise quality, quantity, completeness, accuracy and consistency of health research. These issues can be explored in specific fields, journals, databases, geographical regions and time periods. For example, Areia et al. explored the quality of reporting of endoscopic diagnostic studies in gastroenterology [ 25 ]; Knol et al. investigated the reporting of p -values in baseline tables in randomized trial published in high impact journals [ 26 ]; Chen et al. describe adherence to the Consolidated Standards of Reporting Trials (CONSORT) statement in Chinese Journals [ 27 ]; and Hopewell et al. describe the effect of editors’ implementation of CONSORT guidelines on reporting of abstracts over time [ 28 ]. Methodological studies provide useful information to researchers, clinicians, editors, publishers and users of health literature. As a result, these studies have been at the cornerstone of important methodological developments in the past two decades and have informed the development of many health research guidelines including the highly cited CONSORT statement [ 5 ].
Methodological studies can be found in most common biomedical bibliographic databases (e.g. Embase, MEDLINE, PubMed, Web of Science). However, the biggest caveat is that methodological studies are hard to identify in the literature due to the wide variety of names used and the lack of comprehensive databases dedicated to them. A handful can be found in the Cochrane Library as “Cochrane Methodology Reviews”, but these studies only cover methodological issues related to systematic reviews. Previous attempts to catalogue all empirical studies of methods used in reviews were abandoned 10 years ago [ 29 ]. In other databases, a variety of search terms may be applied with different levels of sensitivity and specificity.
In this section, we have outlined responses to questions that might help inform the conduct of methodological studies.
Q: How should I select research reports for my methodological study?
A: Selection of research reports for a methodological study depends on the research question and eligibility criteria. Once a clear research question is set and the nature of literature one desires to review is known, one can then begin the selection process. Selection may begin with a broad search, especially if the eligibility criteria are not apparent. For example, a methodological study of Cochrane Reviews of HIV would not require a complex search as all eligible studies can easily be retrieved from the Cochrane Library after checking a few boxes [ 30 ]. On the other hand, a methodological study of subgroup analyses in trials of gastrointestinal oncology would require a search to find such trials, and further screening to identify trials that conducted a subgroup analysis [ 31 ].
The strategies used for identifying participants in observational studies can apply here. One may use a systematic search to identify all eligible studies. If the number of eligible studies is unmanageable, a random sample of articles can be expected to provide comparable results if it is sufficiently large [ 32 ]. For example, Wilson et al. used a random sample of trials from the Cochrane Stroke Group’s Trial Register to investigate completeness of reporting [ 33 ]. It is possible that a simple random sample would lead to underrepresentation of units (i.e. research reports) that are smaller in number. This is relevant if the investigators wish to compare multiple groups but have too few units in one group. In this case a stratified sample would help to create equal groups. For example, in a methodological study comparing Cochrane and non-Cochrane reviews, Kahale et al. drew random samples from both groups [ 34 ]. Alternatively, systematic or purposeful sampling strategies can be used and we encourage researchers to justify their selected approaches based on the study objective.
Q: How many databases should I search?
A: The number of databases one should search would depend on the approach to sampling, which can include targeting the entire “population” of interest or a sample of that population. If you are interested in including the entire target population for your research question, or drawing a random or systematic sample from it, then a comprehensive and exhaustive search for relevant articles is required. In this case, we recommend using systematic approaches for searching electronic databases (i.e. at least 2 databases with a replicable and time stamped search strategy). The results of your search will constitute a sampling frame from which eligible studies can be drawn.
Alternatively, if your approach to sampling is purposeful, then we recommend targeting the database(s) or data sources (e.g. journals, registries) that include the information you need. For example, if you are conducting a methodological study of high impact journals in plastic surgery and they are all indexed in PubMed, you likely do not need to search any other databases. You may also have a comprehensive list of all journals of interest and can approach your search using the journal names in your database search (or by accessing the journal archives directly from the journal’s website). Even though one could also search journals’ web pages directly, using a database such as PubMed has multiple advantages, such as the use of filters, so the search can be narrowed down to a certain period, or study types of interest. Furthermore, individual journals’ web sites may have different search functionalities, which do not necessarily yield a consistent output.
Q: Should I publish a protocol for my methodological study?
A: A protocol is a description of intended research methods. Currently, only protocols for clinical trials require registration [ 35 ]. Protocols for systematic reviews are encouraged but no formal recommendation exists. The scientific community welcomes the publication of protocols because they help protect against selective outcome reporting, the use of post hoc methodologies to embellish results, and to help avoid duplication of efforts [ 36 ]. While the latter two risks exist in methodological research, the negative consequences may be substantially less than for clinical outcomes. In a sample of 31 methodological studies, 7 (22.6%) referenced a published protocol [ 9 ]. In the Cochrane Library, there are 15 protocols for methodological reviews (21 July 2020). This suggests that publishing protocols for methodological studies is not uncommon.
Authors can consider publishing their study protocol in a scholarly journal as a manuscript. Advantages of such publication include obtaining peer-review feedback about the planned study, and easy retrieval by searching databases such as PubMed. The disadvantages in trying to publish protocols includes delays associated with manuscript handling and peer review, as well as costs, as few journals publish study protocols, and those journals mostly charge article-processing fees [ 37 ]. Authors who would like to make their protocol publicly available without publishing it in scholarly journals, could deposit their study protocols in publicly available repositories, such as the Open Science Framework ( https://osf.io/ ).
Q: How to appraise the quality of a methodological study?
A: To date, there is no published tool for appraising the risk of bias in a methodological study, but in principle, a methodological study could be considered as a type of observational study. Therefore, during conduct or appraisal, care should be taken to avoid the biases common in observational studies [ 38 ]. These biases include selection bias, comparability of groups, and ascertainment of exposure or outcome. In other words, to generate a representative sample, a comprehensive reproducible search may be necessary to build a sampling frame. Additionally, random sampling may be necessary to ensure that all the included research reports have the same probability of being selected, and the screening and selection processes should be transparent and reproducible. To ensure that the groups compared are similar in all characteristics, matching, random sampling or stratified sampling can be used. Statistical adjustments for between-group differences can also be applied at the analysis stage. Finally, duplicate data extraction can reduce errors in assessment of exposures or outcomes.
Q: Should I justify a sample size?
A: In all instances where one is not using the target population (i.e. the group to which inferences from the research report are directed) [ 39 ], a sample size justification is good practice. The sample size justification may take the form of a description of what is expected to be achieved with the number of articles selected, or a formal sample size estimation that outlines the number of articles required to answer the research question with a certain precision and power. Sample size justifications in methodological studies are reasonable in the following instances:
Comparing two groups
Determining a proportion, mean or another quantifier
Determining factors associated with an outcome using regression-based analyses
For example, El Dib et al. computed a sample size requirement for a methodological study of diagnostic strategies in randomized trials, based on a confidence interval approach [ 40 ].
Q: What should I call my study?
A: Other terms which have been used to describe/label methodological studies include “ methodological review ”, “methodological survey” , “meta-epidemiological study” , “systematic review” , “systematic survey”, “meta-research”, “research-on-research” and many others. We recommend that the study nomenclature be clear, unambiguous, informative and allow for appropriate indexing. Methodological study nomenclature that should be avoided includes “ systematic review” – as this will likely be confused with a systematic review of a clinical question. “ Systematic survey” may also lead to confusion about whether the survey was systematic (i.e. using a preplanned methodology) or a survey using “ systematic” sampling (i.e. a sampling approach using specific intervals to determine who is selected) [ 32 ]. Any of the above meanings of the words “ systematic” may be true for methodological studies and could be potentially misleading. “ Meta-epidemiological study” is ideal for indexing, but not very informative as it describes an entire field. The term “ review ” may point towards an appraisal or “review” of the design, conduct, analysis or reporting (or methodological components) of the targeted research reports, yet it has also been used to describe narrative reviews [ 41 , 42 ]. The term “ survey ” is also in line with the approaches used in many methodological studies [ 9 ], and would be indicative of the sampling procedures of this study design. However, in the absence of guidelines on nomenclature, the term “ methodological study ” is broad enough to capture most of the scenarios of such studies.
Q: Should I account for clustering in my methodological study?
A: Data from methodological studies are often clustered. For example, articles coming from a specific source may have different reporting standards (e.g. the Cochrane Library). Articles within the same journal may be similar due to editorial practices and policies, reporting requirements and endorsement of guidelines. There is emerging evidence that these are real concerns that should be accounted for in analyses [ 43 ]. Some cluster variables are described in the section: “ What variables are relevant to methodological studies?”
A variety of modelling approaches can be used to account for correlated data, including the use of marginal, fixed or mixed effects regression models with appropriate computation of standard errors [ 44 ]. For example, Kosa et al. used generalized estimation equations to account for correlation of articles within journals [ 15 ]. Not accounting for clustering could lead to incorrect p -values, unduly narrow confidence intervals, and biased estimates [ 45 ].
Q: Should I extract data in duplicate?
A: Yes. Duplicate data extraction takes more time but results in less errors [ 19 ]. Data extraction errors in turn affect the effect estimate [ 46 ], and therefore should be mitigated. Duplicate data extraction should be considered in the absence of other approaches to minimize extraction errors. However, much like systematic reviews, this area will likely see rapid new advances with machine learning and natural language processing technologies to support researchers with screening and data extraction [ 47 , 48 ]. However, experience plays an important role in the quality of extracted data and inexperienced extractors should be paired with experienced extractors [ 46 , 49 ].
Q: Should I assess the risk of bias of research reports included in my methodological study?
A : Risk of bias is most useful in determining the certainty that can be placed in the effect measure from a study. In methodological studies, risk of bias may not serve the purpose of determining the trustworthiness of results, as effect measures are often not the primary goal of methodological studies. Determining risk of bias in methodological studies is likely a practice borrowed from systematic review methodology, but whose intrinsic value is not obvious in methodological studies. When it is part of the research question, investigators often focus on one aspect of risk of bias. For example, Speich investigated how blinding was reported in surgical trials [ 50 ], and Abraha et al., investigated the application of intention-to-treat analyses in systematic reviews and trials [ 51 ].
Q: What variables are relevant to methodological studies?
A: There is empirical evidence that certain variables may inform the findings in a methodological study. We outline some of these and provide a brief overview below:
Country: Countries and regions differ in their research cultures, and the resources available to conduct research. Therefore, it is reasonable to believe that there may be differences in methodological features across countries. Methodological studies have reported loco-regional differences in reporting quality [ 52 , 53 ]. This may also be related to challenges non-English speakers face in publishing papers in English.
Authors’ expertise: The inclusion of authors with expertise in research methodology, biostatistics, and scientific writing is likely to influence the end-product. Oltean et al. found that among randomized trials in orthopaedic surgery, the use of analyses that accounted for clustering was more likely when specialists (e.g. statistician, epidemiologist or clinical trials methodologist) were included on the study team [ 54 ]. Fleming et al. found that including methodologists in the review team was associated with appropriate use of reporting guidelines [ 55 ].
Source of funding and conflicts of interest: Some studies have found that funded studies report better [ 56 , 57 ], while others do not [ 53 , 58 ]. The presence of funding would indicate the availability of resources deployed to ensure optimal design, conduct, analysis and reporting. However, the source of funding may introduce conflicts of interest and warrant assessment. For example, Kaiser et al. investigated the effect of industry funding on obesity or nutrition randomized trials and found that reporting quality was similar [ 59 ]. Thomas et al. looked at reporting quality of long-term weight loss trials and found that industry funded studies were better [ 60 ]. Kan et al. examined the association between industry funding and “positive trials” (trials reporting a significant intervention effect) and found that industry funding was highly predictive of a positive trial [ 61 ]. This finding is similar to that of a recent Cochrane Methodology Review by Hansen et al. [ 62 ]
Journal characteristics: Certain journals’ characteristics may influence the study design, analysis or reporting. Characteristics such as journal endorsement of guidelines [ 63 , 64 ], and Journal Impact Factor (JIF) have been shown to be associated with reporting [ 63 , 65 , 66 , 67 ].
Study size (sample size/number of sites): Some studies have shown that reporting is better in larger studies [ 53 , 56 , 58 ].
Year of publication: It is reasonable to assume that design, conduct, analysis and reporting of research will change over time. Many studies have demonstrated improvements in reporting over time or after the publication of reporting guidelines [ 68 , 69 ].
Type of intervention: In a methodological study of reporting quality of weight loss intervention studies, Thabane et al. found that trials of pharmacologic interventions were reported better than trials of non-pharmacologic interventions [ 70 ].
Interactions between variables: Complex interactions between the previously listed variables are possible. High income countries with more resources may be more likely to conduct larger studies and incorporate a variety of experts. Authors in certain countries may prefer certain journals, and journal endorsement of guidelines and editorial policies may change over time.
Q: Should I focus only on high impact journals?
A: Investigators may choose to investigate only high impact journals because they are more likely to influence practice and policy, or because they assume that methodological standards would be higher. However, the JIF may severely limit the scope of articles included and may skew the sample towards articles with positive findings. The generalizability and applicability of findings from a handful of journals must be examined carefully, especially since the JIF varies over time. Even among journals that are all “high impact”, variations exist in methodological standards.
Q: Can I conduct a methodological study of qualitative research?
A: Yes. Even though a lot of methodological research has been conducted in the quantitative research field, methodological studies of qualitative studies are feasible. Certain databases that catalogue qualitative research including the Cumulative Index to Nursing & Allied Health Literature (CINAHL) have defined subject headings that are specific to methodological research (e.g. “research methodology”). Alternatively, one could also conduct a qualitative methodological review; that is, use qualitative approaches to synthesize methodological issues in qualitative studies.
Q: What reporting guidelines should I use for my methodological study?
A: There is no guideline that covers the entire scope of methodological studies. One adaptation of the PRISMA guidelines has been published, which works well for studies that aim to use the entire target population of research reports [ 71 ]. However, it is not widely used (40 citations in 2 years as of 09 December 2019), and methodological studies that are designed as cross-sectional or before-after studies require a more fit-for purpose guideline. A more encompassing reporting guideline for a broad range of methodological studies is currently under development [ 72 ]. However, in the absence of formal guidance, the requirements for scientific reporting should be respected, and authors of methodological studies should focus on transparency and reproducibility.
Q: What are the potential threats to validity and how can I avoid them?
A: Methodological studies may be compromised by a lack of internal or external validity. The main threats to internal validity in methodological studies are selection and confounding bias. Investigators must ensure that the methods used to select articles does not make them differ systematically from the set of articles to which they would like to make inferences. For example, attempting to make extrapolations to all journals after analyzing high-impact journals would be misleading.
Many factors (confounders) may distort the association between the exposure and outcome if the included research reports differ with respect to these factors [ 73 ]. For example, when examining the association between source of funding and completeness of reporting, it may be necessary to account for journals that endorse the guidelines. Confounding bias can be addressed by restriction, matching and statistical adjustment [ 73 ]. Restriction appears to be the method of choice for many investigators who choose to include only high impact journals or articles in a specific field. For example, Knol et al. examined the reporting of p -values in baseline tables of high impact journals [ 26 ]. Matching is also sometimes used. In the methodological study of non-randomized interventional studies of elective ventral hernia repair, Parker et al. matched prospective studies with retrospective studies and compared reporting standards [ 74 ]. Some other methodological studies use statistical adjustments. For example, Zhang et al. used regression techniques to determine the factors associated with missing participant data in trials [ 16 ].
With regard to external validity, researchers interested in conducting methodological studies must consider how generalizable or applicable their findings are. This should tie in closely with the research question and should be explicit. For example. Findings from methodological studies on trials published in high impact cardiology journals cannot be assumed to be applicable to trials in other fields. However, investigators must ensure that their sample truly represents the target sample either by a) conducting a comprehensive and exhaustive search, or b) using an appropriate and justified, randomly selected sample of research reports.
Even applicability to high impact journals may vary based on the investigators’ definition, and over time. For example, for high impact journals in the field of general medicine, Bouwmeester et al. included the Annals of Internal Medicine (AIM), BMJ, the Journal of the American Medical Association (JAMA), Lancet, the New England Journal of Medicine (NEJM), and PLoS Medicine ( n = 6) [ 75 ]. In contrast, the high impact journals selected in the methodological study by Schiller et al. were BMJ, JAMA, Lancet, and NEJM ( n = 4) [ 76 ]. Another methodological study by Kosa et al. included AIM, BMJ, JAMA, Lancet and NEJM ( n = 5). In the methodological study by Thabut et al., journals with a JIF greater than 5 were considered to be high impact. Riado Minguez et al. used first quartile journals in the Journal Citation Reports (JCR) for a specific year to determine “high impact” [ 77 ]. Ultimately, the definition of high impact will be based on the number of journals the investigators are willing to include, the year of impact and the JIF cut-off [ 78 ]. We acknowledge that the term “generalizability” may apply differently for methodological studies, especially when in many instances it is possible to include the entire target population in the sample studied.
Finally, methodological studies are not exempt from information bias which may stem from discrepancies in the included research reports [ 79 ], errors in data extraction, or inappropriate interpretation of the information extracted. Likewise, publication bias may also be a concern in methodological studies, but such concepts have not yet been explored.
In order to inform discussions about methodological studies, the development of guidance for what should be reported, we have outlined some key features of methodological studies that can be used to classify them. For each of the categories outlined below, we provide an example. In our experience, the choice of approach to completing a methodological study can be informed by asking the following four questions:
What is the aim?
Methodological studies that investigate bias
A methodological study may be focused on exploring sources of bias in primary or secondary studies (meta-bias), or how bias is analyzed. We have taken care to distinguish bias (i.e. systematic deviations from the truth irrespective of the source) from reporting quality or completeness (i.e. not adhering to a specific reporting guideline or norm). An example of where this distinction would be important is in the case of a randomized trial with no blinding. This study (depending on the nature of the intervention) would be at risk of performance bias. However, if the authors report that their study was not blinded, they would have reported adequately. In fact, some methodological studies attempt to capture both “quality of conduct” and “quality of reporting”, such as Richie et al., who reported on the risk of bias in randomized trials of pharmacy practice interventions [ 80 ]. Babic et al. investigated how risk of bias was used to inform sensitivity analyses in Cochrane reviews [ 81 ]. Further, biases related to choice of outcomes can also be explored. For example, Tan et al investigated differences in treatment effect size based on the outcome reported [ 82 ].
Methodological studies that investigate quality (or completeness) of reporting
Methodological studies may report quality of reporting against a reporting checklist (i.e. adherence to guidelines) or against expected norms. For example, Croituro et al. report on the quality of reporting in systematic reviews published in dermatology journals based on their adherence to the PRISMA statement [ 83 ], and Khan et al. described the quality of reporting of harms in randomized controlled trials published in high impact cardiovascular journals based on the CONSORT extension for harms [ 84 ]. Other methodological studies investigate reporting of certain features of interest that may not be part of formally published checklists or guidelines. For example, Mbuagbaw et al. described how often the implications for research are elaborated using the Evidence, Participants, Intervention, Comparison, Outcome, Timeframe (EPICOT) format [ 30 ].
Methodological studies that investigate the consistency of reporting
Sometimes investigators may be interested in how consistent reports of the same research are, as it is expected that there should be consistency between: conference abstracts and published manuscripts; manuscript abstracts and manuscript main text; and trial registration and published manuscript. For example, Rosmarakis et al. investigated consistency between conference abstracts and full text manuscripts [ 85 ].
Methodological studies that investigate factors associated with reporting
In addition to identifying issues with reporting in primary and secondary studies, authors of methodological studies may be interested in determining the factors that are associated with certain reporting practices. Many methodological studies incorporate this, albeit as a secondary outcome. For example, Farrokhyar et al. investigated the factors associated with reporting quality in randomized trials of coronary artery bypass grafting surgery [ 53 ].
Methodological studies that investigate methods
Methodological studies may also be used to describe methods or compare methods, and the factors associated with methods. Muller et al. described the methods used for systematic reviews and meta-analyses of observational studies [ 86 ].
Methodological studies that summarize other methodological studies
Some methodological studies synthesize results from other methodological studies. For example, Li et al. conducted a scoping review of methodological reviews that investigated consistency between full text and abstracts in primary biomedical research [ 87 ].
Methodological studies that investigate nomenclature and terminology
Some methodological studies may investigate the use of names and terms in health research. For example, Martinic et al. investigated the definitions of systematic reviews used in overviews of systematic reviews (OSRs), meta-epidemiological studies and epidemiology textbooks [ 88 ].
Other types of methodological studies
In addition to the previously mentioned experimental methodological studies, there may exist other types of methodological studies not captured here.
What is the design?
Methodological studies that are descriptive
Most methodological studies are purely descriptive and report their findings as counts (percent) and means (standard deviation) or medians (interquartile range). For example, Mbuagbaw et al. described the reporting of research recommendations in Cochrane HIV systematic reviews [ 30 ]. Gohari et al. described the quality of reporting of randomized trials in diabetes in Iran [ 12 ].
Methodological studies that are analytical
Some methodological studies are analytical wherein “analytical studies identify and quantify associations, test hypotheses, identify causes and determine whether an association exists between variables, such as between an exposure and a disease.” [ 89 ] In the case of methodological studies all these investigations are possible. For example, Kosa et al. investigated the association between agreement in primary outcome from trial registry to published manuscript and study covariates. They found that larger and more recent studies were more likely to have agreement [ 15 ]. Tricco et al. compared the conclusion statements from Cochrane and non-Cochrane systematic reviews with a meta-analysis of the primary outcome and found that non-Cochrane reviews were more likely to report positive findings. These results are a test of the null hypothesis that the proportions of Cochrane and non-Cochrane reviews that report positive results are equal [ 90 ].
What is the sampling strategy?
Methodological studies that include the target population
Methodological reviews with narrow research questions may be able to include the entire target population. For example, in the methodological study of Cochrane HIV systematic reviews, Mbuagbaw et al. included all of the available studies ( n = 103) [ 30 ].
Methodological studies that include a sample of the target population
Many methodological studies use random samples of the target population [ 33 , 91 , 92 ]. Alternatively, purposeful sampling may be used, limiting the sample to a subset of research-related reports published within a certain time period, or in journals with a certain ranking or on a topic. Systematic sampling can also be used when random sampling may be challenging to implement.
What is the unit of analysis?
Methodological studies with a research report as the unit of analysis
Many methodological studies use a research report (e.g. full manuscript of study, abstract portion of the study) as the unit of analysis, and inferences can be made at the study-level. However, both published and unpublished research-related reports can be studied. These may include articles, conference abstracts, registry entries etc.
Methodological studies with a design, analysis or reporting item as the unit of analysis
Some methodological studies report on items which may occur more than once per article. For example, Paquette et al. report on subgroup analyses in Cochrane reviews of atrial fibrillation in which 17 systematic reviews planned 56 subgroup analyses [ 93 ].
This framework is outlined in Fig. 2 .
A proposed framework for methodological studies
Methodological studies have examined different aspects of reporting such as quality, completeness, consistency and adherence to reporting guidelines. As such, many of the methodological study examples cited in this tutorial are related to reporting. However, as an evolving field, the scope of research questions that can be addressed by methodological studies is expected to increase.
In this paper we have outlined the scope and purpose of methodological studies, along with examples of instances in which various approaches have been used. In the absence of formal guidance on the design, conduct, analysis and reporting of methodological studies, we have provided some advice to help make methodological studies consistent. This advice is grounded in good contemporary scientific practice. Generally, the research question should tie in with the sampling approach and planned analysis. We have also highlighted the variables that may inform findings from methodological studies. Lastly, we have provided suggestions for ways in which authors can categorize their methodological studies to inform their design and analysis.
Data sharing is not applicable to this article as no new data were created or analyzed in this study.
Consolidated Standards of Reporting Trials
Evidence, Participants, Intervention, Comparison, Outcome, Timeframe
Grading of Recommendations, Assessment, Development and Evaluations
Participants, Intervention, Comparison, Outcome, Timeframe
Preferred Reporting Items of Systematic reviews and Meta-Analyses
Studies Within a Review
Studies Within a Trial
Chalmers I, Glasziou P. Avoidable waste in the production and reporting of research evidence. Lancet. 2009;374(9683):86–9.
PubMed Google Scholar
Chan AW, Song F, Vickers A, Jefferson T, Dickersin K, Gotzsche PC, Krumholz HM, Ghersi D, van der Worp HB. Increasing value and reducing waste: addressing inaccessible research. Lancet. 2014;383(9913):257–66.
PubMed PubMed Central Google Scholar
Ioannidis JP, Greenland S, Hlatky MA, Khoury MJ, Macleod MR, Moher D, Schulz KF, Tibshirani R. Increasing value and reducing waste in research design, conduct, and analysis. Lancet. 2014;383(9912):166–75.
Higgins JP, Altman DG, Gotzsche PC, Juni P, Moher D, Oxman AD, Savovic J, Schulz KF, Weeks L, Sterne JA. The Cochrane Collaboration's tool for assessing risk of bias in randomised trials. BMJ. 2011;343:d5928.
Moher D, Schulz KF, Altman DG. The CONSORT statement: revised recommendations for improving the quality of reports of parallel-group randomised trials. Lancet. 2001;357.
Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gotzsche PC, Ioannidis JP, Clarke M, Devereaux PJ, Kleijnen J, Moher D. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. PLoS Med. 2009;6(7):e1000100.
Shea BJ, Hamel C, Wells GA, Bouter LM, Kristjansson E, Grimshaw J, Henry DA, Boers M. AMSTAR is a reliable and valid measurement tool to assess the methodological quality of systematic reviews. J Clin Epidemiol. 2009;62(10):1013–20.
Shea BJ, Reeves BC, Wells G, Thuku M, Hamel C, Moran J, Moher D, Tugwell P, Welch V, Kristjansson E, et al. AMSTAR 2: a critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. Bmj. 2017;358:j4008.
Lawson DO, Leenus A, Mbuagbaw L. Mapping the nomenclature, methodology, and reporting of studies that review methods: a pilot methodological review. Pilot Feasibility Studies. 2020;6(1):13.
Puljak L, Makaric ZL, Buljan I, Pieper D. What is a meta-epidemiological study? Analysis of published literature indicated heterogeneous study designs and definitions. J Comp Eff Res. 2020.
Abbade LPF, Wang M, Sriganesh K, Jin Y, Mbuagbaw L, Thabane L. The framing of research questions using the PICOT format in randomized controlled trials of venous ulcer disease is suboptimal: a systematic survey. Wound Repair Regen. 2017;25(5):892–900.
Gohari F, Baradaran HR, Tabatabaee M, Anijidani S, Mohammadpour Touserkani F, Atlasi R, Razmgir M. Quality of reporting randomized controlled trials (RCTs) in diabetes in Iran; a systematic review. J Diabetes Metab Disord. 2015;15(1):36.
Wang M, Jin Y, Hu ZJ, Thabane A, Dennis B, Gajic-Veljanoski O, Paul J, Thabane L. The reporting quality of abstracts of stepped wedge randomized trials is suboptimal: a systematic survey of the literature. Contemp Clin Trials Commun. 2017;8:1–10.
Shanthanna H, Kaushal A, Mbuagbaw L, Couban R, Busse J, Thabane L: A cross-sectional study of the reporting quality of pilot or feasibility trials in high-impact anesthesia journals Can J Anaesthesia 2018, 65(11):1180–1195.
Kosa SD, Mbuagbaw L, Borg Debono V, Bhandari M, Dennis BB, Ene G, Leenus A, Shi D, Thabane M, Valvasori S, et al. Agreement in reporting between trial publications and current clinical trial registry in high impact journals: a methodological review. Contemporary Clinical Trials. 2018;65:144–50.
Zhang Y, Florez ID, Colunga Lozano LE, Aloweni FAB, Kennedy SA, Li A, Craigie S, Zhang S, Agarwal A, Lopes LC, et al. A systematic survey on reporting and methods for handling missing participant data for continuous outcomes in randomized controlled trials. J Clin Epidemiol. 2017;88:57–66.
CAS PubMed Google Scholar
Hernández AV, Boersma E, Murray GD, Habbema JD, Steyerberg EW. Subgroup analyses in therapeutic cardiovascular clinical trials: are most of them misleading? Am Heart J. 2006;151(2):257–64.
Samaan Z, Mbuagbaw L, Kosa D, Borg Debono V, Dillenburg R, Zhang S, Fruci V, Dennis B, Bawor M, Thabane L. A systematic scoping review of adherence to reporting guidelines in health care literature. J Multidiscip Healthc. 2013;6:169–88.
Buscemi N, Hartling L, Vandermeer B, Tjosvold L, Klassen TP. Single data extraction generated more errors than double data extraction in systematic reviews. J Clin Epidemiol. 2006;59(7):697–703.
Carrasco-Labra A, Brignardello-Petersen R, Santesso N, Neumann I, Mustafa RA, Mbuagbaw L, Etxeandia Ikobaltzeta I, De Stio C, McCullagh LJ, Alonso-Coello P. Improving GRADE evidence tables part 1: a randomized trial shows improved understanding of content in summary-of-findings tables with a new format. J Clin Epidemiol. 2016;74:7–18.
The Northern Ireland Hub for Trials Methodology Research: SWAT/SWAR Information [ https://www.qub.ac.uk/sites/TheNorthernIrelandNetworkforTrialsMethodologyResearch/SWATSWARInformation/ ]. Accessed 31 Aug 2020.
Chick S, Sánchez P, Ferrin D, Morrice D. How to conduct a successful simulation study. In: Proceedings of the 2003 winter simulation conference: 2003; 2003. p. 66–70.
Google Scholar
Mulrow CD. The medical review article: state of the science. Ann Intern Med. 1987;106(3):485–8.
Sacks HS, Reitman D, Pagano D, Kupelnick B. Meta-analysis: an update. Mount Sinai J Med New York. 1996;63(3–4):216–24.
CAS Google Scholar
Areia M, Soares M, Dinis-Ribeiro M. Quality reporting of endoscopic diagnostic studies in gastrointestinal journals: where do we stand on the use of the STARD and CONSORT statements? Endoscopy. 2010;42(2):138–47.
Knol M, Groenwold R, Grobbee D. P-values in baseline tables of randomised controlled trials are inappropriate but still common in high impact journals. Eur J Prev Cardiol. 2012;19(2):231–2.
Chen M, Cui J, Zhang AL, Sze DM, Xue CC, May BH. Adherence to CONSORT items in randomized controlled trials of integrative medicine for colorectal Cancer published in Chinese journals. J Altern Complement Med. 2018;24(2):115–24.
Hopewell S, Ravaud P, Baron G, Boutron I. Effect of editors' implementation of CONSORT guidelines on the reporting of abstracts in high impact medical journals: interrupted time series analysis. BMJ. 2012;344:e4178.
The Cochrane Methodology Register Issue 2 2009 [ https://cmr.cochrane.org/help.htm ]. Accessed 31 Aug 2020.
Mbuagbaw L, Kredo T, Welch V, Mursleen S, Ross S, Zani B, Motaze NV, Quinlan L. Critical EPICOT items were absent in Cochrane human immunodeficiency virus systematic reviews: a bibliometric analysis. J Clin Epidemiol. 2016;74:66–72.
Barton S, Peckitt C, Sclafani F, Cunningham D, Chau I. The influence of industry sponsorship on the reporting of subgroup analyses within phase III randomised controlled trials in gastrointestinal oncology. Eur J Cancer. 2015;51(18):2732–9.
Setia MS. Methodology series module 5: sampling strategies. Indian J Dermatol. 2016;61(5):505–9.
Wilson B, Burnett P, Moher D, Altman DG, Al-Shahi Salman R. Completeness of reporting of randomised controlled trials including people with transient ischaemic attack or stroke: a systematic review. Eur Stroke J. 2018;3(4):337–46.
Kahale LA, Diab B, Brignardello-Petersen R, Agarwal A, Mustafa RA, Kwong J, Neumann I, Li L, Lopes LC, Briel M, et al. Systematic reviews do not adequately report or address missing outcome data in their analyses: a methodological survey. J Clin Epidemiol. 2018;99:14–23.
De Angelis CD, Drazen JM, Frizelle FA, Haug C, Hoey J, Horton R, Kotzin S, Laine C, Marusic A, Overbeke AJPM, et al. Is this clinical trial fully registered?: a statement from the International Committee of Medical Journal Editors*. Ann Intern Med. 2005;143(2):146–8.
Ohtake PJ, Childs JD. Why publish study protocols? Phys Ther. 2014;94(9):1208–9.
Rombey T, Allers K, Mathes T, Hoffmann F, Pieper D. A descriptive analysis of the characteristics and the peer review process of systematic review protocols published in an open peer review journal from 2012 to 2017. BMC Med Res Methodol. 2019;19(1):57.
Grimes DA, Schulz KF. Bias and causal associations in observational research. Lancet. 2002;359(9302):248–52.
Porta M (ed.): A dictionary of epidemiology, 5th edn. Oxford: Oxford University Press, Inc.; 2008.
El Dib R, Tikkinen KAO, Akl EA, Gomaa HA, Mustafa RA, Agarwal A, Carpenter CR, Zhang Y, Jorge EC, Almeida R, et al. Systematic survey of randomized trials evaluating the impact of alternative diagnostic strategies on patient-important outcomes. J Clin Epidemiol. 2017;84:61–9.
Helzer JE, Robins LN, Taibleson M, Woodruff RA Jr, Reich T, Wish ED. Reliability of psychiatric diagnosis. I. a methodological review. Arch Gen Psychiatry. 1977;34(2):129–33.
Chung ST, Chacko SK, Sunehag AL, Haymond MW. Measurements of gluconeogenesis and Glycogenolysis: a methodological review. Diabetes. 2015;64(12):3996–4010.
CAS PubMed PubMed Central Google Scholar
Sterne JA, Juni P, Schulz KF, Altman DG, Bartlett C, Egger M. Statistical methods for assessing the influence of study characteristics on treatment effects in 'meta-epidemiological' research. Stat Med. 2002;21(11):1513–24.
Moen EL, Fricano-Kugler CJ, Luikart BW, O’Malley AJ. Analyzing clustered data: why and how to account for multiple observations nested within a study participant? PLoS One. 2016;11(1):e0146721.
Zyzanski SJ, Flocke SA, Dickinson LM. On the nature and analysis of clustered data. Ann Fam Med. 2004;2(3):199–200.
Mathes T, Klassen P, Pieper D. Frequency of data extraction errors and methods to increase data extraction quality: a methodological review. BMC Med Res Methodol. 2017;17(1):152.
Bui DDA, Del Fiol G, Hurdle JF, Jonnalagadda S. Extractive text summarization system to aid data extraction from full text in systematic review development. J Biomed Inform. 2016;64:265–72.
Bui DD, Del Fiol G, Jonnalagadda S. PDF text classification to leverage information extraction from publication reports. J Biomed Inform. 2016;61:141–8.
Maticic K, Krnic Martinic M, Puljak L. Assessment of reporting quality of abstracts of systematic reviews with meta-analysis using PRISMA-A and discordance in assessments between raters without prior experience. BMC Med Res Methodol. 2019;19(1):32.
Speich B. Blinding in surgical randomized clinical trials in 2015. Ann Surg. 2017;266(1):21–2.
Abraha I, Cozzolino F, Orso M, Marchesi M, Germani A, Lombardo G, Eusebi P, De Florio R, Luchetta ML, Iorio A, et al. A systematic review found that deviations from intention-to-treat are common in randomized trials and systematic reviews. J Clin Epidemiol. 2017;84:37–46.
Zhong Y, Zhou W, Jiang H, Fan T, Diao X, Yang H, Min J, Wang G, Fu J, Mao B. Quality of reporting of two-group parallel randomized controlled clinical trials of multi-herb formulae: A survey of reports indexed in the Science Citation Index Expanded. Eur J Integrative Med. 2011;3(4):e309–16.
Farrokhyar F, Chu R, Whitlock R, Thabane L. A systematic review of the quality of publications reporting coronary artery bypass grafting trials. Can J Surg. 2007;50(4):266–77.
Oltean H, Gagnier JJ. Use of clustering analysis in randomized controlled trials in orthopaedic surgery. BMC Med Res Methodol. 2015;15:17.
Fleming PS, Koletsi D, Pandis N. Blinded by PRISMA: are systematic reviewers focusing on PRISMA and ignoring other guidelines? PLoS One. 2014;9(5):e96407.
Balasubramanian SP, Wiener M, Alshameeri Z, Tiruvoipati R, Elbourne D, Reed MW. Standards of reporting of randomized controlled trials in general surgery: can we do better? Ann Surg. 2006;244(5):663–7.
de Vries TW, van Roon EN. Low quality of reporting adverse drug reactions in paediatric randomised controlled trials. Arch Dis Child. 2010;95(12):1023–6.
Borg Debono V, Zhang S, Ye C, Paul J, Arya A, Hurlburt L, Murthy Y, Thabane L. The quality of reporting of RCTs used within a postoperative pain management meta-analysis, using the CONSORT statement. BMC Anesthesiol. 2012;12:13.
Kaiser KA, Cofield SS, Fontaine KR, Glasser SP, Thabane L, Chu R, Ambrale S, Dwary AD, Kumar A, Nayyar G, et al. Is funding source related to study reporting quality in obesity or nutrition randomized control trials in top-tier medical journals? Int J Obes. 2012;36(7):977–81.
Thomas O, Thabane L, Douketis J, Chu R, Westfall AO, Allison DB. Industry funding and the reporting quality of large long-term weight loss trials. Int J Obes. 2008;32(10):1531–6.
Khan NR, Saad H, Oravec CS, Rossi N, Nguyen V, Venable GT, Lillard JC, Patel P, Taylor DR, Vaughn BN, et al. A review of industry funding in randomized controlled trials published in the neurosurgical literature-the elephant in the room. Neurosurgery. 2018;83(5):890–7.
Hansen C, Lundh A, Rasmussen K, Hrobjartsson A. Financial conflicts of interest in systematic reviews: associations with results, conclusions, and methodological quality. Cochrane Database Syst Rev. 2019;8:Mr000047.
Kiehna EN, Starke RM, Pouratian N, Dumont AS. Standards for reporting randomized controlled trials in neurosurgery. J Neurosurg. 2011;114(2):280–5.
Liu LQ, Morris PJ, Pengel LH. Compliance to the CONSORT statement of randomized controlled trials in solid organ transplantation: a 3-year overview. Transpl Int. 2013;26(3):300–6.
Bala MM, Akl EA, Sun X, Bassler D, Mertz D, Mejza F, Vandvik PO, Malaga G, Johnston BC, Dahm P, et al. Randomized trials published in higher vs. lower impact journals differ in design, conduct, and analysis. J Clin Epidemiol. 2013;66(3):286–95.
Lee SY, Teoh PJ, Camm CF, Agha RA. Compliance of randomized controlled trials in trauma surgery with the CONSORT statement. J Trauma Acute Care Surg. 2013;75(4):562–72.
Ziogas DC, Zintzaras E. Analysis of the quality of reporting of randomized controlled trials in acute and chronic myeloid leukemia, and myelodysplastic syndromes as governed by the CONSORT statement. Ann Epidemiol. 2009;19(7):494–500.
Alvarez F, Meyer N, Gourraud PA, Paul C. CONSORT adoption and quality of reporting of randomized controlled trials: a systematic analysis in two dermatology journals. Br J Dermatol. 2009;161(5):1159–65.
Mbuagbaw L, Thabane M, Vanniyasingam T, Borg Debono V, Kosa S, Zhang S, Ye C, Parpia S, Dennis BB, Thabane L. Improvement in the quality of abstracts in major clinical journals since CONSORT extension for abstracts: a systematic review. Contemporary Clin trials. 2014;38(2):245–50.
Thabane L, Chu R, Cuddy K, Douketis J. What is the quality of reporting in weight loss intervention studies? A systematic review of randomized controlled trials. Int J Obes. 2007;31(10):1554–9.
Murad MH, Wang Z. Guidelines for reporting meta-epidemiological methodology research. Evidence Based Med. 2017;22(4):139.
METRIC - MEthodological sTudy ReportIng Checklist: guidelines for reporting methodological studies in health research [ http://www.equator-network.org/library/reporting-guidelines-under-development/reporting-guidelines-under-development-for-other-study-designs/#METRIC ]. Accessed 31 Aug 2020.
Jager KJ, Zoccali C, MacLeod A, Dekker FW. Confounding: what it is and how to deal with it. Kidney Int. 2008;73(3):256–60.
Parker SG, Halligan S, Erotocritou M, Wood CPJ, Boulton RW, Plumb AAO, Windsor ACJ, Mallett S. A systematic methodological review of non-randomised interventional studies of elective ventral hernia repair: clear definitions and a standardised minimum dataset are needed. Hernia. 2019.
Bouwmeester W, Zuithoff NPA, Mallett S, Geerlings MI, Vergouwe Y, Steyerberg EW, Altman DG, Moons KGM. Reporting and methods in clinical prediction research: a systematic review. PLoS Med. 2012;9(5):1–12.
Schiller P, Burchardi N, Niestroj M, Kieser M. Quality of reporting of clinical non-inferiority and equivalence randomised trials--update and extension. Trials. 2012;13:214.
Riado Minguez D, Kowalski M, Vallve Odena M, Longin Pontzen D, Jelicic Kadic A, Jeric M, Dosenovic S, Jakus D, Vrdoljak M, Poklepovic Pericic T, et al. Methodological and reporting quality of systematic reviews published in the highest ranking journals in the field of pain. Anesth Analg. 2017;125(4):1348–54.
Thabut G, Estellat C, Boutron I, Samama CM, Ravaud P. Methodological issues in trials assessing primary prophylaxis of venous thrombo-embolism. Eur Heart J. 2005;27(2):227–36.
Puljak L, Riva N, Parmelli E, González-Lorenzo M, Moja L, Pieper D. Data extraction methods: an analysis of internal reporting discrepancies in single manuscripts and practical advice. J Clin Epidemiol. 2020;117:158–64.
Ritchie A, Seubert L, Clifford R, Perry D, Bond C. Do randomised controlled trials relevant to pharmacy meet best practice standards for quality conduct and reporting? A systematic review. Int J Pharm Pract. 2019.
Babic A, Vuka I, Saric F, Proloscic I, Slapnicar E, Cavar J, Pericic TP, Pieper D, Puljak L. Overall bias methods and their use in sensitivity analysis of Cochrane reviews were not consistent. J Clin Epidemiol. 2019.
Tan A, Porcher R, Crequit P, Ravaud P, Dechartres A. Differences in treatment effect size between overall survival and progression-free survival in immunotherapy trials: a Meta-epidemiologic study of trials with results posted at ClinicalTrials.gov. J Clin Oncol. 2017;35(15):1686–94.
Croitoru D, Huang Y, Kurdina A, Chan AW, Drucker AM. Quality of reporting in systematic reviews published in dermatology journals. Br J Dermatol. 2020;182(6):1469–76.
Khan MS, Ochani RK, Shaikh A, Vaduganathan M, Khan SU, Fatima K, Yamani N, Mandrola J, Doukky R, Krasuski RA: Assessing the Quality of Reporting of Harms in Randomized Controlled Trials Published in High Impact Cardiovascular Journals. Eur Heart J Qual Care Clin Outcomes 2019.
Rosmarakis ES, Soteriades ES, Vergidis PI, Kasiakou SK, Falagas ME. From conference abstract to full paper: differences between data presented in conferences and journals. FASEB J. 2005;19(7):673–80.
Mueller M, D’Addario M, Egger M, Cevallos M, Dekkers O, Mugglin C, Scott P. Methods to systematically review and meta-analyse observational studies: a systematic scoping review of recommendations. BMC Med Res Methodol. 2018;18(1):44.
Li G, Abbade LPF, Nwosu I, Jin Y, Leenus A, Maaz M, Wang M, Bhatt M, Zielinski L, Sanger N, et al. A scoping review of comparisons between abstracts and full reports in primary biomedical research. BMC Med Res Methodol. 2017;17(1):181.
Krnic Martinic M, Pieper D, Glatt A, Puljak L. Definition of a systematic review used in overviews of systematic reviews, meta-epidemiological studies and textbooks. BMC Med Res Methodol. 2019;19(1):203.
Analytical study [ https://medical-dictionary.thefreedictionary.com/analytical+study ]. Accessed 31 Aug 2020.
Tricco AC, Tetzlaff J, Pham B, Brehaut J, Moher D. Non-Cochrane vs. Cochrane reviews were twice as likely to have positive conclusion statements: cross-sectional study. J Clin Epidemiol. 2009;62(4):380–6 e381.
Schalken N, Rietbergen C. The reporting quality of systematic reviews and Meta-analyses in industrial and organizational psychology: a systematic review. Front Psychol. 2017;8:1395.
Ranker LR, Petersen JM, Fox MP. Awareness of and potential for dependent error in the observational epidemiologic literature: A review. Ann Epidemiol. 2019;36:15–9 e12.
Paquette M, Alotaibi AM, Nieuwlaat R, Santesso N, Mbuagbaw L. A meta-epidemiological study of subgroup analyses in cochrane systematic reviews of atrial fibrillation. Syst Rev. 2019;8(1):241.
Download references
This work did not receive any dedicated funding.
Authors and affiliations.
Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON, Canada
Lawrence Mbuagbaw, Daeria O. Lawson & Lehana Thabane
Biostatistics Unit/FSORC, 50 Charlton Avenue East, St Joseph’s Healthcare—Hamilton, 3rd Floor Martha Wing, Room H321, Hamilton, Ontario, L8N 4A6, Canada
Lawrence Mbuagbaw & Lehana Thabane
Centre for the Development of Best Practices in Health, Yaoundé, Cameroon
Lawrence Mbuagbaw
Center for Evidence-Based Medicine and Health Care, Catholic University of Croatia, Ilica 242, 10000, Zagreb, Croatia
Livia Puljak
Department of Epidemiology and Biostatistics, School of Public Health – Bloomington, Indiana University, Bloomington, IN, 47405, USA
David B. Allison
Departments of Paediatrics and Anaesthesia, McMaster University, Hamilton, ON, Canada
Lehana Thabane
Centre for Evaluation of Medicine, St. Joseph’s Healthcare-Hamilton, Hamilton, ON, Canada
Population Health Research Institute, Hamilton Health Sciences, Hamilton, ON, Canada
You can also search for this author in PubMed Google Scholar
LM conceived the idea and drafted the outline and paper. DOL and LT commented on the idea and draft outline. LM, LP and DOL performed literature searches and data extraction. All authors (LM, DOL, LT, LP, DBA) reviewed several draft versions of the manuscript and approved the final manuscript.
Correspondence to Lawrence Mbuagbaw .
Ethics approval and consent to participate.
Not applicable.
Competing interests.
DOL, DBA, LM, LP and LT are involved in the development of a reporting guideline for methodological studies.
Publisher’s note.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
Reprints and permissions
Cite this article.
Mbuagbaw, L., Lawson, D.O., Puljak, L. et al. A tutorial on methodological studies: the what, when, how and why. BMC Med Res Methodol 20 , 226 (2020). https://doi.org/10.1186/s12874-020-01107-7
Download citation
Received : 27 May 2020
Accepted : 27 August 2020
Published : 07 September 2020
DOI : https://doi.org/10.1186/s12874-020-01107-7
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
ISSN: 1471-2288
Verify originality of an essay
Get ideas for your paper
Cite sources with ease
Updated 25 Jul 2024
Starting to work on your first academic research projects you may feel overwhelmed by the abundance of technical concepts that are commonly used. You may encounter terms like "research methods," "research methodology," and "data collection and analysis" that seem endless. Let’s clarify what they mean.
Before starting any research work, you must know what methods you’ll use to reach your goals. For that, understanding the research methodology definition is needed. The research methodology is essential to a dissertation, thesis, or research paper as it explains the methods applied to collect and analyze data. This chapter from EduBirdie research paper writing services enables readers to estimate the validity and credibility of your research by providing the following information:
The methodology is the overall plan of your project, which includes studying the methods applied in research and the basic principles and theories to develop a suitable approach for achieving your purposes. As for methods, they involve specific procedures used for collecting and analyzing data, like surveys, statistical tests, and experiments.
A simple description of the methods may be sufficient for shorter scientific papers. A methodology section may be required for more extensive and complex projects (dissertation or thesis). In this paragraph, a researcher should explain the approach to explore questions and provide links to relevant sources to support the choice of methods. Understanding research methodology is crucial for conducting effective studies, but if you're struggling to put it all together, you might consider the option to pay someone to write my paper to ensure your methodology is thoroughly and accurately presented.
Ensure your papers' accuracy and quality with expert editing and fact-checking for just $7/page.
You shouldn’t underestimate the importance of this paragraph, as it serves as a platform to demonstrate the thoroughness of your research process and its potential for further investigation. By including a section with a detailed description of the research methods applied, you increase the credibility of your paper and contextualize it within your area of study. This paragraph also serves as a reference for readers with questions or criticisms elsewhere in your article.
This paragraph doesn’t have to describe the data-collecting or analyzing process. Instead, this should outline the essential approaches and research perspectives. Let’s see what research methodology steps to take to complete a well-thought-out paragraph:
To ensure a clear and comprehensive methodology paragraph, it’s essential to avoid irrelevant information.
Adherence to ethical standards is critical to establishing trust, mutual respect, accountability, and fairness in research. Researchers should keep the following ethical considerations in mind when collecting and reporting data:
Three research methodology types are distinguished by their focus on numbers, words, or both. Let’s clarify their differences and features.
This approach aims to measure and test numerical data. It is used to confirm something. The method employs various techniques, such as tests, surveys, and existing databases. For instance, the quantitative methodology may be appropriate if you need to test several hypotheses.
It involves the collection and analysis of textual data and words. This approach is commonly used for exploratory research, where the study objective is to understand a phenomenon. It involves various techniques like interviews, observations, and focus groups. Exploratory research may be particularly useful in Sociology or Psychology, which aims to understand human actions.
This approach combines quantitative and qualitative methodologies. The quantitative method provides definitive facts and figures, while the qualitative approach adds a particular human aspect to the research. Researchers can obtain exact and exploratory data using a mixed-method approach, leading to incredibly interesting outcomes.
A quantitative approach enables getting practical results if your research problem involves collecting extensive numerical data. Otherwise, a qualitative methodology is more effective if you aim to understand people and their perspectives on events. For choosing the most suitable methodology, it’s essential to constantly address the research question and think about what you hope to achieve.
Many students feel confused when gathering data for their projects as they don’t know how to find resources for a research paper. We have something to tell you in this respect. The collecting data process for your study offers various options, which can be divided into the following types:
Interviews can be conducted one-on-one or in groups. They may be unstructured, structured, or semi-structured, which depends on how formal the questions are. In group interviews, you may ask participants to share their perceptions or opinions on specific topics.
This method is similar to a group interview but still has some differences. Focus groups involve a group of individuals discussing a particular topic while the researcher takes notes to make a summary of the received data.
This approach can be used to study human behavior and is conducted in a structured or spontaneous manner. Structured observations are made at a predetermined time and place, while spontaneous observations occur in participants' natural environments to analyze their behavior in everyday life.
Following this method, questions are posed to gather responses from participants, either in-person or virtually. The questions are free-answer (essay-style questions are good examples) and closed questions (multiple-choice). It's also possible to use a combination of both types of questions in a survey.
This involves something other than asking people questions. Instead, it relies on existing data for a study. This approach can be cost-effective and efficient, using research already completed. Still, since the researcher has limited control over the outcomes, documents, and records may need to provide a complete data source.
This method involves observing a person, situation, cultural group, or institution closely and thoroughly to explore the life of this social unit. The case study approach considers the total situation, including the factors, processes, and consequences of events and the individual behavior in its entire setting. It also involves analyzing and comparing cases to formulate hypotheses.
The choice of the data collection methodology in research depends on your academic paper objectives, resource constraints, and practicality. Let’s consider an example. If you’re doing exploratory research, it’s better to choose qualitative methods, such as interviews and focus groups, as they are more appropriate. On the other hand, if your project focuses on analyzing specific hypotheses or variables, large-scale surveys that provide substantial numerical data are likely more suitable.
Data analysis methods are typically categorized based on whether the research is qualitative or quantitative. Let’s consider them in detail.
Qualitative data analysis starts with data coding and is followed by one or more analysis techniques. While using this approach, researchers investigate based on images, observations, and language. They commonly employ the following data analysis methods:
This approach involves categorizing and interpreting the meaning of language (sentences, phrases, or words). Researchers apply it to various data sources, like newspapers, books, video recordings, social media posts, etc.
This method looks at how social contexts shape communication and meaning. It’s applied to explore cultural and social factors that affect the production and reception of spoken and written communication (political speeches, everyday conversations, media texts, etc.).
This method involves coding and examining data to discover general themes and patterns through analyzing focus group discussions, interview transcripts, and open-ended survey responses. The study’s results based on this method illustrate how the themes identified contribute to a broader understanding of the research question.
It involves interpreting story-based or narrative information and identifying the role and significance of people’s experiences. Narrative analysis can be applied to life histories, biographical accounts, and personal stories.
This approach is applied to develop concepts and theories based on the data gathered during research. It aims to generate theories from the data rather than begin with existing hypotheses and theories. When data is gathered, it is analyzed to discover themes and patterns which serve as a basis for developing concepts.
This is the way to explore how individuals interpret relationships, events, and other spheres of their lives. The central concept of this method is the subjective experience which is analyzed using interviews with participants to find out their patterns and understanding of the world around them.
The use of the following data analysis methods typically characterizes it:
The research purposes, practicalities, and resource constraints determine the choice of data analysis method.
Your research objectives heavily influence the choice of your methodological approach. Therefore, taking a step back and considering the bigger picture of your research before making any methodology decisions is crucial. To begin, you should determine whether your investigation is confirmatory or exploratory.
If your paper’s objectives are primarily exploratory, qualitative data collection research methodologies (for example, interviews) and analysis methods (such as thematic analysis) may be more suitable. In contrast, if your paper is looking to test or measure something (for example, confirmatory), quantitative data collection methods (like surveys) and statistical analyses may be more appropriate. It is essential to remember that your research objectives should always be the starting point. All methodology decisions should stem from them.
You must understand that your goal is not describing every method in your research methodology paragraph but explaining why you‘ve applied it. Here are some tips for writing strong examples of research methodology.
1. Concentrate on your purposes and research questions.
You should clearly show why your methods match your objectives and persuade the readers that you’ve selected the best possible method to answer your research questions and problem statement.
2. Refer to relevant sources.
You can strengthen your methodology by citing existing research in your study area. This allows you to do the following:
3. Think about your readers when writing for them.
Define how many details and what kind of information you have to give, and don’t be too redundant. If you choose typical methods that are standard for your subject, you shouldn’t give much justification or background.
Bring a human touch to your AI-generated drafts. Our expert editors refine your content for just $7/page.
Where does the methodology section go in a research paper?
The methodology section of research paper is always presented in a scientific paper after the introduction and before results, discussion, and conclusion. This structure is also used in other types of research papers, such as a thesis, dissertation, or research proposal. However, depending on the scope and purpose of an academic paper, including a literature review or a theoretical framework before presenting the research methodology may be required.
What is the difference between reliability and validity?
When measuring something, the two most important concepts are reliability and validity. Let’s see their peculiarities. Reliability is related to the consistency of measurement, which means the ability to reproduce results under identical conditions. Whereas validity is concerned with the accuracy of a measurement, determining whether or not the results truly represent what they are intended to measure. In experimental studies, it is also important to analyze the internal and external validity of the experiment.
What is the most common research methodology?
The most popular research methodologies are quantitative and qualitative. The choice of a suitable approach depends on the research objective. Researchers use a quantitative method if a research problem demands a large amount of numerical data to test hypotheses. If their objective is to gain insight into people's perceptions and understanding of events, they opt for a qualitative method.
What is a good research methodology example?
A good research methodology is thorough, transparent, and systematic. It must be designed to answer the research question and hypothesis and ensure the results are valid and reliable. Here is a good research methodology sample:
" This study aimed to investigate the effects of mindfulness-based interventions on stress and well-being among college students. A randomized controlled trial design was used in which people were randomly assigned to either an experimental group that received a mindfulness-based intervention or a control group that received no intervention. The study sample comprised 80 undergraduate students at a major US public university. Data were collected through self-report measures of stress and well-being at baseline, immediately after the intervention, and during three months of follow-up. Descriptive statistics were used to present the characteristics of the sample, and repeated measures ANOVA was used to explore the effect of the intervention on stress and well-being over time. Ethical considerations were considered throughout the study, and informed consent was obtained from all participants before participation. The university's institutional review board approved the study."
Thanks for your feedback.
Steven Robinson is an academic writing expert with a degree in English literature. His expertise, patient approach, and support empower students to express ideas clearly. On EduBirdie's blog, he provides valuable writing guides on essays, research papers, and other intriguing topics. Enjoys chess in free time.
What is qualitative research approaches, methods, and examples.
Students in social sciences frequently seek to understand how people feel, think, and behave in specific situations or relationships that evolve ov...
Working on academic papers can make it easy to feel overwhelmed by the huge amount of available data and information. One of the most crucial consi...
This post will help you learn about the use of acknowledgements in research paper and determine how they are composed and why they must be present ...
Table of Contents
Before conducting a study, a research proposal should be created that outlines researchers’ plans and methodology and is submitted to the concerned evaluating organization or person. Creating a research proposal is an important step to ensure that researchers are on track and are moving forward as intended. A research proposal can be defined as a detailed plan or blueprint for the proposed research that you intend to undertake. It provides readers with a snapshot of your project by describing what you will investigate, why it is needed, and how you will conduct the research.
Your research proposal should aim to explain to the readers why your research is relevant and original, that you understand the context and current scenario in the field, have the appropriate resources to conduct the research, and that the research is feasible given the usual constraints.
This article will describe in detail the purpose and typical structure of a research proposal , along with examples and templates to help you ace this step in your research journey.
A research proposal¹ ,² can be defined as a formal report that describes your proposed research, its objectives, methodology, implications, and other important details. Research proposals are the framework of your research and are used to obtain approvals or grants to conduct the study from various committees or organizations. Consequently, research proposals should convince readers of your study’s credibility, accuracy, achievability, practicality, and reproducibility.
With research proposals , researchers usually aim to persuade the readers, funding agencies, educational institutions, and supervisors to approve the proposal. To achieve this, the report should be well structured with the objectives written in clear, understandable language devoid of jargon. A well-organized research proposal conveys to the readers or evaluators that the writer has thought out the research plan meticulously and has the resources to ensure timely completion.
A research proposal is a sales pitch and therefore should be detailed enough to convince your readers, who could be supervisors, ethics committees, universities, etc., that what you’re proposing has merit and is feasible . Research proposals can help students discuss their dissertation with their faculty or fulfill course requirements and also help researchers obtain funding. A well-structured proposal instills confidence among readers about your ability to conduct and complete the study as proposed.
Research proposals can be written for several reasons:³
Research proposals should aim to answer the three basic questions—what, why, and how.
The What question should be answered by describing the specific subject being researched. It should typically include the objectives, the cohort details, and the location or setting.
The Why question should be answered by describing the existing scenario of the subject, listing unanswered questions, identifying gaps in the existing research, and describing how your study can address these gaps, along with the implications and significance.
The How question should be answered by describing the proposed research methodology, data analysis tools expected to be used, and other details to describe your proposed methodology.
Here is a research proposal sample template (with examples) from the University of Rochester Medical Center. 4 The sections in all research proposals are essentially the same although different terminology and other specific sections may be used depending on the subject.
If you want to know how to make a research proposal impactful, include the following components:¹
1. Introduction
This section provides a background of the study, including the research topic, what is already known about it and the gaps, and the significance of the proposed research.
2. Literature review
This section contains descriptions of all the previous relevant studies pertaining to the research topic. Every study cited should be described in a few sentences, starting with the general studies to the more specific ones. This section builds on the understanding gained by readers in the Introduction section and supports it by citing relevant prior literature, indicating to readers that you have thoroughly researched your subject.
3. Objectives
Once the background and gaps in the research topic have been established, authors must now state the aims of the research clearly. Hypotheses should be mentioned here. This section further helps readers understand what your study’s specific goals are.
4. Research design and methodology
Here, authors should clearly describe the methods they intend to use to achieve their proposed objectives. Important components of this section include the population and sample size, data collection and analysis methods and duration, statistical analysis software, measures to avoid bias (randomization, blinding), etc.
5. Ethical considerations
This refers to the protection of participants’ rights, such as the right to privacy, right to confidentiality, etc. Researchers need to obtain informed consent and institutional review approval by the required authorities and mention this clearly for transparency.
6. Budget/funding
Researchers should prepare their budget and include all expected expenditures. An additional allowance for contingencies such as delays should also be factored in.
7. Appendices
This section typically includes information that supports the research proposal and may include informed consent forms, questionnaires, participant information, measurement tools, etc.
8. Citations
Writing a research proposal begins much before the actual task of writing. Planning the research proposal structure and content is an important stage, which if done efficiently, can help you seamlessly transition into the writing stage. 3,5
Key Takeaways
Here’s a summary of the main points about research proposals discussed in the previous sections:
Q1. How is a research proposal evaluated?
A1. In general, most evaluators, including universities, broadly use the following criteria to evaluate research proposals . 6
Q2. What is the difference between the Introduction and Literature Review sections in a research proposal ?
A2. The Introduction or Background section in a research proposal sets the context of the study by describing the current scenario of the subject and identifying the gaps and need for the research. A Literature Review, on the other hand, provides references to all prior relevant literature to help corroborate the gaps identified and the research need.
Q3. How long should a research proposal be?
A3. Research proposal lengths vary with the evaluating authority like universities or committees and also the subject. Here’s a table that lists the typical research proposal lengths for a few universities.
Arts programs | 1,000-1,500 | |
University of Birmingham | Law School programs | 2,500 |
PhD | 2,500 | |
2,000 | ||
Research degrees | 2,000-3,500 |
Q4. What are the common mistakes to avoid in a research proposal ?
A4. Here are a few common mistakes that you must avoid while writing a research proposal . 7
Thus, a research proposal is an essential document that can help you promote your research and secure funds and grants for conducting your research. Consequently, it should be well written in clear language and include all essential details to convince the evaluators of your ability to conduct the research as proposed.
This article has described all the important components of a research proposal and has also provided tips to improve your writing style. We hope all these tips will help you write a well-structured research proposal to ensure receipt of grants or any other purpose.
References
Paperpal is a comprehensive AI writing toolkit that helps students and researchers achieve 2x the writing in half the time. It leverages 21+ years of STM experience and insights from millions of research articles to provide in-depth academic writing, language editing, and submission readiness support to help you write better, faster.
Get accurate academic translations, rewriting support, grammar checks, vocabulary suggestions, and generative AI assistance that delivers human precision at machine speed. Try for free or upgrade to Paperpal Prime starting at US$19 a month to access premium features, including consistency, plagiarism, and 30+ submission readiness checks to help you succeed.
Experience the future of academic writing – Sign up to Paperpal and start writing for free!
How to write a phd research proposal.
The future of academia: how ai tools are changing the way we do research, you may also like, dissertation printing and binding | types & comparison , what is a dissertation preface definition and examples , how to write your research paper in apa..., how to choose a dissertation topic, how to write an academic paragraph (step-by-step guide), maintaining academic integrity with paperpal’s generative ai writing..., research funding basics: what should a grant proposal..., how to write an abstract in research papers..., how to write dissertation acknowledgements.
A title page is required for all APA Style papers. There are both student and professional versions of the title page. Students should use the student version of the title page unless their instructor or institution has requested they use the professional version. APA provides a student title page guide (PDF, 199KB) to assist students in creating their title pages.
The student title page includes the paper title, author names (the byline), author affiliation, course number and name for which the paper is being submitted, instructor name, assignment due date, and page number, as shown in this example.
Title page setup is covered in the seventh edition APA Style manuals in the Publication Manual Section 2.3 and the Concise Guide Section 1.6
Student papers do not include a running head unless requested by the instructor or institution.
Follow the guidelines described next to format each element of the student title page.
|
|
|
---|---|---|
Paper title | Place the title three to four lines down from the top of the title page. Center it and type it in bold font. Capitalize of the title. Place the main title and any subtitle on separate double-spaced lines if desired. There is no maximum length for titles; however, keep titles focused and include key terms. |
|
Author names | Place one double-spaced blank line between the paper title and the author names. Center author names on their own line. If there are two authors, use the word “and” between authors; if there are three or more authors, place a comma between author names and use the word “and” before the final author name. | Cecily J. Sinclair and Adam Gonzaga |
Author affiliation | For a student paper, the affiliation is the institution where the student attends school. Include both the name of any department and the name of the college, university, or other institution, separated by a comma. Center the affiliation on the next double-spaced line after the author name(s). | Department of Psychology, University of Georgia |
Course number and name | Provide the course number as shown on instructional materials, followed by a colon and the course name. Center the course number and name on the next double-spaced line after the author affiliation. | PSY 201: Introduction to Psychology |
Instructor name | Provide the name of the instructor for the course using the format shown on instructional materials. Center the instructor name on the next double-spaced line after the course number and name. | Dr. Rowan J. Estes |
Assignment due date | Provide the due date for the assignment. Center the due date on the next double-spaced line after the instructor name. Use the date format commonly used in your country. | October 18, 2020 |
| Use the page number 1 on the title page. Use the automatic page-numbering function of your word processing program to insert page numbers in the top right corner of the page header. | 1 |
The professional title page includes the paper title, author names (the byline), author affiliation(s), author note, running head, and page number, as shown in the following example.
Follow the guidelines described next to format each element of the professional title page.
|
|
|
---|---|---|
Paper title | Place the title three to four lines down from the top of the title page. Center it and type it in bold font. Capitalize of the title. Place the main title and any subtitle on separate double-spaced lines if desired. There is no maximum length for titles; however, keep titles focused and include key terms. |
|
Author names
| Place one double-spaced blank line between the paper title and the author names. Center author names on their own line. If there are two authors, use the word “and” between authors; if there are three or more authors, place a comma between author names and use the word “and” before the final author name. | Francesca Humboldt |
When different authors have different affiliations, use superscript numerals after author names to connect the names to the appropriate affiliation(s). If all authors have the same affiliation, superscript numerals are not used (see Section 2.3 of the for more on how to set up bylines and affiliations). | Tracy Reuter , Arielle Borovsky , and Casey Lew-Williams | |
Author affiliation
| For a professional paper, the affiliation is the institution at which the research was conducted. Include both the name of any department and the name of the college, university, or other institution, separated by a comma. Center the affiliation on the next double-spaced line after the author names; when there are multiple affiliations, center each affiliation on its own line.
| Department of Nursing, Morrigan University |
When different authors have different affiliations, use superscript numerals before affiliations to connect the affiliations to the appropriate author(s). Do not use superscript numerals if all authors share the same affiliations (see Section 2.3 of the for more). | Department of Psychology, Princeton University | |
Author note | Place the author note in the bottom half of the title page. Center and bold the label “Author Note.” Align the paragraphs of the author note to the left. For further information on the contents of the author note, see Section 2.7 of the . | n/a |
| The running head appears in all-capital letters in the page header of all pages, including the title page. Align the running head to the left margin. Do not use the label “Running head:” before the running head. | Prediction errors support children’s word learning |
| Use the page number 1 on the title page. Use the automatic page-numbering function of your word processing program to insert page numbers in the top right corner of the page header. | 1 |
Published: August 08, 2024
One of the most underrated skills you can have as a marketer is marketing research — which is great news for this unapologetic cyber sleuth.
From brand design and product development to buyer personas and competitive analysis, I’ve researched a number of initiatives in my decade-long marketing career.
And let me tell you: having the right marketing research methods in your toolbox is a must.
Market research is the secret to crafting a strategy that will truly help you accomplish your goals. The good news is there is no shortage of options.
Thanks to the Internet, we have more marketing research (or market research) methods at our fingertips than ever, but they’re not all created equal. Let’s quickly go over how to choose the right one.
5 Research and Planning Templates + a Free Guide on How to Use Them in Your Market Research
All fields are required.
Click this link to access this resource at any time.
What are you researching? Do you need to understand your audience better? How about your competition? Or maybe you want to know more about your customer’s feelings about a specific product.
Before starting your research, take some time to identify precisely what you’re looking for. This could be a goal you want to reach, a problem you need to solve, or a question you need to answer.
For example, an objective may be as foundational as understanding your ideal customer better to create new buyer personas for your marketing agency (pause for flashbacks to my former life).
Or if you’re an organic sode company, it could be trying to learn what flavors people are craving.
Next, determine what data type will best answer the problems or questions you identified. There are primarily two types: qualitative and quantitative. (Sound familiar, right?)
Understanding the differences between qualitative and quantitative data will help you pinpoint which research methods will yield the desired results.
For instance, thinking of our earlier examples, qualitative data would usually be best suited for buyer personas, while quantitative data is more useful for the soda flavors.
However, truth be told, the two really work together.
Qualitative conclusions are usually drawn from quantitative, numerical data. So, you’ll likely need both to get the complete picture of your subject.
For example, if your quantitative data says 70% of people are Team Black and only 30% are Team Green — Shout out to my fellow House of the Dragon fans — your qualitative data will say people support Black more than Green.
(As they should.)
You’ll also want to understand the difference between primary and secondary research.
Primary research involves collecting new, original data directly from the source (say, your target market). In other words, it’s information gathered first-hand that wasn’t found elsewhere.
Some examples include conducting experiments, surveys, interviews, observations, or focus groups.
Meanwhile, secondary research is the analysis and interpretation of existing data collected from others. Think of this like what we used to do for school projects: We would read a book, scour the internet, or pull insights from others to work from.
So, which is better?
Personally, I say any research is good research, but if you have the time and resources, primary research is hard to top. With it, you don’t have to worry about your source's credibility or how relevant it is to your specific objective.
You are in full control and best equipped to get the reliable information you need.
Once you know your objective and what kind of data you want, you’re ready to select your marketing research method.
For instance, let’s say you’re a restaurant trying to see how attendees felt about the Speed Dating event you hosted last week.
You shouldn’t run a field experiment or download a third-party report on speed dating events; those would be useless to you. You need to conduct a survey that allows you to ask pointed questions about the event.
This would yield both qualitative and quantitative data you can use to improve and bring together more love birds next time around.
Now that you know what you’re looking for in a marketing research method, let’s dive into the best options.
Note: According to HubSpot’s 2024 State of Marketing report, understanding customers and their needs is one of the biggest challenges facing marketers today. The options we discuss are great consumer research methodologies , but they can also be used for other areas.
1. interviews.
Interviews are a form of primary research where you ask people specific questions about a topic or theme. They typically deliver qualitative information.
I’ve conducted many interviews for marketing purposes, but I’ve also done many for journalistic purposes, like this profile on comedian Zarna Garg . There’s no better way to gather candid, open-ended insights in my book, but that doesn’t mean they’re a cure-all.
What I like: Real-time conversations allow you to ask different questions if you’re not getting the information you need. They also push interviewees to respond quickly, which can result in more authentic answers.
What I dislike: They can be time-consuming and harder to measure (read: get quantitative data) unless you ask pointed yes or no questions.
Best for: Creating buyer personas or getting feedback on customer experience, a product, or content.
Focus groups are similar to conducting interviews but on a larger scale.
In marketing and business, this typically means getting a small group together in a room (or Zoom), asking them questions about various topics you are researching. You record and/or observe their responses to then take action.
They are ideal for collecting long-form, open-ended feedback, and subjective opinions.
One well-known focus group you may remember was run by Domino’s Pizza in 2009 .
After poor ratings and dropping over $100 million in revenue, the brand conducted focus groups with real customers to learn where they could have done better.
It was met with comments like “worst excuse for pizza I’ve ever had” and “the crust tastes like cardboard.” But rather than running from the tough love, it took the hit and completely overhauled its recipes.
The team admitted their missteps and returned to the market with better food and a campaign detailing their “Pizza Turn Around.”
The result? The brand won a ton of praise for its willingness to take feedback, efforts to do right by its consumers, and clever campaign. But, most importantly, revenue for Domino’s rose by 14.3% over the previous year.
The brand continues to conduct focus groups and share real footage from them in its promotion:
What I like: Similar to interviewing, you can dig deeper and pivot as needed due to the real-time nature. They’re personal and detailed.
What I dislike: Once again, they can be time-consuming and make it difficult to get quantitative data. There is also a chance some participants may overshadow others.
Best for: Product research or development
Pro tip: Need help planning your focus group? Our free Market Research Kit includes a handy template to start organizing your thoughts in addition to a SWOT Analysis Template, Survey Template, Focus Group Template, Presentation Template, Five Forces Industry Analysis Template, and an instructional guide for all of them. Download yours here now.
Surveys are a form of primary research where individuals are asked a collection of questions. It can take many different forms.
They could be in person, over the phone or video call, by email, via an online form, or even on social media. Questions can be also open-ended or closed to deliver qualitative or quantitative information.
A great example of a close-ended survey is HubSpot’s annual State of Marketing .
In the State of Marketing, HubSpot asks marketing professionals from around the world a series of multiple-choice questions to gather data on the state of the marketing industry and to identify trends.
The survey covers various topics related to marketing strategies, tactics, tools, and challenges that marketers face. It aims to provide benchmarks to help you make informed decisions about your marketing.
It also helps us understand where our customers’ heads are so we can better evolve our products to meet their needs.
Apple is no stranger to surveys, either.
In 2011, the tech giant launched Apple Customer Pulse , which it described as “an online community of Apple product users who provide input on a variety of subjects and issues concerning Apple.”
"For example, we did a large voluntary survey of email subscribers and top readers a few years back."
While these readers gave us a long list of topics, formats, or content types they wanted to see, they sometimes engaged more with content types they didn’t select or favor as much on the surveys when we ran follow-up ‘in the wild’ tests, like A/B testing.”
Pepsi saw similar results when it ran its iconic field experiment, “The Pepsi Challenge” for the first time in 1975.
The beverage brand set up tables at malls, beaches, and other public locations and ran a blindfolded taste test. Shoppers were given two cups of soda, one containing Pepsi, the other Coca-Cola (Pepsi’s biggest competitor). They were then asked to taste both and report which they preferred.
People overwhelmingly preferred Pepsi, and the brand has repeated the experiment multiple times over the years to the same results.
What I like: It yields qualitative and quantitative data and can make for engaging marketing content, especially in the digital age.
What I dislike: It can be very time-consuming. And, if you’re not careful, there is a high risk for scientific error.
Best for: Product testing and competitive analysis
Pro tip: " Don’t make critical business decisions off of just one data set," advises Pamela Bump. "Use the survey, competitive intelligence, external data, or even a focus group to give you one layer of ideas or a short-list for improvements or solutions to test. Then gather your own fresh data to test in an experiment or trial and better refine your data-backed strategy."
8. public domain or third-party research.
While original data is always a plus, there are plenty of external resources you can access online and even at a library when you’re limited on time or resources.
Some reputable resources you can use include:
It’s also smart to turn to reputable organizations that are specific to your industry or field. For instance, if you’re a gardening or landscaping company, you may want to pull statistics from the Environmental Protection Agency (EPA).
If you’re a digital marketing agency, you could look to Google Research or HubSpot Research . (Hey, I know them!)
What I like: You can save time on gathering data and spend more time on analyzing. You can also rest assured the data is from a source you trust.
What I dislike: You may not find data specific to your needs.
Best for: Companies under a time or resource crunch, adding factual support to content
Pro tip: Fellow HubSpotter Iskiev suggests using third-party data to inspire your original research. “Sometimes, I use public third-party data for ideas and inspiration. Once I have written my survey and gotten all my ideas out, I read similar reports from other sources and usually end up with useful additions for my own research.”
If the data you need isn’t available publicly and you can’t do your own market research, you can also buy some. There are many reputable analytics companies that offer subscriptions to access their data. Statista is one of my favorites, but there’s also Euromonitor , Mintel , and BCC Research .
What I like: Same as public domain research
What I dislike: You may not find data specific to your needs. It also adds to your expenses.
Best for: Companies under a time or resource crunch or adding factual support to content
You’re not going to like my answer, but “it depends.” The best marketing research method for you will depend on your objective and data needs, but also your budget and timeline.
My advice? Aim for a mix of quantitative and qualitative data. If you can do your own original research, awesome. But if not, don’t beat yourself up. Lean into free or low-cost tools . You could do primary research for qualitative data, then tap public sources for quantitative data. Or perhaps the reverse is best for you.
Whatever your marketing research method mix, take the time to think it through and ensure you’re left with information that will truly help you achieve your goals.
Related articles.
Free Guide & Templates to Help Your Market Research
Marketing software that helps you drive revenue, save time and resources, and measure and optimize your investments — all on one easy-to-use platform
News & notices, customer alerts and processing times.
Over 140 Business Filings, Name Reservations, and Orders for Certificates of Status and Certified Copies of Corporations, Limited Liability Companies and Limited Partnerships available online .
The Business Entities Section of the Secretary of State’s office processes filings, maintains records and provides information to the public relating to business entities (corporations, limited liability companies, limited partnerships, general partnerships, limited liability partnerships and other business filings).
General provisions governing most business entities are found in the California Corporations Code .
27 Pages Posted: 25 Apr 2024
China Jiliang University
University of Macau
Zhejiang University - College of Biosystems Engineering and Food Science
Zhejiang University
The rice starch content is the key indicator which evaluate its nutritive as well as economic worth. The research aimed at analyzing the availability of reflection-type terahertz (THz) absorption spectroscopy and spatially localized rapid sample preparation method combined with chemometrics to quantify starch in rice. A model derived from cuckoo search algorithm-support vector regression (CS-SVR) was proposed for starch content detection in rice. THz absorption spectra were processed by competitive adaptive reweighted sampling (CARS) as well as pre-processing techniques. The experimental results illustrated reflection-type THz absorption spectroscopy and spatially localized rapid sample preparation method combined with CS-SVR could comparatively detect the rice starch content well. Compared with other modeling algorithms, CS-SVR combined with CARS spectral selection and auto scaling could build a better rice starch content model. Its related coefficient of prediction set as well as mean relative error were 0.9805, and 0.4887%.
Keywords: Reflection-type Terahertz spectroscopy, Spatially localized rapid sample preparation method, CS-SVR, rice, Starch content
Suggested Citation: Suggested Citation
No. 258, Xueyuan Street Hangzhou City China
P.O. Box 3001 Macau
Fei liu (contact author), zhejiang university ( email ).
38 Zheda Road Hangzhou, 310058 China
Paper statistics, related ejournals, agricultural food science & food groups ejournal.
Subscribe to this fee journal for more curated articles on this topic
Grab your spot at the free arXiv Accessibility Forum
Help | Advanced Search
Title: algorithm research of elmo word embedding and deep learning multimodal transformer in image description.
Abstract: Zero sample learning is an effective method for data deficiency. The existing embedded zero sample learning methods only use the known classes to construct the embedded space, so there is an overfitting of the known classes in the testing process. This project uses category semantic similarity measures to classify multiple tags. This enables it to incorporate unknown classes that have the same meaning as currently known classes into the vector space when it is built. At the same time, most of the existing zero sample learning algorithms directly use the depth features of medical images as input, and the feature extraction process does not consider semantic information. This project intends to take ELMo-MCT as the main task and obtain multiple visual features related to the original image through self-attention mechanism. In this paper, a large number of experiments are carried out on three zero-shot learning reference datasets, and the best harmonic average accuracy is obtained compared with the most advanced algorithms.
Subjects: | Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI) |
Cite as: | [cs.CV] |
(or [cs.CV] for this version) | |
Focus to learn more arXiv-issued DOI via DataCite |
Access paper:.
Code, data and media associated with this article, recommenders and search tools.
arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.
Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.
Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .
IMAGES
COMMENTS
In a research paper, thesis, or dissertation, the methodology section describes the steps you took to investigate and research a hypothesis and your rationale for the specific processes and techniques used to identify, collect, and analyze data. The methodology element of your research report enables readers to assess the study's overall ...
Research Methodology refers to the systematic and scientific approach used to conduct research, investigate problems, and gather data and information for a specific purpose. It involves the techniques and procedures used to identify, collect, analyze, and interpret data to answer research questions or solve research problems.
Your research methodology discusses and explains the data collection and analysis methods you used in your research. A key part of your thesis, dissertation, or research paper, the methodology chapter explains what you did and how you did it, allowing readers to evaluate the reliability and validity of your research and your dissertation topic.
Methodology Section for Research Papers The methodology section of your paper describes how your research was conducted. This information allows readers to check whether your approach is accurate and dependable. A good methodology can help increase the reader's trust in your findings. First, we will define and differentiate quantitative and qualitative research. Then, for each of these types ...
Learn how to write a clear and effective methodology section for your social sciences research paper. Find tips and examples from USC experts.
Research Methodology Example. Detailed Walkthrough + Free Methodology Chapter Template. If you're working on a dissertation or thesis and are looking for an example of a research methodology chapter, you've come to the right place. In this video, we walk you through a research methodology from a dissertation that earned full distinction ...
Learn exactly what a research methodology is with Grad Coach's plain language, easy-to-understand explanation, including examples and videos.
Research methodology is the set of procedures and techniques used to collect, analyze, and interpret data to understand and solve a research problem.
Research Methods | Definitions, Types, Examples Research methods are specific procedures for collecting and analyzing data. Developing your research methods is an integral part of your research design. When planning your methods, there are two key decisions you will make.
A clear methods section impacts editorial evaluation and readers' understanding, and is also the backbone of transparency and replicability. For example, the Reproducibility Project: Cancer Biology project set out in 2013 to replicate experiments from 50 high profile cancer papers, but revised their target to 18 papers once they understood how much methodological detail was not contained in ...
The methods section of an APA style paper is where you report in detail how you performed your study. Research papers in the social and natural sciences often follow APA style. This article focuses on reporting quantitative research methods.
15 Research Methodology Examples. Research methodologies can roughly be categorized into three group: quantitative, qualitative, and mixed-methods. Qualitative Research: This methodology is based on obtaining deep, contextualized, non-numerical data. It can occur, for example, through open-ended questioning of research particiapnts in order to ...
Your research methodology discusses and explains the data collection and analysis methods you used in your research. A key part of your thesis,
Here is an ultimate guide on research methodology to help you ace your research. Learn about its definition, importance, and types.
Definition, Types, and Examples. Research methodology 1,2 is a structured and scientific approach used to collect, analyze, and interpret quantitative or qualitative data to answer research questions or test hypotheses. A research methodology is like a plan for carrying out research and helps keep researchers on track by limiting the scope of ...
Research methodology is the process or the way you intend to execute your entire research. A research methodology provides a description of the process you will undertake to convert your idea into a study. Read more to find out about types, structure, importance, and tips on research methodology.
The methods section should describe what was done to answer the research question, describe how it was done, justify the experimental design, and explain how the results were analyzed. Scientific writing is direct and orderly. Therefore, the methods section structure should: describe the materials used in the study, explain how the materials ...
Methods section is a crucial part of a manuscript and emphasizes the reliability and validity of a research study. And knowing how to write the methods section of a research paper is the first step in mastering scientific writing. Read this article to understand the importance, purpose, and the best way to write the methods section of a research paper.
The methodology in a research paper, thesis paper or dissertation is the section in which you describe the actions you took to investigate and research a problem and your rationale for the specific processes and techniques you use within your research to identify, collect and analyze information that helps you understand the problem.
The research methodology section of any academic research paper gives you the opportunity to convince your readers that your research is useful and will contribute to your field of study. An effective research methodology is grounded in your overall approach - whether qualitative or quantitative - and adequately describes the methods you used.
Methodological studies - studies that evaluate the design, analysis or reporting of other research-related reports - play an important role in health research. They help to highlight issues in the conduct of research with the aim of improving health research methodology, and ultimately reducing research waste.
Background Methodological studies - studies that evaluate the design, analysis or reporting of other research-related reports - play an important role in health research. They help to highlight issues in the conduct of research with the aim of improving health research methodology, and ultimately reducing research waste. Main body We provide an overview of some of the key aspects of ...
The methodology section of research paper is always presented in a scientific paper after the introduction and before results, discussion, and conclusion. This structure is also used in other types of research papers, such as a thesis, dissertation, or research proposal.
Before conducting a study, a research proposal should be created that outlines researchers' plans and methodology and is submitted to the concerned evaluating organization or person. Creating a research proposal is an important step to ensure that researchers are on track and are moving forward as intended. A research proposal can be defined as a detailed plan or blueprint for the proposed ...
Example. Paper title. Place the title three to four lines down from the top of the title page. Center it and type it in bold font. ... For a professional paper, the affiliation is the institution at which the research was conducted. Include both the name of any department and the name of the college, university, or other institution, separated ...
From brand design and product development to buyer personas and competitive analysis, I've researched a number of initiatives in my decade-long marketing career.. And let me tell you: having the right marketing research methods in your toolbox is a must. Market research is the secret to crafting a strategy that will truly help you accomplish your goals.
News & Notices, Customer Alerts and Processing Times. News & Notices: Get the latest information about changes affecting business conducted with the Business Programs Division.; Customer Alerts: Get the latest information about confirmed scams against Californians and businesses in the State of California, and what you can do if you have been a victim of a scam.
Studying a sample of German consumers of Cradle-to-Cradle certified products and applying the PLS-SEM methodology, we find that the trust in eco-product labels and positive perceptions of green supply chains (GSCs) are important drivers of green purchasing behavior. ... (also referred to as green and eco-products in some research papers) as a ...
The rice starch content is the key indicator which evaluate its nutritive as well as economic worth. The research aimed at analyzing the availability of reflection-type terahertz (THz) absorption spectroscopy and spatially localized rapid sample preparation method combined with chemometrics to quantify starch in rice.
Zero sample learning is an effective method for data deficiency. The existing embedded zero sample learning methods only use the known classes to construct the embedded space, so there is an overfitting of the known classes in the testing process. This project uses category semantic similarity measures to classify multiple tags. This enables it to incorporate unknown classes that have the same ...