>
>
>
How to write a methods section that passes peer review
How to write a methods section that passes peer review
How to write a methods section that passes peer review | RISE Research
How to write a methods section that passes peer review | RISE Research
RISE Research
RISE Research
TL;DR: A methods section is the part of a research paper that explains exactly how a study was conducted, so that another researcher could replicate it. For high school students, writing a methods section that passes peer review means being precise about design, data collection, and analysis decisions, not just describing what you did. This guide walks through every step, shows what strong and weak methods sections look like, and identifies where students most often go wrong without expert guidance.
Introduction
Most high school students think a methods section is a simple description of what they did during their research. It is not. A methods section is a justification. It tells the reader not only what you did, but why you made each decision, and whether those decisions were valid enough to produce trustworthy results. Knowing how to write a methods section that passes peer review is one of the most underestimated skills in academic research, and it is one of the most common reasons student papers get rejected.
The gap between a methods section that gets published and one that gets rejected is almost never about how much work the student did. It is about how clearly and defensibly that work is documented. This post gives you a complete, step-by-step process for writing a methods section that meets the standards peer reviewers actually apply.
What is a methods section and why does it matter for your research paper?
A methods section is the part of a research paper that documents the research design, data sources, data collection procedures, and analysis approach used to answer the research question. It appears after the literature review and before the results. Its purpose is to give peer reviewers and readers enough detail to evaluate whether the findings are credible and reproducible.
Without a strong methods section, even genuinely good research cannot be verified. Peer reviewers are trained to look for specific elements: a clearly stated research design, a justified sampling or data selection process, a transparent analysis method, and an honest acknowledgment of limitations. If any of these are missing or vague, the paper fails review regardless of how interesting the findings are.
For high school students submitting to academic journals or competitions, the methods section is often the section that separates publishable work from work that gets returned for major revision. It is also the section that demonstrates to university admissions readers that a student understands the logic of research, not just the topic. RISE Research scholars who publish original work consistently report that their methods section required the most revision and produced the most learning. You can see examples of published student research across multiple disciplines on the RISE Research publications page.
How to write a methods section that passes peer review: a step-by-step process for high school students
Step 1: State your research design before describing any procedures. The first sentence of your methods section should name the type of study you conducted. Was it a quantitative survey study? A qualitative interview study? A systematic literature review? A comparative case study? Peer reviewers read hundreds of papers. They need to orient themselves immediately. A weak methods section begins with: "I collected data by sending a survey to students." A strong one begins: "This study used a cross-sectional quantitative survey design to examine the relationship between daily screen time and self-reported sleep quality among Grade 11 students." The design statement frames everything that follows.
Step 2: Describe your sample or data source with specificity. Who or what did you study, and why? If you surveyed people, state the total number of participants, how they were selected, and what inclusion or exclusion criteria you applied. If you used secondary data, name the dataset, its source, the time period it covers, and why it was appropriate for your question. Vague descriptions like "a group of students" or "data from the internet" will not survive peer review. Reviewers need to assess whether your sample was appropriate for your research question and large enough to support your conclusions.
Step 3: Document your data collection instruments or procedures exactly. If you used a survey, name the validated scale or describe each question type. If you conducted interviews, state whether they were structured, semi-structured, or unstructured, and how long they ran. If you ran an experiment, describe the conditions, the sequence, and any controls applied. The standard to aim for is replicability: a reader should be able to repeat your study using only your methods section as a guide. Google Forms is a common tool for student surveys; if you used it, state that, along with how responses were stored and whether anonymity was maintained.
Step 4: Explain your analysis method with the same precision. Describing what you did with the data is just as important as describing how you collected it. If you ran statistical tests, name them: a Pearson correlation, a chi-square test, a thematic analysis using open coding. If you used software, name it: SPSS, R, Python with specific libraries, or even Excel with the Analysis ToolPak. If your analysis was qualitative, explain how themes were identified and whether a second coder reviewed your coding for consistency. Reviewers will check whether the analysis method matches the research design and the type of data collected. A mismatch here is a fast path to rejection.
Step 5: Address ethical considerations. Even high school research requires a statement on ethics. Did participants give informed consent? Were they told they could withdraw? Was data stored securely? For studies involving human participants, peer reviewers expect at least a paragraph on this. If your school or program provided an ethics review, mention it. Omitting this section signals to reviewers that the researcher did not think carefully about research integrity.
Step 6: Acknowledge limitations honestly. Every study has limitations. A small sample size, a non-random selection process, a single geographic location, or a self-report bias in survey responses are all legitimate limitations. Naming them is not a weakness. It shows that you understand the boundaries of your findings. Reviewers trust researchers who are honest about what their study cannot prove. Students who skip this step appear either unaware of their study's constraints or unwilling to acknowledge them. Neither impression helps a paper pass review.
The single most common mistake at this stage is writing the methods section as a narrative story rather than a technical document. Phrases like "first I decided to" or "then we thought it would be good to" belong in a lab notebook, not a methods section. Write in the past tense, use precise academic language, and eliminate any language that centers your personal experience rather than the research procedure.
Where most high school students get stuck with writing a methods section
The first sticking point is justification. Students describe what they did but not why they chose that approach over alternatives. A peer reviewer will ask: why a survey and not an interview? Why 50 participants and not 100? Why thematic analysis and not discourse analysis? Students working alone often do not know how to answer these questions because they have not been trained to think about research design as a set of deliberate choices with trade-offs. They chose their method because it seemed easiest or most familiar, not because it was the most appropriate for the question.
The second sticking point is matching the analysis to the data type. A student who collects Likert-scale responses and then runs a Pearson correlation has made a methodological error that most peer reviewers will flag immediately. Likert data is ordinal, not continuous. The appropriate test is different. Without a background in research methods, students often apply the first statistical test they encounter in a tutorial without checking whether it fits their data structure.
The third sticking point is scope. High school students often design studies that are too ambitious for the resources available to them: a longitudinal study that requires two years of data, a clinical study that requires IRB approval, or a comparative study that requires access to proprietary databases. A PhD mentor can assess within one session whether a proposed design is executable, and redirect it to something equally rigorous but actually achievable. RISE Research mentors, drawn from institutions including Ivy League and Oxbridge universities, do exactly this at the design stage before a student invests weeks in an approach that will not survive review. You can learn more about the mentors who guide this process on the RISE Research mentors page.
If you are at this stage and want a PhD mentor to guide you through writing a methods section that passes peer review and through the full research process, book a free 20-minute Research Assessment to see what is possible before the Summer 2026 Cohort I Deadline.
What does a good methods section look like? A high school example
A weak methods section is vague, narrative, and unjustified. A strong one is precise, structured, and defensible. The difference is not length. It is specificity and logic. A strong methods section names the design, justifies the sample, describes the instrument, explains the analysis, addresses ethics, and acknowledges limitations, all in clear, past-tense prose.
Weak example: "I surveyed some students at my school about their social media use and sleep. I asked them questions about how much they use their phones and how well they sleep. Then I looked at the results and found some patterns."
This version fails peer review on every dimension. The sample is undefined. The instrument is undescribed. The analysis is unnamed. There is no ethical statement and no acknowledgment of limitations.
Strong example: "This study used a cross-sectional survey design to examine the association between daily social media use and self-reported sleep quality among high school students. A convenience sample of 87 students in Grades 10 and 11 at a single urban high school in Mumbai was recruited via classroom announcement. Participants completed an anonymous online questionnaire comprising the Bergen Social Media Addiction Scale (BSMAS) and the Pittsburgh Sleep Quality Index (PSQI). Data were collected via Google Forms between March and April 2024 and exported to SPSS for analysis. A Spearman rank-order correlation was used to assess the relationship between BSMAS and PSQI scores, given the ordinal nature of both scales. All participants provided written informed consent, and no personally identifiable information was collected. Limitations include the convenience sampling method, which restricts generalisability, and the reliance on self-report measures, which are subject to recall bias."
The strong version names validated instruments, justifies the statistical test choice, addresses ethics, and acknowledges limitations. A peer reviewer reading this knows exactly what was done and why. For more examples of what rigorous student research looks like in practice, see this published RISE project on borrower characteristics in peer-to-peer lending, which demonstrates clear methodology documentation across a quantitative study.
The best tools for writing a methods section as a high school student
Google Scholar is the starting point for finding papers in your field that use a similar methodology. Reading the methods sections of published papers in your target journal gives you a direct model for the level of detail and the terminology expected. Search for papers with similar designs to yours and study how they document their procedures.
PubMed is essential for health, biology, and psychology research. It provides free access to peer-reviewed studies and includes structured abstracts that often summarise the methods clearly. Use it to identify validated scales and instruments that have already been used in published research, which strengthens the credibility of your own instrument choices.
JSTOR gives access to humanities and social science journals. If your research is in history, sociology, or political science, JSTOR lets you read full methods sections from published papers in those disciplines, where qualitative and interpretive methods are documented differently than in STEM fields.
Zotero is a free reference manager that helps you organise the sources you cite in your methods section, particularly when citing validated instruments or prior studies that used the same design. It generates citations automatically and keeps your reference list consistent across the paper.
CONSORT, STROBE, and PRISMA reporting guidelines are free, publicly available checklists developed by academic bodies to standardise how different types of studies are reported. CONSORT is for randomised trials, STROBE for observational studies, and PRISMA for systematic reviews. Using the relevant checklist as a writing guide ensures you do not omit any element that peer reviewers expect to see.
Frequently asked questions about writing a methods section for high school students
How long should a methods section be in a high school research paper?
A methods section for a high school research paper is typically 300 to 600 words, or roughly one to two pages double-spaced. Length depends on the complexity of the study design. A simple survey study requires less documentation than a mixed-methods study. The goal is completeness, not length: every element of the design must be present, but there is no benefit to padding.
Should I write the methods section in first person or third person?
Most peer-reviewed journals in the sciences use third person and passive voice in methods sections: "Data were collected" rather than "I collected data." Humanities and social science journals sometimes accept first person. Check the author guidelines for your target journal before writing. When in doubt, third person is the safer choice for a student paper aiming for publication.
What is the difference between methodology and methods in a research paper?
Methodology refers to the theoretical framework or philosophy behind your research approach, such as whether you are working within a positivist or interpretivist tradition. Methods refers to the specific procedures used: the survey instrument, the sample size, the statistical test. High school research papers typically focus on methods rather than methodology, though understanding the distinction helps when responding to peer review comments.
How do I justify my sample size in a methods section?
For quantitative studies, the gold standard is a power analysis, which calculates the minimum sample size needed to detect an effect of a given size at a given significance level. Tools like G*Power are free and widely used for this. For qualitative studies, justification is based on saturation: the point at which additional participants stop producing new themes. State whichever applies to your study and explain the reasoning briefly.
Can I use a survey I created myself or do I need a validated instrument?
You can use a self-created survey, but peer reviewers will scrutinise it more closely than a validated instrument. A validated instrument has been tested for reliability and validity in prior published research, which strengthens your study's credibility immediately. If you create your own survey, you must pilot test it and report the results. Using a validated scale where one exists is almost always the stronger methodological choice for a student paper targeting publication.
Conclusion
Writing a methods section that passes peer review requires three things: precision in documenting what you did, justification for why you made each design decision, and honesty about the limitations of your approach. These are not instinctive skills. They are learned through practice and through feedback from researchers who have navigated the peer review process themselves.
The most important takeaway from this guide is that the methods section is not a formality. It is the foundation of your paper's credibility. A peer reviewer who cannot replicate your study from your methods section will not trust your results, regardless of how interesting they are. Get the methods right, and the rest of the paper stands on solid ground.
If you want to see what published high school research looks like across a range of disciplines, the RISE Research projects page shows completed scholar work. The RISE Research results page documents the admissions outcomes that follow. The Summer 2026 Cohort I Deadline is approaching. If writing a methods section that passes peer review is a step you want to get right with expert guidance behind you, schedule a free Research Assessment and we will match you with a PhD mentor who has published in your subject area and guided students through this exact process.
TL;DR: A methods section is the part of a research paper that explains exactly how a study was conducted, so that another researcher could replicate it. For high school students, writing a methods section that passes peer review means being precise about design, data collection, and analysis decisions, not just describing what you did. This guide walks through every step, shows what strong and weak methods sections look like, and identifies where students most often go wrong without expert guidance.
Introduction
Most high school students think a methods section is a simple description of what they did during their research. It is not. A methods section is a justification. It tells the reader not only what you did, but why you made each decision, and whether those decisions were valid enough to produce trustworthy results. Knowing how to write a methods section that passes peer review is one of the most underestimated skills in academic research, and it is one of the most common reasons student papers get rejected.
The gap between a methods section that gets published and one that gets rejected is almost never about how much work the student did. It is about how clearly and defensibly that work is documented. This post gives you a complete, step-by-step process for writing a methods section that meets the standards peer reviewers actually apply.
What is a methods section and why does it matter for your research paper?
A methods section is the part of a research paper that documents the research design, data sources, data collection procedures, and analysis approach used to answer the research question. It appears after the literature review and before the results. Its purpose is to give peer reviewers and readers enough detail to evaluate whether the findings are credible and reproducible.
Without a strong methods section, even genuinely good research cannot be verified. Peer reviewers are trained to look for specific elements: a clearly stated research design, a justified sampling or data selection process, a transparent analysis method, and an honest acknowledgment of limitations. If any of these are missing or vague, the paper fails review regardless of how interesting the findings are.
For high school students submitting to academic journals or competitions, the methods section is often the section that separates publishable work from work that gets returned for major revision. It is also the section that demonstrates to university admissions readers that a student understands the logic of research, not just the topic. RISE Research scholars who publish original work consistently report that their methods section required the most revision and produced the most learning. You can see examples of published student research across multiple disciplines on the RISE Research publications page.
How to write a methods section that passes peer review: a step-by-step process for high school students
Step 1: State your research design before describing any procedures. The first sentence of your methods section should name the type of study you conducted. Was it a quantitative survey study? A qualitative interview study? A systematic literature review? A comparative case study? Peer reviewers read hundreds of papers. They need to orient themselves immediately. A weak methods section begins with: "I collected data by sending a survey to students." A strong one begins: "This study used a cross-sectional quantitative survey design to examine the relationship between daily screen time and self-reported sleep quality among Grade 11 students." The design statement frames everything that follows.
Step 2: Describe your sample or data source with specificity. Who or what did you study, and why? If you surveyed people, state the total number of participants, how they were selected, and what inclusion or exclusion criteria you applied. If you used secondary data, name the dataset, its source, the time period it covers, and why it was appropriate for your question. Vague descriptions like "a group of students" or "data from the internet" will not survive peer review. Reviewers need to assess whether your sample was appropriate for your research question and large enough to support your conclusions.
Step 3: Document your data collection instruments or procedures exactly. If you used a survey, name the validated scale or describe each question type. If you conducted interviews, state whether they were structured, semi-structured, or unstructured, and how long they ran. If you ran an experiment, describe the conditions, the sequence, and any controls applied. The standard to aim for is replicability: a reader should be able to repeat your study using only your methods section as a guide. Google Forms is a common tool for student surveys; if you used it, state that, along with how responses were stored and whether anonymity was maintained.
Step 4: Explain your analysis method with the same precision. Describing what you did with the data is just as important as describing how you collected it. If you ran statistical tests, name them: a Pearson correlation, a chi-square test, a thematic analysis using open coding. If you used software, name it: SPSS, R, Python with specific libraries, or even Excel with the Analysis ToolPak. If your analysis was qualitative, explain how themes were identified and whether a second coder reviewed your coding for consistency. Reviewers will check whether the analysis method matches the research design and the type of data collected. A mismatch here is a fast path to rejection.
Step 5: Address ethical considerations. Even high school research requires a statement on ethics. Did participants give informed consent? Were they told they could withdraw? Was data stored securely? For studies involving human participants, peer reviewers expect at least a paragraph on this. If your school or program provided an ethics review, mention it. Omitting this section signals to reviewers that the researcher did not think carefully about research integrity.
Step 6: Acknowledge limitations honestly. Every study has limitations. A small sample size, a non-random selection process, a single geographic location, or a self-report bias in survey responses are all legitimate limitations. Naming them is not a weakness. It shows that you understand the boundaries of your findings. Reviewers trust researchers who are honest about what their study cannot prove. Students who skip this step appear either unaware of their study's constraints or unwilling to acknowledge them. Neither impression helps a paper pass review.
The single most common mistake at this stage is writing the methods section as a narrative story rather than a technical document. Phrases like "first I decided to" or "then we thought it would be good to" belong in a lab notebook, not a methods section. Write in the past tense, use precise academic language, and eliminate any language that centers your personal experience rather than the research procedure.
Where most high school students get stuck with writing a methods section
The first sticking point is justification. Students describe what they did but not why they chose that approach over alternatives. A peer reviewer will ask: why a survey and not an interview? Why 50 participants and not 100? Why thematic analysis and not discourse analysis? Students working alone often do not know how to answer these questions because they have not been trained to think about research design as a set of deliberate choices with trade-offs. They chose their method because it seemed easiest or most familiar, not because it was the most appropriate for the question.
The second sticking point is matching the analysis to the data type. A student who collects Likert-scale responses and then runs a Pearson correlation has made a methodological error that most peer reviewers will flag immediately. Likert data is ordinal, not continuous. The appropriate test is different. Without a background in research methods, students often apply the first statistical test they encounter in a tutorial without checking whether it fits their data structure.
The third sticking point is scope. High school students often design studies that are too ambitious for the resources available to them: a longitudinal study that requires two years of data, a clinical study that requires IRB approval, or a comparative study that requires access to proprietary databases. A PhD mentor can assess within one session whether a proposed design is executable, and redirect it to something equally rigorous but actually achievable. RISE Research mentors, drawn from institutions including Ivy League and Oxbridge universities, do exactly this at the design stage before a student invests weeks in an approach that will not survive review. You can learn more about the mentors who guide this process on the RISE Research mentors page.
If you are at this stage and want a PhD mentor to guide you through writing a methods section that passes peer review and through the full research process, book a free 20-minute Research Assessment to see what is possible before the Summer 2026 Cohort I Deadline.
What does a good methods section look like? A high school example
A weak methods section is vague, narrative, and unjustified. A strong one is precise, structured, and defensible. The difference is not length. It is specificity and logic. A strong methods section names the design, justifies the sample, describes the instrument, explains the analysis, addresses ethics, and acknowledges limitations, all in clear, past-tense prose.
Weak example: "I surveyed some students at my school about their social media use and sleep. I asked them questions about how much they use their phones and how well they sleep. Then I looked at the results and found some patterns."
This version fails peer review on every dimension. The sample is undefined. The instrument is undescribed. The analysis is unnamed. There is no ethical statement and no acknowledgment of limitations.
Strong example: "This study used a cross-sectional survey design to examine the association between daily social media use and self-reported sleep quality among high school students. A convenience sample of 87 students in Grades 10 and 11 at a single urban high school in Mumbai was recruited via classroom announcement. Participants completed an anonymous online questionnaire comprising the Bergen Social Media Addiction Scale (BSMAS) and the Pittsburgh Sleep Quality Index (PSQI). Data were collected via Google Forms between March and April 2024 and exported to SPSS for analysis. A Spearman rank-order correlation was used to assess the relationship between BSMAS and PSQI scores, given the ordinal nature of both scales. All participants provided written informed consent, and no personally identifiable information was collected. Limitations include the convenience sampling method, which restricts generalisability, and the reliance on self-report measures, which are subject to recall bias."
The strong version names validated instruments, justifies the statistical test choice, addresses ethics, and acknowledges limitations. A peer reviewer reading this knows exactly what was done and why. For more examples of what rigorous student research looks like in practice, see this published RISE project on borrower characteristics in peer-to-peer lending, which demonstrates clear methodology documentation across a quantitative study.
The best tools for writing a methods section as a high school student
Google Scholar is the starting point for finding papers in your field that use a similar methodology. Reading the methods sections of published papers in your target journal gives you a direct model for the level of detail and the terminology expected. Search for papers with similar designs to yours and study how they document their procedures.
PubMed is essential for health, biology, and psychology research. It provides free access to peer-reviewed studies and includes structured abstracts that often summarise the methods clearly. Use it to identify validated scales and instruments that have already been used in published research, which strengthens the credibility of your own instrument choices.
JSTOR gives access to humanities and social science journals. If your research is in history, sociology, or political science, JSTOR lets you read full methods sections from published papers in those disciplines, where qualitative and interpretive methods are documented differently than in STEM fields.
Zotero is a free reference manager that helps you organise the sources you cite in your methods section, particularly when citing validated instruments or prior studies that used the same design. It generates citations automatically and keeps your reference list consistent across the paper.
CONSORT, STROBE, and PRISMA reporting guidelines are free, publicly available checklists developed by academic bodies to standardise how different types of studies are reported. CONSORT is for randomised trials, STROBE for observational studies, and PRISMA for systematic reviews. Using the relevant checklist as a writing guide ensures you do not omit any element that peer reviewers expect to see.
Frequently asked questions about writing a methods section for high school students
How long should a methods section be in a high school research paper?
A methods section for a high school research paper is typically 300 to 600 words, or roughly one to two pages double-spaced. Length depends on the complexity of the study design. A simple survey study requires less documentation than a mixed-methods study. The goal is completeness, not length: every element of the design must be present, but there is no benefit to padding.
Should I write the methods section in first person or third person?
Most peer-reviewed journals in the sciences use third person and passive voice in methods sections: "Data were collected" rather than "I collected data." Humanities and social science journals sometimes accept first person. Check the author guidelines for your target journal before writing. When in doubt, third person is the safer choice for a student paper aiming for publication.
What is the difference between methodology and methods in a research paper?
Methodology refers to the theoretical framework or philosophy behind your research approach, such as whether you are working within a positivist or interpretivist tradition. Methods refers to the specific procedures used: the survey instrument, the sample size, the statistical test. High school research papers typically focus on methods rather than methodology, though understanding the distinction helps when responding to peer review comments.
How do I justify my sample size in a methods section?
For quantitative studies, the gold standard is a power analysis, which calculates the minimum sample size needed to detect an effect of a given size at a given significance level. Tools like G*Power are free and widely used for this. For qualitative studies, justification is based on saturation: the point at which additional participants stop producing new themes. State whichever applies to your study and explain the reasoning briefly.
Can I use a survey I created myself or do I need a validated instrument?
You can use a self-created survey, but peer reviewers will scrutinise it more closely than a validated instrument. A validated instrument has been tested for reliability and validity in prior published research, which strengthens your study's credibility immediately. If you create your own survey, you must pilot test it and report the results. Using a validated scale where one exists is almost always the stronger methodological choice for a student paper targeting publication.
Conclusion
Writing a methods section that passes peer review requires three things: precision in documenting what you did, justification for why you made each design decision, and honesty about the limitations of your approach. These are not instinctive skills. They are learned through practice and through feedback from researchers who have navigated the peer review process themselves.
The most important takeaway from this guide is that the methods section is not a formality. It is the foundation of your paper's credibility. A peer reviewer who cannot replicate your study from your methods section will not trust your results, regardless of how interesting they are. Get the methods right, and the rest of the paper stands on solid ground.
If you want to see what published high school research looks like across a range of disciplines, the RISE Research projects page shows completed scholar work. The RISE Research results page documents the admissions outcomes that follow. The Summer 2026 Cohort I Deadline is approaching. If writing a methods section that passes peer review is a step you want to get right with expert guidance behind you, schedule a free Research Assessment and we will match you with a PhD mentor who has published in your subject area and guided students through this exact process.
Summer 2026 Cohort I Deadline Approaching in
28 days 23 hours 23 minutes
Book a free call
Book a free call
Read More
