Help Select the Next Cohort of Smith Fellows
The Smith Fellowship review process depends on the expertise and perspective of volunteer reviewers. Many of our reviewers return year after year because they deeply value the chance to connect with one another, to support the next generation of early career professionals, and to be inspired by the bold thinking and leadership of the applicants. This community of reviewers is essential - not only to ensuring that each proposal receives thoughtful evaluation, but also to help the program assemble a balanced cohort of Fellows. A strong, diverse reviewer team is the cornerstone of a successful Fellowship program.
Role of Reviewers
As a volunteer reviewer, your role is to:
Evaluate candidates and proposals based on the established review criteria, drawing on your professional expertise and experience to inform your recommendations and feedback.
Provide both scores and written comments that inform the overall assessment of each application.
Contribute to a reviewer-recommended slate of candidates.
The program team then uses this slate as the foundation for advancement to the next round, while also weighing factors such as portfolio balance (topics, geographies, methods) and cohort composition to decide who advances and who is ultimately offered a Fellowship.
Reviewers assess each application across four main criteria:
Research Quality - clarity of questions, soundness of methods, feasibility, and mentor support.
Conservation Relevance - how the work informs management or policy, and how partnerships are integrated.
Innovation & Track Record - evidence of creativity, collaboration, and strong prior work.
Leadership Potential & Community Fit - communication skills, leadership capacity, commitment to DEI, and readiness to contribute to the cohort model.
We aim for each proposal to be evaluated by at least three reviewers, ensuring a mix of expertise and perspectives. The Fellowship funds the person (not just the project!) so reviews focus on both the research plan and the individual’s potential as a conservation leader.
Who Should Review
We welcome reviewers with:
At least 8 years of experience in conservation/stewardship related fields (including management, restoration, policy, or research)
A background in applied conservation, academia, or policy (a PhD is not required).
Independence from the applicant pool (if you would still be eligible to apply, you may not serve as a reviewer).
We seek a diverse mix of professional experiences so that proposals are considered from multiple angles, not just through the lens of disciplinary expertise. We are always interested in expanding our network of reviewers, so if you would be interested in reviewing but don’t quite fit into the guidelines above, we encourage you to reach out to our staff to discuss your interest.
Ways to Participate: Timeline & Commitment
You may choose to join one or more phases:
Phase 1: Written Reviews (October - November): Review 5 - 10 proposals over about a month.
Phase 2: Semi-Finalist Reviews & Virtual Panel (December): Read additional proposals and join a Zoom panel to recommend who advances.
Phase 3: Short Interviews (January): Smith staff and Fellows conduct brief interviews with semi-finalists.
Phase 4: Finalist Interview Panel (February, in-person): A small panel interviews the top candidates to help select the final Fellows.
Final Selection & Candidate Notifications: By the end of February
Our greatest need for reviewers falls between October and December of each year (Phase 1 and Phase 2).
The time commitment per application varies by reviewer, but expect to review an 8-page research proposal, a cover letter, responses to personal statement questions, and four mentor/reference letters. Reviewers are asked to provide thoughtful feedback to the applicants, and score each proposal.
Phase 1 and 2 reviewers can typically expect to be assigned between 5 and 10 applications.
How Reviews Work
Proposal-Reviewer Matching
We strive to match each proposal with reviewers whose expertise aligns closely with the research topic or methods. However, because proposals cover a wide range of disciplines and approaches, a “perfect” match is often not possible. Every proposal is ideally reviewed by at least three people, so while some reviewers may bring deep technical knowledge, others focus on clarity, feasibility, and leadership potential. This balance ensures that each candidate is evaluated fairly and from multiple perspectives.
Scoring
Reviewers score applications on twenty individual criteria (within the 4 broad criteria listed above), each rated from 1 to 5, and also provide an overall advancement score.
Written Comments
In addition to numerical scores, reviewers provide written comments. There is one section for private, internal comments that go directly to the program team, and a separate section for applicant-facing feedback that is shared (anonymous as to the reviewer) with candidates at the end of the process.
Reviewer Portal
All reviews are assigned and managed within the online RQ platform. Reviewers can save drafts of their work and then submit reviews as final once they are complete.
Confidentiality
Applicants never see reviewer names. Only the comments specifically marked as applicant-facing are shared with candidates, and only after the review process has concluded for that individual candidate.
Conflicts of Interest, Confidentiality & AI Policy
Conflicts of Interest
A conflict of interest may arise if you have close professional or personal ties to an applicant, such as serving as a mentor, being affiliated with their host institution, or collaborating directly with them. If you identify a conflict, you may recuse yourself directly within the reviewer portal or notify program staff so the proposal can be reassigned promptly.
Confidentiality
All proposals submitted to the Smith Fellowship are strictly confidential. Reviewers must not share, forward, or upload any application materials to outside platforms, cloud services, or personal networks. The integrity of the process depends on maintaining applicant privacy and intellectual property protections.
AI Policy
Reviewers may not upload proposals into generative AI tools or use AI to score, rank, or make evaluative judgments about applications. Limited use of AI is permitted only to refine or clarify your own reviewer notes (for example, turning rough notes into polished comments). In these cases, reviewers remain fully responsible for the content, fairness, and accuracy of their feedback.
Reviewer Agreements
When signing up, reviewers agree to:
Confirm they meet the reviewer eligibility requirements.
Complete reviews by the deadline or notify staff promptly if circumstances change.
Provide thoughtful, constructive, and respectful feedback.
Promptly disclose any conflicts of interest.
Abide by the confidentiality and AI policies to protect confidentiality and intellectual property of applicants.
Questions?
For any questions about eligibility, assignments, or the review process, contact the Smith Fellows team at smithinfo@conbio.org.