Learn how the World Bank Group is helping countries with COVID-19 (coronavirus). Find Out

Data and Analytics

DIME Analytics supports quality research processes across the DIME portfolio, offers public trainings, and develops tools for the global community of development researchers. DIME’s primary portfolio includes: the DIME Wiki, a one-stop shop for practical guidance and resources on impact evaluation research and the accompanying guidebook Data for Development Impact; ietoolkit and iefieldkit, Stata code packages featuring commands to routinize common impact evaluation tasks; and Manage Successful Impact Evaluations, our flagship training designed to improve the skills and knowledge of impact evaluation practitioners.

DIME Analytics creates tools that improve the quality of impact evaluation research for all. We take advantage of the concentration and scale of research at DIME to develop and test solutions to ensure data work quality across our portfolio, and to make public training and tools available to the larger community of development researchers who might not have the same capabilities. 

Public Resources and Training

  • Manage Successful Impact Evaluations
    DIME Analytics’ flagship training is a week-long annual course, open to the public. MSIE is intended to improve the skills and knowledge of impact evaluation (IE) practitioners, familiarizing them with critical issues in IE implementation, recurring challenges, and cutting-edge technologies. The course consists of lectures and hands-on sessions, with parallel tracks based on software preferences and skill level. Through small group discussions and interactive computer lab sessions, participants work together to apply what they've learned and have a first-hand opportunity to develop skills.
    2019 course materials. The 2020 course has been postponed due to the COVID-19 crisis.
  • Manage Successful Impact Evaluation Surveys
    A fully virtual course, in which participants learn the workflow for primary data collection. The course covers best practices at all stages of the survey workflow, from planning to piloting instruments and monitoring data quality once fieldwork begins. There is a strong focus throughout on research ethics and reproducible workflows. The course uses a combination of virtual lectures, case studies, readings, and hands-on exercises.
    2020 Remote Course Material
  • Research Assistant Onboarding Course
    This course is designed to familiarize Research Assistants and Research Analysts with DIME's standards for data work. By the end of the course's six sessions, participants will have the tools and knowledge to implement best practices for transparent and reproducible research. The course will focus on how to set up a collaborative workflow for code, datasets, and research outputs. Most content is platform-independent and software-agnostic, but participants are expected to be familiar with statistical software.
  • DIME Wiki
    One-stop shop for impact evaluation research solutions. The DIME Wiki is a resource focused on practical implementation guidelines rather than theory, open to the public, easily searchable, and suitable for users of varying levels of expertise.
  • Development Research in Practice: the DIME Analytics Data Handbook
    The DIME data handbook is a practical guide on how to make data work efficient, ethical, transparent, and scalable.
  • Research Standards Bootcamps
    Workshops promoting real-time adoption of DIME’s Research Standards across the DIME portfolio. 
    Research Reproducibility Bootcamp | Research Ethics & Data Security Bootcamp
  • Software training
    Trainings for World Bank staff and government counterparts on common software for impact evaluation research. 
    R for Advanced Stata Users | Introduction to LaTeX | GitHub for Researchers | Research Assistants Continuing Education | Field Coordinators Continuing Education

Reproducible Research

  • Computational Reproducibility – Code Review
    DIME Analytics checks “computational reproducibility” for all DIME publications, verifying that all results are exactly reproduced from the provided code. Analytics offers computational reproducibility checks, pre-publication code review, assistance with the creation of replication package, and technical support with data and code publication as service to other teams. Contact dimeanalytics@worldbank.org for details.
  • Reproducible Research Agenda
    Regular events and trainings to promote reproducible research, in collaboration with the Berkeley Institute for Transparency in Social Science (BITSS) and other partners. Events in 2019 included Making Analytics Reusable (eventblog) and Transparency, Reproducibility and Credibility: a Research Symposium (eventpresentationsblog).

Quality Assurance for Data Acquisition

Open Source Research Tools

We take advantage of the scope and scale of DIME research to develop and test econometric and technical solutions and develop public tools. 


Maria Ruth Jones

Survey Specialist
Photo of Roshni-Khincha, DIME Team

Roshni Khincha

Data Coordinator
Photo of Luiza-Cardoso-De-Andrade, DIME Team

Luiza Cardoso De Andrade

Data Coordinator
Photo of Luis-Eduardo, DIME Team

Luis Eduardo San Martin

Data Coordinator