Chatterji, Madhabi (mb1434)

Madhabi Chatterji

Professor Emerita Measurement, Evaluation, and Education
212-678-3357

Office Location:

282 GDodge

Office Hours:

Mondays, 5-6:30 PM; Wednesdays, 4:30-6:30pm (By appointment)

Educational Background

1990                Ph.D. University of South Florida, Tampa, Florida                    

1980                M.Ed. St. Christopher's College, University of Madras, Madras, India           

1975                B.Ed. University of Bombay, Bombay, India

1973                B.Sc. (Honors) Lady Brabourne College, University of Calcutta, W.B., India

Scholarly Interests

Madhabi Chatterji is the Professor Emerita of Measurement, Evaluation, and Education at Columbia University’s Teachers College (TC) where she founded and still directs the Assessment and Evaluation Research Initiative, a center dedicated to promoting meaningful use of assessment-evaluation information to improve equity and the quality of practices and policies in education, psychology and the health professions (AERI, www.tc.edu/aeri). She retired from TC on August 31, 2022, following almost 22 years of service (2001 – 2022), prior to which she was an assistant professor of educational measurement and research at the University of South Florida (1996 – 2000), and the supervisor of research and evaluation services at the Pasco County School District, Florida (1988 – 1995). She is an award-winning and internationally recognized methodologist and educationist. Her 100+ publications focus on these themes:

  • Instrument design, validation, validity and test use issues
  • Evidence based practices (EBP), Evidence standards, the “evidence debate”, improving evidence-gathering and evidence-synthesis methods on “what works”
  • Educational equity; closing students' learning and achievement gaps with proximal, diagnostic assessments
  • Standard-based education reforms
  • Assessment policies in U.S. and global settings

 

2023 Curriculum Vitae

2023 Biographical Statement (Long)

2023 Biographical Statement (Short)

2023 Research Themes

Selected Publications

Note: Published as Madhabi Banerji from 1990-2000, and as Madhabi Chatterji from January, 2001-present.

REFEREED PUBLICATIONS BY THEME:

I. Instrument Design, Validation and Validity Issues

Chatterji, M. (2024, in press). User-Centered Assessment Design: An Integrated Methodology for Diverse Populations. New York, NY: Guilford Press. [BOOK]

Chatterji, M. (Ed.) (2013). Validity and Test Use: An International Dialogue on Educational Assessment, Accountability, and Equity. Bingley, UK: Emerald Group Publishing Limited. [BOOK]

Chatterji, M. (2003). Designing and Using Tools for Educational Assessment. Boston, MA: Allyn & Bacon/Pearson. [BOOK]

Chatterji, M., & Lin, M. (2018). Designing non-cognitive construct measures that improve mathematics achievement in grade 5-6 learners. A user-centered approach. Quality Assurance in Education, 26(1), 70-100. 

Chatterji, M., Tripken, J., Johnson, S., Koh, N. J., Sabain, S., Allegrante, J.P., & Kukafka, R. (2017). Development and validation of a health information technology curriculum: Toward more meaningful use of electronic health records. Pedagogy in Health Promotion, 3(3) 154–166. Electronic release: © 2016 Society for Public Health Education

Wyer, P. W. & Chatterji, M. (2013). Designing outcome measures for the accreditation of medical education programs as an iterative process combining classical test theory and Rasch measurement. The International Journal of Educational and Psychological Assessment, 13 (2), 35-61.

Chatterji, M., Sentovich, C, Ferron, J., & Rendina-Gobioff, G. (2002). Using an iterative validation model to conceptualize, pilot-test, and validate scores from an instrument measuring Teacher Readiness for Educational Reforms. Educational and Psychological Measurement, 62, 442-463.

Banerji, M., Smith, R. M., & Dedrick, R. F. (1997). Dimensionlity of an early childhood scale using Rasch analysis and confirmatory factor analysis. Journal of Outcome Measurement, 1 (1), 56-86. [Received the Distinguished Paper Award from the Florida Education Research Association, 1993]

Banerji, M. & Ferron, J. (1998). Construct validity of a developmental assessment made up of mathematical patterns tasks. Educational and Psychological Measurement, 58 (4), 634-660. 

II. Evidence based practices, Evidence Standards, the “Evidence Debate”, Improving Evidence-gathering and Evidence-synthesis Methods

Chatterji, M. (2016). Causal inferences on the effectiveness of complex social programs: Navigating assumptions, sources of complexity and evaluation design challenges,  Evaluation and Program Planning,  56 (6) 128–140.

Chatterji, M., Green, L. W., & Kumanyika, S. (2014). L.E.A.D.: A framework for evidence gathering and use for the prevention of obesity and other complex public health problems. Health Education & Behavior, 41 (1) 85-99. First released on June 19, 2013.

Chatterji, M. (2008). Synthesizing evidence from impact evaluations in education to inform action: Comments on Slavin. Educational Researcher, 37 (1) 23-26. 

Chatterji, M. (2007). Grades of Evidence: Variability in quality of findings in effectiveness research on complex field interventions. American Journal of Evaluation, 28 (3), 3-17.

Chatterji, M. (2004). Evidence on “what works”: An argument for extended-term mixed method (ETMM) evaluation designs. Educational Researcher, 33 (9), 3-13. (Reprinted in Educational Researcher, 34 (5), 14-24, 2005). [Received an Outstanding Publication Award for Advances in Research Methodology from the American Educational Research Association, 2004]

III. Standards-based Reforms and Educational Equity

Chatterji, M. (2012). Development and validation of indicators of teacher proficiency in diagnostic classroom assessment. The International Journal of Educational and Psychological Assessment, 9 (2), 4-25. Special Issue on Teacher Assessments. 

Chatterji, M. (2006). Reading achievement gaps, correlates and moderators of early reading achievement: Evidence from the Early Childhood Longitudinal Study (ECLS) kindergarten to first grade sample. Journal of Educational Psychology, 98 (3), 489-507.

Chatterji, M. (2005). Achievement gaps and correlates of early mathematics achievement: Evidence from the ECLS K-first grade sample. Educational Policy Analysis Archives, 13 (46).

Chatterji, M., Koh, N., Choi, L., & Iyengar, R. (2009) Closing learner gaps proximally with teacher- mediated diagnostic assessment. Research in the Schools, 16 (2), 60-77.

Chatterji, M. (2002). Models and methods for examining standards-based reforms: Have the tools of inquiry answered the pressing questions on improving schools? Review of Educational Research, 72 (3), 345-386. 

IV. Evaluation Research and Evaluation Methods

Chatterji, M. (2005). Applying the Joint Committee's 1994 standards in international contexts: A case study of educational evaluations in Bangladesh. Teachers College Record, 107 (10), 2373-2400. Special Issue on New Perspectives in Program Evaluation.

Banerji, M. & Dailey, R.A. (1994). A study of the effects of an inclusion program of Elementary students with specific learning disabilities. Journal of Learning Disabilities, 28 (8), 511-522.

Pearson, C.L. & Banerji, M. (1993). Effects of a ninth-grade dropout prevention program on student academic achievement, school attendance, and dropout rate. Journal of Experimental Education, 61 (3), 247-256.

V. Assessment  Policy

Chatterji, M. (2019).  A Consumer’s Guide to Testing under the Every Student Succeeds Act (ESSA):  What Can the Common Core and Other ESSA Assessments Tell Us?” University of Colorado, Boulder: National Educational Policy Center. See:  https://nepc.colorado.edu/publication/rd-assessment-guide [BOOK]

Chatterji, M. (2014). Validity Counts: Let’s mend, not end, educational testing. Education Week, Issue 24. Published on March 12, 2014, and archived at: www.edweek.org. [Op-Ed]

Chatterji, M. & Harvey J. (2014). (Co-facilitators). Assessing the Assessments: K-12 Measurement and Accountability in the 21st Century. A blog featuring debate and dialogue between scholars and K-12 school officials/practitioners at Education Week’s blog site: http://blogs.edweek.org/edweek/assessing_the_assessments [Blog]

Chatterji, M. (2013). Bad tests or bad test use? A case of SAT® use to examine why we need stakeholder conversations on validity. Teachers College Record, 115 (9), 1-7. [Foreword]

Popham, W. J., Berliner, D.C., Kingston, N., Fuhrman, S.H., Ladd, S.M., Charbonneau, J. & Chatterji, M. (2014). Can today's standardized tests yield instructionally useful data? Challenges, promises and the state of the art. Quality Assurance in Education, 22 (4), 300-315. [Moderated Policy Discussion]

Pizmony-Levy, O., Harvey, J., Schmidt, W., Noonan, R., Engel, L., Feuer, M.J., Santorno, C., Rotberg, I.,   Ash, P., Braun, H., Torney-Purta, J., & Chatterji, M. (2014). On the merits of, and myths about, international assessments. Quality Assurance in Education, 22 (4), 316-335. [Moderated Policy Discussion]

Gordon, E. W., McGill, M.V., Sands, D.I., Kalinich, K., Pellegrino, J.W. , & Chatterji, M. (2014). Bringing formative assessment to schools and making it count. Quality Assurance in Education, 22 (4), 336-350. [Moderated Policy Discussion]

Chatterji, M. (2010). Review of “Closing the Racial Achievement gap: Learning from Florida’s Reforms.” Boulder, CO: National Education Policy Center. Available at http://nepc.colorado.edu/thinktank/learnin-from-florida. [Policy Brief]

Chatterji, M. (2005, April). Closing Florida’s achievement gaps: Florida Institute of Education (FIE) Policy Brief 4. Jacksonville, FL: Florida Institute of Education at the University of North Florida. [Policy Brief]

Chatterji, M. (2004, April). Good and bad news about Florida student achievement: Performance trends on multiple indicators since passage of the A+ legislation. Educational Policy Brief Research Unit, Doc No. EPSL-0401-105-EPRU, Tempe, AZ: Educational Policy Studies Laboratory. [Policy Brief]

NOTE: Publications are available on: PubMed, ERIC, PsycINFO and Psychological Abstracts.

MADHABI CHATTERJI, Ph.D.

Madhabi Chatterji, Ph.D., M.Ed., B.Sc. (Hons.) is the Professor Emerita of Measurement, Evaluation, and Education at Columbia University’s Teachers College (TC) where she founded and directs the Assessment and Evaluation Research Initiative, a center dedicated to promoting meaningful use of assessment-evaluation information to improve equity and the quality of practices and policies in education, psychology and the health professions (AERI, www.tc.edu/aeri). She retired from TC on August 31, 2022, following almost 22 years of service. Prior to joining TC, Chatterji was an Assistant Professor at the Department of Educational Measurement and Research at the College of Education, University of South Florida (1996-2000), and Specialist/Supervisor of Research and Evaluation Services at the Pasco County School System in Florida (1988-1995). She emigrated to the U.S. as a doctoral student in January, 1985 with her then-young daughters following shortly after. They are now settled permanently in the U.S. as naturalized citizens.

An award-winning and internationally-recognized methodologist and educationist, Chatterji has taught and mentored numerous doctoral students and post-doctoral researchers over her 35+ year career. Her “signature course” at TC was Instrument Design and Validation, which drew participants pursuing advanced graduate and professional degrees in various fields, including students and faculty from other universities in and around New York City. Her added academic interests include: improving methods and technical standards for Evidence Based Practices (EBP); standards based education reforms; educational equity; and cognitively-based proximal models of diagnostic assessment for detecting and closing students’ learning gaps across the lifespan. Her 100+ publications on these themes, to date, include over 50 refereed articles in top-tier academic journals, two peer-reviewed books, multiple edited volumes and special issues of journals, policy briefs, blogs and numerous technical reports. Refereed publications with the largest volume of scholarly citations to date, include: her prior assessment book, Designing and Using Tools for Educational Assessment (2003, Allyn & Bacon/Pearson) and articles in The Journal of Educational Psychology (2006), Educational Researcher (2004/05 and 2008), Review of Educational Research (2002); Journal of Learning Disabilities (1994); Journal of Outcome Measurement (1997); Educational and Psychological Measurement (1998;1999; 2002) and the American Journal of Evaluation (2007).

 

A public intellectual, Professor Chatterji has spoken out frequently on the limitations of large scale, standardized tests and the adverse social consequences of misused high stakes, educational assessments. Her long-standing scholarly interests lie in instrument design, validation, validity and test use issues, the central thrust of her forthcoming 12 chapter-textbook: User-Centered Assessment Design: An Integrated Methodology for Diverse Populations and Settings (Guilford Publishers, NY, in press). Chatterji’s policy briefs and a book-length guide on educational testing, “A Consumer’s Guide to Testing Under the Every Student Succeeds Act (ESSA): What Can the Common Core and Other Assessments Tell us?” were published by the National Education Policy Center (NEPC) where she is a Fellow, and via op-eds and blogs in the Education Week. Her membership as a methodological scientist on an Institute of Medicine expert consensus committee (now the National Academy of Sciences, Engineering and Medicine) led to new evidence standards for decision-making in obesity prevention, and a systems-based, multi-method framework for evidence synthesis and evidence generation to address major public health problems (published in the Health Education and Behavior, 2014). She has served on numerous international and national advisory panels and journal editorial boards in measurement-evaluation, including flagship journals of the American Educational Research Association (AERA) and the National Council on Measurement in Education (NCME).

Professor Chatterji’s notable list of recognitions, includes a Fulbright Research Scholar Award (2007-08) for studies examining gender equity issues in primary schools of selected Bengali-speaking regions in India and Bangladesh; an Outstanding Publication Award (2004) from the American Educational Research Association (AERA, 2004) for her lead article in the Educational Researcher, titled: Evidence on “What Works”—An Argument for Extended Term Mixed Methods Evaluation Designs; a Distinguished Paper Award from the Florida Educational Research Association (1993) for demonstrating the combined utility of Rasch and confirmatory factor analysis models to examine the dimensionality and construct validity of test-based data (published in the Journal of Outcome Measurement, 1997); and Reviewer Recognitions from the Educational Researcher and the AERA publications committee (2006), Journal of Graduate Medical Education (2012, 2013) and Studies in Educational Evaluation (2019).

At her center, AERI at TC, Professor Chatterji served as Principal Investigator (PI) or Co-PI on numerous projects supported by competitive research grants from the National Science Foundation, the Stemmler Fund of the National Board of Medical Examiners, various non-profit/state/federal government agencies, including the Educational Testing Service, and most recently, the William T. Grant Foundation and Spencer Foundation.

Chatterji is also a frequently invited speaker at international conferences and forums sponsored by governments, non-governmental organizations and major national universities in the U.S. and abroad. Most recently, she served as Co-Editor of Quality Assurance in Education, an international peer-reviewed journal in educational evaluation. She hopes to continue as an active member of the Faculty Steering Committee of the Columbia Global Centers, a select cadre of university-wide scholars with an international reach and impact of their work.

Note: Madhabi Chatterji’s academic degrees and scholarly publications prior to December, 2000 are listed under the name, Madhabi Banerji; from 2001 onwards, they are under her current name, Madhabi Chatterji.

 

Funded Project and Sponsor/Funder

Project Goals, Years, Grant Amount and Role

1.       AERI Partnership with the Provost’s Office and the Institute for Urban and Minority Education, Teachers College for a centennial conference supported by the Spencer, Hewlett and William T. Grant foundations.

Title: Learning and Thriving Across the Lifespan: A Centennial Celebration of the Intellectual Life and Legacy of Dr. Edmund W. Gordon— The Minister of Education.

 

Purpose: Co-host a virtual conference and publish proceedings for the E.W. Gordon centennial celebration.

AERI Project, 2021-2022.

The Spencer Foundation: $75,000, Principal Investigator

The William T. Foundation: $25,000, Principal Investigator

The Hewlett Foundation: $20,000, Co-Principal Investigator

 

2.       Contract with the Department of Anesthesiology at the Columbia University Medical Center

 

Goals: Support the career development goals of Anesthesiology faculty by providing technical support/consultation in measurement, evaluation, statistics and research methods for preparing grant applications and publications.

Annually renewable contract of $10,000-$15,000, AERI Project, 2018-2023

3.       Foundation for Anesthesia Education Research (FAER). Sub-award.

Title: A mixed-methods, randomized controlled trial comparing two methods of debriefing for a serious game designed to teach novice anesthesia residents to perform anesthesia for emergency caesarian delivery.

Purpose: Serve as Primary Faculty Mentor to guide research proposed by Allison Lee, M.D., Department of Anesthesiology at the Columbia University Medical Center. $4918 (year 1); $7500 (year 2); $7500 (year 3); $7500 (year 4).

Total=$27,418, 2017-23.

4.       Teachers College Global Investment Fund. Seed grant.

Title: Addressing inequities through comprehensive, ecologically-based models of primary education:  A capacity-building effort to support teacher education institutions in India.

 

Goals: Deliver lectures/workshops and initiate possible multi-year partnership projects with the Columbia Global Center (CGC)-Mumbai to serve higher education institutions in India

$8000, AERI Project, 2014-15.

Principal Investigator

 

5.       Subcontract with the International Medical Corps (IMC).

Title: Evaluating comprehensive mental health and psychosocial support services for vulnerable refugees.

 

 

Goals: Select and validate outcome measures, and design a randomized field trial to evaluate the effectiveness of IMC’s new health intervention model for displaced Syrian refugees at camps and urban centers in Amman, Jordan. 

$52,683AERI Project, 2013-15.

Co-Principal Investigator (with the Department of Clinical and Counseling Psychology, TC).

 

6.       Subcontract with Barnard College on the Howard Hughes Medical Institute project.

Title: Hughes Science Pipeline Project for middle schools in New York City. 

Goals: Support the development and evaluation of the Hughes Science Pipeline Project.

$31,998.  AERI Project, 2013-17. 

Principal Investigator.

7.       National Science Foundation (NSF) REESE Program Award.

Title: Improving validity at the nexus of assessment design and use: A proposal for an international conference and capacity-building institute in assessment and evaluation.

Goals: Support the design/hosting of AERI’s inaugural conference and publication of a volume with conference proceedings (with ETS as the co-sponsor).

$124,747AERI Project, 2011-12.

Principal Investigator.

8.       Provost’s Investment Fund Award, Teachers College, Columbia University.

Title: Building capacity at home and abroad: A proposal for rotating institutes and conferences to disseminate cutting-edge knowledge in the assessment and evaluation sciences.

Goals: To design and deliver AERI’s inaugural training institute and publication of policy briefs

$20,000. AERI Project, 2011-12

Principal Investigator.

 

9.       Educational Testing Service (ETS).

Title: Educational assessment, accountability and equity—Conversations on validity around the world.

Goals: Co-design/host AERI’s inaugural conference with ETS at TC and publish proceedings.

$52,200. AERI Project, 2011-12.

Principal Investigator.

10.    Subcontract to TC and AERI from Office of the National Coordinator, United States Department of Health and Human Services, Washington, D.C. Curriculum Development Center award to the Department of Biomedical Informatics, Columbia University (Lead investigator).

Goals: Co-lead the development and validation of curriculum goal frameworks and educational assessments in health information technology, and an evaluation protocol for future workforce training programs.

$204,000, AERI Project, 2010-2012.  Co-Principal Investigator (with the Department of Health and Behavior Studies, TC).

11.    The Nand and Jeet Khemka Foundation, India. Title: Design and Evaluation of a Curriculum for the Foundation’s Life Skills and Global Leadership Programme.

Goals: Develop student outcome frameworks and curriculum-based assessments, perform a formative evaluation of pilot programs, and provide training/capacity-building in assessment and evaluation to staff.

$754,000 (approx. half for the assessment and evaluation components).  AERI Project. 2008-11.

Co-Principal Investigator (with the Program of Social Studies and the President’s office, TC).

12.    Fulbright Research Scholar - Award Competition #7410. Center for International Exchange of Scholars, Washington D.C. Title: Education for All (EFA) –case studies on gender equity in selected primary schools in West Bengal, India and Bangladesh.

Goals: Conduct case study research on gender equity issues in primary schools under the state/national government’s EFA policy.

$13,837. Fulbright Commission, 2007-08.

Principal Investigator

13.    Stemmler Fund of the National Board of Medical Examiners. Title: Designing cognitive measures of practice-based learning and improvement as an iterative process combining Rasch and classical measurement methods.

Goals: Create and validate competency assessments for resident physicians in compliance with the Accreditation Council of Graduate Medical Association’s standards.

$145,000, AERI Project, 2006-09.

Co-Principal Investigator (with the Center for Educational Research and Evaluation, Columbia University- College of Physicians and Surgeons).

 

14.    Community Foundation of Elmira/Corning/Finger Lakes areas. Title: The Chemung County School Readiness Studies.

 

Goals: Conduct an evaluation of the county-wide school readiness project and support local instrument design/research efforts to examine inequities.

$94,000, AERI Project, 2006-09. 

Principal Investigator

15.    National Science Foundation-EREC Program Award 03-542.

Title: Improving mathematics achievement in elementary/middle school students with systemic use of proximal assessment data.

Goals: Conduct research, development and field-testing of the Proximal Assessment for Learner Diagnosis (PALD) model for closing learning gaps in Black/minority students, with classroom teachers in four schools in East Ramapo, NY. 

$501,925, 2005-2009.

Principal Investigator.

16.    Sub-contract with the Family Services of Westchester, NY/U.S. Department of Education-funded program.

Title: School-based mentoring program evaluation.  

Goals: Evaluate the long-term effects of the mentoring program on minority adolescents in the Peekskill School District, NY.

$10,000 per year, 2004-07

Principal Investigator.

17.    Contract with the Carnegie Learning Corporation. Title: Cognitive Tutor program evaluation.

 

Goals: Evaluate the effects of the Cognitive Tutor math program on student performance at 13 Brooklyn high schools.

$20,155, 2003-2004. 

Principal Investigator.

18.    National Center for Educational Statistics (NCES). AERA Statistical Analysis and Policy Institute. April, 2002.

Goals: Receive training with the ECLS database for conducting research on early childhood achievement gaps.

$1500, Spring, 2002.

19.    Kumon North America, Inc.

Title: Kumon program evaluations at Public School 180 in the Chancellor's District, Harlem, New York.

Goals: Evaluate the effects of the Kumon supplementary math and reading programs.

$28,750, 2001-2003. 

Principal Investigator.

 

20.    Pinellas County Schools, Florida, Goals 2000, district-level training grant.

Title: Data-based decision-making in the classroom.

Goals: Develop a training manual for teachers/leaders in basic statistical analysis and use of assessment data for educational decision-making.

$29,000, 1999-2001.

Principal Investigator

21.    Bureau of Teacher Education, Florida Department of Education.

Title: Teacher Readiness for Statewide Assessment Reforms and its Influences on School Practices and Outcomes

Goals: Conduct a large scale survey to evaluate needs related to state-initiated reforms in nine Florida school districts.

 $25,000, 1999-2001 

Principal Investigator

22.    University of South Florida, Division of Sponsored Research, Creative Scholarship Grants Competition for Faculty

Title: Teacher and School Leader Readiness Levels for Statewide Assessment Reforms and its Influences on School Practices and Outcomes

Goals: Conduct a large scale survey to evaluate needs related to state-initiated reforms in nine Florida school districts.

 $7500, 1999-2001 

Principal Investigator

23.    University of South Florida, Instructional Technology Grants Competition for Faculty. Center for Teaching Enhancement

Title: Designing and Validating Educational Assessments: A Computer-based Module

Goals: Design technology-based modules to teach courses in assessment/test design.

 $7500, 1997-1998 

Principal Investigator

24.    Bureau of Curriculum, Instruction, and Assessment, Florida Department of Education. Invitational grant awarded to the Pasco County School System

Title: Developing Teacher-friendly Guides to teach with Florida’s Goal 3 Standards and assess student achievement.

Goals: To develop teacher guides at Levels 1-4, and deliver training to elementary and secondary school teachers.

$67,000, 1995-1996.

Project Leader and Primary Author.

25.    Florida Educational Research Council.

Title: Evidence of Consequential Validity of Alternative Assessments Aligned to an Elementary Mathematics Curriculum: A Pilot Study.

Goals: Conduct validation studies on the Pasco 2001 mathematics assessments for students.

$2,060, 1994-95

Principal Investigator.

 

 

TC Global Investment Fund Award, 2014
Teachers College, Columbia University

Provost’s Investment Fund Award, 2011
Teachers College, Columbia University

Fulbright Research Scholar, 2008
Center for International Exchange of Scholars (Fulbright Commission), Washington, D.C.
Study title: A study of gender equity in primary education in Bengali-speaking regions of India and Bangladesh: Evaluating access, opportunities, and factors affecting school outcomes and completion rates.

Outstanding Reviewer, 2006
Publications Committee, Educational Researcher, American Educational Research Association.

Outstanding Publication Award (Advances in Research Methodology-Division H): American Educational Research Association, 2004
Paper title: Evidence on what works: An argument for Extended-term Mixed Methods (ETMM) designs
Note: Published as a lead article in the Educational Researcher in 2004; reprinted in 2005.

Fellow, National Educational Policy Center (NEPC), University of Colorado at Boulder
Previously Fellow, Educational Policy Research Unit (EPRU), Arizona State University,
2006-present.

Distinguished Paper Award: Florida Educational Research Association, 1993
Paper title: Examining dimensionality of data generated from an early childhood scale using Rasch analysis and confirmatory factor analysis.  
Note: Published as a lead article in the Journal of Outcome Measurement in 1997.

Creative Scholarship Award, University of South Florida, 1999.

Instructional Technology Award, University of South Florida, 1997.

Elected Member, Phi Kappa Phi (Academic Honor Society)
University of South Florida, 1986.

Elected Member, Delta Kappa Gamma (Academic Honor Society for Educators)
1987.

Note: Professor Chatterji published as Madhabi Banerji from 1990-2000 and as Madhabi Chatterji from January, 2001-present. Articles are organized by scholarly interest area.

 

BOOKS

Chatterji, M. (2013). (Ed.). Validity and test use: An international dialogue on educational assessment, accountability, and equity. Bingley, UK: Emerald Group Publishing Limited

Chatterji, M. (2003). Designing and Using Tools for Educational Assessment. Boston, MA: Allyn & Bacon/Pearson. 

 

REFEREED ARTICLES

  • Assessment design,  construct validation, validity issues

Chatterji, M. & Lin, M. (2018) Measures and correlates of mathematics-related self-efficacy, self-concept, anxiety in young learners: Construct validation in context as an iterative process. National Council on Measurement in Education, 2016 paper presentation.

Chatterji, M. (2013). Bad tests or bad test use? A case of SAT use to examine why we need stakeholder conversations on validity. Teachers College Record, 115 (9), 1-7.

Wyer, P. W. & Chatterji, M. (2013). Designing outcome measures for the accreditation of medical education programs as an iterative process combining classical test theory and Rasch measurement. The International Journal of Educational and Psychological Assessment, 13 (2), 35-61.

Chatterji, M. (2013). Global forces and educational assessment: A foreword on why we need an international dialogue on validity and test use. In M. Chatterji (Ed.), Validity and test use: An international dialogue on educational assessment, accountability, and equity (pp. 1-14). Bingley, UK: Emerald Group Publishing.

Chatterji, M., Sentovich, C, Ferron, J., & Rendina-Gobioff, G. (2002). Using an iterative validation model to conceptualize, pilot-test, and validate scores from an instrument measuring Teacher Readiness for Educational Reforms. Educational and Psychological Measurement, 62, 442-463.

Banerji, M. (1999). Validation of scores/measures from a K-2 developmental assessment in mathematics. Educational and Psychological Measurement, 59 (4), 694-715.

Banerji, M. & Ferron, J. (1998). Construct validity of a developmental assessment made up of mathematical patterns tasks. Educational and Psychological Measurement, 58 (4), 634-660.

Banerji, M., Smith, R.M., & Dedrick, R. F. (1997). Dimensionality of an early childhood scale using Rasch analysis and confirmatory factor analysis. Journal of Outcome Measurement, 1 (1), 56-86.

  • Evidence standards, the “evidence debate” and evaluation methods

Chatterji, M. (2016). Causal inferences on the effectiveness of complex social programs: Navigating assumptions, sources of complexity and evaluation design challenges,  Evaluation and Program Planning., 59, 128-140. Available online at: http://dx.doi.org/10.1016/j.evalprogplan.2016.05.009

Chatterji, M, Green, L.W., & Kumanyika, S. (2014). L.E.A.D.: A framework for evidence gathering and use for the prevention of obesity and other complex public health problems. Health Education & Behavior, 41 (1) 85-99.

Chatterji, M. (2008). Synthesizing evidence from impact evaluations in education to inform action: Comments on Slavin. Educational Researcher, 37 (1) 23-26.

Chatterji, M. (2007). Grades of Evidence: Variability in quality of findings in effectiveness research on complex field interventions. American Journal of Evaluation, 28(3), 3-17.

Chatterji, M. (2009). Enhancing scientific evidence on the how global educational initiatives work:  Theory, epistemological foundations, and guidelines for applying multi-phase, mixed methods designs. In K.B. Ryan & J. B. Cousins (Eds.). The SAGE International Handbook of Educational Evaluation (92-111). Thousand Oaks, CA: Sage Publications. 

Chatterji, M. (2010). Evaluation methodology. In P. Peterson, E. Baker, and B. McGaw    (Eds.). International Encyclopedia of Education. Volume 3 (735-745), Oxford: Elsevier.

Chatterji, M. (2004). Evidence on "what works": An argument for extended-term mixed method (ETMM) evaluation designs. Educational Researcher, 33(9), 3-13. (Reprinted in Educational Researcher, 34(5), 14-24, 2005)

Chatterji, M. (2005). Applying the Joint Committee's 1994 standards in international contexts: A case study of educational evaluations in Bangladesh [Special Issue on New Perspectives in Program Evaluation]. Teachers College Record, 107 (10), 2373-2400.

Banerji, M. & Dailey, R.A. (1994). A study of the effects of an inclusion program for elementary students  with specific learning disabilities. Journal of Learning Disabilities, 28 (8), 511-522.

  • Standards based reforms, diagnostic classroom assessment  and educational  equity

Chatterji, M., Tripken, J., Johnson, S., Koh, N., Sbain, S., Allegrante, J.P. & Kufafka, R. (2016). Development and validation of a health information technology curriculum:  Towards meaningful use of electronic health records. Pedagogy in Health Promotion, 1-14. Available online at: http://php.sagepub.com/cgi/reprint/2373379916669149.pdf?ijkey=3sbq6OuzlDS0nze&keytype=finite

Chatterji, M. (2012). Development and validation of indicators of teacher proficiency in diagnostic classroom assessment [Special issue on Teacher Assessments]. The International Journal of Educational and Psychological Assessment, 9(2), 4-25.

Chatterji, M., Koh, N., Choi, L., & Iyengar, R. (2009). Closing learner gaps proximally with teacher-mediated diagnostic assessment. Research in the Schools, 16(2), 60-77.

Chatterji, M. (2006). Reading achievement gaps, correlates and moderators of early reading achievement: Evidence from the Early Childhood Longitudinal Study (ECLS) kindergarten to first grade sample. Journal of Educational Psychology, 98(3), 489-507. 

Chatterji, M. (2005). Achievement gaps and correlates of early mathematics achievement: Evidence from the ECLS K-first grade sample. Educational Policy Analysis Archives, 13(46). 

Chatterji, M., Kwon, Y.A., Paczosa, L., & Sng, C. (2006). Gathering evidence on an after-school supplemental instruction program: Design challenges, lessons, and early findings in light of NCLB. Educational Policy Analysis Archives, 14 (12).

Chatterji, M. (2002). Models and methods for examining standards-based reforms:  Have the tools of inquiry answered the pressing questions on improving schools? Review of Educational Research, 72(3), 345-386.

Chatterji, M. (2002) Measuring leader perceptions of school readiness for standards-based reforms and accountability. Journal of Applied Measurement, 3(4), 455-485.

 

EDITED VOLUMES

Chatterji, M. (2014) (Guest Ed.).  Assessment, accountability and quality issues.  Quality Assurance in Education, 22 (4). Special Issue.

Chatterji, M. (2013). (Guest Ed.). When education measures go public: Stakeholder perspectives on how and why validity breaks down. Teachers College Record, 115 (9).

Chatterji, M. & Welner, K. G. (2014) (Guest Eds.). Validity, assessment and accountability: Contemporary issues in primary, secondary, and higher education. Quality Assurance in Education, 22 (1). Special Issue.

 

OP-ED ARTICLES, REVIEWS AND BLOGS

Chatterji, M. (2014). Let’s mend, not end, educational testing. Education Week, Issue 24. In print on March 12, 2014. Available at: http://www.tc.columbia.edu/aeri/conferences-and-forums/education-week-blog-2014/0311Chatterji.pdf.

Chatterji, M. (2014). Validity, test use, and consequences: Pre-empting a persistent problem. In Assessing the Assessments: K-12 Measurement and Accountability in the 21st Century at Education Week’s blog site on March 17, 2014: http://blogs.edweek.org/edweek/assessing_the_assessments. Available at: http://www.tc.columbia.edu/aeri/conferences-and-forums/education-week-blog-2014/0317Chatterji.pdf

Chatterji, M. (2014). Formative classroom assessment and assessment for accountability: Finding a balance. In Assessing the Assessments: K-12 Measurement and Accountability in the 21st Century at Education Week’s blog site on May 16, 2014: http://blogs.edweek.org/edweek/assessing_the_assessments. Available at: http://www.tc.columbia.edu/aeri/conferences-and-forums/education-week-blog-2014/0516Chatterji.pdf

Chatterji, M. (2010). Review of “Closing the Racial Achievement gap: Learning from Florida’s Reforms.” Boulder, CO: National Education Policy Center. Available at http://nepc.colorado.edu/thinktank/learnin-from-florida.

2015-present:          
Professor of Measurement, Evaluation, and Education
Director, Assessment and Evaluation Research Initiative (AERI at www.tc.edu/aeri)
Program of Social-Organizational Psychology, Dept. of Organization and Leadership,
Teachers College, Columbia University
Co-Editor, Quality Assurance in Education (QAE), an international, peer-reviewed journal in evaluation,
Emerald Group Publishing, UK

2006-2015:          
Associate Professor of Measurement, Evaluation, and Education
Director, Assessment and Evaluation Research Initiative (AERI)
Program of Social-Organizational Psychology, Dept. of Organization and Leadership,
Teachers College, Columbia University

2001-2005:            
Associate Professor of Measurement, Evaluation, and Education (Reappointed in 2003; tenured in 2005)
Dept. of Human Development,
Teachers College, Columbia University.

1996-2000:
Assistant Professor, Department of Educational Measurement and Research,
College of Education,
University of South Florida.

1988-1995:
Supervisor, Research and Evaluation Services
District School Board of Pasco County, Florida         

  1. American Evaluation Association (AEA):Member, 2000-present
    Journal article reviewer, American Journal of Evaluation (ongoing)
    Conference participant (ongoing) and International Topical Interest Group (TIG)
    Ambassador (periodic).

  1. American Educational Research Association (AERA):Member, 1986-present
    Conference Proposal Reviewer for Divisions D and H: Ongoing
    Division H Evaluation Report Judging Panel, 2004
    Session Discussant, Ongoing
    Journal article-peer reviewer, American Educational Research Journal, Educational Evaluation and Policy Analysis, Educational Researcher (topical, ongoing)
    Editorial Board Member, Educational Researcher, 2006-09.

  1. Eastern Evaluation Research Society (EERS), an affiliate of the American Evaluation Association: 
    Member, 2002-present
    Member of the Board, 2004-2006
    Annual Conference Program Committee Member 2005-2006.

  1. Florida Educational Research Association (FERA):  Member, 1986-2000
    Chair of Researcher of the Year Committee, 1997
    Member of Researcher of the Year Committee, 1996
    Chair of Professional Development and Training Committee, 1993
    Conference proposal reviewer and/or discussant 1991-1997, 1999
    Nominated as candidate for president, 1997           

  1. National Council on Measurement in Education (NCME):  Member, 1987-present
    Editorial Board Member: Educational Measurement-Issues and Practice 1995-97
    Member, Nominations Committee 1993
    Proposal reviewer for NCME conferences: Ongoing

  1. Florida Educational Research Council (FERC):  Member, 1990-1995
    Member, 1990-1995, by appointment of the Superintendent, Pasco County Schools
    Member of the Conference Planning Committee, 1993
    Nominated and elected Treasurer, 1994
    Nominated President-Elect, 1995 (Did not serve as President due to move to USF)

  1. Florida Public Health Association (FPHA):  Member, 1999-2000
    Invited lecture at Calcutta University’s Department of Education, Alipore Campus, Kolkata, India on February 5, 2014. Title: Measures and Correlates of Mathematics Self-efficacy, Mathematics Self-Concept and Mathematics-Anxiety in Elementary Students: An Instrument Design and Validation Study.

INVITED TALKS

Invited Lecture-Workshop at the Columbia Global Center-Mumbai (CGC), India on February 12, 2018 to an international audience of educational scholars/faculty, policymakers, leaders from K-12 and higher education institutions. Title: Evidence-based Approaches for Enhancing Educational Quality 

Invited Panel Presentation at Teachers College, Columbia University, Educational Leaders Data Analytics Summit on June 8, 2018

Invited keynote speech at an upcoming international conference organized by CENEVAL at the Autonomous University of San Luis Potosi, Mexico City on Oct. 28-29, 2016. Title: Contemporary methodologies for assessing student learning and evaluating the effectiveness of complex programs in higher education

Invited lecture to the Faculty of Education and the Doctoral Program of Education at Universidad Nacional de Educación a Distancia (UNED) in Madrid, Spain on January 12, 2015. Title:  Mixed Methods Evaluations

Keynote speech at an international education assessment conference organized by the Ministry of Education and Culture and Yogyakarta State University, Indonesia on November 8,  2014.
Title: Issues in ImplementingClassroom Assessment and the Proximal Assessment for Learner Diagnosis (PALD) Model

Invited lecture at Calcutta University’s Department of Education, Alipore Campus, Kolkata, India on February 5, 2014. Title: Measures and Correlates of Mathematics Self-efficacy, Mathematics Self-Concept and Mathematics-Anxiety in Elementary Students: An Instrument Design and Validation Study.

Keynote speech at conference hosted for international clients (Indonesian delegates). December 21, 2012. Pearson Educational Measurement, New York, NY. Title: Validity Considerations with Large Scale Assessments.

Plenary session participant at the International Conference on Educational Measurement and Evaluation. August 10, 2012. Phillipine Educational Measurement and Evaluation Association, Manila, Phillipines. Title: Teacher Proficiency Indicators in Diagnostic Classroom Assessment.

Keynote speech given on June 16, 2011. International Forum on Talent Cultivation in Higher and Vocational Education held at Ningbo City, China, sponsored by Ningbo Polytechnic Institute and Institute of Higher Education, Xiamen University, China. Title: Talent Development in Higher and Vocational Education using a Diagnostic Classroom Assessment Model.

Invited Lecture at Institute of Higher Education, Xiamen University, China on March 15, 2010. Title: Models of Quality Assessment and Evaluation in HigherEducation Systems in the U.S.

United States-India Educational Foundation, Kolkata, India-60th Anniversary Seminar Series, on February 18, 2010. Title: Gender equity in primary education in West Bengal and Bangladesh: Educational opportunities, achievement outcomes, and school completion rates

Institute of Medicine (Food and Nutrition Board), The National Academies, January 8, 2009. Invited panelist at open workshop on generating and using evidence effectively in obesity prevention decision-making. Title: Alternatives and tradeoffs in generating and evaluating evidence: Perspectives from education

BRAC-Research and Evaluation Division (RED), Dhaka, Bangladesh on March 10, 2008. Title: Assessing student learning: Building assessment capacity in Bangladesh's schools and education systems. Audience from the Directorate of Primary Education, National Board of Textbook and Curriculum Development, Ministry of Secondary Education, BRAC University-Institute for Educational Development, BRAC Education Programs and BRAC-Research and Evaluation Division.

Invited Lecture at the 12th International and 43rd National Conference of the Indian Academy of Applied Psychology, Kolkata, India on February 7, 2008.Title: Mixed-method designs for studying effects of complex field interventions: Criteria for screening the type and grade of evidence.

Invited talk at BRAC University-Institute for Educational Development, Dhaka, Bangladesh on February 12, 2007. Title: Assessment and evaluation in school organizations.

Psychometric Research Unit, Indian Statistical Institute (ISI), Kolkata, India on January 22, 2007 to ISI faculty and students. Title: Using structural equation modeling to study the internal structure of attitudinal measures.

American Educational Research Association (AERA), Mixed Methods SIG. Inaugural Session at AERA annual meeting on April 7, 2006. Title: Grades of evidence in effectiveness research and how mixed-method designs help: Evaluating the quality of findings against methodological choices of researchers.

Eastern Evaluation Research Society, an affiliate of the American Evaluation Association. Conference closing panel discussion with Grover J. Whitehurst of the Institute for Education Sciences, U.S. Department of Education and Nancy Wolff, Rutgers University, at the annual meeting on April 29, 2006. Title: Rigorous evaluations: Is there a gold standard?

Fordham University, Department of Clinical Psychology and Psychometrics, Colloquium Series. Feb 28, 2004. Title: Designing and validating construct measures using a unified process model.

American Educational Research Association/Institute for Education Sciences Postdoctoral Fellows' Summer Retreat, August 15, 2003. Title: Instrumentation and validity of indicators. Title: Knowledge production through documentation and evaluation.

Eastern Evaluation Research Society-An Affiliate of the American Evaluation Association, April 28, 2003. Paper Title: Models and methods for examining standards-based reforms (Sole author).

Florida Council on Elementary Education, April, 1989. Title: Results of the developmental kindergarten study. 

American Association of University Women. Title: Continuous progress: New directions in elementary education at Pasco County (with a panel of administrators from the Pasco County School System and Robert H. Anderson, USF), 1991.           

For other National and International Conferences see recent vitae under Documents

Related Articles

Mixing Methods to Learn 'What Works'

There are the Reading Wars, the Math Wars, the School Choice Wars and then there is the war over how to settle all the other wars. At its root are the questions: What research methods help us know what really works in education? What constitutes valid evidence that a program helps students? TC Professor Madhabi Chatterji explains.

Research in 2004

Some of the diverse research contributions of TC faculty during 2004

Chatterji Talks about Making and Taking Tests

With the increasing importance placed on testing, it's necessary to look at how tests and assessment tools are designed and the quality of data they yield. Madhabi Chatterji, Associate Professor of Measurement, Evaluation, and Education does just that.

A New Way to Troubleshoot Student Learning

Two new studies show that teachers who successfully use a method called proximal assessment for learner diagnosis, or PALD, can boost the performance of fifth and sixth grade students in math.

Equity Campaign Announces New Assessment and Evaluation Research Initiative

The Campaign for Educational Equity, launched at Teachers College in 2005 to help narrow the gap between the nation's most advantaged and disadvantaged students, will now evaluate school-based efforts outside the College that have that same goal.

Introducing Peer Review

The Office of External Affairs announces the creation of Peer Review an online talk-show hosted by faculty member Renee Cherow-O’Leary.

TC's Equity Campaign Teams with the Harlem Children's Zone

TC's Campaign for Educational Equity is partnering with the Harlem Children's Zone (HCZ) to evaluate the long-term effectiveness of HCZ's system of early and progressive interventions aimed at improving health and educational opportunities for preschool-aged children.

Developing Young Leaders in India

Teachers College has received a $750,000 grant from The Global Education and Leadership Foundation (tGELF) to help develop and assess a leadership curriculum for junior high and high school students at a group of schools in India. tGELF is an Initiative of the The Nand and Jeet Khemka Foundation, the philanthropic arm of the SUN Group, an energy corporation based in India and Russia.

TC at AERA, 2008

Hank Levin is giving the Distinguished Lecture; Janet Miller is receiving a lifetime achievement award; Susan Fuhrman, Amy Wells, Jeanne Brooks-Gunn and Edmund Gordon are speaking in Presidential Sessions, and Gordon and colleagues are part of "A Scholar's Evening in Harlem." And then there's the research.

Previewing TC at AERA

Some 181 faculty, students and others affiliated with Teachers College will present at "To Know Is Not Enough," this year's meeting of the American Educational Research Association (AERA), which will be held in Vancouver. Click here to see a full listing of the presentations, dates, times and locations.

TC at AERA 2013

Some 172 faculty, students and others affiliated with Teachers College will present at "Education and Poverty: Theory, Research, Policy and Praxis," this year's meeting of the American Educational Research Association (AERA), which will be held in San Francisco from April 27-May 1.

Chatterji Part of Prestigious Committee Recommending Major Change in Thinking About Obesity Studies

The Institutes of Medicine committee examined ways by which the existing evidence base and research on obesity and obesity prevention could be accessed, evaluated and made useful to policy-makers.

Changing the Paradigm in Obesity Studies

TC's Madhabi Chatterji is part of a multidisciplinary Institute of Medicine team that's rethinking an intractable problem

Staff news

Staff news

Menghan Shen: Marshaling the Power of Her Peers

Marshaling the Power of Her Peers

Columbia and TC Cited for Fulbright Excellence

Columbia University is one of the nation's top producers this year of Fulbright Students and Fulbright Scholars -- and TC has played an important part in that success. Top-producing institutions were highlighted in the October 24th digital edition of The Chronicle of Higher Education.

International Conference Will Examine Issues of Validity, Educational Assessment, Equity and Accountability

International stakeholders will gather for a conference and institute, from March 28 through March 31, to discuss how standardized tests and other assessments are constructed, what they measure, and whether the results are appropriately used.

Assessing Assessments — and Assessment Use

Experts from around the world gathered at TC in March to debate whether standardized tests are used in fair and valid ways

2012 Year in Review

Covering the period of September 1, 2011 through August 31, 2012

Academic Festival 2013: The Sessions

Faculty, alumni and students tackle a range of issues, from mobile technology to group therapy for people living in the aftermath of disaster

TC's Chatterji: "More Diverse Students Aspire Toward College"

The Associate Professor of Measurement-Evaluation and Education told The Washington Times that increasing numbers of minority test-takers on the SAT "is a positive thing for our society," but that we need to do more so that all students can succeed in college and beyond.

TC's Chatterji: "Multiple-Choice Tests Not Appropriate for Kindergartners"

The Associate Professor of Measurement, Evaluation and Education writes in the New York Daily News that multiple-choice tests are "not developmentally appropriate for kindergartners, who are not used to so much structure."

TC's Chatterji Co-Hosts EdWeek Blog on Assessment

The first blog entry, by Madhabi Chatterji, the director of TC's Assessment and Evaluation Research Initiative, debuts this week, with guests to follow.

Expanding TC's Global Reach

A renewed focus on international engagement fuels exciting work by TC faculty around the world.

Back to skip to quick links