1990 Ph.D. University of South Florida, Tampa, Florida
1980 M.Ed. St. Christopher's College, University of Madras, Madras, India
1975 B.Ed. University of Bombay, Bombay, India
1973 B.Sc. (Honors) Lady Brabourne College, University of Calcutta, W.B., India
My research and scholarly interests encompass five major lines of inquiry that intersect with policy and practice concerns in education, with expanding lines of inquiry in health, psychology and other applied fields.
INTERDISCIPLINARY STUDIES IN EDUCATION: Doctoral degree applicants interested in working with Professor Chatterji on interdisciplinary problems will find more information on TC’s program here.
Professor Chatterji is also an affiliated faculty member at the Department of Education Policy and Social Analysis at TC.
Note: Published as Madhabi Banerji from 1990-2000, and as Madhabi Chatterji from January, 2001-present.
BOOKS (PEER-REVIEWED):
Chatterji, M. (2019). A Consumer’s Guide to Testing under the Every Student Succeeds Act (ESSA): What Can the Common Core and Other ESSA Assessments Tell Us?” University of Colorado, Boulder: National Educational Policy Center. See: https://nepc.colorado.edu/publication/rd-assessment-guide
Chatterji, M. (Ed.) (2013). Validity and test use: An international dialogue on educational assessment, accountability, and equity. Bingley, UK: Emerald Group Publishing Limited.
Chatterji, M. (2003). Designing and Using Tools for Educational Assessment. Boston, MA: Allyn & Bacon/Pearson.
REFEREED JOURNAL ARTICLES BY THEME:
I. Evidence Standards, the “Evidence Debate”, Improving Evidence-gathering and Evidence-synthesis Methodologies
Chatterji, M. (2016). Causal inferences on the effectiveness of complex social programs: Navigating assumptions, sources of complexity and evaluation design challenges, Evaluation and Program Planning, 56 (6) 128–140.
Chatterji, M., Green, L.W., & Kumanyika, S. (2014). L.E.A.D.: A framework for evidence gathering and use for the prevention of obesity and other complex public health problems. Health Education & Behavior, 41 (1) 85-99. First released on June 19, 2013.
Chatterji, M. (2008). Synthesizing evidence from impact evaluations in education to inform action: Comments on Slavin. Educational Researcher, 37 (1) 23-26.
Chatterji, M. (2007). Grades of Evidence: Variability in quality of findings in effectiveness research on complex field interventions. American Journal of Evaluation, 28 (3), 3-17.
Chatterji, M. (2004). Evidence on “what works”: An argument for extended-term mixed method (ETMM) evaluation designs. Educational Researcher, 33 (9), 3-13. (Reprinted in Educational Researcher, 34 (5), 14-24, 2005). [1]
II. Standards-based Reforms and Educational Equity
Chatterji, M. (2006). Reading achievement gaps, correlates and moderators of early reading achievement: Evidence from the Early Childhood Longitudinal Study (ECLS) kindergarten to first grade sample. Journal of Educational Psychology, 98 (3), 489-507.
Chatterji, M. (2005). Achievement gaps and correlates of early mathematics achievement: Evidence from the ECLS K-first grade sample. Educational Policy Analysis Archives, 13 (46).
Chatterji, M., Koh, N., Choi, L., & Iyengar, R. (2009) Closing learner gaps proximally with teacher- mediated diagnostic assessment. Research in the Schools, 16 (2), 60-77.
Chatterji, M. (2002). Models and methods for examining standards-based reforms: Have the tools of inquiry answered the pressing questions on improving schools? Review of Educational Research, 72 (3), 345-386.
III. Instrument Design, Validation and Validity Issues
Chatterji, M., & Lin, M. (2018). Designing non-cognitive construct measures that improve mathematics achievement in grade 5-6 learners. A user-centered approach. Quality Assurance in Education, 26(1), 70-100.
Chatterji, M., Tripken, J., Johnson, S., Koh, N. J., Sabain, S., Allegrante, J.P., & Kukafka, R. (2017). Development and validation of a health information technology curriculum: Toward more meaningful use of electronic health records. Pedagogy in Health Promotion, 3(3) 154–166. Electronic release: © 2016 Society for Public Health Education
Wyer, P.W. & Chatterji, M. (2013). Designing outcome measures for the accreditation of medical education programs as an iterative process combining classical test theory and Rasch measurement. The International Journal of Educational and Psychological Assessment, 13 (2), 35-61.
Chatterji, M. (2012). Development and validation of indicators of teacher proficiency in diagnostic classroom assessment. The International Journal of Educational and Psychological Assessment, 9 (2), 4-25. Special Issue on Teacher Assessments.
Graham, M.J., Naqvi, N., Harding, K., Encandela, J.A., & Chatterji, M. (2009). Systems-Based Practice defined: Taxonomy development and role identification for competency assessment of residents. Journal of Graduate Medical Education, 1 (1), 49-60.
Chatterji, M. (2002). Measuring leader perceptions of school readiness for standards-based reforms and accountability. Journal of Applied Measurement, 3 (4), 455-485.
Chatterji, M., Sentovich, C, Ferron, J., & Rendina-Gobioff, G. (2002). Using an iterative validation model to conceptualize, pilot-test, and validate scores from an instrument measuring Teacher Readiness for Educational Reforms. Educational and Psychological Measurement, 62, 442-463.
Banerji, M., Smith, R.M., & Dedrick, R. F. (1997). Dimensionlity of an early childhood scale using Rasch analysis and confirmatory factor analysis. Journal of Outcome Measurement, 1 (1), 56-86.
Banerji, M. (2000). Construct validity of scores/measures of a developmental assessment in mathematics using classical and many-facet Rasch analysis. Journal of Applied Measurement, 1 (2), 177-198.
Banerji, M. (1999). Validation of scores/measures from a K-2 developmental assessment in mathematics. Educational and Psychological Measurement, 59 (4), 694-715.
Banerji, M. & Ferron, J. (1998). Construct validity of a developmental assessment made up of mathematical patterns tasks. Educational and Psychological Measurement, 58 (4), 634-660.
Caines, J., Bridglall, B.L., & Chatterji, M. (2014). Understanding validity and fairness issues in high stakes individual testing situations. Quality Assurance in Education, 22 (1), 5-18. [2]
Bridglall, B.L., Caines, J., & Chatterji, M. (2014). Understanding validity issues in test-based models of school and teacher evaluation. Quality Assurance in Education, 22 (1), 19-30.2
Lin, M., Bumgarner, E., & Chatterji, M. (2014). Understanding validity issues in international large scale assessments. Quality Assurance in Education, 22 (1), 31-41.2
Evaluation Research and Evaluation Methods
Chatterji, M., Kwon, Y.A., Paczosa, L., & Sng, C. (2006). Gathering evidence on an after-school supplemental instruction program: Design challenges, lessons, and early findings in light of NCLB. Educational Policy Analysis Archives, 14 (12).
Chatterji, M. (2005). Applying the Joint Committee's 1994 standards in international contexts: A case study of educational evaluations in Bangladesh. Teachers College Record, 107 (10), 2373-2400. Special Issue on New Perspectives in Program Evaluation.
Banerji, M. & Dailey, R.A. (1994). A study of the effects of an inclusion program of Elementary students with specific learning disabilities. Journal of Learning Disabilities, 28 (8), 511-522.
Pearson, C.L. & Banerji, M. (1993). Effects of a ninth-grade dropout prevention program on student academic achievement, school attendance, and dropout rate. Journal of Experimental Education, 61 (3), 247-256.
Assessment Policy
Chatterji, M. (2013). Bad tests or bad test use? A case of SAT® use to examine why we need stakeholder conversations on validity. Teachers College Record, 115 (9), 1-7.
Popham, W. J., Berliner, D.C., Kingston, N., Fuhrman, S.H., Ladd, S.M., Charbonneau, J. & Chatterji, M. (2014). Can today's standardized tests yield instructionally useful data? Challenges, promises and the state of the art. Quality Assurance in Education, 22 (4), 300-315.
Pizmony-Levy, O., Harvey, J., Schmidt, W., Noonan, R., Engel, L., Feuer, M.J., Santorno, C., Rotberg, I., Ash, P., Braun, H., Torney-Purta, J., & Chatterji, M. (2014). On the merits of, and myths about, international assessments. Quality Assurance in Education, 22 (4), 316-335.
Gordon, E. W., McGill, M.V., Sands, D.I., Kalinich, K., Pellegrino, J.W. , & Chatterji, M. (2014). Bringing formative assessment to schools and making it count. Quality Assurance in Education, 22 (4), 336-350.
Chatterji, M. (2010). Review of “Closing the Racial Achievement gap: Learning from Florida’s Reforms.” Boulder, CO: National Education Policy Center. Available at http://nepc.colorado.edu/thinktank/learnin-from-florida.
Chatterji, M. (2005, April). Closing Florida’s achievement gaps: Florida Institute of Education (FIE) Policy Brief 4. Jacksonville, FL: Florida Institute of Education at the University of North Florida.
Chatterji, M. (2004, April). Good and bad news about Florida student achievement: Performance trends on multiple indicators since passage of the A+ legislation. Educational Policy Brief Research Unit, Doc No. EPSL-0401-105-EPRU, Tempe, AZ: Educational Policy Studies Laboratory.
See a selection of publications available on PubMed.
Other publications are available via ERIC, PsycINFO and Psychological Abstracts.
MADHABI CHATTERJI
Madhabi Chatterji, Ph.D., M.Ed., BSc (Hons.) is Professor of Measurement, Evaluation, and Education and the founding director of the Assessment and Evaluation Research Initiative (AERI) at Columbia University’s Teachers College (TC), a center dedicated to promoting meaningful use of assessment and evaluation information to improve the quality of practices and policies, across disciplines and internationally (www.tc.edu/aeri ).
Professor Chatterji’s long-standing research interests are in assessment and evaluation in education. Since 2008, she has extended these lines of inquiry to health, psychology, and other applied fields with a forthcoming book: Designing Assessments for Multidisciplinary Constructs and Applications: A User-centered Methodology (Guilford, in preparation). Her 50+ refereed journal articles and edited works have been published in many top tier academic journals, including the Educational Researcher, Review of Educational Research, Journal of Educational Psychology, Educational and Psychological Measurement, American Journal of Evaluation, Journal of Learning Disabilities, and Health Education & Behavior. Her first book, Designing and Using Tools for Educational Assessment (2003, Allyn & Bacon/Pearson), is popular among students and first presented a Process Model to guide instrument design, validation and use. Professor Chatterji teaches graduate level courses in measurement-evaluation methods at TC, and has served as major advisor or examiner on 60+ doctoral dissertation committees over time.
A public intellectual, she has spoken out on the consequences of misuses of tests and test-based information through news outlets, op-eds and blogs. She co-led the blog in Education Week (2014: Assessing the Assessments: K-12 Measurement and Accountability in the 21st Century: http://blogs.edweek.org/edweek/assessing_the_assessments ). She is a Fellow at the National Education Policy Center (NEPC) based at the University of Colorado, Boulder (2006-present), a forum through which she continues her assessment policy work.
Dr. Chatterji was the Co-editor of Quality Assurance in Education, an international peer-reviewed journal in educational evaluation (2015-2018). Her past service also includes memberships on numerous national and international advisory panels and journal review boards, including the editorial boards of Educational Measurement: Issues and Practice, and the Educational Researcher, the flagship journals of the National Council on Measurement on Education and the American Educational Research Association. She served as a member of an expert consensus committee on evidence frameworks for decision-making in obesity prevention convened by the Institute of Medicine, now the National Academy of Medicine (2008-10). Currently, she is on the Faculty Steering Committee of the Columbia Global Centers-Mumbai.
Prior to joining TC in 2001, Dr. Chatterji was Assistant Professor in the Department of Measurement and Research at the College of Education, University of South Florida (USF, 1996-2000), and Supervisor, Research and Evaluation Services at the District School Board of Pasco County in Florida (1988-1995). She received her Ph.D. from USF in 1990, before which she was a secondary school teacher of English and Science in India.
Dr. Chatterji emigrated to the U.S. as a doctoral student in 1985, with her then-young daughters, Ruma and Raka. They are all settled permanently in the U.S. now.
To learn more about Professor Chatterji’s specific research interests, current roles, sponsored research activities and awards/recognitions, please see her CV.
AERI site at www.tc.edu/aeri
Contract/Agreement of AERI with the Department of Anesthesiology at the Columbia University Medical Center (CUMC), 2018-present. Annually renewable contract of $15,000 per year. AERI’s Goals: Provide supervised research apprenticeship opportunities for TC’s graduate students and postdoctoral researchers, leading up to joint grant applications, conference papers or publications CUMC’s goals: Supporting career development goals of medical faculty with technical consultation in measurement, statistics and research methods, grant preparation and publication.
Foundation for Anesthesia Education Research (FAER), 2017-Present. Sub-award # PG008427-01 of FAER REG-08-15-17; TC Index 544129. Purpose: Primary Faculty Mentor to Allison Lee, M.D., now Associate Professor, Department of Anesthesiology at the Columbia University Medical Center. $4918 (year 1); $7500 (year 2); $7500 (year 3). Study title: A mixed methods, randomized controlled trial comparing two methods of debriefing for a serious game designed to teach novice anesthesia residents (CA1) to perform anesthesia for emergency caesarian delivery.
Teachers College Global Investment Fund. Competition, 2014-15. Addressing inequities through comprehensive, ecologically-based models of primary education: A capacity-building effort in teacher education institutions in India. $8000 for developing grant proposals to initiate multi-year projects. AERI Project.
Subcontract with the International Medical Corps (IMC), 2013-14. Evaluating comprehensive mental health and psychosocial support services for vulnerable refugees. $52,683. Co-Principal Investigator (with the Department of Clinical and Counseling Psychology, TC).
Subcontract with the Howard Hughes Medical Institute (HHMI) project at Barnard College, 2013-2017. $31,998. Received a 4 year subcontract to support the Hughes Science Pipeline Project for middle schools. Principal Investigator.
National Science Foundation (NSF) REESE Award, 2012-2013. $148,747. Title: Improving validity at the nexus of assessment design and use: A proposal for an international conference and capacity-building institute in assessment and evaluation. Principal Investigator.
Educational Testing Service (ETS). 2011-12. $52,200. Co-sponsorship of AERI's inaugural conference on March 28-29, 2012. Title: Educational assessment, accountability and equity-Conversations on validity around the world.
Office of the National Coordinator, United States Department of Health and Human Services, Washington, D.C. Curriculum Development Center award to the Department of Biomedical Informatics, Columbia University (ca. 1.2 million). Subcontract for the development and validation of educational assessments in health information technology and designing a program evaluation protocol, 2010-2012: $204,000. Co-Principal Investigator with the Department of Health and Behavior Studies, TC.
The Nand and Jeet Khemka Foundation, India, 2008-2010. The Global Education and Leadership Foundation's Life Skills and Leadership Programme: Development of curriculum-based assessments, formative evaluation of pilot programs, and organizational capacity-building in assessment and evaluation. $754,000 (approx. $350,000 for the assessment and evaluation components). Co-Principal Investigator (with the Department of Arts and Humanities, TC).
Fulbright Research Award, 2007-08, Competition #7410. Center for International Exchange of Scholars, Washington D.C., June 2007. $13,837. Principal Investigator.
Stemmler Fund of the National Board of Medical Examiners (NBME) in collaboration with the Center for Educational Research and Evaluation, Columbia University- College of Physicians and Surgeons, April, 2006. $145,000. Title:Designing cognitive measures of practice-based learning and improvement as an iterative process combining Rasch and classical measurement methods. Co-Principal Investigator.
Community Foundation of Elmira/Corning/Finger Lakes areas. The Chemung County School Readiness Studies. $94,000 over a 3 year period (2006-09) granted to AERI. Principal Investigator.
National Science Foundation (NSF) EREC Solicitation 03-542, November, 2004-2008. $501,925. Title: Improving mathematics achievement in middle school students with systemic use of proximal assessment data. Principal Investigator.
U.S. Department of Education. School-based mentoring programs: Evaluation of long-term effects on adolescents in the Peekskill School District, NY. Sub-contract with Family Services of Westchester, N.Y. Funded in 2004; project duration January, 2005-December, 2007. $10,000 per year. Principal Investigator.
Carnegie Learning Corporation-Cognitive Tutor evaluation. Contract to evaluate the effects of the Cognitive Tutor mathematics program at 13 Brooklyn schools 2003-2004. $20,155. Principal Investigator.
Kumon North America, Inc. Grant to evaluate the effects of the Kumon supplementary math and reading programs at P.S. 180 in the Chancellor's District, New York. (2001-2002). $28,750. Principal Investigator.
Pinellas County Schools, Florida, Goals 2000 project, 1999-2001. Data-based decision-making in the classroom. Grant to develop a training manual in statistical analysis and use of assessment data for educational decision-making. $29,000. Principal Investigator.
Bureau of Teacher Education, Florida Department of Education, 1999. Readiness for Statewide Assessment Reforms and its Influence on School Practices and Outcomes. Grant to conduct a large scale survey to evaluate needs related to state-initiated reforms in nine Florida school districts. $25,000. Principal Investigator.
University of South Florida, Division of Sponsored Research, Creative Scholarship Grants Competition, March, 1999. Readiness for Statewide Assessment Reforms and Influences on School Practices and Outcomes. $7,500. Principal Investigator.
University of South Florida, Center for Teaching Enhancement, March, 1997. Designing and Validating Educational Assessments: A Computer-based Module. Instructional Technology Grants Competition. $7,500. Principal Investigator.
Florida Department of Education-Bureau of Curriculum, Instruction, and Assessment. Developing Teacher-friendly Guides for Assessing Florida's Goal 3 Standards. Invitational grant awarded to the Pasco County School System, March, 1995. $67,000. Project Leader and Primary Author.
TC Global Investment Fund Award, 2014
Teachers College, Columbia University
Provost’s Investment Fund Award, 2011
Teachers College, Columbia University
Fulbright Research Scholar, 2008
Center for International Exchange of Scholars (Fulbright Commission), Washington, D.C.
Study title: A study of gender equity in primary education in Bengali-speaking regions of India and Bangladesh: Evaluating access, opportunities, and factors affecting school outcomes and completion rates.
Outstanding Reviewer, 2006
Publications Committee, Educational Researcher, American Educational Research Association.
Outstanding Publication Award (Advances in Research Methodology-Division H): American Educational Research Association, 2004
Paper title: Evidence on what works: An argument for Extended-term Mixed Methods (ETMM) designs
Note: Published as a lead article in the Educational Researcher in 2004; reprinted in 2005.
Fellow, National Educational Policy Center (NEPC), University of Colorado at Boulder
Previously Fellow, Educational Policy Research Unit (EPRU), Arizona State University,
2006-present.
Distinguished Paper Award: Florida Educational Research Association, 1993
Paper title: Examining dimensionality of data generated from an early childhood scale using Rasch analysis and confirmatory factor analysis.
Note: Published as a lead article in the Journal of Outcome Measurement in 1997.
Creative Scholarship Award, University of South Florida, 1999.
Instructional Technology Award, University of South Florida, 1997.
Elected Member, Phi Kappa Phi (Academic Honor Society)
University of South Florida, 1986.
Elected Member, Delta Kappa Gamma (Academic Honor Society for Educators)
1987.
Note: Professor Chatterji published as Madhabi Banerji from 1990-2000 and as Madhabi Chatterji from January, 2001-present. Articles are organized by scholarly interest area.
BOOKS
Chatterji, M. (2013). (Ed.). Validity and test use: An international dialogue on educational assessment, accountability, and equity. Bingley, UK: Emerald Group Publishing Limited
Chatterji, M. (2003). Designing and Using Tools for Educational Assessment. Boston, MA: Allyn & Bacon/Pearson.
REFEREED ARTICLES
Chatterji, M. & Lin, M. (2018) Measures and correlates of mathematics-related self-efficacy, self-concept, anxiety in young learners: Construct validation in context as an iterative process. National Council on Measurement in Education, 2016 paper presentation.
Chatterji, M. (2013). Bad tests or bad test use? A case of SAT use to examine why we need stakeholder conversations on validity. Teachers College Record, 115 (9), 1-7.
Wyer, P. W. & Chatterji, M. (2013). Designing outcome measures for the accreditation of medical education programs as an iterative process combining classical test theory and Rasch measurement. The International Journal of Educational and Psychological Assessment, 13 (2), 35-61.
Chatterji, M. (2013). Global forces and educational assessment: A foreword on why we need an international dialogue on validity and test use. In M. Chatterji (Ed.), Validity and test use: An international dialogue on educational assessment, accountability, and equity (pp. 1-14). Bingley, UK: Emerald Group Publishing.
Chatterji, M., Sentovich, C, Ferron, J., & Rendina-Gobioff, G. (2002). Using an iterative validation model to conceptualize, pilot-test, and validate scores from an instrument measuring Teacher Readiness for Educational Reforms. Educational and Psychological Measurement, 62, 442-463.
Banerji, M. (1999). Validation of scores/measures from a K-2 developmental assessment in mathematics. Educational and Psychological Measurement, 59 (4), 694-715.
Banerji, M. & Ferron, J. (1998). Construct validity of a developmental assessment made up of mathematical patterns tasks. Educational and Psychological Measurement, 58 (4), 634-660.
Banerji, M., Smith, R.M., & Dedrick, R. F. (1997). Dimensionality of an early childhood scale using Rasch analysis and confirmatory factor analysis. Journal of Outcome Measurement, 1 (1), 56-86.
Chatterji, M. (2016). Causal inferences on the effectiveness of complex social programs: Navigating assumptions, sources of complexity and evaluation design challenges, Evaluation and Program Planning., 59, 128-140. Available online at: http://dx.doi.org/10.1016/j.evalprogplan.2016.05.009
Chatterji, M, Green, L.W., & Kumanyika, S. (2014). L.E.A.D.: A framework for evidence gathering and use for the prevention of obesity and other complex public health problems. Health Education & Behavior, 41 (1) 85-99.
Chatterji, M. (2008). Synthesizing evidence from impact evaluations in education to inform action: Comments on Slavin. Educational Researcher, 37 (1) 23-26.
Chatterji, M. (2007). Grades of Evidence: Variability in quality of findings in effectiveness research on complex field interventions. American Journal of Evaluation, 28(3), 3-17.
Chatterji, M. (2009). Enhancing scientific evidence on the how global educational initiatives work: Theory, epistemological foundations, and guidelines for applying multi-phase, mixed methods designs. In K.B. Ryan & J. B. Cousins (Eds.). The SAGE International Handbook of Educational Evaluation (92-111). Thousand Oaks, CA: Sage Publications.
Chatterji, M. (2010). Evaluation methodology. In P. Peterson, E. Baker, and B. McGaw (Eds.). International Encyclopedia of Education. Volume 3 (735-745), Oxford: Elsevier.
Chatterji, M. (2004). Evidence on "what works": An argument for extended-term mixed method (ETMM) evaluation designs. Educational Researcher, 33(9), 3-13. (Reprinted in Educational Researcher, 34(5), 14-24, 2005)
Chatterji, M. (2005). Applying the Joint Committee's 1994 standards in international contexts: A case study of educational evaluations in Bangladesh [Special Issue on New Perspectives in Program Evaluation]. Teachers College Record, 107 (10), 2373-2400.
Banerji, M. & Dailey, R.A. (1994). A study of the effects of an inclusion program for elementary students with specific learning disabilities. Journal of Learning Disabilities, 28 (8), 511-522.
Chatterji, M., Tripken, J., Johnson, S., Koh, N., Sbain, S., Allegrante, J.P. & Kufafka, R. (2016). Development and validation of a health information technology curriculum: Towards meaningful use of electronic health records. Pedagogy in Health Promotion, 1-14. Available online at: http://php.sagepub.com/cgi/reprint/2373379916669149.pdf?ijkey=3sbq6OuzlDS0nze&keytype=finite
Chatterji, M. (2012). Development and validation of indicators of teacher proficiency in diagnostic classroom assessment [Special issue on Teacher Assessments]. The International Journal of Educational and Psychological Assessment, 9(2), 4-25.
Chatterji, M., Koh, N., Choi, L., & Iyengar, R. (2009). Closing learner gaps proximally with teacher-mediated diagnostic assessment. Research in the Schools, 16(2), 60-77.
Chatterji, M. (2006). Reading achievement gaps, correlates and moderators of early reading achievement: Evidence from the Early Childhood Longitudinal Study (ECLS) kindergarten to first grade sample. Journal of Educational Psychology, 98(3), 489-507.
Chatterji, M. (2005). Achievement gaps and correlates of early mathematics achievement: Evidence from the ECLS K-first grade sample. Educational Policy Analysis Archives, 13(46).
Chatterji, M., Kwon, Y.A., Paczosa, L., & Sng, C. (2006). Gathering evidence on an after-school supplemental instruction program: Design challenges, lessons, and early findings in light of NCLB. Educational Policy Analysis Archives, 14 (12).
Chatterji, M. (2002). Models and methods for examining standards-based reforms: Have the tools of inquiry answered the pressing questions on improving schools? Review of Educational Research, 72(3), 345-386.
Chatterji, M. (2002) Measuring leader perceptions of school readiness for standards-based reforms and accountability. Journal of Applied Measurement, 3(4), 455-485.
EDITED VOLUMES
Chatterji, M. (2014) (Guest Ed.). Assessment, accountability and quality issues. Quality Assurance in Education, 22 (4). Special Issue.
Chatterji, M. (2013). (Guest Ed.). When education measures go public: Stakeholder perspectives on how and why validity breaks down. Teachers College Record, 115 (9).
Chatterji, M. & Welner, K. G. (2014) (Guest Eds.). Validity, assessment and accountability: Contemporary issues in primary, secondary, and higher education. Quality Assurance in Education, 22 (1). Special Issue.
OP-ED ARTICLES, REVIEWS AND BLOGS
Chatterji, M. (2014). Let’s mend, not end, educational testing. Education Week, Issue 24. In print on March 12, 2014. Available at: http://www.tc.columbia.edu/aeri/conferences-and-forums/education-week-blog-2014/0311Chatterji.pdf.
Chatterji, M. (2014). Validity, test use, and consequences: Pre-empting a persistent problem. In Assessing the Assessments: K-12 Measurement and Accountability in the 21st Century at Education Week’s blog site on March 17, 2014: http://blogs.edweek.org/edweek/assessing_the_assessments. Available at: http://www.tc.columbia.edu/aeri/conferences-and-forums/education-week-blog-2014/0317Chatterji.pdf
Chatterji, M. (2014). Formative classroom assessment and assessment for accountability: Finding a balance. In Assessing the Assessments: K-12 Measurement and Accountability in the 21st Century at Education Week’s blog site on May 16, 2014: http://blogs.edweek.org/edweek/assessing_the_assessments. Available at: http://www.tc.columbia.edu/aeri/conferences-and-forums/education-week-blog-2014/0516Chatterji.pdf
Chatterji, M. (2010). Review of “Closing the Racial Achievement gap: Learning from Florida’s Reforms.” Boulder, CO: National Education Policy Center. Available at http://nepc.colorado.edu/thinktank/learnin-from-florida.
2015-present:
Professor of Measurement, Evaluation, and Education
Director, Assessment and Evaluation Research Initiative (AERI at www.tc.edu/aeri)
Program of Social-Organizational Psychology, Dept. of Organization and Leadership,
Teachers College, Columbia University
Co-Editor, Quality Assurance in Education (QAE), an international, peer-reviewed journal in evaluation,
Emerald Group Publishing, UK
2006-2015:
Associate Professor of Measurement, Evaluation, and Education
Director, Assessment and Evaluation Research Initiative (AERI)
Program of Social-Organizational Psychology, Dept. of Organization and Leadership,
Teachers College, Columbia University
2001-2005:
Associate Professor of Measurement, Evaluation, and Education (Reappointed in 2003; tenured in 2005)
Dept. of Human Development,
Teachers College, Columbia University.
1996-2000:
Assistant Professor, Department of Educational Measurement and Research,
College of Education,
University of South Florida.
1988-1995:
Supervisor, Research and Evaluation Services
District School Board of Pasco County, Florida
American Evaluation Association (AEA):Member, 2000-present
Journal article reviewer, American Journal of Evaluation (ongoing)
Conference participant (ongoing) and International Topical Interest Group (TIG)
Ambassador (periodic).
American Educational Research Association (AERA):Member, 1986-present
Conference Proposal Reviewer for Divisions D and H: Ongoing
Division H Evaluation Report Judging Panel, 2004
Session Discussant, Ongoing
Journal article-peer reviewer, American Educational Research Journal, Educational Evaluation and Policy Analysis, Educational Researcher (topical, ongoing)
Editorial Board Member, Educational Researcher, 2006-09.
Eastern Evaluation Research Society (EERS), an affiliate of the American Evaluation Association:
Member, 2002-present
Member of the Board, 2004-2006
Annual Conference Program Committee Member 2005-2006.
Florida Educational Research Association (FERA): Member, 1986-2000
Chair of Researcher of the Year Committee, 1997
Member of Researcher of the Year Committee, 1996
Chair of Professional Development and Training Committee, 1993
Conference proposal reviewer and/or discussant 1991-1997, 1999
Nominated as candidate for president, 1997
National Council on Measurement in Education (NCME): Member, 1987-present
Editorial Board Member: Educational Measurement-Issues and Practice 1995-97
Member, Nominations Committee 1993
Proposal reviewer for NCME conferences: Ongoing
Florida Educational Research Council (FERC): Member, 1990-1995
Member, 1990-1995, by appointment of the Superintendent, Pasco County Schools
Member of the Conference Planning Committee, 1993
Nominated and elected Treasurer, 1994
Nominated President-Elect, 1995 (Did not serve as President due to move to USF)
Florida Public Health Association (FPHA): Member, 1999-2000
Invited lecture at Calcutta University’s Department of Education, Alipore Campus, Kolkata, India on February 5, 2014. Title: Measures and Correlates of Mathematics Self-efficacy, Mathematics Self-Concept and Mathematics-Anxiety in Elementary Students: An Instrument Design and Validation Study.
INVITED TALKS
Invited Lecture-Workshop at the Columbia Global Center-Mumbai (CGC), India on February 12, 2018 to an international audience of educational scholars/faculty, policymakers, leaders from K-12 and higher education institutions. Title: Evidence-based Approaches for Enhancing Educational Quality
Invited Panel Presentation at Teachers College, Columbia University, Educational Leaders Data Analytics Summit on June 8, 2018
Invited keynote speech at an upcoming international conference organized by CENEVAL at the Autonomous University of San Luis Potosi, Mexico City on Oct. 28-29, 2016. Title: Contemporary methodologies for assessing student learning and evaluating the effectiveness of complex programs in higher education
Invited lecture to the Faculty of Education and the Doctoral Program of Education at Universidad Nacional de Educación a Distancia (UNED) in Madrid, Spain on January 12, 2015. Title: Mixed Methods Evaluations
Keynote speech at an international education assessment conference organized by the Ministry of Education and Culture and Yogyakarta State University, Indonesia on November 8, 2014.
Title: Issues in ImplementingClassroom Assessment and the Proximal Assessment for Learner Diagnosis (PALD) Model
Invited lecture at Calcutta University’s Department of Education, Alipore Campus, Kolkata, India on February 5, 2014. Title: Measures and Correlates of Mathematics Self-efficacy, Mathematics Self-Concept and Mathematics-Anxiety in Elementary Students: An Instrument Design and Validation Study.
Keynote speech at conference hosted for international clients (Indonesian delegates). December 21, 2012. Pearson Educational Measurement, New York, NY. Title: Validity Considerations with Large Scale Assessments.
Plenary session participant at the International Conference on Educational Measurement and Evaluation. August 10, 2012. Phillipine Educational Measurement and Evaluation Association, Manila, Phillipines. Title: Teacher Proficiency Indicators in Diagnostic Classroom Assessment.
Keynote speech given on June 16, 2011. International Forum on Talent Cultivation in Higher and Vocational Education held at Ningbo City, China, sponsored by Ningbo Polytechnic Institute and Institute of Higher Education, Xiamen University, China. Title: Talent Development in Higher and Vocational Education using a Diagnostic Classroom Assessment Model.
Invited Lecture at Institute of Higher Education, Xiamen University, China on March 15, 2010. Title: Models of Quality Assessment and Evaluation in HigherEducation Systems in the U.S.
United States-India Educational Foundation, Kolkata, India-60th Anniversary Seminar Series, on February 18, 2010. Title: Gender equity in primary education in West Bengal and Bangladesh: Educational opportunities, achievement outcomes, and school completion rates
Institute of Medicine (Food and Nutrition Board), The National Academies, January 8, 2009. Invited panelist at open workshop on generating and using evidence effectively in obesity prevention decision-making. Title: Alternatives and tradeoffs in generating and evaluating evidence: Perspectives from education
BRAC-Research and Evaluation Division (RED), Dhaka, Bangladesh on March 10, 2008. Title: Assessing student learning: Building assessment capacity in Bangladesh's schools and education systems. Audience from the Directorate of Primary Education, National Board of Textbook and Curriculum Development, Ministry of Secondary Education, BRAC University-Institute for Educational Development, BRAC Education Programs and BRAC-Research and Evaluation Division.
Invited Lecture at the 12th International and 43rd National Conference of the Indian Academy of Applied Psychology, Kolkata, India on February 7, 2008.Title: Mixed-method designs for studying effects of complex field interventions: Criteria for screening the type and grade of evidence.
Invited talk at BRAC University-Institute for Educational Development, Dhaka, Bangladesh on February 12, 2007. Title: Assessment and evaluation in school organizations.
Psychometric Research Unit, Indian Statistical Institute (ISI), Kolkata, India on January 22, 2007 to ISI faculty and students. Title: Using structural equation modeling to study the internal structure of attitudinal measures.
American Educational Research Association (AERA), Mixed Methods SIG. Inaugural Session at AERA annual meeting on April 7, 2006. Title: Grades of evidence in effectiveness research and how mixed-method designs help: Evaluating the quality of findings against methodological choices of researchers.
Eastern Evaluation Research Society, an affiliate of the American Evaluation Association. Conference closing panel discussion with Grover J. Whitehurst of the Institute for Education Sciences, U.S. Department of Education and Nancy Wolff, Rutgers University, at the annual meeting on April 29, 2006. Title: Rigorous evaluations: Is there a gold standard?
Fordham University, Department of Clinical Psychology and Psychometrics, Colloquium Series. Feb 28, 2004. Title: Designing and validating construct measures using a unified process model.
American Educational Research Association/Institute for Education Sciences Postdoctoral Fellows' Summer Retreat, August 15, 2003. Title: Instrumentation and validity of indicators. Title: Knowledge production through documentation and evaluation.
Eastern Evaluation Research Society-An Affiliate of the American Evaluation Association, April 28, 2003. Paper Title: Models and methods for examining standards-based reforms (Sole author).
Florida Council on Elementary Education, April, 1989. Title: Results of the developmental kindergarten study.
American Association of University Women. Title: Continuous progress: New directions in elementary education at Pasco County (with a panel of administrators from the Pasco County School System and Robert H. Anderson, USF), 1991.
For other National and International Conferences see recent vitae under Documents.