Academics

Section Navigation

Madhabi Chatterji

Professional Background

Educational Background

1990                Ph.D. University of South Florida, Tampa, Florida                    

1980                M.Ed. St. Christopher's College, University of Madras, Madras, India           

1975                B.Ed. University of Bombay, Bombay, India

1973                B.Sc. (Honors) Lady Brabourne College, University of Calcutta, W.B., India

Scholarly Interests

  • Assessment and testing (validity, instrument design and validation, diagnostic classroom assessment)
  • Evaluation methodology (evidence standards and evidence synthesis methods)
  • Educational equity in U.S. and global settings
  • Standards-based educational reforms in U.S. and global settings

Selected Publications

Chatterji, M, Green, L.W., & Kumanyika, S. (2014). L.E.A.D.: A framework for evidence gathering and use for the prevention of obesity and other complex public health problems. Health Education & Behavior, 41 (1) 85-99.

 

Chatterji, M. (2013). Global forces and educational assessment: A foreword on why we need an international dialogue on validity and test use. In M. Chatterji (Ed.), Validity and test use: An international dialogue on educational assessment, accountability, and equity (pp. 1-14). Bingley, UK: Emerald Group Publishing Limited.


Chatterji, M. (2013). Bad tests or bad test use? A case of SAT® use to examine why we need stakeholder conversations on validity. Teachers College Record, 115 (9), 1-7.

 

Wyer, P. W. & Chatterji, M. (2013). Designing outcome measures for the accreditation of medical education programs as an iterative process combining classical test theory and Rasch measurement. The International Journal of Educational and Psychological Assessment, 13 (2), 35-61.


Chatterji, M. (2012). Development and validation of indicators of teacher proficiency in diagnostic classroom assessment [Special issue on Teacher Assessments]. The International Journal of Educational and Psychological Assessment, 9(2), 4-25.


Chatterji, M. (2010). Evaluation methodology. In P. Peterson, E. Baker, and B. McGaw (Eds.). International Encyclopedia of Education. Volume 3 (735-745), Oxford: Elsevier.


Chatterji, M. (2009). Enhancing scientific evidence on the how global educational initiatives work:  Theory, epistemological foundations, and guidelines for applying phased, mixed method designs. In K.B. Ryan & J. B. Cousins (Eds.). The SAGE International Handbook of Educational Evaluation (92-111). Thousand Oaks, CA: Sage Publications. 

 

Chatterji, M., Koh, N., Choi, L., & Iyengar, R. (2009). Closing learner gaps proximally with teacher-mediated diagnostic assessment. Research in the Schools, 16(2), 60-77.

 

Chatterji, M. (2008). Synthesizing  evidence from impact evaluations in education to inform action: Comments on Slavin. Educational Researcher, 37 (1) 23-26.

 

Chatterji, M. (2007). Grades of Evidence: Variability in quality of findings in effectiveness research on complex field interventions. American Journal of Evaluation, 28(3), 3-17.

 

Chatterji, M. (2006). Reading achievement gaps, correlates and moderators of early reading achievement: Evidence from the Early Childhood Longitudinal Study (ECLS) kindergarten to first grade sample. Journal of Educational Psychology, 98(3), 489-507. 

 

Chatterji, M. (2005). Achievement gaps and correlates of early mathematics achievement:

Evidence from the ECLS K-first grade sample. Educational Policy Analysis Archives, 13(46). 

 

Chatterji, M. (2005). Applying the Joint Committee's 1994 standards in international contexts: A case study of educational evaluations in Bangladesh [Special Issue on New Perspectives in Program Evaluation]. Teachers College Record, 107 (10), 2373-2400.

 

Chatterji, M. (2004). Evidence on "what works": An argument for extended-term mixed method (ETMM) evaluation designs. Educational Researcher, 33(9), 3-13. (Reprinted in Educational Researcher, 34(5), 14-24, 2005)

 

Chatterji, M. (2002). Models and methods for examining standards-based reforms:  Have the tools of inquiry answered the pressing questions on improving schools? Review of Educational Research, 72(3), 345-386.

 

Chatterji, M. (2002) Measuring leader perceptions of school readiness for standards-based

reforms and accountability. Journal of Applied Measurement, 3(4), 455-485.

 

Chatterji, M., Sentovich, C, Ferron, J., & Rendina-Gobioff, G. (2002). Using an iterative

validation model to conceptualize, pilot-test, and validate scores from an instrument

measuring Teacher Readiness for Educational Reforms. Educational and Psychological

Measurement, 62, 442-463.

 

BOOKS AND EDITED VOLUMES

 

Chatterji, M., & Welner, K. G. (2014). (Eds.). Validity, assessment and accountability: Contemporary issues in primary, secondary, and higher education [Special issue]. Quality Assurance in Education, 22 (1)

 

Chatterji, M. (2013). (Ed.). Validity and test use: An international dialogue on educational assessment, accountability, and equity. Bingley, UK: Emerald Group Publishing Limited.

 

Chatterji, M. (2013). (Ed.). When education measures go public: Stakeholder perspectives on how and why validity breaks down. Teachers College Record, 115 (9).

 

Chatterji, M. (2003). Designing and Using Tools for Educational Assessment. Boston, MA: Allyn & Bacon/Pearson. 

professional experiences

2006-present:           
Associate Professor of Measurement, Evaluation, and Education
Director, Assessment and Evaluation Research Initiative (AERI)
Program of Social-Organizational Psychology, Dept. of Organization and Leadership,
Teachers College, Columbia University
 
 
2001-2005:             
Associate Professor of Measurement, Evaluation, and Education (Pre-tenure; reappointed in 2003)
Dept. of Human Development,
Teachers College, Columbia University.
 
 
1996-2000:
Assistant Professor, Department of Educational Measurement and Research,
College of Education,
University of South Florida.
 
 
1988-1995:
Supervisor, Research and Evaluation Services
District School Board of Pasco County, Florida                                   

honors and awards

Provost’s Investment Fund Award, 2010-12, Teachers College, Columbia University.
"Building capacity at home and abroad:  A proposal for rotating institutes and conferences to generate and disseminate cutting-edge knowledge in the assessment and evaluation sciences."

 

Fulbright Research Scholar, 2007-08
"A study of gender equity in primary education in Bengali-speaking regions of India and Bangladesh: Access, opportunities, and factors affecting school outcomes and completion rates."

 

Outstanding Publication Award, American Educational Research Association, 2004.
"Evidence of what works: An argument for Extended-term Mixed Methods (ETMM) designs"
 (Lead article Educational Researcher in 2004; reprinted in 2005).

 

Outstanding Reviewer, 2006
Publications Committee, Educational Researcher, American Educational Research Association.

 

Distinguished Paper Award:  Florida Educational Research Association, 1993.
"Examining dimensionality of data generated from an early childhood scale using Rasch analysis and confirmatory factor analysis"

(Lead Article in the Journal of Outcome Measurement in 1995).

 

Fellow, National Educational Policy Center, University of Colorado at Boulder
Previously Fellow, Educational Policy Research Unit (EPRU), Arizona State University. 2006-present.

 

Creative Scholarship Award, University of South Florida, 1999.

 

Instructional Technology Award, University of South Florida, 1997.

 

Elected Member, Phi Kappa Phi (Academic Honor Society), University of South Florida, 1986.

 

Elected Member, Delta Kappa Gamma (Academic Honor Society for Educators), 1987.

biographical information

MADHABI CHATTERJI

Madhabi Chatterji, Ph.D., is Associate Professor of Measurement, Evaluation, and Education, and the founding director of the Assessment and Evaluation Research Initiative at Teachers College (TC), Columbia University, a center dedicated to promoting meaningful use of assessment and evaluation information, across disciplines and internationally.

Professor Chatterji’s general research and teaching interests lie broadly in assessment and evaluation methodology, as applied to practice and policy in education, health and other applied fields. Her publications focus on instrument design, validation and validity issues; evidence standards, the “evidence debate” and evidence synthesis methods; educational equity and standards-based educational reforms; and diagnostic classroom assessment models. A Fulbright Research Scholar in 2008, she is also studying issues of access, quality, and comprehensiveness of primary education through a collaboration with the Columbia Global Center-South Asia.

As an individual researcher or as AERI’s director, she serves as Principal Investigator (PI) or Co-PI on a number of ongoing research partnerships and projects, supported by competitive research grants or sponsorships from the National Science Foundation (2005-08; 2012-13), the Stemmler Fund of the National Board of Medical Examiners (2006-10), the U.S. Department of Health & Human Services (2010-12); the Educational Testing Service (2012); the Howard Hughes Medical Institute (2013-17), and private foundations in the U.S. and abroad.

Dr. Chatterji is the author/co-author/editor of two books and 50 refereed publications in several top-tier journals, including the Review of Educational Research, Educational Researcher, American Journal of Evaluation, Journal of Educational Psychology, Teachers College Record, and Educational and Psychological Measurement. Her book, Designing and using tools for educational assessment (2003, Allyn & Bacon/Pearson) presents an iterative process for the design and validation of instruments/measures guided by user contexts. The model was applied recently for developing competency assessments in nationally-funded graduate medical education and health information technology programs.

Most recent publications include: When Education Measures Go Public—Why We Need Stakeholder Conversations on Validity (2013, Teachers College Record); Validity and Test Use: An International Dialogue on Assessment, Accountability and Equity (2013, Emerald, UK); and L.E.A.D.: A framework for evidence gathering and use for prevention of obesity and other complex public health problems (2014, Health Education and Behavior). Since 2014, she is the co-editor of Quality Assurance in Education.

Past publications—where she recommends the use of systems-based, mixed-methods designs for gauging effectiveness of field interventions, and complementary use of classical and modern measurement techniques for validating measures—have been recognized by the American Educational Research Association (Outstanding Publication Award, 2004) and the Florida Educational Research Association (Distinguished Paper Award, 1993).

Dr. Chatterji has served on numerous national and international advisory panels and review boards, including an expert consensus committee convened by the Institute of Medicine of the National Academy of Sciences in the U.S. (2008-10), the editorial boards of Educational Measurement: Issues and Practice, a leading journal of the National Council on Measurement on Education, and the Educational Researcher, a flagship scholarly journal of the American Educational Research Association. Since 2006, she is a Fellow at the National Education Policy Center.

Prior to joining TC in 2001, Dr. Chatterji served as Assistant Professor in educational measurement and research at the College of Education, University of South Florida (USF, 1996-2000), and as Supervisor, Research and Evaluation Services at the Pasco County school system in Florida (1988-1995). She completed her doctorate in 1990 from USF. She emigrated to the U.S. in 1985 with her two then-young daughters. They are all now settled permanently in the U.S.

Curriculum Vitae

Visit the AERI website: www.tc.columbia.edu/AERI

professional presentations

grants

Subcontract with the International Medical Corps (IMC), 2013-14. Evaluating comprehensive mental health and psychosocial support services for vulnerable refugees. $52,683. Co-Principal Investigator (with the Department of Clinical and Counseling Psychology, TC).

 

Subcontract with the Howard Hughes Medical Institute (HHMI) project at Barnard College, 2013-2017.  $31,998. Received a 4 year subcontract to support the Hughes Science Pipeline Project for middle schools. Principal Investigator.

 

National Science Foundation (NSF) REESE Award, 2012-2013. $148,747. Title: Improving validity at the nexus of assessment design and use: A proposal for an international conference and capacity-building institute in assessment and evaluation. Principal Investigator.

 

Educational Testing Service (ETS). 2011-12. $52,200.  Co-sponsorship of AERI's inaugural conference on March 28-29, 2012. Title:  Educational assessment, accountability and equity-Conversations on validity around the world.

 

Office of the National Coordinator, United States Department of Health and Human Services, Washington, D.C. Curriculum Development Center award to the Department of Biomedical Informatics, Columbia University (ca. 1.2 million). Subcontract for the development and validation of educational assessments in health information technology and designing a program evaluation protocol, 2010-2012: $204,000. Co-Principal Investigator with the Department of Health and Behavior Studies, TC.

 

The Nand and Jeet Khemka Foundation, India, 2008-2010. The Global Education and Leadership Foundation's Life Skills and Leadership Programme: Development of curriculum-based assessments, formative evaluation of pilot programs, and organizational capacity-building in assessment and evaluation. $754,000 (approx. $350,000 for the assessment and evaluation components).  Co-Principal Investigator (with the Department of Arts and Humanities, TC).

 

Fulbright Research Award, 2007-08, Competition #7410. Center for International Exchange of Scholars, Washington D.C., June 2007. $13,837. Principal Investigator.

 

Stemmler Fund of the National Board of Medical Examiners (NBME) in collaboration with the Center for Educational Research and Evaluation, Columbia University- College of Physicians and Surgeons, April, 2006. $145,000. Title:Designing cognitive measures of practice-based learning and improvement as an iterative process combining Rasch and classical measurement methods. Co-Principal Investigator.

 

Community Foundation of Elmira/Corning/Finger Lakes areas. The Chemung County School Readiness Studies. $94,000 over a 3 year period (2006-09) granted to AERI. Principal Investigator.

 

National Science Foundation (NSF) EREC Solicitation 03-542, November, 2004-2008. $501,925. Title: Improving mathematics achievement in middle school students with systemic use of proximal assessment data. Principal Investigator.

 

U.S. Department of Education. School-based mentoring programs:  Evaluation of long-term effects on adolescents in the Peekskill School District, NY. Sub-contract with Family Services of Westchester, N.Y. Funded in 2004; project duration January, 2005-December, 2007. $10,000 per year. Principal Investigator.

 

Carnegie Learning Corporation-Cognitive Tutor evaluation. Contract to evaluate the effects of the Cognitive Tutor mathematics program at 13 Brooklyn schools 2003-2004. $20,155. Principal Investigator.

 

Kumon North America, Inc. Grant to evaluate the effects of the Kumon supplementary math and reading programs at P.S. 180 in the Chancellor's District, New York. (2001-2002). $28,750. Principal Investigator.

 

Pinellas County Schools, Florida, Goals 2000 project, 1999-2001. Data-based decision-making in the classroom. Grant to develop a training manual in statistical analysis and use of assessment data for educational decision-making. $29,000. Principal Investigator

 

Bureau of Teacher Education, Florida Department of Education, 1999. Readiness for Statewide Assessment Reforms and its Influence on School Practices and Outcomes. Grant to conduct a large scale survey to evaluate needs related to state-initiated reforms in nine Florida school districts. $25,000. Principal Investigator.

 

University of South Florida, Division of Sponsored Research, Creative Scholarship Grants Competition, March, 1999. Readiness for Statewide Assessment Reforms and Influences on School Practices and Outcomes. $7,500. Principal Investigator.

 

University of South Florida, Center for Teaching Enhancement, March, 1997. Designing and Validating Educational Assessments: A Computer-based Module. Instructional Technology Grants Competition.  $7,500. Principal Investigator.

 

Florida Department of Education-Bureau of Curriculum, Instruction, and Assessment. Developing Teacher-friendly Guides for Assessing Florida's Goal 3 Standards. Invitational grant awarded to the Pasco County School System, March, 1995. $67,000. Project Leader and Primary Author.

principal publications

PEER-REVIEWED BOOKS

 

Chatterji, M. (2013). (Ed.).Validity and test use: An international dialogue on educational assessment, accountability, and equity. Bingley, UK: Emerald Group Publishing Limited. Available at http://www.emeraldinsight.com/products/books/notable/page.htm?id=9781781909461.

 

Chatterji, M. & Welner, K. G. (2014). (Eds.). Validity, assessment and accountability: Contemporary issues in primary, secondary, and higher education [Special issue]. Quality Assurance in Education, 22 (1). Available at www.emeraldinsight.com/0968-4883.htm.

 

Chatterji, M. (2003). Designing and Using Tools for Educational Assessment. Boston, MA: Allyn & Bacon/Pearson. Copyright transferred to author in 2009.

 

 

REFEREED JOURNAL ARTICLES AND CHAPTERS

 

Chatterji, M., Green, L.W., & Kumanyika, S. (2014). L.E.A.D.: A framework for evidence gathering and use for the prevention of obesity and other complex public health problems. Health Education & Behavior, 41 (1) 85-99.

  

Chatterji, M. (2013). (Ed.). When education measures go public: Stakeholder perspectives on how and why validity breaks down. Teachers College Record, 115 (9).

 

Chatterji, M. (2013). Global forces and educational assessment: A foreword on why we need an international dialogue on validity and test use. In M. Chatterji (Ed.)., Validity and test use: An international dialogue on educational assessment, accountability, and equity (pp. 1-14). Bingley, UK: Emerald Group Publishing Limited.


Chatterji, M. (2013). Bad tests or bad test use? A case of SAT use to examine why we need stakeholder conversations on validity. Teachers College Record, 115 (9), 1-7. Released online in June, 2013.

 

Wyer, P. W., & Chatterji, M. (2013). Designing outcome measures for the accreditation of medical education programs as an iterative process combining classical test theory and Rasch measurement. The International Journal of Educational and Psychological Assessment, 13 (2), 35-61. Released online in August, 2013. Available at http://faculty.tc.columbia.edu/upload/mb1434/PBLI-EBM[TIJEPA]11.14.12.pdf.

 

Chatterji, M. (2012). Validation of a community-based comprehensive index of school readiness for entering kindergartners [Special issue on Early Childhood Assessments]. The International Journal of Educational and Psychological Assessment, 10(1), 6-34.

 

Chatterji, M. (2012). Development and validation of indicators of teacher proficiency in 
diagnostic classroom assessment [Special issue on Teacher Assessments]. The International Journal of Educational and Psychological Assessment, 9(2), 4-25.

 

Chatterji, M. (2010). Evaluation methodology. In P. Peterson, E. Baker, and B. McGaw (Eds.). International Encyclopedia of Education. Volume 3 (735-745), Oxford: Elsevier.

 

Chatterji, M., Koh, N., Choi, L., & Iyengar, R. (2009). Closing learner gaps proximally with teacher-mediated diagnostic classroom assessment. Research in the Schools, 16(2), 60-77.

 

Chatterji, M. (2009). Enhancing impact evidence on how global educational initiatives work: Theory, epistemological foundations, and guidelines for applying multi phase, mixed method designs. In K.B. Ryan & J. B. Cousins (Eds.). The SAGE International Handbook of Educational Evaluation (92-111). Thousand Oaks, CA: Sage Publications.

 

Chatterji, M., Koh, N., & Iyengar, R. (2009). Logic models to support the empirical study of comprehensive education constructs. In E. W. Gordon and H. Varenne (Eds.). Theoretical Perspectives in Comprehensive Education-Volume II. New York, NY: Mellen Press.

 

Chatterji, M., Graham, M.J, & Wyer, P.W. (2009). Mapping cognitive overlaps between practice-based learning and improvement and evidence-based medicine: An operational definition for assessing resident physician competence. Journal of Graduate Medical Education,1(2), 287-298. 

Graham, M.J., Naqvi, N., Harding, K., Encandela, J.A., & Chatterji, M (2009). Systems-based practice defined: Taxonomy development and role identification for competency assessment of residents. Journal of Graduate Medical Education, 1(1), 49-60.


Chatterji, M. (2008). Synthesizing evidence from impact evaluations in education to inform action: Comments on Slavin. Educational Researcher, 37(1) 23-26. 

 

Chatterji, M. (2006). Reading achievement gaps, correlates and moderators of early reading achievement: Evidence from the Early Childhood Longitudinal Study (ECLS) kindergarten to first grade sample. Journal of Educational Psychology, 98(3), 489-507. 

Chatterji, M. (2007). Grades of evidence: Variability in quality of findings in effectiveness studies of complex field interventions. American Journal of Evaluation, 28(3), 3-17. 

 

Chatterji, M., Kwon, Y.A., Paczosa, L., & Sng, C. (2006). Gathering evidence on an after-school supplemental instruction program: Design challenges and early findings in light of NCLB.Educational Policy Analysis Archives, 14(12). 

 

Chatterji, M. (2005). Applying the Joint Committee's 1994 standards in international contexts: A case study of educational evaluations in Bangladesh [Special issue on New Perspectives in Program Evaluation]. Teachers College Record, 107(10), 2373-2400.

 

Chatterji, M. (2004). Evidence on "what works": An argument for extended-term mixed method (ETMM) evaluation designs. Educational Researcher, 33(9), 3-13. (Reprinted inEducational Researcher, 34(5), 14-24, 2005). 

Chatterji, M. (2005). Achievement gaps and correlates of early mathematics achievement: Evidence from the ECLS K-first grade sample. Educational Policy Analysis Archives,13(46).


Chatterji, M. (2002). Models and methods for examining standards-based reforms and accountability: Have the tools of inquiry answered pressing questions on improving schools?Review of Educational Research, 72(3), 345-386.


Chatterji, M. (2002). Measuring leader perceptions of school readiness for reforms: Use of an
iterative model combining classical and Rasch methods. Journal of Applied Measurement, 3(4), 455-485. 

 

Chatterji, M., Sentovich, C, Ferron, J., & Rendina-Gobioff, G. (2002). Using an iterative model to conceptualize, pilot-test, and validate scores from an instrument measuring teacher readiness for educational reforms. Educational and Psychological Measurement, 62, 442-463. 

Ganguly, R. & Banerji, M. (2000). Hepatitis B virus infection and vaccine acceptance among university students. American Journal of Health Behavior, 24(2), 96-107. 

 

Banerji, M. (2000). Construct validity of scores/measures from a developmental assessment in mathematics using classical and many-facet Rasch measurement. Journal of Applied Measurement, 1(2), 177-198. 

Banerji, M., Anderson, R.H., & Kerstyn, C. (2000). Designing assessment systems for nongraded environments: Philosophical foundations and classroom applications. National Forum of Teacher Education Journal. 10(2), 19-39.


Banerji, M. (1999). Validation of scores/measures from a K-2 developmental assessment in mathematics. Educational and Psychological Measurement, 59(4), 694-715.

 

Banerji, M. & Ferron, J. (1998). Construct validity of a developmental assessment made up of mathematical patterns tasks. Educational and Psychological Measurement, 58(4), 634-660.  

 

Ganguly, R. & Banerji, M. (1998). Factors affecting influenza vaccination practices in public clinics. Florida Journal of Public Health, 10(1), 17-21.  

 

Banerji, M., Smith, R.M., & Dedrick, R. F. (1997). Dimensionality of an early childhood scale using Rasch analysis and confirmatory factor analysis. Journal of Outcome Measurement, 1(1), 56-86.

 

Banerji, M. & Dailey, R.A. (1994). A study of the effects of an inclusion model on students with specific learning disabilities. Journal of Learning Disabilities, 28(8), 511-522.  

 

Rushton, T.C., Ganguly, R., Sinnott, J.T., & Banerji, M. (1994). Barriers to immunization: An examination of factors that influence the application of pneumococcal vaccine by house staff.Vaccine, 12(13), 1173-1179.

 

Banerji, M. & Malone, P. (1993). Effects of a multi-agency intervention program on at-risk middle school students. ERS Spectrum: Journal of School Research and Information, 11(4), 3-12.   

         

Pearson, C.L. & Banerji, M. (1993). Effects of a ninth-grade dropout prevention program on student academic achievement, school attendance, and dropout rate. Journal of Experimental Education, 61(3), 247-256.

  

Banerji, M. (1992a). Factor structure of the Gesell School Readiness Screening Test. Journal of Psychoeducational Assessment, 10(4), 342-354. 

Banerji, M. (1992b). An integrated study of the predictive properties of the Gesell School Readiness Screening Test. Journal of Psychoeducational Assessment, 10(3), 240-256.   


 

POLICY BRIEFS AND OTHER ARTICLES

 

Chatterji, M. (2010). Review of "Closing the Racial Achievement gap: Learning from Florida's Reforms". Boulder, CO: National Education Policy Center. Available at http://nepc.colorado.edu/thinktank/learnin-from-florida.

 

Chatterji, M. (2005, April). Closing Florida's achievement gaps: Florida Institute of Education (FIE) Policy Brief 4. Jacksonville, FL: Florida Institute of Education at the University of North Florida.

 

Chatterji, M. (2004, April). Good and bad news about Florida student achievement: Performance trends on multiple indicators since passage of the A+ legislation. Educational Policy Brief Research Unit, Doc No. EPSL-0401-105-EPRU, Tempe, AZ: Educational Policy Studies Laboratory.

 

Chatterji, M. (2001). Review of Empowerment Evaluation by David Fetterman. Evaluation and Program Planning. 

 

Banerji, M. & Hutinger, C.P. (1993). Managing classroom assessments: A computer-based solution for classroom teachers.  Florida Educational Research Council: Research Bulletin, 25(1), 31-43.


 

 

professional organization membership

SERVICE IN PROFESSIONAL ORGANIZATIONS
           
American Evaluation Association (AEA):Member, 2000-present
Journal article reviewer, American Journal of Evaluation (ongoing)
Conference participant (ongoing) and International Topical Interest Group (TIG)
ambassador.
 
American Educational Research Association (AERA):Member, 1986-present
Conference Proposal Reviewer for Divisions D and H: 1996, 1997, 1998, 1999, 2000, 2005
Division H Evaluation Report Judging Panel, 2004
Session Discussant, 2005
Journal article reviewer, American Educational Research Journal, Educational Evaluationand Policy Analysis, Educational Researcher (topical, ongoing)
Editorial Board Member, Educational Researcher, 2006-09.
 
Eastern Evaluation Research Society (EERS), and affiliate of the American EvaluationAssociation
Member, 2002-present
Member of the Board, 2004-2006
Annual Conference Program Committee Member 2005-2006.
 
Florida Educational Research Association (FERA):  Member, 1986-2000
Chair of Researcher of the Year Committee, 1997
Member of Researcher of the Year Committee, 1996
Chair of Professional Development and Training Committee, 1993
Conference proposal reviewer and/or discussant 1991-1997, 1999
Nominated as candidate for president, 1997           
 
National Council on Measurement in Education (NCME):  Member, 1987-present
Member, Nominations Committee 1993
Proposal reviewer for NCME conferences, 1996, 1997, 1998, 1999
 
Florida Educational Research Council (FERC):  
Member, 1990-1995, by appointment of the Superintendent, Pasco County Schools
Member of the Conference Planning Committee, 1993
Nominated and elected Treasurer, 1994
Nominated President-Elect, 1995 (Did not serve as President due to move to USF)
 
Florida Public Health Association (FPHA):  Member, 1999-2000

ORLJ 4009: Understanding behavioral research

Overview of alternative methods of behavioral research and their relative strengths and limitations. Application of methodological principles in order to read and evaluate social science research and learn how to begin to conduct research.

ORL 5522: Evaluation Methods I

Provides an overview of major evaluation models and social research methods useful in developing, monitoring and studying effects of programs, services and institutions in education, health and other fields. This is the second course in a three-course sequence in assessment and evaluation methods offered through the Organization and Leadership department. The prerequisite is the 4000-level course on testing, assessment and accountability or an instructor-approved substitute. Offered twice annually.

ORL 5523: Evaluation Methods II--Seminar

This evaluation research seminar, conducted in actual client contexts, provides laboratory and field experiences in planning, designing, execution, and reporting of various components of evaluations. This course is the third and culminating course in a three-course sequence in assessment and evaluation methods offered through the Organization and Leadership department. The prerequisite is ORL 5522, Evaluation methods I or an instructor-approved substitute. Offered once biennially, typically in fall.

ORL 5524: Instrument design and validation--Seminar

Provides hands-on seminar experiences in the design and validation of instruments to measure educational, psychological, health and social contracts. The type of instrument can vary according to student interests (e.g., multi-part surveys, attitude scales, behavior ratings scales, performance assessments or tests of cognitive abilities and achievement). The prerequisites are intermediate level courses in measurement/statistics or instructor-approved substitutes. Offered once biennially, typically in the fall.

Documents & Papers

Download: Vita2014 [PDF]

Centers and Projects

Assessment and Evaluation Research Initiative
Website: http://www.tc.columbia.edu/aeri/


Madhabi Chatterji appeared in the following articles:

TC's Chatterji Co-Hosts EdWeek Blog on Assessment (3/13/2014)

Expanding TC's Global Reach (2/19/2014)

TC's Chatterji: "Multiple-Choice Tests Not Appropriate for Kindergartners" (10/11/2013)

TC's Chatterji: "More Diverse Students Aspire Toward College" (9/27/2013)

A History Of Evaluation (6/26/2013)

TC at AERA 2013 (4/26/2013)

Academic Festival 2013: The Sessions (4/23/2013)

TC's Global Engagement: A Sampling of Past, Present and Future Programs (4/22/2013)

EPSA at AERA 2013 (4/15/2013)

2012 Year in Review (2/20/2013)

Assessing Assessments — and Assessment Use (5/15/2012)

Previewing TC at AERA (3/19/2012)

International Conference Will Examine Issues of Validity, Educational Assessment, Equity and Accountability (3/14/2012)

Columbia and TC Cited for Fulbright Excellence (11/8/2011)

Menghan Shen: Marshaling the Power of Her Peers (10/10/2011)

Staff news (3/29/2011)

Heritage Foundation Responds to Chatterji's Critique of Florida Reforms (1/1/2011)

Go Slow on Following Florida Reforms, TC's Chatterji Says (12/1/2010)

Changing the Paradigm in Obesity Studies (6/24/2010)

Chatterji Part of Prestigious Committee Recommending Major Change in Thinking About Obesity Studies (5/3/2010)

A Letter From the President (4/1/2010)

TC to Partner with Turkish University (2/9/2009)

Bottling the Magic (12/23/2008)

Developing Young Leaders in India (4/28/2008)

A New Way to Troubleshoot Student Learning (4/2/2008)

New Diagnostic Assessment Method Boosts Math Achievement in 5th and 6th Graders; Suggests New Testing Paradigm (4/2/2008)

TC at AERA, 2008 (3/25/2008)

TC's Equity Campaign Teams with the Harlem Children's Zone (3/1/2007)

Pressure Increases As Testing Season Nears (1/8/2007)

Winter break may melt away (12/15/2006)

Equity Update (9/26/2006)

Campaign Announces New Assessment and Evaluation Research Initiative (8/2/2006 11:58:00 AM)

Equity Campaign Announces New Assessment and Evaluation Research Initiative (7/21/2006)

Mixing Methods to Learn 'What Works' (7/13/2005)

Research in 2004 (5/27/2005)

Introducing Peer Review (4/1/2005)

Data Used More and More to Improve Pedagogy (1/6/2005)

Chatterji Talks about Making and Taking Tests (3/1/2003)

In Brief (2/1/2003)

Safety and Security Committee at TC (11/1/2001)

Faculty Adds Thirteen in Fall Semester, Two in Spring (9/1/2000)