Academics

Section Navigation

Madhabi Chatterji

Professional Background

Educational Background

1990                Ph.D. University of South Florida, Tampa, Florida                    

1980                M.Ed. St. Christopher's College, University of Madras, Madras, India           

1975                B.Ed. University of Bombay, Bombay, India

1973                B.Sc. (Honors) Lady Brabourne College, University of Calcutta, W.B., India

Scholarly Interests

Selected Publications

Note:    Professor Chatterji  published as Madhabi Banerji from 1990-2000 and  as Madhabi Chatterji from January, 2001-present. Articles are organized by  scholarly interest area.


BOOKS

Chatterji, M.  (2013). (Ed.). Validity and test use: An international dialogue on  educational assessment, accountability, and equity. Bingley, UK: Emerald  Group Publishing Limited

Chatterji, M. (2003). Designing  and Using Tools for Educational Assessment. Boston, MA: Allyn &  Bacon/Pearson. 

Chatterji, M.  (Forthcoming). Assessment design:  A user-centered methodology for  multidisciplinary constructs and applications. New York, NY: Guilford  Publishers.


REFEREED ARTICLES

Assessment design, construct  validation, validity issues

Popham, W. J., Berliner, D.C., Kingston, N., Fuhrman,  S.H., Ladd, S.M., Charbonneau, J. & Chatterji, M. (2014). Can today's  standardized tests yield instructionally useful data? Challenges, promises and  the state of the art. Quality Assurance in Education 22 (4), 300-315.

Pizmony-Levy, O., Harvey, J., Schmidt,  W., Noonan, R., Engel, L., Feuer, M.J., Santorno, C. Rotberg, I., Ash, P.,  Braun, H., Torney-Purta, J., & Chatterji, M. (2014). On the merits of, and  myths about, international assessments. Quality Assurance in Education, 22 (4), 316-335.

Gordon, E. W., McGill, M.V., Sands,  D.I., Kalinich, K., Pellegrino, J.W. , & Chatterji, M. (2014). Bringing  formative assessment to schools and making it count. Quality Assurance  in Education, 22 (4), 336-350.

Chatterji, M. (2013). Bad tests or bad test use? A case of SAT® use  to examine why we need stakeholder conversations on validity. Teachers  College Record, 115 (9), 1-7.

Wyer, P. W. & Chatterji, M. (2013). Designing outcome measures for  the accreditation of medical education programs as an iterative process  combining classical test theory and Rasch measurement. The International  Journal of Educational and Psychological Assessment, 13 (2), 35-61.

Chatterji, M. (2013). Global forces and educational assessment: A  foreword on why we need an international dialogue on validity and test use. In  M. Chatterji (Ed.), Validity and test use: An international dialogue on  educational assessment, accountability, and equity (pp. 1-14).  Bingley, UK: Emerald Group Publishing.

Chatterji, M., Sentovich, C, Ferron, J., & Rendina-Gobioff, G.  (2002). Using an iterative validation model to conceptualize, pilot-test, and  validate scores from an instrument measuring Teacher Readiness for Educational  Reforms. Educational and Psychological Measurement, 62, 442-463.

Banerji, M.  (1999). Validation of scores/measures from a K-2 developmental assessment in  mathematics. Educational and Psychological Measurement, 59 (4), 694-715.

Banerji, M.  & Ferron, J. (1998). Construct validity of a developmental assessment made  up of mathematical patterns tasks. Educational and Psychological Measurement, 58 (4), 634-660.

Banerji, M.,  Smith, R.M., & Dedrick, R. F. (1997). Dimensionality of an early childhood  scale using Rasch analysis and confirmatory factor analysis. Journal of  Outcome Measurement, 1 (1),  56-86.

Evidence standards, the “evidence debate” and  evaluation methods

Chatterji, M, Green, L.W., & Kumanyika, S. (2014). L.E.A.D.: A  framework for evidence gathering and use for the prevention of obesity and  other complex public health problems. Health Education & Behavior,  41 (1) 85-99.

Chatterji, M. (2008). Synthesizing evidence from impact evaluations  in education to inform action: Comments on Slavin. Educational  Researcher, 37 (1) 23-26.

Chatterji, M. (2007). Grades of Evidence: Variability in quality of  findings in effectiveness research on complex field interventions. American  Journal of Evaluation, 28(3), 3-17.

Chatterji, M. (2009). Enhancing scientific evidence on the how global  educational initiatives work:  Theory, epistemological foundations,  and guidelines for applying multi-phase, mixed methods designs. In K.B. Ryan  & J. B. Cousins (Eds.). The SAGE International Handbook of  Educational Evaluation (92-111). Thousand Oaks, CA: Sage  Publications. 

Chatterji, M. (2010). Evaluation methodology. In P. Peterson, E. Baker,  and B. McGaw    (Eds.). International  Encyclopedia of Education. Volume 3 (735-745), Oxford: Elsevier.

Chatterji, M. (2004). Evidence on "what works": An argument  for extended-term mixed method (ETMM) evaluation designs. Educational  Researcher, 33(9), 3-13. (Reprinted in Educational Researcher, 34(5),  14-24, 2005)

Chatterji, M. (2005). Applying the Joint Committee's 1994 standards in  international contexts: A case study of educational evaluations in  Bangladesh [Special Issue on New Perspectives in Program Evaluation]. Teachers  College Record, 107 (10), 2373-2400.

Banerji, M.  & Dailey, R.A. (1994). A study of the effects of an inclusion program for elementary  students  with specific learning disabilities. Journal  of Learning Disabilities, 28 (8),  511-522.

     
Standards based reforms and educational  equity

Chatterji, M. (2006). Reading achievement  gaps, correlates and moderators of early reading achievement: Evidence  from the Early Childhood Longitudinal Study (ECLS) kindergarten to  first grade sample. Journal of Educational Psychology, 98(3),  489-507. 

Chatterji, M. (2005). Achievement gaps and correlates of early  mathematics achievement: Evidence from the ECLS K-first grade sample. Educational  Policy Analysis Archives, 13(46). 

Chatterji,  M., Kwon, Y.A., Paczosa, L., & Sng, C. (2006). Gathering  evidence on an after-school supplemental instruction program: Design  challenges, lessons, and early findings in light of NCLB. Educational Policy Analysis Archives, 14 (12).
 
Chatterji, M. (2002). Models and methods for examining  standards-based reforms:  Have the tools of inquiry answered the pressing  questions on improving schools? Review of Educational Research,  72(3), 345-386.

Chatterji, M. (2002) Measuring leader perceptions of school readiness  for standards-based reforms and accountability. Journal of Applied  Measurement, 3(4), 455-485.


Diagnostic classroom assessment

Chatterji, M. (2012). Development and  validation of indicators of teacher proficiency in diagnostic classroom  assessment [Special issue on Teacher Assessments]. The International  Journal of Educational and Psychological Assessment, 9(2), 4-25.

Chatterji, M., Koh, N., Choi, L., & Iyengar, R. (2009). Closing  learner gaps proximally with teacher-mediated diagnostic assessment. Research  in the Schools, 16(2), 60-77.

EDITED VOLUMES

Chatterji, M. (2014) (Guest Ed.).   Assessment, accountability and quality issues.  Quality  Assurance in Education, 22 (4). Special Issue.

Chatterji, M.  (2013). (Guest Ed.). When education measures go public: Stakeholder  perspectives on how and why validity breaks down. Teachers College  Record, 115 (9).

Chatterji, M. & Welner, K. G. (2014) (Guest Eds.). Validity,  assessment and accountability: Contemporary issues in primary, secondary, and  higher education. Quality Assurance  in Education, 22 (1). Special Issue.


OP-ED  ARTICLES, REVIEWS AND BLOGS

Chatterji,  M. (2014). Let’s mend, not end, educational testing. Education Week, Issue 24. In  print on March 12, 2014.

Chatterji,  M. (2014). Validity, test use, and consequences: Pre-empting a persistent  problem. In Assessing the Assessments:  K-12 Measurement and Accountability in the 21st Century at Education Week’s blog site onMarch 17, 2014:  http://blogs.edweek.org/edweek/assessing_the_assessments .

Chatterji,  M. (2014). Formative classroom assessment and assessment for accountability: Finding a balance. In Assessing the Assessments: K-12 Measurement  and Accountability in the 21st Century at Education Week’s blog site onMay 16, 2014:  http://blogs.edweek.org/edweek/assessing_the_assessments .

Chatterji, M. (2010).  Review of “Closing the Racial Achievement gap: Learning from Florida’s Reforms.” Boulder, CO: National Education Policy Center. Available at http://nepc.colorado.edu/thinktank/learnin-from-florida.

professional experiences

honors and awards

2015- present:           
  Professor of Measurement, Evaluation, and Education
  Director, Assessment and Evaluation Research Initiative (AERI at  www.tc.edu/aeri)
  Program of Social-Organizational Psychology, Dept. of  Organization and Leadership,
  Teachers College, Columbia University

2006-2015:           
  Associate Professor of Measurement, Evaluation, and Education
  Director, Assessment and Evaluation Research Initiative (AERI)
  Program of Social-Organizational Psychology, Dept. of Organization  and Leadership,
  Teachers College, Columbia University

 2001-2005:              
  Associate Professor of Measurement, Evaluation, and  Education (Reappointed in 2003; tenured in 2005)
  Dept. of Human Development,
  Teachers College, Columbia University.

 1996-2000:
  Assistant Professor, Department of Educational Measurement and  Research,
  College of Education,
  University of South Florida.

 1988-1995:
  Supervisor, Research and Evaluation Services
   District School Board of Pasco County, Florida  

biographical information

Madhabi  Chatterji, Ph.D., is the professor of measurement, evaluation, and education,  and the founding director of the Assessment and Evaluation Research Initiative  at Teachers College (TC), Columbia University, a center dedicated to promoting  meaningful use of assessment and evaluation information, across disciplines and  internationally  (See AERI at: www.tc.edu/aeri).

Professor  Chatterji’s general research and teaching interests lie broadly in assessment  and evaluation methodology, as applied to practice and policy in education,  health, psychology and other applied fields. Her publications focus on  instrument design, construct validation and validity issues; evidence  standards, the “evidence debate” and improving methods for evidence gathering, synthesis,  and use; educational equity and standards-based educational reforms; and  diagnostic classroom assessment.

A Fulbright  Research Scholar in 2008, she is also interested in issues of equity, quality,  and comprehensiveness of primary education in South Asia. Her past  publications—where she recommends the use of systems-based, mixed-methods  designs for assessing the effectiveness of complex social interventions and  complementary use of classical and modern measurement techniques for validating  measures—have been recognized by the American Educational Research Association  (Outstanding Publication Award, 2004) and the Florida Educational Research  Association (Distinguished Paper Award, 1993).

As an  individual researcher or as AERI’s director, she serves as Principal  Investigator (PI) or Co-PI on a number of ongoing research partnerships and collaborative  projects, supported by competitive research grants or sponsorships from the  National Science Foundation (2005-08; 2012-13), the Stemmler Fund of the  National Board of Medical Examiners (2006-10), the U.S. Department of Health  & Human Services (2010-12); the Educational Testing Service (2012); the  Howard Hughes Medical Institute (2013-17), and private foundations in the U.S.  and abroad.

Dr. Chatterji  is the author/co-author/editor of two books and over 50 refereed publications  in several top-tier journals, including the Review of Educational  ResearchEducational Researcher, Journal of Educational  PsychologyAmerican Journal of EvaluationTeachers  College Record, Journal of Learning  Disabilities and Educational and Psychological Measurement. Her  first book, Designing and using tools for educational assessment (2003,  Allyn & Bacon/Pearson) presents an iterative process for the design and  validation of instruments/measures guided by user contexts. The model was  applied recently for developing competency assessments in nationally-funded  graduate medical education and health information technology programs.

Recent  publications include: When Education Measures Go Public—Why We Need Stakeholder  Conversations on Validity (Teachers College Record); Validity and Test Use: An International  Dialogue on Assessment, Accountability and Equity (Emerald, UK); and  L.E.A.D.: A framework for evidence gathering and use for prevention of obesity  and other complex public health problems (2014, Health Education and  Behavior). Currently, she is the co-editor of the international evaluation  journal, Quality Assurance in Education.

Dr. Chatterji  has served on numerous national and international advisory panels and review  boards, including an expert consensus committee convened by the Institute of  Medicine, now the National Academy of Medicine (2008-10); the editorial boards  of Educational Measurement: Issues and Practice, a leading  journal of the National Council on Measurement on Education; and the Educational  Researcher, a flagship scholarly journal of the American Educational  Research Association. Since 2006, she is a Fellow at the National Education  Policy Center at the University of Colorado at Boulder.

Prior to  joining TC in 2001, Dr. Chatterji served as Assistant Professor in educational  measurement and research at the College of Education, University of South  Florida (1996-2000), and as Supervisor, Research and Evaluation Services at the  District School System of Pasco County,  Florida (1988-1995). She received her Ph.D. in  1990 from University of South Florida.

Madhabi  emigrated to the U.S. in 1985 as a first year doctoral student with her two  then-young daughters. They are all now settled permanently in the U.S.

To learn more  about Professor Chatterji’s current roles and activities, visit the :

professional presentations

INVITED TALKS

Invited lecture at Calcutta University’s Department of Education, Alipore Campus, Kolkata, India on February 5, 2014. Title: Measures and Correlates of Mathematics Self-efficacy, Mathematics Self-Concept and Mathematics-Anxiety in Elementary Students: An Instrument Design and Validation Study. 

Keynote speech at conference hosted for international clients (Indonesian delegates). December 21, 2012. Pearson Educational Measurement, New York, NY. Title: Validity Considerations with Large Scale Assessments.

Plenary session participant at the International Conference on Educational Measurement and Evaluation. August 10, 2012. Phillipine Educational Measurement and Evaluation Association, Manila, Phillipines. Title: Teacher Proficiency Indicators in Diagnostic Classroom Assessment.

Keynote speech given on June 16, 2011. International Forum on Talent Cultivation in Higher and Vocational Education held at Ningbo City, China, sponsored by Ningbo Polytechnic Institute and Institute of Higher Education, Xiamen University, China. Title: Talent Development in Higher and Vocational Education using a Diagnostic Classroom Assessment Model.

Invited Lecture at Institute of Higher Education, Xiamen University, China on March 15, 2010. Title: Models of Quality Assessment and Evaluation in HigherEducation Systems in the U.S.

United States-India Educational Foundation, Kolkata, India-60th Anniversary Seminar Series, on February 18, 2010. Title: Gender equity in primary education in West Bengal and Bangladesh: Educational opportunities, achievement outcomes, and school completion rates

Institute of Medicine (Food and Nutrition Board), The National Academies, January 8, 2009. Invited panelist at open workshop on generating and using evidence effectively in obesity prevention decision-making. Title: Alternatives and tradeoffs in generating and evaluating evidence: Perspectives from education

BRAC-Research and Evaluation Division (RED), Dhaka, Bangladesh on March 10, 2008. Title: Assessing student learning: Building assessment capacity in Bangladesh's schools and education systems. Audience from the Directorate of Primary Education, National Board of Textbook and Curriculum Development, Ministry of Secondary Education, BRAC University-Institute for Educational Development, BRAC Education Programs and BRAC-Research and Evaluation Division.

Invited Lecture at the 12th International and 43rd National Conference of the Indian Academy of Applied Psychology, Kolkata, India on February 7, 2008.Title: Mixed-method designs for studying effects of complex field interventions: Criteria for screening the type and grade of evidence.

Invited talk at BRAC University-Institute for Educational Development, Dhaka, Bangladesh on February 12, 2007. Title: Assessment and evaluation in school organizations.

Psychometric Research Unit, Indian Statistical Institute (ISI), Kolkata, India on January 22, 2007 to ISI faculty and students. Title: Using structural equation modeling to study the internal structure of attitudinal measures.

American Educational Research Association (AERA), Mixed Methods SIG. Inaugural Session at AERA annual meeting on April 7, 2006. Title: Grades of evidence in effectiveness research and how mixed-method designs help: Evaluating the quality of findings against methodological choices of researchers.

Eastern Evaluation Research Society, an affiliate of the American Evaluation Association. Conference closing panel discussion with Grover J. Whitehurst of the Institute for Education Sciences, U.S. Department of Education and Nancy Wolff, Rutgers University, at the annual meeting on April 29, 2006. Title: Rigorous evaluations: Is there a gold standard?

Fordham University, Department of Clinical Psychology and Psychometrics, Colloquium Series. Feb 28, 2004. Title: Designing and validating construct measures using a unified process model.

American Educational Research Association/Institute for Education Sciences Postdoctoral Fellows' Summer Retreat, August 15, 2003. Title: Instrumentation and validity of indicators. Title: Knowledge production through documentation and evaluation.


Eastern Evaluation Research Society-An Affiliate of the American Evaluation Association, April 28, 2003. Paper Title: Models and methods for examining standards-based reforms (Sole author).

Florida Council on Elementary Education, April, 1989. Title: Results of the developmental kindergarten study.

American Association of University Women. Title: Continuous progress: New directions in elementary education at Pasco County (with a panel of administrators from the Pasco County School System and Robert H. Anderson, USF), 1991.           

Invited lecture to the Faculty of  Education and the Doctoral Program of Education at Universidad Nacional de  Educación a Distancia (UNED) in Madrid, Spain on January 12, 2015. Title:  Mixed  Methods Evaluations
  Keynote speech at an international  education assessment conference organized by the Ministry
  of Education and Culture and Yogyakarta  State University, Indonesia on November 8,   2014.
  Title:  Issues in Implementing Classroom Assessment and the Proximal Assessment for  Learner Diagnosis (PALD) Model

 

NATIONAL AND INTERNATIONAL CONFERENCES

(Refereed or peer-reviewed papers marked **

A Question of Validity. Panel discussion at the annual meeting of the American Evaluation Association (with M.J. Feuer, A. von Davier, Drew Gitomer, Judith Torney-Purta and Katherine Ryan),  October, 2013 at Washington, D.C. 

Development and Validation of a Health Information Technology (HITECH) Curriculum. Paper presentation at the annual meeting of the American Evaluation Association (with Jennifer Tripken and J.P. Allegrante), October, 2013 at Washington, D.C. 

 

Survey-based non-cognitive measures for young respondents: Tackling errors using a

multi-stage validation approach(with Meiko Lin). Paper presented at the annual meeting of the National Council on Measurement in Education, April 16, 2012, at Vancouver, Canada.**

 

Bridging the evidence gap in obesity prevention: A framework for decision-making. Panel presentation with members of Institute of Medicine- Committee on Evidence Frameworks for Decision-making in Obesity Prevention, on November 11, 2010 at the annual conference of the American Evaluation Association.**

 

Proximal Assessment for Learner Diagnosis (PALD): A study of teacher practices and early teacher and student outcomes (first author with Ready, Koh, Choi and Iyengar). Paper presented at the annual meeting of the American Educational Research Association, March, 2008, at New York, NY.**

 

Cognitive pathways in mastering long division: A case study of grade 5-6 learners supported with the Proximal Assessment for Learner Diagnosis (PALD) approach (first author with Koh, Solomon and Everson). Paper presented at the annual meeting of the American Educational Research Association, March, 2008, at New York, NY.**

 

Proximal Assessment for Learner Diagnosis (PALD): Early teacher practices and outcomes of a classroom assessment intervention (with E.W. Gordon). Paper scheduled for presentation at the at the annual meeting of the American Evaluation Association, November, 2007, at Baltimore, Maryland.**

 

To what extent are mathematics achievement gaps in girls and boys closed with a teacher-delivered model of proximal diagnostic assessment? Paper presented at the national seminar on Gender Issues and the Empowerment of Women at the Indian Statistical Institute, India, Feb 1-3, 2007.**


Grades of evidence: Evaluating the quality of findings against methodological actions in effectiveness research. Paper presented at the annual meeting of the American Educational Research Association in April, 2006, at San Francisco, CA.

 

Designing and validating measures of Teacher Attitudes towards Inclusive Education (TATIE) using an iterative process model (second author with Clare Sng). Paper presented at the annual meeting of the American Educational Research Association in April, 2005, at Montreal, Canada**

 

Monitoring the effectiveness of New York's Written Composition Test in English (WCTE) using multi-facet Rasch measurement (second author with Stephen C. Hetherman). Paper presented at the annual meeting of the American Educational Research Association in April, 2005, at Montreal, Canada**


Documenting classroom processes and early effects of Dynamic Pedagogy: A study in selected elementary classrooms in New York (with E.A. Thomas, E.W. Gordon and other co-authors). Paper presented at the annual meeting of the American Educational Research Association in April, 2005, at Montreal, Canada**


Correlates of early school achievement: School- versus child-level factors that influence reading and mathematics achievement of ethnic minorities/non-minorities in first grade. Paper presented at the annual meeting of the American Educational Research Association in April, 2004, at San Diego, California**


Evidence of what works in education: An argument for extended term mixed-method designs.  Paper presented at the annual meeting of the American Evaluation Association, November, 2004, at Atlanta, Georgia**


Gathering research-based evidence on a supplemental instruction program: A theory-driven quasi-experiment supported with classroom process data. Paper presented at the annual meeting of the American Educational Research Association in April, 2004, at San Diego, California**


Applying the Joint Committee's evaluation standards to international, health, rehabilitation, and education programs (with E. LeBlanc). Paper presented at the annual meeting of the American Evaluation Association, November 9, 2002**


Correlates of early childhood achievement: A comparison of different ethnic groups using the ECLS database. Paper presented at the annual meeting of the American Evaluation Association, November 9, 2002. Supported by the Institute for Urban and Minority Education, Teachers College, Columbia University**


Examining the influences of readiness for state-initiated assessment reforms on school practices and outcomes (with C. Sentovich, J. Ferron, & M. Mele). Paper presented at the annual meeting of the American Educational Research Association in April, 2001, at Seattle, Washington**


Designing district-level assessment systems. Paper presented at the annual meeting of the  American Educational Research Association, Classroom Assessment SIG, in April, 2000, at New Orleans**


Examining construct validity of measures/scores using classical and many-facet Rasch approaches. Paper presented at the annual meeting of the American Educational Research Association in April, 1999, at Montreal, Canada**

Outcomes of schooling accounting for population demographics, risk factors, and instructional program offerings: A path analysis (with E. Paul). Paper presented at the annual meeting of the American Educational Research Association in April, 1999, at Montreal, Canada**

Achievement in grade K-2 classrooms implementing curriculum-based assessment reforms in mathematics. Paper presented at the annual meeting of the National Council on Measurement in Education in April, 1998, at San Diego, California**

Achievement in grade 3-5 classrooms implementing curriculum-based assessment reforms in mathematics. Paper presented at the annual meeting of the American Educational Research Association in March, 1997, at Chicago, Illinois**

Effects of an integrated classroom model on students with specific learning disabilities. Paper presented at the annual meeting of the American Educational Research Association in April, 1994, at New Orleans, Louisiana**

Examining dimensionality of data from an early childhood scale using Rasch analysis and confirmatory factor analysis (with R.M. Smith and R.F. Dedrick). Distinguished paper presentation at the annual meeting of the American Educational Research Association in April, 1994, at New Orleans, Louisiana**

A program evaluation of a multi-agency intervention program for middle school at-risk students.  Paper presented at the annual meeting of the American Educational Research Association in April, 1993, at Atlanta, Georgia**

Predictive properties of the Gesell School Readiness Screening Test in samples from two treatment contexts.  Paper presented at the annual meeting of the American Educational Research Association, Division D in April, 1991 at Chicago, Illinois**

A study of the effects of a ninth grade dropout prevention program: Trends on selected outcomes (with C. Pearson). Paper presented at the annual meeting of the American Educational Research Association, Division H, at Chicago, IL, April, 1991**

A longitudinal study of the effects of a two-year developmental kindergarten on academic achievement. Paper presented at the annual meeting of the American Educational Research Association, Division H, at Boston, MA, April, 1990**

Profile of a developmental kindergarten. Presentation on the evaluation study of the Pasco County Developmental Kindergarten program, made at the annual conference of the National Association for the Education of Young Children in November, 1990 at Washington, DC.**

An assessment of the importance of Joseph Mayer Rice in American educational research. Paper presented at the annual meeting of the American Educational Research Association, SIG on Educational Research, its History, Philosophy and Ethics, at New Orleans, LA, April, 1988.**

 

REGIONAL/LOCAL CONFERENCES

Moderating effects of personal, school, and family protective factors on young children's achievement (with Young Ae Kwon). Paper presented at the Eastern Evaluation Research Society's annual conference, at Absecon, New Jersey in April, 2005.**

An application of multi-faceted Rasch measurement to monitor the effectiveness of the Written Composition Test of English (WCTE) in the New York City Department of Education (with Stephen Hetherman).  Paper presented at the annual conference of the Eastern Educational Research Association, in 2004, at Absecon, New Jersey.

Teacher attitude towards inclusive education: An instrument design and construct validation study. (with C Sng). Paper presented at the annual conference of the Eastern Educational Research Association, in 2004, at Absecon, New Jersey.

Evaluating the effects of a middle school dropout prevention program. Paper presented at the annual conference of the Eastern Educational Research Association in February, 2001, at Hilton Head, North Carolina.**

Influences of readiness for state assessment reforms on school practices and outcomes

Presenting Chair/Organizer (with C. Sentovich, M. Mele, J. Ferron). Papers presented at a symposium on State Assessment Reforms Studies, at the annual meeting of the Florida Educational Research Association, November, 2000, at Tallahassee, Florida.**

Explaining outcomes of schooling using a context-input-process-product framework: Empirical validation of a path model. (With R. Armstrong). Paper presented at the annual meeting of the Florida Educational Research Association, November, 2000 at Tallahassee, Florida.**

Conceptualization, pilot-testing, and validation of an instrument to measure of teacher readiness for educational reforms. (with C. Sentovich, G. Gobioff, J. Ferron). Paper presented at the annual conference of the Eastern Educational Research Association in February, 2000, at Clearwater, Florida.**

The development of an interactive, computerized module for teaching a graduate measurement course. Presented at the 21st Century Teaching Technologies Symposium in March 20, 1998 at the University of South Florida, Tampa, Florida.

Designing instructionally useful assessment reports: A pilot study (with C. Kerstyn). Paper presented at the annual conference of the Eastern Educational Research Association in February, 1998, at Tampa, Florida.**

The development of an interactive, computerized module for teaching a gradate measurement course. Presented at a symposium on technology-based instruction at the annual meeting of the Florida Educational Research Association, November, 1997, at Orlando, Florida.**

Applying facets analysis to data from portfolio writing samples.  Paper presented at a symposium on the Pasco County Language Arts Portfolio program, at the annual meeting of the Florida Educational Research Association, November, 1995, at St. Petersburg, Florida.**

 Developing teacher-friendly guides for assessing Florida's Goal 3 standards.  Presentation made at a symposium on Florida's Goal 3 assessment project, at the annual meeting of the Florida Educational Research Association, November, 1995 at St. Petersburg, Florida.**

Redesigning assessment programs to support classroom teaching: A conceptual model for school districts.  Paper presented at the annual meeting of the Florida Educational Research Association in November, 1994, at Tampa, Florida.**

Mapping writing development in primary children: Rasch applications on writing data from portfolios. Paper presented at the annual meeting of the Florida Educational Research Association in November, 1994 at Tampa, Florida.**

Redesigning assessment programs in school districts: The Pasco 2001 assessment project. Symposium presentation with B.W. Hall and R.A. Dailey at the annual meeting of the Florida Educational Research Council, March, 1994 at Tampa, Florida.**                 

Examining dimensionality of data from an early childhood scale using Rasch analysis and confirmatory factor analysis (with R.M. Smith and R.F. Dedrick). Paper presented at the annual meeting of the Florida Educational Research Association in November, 1993, at Destin, Florida.*

Managing alternative assessments: A computer-based solution for classroom teachers.  Presentation made with Chuck Hutinger at the annual meeting of the Florida Educational Research Council in March, 1993, at Gainesville, Florida.**

Meeting the assessment challenge in Project CHILD: An application in the Pasco County Schools.  Paper presented at the annual meeting of the Florida Educational Research Association in November, 1991, at Clearwater, Florida.**

Factor structure of the School Work Culture Profile in elementary and secondary samples (with K.J. Snyder). Paper presented at the annual meeting of the Florida Educational Research Association in November, 1991, at Clearwater, Florida.**

A descriptive study of elementary and secondary school work cultures using the School Work Culture Profile (with K.J. Snyder). Paper presented at the annual meeting of the Florida Educational Research Association in November, 1991, at Clearwater, Florida.**

A closer look at face validity. Paper presented at the annual meeting of the Florida Educational Research Association in November, 1986, at Tampa, Florida.**

grants

Subcontract with the International Medical Corps (IMC), 2013-14. Evaluating comprehensive mental health and psychosocial support services for vulnerable refugees. $52,683. Co-Principal Investigator (with the Department of Clinical and Counseling Psychology, TC).

 

Subcontract with the Howard Hughes Medical Institute (HHMI) project at Barnard College, 2013-2017.  $31,998. Received a 4 year subcontract to support the Hughes Science Pipeline Project for middle schools. Principal Investigator.

 

National Science Foundation (NSF) REESE Award, 2012-2013. $148,747. Title: Improving validity at the nexus of assessment design and use: A proposal for an international conference and capacity-building institute in assessment and evaluation. Principal Investigator.

 

Educational Testing Service (ETS). 2011-12. $52,200.  Co-sponsorship of AERI's inaugural conference on March 28-29, 2012. Title:  Educational assessment, accountability and equity-Conversations on validity around the world.

 

Office of the National Coordinator, United States Department of Health and Human Services, Washington, D.C. Curriculum Development Center award to the Department of Biomedical Informatics, Columbia University (ca. 1.2 million). Subcontract for the development and validation of educational assessments in health information technology and designing a program evaluation protocol, 2010-2012: $204,000. Co-Principal Investigator with the Department of Health and Behavior Studies, TC.

 

The Nand and Jeet Khemka Foundation, India, 2008-2010. The Global Education and Leadership Foundation's Life Skills and Leadership Programme: Development of curriculum-based assessments, formative evaluation of pilot programs, and organizational capacity-building in assessment and evaluation. $754,000 (approx. $350,000 for the assessment and evaluation components).  Co-Principal Investigator (with the Department of Arts and Humanities, TC).

 

Fulbright Research Award, 2007-08, Competition #7410. Center for International Exchange of Scholars, Washington D.C., June 2007. $13,837. Principal Investigator.

 

Stemmler Fund of the National Board of Medical Examiners (NBME) in collaboration with the Center for Educational Research and Evaluation, Columbia University- College of Physicians and Surgeons, April, 2006. $145,000. Title:Designing cognitive measures of practice-based learning and improvement as an iterative process combining Rasch and classical measurement methods. Co-Principal Investigator.

 

Community Foundation of Elmira/Corning/Finger Lakes areas. The Chemung County School Readiness Studies. $94,000 over a 3 year period (2006-09) granted to AERI. Principal Investigator.

 

National Science Foundation (NSF) EREC Solicitation 03-542, November, 2004-2008. $501,925. Title: Improving mathematics achievement in middle school students with systemic use of proximal assessment data. Principal Investigator.

 

U.S. Department of Education. School-based mentoring programs:  Evaluation of long-term effects on adolescents in the Peekskill School District, NY. Sub-contract with Family Services of Westchester, N.Y. Funded in 2004; project duration January, 2005-December, 2007. $10,000 per year. Principal Investigator.

 

Carnegie Learning Corporation-Cognitive Tutor evaluation. Contract to evaluate the effects of the Cognitive Tutor mathematics program at 13 Brooklyn schools 2003-2004. $20,155. Principal Investigator.

 

Kumon North America, Inc. Grant to evaluate the effects of the Kumon supplementary math and reading programs at P.S. 180 in the Chancellor's District, New York. (2001-2002). $28,750. Principal Investigator.

 

Pinellas County Schools, Florida, Goals 2000 project, 1999-2001. Data-based decision-making in the classroom. Grant to develop a training manual in statistical analysis and use of assessment data for educational decision-making. $29,000. Principal Investigator

 

Bureau of Teacher Education, Florida Department of Education, 1999. Readiness for Statewide Assessment Reforms and its Influence on School Practices and Outcomes. Grant to conduct a large scale survey to evaluate needs related to state-initiated reforms in nine Florida school districts. $25,000. Principal Investigator.

 

University of South Florida, Division of Sponsored Research, Creative Scholarship Grants Competition, March, 1999. Readiness for Statewide Assessment Reforms and Influences on School Practices and Outcomes. $7,500. Principal Investigator.

 

University of South Florida, Center for Teaching Enhancement, March, 1997. Designing and Validating Educational Assessments: A Computer-based Module. Instructional Technology Grants Competition.  $7,500. Principal Investigator.

 

Florida Department of Education-Bureau of Curriculum, Instruction, and Assessment. Developing Teacher-friendly Guides for Assessing Florida's Goal 3 Standards. Invitational grant awarded to the Pasco County School System, March, 1995. $67,000. Project Leader and Primary Author.

principal publications

BOOKS

Chatterji, M.  (2013). (Ed.). Validity and test use: An international dialogue on  educational assessment, accountability, and equity. Bingley, UK: Emerald  Group Publishing Limited

Chatterji, M.  (2003). Designing and Using Tools for Educational Assessment.  Boston, MA: Allyn & Bacon/Pearson. 

Chatterji, M.  (Forthcoming). Assessment design:  A user-centered methodology for  multidisciplinary constructs and applications. New York, NY: Guilford  Publishers.

REFEREED ARTICLES 

Assessment design, construct  validation, validity issues

Chatterji, M. (2013). Bad tests or bad test use? A case of SAT® use  to examine why we need stakeholder conversations on validity. Teachers  College Record, 115 (9), 1-7.

Wyer, P. W. & Chatterji, M. (2013). Designing outcome measures for  the accreditation of medical education programs as an iterative process  combining classical test theory and Rasch measurement. The International  Journal of Educational and Psychological Assessment, 13 (2), 35-61.

Chatterji, M., Sentovich, C, Ferron, J., & Rendina-Gobioff, G.  (2002). Using an iterative validation model to conceptualize, pilot-test, and  validate scores from an instrument measuring Teacher Readiness for Educational  Reforms. Educational and Psychological Measurement, 62, 442-463.

Banerji, M.  & Ferron, J. (1998). Construct validity of a developmental assessment made  up of mathematical patterns tasks. Educational and Psychological Measurement, 58 (4), 634-660.

Banerji, M.,  Smith, R.M., & Dedrick, R. F. (1997). Dimensionality of an early childhood  scale using Rasch analysis and confirmatory factor analysis. Journal of  Outcome Measurement, 1 (1),  56-86.


Evidence standards, the “evidence debate” and  evaluation methods

Chatterji, M, Green, L.W., & Kumanyika, S. (2014). L.E.A.D.: A  framework for evidence gathering and use for the prevention of obesity and  other complex public health problems. Health Education & Behavior,  41 (1) 85-99.

Chatterji, M. (2008). Synthesizing evidence from impact evaluations  in education to inform action: Comments on Slavin. Educational  Researcher, 37 (1) 23-26.

Chatterji, M. (2007). Grades of Evidence: Variability in quality of  findings in effectiveness research on complex field interventions. American  Journal of Evaluation, 28(3), 3-17.

Chatterji, M. (2004). Evidence on "what works": An argument  for extended-term mixed method (ETMM) evaluation designs. Educational  Researcher, 33(9), 3-13. (Reprinted in Educational Researcher, 34(5),  14-24, 2005)

Banerji, M.  & Dailey, R.A. (1994). A study of the effects of an inclusion program for elementary  students  with specific learning disabilities. Journal  of Learning Disabilities, 28 (8),  511-522.
 

Standards based reforms and educational  equity

Chatterji, M. (2006). Reading achievement  gaps, correlates and moderators of early reading achievement: Evidence  from the Early Childhood Longitudinal Study (ECLS) kindergarten to  first grade sample. Journal of Educational Psychology, 98(3),  489-507. 

Chatterji, M. (2002). Models and methods for examining  standards-based reforms:  Have the tools of inquiry answered the pressing  questions on improving schools? Review of Educational Research,  72(3), 345-386.
 

Diagnostic classroom assessment

Chatterji, M. (2012). Development and validation  of indicators of teacher proficiency in diagnostic classroom assessment  [Special issue on Teacher Assessments]. The International Journal of  Educational and Psychological Assessment, 9(2), 4-25.

Chatterji, M., Koh, N., Choi, L., & Iyengar, R. (2009). Closing  learner gaps proximally with teacher-mediated diagnostic assessment. Research  in the Schools, 16(2), 60-77.

OP-ED  ARTICLES AND BLOGS

Chatterji,  M. (2014). Let’s mend, not end, educational testing. Education Week, Issue 24. In  print on March 12, 2014.

Chatterji,  M. & Harvey J. (2014). (Co-facilitators). Assessing the Assessments: K-12 Measurement and Accountability in the  21st Century. A blog  featuring debate and dialogue between scholars and K-12 school  officials/practitioners at Education Week’s blog site: http://blogs.edweek.org/edweek/assessing_the_assessments

Chatterji, M. (2010).  Review of “Closing the Racial Achievement gap: Learning from Florida’s Reforms.” Boulder, CO: National Education Policy Center. Available at http://nepc.colorado.edu/thinktank/learnin-from-florida.

professional organization membership

     
  1. American Evaluation Association (AEA):Member,  2000-present
    Journal article reviewer, American  Journal of Evaluation (ongoing)
    Conference participant (ongoing) and  International Topical Interest Group (TIG)
    Ambassador (periodic).
  2.  
  3. American Educational Research Association (AERA):Member, 1986-present
    Conference Proposal Reviewer for  Divisions D and H: Ongoing
    Division H Evaluation Report Judging  Panel, 2004
    Session Discussant, Ongoing

    Journal article-peer reviewer, American  Educational Research Journal, Educational Evaluation and Policy Analysis,  Educational Researcher (topical, ongoing)

    Editorial Board Member, Educational  Researcher, 2006-09.
  4.  
  5. Eastern Evaluation Research Society (EERS),  an affiliate of the American Evaluation Association: Member, 2002-present
    Member of the Board, 2004-2006
    Annual Conference Program Committee  Member 2005-2006.
  6.  
  7. Florida Educational Research Association (FERA):  Member, 1986-2000
    Chair of Researcher of the Year  Committee, 1997
    Member of Researcher of the Year  Committee, 1996
    Chair of Professional Development and  Training Committee, 1993
    Conference proposal reviewer and/or  discussant 1991-1997, 1999
    Nominated as candidate for president,  1997           
  8.  
  9. National Council on Measurement in Education (NCME):  Member, 1987-present
    Editorial Board Member: Educational Measurement-Issues and Practice  1995-97
    Member, Nominations Committee 1993
    Proposal reviewer for NCME  conferences: Ongoing

  10. Florida Educational Research Council (FERC):   Member, 1990-1995
    Member, 1990-1995, by appointment of  the Superintendent, Pasco County Schools
    Member of the Conference Planning  Committee, 1993
    Nominated and elected Treasurer, 1994
    Nominated President-Elect, 1995 (Did  not serve as President due to move to USF)

  11. Florida Public Health Association (FPHA):   Member, 1999-2000
    Invited lecture at Calcutta  University’s Department of Education, Alipore Campus, Kolkata, India on  February 5, 2014. Title: Measures  and Correlates of Mathematics Self-efficacy, Mathematics Self-Concept and  Mathematics-Anxiety in Elementary Students: An Instrument Design and Validation  Study.

ORLJ 4009: Understanding behavioral research

Overview of alternative methods of behavioral research and their relative strengths and limitations. Application of methodological principles in order to read and evaluate social science research and learn how to begin to conduct research.

ORL 5522: Evaluation Methods I

Provides an overview of major evaluation models and social research methods useful in developing, monitoring and studying effects of programs, services and institutions in education, health and other fields. This is the second course in a three-course sequence in assessment and evaluation methods offered through the Organization and Leadership department. The prerequisite is the 4000-level course on testing, assessment and accountability or an instructor-approved substitute. Offered twice annually.

ORL 5523: Evaluation Methods II--Seminar

This evaluation research seminar, conducted in actual client contexts, provides laboratory and field experiences in planning, designing, execution, and reporting of various components of evaluations. This course is the third and culminating course in a three-course sequence in assessment and evaluation methods offered through the Organization and Leadership department. The prerequisite is ORL 5522, Evaluation methods I or an instructor-approved substitute. Offered once biennially, typically in fall.

ORL 5524: Instrument design and validation--Seminar

Provides hands-on seminar experiences in the design and validation of instruments to measure educational, psychological, health and social contracts. The type of instrument can vary according to student interests (e.g., multi-part surveys, attitude scales, behavior ratings scales, performance assessments or tests of cognitive abilities and achievement). The prerequisites are intermediate level courses in measurement/statistics or instructor-approved substitutes. Offered once biennially, typically in the fall.

Documents & Papers

Download: CURRENT VITAE [PDF]

Centers and Projects

Assessment and Evaluation Research Initiative
Website: http://www.tc.columbia.edu/aeri/


Madhabi Chatterji appeared in the following articles:

TC's Chatterji Co-Hosts EdWeek Blog on Assessment (3/13/2014)

Expanding TC's Global Reach (2/19/2014)

TC's Chatterji: "Multiple-Choice Tests Not Appropriate for Kindergartners" (10/11/2013)

TC's Chatterji: "More Diverse Students Aspire Toward College" (9/27/2013)

A History Of Evaluation (6/26/2013)

TC at AERA 2013 (4/26/2013)

Academic Festival 2013: The Sessions (4/23/2013)

TC's Global Engagement: A Sampling of Past, Present and Future Programs (4/22/2013)

EPSA at AERA 2013 (4/15/2013)

2012 Year in Review (2/20/2013)

Assessing Assessments — and Assessment Use (5/15/2012)

Previewing TC at AERA (3/19/2012)

International Conference Will Examine Issues of Validity, Educational Assessment, Equity and Accountability (3/14/2012)

Columbia and TC Cited for Fulbright Excellence (11/8/2011)

Menghan Shen: Marshaling the Power of Her Peers (10/10/2011)

Staff news (3/29/2011)

Heritage Foundation Responds to Chatterji's Critique of Florida Reforms (1/1/2011)

Go Slow on Following Florida Reforms, TC's Chatterji Says (12/1/2010)

Changing the Paradigm in Obesity Studies (6/24/2010)

Chatterji Part of Prestigious Committee Recommending Major Change in Thinking About Obesity Studies (5/3/2010)

A Letter From the President (4/1/2010)

TC to Partner with Turkish University (2/9/2009)

Bottling the Magic (12/23/2008)

Developing Young Leaders in India (4/28/2008)

A New Way to Troubleshoot Student Learning (4/2/2008)

New Diagnostic Assessment Method Boosts Math Achievement in 5th and 6th Graders; Suggests New Testing Paradigm (4/2/2008)

TC at AERA, 2008 (3/25/2008)

TC's Equity Campaign Teams with the Harlem Children's Zone (3/1/2007)

Pressure Increases As Testing Season Nears (1/8/2007)

Winter break may melt away (12/15/2006)

Equity Update (9/26/2006)

Campaign Announces New Assessment and Evaluation Research Initiative (8/2/2006 11:58:00 AM)

Equity Campaign Announces New Assessment and Evaluation Research Initiative (7/21/2006)

Mixing Methods to Learn 'What Works' (7/13/2005)

Research in 2004 (5/27/2005)

Introducing Peer Review (4/1/2005)

Data Used More and More to Improve Pedagogy (1/6/2005)

Chatterji Talks about Making and Taking Tests (3/1/2003)

In Brief (2/1/2003)

Safety and Security Committee at TC (11/1/2001)

Faculty Adds Thirteen in Fall Semester, Two in Spring (9/1/2000)