2011 TC Research
Teachers College, Columbia University
Teachers College Columbia University

Research

Section Navigation

Young-Sun Lee

Professional Background

Educational Background

Ph.D. University of Wisconsin-Madison, Educational Psychology (Educational Measurement & Statistics), 2002.
M.A. Ewha Womans University, Seoul, South Korea, Educational Measurement & Evaluation, 1995.
B.A. Ewha Womans University, Seoul, South Korea, Education, 1992.

Scholarly Interests

Psychometrics(Classical Test Theory, Item Response Theory, & Cognitive Diagnosis Modeling), Educational and Psychological Measurement, and Applied Statistics

Selected Publications

RECENT PUBLICATIONS  

Schnur, J., Lee, Y.-S., Goldsmith, R. E., Litman, L., & Guy, M. H. (in press). Development of the Healthcare Triggering Questionnaire in in Adult Sexual Abuse Survivors. Psychological Assessment.

Park, Y. S., & Lee, Y.-S. (2014). An Extension of the DINA model using covariates: Examining factors affecting response probability and latent classification. Applied Psychological Measurement

de la Torre, J., & Lee, Y.-S. (2013). Evaluating the Wald test for item-level comparison of saturated and reduced models in cognitive diagnosis. Journal of Educational Measurement, 50(4), 355-373. 

Lee, Y.-S., Park, Y. S., Song, M. Y., Kim, S. E., Lee, Y. J., & In, B. R. (2012). Investigating Score Reporting of Attribute Profiles from the National Assessment of Educational Achievement using Cognitive Diagnostic Models. Journal of Educational Evaluation, 25(3), 411-433(Written in Korean)
 
Lee, Y.-S., de la Torre, J., & Park, Y. S. (2012). Cognitive diagnosticity of IRT-constructed assessment: An empirical investigation. Asia Pacific Education Review, 13(2), 333-345. 
 
Park, Y. S., Lee, Y.-S., Lee, Y. J., In, B. R., Kim, S. E., & Song, M. Y. (2012). Multilevel analysis of a cognitive diagnostic model using National Assessment of Educational Achievement: Examining differences between regions. Journal of Educational Evaluation, 25(2), 193-212. (Written in Korean)
 
Kim, S. E., Park, Y. S., & Lee, Y.-S. (2012). Application of Latent Class Model toMultiple Strategy CDM Analysis. Journal of Educational Evaluation, 25(1), 49-68. (Written in Korean)

Lee, Y.-S., Krishnan, A., & Park, Y. S. (2012). Psychometric properties of the Children's Depression Inventory: An IRT analysis across age in a non-clinical, longitudinal, adolescent sample. Measurement and Evaluation in Counseling and Development. 45(2), 84-100.
 
Lee, J., & Lee, Y.-S. (2012). The Effects of Testing. In Hattie, J., & Anderman, E., International Guide to Student Achievement. New York: Routledge Publishers.

Lee, Y.-S.
, Lembke, E., Moore, D., Ginsburg, H., & Pappas, S. (2012). Item-Level and Construct Evaluation of Early Numeracy Curriculum-Based Measures. Assessment for Effective Intervention. 37(2), 107-117.
 
Hampton, D. D., Lembke, E. S., Lee, Y.-S., Pappas, S., Chiong, C., & Ginsburg, H. (2012). Technical Adequacy of Early Numeracy Curriculum-Based Progress Monitoring Measures for Kindergarten and First-Grade Students. Assessment for Effective intervention, 37(2), 118-126. 

Lee, Y.-S.
, Park, Y. S., & Taylan, D. (2011). A cognitive diagnostic modeling of attribute mastery in Massachusetts, Minnesota, and the U.S. national sample using the TIMSS 2007. International Journal of Testing, 11, 144-177.
 
Ginsburg, H. P., Pappas, S., Lee, Y.-S., & Chiong, C. (2011). mCLASS:Math: Insights into Children's Mathematical Minds and Performance. In Noyce, P., & Hickey, D. T., Formative Assessment in Learning Contexts, the Next Generation (pp. 46-97). Harvard Education Press.
 
Park, Y. S., & Lee, Y.-S. (2011). Diagnostic Cluster Analysis: An Empirical Study of Mathematics Skills via TIMSS 2007. IERI Monograph Series: Issues and Methodologies in Large-Scale Assessments, 4. 75-108.
 
Kim, S.-H., Sherry, A., Lee, Y.-S., & Kim, C.-D. (2011). Psychometric Properties of a Translated Korean Adult Attachment Measure. Measurement and Evaluation in Counseling and Development, 44, 135-150. 

Lee, Y.-S.
, & Park, Y. S. (2011). Examining the Mastery of Mathematics Skills in Italy Using a Cognitive Diagnostic Model. RicercAzione, 3(1). 59-74.
 
de la Torre, J., & Lee, Y.-S. (2010). A Note on the Invariance of the DINA Model Parameters. Journal of Educational Measurement, 47(1), 115-127.
 
Xu, X., Douglas, J., & Lee, Y.-S. (2010). Linking with Nonparametric IRT Models. In von Davier, A. A. (Ed.), Statistical Models for Test Equating, Scaling, and Linking (pp. 243-260). New York: Springer Verlag.  

Lee, Y.-S.
, Cohen, A., & Toro, M. (2009). Examining Type I error and power for detection of differential item and testlet functioning. Asia Pacific Education Review, 10, 365-375.  
 
Lee, Y.-S., Wollack, J., & Douglas, J. (2009). On the use of nonparametric ICC estimation techniques for checking parametric model fit. Educational and Psychological Measurement, 69, 181-197. 
 
Cervellione, K., Lee, Y.-S., & Bonanno, G. A. (2009). Rasch Modeling of the Self-Deception Scale of the Balanced Inventory of Desirable Responding. Educational and Psychological Measurement, 69, 438-458.
 
Lee, Y.-S., Grossman, J., & Krishnan, A. (2008). Cultural Relevance of Adult Attachment: Rasch Modeling of the Revised Experiences in Close Relationships in a Korean Sample. Educational and Psychological Measurement, 68, 824-844.
 
Lee, Y.-S. (2007). A Comparison of Methods for Nonparametric Estimation of Item Characteristic Curves for Binary Items. Applied Psychological Measurement, 31(2), 121-134. 
 
Lee, Y.-S., Douglas, J., & Chewning, B. (2007). Techniques For Developing Health Quality of Life Scales For Point of Service Use. Social Indicators Research: An International and Interdisciplinary Journal for Quality-Of-Life Measurement, 83(2), 331-350.  
 
TECHNICAL REPORTS   
 
Kim, C.-D., Lee, Y.-S., Lee, J. Y., Yoo, H. S., Lee, D. H., Oh, I., & Lee, S. M. (2012). Multiphasic Developmental Potential Inventory-Seoul Form: MDPI-S. Seoul Metropolitan Office of Education, Seoul, Republic of Korea. 
 
Song, M.-Y., Lee, Y.-S., & Park, Y. S.(2011). Analysis and score reporting based oncognitivediagnostic models using the National Assessment Educational Achievement. Seoul,Republic of Korea: Korea Institute for Curriculum and Evaluation (KICE)Research Report RRE 2011-8.

Lee, Y.-S., Pappas, S., & Chiong, C., & Ginsburg, H. (2011). mCLASS:MATH -Technical Manual. Brooklyn, NY: Wireless Generation, Inc.

Lee, Y.-S.
, Park, Y. S., & Lee, S. Y. (2010). Post-Smoothing by Kernel Equating to CompareCollege Scholastic Aptitude Test Performance between Offices of Education viaMultilevel Methods. Seoul, Republic of Korea: Ministry of Education, Science and Technology. 
 
Romero, M., & Lee, Y.-S. (2008). How Maternal, Family and Cumulative Risk Affect Absenteeism in Early Schooling: Facts for Policymakers. New York, NY: National Center for Children in Poverty, Columbia University, Mailman School of Public Health. (available  at http://www.nccp.org/publications/pdf/text_802.pdf)
 
Romero, M., & Lee, Y.-S. (2008). Brief #2: The Influence of Maternal and Family Risk on Chronic Absenteeism in Early Schooling. "What data tell us about the role of chronic absenteeism in early schooling?"  New York, NY: National Center for Children in Poverty, Columbia University, Mailman School of Public Health. (available at http://www.nccp.org/publications/pdf/text_792.pdf)
 
Lee, Y.-S., Lembke, E., Moore, D., Ginsburg, H., & Pappas, S. (2007). mCLASS:MATH - Identifying technically adequate early mathematics measures. Brooklyn, NY: Wireless Generation, Inc.  
 
Romero, M., & Lee, Y.-S. (2007). Brief #1: A National Portrait of Chronic Absenteeism in the Early Grades. "What data tell us about the role of chronic absenteeism in early schooling?"  New York, NYNational Center for Children in Poverty, Columbia University, Mailman School of Public Health. (available at http://www.nccp.org/publications/pdf/text_771.pdf

biographical information

Dr. Young-Sun Lee joined the department in the fall of 2002. She received her Ph.D in Educational Measurement and Statistics (Department of Educational Psychology) with a minor in Statistics at the University of Wisconsin -Madison. Dr. Lee's research interests are focused primarily on psychometric approaches to solve practical problems in educational and psychological testing. Studies currently in progress focus on development/applications of mixture IRT models, cognitive diagnostic models, international comparative studies using large scale assessment data, and test construction/scale develeopment for young children. Dr. Lee currently teaches courses in test theory (HUDM 6051 - Psychometric Theory I (Classical Test Theory) and HUDM 6052 - Psychometric theory II (Item Response Theory)) and statistical methods (HUDM 4120 - Basic Concepts in Statistics, HUDM 4122 - Probability and Statistical Inference, and HUDM 5122 - Applied Regression Analysis).

professional presentations

professional experiences

PROFESSIONAL ACTIVITIES
 
Committee Member, Brenda H. Loyd Outstanding Dissertation Award Committee, National Council on Measurement in Education (NCME), 2007-2011
 
Review Panel, Child-related interventions research applications (ZMH1 ERB-P(01)) for the National Institute of Mental Health (NIMH), 2005
 
Journal Reviewer for Psychological Methods, Asia Pacific Education Review, Applied Psychological Measurement, Educational and Psychological Measurement, Journal of Educational Measurement 
 
Proposal Reviewer for AERA Annual Meetings & NCME Annual Meetings

service to the college and university

 
Committee Member, Affirmative Action Committee (AAC) at Teachers College, Columbia University, 2011 - Present.
 
Committee Member, Institutional Review Board (IRB) Committee at Teachers College, Columbia University, 2011 - Present.
 
Advisor, Korean Graduate Students Association (TCKGSA) at Teachers College, Columbia University, 2002 - Present.
 
Committee Member, Dean's Grant for Student Committee at Teachers College, Columbia University, 2005 - 2006. 
 
Committee Member, TESOL Search Committee at Teachers College, Columbia University, 2005 - 2006.
 
Committee Member, Affirmative Action Committee (AAC) at Teachers College, Columbia University, 2004 - 2005.

grants

Principal Investigator: "An Extension of the DINA Model Using Covaiates: Examining Attribute Mastery Prevalence and Matsery Profile on 4th Grade TIMSS Science", Dean's Grant for Tenured Faculty Research, Teachers College, Columbia University, Dates: 9/1/2012 - 8/31/2013, a semester research leave with salary & $5,000 research funds.

Co-Principal Investigator: "Developing an instrument for measuring Multiphasic Developmental Potential Inventory (MDPI)" (다면적 성장잠재력 검사 개발 연구), Seoul Metropolitan Office of Education, Seoul, Republic of Korea, Chang-Dai Kim (PI), Dates: 7/1/2011-2/29/2012, 50,000,000 won (approximately $50,000) (Written in Korean).  
 
Principal Investigator: "A Study on Examining and Improving the National Assessment of Educational Achievement using Cognitive Diagnostic Models"(인지진단모형을 통한 국가수준 학업성취도 평가자료의 분석 및 결과보고 개선 연구), Korea Institute for Curriculum and Evaluation (KICE), Dates: 6/1/2011 – 11/31/2011, 48,000,000 won (approximately $44,200) (Written in Korean).
 
Co-Principal Investigator: "Emerging Research-Empirical: Development and Application of a Multilevel Multiple-Group CDM to Compare Cognitive Attribute Distributions based on Eighth Grade TIMSS Mathematics", Johnson (PI), National Science Foundation (NSF), Dates: 9/1/2010 – 8/31/2012, $1,399,144 requested.
 
Principal Investigator: "Post-Smoothing by Kernel Equating to compare College Scholastic Aptitude Test Performance between Offices of Education via Multilevel Methods"(비모수 커널 추정법을 통한 대학수학능력시험의 검사동등화 및 다층모형 분석: 시도교육청간의 학업 성취도 비교 연구), Korea Institute for Curriculum and Evaluation (KICE) - The College Scholastic Aptitude Test (CSAT) Research Project, Dates: 8/15/2010 - 12/7/2010, 20,000,000 won (approximately $18,000) (Written in Korean).
 
Co-Principal Investigator: "mCLASS:Math: Development and analysis of an integrated screening, progress monitoring, and cognitive assessment system for K-3 mathematics", Institute of Education Sciences (IES; R305B070325), Ginsburg (PI), Dates: 9/1/2007 - 6/30/2011, total costs $1,565,455.
 
Co-Principal Investigator: "Computer Guided Comprehensive Mathematics Assessment for Young Children", National Institute of Health (NIH; 1 R01 HD051538-01), Ginsburg (PI), Dates: 10/01/2005 - 8/31/2010, total costs $3,170,839. 
 
Statistician: "What Data Tell Us About the Role of Chronic Absenteeism in Early School", Annie E. Casey Foundation, Knitzer & Romero (PI), National Center for Children in Poverty, School of Public Health, Columbia University, Dates: 3/01/2007 - 2/28/2008, total costs $35,000.

professional organization membership

American Educational Research Association (AERA)
American Psychological Association (APA)
National Council on Measurement in Education (NCME)
Psychometric Society
Society for Research in Child Development (SRCD)

current projects

Emerging Research-Empirical: Development and Application of a Multilevel Multiple-Group CDM to Compare Cognitive Attribute Distributions based on Eighth Grade TIMSS Mathematics (NSF funded) 
 
Students in the United States have consistently performed below many of their international peers on mathematics assessments like TIMSS and PISA (AFT, 1999). This gap in performance is described in terms of a few very broad content domains (e.g., Algebra, Geometry), but it remains unclear exactly what math skills U.S. students are missing. The proposed research will develop and apply methods based on cognitive diagnostic modeling to understand exactly what math skills U.S. students lack, and compare the distributions of skills across countries and within countries across years. The research team will also investigate the relationship between the presence or absence of these skills and core background variables (e.g., gender). The proposed research focuses on the TIMSS eighth grade mathematics assessments from 1999, 2003, and 2007.

mCLASS:Math: Development and analysis of an integrated screening, progress monitoring, and cognitive assessment system for K-3 mathematics (IES funded)
 
We conduct a series of studies to evaluate the reliability and validity of an integrated assessment system for K-3 mathematics and to modify it appropriately. The assessment includes Curriculum Based Measurement (CBM) screening and progress monitoring measures and diagnostic cognitive interviews. It is intended especially for use with under-achieving students from a variety of ethnic, linguistic, and economic backgrounds. The two assessment methods use a proven technology platform-a hand-held computer that guides teachers' assessments-to shorten and simplify the assessment administration process, and to make the resulting data readily available to teachers.
 
Over four years, we will: (1) evaluate the reliability and validity of all items and measures in both the CBM and the diagnostic interviews, and make any necessary revisions; (2) create growth models that describe students' typical trajectories and "aim-lines" on the CBM measures; (3) establish cut-points for the CBM, to aid teachers in identifying students in need of special help; (4) investigate student profiles within and across the CBM and diagnostic interviews; and (5) conduct predictive validity studies to establish which combinations of CBM measures and diagnostic questions best predict student performance in later years.
 
Computer Guided Comprehensive Mathematics Assessment for Young Children (NIH funded)
 
Our first aim is to develop an Early Mathematics Assessment System (EMAS) appropriate for young children (3- to 5-year olds) that will serve three major functions. The long form (L-EMAS) can be used to (1) evaluate the effectiveness of a variety of curricula, (2) and to provide immediate and specific cognitive process information that can be used to guide instruction. (3) The short form (S-EMAS) can be used as a screening instrument for identifying children at risk for mathematical difficulties and those who might require comprehensive assessment and intervention. The EMAS will have several key features. It will measure a broad range of mathematical content, assessing number, operations, shape, space, measurement and pattern. It also will measure a broad range of mathematical proficiency, including performance; cognitive processes underlying performance; comprehension and use of mathematical language; and children's "motivation"-attentiveness and affect. The EMAS will be research-based, drawing on modern cognitive science, developmental and educational research. It will engage and motivate young children through the use of games or purposeful activities such as a birthday party. This type of context should ensure that the EMAS, which will be translated into Spanish, can be used with a diverse population.
 
Our second aim is to develop innovative technology whereby a personal digital assistant (PDA) will guide assessors in administering the EMAS. The technology will help them to use flexible probes modeled on the clinical interview to gain greater insight into student proficiency.
 
Our third aim is to use statistical procedures to ensure that the EMAS is reliable and valid. The psychometric properties of both the EMAS will be investigated using both classical test theory (CTT) and item response theory (IRT) approaches to ensure high reliability and validity of the scale and quality of the items. We will conduct reliability studies and will use IRT to provide item and test information. We will also conduct validation studies. Once their reliability and validity have been shown to be strong, the L-EMAS and S-EMAS will be ready for norming. 
 
Our fourth aim is to help assessors to use the EMAS and to investigate their use of it. We will design and evaluate professional developmental activities to enable early childhood professionals to use the PDA comfortably to administer the EMAS and to interpret the results. We will also investigate how assessors use the EMAS and learn from it. The EMAS will contribute to the evaluation and improvement of mathematics education for young children.

HUDM 4120: Basic concepts in statistics

Descriptive statistics including organizing, summarizing, reporting, and interpreting data. Understanding relationships expressed by cross-tabulation, breakdown, and scatterdiagrams. Designed as a one-semester introduction to statistical methods. Will include reading journal articles. Lab fee $50.00

HUDM 4122: Probability and statistical inference

Prerequisite: HUDM 4120 or undergraduate statistics course. Elementary probability theory; random variables and probability distributions; sampling distributions; estimation theory and hypothesis testing using binomial, normal, T, chi square, and F distributions.Lab fee $50.00

HUDM 5122: Applied regression analysis

Prerequisite: HUDM 4122 or permission of instructor. Least squares estimation theory. Traditional simple and multiple regression models and polynomial regression models, with grouping variables including one-way ANOVA, two-way ANOVA, and analysis of covariance. Lab devoted to applications of SPSS regression program. Lab fee: $50.

HUDM 6051: Psychometric theory I

Permission required. Prerequisites: HUDM 5059, HUDM 5122, or equivalents. Psychometric theory underlying test construction; classical test theory, item response theory, and applications.

HUDM 6052: Psychometric theory II

Permission required. Prerequisites: HUDM 5059, HUDM 5122, or equivalents. Psychometric theory underlying test construction; classical test theory, item response theory, and applications.

Young-Sun Lee appeared in the following articles:

TC at AERA 2013 (4/26/2013)

When Less is More (3/22/2011)

Testing New Standards for Standardized Testing (12/8/2010)

AERA 2010 (4/1/2010)

TC at AERA, 2008 (3/25/2008)

TC Welcomes New Members to the Faculty (10/1/2002)