Thursday, October 21, 2010

Assessment Study Detailed To-Do List/Timeline

Assessment Study To-Do List/Timeline

October: Attract teacher participants/IRB approval

By Saturday, October 23, 2010

• Everyone do the CITI IRB training: http://www.maricopa.edu/irb/apply-employee.html (step 3). The training will take about one to two hours.

• Complete IRB application

o Detailed project plans/methods

- Who – 081, 091, 101, and 102

- Consider 4 sections of each; 2 sections minimum – 1 human assessed placement, 1
computer assessed placement (thus only one section needs to be in computer classroom)

- Research Guided Self-Placement mechanism (Shelley mentioned a couple of models from
the C&W conference)

• Handful of student participants (20-30 participants)

o Data collection materials (interview questions, surveys, etc.)

o Participant recruitment letters/materials/etc.

- Write a letter to faculty to participate

• Shelley get contracts started for Chrises

• Shelley get draft of Yazmin & Emily's contract to HR folks to double-check, etc.

• Yazmin & Emily get Shelley filled out W-9 form

• Shelley figure out how we're going to pay for computer tests

o Yazmin to get Shelley information on the computer tests and

o Yazmin to email Shelley the URLs for the companies/tests so that she can start to figure out
how much this will cost


November: Attract teacher participants

December: Contact students who are currently on roster to have them come in and write an essay

January: Doing guided self placement between Jan 1 and Jan 18th (week of and week before week of accountability)

o Saturday, Jan 15 Classes Begin

o Monday, Jan 17 Observance of MLK Birthday

o All instructors to do the exact same diagnostic writing sample on first day of class (methods:
convenient time to carry this out and it is more representative)

o Collect writing sample

February: Collect writing sample from each teacher participant and

March: Interviews with teachers and students

o We will follow up on attendance/withdrawals with teacher and with student. Why did you
leave? (Health, work, children, etc.? We need this data)

April: Interviews with teachers and students

May: Collect quantitative data (grades—pass rate/retention)

June:

July:

August:

Tuesday, October 5, 2010

Approved Grant Application

This form will be used to request funding or recommend new strategies to the Student Success Initiative-cohort project. S

Note: The SSI cohort population is composed of students meeting the following conditions:
  • New to college (no prior college credits)
  • Degree-seeking or transfer-intent
  • Full credit load for current term (12 credit hours or more)
Also, students who test into developmental courses must enroll in those courses, with reading being the priority during their first term.

Rochelle Rodrigo, English ____5/10/10____
Name and Department Date Submitted

ALIGNMENT WITH SSI PRIORITIES
1. Describe how the project supports or aligns with SSI initiatives and/or cohort. SSI cohort population: New to college, Degree or transfer-intent, and Students taking 12 credit hours or more. Students that test into developmental courses must enroll in applicable course(s) with reading being priority their first term.
Accurately placement of students in the appropriate writing course is both critical to student and institutional success as well as more efficient and ethical as well. On the one hand, research on Writing Placement Methodologies is critical of most standardized tests; on the other hand, MCCCD is moving towards mandatory placement which requires selecting a placement methodology that is valid, economical, and time efficient. In this learning grant we will assess the validity of a four major categories of writing placement methods: multiple choice exam, computer assessed writing exam, human assessed writing exam, and directed self placement. This grant will compensate individuals to help design, develop, and conduct a robust and reliable study.

GOAL #1: Assess the validity of different categories of writing placement methodologies.
To achieve goal #1, the following OBJECTIVES must be met:
a. During the summer and, design a study that assesses the validity of the major categories of writing placement methods: multiple choice exam (Veal & Hudson, 1983), computer assessed writing exam (Burstein, 2003; Chung & Baker 2003; Ericsson & Haswell 2006, Wohlpart, Lindsey & Rademacher, 2008), human assessed writing exam (Cherry & Meyer 1993; Smith 1993; White 1985; Williamson, 2003), and directed self placement (Blakesley, 2002; Lewiecki-Wilson, Sommers, & Tassoni, 2000; Royer & Gilles, 2003; Royer & Roger 2000). The study will follow students through at least one semester to access their retention as well as their grades; qualitative data will be collected as well.
b. During the spring, implement the study with a combination of placement methodologies/exams MCCCD currently uses as well as others not currently in use and then track student retention and grades.
c. During the summer, analyze data from the first round of the study and write report.

GOAL #2: Compare different categories of writing placement methodologies to assess which one might best facilitate valid, economical, and time efficient student placement into appropriate writing courses.
To achieve goal #2, the following OBJECTIVES must be met:
a. Develop criteria for an appropriate MCCCD writing placement assessment methodology, to include but not limited to: validity, reliability, affordability, and efficiency. We'll even want to look at resources like Moss's (1994) "Can There be Validity without Reliability?" in defining what it means for a writing assessment to be valid and reliable while defining what exactly we mean by each term/criteria.
b. Analyze each of the tested methodologies using the different criteria.
c. Compare and contrast the different tested methodologies using the various criteria.

WORKS CITED
Blakesley, D. (2002). Directed Self-Placement in the University. WPA: Writing Program Administration, 25(3), 9-39.
Burstein, J. (2003). The E-rater Scoring Engine: Automated Essay Scoring with Natural Language Processing. In M. Shermis & J. Burstein (Eds.), Automated Essay Scoring: A Cross-Disciplinary Perspective (pp. 113-122). Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
Cherry, R. D. & Meyer, P. R. (1993). Reliability Issues in Holistic Assessment. In B. Huot & P. O'Neill (Eds.), Assessing writing: A critical sourcebook (pp. 29-56). Boston, MA: Bedford/St. Martin's.
Chung, G. & Baker, E. (2003). Issues in the Reliability and Validity of Automated Scoring of Constructed Responses. In M. Shermis & J. Burstein (Eds.), Automated Essay Scoring: A Cross-Disciplinary Perspective (pp. 23-40). Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
Ericsson, P. F. & Haswell, R. H. (2006). Machine scoring of student essays: Truth and consequences. Logan, UT: Utah State University Press.
Lewiecki-Wilson, C., Sommers, J., & Tassoni, J. (2000). Rhetoric and the writer's profile: Problematizing directed self-placement. Assessing Writing 7(2), 165-183.
Moss, P. A. (1994). Can There Be Validity Without Reliability? In B. Huot & P. O'Neill (Eds.), Assessing writing: A critical sourcebook (pp. 81-96). Boston, MA: Bedford/St. Martin's.
Royer, D. J. & Gilles, R. (2003). Directed Self-Placement: Principles and practices. Cresskill, NJ: Hampton Press.
Royer, D., & Roger, G. (2000). Basic Writing and Directed Self-Placement. Basic Writing e-Journal, 2(2). Retrieved, March 12, 2009, from http://www.asu.edu/clas/english/composition/cbw/summer_2000_V2N2.htm
Smith, William L. (1993), Assessing the reliability and adequacy of using holistic scoring of essays as a college composition placement technique. In M. Williamson & B. Huot (Eds.), Validating holistic scoring for writing assessment: Theoretical and empirical foundations (142-205), Cresskill, NJ: Hampton Press.
White, E. M. (1985). Holisticism. In B. Huot & P. O'Neill (Eds.), Assessing writing: A critical sourcebook (pp. 19-28). Boston, MA: Bedford/St. Martin's.
Williamson, M. M. (2003). Validity of Automated Scoring: Prologue for a Continuing Discussion of Machine Scoring Student Writing. In B. Huot & P. O'Neill (Eds.), Assessing writing: A critical sourcebook (pp. 435-455). Boston, MA: Bedford/St. Martin's.
Wohlpart, A. J., Lindsey, C., & Rademacher, C (2008). The Reliability of Computer Software to Score Essays: Innovations in a Humanities Course. Computers & Composition, 25 (2), 203-23.
Veal, L.R. & Hudson, S.A. (1983). Direct and Indirect Measures for Large-Scale Evaluations of Writing. In B. Huot & P. O'Neill (Eds.), Assessing writing: A critical sourcebook (pp. 13-18). Boston, MA: Bedford/St. Martin's.

ALIGNMENT WITH COLLEGE PRIORITIES
1. Please describe how the project aligns with college retention or related institutional priorities.
First and foremost, this project is aligned with the district’s strategic priorities that include mandatory placement testing for all students. It is highly problematic to implement mandatory placement testing if the assessment methodology does adequately place students in a reliable manner.

This project is also aligned with the Mesa Community College’s “Student Retention” strategic priority. The general assumption is that students are more likely to stay and successfully complete their courses if they are appropriately placed; however, research does not necessarily agree with that assumption (Armstrong, 2001; Kolajo, 2004; Perkhounkova, Noble, & Sawyer, 2005). Part of this study will not only test the validity and reliabilty of the various placement methodologies; however, it will also assess the assumption that placement has a direct correlation with student success and retention.

Working under the assumption that students who are accurately placed will be more likely to successfully complete a course, this project aims to assess methods of more accurately placing student into appropriate level writing courses. More importantly, staff and faculty that sincerely believe in the validity of a placement exam will share that confidence with students, creating an environment that promotes student success.

WORKS CITED
Armstrong, W. (2001). Explaining Student Course Outcomes by Analyzing Placement Test Scores, Student Background Data, and Instructor Effects. ERIC Database: ED454907
Kolajo, E. (2004). From Developmental Education to Graduation: A Community College Experience. Community College Journal of Research and Practice, 28(4), 365-371.
Perkhounkova, Y., Noble, J., & Sawyer, R. (2005). Modeling the Effectiveness of Developmental Instruction. ACT Research Report Series: 2005-2. Iowa City, IA: ACT Inc.

JUSTIFICATION
** If funds are being requested this area must be filled out. Recipients of SSI funds are required to complete a project status form and share results with the steering committee.
1. Describe the nature of the budget request, recommendation or need in detail: (Budget request must include description of how funding will be used to support SSI-Cohort and student success initiatives).

Personnel Breakdown
Object Code Description Amount
51310 Part-Time Wages: Christine Helfers
60 hrs @ $26.00/hr + $280.80 statutory benefits 1,840.80
51310 Part-Time Wages: Christine Vassett
60 hrs @ $26.00/hr + $280.80 statutory benefits 1,840.80
Personnel Subtotal $3,681.60
Operational Breakdown
Object Code Description Amount
53210 Professional Services: Emily Hooper for research, design, development, and delivery of study 3,016.00
53210 Professional Services: Yazmin Lazcan for research, design, development, and delivery of study 3,016.00
53210 Professional Services: Human assessed placement exams (6x100) 600.00
53550 Official Function: Food for longer events with students 485.00
59837 Other operational item: Stipends for students to participate in the study 1,200.00
Operational Subtotal $8,317.00
Personnel and Operational Grand Total $11,998.60

Actual Date Funds required : July 1, 2010…if possible to start work!

PERSONNEL REQUEST
** Funding is allocated annually. Subsequent request for new fiscal year must be submitted with new budget request form.

1. Duration of project (beginning and end dates): July 1, 2010-June 30, 2011

2. Number of staff to be funded, hourly rates, beginning and ending dates):
  • Jeff Andelora: Co-primary investigator overseeing entire project and acting as liaison between major departments/groups on the MCC campus and in MCCCD at large.
  • Shelley Rodrigo: Co-primary investigator and project manager overseeing entire project and helping Helfers, Vassett, Hooper & Lazcano getting work done.
  • Christine Helfers & Christine Vassett: OYOs on MCC's campus leading/working on the project. They will research, design, develop, and deliver the study, including participate as "advisers" for the guided self-placement method as well as "readers" for the human assessed method.
  • Emily Hooper & Yazmin Lazcano: Graduate Students from ASU's Rhetoric & Composition program in the English department leading/working on the project. They will research, design, develop, and deliver the study, including participate as "advisers" for the guided self-placement method as well as "readers" for the human assessed method.
  • Human Readers: Residential and/or Adjunct faculty hired to assess the essay placement exams for the human assessed methodology (Spring 2011).
Timelines
Activities (What/Where) Start/End Dates (When) Responsibility (Who)
1. Set up blog to report out process, invite Adelora, Helfers, Vasset, Hooper, & Lazcano July 1, 2010 Rodrigo
2. Manage grant funds and other processes July 1, 2010-June 30, 2011 Rodrigo
3. Manage grant communications (motivate to blog, mid-reports, etc.) July 1, 2010-June 30, 2011 Rodrigo
4. Gather, read, and review placement methods literature within the field of rhetoric and composition/E July 1-August 1, 2010 Helfers, Vassett, Hooper, & Lazcano
5. Gather, read, and review placement methods literature within studies about community college student July 1-August 1, 2010 Helfers, Vassett, Hooper, & Lazcano
6. Gather, read, and review placement methods literature within studies in higher education at large. July 1-August 1, 2010 Helfers, Vassett, Hooper, & Lazcano
7. Design study based on draft produce by Hooper & Lazcano in their Assessment Seminar August 1-15, 2010 Helfers, Vassett, Hooper, & Lazcano
8. Review and revise study with MCC’s English Department, the English Instructional Council Services, T August 23-October 15, 2010 Andelora, Rodrigo, Helfers, Vassett, Hooper, & Lazcano
9. Submit revised study for CRRC/IRB approval. October 15-November 15, 2010 Rodrigo, Helfers, Vassett, Hooper, & Lazcano
10. Finalize plans for implimenting study during Spring 2011 October 15-December 15, 2010 Helfers, Vassett, Hooper, & Lazcano
11. Impliment Study, Collect Data January 15-May 13, 2011 Helfers, Vassett, Hooper, & Lazcano
12. Start Analyzing Data February 15-May 13, 2011 Helfers, Vassett, Hooper, & Lazcano
13. Analyze Final Data from Spring Semester May 13-June 15, 2011 Helfers, Vassett, Hooper, & Lazcano
14. Write Final Report June 1-30, 2011 Andelora, Rodrigo, Helfers, Vasset, Hooper, & Lazcano

3. Are you requesting additional funds for training needs? No

4. Total Budget impact/request: $11,998.60