Thursday, November 4, 2010

Spent two and half hours trying to contact people in charge of the directed self placement tests on two different campuses: Portland State and University of Colorado at Boulder. Yazmin sent me an email saying:
Here are a couple of portals to guided self-placement

Portland State
http://www.pdx.edu/advising/placement

University of Colorado at Boulder
http://www.colorado.edu/orientation/complete/step2asjmc.html

https://aac.colorado.edu/pwr/pwr.asp

Although these links were password protected, I was able to locate contact names and phone numbers from both colleges. I left email and voice messages with both colleges.

Portland State: Freshman Writing Placement Questionnaire
Requires a PSU computing account. I left a message with Director of Undergraduate Advising and Support Center, Mary Ann Barham (503) 725-5471
Here are some additional phone numbers to use in case this doesn't work out:
(503) 725-4005 (Undergraduate Advising and Support Center)
(503) 725-3822 (CLAS)
(503) 725-4357 Log in Help

I also learned that Portland State does not require any writing course for graduation, but recommends it for electives.
www.pdx.edu/advising/faqs/do-i-need-to-take-a-writing-course

University of Colorado at Boulder
Admission Counselor for Arizona: Adam Becenti (303) 492-1223
Directed Self Placement

Testing Services (GRE, LSAT etc) (303) 492-6970
English Department (303) 492-7381
Renzia (pronounced Ren-cha) Benninger (303) 492-6434 (left message)

English Department was unaware that they have a Directed Self-Placement exam.

Thursday, October 21, 2010

Assessment Study Detailed To-Do List/Timeline

Assessment Study To-Do List/Timeline

October: Attract teacher participants/IRB approval

By Saturday, October 23, 2010

• Everyone do the CITI IRB training: http://www.maricopa.edu/irb/apply-employee.html (step 3). The training will take about one to two hours.

• Complete IRB application

o Detailed project plans/methods

- Who – 081, 091, 101, and 102

- Consider 4 sections of each; 2 sections minimum – 1 human assessed placement, 1
computer assessed placement (thus only one section needs to be in computer classroom)

- Research Guided Self-Placement mechanism (Shelley mentioned a couple of models from
the C&W conference)

• Handful of student participants (20-30 participants)

o Data collection materials (interview questions, surveys, etc.)

o Participant recruitment letters/materials/etc.

- Write a letter to faculty to participate

• Shelley get contracts started for Chrises

• Shelley get draft of Yazmin & Emily's contract to HR folks to double-check, etc.

• Yazmin & Emily get Shelley filled out W-9 form

• Shelley figure out how we're going to pay for computer tests

o Yazmin to get Shelley information on the computer tests and

o Yazmin to email Shelley the URLs for the companies/tests so that she can start to figure out
how much this will cost


November: Attract teacher participants

December: Contact students who are currently on roster to have them come in and write an essay

January: Doing guided self placement between Jan 1 and Jan 18th (week of and week before week of accountability)

o Saturday, Jan 15 Classes Begin

o Monday, Jan 17 Observance of MLK Birthday

o All instructors to do the exact same diagnostic writing sample on first day of class (methods:
convenient time to carry this out and it is more representative)

o Collect writing sample

February: Collect writing sample from each teacher participant and

March: Interviews with teachers and students

o We will follow up on attendance/withdrawals with teacher and with student. Why did you
leave? (Health, work, children, etc.? We need this data)

April: Interviews with teachers and students

May: Collect quantitative data (grades—pass rate/retention)

June:

July:

August:

Tuesday, October 5, 2010

Approved Grant Application

This form will be used to request funding or recommend new strategies to the Student Success Initiative-cohort project. S

Note: The SSI cohort population is composed of students meeting the following conditions:
  • New to college (no prior college credits)
  • Degree-seeking or transfer-intent
  • Full credit load for current term (12 credit hours or more)
Also, students who test into developmental courses must enroll in those courses, with reading being the priority during their first term.

Rochelle Rodrigo, English ____5/10/10____
Name and Department Date Submitted

ALIGNMENT WITH SSI PRIORITIES
1. Describe how the project supports or aligns with SSI initiatives and/or cohort. SSI cohort population: New to college, Degree or transfer-intent, and Students taking 12 credit hours or more. Students that test into developmental courses must enroll in applicable course(s) with reading being priority their first term.
Accurately placement of students in the appropriate writing course is both critical to student and institutional success as well as more efficient and ethical as well. On the one hand, research on Writing Placement Methodologies is critical of most standardized tests; on the other hand, MCCCD is moving towards mandatory placement which requires selecting a placement methodology that is valid, economical, and time efficient. In this learning grant we will assess the validity of a four major categories of writing placement methods: multiple choice exam, computer assessed writing exam, human assessed writing exam, and directed self placement. This grant will compensate individuals to help design, develop, and conduct a robust and reliable study.

GOAL #1: Assess the validity of different categories of writing placement methodologies.
To achieve goal #1, the following OBJECTIVES must be met:
a. During the summer and, design a study that assesses the validity of the major categories of writing placement methods: multiple choice exam (Veal & Hudson, 1983), computer assessed writing exam (Burstein, 2003; Chung & Baker 2003; Ericsson & Haswell 2006, Wohlpart, Lindsey & Rademacher, 2008), human assessed writing exam (Cherry & Meyer 1993; Smith 1993; White 1985; Williamson, 2003), and directed self placement (Blakesley, 2002; Lewiecki-Wilson, Sommers, & Tassoni, 2000; Royer & Gilles, 2003; Royer & Roger 2000). The study will follow students through at least one semester to access their retention as well as their grades; qualitative data will be collected as well.
b. During the spring, implement the study with a combination of placement methodologies/exams MCCCD currently uses as well as others not currently in use and then track student retention and grades.
c. During the summer, analyze data from the first round of the study and write report.

GOAL #2: Compare different categories of writing placement methodologies to assess which one might best facilitate valid, economical, and time efficient student placement into appropriate writing courses.
To achieve goal #2, the following OBJECTIVES must be met:
a. Develop criteria for an appropriate MCCCD writing placement assessment methodology, to include but not limited to: validity, reliability, affordability, and efficiency. We'll even want to look at resources like Moss's (1994) "Can There be Validity without Reliability?" in defining what it means for a writing assessment to be valid and reliable while defining what exactly we mean by each term/criteria.
b. Analyze each of the tested methodologies using the different criteria.
c. Compare and contrast the different tested methodologies using the various criteria.

WORKS CITED
Blakesley, D. (2002). Directed Self-Placement in the University. WPA: Writing Program Administration, 25(3), 9-39.
Burstein, J. (2003). The E-rater Scoring Engine: Automated Essay Scoring with Natural Language Processing. In M. Shermis & J. Burstein (Eds.), Automated Essay Scoring: A Cross-Disciplinary Perspective (pp. 113-122). Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
Cherry, R. D. & Meyer, P. R. (1993). Reliability Issues in Holistic Assessment. In B. Huot & P. O'Neill (Eds.), Assessing writing: A critical sourcebook (pp. 29-56). Boston, MA: Bedford/St. Martin's.
Chung, G. & Baker, E. (2003). Issues in the Reliability and Validity of Automated Scoring of Constructed Responses. In M. Shermis & J. Burstein (Eds.), Automated Essay Scoring: A Cross-Disciplinary Perspective (pp. 23-40). Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
Ericsson, P. F. & Haswell, R. H. (2006). Machine scoring of student essays: Truth and consequences. Logan, UT: Utah State University Press.
Lewiecki-Wilson, C., Sommers, J., & Tassoni, J. (2000). Rhetoric and the writer's profile: Problematizing directed self-placement. Assessing Writing 7(2), 165-183.
Moss, P. A. (1994). Can There Be Validity Without Reliability? In B. Huot & P. O'Neill (Eds.), Assessing writing: A critical sourcebook (pp. 81-96). Boston, MA: Bedford/St. Martin's.
Royer, D. J. & Gilles, R. (2003). Directed Self-Placement: Principles and practices. Cresskill, NJ: Hampton Press.
Royer, D., & Roger, G. (2000). Basic Writing and Directed Self-Placement. Basic Writing e-Journal, 2(2). Retrieved, March 12, 2009, from http://www.asu.edu/clas/english/composition/cbw/summer_2000_V2N2.htm
Smith, William L. (1993), Assessing the reliability and adequacy of using holistic scoring of essays as a college composition placement technique. In M. Williamson & B. Huot (Eds.), Validating holistic scoring for writing assessment: Theoretical and empirical foundations (142-205), Cresskill, NJ: Hampton Press.
White, E. M. (1985). Holisticism. In B. Huot & P. O'Neill (Eds.), Assessing writing: A critical sourcebook (pp. 19-28). Boston, MA: Bedford/St. Martin's.
Williamson, M. M. (2003). Validity of Automated Scoring: Prologue for a Continuing Discussion of Machine Scoring Student Writing. In B. Huot & P. O'Neill (Eds.), Assessing writing: A critical sourcebook (pp. 435-455). Boston, MA: Bedford/St. Martin's.
Wohlpart, A. J., Lindsey, C., & Rademacher, C (2008). The Reliability of Computer Software to Score Essays: Innovations in a Humanities Course. Computers & Composition, 25 (2), 203-23.
Veal, L.R. & Hudson, S.A. (1983). Direct and Indirect Measures for Large-Scale Evaluations of Writing. In B. Huot & P. O'Neill (Eds.), Assessing writing: A critical sourcebook (pp. 13-18). Boston, MA: Bedford/St. Martin's.

ALIGNMENT WITH COLLEGE PRIORITIES
1. Please describe how the project aligns with college retention or related institutional priorities.
First and foremost, this project is aligned with the district’s strategic priorities that include mandatory placement testing for all students. It is highly problematic to implement mandatory placement testing if the assessment methodology does adequately place students in a reliable manner.

This project is also aligned with the Mesa Community College’s “Student Retention” strategic priority. The general assumption is that students are more likely to stay and successfully complete their courses if they are appropriately placed; however, research does not necessarily agree with that assumption (Armstrong, 2001; Kolajo, 2004; Perkhounkova, Noble, & Sawyer, 2005). Part of this study will not only test the validity and reliabilty of the various placement methodologies; however, it will also assess the assumption that placement has a direct correlation with student success and retention.

Working under the assumption that students who are accurately placed will be more likely to successfully complete a course, this project aims to assess methods of more accurately placing student into appropriate level writing courses. More importantly, staff and faculty that sincerely believe in the validity of a placement exam will share that confidence with students, creating an environment that promotes student success.

WORKS CITED
Armstrong, W. (2001). Explaining Student Course Outcomes by Analyzing Placement Test Scores, Student Background Data, and Instructor Effects. ERIC Database: ED454907
Kolajo, E. (2004). From Developmental Education to Graduation: A Community College Experience. Community College Journal of Research and Practice, 28(4), 365-371.
Perkhounkova, Y., Noble, J., & Sawyer, R. (2005). Modeling the Effectiveness of Developmental Instruction. ACT Research Report Series: 2005-2. Iowa City, IA: ACT Inc.

JUSTIFICATION
** If funds are being requested this area must be filled out. Recipients of SSI funds are required to complete a project status form and share results with the steering committee.
1. Describe the nature of the budget request, recommendation or need in detail: (Budget request must include description of how funding will be used to support SSI-Cohort and student success initiatives).

Personnel Breakdown
Object Code Description Amount
51310 Part-Time Wages: Christine Helfers
60 hrs @ $26.00/hr + $280.80 statutory benefits 1,840.80
51310 Part-Time Wages: Christine Vassett
60 hrs @ $26.00/hr + $280.80 statutory benefits 1,840.80
Personnel Subtotal $3,681.60
Operational Breakdown
Object Code Description Amount
53210 Professional Services: Emily Hooper for research, design, development, and delivery of study 3,016.00
53210 Professional Services: Yazmin Lazcan for research, design, development, and delivery of study 3,016.00
53210 Professional Services: Human assessed placement exams (6x100) 600.00
53550 Official Function: Food for longer events with students 485.00
59837 Other operational item: Stipends for students to participate in the study 1,200.00
Operational Subtotal $8,317.00
Personnel and Operational Grand Total $11,998.60

Actual Date Funds required : July 1, 2010…if possible to start work!

PERSONNEL REQUEST
** Funding is allocated annually. Subsequent request for new fiscal year must be submitted with new budget request form.

1. Duration of project (beginning and end dates): July 1, 2010-June 30, 2011

2. Number of staff to be funded, hourly rates, beginning and ending dates):
  • Jeff Andelora: Co-primary investigator overseeing entire project and acting as liaison between major departments/groups on the MCC campus and in MCCCD at large.
  • Shelley Rodrigo: Co-primary investigator and project manager overseeing entire project and helping Helfers, Vassett, Hooper & Lazcano getting work done.
  • Christine Helfers & Christine Vassett: OYOs on MCC's campus leading/working on the project. They will research, design, develop, and deliver the study, including participate as "advisers" for the guided self-placement method as well as "readers" for the human assessed method.
  • Emily Hooper & Yazmin Lazcano: Graduate Students from ASU's Rhetoric & Composition program in the English department leading/working on the project. They will research, design, develop, and deliver the study, including participate as "advisers" for the guided self-placement method as well as "readers" for the human assessed method.
  • Human Readers: Residential and/or Adjunct faculty hired to assess the essay placement exams for the human assessed methodology (Spring 2011).
Timelines
Activities (What/Where) Start/End Dates (When) Responsibility (Who)
1. Set up blog to report out process, invite Adelora, Helfers, Vasset, Hooper, & Lazcano July 1, 2010 Rodrigo
2. Manage grant funds and other processes July 1, 2010-June 30, 2011 Rodrigo
3. Manage grant communications (motivate to blog, mid-reports, etc.) July 1, 2010-June 30, 2011 Rodrigo
4. Gather, read, and review placement methods literature within the field of rhetoric and composition/E July 1-August 1, 2010 Helfers, Vassett, Hooper, & Lazcano
5. Gather, read, and review placement methods literature within studies about community college student July 1-August 1, 2010 Helfers, Vassett, Hooper, & Lazcano
6. Gather, read, and review placement methods literature within studies in higher education at large. July 1-August 1, 2010 Helfers, Vassett, Hooper, & Lazcano
7. Design study based on draft produce by Hooper & Lazcano in their Assessment Seminar August 1-15, 2010 Helfers, Vassett, Hooper, & Lazcano
8. Review and revise study with MCC’s English Department, the English Instructional Council Services, T August 23-October 15, 2010 Andelora, Rodrigo, Helfers, Vassett, Hooper, & Lazcano
9. Submit revised study for CRRC/IRB approval. October 15-November 15, 2010 Rodrigo, Helfers, Vassett, Hooper, & Lazcano
10. Finalize plans for implimenting study during Spring 2011 October 15-December 15, 2010 Helfers, Vassett, Hooper, & Lazcano
11. Impliment Study, Collect Data January 15-May 13, 2011 Helfers, Vassett, Hooper, & Lazcano
12. Start Analyzing Data February 15-May 13, 2011 Helfers, Vassett, Hooper, & Lazcano
13. Analyze Final Data from Spring Semester May 13-June 15, 2011 Helfers, Vassett, Hooper, & Lazcano
14. Write Final Report June 1-30, 2011 Andelora, Rodrigo, Helfers, Vasset, Hooper, & Lazcano

3. Are you requesting additional funds for training needs? No

4. Total Budget impact/request: $11,998.60

Wednesday, September 29, 2010

Writing Assessment Seminar Course Project: Proposal Concerning New Placement Procedure for First-Year Composition

TO: Dr. Maria Harper-Marinick, Vice Chancellor of Academic and Student Affairs, Maricopa Community Colleges

FROM: Jeffrey Andelora, Chair, and Shelley Rodrigo, Professor, MCC English Department

DATE: May 4, 2010

SUBJECT: Proposal Concerning New Placement Procedure for First-Year Composition

We write to propose running a pilot version of a feasibility study to be conducted during Fall 2010 concerning the MCC English Department's recently mandatory placement procedure for Developmental Writing (ENG 071, 081, and 091) and First-Year Composition (ENG 101).

Need/Concern to be Addressed

As you are well aware, MCCCD has been moving toward a mandatory placement policy for the last couple of years, and the MCCCD Strategic Plan 2009-2010 indicates that better student retention is a district-wide strategic direction as well as a strategic goal specific to MCC. It has been hoped that the district-wide Student Success Initiative (SSI) would contribute to reaching this strategic direction/goal, and one of the requirements of the SSI is that a majority of students must take a battery of placement tests if they plan to enroll in English, reading, math, or ESL courses throughout the district. As stated in the MCC Catalog 09-10, under the current policy placement testing is strongly urged, but it is not yet mandatory for students to enroll in the English, reading, or math course(s) indicated by their placement testing outcomes. It is our understanding, however, that the MCC Catalog 10-11 will reflect changes to the placement policy making it mandatory for students to enroll in the courses indicated by their placement testing outcomes. It is in this context that the district's Course Placement Council (CIC) recently asked the English Instructional Council (EIC) to investigate a more valid and reliable placement instrument that would place students more accurately and successfully than the current English placement instruments, which are strictly multiple choice tests and thus indirect measures of writing competence/ability. In addition to this, there has been for some time a strong desire from within the MCC English Department itself to investigate a more theoretically and practically valid and reliable placement instrument with which to place students more accurately and, thus, promote their greater academic success.

MCC's Current English Placement Practices

Currently, the MCC Testing Center's official practice is to use one of three multiple-choice tests for English Placement: ACT's ASSET, ACT's COMPASS, and The College Board's ACCUPLACER. District cut scores are set by the English Instructional Council. Every three years, the EIC reviews these scores based on statistical data regarding retention and pass rates. ASSET is rarely used today because of a variety of reasons including the fact that it isn't adaptive (e.g., has a set number of questions, 34-36 for English) and is a pencil-and-paper examination. The Testing Center had been using COMPASS for at least the last ten years, but about one year ago the director began using ACCUPLACER primarily due to it being adaptive to student performance, web-based, and easier to set up in computer labs across campus when the large volume of students testing exceeds the physical confines of the Testing Center's limited space. Although placement testing occurs year-round, the Testing Center administers the majority of tests, approximately 9,000, right before the Fall semester begins. As you are aware, for a long time students have been able to re-test in one subject area of their choice (e.g., English, reading, Math, or ESL), and this is still the case. It is also true that for a long time students have had the option to appeal the placement outcome based on their scores. While there is still an option for students to appeal their placement outcome, the wording in the district guidelines has recently been changed. The old language implied that all the student had to do was ask for a placement waiver from the appropriate department chair. The updated language of the district guidelines indicates that the decision of whether to sign the placement waiver belongs to the chair of the requisite department and that additional testing may be required. Without clear guidelines, some chairs would waive the score and allow students to enroll in a course above where they tested. This practice has rightly raised concerns among college and district administrators, however, that bumping students to a course above placement sets them up for academic failure.

Proposed Action

So as to not impact registration and enrollment practices that are already in place, we propose that students in our pilot study take the established ACCUPLACER placement test before the start of the Fall 2010 semester. Prior to students taking ACCUPLACER, eight full-time faculty with experience teaching all four relevant courses (ENG 071, 081, 091, and 101) will be identified and briefed on the pilot study during the summer of 2010.
Faculty who agree to participate will allow two ASU graduate student researchers, Emily Hooper and Yazmin Lazcano, to give a brief introduction/overview of the study to their courses during the first week of classes, which begin on August 21, 2010. Participating faculty members will also agree to a 45-minute individual interview with either Emily or Yazmin. For their time, students will be given a ten-dollar gift card to iTunes. In addition to this, self-selecting students who agree to an additional 20-minute interview with either Emily or Yazmin will be given an opportunity to receive some sort of extra credit in their course.

Students in each course (ENG 071, 081, 091, 101) will be divided into two groups, A and B, for the purpose of staggering which test--either WritePlacer or e-Write--they will take in weeks two and three of the semester. During week two, students in A groups will take e-Write and students B groups will take WritePlacer in the MCC Testing Center at their individual convenience. The following week, A groups will take WritePlacer, while B groups will take e-Write.
Having the same group of students take all three placement tests will yield data offering a comparative view of performance across an indirect and two direct assessments, as well as across two different computer-scored essay placement tests. To supplement this data set, qualitative data will be collected in the form of a brief survey for all participating students and interviews with both faculty and a smaller subset of student participants. This qualitative data will provide insight into the perceptions of the fairness and reliability of each placement test taken. Finally, researchers will collect quantitative data regarding passing and retention rates of students in the pilot for comparison to a control group of students outside the pilot.


Resources Needed
At the beginning of March, we submitted an application for a 2010-2011 Learning Grant with which to run the proposed pilot study. We should receive news of a decision regarding this grant within the next couple of weeks. The table below accounts for how we plan to allocate the funds from the Learning Grant.

Resources

Description and Amount/Qty/Duration

Money

Student stipend: iTunes gift card ($10/student X 176 students) $1,760
The College Board’s ACCUPLACER WritePlacer essay component ($4/unit) $704
ACT’s COMPASS e-Write essay component ($5/unit) $880

People

Full-time MCC English Department faculty -- 8
MCC students from 2 sections each of ENG 071, 081, 091, and 101 (22/class) -- 176
Self-selecting student population for interviews -- TBD (ideally 16+)
ASU PhD student researchers (Emily Hooper & Yazmin Lazcano) -- 2

Time

Class time (20 minutes/class on First Day of Classes) 20 minutes X 8 classes = 2.5 hours total
Student testing (Weeks 2 and 3) 2 hours/student X 176 students = 352 hours total
Self-selecting student interviews (Weeks 4) 20 minutes/interview X 1 interview/student X 16 students = ~6 hours total
Faculty interviews (Week 4) 45 minutes/interview X 1 interview/faculty member = 6 hours total

Facility

MCC Testing Center computer lab Aug. 28-Sept. 10




Calendar

Summer Semester 2010

• June 2010: Identify 8 full-time faculty with experience teaching ENG 071-101. Contact and meet with them to discuss placement study design plan.
Fall Semester 2010

• August 21, 2010 (Saturday) – First day of classes

• Week 1: August 21-27

o Emily and Yazmin visit participating courses and introduce the study to students

• Week 2: August 28-September 3

o All group A students take e-Write and all group B students take WritePlacer in the Testing Center at their convenience. All students answer brief online survey (see Appendix A) directly following testing while they are still at the Testing Center. Emily and Yazmin provide faculty with interview protocol questions via e-mail.

• Week 3: September 6-10

o All group A students take WritePlacer and all group B students take e-Write in the Testing Center at their convenience. All students answer brief online survey (see Appendix A) directly following testing while they are still at the Testing Center.

• Week 4: September 11 -17

o Conduct individual 45-minute interviews with 8 faculty members after their students have taken both placement tests. (Emily conducts 4. Yazmin conducts 4. Both use audio recordings.)

o Conduct individual 20-minute interviews with self-selecting student population based on extra credit offer. (Hopefully 16+, approximately 2 from each of the 8 classes.)
• Weeks 5-16: September 18 - December 11

o Preliminary data analysis

• Last Day of Class: December 12 (Sunday)

• Last Day of Fall Semester 2010: December 17 (Friday)

Spring Semester 2011:

• Continued data analysis and reporting of findings.

• May 1, 2011: Final Report due to Office of Vice Chancellor of Academic and Student Affairs.


Appendix A

Interview protocols include adapted questions from:

Herrington, A. & Moran, C. (2006). WritePlacer Plus in place: An exploratory case study. In F.Ericsson & R. Haswell (Eds.), Machine scoring of student essays (114-129). Logan: Utah State University Press.

Interview Protocol for Faculty

Week 4 Faculty Interview

1. Have you tried the current placement system, ACCUPLACER, yourself?

2. How satisfied are you with using ACCUPLACER as a placement system? Do you believe the students in your class were placed accurately by ACCUPLACER?

3. Do you feel there is a connection between the placement system with ACCUPLACER and the curriculum? If yes, what is it? If no, why not?

4. What is the best thing about ACCUPLACER?

5. What is the worst thing about ACCUPLACER?

6. Have you tried WritePlacer and e-Write for yourself? If so, what was it like to write an essay online to be evaluated by a computer program?

7.Are you generally satisfied with the writing-placement process that includes WritePlacer and e-Write? (Do you think it is a fair way of evaluating students' writing for placement? If given a choice, would you prefer to have a person or the computer program evaluate your students' placement essays, or doesn't it matter? Why?)

8. Do you think a computer looks for different things when evaluating your writing for placement than a person would?

9. What is the best thing about WritePlacer/e-Write?

10. What is the worst thing about WritePlacer/e-Write?

11. Do you feel there is a connection between the placement system with WritePlacer/e-Write and the curriculum? If yes, what is it? If no, why not? (Explicate with course competencies)

Interview Protocol for Students

Weeks 2 and 3 Student Survey: WritePlacer

Background Information
• Gender: Female ___ Male ___ Transgender ___
• Age: under 18___ 18-22___ 23-35___ over 35___
• When were you last in school prior to enrolling at MCC?
• When did you last take an English class? Where?

Short Answer Questions

1. What was it like to write your placement essay online with WritePlacer?

2. What do you think WritePlacer is reading for when it evaluates your writing? That is, what aspects of your writing do you think it's considering when evaluating it? (Do you think a computer looks for different things when evaluating your writing for placement than a person would?)

3. Does it matter to you whether your teacher or a program reads your writing? Why?

4. Do you think that the computer program will be fair to you in evaluating your essay for placement? (Do you think a person would be more fair?) (If given a choice, would you prefer to have a person or the computer program evaluate your writing, or doesn't it matter? Why?)

Weeks 2 and 3 Student Survey: e-Write

Background Information
• Gender: Female ___ Male ___ Transgender ___
• Age: under 18___ 18-22___ 23-35___ over 35___
• When were you last in school prior to enrolling at MCC?
• When did you last take an English class? Where?

Short Answer Questions

1. What was it like to write your placement essay online with e-Write?

2. What do you think e-Write is reading for when it evaluates your writing? That is, what aspects of your writing do you think it's considering when evaluating it? (Do you think a computer looks for different things when evaluating your writing for placement than a person would?)

3. Does it matter to you whether your teacher or a program reads your writing? Why?

4. Do you think that the computer program will be fair to you in evaluating your essay for placement? (Do you think a person would be more fair?) (If given a choice, would you prefer to have a person or the computer program evaluate your writing, or doesn't it matter? Why?)
Week 4 Student Interview

1. At this point in the semester, what is your perception of how you are meeting the number X course competency for ENG 071/081/091/101? (At this point, the interviewer will share a list of the appropriate MCC course competencies with the student and question the student about each course competency individually. For ENG 071 and 101, there are seven course competencies each. For ENG 081 and 091, there are ten course competencies each.)

2. At this point in the semester, what is your overall perception of how you are performing in ENG 071/081/091/101?

3. You were placed into your current course by ACCUPLACER. Do you believe this placement decision was accurate? Did this placement decision set you up for academic success?

4. Your WritePlacer score would have placed you into ENG 071/081/091/101. Do you believe this placement decision would have been accurate? More accurate/less accurate/or about the same level of accuracy as the ACCUPLACER placement decision that was actually used?

5. Would the WritePlacer placement decision have set you up for academic success?

6. Your e-Write score would have placed you into ENG 071/081/091/101. Do you believe this placement decision would have been accurate? More accurate/less accurate/or about the same level of accuracy as the ACCUPLACER placement decision that was actually used?

7.Would the e-Write placement decision have set you up for academic success?




Appendix B

MCCCD Official Course Competencies ENG 071: Language Skills: Speaking and Writing Standard English
1. Generate grammatically correct simple, compound, and complex sentences.
2. Brainstorm, develop coherent sentences that can be organized into thought groups.
3. Revise sentence fragments into complete sentences.
4. Proofread and edit written work to correct errors in punctuation, spelling, and usage.
5. Organize and present an oral report based on library skills.
6. Use library skills to locate and gather information to organize and present in an oral report.
7. Use a computer to generate written text.

http://www.maricopa.edu/curriculum/D-L/096eng071.html

MCCCD Official Course Competencies ENG 081: Basic Writing Skills
1. Describe the contextual nature of writing, including the importance of circumstance, purpose, topic, audience and writer.
2. Organize writing to support a central idea through unity, coherence, and logical development.
3. Use conventions in writing complete sentences, using appropriate grammar, and using mechanics.
4. Use conventions in writing, including consistent voice, tone, and diction.
5. Recognize effective and appropriate ideas.
6. Craft a variety of sentence types.
7. Recognize and implement steps in the writing process for sentence and paragraph projects, including prewriting, drafting, and editing for unity and coherence.
8. Use feedback obtained from peer review, instructor comments and/or other resources to revise writing.
9. Assess one's own writing strengths and identify strategies for improvement through instructor conference, portfolio review, written evaluation, and/or other methods.
10. Generate, format, edit, and deliver writing using appropriate technology.

http://www.maricopa.edu/curriculum/D-L/096eng081.html

MCCCD Official Course Competencies ENG 091: Fundamentals of Writing
1. Recognize how rhetorical contexts (including circumstance, purpose, topic, audience and writer) affect writing.
2. Organize writing to support a central idea through unity, coherence, and logical development.
3. Use conventions in writing complete sentences, using appropriate grammar, and using mechanics.
4. Use conventions in writing, including consistent voice, tone, and diction.
5. Generate and support effective and appropriate ideas.
6. Integrate a ariety of sentence types.
7. Recognize and implement steps in the writing process for paragraphs and multi-paragraph projects, including prewriting, drafting, and editing for unity and coherence.
8. Use feedback obtained from peer review, instructor comments and/or other resources to revise writing.
9. Assess one's own writing strengths and identify strategies for improvment through instructor conference, portfolio review, written evaluation, and/or other methods.
10. Generate, format, edit, and deliver writing using approriate technology.

http://www.maricopa.edu/curriculum/D-L/096eng091.html

MCCCD Official Course Competencies ENG 101: First-Year Composition
1. Analyze specific rhetorical contexts, including circumstance, purpose, topic, audience, and writer, as well as the writing's ethical, political, and cultural implications.
2. Organize writing to support a central idea through unity, coherence, and logical development appropriate to a specific writing context.
3. Use appropriate conventions in writing, including consisten voice, tone, diction, grammar, and mechanics.
4. Summarize, paraphrase and quote from sources to maintain academic integrity and to develop and support one's own ideas.
5. Use feedback obtained from peer review, instructor comments and/or other resources to revise writing.
6. Assess one's own writing strengths and identify strategies for improvement through instructor conference, portfolio review, written evaluation, and/or other methods.
7. Generate, format, and edit writing using appropriate technologies.

http://www.maricopa.edu/curriculum/D-L/096eng101.html

Tuesday, August 31, 2010

What Works--Hunter Boylan's recommendations

Hi Team
I reviewed the text What Works: Research-Based Best Practices in Dev. Ed. to see what comments were made regarding placement.
This meta-analysis shows that, for developmental students at least, mandatory assessment and placement contribute to student success. (Though he notes that poor instruction/ drill 'n kill in dev ed can negate the benefits of mandatory placement.)
He notes that "the most common placement tests used by community colleges are the ASSET and the Compass produced by ACT, and the Computerized Placement Test produced by the Education Testing Services."
He does briefly discuss testing measures, saying "Others are reluctant to require students to participate in development education because they believe that assessment test results are not completely accurate. This . . . is unsupported by any research. There is, of course, a range of measurement error with any placement instrument. But for most of the scientifically developed and better-validated instruments, this error range is around 5%. Roueche and Roueche (1999) note that placements tests 'may not be accurate without a doublt, (but) the more common tests are valid indicators that students have a problem.' "
Boylan gives the following tips:
*Faculty and staff involved should take whatever placement tests the college uses, to become familiar with them.
*Advisors should be required to read the placement instruments' manual so that they understand what the scores mean.
*Even if testing is "mandatory," some students will "slip through the cracks."
*Students should be familiarized with how to take the test, so placement is not just based on test-taking skill.
*Students should be allowed to 'challenge' the assessment result, since no instrument is 100% accurate for students. This also helps students feel that they are being treated fairly.

So he does not come out and recommend one test over another test. He does go on to argue for systematic program evaluation, though not specifically for placement test accuracy.

Anyway, I thought I'd share the key points from this resource. There is also the LSCHE website (Learning Centers) that Frank Christ referred to in a presentation last summer. It has lots of handy resources, so I will check that out in case anything is helpful for our project.

Cheers
Christine Helfers