Icon for: Mark Filowitz

MARK FILOWITZ

INCLUDES: (STEM)^3: Scaling (STEM)2
California State University Fullerton, California Polytechnic University Pomona, Citrus College
Public Discussion
  • Icon for: Mia Ong

    Mia Ong

    Facilitator
    Senior Research Scientist
    March 20, 2017 | 10:01 a.m.

    Hi Mark! I found your video to be very engaging and informative. I loved that it included multiple viewpoints about the program, by students, faculty, and an administrator. The goal of your INCLUDES project -- to scale an already-successful transfer program from two-year to four-year colleges for URM students in STEM -- is truly important and timely. For the first phase of scaling (to 6 Cal State institutions), how many community colleges, and how many URM community college students, will be involved? What methods will you use to measure impact, and how will you know the project will be successful? Thanks. 

  • Icon for: Mark Filowitz

    Mark Filowitz

    Lead Presenter
    March 22, 2017 | 11:11 a.m.

    Hello Mia, the initial goal will be to form a southern California alliance with 6 CSU's. A number of those CSU's have active programs with the same aims and one of the key elements we are studying in this grant is how we can bridge different approaches and cultures across our campuses. Cal State Fullerton has been successful in establishing itself as a center of excellence for supplemental instruction so we have some good track record in establishing such central networks. Of course, implementation is not trivial. Each of the CSU campuses serves anywhere from 2-6 community colleges with some overlap so at best we can hope to form regional alliance of 7 CSU's serving up to 25 community colleges in our region. Arroyo Research Services will conduct the external evaluation of the (STEM)3 program. Arroyo has extensive experience in education program evaluation and research, with particular strengths in evaluating STEM-related work.

    The primary goals of the evaluation will be to determine: 1) the quality and fidelity of program development and implementation, 2) the effectiveness of the new pilot in terms of student outcomes at participating campuses, and 3) the impact of the program for participating students compared to nonparticipating students. These goals will be explored through implementation, outcome, and impact evaluation phases.

    This will consist of employing needs assessments with participating campuses to determine if there are any unique needs that should be addressed in the partnering institution’s program. We will also collect meeting minutes and other project documents (e.g., program objectives, outreach material), and survey developing partners regarding their perception, satisfaction, and program consensus and capacity (i.e., ability to implement the program and its various components).

    Once developed, we will monitor short-term and formative results, validate program components, and determine whether implemented activities are of the quality and intensity necessary to influence student outcomes. Implementation data collection will consist of gathering program and outreach material; tracking attendance or session data; conducting site visits and observations; and surveying faculty, staff (e.g., mentors, counselors), and students.

    Quantitative data from these collections and evaluation activities will be analyzed using descriptive and inferential statistics; whereas qualitative data will be explored using an inductive coding process for general themes and patterns. Results will be used to determine: how program components are being developed and implemented, the nature and intensity of student and faculty participation, quality and perceived value/satisfaction of activities, etc.

    Outcome evaluation: Arroyo will employ a number of measures and evaluation activities to assess program outcomes beginning in Year 2. Specific outcomes we will seek to measure include students’ STEM self-efficacy, STEM major declaration, TYC to four-year institution transfers, and other milestones to STEM bachelor degree attainment (e.g., enrolling and attaining credit in STEM gatekeeper courses, 24 degree credits in first transfer year).  Data will be collected from partnering campus’ institutional research offices and from developed surveys administered throughout students’ program participation.

    The resulting data will be analyzed using descriptive, inferential, and correlational (e.g., regression) statistics. Matched data from implementation and outcome evaluation activities will be used to determine the influence of program participation, the degree of program quality, and the influence of other program components on student outcomes.

    Impact evaluation: Arroyo will employ a quasi-experimental study that will compare outcomes from participating students to a similar group of non-participating students over the grant period. We will conduct a student-level propensity score matching analysis to select non-participating students by working with the research team to determine student matching characteristics of interest. Analysis will then include a regression model that measures relevant outcomes modeled as a function of student-level factors.  To equate for any possible baseline differences in performance between participant groups, we might include prior performance as a covariate in the model.  Additional inferential statistics (e.g., t-test, chi-square statistics, analyses of variance) may be performed if they are warranted by initial exploratory analysis.

  • Icon for: Mia Ong

    Mia Ong

    Facilitator
    Senior Research Scientist
    March 23, 2017 | 05:43 p.m.

    Hi Mark! Thank you for your thoughtful and thorough answers. As your project continues, I'd love to learn more about how you bridge the different "approaches and cultures" of the campuses included in your project. I'm glad to see that you have an external evaluator who has STEM education experience, though I'd like to have seen more detail about qualitative interviews and focus groups with students, faculty, and staff. I hope they will be able to capture some of the nuances you are seeking to explore.

  • Icon for: Mark Filowitz

    Mark Filowitz

    Lead Presenter
    March 23, 2017 | 06:50 p.m.

    The research element of this project will be carried out by the CSUF CATALYST Center for Research on Science and Mathematics Education in collaboration with an external project evaluator, following the model used in support of CSUF’s Supplemental Instruction (SI) program [Emenike 2012, Filowitz 2012]. CATALYST research teams performed qualitative and quantitative studies including interviews of SI leaders, participants, and faculty as well as analysis of conceptual and attitudinal surveys.

    Arroyo Research Services will monitor the development and implementation of (STEM)3 with partnering institutions.  They will monitor short-term and formative results, validate program components, and determine whether implemented activities are of the quality and intensity necessary to influence student outcomes. Implementation data collection will consist of gathering program and outreach material; tracking attendance or session data; conducting site visits and observations; and surveying faculty, staff (e.g., mentors, counselors), and students.

    Quantitative data from these collections and evaluation activities will be analyzed using descriptive and inferential statistics; whereas qualitative data will be explored using an inductive coding process for general themes and patterns. Results will be used to determine: how program components are being developed and implemented, the nature and intensity of student and faculty participation, quality and perceived value/satisfaction of activities, etc.

    Arroyo will employ a number of measures and evaluation activities to assess program outcomes. Specific outcomes to measure include students’ STEM self-efficacy, STEM major declaration, TYC to four-year institution transfers, and other milestones to STEM bachelor degree attainment (e.g., enrolling and attaining credit in STEM gatekeeper courses, 24 degree credits in first transfer year).  Data will be collected from partnering campus’ institutional research offices and from developed surveys administered throughout students’ program participation.

    The resulting data will be analyzed using descriptive, inferential, and correlational (e.g., regression) statistics. Matched data from implementation and outcome evaluation activities will be used to determine the influence of program participation, the degree of program quality, and the influence of other program components on student outcomes.

    Arroyo will employ a quasi-experimental study that will compare outcomes from participating students to a similar group of non-participating students over the grant period. We will conduct a student-level propensity score matching analysis to select non-participating students by working with the research team to determine student matching characteristics of interest. Analysis will then include a regression model that measures relevant outcomes modeled as a function of student-level factors.  To equate for any possible baseline differences in performance between participant groups, we might include prior performance as a covariate in the model.  Additional inferential statistics (e.g., t-test, chi-square statistics, analyses of variance) may be performed if they are warranted by initial exploratory analysis.

     

  • Icon for: Jeanne Century

    Jeanne Century

    Facilitator
    Director/Research Associate Professor
    March 20, 2017 | 12:23 p.m.

    Hi Mark - it was great to hear about the success of the peer-mentor idea from those who have experienced it already. Mia's question was pointing to what success will be moving forward; I'm interested in hearing more about the metrics you used to measure success of the S2 project and how you might be continuing to use those metrics (or different metrics as you move forward). 


    Thanks, Jeanne

  • Icon for: Mark Filowitz

    Mark Filowitz

    Lead Presenter
    March 22, 2017 | 11:10 a.m.

    Hello Jeanne

    Arroyo Research Services will conduct the external evaluation of the (STEM)3 program. Arroyo has extensive experience in education program evaluation and research, with particular strengths in evaluating STEM-related work.

    We will monitor the development and implementation of (STEM)3 with partnering institutions.  This will consist of employing needs assessments with participating campuses to determine if there are any unique needs that should be addressed in the partnering institution’s program.  We will also collect meeting minutes and other project documents (e.g., program objectives, outreach material), and survey developing partners regarding their perception, satisfaction, and program consensus and capacity (i.e., ability to implement the program and its various components).

    Once developed, we will monitor short-term and formative results, validate program components, and determine whether implemented activities are of the quality and intensity necessary to influence student outcomes. Implementation data collection will consist of gathering program and outreach material; tracking attendance or session data; conducting site visits and observations; and surveying faculty, staff (e.g., mentors, counselors), and students.

    Quantitative data from these collections and evaluation activities will be analyzed using descriptive and inferential statistics; whereas qualitative data will be explored using an inductive coding process for general themes and patterns. Results will be used to determine: how program components are being developed and implemented, the nature and intensity of student and faculty participation, quality and perceived value/satisfaction of activities, etc.

    Outcome evaluation: Arroyo will employ a number of measures and evaluation activities to assess program outcomes beginning in Year 2. Specific outcomes we will seek to measure include students’ STEM self-efficacy, STEM major declaration, TYC to four-year institution transfers, and other milestones to STEM bachelor degree attainment (e.g., enrolling and attaining credit in STEM gatekeeper courses, 24 degree credits in first transfer year).  Data will be collected from partnering campus’ institutional research offices and from developed surveys administered throughout students’ program participation.

    The resulting data will be analyzed using descriptive, inferential, and correlational (e.g., regression) statistics. Matched data from implementation and outcome evaluation activities will be used to determine the influence of program participation, the degree of program quality, and the influence of other program components on student outcomes.

    Impact evaluation: Arroyo will employ a quasi-experimental study that will compare outcomes from participating students to a similar group of non-participating students over the grant period. We will conduct a student-level propensity score matching analysis to select non-participating students by working with the research team to determine student matching characteristics of interest. Analysis will then include a regression model that measures relevant outcomes modeled as a function of student-level factors.  To equate for any possible baseline differences in performance between participant groups, we might include prior performance as a covariate in the model.  Additional inferential statistics (e.g., t-test, chi-square statistics, analyses of variance) may be performed if they are warranted by initial exploratory analysis.

  • Icon for: Jeanne Century

    Jeanne Century

    Facilitator
    Director/Research Associate Professor
    March 24, 2017 | 08:44 a.m.

    Thanks for this detailed, response, Mark. If you have the info handy, I'd love to know what instrument Arroyo will be using to measure STEM self-efficacy. We have found in the past that "STEM" self-efficacy falls out into distinct constructs by discipline (science, technology, engineering and mathematics). In other words, what they feel about science may be very different from what they feel about mathematics." These were high school youth, though - so maybe different.   Do you have existing reports of any kind that are publicly available about your previous S2 project? Thanks.

  • Icon for: Mark Filowitz

    Mark Filowitz

    Lead Presenter
    March 24, 2017 | 10:37 a.m.

    Hello Jeanne

    Here is the executive summary of the final report for (STEM)^2.

    This Year 5 evaluation summary highlights the fifth and final year of findings from the CSUF Strengthening Transfer Education & Matriculation in STEM (STEM)2 project. This evaluation monitored progress toward program objectives, indicators of progress, and process objectives, using a comprehensive approach with quantitative and qualitative methods. The external evaluator Dr. Karen Kim worked closely with the (STEM)2 project director Dr. Maria Dela Cruz and staff to ensure that all program components are evaluated and to provide feedback on the (STEM)2 program. This report highlights the evaluation of the Summer Research Experience, Academic Transition Program, the Peer Mentor program, the Community College cohort activities, the College Education Planner, and the overall partnership.

     

    CSUF Summer Research Experience (SRE)

    The comprehensive evaluation of the SRE program included pre/post student surveys and focus groups, journal reflections, mentor surveys, and participant observation. The findings show that the program staff implemented a highly effective program, making positive improvements on the recommendations the previous years, especially in the areas of community building and the peer mentor one-on-one meetings.  While students provided positive feedback about all aspects of the summer research experience, the program components that were rated most highly by students were (1) the overall program (M=1.18), (2) the overall lab experience (M=1.48), and (3) the opening night reception at Dr. Filowitz’s house (M=1.48).  Student data show that participating students consistently reported positive feedback about their experience in the program.  In the initial design of the SRE program, the intention was to invite students to return for a second summer in the program when possible. This year 10 of the 39 students who completed the pre survey were returning SRE participants.  In addition, several peer mentors had been a part of the program in previous years, either reprising their roles as peer mentors or transitioning into the role of peer mentors after having been participants in the SRE programs as community college students.

     

    Data analysis of the pre/post surveys also indicate positive findings in the participating students’ perceptions about the program and themselves. A majority of students identified positive impacts from participating in the program, reflecting greater science and math ability, problem solving, and interpersonal skills. Paired samples t-tests show that students’ perceived increases (i.e., significant differences at p < .05 between the pre and post test) on several skills and traits compared to the average STEM student. The six skills with the most notable positive changes were:

     

    • Computer programming skills
    • Ability to apply skills and concepts to solving problems
    • Capacity to carry out own investigations and inquiries
    • Ability to explain scientific concepts to others
    • Familiarity with scientific techniques and instrumentation
    • Time management skills

     

    From the surveys, students stated their experience in the program taught them discipline specific research methods and more general life skills but they also often gained greater determination to pursue their education and career goals. In fact, on the post program survey, 97.5% of students reported that they were “very likely” to transfer to a four-year institution, 92.5% were going to continue to pursue other research experiences in STEM, and 87.5% planned to pursue a career in science. Overall, students felt that the program was highly beneficial as evidenced by the fact that 92.5% of students “strongly agreed” that participating in the (STEM)² program will help them to pursue their goals in a STEM field. The vast majority of participating students had positive feedback about the program and believed strongly that the program will help them to accomplish their goals in the future.  Like previous years, the shared appreciation of the program was especially apparent at the observed concluding poster session, which was well-attended by participating students and their families, faculty and student mentors, and staff and administrators from each of the partnering institutions. Overall, the evaluation shows that the SRE program is a central component of (STEM)² that benefits students, faculty, and each of the partnering campus.

     

    (STEM)² Academic Transition Program (ATP) at CSUF

    Evaluation of ATP for Year 5 includes pre/post student surveys for students in the Fall 2015 and Spring 2016 cohorts. Like the SRE program, the (STEM)² Academic Transition Program (ATP) is seen as a highly beneficial program by participating students. Providing valuable support to students as they transition from the community college to CSUF, the ATP evaluation shows that the program provides critical support to students in three primary ways:

     

    • Learning about the campus and its resources
    • Working with peer mentors
    • Proving motivation and encouragement
    •  

    In addition to helping students to become more familiar with the campus and to gain useful skills and meet people on campus, the ATP makes a positive difference in students’ experience and success at CSUF. A majority of ATP students feel that they become a part of community that is a support network for them.  For many students, participation in ATP helps them make connections to CSUF by preparing them for the demands of their academics but also by helping them develop a social support network outside of classes, as a student from the Spring 2016 cohort explains:   “Being apart [sic] of the stem atp program has helped me ease into the university life much easier then i [sic] expected to. It has also helped me network with teachers and students, making study groups through with stem connections has defiantly [sic] made a difference in my grades.”  These positive comments are also reflected in students’ responses to questions about their sense of belonging on campus, their confidence in their ability to accomplish their goals, and their belief that the can help them realize them, as another student said:  “The most useful way (STEM)² helped me was by gathering us all STEM majors together and being able to be with other students who are first timers here at CSUF. It created a friendly atmosphere and made becoming a new student an easier in this large of a college.”  A majority of students had positive identification with the campus, confidence in their ability to accomplish their goals and in the (STEM)² program’s ability to help them in the pre-program survey, which was administered at the conclusion of the ATP Orientation. At the end of the semester, a majority of students continued to agree with these statements, especially their belief that participating in the (STEM)² program will help them to pursue their goals in a STEM field (e.g., all students agreed in Fall 2015 and 28 out of 30 students agreed in Spring 2016).

     

    The data consistently show that ATP program provided valued support to students from their initial introduction to CSUF at the orientation to the end of the program. Pre-program surveys show that students had positive feedback about the orientation programs with the students most commonly identifying the following benefits from the orientations, including learning about resources; becoming more familiar with the CSUF campus; gaining skills that will help them as students; and meeting other students, peer mentors, faculty, and staff.  Mentoring support through the academic year was also highly valued.  For example, in Spring 2015, 71% rated the meeting with the mentor/outreach coordinator “Excellent” and 19% rated it “Very Good.”  In addition, 71% rated the meeting with the one-on-one meetings with peer mentors “Excellent” and 16% rated it “Very Good.” 

     

    Post-program survey findings show that a majority participating students are highly satisfied with the ATP program overall. In Fall 2015, 31% rated it “Excellent” and 63% rated it “Very good.”  In Spring 2016, 74% rated it “Excellent” and 19% rated it “Very good.” Another consistent finding is that ATP participants are highly dedicated to their pursuit of studies in STEM.  In both semesters, students rated themselves highest in their “motivation to achieve my education and career goals.” The next highest ratings were in “ability to deal with challenges” for Fall 2015 and in “enjoyment in studying scientific and/or technical topics” in Spring 2016. In both cohorts, a strong majority rate themselves as “above average” or “highest 10%” in their ability to deal with challenges, their enjoyment of STEM topics, their motivation to achieve, and their ability to work collaboratively. These findings are in fact consistent with previous years’ programs, which show that these are major strengths of the students participating in the ATP.  When asked to identify three ways in which the program was most useful to you, students often mentioned aspects of the program also mentioned in previous years, such as helping to find resources on campus, networking and meeting other STEM students, peer mentor advising, being encouraged to get involved, and financial support and priority registration.  Like last year, this year, students in both cohorts were most likely to mention the “peer mentors” as the most useful part of the program.

     

    Peer Mentor Program

    As discussed in the previous sections, the peer mentors are an important part of the (STEM)² program from the perspective of both SRE and ATP students. Considering the perspective of the peer mentors, data from both the peer mentor surveys and the focus groups consistently show that the (STEM)² program is also highly beneficial to the peer mentors as well.  Like last year, peer mentors had strongly positive feedback about all aspects of the program, especially the “support by CSUF (STEM)² staff” and their “experience working as a peer mentor to fulfill other responsibilities” which were both rated “excellent” by all peer mentors. The peer mentors open-ended responses in the survey as well as their feedback in the focus group further supports their positive ratings.  That is, peer mentors consistently point out that the peer mentor program is highly beneficial to them both in term of professional development as the mentor other students and personal development as they gain access to resources and develop skills that will help them in their own STEM studies.  Peer mentoring also involves them in a community of other peer mentors and with the broader (STEM)² network that they value highly.

     

    One possible reason for the additional strength of the peer mentor program is that all of the peer mentors have also attended a community college with the majority (7 out of 8 who completed the survey) working as a peer mentor at the same community college they attended as students. All of the peer mentors had worked as student researchers in a lab at CSUF and participated in STEM workshops to guide their own pathways.  Six of 8 participated in the (STEM)² program as a student (either in SRE, ATP, or both). This prior experience helped peer mentors to relate to the experiences of their mentees and to provide them useful guidance.  The peer mentors also become leaders in the (STEM)² community welcoming their mentees into the community and guiding them along the way.  This model has mutual benefits for peer mentors and their mentees and the (STEM)² community.

     

    Cohort community activities

    A thorough evaluation of (STEM)2 activities was carried out at each of the partnering institutions. This included student surveys that were administered to CSUF STEM transfer students and STEM students at the community colleges. Overall, 44 students responded to the CSUF survey and 66 students responded to the community college survey (Citrus n=0, Cypress n=23, Santiago Canyon n=40).  For the CSUF survey, 56% heard about (STEM)2 (30% had participated and 26% had not yet participated), compared to 60% who had year about the program last year and 49% of students who heard about the program the year before.  Cross-tabulation analysis shows that 88% of students that had participated in (STEM)2 activities strongly agree or agree that (STEM)2 could help them to accomplish their goals in a STEM field, compared to 95% last year and 69% from the year before.  For the community college survey, 90% heard about (STEM)2 (82% participated and 9% had not yet participated), compared to 87% last year and 69% the previous year.  Cross-tabulation analysis shows that all students that had participated in (STEM)2 activities strongly agree or agree that (STEM)2 could help them to accomplish their goals in a STEM field, compared to 95% last year and 84% the previous year.  In both surveys, students who participate in the program are likely to agree that the program will help them to accomplish their goals, but these positive outcomes are even greater at the community college.

     

    Monitoring of College Education Planner (CEP)

    The CEP made notable progress in the development of the interface and use by each of the community colleges. Improvements and new features were presented and discussed at collaboration council meetings and partnership surveys show that 71% agree that the CEP is an information technology infrastructure that supports students graduating provides accurate student data on enrollment, persistence and completion for tracking and effective advising.

     

    Progress continues to be made on the CEP as evidenced by the further development of the interface and use by each of the community colleges. Pilots conducted at SCC with students and counselors in the Fall 2014 provided valuable feedback about how students and counselors use the CEP, including aspects of the CEP that are most useful and improvements that could be made.  This feedback was gathered and helped to make refinements to the CEP, which is continuing to expand its features, functionality, and use at the partnering community colleges.

     

    Partnership Survey

    The final (STEM)² Partnership Survey was completed by 8 members of the leadership team. The findings show that all agree that there was a perceived need for the program and a clear goal for the partnership and 100% strongly agreed that “there was a shared understanding of, and commitment to, this goal among all potential partners.” All partners also strongly agreed that “there was an investment in the partnership in time, personnel, materials, or facilities” and that “There was a core group of skilled and committed (in terms of the partnership) staff that has continued since the beginning of the partnership.”   This unanimous recognition of the common vision for (STEM)² and the commitment made by each of the partnering institutions helps to explain the strongly positive outcomes in the overall program as evidenced by the data of the individual components presented in this report. All partners strongly agreed or agreed that “(STEM)² has increased the number of Hispanic or low-income students attaining degrees in the fields of science, technology, engineering, or mathematics” and all strongly agreed that “The partnership demonstrated the outcomes of its collective work.”  The respondents identified several accomplishments of the grant including:

     

    The greatest thing that (STEM)2 accomplished was building the cohort model of students from the community college to CSUF, retaining students in the STEM majors, and preparing them to successfully complete their bachelor's degrees.

     

    Everyone involved in this grant initiative had a shared goal and vision. The collaboration among participating institution has been wonderful and rewarding.

     

    The partnership survey clearly shows that (STEM)² was implemented by a core set of leaders from each partnering institution that dedicated time and resources to ensure the overall success of the program. These successes were observed through the effectiveness of the cohort model supporting a community of students moving through the community colleges to CSUF, with program such as the Summer Research Experience and the Academic Transition Program that helped students to navigate through their educational pathways and to become a part of the (STEM)² community.  Ultimately, the success of the partnership helped the program to realize its ultimate goal to increase the number of Hispanic and low-income students pursuing and receiving STEM degrees.

     

    Here is the evaluation plan for the current INCLUDES project.

    The primary goals of the evaluation will be to determine: 1) the quality and fidelity of program development and implementation, 2) the effectiveness of the new pilot in terms of student outcomes at participating campuses, and 3) the impact of the program for participating students compared to nonparticipating students. These goals will be explored through implementation, outcome, and impact evaluation phases.

    Implementation evaluation: Arroyo will assist with program development by working with the research team in determining key program elements and outcomes of (STEM)2 that should be incorporated in (STEM)3.  This will include synthesizing (STEM)2 data collection and report findings with recent literature and conceptual models on STEM education for underrepresented groups.

    Following, we will monitor the development and implementation of (STEM)3 with partnering institutions.  This will consist of employing needs assessments with participating campuses to determine if there are any unique needs that should be addressed in the partnering institution’s program.  We will also collect meeting minutes and other project documents (e.g., program objectives, outreach material), and survey developing partners regarding their perception, satisfaction, and program consensus and capacity (i.e., ability to implement the program and its various components).

    Once developed, we will monitor short-term and formative results, validate program components, and determine whether implemented activities are of the quality and intensity necessary to influence student outcomes. Implementation data collection will consist of gathering program and outreach material; tracking attendance or session data; conducting site visits and observations; and surveying faculty, staff (e.g., mentors, counselors), and students.

    Quantitative data from these collections and evaluation activities will be analyzed using descriptive and inferential statistics; whereas qualitative data will be explored using an inductive coding process for general themes and patterns. Results will be used to determine: how program components are being developed and implemented, the nature and intensity of student and faculty participation, quality and perceived value/satisfaction of activities, etc.

    Outcome evaluation: Arroyo will employ a number of measures and evaluation activities to assess program outcomes beginning in Year 2. Specific outcomes we will seek to measure include students’ STEM self-efficacy, STEM major declaration, TYC to four-year institution transfers, and other milestones to STEM bachelor degree attainment (e.g., enrolling and attaining credit in STEM gatekeeper courses, 24 degree credits in first transfer year).  Data will be collected from partnering campus’ institutional research offices and from developed surveys administered throughout students’ program participation.

    The resulting data will be analyzed using descriptive, inferential, and correlational (e.g., regression) statistics. Matched data from implementation and outcome evaluation activities will be used to determine the influence of program participation, the degree of program quality, and the influence of other program components on student outcomes.

    Impact evaluation: Arroyo will employ a quasi-experimental study that will compare outcomes from participating students to a similar group of non-participating students over the grant period. We will conduct a student-level propensity score matching analysis to select non-participating students by working with the research team to determine student matching characteristics of interest. Analysis will then include a regression model that measures relevant outcomes modeled as a function of student-level factors.  To equate for any possible baseline differences in performance between participant groups, we might include prior performance as a covariate in the model.  Additional inferential statistics (e.g., t-test, chi-square statistics, analyses of variance) may be performed if they are warranted by initial exploratory analysis.

  • Icon for: Erin Sanders

    Erin Sanders

    Director
    March 20, 2017 | 02:04 p.m.

    Dear Mark,

    Thank you for sharing your video and project. Pairing research scientists at Cal State with undergraduate community college students, enabling students to engage in research when they otherwise may not have an opportunity, is a great model for promoting transfer and STEM retention.  UCLA has a similar program established with Santa Monica College.  As such, I appreciate your interest in understanding how to transport this model to new settings and documenting the elements essential to successful program implementation. Can you comment on any of these elements based on what you've done thus far or what you're measuring currently and will be used to inform a plan to successfully scale the program to include other 2-year and 4-year institutions?

    Many thanks,

    Erin

  • Icon for: Mark Filowitz

    Mark Filowitz

    Lead Presenter
    March 22, 2017 | 11:10 a.m.

    Hello Erin

    Maybe too soon to evaluate results. Simple things like semesters vs. quarters need to be considered.

    The first year of funding will focus on consensus and capacity building and planning for implementation at Cal Poly Pomona (CPP) and this is well underway. Since the preparation of the pre-proposal we have taken steps toward the completion of some of these goals.  We have identified key personnel at CPP and Citrus College with responsibilities related to STEM students and transfer and started to form partnerships, we have met with them and have our strategies in place to move ahead.  In the second year, when transfer from Citrus are at CPP we will  collect and analyze project data to judge impact.

  • Icon for: Janice Jackson

    Janice Jackson

    Facilitator
    education consultant
    March 22, 2017 | 12:11 a.m.

    Mark, this is a fascinating project.  I agree with others that is was interesting to hear from students, mentors, and faculty.  One of the faculty members mentioned that they the program leaders are interested in  understanding the elements of thr program and what can be transferred elsewhere.  What would you say the essential elements are?  

    I wish the the best with the handson project.

  • Icon for: Mark Filowitz

    Mark Filowitz

    Lead Presenter
    March 22, 2017 | 11:08 a.m.

    Hello Janice

    Here are the key elements.

    1. Peer mentors that work with STEM students at the community college and at the four year

    2. Undergraduate research experience at the 4 year the summer prior to transfer

    3. Assistance at the community college with the application process to the 4-year

    4. A STEM transfer resource center at both the community college and the four year university for assistance and sense of community

    5. Use of free software called Transferology for best articulation of courses.

    Hope this helps.

  • Icon for: Janice Jackson

    Janice Jackson

    Facilitator
    education consultant
    March 26, 2017 | 12:01 a.m.

    Mark, thanks for your answer.  It sounds like you've thought through details about supporting the students once they transfer.  This is an intriguing project.  I hope it leads to the results you are looking for.  Your findings could be useful beyond the collaborating institutions.

    Janice

  • Icon for: Janice Jackson

    Janice Jackson

    Facilitator
    education consultant
    March 27, 2017 | 11:34 p.m.

    Mark, as the video event closes out I wish your team all the best.

    Janice

  • Further posting is closed as the event has ended.