LA Colloquium 2020

Grade Expectations in Introductory Courses and the Effects on the Corresponding Majors - Paul Graf

Graf explored the probability of students remaining in economics after choosing economics and found gender and ethnicity had significant effects on the likelihood of remaining in the economics major.

Description of the video:

Greetings, my name is Paul Graf and I'm a senior lecturer in the Economics department at Indiana University. Thank you very much for your time. First, I'd like to thank the Learning Analytics Fellows program funded by the Center for Learning Analytics and Student Success, Dennis Groth, the former Vice Provost of Undergraduate Education, George Rehrey, the founding director of CLASS, Harsha Manjunath and Linda Shepard at IU BAR and Nikita Lopatin, professor at Ashland University. Last year, Dr. Gerhard glom and I explored the probability of students remaining in economics after choosing economics or checking the box on their applications to Indiana University. We found gender, ethnicity, in large sections had significant effects on the likelihood of remaining in the economics major. Based on these results, Dr. glom suggested expanding the analysis to include other majors. Often, students declare a major upon arriving at IU Bloomington, which may change from when they first applied. Since students may change their major for different reasons, like their experience taking their first major course. After completing this course, students may switch for two reasons, ability or preference, or the grade they received in this course. This project focuses on the latter reason. So I believe this project may have important admission and retention policy implications that may improve these figures and therefore, the student experience at IU. This project analyzes three effects on the probability a student stays in their declared major. One: prior knowledge measured by SAT scores and high school GPA. Two: experience, represented by the first major course grade and overall student performance that semester, excluding the first major course grade designated as GPAO. 3: Other qualitative factors like gender, ethnicity, residency, and financial need. After exploring different major enrollments, over the last five years, I chose economics, business economics and public policy or BEP, accounting, finance, biology and psychology. Next, I identified the introductory courses for these majors. The introductory course chosen was E21 for economics and BEP, A100 for accounting and finance, L111 for biology and P101 for psychology. Further studying each major, I identified specific qualitative variables to account for program-specific variation. I used a binary model which I estimate using the linear probability OLS method, and for robustness, I also used probit and logistic regression. To compare all six majors using only quantitative variables, I first ran the regression of the likelihood a student remains in their declared major one year after taking the introductory course against four qualitative variables: high-school GPA, SAT score, the course grade, and that term's GPA, excluding the course grade. Next, to model major specific variation, I ran, similar regressions including identified qualitative variables specific to the major. Due to their similarity, I compared economics and BEP, identifying the qualitative variables of female Indiana Resident and first-generation based on the descriptive statistics. For economics and BEP using E201 as the first major course, I found for economics, the quantitative variables were not statistically significant. Indiana residents were, on average, more likely to remain, but first generation students were more likely to switch out of Economics one year after taking E201 For BEP, the higher the students high-school GPA and SAT scores on average, the more likely they remain in the BEP major one year after taking E201. Other qualitative variables are not statistically significant. Next, I compared accounting and finance and found similar patterns in their Descriptive Statistics. Specifically, I identified the qualitative variables of female, resident, first-generation, Asian, Hispanic, Latino, and Black African-American. Using A100 as the introductory course, I found all four quantitative variables on average, having a statistically significant effect of remaining in the declared major. The higher these values, the more likely the student remains in their declared major after one year of taking A100. For accounting, on average, Indiana residents are more likely to switch, while Black African-Americans are significantly more likely to stay one year after taking A100. For finance, on average, female and Indiana resident students are more likely to switch, while Black, African-American, and Hispanic-Latino students are more likely to stay one year after taking A100. Finally, I compared biology and psychology due to their popularity. I identified male, non-resident, first-generation, Asian, Hispanic, Latino, and Black African-American as the qualitative variables. Running the regressions for these two majors using L111 and P101 as the first course for biology and psychology respectively, I found all but a student's high-school GPA has a statistically significant effect on remaining in both majors one year after taking the first major course. On average, males, Asians and Hispanic and Latinos are more likely to remain in the biology major one year after taking L111. On average, Hispanic, Latino, and Black African American students are more likely to remain in psychology one year after taking P101. To summarize, the first major course grade has a significant effect on the most popular majors. Prior knowledge matters for students remaining in their declared majors. A students performance in other courses taken during the first major course semester, Gender and Residency has varying effects. Except for economics, first-generation has no statistically significant effects. Finally, minorities are more likely to remain in their declared major one year after taking the first major course. Further considerations, I would like to explore additional majors, generalize the results at the university, college and major levels to help students stay in the major and potentially increase the diversity and inclusion goals at IU. Furthermore, I'd like to look at the variables of grade penalty and interactive effects on changing majors. Thank you very much for your time.

Investigating an Online Course Feature and Instructor (Re) Positioning for Equity using Social Learning Analytics - Dan Hickey and Joshua Quick

Hickey and Quick used social learning analytic methods to examine a online graduate education course to support engagement and learning for students who find themselves minoritized by the composition of classes and/or the disciplinary knowledge in those classes.

Description of the video:

Hello everyone. I'm Dan Hickey. I'm here with Joshua Quick from the Learning Sciences program. We're going to talk about a course feature and some analytics that's currently underway. First of course, I'm going to say thanks to the CLASS for supporting this work, and George Rehrey for all his support over the years. Say thanks to some awesome graduate students and colleagues, particularly Suraj Uttamchandani who was instrumental in the work we're going to be talking about today. So our work has been deeply informed by the work of the late RandI Engle, in key respects, we've really picked up where she left off when she passed away unfortunately in 2012. This notion of productive disciplinary engagement is probably one of the most useful and used or set of design principles to come out of situative theories of learning. More recently we've embraced, or the last thing she published before she died was this notion of expansive framing. It's about pushing students to find connections with people, places, topics, and times. The goal with all of our work is to position students as authors rather than consumers of disciplinary knowledge and to hold them accountable to disciplinary discourse. So the research context we're going to talk about is a course that I've been teaching for 10 years online. It's really been the test bed for many of the ideas in my work in this framework we call participatory learning and assessment. What are the key things about this course? Well the instructor is pretty busy, I travel a lot in the summer. Sometimes, and I often teach this course in a compressed schedule, which means it's hard for me sometimes to, to get to class for several days at a time. So it's very important that the class sort of almost run itself when it needs to. I spend almost all of my time providing public feedback to the students and very, very little time engaging in private grading and private work. The point here is that the amount of time I'm able to spend is spent positioning students as authors in the course. One of the real goals of a lot of this work goes is avoiding instructor burnout by minimizing grading and private feedback. We do this using the thing we call Wikifolios, these are public artifacts of the class. All of the work happens either on these artifacts and then unthreaded discussions and student pages in Canvas. This is actually a new discussion posts students post each week. So much of our work involves looking at the threaded comments that students post on their work. What we're going to focus on today is a new feature that we added, and I'll provide the rationale. Each assignment includes a private self-assessment. Each assignment concludes with public reflections, these are most important part of our talk today. These reflections are summative for engagement and we can use them for awarding completion points, but they're formative in that they shape engagement proleptically, because students know that they're going to have to reflect on the contextual, collaborative, consequential, and conceptual engagement. These serve to further position students as authors. These are posted publicly. I'm able to comment on those reflections. There's also an exam that automatically automated and time limited to speed things along even more. Here are the four reflection prompt that we had in 2018. Now, what I want to talk about today, and one of the things I want to talk about today is recent critique of the PDE framework by Agarwal and Sengupta-Irving. They can see the PDE might help group-based inequities, compared to traditional curriculum, like problematizing content from your own perspectives allows for cultural meaning, explanation of that content. But they said that doing so without attending to power and privilege probably won't make any difference because of the way the minority students get positioned out of discourse in classrooms. Particularly they argued the problematizing content in ways that challenge culturally dominant ways of knowing can actually lead to racialized controversies. So they advance this framework called CPDE and suggested that instructors should reposition minoritized students proceed to as low status, and they introduce these four new CPDE principles, and central to them is this idea that you use sociopolitical uncertainties to help problematize disciplinary knowledge in your courses. the way that we respond to this in P507 was in 2019, we added a cultural reflection to the four existing reflections. You can see it there. Now we thought that the modest nature of this reflection would be hopeful because it might appeal to adjuncts who have limited curricular control, and it might sidestep some of the pushback from more explicit approaches to equity, but I really was interested in can it catalyze larger changes via instructor positioning and repositioning via instructor comments. So I was very much looking for and encouraging students to use sociopolitical controversies, and I look for opportunities to reposition students who might be in the minority in the class. The way we studied this was we compared the 2018 and 2019 courses. We coded the weekly wikifolios for whether or not they use sociopolitical controversies, and we did thematic analysis of anonymous course evaluations across two courses. Then in the 2019 course, we did thematic analysis of the actual content of the cultural reflections and interpretive analyses of social learning analytics and instructor repositioning and that work is continuing. One of the things we found was that a remarkable increase in the number of sociopolitical controversies. Basically, there were a handful of them that you see, there were 11% out of 230. They were almost all associated with reliability and fairness and bias and standardized testing, those are the only chapters that introduced sociopolitical content. What we see with the addition of the reflections is this use of sociopolitical controversies. Again, we didn't code the, this is not including the content of the cultural reflection, this is everything else. So we were pleased to see that, that we had the desired impact there. The, the themes of the sociopolitical controversies, nearly half of them raise assessment bias, but what's interesting is half of the considerations of bias, were outside of this session of bias. That was really important because what it meant was students, for instance, during formative assessment, were appreciating that bias could be a problem there if they weren't careful. So we compared the course evaluations. What was interesting was that there was a politically conservative student in each of the sections, but the responses were very different. In 2018, the student complain quite bitterly about being silenced. And in 2019, the student self-identified and expressed appreciation of the ability to reflect on these issues. There you see the 2018, "way too philosophical and political, I had less and less trust in either the instructor... I will not take another course by this instructor." Really quite different than what we saw in 2019. "I introduced my views, I was met with objectivity and politeness." And as I'll show you from the comment, you'll see why I was encouraged to. "It was encouraging to me that my voice was a valuable contributor." I'm going to come back to that. So when we did the Content Analysis, 35, 18% of those really got at something that's really important to me was that they surface a sort of implicit bias that critical teacher educators have long pointed to as a source of inequity. So there you see one example on the reliability and fairness, but here you see another one on formative assessment is what I was getting at earlier. "Perhaps it is my own background as a straight white male to cause me to remember this only after I had completed most of the assignment." So this is what I mean by, right? And this is an example by the way, of prolapses not working, right? So that's some evidence saying that maybe we needed to be more explicit about it or that it that it kicked in at some point. That's debatable. This, the kind of thing that we're interested in studying much more carefully. And here you see the 2019. Now, I don't have time to really get into this, but what the student did was essentially reject the chapter's discussion of assessment by us and say that, you know, life is not fair. And here's my feedback, right? This is really hasty on my part in retrospect, it was a pretty egregious mistake. I said, you know, basically bravo to you for expressing this. I was sitting in a Refugio in the Alps with a line of people waiting for me to get to the satellite access, and really quite hasty. In the end, I was quite embarrassed that the way that I tacitly endorses students characterization of assessment bias as being politically correct. And really more importantly, we're really blistering critique from my colleagues with expertise in diversity in several tacit assumptions as a reflection in particular that I'm requiring someone rather than inviting them to speak for their group. There's potential for stereotype threat. I might have done more harm than good. We made some revisions to this. We added it later in the semester when we prepared to discuss it. I'm not going to talk about that today. Instead, I want to shift and turn this over to Joshua because we decided to hold off. He wanted, decided to use this data. It was an ideal dataset for his early inquiry project, he's currently in the process of revising his proposal. It's a fairly formal process into dissertations, level work and many other programs. So I'm going to turn this over to Joshua. Joshua taken away. Thank you Dan. So as Dan mentioned what we're really interested in here is how students are positioning themselves as authors and developing these sort of expansive and contextualized frames. So what we're going to be doing is developing a coded data set that articulates the extent to which each student did this within the wikifolios, the reflections and of course the discussions themselves. And we'll also be looking at how the instructor repositions students via discussion comments in that sense, and paying particular attention to the minority students, such as they are within these courses. Next slide. You can go ahead for them. As you can see from this graph, we do have some reason to think that there might be something to this process of reflection. So on the bottom of the x axis, or just the occurrence of sociopolitical controversy themes within a wikifolio. That's not the total themes that occurred, it's just whether they occurred or not across the students wikifolios. And on the y-axis we have proxy for engagement of how deep the students got into the discussions just by number of sentences that occurred. There was some correlation though it wasn't significant, but we would expect that the more engaged students are, the students that participated more effectively as experts and, and authors of content to engage in these controversies may also perform better on these assessments. So that's what we'll be looking at next. And I think you went back there, Dan. The way that we're going to do that, well, initially wanted to look at the units and data platform and we were intending to do that for this work. But what we found is that through the process of how it transforms data and cleans it. This course wasn't adjusted yet, so we had to go back to the older Canvas data, and do all the cleaning ourselves, so that put us behind a bit. But as I mentioned, we'll be using these codes and then looking at the associations between codes within a technique called epistemic network analysis. So on, on the right of the slide you can see an example of that, and this is just a student's first wikifolio. As you can see, they, they tended to have stronger associations between places and times pass. And then they, they held themselves and others held themselves accountable through that sort of articulation. What's neat about this is that we can actually develop a numeric representation of how the student contextualize through sort of the, the mean of these associative codes. We're going to be using that to do comparative work between how the learners performed over time as well as within the exams themselves. So hopefully we'll have more to report on that soon and I'll turn it back to Dan. Thank you Joshua. And so one of the things we're really excited about is we do have exam data in this course. One of the things I'm focusing on in this work quite a bit is looking as Joshua, the analyses Joshua is doing. We're going to basically look at all of these different factors and see how they relate to exam performance, because our exams were very carefully constructed to be, we never teach to them, right? So they're arguably they're estimates of the transferability of that knowledge to subsequent setting. So we think that that's going to be, that we also have a survey that we've developed with the support of the subtle grant that we're also looking at right now and we're proposing that as an alternative to the more widely used community of inquiry survey, which has never been shown to be related to learning outcomes. So with that, we'll stop and ask for, we'll take questions. I'm gonna leave this slide up while we take questions, but for now, we're just going to give it a wrap on this talk. So thanks for coming, everybody.

Means and Ways: Interaction of Family Income and Gender in Academic Field Selection and Persistence - Michael Kaganovich, Morgan Taylor, and Ruli Xiao

Kaganovich et al. focus on how tuition cost differences for in-state and out-of-state domestic students and their family economic status, represented by residential zip-codes, influence enrollment and attrition decisions.

Description of the video:

Hello and welcome to IUB Finite Mathematics and Retention: Comparing Preparatory Mathematics Courses versus Co-Curricular Mathematics Support Courses. I'm Dr. Andrew M Koke. I am joined by Sharon Hoffman, Dr. Molly Berke-Leon, and Dr. Anthony Guest-Scott. All of us work at the Student Academic Center. The question that we're actually wanting to ask first at for this Learning Analytics grant is about the math requirement at Indiana University Bloomington. Most students will satisfy the math modeling requirement by taking one of three classes, either calculus, M211, finite math, M118, or finite math, D117. Now which of these classes they go into is largely dependent upon their ALEKS score. ALEKS is a math placement exam. It's used across our nation and it ranks students math proficiency on a score of 1 to 100. 75 or higher, they can go into calculus, 50 and higher, 50 to 74, they could go into finite math M118, and actually finite math M118 is the largest class on our campus with around 4,000 to 5,000 freshman and other students taking that course every year. The question is, and things get more complicated whenever students do not perform that well on the ALEKS placement exam. If they are 35 to 49, they will often take the two semester version of finite, where they take math D116 in one semester, and Math D117 in the second semester. And if it's below 34 or 34 lower on the ALEKS placement, then there's a host of other classes that they might take in preparation for finally getting into math. D116 and D117. So the weaker your ALEKS Math Placement Exam, the more math you need to take. That kind of makes intuitive sense, right? Like students who aren't well-prepared for finite should maybe take other prep classes in order to be ready. But the unintended consequence of that is that some students who don't particularly like math and aren't particularly good at math, might take four or five math classes in order to meet the math modeling requirement. Each one of those takes a semester. So it's quite possible for students to spend two years accomplishing their math modeling requirement. We wondered, our first question, whether students who have an ALEKS score below 35 leave the university before graduating, and we actually studied for the past several years, since 2009, all the students who took math classes on our campus who had an ALEKS score below 35. What we found is that in fact, a lot of students will drop out of IU as they face this math hurdle. And you can imagine why if you're not particularly good at math, if you don't really like math and you see that there are four semesters of math ahead of you, and if one of those semesters goes poorly, you might think to yourself, this isn't the school for me. What we found, the most startling result we found is that there is a very high proportion of students who will drop out of IU instead of overcoming that math hurdle. If the student is an under-represented minority, they're more likely to drop out or if their Pell eligible, they're more likely to drop out, and if they're first-generation, they're more likely to drop out as well. So our current system of taking these preparatory classes isn't necessarily the best one. There was a possibly another way forward. Since 2013 at City University of New York, some scholars had been tracking their population of students who also need to take preparatory math courses. What they found was that these students actually will do better, not just in the math required course, but also on all future math courses, If instead of taking the math prep courses, they simply enroll in the required course, but partner that with a co-requisite math support class. The Student Academic Center does have just such a co-curricular math support class. It's education X101. We're wondering could we help IUB students skip those prep classes and actually get across the finish line if we partner them more closely with education X101. That led us to our second question, does our education X101, finite help class actually improve student grades in finite? What we found is that yes, it does. Overall, students who take education X101 get fewer F and W grades and more B, C and D grades in finite, and that actually goes up if they are Pell eligible, an underrepresented minority, and also, or also a first-generation student. That leads us then to our next step. So we know that many students take the prep classes and drop out of IU, and we know that at City University in New York, there was an alternate model, and we also have a successful co-curricular class, education X101. Therefore, we propose in fall 2021, partnering with IUB's Groups Scholars Program, and choosing 40 students who score below a 35 and would normally be taking those math prep courses. Instead putting them in math, D116 and Education X101 at the same time and getting them across the math hurdle in two semesters. We think, we think the tools are in place for these students to have success in two semesters, in what would normally take four semesters. So stay tuned next year, another Learning Analytics grant, and we're going to study this in our pilot program. Thanks very much everyone.

IUB Finite Mathematics and Retention: Comparing Preparatory Mathematics Courses vs. Co-Curricular Mathematics Support Courses - Dr. Andrew M. Koke et al.

Koke et al. determine the extent to which the preparatory mathematics courses lead to attrition and the extent to which the X101 finite help class positively impacts student performance in finite.

Description of the video:

Hello and welcome to IUB Finite Mathematics and Retention: Comparing Preparatory Mathematics Courses versus Co-Curricular Mathematics Support Courses. I'm Dr. Andrew M Koke. I am joined by Sharon Hoffman, Dr. Molly Berke-Leon, and Dr. Anthony Guest-Scott. All of us work at the Student Academic Center. The question that we're actually wanting to ask first at for this Learning Analytics grant is about the math requirement at Indiana University Bloomington. Most students will satisfy the math modeling requirement by taking one of three classes, either calculus, M211, finite math, M118, or finite math, D117. Now which of these classes they go into is largely dependent upon their ALEKS score. ALEKS is a math placement exam. It's used across our nation and it ranks students math proficiency on a score of 1 to 100. 75 or higher, they can go into calculus, 50 and higher, 50 to 74, they could go into finite math M118, and actually finite math M118 is the largest class on our campus with around 4,000 to 5,000 freshman and other students taking that course every year. The question is, and things get more complicated whenever students do not perform that well on the ALEKS placement exam. If they are 35 to 49, they will often take the two semester version of finite, where they take math D116 in one semester, and Math D117 in the second semester. And if it's below 34 or 34 lower on the ALEKS placement, then there's a host of other classes that they might take in preparation for finally getting into math. D116 and D117. So the weaker your ALEKS Math Placement Exam, the more math you need to take. That kind of makes intuitive sense, right? Like students who aren't well-prepared for finite should maybe take other prep classes in order to be ready. But the unintended consequence of that is that some students who don't particularly like math and aren't particularly good at math, might take four or five math classes in order to meet the math modeling requirement. Each one of those takes a semester. So it's quite possible for students to spend two years accomplishing their math modeling requirement. We wondered, our first question, whether students who have an ALEKS score below 35 leave the university before graduating, and we actually studied for the past several years, since 2009, all the students who took math classes on our campus who had an ALEKS score below 35. What we found is that in fact, a lot of students will drop out of IU as they face this math hurdle. And you can imagine why if you're not particularly good at math, if you don't really like math and you see that there are four semesters of math ahead of you, and if one of those semesters goes poorly, you might think to yourself, this isn't the school for me. What we found, the most startling result we found is that there is a very high proportion of students who will drop out of IU instead of overcoming that math hurdle. If the student is an under-represented minority, they're more likely to drop out or if their Pell eligible, they're more likely to drop out, and if they're first-generation, they're more likely to drop out as well. So our current system of taking these preparatory classes isn't necessarily the best one. There was a possibly another way forward. Since 2013 at City University of New York, some scholars had been tracking their population of students who also need to take preparatory math courses. What they found was that these students actually will do better, not just in the math required course, but also on all future math courses, If instead of taking the math prep courses, they simply enroll in the required course, but partner that with a co-requisite math support class. The Student Academic Center does have just such a co-curricular math support class. It's education X101. We're wondering could we help IUB students skip those prep classes and actually get across the finish line if we partner them more closely with education X101. That led us to our second question, does our education X101, finite help class actually improve student grades in finite? What we found is that yes, it does. Overall, students who take education X101 get fewer F and W grades and more B, C and D grades in finite, and that actually goes up if they are Pell eligible, an underrepresented minority, and also, or also a first-generation student. That leads us then to our next step. So we know that many students take the prep classes and drop out of IU, and we know that at City University in New York, there was an alternate model, and we also have a successful co-curricular class, education X101. Therefore, we propose in fall 2021, partnering with IUB's Groups Scholars Program, and choosing 40 students who score below a 35 and would normally be taking those math prep courses. Instead putting them in math, D116 and Education X101 at the same time and getting them across the math hurdle in two semesters. We think, we think the tools are in place for these students to have success in two semesters, in what would normally take four semesters. So stay tuned next year, another Learning Analytics grant, and we're going to study this in our pilot program. Thanks very much everyone.

Factors Determining Student Success in Downstream Courses from Business Statistics - Nastassia Krukava and John Stone

This project aims to address questions related to student performance in down-stream courses from introductory Business Statistics at Indiana University (IU).

Description of the video:

It is our greatest pleasure to present the results of a new learning analytics project that looks at the factors determining student success in downstream courses from business statistics. This project is a joint work performed by learning analytics fellows Nastassia Krukava and John Stone presented by Nastassia Krukava in this video. Within this project, we aim to look at several questions related to student's performance in introductory business statistics courses at IU. I'd like to start with some background information to emphasize why the questions we raised in this project are important. So at IU, there are two main business statistics courses, Economics, E370, and statistics, S301. These courses are substitutable for most students required to complete a business statistics curriculum. Both are similar in content and require the same prerequisite, finite mathematics, or so-called M118 course. Since students have two different options, E370, and S301 for completing their business statistics requirement, and these options are very similar from students perspective. We'd like to ask whether these courses equally prepare students for their downstream courses. In other words, we try to understand whether the students who take E370, and S301 perform equally well or differently in their subsequent courses. We will get two downstream courses, business G350, which stands for a business econometrics and business M346, which stands for analysis of marketing data. These courses heavily rely on the knowledge of statistical tools taught in introductory business statistics, but in the short video, we only present the findings related to G350, noting that our results for M346 are not drastically different. We also look at a related question and try to understand the effect of transfer credit for finite mathematics, M118 from community colleges on students performance in G350 and M346. The main puzzle here is that both business statistics courses require M118 that many students choose to take at a local community college, presumably because it is easier to earn higher course GPA, but whether community colleges prepare students well for their future courses is not immediately clear. So a related question is whether the students who fulfill M118 requirement at a local community college perform differently in their downstream courses like G350 and M346. Let me jump right away into the methodology we use for analysis of our first question that looks at the differences in the performance in downstream courses for students taking different business statistics options. As I already mentioned in this video, we present our findings for one downstream course G350. our baseline population regression model to analyze the relationship between G350 grade in business statistics course completed by a student is presented here on the slide, where we use a student GPA grade in G350 as a left-hand side variable that depends on a dummy variable for business statistics, denoted by E370, and other controls denoted by W. This factor W includes students observable characteristics such as high-school GPA, SAT score, indicators for Pell Grant recipients and others. Our main variable of interest in this model is the dummy for business statistics course. This E370 variable, that takes value of one if the student took E370, and 0, if the student completed S301. The estimated coefficient of this variable will correspond to the effect of taking E370 on students grade in G350 as opposed to taking S301. To understand the effect M118 taken at a local community college, our second question will regress students grades in G350 on the same set of controls included in W, and our main variable of interest denoted by M118_ LCC. This variable is the dummy equal to 1 if the students transferred M118 from a local community college, and is equal to 0 if the student completed M118 at IU. So the estimated coefficient of this variable will inform us if students who take M118 at a local community college tend to do worse or possibly batter in G350 compared to those who take M118 at IU. Results of the estimation are displayed in this table. The first column shows estimates of the coefficients from regress in G350 grade on E370 dummy and other controls. The estimate of the coefficient on E370 variable is negative and indicates that a student who took E370 earns a GPA grade in G350 that is almost 0.04 points lower on average than the measurably comparable student who took S301. However, this point estimate is not statistically significant. So we cannot rule out that students who took E370 and S301 perform equally well in G350. The confidence interval for the estimate of this graph is between -0.13 and 0.05. So we also cannot rule out small differences in students performance in G350 that can be attributed to their business statistics course. But again, given that the bounds of the interval are quite narrow around 0, we are talking about a possibility of small differences. The second column of this table presents estimates of the coefficients for the second model where we regress G50 grade on an indicator variable for M118 taken at a local community college and other controls. The estimated coefficient on our mean variable shows that a student who completed M118 at a local community college earns a GPA grade in G350 that is 0.11 points lower on average than a measurably comparable student who completed in M118 at IU. These estimate is also not statistically significant, so we cannot rule out the possibility that taking M118 at a local community college does not affect students performance in G350. But unlike the coefficient on E370 in the first model, this estimate has a confidence interval ranging from -0.28 to 0.06 with more pronounced negative value for the lower bound. So we cannot rule out the possibility of modest negative effect of taking M118 at a local community college on students performance in G350. As a concluding remark, I'd like to say that we conducted a similar analysis of the effect of taking E370 and S301 in the effect of taking M118 at a local community college on students performance in M346. We used the same methodology as demonstrated in this video for G350, and we obtained similar results. To wind up, I'd like to thank you for taking interest in our findings. Your questions and suggestions are very welcome.

What drives student major change - Jie Li

Li identifies required Finance, Accounting and Marketing major courses and studied their enrollment changes in response to the trend changes of major declaration before addressing factors that directly or indirectly affect the enrollment of a particular course and experiment with predictive data models.

Description of the video:


Hello. My name's Jie Li and I'm from Kelly School of Business. I'm here to talk about my research subject in 2020, the drivers behind student major change behaviors. The data is from Bloomington undergraduate students enrolled between Fall 2006 and Spring 2019. The data include student demographic information, their admission information, and their academic performance information including courses they take each term and the GPA, and the credits earned. If students have graduated, it also provides their degree earned, the time to graduation, etc. This is a continuation from my research in 2019. My focus is narrowed to only students in business school, and I found last year that over 22,000 students enrolled in business undergraduate program. Among them 12,000 have graduated with at least one business degrees. I split the students into two groups, major sticklers and major switchers. Major sticklers are those students who have earned the degree that they declared at the time of admission, and about 55% of students fall into this category. The other category are students that did not earn the first-degree declared at the time of admission. This year, I further narrowed my focus on five majors. Three traditional majors: accounting, finance and marketing and two non-traditional majors: information system and supply chain. I looked at three common course groups. The first group is I-Core, which covers the major business subjects. I also have a group of courses that teaches students quantitative and technical skills. Soft skill skill courses are courses that train students with their presentation, communication, and career development skills. First, I looked at the switchers and sticklers and no significant academic performance difference have been identified between them. But I did find some demographic traits that make students more likely to be a stickler, including female international, first-generation, and students admitted through standard admission process. A smaller group of students have identified as late switchers. These are students declare or drop majors on the six terms or later. About 1/5 of students belong to this group. This group does have lower GPA, their demographic traits are somewhat different. So, I try to create a predictive model to identify those students, but the model's performance are too poor to be useful. I did find some notable demographic differences amongst students earning different degrees. The chart on the left is the gender distribution. Finance students have significant lower percentage of female students and marketing have overrepresentation of female students. The broken line in the middle represent the overall female percentage. The chart on the right is the ethnicity percentage distribution across the five degrees. Students graduated with market degree, the significant lower representation of international students among this degree. Students performance in the three common course groups are analyzed. This is the chart of student's average in there I-Core course performance. This is the number of students who took I-Core, these are the overall GPA average of different courses, and these numbers represent the GPA by courses, by different degrees. It's obvious that accounting performed the strongest. I also conducted ANOVA analysis to make sure that the course GPAs can be used to differentiate students earning these five different degrees. It showed that I370, the discussion course, does not have significant difference among students, but all the five other courses can be used. This is an example of a box plot using F370, and these charts are how these courses group the five degrees into different groups. These two charts are student performance in their quantitative and technical courses and in their soft skill courses. In quantitative courses, all of them have differentiating power. Soft skill courses, three of them should not be used, and we can see that accounting students and finance students still are the strongest performers with quantitative and technical courses, but it's very obvious that information system students perform very well in the computer and technology courses. For the soft skill courses, marketing students really shine, they are the best performers and IS students perform the weakest. I-Core give students exposure to major business subjects, and it is hypothesized that I-Core has significant influence on students major decisions, and this is confirmed in this research. The numbers over here indicate how many students declared majors in the I-Core term or in the term immediately after I-Core. Most students take I-Core in term five, and we see the highest number of major declaration in term five, and in term six. In full, students declared new majors, their performance are generally better than the overall average. Among them, the best-performing students declare accounting and finance. The students performing the weakest declaring marketing, IS, in both the term of I-Core and the subsequent term. The number of students declare majors are significant, over 1/5 of students declared new majors during the I-Core term, and 1/3 of students declared new majors in the subsequent of I-Core term. Over 40% of students earning IS and supply chain degrees declared new majors in these two terms. Over 1/4 of students declared the other three majors in these two terms. Even though not as many people dropped majors around the time of I-Core, but we still have significant number of them dropped in term five, terms six, the numbers are the highest. About 9% of them dropped majors during the I-Core term, and about 16% of them dropped majors right after the I-Core term. The highest drop is marketing, and the lowest drop is finance. It's very obvious in both charts, students performs the weakest in I-Core, drop finance. Students perform the best, dropped marketing or information systems or supply chain management. From the study, I have identified some course that can be labeled as major influencers. The first one is F370, this is the finance component of I-Core. If students perform poorly in this course, it's very likely they are going to drop finance. Another course is the operation component of I-Core it's P370. If students performed relatively well in this course, but not as well in other I-Core courses or other common course groups, it's likely that students will choose supply chain management. If students performed very well in the computer and technology courses, but they have weak performance in soft skill courses, they have a higher chance of moving to Information Systems major. If students have strong performance in soft skill courses and their performance in quantitative and technical courses are weak and if they don't perform very well in F370 and P370, then these students have a high inclination to go into marketing. How would these findings impact our decision making? I would suggest that we provide more support and guidance to students who haven't decided their major until their junior term, that we can reduce late major changes, and we should encourage more female students to pursue finance, and we should strengthen quantitative and technical skills for marketing students. We should try to achieve gender balance and diversity in finance and marketing, help international students overcome language barriers and cultural difference. We should help them improve their communication, presentation, and career building skills and include some language and culture neutral contents in marketing curriculum to attract more international students. Thank you.

DataCamp learning platform: reducing skills gaps between courses and job market - Olga Scrivner

Scrivner examines students’ profiles and develop a learners’ classification system to explore patterns in their course trajectory and learning behaviours.

Description of the video:

Good morning everyone. My name is Dr. Farfan D'souza and my colleague with me today is Dr. Krisha Thiagarajah and our study is looking to investigate the influence in students success in introductory STEM courses. The purpose of our study was to examine the relationship between high school GPA and SAT and and course performance in terms of course grades in introductory STEM courses. We were particularly interested to see if there was a relationship between student, gender, race, ethnicity, immigration status, and socio-economic class on course grades, high school GPA, and SAT scores. The reason for this is because introductory or gateway STEM courses are designed to provide strong academic foundations for both non-STEM and STEM majors. It is the first experience of the STEM courses for most students coming to college, and therefore very essential to examine these experiences. Another important factor in retention, recruitment and the first experience of students is the academic measures of success such as SATs and GPA because it is often used to predict success in undergraduate education and therefore considered to be a major element. For the purpose of our study, we define introductory STEM courses, which is, which stands for science, technology, engineering, and mathematics as Introductory, a 100 or 200 level courses in Math, Natural Sciences, Engineering, Computer or Information Sciences. These are the courses listed on the slide. The courses we chose to look at. So, we had conducted a smaller study last year to understand the relationship between high-school GPA, SAT scores and course grades in an introductory STEM course, but the study was done in one single course, which was Dr. Krisha Thiagarajah's course. We are interested to see if these patterns hold across all STEM courses at IUB. So, for this study, our research methods was primarily quantitative statistical analysis, in particular, descriptive Chi-squared testing and linear regression analysis. The data collected we used from the Bloomington Assessment and Research Department, and we looked at only enrolled students at IUB during the 5-year period from Fall 2014 to Fall 2019 including Spring semesters, but not Summer semesters. So now I'd like to hand it over to my colleague Krisha Thiagarajah to run us through the findings. So first we look at the demographic characteristics. So you can see almost 70, nearly 70% of the student population is White, and then international students, they were 9.3%, and Asian students, 6.8 and Latinos, 6.4% and African-Americans, 4.6%. But at the same times you will see American Indian, Alaskan, Native, or Native Hawaiian and Pacific Islanders were very smaller numbers. So the analyses we exclude those population, and also males and females, almost similar in distribution. The letter grade distribution by ethnicity, if you look at it, the Asian students and also the international students, they have the higher great distribution. At the same time, they also have the lowest withdrawal rate compared to other ethnic groups. Black students had higher withdrawal rate compared to other populations. Next slide, please. If you look at the mean STEM course GPA by ethnicity and gender, Asian, there is no significant difference in the mean GPA between Asian students and international students. Also, their GPA is compared to other ethnic groups, they have the higher GPA. Also the lowest mean GPA is happy for the back students. We looked at the correlation between the STEM course GPA and also high-school GPA, and also then the correlation between SAT scores and also with the STEM course GPA. So you can see almost everything is actually looking significant. Maybe part of the reason, it might be the larger sample size, and also they say in the correlation, if it is 0.5 to 1, it has a higher correlation. None of our results show that. Then 0.3 to 0.49, it is medium level correlation, or moderate level correlation. The lowest is below 0.3. You can see Asian, Black, almost everybody with a high-school GPA and their STEM course GPA, are moderately correlated, except for international students, maybe their education system is different, so that is why it is not reflecting that. If you look at the SAT scores, Asians have a moderate correlation and also at the same time, White and also two or more races, they also have higher correlation. But Black, Hispanic students have lower correlation. Also international students have very low correlation with their STEM course GPA and SAT score. When we look at the prediction of how high-school GPA and SAT scores predict the STEM course GPA, you can see it is very few percentage for White 17 percent, and for Black students it is 13%, and for Hispanic students, also around 13%. At the same time, for Asian students, it is around like 24%. So it is actually not a good predictor for the STEM course GPA. Also, we then looked at the Pell Grant eligibility by the ethnicity. So you all can see the Pell Grant eligibility, most of the students, almost 2/3 of the Black students receive Pell Grant compared to other ethnic groups. Also, we looked at the mean STEM course GPA between Pell Grant recipients and the non-Pell Grant recipients. Non-Pell Grant recipients has higher STEM course GPA, compared to Pell Grant recipients. To conclude, our results show that race and gender do play a factor in the performance scores in introductory STEM scores. But we also saw that low, there was only low or moderate correlations between course GPA in high school, GPA with SAT scores. We are interested to understand why we see these results. Why do we see this difference in grade performance among different racial groups? So we are definitely, we are interested in examining the various factors influencing the success or the challenges of students, successful students who succeed or do when, and the challenges of students who are struggling. So we aim to use qualitative methods to understand deeply what works in order to possibly design future classroom interventions that will change these results. To end, we would just like to thank the Bloomington Assessment and Research, especially Linda Sheppard and Pallavi Chauhan who helped us with getting the data together, thinking through questions and analyses. We would definitely lastly, like to thank the Learning Analytics Fellows Program for giving us this opportunity, and George Rehrey to show for all his support. Thank you.

Improving placement accuracy and efficiency of incoming international students into ESL support courses - Sun-Young Shin

Shin investigates the extent to which TOEFL iBT subskill scores can be comparable to the IAET test scores in terms of placement into various levels of ELIP courses.

Description of the video:

Good morning everyone. My name is Dr. Farfan D'souza and my colleague with me today is Dr. Krisha Thiagarajah and our study is looking to investigate the influence in students success in introductory STEM courses. The purpose of our study was to examine the relationship between high school GPA and SAT and and course performance in terms of course grades in introductory STEM courses. We were particularly interested to see if there was a relationship between student, gender, race, ethnicity, immigration status, and socio-economic class on course grades, high school GPA, and SAT scores. The reason for this is because introductory or gateway STEM courses are designed to provide strong academic foundations for both non-STEM and STEM majors. It is the first experience of the STEM courses for most students coming to college, and therefore very essential to examine these experiences. Another important factor in retention, recruitment and the first experience of students is the academic measures of success such as SATs and GPA because it is often used to predict success in undergraduate education and therefore considered to be a major element. For the purpose of our study, we define introductory STEM courses, which is, which stands for science, technology, engineering, and mathematics as Introductory, a 100 or 200 level courses in Math, Natural Sciences, Engineering, Computer or Information Sciences. These are the courses listed on the slide. The courses we chose to look at. So, we had conducted a smaller study last year to understand the relationship between high-school GPA, SAT scores and course grades in an introductory STEM course, but the study was done in one single course, which was Dr. Krisha Thiagarajah's course. We are interested to see if these patterns hold across all STEM courses at IUB. So, for this study, our research methods was primarily quantitative statistical analysis, in particular, descriptive Chi-squared testing and linear regression analysis. The data collected we used from the Bloomington Assessment and Research Department, and we looked at only enrolled students at IUB during the 5-year period from Fall 2014 to Fall 2019 including Spring semesters, but not Summer semesters. So now I'd like to hand it over to my colleague Krisha Thiagarajah to run us through the findings. So first we look at the demographic characteristics. So you can see almost 70, nearly 70% of the student population is White, and then international students, they were 9.3%, and Asian students, 6.8 and Latinos, 6.4% and African-Americans, 4.6%. But at the same times you will see American Indian, Alaskan, Native, or Native Hawaiian and Pacific Islanders were very smaller numbers. So the analyses we exclude those population, and also males and females, almost similar in distribution. The letter grade distribution by ethnicity, if you look at it, the Asian students and also the international students, they have the higher great distribution. At the same time, they also have the lowest withdrawal rate compared to other ethnic groups. Black students had higher withdrawal rate compared to other populations. Next slide, please. If you look at the mean STEM course GPA by ethnicity and gender, Asian, there is no significant difference in the mean GPA between Asian students and international students. Also, their GPA is compared to other ethnic groups, they have the higher GPA. Also the lowest mean GPA is happy for the back students. We looked at the correlation between the STEM course GPA and also high-school GPA, and also then the correlation between SAT scores and also with the STEM course GPA. So you can see almost everything is actually looking significant. Maybe part of the reason, it might be the larger sample size, and also they say in the correlation, if it is 0.5 to 1, it has a higher correlation. None of our results show that. Then 0.3 to 0.49, it is medium level correlation, or moderate level correlation. The lowest is below 0.3. You can see Asian, Black, almost everybody with a high-school GPA and their STEM course GPA, are moderately correlated, except for international students, maybe their education system is different, so that is why it is not reflecting that. If you look at the SAT scores, Asians have a moderate correlation and also at the same time, White and also two or more races, they also have higher correlation. But Black, Hispanic students have lower correlation. Also international students have very low correlation with their STEM course GPA and SAT score. When we look at the prediction of how high-school GPA and SAT scores predict the STEM course GPA, you can see it is very few percentage for White 17 percent, and for Black students it is 13%, and for Hispanic students, also around 13%. At the same time, for Asian students, it is around like 24%. So it is actually not a good predictor for the STEM course GPA. Also, we then looked at the Pell Grant eligibility by the ethnicity. So you all can see the Pell Grant eligibility, most of the students, almost 2/3 of the Black students receive Pell Grant compared to other ethnic groups. Also, we looked at the mean STEM course GPA between Pell Grant recipients and the non-Pell Grant recipients. Non-Pell Grant recipients has higher STEM course GPA, compared to Pell Grant recipients. To conclude, our results show that race and gender do play a factor in the performance scores in introductory STEM scores. But we also saw that low, there was only low or moderate correlations between course GPA in high school, GPA with SAT scores. We are interested to understand why we see these results. Why do we see this difference in grade performance among different racial groups? So we are definitely, we are interested in examining the various factors influencing the success or the challenges of students, successful students who succeed or do when, and the challenges of students who are struggling. So we aim to use qualitative methods to understand deeply what works in order to possibly design future classroom interventions that will change these results. To end, we would just like to thank the Bloomington Assessment and Research, especially Linda Sheppard and Pallavi Chauhan who helped us with getting the data together, thinking through questions and analyses. We would definitely lastly, like to thank the Learning Analytics Fellows Program for giving us this opportunity, and George Rehrey to show for all his support. Thank you.