Adventurers in Learning — Part 10

What Has Gone Before

In Parts 1-5, I discussed the Team-Based Learning methodology (TBL), course design for an undergraduate INF 202 Introduction to Data and Databases class, and drew parallels between TBL and roleplaying games (RPGs). Part 6 reviewed the first week of classes. Parts 7-9 began the coverage of the second section of the course.

Where We Are Going

This entry finishes the discussion of the second section activities classes.

Class 5: Spreadsheet Activities Continued

Reacting to the attention lull apparent at the end of the last class, I decided to cover “administrative” material at the beginning. I started with a mini-lecture covering the appeals from RAT2. Gave me a chance to reinforce some of the points from the reading, to talk about the group’s reasoning on their appeals, and to explain about my rationale for my answer. Good review, but it stalled the commencement of activities and returned the students to their “lecture-class” attitudes—sit back, turn mind on auto-listen, nod blankly. Can’t win here, it seems.

As soon as I could, I wrapped up my blathering and returned to multi-part spreadsheet design exercise. As you may recall from Part 9, we are designing an spreadsheet solution for tracking and reporting grades in the class. The students have a vested interest for two reasons: (1) it’s an opportunity to have input in how their grades will be reported and calculated, and (2) their assignment for this section is to implement a spreadsheet solution based on the design work we were doing in class (their participation in class sets them up nicely to complete their assignment).

First, to re-engage them, I summarized the goals statements the groups devised in the prior class. This was a nice reminder of what we had done. The slightly different wording of each of their statements was also a good opportunity to point out that no one “right” answer existed. A variety of “good” goal statements were possible and that was okay. I also like the group shout-out that seeing their work (cleaned-up) on the screen provided. Now that I think about it, I could have replaced this presentation with an activity where the groups could pick the statement they liked best. Even if they all picked their own statement, it would have been an opportunity to tease why one statement might be better than another.

Next, I condensed and listed (in one group) the various requirements that the groups created in the prior class. Again, a good review. I then asked each individual to rank each of the 14 requirements as (1) must have, (2) nice to have, (3) enhanced features. I provided a quick explanation of each rank as proceeding down the scale of priority. After a few minutes, I asked the groups to combine and finalize their rankings. In general, the rankings were parallel, but there was some variation. I asked the groups to explain their differences regarding a few of the requirements, but I realize now that I mostly gave my reactions to their statements. I should have done more to get them to debate with each other.

In the next activity, I gave a couple simple definitions of an “entity”. I asked the groups to come up with four entities that will go into the spreadsheet. I (unhelpfully) gave them the first one: grades. In truth, I had only two others in mind: students, tests/assignments. Still, I wanted to see what they might come up with in case I was being blind to something important. All groups came up with students and tests/assignments. That was good to see. The additional answers were either variations on the core three, or not entities. Good check on my design “answers” but there wasn’t a real opportunity to create a class debate on the fourth “entities”. Each was different and somewhat vague. I ‘fessed up to asking for an answer that I didn’t think existed. Got some wry looks but I’m hoping no one got too annoyed.

I then asked the groups to sketch out a rough diagram of where the entities we had identified should be placed on the spreadsheet. To me, it seemed pretty settled that the students should be rows and the tests/evaluations should be columns, making the grades the intersection of the two. Still, I wanted to see if the students would reverse the row/column placement or come up with something completely different. In the end, one group did reverse the placement and another had a completely different layout. Great variety, but it turned out not be a good launching point for a discussion. It was hard for me to articulate why I thought my placement was best (mostly ease of use concerns but they were not that compelling), and it was even harder for the students to articulate why they choose the way they did. There may have been a problem with the phrasing of the direction in this activity. Seems like a kernel of a good activity here, but the execution needed work.

Next, I focused on the student list (the rows), pointing out the requirements of anonymity (Federal regulations on this issue), easy-to-find (students must identify their own grades), and easy-to-assign (I needed to identify the students). I noted that it was not going to be easy to satisfy these design parameters, but asked the groups to brainstorm “some ideas for handling this problem.” A worthy exercise, something that people who design solutions to data problems are regularly faced with, and (for me) a sexy, creative part of the process. Unfortunately, it didn’t make for a very satisfying activity. I think the basic problem was that the students didn’t know enough about Excel’s capacity to piece together a variety of solutions. I probably should have listed some of the features of Excel that they had read about and asked them to pick one and build a solution around it.

I then pointed them toward the LOOKUP feature as a possible answer to the above design problem. I asked them to devise a scheme for using LOOKUP. LOOKUP is complicated to talk about in general, conceptual terms and that showed in the report out. Meh activity.

Time to move onto the tests/evaluations (the columns). I listed the six types (RATs, Final Project, Commentary, In-class Activities, End-of-Section Assignments, and Peer Assessments) and reviewed each in turn.

First up was RATs. In my view, these are the most precise grades, but also the most complicated to record and calculate. I asked the groups to “decide how to handle RAT grades.” I primed the pump with the questions “How many grades need to be entered? What calculations are necessary? What’s the equation for each individual’s final reported grade look like?” Looking back, I violated just about every guideline for designing good activities in this exercise. My only excuse is that it was probably late when I was doing my prep and I was probably beat. In any event, here’s an initial list of problems with this approach:

Too broad: The overall problem was too wide open. Hard to isolate a decision point there that can be the basis for a discussion.
Multiple: I asked three questions to aid in the brainstorming process. Again, decision point became muddled.
Unclear: The first priming question “How many grades need to be entered?” could mean “How many grades need to be entered per RAT?” or “How many RAT grades will we have all semester?” or both or something else. A similar issue exists with the third priming question. That results in the students not all working on the same problem (a TBL activity guideline).
Branching: Depending on how they interpreted the first priming question, the second priming question changes.
Suggestive without being instructive: The question about calculations indicates that I expect calculations to play a part without giving them any way to tie that to the overall problem.

Finally, as if the activity design wa
sn’t screwy enough, we were at the end of the session and everyone was tied. Despite all that, the report out was not a complete disaster. In the end, I simply explained my thoughts on the proper solution. Then I said we would return to the exercise in the next class.

Class 6: Spreadsheet Activities Continued/Data Definitions

Again, I started this class with some bookkeeping. Turns out one of my students wound up on a team that was not the one I had recorded as their assignment. Leaves me with one team of seven, one team of six, and two of five. Everyone seems attached to their teams (a good thing), so I didn’t force the move. We will be testing the theory that precise group size is not an issue.

I then spent some time talking about blogging. One of the assignments for the semester is to contribute to the CCI blog. I spoke about participating in their professional community and building their personal brand. Finally, I told them about this blog (the one you are currently reading) and my plans for it. I asked if anyone had any objections (given that they were the aggregated subjects of my blog) and promised to post it for them to review before I posted it “live.” One student ribbed me about using them to build my personal brand. Dang insightful wretch!

With that, we headed back to the extended spreadsheet design exercise. First, I summarized the work we had done in the last class, showing a spreadsheet with student identifier, student name (an aid for my input, which would be hidden before publication), and three columns for each RAT (iRAT grade, tRAT grade, and weighted final grade accounting for the 40/60 split).

Continuing with the list of evaluation types, I turned to the grade for in-class activities. In total, it’s worth 10% of the entire grade. Trying to better achieve what I now think of as “the decision fulcrum point”, I avoided an open “how would you handle” question in favor of a “decide how many grade entries (columns) are needed”. I gave them the choice of “A. One, B. One per section (7 total), C. One per class (20 total), D. One per activity (unknown number).” Given this clear multiple choice question (my first), I handed out sheets that could be folded to show bold A-D choices in bright colors (suggested by the Dr. Ed Prather at a ITLAL presentation). The callers (see Part 6) were instructed to hold one of the cards aloft when I ask the teams to (simultaneously) report out. Got a couple votes for B and a couple for C. I teased out some reasoning behind each, but not much of a classroom debate resulted. I then gave my ideas (I like C for ease of use on my part). Despite the lack of robust debate, the process worked much better, at least from a mechanics standpoint.

One step forward, another step back. I followed this focused, successful activity with a list of the remaining evaluation types (individual assignments, commentary, final project, and peer evaluation). I asked the groups to pick one and “propose how to handle” it. Ugh. The final project and commentary are one-shots with one grade. Silly—one column, you’re done. The peer evaluation has 4-6 grades (one for each other member of a student’s group), so there was some complexity there. No one picked that. The six individual assignments also each have a grade. The real issue here was how to account for these evaluations in the totality of the course grade. That was not the exercise I choose however (that was coming up). Vague question combined with not-same problem means no real “decision fulcrum point.” Sigh.

Next was the cumulative grade. I told them that the final grade would be a calculation, then asked them to “plot out the design of how the cumulative grade for the course will be registered.” I told them they would need to account for each evaluation (test/assignment) grade, the weight of each such grade, and the combination of those weighted grades. I finally asked if anything else needed to be factored in. Basically, the answer is a big-ol’ formula of some sort. Each group took a game stab at it, but there wasn’t enough time to or enough understanding of the rest of the spreadsheet yet (they hadn’t finalized the implementation of the prior work) to really do the problem justice. Also, the vague, general idea solutions that the students devised weren’t good gist for any kind of discussion. On the plus side, I was surely running out of ways to torpedo activities . . . right? In any event, the extended series of spreadsheet activities was done. Whew.

Transitioning, I spent some time reviewing the grading rubric for the spreadsheet assignment (the end of section task that’s a set part of the TBL sectional process). I tried to give them an idea of what was expected in the assignment—implement the design ideas that we discussed in class. I set the middling grade as “all the basic features in neat presentation” and the maximum grade as “one or more advanced features and dressy presentation”. I listed what I thought were the possible “advanced features” but left it for them to decide what the basic features where (a process of elimination from the advanced) and the what “dressy” meant. Ah clarity, elusiveness becomes you. I was either clear enough or they were spent; no one asked any questions.

Switching gears, I moved onto a set of activities on the data/information/knowledge/wisdom hierarchy (the substance of the Bellinger article). From my Master’s work, my take is that this discussion is an interesting philosophical activity, but “so what”? The Bellinger article tried to answer that question—the answer was that we tried to move our understanding up the hierarchy to better understand the current situation and make better predictions about future events. The breakdown became: data, information (data in context), knowledge (patterns in the information), and widsom (principles derived from the patterns).

With that background, I presented a data problem based around the oft-used (and apparently apocryphal) beer and diapers marketing scenario. I posited the student groups as “crack marketing teams”—that got a rise given the dual meaning of “crack”—of a local supermarket.

I told them their company issued a shopping card that recorded information about each purchase: customer ID, date, time, UPC code, amount, and price. I asked them to decide in their groups whether these recordings were (A) data, (B) information, (C) knowledge, and (D) wisdom. Finally, a truly successful activity. The students split on (A) and (B) and engaged in a robust discussion among themselves about the issue. After so many, at best, partially successful activities, this was a joy. So, well understood problem background, well understood definitions, and a precise decision fulcrum. The requirements for a successful activity were becoming clearer.

The next activity was to add to the purchase records more contextual data to move the entire thing more toward information. I asked them to think about the kind of data that might have been collected when the consumer signed up for the shopping card. Again, the activity called on the students to “create” an answer, not choose among answers. Despite the lack of a solid decision point, it worked pretty well. Some good suggestions were made. No real opportunity for discussion though—there was no “wrong” answer.

Lastly, I told them that their data analysis team (the adjective was “ace” this time, not “crack”) discovered that men between the ages of 25-35 purchase diapers and beer in a single visit to the store. I asked them to use this pattern (knowledge) to come up with a marketing policy (wisdom). I asked them to (1) define a marketing objective and (2) announce a policy that would take advantage of the pattern to achieve that objective. All groups came up with a version of “maximize profit” but two decided to put the two products together while two decided to put the products on opposite ends of the store. I was so happy
with the outcome that I rushed ahead and did an instant analysis of the solutions. I should have restrained myself a bit more and teased out the analysis by asking the students leading questions.

Still and all, the Bellinger article group of three activities felt successful. Woot!

As a final note, I took the time to nap for an hour before this class. After a long day at work, the rest did me good. No doubt, my rested mind/attitude contributed to my impressions of a successful class. New agenda: well designed activities plus rest.

More to come,
M Alexander Jurkat
@malexkat

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s