Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
astro300_f16:day6 [2016/09/20 18:50] jwangastro300_f16:day6 [2016/09/21 19:02] (current) – [Midsemester Evals (15 minutes)] ccheng
Line 1: Line 1:
 +======AY 375 - Fall 2016: Sixth Day Lesson Plan======
 +
 +=====Section Recap (15 min)=====
 +Remind them what to think about for section recap:
 +   * What did you do? 
 +   * What worked? 
 +   * What didn't work? What would you do differently?
 +   * How did you assess learning?
 +   * Did anything unexpected happen?
 +
 +===== Misconceptions (40 min)=====
 +  * Group discussion of video and slides (Led by Carina)
 +
 +===== Break (5 min) =====
 +
 +===== Demos (20 min) =====
 +
 +Diffraction grating demo (12 min) - a 7A/120 combo demo. The purpose is to visualize how diffraction/spectrometers work
 +  * How a spectrometer works (5 min) - spectrometer in 120 lab is a black box. Now we can look into the box to see what goes on
 +      * Laser pointer (red) + diffraction grating + point at white board (2 volunteers: laser and grating (need to hold still))
 +        * Identify orders
 +        * Grating equation (simpler one)
 +        * Grating is created by etching in opaque lines periodically. Which way are the lines oriented?
 +        * What is the spacing of the lines? Need volunteer to mark on the white board and measure things
 +      * Add/change to a green laser (another volunteer).
 +        * Explain how this allows us to separate wavelengths so that a spatial location corresponds to light at a given wavelength
 +        * Explain nuances: spectrometers usually used ruled rather than holographic gratings (better efficiency). Using although reflective rather than transmissive (more compact)
 +  * More of a general diffraction demo: width of a human hair (4 min)
 +    * Volunteer to supply and hold hair in front of laser pointer
 +    * See the ensuing diffraction pattern. Can anyone explain what we're seeing?
 +      * This is analogous to the single slit experiment 
 +      * Equation for single slit experiment (Diffraction minima)
 +    * Volunteer to measure the diffraction pattern (what should we measure?)
 +    * Human hair ~50 microns with factor of ~2 variance 
 +      * How does this compare with the diffraction grating? 
 +        * Can you use this to explain why it can be desirable to want bigger telescopes?
 +  * Pass out diffraction gratings to class (4 min)
 +    * These can be borrowed from the Physics demo room (72 Le Conte - in the basement). C10 also has a bunch in the storage room cabinet on the 1st floor (used for arclamp demo).
 +    * What else can we look at with the diffraction gratings? 
 +
 +Meta Discussion of this Demo (5 min)
 +  * Demos are great to illustrate phenomena, especially ones that aren't easy to understand. Diffraction is often explained on the board using waves and interference.. seeing it can help build physical intuition of it. (For programming, algorithms can be explained with demos. Or connecting things learned in class to real life (e.g. DNS lookup with dig)).
 +    * Can be fun alternative way to cover a topic rather than a worksheet
 +  * As a class on the board: Thinking about the demo we just did and previous demos you have done, what makes a good demo and a bad demo:
 +  * What makes a good demo? (with e.g.'s connecting it back to the diffraction demo)
 +    * Illustrating difficult physical concept(s) (e.g. diffraction is not intuitive)
 +    * Interactive: students can participate (e.g. getting them to help with the demo - they feel they made it happen)
 +    * A springboard to new topics (e.g. can think about extending this to other applications)
 +    * Straightforward: minimal risk of failure (e.g. mostly straightforward except lasers can fail, backup?)
 +    * Demo actually illustrates concept in question (e.g. this one is straightforward because very little that is analogy)
 +  * When demos go wrong:
 +    * Demos **can and sometimes do** FAIL! (e.g. backups, explain what they should see, test your demo!)
 +    * Sometimes, especially in astronomy, they can confuse students more than help them or oversimplify a concept. Both overly complex and overly simplified demos can be confusing! (e.g. this demo attempts to treat diffraction at the same level as the course)
 +    * Materials may be missing or broken, so CHECK IN ADVANCE! (e.g. go check on diffraction gratings a week in advance, test lasers)
 +  * Collect diffraction gratings from class
 +
 +  * Some of our C10 favorite demos: see [[astro300_f13:day8#administering_demos_20_min|notes from Ay375 Fall 2013]].
 +===== Office Hours (15 min) =====
 +This is in particular relevant to the lab-based courses (Ay120, Python class) where most of the GSI interaction is in the form of office hours or emails to answer questions on assignments. These are also useful in TALC.
 +
 +Group discussion - you've probably been doing this but making it more explicit:
 +  * (15 min) Handling Student Questions
 +    * When you get questions from students on homework questions, how do/should you handle them?
 +      * We don't generally want to just tell them how to do the problem
 +        * We don't want the reason they did something to be: "The GSI told me to do it this way"
 +    * In groups of 2-3, come up with 1 question you face or could face and how you would handle that one question without giving away the answer (e.g. I have trouble doing problem _, can you help me?). Come up with an example of a question of where you should just give away the answer and explain why that is the case (7 minutes)
 +  * Go over as a class (8 minutes)
 +    * General strategy: Identify where the confusion is and address it
 +        * Identifying the confusion: ask them to explain the problem to you as best as they can and see where they run into trouble
 +        * What kind of confusion is it: misunderstanding? unable to grasp what the goal of the task it? forget a step? math error?
 +        * Address the source of the confusion and have them attempt the problem again
 +    * In general, we want to push the question back to them, but in a different/leading way.
 +        * e.g. How do I find the location of the star in my data? How would you answer that
 +            * e.g. Where do you think it is? What did your brain just do to try to figure out where it is? How could I put that in mathematical/programming terms to implement it? (Basically what I've done is taken a higher level question, and thought of some lower level questions that can guide me to the answer)
 +    * Learning how to problem solve and get to the answer are skills we want to teach
 +
 +    * Debugging (specific to lab/python)
 +      * How to handle debugging issues? Ask for opinions
 +        * When do you help them? 
 +        * Avoid spiral of debugging everyone's code 
 +      * General rule for Astrolab: because the class does not explicitly prereq coding, especially at the beginning, help them debug but also use it to teach them how to debug (print statements, pdb, how to use IDEs, how to use Google)
 +    * When should you just give them the answer?
 +        * Things that do not really contribute to the learning objectives
 +          * Math errors: 1+1=3? (Exception: equations in wrong units)
 +          * Coding API questions (how do I make an array in numpy, what is the argument to do...)
 +          * Things that take a long time to figure out how to get, but don't have very much benefit to learning (examples?, position of maxima of interference pattern when putting a hair in front of a laser (need to find maxima of sinc^2(x)))
 +
 +===== Midsemester Evals (15 minutes) =====
 + 
 +  * Why do we do mid-semester evals (1 min)
 +    * There is a department wide, official end of semester eval, but by then it's too late to fix things for your current students.
 +    * The point of student feedback at mid-semester is to allow you to adjust your section and teaching style as necessary to match your current students' demands.
 +    * However, take the responses with a grain of salt; Ay 10 student don't always know what's best for them!
 +  * General overview of mid-semester evals (1 min)
 +    * These questions should cover things you want feedback on. Be explicit about what you want to know whenever possible.
 +    * There should be room for some free response (e.g., if there's anything else not addressed here...)
 +    * You should make sure you take some time to go over the results in a later session.
 +      * If you have quantitative questions, this might include averages, or distributions.
 +    * You should make sure they are able to fill them out anonymously.
 +    * We aren't doing one for this class due to how short it is (we will be using notecards periodically instead)
 +  * Going over Aaron's old Ay375 mid-semester eval {{:midsem_evalf13.pdf|Midsemester Evaluation for 2013}} (10 min)
 +    * Your mid-semester should be short (1 sided to 2 sided max). This one is just super long to give you different examples on how to approach it. You should feel free to take what you like
 +  * Going over Carina's C10 mid-semester eval {{:midsemester_eval_carina.pdf|Carina's Eval}}
 +    * Question 1: Useful to know how serious to take the evaluation comments (i.e. take it more seriously if a student attends all the time).
 +    * Question 2: Free response allows students to express themselves.
 +    * Question 3: Take it with a grain of salt, because most students want more lectures.
 +    * Question 4: Another free response.
 +    * Questions 5-8: Useful to gauge how the difficulty of section compares with the class as a whole, and whether you're hitting a middle-ground level of difficulty in section.
 +    * Questions 9-10: Fun, extra questions. Pros: makes the evaluation less formal and fun to read. Cons: can get some inappropriate responses (Carina got a bunch of pick-up lines once and also a phone number) and worrisome responses (one person said they'd be pluto because no one cares about them and they're sad and depressed).
 +  * After the eval is filled out: (1 min)
 +    * It's a good idea to read through them all twice: once to just read every answer and the second time to see if you can spot some trends and consistencies (based on the assumption that if a majority of students agrees on something, then it might actually be a valid point or an accurate assessment).
 +    * Arguably the most important part of evals is responding to the criticisms (both positive and negative) publicly in front of the class and addressing at least some of their concerns (Slater & Adams call this 'closing the loop').
 +  * **We will ask you to compile a short summary report:** Some things to consider including:
 +     -consistent comments
 +     -averages/standard deviations/histograms for any quantitative questions
 +     -any hilarious, unfairly mean, or really thoughtful comments
 +     -a personal assessment of what you've learned and what you'll change (and when)
 +
 +=====Homework=====
 +
 +  - Write a midsemester evaluation and administer them in your sections in the next couple of weeks (9/26-10/5). After reviewing your students' responses, write up a ~1 paragraph summary of the evaluation (What did you learn? What changes might you make?). Bring this summary and a copy of your (blank) evaluation to class on 10/5. If you are not teaching section-based classes, you are encouraged to talk to the Professor of the course and see if it's possible to adminster a midsemester evaluation for the class as a whole. If you are not teaching at all this semester, please draft a general midsemester evaluation (one that you could use in future semesters), but you will not be administering it. 
 +  - If you haven't done so already, visit your peer's section. Meet up for discussion and complete the {{:peergsivisit.pdf| Peer Visitation Worksheet}} by 9/28 (next week!).