»
S
I
D
E
B
A
R
«
Let’s go fly a kite!
Nov 22nd, 2009 by Frank LaBanca, Ed.D.
from www.babygadget.net

from www.babygadget.net

I am a strong advocate for authentic inquiry where we allow students to pursue interesting problems and determine innovative, creative solutions.  In order for a student to build a strong repertoire of problem finding and solving skills, they must develop the necessary prerequisite skills and have a positive disposition to learning.  I often think back to the expertise literature from the creativity domain.  (Below, from LaBanca, 2008):

Experts of a domain structure their knowledge differently from novices (Chase & Simon, 1973; Chi, Glaser, & Rees, 1982; Feldhusen , 2005; Larken, McDermott, Simon, & Simon, 1980; Sternberg, 2001). Expert knowledge is centered on conceptual understanding, with the use of specific domain-based strategies (Driscoll, 2005). Expert problem finding and solving, therefore, is a utilization of pattern recognition based on previous experience and matching those patterns to corresponding aspects of a problem. Novices generally do not possess the same understanding, and, in turn, utilize more general, non-domain specific, problem finding and solving strategies (Driscoll, 2005).

In an instructional setting, some teaching practices lead to the conveying of decontexualized information, whereby students are unable to transfer what they have learned to relevant situations (Brown, Collins, and Duguid, 1989). Students, as novices, have difficulty solving complex, authentic problems because they “tend to memorize rules and algorithms” (Driscoll, 2005, p. 161). Experts would tend to use situational cues to solve problems. Because they have greater domain-specific content knowledge, experts approach finding and solving problems by recognizing and applying previously experienced patterns.

Simply put:

  • Experience matters.
  • Experience promotes higher levels of creativity.
  • Experience makes better problem finders and solvers.
from newenglandsite.com

from newenglandsite.com

As a parent, I feel that part of my responsibility is to provide opportunities for my children to have diverse experiences which expose them to authentic problem solving experiences.  Today was one of those days.  As I was cleaning out the back of my car, I came across several kites.  I enjoy flying kites, but have never done this with my children.  Spontaneously, I packed them up, took a drive to Seaside Park in Bridgeport (probably the nicest beach on the Connecticut coast), and we set up shop.

Although my younger daughter Maggie (5) was not as impressed, my older daughter Anna (7) really got into it.  She was trying to figure out how to get the kite to stay in the air without crashing back to the sand on the beach.  Once the thing was about 100 feet in the air, I asked her how she got it so high.  She was able to give me a detailed explanation of how it works and some of the tricks that were necessary to work the kite.  This was without really any advice from me.  She tackled the problem and devised a solution using a trial and error strategy.

I think sometimes in science education, some get stuck in the mess of using only a hypothesis-based problem solving strategy.  That’s a shame because there are so many other ways to solve problems.  For example (from Wikipedia:)

  1. Divide and conquer
  2. Hill-climbing strategy, (also called gradient descent/ascent, difference reduction, greedy algorithm)
  3. Means-ends analysis
  4. Trial-and-error
  5. Brainstorming
  6. Morphological analysis
  7. Method of focal objects
  8. Lateral thinking
  9. George Pólya‘s techniques in How to Solve It
  10. Research
  11. Assumption reversal
  12. Analogy
  13. Reduction (complexity)
  14. Hypothesis testing
  15. Constraint examination
  16. Incubation
  17. Build (or write) one or more abstract models of the problem
  18. Try to prove that the problem cannot be solved.
  19. Get help from friends or online problem solving community
  20. Delegation: delegating the problem to others.
  21. Root Cause Analysis
  22. Working Backwards
  23. Forward-Looking Strategy
  24. Simplification
  25. Generalization
  26. Specialization
  27. Random Search
  28. Split-Half Method
  29. The GROW model
  30. TRIZ
  31. Eight Disciplines Problem Solving
  32. Southbeach Notation
  33. The WWXXD Method:

Let’s really strategize to provide students with DIVERSE opportunities for problem solving in our classroom.  If I can do it unplanned with my children on a sunny, chilly, fall day at a beautiful beach, we can certainly find ways to to it in our classrooms.

Secrets of the Dead
Nov 18th, 2009 by Frank LaBanca, Ed.D.

http://video.pbs.org/video/1240086878

Mutiple choice or open-ended question . . . it doesn’t matter under the right circumstances
Nov 2nd, 2009 by Frank LaBanca, Ed.D.

questionThe fact of the matter is that objective assessments are here for a while.  How do we as teachers find the balance between objective assessments and authentic assessments? I am a strong proponent of authentic assessments:

  • position (critical stance) papers
  • lab reports
  • poster presentations
  • oral PowerPoint presentations
  • Blog posts and responses

They so better provide a more realistic cognitive apprenticeship for students as they traverse their knowledge growth potential.  But for better or worse, there is an obligation for teachers to work with students and allow them to engage in more objective assessments:  timed tests on specific content.  I’ve often worked with teachers who indicate that they would NEVER use a multiple choice question.  They spout off some nonsense about the nature of the question.  However I would only agree with them if the multiple choice question is a fact check. 

I would classify types of questions (whether objective or authentic)  that teachers ask students into three major categories:

  • factual
  • conceptual
  • analytical

Factual questions are just that:  checking facts.  Isolated information that stands alone.  Generally much lower on Bloom’s Taxonomy (knowledge/comprehension).  Conceptual and analytical questions, though, would fall under higher order thinking questions.  Conceptual questions: ill-defined allowing students to connect ideas together and draw on knowledge.  Analytical: well-defined, challenging students to interpret information or data, and make calculations. 

I’ve seen essay questions that were just as factual as a factual multiple chioce.  Conversely, when students are challenged to connect ideas or analyze information – that’s higher order thinking no matter what the format.

I often think back to a teacher that would tell me that his midterm exam had 300 multiple choice questions for the 2-hour period.  My students can barely complete 40 multiple choice during the same time frame.  Easy reason:  my questions require more thinking and analysis.  His only check facts.  My student test booklets are covered with notes, comments, calculations, and figures.  There certainly is something to it.  The challenge for educators is to put more emphasis on HOTS – no matter what the format.  Authentic assessment can stink just as much as some forms of objective assessment if it isn’t pushing students to higher levels of intellecuation.  

So, ultimately, it’s not what we ask students to do – it’s how we ask them to do it.

I’ve done more detailed posts about conceptual assessment here and here.

»  Substance: WordPress   »  Style: Ahren Ahimsa