Observing Effective Questioning in the Science Classroom
Apr 28th, 2010 by Frank LaBanca, Ed.D.

Note: This article is cross-posted in the CSSA Newsletter.  Be a part of the discussion, join my personal learning network, and leave a comment on its contents here.


n March 13, 2010, the Obama Administration released its strategy for revising the Elementary and Secondary Education Act (ESEA), also known as No Child Left Behind.  The blueprint, in part, focuses on the development of effective teachers and leaders.  The plan requires states to define an effective teacher, effective principal, highly effective teacher, and highly effective principal. Definitions are to be developed in collaboration with teachers and leaders, based in significant part on student growth and other measures such as classroom observations of practice.


he ESEA contains expectations that district level evaluation systems

  • meaningfully differentiate teachers and principals by effectiveness across at least three performance levels
  • are consistent with their state’s definition of effective teacher and highly effective teacher and principal 
  • provide meaningful feedback to teachers and principals to improve their practice and inform professional development
  • are developed in collaboration with teachers, principals, and other education stakeholders



ow do we, as science education leaders operationalize these broad statements and translate them into meaningful methods to assist in teacher growth and improvement?  I think at times, it is necessary to step back and examine how we can compartmentalize the instructional process for the purpose of identifying an area to focus efforts to help teachers improve.  Certainly instruction is a very holistic process, but targeting specific teaching skills in the instructional toolbag can give teachers meaningful feedback to improve their craft.  My focus here is on effective oral questioning. 


uestioning in the classroom is vital to help students develop problem solving and critical thinking skills.  To frame this discussion, it is important to consider the different types of questions that a science teacher might ask students (or students might ask teachers).  I would classify them into three major categories:

  • Factual
  •  Conceptual
  •  Analytical

Factual questions are just that:  checking facts.  Factual questions are composed of isolated information that stands alone and is generally much lower on Bloom’s Taxonomy (knowledge/comprehension).  Conceptual and analytical questions, though, would fall under higher order thinking skills questions.  Conceptual questions are ill-defined, allowing students to connect ideas together and draw on knowledge to formulate an answer, while analytical are well-defined, challenging students to interpret information or data, and make calculations. Both are more inquiry-based but a conceptual question can have multiple possibilities (i.e., the BEST answer), where a well-defined analytical question has one right answer (i.e., the CORRECT answer).  Of course, all types of questions are necessary, especially to scaffold student learning, but are a variety used effectively and judiciously?



s I observe teaching and learning, I often find myself asking many of the following questions: Who (teacher/students) are asking the questions?  Are a variety of students participating?  Does the teacher answer student questions or does the teacher turn them back to the class for a response?  Is appropriate wait time utilized?  If a HOTS question is too difficult to answer, does the teacher rephrase or scaffold to provide a structure for student success?  What types, in what frequency, and in what proportion are questions being asked by students and teachers?


# HOTS questions # K/C questions
# HOTS questions # K/C questions


 If  inquiry is learning by questioning and investigation, then effective oral questioning in a science class is critical to the development of student inquiry skills.  Helping teachers develop their classroom questioning skills is a necessary and important part of professional mentoring for growth and development. 


Problem solving isn’t always obvious
Apr 26th, 2010 by Frank LaBanca, Ed.D.

from: kidsaccident.psy.uq.edu.au


As some might notice, I had a friend design a new header for my blog.  Mark maintains his consulting business at www.mokturtle.net.  He designed the header (which is similar to my homepage labanca.net), sent me some files, and then I had to figure out how to upload them and get them working on my WordPress blog.  I enjoyed the challenge of figuring out how to get it all to work. My problem solving involved several different techniques and cognitive mechanisms (from Wikipedia): 

  • Brainstorming:
  • suggesting a large number of solutions or ideas and combining and developing them until an optimum is found.
  • Lateral thinking: approaching solutions indirectly and creatively.
  • Means-ends analysis: choosing an action at each step to move closer to the goal.
  • Morphological analysis: assessing the output and interactions of an entire system.
  • Research: employing existing ideas or adapting existing solutions to similar problems.
  • Trial-and-error: testing possible solutions until the right one is found.

Often, when some think of problem solving, especially from an educational standpoint it comes down to: 

  • Hypothesis testing: assuming a possible explanation to the problem and trying to prove (or, in some contexts, disprove) the assumption.
This linear method may have applications at times, but doesn’t really allow for the creative potential that is often necessary when solving ill-defined problems:  problems that have more than one possible method of reaching the outcome, or perhaps problems that have more than one acceptable outcome. 

Enter a project that I conducted with my students:  Each student was required to create a short blog post, which had to include a graphic and a self-made media clip (audio or video) about a genetic disorder.  I created a blog (actually two:  here and here), established student accounts, and let them go.  In my usual style, I was intentionally vague so as to not limit the creative potential of the students. 

It was interesting to see that most of the questions I received as the students worked on their projects over the course of  a week were focused on operating the blog platform.  Questions were simple, directed, and easy to provide support. They had to troubleshoot the best ways to make their presentations work.  I think, though, they really could focus on the content without getting bogged down in the idiosyncrasies of technology.

What do I take away?

  1. The tools allow students to focus on content rather than the minutia of form to create attractive products.
  2. Using the tools has its own challenges and allowing students to work through these problems is good problem solving.
  3. Quality of content is still important.  Glitz does not take away understanding.  Just because we made something fancy doens’t mean that we can allow the quality of the concepts to slip.
  4.  In just 4 years since I gave this assignment last, student IT skills have improved tremendously.  I needed to provide very little support for students to make their media components – they know how to do it, and most of them have the tools.  I did loan some digital voice recorders to some, but did NOT have to provide instructions for usage.
  5. Making and editing video has become incredibly easy and there are a wide variety of tools to do it:  webcams, digital cameras, cell phones, video cameras; PC: Movie Maker, MAC: iMOVIE.

Allowing students to be creative producers is critical; these kinds of projects move us in the right direction.

The Phil Mikan Show
Apr 12th, 2010 by Frank LaBanca, Ed.D.

Two students and I received a request to appear on Phil Mikan’s Corner Radio Show.  Phil’s show broadcasts from Middletown, CT and is heard on WLIS and WMRD

Phil asked us to join him to talk about our successes at the Connecticut Science Fair.  Both of my students were finalists and won some significant awards.

One of the most interesting comments about the experience came after we left when one student said, “Boy, I never knew all fo the things I would get to see because I did a science fair project.” 

How true.  Authentic experiences breed other authentic experiences.  I wrote about those unique experiences last year as well.  There is something magical about doing real work (in this case, science research that has an authentic audience), because “real” people want to hear about it.

Listen to the show here:

»  Substance: WordPress   »  Style: Ahren Ahimsa