VOLUME 14, NO. 2 - APRIL 14, 2006


Wow! Who could believe it is already nearly the middle of April and graduation and summer break are right around the corner? Started last year, and already deemed an HCC tradition, the after-graduation party will again be held in the Shell immediately following the ceremony. Jeannie Shaw is to be commended and thanked for once again taking the lead in organizing this party with the help of the Faculty Development members and anyone else who wants to help. This year we are collecting $5.00 per person (guests welcomed) for the party. Please RSVP and pay your $5.00 (cash only) to me by May 5.

I'm sure you like me are feeling that we have been pulled and pushed in way too many directions this year. We are all to be commended on our hard work on the Accreditation Self Study report, the Annual Assessment and Program Review reports, all our committee work and other extra-classroom duties we perform and, oh yeah, taking care of our students. At all new faculty and staff orientation session I share THE STUDENT IS….. handout with all, including all the old time presenters, in attendance. I think we can never remember too often why we are here:


Enjoy this last Faculty Development Newsletter for the 2005-2006 academic year. I especially want to thank Silvan Chung, Career Counselor, for the article she contributed about the importance of helping our students with their career choices and how she can help. The Newsletter also contains a short article on Test Anxiety and another on Knowledge Surveys. If the Knowledge Survey article inspires you to want to learn more, check with Cynthia Smith or David Cleveland, if you can catch him on campus, for more information. Many of our colleagues are successfully using Knowledge Surveys in their courses. Two new faculty colleagues are also highlighted at the end of the Newsletter. Have a productive and enjoyable rest of the semester and a great summer. See you at graduation and the after-graduation party!

Jerry Cerny
FD Coordinator

CAREER COUNSELING: What is it and Who is it for?

By Silvan Chung, Career Counselor, Student Services

The process of deciding what our students want to become for the rest of their lives can be both stressful and intimidating. In fact it takes time as well as hands-on experience to ensure they are making the right decisions for their future. Therefore, I am committed in making this transition and progression as smooth as possible. Helping them get an early start will prove to be a valuable investment in our students' future.

As part of my effort to disseminate information to our students, I feel it is important for them to have access to career information at all times. Therefore, the Career Services website was created to offer career and industry information right at their fingertips. The following is a list of services available to all;

In addition to helping students, I am also available to assist departments and faculty. I strongly feel that it is important for all of us to see the benefits of training students in becoming job ready. For instance, first impressions speak louder than words when landing that first job. Thus, I am here to make this an easy and possible transition from student to employee. Here is a partial list of classroom workshops I can offer; If you are interested in any of the above presentations, or have ideas for others, please feel free to stop by my office on the first floor of Building 6 or contact me via phone at x404 or e-mail at I look forward to meeting you and helping your students move ahead!


Reprinted from The Teaching Professor, Volume 19, Number 9, November 2005.

Test anxiety has been formally defined as "the set of phenomenological, physiological, and behavioral responses that accompany concern about possible negative consequences or failure on an exam or similar evaluative situation." (p. 268) But most teachers don't need a formal description: they've seen test anxiety firsthand.

That test anxiety compromises academic performance is a well-established empirical fact - for students all the way from grade school through college. The recent study referenced below confirmed its continued existence across a large cohort of undergraduate students (4,000) and graduate students (1,414). The study measured text anxiety with a widely used instrument and validated student reports of GPA in several ways.

In this sample, low-test-anxious female and male undergraduates had cumulative GPAs averaging 3.35 and 3.22 respectively, compared with GPAs of 3.12 and 2.97 for high-test-anxious females and males respectively. In practical terms this can be thought of as the difference between a B+ and a B. The relationship between GPA and test anxiety was also present in the graduate student population, although it was weaker.

The difference in GPA may seem small, but in the competitive academic environment, where GPA continues to be used as an important criterion for entrance to professional schools and to separate those who do and don't get job interviews, the impact of test anxiety should not be underestimated. Students who experience test anxiety may be just as smart, may know just as much, and may be just as intellectually able as their colleagues, but their anxiety about performance prevents them from showing fully what they know and can do.

If there is good news, it is that teachers can do much to alleviate test anxiety, and students who experience it can learn strategies that help them manage their anxiety (a google search using overcoming test anxiety will turn up many resources for teachers and students). Teachers can include in courses a variety of ways for students to demonstrate their mastery of the material. They can offer multiple testing events, ways to retake or redo parts of exams, or extra credit opportunities that constitute substantive encounters with the content. Students who experience test anxiety can learn strategies that help alleviate the problem. Most campus learning centers offer workshops and other resources on text anxiety.

Reference: Chapell, M.S., Blanding, B.Z. & others (2005). Test anxiety and academic performance in undergraduate and graduate students. Journal of Educational Psychology, 97(2), 268-274.

Excellent reference proposing strategies for dealing with test anxiety: Mealey, D.L. 7 Host, T.R. (1992). Coping with test anxiety. College Teaching, 40(4), 147-150.


From a Knowledge Survey Workshop presented at Honolulu Community College on August 19 and 20, 2005
by Dr. Ed Nuhfer, Idaho State University

You have undoubtedly been involved in the assessment discussions here at HCC and probably have heard something about Knowledge Surveys. This article is written to provide you with more information. Follow the links in the article to find out more about Knowledge Surveys.

Knowledge Surveys provide a means to assess changes in specific content learning and intellectual development. More importantly, they promote student learning by improving course organization and planning. For instructors, the tool establishes a high degree of instructional alignment, and, if properly used, can insure employment of all seven "best practices" during the enactment of the course. Beyond increasing success of individual courses, knowledge surveys inform curriculum development to better achieve, improve and document program success.

What are knowledge surveys?
Knowledge surveys are an approach to assessing 1) student preparedness and 2) teaching effectiveness. As described by Nuhfer & Knipp (2003), the surveys consist of numerous questions which exhaustively itemize the content of a course. When students take the surveys, they are not asked to provide the information required by the questions. Rather, they are asked to assess their own confidence level with respect to each question. Levels of confidence might include "I could answer this," "I could find the answer to this in ten minutes," "I could not answer this," and so on. Research findings indicate that student responses to the surveys correlate closely to other assessment indicators, such as tests, that require an actual display of knowledge

How are knowledge surveys useful?.
Developing and using a Knowledge Survey forces the faculty member to create a detailed "map" of expected knowledge/outcomes for students and for the faculty member him/herself. It is difficult to get anywhere if you don't know where you are going; however, once armed with a "map" the student is able to "navigate" towards the destination and the faculty member can precisely plot progress on the chart.

Are knowledge surveys suited better to some disciplines than to others?
Any discipline whose course content can be expressed in words, numbers, images, or sounds (and that covers just about everyone) should be able to benefit from knowledge surveys. The project leaders are developing software to allow this kind of flexibility via surveys that will be hosted on-line.

Does the cognitive level of my survey questions have to follow a distribution like that suggested by the Bloom taxonomy?
No. The Bloom taxonomy is offered merely as a help and guide. Some courses may demand disproportionate numbers of purely factual questions, or purely analytical ones, or purely evaluative ones. The nature of the course content should dictate the type of question. However, survey-writing is something of an art and science (more familiar to social scientists than to people in the humanities) and Bloom may help those of us who are inexperienced at survey design to avoid accidentally emphasizing only one kind of question (for example), and thus distorting the actual nature of the knowledge being taught in our classes.

What are knowledge surveys NOT?
Knowledge surveys are not tests. Students are not asked to answer the questions in the survey -- they are asked to assess their own competence to answer the questions if they were to appear on an actual test. Of course, some questions in the survey may actually show up on actual tests, but the advantage of surveys is that they survey the whole content of the course, not just a sample of it for grading purposes. Tests tell teachers how students did in the course; knowledge surveys additionally suggest to teachers how teachers did in the course, that is, how well the content got across

Do all sections of a course have to use the same Knowledge Survey Items?.
Knowledge Surveys do not dictate course uniformity (Deans, Division Chairs, and Disciplines may). A common practice on other campuses that use Knowledge Surveys is to at least share a common core of items and then permit individual faculty who teach the same course to include course specific expected learning outcomes. This facilitates programmatic evaluation without requiring complete course uniformity.

How and why will the development and use of a Knowledge Survey improve my student learning outcomes?
To answer the question, we have included Ed Nuhfer's explanation to the Idaho State University faculty: Better Organization = Better Learning. Those skeptical of the statement can consult a particularly revealing document, Feldman, K. A. (1998). Identifying exemplary teachers and teaching: evidence from student ratings. in Teaching and Learning in the College Classroom 2nd edition, K. A. Feldman and M. B. Paulsen, (Eds.) Needham Heights, MA: Simon & Schuster, 391-414, complete with a murderous array of statistics that proves that the most important way we can spend our time to generate improved learning is to spend that time on preparation and organization of the course. Interestingly, in teasing apart traits that lead to student learning and student ratings ('high student evaluations'), we find the most important practice to produce enhanced learning is only the sixth most important in producing high student ratings. (Resolution is a discussion for another day unless you wish to wade now through a very long summary about student evaluations.) When we think our organization is clear, students usually do not. Harvard's Phil Sadler's (1992) videotape, "Thinking Together: Collaborative Learning in Science," explains how this occurs: "When you learn to teach a subject, just struggling with how to present it, where you're sort of relearning it yourself, that's when students gain the very most from a lecture. Once you've really got it down and you see all these beautiful connections that you didn't see before, you're well beyond the level of the student." We see our organization; they don't, unless we bring it to their level. One way to bridge the gap is to present our organizational plan completely in writing and to let students engage it at their pace in ways that promote their learning.

The concept behind a knowledge survey is simple. It is a written document constructed through a logic that begins with course goals, then outcomes that are fleshed out by what students should be able to do as a result of successfully meeting an outcome. It is a document that discloses the entire course and takes detailed before/after snapshots of students' perceptions of their learning. If you've taught a course before, rudiments for your first crude knowledge survey are likely already in your computer. Copy all your quiz, test, and review questions into one giant file, in the order you intend to cover these topics in the coming days ahead. See if it is, in fact, organized so as to cover and make explicit your stated goals and outcomes. If not, make the needed changes and additions to do so. You now have a "monster exam" that covers the entire course.

Students don't merely retain it as a study guide, they interact with it and produce a scaled record based on their confidence with present knowledge. Students mark an "A" in response to an item if they can, with present knowledge, answer an item or perform the skill for test purposes; a "B" if they have partial knowledge/skill or know how to find the information required to answer the question within a short time (say, 20 minutes) or a "C" if one could not presently answer this question for test purposes.

You now have the basic idea. Next go to the Center's web site for examples, details, and a long list of benefits to be gained from doing so. This paper, published last February, represents our experience as of about two years ago. We now know more about how to use these well, and there is certainly much more to be learned. We have worked with ITRC the past year to allow a knowledge survey prepared in a word processor to be given to students via WebCT and the data returned to the professor as an Excel file to allow pre-post records of the kind shown in the above web site to be produced. We provide workshops on (1) constructing such surveys and (2) getting them up on WebCT. We are happy to come to any unit or department to present this. But you need not wait. Between the web site above and what you intend to do for your course, you can construct a 'first edition' immediately.

To make the best use of this tool, you need to refer to it often through the course, align your lessons with your plan, and make certain students are using it too. For you, it will give a detailed record that can serve as a reality check for how fitting your plan is. If all goes well, better learning will be the outcome. Even if disaster occurs (you find the plan impractical and have to scrap it), take notes and the detailed record will reveal fully how you can design the course for success the next time.

What is the HCC Strategy for the Inclusion of Knowledge Surveys in the Assessment of our SLOs?
We developed some pilot Knowledge Surveys this past fall semester, and then launched a few pilot post-tests at the end of the semester. During this spring semester, we continued to launch several pilot pre/post test Knowledge Surveys. The findings from the pilot projects will be reviewed to determine the utility/appropriateness of Knowledge Surveys and inform our decisions about their future use on the campus.

You can access more information about Knowledge Surveys in the Workshop/Conference Information section of the Assessment section of the HCC Intranet.

11 Steps to develop/utilize a Knowledge Survey

  1. Understand the theory in the Knowledge Survey documentation.
  2. Inventory all of the questions which have already been used in your department (key quizzes and test questions).
  3. Develop the questions above in sequence of teaching and map to the six Bloom reasoning levels. Construct the number of questions that seems to do what you want in the course. Utilize the hierarchy of Responsibilities =>Goals=>Outcomes=>details about outcomes. This could be 50 upwards to above 200 questions depending upon how much detail you want.
  4. Organize the Knowledge Survey items chronologically in order of course presentation.
  5. Ensure that the instructions and explanation for using the Knowledge Survey have been added to the syllabus and are in synch with the sequence of topics.
  6. Post the Knowledge Survey on the web with instructions prior to the first day of class and be sure to dry run it through yourself to be sure it reads and responds as you wanted..
  7. On the first day of class instruct your students to fill out the Knowledge Survey.
  8. Perform a formative analysis based on the Knowledge Survey given the first day of class and address any immediate gaps or areas of overlap. Look at class averages to see if there are surprising patterns.
  9. Remind your students throughout the course of the semester that they can and should be using the Knowledge Survey as a study tool. Use it yourself too, when doing class prep. Try to match pedagogy to learning outcomes to get instructional alignment.
  10. Towards the last day of class instruct students to fill out the Knowledge Survey.
  11. Perform a summative analysis based on the Knowledge Survey comparing the data collected from the first day of class and comparing it to the last day of class, and adjust learning style and or curriculum or the survey to accommodate needs.

The following two faculty members are new to our campus this spring. As you meet our new colleagues, please help make them feel welcome. They include;

Tracy Kobayashi, Counselor, College Skills Center. Tracy was born and raised here on Oahu and graduated from the University Laboratory High School. After high school, she attended Honolulu Community College and earned a Liberal Arts degree. She then transferred to Southern Oregon University where she received a Bachelor of Arts degree in Psychology. After taking a year off from school, Tracy returned home to Hawaii and decided to continue her education at Chaminade University where she earned a Master of Science in Counseling Psychology degree. She enjoys meeting and working with people. Her passion is to help others succeed and believe in themselves so they can achieve their goals if they put their mind and effort into it. In her free time, Tracy enjoys playing tennis, softball, and spending time with family and friends.

Fumiko Takasugi, Instructor, Sociology. Fumiko was born in Tokyo, but from the age of 1 1/2, was raised in New York City, New Jersey, Osaka, and Tokyo, in that order. She has lived in Honolulu for the past 12 years. Fumiko earned a BA at Sophia University in Japan in Comparative Culture, an MA in Sociology at Columbia University in NYC, and a PhD in Sociology at the University of Hawaii Manoa. She has been teaching Sociology for 12 years. Fumiko also worked with the police in Japan as a researcher before becoming more interested in working with those who become targeted by law enforcement. Thus, her doctoral dissertation is an ethnography on the Hawaii local punk rock scene. She continues to be involved in the local scene, both musically and academically and can usually be found at a punk show or two on a weekend night. Fumiko's family all live in the Tokyo metropolitan area, except for her partner and three felines, with whom she presently reside in Manoa.


This newsletter was organized and published by the HCC Faculty Development Committee. Members: Jerry Cerny (Co-Editor), Jeannie Shaw, Lisa Yogi, Rick Ziegler, Rona Wong (Co-Editor), Monir Hodges, Femar Lee, and Bernadette Howard.

Faculty Home Page HCC Home Page