College of Art s and Sciences
Annual Assessment Report
Part I. General Information
Program(s) Discussed:
Current Semester:
Date of Assessment Meeting(s):
Political Science Major
Fall 2018
September 6, 2018 (this was the meeting at which
data were shared with Dept. members. Other
discussions took place as part of scheduled
department meetings throughout the academic year
2017/8).
Participants in Assessment Meeting(s): All full time Dept members and some part-time.
All Annual Assessment Reports are available to the appropriate Associate Dean, Dean, and
the Provost, as well as to other administrators for institutional effectiveness and accreditation
purposes. Please indicate the degree to which your program would like this information more
widely shared.
On-Campus Users
X Freely available
0
Available upon request
0 Unavailable
Off-Campus Users
X Freely available
0
Available upon request
0 Unavailable
Part II. Assessment Process
Prompt
:
In one or two paragraphs, describe your assessment process. Did you gather data on all of your program's student
learning goals? If not, which student learning goals did you measure in this assessment cycle? What tools did you use to
attempt to measure student learning? Where and how were they administered? Who scored them?
According to the department’s original Assessment Plan, the idea for 2017/8 was to focus on Goals I-
III of the departments Learning Goals (for details on the departmental Learning Goals and on how
they align with JCU’s Institutional Academic Learning Goals we refer back to the 2014-15 report).
As explained elsewhere, we made an adjustment toward the end of the the 2016/7
academic year to substitute Goal IV for Goal III and postpone the initial assessment of
Goal III to 2017/8. Accordingly, this report will include data for Goal III for the first
time.
To assess our Learning Goals, we employ the following measures: (1) the Political Science
Major Field Test (MFT), which we have administered to all
graduating political science majors for the
past 20 years and which includes, in addition to the standard questions on substantive political science
knowledge, several items on academic and intellectual/critical analysis skills; (2) the Political Science
Department Rubric for Assessing Writing Assignments, which we developed specifically for this purpose;
and (3) the Political Science Oral Presentation Rubric, likewise specifically created for this purpose by the
department; (4) The Senior Exit Interview (newly revised for this year).
The MFT was administered online, at the end of April, in a session proctored by the assessment
coordinator, to the entire PO graduating class through an interface provided by ETS. The assessment of
writing took place, through the instructors, in PO 300 (required for majors) as well as other courses
(accessible to non-majors as well). Oral presentation skills were assessed in PO 300 only. The Senior Exit
Interview was appended to the MFT and administered to all graduating seniors.
Part
III.
Findings
Prompt : Along with this report, please submit the data charts the program used during the assessment meeting. Describe, in
words, what your program learned about student learning during this assessment cycle. What were your strengths? In what
ways did students fail to meet the goals you set for them?
GOAL I - Demonstrating Knowledge in the Major Fields of Political Science:
As can be seen in Tables 1-3 below, the 2018 MFT scores (mean/median), and (most) subfield scores, are
slightly up from the previous year, the Total Test Score being 154/157 (vs. 153/156 in 2017). This
continues the trend of scores consistently moving up since about 2011 (see especially Table 2). There is
one exception to this trend, namely a decline in the subscore (“Assessment Indicator”) for Methodology
for 2018, which is quite a bit lower than in 2017 (54 v. 61). However, it is still at a high level and
considerably higher than at “peer schools” (see Table 3).
The score for International Relations is unchanged and that for Comparative Politics is up a bit,
continuing its recovery from low points in 2015/2016 (possibly reflecting that fact that one department
member is back to full-time teaching since then). The score for Political Thought is unchanged from 2017
and continues to be lower than in both 2015 and 2016. The return of a f/t faculty member in that area
should help us boost those scores in future years. The score for Analytical & Critical Thinking is up a bit
from 2017. The data confirm that both Methodology (notwithstanding its 2018 decline) and Analytical &
Critical Thinking continue to be strengths for our department. This is supported by the comparative
analysis shown below (see section on Benchmarking & Table 3).
Finally, the standard deviations continue their upward trend from previous years, suggesting that our
group of majors is academically diverse and that there is a tail of low scores that drags down our overall
average somewhat (which is why the median score is considerably higher than the mean).
Table 1 DEPARTMENTAL SUMMARY OF TOTAL MFT TESTSCORES, incl. SUBSCORES, 2015-18
Mean/Median
2018 2017 2016 2015
Standard
Deviation (2018)
Total Test Scaled Score
154/157 153/156 155/157 151/153
13
Subscore: American Govt
55 54 56 52
13
Subscore: Comparative
54 52 50 50
14
Subscore: Intern. Rel.
53 53 54 50
14
Assessment Indicator Number
Assessment Indicator Title
1
Analytical & Critical Thinking
2
Methodology
3
Political Thought
Trends: (Table 2) As indicated above, the 2018 scores are in line with previous years - slightly higher even
- and overall the second highest of the 2011-2018 time period. The median is actually the same as the high
water mark of 2016 (data for 2014 is missing due to a computer glitch that year). Some random fluctuation
is normal (e.g., Methodology), but the overall upward trend is steady.
Table 2 TREND JCU-MFAT 2011-18
2011
2012
2013
2015
2016
2017 2018
Mean
152
152
148
151
155
153 154
Median
154
154
148
153
157
156 157
US
50
52
47
52
56
54 55
Comp
51
51
53
50
50
52 54
IR
55
52
50
50
54
53 53
Meth
67
45
49
47
59
61 54
Pol Tht
51
55
47
52
54
49 49
Crit
68
58
56
59
64
61 62
N
42
29
32
32
24
21 20
Benchmarking (Table 3): One of the shortcomings of the MFT is that it does not automatically provide
any reference points for comparing a given institution’s performance. While peer scores are available
through the comparative data tool, they are not inherently comparable because the institutions
participating in the MFT each year are diverse. In addition, the pool of available reference institutions
varies from year to year. So, unlike 2017, we did not create a new proxy measure for 2018 (it did not
seem worth the added expense and effort). Instead, we can employ the reference data from 2017 (12
schools: 6 from 2016’s “similar” group, and 3 each from 2016’s “better” and “worse” groups).
Table 3 INSTITUTIONAL COMPARISON 2011-18
JCU 11-
15
JCU
16
2017
"Similar"
JCU
17
JCU
18
Mean
150.8
155
155.6
153
154
Median
152.3
157
155
156
157
US
50.3
56
55.5
54
55
Comp
51.3
50
55.5
52
54
IR
51.8
54
55.5
53
53
Meth
52.0
59
47.5
61
54
Pol Tht
51.3
54
57.8
49
49
Crit An
60.3
64
63.7
61
62
N inst
1
1
11
1
1
Nstudents
159
24
515
21
20
As can be seen in Table 3, our 2018 scores are virtually on par with the reference group. The lower overall
mean is balanced by the higher overall median, and the lower Theory score is balanced by the higher
Methodology score. As always, one should be careful not to make too much of individual scores. Instead
one should look for patterns. The patterns observed in Tables 2 and 3 clearly show (1) improving
performance by our students, and (2) performance which is, roughly, in line with expectations, based on
comparisons to similar institutions.
Discussion: The MFT is an imperfect tool to assess achievement in the major, partly because it is
primarily knowledge based (as opposed to skill based with the exception of the critical analysis section).
But there is also a significant random element in the question selection, which is particularly impactful for
majors such as ours with a relatively high variability of courses taken by our students. Nevertheless, as
stated before, it is important to know that our students’ performance has (1) trended upward, (2) is
stronger in areas we emphasize, and (3) is in line with their peers at similar institutions across the country.
GOAL IIa - Assessment of Writing:
The Rubric developed and employed by the department is appended to this report. It is based on similar
rubrics that are used by other departments (e.g., HS) or that can be found at universities across the nation.
Table 4 - ASSESSMENT OF WRITING, 2015-17
Key: 0 = below expectations
1 = meets expectations
2 = exceeds expectations
/% = percent of students below expectations
Year Courses RQ/Thesis Organization Evidence Sentence Grammar Sources Average
Structure
2015/6 Multiple 1.2/21% 1.3/7% 0.8/39% 1.0/27% 1.2/27% 1.1/18% 1.1/27%
(N=59)
2016/7 Multiple 1.5/4% 1.5/0% 1.8/9% 1.6/0% 1.9/0% 2.0/0% 1.7/2%
(N=22)
2017-8 PO 300 1.3/0% 1.4/0% 1.0/21% 1.1/12% 1.1/12% 1.2/8% 1.1/11%
(N=27)
The department assesses writing within the major at 3 distinct points within a student’s 4 year passage
through JCU: in PO 200 (usually taken during 2
nd
year), PO 300 (typically taken during junior year), and
at the 400-level (each student is required to take at least one 400-level class before graduation). Other
upper-division classes may at times serve to provide additional data points. It is expected that, over time,
a sufficient quantity of data are accumulated to allow for an assessment of the strengths and weaknesses
of writing within the political science major. It is further anticipated that progression can be detected and
documented over the course of students’ undergraduate careers in the major.
During 2017-18 assessment data were collected for only one class, PO 300 (over 2 semesters), Research
Methods. As can be seen in Table 4, the average writing assessment scores during 2017/8 were roughly
half way between those for 2015/6 and those for 2016/7. On average, about 10% of the students fell
below expectations (“unsatisfactory”). By contrast, two years prior about a quarter of students (27%),
scored below expectations (across multiple classes and assessment categories). When added to the even
better score from 2016/7 it is hard to overlook the strong, positive trend. As stated previously, it is likely
that this reflects the greater attention to writing given by faculty, as well as more experience in scoring
the written work. In future years we will endeavor to make the data collation more systematic and
detailed, such that trends can be analyzed more effectively: PO 200 v. PO 300 v. 400-level classes.
Individual students could also be identified and tracked as well. The department is committed to provide
more guidance and support for students who are struggling with their writing. Perhaps the results of those
efforts are already showing up.
GOAL IIb - Assessment of Oral Communication:
The oral communication assessment rubric used by the department is an abbreviated version of the one
developed by JCU’s dept. of Communications. During the spring semesters of 2016 & 2017 the
department assessed the oral communication skills of our political science majors in one class: PO 300.
Effective oral communication was assessed in ten separate categories, grouped into three areas: (1)
Content/Substance; (2) Structure; (3) Delivery. The ratios of satisfactory scores to total N in each
respective category were as follows:
Table 5 ASSESSMENT OF ORAL COMMUNICATION 2015-16, 2016-17, 2017-18
Key: Ratio of Satisfactory scores to total N
Content/Substance
Salience 9/9 4/6 12/14
Reasoning 7/9 5/6 14/14
Quality & use of evidence 6/9 5/6 13/14
Structure
Effective introduction 8/9 4/6 12/14
Effective conclusion 7/9 4/6 12/14
Effective limitation of ideas 8/9 4/6 12/14
Effective organization and development of ideas 7/9 4/6 13/14
Delivery
Vocal variety 8/9 5/6 12/14
Style 7/9 5/6 11/14
Facial Expression & eye contact 7/9 5/6 13/14
The data for all three years indicate that the clear majority of our students are quite capable of giving
satisfactory (or better) presentations of their research. The data further show (though this cannot be seen
in the table above) that even though there are a total of 10 categories of assessment, it tended to be the
same students who showed weaknesses across multiple or even all areas of assessment (rather than
specific weaknesses being more or less randomly distributed across the different categories).
Opportunities for honing one’s presentational skills are unevenly distribute across the political science
curriculum (and across the university). A majority of students are able to achieve a satisfactory level of
proficiency. It is not clear from the data whether the remaining students simply have not had sufficient
opportunity to hone their presentation skills. Clearly there will always be students whose performance, for
one reason or another, does not meet expectations. The department will continue to review its
opportunities for students to practice presentations.
GOAL III - Awareness of, and Engagement in, Local, National and Global Politics:
The report for Goal III may be found after the one for Goal IV since it is new for 2018.
GOAL IV - Preparation for Graduate Programs and Careers related to Political Science:
Our Senior Exit Interview asked two questions that are pertinent directly as well as indirectly - to Goal
IV. Table 6 (a & b) presents data collected over the last 4 years (2015-2018).
Questions:
1. What are your plans for next year?
2. Do you feel that your political science coursework has sufficiently prepared you to execute your
future plans?
Table 6a: PLANS FOR AFTER JCU
2015 2016 2017 2018*
Law or Graduate School 42% 30% 37% 50%
Politics related job/internship 15% 15% 21% 38%
Teaching or service 15% 4% 5% 0%
Private sector job 13% 36% 21% 0%
Other (e.g., gap year) 15% 15% 16% 12%
Table 6b: BEING PREPARED FOR LIFE AFTER JCU (self reported)
Well Prepared: 81% 67% 74% 100%
Moderately well prepared 19% 33% 21% 0%
Not well prepared 0% 0% 5% 0%
N= 26 27 19 8
* Only 8 students participated in the open-ended portion of the senior exit interview in 2018. Unfortunately, this was not
noticed until weeks later when the results were analyzed. It is not clear why this happened: a technical glitch or lack of
attentiveness on the part of the students (missing the fact that there were 2 parts to the MFT test). This had never
happened before, so it is likely that there was some sort of a technical malfunction.
As Table 6a shows, in 2018, again, all or virtually all political science graduates had specific plans for
after graduation. A plurality had plans to attend graduate or law school. Several had professional
internships lined up. Only one indicated that she wished to take a year off and pursue a graduate
education thereafter. Though based on a limited sample (40% of graduates), the results are in line with
previous years. This is also the case for Table 6b. All JCU political since graduates reported that they felt
well prepared for life after JCU.
For 2018 we piloted a significant revision of the senior exit interview (see below). We eliminated most
“evaluation-type” questions and added more open-ended questions geared toward measuring learning
outcomes (while keeping questions regarding future career plans and being prepared for life after JCU):
Summarize how the political science major prepared you for your continued educational and
career goals. For example, you could explain how your future plans relate to the knowledge, skills,
interests and values you have acquired as a political science major.
How has your political science educational experience enriched you personally or presented you
with opportunities or challenges in pursuing your continued educational and career goals?
Describe a specific skill, principle or concept you learned in the political science major that you
think will always remember and that may assist you in your future educational or professional
career, or life.
Of the 8 students who completed this portion of the survey, most gave thoughtful answers for at least 2 of
these questions. Two recurrent themes that stood out among these answers by virtue of being mentioned
by 7 out of 8 students - were (1) critical thinking and (2) ability to research complex issues. Again, these
were open-ended questions, so the students thought of these answers spontaneously. Most students
mentioned multiple skills, and these included reading, writing, speaking, constructing proper arguments,
but also understanding global issues and tolerance for differing viewpoints.
Aside from insuring that all students complete this survey, we will work to refine these questions in future
years and possibly convert some of them into closed-ended question, based on multiple years of responses
to the open-ended questions.
GOAL III - Awareness of, and Engagement in, Local, National and Global Politics:
In 2017 it was decided to measure this goal using a newly created open-ended survey that would be part of
the senior exit interview. This was done because it would be the only way to systematically ascertain our
graduates’ prior engagement of these issues. The questions added to the 2018 senior exit interview were the
following:
What do you consider to be the most pressing needs or problems in the global, national, and local
communities?
And, have you in the past been actively engaged in any of these issues?
Are you planning on doing so in the future?
All students answered these questions, though some more thoroughly than others. Unlike in the previous
section, however, there was relatively little convergence on specific themes. The answers ranged from
broad, such as “lack of respect for human life” or “capitalism,” to more specific, such as “intolerance,
violence, extremism,” “lack of economic development and intercultural understanding” or the “inhumane
prison system.” The eight different students literally gave eight different answers. Of those eight students,
five reported that they were very interested in working towards alleviating or improving these problems in
the future.
We think it significant that our students were able to articulate such a variety of issues, many of which are
not grabbed from the headlines, but represent issues that have been discussed in courses, were subjects of
guest speakers and public events, or possibly students volunteer activitities.
Part
IV.
Planned
Changes to
the
Assessment System
Prompt: What changes, if any, do you need to make to your assessment system? (Questions to consider include: 1) Do your
measures and processes provide useful data with a reasonable amount of effort? and 2) Are your measures reliable, valid, and
sufficient?) On which student learning goals do you plan to focus your attention during the next assessment cycle? Do
you
need to implement additional formative assessment tools to better understand some of your findings? If so, describe those
here.
Our initial plan was to cycle through our four departmental learning goals (LGs) by assessing one
additional LG during each successive year, starting in 2014/15. We now have assessed each goal at least
once and have accumulated multiple years of data for the first 2 goals (going back well before 2014 for
Goal I).
As stated last year, despite its limitations, the MFT is an efficient instrument that provides us with useful
baseline information to assess our program and to insure that it is in line with “industry standards.”
The evidence clearly shows continuous overall improvement, and it has highlighted several strengths, as
well as potential problem areas, for our program. No adjustments are required in this area at present.
Our writing and oral proficiency instruments are very well capable of tracking student performance in those
areas. We have started to accumulate some good evidence now. For 2018 we fell a little short in terms of
the number of observations (scores), and we will need to make an effort to increase the number of
observations each semester.
We revised our “senior exit interview” for 2018. It has provided us with more useful information, even
though we actually shortened the questionnaire significantly. Notwithstanding the aforementioned glitch for
this year, this improved instrument should help us obtain more meaningful data from the students in future
years. We will wait at least one or two more years before revising this instrument again.
Part V. Planned Changes to the Program in Response to Data
Prompt: What changes , if any, do you need to make to your program in response to what you now know about student
learning? (Possibilities include changes to learning goals, pedagogy, assignments in particular classes, activities, and curricular
requirements and/or structure.) What is your anticipated timeline for both implementation and assessment of the
planned
changes?
We are committed to making our courses more interesting and relevant to the students. This is reflected in
our Action Plan, which the department developed following the 2017 program review. This Action Plan
includes the following items, among others:
Develop interdisciplinary institutes in the areas of Applied Politics and Data Science.
Develop opportunities for student-faculty research.
Develop partnerships with community organizations and alumni to create more internship
opportunities for our students.
Sustain our co-curricular/experiential activities such as EU Simulation and Model Arab League.
Develop a moot court program.
In addition, we continue to be active participants in the new core curriculum. Accordingly, we have revised
and updated many of our courses (for example, virtually all of our courses are either cross-listed with other
programs, such as PJHR; are “linked courses” as part of the new core; or fulfil other new core requirements
such as EGC or ISJ).
The return of another faculty member from an administrative position to full-time teaching will strengthen
our program further, particularly in the area of political theory.
Finally, in the past two years we have significantly increased the opportunities for our students to actively
participate in public discussions and debates about current political and policy issues (Goal III), and we
will continue to do so. This includes an additional donor-funded lecture series (Suopis) as well as regular
open forums that take place off campus (“Rant on the Rails”). And we continue to work with the JCU
Center for Career Services to increase our students’ opportunities to improve their professional skills (Goal
IV).
Supporting documents to follow:
1.
Political Science Department Action Plan
2.
Writing and oral presentation rubrics.