NEW DIRECTIONS FOR S
TUDENT
SERVICES, no. 132, Winter 2010 © Wiley Periodicals, Inc.
Published online in Wiley Online Library (wileyonlinelibrary.com) • DOI: 10.1002/ss.374
Adopting the practice of systematically and strategically
gathering data to inform the development and
implementation of a strategic plan will ensure its
achievement. This chapter presents a combination of
techniques for student affairs professionals to conduct
data-driven planning.
4
Data-Driven Planning: Using Assessment
in Strategic Planning
Marilee J. Bresciani
Data-driven planning or evidence-based decision making represents noth-
ing new in its concept. For years, business leaders have claimed they have
implemented planning informed by data that have been strategically and
systematically gathered (Banta, Jones, and Black, 2009; Bresciani, 2006;
Maki, 2004; Schuh and Associates, 2009; Suskie, 2009; Upcraft and Schuh,
1996). Therefore, it is safe to assume that the concepts that are included in
data-driven planning have been around for years. Within higher education
and student affairs, there may be less evidence of the actual practice of sys-
tematically and strategically gathering data to inform planning.
Data-driven planning is often referred to in higher education as out-
comes-based program review. The Western Association of Schools and Col-
leges (WASC) defi nes outcomes-based program review as a cyclical process
for evaluating and continuously strengthening the quality and currency of
programs. The evaluation is conducted through a combination of self-eval-
uation and peer evaluation by reviewers external to the program or depart-
ment and, usually, external to the organization (Jenefsky and others, 2009).
The results of this process inform strategic planning.
For purposes of this chapter, data-driven planning is defi ned as a sys-
tematic process that gathers programmatic outcomes-based assessment
data (for example, data derived from outcomes-based program review) and
merges those data with trend, forecast, and capacity data, as well as institu-
tional goals and vision. The results of this process are then used to plan
resources, policies, and program design to achieve or refi ne the intended
39
40 STRATEGIC PLANNING IN STUDENT AFFAIRS
NEW DIRECTIONS FOR STUDENT SERVICES • DOI: 10.1002/ss
institutional vision and goals. For student affairs professionals, this means
that strategic planning cannot be done in isolation of university data, such
as an understanding of market demand for majors, the pool of prospective
students, and the institutional learning outcomes and core values.
Informed by these data, student affairs professionals must align each por-
tion of their divisional strategic planning with the overall values of the
university.
For the profession of student affairs, this means that results derived
from outcomes-based assessment processes inform action planning and
budgeting. This also means that as the student affairs division staff mem-
bers gather more data on how well they are meeting institutional priorities,
they can also use the same process to demonstrate achievement of their
own divisional priorities and goals. Departments within the division can
use this process to demonstrate how they are meeting division priorities as
well. This chapter provides an overview of the components of and steps to
establishing such a process.
Steps for Data-Driven Planning in Student Affairs
When organizations embark on strategic planning, key steps must be put
into place. Data-driven planning does not replace those steps; rather, it is
intended to contribute to the refi nement of those steps by purposefully
integrating planning, assessment, and budgeting processes. For example,
when an organization decides through strategic planning that it will
become the fi rst-choice regional provider of quality education for fi rst-gen-
eration students, it begins to design goals that will help it realize that
vision. The strategic plan represents the ideal of what the institutional lead-
ership desires to achieve.
Once the strategic plan is put into place, indicators of success are artic-
ulated, and programs are often asked to illustrate how they are achieving
the goals and indicators represented by the strategic plan (Drucker, 2000;
Fullan and Scott, 2009; McClellan, 2009). The challenge here is that key
steps, discussed in this chapter, are occasionally left out in implementation.
And the result is that organizational members may become frustrated that
the organization’s vision or strategic plan is not being fully realized. In
order to address this initial challenge, it may become important for institu-
tional and divisional leadership to follow some basic steps for data-driven
planning. The intent of sharing these suggested steps is to provide institu-
tional and divisional leadership with a framework to consider as they adapt
each step, cognizant of their own institutional culture. In many cases, insti-
tutions and student affairs divisions already have many of these pieces of
data-driven planning in place; they have just not yet pulled them together
into a systematic, integrated process.
In order to aid readers with determining how they can pull their pro-
cesses together to formulate data-driven planning, the proposed steps that
DATA-DRIVEN PLANNING 41
NEW DIRECTIONS FOR STUDENT SERVICES • DOI: 10.1002/ss
follow are intended to be used as guidelines as opposed to procedures that
must be followed in the exact order indicated. The steps are not designed
as a linear process. You may fi nd, if you follow the steps in numerical order,
that when you get to, say, step 4, you may need to go back and refi ne steps
2 and 3 because you realized that you were collecting data that will not
really inform your strategic plan. Or you may choose to engage in step 1
and then step 4 in that order to fi gure out how to best approach steps 2 and
3. Thus, the steps are to be used as guidelines in any order that makes sense
for your division or institution. As usual, institutional and divisional lead-
ers will need to adapt these steps in accordance with their own culture,
dynamics, and resources in order to improve their data-driven planning
processes (Banta and others, 2009;
Bresciani, 2006; Maki, 2004; Schuh and
Associates, 2009; Suskie, 2009).
Step One: Establish a Strategic Plan. Many chapters in this book
discuss the importance of having a strategic plan and illustrate various
ways to accomplish it. The important piece of information to note here is
that there must be an institutionally and divisionally agreed-on strategic
plan from which to work (Bresciani, Gardner, and Hickmott, 2009;
Bresciani, 2006; Schuh and Associates, 2009). Many professionals become
frustrated when there is no agreed-on direction for their organization, and
thus, the following steps become even more challenging to implement
(Drucker, 2000; Fullan and Scott, 2009; McClellan, 2009). In an institution
that is not engaged in strategic planning and therefore lacks institutional
values and goals with which to align, this process then starts at the division
level.
Step Two: Gather Forecast and Trend Data Sometimes the best
strategic plans and the most inspiring visions and goals can go unrealized
because the planning to create those strategic goals has been done without
considering what the forecast or trend data are illustrating. Forecast and
trend data simply attempt to calculate or predict some future event or
condition. A detailed study or analysis usually informs this type of
conversation (Schuh and Associates, 2009).
The types of data used in forecasting and determining trends are typi-
cally institutionally reported. They are often collected and stored by agen-
cies outside the institution—for example, extracts from the College Board
data sets or other types of national data sets, such as those gleaned from the
Common Data Set, the National Clearinghouse, or the Integrated Post Sec-
ondary Education Data System. Trend data can also be gleaned from admis-
sions applications, the National Survey of Student Engagement, the
Community College Survey of Student Engagement, the Cooperative Insti-
tutional Research Program, the College Student Experiences Questionnaire,
or Your First College Year surveys. Years of gathering these types of data
can illustrate certain trends that can be used in informing whether your
strategic initiatives are feasible. (An example is provided later in this
section.)
42 STRATEGIC PLANNING IN STUDENT AFFAIRS
NEW DIRECTIONS FOR STUDENT SERVICES • DOI: 10.1002/ss
These types of data are often collected or stored at the institutional
level. The institutional research offi ce is a good place to start when looking
to access and use data that will help in forecasting and identifying trends.
If the institutional research offi ce is too busy to assist right away, and it
often is, consider contacting the Association for Institutional Research,
which has a wealth of resources to assist institutional administrators with
this type of institutional data gathering.
In gathering and using data for forecasting or determining trends, the
idea is not to become consumed by data but rather to use the data to deter-
mine if your strategic goals can be achieved. Perhaps your university vision
is to become the fi rst-choice regional provider of quality education for fi rst-
generation students. Using this example, your strategic plan has informed
a design to implement interventions that will aid fi rst-generation students
in their success, but your current plan has no goals to change its outreach
processes and plans. In accessing admissions applications data and College
Board data, you may discover that the number of fi rst-generation students
applying and being accepted by your institution is declining. This would
indicate that your vision and your corresponding strategic plan would not
be realized unless you also have some initiatives to change outreach to and
recruitment of fi rst-generation students.
Before adjusting your strategic plan to focus on a change in outreach
and recruitment, you access data from the College Board to identify how
many regional students are graduating from high school, taking college
placement tests, and being identifi ed as fi rst generation. If you see that the
number is high, you can then determine that efforts to develop outreach
and recruitment plans may be worthwhile. However, if you discover that
the fi rst-generation students graduating from high school are low in num-
bers and appear to have been decreasing, you may want to reexamine your
institutional vision altogether. Institutional and student affairs divisional
leadership could also choose to design different types of interventions that
work collaboratively with local high schools to increase the number of col-
lege-bound fi rst-generation students.
Step Three: Conduct a Capacity Review. Trend data as well as
additional types of data, such as fi nancial records, fi nancial forecasting, and
capital assets, can also be useful in determining the institution’s capacity to
meet the strategic plan. Borrowing from the Western Association of Schools
and Colleges (2008), a capacity review determines whether an institution
has the resources to fulfi ll its strategic mission. In other words, can the
institution function “with clear purposes, high levels of institutional
integrity, fi scal stability, and organizational structures and processes to
fulfi ll its purposes?” (p. 30).
Identifying meaningful data that indicate whether an institution or
division has key institutional resources, structures, and processes in place
to fulfi ll its institutional or divisional mission and strategic plan is impor-
tant in determining whether changes need to be made in strategic
DATA-DRIVEN PLANNING 43
NEW DIRECTIONS FOR STUDENT SERVICES • DOI: 10.1002/ss
priorities. Consultation with the institutional research offi ce may enable
you to identify, access, and use the most appropriate data to inform your
planning.
In order to understand how to use these types of data, we return to
our example. Consider that your trend data forecast an increase in fi rst-
generation graduates intending to take college entrance exams from your
regional high schools, so you know you will have plenty of students apply-
ing to your college. However, the data from the College Board also indicate
that these students will need more fi nancial aid in order to attend college in
the future. Your forecast data show steadily increasing tuition, and your
capacity study reveals less available institutional and state grant aid. How
do you factor this very real scenario into your strategic planning? What
other types of data may you need to collect to make an informed
decision?
The idea behind conducting short but informative capacity reviews is
that if you are able to identify immediate limitations in the ability to pro-
vide the resources needed for realizing the strategic plan, then you may be
able to immediately adjust your strategic plan to better refl ect your capac-
ity. Or you may choose to adjust the strategic plan to build capacity. The
building of capacity to achieve the strategic plan may well become a large
portion of that plan.
Step Four: Articulate Indicators of Success. Leaders who are
operationalizing their strategic plans may clearly articulate the goals
derived from the plan, yet not have clearly identified the indicators of
success that directly relate to the goals derived from the strategic plan.
Rather than just selecting indicators of success that are easy to measure,
consider starting by spending time describing what a successful strategic
plan looks like when it is implemented (Banta and others, 2009; Bresciani
and others, 2009;
Bresciani, 2006; Maki, 2004; Schuh and Associates, 2009;
Suskie, 2009).
Indicators of success “are quantifi able measurements, agreed to before-
hand, that refl ect the critical success factors of an organization. They help
an organization defi ne and measure progress toward organizational goals”
(Reh, 2009, paras. 1, 2). Such indicators are typically gathered and dissemi-
nated at the institutional level, but what types of data should an institution
collect in order to be able to provide such indicators of success?
Returning to our example, what would it look like when your institu-
tion is the fi rst-choice regional provider of quality education for fi rst-gen-
eration students? The initial inclination of planners is to jump to
performance indicators that articulate expectations for numbers of admits,
persistence, graduation, and career placement rates. These indicators are
easy to measure and certainly would make sense to report in relationship
to achievement of this vision. But what else do we know about first-
generation learners? Would we also want to be able to determine how well
the environment welcomes fi rst-generation learners and their families and
44 STRATEGIC PLANNING IN STUDENT AFFAIRS
NEW DIRECTIONS FOR STUDENT SERVICES • DOI: 10.1002/ss
guardians? How integrated are the services and interventions designed to
support these learners (Kuh and Associates, 2005)?
The idea of this step is to indicate purposefully which indicators will
be institutionally identifi ed to determine success of the strategic plan (for
example, persistence rates, placement rates) and which will be gleaned
from more specifi c programmatic outcomes-based assessment results (for
example, evidence of the effectiveness of various and specifi c student sup-
port programs).
Step Five: Prioritize Action Plans to Meet the Strategic
Goals. Assuming that your organizational strategic plan has articulated
goals or objectives, consider prioritizing them if possible. (Chapter One
details steps for goal setting and action planning.) This will assist with
prioritizing the action plans that operationalize the strategic plan, which in
turn helps prioritize the resources that will enable the strategic plan to
come to fruition. When institutional leadership prioritizes the strategic
plan goals, faculty and staff are more likely to feel empowered in
prioritizing their investment of their own time in their action plans in order
to meet the strategic plan (Banta and others, 2009; Bresciani and others,
2009;
Bresciani, 2006; Jenefsky and others, 2009; Schuh and Associates,
2009; Suskie, 2009).
In order to prioritize decisions that align with organizational goals,
values, and strategic initiatives, criteria must be considered that will assist
in the alignment of proposed action plans to the organizational goals, val-
ues, and strategic initiatives. Although this chapter cannot anticipate the
types of criteria that may best represent various organizational structures,
the following questions, adapted from Fred McFarlane (personal commu-
nication, February, 12, 2007), former department chair of administration,
rehabilitation, and postsecondary education at San Diego State University,
may assist institutions in formulating their own criteria:
• How well does the proposed action plan fi t with our organizational
goals, values, and strategic initiatives?
Within that fi t, how will the action plan benefi t current students
(for example, residential students, commuters, fi rst generation)?
How will the proposed action plan affect future students (for
example, recruitment, new student populations, and their
progression from undergraduate to graduate degrees)?
How will the proposed action plan increase the impact of the
department in relationship to the goals and sustaining objectives of
the department and the division?
How will we know whether the proposed action plan will be
effective in increasing the impact of the department on the students?
Does the proposed plan meet the criteria in that it is consistent with
our values and beliefs (for example, access, equity, and student
success), fi nancially viable (for example, does it cover the costs, and
DATA-DRIVEN PLANNING 45
NEW DIRECTIONS FOR STUDENT SERVICES • DOI: 10.1002/ss
can it be leveraged for continued development; note that one-shot
efforts take a great deal of time and often diffuse resources and
energy), consistent with our professional development, and
consistent with our passion and commitment to student learning
and development?
Posing such questions begins to develop criteria for prioritizing cur-
rent outcomes as well as the great ideas for improvements that result from
engaging in outcome-based assessment (see step 7).
Step Six: Align Division Resources with Institutional
Priorities. This step may appear a bit similar to previous steps,
but
nevertheless it is important to consider. The prioritization of the division
resources toward strategic initiatives infl uences the availability of resources
to improve more refi ned levels of action plans. And the decisions to refi ne
the actions plans are informed by results of outcomes-based assessment
(see step 7). If your institution is bound by a governance structure that
gives you very little room to allocate resources in accordance with your
strategic plan, then this step will be very quick for the institution to
complete, because you are constrained by an inability to prioritize the
resources on your own. If the institutional governance allows more
exibility in the allocation of resources, then the idea is to make available
certain resources for the improvement and refi nement of strategic priorities
that can be allocated based on the results of outcomes-based assessment or
on the proposals of innovative action plans to improve strategic indicators
and initiatives.
Step Seven: Implement Outcomes-Based Assessment Program
Review. Implementing outcomes-based assessment plans for the action
plans to achieve the strategic plan will help in gathering meaningful data
about how well you are achieving your strategic plan. If assessment is done
well, the results will yield specifi c information on what needs to be improved
in order to refi ne the strategic indicators articulated in step 4 (Banta and
others, 2009; Bresciani, 2006; Bresciani and others, 2009;
Jenefsky and
others, 2009; Maki, 2004; Schuh and Associates, 2009; Suskie, 2009).
The following sections set out typical components of an assessment
plan and report.
Program Name. The program name helps indicate the scope of the
assessment project. Are you planning on assessing a series of workshops
within the leadership development center, or on evaluating the entire lead-
ership development center? Often it is diffi cult to determine the scope of
an assessment plan (Schuh and Associates, 2009). When in doubt, organize
the plan around programs that have autonomous outcomes (Bresciani and
others, 2004; Bresciani and others, 2009).
Program Mission or Purpose. List the program mission or purpose
statement. It may also be helpful to provide a one- or two-sentence
46 STRATEGIC PLANNING IN STUDENT AFFAIRS
NEW DIRECTIONS FOR STUDENT SERVICES • DOI: 10.1002/ss
explanation of how this program mission or purpose aligns with the depart-
ment, college, division, or university’s mission within which it is organized.
Setting this out will help explain how the program aligns with institutional
values and priorities.
Program Goals. Goals are broad, general statements of what the pro-
gram wants students to be able to do and to know or what the program will
do to ensure what students will be able to do and to know. Goals are not
directly measurable. Rather, they are evaluated directly or indirectly by
measuring specifi c outcomes derived from the goals (Bresciani and others,
2004; Bresciani and others, 2009).
The further alignment of each goal to
department, college or division, or university goals or strategic initiatives
generated from the strategic plan assists with the communication of priori-
ties and allows programs to show how they are operating within stated
priorities. In addition, the alignment of each goal with professional accredi-
tation standards, if applicable, allows you to determine how this program
intends to meet higher-level organization goals and strategic planning
initiatives.
Outcomes. Outcomes are more detailed and specifi c statements derived
from the goals. They specifi cally are about what you want the end result of
your efforts to be. In other words, what do you expect the student to know
and do as a result of, for example, a one-hour workshop, one-hour indi-
vidual meeting, Web site instructions, or series of workshops? Outcomes
do not describe what you are going to do to the student, but rather how
you want the student to demonstrate what he or she knows or can do
(Bresciani and others, 2004; Bresciani and others, 2009).
In addition, you want to be able to align each outcome with a program
goal. This alignment allows you to link your outcomes to department, col-
lege or division, or university goals and strategic initiatives, as well as pro-
fessional accreditation standards. Such alignment allows you to determine
how this program intends to meet higher-level organization goals and stra-
tegic planning initiatives.
Planning for Delivery of Outcomes. This is where action planning
comes into the process. Here is where you describe or simply draw a dia-
gram that explains how you plan for the student to learn what you expect
the student to learn in order for the outcome to be met. Do you plan for the
students to learn what you expect them to in a workshop, one-on-one con-
sultation, or a Web site? Simply indicate all the ways in which you provide
students the opportunity to achieve the learning outcome. Identifying
where outcomes are being taught or delivered also provides reviewers with
opportunities to identify where that outcome may be evaluated.
Evaluation Methods and Tools. Often the evaluation method or tool
section of the assessment plan can be intimidating to practitioners. This
section is not intended to include detailed research methodology. It is
intended to simply describe the tools and methods (for example, observa-
tion with a criteria checklist, survey with specifi c questions identifi ed,
DATA-DRIVEN PLANNING 47
NEW DIRECTIONS FOR STUDENT SERVICES • DOI: 10.1002/ss
essay with a rubric, role-playing with a criteria checklist) you will use to
evaluate the outcomes of participants in specifi c programs. In this section,
you identify the sample or population you will be evaluating, identify an
evaluation method or tool for each outcome, and include the criteria that
will be used with the tool to determine whether the outcome has been
met—for example:
If the tool to measure an outcome is a survey, which questions in the
survey are measuring the outcome?
If the tool is a test, which questions measure the outcome?
• If the tool is an observation, what are the criteria that you apply to
the observation in order to identify whether the outcome has been
met?
Add limitations of the evaluation method or tool if necessary. Limita-
tions are reminders to you and the reviewer that while the evaluation pro-
cess may not have gone extremely well, you recognize the limitations and
have documented them to be considered in decision making or for
improvements to be made the next time. In addition, select other institu-
tional, system or national data (for example, enrollment numbers, faculty-
to-student ratios, retention rates, graduation rates, utilization statistics,
satisfaction ratings, National Survey of Student Engagement scores) that
will be used to help you interpret how and whether the outcome has been
met.
Implementation of Assessment Process. This is the planning section for
the implementation of the assessment process. Not everything has to be
evaluated every year. You can simply evaluate two or three outcomes each
year, which will create a multiyear assessment plan, of which the fi nal year
of the assessment plan feeds into the comprehensive program review pro-
cess. Identify who is responsible for doing each step in the evaluation pro-
cess. Outline the time line for implementation, including the years in
which each outcome will be evaluated (so as not to indicate that everything
must be evaluated every year). Also include which year you will be review-
ing all prior outcomes data results (for example, comprehensive program
review year) for a holistic program review discussion.
In addition, identify other programs that are assisting with the evalua-
tion and when they are assisting. Include time lines for external reviewers
(including professional accreditation reviews, if applicable) and for com-
munication across departments or colleges. Identify who will be participat-
ing in interpreting the data and making recommendations, along with a
time line for implementing the decisions and recommendations. Finally, be
sure to outline how lines of communication will fl ow. Who will see the
results, when will they see the results, and who will be involved in deter-
mining whether the results are acceptable?
48 STRATEGIC PLANNING IN STUDENT AFFAIRS
NEW DIRECTIONS FOR STUDENT SERVICES • DOI: 10.1002/ss
Results. Summarize the results for each outcome as well as the process
to verify, validate, or authenticate the results. This may include how results
were discussed with students, alumni, other program faculty and adminis-
trators, or external reviewers. Link the results generated from the out-
comes-based assessment results to any other program, college, or institu-
tional performance indicators.
Refl ection, Interpretation, Decisions, and Recommendations. This section
summarizes the decisions and recommendations made for each outcome
and illustrates how you determined if the results were satisfactory. It
therefore requires describing the process used to inform how the level of
acceptable performance was determined and why it was determined as
such.
Illustrate how decisions and recommendations may be contributing to
the improvement of higher-level goals and strategic initiatives. Identify the
groups that participate in the refl ection, interpretation, and discussion of
the evidence that led to the recommendations and decisions. It may then be
helpful to summarize the suggestions for improving the assessment pro-
cess, tools, criteria, and outcomes. Finally, be sure to identify when each
outcome will be evaluated again (if the outcome is to be retained and who
is responsible).
Documentation of Higher-Level Feedback. This section is designed to
document how results are used and how the results are disseminated
throughout the institution. The intent is to document conversations and
collaborations that are being implemented in order to systematically and
institutionally improve student learning and development. Include the
routing of the recommendations or decisions (for example, who needs to
see the recommendations or be involved in the decision making) if
resources, policy changes, or other information was required outside the
scope of the program. For example, if you are the program coordinator and
the decisions you and your students recommend require the approval of the
department director, then you need to indicate that the approval of the
decision must fl ow through the departmental director.
Appendixes. Include any appendixes that may help illustrate the man-
ner in which you evaluate your program. For example, you may want to
include the curriculum alignment or outcome and delivery map or the
tools and criteria to evaluate each outcome. You may also choose to include
any external review of the plan, results, or decisions and what was con-
cluded from that external review. Include any budget plans and resource
reallocation or allocation documents as well
(Bresciani, 2010).
Step Eight: Allocate and Reallocate Resources to Help Realize the
Goals. Jenefsky and others (2009) discuss in detail how outcomes-based
program review provides an effective way for institutional leadership to use
systematically collected data to inform specifi c decisions for improving
strategic plan initiatives. Thus, the fi ndings and recommendations from
step 7 can be used as evidence to inform decision-making processes at
DATA-DRIVEN PLANNING 49
NEW DIRECTIONS FOR STUDENT SERVICES • DOI: 10.1002/ss
various levels in the institution (for example, from the program level
through the university level).
In order to frame this discussion, remember that some suggestions to
improve strategic initiatives can occur with very little resource reallocation
(for example, resequencing process steps, refi nements in the criteria for
student evaluation, or reorganization of workshop material). Other fi nd-
ings may point to a need for a larger reallocation of resources, ranging from
staff development for assessment to hiring more staff or faculty members to
ll unmet needs.
Step Nine: Make It All Systematic. The fi nal step in this process is
to make the entire data-driven planning process systematic. This requires
institutional leadership to schedule periodic holistic reviews of their
processes in order to ensure that they are working together to inform data-
driven planning. There are several things to consider when creating a
systematic, sustainable, and data-driven planning process. The fi rst is to
build collaborations across departments, colleges and divisions, and
hierarchical structures so that information can fl ow in an environment of
trust. Second, review position descriptions and personnel review processes
to ensure that faculty and staff are constantly reminded of the importance
of engaging in data-driven planning, given professional development
opportunities to learn how to do this well, and rewarded for using data to
inform decisions. Third, maintain the integrity of the data by being
forthright with how data will be used for planning purposes, resource
reallocations, and professional development opportunities. Fourth,
consistently use data and provide systematic processes for communicating
how the data are used for informing decisions and planning. This will
motivate faculty and staff engagement in the process. Finally, identify
strategies to keep morale high when someone’s program is not selected as
an institutional priority.
Conclusion
Ensuring these steps are followed will more than likely lead your institu-
tion to establishing an effective data-driven planning process. The gather-
ing and analysis of data also has the potential to strengthen the
implementation of a well-documented plan. The ongoing cycle of evalua-
tion and assessment will ensure the plan’s effectiveness.
References
Banta, T., Jones, E., & Black, K. Designing Effective Assessment: Principles and Profi les of
Good Practice. San Francisco: Jossey-Bass, 2009.
Bresciani, M. J. Outcomes-Based Academic and Co-Curricular Program Review: A Compila-
tion of Institutional Good Practices. Sterling, Va.: Stylus Publishing, 2006.
Bresciani, M. J. “Assessment and Evaluation.” In J. Schuh, S. Jones, and S. Harper (eds.),
Student Services: A Handbook for the Profession. San Francisco: Jossey-Bass, 2010.
50 STRATEGIC PLANNING IN STUDENT AFFAIRS
NEW DIRECTIONS FOR STUDENT SERVICES • DOI: 10.1002/ss
Bresciani, M. J., Gardner, M. M., and Hickmott, J. Demonstrating Student Success in Stu-
dent Affairs. Sterling, Va.: Stylus Publishing, 2009.
Bresciani, M. J., Zelna, C. L., & Anderson, J. A. Assessing Student Learning and Develop-
ment: A Handbook for Practitioners. Washington, D.C.: National Association of Stu-
dent Personnel Administrators, 2004.
Drucker, P. “Managing Knowledge Means Managing Oneself.” Leader to Leader, 2000,
16, 8–10.
Fullan, M., and Scott, G. Turnaround Leadership for Higher Education. San Francisco:
Jossey-Bass, 2009.
Jenefsky, C., and others. WASC Resource Guide for Outcomes-Based Program Review. Oak-
land, Calif.: Western Association of Schools and Colleges, 2009.
Kuh, G. D., and Associates. Student Success in College: Creating Conditions That Matter.
San Francisco: Jossey-Bass, 2005.
Maki, P. L. Assessing for Learning: Building a Sustainable Commitment Across the Institu-
tion. Sterling, Va.: Stylus Publishing, 2004.
McClellan, E. “Promoting Outcomes Assessment in Political Science Departments: The
Role of Strategic Planning.” Paper presented at the annual meeting of the APSA
Teaching and Learning Conference Online, Baltimore, 2009. Retrieved May 26, 2009,
from http://www.allacademic.com/meta/p11617_index.html.
Reh, F. J. ”Key Performance Indicators: How an Organization Defi nes and Measures
Progress Toward Its Goals.” 2009. Retrieved July 24, 2009, from http://management
.about.com/cs/generalmanagement/a/keyperfi ndic.htm.
Schuh, J. H., and Associates. Assessment Methods for Student Affairs. San Francisco:
Jossey-Bass, 2009.
Suskie, L. Assessing Student Learning: A Common Sense Guide. (2nd ed.) San Francisco:
Jossey-Bass, 2009.
Upcraft, M. L., and Schuh. J. H. Assessment in Student Affairs: A Guide for Practitioners.
San Francisco: Jossey-Bass, 1996.
Western Association of Schools and Colleges. “Handbook of Accreditation.” 2008.
Retrieved July 24, 2009, from http://www.wascsenior.org/findit/files/forms/
Handbook_of_Accreditation_2008_with_hyperlinks.pdf.
MARILEE J. BRESCIANI is a professor of postsecondary education and codirector
of the Center for Educational Leadership, Innovation, and Policy at San Diego
State University.
Copyright of New Directions for Student Services is the property of John Wiley & Sons, Inc. and its content
may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express
written permission. However, users may print, download, or email articles for individual use.