Vous êtes sur la page 1sur 22

RUNNING HEAD: FUNCTIONAL AREA STUDY

The Student Affairs Profession: Functional Area Study


Katherine M. Knight
Loyola University Chicago

FUNCTIONAL AREA STUDY


Functional Area Study
Before I began my shadow experience, I thought I had a good grasp at what a career
services department looked like, as frequently visited Career Services at The University of Iowa
as an undergraduate student. I was instantly proven wrong the second I walked into Melissa
Smiths office. With a team of 50 employees housed in one of the most historic buildings on
campus, The University of Chicagos Career Exploration department provided for a unique, eye
opening shadow study.
As an undergraduate student at The University of Iowa, I had a very positive experience

with the institutions career services department, and I had a career advisor I visited often. When
I was going through the career exploration process for a second time after working in finance
after graduation, my positive experiences with career services made the list of student affairs
experiences that led me to pursue a masters degree in higher education. While I am still trying
to figure out what my passion is within student affairs, I decided to shadow a career services
professional to hopefully get a taste of that particular career path.
I work part time at the Illinois Institute of Technology (IIT), and I wanted to gain shadow
experience at an institution in the Chicago area of which was not familiar. The University of
Chicago was appealing to me, as it was in a part of Chicago I had never visited, and I wanted to
experience the history and architecture on campus that I heard about from colleagues. The
prestige and reputation of The University of Chicago drew me in as well, and would be evident
in what I experienced in the two days I was on campus.
Melissa Smith is a Career Adviser for the Student Preparation team within Career
Exploration, and she has been in the student affairs field for greater than three years. I chose
Melissa through a combination of random clicking on the Career Exploration website and the

FUNCTIONAL AREA STUDY


commonalities I found through her bio. Melissa completed her Masters degree at Wright State

and worked in academic advising before her position at The University of Chicago. I was drawn
to her passion for developing students and our common hobbies including running and spending
time with friends and family. Through email communications, she was more than willing to
meet with me on two separate occasions to give me a good sample of what the Career
Exploration department had to offer.
Shadow Study
The shadow study was broken up over two days, one session in mid October and one
session in early November. Each session lasted around three hours, held in the morning. I
arrived to shadow on the first day, and as I walked into Ida Noyes Hall, I was amazed to find that
Melissas office was in a very beautiful, historic building on campus. This building was vastly
different from the modern Pomerantz Center at The University of Iowa that I experienced. I later
learned that Ida Noyez Hall doubled as venue for a variety of banquets and weddings. I found it
interesting that the Career Exploration office resided in such a prestigious building, but I am also
unaware of where the other student affairs offices are located on The University of Chicago
campus.
Melissa came down to meet me, and we walked upstairs to the Career Exploration staff
offices. Melissas office had a window with access to a balcony with a beautiful view of the rest
of campus. The space was very open, and while every staff member had their own space, you
could still openly communicate with other staff members and there was a constant buzz and
energy in the room.
Melissa and I scheduled our first meeting at a time where she had multiple student
appointments. I was fortunate that all of her the students she saw that day were more than

FUNCTIONAL AREA STUDY

willing to have me sit in their one on one meetings. The students we met with that day were
looking for help on a variety of follow up items from the fall job and internship fair. One student
was looking for interview preparation, and Melissa volunteered to go through some commonly
asked interview questions to evaluate answers the student prepared. The most common types of
student meetings Melissa experiences are those involving rsum and cover letter help, which
reflected in the questions students had in our meetings that day.
In our downtime in between meetings, I was able to ask Melissa questions about the other
services Career Exploration offered, and we talked about many of her duties outside of daily
student meetings (M. Smith, personal communication, October 22, 2014). One of the unique
features of the Career Advancement department is all incoming first year students are assigned a
career advisor. These students are not required to visit their career advisor by a certain time;
they receive the name of a staff member as a resource to use when they choose. However, in
order to unlock all that Career Advancement has to offer, a student must visit a member of the
Career Advancement team and sign a form understanding that they are representing The
University of Chicago in any of the services offered to them by Career Advancement. I sat in on
a meeting with a student who was looking for some resume help. Through his visit to the Career
Advancement office, he was able to get personal resume help from Melissa, and he gained access
to services that would continue to help him get an internship for the summer.
Melissa explained several of the services that Career Exploration offers to University of
Chicago students (M. Smith, personal communication, October 22, 2014). A portion of the
services were available online, while other more comprehensive programs included interviews,
field trips, and banquets. Some of the services offered online include resume and cover letter
templates, a career exploration self-assessment, and an extensive alumni database. The database

FUNCTIONAL AREA STUDY

listed alumni by the degree they achieved, the field they went into, and their geographical
location as well as contact information for current students to use in setting up informational
interviews.
The more extensive programming put on by Career Advancement was very interesting to
learn about. I was surprised at how many different types of experiences were available to
students allowing them to jump-start their career and personal explorations in a variety of
different ways. Treks, CAAP, and the Taking the Next Steps program were all programs offered
by The University of Chicago that I had never heard of before in my experience at other
institutions. A Trek is a two to three week long trip to various cities around the world where
second year students have the opportunity to hold informational interviews and potentially
complete small projects in the field and geographic location of their choice. The Chicago
Academic Achievement Program (CAAP) is a program for incoming first year students who
identify as first generation or are from a rural community. CAAP brings these students on
campus and hosts programs in collaboration with other campus departments, including trips to
businesses downtown, to help ease the transition to college in a large urban setting. Lastly, the
Taking the Next Steps program is a banquet held for over 1,000 second and third year students
and 200 alumni at the Hilton Head Hotel. This program serves as a round table networking event
with keynote speakers to inform students about what is available after graduation.
For our second meeting, Melissa wanted to be sure to include me in some of staff
meetings she attends weekly (M. Smith, personal communication, November 12, 2014). I was
able to sit in on the weekly Career Advancement staff meeting, and it was here where I realized
how large the department is. The meetings take place in a large room where all 50 members of
the team are expected to attend. Per Melissa, aside from department updates, many times they

FUNCTIONAL AREA STUDY

will bring in speakers from campus partners to share how Career Advancement can benefit from
their services. At the meeting I attended, Student Disability Services (SDS) presented on the
processes a student goes through to receive their services, and pointed out that students can use
SDS for help with job applications, interviews, and how to inform potential employers about the
accommodation they may need.
One of the staff updates that I found extremely interesting pertained to the Universitys
accreditation process. The University of Chicago is going through the reaccreditation process,
and a part of that involves a large focus on placement rates after graduation. The director
assured the staff that she was more than pleased with the progress and success of the department,
and informed them that many staff members may be asked to pull up statistics and complete the
required reports on their students. I asked Melissa her role in this process, and since she is part
of the Career Preparation team, she would not be one of the people asked to pull numbers, but
may serve as a secondary reader for the reports generated by other team members (M. Smith,
personal communication, November, 12, 2014).
The most surprising part of my shadow study was learning about the size, organization,
and make up of the Career Advancement team. The 50 member team is divided into over 10
different departments including Experiential Education, Employer Relations and Development,
Student Preparation (Melissas Team), and Career Specialist Advisers in fields including health
professions, business, public and social service, science, technology, engineering, and math
(STEM), and education. What is even more fascinating is that this large team consists of many
industry professionals rather than the student affairs personnel staff in Melissas team. In
addition, Melissas Student Preparation team, comprised of only six members of the larger team,

FUNCTIONAL AREA STUDY

is required to go through supplementary counseling training as part of new hire training


facilitated by Melissa (M. Smith, personal communication, November 12, 2014).
The shadow experience forced me to reflect on the consistency of interactions with the
same students I would like in my future career. I am drawn to career services based on my
positive experiences with career services as an undergraduate, and in the past, I have seen it as a
way to integrate my background in business and the student affairs profession. After speaking
with Melissa, a former academic advisor, she expressed that what she missed most about her role
as an academic advisor was the consistent one on one interaction with students (M. Smith,
personal communication, November 12, 2014). In Career Advancement, Melissa sees some
students consistently, but it is harder to build relationships with students that are more personal.
My personal philosophy towards the student affairs profession focuses heavily on building
meaningful relationships with students and having a strong presence in their support group. I am
hesitant about career services because, even though I still have an opportunity to help develop a
student in figuring out what career best fits their values and interests, I may not see as many
students on a consistent basis.
Based on my shadow experience, the two core competency areas developed by ACPA
and NASPA (2010) needed to process career services are advising and helping and assessment,
evaluation, and research. Per my time with Melissa, I witnessed her providing effective
counseling services to individuals during her one on one meetings. Melissa also explained her
role in facilitating counseling training to all new staff members (M. Smith, personal
communication, October 22, 2014). Finally, during the weekly staff meeting, I observed how
Career Advancement collaborates with other campus departments by bringing in fellow
professionals students may need in various capacities, like Student Disability Services.

FUNCTIONAL AREA STUDY


Providing effective counseling and campus collaboration both fall under the advanced levels of
competency in advising and helping (ACPA & NASPA, 2010). Without an advanced

competency in this area, career services in general may not be using resources to the best of their
abilities to develop students.
With the reaccreditation process at The University of Chicago, Career Advancement
could benefit from, at the minimum, basic assessment, evaluation, and research competencies.
Through these competencies, Career Advancement could better define program and learning
outcomes for their services, and how those outcomes relate to organizational goals and the goals
of the reaccreditation process (ACPA & NASPA, 2010). Career Advancement could also use
assessment, evaluation, and research to help with appropriate data collection for reaccreditation
with basic competencies.
Concluding my shadow study, one of the most important things I learned was that every
institution structures and values their career services department at different levels. Compared to
The University of Iowa, The University of Chicago has developed a stronger commitment to
developing a students career path through their large and diverse team. That commitment
shows through the organization structure of the department and the resources the department
provides and receives.
Issues Exploration
Sitting in the staff meeting during my second day of shadowing, I was intrigued by the
way the department stressed the importance of placement rates as it related to the reaccreditation
process. On my first day of shadowing, I felt empowered and confident at the work Melissa and
her Student Preparation team were doing in terms of student development. I had little previous
knowledge on how placement rates are calculated, but from what I could recall, I knew they did

FUNCTIONAL AREA STUDY

not incorporate any sort of student development or satisfaction component at most intuitions.
From this reflection, I decided to focus on the issue of placement rates: how they are calculated,
how they are used, and how they can improve for the betterment of higher education.
The topic of placement rates after graduation is a present and important issue in higher
education today. Higher education in America is constantly under scrutiny for the accountability
and outcomes of students with college degrees (Severy, 2011). With many students and parents
today using placement rates to aid in making decisions on which institution to attend, institutions
are feeling more pressure to calculate and publish these statistics (Sandoval, 2012). Families and
potential students are not the only ones using these rates, however. President Obamas Scorecard
and many national ranking sources are also using placement rates to determine what makes an
institution successful (Kiley, 2013). While many believe the ultimate goal of higher education is
to prepare undergraduates for a career, how does the calculation of placement rates reflect what a
student has truly gained during their time at an institution (Severy, 2011)? The mission of many
student affairs related departments is to help students achieve educational and personal
development goals they have set for themselves (Kuh, 2013). Kuh (2013) states, In an era of
heightened emphasis on accountability and transparency, it is imperative that student affairs
professionals use policies, programs, and practice known to have desired effects on various
dimensions of student performance (p. 258). As placement rates gain more importance in
determining the success of an institution and of students after graduation, it is critical that the
benefit students receive from student affairs professionals is highlighted in these statistics.
The issue of placement rates is very relevant to career services. The University of
Chicago, per talking with Melissa, will be pulling numbers for graduation rates for their
reaccreditation process (M. Smith, personal communication, October 22, 2014). However, the

10

FUNCTIONAL AREA STUDY

immense benefits of Melissas advising and coaching through her role in the Student Preparation
team could be overlooked in those numbers. Melissa stated that she was not needed for the
reaccreditation process, and may only be used as a second reviewer of information. According
to Severy (2011), some career centers, like The University of Chicago, are responsible for these
statistical data on student outcomes. These questions are not unfamiliar, but now they serve a
new importance, as these numbers are increasingly difficult to correct and are not very reliable
(Severy, 2011). The question then, is how can higher education institutions, ranking systems,
and government programs regulate the types of data used in formulating placement rates,
improve the accuracy of the data collected from alumni, and how can this data include the
qualitative work from student affairs professionals?
Research Findings
Accountability for performance is todays mantra in higher education (Wellman &
Ehrlich, 2003, p. B216). Institutional results and the value of higher education in our society
today constantly being discussed among governing boards, state commissions, and accreditation
agencies throughout the country (Severy, 2011). As tuition price increases with the amount of
graduates unable to locate employment after graduation, many institutions feel pressure families,
students, and policy makers to turn to alumni surveys to produce quantifiable evidence of
positive performance (Rogers, 2013). However, one issue for heavily reliance on these
placement rates is there are no standards in place across institutions on how these numbers are
calculated (Sandoval, 2012). Institutions vary on when they survey alumni, the amount of
students reached, and what the word placement means.
Institutions who report placement rates after graduation differ in the length of time waited
before surveying alumni. For example, when comparing institutional placement rates, one

FUNCTIONAL AREA STUDY

11

institution could have collected alumni responses six months before graduation, and one may
have collected responses on graduation day. Colorado College and Colgate University differ in
this exact way (Sandoval, 2012). The day before graduation, Colorado College surveys all
graduates on their plans after graduation. In contrast, Colgate University collects alumni surveys
until six months post graduation. Colorado College reported that 53 percent of the class of 2012
was employed, where Colgate University reported a much higher 72 percent employment rate.
At face value, it appears as though Colgate University has greater success in helping students
find employment after graduation. Looking at when the data was collected shows a discrepancy
in how the placement rates were derived.
The timing of alumni surveys is not the only factor that contributes to the discrepancies in
placement rates among institutions. The amount of alumni that respond to the surveys differs
between institutions, and those often-small amounts are the numbers used to represent an entire
class of graduating students (Rogers, 2013). Keene State University and Syracuse University
both report placement rates to prospective students. Keene State University reported a placement
rate of 94 percent, and Syracuse University reported a slightly lower 84 percent placement rate
for the class of 2012. Looking deeper into how these statistics were calculated, however, both
institutions received survey results of less than 50 percent of the graduating class with Syracuse
at a 42 percent reporting rate and Keene State at a 38 percent reporting rate. Though both
institutions reported relatively high placement rates after graduation, those rates are representing
a minority of the graduating class.
These low response rates are not uncommon for institutions nationwide. The National
Association of Colleges and Employers reported that only one third of colleges that report
placement rates had alumni response rates greater that 75 percent (Sandoval, 2012). With low

FUNCTIONAL AREA STUDY

12

response rates, it is also important to look at the characteristics of alumni who respond. At
Keene State, regardless of if recent alumni are employed or not, employment status does not
effect the amount or type of students that reply to alumni surveys. In juxtaposition, Mark
Schneider at the American Institutes for Research reports that if an institution gets an alumni
response rate lower than 50 percent, it is likely only the students who are the most successfully
employed are those who are responding, skewing placement rates even further (Sandoval, 2012).
A third important component of calculating placement rates is determining what
constitutes placement. Unpaid internships and part time positions often count as sufficient
placement after graduation (Sandoval, 2012). More importantly, many institutions do not
account for underemployment or ask if the position alumni are in even requires a degree. Kansas
State University double checks its surveys to be sure they do not account for unpaid internships,
but other institutions are not required to do the same when determining who has been adequately
placed after graduation.
Taking into account the variety of different institutions that calculate placement rates and
the diversity in how those rates are calculated, it is safe to say that many placement rates are hard
to compare (Severy, 2011). Though these rates appear unreliable and difficult to compare, there
are many outside sources that use self-reported placement rates to measure institutional success.
President Barack Obamas College Scorecard and many accreditation agencies use institutionally
reported placement rates towards their data collection (Kiley. 2013; Cragg, Henderson,
Fitzgerald, & Griffith, 2013).
In 2013, President Barack Obama announced a College Scorecard that would better aid
prospective students in comparing institutions and receiving the most bang for their educational
buck (Kiley, 2013). In the scorecard, data presented includes average net price, graduation rate,

FUNCTIONAL AREA STUDY

13

loan default rate, median borrowing and employment. The employment component of the
scorecard is calculated using information about employment after graduation, self reported by
institutions. Though the institutions are all ranked with the same criteria, the employment
component of the scorecard does not eliminate the discrepancy in variables previously discussed.
One of the biggest critiques about the scorecard is that it represents such a narrow focus on how
a prospective student chooses a college and dismisses many of the reasons why higher education
institutions exist.
Accreditation today is not so much about signaling high quality, however, a lack of
accreditation is a red flag for a school that is of low quality (Kiley. 2013; Cragg, Henderson,
Fitzgerald, & Griffith, 2013). Regardless of which of the six regional accrediting agencies is
assigned to an institution, they all have the same goal of ensuring institutional quality, and
institutional assessment plays a large role in the accreditation process. Placement rates after
graduation are considered a direct form of assessment, and alumni survey data is considered an
item that may be used in the assessment and planning reporting process. Recently, accrediting
agencies are held more responsible for institutional effectiveness due to an increased push from
federal and state levels on accountability. Within this focus on accountability is the need for
assessment, and while placement rates are a small portion of the assessment judged by
accrediting agencies, it is clear from my shadow study that some institutions are still placing a
large influence on placement rates as an effective form of assessment.
Between the ways placement rates are calculated and the way outside institutions,
families, and students that use those rates, there seems to be many factors attributing to student
experience and student success missing from these alumni surveys. The frequent use of
quantitative statistics is understandable as numbers are quick, measurable and concrete (Rogers,

14

FUNCTIONAL AREA STUDY

2013). However, there are many more qualitative measures of institutional and student success
that go beyond short-term postgraduate employment (Kiley, 2013). These qualitative measures
can be used by students, families, and outside institutions in addition to serving as an effective
form of institutional assessment both internally and externally.
Personal Experiences
As I was looking at schools as an incoming undergraduate, I placed a heavy emphasis on
placement rates, as my parents stressed the importance of choosing a major that would garner me
a job after graduation. After deciding to pursue a business degree, I looked up placement rates
for the schools in the state of Iowa to see which business school produced more graduates with
full time employment that year. Now, two years removed from my undergraduate experience,
based on this research and what I have personally reported to The University of Iowa, I can see
where the numbers I initially looked at may not have been as accurate as I believed.
Immediately upon graduation, I filled out a graduation survey on my plans after graduation,
including my place of employment, the field my position fell under, and the salary I was about to
make.
I have since left the business world and have entered into the world of higher education.
In my new role as a graduate student, I no longer work full time. I have part time positions that
relate to my graduate work, but these positions do not pay the same salary as my first job after
graduation. The most important factor in my career change is that I have not yet reported this
information to Iowa, nor have I been prompted to update my information. Therefore, in The
University of Iowa statistics for the class of 2013, my placement information is no longer up to
date. Had I looked into how placement rates were calculated at different institutions, I may have

FUNCTIONAL AREA STUDY

15

placed a greater emphasis into looking at the programs the career services had in place to aid in
my professional development.
Recommendations and Best Practices
According to Severy (2011), many career centers feel as though they have responsibility
in calculating statistical outcomes. As the statistical outcomes become more important, the
numbers become more difficult to collect. Unless an institution is willing to spend immense
amount of time and money on outreach to alumni, the accuracy of these numbers will not
improve. In addition, if there are no standards in place across institutions when calculating
statistical outcomes, regardless of the time and money spent in outreach, it will still be hard to
compare schools that collect the data differently. To better assess an institution, an institution
can change the questions it asks in alumni surveys and outside organizations can set institution
wide standards for calculating placement rates.
Many alumni surveys do not account for value added measurements including, but not
limited to, underemployment and alumni satisfaction (Severy, 2011). Surveys generally focus on
two questions: whether alumni are gainfully employed and their salary (Grasgreen, 2013).
Instead of relying heavily on the numerical data of recent alumni, student affairs professionals
and higher education institutions could benefit from asking more developmental questions of
alumni. Many institutions, and one company partnership, are beginning to add additional
questions to alumni surveys to help develop a fuller picture of students lives after graduation.
For starters, Keene State University has started to ask recent graduates questions about
whether their current jobs relate to their volunteer, work, and research activities as a student
(Rogers, 2013). In addition, Keene State University is also asking alumni about the activities
they are participating in outside side of work that they may find meaningful including exercise,

FUNCTIONAL AREA STUDY

16

hobbies, and volunteer activities. Syracuse University has made moves to increase the detail in
placement of its recent graduates. Going beyond the company alumni work for, Syracuse asks
about the type of employment, like internships or part time education, which many institutions
leave out. Syracuse, like Keene State, also asks if students are in a career field that relates to
their undergraduate field of study. Both Syracuse and Keene State are looking at ways to expand
their surveys, however, both universities realize that there is more expansion needed.
Possibly the most progressive alumni survey system of recent is the Gallup-Purdue
University Index (Ray & Kafka, 2014). The Index was created to explore the relationship
between a students college experiences and their engagement in work after graduation. The
study was set out to show that a college students experiences affected life after college
regardless of what institution they attended, taking the conversation away from placement rates
of specific institutions all together (Ray & Kafka, 2014). In the survey, recent graduates are
asked about their experiences in college in addition to their satisfaction and engagement in
careers and life after graduation. Sample questions asked about a students college experience
include extracurricular activities, internship opportunities, and faculty interaction. Questions
asked about alumni post graduate experiences look at whether a student likes what they are doing
at work each day, if they are recognized for improving the community, and the types of support
they are getting from coworkers, friends, and family in their career (Grasgreen, 2013).
This survey takes the emphasis off actual placement of graduates in the workforce and
sheds light on how engaged a student is after graduation. Although the survey is brand new, and
there are sure to be critiques on the methodology, the data is beginning to show that great jobs
and careers after graduation have a correlation with a students experience outside of the
classroom (Ray & Kafka, 2014). The data suggests that what a student is involved in during

17

FUNCTIONAL AREA STUDY


their time at an institution and how they experience those involvements, has an overwhelming
effect on a students success after graduation. These findings are great for showing the
importance of the student affairs profession and the holistic development of students (ACE,
1949).
In contrast to altering the questions offered in alumni surveys, an institution wide
standard for data collection could be set. A large part of why it is hard to compare placement
rates of institutions is the unreliability and inconsistency of the data across universities. If
institution wide standards were set, either by accreditors or lawmakers, comparison between

placement rates of institutions would be more accurate. The challenge, however, is determining
standards that are accessible to all institutions. With the amount of funding required to reach out
to and follow up with alumni to gather these statistics, setting standards on a percentage of
alumni required for the survey would require those institutions to reach a same percentage of
students working with many different levels of resources. Standardization of what the word
placement looks like across the board would help in determining what students are working in
a full time job versus a part time job or unpaid internship after graduation. This clarification still
does not show a prospective student or family how effective an institution is at aiding students in
finding meaningful work and life connections after graduation (Rogers, 2013).
Importance of Assessment
Accuracy aside, calculating placement rates after graduation is a form of institutional and
program assessment used internally and by external organizations as well as families and
students. The ability to incorporate assessment and evaluation into day-to-day practices is
emerging as a necessary competency for student affairs practitioners (Bresciani, 2013, p. 321).

18

FUNCTIONAL AREA STUDY


Continued importance of assessment and further research on the accuracy of assessment
practices in place begin the conversation on other effective forms of assessment.

Palomba and Banta (1999) define assessment to include both qualitative and quantitative
data as the collection of information that pertains to education programs set out to improve
student learning and development. The way many institutions calculate placement rates after
graduation currently does not specifically assess students on how they have developed during
their time at institutions. Career services specifically, could benefit from a variety of different
outcomes based assessments (OBA) when evaluating their department programming. An
outcomes based assessment (OBA), like a calculated placement rate, includes establishing
identifiable end results of student learning and development, ensuring students have
opportunities to reach those end results, gathering and interpreting data to determine of those
students met those end results, and using the results to celebrate, inform planning processes,
reallocate resources, and revisit plans (Bresciani, 2011). University of Chicagos Trek, CAAP,
and Next Steps programs could be assessed with an OBA to show the programs effectiveness in
developing students and show improvements needed in the programs. Improving programs in
career services and highlighting where students are reaching the outcomes set in the assessments
could lead to better understanding of how the programs effect how a student has developed
through career services, and how that development leads to the amount of students placed after
graduation.
Conclusion
Although the purposes of higher education may be debated, the agreed upon expectation
is that students will be able to graduate with a set of knowledge and skills (Bresciani, 2011, p.
322). Increased scrutiny in higher education today brings up the ongoing debate of what

FUNCTIONAL AREA STUDY

19

students should accomplish during their time at an institution and how that leads to life after
graduation. My shadow experience showed me that career services is more than making sure a
student gets a job after graduation. The University of Chicago has many programs in place to
help develop a student professionally and personally, and the Career Exploration department
provides students internal and external opportunities to explore their career interests. The
University of Chicago places a great emphasis on career services, as they have a large team
dedicated to ensuring student success.
My experience shadowing Melissa Smith in the Career Exploration department sparked
an interest in how universities are assessed by outside organizations and how they assess
themselves. The University of Chicago responded to the accreditation agency by providing the
most recent placement rates of students after graduation. Researching how placement rates are
calculated, best practices from other institutions, and current moves in changing the way recent
graduates are assessed opened my eyes to the growing importance of assessment in higher
education. With a field like student affairs, where many question the function of the different
departments and how they are adding to a students learning experience at a university, showing
student learning outcomes through assessment helps to highlight what it is that we do in higher
education and how that work produces results (Bresciani, 2011). The Student Personnel Point of
View (1949) describes the concept of education as more than just the material that is learned in
the classroom. Student affairs professionals are to help with a students development physically,
spiritually, emotionally, and intellectually. My personal philosophy on the student affairs
profession includes becoming a key component in a students support group as well. Career
services plays a very important role in this development and it is through assessment that

FUNCTIONAL AREA STUDY


institutions can show the increasing and on going benefit of career services and student affairs
professionals as a whole.

20

21

FUNCTIONAL AREA STUDY


References
ACE (1949). Student Personnel Point of View.
ACPA & NASPA. (2010, July 24). Professional competency areas for student affairs
practitioners.
Bresciani, G. (2011). Assessment and evaluation. In Schuh, J., Jones, S., Harper, S., &
Associates (Eds.), Student services: A handbook for the profession (5th ed.) (pp. 321334). San Francisco, CA: Jossey-Bass.
Cragg, K., Henderson, A., Fitzgerald, B., & Griffith, R. (2013). Administrative aspects of

accreditation and assessment. In Schloss, P. J., & Cragg, K. M. (Eds.) Organization and
administration in higher education (pp.80-100). New York, NY: Routledge.
Kiley, K. (2013, February 14). White house new scorecard oversimplifies institutions, liberal arts
advocates say. Inside Higher Ed. Retrieved December 1, 2014, from
https://www.insidehighered.com/news/2013/02/14/white-houses-new-scorecardoversimplifies-institutions-liberal-arts-advocates-say
Kuh, G. (2011). Student success. In Schuh, J., Jones, S., Harper, S., & Associates (Eds.), Student
services: A handbook for the profession (5th ed.) (pp. 257-270). San Francisco, CA:
Jossey-Bass.
Grasgreen, A. (2013, December 17). Gallup-purdue study will measure graduates' quality of life
outcomes. Inside Higher Ed. Retrieved December 1, 2014, from
https://m.insidehighered.com/news/2013/12/17/gallup-purdue-study-will-measuregraduates-quality-life-outcomes
Palomba, C.A., & Banta, T.W. (1999). Assessment essentials: Planning, implementing, and
improving assessment in higher education. San FransciscoL Jossey-Bass.

FUNCTIONAL AREA STUDY

22

Ray, J., & Kafka, S. (2014, May 6). Life in College Matters for Life After College. Retrieved
December 8, 2014, from http://www.gallup.com/poll/168848/life-college-matters-lifecollege.aspx?version=print[9/26/2014
Rogers, M. (2013, December 17). Job placement confusion. Inside Higher Ed. Retrieved
December 1, 2014, from https://www.insidehighered.com/news/2013/12/17/collegesreport-job-outcomes-results-are-limited-value
Sandoval, T. (2012, July 16). In job-placement rate, fuzzy data. Chronicle of Higher Education.
Retrieved December 1, 2014, from http://chronicle.com/article/Job-Placement-StatisticsAre/132881/
Severy, Lisa (2011). Career services. In Zhang, N. and Associates (Eds.). Rentzs student affairs
practice in higher education (4th ed.) (pp. 119-149). Springfield, IL: Charles C. Thomas.
Wellman, J., & Ehrlich, T., (2003, August). Re-examining the sacrosanct credit hour. Chronicle
of Higher Education, 50(5), p. B16.

Vous aimerez peut-être aussi