intTypePromotion=1
zunia.vn Tuyển sinh 2024 dành cho Gen-Z zunia.vn zunia.vn
ADSENSE

Assessing institutional learning outcomes: Implications for Vietnam higher education institutions

Chia sẻ: Lê Thị Mỹ Duyên | Ngày: | Loại File: PDF | Số trang:12

35
lượt xem
1
download
 
  Download Vui lòng tải xuống để xem tài liệu đầy đủ

Researcher also made recommendations for Vietnam HEIs to improve internal quality assurance for both quality improvement and accountability purposes.

Chủ đề:
Lưu

Nội dung Text: Assessing institutional learning outcomes: Implications for Vietnam higher education institutions

VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12<br /> <br /> <br /> <br /> <br /> Original Article<br /> Assessing Institutional Learning Outcomes:<br /> Implications for Vietnam Higher Education Institutions<br /> <br /> Pham Thi Tuyet Nhung*<br /> College of Foreign Languages - Hue University,<br /> 57 Nguyen Khoa Chiem, Hue City, Vietnam<br /> Received 22 May 2019<br /> Revised 07 June 2019; Accepted 08 July 2019<br /> <br /> Abstract: Institutional learning outcomes indicate the knowledge and skills that all students<br /> regardless of disciplines from a specific university demonstrate. There are some researches about<br /> assessing learning outcomes at program level in Vietnam but no research about learning outcomes<br /> at institution level. This case study research shared experience from a U.S. comprehensive university<br /> to conduct assessment of institutional learning outcomes. The paper discussed the achievements such as<br /> successful two-year institutional assessment implementation, effective use of a national Valid<br /> Assessment of Learning in Undergraduate Education (VALUE) rubric to assess students’ performance,<br /> the use of technology in data analysis, and the best practices to communicate assessment results to<br /> multiple stakeholders to facilitate leadership decision making; the challenges such as technology,<br /> faculty engagement, the participation rate, validity and reliability; and improvement plans. Researcher<br /> also made recommendations for Vietnam HEIs to improve internal quality assurance for both quality<br /> improvement and accountability purposes.<br /> Keywords: Institutional learning outcomes, achievements, challenges, quality improvement, accountability.<br /> <br /> <br /> 1. Introduction * learning (Bassis, 2015 [1]; Jones, 2009 [2];<br /> Nelson, 2014 [3]). The regional accrediting<br /> Over the past several years, various organizations identified and recognized by the<br /> individuals, organizations, and legislators have Council for Higher Education Accreditation<br /> continued to express concerns about the quality (CHEA) all include requirements related to<br /> of higher education. Those concerns have assessing student learning outcomes for general<br /> triggered legislation and requirements at the education. The accreditors have requirements<br /> federal and state levels and by regional for articulating the outcomes as well as<br /> accreditors to assess and report on student measuring and documenting student success<br /> _______ (“Council for Higher Education Accreditation”,<br /> * Corresponding author. n.d.) [4].<br /> E-mail address: nhungptt48@gmail.com<br /> https://doi.org/10.25073/2588-1159/vnuer.4265<br /> 1<br /> 2 P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12<br /> <br /> <br /> <br /> Assessment of general education has been higher education institutions have typically<br /> going on for years. According to Penn (2011) functioned in an autonomous and siloed culture<br /> [5], one of the first, comprehensive assessments when implementing changes. Various programs<br /> of general education was in the late 1920s. and offices have operated independently of one<br /> Major initiatives were undertaken in higher another. The concept of holistic, institution wide<br /> education assessment in the mid 80’s to early assessment can be somewhat of a challenge due to<br /> past practices and that autonomous nature. A<br /> 90’s to assess general education and university<br /> cohesive framework and cooperation across<br /> is again seeing that demand for detailed,<br /> campus are critical for effective implementation<br /> comprehensive assessment. With all the of general education assessment.<br /> requirements, it is easy to lose focus of the reason Similarly, accreditation is also a major<br /> for assessment and why university collect data, driver for Vietnamese higher education<br /> enter it into databases, and generate reports so that institutions (HEIs) to provide evidence of<br /> university can improve the learning and student learning. The new standards of higher<br /> performance of students. Fletcher, Meyer, education accreditation for both institution and<br /> Anderson, Johnston, & Rees (2012) [6] stated program level focus on assessment of student<br /> universities conduct assessment to provides learning following Plan-Do-Check Act (PDCA)<br /> information about student learning, student to make quality improvement (MOET, 2017,<br /> progress, teaching quality, and program and MOET, 2016) [10, 11]. Therefore, there is a<br /> institutional accountability. need to create an internal quality assurance<br /> There are numerous ways of conducting<br /> (IQA) to meet such requirements from external<br /> effective general education assessment. The<br /> stakeholders. Still, IQA is still a challenge for<br /> Association of American Colleges &<br /> Universities (AAC&U), Valid Assessment of many Vietnamese HIEs (Nguyen, 2018) [12]<br /> Learning in Undergraduate Education and quality assurance offices (Pham, 2019)<br /> (VALUE) project and the resulting rubrics have [13]. There is a research from Hue University to<br /> been implemented by many Universities. The share the experience to implement IQA from<br /> VALUE rubrics were developed as part of Asian University Network- Quality Assurance<br /> AAC&U’s Liberal Education and America’s (AUN-QA) to assess learning outcomes at<br /> Promise (LEAP) initiative (“About LEAP,” program level (Nguyen and Nguyen, 2017) [14]<br /> n.d.) [7]. One advantage of implementing the but no research has shared experience to assess<br /> VALUE rubrics is that data and studies such as learning outcomes at institutional level in<br /> the Multi-State Collaborative to Advance Vietnam context. This case study shared<br /> Quality Student Learning (MSC) and the Great experience from a comprehensive university in<br /> Lakes College Association Project to Advance United States to conduct the assessment of<br /> Learning, to name a few, report their findings student learning at institution level to support<br /> and share lessons they have learned through<br /> Vietnamese HEIs to improve quality of student<br /> their implementation. A recent report, On Solid<br /> learning and provide accountable evidence for<br /> Ground (McConnell & Rhodes, 2017) [8],<br /> provides detailed information from a large external stakeholders such as accreditation.<br /> number of institutions. The VALUE rubrics<br /> were piloted and are used by a diverse range of<br /> 2. Method<br /> post-secondary education institutions including<br /> community colleges, regional comprehensives, This research used case study as a major<br /> and R1 institutions. These data sets allow us to method to provide a rich description of the<br /> benchmark our student performance with that of phenomenon (Yin, 1994) [15]. A case can be a<br /> the collaborating universities. Brown, person, a small group, a program, or an<br /> McGreevy, & Berigan (2018) [9] point out that institution. As stated by Merriam (1998) [16], a<br /> P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12 3<br /> <br /> <br /> case study provides an in-depth description of a reviewed by a general education committee for<br /> single instance, phenomenon, or social unit. recertification and to ensure they are following<br /> Creswell (2014) [17] also stated that a case has the assessment plan and student artifacts align<br /> a clear boundary and can provide an in-depth with desired outcomes.<br /> comprehension of the case. The first step in This research tried to answer the following<br /> conducting a case study is to define the case. questions:<br /> The university’s assessment process 1. What are the assessment process of<br /> explained here is from a regional institutional learning outcomes?<br /> comprehensive university in the Midwest of 2. What were the challenges and<br /> United States. Their Carnegie classification is improvements the university have had?<br /> Comprehensive Universities offering both 3. What are the key achievements the<br /> undergraduate and graduate programs. The university has made?<br /> enrollment of the university is just over 12,000 4. What are the strategies university use to<br /> undergraduate and graduate students. The sustain the institutional learning outcome system?<br /> general education program has always had the<br /> mission of providing students with foundational<br /> knowledge and skills, primarily in liberal arts 3. Findings<br /> and sciences, that encompasses all 3.1. Assessment process of institutional<br /> baccalaureate programs. A frequent observation learning outcomes<br /> made by faculty and students alike was that our<br /> previous general education program did not Assessment measures. In 2014, university<br /> appear to be a program at all but rather a updated our general education curriculum to<br /> collection of unconnected courses. Our include areas of understanding which comprise<br /> programs and the general education program four key outcomes that include a total of ten<br /> were operating in that siloed type of competencies. To assess these competencies,<br /> environment and not functioning cohesively, the Valid Assessment of Learning in<br /> particularly when related to assessment. For Undergraduate Education (VALUE) rubric<br /> those reasons, university sought a framework to (Rhodes, 2009) [19] was modified and applied<br /> implement a holistic assessment approach across campus. This activity demonstrated the<br /> which would allow us to assess the impact of institution’s commitment to ensuring learning<br /> our general education. outcomes are achieved and that a degree<br /> Like many universities, our previous reflects high quality, a goal of the Multi-State<br /> general education program focused on input, in Collaborative (MSC). This effort also<br /> the form of courses and their specific responded to a widespread objective of using<br /> competencies, and not on an outcomes related standardized testing in higher education. Most<br /> perspective (Bruce, 2018) [18]. The courses importantly, the assessment of student learning<br /> were selected strictly by their alignment with using a modified VALUE rubric provided the<br /> the selected general education topic areas. opportunity for faculty to have conversations<br /> Under our current general education program, about improvement of student learning<br /> courses must show how they align with and will outcomes (Wehlburg, Carnahan & Rhodes,<br /> meet the specific outcomes for the university 2017) [20].<br /> general education program. Programs on Assessment process. The university<br /> campus can submit courses to the faculty senate assessment system follows six phases of the<br /> general education committee for consideration assessment cycle: (1) plan and identify<br /> of inclusion in the general education program. outcomes, (2) collect data, (3) analyze data, (4)<br /> As part of that submission, they must include share results, (5) identify and implement<br /> information on how they will meet and assess changes, and (6) assess impact of change (Kuh,<br /> the prescribed outcomes. Courses are also Ikenberry, Jankowski, Cain, Edwell, Hutching<br /> 4 P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12<br /> <br /> <br /> <br /> and Kinzie, 2015) [21]. The revised general how to use the modified rubrics. It was<br /> education program serves student need and the determined that pilot data would be collected in<br /> public interest by ensuring students have strong the Spring of 2017 semester. Student artifacts<br /> foundational skills by providing a broad, for five competencies: written communication,<br /> enriched academic experience that both oral communication, quantitative literacy,<br /> complements and supports their study within critical/creative thinking, and managing<br /> information would be collected. As this was the<br /> specialized disciplines. To capture the student<br /> first time the university had conducted an<br /> learning of the ten general education<br /> institution-wide general education assessment,<br /> competencies, the university has used three instructors of all courses that aligned to a<br /> major assessment measures: The General specific competency were asked to voluntarily<br /> Education Assessment (GEA) Exam, the provide students’ artifacts for institutional<br /> Modified VALUE rubrics, and the National assessment. Data from four competencies (Oral<br /> Survey of Student Engagement (NSSE). The Communication, Quantitative Literacy,<br /> GEA and Modified VALUE rubrics serve as the Creative/Critical Thinking, and Managing<br /> direct assessment measure of student learning Information) were gathered in an excel template<br /> outcomes and the NSSE serves as an indirect and the Written Communication competency<br /> assessment measure of student learning outcomes. was collected through an assessment<br /> This paper only discusses the newly management software (AMS). The purpose of<br /> implementation of direct modified this pilot was to ensure the assessment process<br /> VALUE rubric. was appropriate before collecting artifacts of<br /> In an effort to determine whether the the five competencies from all courses.<br /> teaching of the GE courses met the requirement Two-Year Timeline. The data collection<br /> of the new general education competencies, the pilot was successful, therefore, from 2017-<br /> university started working on an assessment 2018, the university implemented a two-year<br /> plan and timeline for data collection. In 2015- assessment plan for general education<br /> 2016, university conducted a series of planning assessment (Table 1), using the course-<br /> meetings, with faculty teaching in the general embedded assessment (CBA) function in the<br /> education program, to collectively define the AMS. Data was collected during the Fall<br /> process for data collection. In the Fall 2016 semester, and in the Spring semester the results<br /> semester, the institution provided face-to-face, and opportunities for teaching and learning<br /> as well as online training for all instructors on improvement are discussed and documented.<br /> <br /> Table 1. Two-Year general education assessment timeline 2017-2018<br /> 2017-2018 2018-2019<br /> Assessment and Evaluation Activity<br /> Fall Spring Fall Spring<br /> Collect data/Evaluate data including the processes Competency 1,2,3 & 5 Competency 4<br /> Deliver report findings to constituents x x<br /> Take actions where necessary x x<br /> Review the competency if necessary x x<br /> yh<br /> Human Resources. To support the Services oversees the assessment activities. The<br /> assessment of the general education program, university assessment coordinator is in charge<br /> additional resources were needed and had to be of implementing the assessment process. The<br /> devoted to the process. Our structure included general Education Coordinator, a full-time faculty<br /> administrative support and faculty input. The member with course release, supports the<br /> Vice Provost of Academic Programs and communication of the purpose of assessment,<br /> P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12 5<br /> <br /> <br /> assessment process, and facilitates the course- and enabled a relatively automated transfer of<br /> embedded assessment (CBA) training with information into the AMS. Therefore, faculty<br /> university assessment coordinator to streamline utilize and grade the students’ artifacts using<br /> the process and to increase the artifacts the LMS they are familiar with. As most faculty<br /> submission in the AMS. Both the assessment were familiar with LMS, this helped to<br /> coordinator and the general education coordinator encourage their participation. The second<br /> are non-voting members on the faculty senate advantage of technology is the protection of<br /> general education committee. confidential information. All data were loaded<br /> Data Collection. Aligning several directly into the AMS and only people with<br /> components of the general education courses, specific privileges were able to access the data.<br /> assessment process, and data collection is very The third advantage of technology was<br /> intentional. The goal is to ensure courses efficiency (e.g., time savings) in the data<br /> maintain alignment with the competencies and analysis, as the assessment software could run<br /> that faculty can collect and report data with a various reports. Consequently, the university<br /> minimal amount of additional workload. Any could collect a large sample of students’<br /> GE courses going through the recertification artifacts across multiple competencies in a year.<br /> process need to demonstrate that the course This comprehensive data collection enabled the<br /> learning outcomes and course assignments align university to capture a more accurate and<br /> with a specific GE competency. This ensures complete picture of student learning and<br /> courses continue to align with the general facilitate actions for improvement when looking<br /> education competencies and goals. All courses at the assessment results in the later step. The<br /> aligned to a skill-based competency are fourth advantage of using technology for data<br /> required to provide students’ artifacts from one collection was to provide both faculty and the<br /> assignment in their class. Faculty choose an institution individualized assessment reports<br /> assignment that meets all the dimensions in the based on the needs.<br /> modified VALUE rubric for university data Assessment Results. In AY 2017-2018,<br /> collection. The intent is for faculty to utilize a faculty collected students’ artifacts from 230<br /> normal or typical assignment that are currently sections aligned with Competency 1 (Written<br /> implementing in their course and to use that for Communication), Competency 2 (Oral<br /> the institutional assessment. This authentic Communication), Competency 3 (Quantitative<br /> assessment does not create much additional Literacy) and Competency 5 (Managing<br /> workload for faculty as opposed to using an Information). 57% (2858) of the artifacts had<br /> intentional assignment just for institutional been assessed by the instructors and loaded into<br /> assessment as a component of student learning in the AMS. For the remaining 43%, in some<br /> their course. Since assessment is embedded within cases, faculty did not collect the data and in<br /> all sections of the courses and is evaluated by the others, improvements in the assignments are<br /> faculty member teaching each section, the needed for faculty to be able to independently<br /> assessment process has been streamlined. score the artifacts. The goal is to have 100% of<br /> Advantages of Technology in Data the artifacts scored. In the future, to continue to<br /> Collection. In addition to the faculty-centered ensure sustainability of the assessment process,<br /> and authentic assessment process, the data university will likely implement sampling of<br /> collection and data analysis from an AMS also larger sections. Of the four competencies,<br /> streamlined assessment process. The first Competency 3 received the highest response<br /> advantage was that it integrated with the rate (76%) and Competency 2 received the<br /> existing learning management system (LMS) lowest response rate (42%). o<br /> 6 P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12<br /> <br /> <br /> <br /> <br /> Table 2. Modified VALUE Rubric Response Rate 2017-2018<br /> <br /> Written Oral Quantitative Managing Total<br /> Communication Communication Literacy Information<br /> Total Students 1610 828 1218 1330 4986<br /> Total Reponses 752 350 924 832 2858<br /> % of Response 47% 42% 76% 63% 57%<br /> t<br /> On average, 98% of freshman met the competencies, Oral Communication and<br /> requirement, scoring one or above in the Quantitative Literacy had a higher average<br /> modified VALUE rubric. Of the four score (2.4).<br /> <br /> <br /> Written Communication (N=534) Oral Communication (N=297)<br /> <br /> Rating 0 1% Rating 0 2%<br /> Rating 1 47% Rating 1 13%<br /> Rating 2 36% Rating 2 43%<br /> Rating 3 11% Rating 3 21%<br /> Rating 4 5% Rating 4 19%<br /> <br /> <br /> <br /> Assessment ompetencies<br /> <br /> <br /> <br /> <br /> Quantitative Literacy (N=603)<br /> Managing Information (N=494)<br /> Rating 0 3%<br /> Rating 0 1%<br /> Rating 1 14%<br /> Rating 1 22%<br /> Rating 2 36%<br /> Rating 2 53%<br /> Rating 3 37%<br /> Rating 3 15%<br /> Rating 4 9% Rating 4 10%<br /> <br /> <br /> <br /> <br /> Figure 1. Assessment Results of Competencies.<br /> l<br /> <br /> <br /> <br /> <br /> In Spring 2018, the University Assessment involved in the data collection of Modified<br /> Coordinator prepared the university GE VALUE rubrics. The purpose of the meeting<br /> Assessment report and shared it with several with academic council was to provide them<br /> groups and committees across campus with the assessment results and discuss the<br /> including Academic Council, department strategies to improve next year’s response rates<br /> chairs, General Education Committee, Faculty using the Modified VALUE rubrics. The<br /> Senate University Assessment Council discussion with the GE Committee was to<br /> (FSUAC) and the faculty group that has been facilitate their use of assessment results in the<br /> P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12 7<br /> <br /> <br /> recertification process. In addition to Improvements: From the challenges<br /> aggregated assessment results for the whole encountered, in AY18-19, university prioritized<br /> university, the assessment coordinator also three solutions to facilitate closing the loop in<br /> provided the assessment report by competency. the assessment process. Acknowledging the<br /> The faculty meetings were set up by the Vice value of faculty coming together to discuss<br /> provost, university assessment coordinator, and student learning and pedagogy to identify<br /> GE coordinator to share the results and ask for opportunities to better support teaching and<br /> their feedback about the assessment process. learning in GE courses is critical. The first<br /> One of the key and critical components of the improvement is to create a time and place for<br /> assessment process remains a challenge; faculty to engage in deep, meaningful<br /> documenting actions for improvement from conversations about student learning and<br /> each competency. effective teaching. To facilitate this strategy,<br /> university established lead faculty for each<br /> 3.2. Challenges encountered and improvements<br /> competency. The major responsibilities of these<br /> Challenges encountered. After two-year of faculty are to lead the discussion of the<br /> implementation, the university still has some assessment results within their group, document<br /> challenges to overcome. The first challenge the feedback and recommendations to improve<br /> university encountered is the technology. the assessment process and possible actions for<br /> Although it provides the ability to collect and improvement. University provides a template<br /> analyze a great deal of information, some with key components in the assessment cycle to<br /> faculty had issues in the implementation such as facilitate the documentation of meeting<br /> being unable to create a link in the LMS, minutes. The second priority is to improve the<br /> inappropriate data display or issues with artifact validity and reliability of student artifacts.<br /> submission by students. The second challenge University is currently providing training and<br /> is the faculty interpretation of the modified workshops on “assignment design” and<br /> VALUE rubrics. Although training about the “norming” workshop series facilitated by<br /> modified VALUE rubrics was done before the university assessment coordinator and external<br /> data collection, some faculty still had a hard presenters. In the following semesters, lead GE<br /> time determining and assigning the scores from faculty in each competency will facilitate these<br /> the rubric to their own assignment, especially trainings for their own group annually. These<br /> when the freshman scored one in the rubric still lead faculty will serve as facilitators to promote<br /> got the A grade in their course. The third the professional development opportunities and<br /> challenge is the participation rate across the to coordinate faculty meetings to discuss and<br /> institution. Although more than two thousand review actions taken in response to learning<br /> artifacts were collected, it only accounted for outcomes data. The third improvement the<br /> 57% of population. Some faculty decided not to university is working is the additional<br /> submit any artifacts from their course in the requirement of utilizing assessment data in the<br /> system. Some had challenges separating out the GE recertification. Previously, the GE<br /> individual artifacts. The fourth challenge is the committee ensured the course learning<br /> lack of infrastructure to engage faculty who are outcomes and course assignments aligned with<br /> directly involved in the assessment process to GE competencies. The current practice is to<br /> discuss results of student learning effectively ensure student performance meets the<br /> and to identity changes for quality expectation of course learning outcomes and the<br /> improvement. Finally, university assessment course assignment.<br /> results relied on one artifact or one assignment;<br /> therefore, it was sometimes questioned about 3.3. Key achievements<br /> the reliability of the results, a barrier in making<br /> The first advantage of this assessment<br /> appropriate changes for improvement.<br /> process is the consistent assessment process for<br /> 8 P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12<br /> <br /> <br /> <br /> all GE competencies, which would benefit the sheet (Appendix A). This is a meaningful<br /> accreditation-related efforts. Our goal is to process and allows faculty to determine the<br /> create processes and strategies that make strengths and weaknesses of student learning<br /> assessment practice and assessment visible to for their own course, then decide what actions<br /> all faculty. This is the first-time the university they can make for improvement. Our goal is not<br /> conducted an institution-wide authentic to evaluate faculty assessment efforts but to<br /> assessment following the national authentic assist them in using assessment results to<br /> assessment, VALUE rubric. The intent is to evaluate their own practices. It is hoped that<br /> capture the 21st century skills that all graduates multiple, minor changes systematically<br /> need to demonstrate by their graduation. To implemented over time can produce substantive<br /> facilitate the implementation, the university sets impact on teaching and learning (Stanny,<br /> up GE assessment plans and a two-year Gonzale and McGowan, 2015) [22]<br /> timeline to collect data, provides multiple<br /> assessment related trainings to faculty 3.4. Sustainable strategies<br /> throughout the academic year, and utilizes a As short-term goals, the university has three<br /> central AMS system to store and analyze plans to improve the assessment of the GE<br /> assessment data. program. The first plan is to improve the<br /> The second advantage of this process is the alignment of student learning outcomes at<br /> widespread faculty engagement in the different levels (university, GE, and academic<br /> assessment process from assignment design to programs) to facilitate skill-based assessment at<br /> pedagogy, data collection, and discussion of the senior level. Senior level data not only<br /> assessment result. Two features of this process, ensures students have had opportunities to<br /> personnel work and technological tools, improve, practice, and develop skills related to<br /> distribute the responsibility for assessment of the competencies, but allows us to provide<br /> student learning outcomes so that no one person evidence of student growth over time. The<br /> is solely responsible for the assessment. University Assessment Committee will work<br /> Multiple coordinators at different levels with programs to ensure appropriate skills are<br /> (university, college, department, and embeded in their program learning outcomes. A<br /> competency) facilitate faculty engagement in pilot will be implemented the Spring of 2019 in<br /> meaningful discussion of assessment findings which faculty teaching capstone courses will<br /> and regular conversations about teaching use the modified VALUE rubric to assess<br /> practices. Most importantly, faculty can student performance. For one capstone<br /> experience assessment activities as assignment, faculty can use it to assess multiple<br /> opportunities for their own learning and skills. Faculty will decide which skills the<br /> professional growth when attending the annual capstone would align with and select the<br /> training about teaching and learning appropriate rubric(s). The pilot of capstone<br /> improvement. At the same time, lead faculty assessment will facilitate the university plan to<br /> serve as the leaders in their group to facilitate fully implement assessment across the entire<br /> closing the loop discussions. academic timeframe of students. The second<br /> The third advantage of this assessment plan is to improve the validity and reliability of<br /> process is that it also allowed individual faculty assessment results by encouraging more<br /> to evaluate their own practice. After attending meaningful actions for improvement.<br /> meetings with the group to discuss assessment University will build an inter-rater reliability<br /> results within their competency, faculty are system that includes a second faculty assessing<br /> encouraged to run the CBA report, watch a sample artifacts of the five competencies.<br /> video on the assessment website on the Statistical power will be tested to have<br /> strategies of interpreting assessment data, and representative and powerful sample. Finally, the<br /> then fill in the GE Assessment Self-reflection university will consider having a GE Assessment<br /> P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12 9<br /> <br /> <br /> Committee to discuss and continue to improve the include qualitative data which our process has<br /> GE assessment process. Right now, the bulk of not yet formally included.<br /> the GE assessment activities are still initiated and<br /> overseen at the academic administrative level. To<br /> transition the assessment functions to the GE 4. Conclusion<br /> committee or formation of a committee specifically<br /> addressing GE assessment, will transfer some of As discussed in the literature review, there<br /> the ownership to faculty and help with are limited research about the implementation<br /> dissemination of information. This committee can of IQA in Vietnam context and there is no<br /> also support with inter-rater reliability as well as specific research about assessment of<br /> documentation of discussions and institutional learning outcomes. This case study<br /> recommendations for annual assessment reports. provided detailed steps by steps from choosing<br /> To sustain the culture of continuous the assessment measure to analyze the data to<br /> improvement, the university needs to maintain facilitate the implementation for other<br /> some long-term strategies. The first strategy is institutions. In addition, the sharing of the<br /> to provide continuous professional development challenges this case encountered, the<br /> opportunities for GE faculty, especially the achievements it has made and the strategies the<br /> university continue to sustain the IQA system<br /> adjuncts. University continues to have faculty<br /> can be good examples for other institutions.<br /> who seek to determine whether the pedagogical<br /> Vietnamese HEIs can implement this<br /> changes they make in the course will produce<br /> assessment process for quality improvement<br /> improvement in student learning. Those faculty and accountability, especially the current<br /> wish to pursue research and scholarship accreditation standards encouraged institutions<br /> opportunities related to assessment based on to provide quality of student learning.<br /> those findings. These efforts can lead to the First, Vietnam HEIs should look at the<br /> creation of an assessment network where institution mission to set up appropriate<br /> faculty can design and develop a common institutional learning outcomes (ILOs) for the<br /> course-based assignment for courses. The first sixty credits in the first two years. The best<br /> second strategy to build the culture of practice for ILOs is to look at the list of 21<br /> assessment is to have annual teaching and century skills that AAC&U developed and<br /> learning fair, poster sections, workshops, or choose the neccesary skills for Vietnam<br /> thinktanks where faculty facilitate sections on context. Second, institutions require courses in<br /> assessment results and implications. The major the first two year curriculum to align its courses<br /> goal of these events is to enhance faculty to appropriate ILOs. To ensure the alignment,<br /> understanding of assessment process, facilitate the course learning outcomes need to address<br /> the use of data, evaluate the entire assessment the ILOs language in the course objectives.<br /> cycle and determine whether the assessment Third, Vietnam HEIs should choose a reliable<br /> process leads to real changes in student assessment measures to collect data. VALUE<br /> learning. The final strategy is to engage student rubric is an initiative in U.S. assessment practice<br /> in GE assessment process. Although the to move away from standardized exam to<br /> university administers the NSSE, it is not authentic assessment, using the authentic<br /> students’ artifacts to make improvement of<br /> administered annually. To triangulate<br /> student learning. Some U.S. HEIs just used the<br /> assessment data from both direct and indirect<br /> available assessment rubric to collect data. Some<br /> assessment measures, instructor can ask<br /> adopted the language in the rubric. Others used<br /> students to reflect in class and use that feedback VALUE rubric as a framework to build their own<br /> for indirect authentic assessment evidence in rubric. Vietnam HEIs can choose appropriate<br /> addition to the student assignment artifacts practice to implement. Researcher recommended<br /> (Hutchings, 2018) [23]. That feedback could<br /> 10 P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12<br /> <br /> <br /> <br /> using the available rubric then make changes later References<br /> if there are any issues.<br /> [1] M. Bassis, A Primer on the transformation of higher<br /> Fourth, one of the keys to engage faculty is<br /> education in America.<br /> to provide guidance and understanding of the http://www.learningoutcomeassessment.org/document<br /> entire assessment process, why it is being s/BassisPrimer.pdf/, 2015 (accessed 1st April 2019).<br /> undertaken, and what the outcomes of the [2] D.A. Jones, Higher education assessment-Who are<br /> process will be used for. Vietnam HEIs should we assessing, and for what purpose?<br /> provide professional development opportunities https://www.aacu.org/publications<br /> for faculty teaching the courses on how to research/periodicals/higher-education<br /> design the assessment to align with the rubric, assessment%E2%80%94who-are-we-assessing-and-<br /> what-purpose/, 2009 (accessed 5th March 2019).<br /> how to read, integrate and use the rubric to<br /> [3] C. Nelson, Assessing assessment.<br /> score students’ assignment and how to provide https://www.insidehighered.com/views/2014/11/24/e<br /> consistent scoring across the courses. This is a ssay-criticizes-state-assessment-movement-higher-<br /> very significant important step to avoid the education/, 2014 (accessed 4th April 2019).<br /> challenges in validity and reliability in the data [4] Council for Higher Education Accreditation.<br /> collection that this case study encountered. (n.d.). Retrieved from<br /> Figure 2 provides additional information on https://www.chea.org/regional-accrediting-<br /> how Vietnam HEIs can share the assessment organizations/ (accessed 10th April 2019).<br /> results with multiple committee to close the [5] J.D. Penn, The case for assessing complex general<br /> assessment loop for quality improvement of education student learning outcomes, New<br /> Directions for Institutional Research 149 (2011)<br /> student learning. Lastly, Vietnam HEIs should 5-14. https://doi.org/10.1002/ir.376.<br /> have a meta-assessment, assessing the [6] R. Fletcher, L. Meyer, H. Anderson, P. Johnston,<br /> assessment process in place such as peer review M. Rees, Faculty and students’ conceptions of<br /> of assignment design to ensure the validity of assessment in higher education, Higher Education<br /> the assignment, calibration to ensure the 64 (1) (2012) 119-133.<br /> reliability of the students scores across the http://www.jstor.org/stable/41477923.<br /> multiple courses and ask for faculty perceptions [7] About LEAP. (n.d.). https://www.aacu.org/leap/,<br /> about the assessment process. 2018 (accessed September 01, 2018).<br /> These practices will help institutions to [8] K.D. McConnell, T.L. Rhodes, On solid ground.<br /> figure out the strengths and weaknesses in the Retrieved from<br /> https://www.aacu.org/OnSolidGroundVALUE/,<br /> process to make improvement and most 2017 (accessed 10th April 2019).<br /> importantly, provide evidence for institutions to [9] S. Brown, J. McGrevy, N. Berigan, N., Evidence-<br /> allocate appropriate resources to improve the Informed improvement through collaborative<br /> weaknesses. The implementation of this case professional integration, New Directions for<br /> study totally aligned with the suggestions from Teaching and Learning 155 (2018) 55-64.<br /> eight case studies supported by UNESCO that https://doi:10.1002/tl.20303.<br /> IQA is based on the national accreditation [10] MOET, Circular 12/2017/TT-BGDĐT<br /> requirement and international best practice promulgating regulations on accreditation for<br /> higher education institutions, Hanoi, Vietnam:<br /> (Martin, 2017) [24]. This case study assessment<br /> The Author, 2017.<br /> of ILOs demonstrated the four key components<br /> [11] MOET, Circular 03/2017/TT-BGDĐT<br /> of PDCA required by Vietnam national promulgating regulations on accreditation for<br /> accreditation in higher education and the higher education programs, Hanoi, Vietnam: The<br /> updated assessment initiative from U.S. Further Author, 2016.<br /> research can be how a Vietnamese university [12] CEA-HCM, Vietnamese accreditation system:<br /> learn this process and implement successful in achievements, challenges and lessons learned<br /> Vietnam context. from international accreditation model, Paper<br /> presented at Conference about Vietnam higher<br /> education, 2018.<br /> P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12 11<br /> <br /> <br /> [13] Pham Thi Huong, Limited legitimacy among [19] T. Rhodes, Assessing outcomes and improving<br /> academics of centrally driven approaches to achievement: Tips and tools for using the rubrics,<br /> internal quality assurance in Vietnam, Journal of Washington, DC: Association of American<br /> Higher Education Policy and Management 42 (2) Colleges and Universities, 2009.<br /> (2019) 172-185. http://doi.org/ [20] C. Wehlburg, J. Carnahan, T. Rhodes, Multi-State<br /> 10.1080/1360080X.2019.1565298. collaborative to advance quality student learning.<br /> [14] Nguyen Hong Giang, Nguyen Hong Son, Quality https://www.aacu.org/sites/default/files/MSC_De<br /> Assurance Procedure for Training Programs of monstration_Year.pdf/, 2017 (accessed 10th<br /> Hue University in Accordance with AUN-QA, April 2019).<br /> VNU Journal of Science: Education Research 33 [21] G.D. Kuh, S.O. Ikenberry, N.A. Jankowski, T.R.<br /> 1 (2017) 47-57. Cain, P.T. Ewell, P. Hutching, J. Kinzie, Using<br /> [15] R.K. Yin, Case study research: Design and evidence of student learning in improve higher<br /> methods, Sage Publications, Thousand Oaks, education, San Franciso, CA: Jossey-Bass, 2015.<br /> CA, 1994. [22] C. Stanny, M. Gonzalez, B. McGowan, Assessing<br /> [16] S. Merriam, Qualitative research and case study the culture of teaching and learning through a<br /> applications in education, Jossey-Bass syllabus review, Assessment & Evaluation in<br /> Publications, San Francisco, CA, 1998. Higher Education 40 (7) (2015) 898-913.<br /> [17] J.W. Creswell, Research design: Qualitative, [23] P. Hutchings, Helping students develop habits of<br /> quantitative, and mixed methods approach (4th ed.), reflection: What we can learn from the NILOA<br /> Sage Publications, Thousand Oaks, CA, 2014. Assignment Library, Urbana, IL: University of<br /> [18] R.T. Bruce, Assessment in Action: Evidence- Illinois and Indiana University, National Institute for<br /> Based discussions about teaching, learning, and Learning Outcomes Assessment (NILOA), 2018.<br /> curriculum, New Directions for Teaching and [24] M. Martin, Internal Quality Assurance: Enhancing<br /> Learning 10 (2) (2018) 1-7. higher education quality and graduate<br /> https://doi.org/10.1002/tl.20260. employability, UNESCO Publishing, 2017.<br /> E<br /> <br /> <br /> <br /> <br /> Figure 2. Institutional learning outcomes assessment process.<br /> 12 P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12<br /> <br /> <br /> <br /> Appendix A<br /> <br /> General Education Assessment Self-reflection<br /> <br /> Competency:<br /> Note: Please do not provide individual information in the self-reflection.<br /> How does the student learning in your course, based on the CBA data, compare with the<br /> institutional assessment results? (Benchmark)<br /> What did you learn from the individual course assessment result? Did you find any common<br /> patterns occurring in your courses?<br /> What are the strategies you implement in class in order to maintain and support student learning?<br /> If possible, what new strategies, materials, or pedagogy will you implement in this section to better<br /> support student learning?<br /> <br /> Thank you for your feedback!<br />
ADSENSE

CÓ THỂ BẠN MUỐN DOWNLOAD

 

Đồng bộ tài khoản
2=>2