IRPA

The Office of Institutional Research, Planning & Assessment (IRPA) supports many kinds of initiatives throughout the university and beyond:  

  • Reports information to the Federal government, the State, numerous professional organizations and accrediting bodies, publications and research institutes. 
  • Answers cyclical and ad hoc requests for its Lynn University colleagues and provides coaching, support and additional labor on everything from surveys to complex statistical analyses.  
  • Has the capability and personnel to support strategic planning and assessment initiatives for units throughout the organization.

Mission Statement

Lynn University’s Office of Institutional Research, Planning & Assessment will serve as an expert resource for the organization and its constituent departments with timely, courteous and professional support for research design, data collection, analysis and reportage to both internal and external stakeholders.  IRPA will practice responsible and sustainable citizenship as members of its community, in concert with the university's mission to achieve its strategic goals as expressed in the Lynn 2020 strategic plan.

Values
  • Professionalism
  • Accuracy
  • Continuous Quality Improvement
  • Timeliness 
  • Sustainability

Ethics Statement

Lynn University’s Office of Institutional Research, Planning & Assessment endorses and observes the statement of professional ethics of the Association for Institutional Research (AIR).  IRPA will practice a level of care and compliance that may serve as an exemplar of ethical research policy and practices.

Realizing the Lynn 2020 Mission & Vision

Unit missions should build on the overall mission of the larger organization and develop its mandates in their local context.  Lynn University representatives and consultants developed the overall mission statement as part of their Lynn 2020 Strategic Plan for the entire University: 

“Our mission is to provide the education, support and environment that enable individual students to realize their full potential and to prepare for success in the world.”

The need for effective, timely and accurate data services for the institutional decision makers has generated the development of the current structure, processes, systems, policies and personnel of the Office of Institutional Research, Planning & Assessment. IRPA at Lynn University aspires to deliver consistent excellence of performance in support of the larger mission and vision of the organization.

 IRPA delivers information services to stakeholders within and without the university so they may make effective, data-informed decisions in support of providing the innovative, international and individualized higher education experience for which Lynn University is known. 

The Office of Institutional Research, Planning & Assessment supports many kinds of initiatives throughout the university and beyond.  Among the sustaining mandates for this unit are:  

  • Coordinate the efficiency, speed and accuracy of the routine collection, analysis and dissemination of institutional data.
  • Conduct sophisticated descriptive and inferential analysis of internal and external data.
  • Expand the collective organizational intelligence of the university about both internal and external data contexts in support of informed decision making.
  • Enhance the adaptive functions of an already nimble institution in a changing higher education environment.
  • Provide assessment and support for teaching and learning.
  • Describe institutional effectiveness and productivity for multiple units of the organization to multiple audiences.
  • Support student recruitment, enrollment management, transfer modeling and student retention efforts.
  • Assist with external accreditations (SACS, DoE, Conservatory) and internal program reviews.
  • Gather, analyze, compile and submit regular external reports (Federal agencies, national organizations, national educational and marketing surveys and Web sites, professional associations, and so on). 
  • Provide similar services to the above for ad hoc data requests and project support. 
  • Implement peer comparisons for benchmarking and identification of best practices.
  • Facilitate contemporary records management policies and process to maintain integrity of data collection and storage.
  • Maintain professional development for currency of theory and practice.

Goals

After a process of internal and external consultations, IRPA has elected to focus on the following three short term goals for its development and assessment processes:

  1. Optimize workflow priorities and minimize delivery delays (Operational).
  2. Become identified as a premier department and first-choice support unit for IR, planning and assessment (Reputational).
  3. Provide research and planning support for a successful SACS re-accreditation and coach and facilitate continuous quality improvement throughout the university (Operational).

Objectives

If goals are the destinations of choice for the operational unit, objectives are the roadmap for how we intend to get there.  IRPA has collectively identified a set of objectives that will result in measurable outcomes for its assessment of how well we are performing toward the achievement of its stated mission and goals:

  1. Utilize online Data & Project Request to track individual performance and work processes and identify targets for improvement  [Goal 1, Goal 2, Goal 3].
  2. Build a web page presence with links to public information sources and answers to frequently asked questions [Goal 1, Goal 2, Goal 3].
  3. Optimize course evaluation processes and decrease delivery time on results by implementing review suggestions from Data Governance Committee and setting measurable performance targets  [Goal 1, Goal 2, Goal 3].
  4. Submit timely and complete reports to all Federal, State and external associates  [Goal 1, Goal 3].
  5. Hold periodic IRPA department meetings and planning sessions  [Goal 1, Goal 3].
  6. Engage in fiscally feasible professional development activities such as webinars and CD-based training to maintain skill currency  [Goal 1, Goal 2, Goal 3].
  7. Participate as funds allow in national, regional and local professional forums and associations to increase public profile  [Goal 1, Goal 2, Goal 3].
  8. Document standard operating procedures and policies for IRPA  [Goal 1, Goal 2, Goal 3].
  9. Develop a professional Advisory Board to review and evaluate unit plans and progress annually [Goal 1, Goal 2, Goal 3].
  10. Locate, identify, gather, analyze and present data in support of ongoing Institutional Effectiveness for SACS reaffirmation of accreditation  [Goal 2, Goal 3].
  11. Develop and implement templates and time cycles for tracking key institutional metrics such as enrollments, retention, graduation, transfers, returning students and others as negotiated with decision makers  [Goal 2, Goal 3].
  12. Serve as stewards in its community via targeted participation in service and committee work [Goal 2, Goal 3].

Measures & Evidence

IRPA prefers to have direct evidence of its progress on each objective, which may then be supported and augmented by indirect evidence, such as emails from clients or satisfaction survey results.  IRPA has at least one direct measure below for every objective.  An exhibit may serve more than one objective, as noted.

  1. Copy of implemented FootPrints Data Request Form and associated database and tracking materials {Objective A}.
  2. Copy of posted web page (or drafted materials if not yet posted) and FAQ page {Objective B}.
  3. Spreadsheet of historic throughput data by term, process detail SOP for evaluation processing, other materials to document optimization analysis and implementation {Objective C}.
  4. Copies of project submission emails and report acknowledgements, fulfillment tracking report from FootPrints {Objective D}.
  5. Department meeting agendas and minutes, copy of strategic planning materials {Objective E}.
  6. List of activities and attendees, agendas {Objective F}.
  7. Emails, agendas, and other materials showing professional-level participation {Objective G}.
  8. SOP Manual (or draft if still under revision) {Objective H}.
  9. Membership list of IRPA Advisory Board and annual review schedule {Objective I}.
  10. Emails and reports associated with QEP and Institutional Effectiveness Report process, agendas from SACS workshops and trainings {Objective J}.
  11. Copy of templates, calendar of delivery dates and list of recipients {Objective K}.
  12. Annual master list of service and committee work {Objective L}.

This collection of unit performance data will feed the next several steps in the assessment cycle, drive the development of improvements, and suggest additional measures for feedback and tracking.  IRPA will evaluate these results to see which of its goals may need adjustment in a future cycle.   

Assessment & Planning

Robust and systematic cycles of assessment and planning need feedback and monitoring loops built into all process designs.   IRPA will be able to tell whether and to what degree it has done a good job of serving its constituencies through ongoing self-assessment and process evaluation of its strengths and weaknesses. IRPA will use environmental scanning and peer comparisons to help identify external changes that may affect how IRPA optimizes its operations.

Analysis

Direct and indirect measurement data will be collected, compiled, tabulated and analyzed.  Patterns will be identified, and the results used to identify trends, set benchmarks for future cycles, mark areas for improvement, and indicate future resource development needs.

When possible, it is strongest to combine direct measures of performance with indirect measures, such as customer satisfaction inventories.  IRPA is going to begin its data collection efforts with the direct evidence of its productivity and then build out in its next stage of development to measuring its reputational variables once IRPA has a stable set of policy and procedures in place.

Interventions

Based on the results of the analyses, targeted interventions will be designed to address improvements to the quality of systems, products, or processes.  This is where IRPA begins to “close the loop” of assessment and where all the hard work of evaluation begins to really pay off in the greater efficacy of its departmental efforts. 

IRPA has already begun to do this process in a semi-formal fashion.  For example, in the academic year 2008-2009, a preliminary examination of the patterns of work task demand and fulfillment led to the implementation of the FootPrints job tracking system.  This new implementation let us track how many jobs IRPA does and how quickly IRPA is able to fulfill them.

Also, some new guidelines were developed to measure fulfillment performance depending on the kind of task.  These guidelines come into effect in the Spring 2010 UGD/Spring II 2010 iteration of the SIR II course evaluation process as a pilot run for the next fiscal year beginning July 1, 2010:

  1. Major Projects, such as mandated Federal reporting and national surveys, by deadline.
  2. SIR II Surveys, no later than 4 weeks after end of term, unless specifically requested by deans or VPAA for expedited review. DOL evaluations, no later than 3 weeks after the SIR II deadline.
  3. Other ad hoc requests, as possible depending on scheduled workload.

Tracking & Monitoring (Feedback) 

It is not good practice to send your improvements off into the world and then never check on them again.  IRPA will seek opportunities to build embedded future assessments, feedback, and monitoring of the successes and failures of its attempted improvements so that it can continue and refine  development into the future.  When assessment and evaluation efforts are fully integrated into the strategic planning process, a seamless system of data to support quality is the reward.

Data Requests

Please contact the Director of Institutional Research, Planning & Assessment.