Jisc case studies wiki Case studies / Durham University - Business Intelligence
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Durham University - Business Intelligence

Enabling Benchmarking Excellence

 

Case study written October 2012.

 

Contents

 


Background

 

Across the HE sector, significant moves are being made to incorporate benchmarking techniques into institutional planning and performance monitoring activity.

 

One of the greatest barriers to this activity is the structural framework of nationally published data sets, which are organised in ways that inhibit detailed benchmarking below the level of the institution as a whole.

 

Project Scope

 

The Enabling Benchmarking Excellence project proposes to begin work to overcome this barrier, by gathering a set of metadata based on institutional organisational structures. Institutions organise themselves according to a variety of functional models and this is often a significant issue when attempting detailed analysis of data.

 

This metadata will be applied to HESA’s new academic cost centres – ACCs - (as determined by the current HESA cost centre review) and the way in which these map to academic units.  This will facilitate a relationship map that will allow data to be disaggregated and compared in a meaningful and robust way.  Initially we will use a select range of partner institutions chosen to cover a wide range of organisational structures.  While this project aims to deliver a proof of concept, the eventual aim is to make comparative analysis far more flexible and useful to all stakeholders within the HE community.  The outcomes will therefore be taken forward by HESA as part of its benchmarking provision across all HEIs in the sector.

 

Objectives

 

Working with HESA we aim to:

 

  • Prove the concept of a model template which can be used by HESA in providing benchmarking facilities to the HE sector.
  • Identify key metrics which make use of this model to enable more meaningful inter-institutional benchmarking at different levels within organisational structures.
  • Provide guidance to institutions in mapping their academic units to cost centres and their organisational structures.

 

The Business Case

 

The JISC InfoNet BI InfoKit states that a business case will include a set of predicted benefits, as well as a baseline of present costs and how these will be affected. The way to make progress is to decide what the business intelligence system will do.

 

Our project aims to improve benchmarking in HEI through the collection of a set of metadata, which will provide greater context to the information that is currently available from HESA collections in HEIDI data sets. This information will be used to enhance benchmarking practices and inform decision making. As stated by our partners at the University of Liverpool,

 

‘We think that the data will allow us to understand the teaching structure of various institutions more clearly.’

 

Sunderland University have stated that they,

‘Would like to use this data when undertaking a mapping and comparisons in relation to sector norms.’

 

This is reiterated at Durham University, who say that the…

‘Extra level of detail means that we will be able to examine in much more depth how recruitment to our programmes and our student populations compare to those other institutions and make really well informed decisions.’

 

The cost of collecting this information was examined via interviews with those providing the information, that is, interviews with the key staff involved in HESA returns. The information being collected was found to take between two hours and half a day’s work, from a key member of staff involved in the completion of the HESA student return. No additional data needed to be collected as the collection involved a manipulation of existing data. The principal part of the time was taken up with updating the ACCs, which HESA does not expect to happen every year. Sunderland University have stated that they are working on a revised internal structure to reflect the new cost centres, which will enable greater ease of return.

 

The benefits of the collection were considered to be the enhancement of benchmarking capabilities by providing a greater granularity of data on institutional structures. Thus, the benefits of the available data, once released sector wide, are seen as considerably greater than the cost involved in providing the data.

 

This feeling is expressed by the University of Liverpool who state:

‘The compilation of the table is not too onerous because most of the underlying data is required for internal activity mapping in any case.  Therefore, even moderate usefulness would justify the process.’

 

The process of the collection of data was described by Durham University as:

‘Once this mapping (of the new cost centres) was completed, a database query was written, which extracted programme information with the field required for the JISC project. This was a simple query, which used the tables we already had and did not take too much manipulation.’

 

When asked about the time that was spent on the completion, compared to what would be gained from the results, Durham University said:

‘To say that the two hours that she’s spent will enable other institutions to understand our structures and how our data is compiled so much more easily and the transparency it will give across the sector, it’s definitely worth it.’ 

 

Outcomes

 

BI Maturity Model Level 1

 

At this level in the implementation of a BI system, information is fragmented. This means that institutions will have difficulty providing clarity of information. This will impact on their ability to complete the template for this study, and to provide high quality, accurate BI information. In addition to this, institutions at level 1 may also struggle to use the information that is provided from the collection. This is because the lack of accurate internal information will hamper the ability to benchmark against external data.

 

More generally, benchmarking practices, whether internal or external, will be hampered by the lack of reliable, cohesive information. The Enabling Benchmarking Excellence project will providea process improvement for current benchmarking practices, but institutions at this early stage of implementation of a BI system will find it difficult to grasp the benefits offered.

 

BI Maturity Model Levels 2 – 4

 

With coherent information available at this stage of the implementation of a BI system, institutions will have the information available to complete the template for this study. Institutions will also be able to improve their benchmarking with access to the information collected. However, information gathering for the completion is likely to be time consuming due to the discrete sets of data that will need to be amalgamated in order to provide the most accurate representation of the institutional structures and cost centre allocation.

 

The availability of this data at a departmental level will allow institutions to conduct meaningful benchmarking activities and enable more informed decision making. However, due to the discrete nature of the data in institutions at a level 2 point, this benchmarking will be conducted in a more isolated fashion and different sets of data will need to be manually combined. However, the capacity to do this easily would necessitate the need for the institution to move forward in the maturity model and initiate a BI system.

 

The stage of initiating the BI system is likely to be very time consuming, due to the issues of system implementation and development. This will affect the resources available for the completion of tasks such as this collection as well as other HESA collections such as their staff, student and finance returns. This is particularly likely to affect the initial collection of any BI after the implementation of a new system, making the collection quite time consuming. However, this will be a short-lived negative; once the system is running, the benefits will far outweigh this short-term negative. At this point, the information gathering will be far quicker and the institution will be greater equipped to apply different data sets for benchmarking purposes.

 

The challenges associated with the adoption and application of a BI system will all take time for adjustment before the benefits of the new system can be fully realised.The period of transformation from the old to the new system will bring with it issues of change management such as overcoming resistance to the new system. Staff will have to adapt to new work patterns, and likely new systems of data usage as well as data definition and management. However, the benefit of the BI system to reduce the time taken to provide HESA information can be stressed to encourage adoption of the new system. Additionally, the ability to apply the information from this collection to enhance decision making practices and enable effective and meaningful benchmarking will ensure senior management buy-in to the changes.

 

BI Maturity Model Levels 5 and 6

 

Institutions at level 5 and 6 of the Maturity Model will be able to more fully realise the potential of their BI system. This, in turn, will enhance their benchmarking capabilities as well as both the ease with which they can supply the information for this collection, and their ability to use the information provided. At these stages, the greater level of granularity in the data available will allow institutions to apply and embed the data in ways that are more meaningful to their own internal structures. This is because it makes it possible for these institutions to routinely undertake subject-specific benchmarking on a range of activities with confidence. This was not possible prior to this study.

 

These will allow the institutions to apply their BI capabilities to model changes in the external environment and competitor performance. In addition to this, those institutions with a mature BI system will be able to complete the information required for this collection quickly, saving valuable employee time. It is those institutions at this stage of the BI maturity model, which may provide an example of maturity exemplars, where they may have taken on elements of all the previous implementation issues and maturity levels, in order to develop a fully mature BI system and fully grasp the benefits offered by the information collected in this study. 

 

Case Study Findings

 

We have conducted interviews with key members at our partner institutions and have drawn together the information gathered to a case study report which will be provided to HESA. This details the business use of the information that HESA provide, as well as current benchmarking practices and how HESA can help to improve benchmarking across the HE sector. The key findings from this are as follows.

 

HESA

 

The project found that HESA datasets are widely considered to be a useful source of information. HESA’s role as a point of commonality between Scottish and English Universities is seen as increasingly important. This is particularly due to the scale of recent changes to funding and student fees, which make the differences between the two regions more distinct. This is also likely to be relevant for Northern Ireland and Wales as all institutions in the UK are required to submit data to HESA; however, we did not have any Northern Irish or Welsh institutions as part of our study.

 

HESA has developed HEIDI – an on-line tool for accessing institutional specific data. HEIDI is a useful tool, but can be complex and time consuming to use owing to the way in which data is structured, primarily on the basis of costs centres and JACS code. The lack of departmental level data fromHEIDI is becoming increasingly important as institutions look to benchmark their activities more accurately.  In addition to benchmarking undertaken centrally by the university, many institutions encourage academic departments to conduct their own benchmarking and performance monitoring activities. However, this can be inhibited by the nature of HEIDI, which is not considered to be user friendly for those who do not use it on a regular basis, and so many not readily appreciate the limitations of data held on a cost centre or JACS code basis.

 

While the roll-out of this study sector-wide is viewed to be of huge benefit to current benchmarking practices, it has been questioned whether this needs to be a separate collection. Institutions anticipate that completion will involve a revision of the student record, and this could be completed by HESA, or alternatively, through a minor modification to the module record.

 

HEI Structures

 

All Universities interviewed as part of this study had a structure of faculty or schools, followed by department, with some slight variations of this form. Each unit was reasonably discrete, although there was some duplication of cost centres across these. There were no large-scale interdisciplinary faculties or colleges.

 

The structures that are being returned are represented below. These different structures also highlight the need for consistency to ensure that the tiers are comparable, in order for the information to be of business use for benchmarking.


 

Template Completion

 

Template completion has been viewed as fairly straightforward overall. There appears to be a mix of approaches to the completion. These need to be examined as consistency is vital to ensure the business use of this collection.

 

This is a copy of the template, with an illustrative example from Durham University.

 

 

The full set of completed templates from Durham and partner institutions can be found here.

 

Joint Programmes

 

Activity on joint programmes was generally divided 50/50 for ‘and’ programmes and according to the relevant proportion for a ‘with’ programme; for example, 30/70 or 60/40 etc. However, this will only create a difference if comparing at programme level. If the template is completed to a modular level then joint programmes will be taken account of in terms of the student’s modular choice.

 

HESA Returns

 

There is a mix of approaches to the completion of HESA returns currently. Most institutions require the specialist department to complete the relevant return, such as finance for the finance return. The returns then may be checked by the central planning department, or, alternatively one of the specialist departments will take the lead and cross-check all the returns. Whatever the approach taken, the collection of this metadata reflecting universities’ departmental structure is likely to place greater prominence on the need for universities to ensure that separate HESA returns for students, finance and staff are co-ordinated. 

 

HEIDI

 

HEIDI is considered to be an excellent tool and contains a large amount of data. However, usability can be an issue, particularly for infrequent users of the programme, and also for regular users. For example, data cannot always be found intuitively, and often cannot be accessed and downloaded without time-consuming navigation through menu choices. Furthermore, HEIDI terminology could be made more accessible for staff in academic departments who may not work with HESA directly – this is particularly important given the increasingly devolved nature of benchmarking activities.

 

Benchmarking

 

Benchmarking activities are often carried out by a central services department. However, there appear to be increasing moves to devolve benchmarking processes to departmental level, and also a desire amongst senior corporate managers (e.g. Pro Vice-Chancellors, Heads of Faculty etc) to have the capability to interrogate data themselves. Liverpool and Birmingham Universities are both examples where changes have been made to provide academic departments with the support needed to both provide accurate and timely data to central services and to conduct their own benchmarking activities. This study is viewed as being particularly useful as it will enhance the ability of academic departments to compare specifically against cognate institutions and departments. HEIDI is often a primary source of benchmarking information, but institutions also often use league tables, the national student survey and Unistats for benchmarking activities.

 

New ACCs

 

The new ACCs are viewed favourably for their addition to the humanities and social sciences. There is a view that more can be done, particularly to improve the ability to benchmark against professional services, such as information, computing and library offerings.

 

Partner Institution Example

 

To provide an example of our partner institution case studies, below is the case study from the University of Liverpool. We have chosen the University of Liverpool as the example given due to the timing of our study, which coincided with Liverpool’s adoption and implementation of a new BI system. This meant that many of the issues associated with the transition between the different stages of the BI Maturity Model are highlighted in this case study and are issues that other institutions are currently facing, or can expect to face in the future. Thus, Liverpool provides a good representation of the sector. However, for more detail, particularly of specialist institutions, please see our full case study report, which can be found here

 

 

Spotlight: The University of Liverpool

 

Key Thematic Findings

  

Liverpool has recently undergone a major university restructure; this has meant it has halved its faculties from 6 to 3. Also of significance is that one of these faculties, the Faculty of Health and Life Sciences, has undergone a fundamental internal restructure – separated into institutes for learning and teaching and institutes for research. From a planning perspective there are 4 planning units, but only 3 academic departments.

 

For each faculty there is an executive pro-Vice chancellor as well as faculty managers, who are a professional services member of staff. Thus, each faculty has an academic structure as well as a professional services structure. There is a head of school and a head of school manager who supports them. The aim is to work in collaboration, rather than having everything centrally driven. This member of staff (the school manager) understands the faculty business and makes it easier for other departments, such as central services, to get departmental information, as well as for the communication of change from central services such as the New ACCs and this new collection.

 

Template Completion

  

Liverpool completed the template by the following method:

 

‘This was done by student activity (FTE) at module level.  Each module has a department and a cost centre (or is split between several).  Every registration on each module can be converted to an FTE through the credit points, which are then aggregated across the institution.  So a cost centre which receives 40 FTE from Dept A and 60 FTE from Dept B will be split 40%:60% between the 2 departments.  This also works the other way round, splitting departments by cost centres. In the case of non-modular activity, the mapping was carried out at programme level or, in the case of PGR students, by supervisor splits.’

 

In terms of HESA’s planned roll out of optional tiers 1 and 2, and required completion of tier 3 (‘Department’), Liverpool think:

 

‘The University of Liverpool will definitely complete the optional tiers, and we would like to see a further split between UG, PGT and PGR as these have their own distinct implications.’

 

Further feedback on the completion was:

 

‘The compilation of the table is not too onerous because most of the underlying data is required for internal activity mapping in any case.  Therefore, even moderate usefulness would justify the process.  Reviewing the submissions made by other institutions, we think that it would be very useful to the point of being indispensible to have a Tier 4 split between UG, PGT and PGR.’

 

If it wasn’t for the restructure, discussed above, this completion would be a simpler task. Liverpool has made a lot of change around areas of responsibility. They used to be very centrally focused; they have now diversified and devolved responsibility out. This devolved nature means that other areas are more involved in central planning and strategy activity. This has increased the need for the data warehouse project that the university is running. However, this devolved nature means that the central planning department can’t simply complete the ACC mappings but must engage with the faculties and schools much more. Once these new systems are set up, the completion should be quicker and more accurate it is only the first collection under the new structure that will be most affected.

 

The restructure has also meant that there has been a re-education of the staff to use the information and understand the level of granularity needed. The restructure has also increased the understanding of staff allocation, in terms of who is allocated where. If the MI project was already in place, and the data warehouse was complete, then completion of this study’s template would be far easier. However, as it’s not set up, the data has to be extracted from different systems.

 

It is possible that system changes could cause an issue with completion for a particular year. Liverpool is currently adopting a new HR system which needs to be set up and structured in such a way that will allow HESA data to be captured. It is important that integrating a new system won’t result in an increased workload in a different area. Delays in completion could also be caused from key staff leaving if the team were dependent on the expertise of a key member and had become single person dependent. This was previously a potential issue at Liverpool, but the change in structure has lead to the development of two key people who are experts in each return as well as computing staff to provide technical support.

 

HESA Returns

  

The expert area completes the return. HR complete the Staff Return, Planning complete the Student, and Finance complete the Finance Return. The returns are then sent to strategic planning for a cross check to ensure consistency, prior to them being sent to HESA. Liverpool has introduced cross checking procedures to ensure that the returns are linked.

 

HEIDI

  

HEIDI is very comprehensive, but lacks the ability to measure professional services and their contribution: for example, library and IT services etc. It would be good if there were some more generic measures that would help this. In addition to this, Liverpool would appreciate HEIDI having more international data and international measures. HEIDI does not always display data well and invariably, Liverpool exports data from HEIDI and uses Excel to clean the data. It would be nice to be able to do that within HEIDI. In a similar way, it is not easy to merge two data sets and so Liverpool staff will export them to Excel to merge and complete pivot tables.

 

HEIDI has lots of data, and if you use it a lot then it’s not a problem, but if you are new to the system, or only use it occasionally then there is a lot there to remember. It’s not always intuitive to know how to access the data and to use it effectively, or to understand some of the terminology used, so in some ways, HEIDI is not very user friendly. This is a barrier that Liverpool has faced particularly with relation to its restructure as its aim is to devolve responsibility. This is a trend that is likely to continue and thus the importance of the information being accessible to members of academic departments is even more important.

 

Benchmarking

  

The restructure at Liverpool has meant that departments are now encouraged to do their own benchmarking, for example, for the chemistry department to seek to compare itself against other chemistry departments, rather than the entire science faculty. All departments have access to HEIDI and are receiving training in how to use HEIDI. League table information is also used for benchmarking.

 

In terms of the collection, Liverpool would like to see a collection of metadata, as well as the information mapped into the data.

Liverpool expect that:

 

‘the data will allow us to understand the teaching structure of various institutions more clearly.  However, other than this we do not have a specific project in mind.’

 

‘We expect that the information will be useful to inform such benchmarking, although the exact extent will depend on the details of the exercise and on the range of institutions against which such benchmarking is carried out.  For example, it will be more useful in benchmarking against a smaller sample of institutions and those which are similar to Liverpool such as the Russell Group.’

 

New ACCs

  

The cost centres are good, but fall short in terms of the professional services. Liverpool requires their professional services area to create a plan and monitor performance in each area such as library and computing services. However, it is notoriously difficult to put the different areas together to measure professional services as a whole.

 

Key lessons learned

 

Differences in completion

 
As this is a proof of concept study, we have not been prescriptive in how we would like the template to be completed. This has meant that the template has been completed differently by different institutions. This poses a problem for the sector-wide roll out, as a consistent approach is necessary for effective benchmarking.

 

To illustrate these issues, an extract of two different completions from two institutions can be found here.

 

This table is just an extract. There is an issue of comparability as:

 

  • This table represents a small extract, of Durham programmes which are similar to SOAS’s departments.
  • This table represents SOAS’s entire Faculty of Languages and Cultures and all the departments contained within it.
  •  Durham’s entries are far greater than SOAS, with the Faculty of Social Science having 987 entries, the Faculty of Arts and Humanities having a total of 290 entries. The Department of Modern Languages and Cultures has a subset of 56 programmes, and the cost centre ‘137 Modern Languages’ has 84 entries at Durham.

 

The question is how valuable this extra level of detail really is. After consultation with HESA, it was decided that the template should be completed to department level for all institutions. This will provide a greater level of detail that what is currently available, without being unnecessarily extensive.

 

This does raise the issue, of what a HEI will do if it does not have a department structure. In order to overcome this, a definition of the sort of information that HESA would like provided by HEIs is included as part of the guidance that will be sent to institutions, prior to the collection.

 

Consistency

 
For the collection to provide a real business use to the sector for benchmarking purposes, the information provided by each institution needs to be consistent. There have been a few different issues of consistency that have arisen from the different completions that we have received.

 

This has included:

 

  • Is research activity taken into consideration, or is the template based on students and thus teaching?
  • If research activity was to be included, there are likely further issues that would need to be resolved to ensure consistency. For instance, there may be discrepancies between the approaches taken by research and non-research institutions. A non-research based institution may complete the template based entirely on teaching, not including any research, and a research based institution may complete the template on both.

 

Furthermore, if the template should be based on students and teaching activity the information could be gained through minor modification to the module record or an additional module record. 

 

Additional issues of consistency included:

 

  • How are the tiers defined?
  • To what tier should the template be completed?

 

While it was suggested that the template could be completed to the lowest point for which there is a difference in cost centre, this wouldn’t provide the level of detail that is viewed as most beneficial as it may just indicate faculty level for some areas. Thus it was felt that this was not an appropriate definition as further detail was wanted from the study. Some Universities have chosen to thus complete to modular or programme level, whilst others have fairly detailed information at departmental level or subject groupings.

 

These issues were resolved after collaboration with HESA. In order to overcome the issue of the tier completion, HESA will request all institutions to complete the mapping at department level. Furthermore, we have recommended to HESA that they also request a separation of the information for undergraduate, postgraduate taught, and postgraduate research levels. HESA has confirmed that they cannot currently provide this information from the existing data they collect, which makes this collection necessary. In order to ensure clarity and consistency of the information provided, such as whether student, teaching or research information is included, comprehensive guidance will be sent to all HEIs prior to the collection.  We would recommend that HESA considers setting out some case examples, particular so that it is easy for institutions to understand what is meant by ‘department’.  Our project revealed different a wide array of organisational structures and different usages of terminology.  

 

Our own experiences have led us to conclude that departments can be characterised in the following ways:

 

  • A department is responsible for organising and delivering academic activity in a discrete subject, discipline or field of knowledge.
  • A department will have a formal position within the institution’s structure – it is likely to be separately identified on the institution’s organisational chart, it is likely to have its own management structure and be led by a ‘head’ or ‘director’ etc, and it is likely to have some degree of budget responsibility.
  •  In larger organisations, a department can often form part of a larger structure (such as a faculty or school) which has been formed to cohere and link related subjects or disciplines. 

 

HEIDI

 
Our study has also aimed to provide HESA with a more general insight into the use of its information, in order to improve their general provision. This aim led to feedback from our partners about the use of HEIDI.

 

Benchmarking is being viewed as a more devolved activity, undertaken by academic departments, and therefore the usability of HEIDI can be an issue. To overcome this, it was suggested that a glossary of terms could be incorporated into the system.

 

HEIDI was considered to be a good tool for dissemination, but not the only tool. This is because of the potential interest from academic departments who do not use HEIDI on a regular basis. Thus, other forms of dissemination should be examined to ensure high visibility of the findings.

 

In addition to this was feedback about how the information could be presented in HEIDI. After consultation with HESA on the basis of the feedback provided, it has been decided that the information will be presented in its raw form. Additionally, the information will be embedded as a hyperlink in the name of the institution. This way, whatever query you run in HEIDI, you would be able to click the hyperlink, and have that institution’s mapping to the new Academic Cost Centres appear in a new window.

We also recommend that HESA gives consideration to whether information on departmental structures and/or HESA data compiled to reflect departmental structures should be made available to bodies other than higher education institutions and HE-sector bodies that operate on a not-for-profit basis.  It seems likely that the compilers of league tables would be interested in this data in its own right, and may well see some potential for combining it with new data that will shortly become available through Key Information Sets.  The development of league table methodologies has not been without controversy, and would seem prudent for HESA to proceed with caution and in dialogue with institutions.

 

Looking ahead

 

One of the big issues we’ve faced with our project has been to ensure real business use of the information that is collected. This is particularly important for when HESA roll this collection out to the sector. This will allow the study to provide a means for the sector to develop and improve current benchmarking and business intelligence practices.

 

Our project affects existing systems in HEIs in two ways. On the one hand, additional information is provided to HESA; on the other hand, additional information is received from HESA which affects a different part of the institution from those who provide the information.

 

In the case of Durham University, the additional information is provided by Durham University’s Student Planning and Assessment department. The information was extracted from an existing database, and a simple manipulation was performed to provide the information required. This took approximately 2 hours. The majority of this time was taken up by updating the cost centres. This will not need to be done every year, so it is likely that the time impact will be reduced in future collections.

The information will primarily be used by the Strategic Planning and Change Unit to inform benchmarking activities, planning rounds and strategic policy recommendations and decision making. The information is also likely to be used by academic departments conducting their own benchmarking practices, and to inform their decision making process. Thus, the provision will have widespread benefits for the institution.

 

Furthermore, feedback from colleagues at both Durham University and our partner institutions has indicated that the information will allow greater transparency and granularity of data that will be used to inform other data sets. This will have the overall benefit of enhancing decision making and allowing more informed decisions. It will be used to inform benchmarking activities at all levels of institutions and the findings will be embedded to inform strategic decision making processes. Thus, the improved capabilities and efficiencies are seen as outweighing the small amount of time that it will take to provide the information for our institution.

 

Our project has gained an understanding of the business use of HESA data as well as current benchmarking practices within our partner institutions. Our partner institutions have included members of the Russell Group, former polytechnics, specialist institutions and arts based colleges. In this way, we have been able to gain a broad understanding of the business needs of the sector for benchmarking information, as well as an understanding of how these institutions currently provide information to HESA.

 

Thus, our project has informed HESA’s understanding of the sector’s BI issues. This information has informed the way in which HESA are taking the study forward for sector-wide completion. Our project findings are being used to inform the guidance that HESA will send to the sector. The information collected has been used to inform sector-wide completion and thus the structure of the template has evolved based on the information gained from our partner institutions. This means that once the collection is rolled out to the sector, it will provide real business value to institutions and their BI and benchmarking activities.

 

Furthermore, our study has also highlighted the ways in which the sector wishes to use the information that is being collected, thus, informing HESA of BI issues which affect the sector. This information will inform HESA’s approach to their collection of this information and other returns, and thus should stimulate progress and development of information and dissemination vehicles that are conducive to BI activities. 

 

Summary & reflection

 

The Enabling Benchmarking Excellence project has been ambitious in that it aims to provide a new set of data to the Higher Education sector, and in doing so, enhance benchmarking and decision making capabilities across all levels of HEIs sector wide. The data collected in this project will make comparative analysis far more flexible and useful to all stakeholders within the HE community.

 

Thanks to the invaluable support of our partner institutions, we have been able to provide a proof of concept for HESA, prior to their sector wide roll out of the collection. Our partner institutions have offered insight and recommendations which will ensure the value of the study to the sector. 

 

The generous dedication of time from our partner institutions has meant that we have successfully conducted qualitative interviews with the users and providers of HESA data at 8 institutions. This information has enabled us to identify key metrics which will make the use of this model more meaningful for inter-institutional benchmarking at different levels within organisational structures.

 

In addition to this, our partners have generously completed the template, which allows us to prove the concept of the model which can be used by HESA in providing benchmarking facilities to the sector. In doing so, our partners have tested the concept, and allowed us to develop a knowledge resource, which will inform the guidance that HESA issue to the sector prior to the collection.

 

 The support offered from JISC, in terms of direct discussions with the programme managers and our critical friend has been invaluable as it has allowed us to clearly define our ideas and adapt our project to the insights provided. In addition to this, working with JISC has allowed us to interact with a network of institutions conducting similar and yet very different business intelligence studies. This has really allowed us to sound out ideas and define our approach to the project.

 

On the completion of this project, we have issued the findings to our partner institutions. We have had some very favourable responses to this, and are confident that the information contained in our final report to HESA will be used to ensure real value is provided to the sector from this completion. However, as noted in our examination of BI Maturity levels, institutions will only really be able to gain from the data gathered in this study if their own internal systems are sufficiently developed to harness the potential benefits. Our project focuses on enhancing the ability of HEIs to conduct external benchmarking activities, and doesn’t examine whether institutions use this information effectively. Thus, this project cannot claim to resolve all BI and benchmarking issues in the sector, and institutions will need to conduct their own internal analysis to ensure that they are not constrained by their own systems. 

 

Click here for a summary video of the project.