Jisc case studies wiki Case studies / Transformations Falmouth University - Amdash
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Transformations Falmouth University - Amdash

Project Name: Admissions Executive Dashboard

Lead Institution: Falmouth University

Project Lead: Andrew Climo


See the full Transformations programme playlist

 

Background

During 2012 Falmouth launched a major new initiative to start development of an executive reporting service. This was a marked departure from the previous attempts to deliver ad-hoc reports directly from operational systems.

 

Previous thinking stated that as long as administrative resources could be sufficiently skilled to created spreadsheet models presenting data from operational systems (primarily the SITS student record system), the needs of senior executives could be met. A focus on responsiveness, rather than process and of frequency rather  than quality assurance were seen as the main drivers. As it became clear that highly skilled analysts were required rather than a more-clerical resource the pressures on a small number of senior analysts grew.

 

By 2012 it had become clear that ad-hoc reporting and the use of senior analysts for this activity was inappropriate and also that much of report preparation time was due not to presentation issues, but to identify data quality issues and errors: Some of which resulted from poor definition of data and some due to changes arising from change management internally and with UCAS.

 

The goal of the Executive Information programme is to improve data quality whilst automating turnkey reporting and freeing senior analysts to concentrate of higher-value scenario modelling.

 

The first stage of the programme is to focus on the Admissions Capacity Model, which is the highest value report with a turnkey element. The second stage is to offer this up by means of a dashboard, providing a ‘single version of the truth’ to all members of the Senior Executive Team (SET).

 

The principal issue relating to KPI reporting is the ‘apples and pears’ problem: Consecutive management reports may not contain data that is ‘equivalent’ either in business terms, or technical derivation. This is neither a trivial problem nor is the solution simple.

 

Understanding this issue is one of the principal areas of requirements definition, problem solving and creating a robust and credible KPI reporting system.

 

Aims and Objectives

The aim is to replace the current manually intensive report writing service with a data warehouse, ‘off the peg’ reports and an executive dashboard that shows course performance and allows users to ‘drill down’ to identify specific areas of weakness. This is intended to be a largely ‘unattended’ service not requiring the direct input of strategic analysts or IT staff.

 

Context

The advent of the £9,000 per year student fee has created a sharper focus for ensuring that the courses run by Falmouth University are cost effective, aligned with demand and sustainable.

 

Although decisions can be made using existing operational data, extraction and presentation is a time-consuming and costly activity: A simpler, more responsive reporting service would assist with earlier action, and allow Falmouth to react to quicker to changing market conditions.

 

The Business Case

The principal reasons for this project are:

  • Single point of failure (one person) – operations issue
  • Data in SITS required for later reporting may be over-written, so snapshots are required
  • Queries have to be written based on assumptions about statuses: A robust approach is needed to ensure that query results are consistent so that comparing data from different snapshots is truly comparing like with like
  • Presently SQL must be manually crafted each time to take account of different cut off dates
  • Queries getting longer, no fixed format for report
  • Elimination of ‘atypical data’ is part of the clean-up process, but what does this mean, when is it valid or not, what are the implications for data quality
  • Access to tools: SQL Developer only (and correctly) available to skilled staff, spreadsheet available to others. This means that there are considerable restrictions as to who can do what with what (safely)
  • Many ‘exceptional exceptions’: Large numbers of rules likely to make writing a robust MIS difficult / costly
  • Use of Excel: Model inherently non-robust. Formulae may easily be corrupted or over-written
  • Potential for mis-keying
  • Cut and Paste orientation means that it is easy to introduce errors or destroy the model
  • Unclear whether some reports are/should be pull/push/mixture (what should be scheduled output, ad-hoc, only produced at certain times/events) 
  • Mostly no feedback from course leaders: Cannot tell whether corrective or preventive action has been taken or what the result was.

 

Key Drivers

The key strategic drivers for this project were:

  • The need for the University to ‘step up’ and achieve wider international recognition
  • To implement ‘evidence based’ decision making and provide suitable support to senior management in terms of information gathering and selection
  • To compete effectively for students when recruiting, particularly when student applications may be falling
  • To increase the conversion ratio from applications to offers and from offers to acceptances
  • To reduce delays in turning around reports and presenting them to senior managers and to reduce the need for ‘at elbow’ support by strategic analysts and IT staff.

 

JISC Resources/technology used

Overview

The project has made use of several JISC toolkit items. Appendix 2 shows our experiences of using each within a project context and how/where we would recommend their use in projects of a similar ‘footprint’.

 

Through our active participation in the FSD programme we developed our Project Enterprise Architecture Toolkit (PEAT) which we have been using to plan and manage project activities. This has been successful in respect of this project and EA was largely followed.

 

To support this project we intended to use the following JISC resources from the outset:

 

  • JISC TechDis
  • Coventry University VISIO EA modelling stencil
  • JISC Transformations programme and its activities  
  • JISC EA community via the FSD programme
  • Several JISC tools and techniques including stakeholder analysis, risk management and several brainstorming techniques
  • JISC Involve Project blog
  • JISC Events and collaborative working 

 

Coventry Visio Template and other modelling techniques

We were successful in conducting a detailed process review of the ‘as is’ process for creating the present spreadsheet models and showing process costs and risks for each step.

 

We created three process models: One, a high level IDEF0 model showing the business context of reporting, and two Swimlane Diagrams, relating to the development and usage of the Admissions Capacity Model respectively.

 

The Coventry Stencil was found to be fit for purpose and we have recommended its further use on other projects. The usage of the Coventry Stencil for drilling down into requirements has been essential and it has driven the design of the Data Warehouse.

 

We have also created an IDEF0 template which we can offer up to others and demonstrate its use if requested.

 

In addition, a Workflow Stencil is useful to describe potential workflow interactions (we have created such a stencil). This is to design workflow state changes relating to our change management cycle for reporting models. This has also been used on another project internally and again, we can offer this up for wider usage if requested.

 

Stakeholder Analysis

The JISC Stakeholder Analysis tool provided a useful checklist for the effectiveness of our engagement with management: Institutional stakeholders include the Vice-Chancellor, Deputy Vice-Chancellor, Finance Director, Heads of School, Senior Analysts, Management Accountants and our own Business Information Systems unit.

 

In effect, we have brought these stakeholders together through (i) the Project Board, and (ii) The Expert Group on Reporting Standards [EGORS].

 

Extensive stakeholder workshops were held towards the beginning of the project and throughout requirements gathering. The use of clear agendas, pro formas to minimise writing up time and continual feedback have resulted in an effective collaborative environment and high quality documentation.

 

Technical staff have been more difficult to engage with, and in particular the use of an unfamiliar methodology and multi-disciplinary team working. A lesson learned is that since business intelligence-related projects require seamless working between pure business and pure technical skillsets and silo-based working can paralyse a business intelligence project.

 

Finding: Staff must be inducted formally into projects before they start work within the project and they must sign up to a Project Charter where they accept the roles and responsibilities of all other team members and respecting the skills, knowledge and experience of others is crucial in what is a high-value project with high-quality value outputs.

 

Power/Interest Grid:

 

 

Enterprise Architecture

Project made good use of process and application modelling techniques, but engagement from the IT team was disappointing: We would have preferred that they took the reins with application and data modelling. The overall notion of progressing from Concept Design through Logical Design to Physical Design is strong and we would like to build this into development in future since it fits well with Agile and RAD approaches (as well as Waterfall).

 

It was always intended that the project would use EA tools and techniques. Early project activities included creating two ‘as is’ process models using the Coventry Visio stencil, and early specification activities used the modelling cycle Concept Design à Logical Design à Physical Design.

 

One process model related to the development of a typical management report, whilst the second related to the update and usage of the model by the executive team.

 

In the application modelling dimension a generic data warehouse conceptual application model was created for discussion and awareness purposes and a data flow diagram showing principal system components at the logical application model level.

 

The business process model and data flow diagram were both used to derive the key business process models using a Volare pro forma. The switch away from EA to Volare was at the request of the IT organisation who were trialling its usage internally. But given that the new-style requirements definition did not achieve any greater level of engagement within the IT organisation, one might reasonably be sceptical as to whether the method of presentation of requirements is the critical success factor for achieving meaningful engagement.

 

Finding: Levels of engagement with the ‘end user’ within the Marketing Directorate was good, and models were used in an on-going manner to discuss requirements and understand the project direction of travel. We have now used the same engagement technique using EA workshopping, the Coventry Stencil and development of downstream models with the eRecruitment and PaperLite Admissions. It is certainly not over-stating matters to note that the quality of engagement and general levels of trust, enthusiasm and team working have improved as a result of these tools.

 

See Appendix 3 – EA Outputs Matrix.

 

Requirements Gathering

Requirements Gathering has been formal and highly successful as far as the project team are concerned. Requirements have flowed smoothly from Business Case, to PID, to Business Requirements and Functional Requirements. This speaks to the professionalism of the team and the robustness of the methodology. Workshops have generally enjoyable and outputs have been effective.

 

Best practice analysis

Whilst there have been copious amounts written on the technical aspects of data warehousing and reporting, very little has been written on the cultural and behavioural aspects of delivering a quality reporting environment.

 

We hope that our contribution will be useful to the wider JISC community.

 

Risk Management

The JISC Risk Management Toolkit has been used and found useful.

 

Collaboration Diagrams

These have not been as useful as expected due to the small number of interactions involved and the close relationship between senior analysts and senior management. Example provided in Appendix 6.

 

Mind Maps 

A comprehensive mind-map was created that showed the problem domain and recorded progress: 

 

 

Five Whys

During workshopping extensive use of the ‘Five Why’s’ method for resolving the root cause of an issue was made.

 

PICA

Extensive use was made of the classic plan-do-check-act approach to understanding process depth, inputs and outputs to obtain a full set of sub-process steps. We use the in-house variant Plan-Implement-Check-Analyse.

 

 

 

Ishikawa Diagrams

 

We did not, in the end, make extensive use of Ishikawa diagrams, although one was used to describe engagement issues with management.

 

JISC Blog and Events

The JISC Blog and intranet pages all seem to be working well, with all project documentation stored on our internal SharePoint. Extensive communications with Associate Deans were particularly helpful last year and we will be looking to engage at the next level down in the institution this year.

 

Lessons Learned: We have found that JISC Blogs are insufficiently differentiated from monthly highlight reports (which they have effectively become). They have not therefore, resulted in a ‘conversation’ with our JISC peers.

 

Why Blogs seem to be a sterile environment for collaborative working is a good question and to which we cannot provide a definitive answer, but we suggest that ‘in the flesh’ events are far richer for generating collaborative thinking. It may be that face-to-face exchanges provide learning possibilities that can include not only a richer visual and verbal environment, but open up communications in the kinaesthetic and emotional axes, adding to both speed and quality of information exchanged.

 

Outcomes

 

Improved management focus

During our analysis of the ‘as is’ process for developing new reporting models we continued to stumble over the semantics of a raft of terms: What does one actually mean by “Applications to Date” or “Offers at Stage 1”?

 

Finding: As we drilled down, it became clear that every item of terminology would need to be defined at both a business level, and in terms of contributing data structures. Each reporting metric (KPI) was given its own English definition, described in terms of business decision points and also in terms of SITS data flags.

 

Without this we were unable to either obtain the necessary clarity when referring to them, or to establish the formulae for combining KPIs together later. We therefore strongly recommend the creation of an institution-wide expert group.

 

At Falmouth we now have an Expert Group (communicating ex-Officio at present), but from Autumn onwards it is intended to be part of the committee support structure for senior management. The Norms and Standards that it publishes will be part of the University’s Quality Management System and those responsible for reporting will have to comply with the definitions when producing management information.

 

Successful concept design and stakeholder acceptance

The JISC project captured the ‘as is’ process model for the development of these reports and conducted a review of ‘as is’ back-office activities.

 

We then undertook a concept design for a data warehouse containing historic data and topical snapshots of SITS (student record) data, and considered how a re-engineered role for the senior analyst might look once the warehouse was implemented.

 

The focus was very much on creating a richer environment for our strategic analysts which would enable them to more rapidly and analyse course data and conduct more profound research.

 

Use was made of several JISC resources (including toolkit elements, modelling tools and techniques, LTC [Leading Transformational Change] and locally derived products built on JISC/LTC products).

 

Outputs include:

 

  • Example ‘as is’ models and associated concept models for the transformation of executive reporting services from legacy spreadsheets to a data warehouse.
  • Model practices for translating business requirements (often expressed in very general terms) into meaningful technical requirements for data warehousing.
  • Identification of which classes of data might be ‘pulled’ by executives without the intervention of senior analysts, thus reducing cost, eliminating wait time and improving the quality of service. 

 

The development of this resource benefits not only Falmouth University but the wider JISC community by providing:

 

  • A release management process for the dissemination of sensitive management information
  • Reusable project support documentation for migrating from legacy reporting processes and tools to data warehousing/reporting environments
  • A definition of an executive reporting environment including application, data and business models, service level specifications and roles and responsibilities documentation

 

Resulting from Falmouth University’s involvement in the LTC programme we have developed the means to capture, control, track and deliver detailed project requirements (including transformation into packages of testing and sign off). This was further developed and delivered as part of our JISC contribution.

 

The processes, practices and tools developed have been shared to the wider community via the JISC blog and final case study. We have also attended group events to share our experiences. It is our hope that our findings will be of particular interest to the wider community generally, but also to similar arts-oriented colleges.

 

Change management orientation

Change Management of executive information is, we believe, the biggest challenge associated with delivering a trusted executive management system. Lack of trust in the data (both ‘at source’ and ‘as presented’) is continually raised by senior management. This is ahead of usability issues and frequency/service responsiveness.

 

A successful KPI reporting system may be defined (in part) as one where consecutive management reports may contain data that is ‘equivalent’ in business terms and technical derivation: When data is extracted for a report, it must be the right data and extracted in the correct manner. There must be clarity as to what data relates to in terms of business semantics. Processes must be reliable, automated, repeatable and, crucially traceable so that when differences become apparent they can be explained and understood in both business and technical terms.

 

However, we have not been successful in driving this message out and maintaining technical focus has been challenging. However, we see this as an organisational and skills failing rather than one of poor technique or a project management failure.

 

Finding: Whilst a ‘mandraulic’ system of change management might be applied to the Data Warehouse it would be far preferable to encapsulate the process of release of data and reports within a workflow system/cycle. This would insulate IT staff from requests for ad-hoc reports, remove the temptation for IT staff to just ‘run some SQL’ and insert a layer of management authorisation between technician and senior manager.

 

Achievements  

The project has delivered against outcomes as follows: 

 

Outcome 

Deliveries 

Contribution to Outcome 

Improved Management Focus 

Expert Group terms of reference 

Project communications 

Secured agreement with senior management on goals of KPI dashboarding and admissions planning reporting 

Successful concept and logical designs 

Detailed process models (Coventry Stencil, IDEF0 Model) 

Concept models for Data Warehouse 

PICA and 5-Whys problem solving 

Best practice analysis 

Provided clear statement of requirements, the solution concept and critical success factors 

Successful acceptance by stakeholders 

Stakeholder matrix 

Collaboration diagrams 

Workshops and process modelling 

Secured acceptance with business users (only partially successful on technical side) 

Change management orientation 

Concept models 

Provided concept for change management approach for KPI reporting (only partially successful at securing agreement on change management processes) 

 

Benefits  

 

By the end of the project design had been completed to:  

  • Partially eliminate the single point of failure issue 
  • Mitigate the risk that data in SITS may be over-written by providing data warehouse snap-shots 
  • Clarify the definition of application statuses and how their business logic works 
  • Reduce reliance on manually crafted SQL for reporting 
  • Reduce the likelihood of mis-keying of data 
  • Document the data clean-up process 
  • Improve the reliability of Excel reports by providing a re-usable data stream and minimising the use of formulae 
  • Removal of Cut and Paste operations and thereby stop errors being introduced by that mechanism.

 

Drawbacks  

Having completed this feasibility study we can report that our initial view of risks was close to the actual risks observed.

 

Firstly, the ability to recruit a Data Architect who is not only technically able but temperamentally suited to working in a business-centred rather than IT-focused environment. It is difficult to recruit individuals with this blend of high-end technical capability and the necessary people and team-working skills. It took us over twelve months and three attempts at recruitment to recruit someone who was close to the person specification.

 

Secondly, the need to both understand senior managers (at the level of Vice-Chancellor, Director and Heads of Department) is crucial, as is the ability to communicate effectively with them to create a ‘culture shift’ away from spreadsheet thinking to an appreciation of the importance of high quality data and the need to fund for backend personnel. The perceived ‘wants’ and actual ‘needs’ of senior managers are some distance apart.

 

There are also some limitations identified:

 

  • Although we have alleviated one single-point-of-failure issue, we have introduced potential for another (in the form of a Data Warehouse specialist) 
  • The lack of an agreed change management process means that it is not impossible for poor-SQL to be re-introduced and there is no means to validate the quality of the SQL 
  • The issue of ‘atypical data’ and exceptional-exceptions is still yet to be addressed and neither has the issue to tool access 
  • Use of Excel: Model inherently non-robust. Formulae may easily be corrupted or over-written  
  • Although clear in principle, we are yet to produce an agreement as to which reports will be pull/push 
  • We have not addressed the issue of poor feedback from course leaders

         

Key lessons

 

Business Analysis

 

  • The development cycle for a data warehouse and/or turnkey reporting follows the traditional development cycle. Agile, RAD or Waterfall or all viable techniques as long as supported by staff who are prepared to work in a disciplined manner and with the equal input of fellow team members. Health Warning: We have found that the term ‘Agile’ is bandied about as cover for not following a process and not providing documentation. If ‘Agile’ methods are to be used they should be formalised in the same manner as any other methodology.
  • Creating a turnkey reporting service based on a data warehouse is a multi-disciplinary activity. Senior Management Analysts as well as business analysts and technical architects need to come together to make it work. An IT-focused solution will fail and will result in a loss of trust with senior management and may result in project failure.
  • Attention to detail is crucial: Robust, accurate and comprehensive data definitions are needed to ensure that quality data is provided and quality outputs are produced.
  • Change management is crucial: Changes to datasets and reports must be controlled. Communication to senior management must be candid, honest and informative: Guidance must be provided so that senior managers may interpret changed information with confidence and over/under-reactions to presented data does not occur. Health Warning: The pressures of creating management information can result in a ‘house on fire’ approach to business intelligence. This will inevitably result in corners being cut, and reports not being created with data of known provenance. 
  • Technical staff must be firewalled from senior managers: Contact should be through senior analysts. Ad-hoc changes to reports must be eliminated.  

         

Technology 

Finding: The creation of staging areas, fact/dimension data and the vault (which the reporting service accesses) whilst not the same as the creation of a relational database is not technically difficult. 

 

There is no a priori reason why this could not be an arms-length activity considered to be a purely back-office service. If designing this service again we would probably pursue this approach and recruit a report writer rather than creating a single role that is both a single point of failure and bottleneck as they must be involved at every stage. 

 

The technical infrastructure was right-sized from day one: This greatly aided later development. However, we would also now provision a second smaller machine to allow senior analysts a separate and safe environment to build ‘side-models’ using a separate data warehouse built to their own specification. This would lower the risk of the production data warehouse being overwritten and would simplify the change management process.

 

Looking ahead

Forces supporting change are:

 

  • A willingness by senior management to recognise the importance for evidence-based decision making
  • A post has been created to build and develop the institutional Data Warehouse and use this to support the institutional KPI set
  • An ongoing programme has been supported and funded by senior management which recognises the strategic importance of executive dashboarding and the ongoing creation of the infrastructure to support it
  • A clear direction by IT and the Delivery and Innovation Group (DIG) to promote and drive executive dashboarding
  • The willingness to create the Expert Group on Reporting Strategy and for this to report to the Vice-Chancellor.

 

Forces against change:

 

  • An inevitable focus on senior management to want explanations for short-term bumps in the data, which can tend to take reporting towards knee-jerk requests for ad-hoc reports
  • Loss of focus by IT staff on quality and a temptation to cut corners by implementing changes that are not thought through in order to satisfy requests
  • A tendency to 'stove pipe' work and drive all reporting work through the Data Warehousing developer rather than broadening skills and involving a wider range of staff, departments and expertise
  • A tendency for reporting to be seen as an IT function rather than one where business knowledge and experience add value and create the context and understanding about what the data 'means': This can never be IT function. 

 

Appendices

Appendix 2b - Stakeholder Analysis Matrix

Stakeholder 

Stake in the Project 

Potential Impact on Project 

Expectations of the Stakeholder 

Perceived attitudes and/or risks 

Stakeholder Management Strategy 

Responsibility 

Vice-Chancellor, Deputy Vice-Chancellor, Finance Director

Policy and process owners who determines institutional administrative policy and procedures. Sponsors for the Expert Group on Reporting Standards (EGORS). Ultimate customers for the information presented by the new service.

Very High

Active delegation  and ongoing interest to ensure that staff are empowered and trusted

Potential for distraction with tactical issues (urgency for admissions reports)

Involvement in Project Steering Board (Finance Director).

Regular update meeting with programme manager.

Programme Manager

Heads of School

Manages School Admin staff who will operate the new system at local level and academic staff who will indirectly input and directly extract data.

Medium

Commitment to engaging with the system / service.

Lack of interest in project.

Involvement in briefing sessions at quarterly School meetings.

Deputy VC and Finance Director. and Project Sponsor

Senior Analysts

Will extract data from the new system, conduct QA activities and manage outputs from IT department

Very High

Contribute to system and process design and testing.

Concern about increased workload.

Concern that Heads of School may not engage fully. Concern that project could be ‘IT for its own sake’

Involvement in project steering and buy off of all deliverables.

Project Team, Project Board

IT Developer

Required technical expertise

High

Deliver as requested / directed

Sole expert. Single point of failure. Temptation to cut corners to achieve goals.

Clear terms of reference. Strong management direction and oversight.

Programme Manager. Line Manager.

 

Appendix 2c - Model Practices

When attempting to develop a project to deploy an executive reporting system:

Do

  • Understand the process for gathering and assessing management requirements for executive reporting
  • Understand the sensitivities associated with particular key performance indicators and the risks associated with incorrect measurement and interpretation
  • Build data structures that will capture time-based data for later trend analysis
  • Build-in change management so that management requests for change may be linked to changed data
  • Provide narrative information to guide users so that they are clear as to what is significant and noteworthy, and conversely what changes to reports may safely be ignored (even if apparently significant)
  • Build a team of skilled technical and business users that will work collaboratively and effectively and actively transfer skills from themselves to others
  • Establish an expert group to define reporting terminology, data standards and KPI semantics.

Don’t

  • Become a clearing house for ad-hoc reporting
  • Focus on short term report needs
  • Prioritise report quality or frequency over data quality
  • Get involved in either operational or ‘tactical’ reporting
  • Attempt to displace other (legitimate) forms of reporting

 

Appendix 3 - Enterprise Architecture Models at Falmouth University

 

 

Domain of Change

 

 

Technology

Application

Data

Process

Organisation

Location

Enterprise

Model Type

Conceptual

CTM

What: Simple diagram of major items of technology - servers, networks, clients

Why: To communicate basic organisation of hardware and networking

When: RD, updated during A&F

How: Visio (stencil) by consultation with stakeholders and TA

Author: BA

CAM

What: Simple diagram of  application components, databases, interfaces, web, display / print processing

Why: To communicate basic organisation of software functions

When: RD, updated during A&F

How: Visio (stencil) by consultation with stakeholders and TA

Author: BA

CDM

What: Simple diagram of key items of data, people, products, tasks etc. within the proposed system.

Why: Shows key data items / linkages, ensure system supports essential processing

When: RD, updated during A&F

How: Visio (stencil) by consultation with stakeholders and TA

Author: BA

CPM

What: A meta-model of what types of process are in involved in ‘to be’ process or IDEF0 ‘top down’

Why: Coaching of workshop participants, scoping of project [in conjunction with an ECM]

When: Usually produced when visioning or re-architecting business

How: PowerPoint, Visio (stencil)

Author: BA

COM

What: A meta-model of what types of organisational unit exists / will exist

Why: Coaching of workshop participants and in conjunction with an ECM

When: Usually produced when visioning or re-architecting whole business

How: PowerPoint

Author: BA

CLM

What: A meta-model of what types of location exist / will exist

Why: Coaching of workshop participants and in conjunction with an ECM

When: Usually produced when visioning or re-architecting whole business

How: PowerPoint

Author: BA

ECM

What: A meta-model showing the priorities, influences and direction that the business faces.

Why: Sets the context in which the project is framed

When: This is a pre-requisite to project initiation

How: PowerPoint, Word

Author: Executive Management

Notes: This will typically be related to, or part of any strategic documentation such as institution-wide Business Plans and will be underpinned by Vision, Mission and Top-Down goals.

 

 

Logical

LTM

What: Complex diagram at component level

Why: Drives initial configuration

When: DD, updated during Implementation

How: Visio (stencil)

Author: TA

LAM

What: Complex diagram at component level

Why: Drives initial configuration

When: DD, updated during Implementation

How: Visio (stencil)

Author: TA

LDM

What: ‘As designed’ entity model

Why: Drives development

When: DD, updated during Development

How: Visio (stencil)

Author: TA

LPM

What: Swimlane ‘as is’ and ‘to be’ process models showing actual business activities / tasks.

Why: Drives design, Dev., Implementation

How: Visio (stencil)

Author: BA

LOM

What: Stakeholder diagram showing key business and users relating to the proposed system

Why: Drives design, Dev., Implementation

How: Visio or PowerPoint

Author: BA

LLM

What: Diagram of ‘as is’ and ‘to be’ location model showing interactions between locations

Why: Drives design, Dev., Implementation

How: Visio or PowerPoint

Author: BA

Physical

PTM

What: Detailed diagram of assignment of network ports, VLANs, drive partitions / RAID

Why: Drives implementation and maintenance

When: DD and Implementation

How: Network management tool(s)

Author Who: Technician

PAM

What: Detailed diagram of code structure (Class level)

Why: Drives implementation and change management

When: DD and development

How: Code management and application modelling tools

Author: Developer

PDM

What: Detailed diagram of implementation data model

Why: Drives implementation and database management

When: DD and development

How: Code management and database modelling tools

Author: Developer, DBA

PPM

What: Swimlane diagram who, what and when (could be linked to workflow system)

Why: Drives implementation, procedures and methods

When: DD and implementation

How: Process mgmt and workflow s/w

Author: BA

POL

What: Detailed diagram and roles and resps. model

Why: Drives implementation, procedures, methods

When: DD and implementation

How: Process mgmt and workflow s/w

Author: Senior Bus. Lead

PLL

What: Detailed diagrams of physical locations people, equipment, activities

Why: Drives implementation, logistics, methods

When: DD and implementation

How: Process mgmt and workflow s/w

Author: Senior User Lead

 

Project Blog

 

http://amdash.jiscinvolve.org/wp/