NETS Reporting tool

UX/UI: Matt handover

InVision wireframeshttps://projects.invisionapp.com/freehand/document/fOAwpR8wd

Packaged design files and any UX

Workshop feedback to be used in handover for Matt notice period on 25/10/19

NETS Reporting tool workshop feedback


Wants/NeedsConcernsQuick Wins
Vision 

Key questions for focus on reporting functionality

  1. Within which provider organization is risk greatest?
  2. Within an organization where is risk greatest?
  3. Within an organisation where is good practice being demonstrated?
  4. Which organization presents the greatest risk within a specific training programme/learner group?
  5. How do results from different professional groups compare within organizations
It has become apparent that we need to ensure excatly what this reporting tool needs to do and for whom.

We need to understand:

  • User groups and their needs
  • How these groups use the report
  • What these groups expect from the report
  • What are we building this in? (Tableau at the moment, but will this be possible with the answers to the above)
Key questions

Language and messaging

  • How are we defining risk? Outliers (NTS), Benchmarks, Raw score
Language and messaging needs to be clear. What are these results measured against, what rules and benchmarks etc are HEE using to define Risk etc. What does it all mean in lamens terms?NETS need to establish a clear strategy and messaging around the framework and how the data can be used.
Considerations
  • Data extraction
  • The volume of info and what do external users need?
  • Reputational damage
  • Within theme or indicator which questions make them up and what were the responses 'drill-down'
  • In EXTERNAL users cant action the data, they won't distribute the survey and there will be no Data we need to make sure it's usable for EXTERNAL
  • Risk by 5X5 Matrix?
  • Link to the HEE Quality framework
  • Year-on-year longitude (data where N<3)
  • Creating a bank of non-med data that can map to NTS data
We need to understand who this is for be it Internal teams and External teams may well need very different types of functionality, reporting information, tools etcInvestigate exact needs to understand if this is a tool for all (Internal and External) or we build out 2 solutions.
Slide 1
  • UI Page instead to access further data
  • Intro dashboard could adapt the page which explains all the other dashboards to create an overall introduction
  • Log in dictates if you see Internal or External tool
  • What happens when response rates increase and released nationally
  • What happens when the survey has done two revolutions
  • Do we need it at all
  • Link is broken so email better
  • Add in which survey the tool links to (date)
  • HEE Branding to be better
  • Contents page would be useful
  • Purpose of the survey upfront
  • Navigation to each area with a short description
  • Last updated - Data as title
  • Contact information
Slide 2
  • Need an indication of response rates to show the validity
  • Add a theme/indicator level under the domain
  • Responders by Region - may be more useful to have response numbers vs total learners (ie. Total potential responses)
  • Future - To cover all domains in questions
  • Respond by Region. Change to number + Split Med and Non-Med
  • Useful only to operational staff


  • Do filters filter all pages or just single page
  • How will this be presented year on year
  • Not measuring against all the domains
  • Number of questions varies by domain (2-15) How best to summarise
  • Not useful EXTERNALLY (Just more information to navigate past)
  • What are our response rates? What do about non-med?
  • If we get Slide 1 right we don't need this

Summary slides
  • Overall satisfaction by location (Drill down on a map)
  • Rag ratings
  • Positivity/negativity
  • OS by Region/Trust/Programme - Drill down to see questions
  • Can we RAG
  • Domain useful internally at National/board level
  • Question categories: eg Teaching, Supervision useful externally and Local Office level
  • Would like the option to compare responses to regional mean as well as national (overall sat. by location chart)
  • Ability to be able to compare results for sites as well as trusts
  • Create a common NETS language (like GMC flags - or NETS flags)
  • Click on a result by domain to see questions items that feed into it
  • Domain level too broad 'overall Sat need to drill down to 'Supervision', 'Training' etc
  • Breakdown of response has 'value' for local offices
  • Can we map risk against the HEE risk matrix?
  • OV Sat/OV Variance tabs would be useful if displayed by Category/Teacher/Supervision
  • Results on a page. eg - for a provider - too many tabs to view all data
  • If we theme the questions by Supervision etc, it may be easier to understand/more useful
  • OV Variance filter by colour coding to investigate potential risk
  • Introduce a slider with movable maximum and minimum to highlight the top and bottom ranges
  • Domain summary - Lime the idea of having one list of questions rather than response types
  • The end goal - EDU Provider . can choose themselves, click download & obtain a report on themselves
  • Addition of Average bar of those selected
  • Can we RAG/Categorise data in some way as GMC? (We all understand that system & so do trusts)
  • Currently can't answer business questions from too!
  • How do we identify the key areas of risk in these pages
  • Ask National Directors how they want to analyze business risk
  • How do we know which are our organizations of greatest risk? Need to agree threshold + process
  • Calculation of data/scores/risk. How do we want to count? Data methodology
  • If we cant do % response rates - do no's at - National/Regional and also at Placement/Trust level
  • How is this currently better than excel?
  • Don't use '+ive & '-ive' as it's confusing terminology
  • Instead of an external tool can we generate a PDF report
Slides 12-13We need to try to actually introduce a lot of this into the reporting tool, as opposed to having it as a reference at the back. The tool should be intuitive enough for users. But if not, we need tooltip areas to help users understand what things are or where things are and what they do

Photos and wallspace