Progress Report: Data Analytics October 2016
Executive Summary • Smart Specialisation is a strategy that aspires to promote local economic growth in innovation, and seeks to enhance collaboration between business, higher education and public sector stakeholders. Its aim is the pursuit of comparative advantage – focusing investment on areas of real local strength, as supported by analysis and driven by bottom-up entrepreneurial discovery – to build productivity, resilience, and diversity into local economies.
• This aspiration was • Measuring innovation followed through by involves using a range Government with the of indicators to capture establishment of a Smart the series of inputs (e.g. Specialisation Hub jointly funding and human delivered by the National capital) and outputs (e.g. Centre for Universities knowledge outputs from and Business and the research, innovative Knowledge Transfer products, services, and Network (KTN). The Hub processes) that encompass aims to provide an evidence the innovation process, and base to inform better the interactions between investment decisions in them. innovation, and use Smart Specialisation approaches • Following a period of data to mobilise collaboration curation, we selected between Local Enterprise thirty such indicators Partnerships (LEPs) and reflecting local and other stakeholders across national performance in geographical boundaries innovation at LEP level. A and sectors. pilot analytical framework has been developed as a first step toward profiling and comparing innovation performance, and is structured around different actors. This includes research/Higher Education Institutions (HEIs), business/ industry, and the wider LEP environment, what they each contribute to the innovation ecosystem in terms of funding, talent, and observable outputs.
“ Smart Specialisation is a strategy that aspires to promote local economic growth in innovation ...”
2
• In common with all efforts of this nature, the Smart Specialisation Hub worked through a series of challenges in assembling this framework, including (see page 7):
• The Hub’s Observatory • The Hub’s data analytic function will house a series team aims to complete a of data visualisations pilot testing phase in the designed to enable users coming months, comparing to easily track innovation innovation assets and performance within and performance across the across LEPs. Benchmarking Heart of the Southwest and innovation performance will Greater Manchester LEPs. 1 a lack of available enable users to evaluate Longer-term milestones indicators at LEP level. where LEPs are positioned include developing a This was overcome using in relation to one another, functional analytic tool to a combination of NUTS2 and highlight fruitful points better inform investment regional data, university, of collaboration based decisions and points of and postcode level data; on respective research collaboration in innovation 2 misalignment of sectoral and business innovation within and across all LEPs. classification systems. strengths. Data analysts for the Hub have sought, where possible, to align classification systems across indicators with sectoral validity; and 3 backward-looking and static data. The Hub uses up-to-date university research funding data to highlight emerging innovative technology and industry areas.
3
Introduction Smart Specialisation in England is an
with enhanced university engagement,
innovation performance. This report
economic strategy centred on the
a systematic communications platform,
will outline initial progress from the
promotion of place-based economic
and an Observatory of data outputs
Hub’s data analytics team in developing
growth in innovation, and seeks
designed to profile and compare
an overarching analytical framework
to enhance collaboration between business, higher education, and public sector stakeholders (Department for
Local Enterprise Partnerships
Business, Innovation & Skills, April 2015). Smart Specialisation allows each Local Enterprise Partnership (LEP) in England - see Figure 1 for a map of England’s 39 LEPs - the chance to build on their comparative advantages in skills and industry, and the opportunity to work in collaboration with other localities to drive productivity and growth. Many of the current aspirations for building local growth on localised comparative economic advantage were highlighted in Sir Andrew Witty’s Review of Universities and Growth, ‘Encouraging a British Invention Revolution’ (BIS, 2013), including the need for an advisory capability to advise on how strongly local innovation decisions are based on sound assessment of comparative advantage. This aspiration was formalised by Government with the establishment of the Smart Specialisation Hub jointly delivered by the National Centre for Universities and Business and the Knowledge Transfer Network (KTN). The Hub was established to provide an evidence base to inform better investment decisions in innovation, and to promote collaboration between LEP stakeholders across geographical boundaries and sectors. To date, a number of LEPs have begun to develop innovation and smart specialisation strategies alongside their Strategic Economic Plans (SEPs). Whilst SEPs set out broad future economic development and growth plans for a LEP, innovation and Smart Specialisation strategies focus attention on growth and job creation through product and service innovation. In line with its remit, the Hub aims to add to the ongoing processes in LEPs
4
Measuring innovation across geographical areas for innovation and in profiling the
Innovation can be understood as a
Furthermore, as part of the first phase
innovation performance within and
process that, whilst difficult to examine
of the Government’s programme of
across LEPs in England.
in its entirety, can be partially followed
Science and Innovation Audits (SIAs)
by observing what inputs it deploys
(BIS, November 2015), established to
(e.g. funding and human capital), and
profile the innovation performance of
what outputs it obtains (e.g. knowledge
regional consortia in the UK, analysts
outputs from research, innovative
Technopolis (April 2016) developed
products, services and processes)
an enlarged analytical framework to
- though direct relationships of one
encompass all aspects of regional
output to one input are difficult to
excellence in science and research and
establish (Fagerberg et al., 2006). Since
strengths in innovation. Whilst larger in
it is not a linear process, measuring
scope, this framework drew on a similar
innovation involves using a range
set of indicators to those featured in
of indicators to capture the series
the Government’s framework (BIS,
of inputs and outputs, and where
2015 op. cit.), but were structured
possible, interactions between them,
slightly differently around the themes
that encompass the innovation process.
of ‘Regional Science and Innovation
Collectively, these indicators can be
Assets’, ‘Excellence in Science and
used to build a more detailed picture
Research’, and ‘Innovation Strengths’.
Figure 1: Map showing the 39 LEPs in England (Source: BIS, contains Ordnance Survey data, 2013)
not just of the innovation assets present in a locality, but also how they
Whilst highly insightful, these
are related. This in turn can provide
innovation frameworks tend to include
an evidence-based assessment of
indicators with a relatively low level
innovation performance within and
of sectoral granularity, usually do
across geographical areas.
not explicitly compare innovation performance across geographical
To date, several analytic frameworks
areas, and elect not to highlight
have been developed with the
emerging innovative technology and
intent to capture innovation assets
industry sectors. Whilst functioning
in England. A recent Government
within some constraints relating
framework (Mapping Local Comparative
to data procurement and validity
Advantages, BIS, July 2015) developed
(outlined in more depth on page three)
for ranking the innovation performance
the Smart Specialisation Hub’s analytic
of LEPs, incorporates a range of
framework and data outputs seek to
input and output indicators oriented
respond to these key limitations.
around the themes of funding, talent, knowledge assets, structures and incentives, broader environment, and innovation outputs.
“ This in turn can provide an evidence-based assessment of innovation performance within and across geographical areas.”
5
The Hub’s analytical framework We have designed the Hub’s analytical
within the Hub’s Observatory. This will
series of innovation inputs and outputs,
framework using indicators curated
act as the platform from which users can
including funding in innovation, talent in
from a range of sources, including
consult data visualisations relating to
innovation, and innovation outputs – see
the Government’s ‘Mapping Local
innovation assets and performance.
Figure 2: Venn diagram of innovation
Comparative Advantage’ report, produced with Liverpool John Moores
Following the data curation stage, which
University (BIS, 2015), the Science
included wide-ranging stakeholder
and Innovation Audit inception report
engagement, we identified thirty
(Technopolis, 2016), the Higher Education
indicators reflecting local and national
Statistics Agency (HESA), the Higher
performance in innovation at LEP
Education Funding Council for England
level. These were structured firstly
(HEFCE), and Innovate UK. The Hub’s
around innovation actors (i.e. research
framework responds to limitations
institutions, businesses, the broader LEP
of existing innovation frameworks by
environment), and secondly around a
attempting to achieve a greater degree of sectoral granularity by comparing innovation performance across LEPs via benchmarking techniques, and by signposting innovation in emerging technology and industry sectors. The analytical framework will be housed
“Amongst outputs, we, include firms engaged in product or process innovation as an indication of strength.”
Broader LEP Environment
‘actors’ and their interactions. We developed a pilot framework as a first step toward profiling and comparing the innovation performance of LEPs. It comprises 14-18 indicators (see Figure 3 - and annex for a list of all indicators in the pilot framework). For each innovation actor, we look at several elements that reflect their contribution to innovation. In a first step, we analyse what actors contribute to the innovation system in terms of funding (used to invest in infrastructure and new knowledge), and talent (the human capital that contributes to developing and sharing new and existing knowledge), as well as observable outputs of these contributions (measurable outputs of the innovation process). Amongst inputs we include, for example, the number of academic staff submitted to the Research Excellence Framework of 2014 to capture innovative research areas across disciplines. Amongst outputs, we include
Research/HEI Innovation Assets
firms engaged in product or process innovation as an indication of strength in these areas across LEPs. Employing multiple indicators - e.g. size and number of Innovate UK grants - should enable
University-Business Innovation Interaction
analysts to capture a more nuanced picture of innovation ecosystems. Whilst higher values across indicators will tend to represent better innovation performance, the presence of internal trade-offs between indicators means
Business/Industry Innovation Assets
that qualitative evidence will be required in order to develop a fuller picture. To trial the reduced pilot framework, we selected two LEPs that represent two of the Consortia chosen by BIS for the first phase of the Science and Innovation audits - the Heart of the South West and
Figure 2: Venn diagram of innovation actors and their interactions
6
Greater Manchester LEPs.
Figure 3: Table showing an abridged version of the pilot framework and sample indicators (Yellow = not available at LEP level/Green = available at LEP level)
Data analytic challenges and initial restrictions on indicator use Availability of data at the LEP level
Misalignment of sectoral classification systems
Backward-looking data
A key challenge in profiling innovation
Since indicators come from diverse
data, innovation profiling will often be
performance for the Smart Spec Hub
sources and are used for a variety of
backward-looking and static, with the
is a lack of available indicators at LEP
purposes, indicators tend to have data
result that it may overlook emerging
level. To address this challenge we have
structured around varying sectoral
innovative technologies and industries.
undertaken the following approaches:
classification systems. Whilst this
The use of up-to-date grant-funding
1 using indicators that are available
makes the task of comparing across
data for innovative business projects
at the NUTS2 (Nomenclature of
sectoral indicators problematic, data
(Innovate UK, 2012-14) provides a way
Territorial Units for Statistics Level 2)
analysts for the Hub will seek to align
of highlighting emergent innovation in
regional level since this geographical
classification systems across indicators
specific technology sectors. Due also
scale aligns most closely with LEP
with sectoral validity.
to lags in research impact, capturing
areas; 2 employing indicators at the Higher Education Institution level and aggregating values to the LEP level; 3 using data assigned with postcodes and aggregating to the LEP level.
“The use of up-to-date grant-funding data for innovative business projects ... provides a way of highlighting emergent innovation in specific technology sectors.� 7
Due to inevitable lags in the release of
university research funding could provide a useful indication of forthcoming innovation strengths in research.
Preliminary data analysis and visualisations The Observatory function of the Hub
average values across categories or
comparing performance across many
will house a series of data visualisations
LEPs, and adopting a ranking format.
LEPs challenging.
designed to enable users to easily
For multi-variable indicators - i.e. those
track innovation performance both
including university disciplines or
within and across LEPs. These will
sectoral classification - averages can
Normalised average and benchmarking
include conventional bar charts, line
be calculated for each category
Normalising the averages across
graphs using normalised averages
and more easily compared using a
categories for a given indicator (see
(a benchmarking tool), and ranking
normalised average.
figure 5) facilitates the task of comparing
bar charts, that enable detailed
innovation performance across five to
innovation analysis within and across
Conventional bar charts
LEPs. Mapping tools (see page 10 for
Figure 4 shows Research and
R&D expenditure, places average values
details of Power Map) will also facilitate
Development (R&D) expenditure across
across expenditure types on a level
comparative analysis across LEPs.
the four domains of Higher Education
setting by attributing them with a value
(HERD), businesses (BERD), government
of one. The deviation of expenditure
Benchmarking innovation performance
(GovERD), and private and non-profit
for each expenditure type away from
across LEPs will be an important
organisations (PNPRD) for the two
the cross-LEP average can then be
function of the Hub’s observatory. This
pilot LEPs in relation to the average for
observed, facilitating the task of visual
will enable users to evaluate where LEPs
those groups. Whilst conventional bar
data comparison across LEPs. Figure
are positioned in relation to one another
charts have the benefit of facilitating
5, for example, shows Research and
across indicators, and thus highlight
in-depth profiling of a small number
Development (R&D) expenditure (2011)
potential strengths and weaknesses
of LEPs (i.e. by showing raw data
in relation to a normalised average,
across their innovation ecosystems.
values and their position in relation to
allowing for standardised comparison
Initial benchmarking is being trialled
varying sectoral averages), this form
across the Heart of the Southwest and
by the Hub’s data analytics team using
of data visualisation makes the task of
Greater Manchester LEPs.
15 LEPs. This process, in the case of
R&D Expenditure from Higher Educa7on (HERD), Business (BERD), Government (GovERD), and Private & non-‐profit organisa7ons (PNPRD), BRES, 2011 Heart of the South-‐West
Greater Manchester
Average across LEPs
£s per person employed FTE
900 800 700 600 500 400 300
306.3 240.7
238.9
200
124.5
100
18.5
0 HERD
BERD
10.2
GovERD
R&D expenditure by type Figure 4: R&D expenditure from HERD, BERD, GovERD & PNPRD, 2011
8
23.7
0.3
PNPRD
Figure 5: CrossLEP comparison: R&D expenditure, 2011 (normalised average)
Whilst HERD expenditure can be seen to be above average for both
Ranking and collaboration
sector. The bar chart usefully illustrates
As Figure 6 below demonstrates, ranking
the relative position of all LEPs to the
LEPs, Greater Manchester spends
LEPs based on individual indicators can
average for Innovate UK investment in
proportionately more on Higher
usefully show their relative positioning
Space Programmes, which cannot be
Education R&D activities, but less in
in relation to a sectoral average and
observed in the prior data visualisations.
relation to government and private/non-
can provide a snapshot view of a LEP’s
This chart suggests, for instance, that
profit R&D.
overall position for a given indicator or
the Enterprise M3 LEP receives by far
Figure 6: Innovate UK Investment in Space Programmes across LEPs, 2012-14
9
Figure 7: Volume of research outputs for Space across LEPs, 2012-14
10
the most amount of grant funding from
prowess in this sector. The ability
Innovate UK for space programmes,
of the Hub to highlight key points
highlighting its likely business
of collaboration between research
innovation strength in this domain.
institutions and businesses, whilst employing multiple indicators, is one of
Next steps The above data visualisations provide an insight into the development of an evidence base that will be used to inform better
its key assets.
investment decisions in innovation. The Hub’s
points of collaboration between
Looking across different types of data
above pilot phase over the coming one to
LEPs based on respective research
visualisation (i.e. normalised average
and business innovation strengths.
line graphs, ranked bar charts, and
Figure 7 illustrates the ranked volume
spider charts), Observatory users will be
of research outputs for Space (an
able to assess innovation performance
Innovate UK priority area) across
across a range of actors and input/
LEPs from 2012-14 (Scopus). The data
output indicators within a LEP, and
suggests that London far exceeds
compare innovation performance
other LEPs in space-related research
across multiple LEPs, highlighting areas
outputs, and shows that the Greater
of potential collaboration across LEPs’
Cambridge & Greater Peterborough
innovation ecosystems. Whilst data will
and Oxfordshire LEPs are also strong
take the form of both pre-described
in this research domain. Looking
charts and a mapping tool, preliminary
across figures 6 and 7, we can observe
investigation has identified Excel’s
that London has substantial research
Power Map as a less resource intensive
strength in space, whilst the Enterprise
and highly intuitive mapping software.
M3 LEP is ranked first for business
Figure 8 below provides an example
investment in this sector. This points to
of how data could be embedded and
the potential for fruitful collaboration
visualised using such a mapping tool,
between these two LEPS, most notably
showing clusters of research and
the opportunity available to Enterprise
investment activity areas in the space
M3 to draw upon London’s research
technology sector.
Furthermore, looking across ranked
data analytics team aims to complete the
bar charts could indicate fruitful
two months, comparing innovation assets and performance across the two pilot LEPs and additional LEPs involved in the first and second waves of the Science and Innovation Audits. Longer-term milestones include: 1 producing a functional analytic framework profiling and comparing 10 to 15 LEPs across a larger number of indicators; 2 the use of mapping tools to visualise data. Continued stakeholder engagement will seek to build in contextual information with expert views and local qualitative data sources. This will result in an analytic framework and subsequent data outputs that are sensitive to the needs and requirements of those eventually using and engaging with the Hub’s Observatory function.
Figure 8: Mapping research and investment activity for Space
11
References Department for Business, Innovation & Skills (October, 2013). ‘Encouraging a British Invention Revolution: Sir Andrew Witty’s Review of Universities and Growth’. The National Archives. Department for Business, Innovation & Skills (April, 2015). ‘Smart Specialisation in England: Submission to the European Commission’. The National Archives. Department for Business, Innovation & Skills (July, 2015). ‘Mapping Local Comparative Advantages in Innovation. Framework and Indicators: Appendices’. The National Archives. Department for Business, Innovation & Skills (November, 2015). Science and Innovation Audits: Call for Expressions of Interest. The National Archives. Fagerberg, J., Mowery, C., & Nelson, R. (2006). The Oxford Handbook of Innovation. Oxford University Press. Technopolis (April, 2016). ‘Science and Innovation Audits. (Draft) Inception report for the Consortia’. Version 1.3. Technopolis Group. Annex of indicators in pilot framework - Higher Education (HERD), Business (BERD), Government (GovERD), and Public Non-Profit (PNPRD) expenditure in R&D, Eurostat, 2011. - Number of staff submitted to REF 2014 (at 3*/4*) across disciplines, HESA, 2013/14. - Mapping of publication output to the priorities of Innovate UK, Scopus, 2012-14.
- Mapping of publication output to the 8 Great Technologies, Scopus, 2012-14. - Investment in Innovation by sector, Innovate UKBIS, 2010-14. - Number employed FTE by technology sector, BRES, 2013. - Number of firms engaged in product or process innovation, UKIS, 2008-10. - Number of inventors on patents by technology sector, BIS-WIPO, 2011-14. - Income for knowledge exchange between HEIs and SMEs (consultancy and contract research), HEBCI, 2010/11 - 2012/13. - Income for knowledge exchange between HEIs and large businesses (consultancy and contract research), HEBCI, 2010/11 - 2012/13. - Employed 1st degree graduates across innovationactive sectors, DLHE, 2013/14. - Number of graduate start-ups, HESA, 2009/10. - Average travel to work times, Halifax Quality of Life Survey, 2014. - Superfast broadband availability, Labour Force Survey, 2013. - Average annual gross full-time earnings, Annual survey of hours and earnings, 2014. Acronyms
BRES: Business Register and Employment Survey DLHE: Destination of Leavers of Higher Education (Survey) GovERD: Government Research and Development HEBCI: Higher Education Business and Community Interaction (Survey) HERD: Higher Education Research and Development PNPRD: Private Non-Profit Research and Development UKIS: UK Innovation Survey Appendix: Figure 1: Map showing the 39 LEPs in England (Source: BIS, contains Ordnance Survey data, 2013) Figure 2: Venn diagram of innovation ‘actors’ and their interactions Figure 3: Table showing an abridged version of the pilot framework and sample indicators Figure 4: R&D expenditure from HERD, BERD, GovERD & PNPRD, 2011 Figure 5: Cross-LEP comparison: R&D expenditure, 2011 (normalised average) Figure 6: Innovate UK Investment in Space Programmes across LEPs, 2012-14 Figure 7: Volume of research output for Space across LEPs, 2012-14 Figure 8: Mapping research and investment activity for Space.
BERD: Business Enterprise Research and Development
Compiled by Dr Etienne Bailey, Dr Rosa Fernandez, Andrew Basu-McGowan and Kim MacLean
Smart Specialisation Hub National Centre for Universities and Business Studio 10-11, Tiger House Burton Street London WC1H 9BY KTN Ltd Suite 220 Business Design Centre 52 Upper Street London N1 0QH