DELIVERABLE Project Acronym:
APOLLON
Grant Agreement number:
250516
Project Title:
Advanced Pilots of Living Labs Operating in Networks
Deliverable 1.3 Framework for APOLLON Evaluation and Impact Assessment including KPI definition and measurement Revision: Final Version
Authors: Anna Ståhlbröst (LTU) Petra Turkama (Aalto University) Bram Lievens (IBBT) Hendrik Heilkama (Aalto University) Petra Hochstein (SAP) Christian Merz (SAP) Claudio Vandi (UP8)
Project co-funded by the European Commission within the ICT Policy Support Programme Dissemination Level P
Public
C
Confidential, only for members of the consortium and the Commission Services
X
Apollon – Deliverable 1.3
Revision History Revision Date
Author
Organisation Description
V0.1
11.08.2010 Ståhlbröst LTU/CDT
V0.2
12.08.2010 Turkama
Additions to chapters 2 and 5
V03
02.10.2010 Ståhlbröst LTU/CDT
Change in methodology assessment framework
V04
10.10.2010 Ståhlbröst LTU/CDT
Changes in framework
V05
28.10.2010 Ståhlbröst LTU/CDT
Finalising the deliverable
Aalto
impact
assessment
The information in this document is provided as is and no guarantee or warranty is given that the information is fit for any particular purpose. The user thereof uses the information at its sole risk and liability.
Statement of originality: This deliverable contains original unpublished work except where clearly indicated otherwise. Acknowledgement of previously published material and of the work of others has been made through appropriate citation, quotation or both.
ICT PSP Project Reporting Template
2
Final Version
Apollon – Deliverable 1.3
Table of Contents 1.
Summary ....................................................................................................................................... 4
2.
Introduction ................................................................................................................................. 4 2.1 Objective and Aim ................................................................................................................................. 5 2.2 Theoretical Framework for Evaluation ......................................................................................... 6 2.2.2 Evaluation Approaches .................................................................................................................................. 6 2.3 Evaluating Methodology ..................................................................................................................... 8 2.3.2 Four common deficiencies in methodologies are: .............................................................................. 8 2.4 Key-‐Performance Indicators ............................................................................................................. 9
3.
APOLLON Criterion for the Evaluation Framework ..................................................... 10
4.
APOLLON Evaluation Processes .......................................................................................... 12
3.1 3.2
Summary of the Base-‐Line Investigation among APOLLON Partners ............................... 10 Summary of Requirements from APOLLON Deliverable x.1 ................................................ 11
4.1 APOLLON Methodology Evaluation Process .............................................................................. 12 4.1.2 Methodology Evaluation Process ........................................................................................................... 14 4.2 Evaluation of the Cross-‐border Networking Process within the Thematic Experiments ..................................................................................................................................................... 14
5. Research Framework for Evaluation of Cross-‐Border Networking Process within the Thematic Experiments ................................................................................................. 14 5.1
6. 6.1
Evaluation Process of the Thematic Experiments ................................................................... 16
Evaluation Template for the APOLLON Cross-‐Border Methodology ....................... 17 APOLLON methodology Evaluation Framework Template .................................................. 17
7.
Template for Evaluating Cross-‐border Networking Experiments ........................... 23
8.
Template for APOLLON Experiment Specific KPI:s ....................................................... 36
References ............................................................................................................................................. 38
ICT PSP Project Reporting Template
3
Final Version
Apollon – Deliverable 1.3
1. Summary The aim of this deliverable is to provide an evaluation and impact assessment framework that will allow to assess the APOLLON methodology & tool set as well as to identify the added value of cross border Living Lab networking with specified key-‐ performance indicators. In this deliverable, a presentation of the theoretical framework that has guided the development of the evaluation framework is given. In addition the investigation that was performed among APOLLON partners in the beginning of the project is presented. This investigation served to identify relevant measures of performance among the involved project partners. We have also used the D x.1 deliverables from the other work-‐packages as a means to identify relevant performance indicators of the experiments in the thematic areas. These sources of information then formed the basis of this deliverable together with the theoretical framework. This framework has then been used as a basis for the design of the evaluation process as well as the evaluation framework. The process of evaluating the APOLLON methodology is described where the liaison person from WP1 collaboratively and iteratively evaluates the different stages of the methodology. The process of evaluating the experiment, which is carried out in the different work-‐packages, is based on self-‐assessment where the leaders of task x.4 in each work-‐package apply the framework and make the necessary adjustments for their context into the framework. This report also contains the research framework which is applied into the experiments in the work-‐packages to help them design and assess their experiments in a considered and researchable manner. Finally, this deliverable contains two different evaluation framework templates, first the framework for evaluating the APOLLON methodologies each phases, and secondly the framework for evaluating the added value and impact of the experiments for relevant stakeholders.
2. Introduction The evaluation and impact assessment framework developed in the APOLLON project, aims to monitor, analyse and assess the APOLLON methodology as well as the added value of cross border Living Lab networking. The aim of this deliverable is to provide an evaluation and impact assessment framework that will allow to assess the APOLLON methodology & tools set as well as to identify the added value of cross-‐border Living Lab networking. In this framework key performance indicators are defined which will be measured in the experiments in WP 2, 3, 4 & 5 in task x.4. This evaluation framework will therefore assess two different processes, (1) the APOLLON methodology supporting the cross border networking, and (2) the added value of the cross border Living Lab networking. The developed APOLLON methodology will provide a framework for engaging, empowering and mobilizing self-‐organizing individuals within actor networks. The proposed cross-‐industry infrastructure provides new opportunities and insights for individuals as relationships between the organization and its members, and among actors within and across organizations. The individual steps of the methodology will be continuously evaluated in three month intervals during the project in close collaboration with the other work-‐packages. ICT PSP Project Reporting Template
4
Final Version
Apollon – Deliverable 1.3 The cross-‐border networking process will focus on technology and knowledge transfer activities which will be evaluated in a formative manner during the project’s lifecycle. In the beginning of the project, different work-‐packages identify relevant key-‐performance indicators and measures of their thematic experiment. This input will then form the basis of the evaluation framework of these experiments. During the project, the ongoing experiments in the thematic domains will be assessed and the evaluation framework will be adjusted accordingly to ensure its relevance and usage in the project. The evaluation activities will be carried out continuously with the aim to interpret the cross-‐ border networking activities from different perspectives. In this process, the applied methodology supporting cross-‐border networking will be evaluated accordingly. To support these assessment activities within each work package, the framework developed within this deliverable will be applied. This framework should be viewed as a work-‐in-‐progress where experiences from the formative evaluations are incorporated into an updated version of the framework to ensure that the framework is useful to the project’s activities both in the vertical domains and WP1.
2.1 Objective and Aim The objective of this deliverable is two-‐folded: one is to evaluate the APOLLON cross-‐ border networking methodology and the other is to identify the added value of Living Lab cross-‐border networking experiments. That is. 1. To monitor, analyse and assess the vertical experiments in relation to the general objectives and the overall APOLLON methodology 2. To monitor, analyse and assess the cross-‐border networking impact on its relevant stakeholders The cross-‐border networking process in the different experimental domains will be supported and assessed by means of the developed cross-‐border networking methodology. This assessment will be supported by continuously interaction between WP1 and the vertical domains regarding the implemented networking methodology. The specific objectives of the evaluation and assessment activity can be stated as follows: • Observe and understand the progress and impact of the APOLLON methodology among its stakeholders and to understand the determining factors, challenges and processes • Observe and interpret the process of the vertical domains activities to be implemented in the APOLLON methodology • Evaluate the different patterns within the vertical domains and how these contribute to the creation of an validated and cross-‐border methodology • Assess the added value of cross-‐border networking among the relevant stakeholders such as Living Labs, SMEs, local authorities, end-‐users and large enterprises This will be performed by the liaison persons (explained in more detail in section 5) from WP1 who will have close collaboration with the vertical experiments. These persons will facilitate the usage of the methodology by explaining and suggesting ICT PSP Project Reporting Template
5
Final Version
Apollon – Deliverable 1.3 suitable tools or templates to be used from the methodology in accordance to the vertical experiments phase. For example, if an experiment is planning to write agreements, the liaison person should inform the person responsible for the thematic domain experiment that they can find support for this in the methodology checklists. The relevance and usage of the checklist will then be assessed in later interactions. To be able to formatively assess these processes and their impact, WP1 strive to build our understanding of the cross-‐border Living Lab networking in accordance to the following overarching themes: •
Interaction process (between the stakeholders in different countries)
•
Stakeholder needs (which needs are related to cross-‐border networking among its stakeholder)
•
Activities (which key activities that are performed during the networking)
•
Tools (what kind of tools are needed and used to support the process)
•
Results and effects of the cross-‐border collaboration (what happened during the process and what was the impact of it)
•
Critical Success Factors (factors ensuring the sustainability of the collaboration)
2.2 Theoretical Framework for Evaluation To really grasp why the assessment framework is designed in a certain manner and to be able to apply it in a useful way, it is important understand the basics of evaluations. Evaluation is a process aiming to investigate the significance, value, or quality of something, based on a careful study of both its good and bad features. In many everyday situations, we all make judgments about different things, actions, and events happening around us, without reflecting over whether it should be called evaluation. Usually, evaluations are related to something being valued in a systematic and well-‐ considered manner. Hence, evaluations become a rational process, where methods are followed as a means to gain control over the different steps in the evaluation process. The main aim of an evaluation is to express a value judgement about the thing being evaluated and the evaluation should critically scrutinize the particular object. Thus, the mission is not to only describe, map out, or measure an attitude. Instead, the endeavour should be to gain deeper insights and to question what is taken for granted (Lundahl & Öquist, 2002). An evaluation methodology must be chose that reflects the views of all involved stakeholder. 2.2.2 Evaluation Approaches Evaluations are performed in a number of areas, such as, evaluations of educational programs, organizational changes, project performance or evaluation of technology, and there are at least three different evaluation approaches; (1) objective and result evaluation, (2) process evaluation, and (3) interactive evaluation. The first approach, objective and result evaluation, dominated in the 1950’s and the 1960’s. In these evaluations, the evaluator measured and described the results in quantitative terms and it was not the evaluator’s job to value differences (Guba & Lincoln, 1989; Karlsson, 1999). ICT PSP Project Reporting Template
6
Final Version
Apollon – Deliverable 1.3 The second approach, process evaluation, was common between the 1970´s and the 1980’s. Within this approach, the interest was aimed at issues about how the results have been reached. The evaluations were not only focused on describing something, the expectation was to do a qualitative judgment about the thing being evaluated. The third approach, interactive evaluations, developed during the 1980’s and the 1990’s and had its focus on participation among those influenced by the evaluation. The basic thinking within this approach was that participation by different stakeholders increased the relevance of the evaluation questions and results; therefore, the stakeholders’ influence was strengthened. In this project, we apply the interactive evaluation approach because this matches the process we have designed as well as our perspective on evaluations. Evaluations can be used in many different ways: as instrumental, where the evaluation results are use to influence people’s mind-‐sets or actions, as long-‐term or short-‐term effects from the evaluation, as guidance for choices, and so forth (Karlsson 1999). Either way, the evaluation should elucidate wholeness and relations, and not focus on isolated issues, and it is the evaluator’s duty to make sure that different interests are represented in a reasonable and balanced manner (Lundahl & Öquist, 2002). Generally, in any evaluation, it is important to determine when the evaluation will be carried out and why, meaning, to understand if it is a formative or a summative evaluation. A formative evaluation is performed with the intention to change, or improve, something, such as a project (Benyon et al., 2005; Karlsson, 1999; Lewis, 2001). A formative approach to evaluation requires communication between stakeholders and the evaluator, because of its goal to change something and with an aim to identify learning possibilities from the situation. In contrast, a summative evaluation is carried out in order to determine the impact of the evaluand (Benyon et al., 2005; Karlsson, 1999; Newman & Lamming, 1995). For example, a summative evaluation could be to study the impact of a project such as APOLLON in the end of the project. Anyhow, when deciding the when and why of the evaluation, communication between stakeholders and the evaluator is vital. Furthermore, those working with evaluations should understand how things are related, and realize the fact that how things are related to each other are influenced by circumstances occurring in the evaluations context (Córdoba & Robson, 2003). In order to carry out an evaluation, it is important to know the purpose of the evaluation. This might seem obvious, but is not always apparent when an evaluation is being planned. When evaluation researchers discuss the question why an evaluation is performed, they usually distinguish between aim, purpose, and function of the evaluation. The aim of an evaluation is to produce a judgment that establishes the value of the object being evaluated, that is the evaluand, which in this project is the APOLLON methodology for cross-‐border networking and the cross-‐border networking experiments in the vertical domains. These judgements arise from the basis of interpretations, descriptions, and valuation of the evaluand. The purpose of an evaluation is the intended usage of the evaluation. A purpose can be to get the opportunity to control and judge the effectiveness and quality of an organisation. Another purpose of an evaluation could be to gain support for decision-‐making, and a third example of a purpose could be to sustain decision-‐makers with arguments for prioritizing. In this project, the purpose of the evaluation is to continuously refine the ICT PSP Project Reporting Template
7
Final Version
Apollon – Deliverable 1.3 APOLLON methodology and to transfer knowledge from one experiment to the other. The purpose of an evaluation might be separated from the actual usage or function the evaluation has in practice. The function does not have to be the same as the declared purpose (Karlsson, 1999). Many traditional approaches to evaluations have failed to recognize the reactive nature of evaluation. Just as performance factors reward safe, short-‐term activities, evaluations based on mean scores, instead of on the recognition of a few, but extraordinary accomplishments, work against innovation, and those aiming to explore the unknown. Instead, these approaches reward mediocrity. Failures are usually viewed and treated negatively, with negative consequences for those who have failed, even if the attempt of the innovation was very ambitious. A project claiming to be innovative and have a high level of “success” should be viewed with scepticism, because this probably means that what is being attempted is not very ambitious (Perrin, 2002). Hence, a methodological approach to evaluation of innovations should be able to: • • •
get at the exceptions, including unintended consequences, given that a quantitative research approach is not relevant and will hide true achievements; provide an understanding of the complex processes involved, as well as to help identify learning and implications from successes and failures; be flexible enough to be open to chances and unexpected findings, which, especially regarding innovations, can represent the key outcomes (Perrin, 2002).
For that kind of questions, qualitative methods are usually most suitable, possibly in combination with other approaches (Patton, 1987, 1990).
2.3 Evaluating Methodology One part of this deliverable is the framework for the APOLLON methodology evaluation. Methodology is a simple set of statements or a formal specification that is appropriate for the applied context and culture, and clearly documented and rigorously followed. Users must be involved in the specification and in the design, development and implementation of the methodology, and feel that the process is controllable and predictable. In the APOLLON project, we aim to continuously involve all work-‐packages in the process of developing the methodology. Methodology must integrate all stakeholders’ strategic goals with the practical realities of the available information technology and business environment. This means that methodology cannot be a static document. Instead, it must provide an adaptable framework for planning, specifying, building, and implementing practical information systems. We strive to accomplish this by using the requirements from x.1 as well as the base-‐line questionnaire as a basis for the design of the APOLLON methodology. 2.3.2 Four common deficiencies in methodologies are: 1. Lack of structure: The material is so disorganized that readers can't find what they're looking for. To facilitate the usage of the developed methodology in the APOLLON project we will structure the methodology in accordance to the project plan and the ongoing activities within the project.
ICT PSP Project Reporting Template
8
Final Version
Apollon – Deliverable 1.3 2. Fragmentation: The material the project participants needs is scattered among multiple manuals and other documents that have no clear relationship to one another. Fragmentation arises when an organization makes a commitment to some new methodology component without considering its impact on other, already established methodology components, or when responsibilities for methodology support and support are split among different parts of an organization. In the APOLLON project, this is handled by having one entrance point to the methodology and by having clear descriptions to where the information can be assessed. 3. Structural incompleteness: There is no natural or obvious place to put certain information needed by the professional staff. Consequently, some important information either never gets written down or is issued in separate memos that are soon forgotten. Structural incompleteness occurs not only as a by-‐product of a lack of structure (1 above), but also whenever the topics in the table of contents are based more on today's specific tools and techniques than on relatively stable concepts. In the APOLLON project this is handled by continuously validating the methodology in collaboration between WP1 and the other work-‐packages. The aim here is to re-‐design the methodology in accordance to the thematic domains needs and requirements. 4. Obsolescence: Most of the methodology material was developed years earlier and no longer reflects important aspects of the hardware, the system software, or the methods and tools in actual use. This is particularly true of a project like APOLLON in which the Methodology is at the same time the prerequisite for collaboration (established by WP1) and one of the main results of the project (critical analysis of best case methodologies use in verticals and redaction of the Apollon Methodology) These four shortcomings severely impair the usability of methodology documentation, its acceptance by the users, and its value to the goal to be achieved. Hence, we strive to meet these shortcomings in the beginning of the design of the APOLLON methodology to ensure its usability.
2.4 Key-Performance Indicators Key Performance Indicators are quantifiable measures that mirror critical success factors in a project. These KPI:s should be decided on beforehand and they give a snapshot view of the status of the project. It is therefore important to relate the KPI:s to the project goal. The KPI:s as such therefore function as a measure of the progress of the project towards the overarching goals, not the fulfillment of the goals as such. In the APOLLON project, the goal is to share and harmonize Living Lab approaches and platforms between networks of exemplary European Living Labs. Hence, the KPI:s of this project focus on measuring the impact of the cross-‐border collaboration experiments in relation to methods, approaches and tools to enlighten and measure the degree of goal fulfillment.
ICT PSP Project Reporting Template
9
Final Version
Apollon – Deliverable 1.3
3. APOLLON Criterion for the Evaluation Framework In this section we will present the data that form the background for the evaluation and assessment framework. This background data includes a summary of deliverable x.1, a base-‐line investigation, interviews with SMEs and Living Labs as well as the “description of work”. To ensure that the evaluation and assessment framework being developed within this task is of value for the vertical domains and the methodology development, the framework is built on collected data so far. This means that we have used the expressed evaluation criterions from the base-‐line investigation, deliverable x.1 (Identification of requirements) from the different work packages, results from the work-‐packages use of the research framework in their description of their experiments, interviews with SMEs and Living Labs and input from the project consortium as a basis for our framework. This material has been analysed with the aim to render evaluation and impact assessment criterions to the evaluation framework.
3.1 Summary of the Base-Line Investigation among APOLLON Partners The baseline investigation was a web based questionnaire of 43 questions divided into five main categories: General, Connect, Set Boundaries & Engage, Support & Govern and Manage & Track, with subcategories. The questionnaire was pre-‐tested by one of the partners. After revision it was put online and an invitation was sent to the Apollon partners. The questionnaire was answered by 16 of the partners, 6 of which are Living Labs, 7 SMEs and 3 other. The living labs that answered the questionnaire indicated that they mainly work locally, (50 % ) or in their home country (50 %) and that the public and regional authorities are very significant for their operations. The living labs have SMEs and large corporations as their important customers. The SMEs are working on various fields, but software development stands out as one of the major areas of development. The SMEs have the self-‐declared task of understanding the customers better so that their products are better understood. For the SMEs the most important contributions to the networking activities within the projects they completed are the openness and knowledge they provide to the network. The Openness of working is also mentioned as important in the list of competencies that is essential (questions 14 to 16). The most significant support expected in terms of networking activities is in the guidance and information delivery for one respondent, while the other SMEs did not indicate specific wishes. The Living Labs are more demanding in terms of methodology. They request, in different words, a description of the different methods and guidance on when they should be used (questions 17 & 18). This is reinforced by the answers to question 25, which also indicates the need for a unified approach. All respondents of the questionnaire indicate that they are well able to work with technical tools. The level of importance is differing little between the different groups and the sample size is too small to find statistical significant differences. The tools that are considered very important are email, project portals, video and phone conferencing. Other tools, like chat software, groupware such as lotus notes, or wiki pages are important to a lesser extend.
ICT PSP Project Reporting Template
10
Final Version
Apollon – Deliverable 1.3 The use of IPR in projects is of considerable importance for the participants. The answers indicate that there is a need for agreements before the project starts. This need is equally big for the SMEs as it is for the Living Labs. When looking at expectations, there is some difference between the living labs and the SMEs. The former show interest in the development of cooperation and partnership forming, whilst the latter focus more on new ideas for business and markets opening up abroad (question 38) . These expectations are inline with what would be expected and evaluation of the projects should focus on these items.
3.2 Summary of Requirements from APOLLON Deliverable x.1 In order to facilitate the usage and implementation of the evaluation framework, we aimed to link it to the ongoing activities within the different experiments in the WP:s. In order to accomplish this, we based our work on the deliverable x.1 “Identification of Requirements”. This is a summary of these deliverables. In this deliverable, the different work-‐packages have presented a list of requirements for the different experiments to be transferred between the different Living Labs. This deliverable has been analyzed and their relevant requirements are summarized in the table below.
Experiment Health
Energy
Emanufacturing
Approach
Common eco-‐ system model
A common benchmark model
Common platform
Research Focus
To what extent is Difficulties faced a trans-‐national with product innovation system integration able to stimulate the adaptation of innovations successful in one country to another country
Evaluate new ways of What is needed for collaborating with engaging users to partners participate, cultural differences
Research Question
How can we transfer a contextualised project into another cross-‐ border project and what issues are related to that
User co-‐innovation
Which cultural specific issues are problematic when extending more innovative applications to a broader contexts
Method
Compare use of Results on the the platform in impact of different LL regulatory contexts environment, climate, culture and behaviour compared between the different LLs
Networking between LLs
Integration of different solutions – impact on market fragmentation
Expected Benefits
Market Potential of cross opportunities for border
SMEs
ICT PSP Project Reporting Template
Difficulties with users culture and their surrounding environment
11
eParticipation
Technology An Integration framework
transnational The applications ability to answer to
Final Version
Apollon – Deliverable 1.3 SMEs by doing this collaboration in transfer terms of creating sustainable and feasible solutions for a broadly defined challenge
market opportunities
users/citizens needs
Data collection
Monitoring, interviews, questionnaires
Study user behaviour change and mechanisms
Feedback on platform Lingual and cultural deployment and misunderstandings integration services
Result categories
Successful implementation
Are results from one Finance country coherent with results of others
User experience
Effectiveness of methods Domain/areas used
Connecting with local stakeholders
The interest of the Ways partners of using LL networks
Can a shared set of tools Tools and a common methodology extend the validity of national tests
Skills
Enhancement
Efforts
Support
Dissemination activities
Table 1. Expressed evaluation criterions from the different thematic domains. Summary of deliverable x.1.
4. APOLLON Evaluation Processes As mentioned before, the aim of deliverable is to provide evaluation frameworks both for the APOLLON methodology and the cross-‐border networking process within the experiments in the thematic domains. Hence, in this section a presentation of the evaluation process for respective evaluations is presented. In accordance to the Living Lab approach as such, both processes build on user participation and a bottom up approach. This means that the evaluations will be carried out in an interactive and iterative manner. Subsequently, a description of these two evaluation processes is given
4.1 APOLLON Methodology Evaluation Process In the APOLLON project, the cross-‐border networking process in the different experimental domains is supported by the developed APOLLON cross-‐border networking methodology. This methodology is developed continuously during the project, hence it different phases will be evaluated in an interactive and iterative manner. This assessment will be supported by continuously interactions between WP1 ICT PSP Project Reporting Template
12
Final Version
Apollon – Deliverable 1.3 and the vertical domains regarding the implemented methodology for cross-‐border networking. The specific objectives of the evaluation and assessment activity can be stated as follows: • Observe and understand the progress and impacts of the APOLLON methodology among its stakeholders and to understand the determining factors, challenges and processes • Observe and interpret the process of the vertical domains activities to be implemented in the APOLLON methodology • Evaluate the different patterns within the vertical domains and how these contribute to the creation of an validated and cross-‐border methodology • Assess the added value of cross-‐border networking among the relevant stakeholder, Living Labs, SMEs and large enterprises This will be performed by the liaison persons (explained in more detail in section 6) from WP1 who have close collaboration with the vertical experiments. These persons will facilitate the usage of the methodology by explaining and suggesting suitable things to be used from the methodology in accordance to the vertical experiments phase. For example, if an experiment is planning to write agreements, the liaison person should inform the vertical that they can find support for this in the methodology checklists. The relevance and usage of the checklist will then be assessed in later interactions. To be able to formatively assess these processes and their impact, WP1 strive to build our understanding of the cross-‐border Living Lab networking in accordance to the following overarching themes: •
Interaction process (between Living Labs, SMEs, Large enterprises in different countries)
•
Stakeholder needs (which needs that are related to cross-‐border networking among the relevant stakeholder)
•
Activities (which key activities that are performed during the networking)
•
Tools (what kind of tools are needed and used to support the process)
•
Results and effects of the cross-‐border collaboration (what happened during the process and what was the impact of it)
•
Critical Success Factors (factors ensuring the sustainability of the collaboration)
This category is related to evaluating the methodology that is being developed in the APOLLON project. The aim of this methodology is to support the vertical experiments cross-‐border networking activities. This category is related to identifying interaction processes, activities and tools. This evaluation will gather information concerning: • Experienced benefits and challenges with cross-‐border activities from an overarching perspective
ICT PSP Project Reporting Template
13
Final Version
Apollon – Deliverable 1.3 • Applied methodology/approach, which parts of the methodology that has been used and the applicability of the methodology, its support of the process of cross-‐border networking • Effectiveness of the applied methodology and its support for the cross-‐border networking activities • The efficiency of the usage of the methodology • Accumulated process value and learning 4.1.2 Methodology Evaluation Process The evaluation process of the methodology is designed as follows: 1. Based on the ongoing activities in the work-‐package 2-‐5, the liaison person informs the work-‐packages about the methodology and its support for their current tasks 2. The liaison person participate in the work-‐package meetings and listen to how they have used the methodology 3. The liaison person ask questions to make sure that the stages of the cross-‐border collaboration methodology is discussed and validated 4. The answers to the questions is gathered in the template suggested below 5. In accordance to the validation results, the methodology is adjusted 6. In the end of the project, the methodology as a whole is evaluated in discussion between the experiment leaders and WP1. This is a task of which the liaison person is responsible. 7. The results from the methodology evaluation is gathered in an evaluation report The more specific template is found in section 7 below.
4.2 Evaluation of the Thematic Experiments In task 1.3 the aim is not only focused on providing a framework for evaluating the methodology, but also to design an evaluation framework for the thematic domains experiments to grasp the activities and results from these activities. As mentioned previously, we have chosen a stakeholder-‐driven approach focusing on empowering the stakeholders involved in the cross-‐border collaboration processes. With this approach, we will form a methodology that stem from the learning and experiences from the experiments in the APOLLON project. Based on the description of work, the evaluations aims to assess the stakeholders’ different perspective on working in cross-‐border networking activities supported by Living Labs.
5. Research Framework for Evaluation of Cross-Border Networking Process within the Thematic Experiments To facilitate the evaluation of the different experiments, a research framework has been developed to support the planning of the experiments. Therefore in order to effectively ICT PSP Project Reporting Template
14
Final Version
Apollon – Deliverable 1.3 apply the framework, we need to review APOLLON research framework and align evaluation and data collection processes. The APOLLON research framework is applied to the thematic experiments by answering the questions in each of the following classes, hence, this research framework is used as a support for the vertical work-‐packages when they plan and design their experiments. Activities/Outputs
Build
Evaluate
Justify
Constructs
What are the What are the How do you How do you filter variables that you elements that decide best pilot specific study? you measure? practices across elements out? the experiments?
Model
What are the basic assumptions, causalities and outcomes that you perceive?
Method
What is the How do process for evaluate validating the adjust assumptions? validation process?
Installation
Who are the How do you stakeholders at evaluate added your experiment? value for each stakeholder?
What measures What are the do you use to success criteria evaluate the that you use? validity of the assumptions?
Generalize
How do you assess the wider applicability of the model?
you How do you justify How do you and the use of selected ensure the the methods? scalability and wider applicability of the methods? How do you justify the selected collaboration model?
How do you compile recommendations for sustainability
Figure 2. Thematic experiments’ focus and content communicated in categories of ‘activities’ and ‘outputs’
By applying this framework, the work-‐packages are facilitated in their process of defining the measures and key-‐performance indicators of each experiment. This information will then be used as input to the evaluation framework of the thematic experiments. The answers will reflect the variables that are measured in project level by Tasks X.4 in the thematic experiments. This collected data will be fed back to the development of APOLLON evaluation framework, and contribute to the creation of the final version of the document. In this formative process we need to be in contact with the vertical experiments regularly. We need your contribution and experiences in order to provide you with usable advice on collaboration practices within their living lab network. Initially, we propose the following practices: 1. Requirement collection from thematic experiments 2. Dedicated Work Package 1 members as liaison to vertical experiments 3. Regular collaboration and formal meetings for iterative concept validation 4. A wiki as platform to share insights practices and with vertical experiments ICT PSP Project Reporting Template
15
Final Version
Apollon – Deliverable 1.3 5. SME engagement process The requirement collection was done in form of the base line questionnaire in May 2010, discussions with WP representatives, and as an analysis of X.1 deliverables. Each WP has been nominated a liaison person. The dedicated persons for each WP are: Liaison Person(s)
Work Package
Bram Lievens, Hendrik Hielkema
WP2 E-‐Health
Anna Ståhlbröst
WP3 Energy Efficiency
Christian Merz
WP4 E-‐Manufacturing
Claudio Vandi
WP5 E-‐participation
We propose regular collaboration and formal meetings for iterative concept validation every 3 months. Responsibility for calling these meeting will be with WP1, and all WP leaders commitment to participate in the process either themselves of with a nominated representative will be needed. This process would kick off at the APOLLON general assembly in September 30th, and meet at 3 month intervals in December 2010, March 2011, June 2011, September 2011 and November 2011. APOLLON wiki at mybbt will be used to disseminate the latest results. This platform is open for comments and contributions at any time. Other channels are meetings, presentations and emails. WP1 will also take more active role in large APOLLON events as findings accumulate. For more information, please refer to D1.2, of which this is a short concluding summary.
5.1 Evaluation Process of the Thematic Experiments The evaluation process of the thematic experiments will be designed as follows: 1.
The liaison person collaborates with the vertical experiments and help them plan their experiments according to the research framework presented in section 5 above
2.
Based on their research framework, the evaluation framework for the vertical experiments is applied and re-‐designed to fit into the experiment context
3.
The design of the evaluation framework is discussed and elaborated with in collaboration with the task leader of task x.4 in the vertical experiments
4.
After each experiment has been carried out, it is evaluated by the partners in task x.4 by using the evaluation framework
5.
The evaluation framework is continuously updated based on the results from its usage to make sure it answers to the experiments requirements.
6.
After all experiments has been evaluated, the results are cross-‐case analysed and an evaluation report is authored
ICT PSP Project Reporting Template
16
Final Version
Apollon – Deliverable 1.3 The cross-‐border networking process in the different experimental domains will be supported and monitored by means of the developed cross-‐border networking methodology. This monitoring will be supported by continuously evaluation of the implemented networking methodology based on its evaluation framework presented in forthcoming sections within this deliverable.
6. Evaluation Template for the APOLLON Cross-Border Methodology This section provides the templates for data collection within each experiment in the thematic domains. For each vertical experiment, the evaluation framework will be applied by the liaison person in accordance to the experiments ongoing activities. Hence, the activities will be matched to the phases of the methodology which are: Connect, Set Boundaries & Engage, Support & Govern, and Manage & Track. This template should be viewed as a self-‐assessment framework, where the question areas posed below are implemented as an evaluation carried out by the liaison person in the thematic experiments in the different work-‐packages. The aim with this template is to facilitate knowledge sharing across the vertical domains and to support the development of the methodology from a “bottom-‐up” approach.
6.1 APOLLON methodology Evaluation Framework Template In this framework, people involved in the thematic experiments should contribute with their experiences from their experiments in relation to the question areas suggested below. For example, if the experiment is focused on supporting and governing the cross-‐ border process, the template for these activities should be filled in collaboratively by the experiment leader and the liaison person. The more specific questions within the parenthesis should be considered as guidance to what kind of answer that is sought for in the question. These do not have to be answered specifically. The answers to the questions are filled in continuously. The first part is more overarching and should be filled out in all the evaluation activities to describe the context in which the evaluation is performed.
Methodology Evaluation In this section the aim is to evaluate the APOLLON cross-‐border methodology. This evaluation will be carried out continuously by the liaison person in collaboration and dialogue between the different work-‐packages and WP1. Work-‐package number
Experiment name (scope) Introduction Describe the objective with ICT PSP Project Reporting Template
17
Final Version
Apollon – Deliverable 1.3 the experiments; • who was involved • what kind of technology/knowledge/e tc was transferred in the cross-‐border networking process Overarching activities and experiences Describe the process of the collaboration in the experiment, what has been done, how did you communicated, who was in charge of the technology/knowledge transfer etc: • partners • roles What experiences are gained from involving partners from different countries and organisations? (Describe possible lessons learned from sharing knowledge and technology across borders. What kind of similarities, differences, problems, opportunities, strengths, weaknesses etc has been experienced during the
ICT PSP Project Reporting Template
18
Final Version
Apollon – Deliverable 1.3
Assessing the Connect Phase In this section, the focus is to evaluate the connect phase. In this phase activities such as setting up Living Lab network, identifying stakeholders, creating work plans and vision, determine the scope of the collaboration, project owner, defining technical platforms, funding and contracts are carried out. Note that all questions are not possible to answer in correlation to usage of the methodology, focus on the actual activities that was carried out during this collaboration phase which then function as input to the final design of the APOLLON methodology
Question Area
Lessons Learned
How did the partners involved in the cross-‐ border experiment get in contact with each other? Consider activities such as: • Setting up Living Lab network • Identifying stakeholders • Creating work plans and visions • Determine the scope of the collaboration & the project owner • Defining technical platforms • Funding and contracts Have any parts of the APOLLON methodology to support the process of connecting between different stakeholders been used? (If not, why?) (If so, describe which parts of the methodology has been used and how they have been implemented) How do the suggested tools and templates support the process of cross-‐border collaboration connecting between different stakeholders? What kind of support is needed when different stakeholders want to get in contact with each other and to collaborate across borders? Do the resources available to support the connect phase to be as efficient as possible? (Consider to what extent the use of resources has been used efficient in the process. Are there anything that could have been carried out differently and more resource efficient? What is
ICT PSP Project Reporting Template
19
Final Version
Apollon – Deliverable 1.3 that? How could it be performed instead?)
Assessing the Set Boundaries and Engage Phase In this section, the focus is to evaluate the connect phase. In this phase activities such as identifying partners, identifying risks and drivers in the cross-‐border project, create management plan / analyze stakeholders, train cross-‐border partners, define technical and IPR issues, ensure project team commitment are carried out. Note that all questions are not possible to answer in correlation to usage of the methodology, focus on the actual activities that was carried out during this collaboration phase which then function as input to the final design of the APOLLON methodology
Question Area
Lessons Learned
How were the set boundaries and engage phase carried out? Which activities are common when determining the scope of the project as well as processes for creating commitment among partners? Consider activities such as: • Identifying partner • Identifying risks and drivers in the cross-‐border project • Create management plan / analyze stakeholders • Train cross-‐border partners • Define technical and IPR issues • Ensure project team commitment Have any parts of the APOLLON methodology to support the process of set boundaries and engage between different stakeholders been used? If not, why? If so, describe which parts of the methodology has been used and how they have been implemented How do the suggested tools and templates in the APOLLON methodology support the process of setting boundaries and create engagement between different stakeholders in cross-‐border networking? What support for this phase is needed?
Are the resources available to support the set boundaries and engage phase efficient as
ICT PSP Project Reporting Template
20
Final Version
Apollon – Deliverable 1.3 possible? (Consider to what extent the use of resources has been used efficient in the process. Are there anything that could have been carried out differently and more resource efficient? What is that? How could it be performed instead?)
Assessing the Support and Govern Phase In this section, the focus is to evaluate the connect phase. In this phase activities such as managing stakeholder, selecting research methods, planning financial aspects, implementing technical infrastructure, supporting deployment, designing evaluation frameworks, designing operational model are carried out. Note that all questions are not possible to answer in correlation to usage of the methodology, focus on the actual activities that was carried out during this collaboration phase which then function as input to the final design of the APOLLON methodology
Question Area
Lessons Learned
How was the support and govern phase carried out? Which activities are common to support and govern the cross-‐border collaboration process among partners? Consider activities such as: • Managing stakeholder • Selecting research methods • Planning financial aspects • Implementing technical infrastructure • Supporting deployment • Designing evaluation frameworks • Designing operational model Have any parts of the APOLLON methodology to support the process of supporting and govern the cross-‐border collaboration between different stakeholders been used? If not, why? If so, describe which parts of the methodology has been used and how they have been implemented How do the suggested tools and templates in the APOLLON methodology support the process of support and govern cross-‐border collaboration between different stakeholders ICT PSP Project Reporting Template
21
Final Version
Apollon – Deliverable 1.3 in the thematic domains? What kind of support is needed for this phase? Are the resources available to support the support and govern phase as efficient as possible? (Consider to what extent the use of resources has been used efficient in the process. Are there anything that could have been carried out differently and more resource efficient? What is that? How could it be performed instead?)
Assessing the Manage and Track Phase In this section, the focus is to evaluate the connect phase. In this phase activities such as assessing impact, revising operational mode planning business model, evaluating usage of technical platforms and handover of responsibilities for new pilot are carried out. Note that all questions are not possible to answer in correlation to usage of the methodology, focus on the actual activities that was carried out during this collaboration phase which then function as input to the final design of the APOLLON methodology
Question Area
Lessons Learned
How was the Manage and Track phase carried out? Consider activities such as: • Assessing impact • Revising operational mode • Planning business model • Evaluating usage of technical platforms • Handover of responsibilities for new pilot Which activities are common when managing the cross-‐border collaboration process among partners? Which activities are common when tracking the results of a cross-‐border collaboration process among partners? Have any parts of the APOLLON methodology to support the process of Manage and Track the cross-‐border collaboration between different stakeholders been used?
ICT PSP Project Reporting Template
22
Final Version
Apollon – Deliverable 1.3 If not, why? If so, describe which parts of the methodology that has been used and how they have been implemented How do the suggested tools and templates in the APOLLON methodology support the process of Managing and Tracking cross-‐ border collaboration between different stakeholders in the thematic domains? What kind of support is needed for this phase? Are the resources available to support the Managing and Track phase as efficient as possible? (Consider to what extent the use of resources has been used efficient in the process. Are there anything that could have been carried out differently and more resource efficient? What is that? How could it be performed instead?)
7. Template for Experiments
Evaluating
Cross-border
Networking
In this template, the aim is to support the evaluation of the impact of the cross-‐border networking activities carried out in the vertical experiments. Hence, this template strives to evaluate the added value of being involved in cross-‐border networking activities in the APOLLON project for different stakeholders. The objective of this template is to support the process of evaluating the cross-‐border networking experiments carried out in the different work-‐packages to assess the value of Living Lab operating in networks for the involved stakeholders. This template should be used as an overarching framework that support the evaluation of the vertical experiments, but it must be complemented with specific questions for each vertical experiment to deal with situational aspects. This template should be applied by the experiment leaders when assessing their experiments. The aim of this template is to ensure that the same areas are assessed in the thematic experiments. As a structure for the template we have chosen to apply the components of Living Lab milieus, which are: Users and partners, Management, Research, Innovation, ICT tools and infrastructure and Approach (Bergvall-‐Kåreborn et al, 2009). The key components of Living Labs are illustrated in figure 3. Approach stand for methods and techniques that emerge as best practice within the Living Labs environment. The Living Lab Partners & Users bring their own specific wealth of ICT PSP Project Reporting Template
23
Final Version
Apollon – Deliverable 1.3 knowledge and expertise to the collective, helping to achieve boundary spanning knowledge transfer. The ICT & Infrastructure component outlines the role that new and existing ICT technology can play to facilitate new ways of cooperating and co-‐creating new innovations among stakeholders. Research symbolizes the collective learning and reflection that take place in the Living Lab, and should result in contributions to both theory and practice. Technological research partners can also provide direct access to research which can benefit the outcome of a technological innovation. Finally, Management represent the ownership, organization, and policy aspects of a Living Lab, a Living Lab can be managed by e.g. consultants, companies or researchers (Bergvall-‐ Kåreborn et al., 2009).
Figure 3: Living Lab Milieu Key Components
ICT PSP Project Reporting Template
24
Final Version
Apollon – Deliverable 1.3
Evaluation Template of Cross-‐Border Networking Experiments This evaluation framework aims to support the evaluation of the cross-‐border networking experiments. This template is divided into six different sections; Background, Approach, Partners & Users, ICT & Infrastructure, Research, and Management. Related to each section, questions are asked where a value needs to answer, each of these value then needs to be related to a source where it can be recaptured (a deliverable, a data collection etc.), and finally the impact from the results should be stated. The impact is appreciated by the experiment leaders in relation to the ratio of the ordinary values. For instance, if a methodology has been transferred between partners, the output (effects marked in the grey area) of this methodology might be new processes that increased the number of successful technology implementation with 5 % in relation to ordinary implementation ratio. There will also be a number of questions which demands answers of more qualitative character; these are recognised by the large writing section.
Background Information In the subsequent rows some background data is required to se the evaluation in the right context.
WP number Experiment description
Involved Partners
Number of countries involved in the experiment Type of cross-‐border activities that has been carried out in the experiments Purpose of the cross-‐ border activities (expected outcome) Experienced strengths of working in cross-‐ border collaboration ICT PSP Project Reporting Template
25
Final Version
Apollon – Deliverable 1.3 experiments Experienced challenges of working in cross-‐border collaboration experiments
Approach Approach refers to the methods and techniques that have been used to support the cross-‐border collaboration in the APOLLON project. Hence, it has a broader scope than what is usually assessed in Living Lab activities. In the following table, some questions require numerical value while others are of more descriptive character. Thus, not all questions will have a numerically measureable impact but if other impact has been observed these should be filled in.
Theme
Approach (The lines that only have one column to fill in aims at gathering qualitative data)
Measures
Value
(output)
Measurement tool
Impact
No of cross-‐border activities
No of intellectual products (methodologies, know-‐how etc) transferred in the experiment
No of technology transfer activities
(e.g. % ratio of (where the data stem from, e.g. ordinary deliverable number, values, or qualitative interview etc) impacts)
Which methods were used in the experiment? Please name and/or shortly describe the methods
Partners and Users The section Partners & Users refer to those who has been involved and brought their
ICT PSP Project Reporting Template
26
Final Version
Apollon – Deliverable 1.3 own specific wealth of knowledge and expertise to the project and thus, helped to achieve cross-‐border networking experiments. In the following table, some questions require numerical value while others are of more descriptive character. Thus, not all questions will have a numerically measureable impact but if other impact has been observed these should be filled
Theme
Measures
Value (output) Measurement Impact tool (% ratio (where the data of stem from, e.g. ordinary values) deliverable number, interview etc)
No of Users that has been involved in the experiment
No of user involvement activities
No of new ideas that emerged from the Partners: cross-‐border Users collaboration with (The lines that users only have one No of column to fill implementations of in aims at e.g. new functions as gathering a result from the qualitative cross-‐border data) collaboration with users
No of redesign of products/ services as a result from the cross-‐ border collaboration with users
User engagement activities in detail (e.g. usability evaluation, behaviour change studies, user experience ICT PSP Project Reporting Template
27
Final Version
Apollon – Deliverable 1.3 evaluations etc)
What was the user’s role in the cross-‐ border collaboration activities?
PARTNERS: SME In this section, the aim is to evaluated the SME engagement and the added value of their participation for them as SMEs
Theme
Measures
Value (output) Measurement Impact tool (% ratio (where the data stem from, e.g. deliverable number, interview etc)
of ordinary values)
No of SMEs involved in the experiment
No SME engagement activities
No of new international partners
No of signed letter of intent between (The lines partners and/or that only have customers one column to of new fill in aims at No businesses gathering generated in other qualitative countries data)
No of new business proposals
No of new customers in other countries
PARTNERS: SME
Did the cross-‐ border collaboration lead to increased turnover
ICT PSP Project Reporting Template
Yes
No I do not know
28
Final Version
Apollon – Deliverable 1.3 Not relevant in this experiment Did the cross-‐ border collaboration lead to increased customer retention
Yes
No I do not know Not relevant in this experiment
SME engagement activities in detail (e.g. developing technology, user tests, implementation of technology etc) What was the role of the SME in the cross-‐border collaboration?
PARTNERS: Large Enterprises In this section, the aim is to evaluate the Large Enterprises engagement and the added value of their participation for them as Large Enterprise
No of LEs involved in the experiment
No LE engagement activities
No of new international (The lines that partners only have one column to fill No of signed letter in aims at of intent between partners and/or gathering customers qualitative data) No of new businesses generated in other countries
No of new business proposals
Large Enterprise
ICT PSP Project Reporting Template
29
Final Version
Apollon – Deliverable 1.3 No of new customers in other countries Did the cross-‐ border collaboration lead to increased turnover
Yes
No I do not know Not relevant in this experiment
Did the cross-‐ border collaboration lead to iincreased customer retention
Yes No I do not know Not relevant in this experiment
LE engagement activities in detail (e.g. developing technology, implementation of experiments etc) What was the LE role in the cross-‐ border experiment
PARTNERS: Local Authorities In this section, the aim is to evaluate the Local Authorities engagement and the added value of their participation for them as Local Authorities
Theme
Measures
Value (output) Measurement Impact tool (% ratio (where the data of stem from, e.g. ordinary values) deliverable number, interview etc)
No of local authorities involved in the experiment
No local authority
ICT PSP Project Reporting Template
30
Final Version
Apollon – Deliverable 1.3 PARTNERS: Local Authorities
engagement activities No of new international partners
No of new business proposals
No of new customers in other countries
No of signed letter (The lines that of intent between only have one partners and/or column to fill customers in aims at No of new gathering businesses qualitative generated in other data) countries
Did the cross-‐ border collaboration lead to increased turnover
Yes No I do not know Not relevant in this experiment
Did the cross-‐ border collaboration lead to increased customer retention
Yes No I do not know Not relevant in this experiment
Local authority engagement activities in detail (e.g. implementation of experiments, experimental settings etc)
What was the local authorities role in the cross-‐border
ICT PSP Project Reporting Template
31
Final Version
Apollon – Deliverable 1.3 collaboration experiment
Technology and Infrastructure The ICT & Infrastructure component outlines the role that new and existing ICT technology can play to facilitate new ways of cooperating and co-‐creating new innovations among stakeholders. In the following table, some questions require numerical value while others are of more descriptive character. Thus, not all questions will have a numerically measureable impact but if other impact has been observed these should be filled in. In the questions where answers of Yes and No character are asked for, please respond according to the experiences from the experiments
Theme
Measures
Value (output) Measure-‐ ment tool
Impact
(% ratio of (where the data ordinary stem from, e.g. values) deliverable number, interview etc)
Technologies
No of products that has been transferred in the experiment
No of cross-‐border collaboration tools that has been used the experiment
No of NEW (for the stakeholders) ICT-‐ tools that has been used in the experiment
No of distributed cross-‐border collaboration activities
Did the cross-‐border collaboration tools you used lead to increased access to relevant information ICT PSP Project Reporting Template
Yes No I do not know
32
Final Version
Apollon – Deliverable 1.3 We didn´t use any collaborative tools? Did the cross-‐border collaboration tools you used lead to increased effectiveness in communication
Yes
Did the cross-‐border collaboration tools you used lead to increased co-‐ creation of innovations among stakeholders
Yes
No I do not know We didn´t use any collaborative tools? No I do not know We didn´t use any collaborative tools?
Which collaboration tools have been used to support the cross-‐ border collaboration in the experiment
Research Research symbolizes the collective learning and reflection that take place in the Living Lab, and should result in contributions to both theory and practice. In the following table, some questions require numerical value while others are of more descriptive character. Thus, not all questions will have a numerically measureable impact but if other impact has been observed these should be filled in. In the questions where answers of Yes and No character are asked for, please respond according to the experiences from the experiments. This is not an exact measure, it rather strive to gather the impressions of the impact.
Theme
Measures
Value (output) Measurement Impact tool (% ratio (where the data of stem from, e.g. ordinary values) deliverable number,
ICT PSP Project Reporting Template
33
Final Version
Apollon – Deliverable 1.3 interview etc) Research
No of research activities that has been performed during the experiment
No of authored journal articles
No of authored conference papers
No of research conference presentations
No of new research projects initiated
Did the cross-‐border collaboration lead to increased comparability of Living Lab research
Yes No I do not know Not relevant for our experiment
We didn’t do any research
Management Management represent the ownership, organization, and policy aspects of Living Labs. In this project, the aim is also to define the role of the Living Lab in the cross-‐border collaboration as well as the impact of the project on local Living Labs as well as the EnoLL. In the following table, some questions require numerical value while others are of more descriptive character. Thus, not all questions will have a numerically measureable impact but if other impact has been observed these should be filled in. In the questions where answers of Yes and No character are asked for, please respond according to the experiences from the experiments. This is not an exact measure, it rather strive to gather the impressions of the impact.
Theme
Measures
ICT PSP Project Reporting Template
Value (output) Measurement Impact tool (% ratio 34
Final Version
Apollon – Deliverable 1.3 (where the data of stem from, e.g. ordinary values) deliverable number, interview etc)
No of Living Labs that has been involved in the experiment
No of new collaboration initiatives between Living Lab network MANAGEMENT: members (planned, prepared or Living Lab submitted) Management Role No of new Living
Lab network members Did the cross-‐ border collaboration lead to increased access to user communities in other countries?
Yes No I do not know Not relevant for our experiment
Did the cross-‐ border collaboration lead to increased value proposition to the stakeholder community
Yes No I do not know Not relevant for our experiment
Did the cross-‐ border collaboration lead to increased learning of Living Lab collaboration in networks
Yes No I do not know Not relevant for our experiment
ICT PSP Project Reporting Template
35
Final Version
Apollon – Deliverable 1.3 Did the cross-‐ border collaboration lead to increased maturity of Living Lab management
Yes
No I do not know Not relevant for our experiment
The Living Lab’s role in the cross-‐ border collaboration activities (what has been the responsibilities of the Living Lab) The experiment’s impact on local policies (describe the impact of the experiment of local policies, both actual and expected impact)
8. Template for APOLLON Experiment Specific KPI:s In this section we want the experiment leaders to fill in the relevant and context dependent key-‐performance indicators for the specific cross-‐border collaboration experiment. These KPI:s should be in consistent with the description of the experiments in del x.2 & x.3.
Experiment Specific Key-‐Performance Indicators In this section the experiment leader and the task leader of x.4 should fill in the key-‐ performance indicators that are relevant and specific for each individual experiment.
KPI
Measures
(An overarching description of the Key Performance
(Define the measure you use to measure the
ICT PSP Project Reporting Template
Value
(output)
Measure-‐ ment tool (where the data stem
36
Impact (e.g. % ratio of ordinary values, or qualitative
Final Version
Apollon – Deliverable 1.3 Indicator)
KPI)
from, e.g. deliverable number, interviews etc)
impacts)
ICT PSP Project Reporting Template
37
Final Version
Apollon – Deliverable 1.3
References Benyon, D., Turner, P., and Turner, S. 2005. Designing Interactive Systems. Edinburgh: Pearson Education Limited. Bergvall-‐Kåreborn, B., Ihlström Eriksson, C., Ståhlbröst, A., and Svensson, J. 2009. A Milieu for Innovation -‐ Defining Living Labs. The 2nd ISPIM Innovation Symposium -‐ Stimulating Recovery -‐ The Role of Innovation Management. New York City, USA. 6-‐9 December 2009 Córdoba, J., and Robson, W. 2003. Making the Evaluation of Information Systems Insightful: Understanding the Role of Power-‐Ethics Strategies. Electronic Journal of Information Systems Evaluation (2), http://www.ejise.com/volume6-‐ issue2/vol6-‐i2-‐articles.htm. Guba, E., and Lincoln, Y. 1989. Fourth Generation Evaluation. Newbury Park: Sage Publications Inc. Karlsson, O. 1999. Utvärdering -‐ mer än metod. Edited by s. kommunförbundet. Vol. 3, ÁJOUR. Stockholm: Kommentus Förslag. Lewis, J. 2001. Reflections on Evaluation in Practice. Evaluation 7 (3):384-‐394. Lundahl, C., and Öquist, O. 2002. Idén om en helhet -‐ utvärdering på systemteoretisk grund. Lund Studentlitteratur. Newman, W., and Lamming, M. 1995. Interactive System Design. Cambridge: Addison-‐ Wesley Publisher Ltd. Patton, M., Q. 1990. Qualitative evaluation and research methods. 2nd ed. Newbury Park: Sage Publications. Patton, M., Q. . 1987. How to Use Qualitative Methods in Evaluation. California: Sage Publications. Perrin, B. 2002. How to -‐ and How Not to -‐ Evaluate Innovation. Evaluation 8 (1):13-‐28.
ICT PSP Project Reporting Template
38
Final Version