Feedback Utilization in the Rohingya Response: Summary of Lessons and Promising Practices

Page 1

Feedback Utilization in the Rohingya Response: Summary of Lessons and Promising Practices January 2019

CDA Collaborative Learning Projects and Disasters Emergency Committee Kiely Barnard-Webster and Isabella Jean with Monica Blagescu and Katy Bobin

1


Table of contents Acknowledgements 1. Executive Summary

4

2. Introduction

6

3. Methodology

7

4. Key Lessons Learned

10

5. Areas for Further Attention

18

6. Recommendations

22

7. References

27

Annex 1: DEC Members Feedback Utilization Survey Conducted in August 2018 Annex 2: Number and Type of Participants

2


Acknowledgements The authors of the report thank Oxfam, CARE and Christian Aid for hosting and guiding CDA’s research team during their field work in Cox’s Bazar, Bangladesh. Many thanks also to the World Vision team for providing logistical support. CDA would also like to thank colleagues at the Disasters Emergency Committee for their invaluable contribution and collaboration throughout this entire action research process.

How the DEC Works The DEC brings together some of the UK’s leading charities to raise funds at times of significant humanitarian need overseas. It allocates appeal funds to its members and ensures that the generous donations of the UK public are spent on emergency relief needed by communities devastated by humanitarian crises, as well as on longer-term support to rebuild the lives of people in these communities and strengthen their resilience. http://www.dec.org

3


1. Executive Summary In 2018, Disasters Emergency Committee (DEC) member charities collaborated on a learningfocused initiative to better understand and improve the utilization of community feedback at different stages in the program cycle. DEC members acknowledged that there are good practices across member organizations, but that such practices are often not sufficiently understood nor are they properly documented. In addition to a London learning workshop with DEC members, three DEC member charities offered themselves as case studies for a more in-depth examination of related practices in their emergency response to the Rohingya refugees in Cox’s Bazar, Bangladesh. Feedback and complaints are types of information provided by communities about the relevance, quality and effectiveness of aid as well as about staff and partner conduct. This information can be proactively solicited or arrive unsolicited, it can be collected through formal and informal feedback channels and used for improvements and course corrections throughout the program. Aggregated community feedback trends are useful for program strategy reviews and improvement. DEC member agencies remain deeply committed to addressing safeguarding concerns and urgent protection issues. Therefore, in this learning engagement, it was a priority for DEC members to also understand how feedback systems are utilized to communicate such information and what the link is between feedback systems and complaint-handling mechanisms for reporting, investigating and responding to serious incidents. Staff of case study organisations were also asked to describe how their agencies handle complaints of a sensitive nature and which require confidentiality, including reports on sexual exploitation and abuse incidents. Utilization of community feedback for improving humanitarian response is one of many key elements for increasing accountability to crisis affected people. There is still more to learn about institutional enabling factors that make such utilization possible. This report highlights the key observations from our field work and examples provided by DEC member organizations. Main Lessons 1. Using feedback under challenging circumstances: creative solutions are possible. CARE has found ways to use feedback under some of the most challenging circumstances, for example by strengthening coordination and strategically partnering with Camp in Charge (CIC) and others to strengthen effectiveness of communication, response and ongoing improvement. 2. Signaling to staff that data collected is being used at the senior level is important for organizational accountability culture. Making accountability to communities a part of the organizational culture is a challenge. Oxfam’s regular Accountability Reports are shared with

4


all staff to demonstrate how MEAL and senior management teams use community feedback, a signal to staff that accountability matters. 3. Basic tools that enable better inter-agency coordination can make a difference. Christian Aid has developed a visual map to show which services are provided by implementers in Camp 15. The team regards the map as critical for referring feedback to appropriate agencies and for alleviating the burden on feedback providers to seeking that information. Key recommendations •

Poor inter-agency communication and coordination pose a risk to effective feedback utilization, as feedback may be lost, misplaced, or require significant time before action is taken. This places the burden on community members to advocate for themselves rather than on the aid providers. Aid agencies, including DEC members, need to review their current external referral and cluster communication processes and prioritize improvements in interagency communication channels and coordination.

•

Well-functioning feedback systems in general (where feedback providers receive a response or can see the results of their feedback) strengthen the confidence for survivors of safeguarding breaches to come forward and report incidents. Investing in effective feedback systems should improve reporting of sexual exploitation and abuse incidents perpetrated by aid workers on communities.

•

HR and senior management teams to consider a better balance of male to female staff during hiring decisions, because female community members will more often choose to disclose sensitive information to other women.

5


2. Introduction ‘To work in an adaptive way, we need catalysts that compel response to information that indicates we could be doing something better. It requires a will and incentives to change course, but also a skill set to think critically about data, context, problems and solutions.’1

Feedback can inform decision-making from the field level up to the strategy level. However, it is often given second or third priority in relation to other forms of data or “expert-led” mid-term and final evaluation reports. Receiving and using feedback effectively signals respect, transparency, and commitment to engage communities and crisis-affected people in decisions that directly impact their lives. Feedback utilization is part of the set of commitments made by the humanitarian sector at the World Humanitarian Summit and is embedded in the Core Humanitarian Standard on Quality and Accountability (CHS).2 In 2018, Disasters Emergency Committee (DEC) member charities collaborated on a learningfocused initiative to better understand and improve the utilization of community feedback at different stages in the program cycle. DEC members acknowledged that there is good practice across member organizations, but that such practices are often not sufficiently understood nor are they properly documented to promote learning within and across organizations. In addition to a London learning workshop with DEC members, three DEC member charities offered themselves as case studies for a more in-depth examination of their practice in the humanitarian response to the Rohingya crisis in Cox’s Bazar, Bangladesh. DEC members acknowledged that increased investments and attention have been paid for developing tools and refining mechanisms for collecting feedback from communities. But such efforts have not been matched with lessons and resources to improve organizational decisionmaking behavior and increase utilization of community feedback. This learning initiative focused on documenting examples of good practice and understanding how organizations collect feedback and why, when and how they utilize it. The promising lessons highlighted in this report point to enormous creativity of frontline staff and partners to overcome the barriers to utilizing feedback program design and for informing wider organizational improvements. This requires leadership and staff commitment to improve internal communication, decision-making and provide sufficient resources for staff to be able to remain responsive to feedback. Section 3 of this report describes the methodology. Section 4 provides the key lessons and examples of promising practices documented during the London learning workshop and the visits with three DEC case study member organizations in Cox’s Bazar. Section 5 discusses feedback utilization areas that require further attention and improvements. Section 6 continues with several recommendations for applying these lessons to future strategies and programs.

1

Jean, Isabella, “Beneficiary Feedback: how we hinder and enable good practice.” London: Bond, 2017. CHS Alliance, ICRC, “How Change Happens in the Humanitarian Sector.” Switzerland, CHS Alliance, 2018. https://www.chsalliance.org/har 2

6


3. Methodology The CDA research team (two CDA staff and one Bangladeshi consultant), in partnership with DEC’s Programmes and Accountability team, chose an iterative approach for gathering the lessons presented in this summary report, informed by conversations with DEC members’ staff based in the UK and in Cox’s Bazar. In total, we held 21 key informant interviews, 8 focus group discussions, 3 process mapping exercises, conducted a survey of DEC members, and held 2 learning events (one in London and one in Cox’s Bazar) focused on peer-to-peer exchange of experiences with using community and partner feedback. Participating DEC members (affiliates and partners in Bangladesh engaged in the process) ActionAid UK

Christian Aid

Oxfam GB

British Red Cross

Concern Worldwide UK

Plan International UK

CARE International UK

Age International

Tearfund

CAFOD

Islamic Relief Worldwide

Save the Children UK

World Vision UK

CDA developed a preliminary scoping paper with overall lines of inquiry and guiding questions for the learning process. The scoping paper was based on consultations with the DEC team, evaluators who completed a recent real-time evaluation in CXB, DEC member survey results and a desk review of good practice in feedback utilization. The approach to this learning process was also informed by CDA’s collaborative learning and advisory work with non-DEC organizations focused on mapping and documenting internal feedback utilization processes. A one-day learning event with UK-based DEC member staff was facilitated by CDA and DEC in London in September 2018; two representatives from Bangladesh also joined (one in person and one remotely). Participants shared good practice examples and highlighted areas of concern in relation to feedback utilization. Examples of utilization were also shared via a 12-question survey in advance of the learning event and were analyzed by CDA to inform the discussions. The survey was completed by 22 staff in the UK and Cox’s Bazar (see Annex 1 for complete set of survey questions and Annex 2 for selection of survey response data). Three DEC members agreed to be ‘case studies’ and to participate in closer examination of their formal and informal feedback utilization processes in Cox’s Bazar programming. Organizations were asked to put themselves forward as cases either because they have built community listening as a standard approach across their operations in Cox’s Bazar and had examples of feedback practices that work or don’t work; or simply because they wanted an opportunity to engage staff and partners in an open dialogue on how they can listen better to communities, and to improve their response and use of feedback in decision-making. In Cox’s Bazar, CDA facilitators and staff from participating organizations visually mapped how community feedback travels internally throughout the field office, the rest of the organization and across different stages of the program cycle. The exercise examined what formal and informal processes and procedures exist to collect, document, analyze and respond to feedback. To guide staff through this participatory process, CDA team used a visual “process mapping” exercise, applied successfully in several recent CDA engagements with humanitarian partners.

7


Multiple good practices were described during this exercise and were further explored during the subsequent diagnostic interview process. An informal “diagnostic process” involved a set of quantitative and qualitative questions intended to quickly gather facts, opinions and analysis of cause and effect and to diagnose “bottlenecks” and reasons for why community feedback remains underutilized even if feedback mechanisms solicit and collect it. Staff in Cox’s Bazar central offices were consulted through individual interviews and, in a few instances, focus groups were also held with staff. Field staff were similarly asked to share their perspective informed by daily interactions and program implementation roles closer to the points of service in the camps. A small but representative group of community members took part in focus group discussion at field level, to assess the extent of their knowledge about established feedback processes and how organizations communicate their response about actions taken/not taken based on feedback. In total, CDA held conversations with 156 different people (staff, community volunteers and community members from offices in Cox’s Bazar and within five different camps). Some staff who participated in KIIs also engaged in the process mapping exercises; however, they have not been counted twice in the total figures presented in Annex 3. Lessons in this summary report draw largely on conversations and process mapping with staff from the three case study organizations: CARE, Christian Aid, and Oxfam. Some lessons below were shared in the member surveys and documented during the two learning events, which extended beyond these three organizations and included a wider group of DEC member charities. Limitations It was outside the scope of this engagement to conduct an in-depth examination of how organizations provide information and raise community awareness about the channels for providing feedback and reporting complaints. Specifically, we did not closely examine the processes by which organizations explain what is considered acceptable or unacceptable behavior and how to lodge serious incidents. However, we include several relevant recommendations in the final section of this report that were raised by staff who are already working to address these important questions. Key Concepts & Terminology Used in This Report Feedback and complaints are types of information provided by communities about the relevance, quality and effectiveness of aid as well as about staff and partner conduct. This information can be proactively solicited or arrive unsolicited, it can be collected through formal and informal feedback channels and used for improvements and course corrections throughout the program cycle. Aggregated community feedback trends are also useful for program strategy and strategy reviews. Feedback: Feedback is ‘information about reactions to a product, a person’s performance of a task, etc. which is used as a basis for improvement’ (Oxford Dictionary, 2013). According to extensive 2014 ALNAP action research looking at feedback mechanisms in humanitarian

8


contexts, other agencies also define feedback as “opinions, concerns, suggestions and advice that ‘aid agencies may adopt, challenge or disagree with as appropriate’ (Banos Smith, 2009:33).”3 Once feedback is received, it is often used first by field staff who assess the nature of the information and may respond to it immediately, document it for sharing with other project/sector teams, or refer the feedback to partners or peer agencies at field level that are better placed to respond and act on it. Feedback can also be shared with central office teams and the headquarters. It may inform project level course corrections, wider program re-design, or strategic senior management level discussions (e.g., on program relevance, quality improvements, partner selection and other operation-wide adaptations). It can also be shared with the cluster system and donors to demonstrate recurring concerns or positive trends in satisfaction rates. Feedback is often shared and used by MEAL teams and external evaluators to complement other assessments of the relevance and quality of service provision. Finally, feedback may be shared back with community members, along with a brief report on how the feedback was handled by the agency, in order to support transparency and accountability. Complaint: Complaints are a critical form of feedback relating to the quality of items and services and can include grievances shared by people negatively affected by the aid program, the organization delivering it or specific behaviors and conduct of its staff and partners. According to the Core Humanitarian Standard, complaints-handling processes and procedures: “cover programming, sexual exploitation and abuse, and other abuses of power.” 4 Feedback

Definition

How This Information May be Used by Agencies

Information about reactions to a product, a person’s performance of a task, etc. which is used as a basis for improvement (Oxford Dictionary, 2013)

Feedback can be used across all types of programmes and by staff at all levels. It may inform project level course corrections, wider program re-design, or strategic senior management level discussions (e.g., on program relevance, quality improvements or operation-wide adaptations). It is helpful to classify feedback as it relates to 1) day-to-day implementation decisions versus 2) bigger picture concerns related to theories of change, geographical coverage, etc.6

Opinions, concerns, suggestions and advice that ‘aid agencies may adopt, challenge or disagree with as appropriate’ (Banos Smith, 2009:33)5 Complaints

A specific grievance of anyone who has been negatively affected by an organization’s action or who believes that an organization has failed to meet a stated commitment.7

Different complaints mechanisms are in use. These are a specified series of actions through which an organization deals with complaints and ensures that complaints are reviewed and acted upon... [and] procedures for handling all types of complaints, including those related to sexual exploitation and abuse of crisis-affected people by staff.8

3

Bonino, F. with Jean, I. and Knox Clarke, P. (2014), “Humanitarian Feedback Mechanisms Research, Evidence and Guidance.” ALNAP Study. London: ALNAP/ODI. Page 29. 4 Core Humanitarian Standard on Quality and Accountability. 2014. https://corehumanitarianstandard.org/files/files/Core%20Humanitarian%20Standard%20-%20English.pdf . P 14. 5 Bonino, F. with Jean, I. and Knox Clarke, P. (2014), “Humanitarian Feedback Mechanisms Research, Evidence and Guidance.” ALNAP Study. London: ALNAP/ODI. Page 29. 6 Bonino, F. with Jean, I. and Knox Clarke, P. (2014), “Humanitarian Feedback Mechanisms Research, Evidence and Guidance.” ALNAP Study. London: ALNAP/ODI. Page 29. 7 HAP Standard in Accountability and Quality Management. 2010. https://www.chsalliance.org/files/files/Resources/Standards/2010-hap-standard-in-accountability.pdf . Page 6. 8 HAP Standard in Accountability and Quality Management. 2010. https://www.chsalliance.org/files/files/Resources/Standards/2010-hap-standard-in-accountability.pdf . Page 6.

9


4. Key Lessons Learned: Promising Practices Across Three Case Study Agencies Christian Aid, CARE and Oxfam GB put themselves forward as case studies for more in-depth learning collaboration for several reasons: because they had community listening as a standard approach across their operations, or specific examples of feedback practices that work and do not work, or simply because they wanted to engage staff and partners in an open dialogue on how they could improve their feedback and listening processes and utilize community feedback. All three organizations described good practices and examples of sharing and utilizing feedback for different purposes. CARE WASH and Gender teams are currently using community feedback to provide WASH services that address women’s hygiene needs. Christian Aid is similarly using feedback to make adaptations to their programming, in this case to accommodate marginalized community members. Oxfam is successfully using feedback to monitor perspectives of hard to reach groups (e.g., adolescent girls) by using Listening Groups (described below), and by improving methods to document and share feedback with senior management teams using Accountability reports. More examples are highlighted in the sections below. A. CARE Bangladesh CARE Bangladesh regularly receives formal and informal feedback. When feedback is given verbally, staff reported an average of 60 feedback entries a week. Written feedback is less frequent and averages less than 30 entries a month. CARE staff receive most verbal feedback at field level, as community members feel most comfortable providing feedback this way to volunteers and staff (e.g., during community meetings, house-to-house visits, through Majhi / other community leaders, and when program teams visit camp sites and have direct conversations). Staff noted that 76% of respondents reported that either they or a family member possess mobile phones9. Using feedback to inform WASH/Gender program re-design “WASH/GBV coordination – it started during a community level women’s discussion in Women Friendly Spaces (WFS) that highlighted menstruation hygiene management as an issue. A CARE [female] GBV staff used drawings and diagrams to communicate. The CARE WASH and GBV senior staff discussed further how to identify the most culturally appropriate solutions. We are now piloting WASH facilities in WFS including laundry facilities that will address menstrual hygiene management. “- CARE WASH Advisor One main challenge in designing the WASH facilities in the WFS project with Menstrual Health Management (MHM) laundry facilities (see Figure 1), was understanding cultural sensitivities around menstrual hygiene practices in Rohingya communities.

9

Interview with CARE MEAL staff

10


This required significant research by the CARE Gender & Protection team (see Figure 2 for findings), in coordination with WASH colleagues. At present, most women in Camp 16 wash their clothes in communal wash spaces or behind their houses, however this is often considered as inappropriate as it is too public. The MHM laundry facility design is meant to address this (see Figure 3, on how decision was made), though is currently in pilot stage within communities in Camp 16 to determine if the approach is a more effective option for women when cleaning menstrual clothes.

Figure 1. CARE WASH/GBV actual sketches of WASH proposed facilities in WFS as output of discussion with women.

FINDINGS TOILET, TUBEWELL AND BATHROOM

Menstrual Health Management

ü No pordha (wall) or segregation of Tubewell ü Feel ashamed to use the toilet while man standing outside ü Man still use the women bathroom and toilet. ü No water facilities in bathroom. ü Water point far from the bathroom. ü Drainage system not good as a result the grey water from menstrual cloth can see.

ü Mostly use cloths than pad ü At first they use to throw in toilet later on they bury it as men see it if they throw in toilet. ü WASH at home in a small areas which is made for women & girls for night time ü Dig hole within H.H boundary and wash it. ü Wash in tubewell when men are not around and in bathroom but water problem and drainage issue. ü Hiding dry in backyard or inside home as a result bad smell. ü Shouldn’t come in contact with any male person as it is a sin if they see it & loss masculinity if they go under any women wearing.

Figure 2. CARE research findings to inform Menstrual Health Management re-design

11


Decision making Risk assessment: • The Imam, Majhi and family member were consulted. • The women and girls find it useful & safe in WFS. • Laundry and MHM facility in WFS provide more dignity in WFS due to Pordha/Wall Process: • GiE team call for meeting with WASH advisor to present the recommendation based on the FGD findings & gender sensitive laundry design along MHM & bathroom.

Figure 3. Decision making steps for CARE’s WASH/Gender MHM program

o Site Management “Soft” Skills – Camp 16 An example of good practice of feedback sharing and coordination is currently taking place in Camp 16, managed by CARE. The CARE site management team is very small due to shortage of staff; the team is not always able to frequently get out and speak to people face-to-face when issues arise. As a result, out of necessity, this has led staff to intentionally establish new relationships or further develop close relationships with government counterparts (e.g., Deputy Secretary of Government) and agencies directly implementing in the camp. “Camp is like a big family. It’s a hills problem. It’s a land problem. It’s a space issue. There are many more things [we all face].”- CARE Camp Manger/ Technical Coordinator, Management & Coordination team The employment of soft skills, and relationship building, has led the CARE site management team to coordinate efficiently when urgent issues arise. For example: “during monsoon season, latrines fall. If they fall, 2-3 houses will be crushed and people will die. I was on leave in Dhaka. They called me. I know who has the capacity. I called around. At the time, CARE was the one with the ability to address this. This was at 4pm, I got the call! I spoke to [my colleague]. I also consulted with CIC, and BRAC because it was their latrine. I have to get the sign-off of CIC and pass off by BRAC too. I need to get the parents to sign off!...the fact is we have good coordination. My phone is always on.” CARE Camp Manger/ Technical Coordinator, Management & Coordination team

12


The ‘soft skills’ (which are, in fact, often ironically quite hard to develop and identify during staff hiring processes), include communicating quickly to the multiple stakeholders implicated in feedback response (e.g., Security and military government departments, CIC or local government counterparts, community members, and other agencies). Also, these may include ‘emotional intelligence’ skills required to develop relationships and understand how best to engage with partners. Site management staff gave two recommendations for achieving higher levels of feedback sharing and coordination: rapport building (e.g., with communities, through block management teams) and knowing role and responsibilities as a member of the CARE camp management team. CDA has observed in past learning engagements – and validated again through the work in Cox’s Bazar – that leadership is critical to increase buy-in and support of listening and feedback processes. “Senior management endorsement and demonstration of the value of listening and the value of feedback helps to advance the use of feedback in the organization overall. This includes closing the loop on feedback from staff as well.”10 This role model approach may often lead staff to rise to what they see as “expectations set for them by their managers and peers.” This requires agency-wide support to organizational champions, especially those in management roles, who demonstrate that feedback sharing, and use are critical to the mission of the organization as well as emphasize the importance of listening, facilitation and joint problemsolving skills.11 Using feedback under challenging circumstances: finding creative solutions CARE has found ways to use feedback under some of the most challenging circumstances, as described by other DEC member agencies: when facing barriers and restrictions from local stakeholders (e.g., government administration officials, such as the Camp in Charge, or CIC, at field level). CIC must give permission to humanitarian implementers on most of the steps required for service provision (e.g., design, budget, etc) to ensure projects align with existing sector-wide standards. Given the overwhelming conditions at the start of August 2017, the Government of Bangladesh saw this as a necessary step to ensure better coordination at field level and improve the quality of services. To date, for humanitarian agencies, this mandated approval process has frequently led to delays in implementation and raised dilemmas in terms of how to quickly and effectively act on community feedback (e.g., for more space, for quick quality improvements that may fall outside of an agency’s remit or budget, etc). One creative solution for responding to community feedback, under challenging circumstances such as lengthy approval processes, introduced by CARE is to strategically partner with other agencies. For example, months ago in Camp 16, CARE was providing latrines; however, it was not yet approved to provide cleaning (“de-sludging”) services for these units and received significant amounts of feedback that latrines needed more rapid cleaning. Rather than request additional funding from donors and go through the lengthy sign-off process from the government to provide 10

CDA Collaborative Learning Projects, Issue Paper on Feedback Utilization: Modeling & Routines. Unpublished, available upon request. 2016. 11 CDA Collaborative Learning Projects, Issue Paper on Feedback Utilization: Modeling & Routines. Unpublished, available upon request. 2016.

13


this service, CARE simply found a partner agency already approved to provide this service, with budget to do so, and agreed to rapidly liaise with one another when de-sludging services were needed. This required rapport building, joint planning, and frequent communication with stakeholders to ensure this partnership would be possible and admissible. B. Oxfam Oxfam’s commitment to accountability to communities and to improving internal processes for reviewing and responding to feedback, has meant increased support to the Cox’s Bazar team including budgetary resources to develop new tools and approaches for using feedback. Over the last several months, Oxfam received significantly higher rates of feedback than in January-June 2018. For example, in July 2018 alone, 1,634 feedback entries were received; in August 2018, 706; and in September 2018, 866.12 The agency’s focus, especially in the last 6 months, has been to improve the overall quality of programs by designing better processes for analyzing, sharing and responding to feedback. For example, Oxfam MEAL staff described new processes that focus on how to motivate staff to collect and use data, how to most effectively share feedback with the senior management team, and the best methods for rapid trend analysis of feedback at field level. Listening Groups and Spidergram methodology for rapid trend analysis Oxfam conducts regular listening exercises with community members using open-ended discussions for different groups of people to share concerns, discuss quality of services, and talk about their community’s well-being. Listening groups are held in a central community center, rather than in an office, or in members’ homes or more private spaces, at the request of members themselves (both men and women). Community based volunteers will still make house calls to assess how community members are doing. Admittedly, Oxfam has been trying to find ways to address the issue of adolescent girls not always attending the Listening Group meetings. “[community members] didn’t want to come at first because they weren’t paid. They like us and think we take care of them. We don’t provide and leave. We monitor - if desludging is needed and it’s not our latrine we will still do it.”- Oxfam Senior WASH Officer Listening Groups have proven to be an effective way to gather community feedback. When done well, they can build trust between community members and implementers and create space for feedback about a variety of issues. But it is not an easy process to design. Oxfam has taken 3 months to set up the process for Listening Groups. This included establishing the participant selection mechanism, deciding on which community members will attend, soliciting feedback about this process, and following up when community members do not show up to assess if there are issues with the process. They are in the second month of piloting this approach and have experienced challenges such as community members expecting to be paid for attendance, and not remembering when to attend (i.e., not yet building the habit of attending regularly). 12

Conversation with Oxfam MEAL Coordinator

14


As open spaces for discussion, Listening Groups have multiple purposes: to gather community feedback and going forward, to engage community members in joint analysis of programmatic issues and other concerns and for options generation to address recurring challenges.

Figure 4. Oxfam Spidergram exercise

Additionally, Listening Groups serve as spaces to conduct rapid trend analysis with community members. Oxfam has introduced the ‘Spidergram’ methodology for rapid trend analysis, in which Listening Group members are asked to assess the quality of projects (e.g., what’s going well or not well with Oxfam’s water projects, latrine projects, etc). Over the course of a Listening session, which lasts approximately 1 hour, each community member shares feedback on a service or project and is then asked to give a ranking (1-5) to rate quality. The average rating for each service is documented on a ‘Spidergram’ (see Figure 4), by making a mark on the service’s ‘leg’ of the web. When the dots on each leg of the web are connected, meaning a circular line is drawn to connect each ‘rating’ around the legs of the web, Oxfam can visually assess by the shape of the circle priority areas of concern, and how the agency is doing overall in their service provision in Camp 19.

15


Accountability Reports: build momentum and involve senior staff Oxfam currently uses Survey CTO mobile data gathering tool to collect feedback at field level (rather than KOBO toolbox, an open platform that potentially exposes private data). They are in the process of upgrading their mobile feedback system, to move away from paper forms, and to support better data analysis and visualization. Oxfam staff noted two important challenges: 1) embedding accountability to communities into the organizational culture and norms, and 2) effectively demonstrating to staff why accountability matters. One of the purposes of Accountability Reports is to signal to staff that MEAL and senior management teams are using the feedback gathered and in turn, motivating staff to support accountability commitments. “It helps having Accountability reports. It helps that [staff] see that we read the feedback! We also keep positive feedback and use this too” – Oxfam MEAL Coordinator Oxfam has been receiving significantly higher rates of feedback over the last few months. Staff reported they are very happy with the regular Accountability Reports released every month which provide a visual for macro trends across multiple data points (see Figure 5).

Upgrade of Mobile Feedback System

Figure 5. Oxfam Accountability Reports

The mobile data collection tool, Survey CTO, is used during post-distribution monitoring (PDM) when Oxfam receives significant amount of concrete feedback on most or least useful items, Page 7 what communities need more of, feedback on registration and distribution processes, on safe access of services, on vendors, or Oxfam staff behaviour, etc. Only MEAL staff at Oxfam are designated to collect feedback during PDMs, which are completed rapidly, three days after each distribution, allowing for changes to be made before the next distribution. Planned improvements to these Accountability reports include visual presentation of feedback by sector and developing direct referral channels in Survey CTO for sector leads to receive feedback specifically related to their projects.

16


C. Christian Aid Bangladesh Christian Aid (CAID) team is driven to improve internal processes for reviewing and responding to feedback in order to ensure quality of programming. The organization’s leadership has expressed high level of interest in more, and better, inter-agency coordination at both senior management level (strategy) and camp levels (implementation). CAID is using community feedback to motivate peer agencies within and across camps to coordinate collective responses to urgent issues. On average, Christian Aid receives 30 feedback entries a week. Initiating an inter-agency response to feedback at camp level Several months ago, CAID heard from a Community Mobilizer (community-based volunteer) that there was an adolescent boy in need of special services, to accommodate his physical disability. This was flagged within CAID’s database and referred to their Site Management focal point in Camp 15 who immediately spoke to the CAID Accountability team in CXB. The newly developed service map in the camp showed that there were no implementers such as Handicap International, able to accommodate his needs. What’s more, service providers like Handicap International were not yet authorized to operate within the camp. And, the boy could not be transferred to another camp as this would have raised flags with local government officials (CIC) who monitor movement of Rohingya residents in and out of the camp (for security reasons and because there have been instances of kidnapping from the camps). As of October 2018, one month after receiving the feedback, CAID helped Handicap International to receive approval to operate in Camp 15 and provide appropriate services to the young boy. CAID Accountability Team described this as a successful example of coordination to provide appropriate services, despite the hurdles of approval and restrictions on moving the boy outside the camp. Block Development Committees CAID regularly engages members of committees that have been established in the camps and has found this to be one of the best ways to receive and use feedback. Block Development Committees (BDCs) are a new governance system that works as an alternative to the Majhi system. The system of Majhi was enacted by the Bangladeshi Army after the influx of refugees in August 2017. Majhis are not elected leaders, therefore some camp residents do not feel Majhis represent them and the system does not work well for the women. Therefore, BDCs were formed with elected Rohingya members to reach more people in the community, to deliver services as well as to receive adequate feedback. BDCs were also established to empower women as 50% of the BDC members are women. BDC members visit door to door to collect feedback and use voice recorders. BDC members report any urgent issues to the Community Mobilizers or the CIC officers. BDC members meet once a week to discuss feedback and complaints and other issues they have observed. CAID Accountability Community Mobilizers also attend these weekly meetings and listen for feedback and complaints, then refer these to the Site Management team.

17


Service mapping In Camp 15, which CAID manages, a visual map was recently developed to show which services are provided by implementers in the camp. Though this may seem a simple tool, the CAID team mentioned this was a critical tool to help them use feedback. They have already used this tool to coordinate with other agencies providing services and refer people to these service providers.

5. Areas for Further Attention and Improvement DEC case study organizations were all optimistic about future steps for improving feedback processes, and felt momentum was building for improving internal systems, as a way to also improve the quality of services and programs. However, there were key areas that require attention which were identified in conversations with staff in the three case study organizations and validated as shared issues by other participating organizations in the learning workshop in Cox’s Bazar. A. Collecting useable and actionable feedback All case study organizations raised the challenge of collecting feedback from women and adolescent girls. For example, one organization recently reviewed feedback trends over several months and realized that, in this period, they had not received any feedback from young girls. Many agencies are concerned by this, particularly since young girls were responsible for collecting firewood before community kitchens and adequate fuel distributions were established, and there were instances of rape and assault by host communities.13 It is possible there are continued risks that are not reported by young women out of cultural or personal considerations. The experiences and communication preferences of girls and young women need to be better understood by aid organizations. Language barrier remains a consistent challenge. Many field staff are from across Bangladesh and some are not familiar with the Rohingya language. In cases where voice recorders are used, it often takes a lot of time to transcribe the complaint or feedback from Rohingya to Bangladeshi or English; then decide on actions needed after this process, which may take several days with information potentially getting lost in translation. Organizations hope to hire more Rohingyaspeaking staff, until then, addressing the language barrier will remain a key challenge. During the diagnostic conversations as part of this initiative, participants indicated that staff capacity needed further development in order to improve feedback utilization. The staff hired for Cox’s Bazar positions are experienced humanitarian and development professionals but working in an emergency setting is relatively new to many of them. Some organizations pointed to increased staff capacity to respond in an emergency as needing further attention in order to improve increased feedback sharing and utilization. One of the case study organizations is currently in the process of updating their staff hiring and capacity building processes. 13

Ground Truth Solutions, “Bulletin #3 – Safety and Outlook,” August 2018.

18


B. Using feedback internally The three organizations are currently in the process of developing their feedback tracking or feedback registry processes to document when feedback is received, the type of feedback, and how it is responded to. However, as these processes are developed, it will remain a challenge continuing to ensure that feedback is reaching the right channels and individuals, to inform decision-making within the agency especially as staff turnover remains high. All case agencies have a process for prioritizing urgent feedback received, and some also regularly conduct ‘spot checks’ of volunteers at field level to determine if feedback is used properly, depending on the type of issue raised. However, field staff are not always using these processes in consistent ways. This poses a challenge, as it makes it difficult at field level to know what to report, how to note the timing of reporting and response. A lot could be getting lost between field level and central level (particularly if/when working with implementing partners). Given the intense nature of this work, one challenge mentioned was high staff turnover. This affects information sharing processes between central and field offices. One case agency has paid attention to their human resource team processes and has updated their hiring and staff capacity development processes. However, across all agencies, more time and space might be needed for sharing information at central and field levels. Closing the loop internally and with communities remains a challenge. Both staff and community members want to hear about decisions made in response to feedback and complaints, both urgent and non-urgent. C. Feedback for donor accountability versus for program improvement and learning purposes [organisational culture] There are many reasons why feedback is important in reporting: internally for MEAL teams and staff or externally to donors, for identifying program gaps and future needs, for informing future strategy development, etc. DEC member organizations clearly demonstrate a commitment to collecting feedback in order to fulfill both internal and external reporting requirements and information needs as well as to foster learning and program improvement processes. However, some staff at the case study organizations – those at central, and field levels – did not know from their job descriptions and documented internal processes what their responsibilities and expectations were related to using community feedback. In other peer organizations, some staff at times felt hesitant to mention if they had ‘failed’ to use feedback as this was perceived as a negative (a personal failure on their part) rather than a missed learning opportunity to improve services to communities. It was not often clear what type of ‘learning culture’ existed within organizations, and whether staff felt encouraged to use and reflect on feedback for program improvements, on top of their day-to-day responsibilities or whether indeed this was consistently encouraged and expected as part of their responsibilities.

19


D. Inter-agency communication channels & sharing Staff reported challenges with inter-agency communication channels for information sharing, though they also mentioned different resources available to address these challenges. Issues raised by field staff were validated during process mapping and discussions with central teams and point to a significant challenge with using feedback that concerns other service providers. Several staff members mentioned that, at times, dynamics between agencies serve as a barrier to sharing important community feedback. At field level, there also seems to be some confusion about who provides what service, making it difficult to refer feedback that is not within the immediate remit of the agency that has received the feedback. Poor levels of inter-agency coordination pose a risk to feedback providers, especially in the case of urgent and unresolved complaints. When agencies are not able to effectively coordinate their response to feedback (or effectively use shared complaints channels), the burden then falls to the feedback provider to advocate on their own behalf until appropriate action is taken. E. Sharing feedback with external stakeholders A significant challenge that was mentioned by all case study organizations and at the final learning event hosted for DEC members in Cox’s Bazar, was that often community feedback is not acted upon because teams do not have approval from the Camp in Charge, or CIC (government administration) to move ahead with necessary changes to programming. This trend is concerning, as it indicates significant amounts of feedback, if challenging to respond to, may not be getting shared properly within or across organizations and instead be lost at field level. Host communities are also providing feedback, which is being used by case study organizations. However, it was unclear how often (and in what ways) the loop is being closed with host communities. One promising practice was shared by the CARE team during the London learning event where CARE WASH team member described how complaints from the host communities about water shortages and lack of latrines were raised to the senior management and resulted in a program re-design that allowed CARE to provide construction of wells to host communities. The senior management recognized the increased tensions between refugees and host communities and after the local government officials raised the issue with the aid agencies, it gave a momentum to CARE to make the decision based on several information sources, including host community feedback. In September 2018, CARE was planning to add 18 more host communities with 36% of the WASH assistance going to host communities. A note on safeguarding Understanding possible safeguarding concerns, or urgent protection issues, was a priority for this engagement as DEC member agencies remain deeply committed to addressing problems of this nature. Case study organization staff were asked about how their agencies handle serious incident reports including complaints of a sensitive nature, which require confidentiality. During focus group discussions community members were asked how comfortable they felt disclosing incidents of harm to agencies, even when suspected or alleged, in particular if the perpetrators

20


are aid workers or volunteers themselves. There is certainly more work to be done in this area to understand the different nuances, and DEC members have made a commitment to progress this work collectively in particular within those camps where they are providing complementary assistance. The main challenges raised are covered below. All agencies are investing - and requesting further support - to design appropriate systems for women and adolescent girls to report abuse and exploitation. Staff, including participants during the London learning event noted that gender-based violence and domestic violence is increasing, and that frontline staff are also at risk. In Cox’s Bazar, aid agencies were concerned by the fact that they are not receiving reports about serious incidents that affect their target population, and that this is symptomatic of wider issues such as stigma for the survivor of such incident, lack of confidence that any response would be provided (given the wide-spread abuse that some of the population has suffered or witnessed before fleeing Myanmar); but also that such information is more likely to be shared through other channels and that more must be done both to strengthen early detection and to provide support in third-party reporting. •

Over the course of focus group discussions with women in the community, it became clear that information does not always flow first to agencies, even when aid workers may be involved. There are several other channels women and young girls may choose to use if they have to report serious incidents, which poses additional challenges to agencies trying to receive and act on this information. Women may disclose information to local Majhis (male community administrators appointed by the Bangladeshi army), or directly to their husbands or other family members. Agencies are struggling with this, as - for example there have been notable instances in which Majhis will not respond to complaints nor will they refer these to appropriate channels where they can be addressed.

As sensitive complaints may not always be reported to agencies, staff need to be even more skilled and well-trained in detecting when sensitive issues exist, even when they are not being reported (e.g., asking women in focus groups to speak about stories they’ve heard in other communities, to make it less a personalized conversation but more to infer what types of sensitive concerns community members are aware of). All the more important as well, for agencies to be sure staff understand reporting processes when serious reports are disclosed.

Lastly, when agencies do receive sensitive complaints, there have been two challenges mentioned: internal barriers and not knowing how this information is used within the agency (usually by protection or GBV teams), or what decision was made and whether the loop was closed with the community member. Though this is understandable given the confidentiality considerations linked to serious reports, there remains some desire for more transparency internally, especially to know if and when the loop is closed. External challenges were described as well. For example, during the Cox’s Bazar Learning Event, one staff member told a story about a woman needing to be re-located to another camp after a sensitive incident; however, the paperwork and administrative hurdles coordinating with local government made this process extremely challenging for the agency and the community member

21


involved. To note, agencies did not mention any instances or examples of serious reports involving men or young boys.

6. General Recommendations These recommendations are inferred from engagement and consultations which involved DEC member charities, including the final workshop in Cox’s Bazar, but they are likely to resonate with and have applicability for most aid providers in Cox’s Bazar. They serve as options to be discussed and refined by staff of organizations working in Cox’s Bazar, and with senior management at country level. To implement the ideas and recommendations provided in this report, a participatory process involving MEAL and frontline staff, partners and community representatives may allow organizations to identify the most appropriate and effective way to meet the communication and feedback needs of the camp residents and the information needs of program and senior management staff. Tailored options for improvement have been separately prepared and submitted to the three DEC members who volunteered as case study organisations in this initiative. A. To improve collection of useable and actionable feedback 1.

HR and senior management should consider a balance of male to female staff, as female community members will most often choose to disclose sensitive information to other women.

2.

Hire staff with demonstrated experience in community engagement, listening and feedback collection; ask human resource team to inquire about applicants’ past experience with qualitative data gathering and utilization during the hiring process.

3.

Some organizations are designing hotlines answered at site management level (e.g., Oxfam). Several staff mentioned that women (regardless of age) may not provide feedback in the presence of others (including other women) for a variety of reasons, therefore hotlines available in private, e.g. in a private consultation room where callers can remain completely anonymous, may be one option. ▪

4.

All case study organizations noted some of the best ways to get feedback from women is through one-on-one conversations. Another option may be more one-on-one visits to women / private consultations offered in women friendly spaces (WFS).

Training Rohingya enumerators (e.g., community volunteers) to understand what is ‘quality information’ and ethical methods for collecting this type of feedback or using existing community capacity to gather it is one option that some agencies are attempting. This should include 1) defining with them what ‘quality information’ includes (i.e., rich information that demonstrates the nature of the problem) and 2) training Rohingya enumerators to conduct surveys and to ‘listen’ for community concerns and urgent needs. 22


5.

In June 2018, Translators Without Borders released a Glossary for Bangladesh for WASH sector projects with an understanding that Rohingya men and women, with different public and private experiences, will use different language based on their personal experiences when describing similar concepts or scenarios. Use these resources!

6.

Feedback collection skills training (e.g., on how to ask open-ended questions, how to ask good probing and follow-up questions vs. only pre-determined survey questions, how to make sense of contradictory feedback and interpret qualitative feedback data, how to work with translators).

7.

Conduct workshops for staff on CHS guidelines, their practical application and how these apply to the Rohingya response (e.g., “How would you explain duty of care? What exactly does this concept mean to you? To your organization?”).

8.

Provide psycho-social support for staff and partners at field level (to help manage stress/cope with receiving high volumes of sensitive information). B. To improve use of feedback internally

9.

Ensure feedback collection processes, templates and registries are accessible to all staff, e.g., one central Excel sheet/feedback registry. However, not all staff should see sensitive complaints, or data that must be protected per informed consent policies. Centralized feedback registries should not include sensitive information as this could pose harm to communities. Ensure that feedback registry/database includes the method used to collect feedback and some description of how feedback was responded to. Include how central and field level coordination made this response possible (as this also demonstrates how loops were closed internally).

10.

Clearly define and report on sensitive/non-sensitive feedback, by sector, and initiate systems to support field level (e.g., color-coded prioritization function on Survey CTO). Oxfam GB has recently used Survey CTO in this way and could be a thought partner to others. For all SEA/safeguarding concerns raised by community members, there should be a very clear protocol explained and followed by all staff, even the drivers. SEA/safeguarding protocols should follow a survivor-centered approach, meaning survivors’ identities are protected and cases are kept confidential, especially to protect the integrity of possible (criminal or noncriminal) investigations.

11.

Include a standing agenda item during all staff meetings to share feedback trends (e.g. recurring issues or feedback that staff find particularly challenging to respond to). Some feedback in relation to partner misconduct, or serious quality concerns can be regularly reviewed as part of program quality and senior management meeting agenda. Other recurring issues related specifically to WASH, health or other technical sectors, can be discussed at team meetings. CXB based team can use separate regular (e.g., monthly) allstaff meeting to have follow on conversations about how to address complex challenges for responding to, and using, feedback.

23


12.

Consider coordinating with the human resource team, to assess if it is valuable to require staff to commit to the agency for a longer period of time (e.g., 6 months minimum) which can help maintain good feedback practices for longer.

13.

Require outgoing staff to leave a short transition plan explaining how they currently collect, use, or share feedback – and what challenges they observe and/or suggest new staff should consider in this role.

14.

Share stories via email, through SMS message, in meetings between field and central level staff, in regular reporting, that describe how feedback was used to inform decision-making, and how the community was informed of decisions.

15.

Create short Frequently Asked Questions sheets (FAQs) that are translated and posted on bulletin boards in community / service provision areas as well as in field offices – these can include examples of how feedback was used to improve programming. Field staff need this information as much as community members do and it supports consistent messaging and information provision.

16.

To close the loop internally and with communities, one option is to build-in more frequent visits by senior level to field level. These may already occur regularly, however during these visits time specifically reserved for discussing feedback and decision-making with field staff, and then with communities, would be one option to consider.

17.

Program managers in CXB should share regular updates with staff about the status of outstanding issues and include these in the updated FAQs (if using these). C. To use feedback for program improvement and learning purposes

18.

Case study organizations could include a small “Fail Fest” as part of an all staff gathering to encourage an exchange of examples of programming and feedback processes that in retrospect, didn’t go as well as they could have. This could be a fun, light-hearted way to create an environment in which staff feel they are free to learn, and make mistakes, without feeling personally culpable. These would be best if moderated by the MEAL team and carried out in separate groups of senior staff and other (including field) staff.

19.

Clearly outline in job descriptions or onboarding procedures how staff are to be assessed on their performance as it relates to community engagement, listening and using feedback, and resources available to them if they feel, or have been told, they are underperforming.

20.

After action reviews (consider including as part of PDM surveys)

21.

Case studies to document changing community perceptions about quality of programming and responsiveness of case study organizations to their concerns, to motivate staff to continue to learn and improve the services they provide to these communities.

24


D. To improve inter-agency communication channels and sharing 22.

Inadequate levels of inter-agency communication and coordination pose a risk to feedback providers, as feedback may be lost, misplaced, or require significant time before action is taken. This then places the burden on feedback providers to advocate for themselves rather than on the aid system to advocate for and better serve communities. Aid agencies should be cognizant of this risk and commit to prioritizing improved inter-agency communication channels and coordination to the extent possible.

23.

Site management role is critically important for coordination and ensuring feedback is circulated to the right people, to make decisions and respond to complaints at field level. One option would be to capture / document ‘successful coordination’ stories from camp management teams, or other camps that are perceived to be managed well. These could be circulated among staff and with other agencies to show ‘success stories’ of good coordination processes. Stories could provide examples that show cases in which human dynamics were managed well, and a variety of different coordination approaches and channels were used to respond to feedback quickly.

24.

Improve the protocols and process for internal and external referrals such as creating a visual site mapping, that is regularly updated (for examples, confer with Christian Aid site maps recently developed in camp 15).

25.

For those aid agencies that haven’t yet conducted a Do No Harm (DNH) dividers/connectors analysis, it is highly recommended to do this in order to identify potential sources of tensions, within camps and between camp residents and host communities. To members that have conducted a DNH analysis at the start of programming, ensure it is frequently monitored and updated (e.g., monthly). This type of analysis (developed and applied frequently by CDA and other partners) helps agencies recognize that assistance has the potential to reduce or exacerbate intra-group and inter-group tensions. The DNH tool enables an organization to: understand the context in which it is operating, understand the interaction between the intervention and the context, and, act upon that understanding, in order to avoid negative impacts and maximize positive impacts on the conflict. This practice is referred to as “conflict sensitivity.” The DNH analysis may reveal important positive relationships between DEC case study organizations (or other aid agencies!), the host community and Rohingya community members that are already helping to reduce tensions and enable better coordination and feedback sharing.

26.

Group established by DEC members to regularly meet and discuss feedback amongst themselves and with the wider sector, e.g., asking: what similar feedback all of us receive across different camps, how do we interpret this together/what are the macro trends we can observe across camps, what are the gaps – meaning what additional information do we consistently need to make decisions, what are shared challenges for collecting and using feedback, what are different agency proposals for steps forward? This type of coordination could eventually lead to complementary services and technical expertise. In the context of the DEC, this would capitalise on how member charities and their partners are finding ways to collaborate and leverage one another’s experiences and expertise for a more efficient, coordinated approach. 25


27.

Case study organization representatives who attend cluster meetings to add a standing agenda item on cluster meeting agenda to share feedback trends of relevance to peers.

28.

Rapport building activities within camps (e.g., inter-agency “fail exchange” – come together, share what’s not working and why).

29.

Member visits, to different camps, hosted by managing agencies with the intended purpose of sharing what is working well to coordinate and share feedback amongst agencies, and what the ongoing or complex challenges are. For example, CARE manages Camp 16 and might host Oxfam and CAID to come and learn how the camp is managed, feedback is shared, and what creative solutions exist for engaging multiple stakeholders so that action is taken to respond to complex or challenging feedback.

30.

Related to 29 above, consider collaboratively conducting inter-agency Listening exercises together during these field trips. The purpose would be jointly listening to community concerns and, afterwards, discussing each agencies’ interpretation of overall community wellbeing. E. To improve feedback sharing with external stakeholders

31.

Better coordination among agencies to speed up problem-solving and/or better coordination with government. For example, CARE has faced resistance from CIC to make necessary changes to a WASH program (to find a way to de-sludge latrines, specifically). In this instance, CARE staff found creative solutions to collaborate with other agencies operating in the same camps to solve the issue. CIC had no problem with CARE contracting an agency that specializes in de-sludging, to come and de-sludge CARE’s latrines. This collaboration resolved the community-level issue quickly and aligned with CIC’s expectations about the quality of service provided. CAID also has the example of coordinating with Handicap International.

26


7. References Aelbers, Stijn, Fluck, Viviane Lucia, Rahaman, Jyoti. Internews. “Humanitarian Feedback Mechanisms in the Rohingya Response, Cox’s Bazar, Bangladesh.” 2018. Bonino, F. with Jean, I. and Knox Clarke, P. “Humanitarian Feedback Mechanisms Research, Evidence and Guidance.” 2014. ALNAP Study. London: ALNAP/ODI. Page 29. Buchanan-Smith, Margie and Islam, Shahidul. UNICEF. Communication and Community Engagement Initiative (CCEI). “Real-Time Evaluation of Communicating With Communities Coordination: The Rohingya Response.” 2018. http://www.cdacnetwork.org/tools-andresources/i/20181126144951-mbr81 BBC Media Action, Internews, Translators without Borders. “What Matters? Community feedback summaries from the Cox’s Bazar Response.” 2018. CDA Collaborative Learning Projects, Issue Paper on Feedback Utilization: Modeling & Routines. Unpublished, available upon request. 2016. CHS Alliance, ICRC, “How Change Happens in the Humanitarian Sector.” Switzerland, CHS Alliance, 2018. https://www.chsalliance.org/har Core Humanitarian Standard on Quality and Accountability, 2014. https://corehumanitarianstandard.org/files/files/Core%20Humanitarian%20Standard%20%20English.pdf. Page 14. Ground Truth Solutions. Cox’s Bazar – Bulletins #1 Needs and services, #2 Feedback and trust, #3 Safety and outlook. 2018. HAP Standard in Accountability and Quality Management, 2010. https://www.chsalliance.org/files/files/Resources/Standards/2010-hap-standard-inaccountability.pdf . Page 6. Holloway, Kerrie and Fan, Lilianne. HPG Humanitarian Policy Group. ODI. “Dignity and the displaced Rohingya in Bangladesh: ‘Ijot is a huge thing in this world’”. 2018. HPG Working Paper. Jean, Isabella, “Beneficiary Feedback: how we hinder and enable good practice.” London: Bond, 2017.

27


Annex 1: DEC Members Feedback Utilization Survey conducted in August 2018 1. In your opinion, does your organization put enough emphasis on the use of community feedback? [Yes/No + fill in the response] a. If yes, what has helped to ensure this sufficient emphasis? b. If not, what are the reasons for lack of emphasis? 2. Which of the following types of feedback does your organisation receive? Select all that apply: a. Feedback that the organization proactively seeks through surveys, focus group discussions, consultations and listening sessions, etc. b. Feedback that is passively sought through suggestion boxes, hotlines, etc. c. Feedback that is received in an ongoing manner for which the organization has processes for recording and responding to such input d. Feedback that is received in an ongoing manner for which the organization does not have processes for recording (passed on verbally/informally) e. Don’t know 3. What categories does the feedback you receive fall into? Please rank according to frequency/prevalence: a. day to day operational/implementation feedback related to services and aid items b. reports and concerns about staff misconduct and financial misconduct c. reports about harassment or sexual exploitation and abuse d. strategic level feedback about program design, relevance of programming, targeting e. feedback about the work of other entities and authorities, nor applicable to your org 4. To what extent do current standard operating procedures for handling and responding to sensitive complaints meet the safeguarding obligations of your organization? a. Our current procedures for handling sensitive complaints are consistently applied and well understood by all relevant staff b. Our current protocols for handling sensitive complaints are well designed but not consistently applied and not understood by all staff c. Our current protocols for handling sensitive complaints need review and revision to fully meet our safeguarding obligations d. Our protocols are not being applied / are non-existent 5. To what extent do current standard operating procedures for handling and responding to sensitive complaints meet the cultural, protection and legal needs of the people reporting harassment and SEA? a. Our procedures for handling sensitive complaints were designed with input from a representative sample of the population we are serving, including those most at risk b. Our procedures were designed by internal PSEA experts and/or consultants based on industry standards and have been communicated to staff and community members c. Our procedures are not well understood in the community and we are concerned that cases of misconduct are not reported due to confusion, uneasiness or fear of retaliation d. Our procedures require significant revision to meet the protection needs and legal requirements in cases of SEA and other serious misconduct 6. What team(s) in your organisation effectively use feedback and routinely address community concerns, comments and suggestions? a. Program teams in the field b. Technical specialists / program support teams at HQ c. M&E/MEAL team d. Admin and Operations: HR/Audit/Finance/Procurement/Security/Logistics

28


e. Senior Management and Leadership f. Other___________ please fill in Open question: In your opinion, what enables their effective feedback utilisation practice? 7. How is feedback in your organization recorded/organized? Select all that apply: a. The feedback is stored in its original form (i.e. survey form, a note, an email, etc.) in a central location b. It is compiled into a central list or spreadsheet c. Meetings are held to prioritize feedback before compiling d. Feedback reports are written to summarize feedback on similar topics e. Other (specify)___________ 8. How do you aggregate/summarize/make sense of the various feedback your organisation receive? (select all that apply) a. Large team meetings where we discuss the feedback b. A single individual is in charge of doing this i. If selecting this option, please indicate level and role of the person c. A small team is responsible d. Other (specify)_____________ 9. How is feedback communicated to the relevant personnel in your organization? Select all that apply: a. Email b. Conference call c. In person meeting d. Report e. Don’t Know 10. How are decisions made regarding how to use feedback? Select all that apply a. They are made on the spot in real time b. They are made after the team discusses ways forward c. They are made after the team discusses and compiles feedback d. They are made after the team discusses, compiles feedback, and consults again with beneficiaries e. Don’t Know 11. Describe a process you have seen to be effective in internal utilisation of community feedback at your organisation. E.g. specific action plans made to address/incorporate the feedback; feedback prompting additional data collection to understand the issue and address it by modifying program design or implementation plan; senior management review and use of feedback for strategic purposes. [Open ended question] 12. Please write in the top 3 topics related to feedback utilisation that you would like to discuss, learn about, and reflect on with your peers during our workshop on September 7th. a. … b. … c. …

29


Annex 2: DEC Member Survey Responses [n: 22; selection of data below] In your opinion, does your organisation put enough emphasis on the USE of community feedback?

No

Yes

Open-ended responses and reasons given for lack of emphasis: •

sheer amount of information that response teams have to handle, analyse and make decisions based on;

lack of willingness to adapt programming approaches in general (not just specific to using feedback);

limited incentives to prioritise analysis and use of feedback (varies from country to country - in some, members of the leadership team do try to create incentives for this, but it is largely down to people rather than systems/organisational culture);

attitudes of some staff who still perceive humanitarians as the "experts" and do not recognise the value of information from affected people

Promising reasons why the emphasis is growing: •

“The programme teams have a close relationship with the community and this is apparent in the adaptations made at community level.”

“The emphasis on using (not just collecting and analysing) feedback is felt and championed within the MEAL team, but there is more work to be done on permeating this attitude through programmatic teams (globally, not just in Cox's). We have really improved on actively seeking, recording, and actioning feedback more systematically, and in many countries, feedback does influence programmatic changes, but perhaps not at the scale or rate it should be. “

“It is a key commitment within our Quality Standards and ways of working and a key part of our CHS commitment. It is tracked at all levels from the field up to senior management.”

30


Which of the following types of feedback does your organisation receive? 25 20

21

15

17

15

10

14

5 0 Feedback that the Feedback that is passively Feedback that is received in Feedback that is received in organisation proactively sought through suggestion an ongoing manner for which an ongoing manner for which seeks through surveys, focus boxes, hotlines, etc. the organisation has the organisation does not group discussions, processes for recording and have processes for recording consultations and listening responding to such input (passed on sessions, etc. verbally/informally)

What team(s) in your organisation effectively use feedback and routinely address community concerns, comments and suggestions? 25 20 15 10 5

21 17 13 9

7

1

0

31


How is feedback in your organisation recorded/organised? 16 14 14 12

13

13

10 8 7

6 6 4 2 0 The feedback is stored It is compiled into a in its original form (i.e. central list or survey form, a note, an spreadsheet email, etc.) in a central location

Meetings are held to prioritise feedback before compiling

Feedback reports are written to summarise feedback on similar topics

Other

How do you aggregate/summarise/make sense of the various feedback your organisation receives? 16 15

14 12

11

10 8 8 6 4 4 2 0 Large team meetings where we discuss the feedback

A single individual is in charge of doing this

A small team is responsible

Other

32


How is feedback communicated to the relevant personnel in your organisation? 20 18 18 17

16

17

14 12 10 8 6 4 3

2

2 0 Email

Conference Call

In Person Meeting

Report

Don't Know

How are decisions made regarding how to use feedback? 18 17

16 14 12

12 10

11

11

8 6 4 3

2 0 On the spot in real time

After the team After the team After the team discusses ways forward discusses and compiles discusses, compiles feedback feedback, and consults again with beneficiaries

Don't Know

33


Annex 3: Number & Type of Participants Number of DEC “Case Agency” Participating Individuals Central Level Offices (Cox’s Bazar) KII’s

Process Mapping + Focus Group Discussions

13 staff members

4 group discussions / 22 staff members (who did not also participate in KII’s)

Camp Level Offices (Camps 15, 1E, 1W, 16, 19) KII’s

Focus Group Discussions

8 staff members

Women

Men

4 group discussions / 52 women community members

3 group discussions / 61 male community members

Total # of staff, volunteers, and community members consulted 156 individuals

Peer-to-Peer Learning Events (total participants) London

Cox’s Bazar

20 participants

23 participants

34


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.