> Posted by Nadia van de Walle, Senior Africa Specialist, the Smart Campaign
The Smart Campaign secretariat does a lot of things – manage a Certification program, provide technical assistance, develop and promote industry standards, and conduct research. Our small team is always putting on different hats, and we joke about trying to explain our jobs to friends. At the end of the day, the one thing many of our friends can understand is that we are an industry-facing organization offering a “public good.” The Smart Campaign’s public good is not a road or a lighthouse. It just happens to be standards and guidance on protecting clients. These standards are a public good because they belong to everyone, and one individual or institution’s use does not reduce the availability of the resources for others.
Some of our ever-thoughtful friends then ask if this means that we contend with other classic public goods challenges.
The answer is yes, absolutely. One of the biggest issues we struggle with is the lack of a market feedback mechanism. Industry stakeholders can use Smart Campaign tools and resources without paying and thus without providing feedback on their experience. Without a price signal, it can be difficult for the staff to assess demand and user experience. This makes it hard to know how to tailor, expand, or improve offerings. We are curious to hear examples from readers about how other similar organizations consistently improve their offerings without market feedback.
We also must contend with the blessing that is the diversity of Smart Campaign participants. The material is translated into dozens of languages, in very different national contexts, by people in varied types of organizations in dissimilar roles and with a range of knowledge backgrounds.
We continue to think a good measure of success is the number of global clients of the institutions that have worked to apply the Client Protection Principles. For instance, we know that the number of clients served by certified MFIs totals roughly 8 million. We also get feedback on which tools appeal most based on website hits and material downloads and of course through conversations at conferences and in public forums and private discussions.
Recently, the secretariat has worked to improve a publicly available tool that is central to the Campaign, the Client Protection Assessment Guide. A client protection assessment is a thorough review of an institution’s practices as they relate to client protection, the first step in the process of becoming Smart Certified. The Assessment Guide is one of the Campaign’s key documents and many financial institutions have told us that it has been of tremendous value to them. It is a comprehensive diagnostic tool that financial institutions use to thoroughly examine their implementation of the seven Client Protection Principles. It serves as a starting point and ongoing metric for institutions seeking to protect their clients and can help them protect themselves from default and overindebtedness, while also enabling them to improve their product portfolio, client relationships, and public reputation. Assessments also reveal common institution strengths and weaknesses, regional or linguistic differences, which helps the secretariat learn more about challenges and opportunities institutions face, so we can adjust the tools accordingly.
Yet, all those advantages aside, the Assessment Guide is a public document that can be continually improved, and we have found that the key to that is direct user feedback. Recently we noticed that demand for assessments was falling, and when we spoke with institutions we realized there was substantial confusion about the role of client protection assessment versus social performance management tools. As more social performance tools have become available, organizations had more options to choose from. Additionally, the previous version of the Assessment Guide did not address institutions’ most frequent questions, provide adequate process guidance, or make the business and social performance case for assessments. Moreover, some institutions could not find experts locally available to be assessment consultants. With all these questions or doubts, they were not convinced that the outcomes were worth the costs.
In the latest version of the Guide we tried to address these issues. We expanded some sections, reorganized the overall flow, and adjusted the language to be more user friendly. We created a video, PowerPoint presentation, and one-pager to orient users to the Guide. We also prepared almost 30 annexes to provide institutions and technical assistance providers with templates, checklists, and recommendations.
We also responded to the expressed need to develop alternatives to full assessments that use external consultants. The new Guide provides more information about self-assessments, including an accompanied self-assessment (which involves off-site guidance to assist institutions in assessing their own practices). We also added a section at the beginning of the Guide making the business and social performance case for doing assessments and a section at the end on what to do after the assessment to implement the findings. Finally, we continue to work to develop the local market for assessors by offering assessor training and refresher courses and working with associations and consultants to serve as local resources.
An assessment is the first step for an institution seeking to meet the global standard of client protection practices and receive Smart Certification. It should be available and possible for all financial institutions.
It is challenging to improve, disseminate, and support such a public tool and its related information goods. If the industry is to have an assessment tool that is as sharp and useful as possible, we, the secretariat, need feedback from users – institutions, consultant assessors, and others throughout the industry.
Please take a look at the Guide and tell us what you think!
Have you read?