
APQC Communities of Practice Survey Findings Reveal Design and Strategy Gaps Organizations Must Close to Make CoPs Effective and Relevant
Surveys, even when well-crafted, reveal only what respondents already believe to be true. They do not surface what participants don’t know, can’t imagine, or have yet to encounter. That means survey results should be treated as guideposts, not verdicts—signals that reveal where organizations are confident, where they are uncertain, and where blind spots are quietly shaping future challenges.
The APQC Communities of Practice (CoP) survey offers a useful snapshot, but its greatest value lies in the conversations and adjustments it can inspire.
To move from insight to action, it’s essential to treat survey findings as invitations for deeper inquiry, fostering discussion across silos and hierarchy to clarify what real capabilities look like in context. Rather than accepting surface-level responses or static snapshots, organizations should use this data as a launchpad for targeted interventions, defining the characteristics of genuine expertise, tying activity more closely to demonstrable outcomes, and rigorously validating perceptions through tangible evidence. This approach transforms the survey from a mere reporting exercise into a dynamic tool for strategic learning and continuous improvement.
Below, I respond to each major finding, identifying what’s missing and offering recommendations informed by my own work in knowledge management, CoPs, AI, and organizational design.
A PDF of the full APQC report (COMMUNITIES OF PRACTICE IN MODERN ORGANIZATIONS) can be found here.
I will refer to the slides by the numbers within the PDF. I will not reproduce the content as it belongs to the APQC.
My analysis begins with Slide 5.

APQC Communities of Practice Survey Findings; A Slide-by-Slide Commentary
Slide 5 — Organization’s Familiarity with KM
This slide sets the tone for the respondents more so than offering any insight into knowledge management (KM) or its practices. It is important, however, to note that self-reported familiarity is not the same as capability.
People often confuse exposure—such as attending a KM workshop or occasionally accessing a shared drive—with practice. Internally, organizations should define capability based on contributions to knowledge assets, decisions informed by documented insights, or demonstrated reuse of existing work. A KM capabilities heatmap by function will reveal where expertise is genuine and where it’s aspirational. This should be validated through artifact reviews and post-mortems on decisions, tracing them back to the knowledge sources that informed them.
Slide 6 — Perceived KM Maturity
Given the rapidly changing environment for knowledge, including artificial intelligence and various forms of organizational structure that often place expertise in the mind and hands of partners and contractors, the idea of maturity in any function, let alone knowledge management, is a challenging idea.
As with the previous slide, a self-assessment of maturity is highly misleading. A maturity model must correspond to measurable business impact. Whether your organization is ‘developing’ or ‘optimizing’ matters less than whether KM maturity reduces decision latency, improves win rates, accelerates onboarding, or decreases defects. Those more detailed ideas may roll-up to an idea like innovation, but the proof of innovative acts and their success should be the determinant, not a non-empirical perception of innovation.
Internally, tie each stage however, those stages are divided or named, to a specific set of metrics and track them relentlessly. If a stage produces no measurable change, revisit the activities in that stage—they may be ceremonial rather than impactful.
Slide 7 — Organization’s Familiarity with CoP
Rather than familiarity with Communities of Practice, it would be better to combine this slide with richer questions in slide 8 that would demonstrate effective CoP capabilities rather than familiarity.
Regardless of an integration with Slide 8, it should be noted that Communities of Practice are not monolithic. Generalization makes it difficult to discern the “familiarity” of one CoP over others. A standards-focused community has very different rhythms, governance, and measures of success than one dedicated to innovation or onboarding.
General familiarity does not guarantee capability for all types of CoPs. Organizations should classify communities, for instance, as problem-solving, standards, innovation, or onboarding, and design governance, facilitation, and funding rules accordingly. This avoids applying a one-size-fits-all model that undermines the specific needs of each type.
Familiarity would be more useful by type of CoP.
Slide 8 — Perceived CoP Maturity
Rather than familiarity, the effectiveness of CoPs is more useful. Ask questions that get to the capability of CoP types, and offer insight into which types of CoPs are working best. Lower-level questions would get to areas like governance, facilitation, motivation, and business alignment.
Self-assessed maturity is of limited value without evidence of sustained impact. A practical way to measure this is through a reuse ratio: the percentage of solutions, patterns, or artifacts adopted outside the originating team. Also, track whether these practices survive leadership changes. Look for evidence of resilience rather than maturity, including the resilience of influence in times of business or market flux.
Slide 9 — CoP Organization Model
This is a question that clearly requires multiple answers and a revelation of how different CoP models exist within the same organization. In workshops to improve CoP effectiveness, understanding the spectrum of models would make for a good opening discussion.
Slide 10 — CoP Strategy Alignment
Alignment needs to be more than a checkbox. Every CoP should be defined by an Objective and Key Results (OKR) that measures learning, and another tied to business performance. Quarterly value stories—short narratives connecting these OKRs to actual outcomes—make alignment visible and tangible, reinforcing the CoP’s role in business success. An internal survey should not measure perceived alignment but effective alignment based on rolled-up performance.
In addition, CoP strategy should not align with a KM strategy or a business initiative, but to the business strategy, of which KM (or some other initiative) acts as a means of achieving the business strategy. Organizations must, then, clearly define their business strategy and draw clear lines between the expectations for communities and the objectives of the business.
Slide 11 — CoP Oversight and Sponsorship
Oversight without role clarity results in slow decisions and inconsistent support. A sponsor charter should define budget authority, decision rights, and escalation paths, while setting expectations for removing barriers. Rotating sponsors every couple of years prevents complacency, introduces fresh energy, and keeps sponsorship aligned with evolving business priorities.
This survey question would also benefit from the acceptance of multiple answers that would add to the metadata about CoPs across organizations, rather than a “best fit” answer by a respondent within the constraints of the choices. Organizations must understand the range of oversight roles and other governance mechanisms, as they are likely to vary, and require different interactions, incentives and practices depending on the CoP’s chosen model.
Slide 12 — CoP Program Responsibility
Again, a mix of answers would be useful here, showing matrix responsibilities which are not uncommon as business units work with KM teams to define need and reap the rewards of collaborative work.
Although the survey conflates functional area and business unit, organizations should be cautious about CoP responsibility that resides entirely in business units, which often results in the loss of cross-community coherence.
Organizations should consider a lean platform community, one responsible for taxonomy, analytics, facilitation standards, and community tools to ensure interoperability across CoPs while respecting local autonomy. This team becomes the connective tissue between otherwise isolated communities. Done well, members of other communities join this community, which turns it into a community-of-communities that self-governs approaches to meet business needs.
Slide 13 — CoP Goals
Slide 13 suggests a disconnect between the strategic objective of CoPs and their perceived goals. Each of these areas should include a comment that says, “to what end?” Fostering collaboration and sharing across geographies is great, but is it useful in supporting a particular business objective, or many? If so, which ones and how?
The same questions can be asked of all items on this list.
I was instructed decades ago that being busy is not the same as being productive, and it certainly isn’t the same as supporting a business objective. Ambition without focus dilutes impact.
Consider limiting each CoP to three core goals, creating space for depth rather than breadth. Each goal should produce a tangible artifact—be it a playbook, decision log, or tested pattern. This kind of activity creates a clear link between community work and organizational value. It also kick-starts discussions about how to improve the artifact, and what else may be needed to support a community’s knowledge needs.
A CoP is a place people come to learn. That is true. But what they learn is just as important as nurturing the learning.
Slide 15 — Requirements to Launch a CoP
This slide initiated my work on this post. From what you have read to this point, it should be clear that CoPs require a business need to lauch—not a priority, but an objective. They need to be about something initially, as a rallying point for action. They can certainly evolve, and should, but at launch, they need purpose.
Once they have purpose, they can organize. This list offers some hints. I think the APQC has deep materials on how to launch a CoP, as each topic on this list requires more than a title. My post on Cultivating Communities of Practice: A Practical Guide covers many of these topics and suggests ways that AI can play a role in CoP development and management.
Just as important as launching a CoP, communities should also be easy to stop when they are no longer delivering value. Define exit triggers—such as two consecutive quarters of participation decline or failure to produce valuable artifacts—and act on them. This prevents dormant CoPs from consuming resources and attention that could go to active, high-value communities.
Slide 16 — Roles in a CoP
Opinions should not matter when it comes to roles in a CoP. All of the roles outlined on this slide reflect good practice. That any of them are missing puts a community at risk. That does not mean that the roles need to be dedicated, full-time, but it does mean the accountability for the role should be clearly understood by the community.
Some of the roles in this list, such as content manager and librarian can be conflated. Formal leadership, however, at only 58% suggests alternative models of leadership that are informal, but my experience suggests a community cannot survive with informal leadership. It can certainly adopt various forms of leadership, but any of those forms needs to be formalized so that the community understands its operating principles.
In the best CoP models, communities act as extensions of management, often being designated a responsibility for decision making, at least on technical or operational matters within their purview. The role of liaison to management is missing from this list. Again, it need not be a single person with that accountability, but it is a crucial role that make CoPs integral members of an organization’s strategic operations rather than a peripheral, and therefore, less important support structure.
Other roles missing from this list include an Outcome Steward, a role responsible for ensuring that discussions translate into actionable outputs. And with the rise of AI, a Pattern Editor could work hand-in-virtual-hand with AI to curate and refine community contributions into reusable knowledge assets.
Slide 17 — Skills for CoP Leaders
Facilitation and passion are critical, but communities that deliver sustained value are run like products. Teaching CoP leaders backlog grooming, experiment design, and metrics literacy allows them to shape roadmaps, track impact, and pivot based on evidence—not just enthusiasm.
A skill missing from this list is collaboration, which works internally, as well as among communities. Good CoP implementations share knowledge about the practical management, nurturing and sustaining of CoPs in addition to their domain accountabilities.
Slide 18 — Participation Methods
While 59% of respondents saw “Promoting a culture that encourages and supports
Participation: my entire book, Management by Design, belies the notion of promoting culture as an effective means of well, promoting culture. Culture is derived from policy and practice, from the use of the space (virtual and physical) and the integration of technology. It needs to be designed, not promoted. I see culture as an abstract term with too much baggage to be meaningful. Organizations need to focus on the levers that create the “culture” they desire rather than asking employees to figure out what they mean by “culture.”
Further, encouragement is not enough if people have no time to participate. Managers should swap out low-value status meetings for community sessions, effectively granting ‘permission to participate.’
An integration of CoP activity into the workstream signals that CoP involvement is legitimate work, not extracurricular activity. Job descriptions and performance reviews also need to acknowledge the role of CoPs in the work of individual participants. If they do not, CoPs are at best considered positive extracurricular activities, and at worst, extra work for which they are neither recognized nor compensated.
Slide 19 — Critical Success Factors
There should be one overwhelming Critical Success Factor for CoPs, and it is not on this list: business impact. Business impact can be defined in many ways, from cost savings to revenue generation, but without it, a CoP cannot tie itself to the organization’s strategy, its forward progress in its industry.
Alignment to business objectives is not enough. A CoP dedicated to salespeople should not be a community that gives salespeople a place to commune with other salespeople. Sales is a domain, it is a discipline, and it continues to evolve as new tools emerge and as buyer perceptions require new tactics. If the Sales CoP does not result in the adoption of new practices that increase sales, then the CoP, regardless of its ability to meet any of the success factors listed here, is not contributing effectively to the organization’s performance.
Slide 21 — Program Tenure
A long-lived CoP is not necessarily a healthy one, nor a mature one. A well-designed CoP can be “mature” almost immediately, whereas a long-lived on may have devolved into a functional club.
Annual reviews should assess participation trends, the quality and reuse of artifacts, cross-functional reach, and leadership resilience and business impact. Communities that survive but fail to thrive should be restructured or retired. In fact, a healthy metric on CoP tenure should also include the number of retired CoPs, which would demonstrate internal recognition of changes in the needs of the business and its employees.
Slide 22 — Number of Active CoPs
Simple counts mean very little. Multiple communities may unknowingly be solving the same problems. A census—mapping topics, membership, and outputs—reveals duplication. Merge overlapping CoPs to concentrate energy, reduce redundancy, and scale outcomes.
Slide 23 — Time Spent on CoP Activities
It’s not the hours that matter, but what those hours produce. Track how long it takes for a discussion to result in a reusable artifact or meaningful practice adoption. Communities that consume time without producing assets or business impact should adjust their approach.
I have worked with engineering CoPs, for instance, that measure their success based on the magnitude of cost savings. The CoP is a place to bring ideas. For those who don’t have something to share, the time is limited. If, however, an engineer discovers a problem and a solution that can be applied to multiple other instances, then the community buzzes with activities as peers realize the impact and implement the new approach to mitigate risk and reduce costs.
Time spent should be measured against value creation, not as a percentage of a work week.
Slide 24 — CoP Subject Areas
Some of the most valuable communities are in underrepresented areas—like compliance or emerging markets—where decisions carry high stakes. Build a subject-risk map and deliberately start communities in these blind spots.
As I pointed out in my APQC talk (documented in detail here) on the need to apply KM to AI, none of those topics are represented on this list. At this point in the evolution of AI in the enterprise, they all should be very active. Their absence from this list suggests an abdication of that domain to IT, which is likely ill-equipped to manage those issues on its own.
Slide 25 — Budgets
Unsure is an unacceptable answer, with as needed, nearly as unacceptable. CoPs require budget, what this chart tells me is that most organizations have not defined good practice for CoP funding.
Most importantly, the budget should be a baseline of the amount of time individuals can allocate. CoPs represent a complementary organizational structure. No budget, the structure fails. People must be allocated time, which is a proxy for budget, to figure out what the CoP can do.
Like any structure within an organization, the CoP should then be accountable for requesting funds to deliver business results. It should define the proposed value of the results, and request a budget for time, capital, services and other elements that would allow the community to deliver on its proposal.
CoPs should be rewarded (also a budget item) for producing reusable assets, conducting cross-community events, or delivering innovations that include a measurable business impact. That is the crux of the knowledge management challenge for CoPs. If they want budget, they should be able to document their results.
Of course, it can be hard to document results when many CoPs leverage serendipity (see my work on the Serendipity Economy here), but even serendipity, which may not be able to predict time-to-value, can nonetheless document value when it does occur. CoPs, more so than other structures within an organization, need to pay attention to the provenance of their value realization, because they can so often be separated in time from initial value creation events.
Organizations should consider micro-funding to keep resources tied to outcomes rather than activity. Fund based on a combination of the most recent value realization and anticipated return from current work.
Slide 26 — Connection & Involvement
Voluntary participation works best for CoPs, and being part of performance goals, or set as a role expectation, does not make a CoP required. What it means is that if one does not participate, then they are opting to disregard a performance measure (if everyone adopted all of their performance measures at all time no one would ever receive a below “excels” evaluation).
As for how people meet, a list is as good as the percentages. What CoPs within organizations needs, at this will vary by CoP, is a list of the best ways, to solve for particular problems. Virtual meetings, for instance, can be just talking heads yammering on to each other, or they can be active meetings driven by tools like Miro, that facilitate deep knowledge capture and offer design and documentation tools that exceed those available in most physical spaces.
A CoP meeting should be designed (see my work on meeting design here).
Slide 27 — Meeting Cadence
Communities should meet as often as their problem space demands. If decisions and deliverables emerge monthly, meet monthly; if quarterly, adjust accordingly. Use asynchronous tools to prepare discussions so synchronous time is focused on action. CoPs should base their cadence on value cycle time, not habit.
Slide 28 — Active Participation
If CoPs provide a combination of professional development through peer interaction, and are held accountable for discipline/domain practice implementation and related policy recommendations (along with space use and technology adoption), then anything less than 100-percent participation is too low.
I say this, because for who choose not to participate also choose not to grow within their discipline (at least within the bounds of the opportunity afforded to them by their organization), nor do they choose to actively shape the future of their own role, and those whole will assume that role in the future.
Now, there are caveats to this based on the perceived respect for the CoPs. If CoPs are not delivering on expectations, either personal or organizational, they will likely find much lighter affiliation, as the activities of policy setting and role growth either do not exist or are served through some other mechanism. If that is the case, CoPs need to reflect on their value and their label.
A CoP has a particular function within an organization; if a community serves some other function, such as sharing industry news or talking about the latest vendor issues. If the goal is camaraderie and not business results, then the community may be one of practice or passion, but it is not a Community of Practice.
Further high participation rates mean little if the same few voices dominate. Measure unique contributors, spotlight first-time voices, and rotate facilitation to surface different perspectives.
Slide 29 — CoP Content
This question would be much better as a multiple-choice question that results in a matrix showing if a CoP exists in a rationalized collaboration environment, or a diverse one. Organizations need to understand that perception, by CoP.
Many organizations, for instance, use Slack and Microsoft, which includes Teams and SharePoint.
CoP Content needs to answer several questions, including these:
- If all of those are available as locations for CoPs to share, collaborate or share knowledge, how do individual workers know where to look for the system of record?
- Do all CoPs share in one technology, or is it up to the CoP to choose?
- Do they apply the same practices for metadata?
- The same indexing engines?
- Do they measure effective use of content, not just its storage?
- Does the repository include feedback mechanisms?
- What is the CoP turnaround on suggested changes or improvements to content? Do they apply version control to retain previous versions that may need to be referenced in the future?
The question of content should be more about the design of the content management experience than one of the location where that content resides (assuming the repository meets the design criteria for the content maintenance features).
Slide 30 — AI in CoPs (Deployment Stages)
This slide suggests that a maturity model exists for AI. It does not. There is not sense for what “optimizing” AI means at this time. Optimization will likely never be achieved as it will always be a moving target.
From an advisory standpoint, any organization not implementing AI for CoP use is suboptimizing the current technology. Much of AI does not require formal implementation like systems of the past. Bringing AI into a discussion about engineering is implementation, even if it doesn’t provide all the right answers. AI can be a partner in many discussions.
For deeper integration, such as knowledge chat-bot, categorization, pattern recognition, auto-tagging and other features, those capabilities may require evaluation or piloting.
AI is not just one thing. This question would be better phrased to look at various aspects of AI use, rather than to treat it as a monolithic technology.
Regardless, the “adoption” of AI should be 100% at this point. How the output gets used remains at the discretion of the CoP, but that any CoP is not considering it is irresponsible.
Slides 31–32 — AI Trends Snapshot
These two slides deserve a lot more visual mapping and organization to make the findings pop out. I would like to see the security and compliance concerns listed, and a much richer view of the tools that were available for selection in the question prompt.
As for the “Snapshot of Emerging Trends,” I avoid trends, as I think they are momentary, and in lieu of statistical validation, not really trends. This is, however, a snapshot. Much of the value of this slide is lost in its narrative construction. A table with “Activity” and “AI Use” would be easier to read. And for any of them, the answers beg other questions, such as:
- How long did it take to implement?
- Did you need to implement more than once, given changes to LLM models?
- Did AI deliver on its promise? What was the business impact for AI versus legacy approaches like search?
- Are you using more than one AI tool for the same thing? Do they conflict? Do you use them collaboratively, adversarially?
Slide 34 — Change Management
Change management is only important to CoPs if they are accountable for change, meaning practice or policy—that they are viewed as part of the management infrastructure of an organization.
If they are not seen in that way, then they do not need to would about change management because they aren’t changing anything.
If they are, however, accountable for change, then they need to follow good change management practice, which extends beyond the topics list on this slide, as many of these ideas are sub-topics to “Follow a structured change management approach.”
Those accountable for change should consider applying work experience design. Remove barriers like scheduling conflicts, make next steps obvious, and leverage social proof to build momentum intended to meet the goals of the change (all changes require different designs).
Slide 35 — Rewards & Recognition
I find this slide disconcerting, as it reveals that 50% of the respondents work in CoP environments that are likely not just suboptimal, but may actually be costing the organization more than any value put in (which would likely be coming from voluntary, uncompensated participation). It would be better for those organizations to find an alternative way to meet whatever goals CoPs were established to support.
As with all other aspects of CoPs, they benefit from design, meaning rewards and recognition, and other compensation are tied to outcomes, not to participation.
Slide 36 — Adoption/Participation Assessment
My research suggests that any type of “counting” measure says more about activity and less about impact. CoPs should consider the following metrics to determine the adoption of their outputs rather than the engagement of their communities.
- Changes to business practice that result in cost savings (reduction or process, adoption of automation, etc.), employee retention/attraction, revenue generation (including practices that improve sales results and “innovations” that significantly enhance the perceived value of products or services), or other measurable business impact.
- Risk mitigation/cost avoidance that results in mistakes not being made/repeated or human/capital costs associated with accidents.
- Brand equity: thought leadership that leads to perceived value of the organization, and therefore enhances capital availability through stock price or other investment mechanisms.
Activity measurements can prove useful for community management, but they do not reflect the value of the community.
Slide 37 — Value Assessment
Again, a 46% result that reports NA or “We do not formally assess the overall value of our CoP” says to me that 46% of participants in the survey are saying their organizations do not value their CoPs. The only value-based measure on this survey question is “Number of problems/issues solved per time period due to participation in CoP” at 26%, an attribute of CoP that should be its primary purpose.
Top Takeaways from my APQC CoP Assessment
Overall, organizations should be applying design thinking, business-value results and asking CoPs to take on accountability and responsibilities as complementary organizational structures.
This survey reflects deep missed opportunities at a moment when CoPs should be indispensable. In an era of accelerating AI adoption, economic uncertainty, and political volatility, organizations need structures that empower people, connect them across silos, and give them agency in shaping their own disciplines. CoPs are uniquely human systems, capable of turning tacit knowledge into shared capability and aligning expertise with business imperatives.
What this survey reveals instead is that too many CoPs lack the design, the accountability, and the integration with business strategy necessary to thrive. Without clear purpose, evidence-based measures of value, and the authority to influence practice and policy, CoPs risk becoming marginal activities rather than essential levers of performance. The moment demands that leaders treat CoPs not as optional social spaces, but as complementary organizational structures—designed with intent, resourced with time and budget, and held accountable for delivering outcomes that matter.
CoPs should be drivers of adaptation, not victims of change, actively shaping how organizations respond to disruption, absorb new technologies, and sustain the human connections that make innovation possible.
For more serious insights on AI, click here. For more serious insights on KM, click here.
Did you enjoy this post on APQC Communities of Practice Survey Findings? If so, like it, subscribe, leave a comment or just share it!
Thanks for mentioning our research survey, Dan. I want to clarify for you and other readers that this is just one new piece of survey data in an entire body of research, insights, and tools on communities of practice that APQC has been doing for more than two decades. The piece you mentioned is hot off the press this year, so your insights on the raw data are much appreciated. We at APQC are just getting started in creating additional research content that will include more insights, tools/templates and case studies from member organizations. In fact, we had a webinar last week with Stan Garfield as a visiting expert to share his insights into our latest research.
Here is a link to the recording:
https://www.apqc.org/resource-library/resource/unlock-power-communities-practice-research-insights-and-expert/html
As a reminder, APQC members can access all our research on communities, and those who are not members still have access to some “free” information, along with our webinars and blogs. In addition to our body of research on all things KM, we have training throughout the year and our annual conference each spring in Houston. There is a call for speakers RIGHT NOW for APQC Connect 2026. I would love to have KM enthusiasts and experts check it out or submit to share a story with us.
Check out other available resources below!
Research collection on communities of practice (contains both member-only and public resources):
https://www.apqc.org/resource-library/resource-collection/km-essentials-communities-practice
Upcoming Webinars:
https://www.apqc.org/events/upcomingwebinars
APQC Connect 2026:
https://www.apqc.org/events/annual-process-knowledge-management-conference
https://www.apqc.org/events/2026/apqc-conference/call-for-speakers
APQC Training:
https://www.apqc.org/resources/training