Community Exchange
In 2023, RadicalxChange experimented with a novel citizen engagement methodology in Colorado, in partnership with The Civic Canopy and Healthy Democracy. Building on several years of work developing best practices for the use of pol.is and Quadratic Voting for deliberation and collective decision making, and inspired by successes of the Citizens Assembly model of engagement around the world, RadicalxChange sought to determine what advantages there might be to cross-pollinating the two methodologies.
The opportunity presented itself in a partnership with the Office of Climate Preparedness (CPO), a newly-created office in the executive branch of the state of Colorado. The CPO was tasked by a State Senate bill with creating a roadmap that would coordinate climate preparedness policy across state government agencies. In order to invite citizens into the process, RadicalxChange assembled a representative sample of the state’s population to discuss climate preparedness policy and communicate their priorities to the CPO using the newly developed engagement model: “Community Exchange.”
Background
Popular citizen feedback formats such as town halls and stakeholder consultations have well known weaknesses: they often fail to achieve synthesis between opposing views, and/or amplify voices that are not necessarily representative of the population. Citizen assemblies are the gold standard at achieving these ends, but they are typically resource-intensive, multi-day endeavors. The Community Exchange pilots in Colorado explored how we can harness tools like Plural Voting and pol.is to convene mini-publics that are much shorter in duration than traditional citizens assemblies and less costly to implement, but also deliver more meaningful deliberation, synthesis across disagreements, and actionable inputs than town halls. The results were encouraging, and show how this model has potential to greatly increase the bandwidth of communication between citizens and the governments that serve them.
The Community Exchange Model
RadicalxChange convened 3 separate citizen panels to test the Community Exchange model and gather rich input for the CPO. The same general structure was used in all three panels, making some modifications from one panel to the next:
Assemble a minipublic - Using sortition, select a panel of Colorado residents on Zoom that is demographically representative of the population of the state.
Small-group discussions - Facilitate small-group discussions between the citizens on prompts designed to surface their personal concerns and priorities in the policy area. These discussions build trust and create a safe space for sharing concerns and ideas. They address information asymmetries and build shared understanding, making policy more transparent and participatory.
Pol.is conversation - Host a live pol.is conversation to capture a map of the landscape of perspectives in the group. Pol.is allows the group to efficiently crowdsource a wide array of comments in a fraction of the time it takes to brainstorm verbally, while simultaneously generating an analysis of the areas of consensus and points of tension.
Quadratic Vote - Hold a live Quadratic Vote, where panelists rank their priorities across the inputs they shared in the pol.is conversation. QV’s unique mechanism design internalizes inherent tradeoffs between options and results in a clear, accurate representation of citizens’ collective priorities.
In each case, the panel discussed a broad range of climate preparedness issues and resulted in a detailed report of the group’s input, in one three and a half hour session per panel.
Conclusions
The Community Exchange Model is a promising tool for gathering input from the public on what areas of state action they feel are most important to them. In particular, these pilots showcased the value of pol.is and Quadratic Voting as tools for synthesizing the viewpoints and convictions of a group of citizens, making the output of a single-session, online assembly more useful to policymakers.
Future experiments should explore how the session agenda can be revised to encourage even more collaboration between the panelists. When gathering citizens, every effort should be made to maximize the amount of time spent in face-to-face conversations and collaborative work relative to the amount of time silently interacting with online tools. That said, preserving some space for individuals to reflect silently and organize their thoughts often enhances the quality of group discussions.
Iterations and Learnings
Panel 1
The first panel was intended as a lightweight experiment to test out some of our assumptions in a lower-stakes environment. Recruitment standards were more relaxed to minimize cost.
Recruitment Design
- Panelists were recruited through the mailing list and Twitter reach of Civic Canopy. Each panelist was offered $100 for their participation.
- From the pool of volunteers, a panel would be selected by lottery using Panelot, an open-source sortition tool, according to demographic quotas that would be proportional to the state-wide demographics recorded in the most recent census.
- The target panel size was 35 Colorado residents.
Recruitment Results
- 30 selected panelists responded to their invitation that they would attend. 22 actually attended.
- Our selection process resulted in a panel that was reasonably representative of Colorado’s demographics.
- During the panel, it became clear that many of the attendees were not residents of Colorado. This highlighted the need for a stricter verification process for prospective attendees, especially when we are offering a $100 participation reward.
- When designing verification methods, it is important to be aware of accessibility tradeoffs.
Facilitation Design
- The session began with an icebreaker conversation in small groups (Zoom breakout rooms).
- Re-assembling the full group in the main room, facilitators introduced the agenda and the subject matter.
- The panel then broke into small groups again for three rounds of discussions on the session topic.
- Round 1: Impacts and fears. This round was designed to get panelists talking about their personal experiences with climate change.
- Round 2: Positive futures. In this round, panelists were asked to visualize the future they would like to see.
- Round 3: Recommendations. Finally panelists were asked to brainstorm recommendations for the state to consider when building policy.
- Panelists were then given a 15 minute break.
- Next, panelists were asked to participate in a live pol.is conversation, to capture and map the ideas they had discussed and continue the brainstorm.
- Facilitators presented the pol.is report to the full group, pointing out developments such as opinion groups, areas of consensus, and areas of disagreement. The full group was invited to discuss what they found interesting in the report and identify ideas they felt weren’t represented or fully developed yet.
- During this conversation, “backstage” facilitators selected a representative set of panelists’ statements from the pol.is report (including the highest consensus statements, as well as some divisive statements) to form a QV ballot.
- Finally, panelists used QV to prioritize the recommendations generated by the group.
Facilitation Results
- Facilitators executed the agenda very successfully in a 3.5 hour session.
- It was difficult to get the panelists talking—many even kept their cameras and microphones off. This was largely because many of the panelists were non-residents, as stated above.
- Panelists were perhaps not sufficiently introduced to core technical concepts. Citizen panels must either remain sufficiently broad as to be inclusive of all levels of subject matter expertise, or provide sufficient background education to bring all participants up to the requisite baseline for a productive discussion.
- Pol.is and QV were very effective at synthesizing the group’s contributions into concise and clearly ranked priorities to submit to the CPO.
- In Panel 1, panelists reported that they would have liked even more time to process and discuss the results of their pol.is conversation.
Panel 2 and Panel 3
Panel 2 was the first of two full-scale panels. Panel 2 and Panel 3 were both selected in one joint recruitment effort.
Recruitment Design
- The target panel size for Panel 2 and Panel 3 was 24 Colorado residents each.
- In order to meet demographic goals and have a sufficiently random lottery, the goal was to recruit a volunteer pool of around 400 Colorado residents, from which the final panels would be selected.
- A letter of invitation was sent out via USPS to about 19,000 randomly selected residential addresses in Colorado, using an online mailing service called Lob. These addresses were pulled from the National Address Database (NAD). Counties not represented in the NAD were filled in using public record voter roll data.
- The final panels were selected using Panelot.
- In Panel 2 and Panel 3, Healthy Democracy (HD) held one-on-one orientation calls with each of the panelists to introduce them to the technologies they would need to use in the session for accessibility. HD also provided backstage tech support during the sessions to ensure that all panelists could participate fully.
Recruitment Results
- Some hundreds of the 19,000 invitations were not sent or returned to sender as a result of out-of-date records in the NAD and voter rolls.
- Less than 100 invitees responded to the letter, a response rate of about 0.5%. Our expected response rate was 2.5-3%. This may have been because the letter and envelope was not engaging enough.
- Panel 2 was selected from the ~100 volunteers—because the volunteer pool was small, Panel 2 missed some of its demographic targets.
- A follow-up postcard was sent to all addresses who received a letter and did not respond, in order to gather enough volunteers for Panel 3. The postcard was visually engaging and simple, in contrast to the letter. Around 60 new volunteers responded to the postcard, including members of key demographics that were slightly underrepresented in Panel 2.
- Panel 3 was selected from the augmented volunteer pool. Demographic groups that were underrepresented in Panel 2 were overrepresented in Panel 3 to achieve balance across the two panels.
- For each Panel, 24 panelists responded that they would attend. About 20 actually attended in each case.
- The pre-panel orientation calls and in-panel tech support provided by Healthy Democracy was essential to a successful online panel.
Facilitation
Changes from Panel 1 to Panel 2
- A concerted effort was made to better orient the panelists to the technical realities of the CPO’s function at the beginning of the panel. This seemed to help focus the panelists’ conversations. A clearer indication of the intended scope of the panel appeared to give panelists more confidence to contribute. Providing a few examples of existing state policies in the relevant policy area also helped prime panelists for discussion.
- Panelists had two rounds of small group discussions instead of three:
- Round 1: Existing policies & communications. Panelists were asked what existing state policies they already knew about, by what communication channels they heard about them, and what communication channels they preferred or would like the state to use. In Panel 2, these conversations were hampered by the panelists’ lack of confidence in speaking about existing policies. This showed the importance of encouraging panelists to speak from their personal experience and establishing that no prior policy knowledge is expected or required. This stage was more successful in Panel 3 as a result of this change.
- Round 2: Recommendations. Panelists were asked to brainstorm recommendations for the state to consider when building policy.
- Panelists were given more time to interact with the pol.is conversation, and after the pol.is report was presented, they were sent back into small groups for further discussion in light of the information shown in the report. These discussions struggled in Panel 2 because the information in the report was overwhelming and panelists’ weren’t sure what they were supposed to be discussing. This showed that facilitators must re-focus the conversation on specific tension points or areas of confusion illucidated by the report. This stage was more successful in Panel 3 as a result of this change.
- Back in the full group, panelists were shown the ballot that had been constructed from their pol.is comments and were allowed to suggest amendments before prioritizing with QV.