Report from June 10, 2020 ICARP TAC Workgroup on Resilience

Transcript Of Report from June 10, 2020 ICARP TAC Workgroup on Resilience
Agenda Item #4b
June 26, 2020 TAC Meeting
Report from June 10, 2020 ICARP TAC Workgroup on Resilience Metrics
Purpose: There is a critical need to develop a suite of outcome-based climate adaptation and resilience metrics that can help the state track progress over time. This is a role that ICARP – via the Technical Advisory Council and Adaptation Clearinghouse – as well as through its coordination role, can advance. For example, the current iteration of Safeguarding, the state’s climate adaptation strategy, serves as a valuable inventory of state adaptation programs and projects, but lacks a strategic framework to guide the prioritization of state efforts relative to climate resilience goals. Foundational to building this type of prioritization framework, is the need for outcomes-based climate adaptation metrics. In addition to feeding into Safeguarding, these metrics could help inform other state climate adaptation planning, assessment, and guidance processes.
OPR White Paper Synopsis
Ahead of the June 10 meeting, OPR released a preliminary draft white paper discussing resilience metrics in California. The draft document is available here for download, review, and comment. The preliminary white paper outlines the TAC workgroup’s goals, some of the resources available to arrive at those goals, and begins to develop a state of practice in resilience metrics.
In terms of preliminary conclusions, OPR staff found that resilience metric availability - the strength and number of various data sources throughout the California ecosystem - is comparatively strong, and there is a rich backdrop of resources at the regional and city level across the state for the TAC’s consideration. Meanwhile, OPR Staff also noted gaps in the state of practice and challenges in advancing this effort. The draft preliminary white paper also shares several potential metrics for natural, built, and social systems.
Speaker Synopsis:
Martine Schmidt-Poolman (California Energy Commission): CEC presented on three current efforts that could support the TAC’s resilience metrics goals, including sharing data methodologies, developing data for decision-making, and guidance and training:
1. Sharing data methodologies: a. CEC’s most recent focus has been in bridging the gap between past work in climate data and forward-looking methodologies for calculating climate impacts. The first main area of operation,
1
Agenda Item #4b
June 26, 2020 TAC Meeting
sharing data methodologies, aims to provide detailed scientific data into the hands of practitioners and scientists across the state. 2. Development of data and methodologies to be repackaged for decisionmaking: a. Cal-Adapt is a publicly available tool that packages data from across the state. The website allows everyone to search for and use data for custom decision support tools from a 4th assessment perspective. 3. Guidance and training – CEC assists stakeholders in finding resources, analyzing what resources are available and their best uses.
Carmen Milanes (OEHHA): OEHHA provided an overview of the Indicators of Climate Change Report released periodically as a background for this discussion. The report focuses primarily on drivers of climate change, and associated impacts categorized by conceptual model. Due to the nature of the indicators used, most are backward looking - featuring long term trends in data captured over many years of dedicated research. Each of the metrics requires public information and commitment for multi-year tracking, involving scientific and technical expertise – deriving experiences and collaborations from over 70 collaborates across sectors. For the next Indicators report OEHHA hopes to additionally incorporate indicators of how Native American Tribes have been impacted by climate change.
Dorian Fougeres (CA Tahoe Conservancy): The CA Tahoe Conservancy framed a discussion on lessons learned in Tahoe Basin, specifically around adaptation planning and vulnerability assessment processes. Lessons included:
1. Tahoe began its effort with design principles – what characteristics do we want this system(s) to have? What are California’s cities and jurisdictions already doing?
2. Tahoe integrated vulnerability assessments and the existing adaptation portfolio – around ecosystems, infrastructure, and communities, and paid particular attention to vulnerable communities.
3. Tahoe framed their processes through a lens of social-ecological resilience – purposefully not separating people from natural systems.
4. Tahoe also thought through the adaptive capacity of communities. How capable are communities to adapt? What steps can be taken to improve that capacity? Tahoe also considered a subset of the questions below: a. Who is your system benefiting? Policymakers? Communities? Academia? Think through who the audience is for these metrics.
2
Agenda Item #4b
June 26, 2020 TAC Meeting
b. Resilience of what to what? Resilience in literature is centered on disturbances and responses. Are these resilience metrics centered only on climate change? Or other hazards?
c. What are your process indicators or indicators of success? Who does what? Who monitors what?
d. What is practicable? Balancing broadly applicable metrics with those that are narrowly specific can be difficult, particularly around multiple geographies.
5. Performance measures are important. What are the inputs, outputs, and what does that tell us about what to care about? Where should we go from here? For example, the Natural Hazards Resilience Screenings Index is a good example, Tahoe-Central Sierra Initiative's 10 Pillars of Resilience is a good starting point for considering an order of operations – highlighting the need for broad agreement, followed by indicators and outputs, and then outcomes.
Adam Parris (NYC Mayor’s Office of Resiliency): Three major lessons from the NYC context include:
1. Avoid searching for the perfect metric: New York designed One NYC, a long-term resilience strategy, to organize our goals for sustainability and resilience. As part of this, we tried to identify what we’re trying to achieve with adaptation, especially difficult in a rapidly changing world, and faced with a desire and opportunity to face historic injustice.
2. The ends don’t always justify means: NYC didn’t just focus on outcomebased indicators of progress, but also on process-based indicators to make sure NYC was bringing in partners and diverse stakeholders. This was a key framing point of the New York City Panel on Climate Change, a mayor-appointed board of researchers with expertise on the sectoral impacts of climate change.
3. Related to the previous lesson, every indicator represents a network of partners. Because of this, NYC not only needed publicly available information from years past, but also institutional arrangements for sustained data collection and reporting.
Draft Definitions
[These definitions come from the DRAFT White Paper on Resilience Metrics and are for discussion purposes. Over the coming months the TAC workgroup will discuss and refine these so they match the goals of the TAC in formulating resilience metrics.]
2018 Safeguarding states that metrics should be developed to track progress in:
3
Agenda Item #4b
June 26, 2020 TAC Meeting
• Changing Climate Conditions: Once key risks are identified; metrics should be identified to track the progress and occurrence of change.
• Resilience Outcomes: Metrics should be developed that track the performance of a plan or investment, both in terms of resilience to climate change and in meeting management objectives. Metrics should track proactive action taken by the state to enhance resilience, as well as the effect of past actions.
[DRAFT] Resilient Natural System Definition: “Natural systems adjust and maintain functioning ecosystems in the face of change.”
[DRAFT] Resilient Built System Definition: "Infrastructure and built systems withstand changing conditions and shocks, including changes in climate, while continuing to provide essential services."
[DRAFT] Resilient Social System Definition: "All people and communities respond to changing average conditions, shocks, and stresses in a manner that minimizes risks to public health, safety, and economic disruption and maximizes equity and protection of the most vulnerable."
[DRAFT] Indicator Definition: An “indicator” refers to a characteristic used to describe something.
• An indicator can consist of a process, or a condition. • However, given the difficulty of directly measuring many processes, for our
discussions we propose (1) using the term “indicator” to refer to a sitespecific condition at a given moment, and (2) that using multiple indicators taken together (especially when measured over time) can approximate a process. • Indicators can be Output or Outcome focused.
• Outcome-based metrics represent a specific, observable and measurable indicator of an outcome.
• An indicator can also be Process focused, measuring how well a set of indicators is utilizing partnerships and bringing in diverse stakeholders.
[DRAFT] Metric Definition: Measuring an indicator implies identifying an appropriate unit of measurement (a “metric”), and then creating or utilizing a corresponding data set. In some cases, an indicator and metric may be identical (e.g., trees per acre). And in some cases, complex indicators may combine multiple metrics and data sets.
Initial Feedback from TAC Members:
1. Several TAC members noted that we should be sure to enter this discussion from the correct altitude – one potential scale is at the region level, based on work by a variety of regions in California, and the Regions Rise Together initiative.
4
Agenda Item #4b
June 26, 2020 TAC Meeting
a. Several TAC members noted that it is important to have standardized metrics across geographies - State and Federal. TAC members raised a concern, asking to what degree our resilience metrics can be scaled from local efforts, to regional or state. Are each those metrics the same irrespective of geographic scale?
2. This effort should set clear boundaries on what is and isn't included: all metrics may not have equal value. Identifying these boundaries and priorities, may be a useful starting point.
3. Public understanding and perception metrics should be included. 4. We should critically examine whether the existing vulnerability assessment
frameworks are useful for this exercise in resilience metrics. 5. It is vital to define our metrics and indicators from a broader lens of what
we hope to accomplish by this effort. We should consider the definitions provided and dig more deeply. 6. We should include clear metrics around public health and environmental justice. 7. The three systems defined by the TAC in April seem appropriate. We should heavily focus on the social systems aspect -- while the draft preliminary white paper emphasized the importance of human vulnerability and equity, there are not many indicators in the Safeguarding Appendix on these two issues. The California Department of Public Health has examples that might be useful in defining potential indicators around human vulnerability and equity.
5
June 26, 2020 TAC Meeting
Report from June 10, 2020 ICARP TAC Workgroup on Resilience Metrics
Purpose: There is a critical need to develop a suite of outcome-based climate adaptation and resilience metrics that can help the state track progress over time. This is a role that ICARP – via the Technical Advisory Council and Adaptation Clearinghouse – as well as through its coordination role, can advance. For example, the current iteration of Safeguarding, the state’s climate adaptation strategy, serves as a valuable inventory of state adaptation programs and projects, but lacks a strategic framework to guide the prioritization of state efforts relative to climate resilience goals. Foundational to building this type of prioritization framework, is the need for outcomes-based climate adaptation metrics. In addition to feeding into Safeguarding, these metrics could help inform other state climate adaptation planning, assessment, and guidance processes.
OPR White Paper Synopsis
Ahead of the June 10 meeting, OPR released a preliminary draft white paper discussing resilience metrics in California. The draft document is available here for download, review, and comment. The preliminary white paper outlines the TAC workgroup’s goals, some of the resources available to arrive at those goals, and begins to develop a state of practice in resilience metrics.
In terms of preliminary conclusions, OPR staff found that resilience metric availability - the strength and number of various data sources throughout the California ecosystem - is comparatively strong, and there is a rich backdrop of resources at the regional and city level across the state for the TAC’s consideration. Meanwhile, OPR Staff also noted gaps in the state of practice and challenges in advancing this effort. The draft preliminary white paper also shares several potential metrics for natural, built, and social systems.
Speaker Synopsis:
Martine Schmidt-Poolman (California Energy Commission): CEC presented on three current efforts that could support the TAC’s resilience metrics goals, including sharing data methodologies, developing data for decision-making, and guidance and training:
1. Sharing data methodologies: a. CEC’s most recent focus has been in bridging the gap between past work in climate data and forward-looking methodologies for calculating climate impacts. The first main area of operation,
1
Agenda Item #4b
June 26, 2020 TAC Meeting
sharing data methodologies, aims to provide detailed scientific data into the hands of practitioners and scientists across the state. 2. Development of data and methodologies to be repackaged for decisionmaking: a. Cal-Adapt is a publicly available tool that packages data from across the state. The website allows everyone to search for and use data for custom decision support tools from a 4th assessment perspective. 3. Guidance and training – CEC assists stakeholders in finding resources, analyzing what resources are available and their best uses.
Carmen Milanes (OEHHA): OEHHA provided an overview of the Indicators of Climate Change Report released periodically as a background for this discussion. The report focuses primarily on drivers of climate change, and associated impacts categorized by conceptual model. Due to the nature of the indicators used, most are backward looking - featuring long term trends in data captured over many years of dedicated research. Each of the metrics requires public information and commitment for multi-year tracking, involving scientific and technical expertise – deriving experiences and collaborations from over 70 collaborates across sectors. For the next Indicators report OEHHA hopes to additionally incorporate indicators of how Native American Tribes have been impacted by climate change.
Dorian Fougeres (CA Tahoe Conservancy): The CA Tahoe Conservancy framed a discussion on lessons learned in Tahoe Basin, specifically around adaptation planning and vulnerability assessment processes. Lessons included:
1. Tahoe began its effort with design principles – what characteristics do we want this system(s) to have? What are California’s cities and jurisdictions already doing?
2. Tahoe integrated vulnerability assessments and the existing adaptation portfolio – around ecosystems, infrastructure, and communities, and paid particular attention to vulnerable communities.
3. Tahoe framed their processes through a lens of social-ecological resilience – purposefully not separating people from natural systems.
4. Tahoe also thought through the adaptive capacity of communities. How capable are communities to adapt? What steps can be taken to improve that capacity? Tahoe also considered a subset of the questions below: a. Who is your system benefiting? Policymakers? Communities? Academia? Think through who the audience is for these metrics.
2
Agenda Item #4b
June 26, 2020 TAC Meeting
b. Resilience of what to what? Resilience in literature is centered on disturbances and responses. Are these resilience metrics centered only on climate change? Or other hazards?
c. What are your process indicators or indicators of success? Who does what? Who monitors what?
d. What is practicable? Balancing broadly applicable metrics with those that are narrowly specific can be difficult, particularly around multiple geographies.
5. Performance measures are important. What are the inputs, outputs, and what does that tell us about what to care about? Where should we go from here? For example, the Natural Hazards Resilience Screenings Index is a good example, Tahoe-Central Sierra Initiative's 10 Pillars of Resilience is a good starting point for considering an order of operations – highlighting the need for broad agreement, followed by indicators and outputs, and then outcomes.
Adam Parris (NYC Mayor’s Office of Resiliency): Three major lessons from the NYC context include:
1. Avoid searching for the perfect metric: New York designed One NYC, a long-term resilience strategy, to organize our goals for sustainability and resilience. As part of this, we tried to identify what we’re trying to achieve with adaptation, especially difficult in a rapidly changing world, and faced with a desire and opportunity to face historic injustice.
2. The ends don’t always justify means: NYC didn’t just focus on outcomebased indicators of progress, but also on process-based indicators to make sure NYC was bringing in partners and diverse stakeholders. This was a key framing point of the New York City Panel on Climate Change, a mayor-appointed board of researchers with expertise on the sectoral impacts of climate change.
3. Related to the previous lesson, every indicator represents a network of partners. Because of this, NYC not only needed publicly available information from years past, but also institutional arrangements for sustained data collection and reporting.
Draft Definitions
[These definitions come from the DRAFT White Paper on Resilience Metrics and are for discussion purposes. Over the coming months the TAC workgroup will discuss and refine these so they match the goals of the TAC in formulating resilience metrics.]
2018 Safeguarding states that metrics should be developed to track progress in:
3
Agenda Item #4b
June 26, 2020 TAC Meeting
• Changing Climate Conditions: Once key risks are identified; metrics should be identified to track the progress and occurrence of change.
• Resilience Outcomes: Metrics should be developed that track the performance of a plan or investment, both in terms of resilience to climate change and in meeting management objectives. Metrics should track proactive action taken by the state to enhance resilience, as well as the effect of past actions.
[DRAFT] Resilient Natural System Definition: “Natural systems adjust and maintain functioning ecosystems in the face of change.”
[DRAFT] Resilient Built System Definition: "Infrastructure and built systems withstand changing conditions and shocks, including changes in climate, while continuing to provide essential services."
[DRAFT] Resilient Social System Definition: "All people and communities respond to changing average conditions, shocks, and stresses in a manner that minimizes risks to public health, safety, and economic disruption and maximizes equity and protection of the most vulnerable."
[DRAFT] Indicator Definition: An “indicator” refers to a characteristic used to describe something.
• An indicator can consist of a process, or a condition. • However, given the difficulty of directly measuring many processes, for our
discussions we propose (1) using the term “indicator” to refer to a sitespecific condition at a given moment, and (2) that using multiple indicators taken together (especially when measured over time) can approximate a process. • Indicators can be Output or Outcome focused.
• Outcome-based metrics represent a specific, observable and measurable indicator of an outcome.
• An indicator can also be Process focused, measuring how well a set of indicators is utilizing partnerships and bringing in diverse stakeholders.
[DRAFT] Metric Definition: Measuring an indicator implies identifying an appropriate unit of measurement (a “metric”), and then creating or utilizing a corresponding data set. In some cases, an indicator and metric may be identical (e.g., trees per acre). And in some cases, complex indicators may combine multiple metrics and data sets.
Initial Feedback from TAC Members:
1. Several TAC members noted that we should be sure to enter this discussion from the correct altitude – one potential scale is at the region level, based on work by a variety of regions in California, and the Regions Rise Together initiative.
4
Agenda Item #4b
June 26, 2020 TAC Meeting
a. Several TAC members noted that it is important to have standardized metrics across geographies - State and Federal. TAC members raised a concern, asking to what degree our resilience metrics can be scaled from local efforts, to regional or state. Are each those metrics the same irrespective of geographic scale?
2. This effort should set clear boundaries on what is and isn't included: all metrics may not have equal value. Identifying these boundaries and priorities, may be a useful starting point.
3. Public understanding and perception metrics should be included. 4. We should critically examine whether the existing vulnerability assessment
frameworks are useful for this exercise in resilience metrics. 5. It is vital to define our metrics and indicators from a broader lens of what
we hope to accomplish by this effort. We should consider the definitions provided and dig more deeply. 6. We should include clear metrics around public health and environmental justice. 7. The three systems defined by the TAC in April seem appropriate. We should heavily focus on the social systems aspect -- while the draft preliminary white paper emphasized the importance of human vulnerability and equity, there are not many indicators in the Safeguarding Appendix on these two issues. The California Department of Public Health has examples that might be useful in defining potential indicators around human vulnerability and equity.
5