External Peer Review Comments for EPA's Draft Vulnerability Assessment, A Novel Approach Using Expert Judgment 101 ame II: Results for the Massachusetts Bays Program Contract No.: EP-C-07-024 Task Order 112 Submitted to: Jordan M. West U.S. Environmental Protection Agency Office of Research and Development National Center for Environmental Assessment Global Change Research Program Washington, DC 20460 Submitted by: Eastern Research Group, Inc. 110 Hartwell Avenue Lexington, MA 02421-3136 October 7, 2011 ------- ------- Contents Responses to Charge Questions 1 1. Do you believe that the "expert elicitation" method developed for this study was effective for assessing the sensitivities of ecosystem processes to climate change? If not, how could expert elicitation be more effectively used? 3 2. Is the level of detail and organization of the report useful to the scientific community as well as ecosystem managers? If not, how would you re-organize the report? 6 3. Does the report effectively: 10 3a. Provide sufficient background information on the estuary program? 10 3b. Explain the scoping process to select vulnerable ecosystem processes? 11 3c. Use conceptual models of ecosystem processes? 12 4. Please comment on whether the project steps were adequately described in the report and in detail appropriate for an ecosystem manager to begin to develop adaptation strategies. Please provide any recommendations for improvement 13 5. Beyond the scope of this report and looking ahead to future work on adaptation to climate change, please comment on the following. This report presumes that to develop adaptation strategies, the first step is to identify system vulnerabilities and sensitivities. Do you agree? 18 5a. If no, what alternative method can you suggest for developing adaptation strategies? 18 5b. If yes, what is the most effective way of identifying those ecosystem characteristics that are most vulnerable to climate change, for deeper focus with sensitivity analysis? 18 6. Please provide any other comments or recommendations that you feel would strengthen the document 20 Additional Reviewer Comments 23 Donna M. Bilkovic, Ph.D 25 Appendix A: Individual Reviewer Comments Error! Bookmark not defined. Donna M. Bilkovic, Ph.D Error! Bookmark not defined. Caitlin M. Crain, Ph.D Error! Bookmark not defined. Matthew L. Kirwan, Ph.D Error! Bookmark not defined. l ------- ------- A Novel Approach Using Expert Judgment - Volume II: Results for the Massachusetts Bays Program Responses to Charge Questions Provided by External Peer Reviewers Donna M. Bilkovic, Ph.D. College of William and Mary Caitlin M. Crain, Ph.D. The Nature Conservancy, Global Marine Initiative Matthew L. Kirwan, Ph.D. University of Virginia Submitted on October 7, 2011 1 ------- ------- Responses to Charge Questions - MBP 1. Do you believe that the "expert elicitation" method developed for this study was effective for assessing the sensitivities of ecosystem processes to climate change? If not, how could expert elicitation be more effectively used? Reviewer Comments Response to Comments Bilkovic The method was generally effective for assessing the sensitivities of select ecosystem processes to climate change. There were several caveats regarding results that were mentioned in the document that will need to be addressed before the method can be broadly applied in a standardized manner. In particular, the determination of confidence and uncertainty estimates for individual judgments will be essential for effective adaption planning. Management actions will likely be driven by the availability and dissemination of limited resources and those actions with a strong justification and defined confidence or uncertainty estimates are more likely be enacted (see later comments). Also, a standardized protocol to ensure that the best- available data/literature are incorporated into influence diagrams and uncertainties will increase the likelihood of acceptance and successful implementation of the most appropriate management actions. Thank you. We agree that improvements to the confidence method will be important. Section 3.1.1.3 discusses sources of difficulty and how they could be corrected. Thank you for this suggestion. Crain Overall yes, I believe the expert elicitation process was effective for reaching the goals of this study. The elicitation was conducted at a level of detail that can be successful - producing qualitative, "rapid" assessments of influence relationships (pathways) and prioritizing their importance. Numerous steps can be taken to improve the "accuracy" of the results and utility of the findings. Some of these are outlined in the report, but additionally, the methodology should be replicated with different experts, or done again with more experts (increase from N=7). The relationship diagrams and relative importance of pathways that are at the core of the outputs can be easily skewed toward "pet projects" or just based on expertise and perspective with such a small number of somewhat connected experts. While I don't necessarily agree with all of the outputs and findings of the expert elicitation (described in more detail below), I found the methodology sound, well thought out, and relatively transparent. These qualities are essential for replicability which as described above is essential for achieving best results. There is the issue of whether we learned anything new? When you look at figure 2-3, 2-4, 2-9, and 2-10 the vast majority of relationships are considered intermediate sensitivity. These are relationships where an increase or decrease in variable leads to a Thank you. EPA's white paper on expert elicitation reviewed the literature and summarized that 3-11 experts is considered sufficient for most EEs, with a law of diminishing returns beyond 6. We have added text to explain this in the report.Thank you. It is intuitive that SLR will have an increasing impact, but the way it 3 ------- Responses to Charge Questions - MBP proportional increase or decrease in response. While these oversimplifications can be useful for envisioning influences, nature is never that simplistic and this basically points to lack of understanding of the system at this level of detail. The impact figures are somewhat more informative, but increasing impact is generally driven by variables that will obviously be increasing in variability or shifting due to climate change. In figure 2-11 we learn that influences that link to inundation increase in relative impact. This information is intuitive with Sea Level Rise (SLR). In addition, use of the terms "thresholds" and "synergies", are often not used accurately in scientific terms. For example on page 2-25, In 19, synergy is used to describe two factors that have the same individual effect when applied together. However, factors can have cumulative effects that are additive, synergistic or antagonistic depending on how they affect the outcome when acting in concert. These cumulative effects can all occur regardless of the direction of the individual effects. Synergism signifies that the outcome is greater than expected due to the individual effects of each factor added together. Here we have no idea what the magnitude of the interaction will be. It is more accurate to say two factors that favor the same or opposing outcomes, we have no sense of the statistical magnitude. In table 2-8 the only type of interaction assigned is synergy (since we tend to anticipate, worry and over-assign them in general) and what people mean is that the outcome is worse than one variable acting alone, however we have very little information on the actual interactive effect. While it is important to consider stressor interactions and thresholds, I found that on the scale of this study, these terms are thrown around without much accuracy or evidence - more as intuitions. This may be fine for pointing to possible issues, but should be more clearly stated as such. It would have been interesting to ask the experts before starting the process to rank their top three key pathways and interactions they anticipated would be most impacted by climate change. This is sometimes done in expert elicitation and is helpful to gauge whether novel information or intuitive information emerges through the exercises. This is something to consider when repeating the exercise. While in this case, the key pathways and management levers are not all that surprising outcomes, it is helpful to see the steps and components that brought the group to their conclusions. While none of the results of the study are particularly novel or surprising, I do think the output tables and figures are valuable. Using experts to think through the system, identify connections and types of relationships, placing those relationships within the whole system helps envision the climate change issues in an plays out with certain variables/influences, including with some interactions, is more complex and informative. 'Threshold' is explicitly defined at the beginning of Section 2.3 with an accompanying citation. 'Synergy' is defined in Table 2-3 as where the effect of X on Y increases with an increase in Z. We have made no statement that N and saline inundation have the same individual effect when applied together; rather, we say that they operate synergistically, which by our definition means that we think that each has a greater effect when in the presence of the other than it would otherwise. It is true that this exercise did not quantify magnitude of interactions, but rather qualitative categories. Following from our above response, it is unclear in what way our terms are used without much accuracy. Using qualitative categories rather than quantitative values is not equivalent with being inaccurate. This will be a good idea to talk about in our lessons learned report. 4 ------- Responses to Charge Questions - MBP ecosystem context so that prioritization can begin. This is a valuable output and where climate change adaptation efforts must begin. Thank you. Kirwan I believe the expert elicitation method was effective for assessing the sensitivities of ecosystem processes and that it will make a good template for future assessments. The report is correct in noting that quantitative modeling could not be used over a short time period to address sensitivities in a particular marsh (Conclusions, A-7). But also please see my response to Question 5 regarding using simple quantitative models to help constrain the variables of interest. Overall, the approach did a remarkable job of blending qualitative group discussion with a rigorous scoring system designed to capture independent expert opinion. The success depended on each of these steps: having the group agree on a simple conceptual framework, while capturing independent opinions that allow for estimation of uncertainty. Overall, I believe the process was extremely effective. There were of course several limitations to the process. Two that are highlighted in the text consider the large number of missing cells for confidence scoring (pg. 3-5, In 6), and the small amount of interactions with enough data to analyze (pg. 2-18, In 31). The causes of both limitations, and their impact on incorporating that information into management strategies, are well discussed. I had difficulty understanding how the confidence scoring worked, so I don't have suggestions to improve it. The interactions scoring process, however, seems like an easy fix if and when this approach is used again. The text reports that "Of the 48 combinations of influences with interactions characterized by participants, only nine could be considered for agreement with at least three participants. (Pg. 2-18, Ln 31). " Next time, make experts score each possible interaction, or at least give them more instruction, time to complete, and encourage them to rank the most significant 5-10 interactions. I would add to these limitations, the potential for inadvertent bias towards predicting vulnerabilities when certain pathways and ecosystems may not in fact be all that vulnerable. After all, most comparisons of accretion rates and sea level rise rates in the New England region over the past 100 years show that the two processes are roughly balanced and that relative wetland elevations are remarkably stable. In fact sea level proxies have been developed on the premise that the marsh more or less has Thank you. Thank you for the suggestions for improvement of the interactions method - these will be used in the lessons learned write up. A citation for this assertion would be helpful. We would assert that there is wide agreement that stability since the Holocene is a not a good predictor of future change (and that proxies based on this are 5 ------- Responses to Charge Questions - MBP kept up with sea level rise rates for much of the late Holocene. Therefore, I was disappointed that the report made no mention of discussion of historical stability. Instead, it immediately states in the first sentence of the Executive Summary (pg. xi) and Introduction (pg. 1-1, In 1) that "The estuaries of the Massachusetts Bays are highly vulnerable to the impacts of climate change." Are these statements based on information gleaned prior to the workshop, or from the results of the workshop? Since the goal of the workshop was to identify potential vulnerabilities, it would be easy to assume that Massachusetts Bays estuaries are vulnerable. Finally, I believe the report makes very appropriate remarks about how lack of agreement does not indicate that a relationship is not potentially important (pg. 3-2, In 28), but that it may appropriately influence the prioritization of management actions. Section 2.2.2 states that "consensus was not the goal of the exercise." I believe the independent scoring and potential for dissenting opinions makes the process and its findings credible. not good ones). This is because (1) rates of sea level rise are accelerating; and (2) humans have modified the coastlines in ways that block landward migration and interfere with natural flows. We have added citations for this (Scavia et al., 2002; Fitzgerald et al., 2008) in the opening statements of the intro. Thank you. 2. Is the level of detail and organization of the report useful to the scientific community as well as ecosystem managers? If not, how would you re-organize the report? Reviewer Comments Response to Comments Bilkovic Overall, the level of detail and organization of the report was sufficient to be useful to the scientific community as well as ecosystem manager. However, I did find it frustrating to have to refer to Appendices repeatedly for more details on the process when trying to evaluate the methodology. I understand the rationale for this organization, but would have preferred to have some key details included in the main document, such as elements of Appendix A: A. 1, A.2 (1st section-conceptual models), A3 (justifies the need for the elicitation process in relation to an analysis of available data), Appendix B: B. 1.1 - Selecting Workshop Participants - particularly the criteria for selection, B.1.2. Straw Man Influence Diagrams, and B.1.3-B.1.4 - a brief description (few sentences) of the background material and assignment in preparation and development of consolidated influence diagrams will assist the reader in understanding the extent of information incorporated into the exercise. Specific recommendations: Executive Summary • Prior to presenting details on the "top pathways" obtained from the process, it would be beneficial to the reader to have more information on the process (page xi). For Thank you. All of A. 1 is already covered at the beginning of section 1.2.2. We have added some more info from A.2 to the second scoping step, and to the submodels. We have added a sentence to make the A.3 point. We've added a sentence about expert selection criteria. There is no more to say about the straw man diagrams, other than they were based on the original conceptual models. We've added a sentence on the briefing calls. We regard these questions 6 ------- Responses to Charge Questions - MBP instance, how were the 7 experts selected? Were they provided with a framework to develop the influence diagrams based on the literature? What was the level of agreement necessary to identify a pathway, threshold, or expected level of sensitivity? In other words, how will the reader or manager be able to gauge the level of confidence for each process to assist in decision-making? A brief synopsis of the protocols and criteria used to ensure the methodology was effective to elicit accurate and informative responses would enhance this section. P xv - xvi. It is unclear who participated in determining adaptation options for each pathway. In order for any adaptation planning exercise to be effective, it is critical to identify and include the stakeholders as early in the process as possible. While this was not the explicit objective of the study, how adaptation options were identified should be clarified. I would suggest reorganizing the executive summary by moving up the section on the "Evaluation of Expert Judgment Approach" (xvii) to precede the discussions on Adaptation Planning (p. xv). Additionally, the paragraph (p. xv) beginning "Based on the nature..." seems to fit within the discussions on adaptation planning and may be better served there. Main document "Thresholds" are first mentioned in the Results (P 2-10 In 13) and should be introduced in the methodology. as beyond the level of detail of an Executive Summary; these details are covered in the full technical report. This was not an adaptation planning exercise; it was a vulnerability assessment followed by a discussion of management implications that was not necessarily comprehensive or indicative of prioritization. A comprehensive adaptation planning exercise would be the next step. We have stated that there was a management discussion by the workshop participants from which the example options were drawn. Following analysis of the exercise results, additional examples were developed based on MBP planning documents, again with the purpose of being illustrative of thought processes, rather than comprehensive. This could definitely work; however we have ordered the information in order of our perception of importance/ interest for the executive reader, in case they stop reading and do not finish. The threshold concept was not inherent to the methodology but emergent upon applying 7 ------- Responses to Charge Questions - MBP • P 2-12, Ln 25-end, P 2-13, Ln 1-6. I would suggest stating upfront the total number of influences examined as the proportion of influences falling in each category (e.g. full agreement) can ascribe a relative value to the results. • Figure 2-7 and 2-12. These graphics are confusing as there are no x-axis labels, nor an explanation in the figure heading. Perhaps a depiction of relative agreement (%) would be more useful to the reader. Also, the category HL is not intuitive. Some interpretation of why this type of situation occurred would be helpful, for instance was this due to unfamiliarity with the evidence by some experts or was there disagreement on the actual mechanisms behind the influences. Can conclusions and adaptation planning still be extracted from influences falling in this category? • P 3-1 Lnl 1-20. Potential issues may arise if "crosswalk" results lead to conflicting management options within a given model or amongst multiple models of processes. Recommending a process to reconcile conflicting outcomes would be helpful to a manager attempting to apply this methodology. the methodology and looking at the results. It is defined at the earliest point at which it arises. Good idea. Percentages have been added throughout. These figures have been deleted and a more simple explanation given in the text. The issue of trade-offs is acknowledged in several places in the report. Recommending a process for reconciling trade-offs in management options is beyond the scope of this vulnerability assessment. Crain It is not clear to me that a goal of this report is that it be useful to the scientific community. If so, it would be important to include a section of how you believe or would like the findings to be used. As is, the most apparent utility is in pointing out research gaps that need addressing. It is also useful to see how experts rank important pathways for focusing research and putting research in context. As far as the level of detail for the report to be useful in general, I believe it is. However I have several suggestions for improving the readability of the report. I found the section 2.4 Discussion of Adaptation Strategies somewhat superficial. It felt like it was there to document the discussion that occurred, which is useful, but gets in the way of jumping to the more relevant, potentially novel findings of the workshop that are elaborated in section 3.1 found the crosswalk tables (3-1 and 3-2) and key pathway figures (3-2 and 3-3) to be nice synthetic results from the workshop output. While these tables and figures were not created at the workshop and thus in another section, they seem like the key "results" of the workshop. You might consider reformatting to focus on expert elicitation and results, both individual and summary, and a final section on You are correct that a main utility to the scientific community is in identifying research gaps (as discussed in section 3.1.1.3). Thank you. We included this section based on preferences from our NEP partners that we reflect the workshop discussions, as distinct from follow-on thinking about mainstreaming adaptation that is discussed later. 8 ------- Responses to Charge Questions - MBP adaptation strategies and management links. Currently section 2.4 breaks up what I see as results and then you get back into adaptation strategies in 3.2.2. In addition, the discussion of two example pathways (sections 3.1.2.1 and 3.1.2.2) is somewhat redundant with the discussion of the three top pathways and breaks up the flow of sections. I recommend removing these examples and including what you want in the following top pathways section. While the report attempts to document all steps of the process as to promote transparency and replicability, I found that there were many figures emphasizing slightly different points - you may want to reduce the total number of figures to focus the reader's attention. The information gathered and goals of study are referred to in different combinations throughout the study. In the executive summary (and on page 2-9) you refer to 1) direction and strength, 2) sensitivity and 3) highest impact. But then you also collect data on confidence. This is included as a list of four steps on page 2-2. On page 1-3 you list the expert elicitation steps as a "sensitivity analysis, vulnerability assessment and analysis of management implications". Some streamlining and consistency would improve the readability and enable the reader to keep the goals clearly in mind as the process gets complex at times. There are places where the document is difficult to understand on a quick glance. For instance, Table 2-12 as an example. It is difficult to understand the point here without referring elsewhere in the document to define what the pathway is. It would be nice to include additional information (a brief description of the pathway in the legend) so that the Table makes sense with a quick read. We want to show interested readers the process by which one can identify and build a top pathway; however, we agree with the desire to streamline. Therefore we have deleted one of the examples. Thank you for this suggestion. We have deleted three figures from the report. Since the confidence information was incomplete and few conclusions could be reached from it, it was not presented in the ES. The types of data categories collected are not goals. The steps on p 1-3 for carrying out the assessment are also not goals, but rather steps in a process. The purpose of the study is stated in the second paragraph of the ES. It is not possible to provide a description of six complex pathways in a figure legend; however, we have changed the legend to refer the reader to the top pathways figures rather than the text section, which should make cross-referencing much easier. Kirwan The greatest impediment to making the report accessible to managers and scientists is its length. The report acknowledges that the workshop resulted in a large volume of information on the sensitivities of processes to stressor interactions, and that the next step lies in organizing the information into a form that We have done our best to balance calls from some reviewers for more detail, with calls from other 9 ------- Responses to Charge Questions - MBP managers can use (pg. 3-1, In 3). I agree whole-heartedly and emphasize that the shear volume of information makes this a difficult task. Redundancy is an issue in the report, but here a few specific ideas to reduce the volume of information presented: 1. If the goal is simply to demonstrate proof-of-concept, then presenting the results of one of the two working groups would be sufficient (i.e. Sediment Retention OR Community Interactions) 2. 3. The report is thorough, and well organized, but the volume and detail of information is often too much. Some information, like the total number of agreements for a particular influence, under each climate scenario is better left to the tables or a figure (e.g. pg. 2-12, In 25 through pg. 2-13, In 6). In a long document, figures represent a convenient way for a reader to scan the document and pick up the important points. For this to work, however, there needs to be more information conveyed in the figure captions. Most figure captions in the report are described in a single sentence. Add a sentence or two of methodology used to create the figure, and give a one sentence description of the main point that each figure is designed to show. This will make the figures essentially a concise summary of the text, capable of being read alone. Other minor recommendations for organization/clarity: 1. Tell the reader early in the introduction or executive summary where the Massachusetts Bays estuaries are. 2. Page 2-2, Line 16-21. There is discussion of Jeffrey's Neck Marsh. A map in an Appendix showing both the location of the Massachusetts Bays area relative to the North American coast, and Jeffrey's Neck Marsh would be helpful. reviewers for less. Proof of concept was not the only goal; we also wanted managers interested in these processes to be provided with the information from both parts of this study for their management consideration. We agree that the detail is cumbersome and have shortened the information and referred to percentages rather than numbers. Unlike the separate files sent for the review, the final report will have the figure embedded within the text explaining them, making it easy to refer to the explanatory text when studying them. A map has been added. Good idea. An image has been added. 3. Does the report effectively: 3a. Provide sufficient background information on the estuary program? Reviewer Comments Response to Comments Bilkovic Limited background information was provided on either the Climate Ready Estuary program or the National Estuary Program in the document. While extensive detail on these programs is This information has been added to the Preface of 10 ------- Responses to Charge Questions - MBP unnecessary to evaluate or adapt information contained in the report, the overarching goals of these programs in relation to the task could be outlined in greater detail (in the introduction) to assist managers that may wish to utilize similar approaches for other systems/programs. the report. Crain Yes. Again, I don't see this as a major goal or necessity of the current project and the information provided is sufficient to provide context. Thank you. Kirwan The only text I can find related to the overall goals of the Climate Ready Estuaries Program is a single sentence on page 1-1 (In 14). Nevertheless, information on the Massachusetts Bay program seems more relevant and the goals of that particular program are described sufficiently. Thank you. Some additional information on the CRE has been added to the Preface of the report. 3b. Explain the scoping process to select vulnerable ecosystem processes? Reviewer Comments Response to Comments Bilkovic Adequate details were included if the reader refers to both the main document and Appendix A. Particularly informative and highly relatable to adaptive management was the rationale outlined on P 1-2, Ln 30-33 for process selection based on the MBP's management goals, increasing sensitivity to climate change and sufficiently well-studied. Thank you. Crain No. This is a weakness of the document if intended to provide guidance to adaptation planning for managers in general. I would like to see more explanation and justification of the ecosystem processes selected. In light of so many possible "processes" to consider it would strengthen the output to understand why these two were selected and therefore why outputs from this analysis should be useful overall. It would be helpful to know if the initial process of outlining processes was thought to be comprehensive with representative or important pathways selected for focus? Could the ecosystem processes also be regarded as ecosystem "services" that we care about? There are other easy to identify processes that were not included such as nursery habitat or wildlife habitat, secondary production, etc. Section 1.2.2 (as well as Section A.2.2) explains that the purpose was to select good processes for piloting the method, not to prioritize among all vulnerable processes ~ a comprehensive listing/prioritization of ecosystem processes and services was beyond the scope of this report. Section 1.2.2 lists the criteria by which the two processes were selected. We have added a sentence to clarify further that MBP staff were involved in selecting these processes as valuable to them. 11 ------- Responses to Charge Questions - MBP Kirwan Yes, the scoping process is described sufficiently, and the Thank you. appendices are used appropriately to make the main text more succinct. 3c. Use conceptual models of ecosystem processes? Reviewer Comments Response to Comments Bilkovic Yes, conceptual models were effectively used with the groups to initiate the derivation of refined influence diagrams and incorporation of relationship types (e.g. direct), interactions and thresholds. This exercise was particularly instructive and could be readily utilized for statistical evaluations such as structural equation modeling (SEM) which allow one to address the relative importance of multiple processes in one statistical framework, as well as positing hypotheses that are testable in experimental studies (Grace 2006). See further comments below. Thank you. Crain I don't entirely understand this question. The document obviously uses conceptual models and I do think laying the expert knowledge out through diagrams is a very effective way for condensing the breadth of understanding. I don't always agree with the influence diagrams created and will use this question as an opportunity to highlight some of my concerns. Sediment Retention: Relationship AA (marsh edge erosion on sediment deposition) is characterized as a weak inverse effect. This characterization and the mechanism driving this effect is never adequately described. On xii, this process is described as marsh edge erosion where some sediment is deposited and some not - a weak inverse relationship does not clearly follow from this description so more detail and justification would be helpful. On page 3-9 this relationship is also described with inadequate explanation of the mechanism connecting deposition on the marsh surface with erosion on the edge - is this causal or correlational (high storm energy both erodes the edge and reduces sediment trapping?)? Community Interactions: The Community interactions endpoint is "Saltmarsh Sharp-tailed Sparrow Nesting Habitat". Because this is a physical feature, the influence diagram focuses on physical drivers. It would be nice to have an example with more biotic interactions considered as this would draw in the many uncertainties and issues of shifting biotic interactions with climate. Thank you. AA is not characterized as weak but rather inter- mediate sensitivity and impact. Uncertainty as to whether there is net positive or negative deposition is the reason there is no agreement on AA under current conditions, but under climate change there is strong agreement in an inverse effect. Page 3-9 states causal agents. The threshold process is explained on 3-10. Understood. But when asked to condense the process down to a tractable influence diagram, the highest priority variables chosen by the experts as most 12 ------- Responses to Charge Questions - MBP In the community interactions influence diagram, they failed to connect the ratio of native high marsh to phragmites directly with the endpoint. By having the influence go through elevation, you are removing the direct influence of habitat quality which you state is an important feature. The interpretation and resulting management implications are actually not interpreted correctly as the diagram reads. Page 3-15.1 believe the argument behind this green pathway is flawed. The current influence path goes from sparrow to elevation. If increasing elevation is what is essential for the sparrow, then the argument traveling up the pathway, would be to promote Phragmites through increasing nitrogen runoff. I really feel this diagram should have an arrow linking the ratio of native high marsh to Phragmites directly to sparrow habitat as it is stated that the species themselves are important, not just marsh elevation as the diagram is currently constructed. In the search for management implications, you are placing judgment on Phragmites as undesirable even though the diagram does not actually indicate that. important were these. We agree that in another iteration of the exercise, this connection could be added. At the same time, the participants and MBP staff had no trouble remembering that the original MBP goal was preservation of native marsh, hence they arrived at the interpretations described in the report, rather than a blind acceptance that because Phrag maintains elevation, that should be the only management consideration. Kirwan Yes, the report thoroughly uses and describes conceptual models of ecosystem processes. I found the conceptual models to be very well captured in the figures. Thank you. 4. Please comment on whether the project steps were adequately described in the report and in detail appropriate for an ecosystem manager to begin to develop adaptation strategies. Please provide any recommendations for improvement. Reviewer Comments Response to Comments Bilkovic Since the development of conceptual models (step 2, P 1-2, Ln 22-32) as a framework in support of the expert elicitation process is an essential step that could influence the outcome of the exercise, some further standardized guidance for their development would be helpful for managers. While each pathway was explained in great detail, there were instances when pathways and management options appeared to be at odds. For instance, the purple pathway (P 3-15, Ln 31-35) and the green pathway (P 3-16, Ln 32-36) reflect varying effects of sea level rise depending on the primary management goal. For The appendix describes the types of documents used as references in developing the conceptual models. And the models from this project themselves could be used as a starting point for others in future assessments. The issue of trade-offs is acknowledged in several places in the report. However recommending a process for reconciling 13 ------- Responses to Charge Questions - MBP the approach to be most useful to managers, a mechanism to reconcile (or prioritize) potentially conflicting paths could be proposed to support effective ecosystem adaptive management that incorporates climate change into planning. This may be outlined in Volume III, but if not... because the success of the "expert elicitation" method strongly depends on the participating experts, guidance on the optimal number of participants, and influences modeled in conceptual diagrams to elicit the best information should be included. P 4-7, Lnl6-23; P 3-17, Ln 4-13 (and throughout). One could argue that it is essential to have a clear adaptive management plan in place to not only allow for contingency planning but also create a framework that allows the program to routinely revaluate the program goals and priorities and structure monitoring plans so that expected outcomes from a management option are identified and measured. If outcomes are not realized, the adaptive management plan should entail specific paths to change actions and plans (see Boesch 2006 for examples). This may amount to semantics, but other large restoration programs have been struggling with the concept of adaptive management for years. For instance, a recent review by the National Academy of Sciences (2011) in part addressed the effectiveness of the Chesapeake Bay Program's adaptive management strategies, and noted that "milestones and contingencies could be an important part of an adaptive management strategy, but.. .they do not themselves constitute adaptive management. In a few cases, plans to implement practices or programs, monitor results, and modify activities are described..., which are key elements of adaptive management." Explicitly placing the expert elicitation method in the context of an adaptive management plan would enhance the usefulness of the methods to coastal managers and ease its integration into current plans. The discussion on P 4-6, Ln 33-37 through 4-7, Ln 1-15 on the iterative process of planning begins to address these comments, but the following paragraphs on contingency planning seem to move away from adaptive management. P 4-6, Ln 24-31. While I recognize the difficulties in assigning trade-offs in management options (adaptation planning) is beyond the scope of this vulnerability assessment. EPA's white paper on expert elicitation reviewed the literature and summarized that 3-11 experts is considered sufficient for most EEs, with a law of diminishing returns beyond 6. We have added text to explain this in the report. Since there is no precedent for this method, there is no existing info on the optimal number of influences in an influence diagram. The relationship of adaptive management to iterative planning has been explicitly clarified in section 4.2.2. The use of "likelihood" to characterize uncertainty 14 ------- Responses to Charge Questions - MBP uncertainty given a small number of experts with varying expertise, a means to report a relative uncertainty will obviously be critical for effective adaptation planning. One suggestion is the addition of measures similar to those applied by the IPCC (2007) which have been largely understood and accepted by managers and the public and may be easier for participating experts to understand and apply. For example, the IPCC report indicated very likely and likely to mean "the assessed likelihood, using expert judgment", are over 90% and 66% respectively. Perhaps, in instances when experts reported varying confidence in a particular path, the additional measure of probability of occurrence may at least elicit a broad categorization of uncertainties. Also, when an expert did not feel comfortable responding (e.g. outside of an area of expertise), a method should be detailed for evaluating uncertainty in those instances. The validity of the outcomes will be suspect if only a small number of experts participate or if variation among participants is high. The importance of the inclusion of extensive literature reviews to summarize the state of the science becomes more critical if adequate expertise in a particular area cannot be recruited for the expert elicitation process. involves the use of quantitative probability distributions based on having good quantitative information on the judgments in question. For efforts that involve making judgments about ecological processes for which quantitative information is lacking, a qualitative approach in the form of confidence categories such as "high" and "low" is warranted (CCSP, 2009). Agreed. Next time we would add a code for allowing experts to acknowledge lack of expertise. Crain I find this question very similar to #2 above so see answer there regarding level of detail. While I believe that the project steps are described in a level of detail that a manager could use the findings while considering adaptation strategies, it is not clear to me that this is really a goal of the report. It is stated throughout that this is a "pilot vulnerability assessment" or "proof-of- concept". While results can help managers think about prioritizing actions based on key pathways it is impossible to accomplish everything with this report. As a demonstration of a novel methodology that could be used again, it cannot also be expected to produce results that managers can directly apply. There are several major limitations of the report as it is that make me hesitant to apply the results directly. 1) The scoping project that identified both the overall salt marsh model and then selected two "processes" to analyze is not explained in enough detail or even intended to validate the processes selected as comprehensive, representative or the most important. Therefore findings from the study of these two processes do not meet those standards for using the results to really prioritize adaptation actions. 2) Replicating the process with additional experts for the same location and new experts in a new location would strengthen the generality and applicability of the findings. Several areas where more detail or clarifications are needed to While one goal was proof of concept, another simultaneous goal was to provide useful information on vulnerabilities of two processes that the MBP staff selected as valuable to look at. See above response explaining criteria for process selection. We disagree that the information on two processes selected by MBP as important cannot be useful without a comprehensive systematic analysis/valuation of all processes. Rather, if a manager has already decided that these are important processes based on their own analysis, they could act on the 15 ------- Responses to Charge Questions - MBP improve the utility of the report are outlined below. Section 2.2.2.5 page 2-7 This section is the crux of the expert elicitation and while it makes sense after several reads, seems like it could be clearer on the first read. Possibly the "Types" could be called "Direction". Table 2-2 could have Pairings by type/direction and degree that vary in X direction in two columns next to each other. In the expert elicitation methodology it would be helpful to describe how the influence diagrams were considered in future climate scenarios. How do you distinguish between a change in sensitivity to a driver versus change in the driver's status (shift in degree or increasing variability)? Why would the nature of the relationship (sensitivity) change based on different "start points" driven by shifting climates? What do you assume about other non-climate stressors included in the models? I have a concern about one of the restoration solutions advocated, the removal of tidal restrictions. This restoration should proceed cautiously in light of SLR since restricted marshes have often already subsided due to lack of sediment input and altered below- ground processes so that abrupt reintroduction even at today's sea levels can flood and drown wetlands. This issue will only become information provided in this report to begin management planning. The problem with "direction" is that it implies "up" or "down", when a "type" of relationship can involve x going EITHER up OR down. Table 2-2 is faithful to the actual coding scheme that was used by the participants in the workshop and we would not want to imply that they used something different; however we certainly agree it could be much simplified/ improved in the next go- round. Simplification of the coding scheme has been added to the Conclusions. (1) The distinction between sensitivity and change in driver status is what comprises the definition of relative impact. (2) The nature of sensitivity would change based on different start points depending on the location of an impending threshold. (3) Participants indicated their assumptions about other non-climate stressors by using the coding scheme to indicate whether the variable was increasing or decreasing under the different scenarios. We agree and have noted in the report that all management actions should be taken with careful consideration of place-based particulars. 16 ------- Responses to Charge Questions - MBP greater with increasing sea levels. Kirwan Overall, the project steps were adequately described in the report, and the appendices were used appropriately to present much of this information away from the main text. Nevertheless, there are three project steps that need more clarity: 1. There needs to more information on how the experts were selected. The success of the entire approach depends, almost by definition, on the quality of the "experts." How many potential participants were contacted, how many declined? Was there any attempt to have representative from a broader geographic area? 2. There needs to be a clearer description of confidence scoring and its purpose (Section 2.2.2.4). I found this section difficult to comprehend, and it wasn't clear whether the confidence scoring was done before or after the results of the individual judgments were tabulated. Are experts ranking their own judgments, or the outcome of the group's judgments? Does the "level of agreement/consensus in the expert community (pg. 2-6, In 36) " refer to the level of agreement in the working group participants, or is it supposed to reflect the scientific community at large? Similarly, the figures from this section cannot be understood by themselves. There is no y- axis label on Figure 2-7. 3. How sea level scenarios were chosen for each climate scenario needs much more discussion. Appendix C (pg C-l) notes that "Sea level rise information was provided by the Sea Level Affecting Marshes Model (SLAMM 5.0). " This is very troubling since SLAMM is designed to model the effect of sea level on marshes, not to provide information on sea level itself. The Appendix goes on to say that sea level is based on IPCC and Rahmstorf (2007) scenarios, but doesn't say how. Given that sea level rise is a major focus of the report, this needs much more explanation. In particular, we need to know if and how global sea level patterns from the IPCC and Rahmstorf have been adjusted for regional influences such as the rate of land subsidence. This is a critically important oversight. Thank you. Information on the size of the total pool has been added, as well as clarification of regional experience being part of the criteria for selection. The experts rated their confidence in their own judgments, as indicated in section 2.2.2.4. The "level of agreement" refers to the scientific community at large. The text has been clarified to make this more clear. Figure 2-7 has been deleted. Thank you for pointing out that Appendix C was inaccurate regarding the source of the sea level information. The text has been corrected to indicate that two of the scenarios used in an application of the SLAMM for a study ("Application of the Sea- Level Affecting Marshes Model (SLAMM 5.0) to Parker River NWR") were used. Results of that study were used in the workshop presentation on the scenarios, with maps of the area showing the modeled response to those increases in sea level. These scenarios are based only on eustatic sea level rise, and while it would have been an improvement to adjust for local subsidence, it was beyond the resources of 17 ------- Responses to Charge Questions - MBP this effort to develop local estimates, and a survey of the range of historic rates of sea level rise for the Gulf of Maine indicates that the difference at mid- century would not have changed the scenarios outside of other limitations. 5. Beyond the scope of this report and looking ahead to future work on adaptation to climate change, please comment on the following. This report presumes that to develop adaptation strategies, the first step is to identify system vulnerabilities and sensitivities. Do you agree? 5a. If no, what alternative method can you suggest for developing adaptation strategies? 5b. If yes, what is the most effective way of identifying those ecosystem characteristics that are most vulnerable to climate change, for deeper focus with sensitivity analysis? Reviewer Comments Response to Comments Bilkovic Yes, assuming that the planning process has been completed which outlines the scope of the plan, available resources and engages stakeholders; the initial step in adaption strategy development is vulnerability assessment. Ideally, the identification of ecosystem characteristics that are most vulnerable to climate change would incorporate 1) an understanding of the best-available data, 2) a posited system's response to climate change that is identifiable and measureable, 3) the establishment of specific monitoring activities to measure the response over time and evaluate the variability/sensitivity in that response, and 4) the subsequent application of resulting empirical data for the development of adaptation strategies. The expert elicitation approach contributes where data are insufficient, as is often the case, and targets key pathways for which uncertainty is minimized, as well as highlights those relationships that need further research. If limitations previously discussed are addressed the method will be strengthened, for example, improved evaluation of the confidence in individual judgments, standardized metrics of uncertainty, and methods to ensure the best available data/literature are accounted for in the assessment. Over-reliance on select expert opinion in a given workshop could undermine the validity of the results. However, the use of expert-elicitation methods, in combination with an understanding of the best available data/literature on processes, shows promise as a mechanism to elucidate complex interactions in ecosystems by capturing the collective knowledge or experts in Thank you for these valuable insights which we will use in moving forward. Thank you. 18 ------- Responses to Charge Questions - MBP a format that is useful to both managers and researchers. Crain Yes, this is a critical first step. We can't protect, manage, adapt if we don't know what the problems are and will be. Some terminology is confusing here and in the document. I assume when speaking of sensitivity analysis you are referring to how sensitive the processes are to climate change. However, in the document, you quantify sensitivity within the influence diagrams as relative impact of one variable on another. I believe several complementary approaches are best. In addition to the type of approach you've undertaken here, an alternative is to envision what the system might look like physically in 100 years and how we could help it get there. You mention the trade- offs in protecting systems as they are until some tipping point and then re-focusing management on an alternative state. Rather than waste resources on an inevitable transition (Phragmites eradication?), embracing the changing physical state and promoting migration, maximum accretion, etc. may be a useful exercise. Thank you. See sections 2.2.2.5 and 2.2.2.6 for the distinction between sensitivity and relative impact. Thank you for these valuable insights which we will use in moving forward. Kirwan I agree with the assumption that the first step to developing adaptation strategies is to identify system vulnerabilities. A panel of experts from many regions of the world (as opposed to one that is focused on a particular area, like Massachusetts) would ensure that the widest range of vulnerabilities are considered. Existing peer-review literature would also be helpful at any stage, in both selecting vulnerabilities to consider, and narrowing them down to the most important ones. Although I agree with the panel that detailed numerical models are difficult to construct and apply to local ecosystems, they can be particularly helpful in narrowing down the most important variables and pathways. In many ways, simple numerical models have the same goal and requirement as the initial working group meetings discussed here: that is they have to try and distill a complex ecosystem into a simple conceptual framework that can only incorporate the processes/interactions that are most relevant. For what it's worth, the report actually suggests a different approach to developing adaptation strategies. Rather than starting with vulnerabilities, the report states "Another method for sorting through and prioritizing "non-agreement" influences for further study might be to start from the perspective of management opportunities. Managers could look at their most tractable and effective management levers currently available, and trace pathways from those down to the endpoint of interest, as a means of identifying and selecting priority influences for research (Pg. 3-4, Ln 23) " In practice, an iterative process between identifying vulnerabilities and identifying possible "management levers" is certain to occur. Determining which should happen first is not entirely necessary, and the report already does quite well in Thank you for these valuable insights which we will use in moving forward. Thank you. 19 ------- Responses to Charge Questions - MBP recommending that iterative process. One challenge in using the report to inform adaptation strategies is the reported difficulty in agreeing on the direction and sensitivity of processes where thresholds are involved. The causes of the difficulty are well addressed (pg. 4-4 for example), and the report concludes that "Thresholds are clearly relevant to management, but usable information on thresholds remains elusive (pg 4-6). " These observations lead me to believe that when threshold processes actually do emerge into the top pathways, that they are even more significant than the easier to define pathways. When it comes to informing management decisions, it seems threshold pathways should perhaps be given extra emphasis given that they are inherently and inadvertently subject to less agreement. I believe the issue of lack of agreement in threshold pathways remains one of the biggest challenges to utilizing this methodology. We totally agree with you on this. 6. Please provide any other comments or recommendations that you feel would strengthen the document. Reviewer Comments Response to Comments Bilkovic While the temporal influences on pathways were briefly discussed, consideration of spatial variability in system responses will be similarly important for effective adaptation planning. The evaluation of data in spatial frameworks such as GIS is increasingly becoming the decision-support tool of choice for coastal managers. To address this, a brief discussion could be included on how the expert elicitation method (and influence diagrams) can be structured to accommodate spatial distinctions in ecosystems and their sensitivities to climate change. We have emphasized that individual managers would need to consider the particulars of their place when considering the results. It would be theoretically possible to do multiple sensitivity analyses, based on spatial variability, but this would be very intensive. It would be interesting to compare/consider coupling map-based GIS approaches with our sensitivity analysis method to assess benefits for management planning. 20 ------- Responses to Charge Questions - MBP Crain Several details that would improve the document are below: Page 3-7, last paragraph. This argument does not make sense to me on several levels. I don't understand why elevation "will become increasingly sensitive to the ratio of native marsh to Phragmites". It seems that Phragmites shift in accretion rate will have the same influence on elevation regardless of sea level - it may be more important, but I don't understand why more sensitive. The justification given here also makes no sense to me as trapping more sediments (thus accreting more peat) and migrating landward are two separate mechanisms of dealing with SLR, but do not explain why elevation will be more sensitive to the ratio of plant species. Page 3-12 In 4-5 -1 don't understand the argument here for restoring hydrology - please explain more clearly. Page 3-13, In 8-9 This is a key point that could use a citation 3-15 In 29 "Competition" is used incorrectly here and in previous discussion of interactions. Salinity and nitrogen are not competing but driving opposing outcomes in Phragmites. Competition has a very different meaning in the scientific community. Table 2-9 OMWM is not actually defined here - it is in text and The implication is not that the sensitivity of Phrag changes, but rather that the sensitivity of marsh elevation to the ratio of Phrag: native marsh changes. You make an excellent point that Phrag's ability to trap sediment alone would not explain this. We have added that the change in sensitivity could occur if the native marsh contribution to elevation shrinks, making the relative contribution of Phrag shift. That said, this is indeed a very complex mechanism that could have benefitted from more discussion in the group; the need for more group discussions of mechanisms will be covered in lessons learned report. We have added text to further clarify. This is one of many statements made by the experts during the elicitation; so the reference for this and other statements of this kind would be the experts themselves. We have defined competition in Table 2-3 as when the effect of X on Y decreases with an increase in Z. That is the way in which we are using the word here. Thank you. Edit has been 21 ------- Responses to Charge Questions - MBP needs to be actually defined here. Figure 2-7 what is Y axis? made to Table 2-9. This figure has been deleted. Kirwan The success of the expert solicitation process depends on the quality of the "experts". While I feel this panel was adequate and actually did quite well (I find myself mostly agreeing with their choices of top pathways, and how they may become more or less sensitive under climate change), I recommend that the next panel consist of scientists representing broader perspectives. I understand that the intent was to have scientists familiar with the region of study, but having scientists from outside New England and ideally outside the United States would dramatically improve the breadth of pathways considered. Similarly, since each group consisted of only 7 participants (should expand that number in the future), they are very subject to duplication of expertise and education background. 2 of the 7 experts in the Sediment Retention group were educated by the same graduate advisor, 2 of the 7 participants in the Community Interactions group are from U. New Hampshire, 3 participants overall work in Woods Hole, etc. Agreement between participants on important pathways is much less impressive if they are all from the "same crowd" so to speak, and agreement between participants would more impressive with more scientific diversity. Nevertheless, I want to emphasize that while I believe that the breadth of scientific representation should be considered more carefully in the next assessment, I believe the panel selected here was successful, especially if the primary goal was proof-of-concept. The Sediment Retention group seemed to have a strong focus on purely physical processes, rather than on biological processes that have been shown to influence sediment retention. For example, climate factors such as atmospheric C02 and temperature warming clearly affect plant growth in ways that allow more belowground biomass and sediment deposition, but were not discussed. I recognize the value of letting the group pick its own dominant pathways, but I think if the group were more diverse, they might have given more proper consideration of biophysical pathways. A minor issue: "The community diagram includes both above ground and below ground biomass variables while the sediment diasram only includes above sround biomass fps. 4-2, In 26)" The second half of this sentence is incorrect. The sediment diagram includes only below ground biomass, and a case could be made that the Surface Roughness variable incorporates above ground biomass. See Figure 3-3. Thank you for these thoughts. This may or may not be the case; we sought to make sure there were both biologists and physical science experts in both participant groups. In the case of both MBP and the parallel SFEP effort, the sediment retention groups felt that when limited to only the most important variables, physical factors were overall most important. Thank you; the appropriate correction has been made. 22 ------- Additional Reviewer Comments 23 ------- ------- Matthew L. Kirwan, Ph.D. MBP Additional Comments Submitted by Dr. Donna M. Bilkovic Literature used in this review Boesch, D.F. 2006. Scientific requirements for ecosystem-based management in the restoration of Chesapeake Bay and Coastal Louisiana. Ecological Engineering 26:6-26. Grace, J.B. 2006. Structural Equation Modeling and Natural Systems. Cambridge University Press. (86+17) Intergovernmental Panel on Climate Change (IPCC). 2007. Climate Change 2007: Impacts, Adaptation and Vulnerability. Contribution of Working Group 11 to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change. M L. Parry, O F. Canziani, J.P. Palutikof. P.J. van der Linden, and C.E. Hanson, eds. Cambridge University Press, www.ipcc.ch National Academy of Sciences (NAS). 2011. Achieving Nutrient and Sediment Reduction Goals in the Chesapeake Bay: An Evaluation of Program Strategies and Implementation. 258pp. http://dels.nas.edu/Report/Achieving-Nutrient-Sediment-Reduction-Goals/13131 25 ------- |