|
I. Overview Section |
Wendy Martinez, the ONR Program Manager overseeing this MURI gave an introduction and overview of the origin of this MURI topic. She also mentioned that with a MURI, there is a concern that though the work is focused on basic research, the sponsors hope to see some payoff delivered to the Department of Defense at a rate that is faster than is typically expected of 6.1 research. It would be helpful to send information on any early successes to the program manager. MURIs have historically had a tough maintaining a coordinated effort, but this MURI has a higher chance of success because of the co-location on a single campus and the tight interaction with a operational DoD organization. Navy View on Uncertainty in Mesoscale Meteorology |
Ted Tsui, from the Naval Research Lab, Monterey, described the kinds of meteorology information navy tactical users need and the current operational infrastructure that provides that information. He then went on to describe how uncertainty complicates the interpretation of the meteorology information. Ted commented that the forecasters need error characteristics or confidence levels associated with model prediction. They also need a way to visualize the uncertainty in the model output, which includes the uncertainty or possible error in the atmospheric observations. These tools need to be simple, easy to understand, and offer one-click access. |
Army View on Uncertainty in Mesoscale Meteorology |
Doug Brown, from the Army Research Laboratory, presented an overview of an Army project to help reduce the uncertainty in battlefield scale meteorology. The Integrated Meteorological System (IMETS) incorporates a Battlefield Forecast Model (BFM) and workstation technology on a mobile platform (a truck) to produce nowcast information. The model provides near-term forecasts at resolutions down to 50 m, which are not available in current operational numerical models. The additional value IMETS is to correct timing errors and fine-scale weather events missed by synoptic scale models. |
II. Atmospheric Sciences Section |
Cliff Mass provided a presentation on the MM5 ensemble system being run at the University of Washington and its implications for forecasters. Question: In regards to Cliff Mass’ “Forecaster Console of 2005”, and the “paradigm shif” taking place in the forecasting process, what will be the human role? |
Response and Discussion: |
Cliff: The human role may be completely different 20-25 years from now by moving more toward an interpretation of probabilistic numerical weather predictions and communication of the uncertainty in the forecasts. Humans may be completely removed from the forecast itself. Brad Colman and Ted Tsui disagreed with Cliff’s opinion that the human role in forecast process would be so diminished, but that the human role would most definitely change a great deal. Bob Miyamoto commented that the forecaster role would change in concert with the forecast process and the way in which the guidance is created. Everyone agreed that educating the forecasters on how to use and interpret probabilistic forecast guidance is critical and will present challenges. |
Question: How can we avoid overwhelming forecasters with guidance? |
Response and Discussion: |
Cliff: We will create software that will take into account the past (last few days) performance of the model as well as its real-time performance. Using this measure of trust-worthiness in addition to the confidence estimate (derived from the variance in the ensemble forecasts), the software we create will have the capability of informing the forecaster what aspects of the forecast to trust and what not to trust. |
III. Statistics Section |
Adrian Raftery gave a presentation on Bayesian Melding and Bayesian Model Averaging. He described how these techniques could possibly yield techniques which would do a better job in describing the uncertainty in mesoscale predictions and might also help forecasters to better utilize the forecasts from these models. Question: Will the statistical methods work no matter where MM5 is running? This question is in regard to both the computing platform MM5 is running on as well as the geographic location of interest. |
Response and Discussion: |
In terms of working on different computing platforms, and for the general statistical formulation for the problem in general, it will be necessary to employ methods for reducing the dimensionality of the problem (methods such as principal components, wavelets, methods from spatial statistics, etc.). The dimensionality reduction allows to look at which variables are most important, both statistically and in terms of their significance to the user. The Bayesian approach allows for differences in geographic areas by allowing for differences in likelihood and prior formulations. |
Question: How will Bayesian techniques help in picking the "weather model of the day"? |
Response and Discussion: |
In general, the approach of statistics is to pick a single most appropriate model given the data. The Bayesian model averaging approach is quite different in that it allows one to use a group of competing models for the purpose of prediction. Thus, one is not obliged to choose a single "best" model. As for the weather prediction problem, there appear to be roughly 8 "good" models. This is good opportunity to employ Bayesian model averaging. This approach will put ensemble forecasting on a better statistical footing. We might also calibrate the approach to allow for information from model verification procedures, in order to determine which models ought to be weighted more heavily. Initially, this might be done on a monthly basis. Through appropriate choices of likelihoods and priors (expert knowledge of a certain geographic region) the method might choose different "models of the day.". |
Tilmann Gneiting gave a presentation on spatial statistics and its potential uses as a technical tool in the project. Question: Does spatial statistics generally assume stationarity and isotropy, the latter meaning that covariances depend on geographic distance only? |
Response and Discussion: |
No, the presentation was extremely simplistic. The development of appropriate non-stationary and non-isotropic models has been an active research area in spatial statistics, environmental statistics, and meteorology. Spatio-temporal prediction is another important area. In the project context the role of spatial statistics is that of a technical tool to reduce the computational complexity of deterministic simulation models through so-called "model emulators." In this context stationary models seem appropriate. The situation in atmospheric data assimilation is quite different. Cliff points out that an MM5 run takes about 60 to 90 minutes. |
Question: Could some of our problems be solved through the use of supercomputers? |
Response and Discussion: |
Maybe in the initial stages of the project, however, we need to keep in mind that our eventual goal is a method, which allows users with somewhat limited computing facilities to also use what we develop. The final result must be scalable. |
Question: In dealing with the computational complexity, is it possible to work at lower resolution first and then use the results in higher resolution maps? |
Response and Discussion: |
In general the statistical approach that we have in mind ought to work at all resolutions. In terms of computational efficiency, this could be a very good idea. Cliff provides a clarification of atmospheric scales: Global Synoptic (1,000 to 4,000 km) Mesoscale (Pacific Northwest, say) Microscale |
IV. Human Factors Section |
Buz Hunt gave a presentation on the expected approach the cognitive scientists will take in studying the tasks of naval forecasters. He also described some preliminary thoughts after visiting the naval forecasters at the Whidbey Island Naval Air Station weather facility. A key concern was the level of interruptions with which the naval forecasters is confronted. Susan Joslyn provided amplifying information on the current research of the cognitive task of naval forecasting and described the scenario in which the forecaster operates. She went on to provide a detailed task analysis of an expert forecaster that was observed this year. |
Discussion: |
Concern was expressed about demand characteristics in observations and interviews of military forecasters. When formally asked about procedures, forecasters might describe the proper or prescribed procedures. However, they might actually do something else entirely when they are not being observed. The speakers responded that this was indeed a problem in this type of work, but one that could be minimized by the approach of the observer. A quiet observer, introduced without fanfare and sitting in the back of the room can become background to the worker, making it more likely that workers will engage in typical procedures. Likewise interview techniques in which the expected answer is not part of the question (leading questions) make it more likely that the respondent will give an honest answer. There was also concerned that "casting a wide net" would make the cognitive task analysis problem unmanageably difficult. The speakers thought the idea of narrowing the focus to one type of situation was an excellent idea. There were several questions from the audience about the issue of "interruptions". Buz explained the difference between interruptions and dual tasking. Dual tasking is "riding a bike and reciting poetry" or driving a car and talking on a cell phone. It is a situation in which you do two things more or less simultaneously. There is considerable experimental literature on dual tasking. Interruptions, on the other hand, are situations in which you are working on one project and must put it aside to attend to another task that takes several minutes or more. There is far less experimental work concerning this situation. Some work suggests that the more similar the interrupting task is to the primary task, the larger the decrement in performance. Buz suggested that in military forecasting, if the interruption involves providing a briefing based on the forecast upon which one was working (the interrupted task), it might have the effect of increasing the forecasters the estimate of certainty. That is, it might make an tentative early, forecast seem more certain to the forecaster because he or she has just given a briefing based on it. It was acknowledged that this was an interesting question and might be the subject of later research. There was concern about individual differences in forecasting skill and procedures. Doug Brown recommended considering the goals of the end user (e.g. pilots) in the design. In the army setting there are no forecasters and the information goes straight to the end user. Adrian commented on the potential usefulness for the statistics group of data emerging from the cognitive task analysis. The variables evaluated by expert forecasters for particular predictions might be used to narrow the choice of variables for the statistical approach. He also felt that patterns identified by experts might suggest variables that cluster together. |
V. Visualization Section |
Bob Miyamoto started the presentation and described what APL brings to the MURI project and the role it expects to play. He described ongoing APL research work in METOC Human Systems Improvement and the Directed Research Initiative on Uncertainty in Acoustics that have similar concerns and should have spin-offs that could help in the MURI research. David Jones then described how naval forecasters deal with uncertainty today and the products they have available to them that show the uncertainty in model forecasts. He described future plans for collaborative work with Navy operational forecast centers and possible visualization tools. Keith Kerr described the basic architecture of a future visualization system and the kind of visualizations that could be produced by VISAD, which is a new open source software package designed for the meteorology research community. |
Discussion: |
Buz - one way to help improve the Navy’s tools for visualizing uncertainty and their methodology, or lack of, for dealing with uncertainty, might be to look at the medical community, and in particular the radiological community, due to their long history of dealing with uncertainty in xray technology. Ted - Displays must fit the needs of the forecaster or they will be ignored. Bob - to understand how uncertainty plays a role in forecasting, researchers need to first study the needs of the tactical customers that the forecasters provide information to. David - Since the forecaster is already overloaded with information, additional information on uncertainty may be useless to them unless we can automate some of the time consuming, non-cognitive aspects of their current work flow. A system of automation and alarms to focus forecaster attention when full automation may be too dangerous, might offer the best combination. Cliff - described the dModel/dt that is in the national weather service’s AWIPS systems. It is a very useful tool but could be improved upon. |
VI. Open Discussion |
Following the formal presentation there was a general discussion on where the project would proceed. Adrian described a weekly seminar that will focus on the statistic topics needed for the MURI research. He also described a plan to create a web site that would give researchers access to a information produced by the MURI team and a second site which will provide general information to a wider public audience. Cliff commented that the best approach might be to quickly develop a prototype system that delivers ensemble information to naval forecasters. Then study how the forecasters use the system, iterate on it as the statistics researchers devise better ways to deal with model uncertainty and improve its visualization capabilities. There was general consensus that this might be the best way to get things moving on the operational side, while more basic research in Bayesian approaches could proceed semi-independently and then folded in as results/techniques become available. There was also a discussion on how training will be an important component in gaining acceptance of any uncertainty tools that are produced. Bob mentioned that the Navy’s new METOC Professional Development Center might be a venue to get the training to the Navy forecasters. Additionally, online courses were thought to be a promising approach. There will be two web sites, an internal one and a public one. We intend to post and announce papers, software, meetings, progress reports, achievements, and links. There will be quarterly meetings by the MURI group. Each year, one of these (likely the Fall meeting) will be a "big" meeting like this one. The report from the Fall meeting can be developed into the yearly progress report to ONR, which is due in January. In Winter quarter 2002, Adrian and Tilmann will teach a Special Topics course in Statistics, which focuses on the research for the MURI project. Cliff has good connections to the Air Force, which is likely to be interested in the project, too. Wendy thought that it would be a good idea to hold an annual MURI meeting and it probably would be best to keep it here in Seattle. |
|