Title: Real-time hurricane wind field analysis

Mark Powell, NOAA Hurricane Research Division Miami Florida

Abstract For several years NOAA's Hurricane Research Division has conducted real-time analysis of tropical cyclone wind fields. While forecasts of Atlantic basin hurricane tracks show increasing skill, forecasts of intensity show little skill and forecast model wind fields are not routinely evaluated against observations. In models such as the GFDL, the hurricane is "bogused" in rather than being initialized through data assimilation. Hence little is known about the model wind field errors and intensity errors are determined through comparison to a subjective "best track" estimate. Observations are available from a variety of conventional, remote, and specialized air-, space-, land-, and sea-based platforms but large data voids can exist and not all platforms perform equally in all situations. In particular, aircraft reconnaissance measurements are critical to resolving the inner core and such data are only available when storms are west of 60 W longitude. In addition, the aircraft observations have to be adjusted to the surface and methods for the adjustments have high uncertainty. The current uncertainty in estimating the intensity of a tropical cyclone is at least ~ 20%. Hence, our approach (called H*Wind) involves a JAVA graphical front end to an Oracle database, providing interactive quality control that requires the experience of the analyst to make decisions on what data are representative of the current situation. To improve data coverage we incorporate a time-to-space conversion within a storm relative coordinate system with a time window that may be adjusted to provide sufficient data for analysis. A low-weight background field from a prior analysis may also be incorporated to provide continuity and help fill in data voids. Once these data pass scrutiny, they are objectively analyzed with a nested "mechanical" statistical interpolation approach developed by Dr. Vic Ooyama in 1987, and later implemented by Dr. Steve Lord and James Franklin. Analyses are made available in graphic and gridded form on the web (http://www.aoml.noaa.gov/hrd/data_sub/wind.html), where they are used for a variety of applications including experimental forecast/warning gudiance, and damage assessment. Frequency is typically 6 h (or ~ 3h at landfall). Based on several years of experience, we believe that much of the quality control may be automated such that analyses could be run at 6h frequency, and eventually 1 h (for support of wave and storm surge model hindcasts). Future plans involve tracking the error characteristics of various observation platforms and evaluating model wind fields to help support development of the HWRF model. For several years NOAA's Hurricane Research Division has conducted real-time analysis of tropical cyclone wind fields. While forecasts of Atlantic basin hurricane tracks show increasing skill, forecasts of intensity show little skill and forecast model wind fields are not routinely evaluated against observations. In models such as the GFDL, the hurricane is "bogused" in rather than being initialized through data assimilation. Hence little is known about the model wind field errors and intensity errors are determined through comparison to a subjective "best track" estimate. Observations are available from a variety of conventional, remote, and specialized air-, space-, land-, and sea-based platforms but large data voids can exist and not all platforms perform equally in all situations. In particular, aircraft reconnaissance measurements are critical to resolving the inner core and such data are only available when storms are west of 60 W longitude. In addition, the aircraft observations have to be adjusted to the surface and methods for the adjustments have high uncertainty. The current uncertainty in estimating the intensity of a tropical cyclone is at least ~ 20%. Hence, our approach (called H*Wind) involves a JAVA graphical front end to an Oracle database, providing interactive quality control that requires the experience of the analyst to make decisions on what data are representative of the current situation. To improve data coverage we incorporate a time-to-space conversion within a storm relative coordinate system with a time window that may be adjusted to provide sufficient data for analysis. A low-weight background field from a prior analysis may also be incorporated to provide continuity and help fill in data voids. Once these data pass scrutiny, they are objectively analyzed with a nested "mechanical" statistical interpolation approach developed by Dr. Vic Ooyama in 1987, and later implemented by Dr. Steve Lord and James Franklin. Analyses are made available in graphic and gridded form on the web (http://www.aoml.noaa.gov/hrd/data_sub/wind.html), where they are used for a variety of applications including experimental forecast/warning gudiance, and damage assessment. Frequency is typically 6 h (or ~ 3h at landfall). Based on several years of experience, we believe that much of the quality control may be automated such that analyses could be run at 6h frequency, and eventually 1 h (for support of wave and storm surge model hindcasts). Future plans involve tracking the error characteristics of various observation platforms and evaluating model wind fields to help support development of the HWRF model.