UNIVERSITY OF CALIFORNIA, SAN DIEGO UNIVERSITY OF CALIFORNIA, SAN DIEGO 3 1822 04429 0617 3 1822 04429 0617 Offsite MOSPHERIC OPTICS GROUP (Annex-Jozhnical Note No. 273 rnals) Version 1.1 November 2007 QC 974.5 . T43 no. 273 Continuing Support of Cloud Free Line of Sight Determination Including Whole Sky Imaging of Clouds Published as Final Report for ONR Contract N00014-01-D-0043 DO #13 UNIVERSITY OF CALIFORNIA SAN DIEGO Janet E. Shields Monette E. Karr Art R. Burden Richard W. Johnson William S. Hodgkiss WS 1803 The material contained in this note is to be considered proprietary in nature and is not authorized for distribution without the prior consent of the Marine Physical Laboratory. SCRIPPS INSTITUTION OF OCEANOGRAPHY MARINE PHYSICAL LAB San Diego, CA 92152-6400 UNIVERSITY OF CALIFORNIA, SAN DIEGO IIIIII ITIU 11 1 1111 LITHIN III NUJU IIIIIIIII IT IIIIIIIIIIIIIII ILU I I II I IND DIN 11 III IIIIIIIIIIIIIIIIIIII 11111111 III ID II 3 1822 04429 0617 ATMOSPHERIC OPTICS GROUP Technical Note No. 273 Version 1.1 November 2007 Continuing Support of Cloud Free Line of Sight Determination Including Whole Sky Imaging of Clouds Published as Final Report for ONR Contract N00014-01-D-0043 DO #13 UNIVERSITY OF CALIFORNIA SAN DIEGO Janet E. Shields Monette E. Karr Art R. Burden Richard W. Johnson William S. Hodgkiss namesema The material contained in this note is to be considered proprietary in nature and is not authorized for distribution without the prior consent of the Marine Physical Laboratory. SCRIPPS INSTITUTION OF OCEANOGRAPHY | MARINE PHYSICAL LAB San Diego, CA 92152-6400 Continuing Support of Cloud Free Line of Sight Determination Including Whole Sky Imaging of Clouds Table of Contents 1. Introduction ........ 2. Background ..... 3. Statement of Work ..... ............4 4. Funding Increments.......... 5. Major Deliveries and Documentation.. A OD 6. WSI Refurbishment, Deployment, and Support ....................... ............ 6.1 WSI Refurbishment under the previous contract ................................. 6.2 Site 2 Deployment.......... 6.3 Site 5 Deployment. 6.4 Site 3 Preparation and Site Maintenance for SOR and VA Sites....... 12 .............................. 12 ...... 7. Software Upgrades and Site Data Processing ...... 7.1 Field Software... 7.2 Software related to Data QC.. 7.3 Data Archival and Processing ......... 16 8. Day Algorithm Upgrade and Analysis .......... 8.1 Previous Day Algorithm Results........ 8.2 Day Algorithm Results for Site 2......... ............ ........... 9. Night Algorithm Developments and Analysis ..... 9.1 Previous Night Algorithm Results ........ ........... 9.2 Development of the first Version of the Full Resolution Night Algorithm.. ........ 34 9.2.1. Transmittance Results .... 9.2.2. The Clear Sky radiance Distribution or "Shell"..... 9.2.3. The Opaque radiance Distribution or “Shell”.... ...........37 9.2.4. Cloud decision at the Pixel ........ 9.2.5. No-moon Algorithm Results for Site 2 ......... 9.2.6. Moonlight Results for Site 2 ...... 9.2.7. Conversion of the Test Code into Field Code...... ........... 9.3 Stellar Irradiance Time Series Study...... ............ 9.4 Summary of Night Algorithm Results ........ .......... 10. Summary.. 11. Acknowledgements................. 12. References...... 12.1 In-house Technical Memoranda, in order by memo number; available to sponsor on request ................... 12.2 Power Point Files from Presentations to SOR ..... .................... 12.3 Selected Published References and Technical Notes in order by date... List of Illustrations Fig. 1 ............. WN Fig. 4 Fig. 5 Fig. 6 Fig. .... Fig. 8 Fig. 9 Fig. 10 Fig. 11 Fig. 12 Fig. 13 Fig. 14 Fig. 15 Fig. 16 Fig. 17 Fig. 18 ...... Sensor and Controller Configuration for Unit 7 Data Availability, Site 2, 1 May – 10 Dec 2006. Data Availability, Site 2, 28 Feb - 30 Sep 2007 .......... ........... 11 Data Availability, Site 5, 2 Feb - 23 Jul 2007 ....... ........... 13 WSI User Interface. ............................................. 14 Raw red and processed cloud decision from SOR site, 26 Jul 05 2200..21 SORCloudAssess program display for user assessment of algorithm results......... ............ 22 Fraction of correct answers, in percent, for each ROI for SOR Day Test Bed Processing ..... ............................................ 22 Fraction of correct answers, in percent, for each ROI for Virginia Day Test Bed Processing ............. ..................... 22 NIR/blue ratios at beta points for several clear-to-hazy days for Site 2..24 Typical clear sky (no cloud) results, 20 May 06 1800... ............ 25 Poor clear sky results due to heavy haze, 10 May 06 1500.. Nice mixed cloud results, 19 May 06 1700 ...... ...... 26 Clouds on the horizon, 17 May 06 1800.......... Overcast sky, 23 May 06 1800 ... ........... Hourly time series, 19 May 06 1400 - 2200 ................ Clear Night, 14 Feb 99, 0340.... Cloudy Night, 16 June 99 0510... Clear sky sample, no moon.. Radiance from Fig. 19 compared with nominal clear sky and opaque sky radiances........... ............... 30 Broken transparent cloud sample.............. ............. 31 Radiance from Fig. 21 compared with nominal clear sky and opaque sky radiances............. Example of a relatively good night clear sky results ...... Example of a night case with opaque and thin cloud ........ Example of a moonlight case...... Fraction of correct answers, in percent, for each ROI for SOR Night Test Bed for Region of Sight Test ............ ......... 33 Relationship between spectral stellar irradiance from measured data and spectral stellar irradiance computed from stellar libraries ...............35 ...... ......... .......... Fig. 20 ................................................) Fig. 21 Fig. 22 .......... Fig. 23 Fig. 24 Fig. 25 Fig. 26 Fig. 27 TBD ...... Fig. 28 Fig. 29 ................36 Fig. 30 Fig. 31 Fig. 32 Raw image and cloud decision image with full resolution algorithm, Site 2, 26 May 2006 at 0926............... .......... 38 Curves showing data through the middle column for the typical opaque cloud for this site and time (red), typical clear for this site and time (blue), and actual image shown in previous figure (black)....... 39 Typical results for a clear (cloud-free) night, 20 May 2006 at 0200 ....... 39 Typical results for scattered clouds, 26 May 2006 at 0918 .................. Typical results for full overcast with both opaque and thin cloud, 24 May 2006 at 0900... Relatively poor results for a clear sky, 30 May 2006 at 0300 ................ 41 Fig. 34 Fig. 35 ..... 41 Fig. 36 Fig. 37 Fig. 38 Fraction of correct answers, in percent, for each Line of Sight for Site 2 Test Data using Program SORCloudAssess............ Relatively good results for moonlight, 8 May 2006 at 0500 ................... 42 Relatively poor results for moonlight, 10 May 2006 at 0400 ................. Time series of stellar irradiance from the images during a night with variable cloud, Polaris, SOR Site, 11 January 2002 ..... ..44 Time series of stellar irradiance from the images during a clear night, Polaris, SOR Site, 12 January 2002........... Time series of stellar irradiance from several nights, Polaris, SOR Site. 46 Time series of stellar irradiance from Alnath and Dubhe for the same nights shown in the previous figure...... g. 39 ...... 45 Fig. 40 Fig. 41 ... 47 List of Tables ........ 18 Table 1 Table 2 Table 3 Table 4 ......... Typical QC Repot, Site 2, Year 2007 Field Data Archival . .............. Status of Data Processing Summary of Estimated Accuracy for the Day Cloud Algorithm using SORCloudAssess for the SOR and Virginia Test Bed Data Sets .......... Summary of Estimated Accuracy for the Night Cloud Algorithm using SORCloudAssess for the SOR Test Bed Data Set .......... ........ 34 Summary of Estimated Accuracy for the Night Cloud Algorithm using SORCloudAssess for the Site 2 and SOR Data Sets Table 5 Table 6 Continuing Support of Cloud Free Line of Sight Determination Including Whole Sky Imaging of Clouds Janet E. Shields, Monette E. Karr, Art R. Burden, Richard W. Johnson, and William S. Hodgkiss 1. Introduction This report describes the work done for the Starfire Optical Range, Kirtland Air Force Base under ONR Contract N00014-01-D-043 DO #13, between 20 April 2006 and 31 July 2007. This work relates to the Air Force's need to characterize the cloud distribution during day and night, for a variety of applications. These applications include support of research into the impact of clouds on laser communication and support of satellite tracking. This contract followed Contract N00014-01-D-0043 DO #4, which is documented in Shields et al. 2007a, Technical Note 271, and Contract N00014-01-D- 0043 DO #11, which is reviewed in Section 2 and documented in Shields et al. 2007b, Technical Note 272. Under DO #13, we finished preparation of two of the WSI units and their software, and fielded them at Air Force Sites 2 and 5. We upgraded the field software and the support software. We upgraded the day cloud algorithm to use Near Infrared data, and developed esolution night cloud algorithm, and tested and evaluated these algorithms with test data sets. The new algorithms were installed in the field at Site 2. A follow-on grant, N00244-07-01-0009 was funded through Naval Postgraduate School on 28 June 2007. The work under this grant will be reported under a separate report upon completion of the contract. 2. Background A series of digital automated Whole Sky Imagers (WSI) have been developed by MPL over many years, beginning in the early 1980's (Johnson et al. 1989 and 1991, and Shields et al. 1993, 1994, 1997a and b). (Published references are listed in Section 12.3.) These systems are designed to acquire accurate imagery of the full upper hemisphere in several spectral filters, in order to assess the presence of clouds at each pixel in the image. A system capable of 24-hour operation, the Day/Night WSI, was developed by MPL under funding from the Air Force, Navy, and Army in the early 1990's (Shields et al. 1998, 2003b and e, 2004 a and b, and 2005b and c). One of the first two units was fielded at the Air Force's Starfire Optical Range in October 1992. Technical Memo AV06-034t documents the estimated run-time experience with the different systems. (Technical Memos are listed in Section 12.1.) We have approximately 69 system-years, or 600,000 hours of experience with the Day/Night WSI systems. Related systems have been designed and fielded over the years. These include a Daytime Visible/NIR WSI (Feister et al. 2000, and Shields et al. 2003d), and visible and Short- wave IR systems for airborne use (Shields et al. 2003c). An imaging system for measuring visibility was developed and successfully tested (Shields et al. 2005a and 2006). Also, a field calibration device to enable radiometric calibration of the systems in the field was developed for use with the Day/Night WSI (Shields et al. 2003a). Partly as a result of SOR's experience with the Day/Night WSI fielded in 1992, they funded development of additional instruments and algorithm development in the past decade. Under Contract N00014-97-D-0350 DO #2 (September 1997 – June 2001), MPL was funded to develop and provide a new Day/Night WSI, which was designated Unit 12 (Shields et al. 2003b). This was the first system using a Windows operating system and the Series 300 Photometrics cameras. (Older systems used DOS and Series 200 Photometrics cameras.) We also analyzed and processed existing daytime cloud decision images to provide statistical estimates of Cloud Free Line of Sight (CFLOS) and related properties. 17 Under Contract N00014-97-D-0350 DO #6 (May 1999 – May 2003), we were funded to provide two additional instruments, Units 13 and 14 and do additional analysis work (Shields et al. 2004b). These instruments included significant design upgrades, including of the control computer into the outdoor environmental housing, new control hardware and software, and new software to provide near-real-time cloud processing on an additional processing computer. One of the instruments was fielded at a site in California for an experiment. The other was kept at MPL pending sponsor readiness for deployment. In addition, we developed a night cloud algorithm based on detection of the contrast between the signal from stars and their background. The concepts had been developed under funding from another sponsor (Shields et al. 2002). Under Delivery Order #6, this concept was expanded to handle moonlight, and converted to a fieldable C- code and installed on Unit 12 running at the SOR site. Under Contract N00014-01-D-0043 DO #4 (May 2001 – September 2006), documented in Shields et al. 2007a, we fielded WSI Unit 14 at a site in Virginia, and supported this deployment as well as the continued operation of Unit 12 at the SOR site. Although Unit 14 ran flawlessly on the earlier deployment and in extensive tests at MPL, we had more problems than normal at the Virginia site, but were able to keep it operational much of the time. The Unit 12 system generally operated well, although it required extensive repair following a lightning strike, and somewhat more repairs than normal during the period after that. We also began evaluation of how to build a new WSI with currently available technology, and how to simplify the system for increased robustness and decreased cost. Under DO #4 the day algorithm at the SOR site was updated to run in real time (like Units 13 and 14) and include a horizon and occultor mask, and the clear sky background was updated. Also a stand-alone version of the algorithm was written. Under funding from another sponsor, an extensive data base was processed and analyzed to extract CFLOS statistics. This work allowed us to determine the strengths and weaknesses of the daytime cloud algorithm. The night algorithm was further developed to handle the anthropogenic light sources near the SOR site. The new night algorithm was installed to run in real time at the SOR site. Under DO #4 we also provided an extensive analysis of IR systems and their pros and cons for this program. Most of this analysis concentrated on Long Wave IR (LWIR) systems in the 8 – 12 um wavelength region, because our analysis showed that Short Wave IR (SWIR) systems near the 1.6 um wavelength region would not be adequate for our purposes, and Mid Wave IR (MWIR) systems near the 3 – 5 um wavelength region had disadvantages with respect to the LWIR systems. The LWIR analysis of theoretical performance and images we had access to show that the LWIR system probably will not detect most clouds near the horizon in normal haze environments, and will have difficulty detecting high clouds over many parts of the sky under many conditions. This analysis, as well as the other work done under DO #4, is presented in Shields et al. 2007a. Contract N00014-01-D-0043 DO #11 (September 2004 – April 2006), is documented in Shields et al., 2007b. During this time, we received some of the older DOS-based WSI systems, and began refurbishing them for use with this project. The refurbishment also included creating the control software and the processing software to meet the needs of the SOR program and also be compatible with these older hardware systems. Many new features were added to the software during this time. The first of the refurbished WSI's was operational in June 05, and ran well, but was not fielded because the site was not ready. We were asked to direct much of the effort under DO #11 toward processing and testing data, and developing methods to evaluate the results. Program SORCloudAssess was developed to enable testing, and day data from the SOR site were processed and found to be very good, with an estimated accuracy of about 98%. This program does not use a blind test, so the numbers could be optimistic, however the program does enable us to quantitatively compare different sites and algorithm versions. We were challenged to process data for a hazy site, so data from the Virginia site were processed, and the day cloud algorithm also worked well for this site, with an estimated accuracy of about 98%. For the VA data, we updated the day algorithm to use Near Infrared (NIR) data, and also tested an adaptive algorithm concept. Meanwhile, the night algorithm data were tested using Program SORCloudAssess. The accuracy is more difficult to assess due to the moderate resolution of the night algorithm, however we still got results of 88 - 95%, depending on whether one assesses the line of site, or the region of sight, as discussed in Technical Note 272. We also developed a method to ground-truth the results using the brightest stars, and developed and tested beginning concepts for a high resolution night algorithm. Much of the work to convert algorithm to a transmittance-based algorithm (rather than contrast-based) was done under this contract. Work in several other areas, including evaluating night data in cities, and developing forecasting techniques, was also begun under this contract. 3 3. Statement of Work The Statement of Work for the Delivery Order #13 is given in italics below. Primary Task: The contractor shall, unless otherwise specified herein, supply the necessary personnel, facilities, services, and materials to accomplish the following tasks within a one-year period following receipt of funding. 1. Continue refurbishment of WSI units for deployment at locations to be determined, including upgrading of software capabilities as appropriate for efficient use of the instruments. If a sponsor-provided deployment site becomes available, deploy one of the WSI units after the site becomes available. 2. Support the sponsor in QC and analysis of the field data and in getting the cloud algorithms initialized and in place. Continue development of cloud algorithms appropriate to the conditions encountered at the fielded locations. 3. Continue development of capabilities related to the forecast Cloud Free Line of Sight. This may also include extension of cloud algorithms to other environments, and improved means to assess the results. 4. Continue evaluation of the optimal system and algorithms for possible future applications. 5. Provide interim verbal reports at meetings with the sponsor or provide interim reports via e-mail. Provide a written summary report on conclusion of the contract. Under the optional budget, if funded, the contractor shall accomplish the following: 6. Coordinate with the sponsor regarding the most appropriate tasks and estimated costs for further development. Tasks are anticipated to include one or more of the following: 6a. Deploy additional WSI units at sponsor-provided sites. 6b. Continue work on the tasks outlined in the primary statement of work. 6c. Extend the above work to meet new and evolving needs of the program, as discussed with SOR and mutually agreed upon by SOR and MPL. 7. Provide personnel trained in the WSI and its capabilities to address these tasks to the limit of funding provided under the optional budget. These tasks may include hardware deployment, development and repair, data analysis, software development, documentation, and other tasks related to the WSI and as mutually agreed upon by the sponsor and by MPL to be appropriate. 4. Funding Increments The funding was sent from the Air Force to ONR in two funding increments. An initial funding increment was sent to ONR in a MIPR dated 24 January 2006. The new contract was received at MPL 20 April 2006. This funded the primary task budget, and the priority was to field the next WSI, continue system refurbishment, and continue with data processing and cloud algorithm evaluation, i.e. primarily Tasks 1 and 2. The second funding increment was issued 17 May 2006 and received 31 August 2006. This MIPR fully funded the optional budget. The priority from the sponsors was to continue further developments under Tasks 1 and 2. Because it impacted our work, I would also like to note that we lost our hardware engineer in August 2006. We immediately advertised to replace him, however funding issues prevented our actually hiring anyone full time. Although the funding on this contract was originally intended to cover FY 06, we realized by late 2006 that we had to make the funding last for most of FY 07. Our sponsors allowed us to slow our spending, so that we would be able to continue to address their needs longer than just a year. As a result, we did not have any hardware engineer or technician during most of the contract period. Our project manager mostly unfunded under project for about 6 months, team members lowered their spending rates as much as possible. In spite of this, we were able to make significant progress under this project. The project manager continued working on an unfunded basis on instrument support and preparation for deployment, as well as project management and support of the other work. We used the unfunded time to write reports on the previous contracts, as well as help a graduate student with development of a technique to extract aerosol amount from the WSI data. 5.0. Major Deliveries and Documentation This section outlines the tasks and associated deliveries. The details of the hardware and oftware deliveries are discussed in Sections 6 and 7. The details of the algorithm developments and data analysis are discussed in Sections 8 and 9. Task 1: This task is directed primarily toward the refurbishment of instruments, the deployment when sites were ready, and the associated development of the software. The refurbishment of WSI Unit 7 was completed under this contract, and the instrument was ployed to Site 2 during 1-5 May 2006, as documented in Memo AV06-0008t. This deployment and the resulting data are discussed in Section 6.2. The refurbishment of WSI Unit 4 was also completed, and this unit was deployed to Site 5 on 31 January 2007, as documented in Memo AV07-010t. This work is discussed in Section 6.3. In addition, a third unit, WSI Unit 8, was partially refurbished under the contract. The supporting software was upgraded in numerous ways, and is discussed in Section 7. The first version of the new RunWSI that controls these instruments is documented in Memo AV06-005t, and the first version of the processing code for these instruments, ProcWSID, is documented in Memo AV06-006t. Other software deliveries throughout the contract period are documented in Section 7. Task 2: This task is to support the sponsor with data QC, and continue algorithm development. Upgrades to the QC software were made and new programs have been written to support the QC, such as Program WSIFluxChk, documented in Memo AV06- 023t. The data from the full deployment set have been QC'd as the files became available to us, and the results are reported approximately monthly. The QC is discussed in Section 7. The day algorithm was upgraded to use the NIR data, and to use a clear sky background reference level that is a function of solar zenith angle. This algorithm was installed in the field at Site 2 on 06 September 2006, as documented in Memo AV06-015t. At Site 5, the algorithm is installed, but its inputs are not initialized yet, as discussed in Sections 7.3 and 8. The day algorithm was also used to process considerable test data, as discussed in Section 8. This algorithm works quite well except for very hazy conditions. Under this contract, work was continued on an adaptive algorithm to handle haze. The first version of the full resolution night algorithm was developed under this contract, and is discussed in Section 9. It was initially installed at Site 5 on 1 February 2007, as documented in Memo AV07-007t, although it has not yet been initialized for that site with the proper site-dependent backgrounds. The algorithm was installed in the field at Site 2 on 5 March 2007, with the site-dependent backgrounds. This algorithm works very well for no-moon conditions. It is an adaptive algorithm, and adjusts each image for current lighting levels. Under this contract, work was begun on a moonlight full resolution algorithm. Task 3: Task 3 was to continue development toward forecasting cloud free line of sight, and/or extend the algorithms into new environments. Under the previous contract, we began development of concepts for forecasting the cloud free line of sight, as documented in Memo AV05-036t. This forecast concept is based on the presence and amount of clouds in the direction of the line of sight, the rings around the LOS, and the whole sky. Under the earlier contract, we also began creating software to use archived WSI data to determine the most effective logic for the forecast. Under this contract, we did not continue with the forecasting work, because there was so much to do under tasks 1 and 2, which were the sponsors' priorities. (We hope to continue this work in the future.) However, we did extend the algorithm into new regimes. Specifically, we upgraded the day algorithm to handle haze better, based on analysis of data from the Virginia site, as documented in Memo AV06-018t, and used this same upgraded algorithm to for Site 2, which is also somewhat hazy. A second new regime that was successfully addressed was bright regions at night. Also our new night algorithm handles bright city lights well, as tested at Site 2 documented in Memo AV06- 033t. Task 4: Task 4 was to continue development of optimal hardware configurations for future systems. This task received less attention than tasks 1 and 2, due to sponsor priorities. However, under this contract we documented several ideas for hardware upgrades in Memo AV06-017t. Also, we continued mock-up level development of new concepts for solar occultors which should be more reliable than the current trolley system. We also wrote Memo AV07-026t, which documents the analysis of infrared system concepts completed under earlier contracts. Task 5: Task 5 was to provide interim verbal reports at sponsor meetings, provide interim reports via e-mail, and provide a final report. Power point presentations were made to the sponsors in September 2006, October 2006, and January 2007. The contents of these reports are briefly described in the Reference Section 12.2. We provided informal e-mail reports initially, and starting October 2006, formal monthly reports were submitted in the format requested by the sponsors. In addition, there were weekly conference calls for planning, when the sponsors were available. This report, Technical Note 273, constitutes the final report, which will also be submitted to ONR for distribution. Under the optional funding, we were asked to continue with the work under Tasks 1 and 2, which is described above. Although this section documents some of the more significant deliveries, there were numerous additional deliveries, which are documented in the following sections. We believe we have completed all contract requirements. The Statement of Work tasks were open-ended, and we have made progress in all areas, most significantly in Tasks 1 and 2, which were our sponsors' priorities. This work is discussed more fully in the following Sections 6-9. 6. WSI Refurbishment, Deployment, and Support This section discusses the progress with WSI refurbishment and deployment, QC and support of the fielded instruments. The software upgrades that were part of this development are discussed in Section 7. More details on the day and night cloud algorithm developments are given in Sections 8 and 9. 6.1. WSI Refurbishment under the previous contract As discussed in Technical Note 272, some older WSI systems became available early in the previous contract, and the decision was made to refurbish these systems for use at three new sites for this program. These older instruments were originally built for the Department of Energy (DOE) and used for many years at a variety of sites. The first two units for refurbishment, Units 7 and 8, were received in-house in February 05, as documented in Technical Memo AV05-004t. The retired unit that was already in-house, Unit 4, is documented in Memo AV05-020t. Other units that can serve as test systems and spare parts, or be used for other programs, are documented in Memos AV05-025t, AV05-028t, and AV06-022t. The configurations of the Unit 7 sensor and the controller upon completion of the refurbishment are shown in Figure 1. The left side shows the sensor unit, with its environmental housing and the solar/lunar occultor. This unit has successfully operated for many years in the Arctic, and similar units have operated in other locations, including the tropics and the desert. The right side shows the controller unit, as reconfigured for the SOR project. The other units are quite similar, except for the size of the occultor shade, which depends on the latitude of the site. Also, Units 7 and 8 have glass domes, and Unit 4 has an acrylic dome. Fig. 1. Sensor and Controller Configuration for Unit 7 These systems were built in the mid-to-late 1990's, and as a result they needed a fair amount of refurbishment. Typical refurbishment tasks include disassembly and cleaning, replacing worn components such as the coolant tubing, replacing any failed components such as arc drive motors, and getting the cameras tested and purged at Photometrics. In many cases, these components were still operational, but it was sensible to replace them prior to redeployment. Unit 4 had significant electronics issues with the Accessory Control Panels and camera housings. We also replaced filters as necessary, replaced the shutter, replaced a few cables as required, cleaned the system, replaced the acrylic optical domes, and re-labeled cables. These repairs were completed on the first system (Unit 7), and partially completed on two other systems (Unit 4 and 8) under DO #11. Details of the refurbishments are included in Technical Note 272. The full automated hands-off test of Unit 7 began 27 June 2005. The sponsors had to request that we delay the deployment due to site issues outside of the control of MPL. As a result, Unit 7 was allowed to run continuously at MPL, where it ran well. We ran the control computer and WSI system in hands-off mode, but we also frequently stopped the processing computer in order to test ongoing updates to the processing code. Toward the end of the previous the DO #11 contract the site became available, and so we were able to schedule the deployment as soon as the funds for the DO #13 contract arrived. 6.2. Site 2 Deployment As soon as funding was received, we scheduled a trip during 1 - 5 May 2006 to deploy Unit 7 to Site 2. The trip report and the component serial numbers are documented in Memo AV06-008t. The system hardware overview is documented in Memo AV06-007t rev 1. The emergency shutdown procedures for site personnel are documented in Memo AV06-010t. The WSI software includes the features in the other SOR instruments, but it is adapted for the hardware of these older systems. In addition, the hardware and software were updated to enable real-time processing of the cloud algorithm, similar to that done on the other SOR systems. The software updates are documented in Memos AV06-005t and AV06-006t, and further discussed in Section 7. The quality of the resulting imagery was quite good, and may be seen in Sections 8 and 9 that discuss the cloud algorithms. We had a few issues that we had to work our way through. a) We had been promised an Uninterruptible Power Supply (UPS) to be supplied by another contractor. However, we had not understood that the UPS would only hold power for 3 minutes, i.e. long enough to enable a graceful shut-down of the computer, and the shut-down information would not be sent to the WSI. It also takes the UPS 2 - 3 hours to come back on. As a result, the WSI is vulnerable to site power-outages. There were many outages at the beginning of the deployment period, but they have lessened considerably. We have found a commercially available UPS that can provide extensive running time as well as a graceful shut down, but did not have the funding to purchase it under this contract. As a result, there was a significant amount of down-time due to power-outages. b) We began to see problems with some of the sub-assemblies during the week of July 24. A trip was made during 2 - 4 August to the site to determine the cause of the problem. The problem turned out to be excess temperature (120 F) in the shed holding the WSI controller, caused by failure of the cooler provided by another contractor. The trip to diagnose this problem is documented in Memo AV06-021t. c) During the initial deployment we had plugged the Global Positioning System (GPS) into wall power, rather than UPS power. The GPS does not have an automatic turn-on feature after a power-out (this is one of the limitations of using the old camera and hence computer systems). Whenever the GPS was turned off, the computer would miss every other image while trying to find the GPS. To correct this problem, we plugged in the GPS correctly during the 2 August trip. We also modified the software so that images will not be lost when the computer is hunting for the GPS, as discussed in Section 7. d) In September 2006, the system hung during the full moon cycle. The problem was solved in October. In August, we had set the system up so that if the processing computer did not receive data from the control computer for 5 minutes, it would reboot. During full moon, a data set takes longer than 5 minutes to acquire, so this was causing the computer to reboot. The problem was solved by resetting the wait time to 15 minutes, as documented in Memo AV06-028t. The resulting data loss statistics are shown in Figure 2. The dark blue shows the fraction of expected images that were successfully grabbed. The turquoise segment shows the fraction of cases lost due to power-outs and failures associated with the excess temperature. The yellows are those cases lost due to hardware or software failures. In this case, the yellow cases were almost all due to times when the GPS was off and the WSI only grabbed every 2 minutes while hunting for the GPS. The red cases are those where we cannot grab every 2 minutes during full moon, because the instrument is currently set to acquire both spectral and open hole data during full moon, and it takes significantly longer than 2 minutes. Site 2 (Unit 7) Data Availability (1 May - 10 Dec. 2006) 06% #of actual image sets grabbed 08% 11% # of image sets lost during full moon # of image sets lost due to WSI hardware/soft ware failure # of image sets lost due to facility failure 85% Fig. 2. Data Availability, Site 2, 1 May - 10 Dec 2006 10 Unfortunately, the Occultor ACP at Site 2 failed on December 11 2006. Due to the funding issues mentioned in Section 4, we were not in a position to make the repair at that time. We were able to hire a part-time technician for about a month, and get a spare ACP operational. The SOR team replaced the ACP, replaced a faulty occultor arc drive potentiometer, and calibrated the ACP in February 2007, and the instrument performed well after this time. This trip is documented in Memo AV07-038t. The data rates for the period after February to the end of the fiscal year are shown in Figure 3. We still lose about 1% of the data due to acquiring both spectral and open hole data during moonlight. The 1% hardware/software category was not actual due to two things: we occasionally miss grabs when the exposure is long and we have to grab a dark image. And we occasionally lose an image when the file transfer from the control computer to the processing computer is slow. Site 2 (Unit 7) Data Availability ( 28 Feb. - 30 Sept. 2007) 1% 1% -0% # of actual image sets grabbed # of image sets lost during full moon a# of image sets lost due to WSI hardware/soft ware failure a# of image sets lost due to facility failure 98% Fig. 3. Data Availability, Site 2, 28 Feb – 30 Sep 2007 We should also note that the Teca cooler used at this site is less efficient than the McLean cooler used in the other instruments. The instrument tends to run a bit hot during the summer, but not hot enough to affect the data. Also, we updated the software September 2006, as documented in Section 7.1, to provide more realistic image grab rates during sunset, install the new day algorithm, upgrade the flux control, and change the file structure in the data archival. In March 2006, we installed the new high resolution night algorithm. The software is discussed in Section 7, and the algorithms in Sections 8 and 9. 11 6.3. Site 5 Deployment Both Units 4 and 9 had been partially refurbished under the previous contract, DO #11. Under this contract DO #13, we continued the refurbishment. By late summer of 2006, we were able to run Unit 4, and were using it also to support the software tests for Unit 7. The system test began in November 2006, and we began final preparations such as replacing tubing and labels, preparing crates, and so on. The deployment was delayed slightly while we waited for the sponsor site to be ready. The instrument was deployed 31 January 2007. Memo AV07-010t documents the deployment. A new system check list is documented in Memo AV07-001t. The system hardware is documented in Memo AV07-002t. The system setup instructions are in a preliminary version in Memo AV06- 011t, and in the final version in Memo AV07-003t. The emergency shut-down and other procedures needed by site support are documented in Memo AV07-004t. Memo AV07- 011t provides a post-deployment updated check list. UU The software is discussed in Section 7. Memo AV07-005t provides an overview of the control computer operations. Memo AV07-006t documents the processing computer operations. Memo AV07-007t documents the updates to the processing software program ProcWSID. Memo AV07-008 provides an update of the utility programs on the system. The control software, RunWSI, did not change since the last update documented in Memo AV06-016t. Following deployment, we found that the night images looked excellent, but the day images were slightly out of focus. This does not affect the algorithm results significantly. A leak in the camera housing developed, and by this time we did not have sufficient funding to address this issue. This does not affect the data. While the SOR team was at the site for another reason on 27 – 28 June 2006, they attempted to fix the pressure leak, as documented in Memo AV07-039t. They also replaced the external hard drive and turned the GPS on. The instrument ran very well until it was hit by lightning 23 July 2007. Figure 4 shows the data success rate for the period from deployment through the lightning strike. The instrument lost about 6% of the data due to power-outs. We still lose 1% of the data due to taking both spectral and open hole data during full moon. In addition, 2% of the data were lost due to the long exposures and transfer times mentioned in Section 6.2. Repair of the Site 5 instrument is ongoing under the follow-on contract. 6.4. Site 3 Preparation and Site Maintenance for SOR and VA Sites . - . - - . - - - - - - - - - We completed much of the refurbishment of Unit 8, for Site 3, during this contract. However, the camera has intermittent problems, including loss of sync, and other problems that appear to be associated with the analog to digital conversion. During the latter half of the contract, we did not have the funds to hire a technician to debug this, so the problem waited for the follow-on funding. - - - - . . . - - - - .-.- . ' RAL T This contract was not intended to include sufficient funds to address the SOR or Virginia (VA) sites. The occultor ACP had failed at the SOR site, and the camera had failed at the VA site prior to the start of this contract, but we were unable to address these issues. Site 5 (Unit 4) Data Availability (2 Feb. - 23 Jul. 2007) 2% 76% 1% # of actual image sets grabbed # of image sets lost during full moon # of image sets lost due to WSI hardware/softwar e failure a# of image sets lost due to facility failure 91% Fig. 4. Data Availability, Site 5, 2 Feb - 23 Jul 2007 7. Software Upgrades and Site Data Processing This section discusses the upgrades to the field software, the software related to data quality control, and the archival and processing of data. Although this section discusses some of the algorithm installation, the actual algorithm upgrades are discussed in Sections 8 and 9. 7.1 Field Software The WSI control program, RunWSI, had been completed and installed on Unit 7 (for site 5) toward the end of the previous contract. It included many significant upgrades, as discussed in Technical Note 272. Following deployment to Site 2 in May 2006, we found some issues discussed in Section 6. The most important of these was the problem with the GPS. The GPS that is compatible with these older computer systems does not have an automatic turn-on feature following power-outs. We had expected to be protected from this problem with a reliable UPS provided by another contractor, but it turned out that the UPS protection was limited to 3 minutes. And, a bit of human error was involved, because we plugged the GPS into wall power during the initial deployment. Whenever the GPS was off, the software 13 would hunt for the GPS, and we would lose every other image, acquiring imagery every 2 minutes until the GPS was turned back on by someone at the site. In August 2006, the GPS was plugged into the UPS. Most importantly, the software was updated to speed up the handling of the GPS, so that data is not lost when the GPS is down. This update, as well as other minor updates in the RunWSI program, were made on 3 August 2006, and are documented in Memo AV06-012t. In addition, the processing program, ProcWSID, was updated on 26 July 2006 to provide better handling of out-of-bounds ACP readings, and to provide a better method for creating the user interface, i.e. the display shown on the monitor. These upgrades are documented in Memo AV06-013t. The user interface is shown in Figure 5. This interface shows the most recent image, the most recent status readings (such as occultor position and temperature readings), and the current status of the system (readings normal or not). Normal readings are marked with a green dot. Slightly abnormal readings that indicate that future maintenance may be required are indicated with a yellow dot. An example is the yellow flag shown in Fig. 5 on nitrogen, which does not actually affect the data. Problems that may affect the data, such unsafe temperatures, are indicated with a red dot. The red is actually colored slightly orange-red, so that color-blind people can see it. The system flags associated with these color dots are documented in Memo AV07- 054t. ProcWSID Fie Options Help Red Image Date: 07/10/2005 Timc:19:47G Exposure: 100 Arc Actual: 88.1 Arc Dest: 88.3 Tric Actual:Fixed Tric Dest: Fixed ND: 313 log) SP: 3 (Red) CCD chip temp: -34.0 Eny. Housing temp:16.0 Cam. Housing temp:24.0 Flow 0.38 GPM N2: 0.0 PSI Flux Level Status WSI Hard Disk Status Min: 768 Max.: 12600 Next grab time: 19:48Z Figure 5. WSI User Interface 14 DA . .. ... . On 6 September 2006, a new version of ProcWSID was installed at Site 2, as documented in Memo AV06-015t. This installation included a major upgrade to the day cloud algorithm. The day cloud algorithm was improved to use either the red/blue or the NIR/blue ratio, where NIR is the Near Infrared data acquired by the NIR filter. For this site, it was set to use the NIR/blue ratio, as discussed in Section 8. Also, in the new algorithm the clear sky background ratio reference corrections are a function of solar zenith angle. The upgrade included installing the clear sky background library for this site, as well as the other algorithm upgrades, as discussed in Section 8. Other updates associated with this change, such as updated input file formats, are documented in Memo AV06-015t. Also, the file structure for the archival data stored on the external drive was updated to be compatible with the file structure used at SOR to store the data received by ftp. The next upgrade to Run WSI was made on 8 September 2006 at Site 2, and documented in Memo AV06-016t. This change modified the expected grab rate for sunrise and sunset from 1 minute to 2 minutes, since longer exposures are used during this time. Also, the flux control settings were updated, as will be discussed in Section 7.2. When Unit 4 was deployed to Site 5 on 31 Jan 2007, it used the same version of RunWSI documented in Memo AV06-016t. The new high resolution night algorithm was developed during this time, as discussed in Section 9. The new algorithm is documented in Memo AV06-029t, as well as the September 2006 and October 2006 power point files. Memo AV06-030t documents the conversion to C code from IDL, and Memo AV06-033 documents the result of processing Site 2 data. The new code was first installed in ProcWSID at the Site 5 installation on 31 January 2007, although the background and other algorithm inputs could not be extracted until after the system had run awhile. This means that the system now applies calibrations to the images, determines the earth-to-space beam transmittance for selected stars (although this is not reported) and then computes the cloud algorithm results in real time in the field. Related changes include changes to header formats, input files, and other subroutines. This software is documented in Memo AV07-007t. The inputs to the night algorithm for Site 2 were extracted and tested, as documented in Section 9. The new software along with the algorithm inputs were installed in the field at Site 2 on 5 March 2007. The Site 5 deployment also included several new utility programs, as documented in Memo AV07-008t. Program OccInfo provides a sum and moon ephemeris that can be used at the site. Program ProcWSIDInput allows the users to change input files with much less risk of changing inputs that should not be changed. Programs SendWSIFiles, which sends the data by ftp, and SendWSIQCFiles, which sends QC files by ftp, were updated to read the new ProcWSID input files. And, Programs WSIView and 15 WSIImageLoop, that allow the user to view stored images and loop through sets of images, were added to the instrument. Prior to this time, they had only been used to study archived data in the lab. Because the information stored in the image headers has continued to evolve, Memo AV07-009t was written to document the format of the raw and processed image headers on the Units 4 and 7, the Unit 12 deployed at SOR, and the Unit 14 deployed in VA. We plan to make these programs more consistent in the future, if funded to do so. ProcWSID was also updated on 18 May 2007, to provide better handling of dark images during the full moon period. This is documented in Memo AV07-028t. We also realized during this time that we had not documented the software upgrades to Unit 12 at SOR made during the previous contract. These updates are documented in Memo AV06-027t. 7.2 Software related to Data QC There are several layers of quality control for the WSI systems. While much of the work on the QC software was done under previous contracts, the developments are ongoing, and a short review is appropriate. The first layer of protection occurs in the program RunWSI, where a variety of checks are determine whether the instrument is operating in a safe condition, and should produce valid data. These QC checks are documented in Memo AV07-054t. They result in the green, yellow, and red indicators shown in Figure 5. Certain of the red flags, such as excess temperature, will also result in turning the system off automatically, to protect the camera. Under these conditions, the system will continue checking to see if it is safe to turn the camera back on, and turn it on when it is safe to do so. These QC results are also stored in the image header, so that the user may later determine under what conditions a given image was acquired. The header formats are documented in Memo AV07-009t. The next layer of protection is the generation of QC files by the program ProcWSID. As documented in Memo AV05-023t, there are several QC files, including files that summarize the image acquisition, the occultor behavior, the occasions with abnormal temperatures, and so on. The primary purpose of these files is to make post-archival evaluation of the instrument status easier. The SOR site team has developed a program that automatically checks for the existence of these QC files Under the follow-on contract, MPL has also developed further programs to process these files. We have also added a 12 bit quality string embedded in the cloud decision header. It is documented in Memo AV06-006t. In addition, there are several utility programs that support the analysis of the data. Under this program, we developed Program FluxCheck, documented in Memo AV06-023t. 16 This program is designed to evaluate all data to determine whether the flux levels are reasonably optimal. One of the flags generated by the Run WSI program checks whether there are any images with more that 1% of the pixels offscale (yellow flag) or more than 5% of the pixels offscale (red flag). When we first installed the instrument at Site 2, this was happening frequently, because the city lights were so bright. Program FluxChk makes it much easier to look at when this occurs, so that we can adjust the flux control table accordingly. The flux control table instructs the program on what exposures and filters to select, depending on the sun and moon position, and the moon phase and earth- to-moon distance. IS The results of the Site 2 flux check are given in Memo AV06-024t. On the basis of this analysis, the flux table was improved on 08 September 2006, as documented in Memo AV06-016t. In addition, the flux levels for Unit 4 were checked at MPL prior to its installation at Site 5. The spectral filters we used in earlier systems are no longer commercially available. Unit 4 is the first of the systems with the new spectral filters, and they require slightly different exposure settings. Memo AV06-025t documents the flux check based on data acquired at MPL. Other programs used at MPL to ease the data QC task are programs WSIView and WSIImageLoop that allow the user to view stored images and loop through sets of images. These are documented in Memo AV07-008t. The QC results are reported regularly (approximately monthly) to the sponsors in the form of an Excel table. A sample table, for Site 2 Year 2007, is shown in Table 1. In the original Excel file, the cells have embedded comments, so that additional details may be seen. In this example, we see that the temperatures had yellow flags at least once during essentially every week. As discussed in Section 6, this is due to the use of the Teca cooler, which is not as good as the McLean. During the August 2006 trip, we replaced the Teca cooler with a new Teca cooler, to see whether this would improve the temperatures, but it did not. We verified that the data are not impacted. The cooler was not replaced with a McLean, because this requires a significant effort, and the data appear to be unaffected. We also note several GPS yellow flags in Table 1. This occurs when there has been a power-out, and the GPS was not yet turned on by the site personnel. This affects the computer clock time if it is allowed to continue long enough for the time to drift significantly. We recommend purchasing a UPS for the field sites if the sponsors approve in the future. The sponsors did not consider it a priority at the time of this contract. The only red flags during this time were a few images the week of March 5, when someone had turned on the lights at the site at night, affecting the flux levels. 7.3. Data Archival and Processing The instruments deployed to Sites 2 and 5 include external hard drives (originally 300 GB, now 500 GB). The primary data archival mechanism is at SOR, and data are ftp'd directly from the processing computer to the SOR site, where they are archived. The 17 Table 1 Typical QC Report, Site 2, Year 2007 Red Flags? Week Yellow Flags? Data Summary Pending Comments A few SP errors are normal; only abnormal amounts will be noted Notes SP 1 - Normal Spectral filter sequence is "143" or "1435" during day, "2" or "25" at night Flux control Days where number of flux control errors are <3 are not counted. ? All times listed in this worksheet are GMT time. Env. Housing, GPS Env. Housing over 32degC Aug. 20-31, Sept. 1,2 Aug. 20 - Sept. 02 Aug. 1 - 19 Env. Housing Env. Housing over 32degC Aug. 1, 3-12, Four hour power out on Aug. 12 Env. Housing over 32degC July 1 - 31, Five hour power out(?) on July 16. GPS off July 16 - 24 July 1 - 31 Env. Housing, GPS June 18 - 30 Env. Housing Env. Housing over 32degC June 18-30, 22 June - Dari at site, system reset at 20:08 Env. Housing over 32degC May 28 - June 17. Power out on June 1 20:05 - 20:51. GPS off from June 1 - 17 Env. Housing, GPS May 28 - June 17 May 21 - 27 May 14 - 20 Env. Housing Env. Housing over 32degC May 21-27 Env. Housing Env. Housing over 32degC May 14-20 May 7 - 13 Env. Housing Flux control Apr. 30 - May 6 Env. Housing Env. Housing over 32degС. May 10-13. Temperature highs at site in mid to high 80's. 6 flux control errors on May 7 at sunset Env. Housing over 32degC. Apr.30-Mayo. Temperature highs at site in mid - high 80's(degF). Env. Housing over 32 degC. Apr. 23,24, 27- 29. Temperatures at site ran in the mid-80's (degF). Possible power spike at 18:53 on Apr. 25 Env. Housing over 32 degC. Apr. 22 This is normal with this cooler, and is similar to what we saw last year. It does not affect the data or indicate a problem. Apr. 23 - 29 Env. Housing Apr. 16 - 22 Env. Housing Env. Housing over 32 degC. Apr. 3, 4, 12-14. This is normal with this cooler, and is similar to what we saw last year. It does not affect the data or indicate a problem. Env. Housing Apr. 3 - 15 Mar. 12 - Apr.2 Env. Housing Env. Housing over 32 deg. Mar 23-21 Flux control errors on Mar. 5,6 may be due to work being done on site by Oceanit? Control and processing computers were restarted a few times on Mar. 5, 6 Flux control Mar. 5-11 Flux control Review images and redo geo calib. None None Mar. 1-4 Prior to Feb 28 System returned to service Feb. 28. There were no significant errors noted in the QC files for Mar. 01-04 System shut down due to Occultor ACP Failure No data 18 external drives are a redundant backup that enables MPL to use the data without causing extra work transferring data at the SOR. These external drives also provide a second archive, in case of server crashes at the SOR site. The 300 GB drives must be changed approximately every 6 months, and the 500 GB drives every 9 months. The procedure for changing the drive is documented in Memo AV07-017t. The drives with archived data that have been acquired to date are listed in Table 2. Table 2 Field Data Archival 2 Drive Number Site Dates 17 2 May - 26 Sep 06 2 3 Aug - 11 Dec 06, 27 Feb - 27 Apr 07 31 Jan - 27 June 07 18 19 We would also like to document the status of the processed data. When the system is first placed in the field, the processed cloud data are not valid, because the site-dependent inputs have not been determined. Once the algorithm inputs such as the clear sky background are set up using the field data from the site, the processed data should be valid (to within the accuracy of the installed algorithm). Ideally, we like to set up the algorithms as quickly as possible, and also process the data that was acquired in the interim. This has not always been possible, because often we have been directed toward different priorities. For example, because we had no hardware person during most of this contract, the work on preparing the instruments for fielding was largely performed by the team members who normally work on the algorithms. And often at the meetings, we are directed to different priorities, such as doing analysis of time series of stellar irradiance (as discussed in Section 9.3). As a result, we have not been as responsive as we would like to the need to get the algorithms set up and the data processed. The status of processed data is given in Table 3. The results of the processing are discussed in Sections 8 and 9, which discuss the cloud algorithms and data processing results. 8. Day Algorithm Upgrade and Analysis During the previous contract, as documented in Technical Note 272, we had processed and evaluated data from both a clear site at SOR site and a hazy site in Virginia (VA). We had also developed a program SORCloudAssess that enables a somewhat more quantitative evaluation of the algorithm results. Section 8.1 provides an overview of these earlier results, and Section 8.2 discusses the work under this contract. TIONE Table 3 Status of Data Processing Site Dates Status Day or Night Day 3 May - 27 Jun 06 27 Jun - 6 Sep 06 6 Sep 06 - Present Night Day Night 27 Jun 06–5 Mar 07 5 Mar 07 - Present 31 Jan - 27 Jun 07 31 Jan - 27 Jun 07 Jul - Aug 2005 Processed, delivered 20 Jul 06, Memo AV06- 020t Unprocessed Automatically processed in field, Memo AV06-0151 Processed, delivered 20 Jul 06, Memo AV06- 033t Unprocessed Automatically processed in field Unprocessed Unprocessed Processed, delivered 11 Oct 05, Memo AV05-032t Processed, delivered 11 Oct 05, Memo AV05-032t Automatically processed in field Processed data not required Processed, reported Feb 06, Memo AV06-018t Processed data not required SOR Day Night Jul – Aug 2005 Day & Nt Nov 2005 - Present Day & Nt All other dates Day April 2005 Day & Nt All other dates VA 8.1. Previous Day Algorithm Results For the SOR site, we processed a test-bed of data from July and August 2005 taken at 1- minute intervals, yielding a set of 15,000 images. For the VA data, we processed data from April 2005, as it was the closest month we had to summer, when the haze should be heaviest on average. Hourly data were processed, and 307 images were analyzed. Also, a major upgrade was made to the algorithm at this time, to use the Near Infrared (NIR) imagery in the day cloud algorithm. A sample raw and cloud decision image are shown in Fig. 6. The cloud decision image on the right shows the results of the cloud algorithm. In this image, black is the color code used for “no data”, blue is "no cloud”, yellow is pixels the algorithm has identified as "thin cloud”, and white is "opaque cloud”. The texture within each of these clouds is only to help the analyst assess the images. (For example, in the opaque cloud regions, colors ranging from grey to white indicate how much the ratio exceeded the opaque threshold.) 20 Fig. 6. Raw red and processed cloud decision from SOR site, 26 Jul 05 2200 Because the WSI looks up, the directions in these images are not the same as on a map. East is to the right, and West to the left, however North is at the bottom of the image and South at the top. (Visualize the scene lying on your back with your toes to the north.) During DO #11, we had developed a program SORCloudAssess to help us evaluate quantitatively how the algorithm was doing, and where the trouble areas were. The display in Figure 7 is the display from Program SORCloudAssess. In this image, several Regions of Interest (ROI) are marked, at the zenith, azimuth angle combinations (0,0), (30,0), (30,90), (30,180), (30,270), (60,0), (60,90), (60,180), (60,270), (80,90), and (80,270). The analyst uses the mouse to mark any of these predesignated ROI's that are judged to be incorrect, and also indicates whether the overall results are judged to be correct over 90% of the image. (The ROI indicators are difficult to see in the report format, and are designed for best viewing during use of the program.) Although this program does not use a blind test, we felt that it would provide us a good way to look at the errors, and conditions under which the errors tend to occur. Another contractor has developed a blind test program based on this program, and we hope to adapt it for our needs and use it in the future. For the SOR data set, we processed data taken at 1-min intervals, and then we assessed data taken at hourly intervals from the entire test bed data set, thus assessing 310 images, or 3410 ROI's. Overall, we found that the ROI results were evaluated to be correct 97.4% of the time, with a 99.0% rate at the zenith, and a 95 – 96% rate near the horizon. Later evaluation sorted out the sunrise/sunset cases, and with these removed, the average results over the full sky were evaluated to be correct 98.4% of the time. Results were estimated to be correct over 90% of the sky 94.5% of the time. The distribution of the results as a function of the ROI position is shown in Figure 8. 21 y. File Edit View Intensty Color Geometry Math Eiker Analyze Sequence Tools Camera User Window Help I 51902200 RED Raw Red O 51902200.cld Cid Image Yes 90% No Exit? Fig. 7. SORCloudAssess program display for user assessment of algorithm results A similar assessment was made with the VA data, with results as shown in Fig. 9. Even though the VA site was more difficult, due to the heavier haze, we still achieved very similar results, due to the introduction of the NIR/blue algorithm. We did note that there were four days in our VA test set with quite heavy haze, and we made a manual adjustment to the haze level for this set. It will be important to develop an algorithm that can automatically sense and adjust for the haze level. 98.6 98.0 97.7 98.3 94.9 97.8 98.3 99.0 97.6 94.8 96.5 || 98.8 99.1 97.4 97.5 98.0 | 96. 1 98.9 98.8 97.7 97.8 97.9 Yes 90% No Exit? Fig. 8. Fraction of correct answers, in percent, for nt, for each ROI for SOR Day Test Bed Processing Fig. 9. Fraction of correct answers, in percent, for each ROI for Virginia Day Test Bed Processing Although there are obvious improvements we can make with the algorithm, we were very pleased with these results. A summary of these daytime results (with sunrise/sunset removed) for both the SOR and the VA test bed data sets is shown in Table 4. Table 4 Summary of Estimated Accuracy for the Day Cloud Algorithm Using Program SORCloudAssess For the SOR and Virginia Test Bed Data Sets Region Overall Zenith Horizons 90% Correct? SOR Results VA Results 98.4% 98.1% 99.5% 97.5% 98.5% 98.8% 96.7% 96.7% At this point, we felt that we had demonstrated that the day algorithm works well in relatively haze-free atmospheres, and works reasonably well in hazy atmospheres. As discussed above, we would like to develop a better assessment tool, based on the blind test program which was developed using the ideas from this program. It is important to continue to test other environments, and continue to make upgrades to the day algorithm. 8.2. Day Algorithm Results for Site 2 As documented in Section 6.2, the Site 2 instrument deployment occurred right after this contract was funded, during May 1 - 5 2006. The data for the period May 3 through June 27 were used to set up the algorithm. The data were processed and delivered to the sponsors on 20 July. The results of the processing are documented in Memo AV06-020t, and the software updates to the stand-alone algorithm processing software are documented in Memo AV06-019t. The algorithm was integrated into the real-time software on the instrument on 06 September 2006. The upgrades in the field version of the code are documented in Memo AV06-015t. The WSI control code (RunWSI) updates made on 8 September are documented in AV06-017t. The day data were acquired at 1-min resolution, and the full data set was processed. There were some days when data were acquired only every 2 minutes, as discussed in Section 7.1. These data were processed using the NIR/blue algorithm used for the VA site, rather than the red/blue algorithm used for the SOR site. This is because we expect Site 2 to be somewhat hazy, and the NIR data should work better for hazy sites. Also, inspection of the imagery showed that although the red and the NIR images were often similar, the NIR images sometimes showed better contrast between the clouds and the background sky. When we extracted the NIR/blue clear sky background, we also evaluated how the NIR/blue reference values behaved. The reference values are the ratios at the beta-beta points, and are normally extracted only for clear skies. There are two points in the sky 23 where the beta scattering angle is 45º and the zenith angle is also 45°. The beta-beta values are the average of the NIR/blue ratio values for these two points. The saved clear sky background values are actually the ratios divided by these reference values. When the clear sky background for a specific image is recreated from the saved background library, it is renormalized. Previously the renormalization value was a fixed number based on these reference values, but as discussed below, that changed with this data set. The reference values for several clear days are shown in Fig. 10, plotted as a function of solar zenith angle. The behavior was similar to what we've seen before, with higher values higher at sunrise and sunset, and higher values on hazy days. We also saw a sharp downturn in the curve at 85° zenith angle, as had been hinted at by the VA data, as shown in Memo AV06-020t. 2.0x104 * * * * * * May 4 *** May 7 May 12 *** May 17 *** May 20 *** June 8 1.5x104 Reference Values 1.0x104 * 5.0x103 O LILIT 20 60 80 100 40 Solar Zenith Angle (Degrees) Fig. 10. NIR/Blue ratios at beta points for several clear-to-hazy days for Site 2. Also, we noted that this data set had larger increases in reference values with increasing Solar Zenith Angle (SZA) than we had seen before. This was much more marked at SZA angles from 70° to 85°, and even to some extent from 40° to 70°. As a result, we decided to upgrade the program to allow this normalization constant to vary with SZA. We found that the results of the new cloud algorithm processing were reasonably good, but that it would be very helpful to finish testing and integrating the adaptive algorithm that should automatically adjust for haze amount. With the current algorithm we found that on very clear days, some thin cirrus were miss-identified as clear, but on the more hazy days, the haze was identified as thin cloud. In most cases, however, the results appear to be quite good. Examples are shown in Figures 11 - 15. Figure 11 shows good clear sky results. Relatively poor results from a hazy sky are shown in Figure 12. 24 Fig. 11. Typical clear sky (no cloud) results, 20 May 06 1800 Fig. 12. Poor clear sky results due to heavy haze, 10 May 06 1500 Figures 13 and 14 show clouds overhead and on the horizon. Figure 15 shows an overcast case with rain drops on the dome. Figure 16 shows a beautiful hourly time series acquired over an 8-hour interval (for 9 images). Quite a few program updates were made during this time. The Site 2 data have slightly different format than the VA data taken by the older programs, so we had to update the AutoProcWSI program that is used for interactive processing of archived data. We added various test features such as the ability to test the day algorithm down to a user-selected SZA value. In addition, several of the utility programs were updated to handle these changes. The updates were also made to the field code, and installed on the Site 2 and Site 5 systems. 25 S TRES Fig. 13. Nice mixed cloud results, 19 May 06 1700 Fig.14. Clouds on the horizon, 17 May 06 1800 Fig. 15. Overcast sky, 23 May 06 1800 26 3 Fig. 16. Hourly time series, 19 May 06 1400 – 2200 We did not have the opportunity to do quantitative tests on this data set. As mentioned earlier, another contractor has developed a new version of the program SORCloudAssess program that uses a blind test. This program was used to evaluate data from several months at Site 2, and yielded 83% accuracy. This is surprisingly lower than the 98% values obtained at the earlier sites. Clearly, part of the difference may be that in doing the SORCloudAssess test, we were slightly biased because it was not a blind test. Also, it's possible that the Site 2 data set had more variable haze amounts, in which case it will be even more important to introduce programming to handle the haze amount. Also, we had some concerns with the blind test program. For example, it treated the “no data” occultor region as data, and we are not sure how that will impact the statistics. Also, the portions of the algorithm classified as “indeterminate” were handled, I believe, as if the algorithm had assigned them a “cloud" value, and this would certainly cause an artificially low accuracy estimate. Finally, untrained observers made the assessment, and 27 I believe this would cause problems. We hope to have an opportunity in the future to fix these issues, and reassess the accuracy. Although the next site, Site 5, was installed in January 07, we did not extract the day algorithm for that site, because we had no funding to pay the analyst for this task. During this period, we also documented the processing of the VA data in Memos AV06-018t and documented the results of an infrared study in Memo AV07-026t. The work documented in those memos was performed in the previous contract, however the documentation had not been completed at that time. In summary, we are pleased with the current status of the day algorithm, as we believe it provides reasonably accurate results. We hope to have an opportunity soon to continue to upgrade the algorithm to include the adaptive adjustment for haze, and we hope to upgrade the blind test program to be compatible with the features of the algorithm. In addition, we need to work on calibrating the definition of thin and opaque cloud thresholds in the algorithm in terms of optical depth. However, a higher priority at this time (on the follow-on contract) is to extract the clear sky background for Site 5, and process the data from that site. 9. Night Algorithm Developments and Analysis During the previous contract, as documented in Technical Note 272, we developed a new ground-truthing method using bright stars, did additional work toward developing a transmittance-based moderate resolution algorithm, and tested concepts for a high resolution algorithm. We also processed a data set from SOR, and delivered the processed data to SOR. This data set enabled us to do a much better evaluation of the state of the night algorithm, in terms of when it does well, and when it doesn't do as well. Under DO #13, the subject of this report, we developed the first version of the high resolution algorithm, set it up for Site 2, and installed it in the field. This high resolution algorithm works well on moonless nights, so we proceeded to develop concepts, and start the programming, for an upgrade to handle moonlight. Before discussing these current developments, we will review the results under the previous contract. 9.1. Previous Night Algorithm Results During the previous contract, DO #11 discussed in Technical Note 272, our sponsors asked us to develop a ground-truthing method that would provide a means to verify that the cloud algorithm results are reasonably correct. We suggested that a few of the brightest stars might be used to test the results, by assessing their transmittance. (These bright stars would not be used in the actual algorithm, in order to provide at least some independence.) In order to extract the transmittance more accurately, we studied bright stars, developed methods to determine a reasonable correction for aerosol transmittance, and evaluated the inherent (above the atmosphere) stellar irradiance. Although there was a good correlation 28 between these irradiance values and those computed from clear sky libraries, we found that there is an offset between the measured and inherent star irradiances. Using data from SOR taken in 1999 (ref Technical Note 272), we found that the mean offset for this limited data set was near 1.37, and the values ranged from 1.06 to 1.7. We believe some of this offset may be due to a calibration effect, and the variance may be caused by characterization of the stellar spectral output using color temperature. We wrote a program to automatically extract the correction constant for each star we use in the star library, called Site Star Irradiance Correction (SSIC), and adapted the software to use this correction. For a given star, there is also some variance, even on a clear night, in the apparent irradiance corrected for aerosol transmittance and for the star offset. Typical STD values range from 5% to 15%. We evaluated the data as a function of the background brightness, and found that the uncertainty is independent of background radiance. This indicates that the offset has little to do with the ability of the code to correctly extract the background. We also found no correlation between the offset of a star and an adjacent star, indicating that the variations are not due to small variations in the clear-night transmittance. Our system is very well focused, with a PSF of less than a pixel. This may mean that under-sampling of the Gaussian is the cause of the uncertainty. We have not yet isolated the cause of this variance. For the bright star ground-truthing, we selected those bright stars with a Standard Deviation (STD) in the aerosol-corrected irradiance of 5% or less. To choose the ground-truth stars for this test, we started with the brightest stars, and selected stars that had multiple cloud-free appearances near the zenith, and that also had a STD of 5% or less from the clear night set. We next evaluated a number of images to see if the corrected bright star transmittances appeared to be reasonable. Examples are shown in Figures 17 and 18. These plots have been corrected for the offsets; however they have not been adjusted for aerosol transmittance. That is, the transmittances include losses due to both clouds and aerosols (and molecular losses). 17 shows a clear night. The extracted transmittances averaged .89, which is reasonable for a clear night in this wave band at the SOR site. The average variation from this average was .04, and the maximum variation from the average was .07. A very cloudy case is shown in Figure 18. Here, the areas that appear to be thinner are identified with transmittances of .10 and .14. Areas with thicker cloud had 3 stars that were not detected, and one star that was detected with a beam transmittance of .01. Further evaluation of samples with thin clouds, and time series, demonstrated to our sponsors and ourselves that this technique is reasonably accurate, and may be used for ground-truthing of the night cloud algorithm. Further examples are shown in Technical Note 272. Under DO #11, we also developed concepts for a spatially high resolution algorithm. These are documented in Technical Note 272. We extracted average clear sky radiance distributions based on 145 images over 11 nights. The typical opaque cloudy radiance distribution was based on 101 images on 5 nights. These were compared with the radiance from nights with mixed clouds. 29 HESOR Dato_WS 1999 02114190450340_CLR.col HASOR Dato_WS199910616191670510_CLR.COM 0.95 0.87 hogo 0.87 ,0.950 84.. 9.01 Fig. 17. Clear night, 14 Feb 99, 0340 Fig. 18. Cloudy night, 16 June 99 0510 Figure 19 shows a clear night with no moon. Figure 20 shows the associated middle row radiance for this night (black curve) compared with the nominal clear sky (blue curve) and opaque cloud (red curve). SOR February 14, 1999 0750 UTC SOR Radiance Cross Section - Middle Row February 14, 1999 0750 UTC 15x10-4 TTTTTTTTTTTTTTTTTTTTTTTTT - Nominal Opaque Cloud (Curve 1) Nominal Cloud-Free Background (Curve 5) - Feb. 14, 1999 Cloud-Free Image (Curve 3) + 1.0x10-4H Radiance (W/m²-um-Sr) 5.0*10-5 UUUUUUUUUUUUUUUUUUUUUUUULI 100 400 500 200 300 Column Pixel Fig. 19. Clear sky sample, no moon Fig. 20. Radiance from Fig. 19 compared with nominal clear sky and opaque sky radiances Figures 21 and 22 show a similar example with no moon and broken cloud. Note that in this example, the regions with cloud are brighter than the clear sky, but darker than the typical cloud curve. From the points within the clouds that we know are cloudy (based on the transmittance), we would determine the cloud radiance values and lower the 30 nominal cloud curve to generate a cloud curve for that image, to use in the pixel evaluation. SOR June 16, 1999 0700 UTC SOR Radiance Cross Section - Middle Row June 16, 1999 0750 UTC 1.5X10-4 TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT PTT Nominal Opaque Cloud (Curve 1) Nominal Cloud-Free Background (Curve 5) + June 16, 1999 Broken Cloud Image (Curve 34 1.0*10-4 Radiance (W/m²-um-Sr) 5.0x10-5 100 200 300 Column Pixel 400 500 Fig. 21. Broken transparent cloud sample Fig. 22. Radiance from Fig. 21 compared with nominal clear sky and opaque sky radiances From this data and other examples, including examples acquired with moonlight, we concluded that the use of the radiance distributions should have a very good chance of working well under moonless conditions. The work on the high resolution algorithm was continued under DO #13, and will be discussed later in Section 9.2. Finally, under DO #11, we had the opportunity to process a database from the SOR site, and evaluate the results. These data were processed with the old moderate resolution algorithm that had been installed at the site for some time. We extracted the current angular calibration, and updated the geometric algorithm inputs to the algorithm as well as the horizon mask. We processed the nighttime data from the same SOR test-bed data from July and August 05 used for the day algorithm evaluation. The night data were taken at 3-minute intervals at that time, with the result that 3698 images were processed. The data were all processed with the same contrast-based algorithm that was in the field at the time. Sample results are shown in Figures 23 - 25, which illustrate a clear case, a case with scattered clouds, and a case under moonlight. These figures are shown in the format shown to the user in the SORCloudAssess program. The raw open-hole (no spectral filter) image is shown on the left, and the cloud decision is on the right. In the cloud decision image, black indicates the “no data” category. Pixels in this category can be due to physical structures on the horizon, or if the black region is in one of the cloud decision cells, it means that there were insufficient stars in the cell to make a determination. Green indicates a night thin cloud decision, and grey-to-white indicates an opaque cloud decision. Although the moderate resolution is not optimum, we were reasonably pleased with the results. 31 1911001 CR Raw Red 51911001.cld Cid Image Yes 90% No ** Exit? Fig. 23. Example of a relatively good night clear sky result 2263007 UR Raw Red 3 52261002.cld Cid Image + Yes 90% NO Exit? Fig. 24. Example of a night case with opaque and thin clouds In order to assess these results more systematically, the SORCloudAssess program was updated to allow assessment of night imagery as well as day imagery. We assessed the Line of Sight, but this is somewhat unrealistic, because of the moderate resolution. We also assessed the Region of Sight (region around the line of sight), as described in Technical Note 272. These results are shown in Figure 26, and a summary of the night SORCloudAssess results are shown in Table 5. 52081001 CER Raw Red 52081001.cld Cld Image + Yes 90% NO Exit? Fig. 25. Example of a moonlight case 51930402.cld Cid Image 96.3 96.8 81.4 | 87.2 95.7 97.9 98. 9 99.5 99.5 98.4 98.4 97.9 # Yes 90% No Exit? Fig. 26. Fraction of correct answers, in percent, for each ROI for SOR Night Test Bed for Region of Sight Test In Figure 26 and Table 5 we can see that the results over most of the sky are very good for the Region of Sight test and reasonably good for the Line of Sight test (Table 5). This implies that if we were to continue to use the reduced resolution algorithm, we would have to look at the Region of Sight, not just the Line of Sight, to assess the site. We felt that these results shown in Figs. 26 and Table 5 were very encouraging, but they would clearly benefit from algorithm improvements. Table 5 Summary of Estimated Accuracy for the Night Cloud Algorithm Using SORCloudAssess For the SOR Test Bed Data Set Region Overall Zenith Eastern Horizon Western Horizon 70% Correct? LOS Results ROS Results 87.7% 95.5% 93.1% 97.9% 87.8% 99.5% 70.2% 81.4% 95.2% Following delivery of the SOR night algorithm processed results under the earlier DO #11, we immediately went ahead with work toward integrating the beam transmittance calculations into the moderate resolution night algorithm. Under this contract, we got the algorithm programmed in IDL, and ran a few test cases. As part of this work, we extracted the star inherent irradiance offset for 7600 stars using the techniques discussed earlier. In addition, we did preliminary assessments of the very bright night skies at the Virginia site, and concluded that although the algorithm would have to be modified, the data were appropriate for development of an algorithm, even at this very bright site. These results are discussed in Technical Note 272. 9.2. Development of the First Version of the Full Resolution Night Algorithm The concepts for a spatially high-resolution algorithm were first developed in 2001 (although we had informally been thinking about these concepts for many years prior to that date), and are documented in Memo AV01-069t. Basically, the concepts as developed at that time are as follows: a) Extract typical radiance distributions for clear skies and for cloudy skies b) Use the star detection and associated beam transmittance determination to assign a value of opaque cloud, thin cloud, or no cloud, to selected star locations within the image. c) Near these star locations, determine how the radiance distribution differs from the typical radiance distribution at the same location. Locations that are determined to be clear sky will be used to adjust the clear sky distribution, and locations that are determined to be cloud will be used to adjust the cloud sky distribution. d) Compare the radiance in each pixel with the adjusted nominal clear sky and cloudy sky distributions to determine the presence of cloud. Note that Step c makes this an adaptive algorithm, in which the algorithm adapts on an image-by-image basis to the ambient conditions. In order to pursue this approach, our first step was to improve the techniques for extracting earth-to-space beam transmittance for the star locations. This work was 34 mostly completed under DO #11, but documented under the current contract, in Memos AV06-009t and AV07-037t. 9.2.1. Transmittance Results Shortly after the start of DO #13, WSI Unit 7 was fielded at the first of the new Air Force sites, which is designated Site 2. This site is near a city, and consequently the radiance distributions were impacted by city lights. For development of the night algorithm, we immediately jumped to Site 2 data, because the sponsors needed this data more critically than data from SOR, and because we were eager to address this more stressing site. In general, when there is more scattered light in the atmosphere, it becomes more difficult to detect the stars. On extracting the transmittance, we found that the results were quite good. Figure 27 shows the correlation between the inherent spectral stellar irradiances determined from the measurements with the values determined from the star libraries. For stars down to magnitude 5.5, the correlation is .972, and for stars down to magnitude 4, the correlation is .981. These correlations are the result of 50,000 extracted cases from clear nights for the magnitude 5.5 results, and 10,000 extracted cases for the magnitude 4 results. Given these results, we decided to proceed with the concepts we had evolved based on darker sites. Spectral Irradiances For WSI Passband, Zenith < 60 Degrees Spectral Irradiance > 2e-10 (W/m2-um), Method: Integral Under Gaussian 10-6 TTTTTTTTTTTTTTTTTTTTTTTTTTTT UL TTTTTTTTTTTTT 00 Measured Spectral Irradiance (W/m²-um) ILUL Linear Coefficients: A = 1.0013166e-01 13 0.896849137 Corr. Coeff =0.972 10-10 لللللللل 10-10 10-9 10-8 10-6 10-7 Theoretical Spectral Irradiance (W/m²-um) Fig. 27. Relationship between spectral stellar irradiance from measured data and spectral stellar irradiance computed from stellar libraries. Site 2, star magnitudes 4.0 or less with star offset corrections. 10,943 stars used for correlations, data sub sampled for plot. 35. CCC 9.2.2. The Clear Sky Radiance Distribution or “Shell” The full resolution algorithm is discussed in Memo AV06-029t, which is available to the sponsors upon request. In order to use NightCloudAlg2 at a particular WSI site, nominal cloud-free background imagery must be created for that location. As discussed in Memo AV05-012t, a large number of cloud-free images are compiled, and the background image is created by determining the median value at each pixel, and then the clear shell is smoothed The radiance levels at night tend to vary with time (perhaps due to fewer lights being turned on later in the night). The variability of the cloud-free background with time of night is handled in two ways. First, radiance shells are created as a function of hour angle. Secondly, as will be discussed below, a scale factor Image Background Adjustment Factor (IBAF) is used to further adjust the cloud-free background image for each processed image. Figure 28 shows cross-sections of hour angle background imagery for SOR on the left, and Site 2 on the right. In each plot we chose a curve that goes through the brightest and darker parts of the sky. SOR Hour Angle Bkgrnd And Smoothed Opaque Cloud Cross-Section: Middle Row 1.5*10-4 AF2 Hour Angle Bkgrnd And Smoothed Opaque Cloud Cross-Section: Middle Column -4 TTTTTTTTTTTTTTT ст TTTTTTTTTTTT TTTTTTTTTTTTTTT 8.0e-004 - - 1.0*10-4 Hour Angle Range 6 to 7 Dusk 7 to 8 8 to 9 9 to 10 10 to 11 11 to 12 -12 to -11 -11 to -10 -10 to -9 -9 to -6 Down Combined Opoque Cloud 6.Oe-004 H Radiance (W/m²-um-Sr) Hour Angle Range 8 lo 9 9 to 10 10 lo 11 11 lo 12 -12 lo -11 -11 lo - 10 -10 lo - 9 -9 to -6 Combined • Opoque Cloud IIIIIIIII - - Rodionce (W/m²-um-Sr) - TTTTTTTTT TTTTT 4.0e-004 5.0x10-5 2.0e-004 LUUL LLLLL UUUUU 0.0e +000L 100 200 400 500 600 300 Column 100 200 400 500 600 300 Row Fig. 28. Typical no-moon clear sky radiance levels for clear sky as a function of hour angle (solid color curves), and typical no-moon cloud radiance levels (dash line). Plot on left is for SOR site, and plot on right is for Site 2. Left plot y-axis scale is 0 to 1.5 104. Right plot y axis scale is 0 to 9 104. For a given image, the appropriate cloud-free background image for the time of image acquisition serves as the base cloud-free shell. After individual star cloud-decisions have been performed and the medium resolution cloud image has been created, the software searches for a subset of at least 10 bright stars to use in determining the cloud-free shell scale factor. The search is iterative, beginning with strict selection criteria, and relaxing the criteria until a sufficient number of stars are compiled or until the criteria are exhausted without finding sufficient stars. In order to pass the initial criteria, in addition to being classified as cloud-free, a star must possess the following attributes: magnitude < 4, zenith angle < 60°, the cell the star is found in must also have been classified as cloud- free, and the cloud-free background percent difference for the pixels surrounding the star must fall within acceptable limits (e.g. 25% difference). If an insufficient number of stars are found in the first pass, a second and third pass is performed, relaxing the magnitude 36 limit to 4.5 and 5, respectively. If fewer than 10 usable stars are found matching the criteria, no scale factor is applied. Other criteria may be investigated in the future to handle cases where no usable stars are found, such as keeping a recent history of scale factors or using additional parameters to isolate stars that are cloud-free. If a subset of stars identified as cloud-free is found, the fractional difference, d, between the background radiance at each star location and the radiance from the nominal cloud- free background image is used to derive the IBAF scale factor that adjusts the nominal cloud-free background to the cloud-free background for that particular image. (This is the adaptive algorithm feature.) At present, this factor is not pixel-dependent, i.e. the same correction is applied to the nominal background radiance for the whole image. Before comparing the current calibrated image to the cloud-free radiance distribution on a pixel-by-pixel basis, stars must also be removed from the calibrated image. Because the radiance for the area surrounding a star is significantly elevated from the background sky, often above the cloud-free shell radiances, those locations would app cloud in the cloud decision image. To account for this, a star-free image is created prior to comparing the image with the cloud-free and opaque cloud shells. The star-free image is created by replacing the pixel values associated with each star with the nearby background radiance. Once this is done, the image is subjected to a median filter for smoothing. Some objects, such as planets, are not currently corrected for. 9.2.3. The Opaque Cloud Sky Radiance Distribution or “Shell” The opaque cloud shell is created in a similar manner to the cloud-free shell. The radiances associated with opaque cloud are far more variable than those for cloud-free sky. Cloud radiances are dependant on features such as cloud thickness, cloud height, cloud bottom shape, location and amount of city lights, and total cloud cover. Since we are interested in distinguishing opaque cloud from thin cloud and clear sky, the goal is to create an opaque cloud shell that represents something close to the lower radiance limit of opaque cloud. Like the clear sky background, the cloud background is adjusted with an Image Cloud Adjustment Factor (ICAF). In order for stars to be used for the opaque cloud scale factor ICAF determination, they must have magnitudes < 4, must have zenith angles < 60°, and must have been identified as opaque cloud during the individual star cloud decisions. If fewer than 10 stars are identified this way, the magnitude restriction is relaxed to 4.5, and then to 5, until enough stars are found. If fewer than 10 usable stars are found matching the criteria, no scale factor is applied. As with the cloud-free shell case, the fractional difference, d, between the background radiance at each star location and the radiance from the nominal opaque cloud background image is calculated, and the correction factor is based on this fractional difference. 37 9.2.4. Cloud decision at the Pixel The correction factor for the cloud-free shell is intentionally set so that the shell will be slightly brighter than the actual anticipated clear sky. Similarly, the correction factor for the cloud shell is intentionally set so that the cloud shell will be slightly darker than the anticipated cloudy sky. As a result, at each pixel, the resulting logic is quite simple. If the radiance at a given pixel is lower than the cloud-free shell, it is identified as cloud- free. If the radiance at a given pixel is higher than the cloud shell, it is identified as opaque cloud. Otherwise it is identified as thin cloud. We may find that more sophistication is beneficial in the future, but this method is yielding reasonably good results. A nice sample is shown in Figure 29 and 30. The raw image and the cloud decision results are shown in Figure 29. The cloud-free shell, the cloud shell, and the actual image radiance for the central column are shown in Figure 30. In the plot, the blue curve shows the clear sky radiance distribution, adjusted for the current image. Note that values are higher at the bottom of the image, due to the proximity of anthropomorphic light. The red curve shows the cloud shell; no adjustment was made on the cloud shell for this image. Like the clear sky, the cloudy sky shell is brighter toward the city lights. The actual sky radiances are shown with the black curve. They are slightly below the blue curve where there are no clouds, and above the blue curve where there are clouds. We feel that these results are excellent. A few planets are identified as cloud, but the results in this example otherwise appear to be reasonable. Fig. 29. Raw image and cloud decision image with full resolution algorithm, Site 2, 26 May 2006 at 0926 3 X Radiance Cross Section - Middle Column Multipliers: CF = 1.07 Opaq = 1.00 FTTTT TUTTE 0.0008 0.0006 Radiance (W/m²-um-Sr) 0.0004 0.0002 0.0000 Laululululum 100 200 Row Pixel 300 400 500 Fig. 30. Curves showing data through the middle column for the typical opaque cloud for this site and time (red), typical clear for this site and time (blue), and actual image shown in previous figure (black). 9.2.5. No-moon Algorithm Results for Site 2 We processed data from May and June 2006 from Site 2, as documented in Memo AV06- 033t. The full resolution algorithm results were presented in the Power Points talks in September and October 2006. We found that the results were quite good for no-moon conditions, but not as good for the moonlight conditions. Several samples of no-moon conditions are shown in Figures 31-34. BA Fig. 31. Typical results for a clear (cloud-free) night, 20 May 2006 at 0200 Fig. 32. Typical results for scattered clouds, 26 May 2006 at 0918 Fig. 33. Typical results for full overcast with both opaque and thin cloud, 24 May 2006 at 0900 Figures 31 - 33 shows good results for clear, thin cloud, and opaque cloud with rain. In these examples, the results appear to be correct over the whole sky, with the possible exception of very narrow slivers near the horizon. Figure 34 shows relatively poor results for a clear sky. In Fig. 34 the result is good except near the East horizon. The solar/lunar occultor was to the West, and one can see from the results that the cloud-free radiance shell has not been optimized for the region on the Eastern horizon that is normally hidden by the occultor. This problem should be relatively easy to correct, as it requires providing valid clear sky background radiances for the region normally covered by the occultor. 40 co Fig. 34. Relatively poor results for a clear sky, 30 May 2006 at 0300 In order to more fully assess these data, we also evaluated the results with the program SORCloudAssess. The results for the no-moon conditions are shown in Figure 35. In this figure, we have shown the results for the actual Line of Sight, as opposed to the Region of Sight shown earlier. As before, this is a program that allows us to make a visual assessment of the imagery and whether we believe the results are correct. It is not a blind test, and we hope to use a blind test at some point in the future. 92.6 92.6 92.6 92.6 | 92.6 | 92. 6 92.6 | 92. 692.6 92.6 92.6 Fig. 35. Fraction of correct answers, in percent, for each Line of Sight for Site 2 Test Data using Program SORCloudAssess Table 6 Summary of Estimated Accuracy for the Night Cloud Algorithm Using SORCloudAssess For the Site 2 and SOR Data Sets Algorithm Region Overall Zenith Eastern Horizon Western Horizon 70% Correct? Full Resolution, Site 5 Moderate Resolution, SOR LOS Results LOS Results | ROS Results 94.3 87.7% 95.5% 99.2 93.1% 97.9% 58.3 87.8% 99.5% 88.1 70.2% 81.4% 97.5 95.2% . .... ... - . . . . . . . We were very pleased with these results. In Table 6, we have used the no-moon cases for Site 5, since the moonlight algorithm is still in development. The LOS results were much better than LOS from the earlier moderate resolution algorithm. There is no longer a need to look at the Region of Sight, since the algorithm is now high resolution, but it is also worth noting that the results are in general better than the previous ROS results, except at the eastern horizon. The overall results were an estimated accuracy of 94.3% when moonlight cases are removed, and 95.7% when the horizon values are also removed. We have concepts in place to improve both the horizons and the moonlight. We have been wanting to work on the full resolution algorithm for quite some time, and it was very pleasing to have results, and to have the results be reasonably good. In addition, this site is a bright site, and we were concerned that this might cause problems. The algorithm handled the bright site quite well. As will be reported in the next report, we find that the full resolution algorithm is also working quite well at a relatively dark site. 9.2.6. Moonlight Results for Site 2 The moonlight results were not very good for this first version of the full resolution algorithm. ` Whereas the overall accuracy for no-moon data was estimated to be about 94.3% (with horizons included), this number drops to 77% if moonlight is included. A reasonably good moonlight result is shown in Fig. 36, and a poor result is shown in Fig. 37. The source of the problem illustrated in Figure 37 is that there were very few cloud-free moonlight cases to use to extract the cloud-free background, and the situation is further complicated by the changing phase of the moon, the earth-to-moon distance, and the anthropogenic light. Fig. 36. Relatively good results for moonlight, 8 May 2006 at 0500 00 Fig. 37. Relatively poor results for moonlight, 10 May 2006 at 0400 As a result of this issue, we began development on a different way to characterize the moonlight clear sky radiance shell. This method, documented in Memo AV07-012t, characterizes the sky radiance distribution under moonlight as the sum of the no-moon light and the scattered light from the lunar source. The lunar source light is in turn characterized using the solar radiance distribution, times factors that account for the ratio of moonlight to sunlight, and factors that characterize the relative moon radiance as a function of earth-to-moon distance and moon phase. Under this delivery order, we documented the concept and began writing the software to extract the moonlight radiance distribution. The programs for extracting the solar radiance were written, and the data for Site 2 were extracted. Programming to use the data were in development at the end of the contract period. At the time of this writing, they have been tested with limited Site 5 data processed at MPL, and are working quite well. That work will be documented in the next report. 43 9.2.7. Conversion of the Test Code into Field Code The first version of the full resolution algorithm was converted to C, as documented in Memo AV06-030t. It was then integrated into the real time field code, where it was tested, and further debugged and updated. The field code was installed in the field at Site 2 on 5 March 2007, as documented in Memo AV07-028t. We should also note that the geometric calibrations for this site are documented in Memos AV06-014t and AV07-024t. The radiometric calibrations for Site 2 are documented in Memos AV07-040t and AV07-041t. 9.3. Stellar Irradiance Time Series Studies In October 2006, we were asked to do a study of time series of the stellar irradiances extracted from the night images. These results were presented in the Power Point talk in January 2007. I believe the purpose of this effort was to determine whether it might be possible for others to evaluate this data, much as they evaluate pyrheliometer data, in order to look at time series of cloud behavior. Accordingly, we delayed father development of the night algorithm to do this work. We first evaluated Polaris, as its nearly fixed position in the sky makes the evaluation somewhat easier to interpret. Figure 38 shows the results for the night of 11 Jan 2002, at the SOR site. 11 Jan 2002 6x10- _ Clear 0-9 DEB NE Ges Measured Spectral Irradiance (W/m²-um) 2x10-9 Opq Cloud Thin Cloud 10 Hour (UTC) Fig. 38. Time series of stellar irradiance from the images during a night with variable cloud, Polaris, SOR Site, 11 January 2002. In Figure 38, we have marked times when the images were visually determined to have clear sky, thin cloud, and opaque cloud in the direction of Polaris and over most of the sky. As can be seen in the time series plot, there is some variance within the clear sky 44 region, but there is a clear separation between the clear sky and the thin and opaque cloud. That is, the difference between the thin cloud signal and the clear sky signal is much larger than the variance in the clear sky signal. The variance during the times when the sky appears to be clear is quite interesting. Figure 39 shows a sample during a time that was believed to be clear. We see small changes of about +8% in the signal that appear to be almost periodic (see the two horizontal lines in the figure). These variations are certainly too regular to be due to Shot noise. We have not had the time to determine the causes of these variances. (We feel that it is most important first to get a good night algorithm working for moonlight.) The variations do not appear to be a function of something in the imagery suc pixel boundary, nor do they appear to be related to something in the hardware like chip temperature. At some point, it may be worth further exploring this issue, as we should be able to distinguish thinner cloud thresholds if we can remove this variance. 12 Jan 2002 6x10-TTTTTTTTTTT 4x10-9- EKIN 3 IL E Measured Spectral Irradiance (W/m'-um) 2x10-9 OLIITILI LILILITI 6 8 0 2 4 10 Hour (UTC) Fig. 39. Time series of stellar irradiance from the images during a clear night, Polaris, SOR Site, 12 January 2002. We extracted the results for Polaris for several days, as shown in Figure 40. The red curve represents a night that was overcast with opaque clouds (Jan 10 2002). The turquoise and green curves show variable cloud conditions (Jan 13 and 11 respectively), and the blue and yellow show clear skies (Jan 12 and 14 respectively). 45 Measured Irradiances for Polaris All Days 6x10-9 4x10-9 Measured Spectral Irradiance (W/m²-um) 2x10-9 OL 0 2 8 10 6 Hour (UTC) Fig. 40. Time series of stellar irradiance from several nights, Polaris, SOR Site. Red is an overcast night, green and turquoise show nights with variable clouds, and the yellow and deep blue show two clear nights Clearly in Figure 40, the separation between clear sky, thin cloud, opaque cloud is obvious. It is also perhaps worth noting that the shape of the curve under clear skies is not completely flat, as we would anticipate. Note in particular the rise just before 0600. We do not know the cause of this feature. It is unimportant at the present time, but if we can diagnose this feature, it may help us to provide thinner cloud thresholds in the future. Figure 41 shows similar plots for the stars Alnath and Dubhe. In these plots, we see the additional impact of the aerosol transmittance on the changing star position throughout the night. Dubhe, for example, was rising on the eastern horizon during this period, and the lower transmittance due to the larger zenith angle is apparent in the left side of the image. The sponsors agreed that it appears that in terms of the temporal behavior of the stellar irradiance, the thin clouds and opaque clouds are well separated from the clear sky results. This should enable temporal studies of cloud blockages in many directions in the night sky, similar to studies being done with pyrheliometer data. 9.4. Summary of Night Algorithm Results The development of the first version of the night algorithm using transmittance and providing full resolution (or nearly full, given the smoothing) was a major step forward. We were pleased that it handled bright sites very well, and as will be shown in the next report, it also is working well for dark sites. A modification of the algorithm to handle 46 moonlight is also in progress, and appears very promising. The full resolution version is operating in the field at Site 2. Measured Irradiances for Alnath All Days TTTTTTT Measured Irradiances for Dubhe All Days 6x10-9 1x10-8- 8x10-9 4x10-07 6X10-9 Measured Spectral Irradiance (W/m²-um) Measured Spectral Irradiance (W/m²-um) 2x10-9 OL 1 o 4 8 10 12 2 o O 2 6 Hour (UTC) Hour (UTC) Fig. 41. Time series of stellar irradiance from Alnath and Dubhe for the same nights shown in the previous figure. 10. Summary Under this contract, we finished the refurbishment of two systems, and fielded them both. In general they have worked very well, and most of the problems have been due to outside issues such as site power-outs, excessive shed temperatures, and lightning. Once the initial issues were addressed, we had 2% or less data loss due to WSI hardware and software issues. The new night algorithm that uses NIR data and a clear-sky reference that varies with solar zenith angle is in place in the field at Site 2 and working well. A major upgrade was made to the night algorithm, which now uses beam transmittance and radiance distributions to provide a nearly full spatial resolution result. 11. Acknowledgements We would like to express our appreciation to the personnel of Starfire Optical Range and their contractors from Boeing. Dr. Earl Spillar, the head of this program, and Ann Slavin, our acting contract monitor, were very helpful in providing guidance, and have been a pleasure to work with. We also have found our contacts with Lt. Benjamin Karlow, Lt. Kevin Pastorello and the other Air Force personnel to be professional and productive. The newer members of the Boeing team, Marjorie Shoemake and Darielle Dexheimer, supported us with careful work and enthusiasm, and were very helpful in diagnosing hardware issues and working with us on other aspects of the deployments. The SOR team is always a pleasure to work with. We feel this WSI work is valuable, and very 47 much appreciate having had the chance to advance the state of the art, as well as meet our sponsor's specific needs. We also would like to express our appreciation to ONR, which provided the funding vehicle for this work. 12. References 12.1. In-house Technical Memoranda, in order by memo number; available to sponsor on request Shields, J., “Research Approach for High Resolution Night Cloud Algorithms”, Atmospheric Optics Group Technical Memorandum AV01-069t, 3 July 2001 Shields, J., “Return of D/N WSI Units 7 and 8 from NSA Sites", Atmospheric Optics Group Technical Memorandum AV05-004t, 15 February 2005 Burden, A., “Cloud-free Background Imagery for Version 3 Night Cloud Algorithm", Atmospheric Optics Group Technical Memorandum AV05-012t, 24 February 2005 Shields, J., “Unit 4 Parts changes during 2002 - 2005”, Atmospheric Optics Group Technical Memorandum AV05-020t, 1 July 2005 Karr, M., “ProcWSID QC Files”, Atmospheric Optics Group Technical Memorandum AV05-023t, 2 August 2005 Baker, J., “Return of D/N WSI Unit 6 from TWP Site”, Atmospheric Optics Group Technical Memorandum AV05-025t, 24 August 2005 Baker, J., “Return of D/N WSI Unit 9 from TWP Site”, Atmospheric Optics Group Technical Memorandum AV05-028t, 12 September 2005 Shields, J., “Cloud Free Line of Sight Forecast Test-bed Program”, Atmospheric Optics Group Technical Memorandum AV05-036t, 1 November 2005 Karr, M., “RunWSI for DOS based SOR units, Version 1.0”, Atmospheric Optics Group Technical Memorandum AV06-005t, 5 June 2006 Karr, M., “ProcWSID for DOS based SOR units, Version 1.0”, Atmospheric Optics Group Technical Memorandum AV06-006t, 5 June 2006 Baker, J., “Unit 7 General Instrument Overview and FAQ”, Atmospheric Optics Group Technical Memorandum AV06-007t rev 1, 28 June 2006 Shields, J. E., J. R. Baker, and M. E. Karr, “Unit 7 SOR Site 2 Trip Report, 1 - 5 May 06”, Atmospheric Optics Group Technical Memorandum AV06-008t, 1 June 2006 48 Burden, A., “Night Algorithm Progress", Atmospheric Optics Group Technical Memorandum AV06-009t, 1 August 2006 Baker, J., “Unit 7 Reboot Procedures and Emergency Shutdown Procedures”, Atmospheric Optics Group Technical Memorandum AV06-010t, 27 June 2006 Baker, J., “Set-Up Instructions for WSI Unit 4 - Preliminary Version", Atmospheric Optics Group Technical Memorandum AV06-011t, 1 August 2006 Karr, M., “Software update: RunWSI for Unit 7 Version 1.1", Atmospheric Optics Group Technical Memorandum AV06-012t, 18 August 2006 Karr, M., “Software update: ProcWSI for Unit 7 Version 1.1”, Atmospheric Optics Group Technical Memorandum AV06-013t, 18 August 2006 Burden, A., “Unit 7 Star-Based Geometric Calibration for Data Collected May 2006 and Later", Atmospheric Optics Group Technical Memorandum AV06-014t, 20 August 2006 Karr, M., “Software update: ProcWSI for Unit 7 Version 1.2”, Atmospheric Optics Group Technical Memorandum AV06-015t, 19 September 2006 Karr, M., “Software update: RunWSI for Units 7 and 4 Version 1.2”, Atmospheric Optics Group Technical Memorandum AV06-016t, 20 September 2006 Baker, J. and J. Shields, “WSI Modernization Ideas”, Atmospheric Optics Group Technical Memorandum AV06-017t, 20 September 2006 Shields, J., “Processing of the VA Apr 05 Data Set”, Atmospheric Optics Group Technical Memorandum AV06-018t, 18 September 2006 Karr, M., “Program Updates for Site 2 Stand-alone Processing”, Atmospheric Optics Group Technical Memorandum AV06-019t, 20 September 2006 Shields, J., “Processing of the Site 2 May - June 06 Data Set”, Atmospheric Optics Group Technical Memorandum AV06-020t, 19 September 2006 Karr, M., “Site 2 Trip report, Aug 2 – 4, 2006”, Atmospheric Optics Group Technical Memorandum AV06-021t, 25 August 2006 Shields, J., “Return of D/N WSI Unit 3 from Manus Site”, Atmospheric Optics Group Technical Memorandum AV06-022t, 20 September 2006 Karr, M., “Program WSIFluxChk”, Atmospheric Optics Group Technical Memorandum AV06-023t, 3 October 2006 Shields, J., “Unit 7 Site 2 Flux Check”, Atmospheric Optics Group Technical Memorandum AV06-024t, 3 October 2006 Shields, J., “Unit 4 Flux Check at MPL”, Atmospheric Optics Group Technical Memorandum AV06-025t, 3 October 2006 Karr, M., “Past software revisions for Unit 12”, Atmospheric Optics Group Technical Memorandum AV06-027t, 6 October 2006 Karr, M., “Program hangs at full moon”, Atmospheric Optics Group Technical Memorandum AV06-028t, 29 October 2007 Burden, A., “WSI Full Resolution Nighttime Cloud Algorithm Progress", Atmospheric Optics Group Technical Memorandum AV06-029t, 10 October 2006 Burden, A., “NightCloudAlg2 – Conversion to C”, Atmospheric Optics Group Technical Memorandum AV06-030t, 10 October 2006 Burden, A., “AF2 Night Cloud Image Processing - May/June 2006”, Atmospheric Optics Group Technical Memorandum AV06-033t, 6 October 2006 Shields, J., “WSI Run Time Hours”, Atmospheric Optics Group Technical Memorandum AV06-034t, 31 October 2006 Shields, J., “WSI Field Check List”, Atmospheric Optics Group Technical Memorandum AV07-001t, 8 January 2007 Shields, J., “Unit 4 System Overview and Parts List, Site 5”, Atmospheric Optics Group Technical Memorandum AV07-002t, 11 January 2007 Shields, J. and M. Karr, “Set-up Instructions for WSI Unit 4, Site 5”, Atmospheric Optics Group Technical Memorandum AV07-003t, 11 January 2007 Karr, M., “WSI Quick Checklist, Emergency Shutdown/Startup Procedures and Reboot Instructions - Unit 4, Site 5”, Atmospheric Optics Group Technical Memorandum AV07- 004t, 10 January 2007 Karr, M., “Control Computer operations overview for WSI Unit 4, Site 5”, Atmospheric Optics Group Technical Memorandum AV07-005t, 10 January 2007 Karr, M., “Processing Computer operations overview for WSI Unit 4, Site 5”, Atmospheric Optics Group Technical Memorandum AV07-006t, 10 January 2007 Karr, M., “Software update: ProcWSID Version 2.0”, Atmospheric Optics Group Technical Memorandum AV07-007t, 10 January 2007 Karr, M., “WSI Utility Program Updates”, Atmospheric Optics Group Technical Memorandum AV07-008t, 12 January 2007 Karr, M., “WSI Image Headers", Atmospheric Optics Group Technical Memorandum AV07-009t, 27 April 2007 Karr, M., “Unit 4 Site 5 Trip Report, 31 Jan - 1 Feb 07”, Atmospheric Optics Group Technical Memorandum AV07-010t, 7 February 2007 Shields, J., “WSI Field Check List [Update]”, Atmospheric Optics Group Technical Memorandum AV07-011t, 8 February 2007 - Shields, J., “Moonlight Radiance Distribution Concepts”, Atmospheric Optics Group Technical Memorandum AV07-012t, 21 February 2007 Karr, M., “Unit 7 Maxtor drive replacement, Atmospheric Optics Group Technical Memorandum AV07-017t, 17 April 2007 Burden, A., “Unit 7 Star-Based Geometric Calibration for Data Collected after February 26, 2007”, Atmospheric Optics Group Technical Memorandum AV07-024t, 27 April 2007 Shields, J., “Wavelength Options in Cloud Imaging”, Atmospheric Optics Group Technical Memorandum AV07-026t, 3 May 2007 Karr, M., “Software Update: ProcWSID Version 2.2”, Atmospheric Optics Group Technical Memorandum AV07-028t, 16 May 2007 Burden, A., "Earth-to-Space Beam Transmittance from Nighttime WSI Imagery: Additional Results”, Atmospheric Optics Group Technical Memorandum AV07-037t, 24 July 2007 Karr, M., “Unit 7 Site 2 Trip report, 27 – 28 Feb 2007”, Atmospheric Optics Group Technical Memorandum AV07-038t, 16 July 2007 Karr, M., “Unit 4 Site 5 Trip report, 27 – 28 June 2007”, Atmospheric Optics Group Technical Memorandum AV07-039t, 16 July 2007 Burden, A., “Unit 7v3 Effective Lamp Radiances for Field Calibration”, Atmospheric Optics Group Technical Memorandum AV07-040t, 2 July 2007 Burden, A., “Unit 7v3 Field Calibration Results for Site 2”, Atmospheric Optics Group Technical Memorandum AV07-041t, 2 July 2007 Karr, M., “Red and Yellow Flags in Day/Night WSI Acquisition Programs”, Atmospheric Optics Group Technical Memorandum AV07-054t, 7 November 2007 [written after contract period] 12.2. Power Point Files from Presentations to SOR September 06: First version high resolution night algorithm and Site 2 results, upgrades to day algorithm (variable reference value) and Site 2 results October 06: Hardware and data rate status; reported day algorithm installed at Site 5 in September, night algorithm nearing installation readiness; Site 5 instrument nearly ready, but site is not ready; Site 3 instrument prep had to stop due to funding January 07: Site 5 instrument shipped to site; Site 2 ACP failed, ACP repaired but trip awaiting funding; night algorithm has been installed on Site 5 instrument; presented results of time series of star irradiances 12.3. Selected Published References and Technical Notes in order by date Johnson, R. W., W. S. Hering and J. E. Shields (1989), “Automated Visibility and Cloud Cover Measurements with a Solid-State Imaging System”, Marine Physical Laboratory, Scripps Institution of Oceanography, University of California San Diego, SIO 89-7, GL- TR-89-0061, NTIS No. ADA216906 Johnson, R. W., J. E. Shields, and T. L. Koehler (1991), “Analysis and Interpretation of Simultaneous Multi-Station Whole Sky Imagery”, Marine Physical Laboratory, Scripps Institution of Oceanography, University of California San Diego, SIO 91-3, PL-TR-91- 2214 Shields, J. E., R. W. Johnson, and T. L. Koehler, (1993), “Automated Whole Sky Imaging Systems for Cloud Field Assessment”, Fourth Symposium on Global Change Studies, 17 – 22 January 1993, American Meteorological Society, Boston, MA Shields, J. E., R. W. Johnson, and M. E. Karr, (1994), “Upgrading the Day/Night Whole Sky Imager from Manual/Interactive to Full Automatic Control, Marine Physical Laboratory, Scripps Institution of Oceanography, University of California San Diego, Report MPL-U-140/94 Shields, J. E., R. W. Johnson, M. E. Karr, R. A. Weymouth, and D. S. Sauer, (1997a), “Delivery and Development of a Day/Night Whole Sky Imager with Enhanced Angular Alignment for Full 24 Hour Cloud Distribution Assessment”, Marine Physical Laboratory, Scripps Institution of Oceanography, University of California San Diego, Report MPL-U-8/97 Shields, J. E., M. E. Karr, and R. W. Johnson, (1997b), “Service Support for the Phillips Laboratory Whole Sky Imager”, Marine Physical Laboratory, Scripps Institution of Oceanography, University of California San Diego, Report MPL-U-10/97 Shields, J. E., R. W. Johnson, M. E. Karr, and J. L. Wertz, (1998), "Automated Day/Night Whole Sky Imagers for Field Assessment of Cloud Cover Distributions and Radiance Distributions”, Tenth Symposium on Meteorological Observations and Instrumentation, 11 - 16 January 1998, American Meteorological Society, Boston, MA Feister, U., Shields, J., Karr, M., Johnson, R., Dehne, K. and Woldt, M, (2000), “Ground- Based Cloud Images and Sky Radiances in the Visible and Near Infrared Whole Sky Imager Measurements”, Proceedings of Climate Monitoring - Satellite Application Facility Training Workshop sponsored by DWD, EUMETSAT and WMO, Dresden 2000. Shields, J. E., M. E. Karr, A.R. Burden, R.W. Johnson, and J. G. Baker, (2002), “Analytic Support for the Phillips Laboratory Whole Sky Imager, 1997 - 2001”, Marine Physical Laboratory, Scripps Institution of Oceanography, University of California San Diego. Shields, J. E., R.W. Johnson, M. E. Karr, A.R. Burden, and J. G. Baker, (2003a), “WSI Field Calibration System Operations Manual”, Marine Physical Laboratory, Scripps Institution of Oceanography, University of California San Diego, Technical Note 252, February 2003. Shields, J. E., M. E. Karr, A.R. Burden, R.W. Johnson, and J. G. Baker, (2003b), “Analysis and Measurement of Cloud Free Line of Sight and Related Cloud Statistical Behavior – Published as Final Report for ONR Contract N00014-97-D-0350 DO #2”, Marine Physical Laboratory, Scripps Institution of Oceanography, University of California San Diego, Technical Note 262, June 2003. Shields, J. E., R. W. Johnson, M. E. Karr, A. R. Burden, and J. G. Baker (2003c), Calibrated Fisheye Imaging Systems for Determination of Cloud Top Radiances from a UAV, International Symposium on Optical Science and Technology, SPIE the International Society for Optical Engineering, 2003. Shields J. E., R. W. Johnson, M. E. Karr, A. R. Burden, and J. G. Baker, (2003d), Daylight Visible/NIR Whole Sky Imagers for Cloud and Radiance Monitoring in Support of UV Research Programs, International Symposium on Optical Science and Technology, SPIE the International Society for Optical Engineering, 2003. Shields, J. E., R. W. Johnson, M. E. Karr, A. R. Burden, and J. G. Baker (2003e), Whole Sky Imagers for Real-time Cloud Assessment, Cloud Free Line of Sight Determinations and Potential Tactical Applications, The Battlespace Atmospheric and Cloud Impacts on Military Operations (BACIMO) Conference, Monterey, CA. http://www.nrlmry.navy.mil/bacimo.html, 2003. Shields, J. E., A.R. Burden, M. E. Karr, R.W. Johnson, and J. G. Baker, (2004a), “Development of Techniques for Determination of Nighttime Atmospheric Transmittance and Related Analytic Support for the Whole Sky Imager - Published as Final Report for ONR Contract N00014-01-D-0043 DO #5”, Marine Physical Laboratory, Scripps Institution of Oceanography, University of California San Diego, Technical Note 263, April 2004. Shields, J. E., M. E. Karr, A.R. Burden, R.W. Johnson, and J. G. Baker, (2004b), “Project Report for Providing Two Day/Night Whole Sky Imagers and Related Development Work for Starfire Optical Range – Published as Final Report for ONR Contract N00014- 97-D-0350 DO #6”, Marine Physical Laboratory, Scripps Institution of Oceanography, University of California San Diego, Technical Note 265, May 2004. Shields, J. E., J. G. Baker, M. E. Karr, R. W. Johnson, and A. R. Burden, (2005a), Visibility measurements along extended paths over the ocean surface, International Symposium on Optical Science and Technology, SPIE the International Society for Optical Engineering, August 2005. Shields, J. E., A.R. Burden, R.W. Johnson, M. E. Karr, and J. G. Baker, (2005b), “Cloud Free Line of Sight Probabilities and persistence Probabilities from Whole Sky Imager Data”, Marine Physical Laboratory, Scripps Institution of Oceanography, University of California San Diego, Technical Note 266, August 2005. Shields, J. E., A. R. Burden, R. W. Johnson, M. E. Karr, and J. G. Baker (2005c), Measurement and Evaluation of Cloud Free Line of Sight with Digital Whole Sky Imagers, The Battlespace Atmospheric and Cloud Impacts on Military Operations (BACIMO) Conference, Monterey, CA. http://www.nrlmry.navy.mil/bacimo.html, 2005. Shields, J. E., R. W. Johnson, J. G. Baker, M. E. Karr, and A. R. Burden, (2006), Multispectral scattering measurements along extended paths using an imaging system, International Symposium on Optical Science and Technology, SPIE the International Society for Optical Engineering, August 2006. Shields, J. E., M. E. Karr, A.R. Burden, R.W. Johnson, and W. S. Hodgkiss, (2007a), “Enhancement of Near-Real-Time Cloud Analysis and Related Analytic Support for Whole Sky Imagers, Final Report for ONR Contract N00014-01-D-0043 DO #4”, Marine Physical Laboratory, Scripps Institution of Oceanography, University of California San Diego, Technical Note 271, May 2007. Shields, J. E., M. E. Karr, A.R. Burden, R.W. Johnson, and W. S. Hodgkiss, (2007b), “Whole Sky Imaging of Clouds in the Visible and IR for Starfire Optical Range, Final Report for ONR Contract N00014-01-D-0043 DO #11”, Marine Physical Laboratory, Scripps Institution of Oceanography, University of California San Diego, Technical Note 272, July 2007. 54