Rosetta Review - Wednesday, 17 February 2016 In the room: Tony Farnham Cesare Gravas Heather Franz Mike A'Hearn Ludmilla Kolokolova Steve Joy Emily Law Sascha Kempf (left ~9:45am) Tilden Barnes Anne Raugh (recording) Mike Kelley Bryan Butler Lori Feaga (joined ~10:00am) Joint Session ROSINA Data =========== Simon Sheridon's presentation o RID 101 is listed as major (provide smaller files for downloading), but the team is correct - this is an archive function, not a data preparation function. In this case, the archive can repackage the files in various ways and provide an interface for users to select data. PSA may have a different view. o RID 102 listed as "minor" requests that lab based calibration be added to the archive. The team thinks this may be misleading, because the behaviour of the instrument in space is substantially different. The team suggest documentation of the history of the instrument, which seems reasonable to the reviewer. COSIMA Data =========== Eberhard Gruen's presentation o There is a diagram that shows the position of the named particles in this presentation, at least for one view of the detector surface. o The requested mass scale and deadtime correction calibration data is still in active development. The team notes that providing a mass scale involves scientific interpretation, though the scale generated automatically is in some ranges very good. o The team will attempt to deliver the requested ground calibration data. o RID CS8 - grain lists are labelled "image" if there is an accompanying image, but it sounds like the team wants to re-think both the logic and the labelling. Returning to Sascha's presentation: o The data tables include file references for files that are not always present. This is a result of gaps in the telemetry (the file names are produced automatically, apparently). This seems to be easy for the team to correct. o In addition to the other usability issues, Eberhard's suggestion that the particule names be included in any index listing for the spectra would be a great help to users. GIADA Data ========== Amara Graps presentation: o This reviewer attempted to work backward from level 3 data to level 2 and found that data with a quality description of "bad data" were used. The team says this is not the case. This may be a problem with the calibration description being too high-level, or possibly reviewer error. Notwithstanding, additional calibration detail would be helpful. o The team note that (at least one) paper about calibration is currently under review prior to publication. Copyright issues prevent it (and possibly others) from being included in the archive. o Reviewer requested a data file index in the DATA/ directory. Raugh directed her to the manifest index file in the INDEX/ directory, but it may be this does not have the information actually desired (it really only contains some PDS parameters and file and directory names). SESAME Data =========== Sascha had no outstanding issues with this data. There is no review available for this dataset from ESA. COSAC ===== Lena Le Roy's presentation: o The reviewer would prefer one spectrum per file, but each file represents an observing run. No change will be made. PTOLEMY ======= Lena Le Roy's presentation o Need to tie individual observations to the scan function used. This is not unique to instrument mode (also not provided) so it's going to have to be added explicitly. o Again the question of single files for each spectrum is raised. In this case there is some additional detailed documentation that can be added regarding individual observational circumstances. MIRO ==== Alice Le Gall's presentation o This reviewer also reports gaps in the data. Some are gaps in telemetry, but one is a published result data set. The team reports the missing published data is a mystery currently under investigation. o There are known issues with the calibration, but it is expected to be corrected in a future delivery. It is unlikely this can be done well and fully documented for this delivery. o Similarly, improved and expanded geometry is planned for future deliveries. MIDAS ===== Jon Hiller's presentation... Anne R. explained the discussions from last night re: the actual values in the data and the discrepancy in ranges between the proprietary image display and the PDS image display. This needs to be worked offline. The format itself, though not standard, should be PDS3 compliant as it currently exists. There seems to be some piece missing in either the format description or the PDS label. MUPUS ===== ESA Reviewer's presentation... o Also notes the missing hammer data. The team is researching this. SESAME ====== ESA reviewer not available. So we recapped Kevin Walsh's comments and reviewed the ESA RID submission. SD2 === Miguel Perez Ayucar's presentation o Also noted the groupings of four lines with the same time tag in the calibrated data, though it doesn't appear in the housekeeping data. The team notes there is a difference in the onboard time counter (column 28). They will identify the document that explains this and provide the time conversion algorithm. Reviewer requests that the time be updated to reflect the actual observation time calculated from the spacecrack clock, rather than leaving that as an exercise for the user. o Reviewer notes there is no BROWSE/ directory. This is where the noted missing plots should be (confirmed by the team). The issue of the lack of binary file structure description came up again. Apparently Tilden will be raising this RID and SONC will address it. ----------------------------------------------------------------------- UMD Review Session ------------------ ROMAP ===== Steve Joy's Presentation contains a number of issues that will be raised as RIDs. Additional notes: General notes: o In the calibration description comments, if it is not possible to do any or all of these things then there should be clear documentation to that effect. o The checksum.tab/lbl comment is not for the teams - this file is added by SBN for data integrity management. MAG notes: o The data set catalog files (of all data sets) must contain sufficient information to allow a reasonable user to distinguish between similar data sets. o The data designated as level 5 do not appear to be derived, merely calibrated. o The data designated level 3 do not appear to be actually calibrated. The data indicate field strengths order of magnitude away from any plausible reality. This is a major problem for certification, as it misrepresents the processing status of the data to potential users. Certification: Level 2 certified. Archive pending lien resolution. Level 3 not certified. Delta review required. SPM notes: o The SPM files have been created apparently to minimize the effort involved in creating the FMT file, rather than for reading the data for analysis. Certfication: Deferred. RPCIES ====== o Same issue with dataset catalog descriptions. o The archive requirements could not be assessed because a complete data set was not presented. In fact, not even a complete archive data set structure was presented. o Data labels do not actually describe the data (they don't validate). o The flux unit in column 22 is both too long (too many characters) and incomprehensible to everyone in the room. o Level 3 data has an unbelievable number of significant digits. o Level 3 data contains contradictory notes, and probably other errors noted in Level 2 data that have not been propagated here. Not reviewable. RPCICA ====== o The data sets designated levels 2 and 3 are not actually related to each other by derivation - they are parallel, apparently irreversibly processed (level 4 or higher) data sets. o A summary description of the calibration file name formation rules in the calinfo.txt file would be particularly helpful to users. (This is in the EAICD, but it's not complex and could be copied.) o The BROWSE plots appear to be of derived data, not direct plots of any actual data file. Certification: Not certified. RPCLAP ====== Data are unusable as presented. The overhead required in reading labels in order to get to data overwhelms whatever processing time might be spent in analysis. Certification: Not certified. RPCMIP ====== No major issues; couple of minor issues. Certification: Certified; archive pending lien resolution. RPCMAG ====== No major issues; couple of minor liens. Certification: Certified; archive pending lien resolution. RSI === Dick Simpson's presentation o Documentation of Earth station is incomplete and does not note the use of DSN and ESA Malargue antennas. o Some files refer to other missions and are just plain wrong. o Labels are wildly inaccurate and incomplete. o The two data sets supplied are not representative of the entire collection. o The dataset.cat description for EVERY data set must contain sufficient detail to allow a user to discriminate between similar data sets. Certification: Not certified. CONSERT ======= Disk Simpson's presentation contains issues that should be taken as liens. Additional notes: o The instrument.cat file doesn't contain much that uniquely defines or describes the instrument. It references a paper for details, but we need to check if that is sufficient in this case. o The dataset.cat description must contain enough detail to enable a user to distinguish between similar data sets. o Note that it is not appropriate to incude URL references in archival documents as references. It must be assumed they will not be resolvable for most of the life of the archive. Certification: Not certified. (?) May only need a delta review.