PDS_VERSION_ID = PDS3 LABEL_REVISION_NOTE = "2018-07-11, R. Alanis and J. Maki, intial release for Version 2.0 of the Navcam RDR dataset." RECORD_TYPE = STREAM OBJECT = DATA_SET DATA_SET_ID = "MSL-M-NAVCAM-5-RDR-V2.0" OBJECT = DATA_SET_MISSION MISSION_NAME = "MARS SCIENCE LABORATORY" END_OBJECT = DATA_SET_MISSION OBJECT = DATA_SET_INFORMATION DATA_SET_NAME = "MSL MARS NAVIGATION CAMERA 5 RDR V2.0" DATA_SET_COLLECTION_MEMBER_FLG = N DATA_OBJECT_TYPE = IMAGE ARCHIVE_STATUS = "ARCHIVED - ACCUMULATING" START_TIME = 2017-11-09T15:56:56.773 STOP_TIME = NULL DATA_SET_RELEASE_DATE = 2018-08-01 PRODUCER_FULL_NAME = "Justin Maki" DETAILED_CATALOG_FLAG = Y DATA_SET_TERSE_DESC = "Mars Science Laboratory Mars Navigation Camera Reduced Data Records" CITATION_DESC = "Maki, Justin, MSL Mars Navigation Camera RDR V2.0, NASA Planetary Data System, MSL-M-NAVCAM-5-RDR-V2.0, 2018." ABSTRACT_DESC = "NULL" DATA_SET_DESC = " Data Set Overview ================= This data set contains derived data products, that have been processed with correctly rotated flat field files, for the MSL Navigation cameras (Navcam) to support rover traverse planning, post-traverse assessment, rover localization, operation of the robotic arm, and the selection of science targets. For a description of the flat field rotation issue affecting previously released products from Releases 1-17, see SECTION B, item 6 of the accompanying ERRATA_V1.TXT file. See the VOLDESC.CAT and ERRATA.TXT at the root level of this volume for additional information about the addition of MSL-M-NAVCAM-5-RDR-V2.0 to this volume. Most Navcam images were commanded via ground commands from Earth, although a subset of Navcam images were commanded autonomously during rover traverses by the Rover onboard autonomous navigation system. The Navcams were extensively used to acquire end-of-drive 360-degree panoramas. This data set is similar to the reduced data sets for the Navigation cameras on MER [MAKIETAL2003]. Detailed descriptions of all Reduced Data Record (RDR) products are available in the MSL_CAMERA_SIS.PDF, located in the DOCUMENT directory of this volume. Processing ========= This data set uses the Committee on Data Management and Computation (CODMAC) data level numbering system. The MSL Navcam RDRs are considered CODMAC Level 3 (calibrated data equivalent to NASA Level 1A), CODMAC Level 4 (resampled data equivalent to NASA Level 1B), or CODMAC Level 5 (derived data equivalent to NASA level 3). The RDRs are derived from the Navcam EDR data set and include radiometrically corrected and/or camera model corrected and/or geometrically altered versions of the raw camera data, in single frame form. All of the RDR data products in this dataset have detached PDS labels. Data ==== There are dozens of types of RDR products, described in detail in Section 5.1 of the MSL_CAMERA_SIS.PDF. Below are descriptions of the most commonly used RDRs. 1) Geometrically Corrected (Linearized) Images EDRs and single-frame RDRs are described by a camera model. This model, represented by a set of vectors and numbers, permit a point in space to be traced into the image plane, and vice-versa. EDR camera models are derived by acquiring images of a calibration target with known geometry at a fixed azimuth/elevation. The vectors representing the model are derived from analysis of this imagery. These vectors are then translated and rotated based on the actual pointing of the camera to represent the conditions of each specific image. The results are the camera model for the EDR. The Navcams use a CAHVOR model, while the Hazcams use a more general CAHVORE model. Neither are linear and involve some complex calculations to transform line/sample points in the image plane to XYZ positions in the scene. To simplify this, the images are warped, or reprojected, such that they can be described by a linear CAHV model. This linearization process has several benefits: a) It removes geometric distortions inherent in the camera instruments, with the result that straight lines in the scene are straight in the image. b) It aligns the images for stereo viewing. Matching points are on the same image line in both left and right images, and both left and right models point in the same direction. c) It facilitates correlation, allowing the use of 1-D correlators. d) It simplifies the math involved in using the camera model. Transformation introduces some artifacts in terms of scale change and/or omitted data (see the references). The linearized CAHV camera model is derived from the EDR's camera model by considering both the left and right eye models and constructing a pair of matched linear CAHV models that conform to the above criteria. The image is then projected, or warped, from the CAHVOR/CAHVORE model to the CAHV model. This involves projecting each pixel through the EDR camera model into space, intersecting it with a surface (which matters only for Hazcams and is a sphere centered on the camera), and projecting the pixel back through the CAHV model into the output image. See GEOMETRIC_CM.TXT for additional detail. 2) Inverse Lookup Table Scaled Products If the Navcam EDR is in 8-bit format as a result of onboard 12 to 8-bit scaling using a Lookup Table (LUT), then an Inverse LUT is applied to rescale the 8 lowest bits into the 12 lowest bits in the 16-bit signed integer. 3) Radiometrically Corrected Products There are three types of radiometrically corrected products, and multiple methods of performing radiometric correction. For more information, see the MSL_CAMERA_SIS.PDF in the DOCUMENT directory of this volume. a) RA products have been corrected to absolute radiance units of W/m^2/nm/steradian. b) RI products have been corrected for instrument effects only, and are in units of DN. c) IO products are radiance factor (I/F) and are dimensionless. 4) Disparity Files A Disparity file contains 2 bands of 32-bit floating point numbers in the Band Sequential order (line, sample). Alternatively, line and sample may be stored in separate single-band files. The parallax, or difference measured in pixels, between an object location in two individual images (typically the left and right images of a stereo pair) is also called the disparity. Disparity files contain these disparity values in both the line and sample dimension for each pixel in the reference image. This reference image is traditionally the left image of a stereo pair, but could be the right image for special products. The geometry of the Disparity image is the same as the geometry of the reference image. This means that for any pixel in the reference image the disparity of the viewed point can be obtained from the same pixel location in the Disparity image. The values in a Disparity image are the 1-based coordinates of the corresponding point in the non-reference image. Thus, the coordinates in the reference image are the same as the coordinates in the Disparity image, and the matching coordinates in the stereo partner image are the values is the Disparity image. Disparity values of 0.0 indicate no valid disparity exists, for example due to lack of overlap or correlation failure. This value is reflected in the MISSING_CONSTANT keyword. 5) XYZ Products An XYZ file contains 3 bands of 32-bit floating point numbers in the Band Sequential order. Alternatively, X, Y and Z may be stored in separate single-band files as a X Component RDR, Y Component RDR and Z Component RDR, respectively. The single component RDRs are implicitly the same as the XYZ file, which is described below. XYZ locations in all coordinate frames for MSL are expressed in meters unless otherwise noted. The pixels in an XYZ image are coordinates in 3-D space of the corresponding pixel in the reference image. This reference image is traditionally the left image of a stereo pair, but could be the right image for special products. The geometry of the XYZ image is the same as the geometry of the reference image. This means that for any pixel in the reference image the 3-D position of the viewed point can be obtained from the same pixel location in the XYZ image. The 3-D points can be referenced to any of the MSL coordinate systems (specified by DERIVED_IMAGE_PARAMS Group in the PDS label). Most XYZ images will contain holes, or pixels for which no XYZ value exists. These are caused by many factors such as differences in overlap and correlation failures. Holes are indicated by X, Y, and Z all having the same specific value. This value is defined by the MISSING_CONSTANT keyword in the IMAGE object. For the XYZ RDR, this value is (0.0,0.0,0.0), meaning that all three bands must be zero (if only one or two bands are zero, that does not indicate missing data). 6) Range (Distance) Files A Range (distance) file contains 1 band of 32-bit floating point numbers. The pixels in a Range image represent Cartesian distances from a reference point (defined by the RANGE_ORIGIN_VECTOR keyword in the PDS label) to the XYZ position of each pixel. This reference point is normally the camera position as defined by the C point of the camera model. A Range image is derived from an XYZ image and shares the same pixel geometry and XYZ coordinate system. As with XYZ images, range images can contain holes, defined by MISSING_CONSTANT. For MSL, this value is 0.0. 7) Surface Normal Files A Surface Normal (UVW) file contains 3 bands of 32-bit floating point numbers in the Band Sequential order. Alternatively, U, V and W may be stored in separate single-band files as a U Component RDR, V Component RDR and W Component RDR, respectively. The single component RDRs are implicitly the same as the UVW file. The pixels in a UVW image correspond to the pixels in an XYZ file, with the same image geometry. However, the pixels are interpreted as a unit vector representing the normal to the surface at the point represented by the pixel. U contains the X component of the vector, V the Y component, and W the Z component. The vector is defined to point out of the surface (e.g. upwards for a flat ground). The unit vector can be referenced to any of the MSL coordinate systems (specified by the DERIVED_IMAGE_PARAMS Group in the PDS label). Most UVW images will contain holes, or pixels for which no UVW value exists. These are caused by many factors such as differences in overlap, correlation failures, and insufficient neighbors to compute a surface normal. Holes are indicated by U, V, and W all having the same specific value. Unlike XYZ, (0,0,0) is an invalid value for a UVW file, since they are defined to be unit vectors. Thus there is no issue with the MISSING_CONSTANT as there is with XYZ, where (0.0,0.0,0.0) is valid. 8) Surface Roughness Maps The roughness maps contain surface roughness estimates at each pixel in the image, along with a 'goodness' flag indicating whether the roughness meets certain criteria. For each pixel, the surface normal product defines a reference plane. XYZ pixels in the area of interest are gathered, and the distance to the plane is computed. Minimum and maximum distances from the plane are computed, with outliers excluded. Roughness is defined as the distance between this min and max. 9) Slope Products The slope RDR products represent aspects of the slope of the terrain as determined by stereo imaging. There are several slope types, described in further detail in the MSL_CAMERA_SIS.PDF. 10) Arm Reachability Maps The Arm Reachability Maps contain information about whether or not the instruments on the arm can reach (either contact or image) the object or location represented by each pixel in the scene. They are derived from the XYZ and Surface Normal products. 11) Stereo Anaglyphs A stereo anaglyph is a method of displaying stereo imagery quickly and conveniently using conventional display technology (no special hardware) and red/blue glasses. This is done by displaying the left eye of the stereo pair in the red channel, and displaying the right eye in the green and blue channels. An anaglyph data product simply captures that into a single 3-band color image, which can be displayed using any standard image display program with no knowledge that it is a stereo image. The red (first) band contains the left eye image, while the green and blue (second and third) bands each contain the right eye image (so the right image is duplicated in the file). Anaglyphs are created manually from CAHV linearized Full Framed stereo pair EDRs or mosaics. Often times the images are stretched prior to creating the anaglyph. After stretching, the images are converted to a VICAR cube, which creates a single multi-band image. The final step involves adding the PDS label. Software ======== Navigation camera downlink processing software used by the science and engineering team during operations is focused on rapid reduction, calibration, and visualization of images in order to make discoveries, to accurately and expeditiously characterize the geologic environment around the rover, and to provide timely input for operational decisions concerning rover navigation and Robotic Arm target selection. Navigation images can be viewed with the program NASAView, developed by the PDS and available for a variety of computer platforms from the PDS web site http://pds.jpl.nasa.gov/tools/nasa-view.shtml. There is no charge for NASAView. Media/Format ============ The data set will be delivered and made available to the public through the Planetary Data System web sites. " CONFIDENCE_LEVEL_NOTE = "NULL" END_OBJECT = DATA_SET_INFORMATION OBJECT = DATA_SET_TARGET TARGET_NAME = MARS END_OBJECT = DATA_SET_TARGET OBJECT = DATA_SET_HOST INSTRUMENT_HOST_ID = MSL INSTRUMENT_ID = NAV_LEFT_A END_OBJECT = DATA_SET_HOST OBJECT = DATA_SET_HOST INSTRUMENT_HOST_ID = MSL INSTRUMENT_ID = NAV_LEFT_B END_OBJECT = DATA_SET_HOST OBJECT = DATA_SET_HOST INSTRUMENT_HOST_ID = MSL INSTRUMENT_ID = NAV_RIGHT_A END_OBJECT = DATA_SET_HOST OBJECT = DATA_SET_HOST INSTRUMENT_HOST_ID = MSL INSTRUMENT_ID = NAV_RIGHT_B END_OBJECT = DATA_SET_HOST OBJECT = DATA_SET_REFERENCE_INFORMATION REFERENCE_KEY_ID = "JPLD-38107" END_OBJECT = DATA_SET_REFERENCE_INFORMATION OBJECT = DATA_SET_REFERENCE_INFORMATION REFERENCE_KEY_ID = "MAKIETAL2012" END_OBJECT = DATA_SET_REFERENCE_INFORMATION OBJECT = DATA_SET_REFERENCE_INFORMATION REFERENCE_KEY_ID = "MAKIETAL2003" END_OBJECT = DATA_SET_REFERENCE_INFORMATION END_OBJECT = DATA_SET END