Data for testing readstat-rs binary
ahs2019n.sas7bdat
→ US Census data- http://www2.census.gov/programs-surveys/ahs/2019/AHS%202019%20National%20PUF%20v1.1%20Flat%20SAS.zip
- Must be downloaded manually as currently ignored by
git
(i.e. has been added to the repository.gitignore
file) - Renamed to be
_ahs2019n.sas7bdat
in order to be picked up by the_*.sas7bdat
pattern in the.gitignore
file
all_types.sas7bdat
→ SAS dataset containing all SAS types- Created using create_all_types.sas
cars.sas7bdat
→ SAS cars datasethasmissing.sas7bdat
→ SAS dataset containing missing valuesintel.sas7bdat
messydata.sas7bdat
rand_ds.sas7bdat
→ Created using create_rand_ds.sas- Renamed to be
_rand_ds.sas7bdat
in order to be picked up by the_*.sas7bdat
pattern in the.gitignore
file
- Renamed to be
rand_ds_largepage_err.sas7bdat
→ Created using create_rand_ds.sas with BUFSIZE set to2M
- Does not parse with version 1.1.6 of ReadStat
rand_ds_largepage_ok.sas7bdat
→ Created using create_rand_ds.sas with BUFSIZE set to1M
- Parses with version 1.1.6 of ReadStat
scientific_notation.sas7bdat
→ Used to test float parsing- Created using create_scientific_notation_ds.sas
- Previously parsed floats with lexical by first converting to a string
lexical::to_string(value)
and then converting back (after truncating) withlexical::parse
- Had an issue where large floats were being read correctly via ReadStat and the string conversion via
lexical
resulted in a string with scientific notation; after trying to parse back from a string to a float withlexical::parse
there would be an error thrown - Fixed by d301a9f9ff8c5e3c34a604a16c095e99d205f624
somedata.sas7bdat
somemiss.sas7bdat