AuthorJory, H. M.
AffiliationMcDonnell Douglas Corporation
MetadataShow full item record
RightsCopyright © International Foundation for Telemetering
Collection InformationProceedings from the International Telemetering Conference are made available by the International Foundation for Telemetering and the University of Arizona Libraries. Visit http://www.telemetry.org/index.php/contact-us if you have questions about items in this collection.
AbstractThis report presents the results of a company funded computer study to deter-mine the effectiveness of redundancy removal algorithms as applied to manned spacecraft data. The company familiarity with and access to manned space flight data provided an almost unique opportunity to study this method of data compression using data representative of that which will be required from a Manned Mars Mission. A total of 28,500 seconds of the Gemini XII flight is examined using seven algorithms and three different tolerance bands. Over eleven million samples have been examined using terminology and descriptions consistent with previously published literature to allow direct comparison of actual flight data with previous results using synthetic data. The outputs from the computer presented the following information: A. Compression ratios as a function of technique, channel number and type of data for each of the activity periods. B. Buffer input rates and accumulated queue lengths every 2.4 seconds for the ZFN technique. C. Error distribution, for each of the techniques for six different apertures. The results indicate that the zero order - variable corridor - adjusted preceding sample transmitted (ZVA) technique can provide data compression ratios of 187:1 using a 1.2% tolerance. Nominal buffer sizes of 20K bits are adequate to handle the data activity period involved. The error distribution evaluation indicates that the error distribution is primarily a function of the technique and the aperture.
SponsorsInternational Foundation for Telemetering