Using Generic Telemetry Prognostic Algorithms for Launch Vehicle and Spacecraft Independent Failure Analysis Service
Author
Losik, LenAffiliation
Failure AnalysisIssue Date
2009-10Keywords
Failure analysisindependent failure analysis
fault analysis
prognostics
failure analysis
diagnostics
telemetry analysis
failure isolation
identification and recovery
Metadata
Show full item recordRights
Copyright © held by the author; distribution rights International Foundation for TelemeteringCollection Information
Proceedings from the International Telemetering Conference are made available by the International Foundation for Telemetering and the University of Arizona Libraries. Visit http://www.telemetry.org/index.php/contact-us if you have questions about items in this collection.Abstract
Current equipment and vehicle failure analysis practices use diagnostic technology developed over the past 100 years of designing and manufacturing electrical and mechanical equipment to identify root cause of equipment failure requiring expertise with the equipment under analysis. If the equipment that failed had telemetry embedded, prognostic algorithms can be used to identify the deterministic behavior in completely normal appearing data from fully functional equipment used for identifying which equipment will fail within 1 year of use, can also identify when the presence of deterministic behavior was initiated for any equipment failure.Sponsors
International Foundation for TelemeteringISSN
0884-51230074-9079
Additional Links
http://www.telemetry.org/Related items
Showing items related by title, author, creator and subject.
-
Stability analysis of wedge type rock slope failuresSublette, William Robert, 1944- (The University of Arizona., 1976)
-
Acetazolamide Therapy in Patients with Heart Failure: A Meta-AnalysisWongboonsin, Janewit; Thongprayoon, Charat; Bathini, Tarun; Ungprasert, Patompong; Aeddula, Narothama Reddy; Mao, Michael A; Cheungpasitporn, Wisit; Univ Arizona, Dept Internal Med (MDPI, 2019-03-12)Nine studies (three randomized controlled trials and six cohort studies) with a total of 229 HF patients were enrolled. After acetazolamide treatment, there were significant decreases in serum pH (mean difference (MD) of -0.04 (95% CI, -0.06 to -0.02)), pCO₂ (MD of -2.06 mmHg (95% CI, -3.60 to -0.53 mmHg)), and serum bicarbonate levels (MD of -6.42 mmol/L (95% CI, -10.05 to -2.79 mmol/L)). When compared to a placebo, acetazolamide significantly increased natriuresis (standardized mean difference (SMD) of 0.67 (95% CI, 0.08 to 1.27)), and decreased the apnea-hypopnea index (AHI) (SMD of -1.06 (95% CI, -1.75 to -0.36)) and central apnea index (CAI) (SMD of -1.10 (95% CI, -1.80 to -0.40)). Egger's regression asymmetry tests revealed no publication bias with p = 0.20, 0.75 and 0.59 for analysis of the changes in pH, pCO₂, and serum bicarbonate levels with use of acetazolamide in HF patients.
-
Analysis of Failures of Decoders for LDPC CodesVasic, Bane; Chilappagari, Shashi Kiran; Vasic, Bane; Marcellin, Michael W.; Ryan, William E.; Lux, Klaus M. (The University of Arizona., 2008)Ever since the publication of Shannon's seminal work in 1948, the search for capacity achieving codes has led to many interesting discoveries in channel coding theory. Low-density parity-check (LDPC) codes originally proposed in 1963 were largely forgotten and rediscovered recently. The significance of LDPC codes lies in their capacity approaching performance even when decoded using low complexity sub-optimal decoding algorithms. Iterative decoders are one such class of decoders that work on a graphical representation of a code known as the Tanner graph. Their properties have been well understood in the asymptotic limit of the code length going to infinity. However, the behavior of various decoders for a given finite length code remains largely unknown.An understanding of the failures of the decoders is vital for the error floor analysis of a given code. Broadly speaking, error floor is the abrupt degradation in the frame error rate (FER) performance of a code in the high signal-to-noise ratio domain. Since the error floor phenomenon manifests in the regions not reachable by Monte-Carlo simulations, analytical methods are necessary for characterizing the decoding failures. In this work, we consider hard decision decoders for transmission over the binary symmetric channel (BSC).For column-weight-three codes, we provide tight upper and lower bounds on the guaranteed error correction capability of a code under the Gallager A algorithm by studying combinatorial objects known as trapping sets. For higher column weight codes, we establish bounds on the minimum number of variable nodes that achieve certain expansion as a function of the girth of the underlying Tanner graph, thereby obtaining lower bounds on the guaranteed error correction capability. We explore the relationship between a class of graphs known as cage graphs and trapping sets to establish upper bounds on the error correction capability.We also propose an algorithm to identify the most probable noise configurations, also known as instantons, that lead to error floor for linear programming (LP) decoding over the BSC. With the insight gained from the above analysis techniques, we propose novel code construction techniques that result in codes with superior error floor performance.