Using Data-Driven Prognostic Algorithms for Completing Independent Failure Analysis
Independent Failure Analysis
Identification and Recovery
MetadataShow full item record
RightsCopyright © held by the author; distribution rights International Foundation for Telemetering
Collection InformationProceedings from the International Telemetering Conference are made available by the International Foundation for Telemetering and the University of Arizona Libraries. Visit http://www.telemetry.org/index.php/contact-us if you have questions about items in this collection.
AbstractCurrent failure analysis practices use diagnostic technology developed over the past 100 years of designing and manufacturing electrical and mechanical equipment to identify root cause of equipment failure requiring expertise with the equipment under analysis. If the equipment that failed had telemetry embedded, prognostic algorithms can be used to identify the deterministic behavior in completely normal appearing data from fully functional equipment used for identifying which equipment will fail within 1 year of use, can also identify when the presence of deterministic behavior was initiated for any equipment failure.
SponsorsInternational Foundation for Telemetering
Showing items related by title, author, creator and subject.
EFFICIENT METHODS FOR MECHANICAL AND STRUCTURAL RELIABILITY ANALYSIS AND DESIGN (SAFETY-INDEX, FATIGUE, FAILURE).WU, YIH-TSUEN. (The University of Arizona., 1984)Three fundamental problems of mechanical reliability are addressed. (1) computing the probability of failure, p(f), of a component having design factors with known statistical distributions and a limit state with a closed form algebraic expression (2) computing the probability of failure of a component having design factors with known distributions and a limit state which can only be expressed by a computer algorithm, and (3) deriving safety check expressions in a "design by reliability" approach. An algorithm for generating estimates of p(f) is presented. The method is an extension of, and demonstrated to be a significant improvement to, the widely used Rackwitz-Fiessler (R-F) method--a fast and efficient numerical method for performing reliability analysis. Comparisons were made for numerous examples, it was found that the error in p(f), using the proposed method, is typically about half of the error in R-F estimates. A method was proposed for computing p(f) when the relationship between design factors can be defined only using a computer algorithm, e.g., finite element analysis. A second order polynomial is constructed, using a simple curve fitting routine, to approximate the limit state in the neighborhood of the design point (i.e., a point close to the most likely value of the design variables at failure). Then the R-F method can be applied easily. It is demonstrated that this scheme is much faster than the Monte Carlo method in producing reasonable estimates of p(f). Methods of deriving safety check expressions for design codes and design criteria documents are studied. A Level I format employing partial safety factors derived from Level II methods is used to construct the safety check expressions which are suitable for code development. The procedures are demonstrated using numerous examples which include the problems where the limit states are complicated, i.e., the limit states are not explicitly defined.
Analysis of Failures of Decoders for LDPC CodesChilappagari, Shashi Kiran (The University of Arizona., 2008)Ever since the publication of Shannon's seminal work in 1948, the search for capacity achieving codes has led to many interesting discoveries in channel coding theory. Low-density parity-check (LDPC) codes originally proposed in 1963 were largely forgotten and rediscovered recently. The significance of LDPC codes lies in their capacity approaching performance even when decoded using low complexity sub-optimal decoding algorithms. Iterative decoders are one such class of decoders that work on a graphical representation of a code known as the Tanner graph. Their properties have been well understood in the asymptotic limit of the code length going to infinity. However, the behavior of various decoders for a given finite length code remains largely unknown.An understanding of the failures of the decoders is vital for the error floor analysis of a given code. Broadly speaking, error floor is the abrupt degradation in the frame error rate (FER) performance of a code in the high signal-to-noise ratio domain. Since the error floor phenomenon manifests in the regions not reachable by Monte-Carlo simulations, analytical methods are necessary for characterizing the decoding failures. In this work, we consider hard decision decoders for transmission over the binary symmetric channel (BSC).For column-weight-three codes, we provide tight upper and lower bounds on the guaranteed error correction capability of a code under the Gallager A algorithm by studying combinatorial objects known as trapping sets. For higher column weight codes, we establish bounds on the minimum number of variable nodes that achieve certain expansion as a function of the girth of the underlying Tanner graph, thereby obtaining lower bounds on the guaranteed error correction capability. We explore the relationship between a class of graphs known as cage graphs and trapping sets to establish upper bounds on the error correction capability.We also propose an algorithm to identify the most probable noise configurations, also known as instantons, that lead to error floor for linear programming (LP) decoding over the BSC. With the insight gained from the above analysis techniques, we propose novel code construction techniques that result in codes with superior error floor performance.
Using Generic Telemetry Prognostic Algorithms for Launch Vehicle and Spacecraft Independent Failure Analysis ServiceLosik, Len; Failure Analysis (International Foundation for Telemetering, 2010-10)Current failure analysis practices use diagnostic technology developed over the past 100 years of designing and manufacturing electrical and mechanical equipment to identify root cause of equipment failure requiring expertise with the equipment under analysis. If the equipment that failed had telemetry embedded, prognostic algorithms can be used to identify the deterministic behavior in completely normal appearing data from fully functional equipment used for identifying which equipment will fail within 1 year of use, can also identify when the presence of deterministic behavior was initiated for any equipment failure.