SERTA (SER Technology Analyzer) allows for a fast characterization of raw failure rates of current and future technologies.
ReDO optimizes your system to match your reliability contraints.
SyRA automates reliability analysis of complex electronic systems by means of component based statistical reliability models.
Partner UOA will present two papers at 2016 IEEE International Symposium on Performance Analysis of Systems and Software April 17-19, 2016, Uppsala, Sweden on “Anatomy of Microarchitecture-Level Reliability Assessment: Throughput and Accuracy” and “GUFI: a Framework for GPUs Reliability Assessment”.
Partner POLITO will present a paper at the 2nd Workshop On Approximate Computing (WAPCO 2016) held in conjunction with HiPEAC 2016, Prague, 18-20 January 2016, on “Early Component-Based System Reliability Analysis for Approximate Computing Systems”.
With the proliferation of integrated circuits implemented in the most advanced process technologies, there is a growing need to jointly analyze the effect of multiple sources of failures including variability and aging and to understand, early in the design cycle, their impact on system reliability.
Today, conservative margins are required to ensure devices operate correctly over their full lifetime, despite the impact of aging effects (BTI, HCI) and failure mechanisms such as EM. New methodologies for improved cross-layer modeling and mitigation, if planned early in the design of a product, have the potential to remove unnecessary conservatism, reduce power and cost and improve yield.
This workshop is focused on sharing new research on techniques and methodologies for modeling the effects of failures due to transistor aging, variability and other mechanisms all the way from the cell level to system level. New approaches to perform early estimations of system reliability are much needed to enable cost-effective designs jointly optimized with respect to reliability, power consumption as well as costs.
Research in the last few years has focused on approximate computing as a means to overcome the energy scaling barrier of computer systems. Such savings can be achieved by utilizing the inherent error resilience of algorithms in many application domains such as signal processing, multimedia, data analytics and computational engineering, among others. Indeed, fully accurate arithmetic in specific phases of a computation in those applications may have only a marginal effect on output quality, especially if combined with error correction frameworks such as iterative refinement. Thus, accurate execution may be traded off with lower energy consumption by providing the ability to scale supply voltage below nominal values or to use lower precision arithmetic (i.e. 8 or 16 bit), thus, trading off low energy with quality of output results.
Rather than focusing on a single layer, designing such systems in a general-purpose computing environment requires a holistic view of all layers from algorithms, programming models, system software, and hardware down to the transistor level. This half-day workshop is an inter-disciplinary effort to bring together researchers from the areas of mathematics, computer science, computer and electrical engineering to discuss challenges, risks and opportunities of approximate computing in all design layers. Papers will not be published in proceedings, so submitting to WAPCO will not preclude future publication opportunities. We are soliciting original papers on topics that include but are not limited to the following:
Partner UoA will present a paper at IEEE Symposium on Defect and Fault Tolerance in VLSI and Nanotechnology Systems (DFTS) 2015 on “Accelerated Microarchitectural Fault Injection-Based Reliability Assessment”.