The growing number of fine-granular data streams opens up new opportunities for improved risk analysis,
situation and evolution monitoring as well as event detection. However, there are still some major roadblocks
for leveraging the full potential of data stream processing, as it would, for example, be needed the highly
relevant systemic risk analysis in the financial domain.The QualiMaster project will address those road blocks
by developing novel approaches for autonomously dealing with load and need changes in large-scale data
stream processing, while opportunistically exploiting the available resources for increasing analysis depth
whenever possible. For this purpose, the QualiMaster infrastructure will enable autonomous proactive,
reflective and cross-pipeline adaptation, in addition to the more traditional reactive adaptation. Starting from
configurable stream processing pipelines, adaptation will be based on quality-aware component description,
pipelines optimization and the systematic exploitation of families of approximate algorithms with different
quality/performance tradeoffs. However, adaptation will not be restricted to the software level alone: We will
go a level further by investigating the systematic translation of stream processing algorithms into code for
reconfigurable hardware and the synergistic exploitation of such hardware-based processing in adaptive high
performance large-scale data processing. The project focuses on financial analysis based on combining financial
data streams and social web data, especially for systemic risk analysis. Our user-driven approach involves
two SMEs from the financial sector. Rigorous evaluation with real world data loads from the financial domain
enriched with relevant social Web content will further stress the applicability of QualiMaster results.