Winter Term 2013/14

We present different topics of our research every Wednesday at 11:30 am in room 305, Obermarkt 17. For news about the EAD-Lunch talks and seminars please feel free to subscribe to EAD-Public@googlegroups.com. (Register here: https://groups.google.com/forum/?hl=de&fromgroups#!forum/ead-public)

29.01.2014 Waheed Aslam Ghumman "Automation of the SLA Life Cycle in Cloud Computing"

contact: w.ghumman(at)hszg.de

Cloud computing has emerged as a popular paradigm for scalable infrastructure solutions and services. The requirement of automated management of Service Level Agreements (SLAs) between the cloud service provider and the cloud user has increased to minimize user interaction with the computing environment. Thus, effective SLA negotiation, monitoring and timely detection of possible SLA violations represent challenging research issues. A big gap exists between a manual/semi-automated and a fully automated SLA life-cycle. This gap can be bridged with formalization of generally existing natural language SLAs. Algorithms and strategies for SLA monitoring, management and SLA violation are directly dependent on a complete formalization of SLAs. The goal of the thesis is to analyze currently existing SLA description languages, to find their shortcomings and to develop a complete SLA description language. As next step, we plan to develop distributed algorithms for automated SLA negotiation, monitoring, integration and timely SLA violations detection for cloud computing.

29.01.2014 EAD-Group is hosting first JavaUserGroup Meeting in 2014, Max Wielsch talking about "Tool Integration in the software development process"

contact: max.wielsch(at)googlemail.com

 

22.01.2014 Marc Ritter "Generic Video Analysis at Chemnitz University of Technology"

contact: marc.ritter(at)informatik.tu-chemnitz.de

Since 2007, the Chair Media Informatics at Chemnitz University of Technology, works in cooperation with companies operating in the media sector with the research objective to develop solutions and strategies for several tasks and problems resulting from the current turmoil in media technology. We are focusing on prevailing problems of local or regional television operators such as improving platforms for joint production and archiving processes or switching from analog to digital TV services. This presentation gives insights into the outcomes of six years in the field of metadata extraction by video analysis while introducing a holistic, unified and generic research framework that is capable of providing arbitrary application dependent multi-threaded custom processing chains for workflows in the area of image processing. This comprises methods such as shot boundary detection and shot compositions to structurally divide the video into disjoint parts and to reduce data by the creation of representative keyframes at scene level. Techniques from machine learning and statistical pattern recognition are utilized for content-based analysis to reliably detect and recognize faces and other objects. To enhance and speed up the development of algorithms, a graphical user interface connects to the framework to allow for model-driven and semi-automated (ground truth) annotation as well as training, visualization, and evaluation purposes. The heavy use of plug-in concepts and design patterns throughout the framework sustainably guarantees maintainability and extensibility while being flexible enough to adapt to new demands and fields of application that are not necessarily restricted to the field of computer vision.

15.01.2014 Andreas Schulz "Web Data Extraction"

contact: a.schulz(at)hszg.de

A mammoth amount of data is available via the World Wide Web. For leveraging the potential behind this data there is a need to access them and make them digestible. Web Data Extraction targets to enable the extraction and use of this vast amount of data lying around in the internet and making the immanent information and knowledge usable. Because of the evolving nature of the Web the challenge is to find robust and adaptable methods, that enable us to keep pace with that constant change. Besides the integration of semantics and NLP promise to improve this process whereas the Deep Web hiddes additional plethora of data and also additional difficulties in accessing this data. This talk will give a short overview of the field, locate it's various concepts and give an insight into possible technologies and applications.

13.11.2013 Max Wielsch & Raik Bieniek "Efficient Tool Integration with OpenDIP"

contact: max.wielsch(at)googlemail.com

The software development process and its supporting tools have been subject to changes during the recent years. Today, agile process models are establishing continuously and tool vendors are providing tools suites and platforms reaching a better integration. Is this level of integration sufficient to realize tool environments perfectly adapted to single projects? In two master's theses and a scientific paper further use cases have been presented that imply the need for an enhancement of current integration technology. In a short presentation and a discussion Max Wielsch and Raik Bieniek from the Saxonia Systems AG we consider the efficient and easy development of new tools based on the idea of an open development integration platform (OpenDIP).

06.11.2013 Daniel Tasche "Sustainable Sensing - Design of a Remote Monitoring System"

contact: d.tasche(at)hszg.de

A possible contribution to achieve the ambitious climate protection goals of the federal government especially in relation to the power supply is the saving of energy by advanced technologies and modern methods. Municipalities and counties have a large number of buildings and in fact there is a significant energy saving potential by using an intelligent building management. The first step to identify such potentials is the measurement of energy consumption data within the building. If this data is visualized and compared with expected values, shortcomings and deficits can be effectively detected. Usually there are no resources in the municipality to monitor all buildings on site. For this purpose a system for automated sensor data detection and web based visualizing was designed and a prototypical service oriented architecture was implemented.

23.10.2013 Markus Ullrich "Current Challenges and Approaches for Resource Demand Estimation in the Cloud"

contact: mullrich(at)hszg.de

The increasing popularity of Cloud computing, especially for high performance computing (HPC) applications offers a huge potential for optimizing the consumption of compute resources. Since hybrid Cloud platforms in particular offer the best balance between data security, performance, business agility and mobile support, they are used more and more frequently. In this work, we highlight the most important challenges that arise for resource demand estimation systems, especially in public and hybrid Cloud environments. We present existing approaches, separated in load-balancing - or single resource type systems - and Cloud or virtual machine (VM) type selection - or multiple resource type systems. The approaches are analyzed in different aspects including their potential to overcome the presented challenges and their applicability in different Cloud environments. Our research reveals that not all of the issues have been resolved yet but the means to achieve that are available. We conclude our work with useful suggestions that can help to overcome the remaining challenges.

16.10.2013 Jens Heider "Scenario Analysis and Climate Action Planning"

contact: j.heider(at)hszg.de

There is a number of ambitious goals to reduce greenhouse gas emissions. Usually these are established by national or international institutions/governments, while the implementation is done on a local/municipal level. In order to evaluate and estimate the (long-term) impacts of appropriate measures, especially the effectiveness and efficiency, an extensive strategic planning is required. For this purpose, the scenario development and analysis has been established, which is part of the so called "Climate-Action-Workshops" to support regional and municipal climate action management. The primary goal is to identify unused potentials and to evaluate appropriate mitigation measures. The talk discusses several common methods and algorithms used for the scenario technique and evaluates their potential implementation in a software which supports the workshops.

02.10.2013 Abhishek Awasthi "Common Due Date Problem: Exact Algorithms for a Given Job Sequence"

contact: aawasthi(at)hszg.de

This talk considers the Common Due Date scheduling problem (CDD) where all the jobs possess different processing times but the same due date. CDD is encountered by major industries which have to process and supply a particular good just-in-time. We present exact polynomial algorithms for optimizing a given job sequence for single and parallel machines with polynomial run-time complexities and prove the optimality for the single machine case. Henceforth, we present the results for the benchmark problems for both single and parallel machine cases and compare with previous work.