english | deutsch

MAST@RE19 – Best Poster and Tool award, Paper presentations, workshop organization

Highlights

  • Six MAST members attended RE19
  • 1 full paper, 1 workshop paper, 2 Poster and Tool demos
  • AffectRE workshop organization
  • Best Tool and Demo award

Paper Presentations

Six members of the MAST team attended the 27th IEEE International Requirements Engineering Conference RE19 (Sept. 23rd – 27th) on Jeju Island, South Korea.

We presented 1 conference paper, 1 workshop paper, as well as 2 Poster and Tool demos.

  • Extracting and Analyzing Context Information in User-Support Conversations on Twitter, Daniel Martens and Walid Maalej. [preprint]
  • Classifying Multilingual User Feedback using Traditional Machine Learning and Deep Learning, Christoph Stanik, Marlo Haering, and Walid Maalej. [preprint]
  • Requirements Intelligence with OpenReq Analytics, Christoph Stanik and Walid Maalej. [preprint] [poster] [video]
  • OpenReq Issue Link Map: A Tool to Visualize Issue Links in Jira, Clara Marie Lüders, Mikko Raatikainen, Joaquim Motger, and Walid Maalej. [preprint] [poster] [video]

Workshop Organization

Davide Fucci and Walid Maalej from MAST, as well as Simone Kühn from the Max Planck Institute for Human Development organized the AffectRE workshop.

The topic of the workshop can be summarized as: As software systems are designed and used by human beings which can be characterized by emotions, the aim of Affective Computing is to study the development of software systems and devices that can recognize, interpret, process, and exploit human affects, feelings, emotion, attitudes and personalities.

The program included Jan Ole Johannsen from the Technische Universität München as the keynote speaker, as well as the presentations of the five accepted papers.

Best Tool and Demo Award

We are proud for receiving the best RE’19 Poster and Tool Demo award for our contribution: Requirements Intelligence with OpenReq Analytics.
The tool is being developed in the Horizon 2020 Project OpenReq. The tool is meant to support requirements engineering related tasks by analyzing user feedback from social media and app stores. Currently, it provides three distinct views – a dashboard that reports the general health of a software product or service, focus views for filtering and reading user feedback, and a competitor comparison. The tool is briefly introduced in the linked video. The linked poster further summarizes the contribution.

Further Impressions