Research Intern for Discussion Toxicity Prediction (within International Research Project)

Our vision is to make Slovakia a great research and innovation space. We work hard to motivate talents to return back home, to attract world-class researchers to join our team and to stimulate knowledge sharing both nationally and internationally by the circulation of “brains”.

We are a nonprofit organization that brings together the experts in AI and other areas of computer science and dedicate our time and effort to intelligent technology research.

Job description

We seek reinforcements for our team solving the international project VIGILANT (Vital Intelligence to Investigate Illegal Disinformation), a Horizon Europe project aimed at application of data collection and AI-based web content analysis methods to support police authorities in investigations of disinformation-related crime. In this project, KInIT collaborates with research institutions, software companies, police authorities and training institutions from across Europe.

The intern will work on a specific task of discussion toxicity prediction on social media. The goal is to create a machine-learned model that would predict the scale and intensity of toxicity of discussion that would emerge under a social media post (based on the sole content of that post). In other words, the model would measure how provocative (toxicity indicing) a given post is. The purpose is very practical: when sourcing data for disinformation or hate speech analysis, the social media often limit their APIs to a certain number of requests. Predicting the toxicity would allow to focus the precious API call limits to potentially interesting content.

Your responsibilities will include:

  • Creation of AI models and methods, statistical analysis, machine learning
  • Data acquisition (including cleaning and preprocessing)
  • Data annotation and organization of annotation activities
  • Application of machine learned models on practical problem solving
  • Experimental software development
  • Assisting with execution of research studies

Employee perks, benefits of the internship:

  • Improve your programming skills and learn to work with various data types.
  • Learn about the meteorology, geoscience, and Earth observation domains.
  • Opportunity to meet the Slovak elites in AI.
  • Work with a very friendly team striving to make the green future a reality.
  • Improve your data analysis, machine learning and programming skills.
  • Experience best practices of research and project work.
  • Opportunity to work on the design of research studies.
  • Learn about disinformation ecosystems and methods for their tackling.
  • Opportunity to meet the Slovak and European elites in AI and disinformation tackling.
  • Work with a very friendly team.


Beyond the VIGILANT project, the toxic content tackling is one of KInIT’s core application domains. Presently, KInIT is a partner in multiple large grant projects, especially in the Horizon Europe programme, where we are teamed up with 1st class European research institutions.

simko-jakub-web_51330185419_o

Learn to tackle disinformation the AI way!

Jakub Šimko

Lead and Researcher, Hiring manager

Skills

  • Slovak 100%
  • English 80%

Personality and profile requirements

  • You should have some experience with programming (e.g. several university courses).
  • You should have some experience with statistics (and math in general)
  • You should have some experience machine learning basics
  • Further nice-to-have skills (not required, you will learn them in KInIT):
  • Automatic data collection from the web (scraping, APIs)
  • Software development and deployment (including machine-learned models)


Does this internship sound attractive to you? Apply here with your CV and cover letter.

By hitting “Apply”, you agree that we process your personal data for recruitment purposes, based on the legal basis of GPDR Art. 6 (1) letter (a) consent to the processing of personal data. You can withdraw your consent at any time. For more information please read our Privacy Policy.