Back to overview

HR-Dashboard

Reading time approx. 6 minutes
30.06.2023

Every day, our Xperts use their skills to deliver the best possible products and results to our customers. Why not use these skills to improve our internal processes?

In our post from 02.05.2023, we already presented our first internal hackathon. The end result provides our sustainability guild with a much more efficient way to determine our annual carbon footprint. The Administrative departments of our company also have multifaceted tasks whose processes can be simplified or better visualised with the right tools. Currently, our data science teams are supporting our HR department in the development of a dashboard to visualise the entire recruiting process.

Requirements

Due to the complexity of the recruiting process and the use of different tools, the necessary information is widely dispersed. The goal is a uniform and centralised representation of the entire process from the receipt of an application to the first day in the company.

The collection of data from different sources, their transformation into a uniform and easily processable form as well as the graphical representation of this data belong to the core tasks of a data engineer or data scientist. This internal use case offers the ideal opportunity for our Data Engineering and Machine Learning teams to apply their skills away from project work and optimise our internal processes. This not only allows us to keep a better eye on all current recruiting processes, but also opens up hidden potential for improvement in the individual steps.

Starting point

The idea was born a few years ago, when it initially served as a project for an ambitious working student. The topic offered a respectable scope to deal intensively with requirements engineering, planning, implementation and graphical presentation of the results. The project was supported by one of our Data Engineer Xperts. In coordination with the HR department, the exact requirements for the future tool as well as the most important KPIs were clarified together in several meetings.

The most important KPIs in the recruiting process include time-to-hire (time between application and signing of contract), cost-per-hire (costs around recruiting divided by number of new hires) and source-of-hire (which sources bring the best success rate).

In the first step, only one of the tools used in the recruitment process, namely JIRA, should be connected to create a Proof of Concept (POC). A POC is a product approach in scaled-down form that serves as proof that the ideas discussed in theory can actually be implemented in practice and fulfil the desired goal. For the best possible realisation, we chose the open source tool Apache NiFi. There, a workflow was configured that queried a REST interface of JIRA at regular intervals, laced data packages from it, carried out certain transformations and then wrote them to a PostgreSQL database.

This is where the first ETL route was born. ETL is a central principle in the field of data engineering and stands for Extract - Transform - Load. Data is thus obtained from a source, then transformed into the desired or easily usable form before it is subsequently made available in a designated data storage. In our use case, the data includes everything about the advertised positions: the type and number of the position sought (development, UXD, Scrum Master, etc.), the current status of the process as well as the timestamp for all process steps and many more.

Some fields in the JIRA tickets can also contain free text entries or units have to be standardised for certain data. Therefore, the data from the tickets isn’t retrieved as-is, but is transformed, as already mentioned. This means that the data is either prepared in such a way that it is optimally suited for the desired presentation or that it is easy to handle to enable future analyses. The individualised data sets were then accessed via Microsoft Power BI to provide an appealing visualisation of the data for our HR team.

Continuation

After a thorough evaluation of this POC by our data science teams, the idea was taken up again this year. In addition to JIRA, other tools used by HR are now to be connected, thus creating a holistic, central mapping of the recruiting process.

For this purpose, a new working student, under the guidance of our two data science team leads, is taking on the role of project manager and coordinating the implementation process. Currently, task packages are being put together, which will then be worked through piece by piece by the two specialist teams Machine Learning and Data Engineering. As implementation takes place parallel to the ongoing project and the participants cannot always work on the tasks at the same time, the work is not done according to Scrum, but according to the Kanban principle. This way, all employees can complete their tasks in a suitable time window and everyone keeps track of the current status. Similar to our hackathon, there are opportunities here for our employees to engage with other methods and tools that are also in their area of interest in addition to project topics. They can also get a taste of other areas of responsibility, such as project management or the role of product owner.

Currently, our specialist teams are evaluating which software is best suited for the implementation of the dashboard. For further ETL work, Apache NiFi has so far convinced us with its clear graphical user interface and easy to follow the data flow. Individual process steps can be started and stopped independently of each other, so that the intermediate result - the transformed data records - can be viewed at any time. In addition, external data sources can be easily tapped and the transformed data imported into a separate data store. Apache NiFi provides special processors for this purpose, which can be individually configured. Important findings from the POC can be reused or expanded for the transformation steps that are still required.

Other challenges that arise in the intended implementation are the secure storage of authentication data for the data sources to be connected and the use of version control so that the team can work together on transformation scripts and the workflow. The only thing missing is the visualisation tool. The requirements for this tool are that it makes the processed information available to the stakeholders in a visually appealing, simple and cost-effective way on a regular basis. Whether Microsoft Power BI will be used for this, as in the POC, is still unclear at the moment. In addition to various licences, a permanently operated data gateway would be necessary to provide the dashboard with live updates. Our two specialist teams are currently researching a more suitable alternative for the extensive requirements.

Conclusion

The HR dashboard gives good insight into how we understand the specialist team concept at IT Sonix. The specialist team time is not only used by our Xperts to build up broad knowledge in their field in order to be always up to date for the best possible realisation of our customer projects. The time and the acquired knowledge can also be used for projects within the company in order to continuously improve our internal processes and the working conditions of our employees.