Karriärannonser

Advancing Emergency Operations with AI and Merging Multi-Sensor Information

2025-01-17
Spring 2025
Teknisk fysik, Teknisk matematik och Globala system
AI, Datateknik, Elektronik, Fysik, Programmering, Robotik
30 hp
Academic supervisor: Magnus Karlsteen (magnus.karlsteen@chalmers.se )
Industrial supervisor: Andreas Eriksson (andreas.eriksson@uavision.se)
UA Vision AB Göteborg, Sverige
1.             Thesis background

Emergency services and industries worldwide face increasingly complex challenges in hazardous environments, where visibility, structural safety, and swift decision-making are critical. UA Vision Nordic AB is developing MEV-Cam, a groundbreaking technological solution that combines advanced LIDAR, infrared (IR), color camera and AI technologies to enhance safety and efficiency in these environments. MEV-Cam will enable real-time analysis and insights, providing rescue leaders and industrial operators with the tools to make informed decisions and minimize risks.

 

2.             Thesis objectives and timeline

This thesis aims at implementing and testing the MEV-Cam core technology Proof-Of-Concept (POC) by:

●     Week 1: Formal POC goals and requirements definition.

●     Weeks 1-3: Literature review and POC use case study.

●     Weeks 3-5: Define overall hardware and software needs.

●     Weeks 6-13: Implement solution and integration testing.

●     Weeks 14-17: POC verification tests as per verification plan (verification plan not within thesis scope but student input is expected).

●     Weeks 18-20: Thesis redaction (and eventual pending tests and verifications. Only tests’ reports to be used as thesis content are expected, drafts are accepted).

 

3.             Main thesis expected outcomes

 a)     Detailed evaluation of HW and SW options for a Proof-Of-Concept (POC) MEV-Cam system including future MEV-Cam development phases.

b)     Implementation of a HW/SW solution for the MEV-Cam POC.

c)     POC tests and preliminary performance evaluation.

d)     Scientific insights and technical recommendations to guide future product development.

 

4.             Student qualifications

Minimal:

●     Final year MSc student with interests in Electronics, Data Science, Mechatronics or similar, and fields such as Biomedical Engineering, Complex Adaptive Systems, or Physics..

●     Capacity to “put the pieces together” for complex systems comprising software, hardware and data processing.

●     C/C++, Python and any SLAM algorithm.

●     Experience in sensors and its data processing and utilization.

●     Experience with ROS/ROS2/Gazebo or similar on Linux (OK from MSc courses projects).

●     Understanding serial communication, Ethernet, CAN and similar robotics’ protocols.

●     Proficient in English (including technical).

●     Interest and quick learner of new technologies.

 

Additional preferred:

●     Hands-on experience on 3D lidar data visualization in ROS/ROS2/Gazebo or similar.

●     Hands-on experience in 3D SLAM.

●     Image-based data processing (color and/or thermal, encoders, filters, etc.).

●     Front-end/User interface practical experience (any language and graphical interface that substitutes Command Line usage).

●     Proficient in Swedish (including technical).

 

5.             UA Vision’s main roles and support

●     Expert guidance by two industry experts.

●     MEV-Cam hardware and software procurement as well as test sites facilitation.

Andreas Eriksson