Master Thesis Projects in Sensing & Perception
Shape the way cars see and think. Work with cutting-edge AI and sensor data to enable autonomous vehicles to understand their environment and make safer decisions.
👁️ Help Cars See, Understand, and React to the World
This year, we’re trying something a little different to make it easier for you to explore and apply for our master thesis projects. Instead of separate ads for every topic, we’ve grouped all projects into three main clusters — each focused on a different part of autonomous driving. You’re welcome to apply to one, two, or all three clusters if you like, but later in the application process, we’ll ask you to prioritize the projects you’re most excited about, in each cluster.
Let’s take a closer look at what this cluster is all about:
In autonomous driving, everything starts with perception. How a vehicle senses, interprets, and understands its surroundings determines how safely and intelligently it can act. In this master thesis cluster, you’ll work on projects that push the boundaries of how cars perceive the world — turning raw sensor data into actionable awareness that powers real-world decision-making.
From transforming
sensor data into meaningful insights with cutting-edge AI, to developing
systems that detect drivable areas and predict how environments will change,
your work will directly shape how vehicles see, understand, and interact with
the world around them.
🔬 Sensing & Perception: Thesis Projects (Cluster A)
Here are the master thesis projects offered in this cluster — each topic below is a separate project you can apply for:
- Project 1: 🧠 Captioning Engine for
AD/ADAS data using Multi-Modal Large Language Models – Use advanced LLMs
to describe and interpret sensor data from autonomous vehicles.
- Project 2: 🌐 World Model Online
Evaluation – Develop methods to continuously evaluate how vehicles
represent and understand their environment.
- Project 3: 📡 Realistic Radar Simulation
Using Transfer Learning for Autonomous Driving – Build radar simulations
that improve perception model training and validation.
- Project 4: 🤖 Self-Supervised
Representation Learning on LiDAR Point Clouds – Enable vehicles to learn
meaningful features from raw LiDAR data without manual labeling.
- Project 5: 🚘 Free-Space Network Heads for
Drivable Area Estimation – Improve how cars detect drivable areas and plan
safe navigation.
- Project 6: 🧩 Explicit Model Pre-Training
for 3D Occupancy Prediction – Develop foundational models that enhance 3D
environmental understanding.
- Project 7: 💡 Deep-Learning-Based Headlight Beam Control – Explore how perception and deep learning can adaptively control headlights for safer driving.
Depending
on which project you’re offered, you’ll conduct applied research in areas such
as AI, sensor fusion, and 3D
perception. You’ll design and train deep learning
models using real or simulated sensor data and explore innovative ways to
process inputs from LiDAR, radar, and cameras. Throughout the projects, you’ll
collaborate closely with experienced researchers and engineers — and the
results of your work will directly contribute to the future of safer, smarter
autonomous vehicles.
We offer several master thesis projects across three clusters:
- Sensing & Perception (this one) – how the car sees and understands the world
- AI Tooling & Infrastructure – the data, platforms, and tools that power autonomous systems
- Planning, Decision-Making & Safety – how the car predicts, plans, and acts intelligently
Each
cluster has its own job ad and a detailed project PDF with background on all
topics. You’ll receive the PDF in a separate email after you apply to help you
explore the projects in more depth and choose the ones that best match your
skills and interests.
🎓 So Who Are We Looking For?
We’re seeking passionate and curious Master’s students from (including but not limited to):
- Computer Science / Software Engineering
- Machine Learning / Artificial Intelligence
- Robotics / Autonomous Systems
- Signal Processing / Computer Vision
Because
this cluster spans several projects, the required skills vary. When you apply,
please list all relevant skills,
tools, and knowledge areas — and describe your level of experience
with each (e.g., basic, intermediate, advanced). This helps us understand your
profile and match you with the project that best fits your background.
🧰 Expected Skills & Experience
Typical skills we look for (you’re not expected to have all):
- Programming (e.g., Python, C++)
- Experience with machine learning or deep learning frameworks (e.g., PyTorch, TensorFlow)
- Understanding of sensor data processing, 3D perception, or sensor fusion
- Knowledge of computer vision, LiDAR/Radar data, or scene understanding
- Familiarity with self-supervised learning, data pipelines, or simulation tools
🌟 What’s in It for You?
- Work on impactful projects that directly contribute to autonomous driving systems
- Gain hands-on experience with real-world data and cutting-edge technology
- Learn from industry experts and grow your professional network
- Be part of an inclusive and innovative team shaping the future of mobility
📩 How to Apply?
Submit your CV, motivation letter, and grade transcripts.
- Applying as a pair? Include your partner’s name in the application.
- Planned start: January 2026 (flexible)
- Application deadline: October 31, 2025 (applications reviewed continuously)
📧 For more information, contact: Gabriel Campos, Research Manager – gabriel.campos@zenseact.com
This role may involve access to sensitive information, trade secrets, and confidential data. Selected candidates may undergo a background check as part of the recruitment process.
More about Zenseact
🚗 Our Software Makes a Difference
We use AI-driven technology to fight traffic accidents and make roads safer. Every year, 1.4 million people lose their lives in traffic — we’re here to change that.
🎯 One Purpose, One Product
We design the complete software stack for autonomous driving and advanced driver-assistance systems. With continuous updates, cars become safer over time — bringing us closer Towards Zero. Faster.
❤️ Culture with People at Heart
We can only succeed together. Our culture is built on care, trust, and belonging — a place where everyone can grow, be themselves, and do their best, both at work and in life.
Zenseact works proactively to create a culture of diversity and inclusion, where individual differences are appreciated and respected. To drive innovation, we see diversity as an asset — we value and respect differences in gender, race, ethnicity, religion or belief, disability, sexual orientation, age, and more.
🕐 Interviews are held on a continuous basis, so we highly recommend that you submit your application as early as possible.
- Competence area
- Opportunities for Students, Graduates & Innovators
- Locations
- Gothenburg, Sweden, Lund, Sweden
- Remote status
- Hybrid
About Zenseact
One purpose, one product
We are a software company focused on transforming car safety. By developing a complete software stack for autonomous driving and advanced driver-assistance systems, we aim to eliminate car accidents and make roads safer for all. Founded by Volvo Cars, Zenseact operates globally, with teams in Gothenburg and Lund, Sweden; and Munich, Germany.
Already working at Zenseact?
Let’s recruit together and find your next colleague.