What will you be doing?
Cameras are widely used for surveillance; CCTV systems can be found in city centres, in malls, on parking lots, in train stations, on airports, etc.
This widespread use of cameras is motivated by their high resolution and ease of use. However, cameras do have some disadvantages.
The quality of (daylight) camera imagery degrades in poor lighting conditions, smoke and dust. Furthermore, a single camera cannot provide information about the range to a person.
These issues may be alleviated by exploiting the strongpoints of radar. Radar systems provide the range and velocity of detected subjects and radar performance is largely preserved during night time and in poor weather, dust and smoke.
Compared to cameras, however, the resolution of radar images is typically much less. Since their strengths and weaknesses appear to complement each other, a natural question seems to be : Can radar and camera systems learn from each other?
Multimodal learning is the topic of this assignment. Based on earlier results, the goal is to design and evaluate a multimodal neural network to classify different types of human motion on the basis of video and radar data.
Understanding the operation of the multimodal neural network is an important asset in evaluating the final classification results.
For this purpose visualization techniques can be tested and applied that highlight the image pixels actually used for feature extraction and classification.
A multimodal (radar / video) data set of walking people is readily available. The work may also include planning and conducting new multimodal measurements.
You will perform this assignment in the Department of Radar Technology. We are a passionate and creative group of professionals (60 people) dedicated to the specification, development and evaluation of innovative, high-
performance MMICs, miniaturised and integrated RF subsystems, antennas and front-ends. The department is at the heart of novel, game-
changing radar system and signal processing concepts for the military, space and civil domains.
What do we require of you?
You are in the final stages of your degree in artificial intelligence, physics, mathematics, or a similar degree and you have a track record in the field of machine learning (deep learning).
You have experience in programming in Python and / or Matlab, you are pragmatic and focused on making things work. Next to technical expertise we value communication skills and a results-driven attitude.
What can you expect of your work situation?
TNO is an independent research organisation whose expertise and research make an important contribution to the competitiveness of companies and organisations, to the economy and to the quality of society as a whole.
Innovation with purpose is what TNO stands for. With 3000 people we develop knowledge not for its own sake but for practical application.
To create new products that make life more pleasant and valuable and help companies innovate. To find creative answers to the questions posed by society.
We work for a variety of customers : governments, companies, service providers and non-governmental organisations.
What can TNO offer you?
You want to work on the precursor of your career; a work placement gives you an opportunity to take a good look at your prospective future employer.
TNO goes a step further. It’s not just looking that interests us; you and your knowledge are essential to our innovation.
That’s why we attach a great deal of value to your personal and professional development. You will, of course, be properly supervised during your work placement and be given the scope for you to get the best out of yourself.
Naturally, we provide suitable work placement compensation.
For this vacancy it is required that the AIVD issues a security clearance after conducting a security screening. Please visit for more information the AIVD .
Has this vacancy sparked your interest?
Then please feel free to apply on this vacancy! For further questions don’t hesitate to contact us.
Contact : Jacco Wit, de
Phone number : +31 (0)88-86 61057