Funded PhD Project: The augmented agronomist: Synthesis of AI, ML and robotics to assist decision support
We are offering an exciting funded PhD opportunity (funded for up to four years) at the crossroads of innovative robotics & AI research and agri-food technology within the Collaborative Training Partnership (CTP) for Fruit Crop Research funded by BBSRC and Industry This project is funded through the “National Productivity Investment Fund (NPIF) through BBSRC.
“The augmented agronomist: Synthesis of AI, ML and robotics to assist decision support”
With recent advances in Artificial Intelligence and Machine Learning and maturity gained in many robotic applications and domains, this project sets out to provide agronomist with dedicated technological support in assessment and decision making. Adopting innovative paradigms already successfully deployed in tele-medicine and –care, a mobile robotic “proxy”, equipped with multi-modal sensing to directly facilitate visual as well as multi-modal (e.g. NIR, moisture, …) inspection, will be developed and field-tested in the context of soft fruit production. Objectives of the project are
- Shared Control and Assisted Assessment from a Mobile Robotic Platform,
- Integration of Automated Diagnosis Employing Multi-Modal Sensing,
- Robotic Telepresence facilitated through Adaptive Augmented/Virtual Reality Interfaces
Consequently, decision support and outcomes from automated analysis, as well as control-relevant information are provided by means of virtual and augmented reality, offering an immersive experience and fluid shared control and assessment for the operator. The operator is in a close loop with the system, despite their remote location, enabling them to effectively assess the situation and decide on interventions quickly with all information available at hand. The project is closely linked with the RASberry project (https://rasberryproject.com/) and will have access to its software and hardware resources to minimise risks and maximise synergies.
The scientific questions addressed with this PhD projects is mainly
- How can a mobile robotic platform, equipped with multi-modal sensing, be effectively used by a remote agronomist to assess a situation and take decisions?
- How does this tele-presence affect the quality and performance of the assessment of an agronomist in comparison to their physical presence?
- What is the performance of different AI and ML techniques in this scenario, and which are methods are well-suited?
It is in this context that we are looking for a highly skilled PhD student, to specialise in artificial intelligence, machine learning, robot navigation, human-machine interaction, and shared autonomy with robotic systems. The post is fully funded for 4 years of study (bursary at RCUK rate and all fees covered), which includes opportunity and funds for 1 year of direct collaboration with the industry partner. A generous travel and equipment budget is available, too. Applicants will possess excellent research and programming skills as well as a solid mathematical understanding. Successful applicants will usually have graduated with an MSc degree in a relevant area (e.g. maths, computer science, engineering). Previous publications (e.g. from MSc studies) are not required but will be considered in favour of the candidate.
Note: Due to conditions of the grant, this position can only be offered to UK/EU/EEA candidates initially. International PhD fees cannot be covered at this point.
Applications are received online only (no email applications, please). Candidates are asked to apply with their CV (including a list of publications), names of 2 referees who are content to be contacted, a transcript of their prior studies, and a very brief motivation statement (all in the online form).
Applications are continuously received and regularly (weekly) assessed until the position is filled (this advert will disappear once they are). Due to the anticipated high number of applications, please understand that we may only get back to you if you are short-listed for an interview.