Fully Funded PhD position in Human-Computer Interaction at Lancaster University.
I am looking for a new PhD student in Human-Computer Interaction with a focus on IoT/multi-device Interaction, Collaborative Augmented Reality, or Physical Data Visualisations.

We have some defined projects available (see below) but are open for project proposals within the scope of:

  • Multi-device or Cross-device interaction techniques, toolkits, systems and applications
  • Internet-of-things systems, interaction techniques and applications
  • Augmented or Virtual Reality systems, tracking, and interaction techniques
  • Interactive spaces and spatial/proxemic interaction techniques, displays and applications
  • Human-data interaction, physical data visualisations, in-the-wild deployment of physical data installations
  • Large scale interactive systems, urban IoT, sensor systems and public installation

Application deadline: June 28th, 2019
Start date: between October 2019 and January 2020.

If you are interested in doing doctoral research in the area of human-computer interaction and physical computing, contact me directly to discuss opportunities and how to apply using the official application procedure.

Example projects I am currently working on:

“Multi-user collocated interaction in Augmented/Mixed Reality”

Augmented or mixed reality (AR) is a technology that overlays digital information and interfaces on the real physical world. Wearable and mobile AR, using tablets and smart glasses, have many applications in the fields of healthcare, industry, knowledge work, and gaming as it allows people to seamlessly interact with both digital and physical objects in real-time. Research into AR is often focused on applications and techniques exploring individual use.vHowever, the reality is that many work environments require ad hoc collaboration with other people in different locations and times. People often work in environments (like hospitals or factories) that require them to roam between different locations and interact with various people, machines, objects, and information. There is therefore a need to explore how AR can support ad hoc collaborations that enable people to easily interact with the same virtual/physical content at the same time.

This project will explore how to enable and support seamless multi-person augmented reality environments that facilitate and foster ad hoc collaborations. The candidate will invent new technical solutions to multi-user augmented reality, and study foundational interaction techniques and applications for collocated interaction in AR that can be applied to healthcare or industry environments. There will be opportunities to intern and collaborate with industry and healthcare partners.

Profile and requirements:

  • Degree in Computer Science or Engineering (first degree or GPA equivalent)
  • Academic excellence and interest in HCI and UbiComp research domains.
  • Strong interest or experience in Augmented or Virtual Reality on wearable and mobile devices).
  • Experience in programming and designing 3D environments (e.g., Unity or Unreal engine, 3D max, Maya,…).
  • Strong design and programming skills as you will be required to build high-fidelity prototypes that will be tested both in the lab and real-world environments

“Evaluating Physical Ambient Displays in the Wild”

Urban Internet-of-Things (IoT) devices and sensors provide rich information about the world around us. They can measure air pollution, traffic, noise levels, temperature, and other environmental data that can help us make decisions in our everyday lives. We may for example decide to take a different route to work if the air quality on our commute is particularly bad, or we may want to open a window if humidity or CO levels are getting higher inside. Access to such data can also make people more aware of the environment and how they contribute to it. While sensor platforms are commercially available and sensors have been placed in our cities and our streets, data from these sensors is usually only available through websites or is stored in large data sheets that are hidden or unengaging to inexperienced users. Furthermore, direct representations of sensor readings do not always make sense to us: what unit are readings presented in, and when are readings out of the ordinary?

This project examines how new physical data representations (or tangible interfaces) can be used to make invisible data streams from IoT sensors more visible. The candidate will design and build novel physical data representations and conduct a range of in the wild deployments to evaluate the proposed physical representations in real world scenarios. This project will be in collaboration with industry partners that will provide access to a sensor platform.

Profile and requirements:

  • Degree in Design, Computer Science, or Engineering (first degree or GPA equivalent)
  • Academic excellence and interest in Urban Internet-of-Things (IoT), Smart Cities and HCI research domains.
  • Strong interest or experience in Physical Computing: fabrication techniques (laser cutter, 3D printer, CAD), and hardware prototyping (Arduino)
  • Experience in programming and designing physical computing prototypes.
  • Experience in conducting research studies in the wild.

“Augmented Factories: Using IoT Technology to improve Factory Safety”

Factories are complex systems consisting of machines, moving vehicles, people, and materials. Despite the rise in automation, factories will continue to require human workers and maintenance crews who will have to work with (1) current machinery and (2) increasingly automated equipment (robots, drones). Many Health & Safety directives in factories are ‘passive’ or instructional (e.g. wear a hi-vis jacket, wash your hands for food safety, keep hands away when machine operating) but these rely on workers following directives and require human intervention to monitor and uphold. There is significant opportunity to transform workers’ health and safety into an ‘active’ activity where sensing and computation provide ‘always on’ monitoring and immediate interventions in unsafe conditions (e.g. audible/visible warnings, machine shut- downs, space reconfiguration, incident reporting).

This ambitious project explores how Internet-of-Things technology such as sensors, actuators, projection systems, and other ‘smart’ technologies can be leveraged to make factories more dynamic and safe. The candidate will invent and design new industrial smart space technology and evaluate them in both ‘living lab’ conditions as well as in real factory testbeds. This project is in close collaboration with several industry partners and the candidate will have the opportunity to intern with our industrial partner.

Profile and requirements:

  • Degree in Computer Science or Engineering (first degree or GPA equivalent)
  • Academic excellence and interest in HCI and UbiComp research domains.
  • Strong programming and development skills in IoT technology.
  • Experience with projection-based technology (projection mapping) and Ubicomp environments.
  • Overall strong design and programming skills as you will be required to build high-fidelity prototypes that will be tested both in the lab and real-world environments.