We are looking for master's students who want to work with applied technology in the areas of navigation, tangible interaction and mobile applications that facilitate everyday life for people with disabilities and their relatives. Some examples are presented here, but you can also come up with your own project suggestion that has a similar focus.
Location based games for young people who are blind
In contrast to their sighted peers, young visually impaired people are faced with a greater challenge in liberating themselves from the protection of adults because it is so difficult for them to get around on their own. They also have a greater workload because in addition to schoolwork, they have to learn to walk with a cane and to orient themselves as best they can. A mobile-based and location-based game (with GPS, for example) could be a new, fun and motivating way for them to learn orientation. At the same time, the location-based system can be a source of security so that they venture out and practice getting around on their own. This master's project can be carried in cooperation with the Do-Fi company to a certain extent, and will involve both the implementation and user testing of prototypes. It is suitable for one or preferably two engineering students who are studying the C, D, E or F programmes and who have taken one or more courses in rehabilitation engineering and/or the series of courses in human-computer interaction. Good knowledge in programming is also required.
Contact person: Kirsten Rassmus-Gröhn
Investigation and practical application of GPS tracking of children
Some children with cognitive limitations, such as those who have autism or Down’s Syndrome, can suddenly decide to wander off or run away from their homes, schools or daycare centers. This causes much worry and uncertainty for their relatives and the personnel who work with them. There are a number of technical solutions for tracking that use GPS, but no comprehensive user-friendly solution exists as far as we know. The project would involve carrying out an accurate analysis of the problem and market research of existing systems and then to combine this with a tailor-made, functional system to be tested. This master’s project is suitable for one or two engineering students who are studying the C, D, E or F programs and who have (preferably) taken one or several courses in rehabilitation engineering and/or in the series of courses in human-computer interaction.
Contact persons: Arne Svensk & Kirsten Rassmus-Gröhn
Non-visual use of mobile phones with touch screens
Today’s mobile phones with touch screens are difficult to interact with when you are unable to look at the screen, which entirely excludes people who are blind, but can also make it hard for people who are walking or cycling in traffic. Thus, our interest is in examining the possibilities for non-visual interaction. The project would involve carrying out an analysis of the problem and market research on touch screen displays for blind users. This would lead to the technical development of, for example, an Android platform and the testing of it on the target group. This master’s project is suitable for one or (preferably) two engineering students who are studying the C, D, E or F programs and who have (preferably) taken one or several courses in rehabilitation engineering and/or in the series of courses in human-computer interaction. Good knowledge of Java programming is also required.
Contact person: Kirsten Rassmus-Gröhn
iBeacon toys for blind interaction
In audio-bracelet for blind interaction (ABBI) is being developed to help sensory-motor rehabilitation for visual impaired children. The bracelet provides spatial information on where and how body movements are occurring via audio feedback. The ABBI device has mainly been used in spatial rehabilitation studies, but has good potential to be used in social and playful contexts. The idea of the project is to develop games and toys that connect and interact with ABBI via Bluetooth LE and/or can react to beaconing signals from multiple ABBIs. Toys will probably be developed using small embedded computers such as Arduino boards with BLE modules. Bluetooth LE and beacon capabilities are already present in ABBI.
Interactive objects for mobility training and rehabilitation after stroke
Interactive objects may be used to make training after stroke both more motivating and easier. Today there are smart watches, activity bracelets and apps for smartphones that can be used to support various activities. However, interactive objects (“tangibles”) can be easier to use for older people or people after stroke. The idea is to encourage activities that are already known to have good effect for stroke rehabilitation. The project would involve developing interactive objects that make it easier and more motivating to perform the training exercises. The students must develop and test the interactive objects to see if they work as intended (not to evaluate training or rehabilitation effects). “Tangibles” will probably be developed using small, embedded computers such as Arduino boards and different kind of sensors and actuators.
Interactive object creation with mobile phone and 3D printer
The aim of this project is to create an interactive tangible object by 1) making a 3D printed object that encases a mobile phone and 2) program a simple app for the encased phone so that the object responds to movement (or lack thereof) by light and/or sound and/or vibration. The object should be shaped to be held with two hands – you are free to choose what movement to respond to, but it should be a movement that can be performed by holding the object with two hands.
Shared control of an autonomous powered wheelchair
The Autonomous wheelchairs allow users with severe physical disabilities to navigate an environment by themselves. However, they provide very limited navigation and mostly in known environments. Users should be allowed as much control of the wheelchair as their capabilities allows them to. The idea of the project is to find an optimal adaptive way to share control tasks between the user and the autonomous wheelchair for people with varying degree of disability and in different contexts. The focus will be in the interaction between user and wheelchair. A powered wheelchair with RGB-D camera and laser sensor is available for the project. The low-level control is already implemented. The wheelchair will be controlled by a small computer like Raspberry Pi or similar running Robot Operating System (ROS).
Contact person: Héctor Caltenco
Evaluation of energy-usage feedback to house and apartment habitants (in cooperation with Kraftringen)
Most domestic energy use is invisible to the user. Most people have only a vague idea of their energy consumption for different purposes (heating, cooking, washing, etc.). A more visible feedback on usage of energy is important to both, habitants and house/apartment owners, to understand how they use energy and perhaps adjust their behaviour or invest in efficiency measures. However, providing energy consumption data in daily, weekly or monthly basis might not be enough, since energy consumption depends on many different factors, such as presence at home, outdoor weather, daily or weekly routines, etc. Feedback relative to the combination of these factors might provide a more useful and understandable way for users to understand and adjust their energy consumption behaviour. The project aims to evaluate an energy consumption app and design useful feedback strategies for users.
This master thesis is done in collaboration with Kraftringen, Liisa Fransson is the project leader.