People with learning difficulties or memory problems face barriers in the working environment because they need extra supervision. New environments, unfamiliar equipment and changing of tasks can be especially challenging.
VirtuAssist provides real-time guidance to operate working equipment so people can work with minimal supervision in these challenging environments. VirtuAssist combines cutting-edge technologies such as computer vision, pointing gesture recognition, machine learning and task modelling with smart-glasses. This personalises information and interaction to the end-user’s needs and preferences in a fun and effective way.
The 2012 Horizon Report was recently released by the New Media Consortium and the EDUCAUSE Learning Initiative. The report highlights emerging technology trends projected to impact education on a global scale.
I’ve personally been following the annual reports for the past four years, and the forward thinking and progressive approach to technology is very encouraging. This is not to say that the “horizon” they point towards is inevitable, but that the reports are informed enough and courageous enough to project a future in an industry where it’s become cliche to describe development as exponential.
The 2012 report projects the following adoption trends:
- Adoption in 1 year or less
- Mobile Apps
- Tablet Computing
- Adoption in 2 to 3 years
- Game-based learning
- Learning analytics
- Adoption in 4 to 5 years
- Gesture-based computing
- Internet of things
Without turning this post into a dissertation, I’d like to toss around some ideas about how these developments may impact the right to education and the right to information for persons with disabilities. This post will be divided into three parts, addressing each of the projections in the report.
In 2012 mobile apps and tablet computing have become emblematic of how we consume, educate, and learn. However mobile apps and tablet computing are part and parcel of the same fundamental accessibility challenge. That is the development of touchscreen interfaces as a popular mode of human computer interface (HCI). This HCI du jour is paradoxical in that it promotes cost efficacy and inclusiveness for certain populations while potentially excluding persons with disabilities such as those who are blind and partially sighted. As educational providers continue to adopt this technology, it is imperative that they evaluate the accessibility of both the device hardware as well as the apps.
It is clear in the development of HCI from the command prompts of MS-DOS to the graphical user interface in early Apple and Microsoft operating systems to the touchscreen interfaces ubiquitous in mobile devices, that multi-sensory inputs will continue to broaden the widespread appeal of technology while challenging the inclusiveness for persons with disabilities. This provides a clear opportunity for Universal Design in the approach to another Horizon Report projection, the adoption of gesture-based computing in the next four to five years. More on the role of accessibility in gesture-based computing and education in part 3 of this post.