Translate this: 'cognition-strength interfaces'

Translate this: 'cognition-strength interfaces'

(PhysOrg.com) -- A highly ambitious European project used basic cognitive function, eye-tracking and keystroke logging as the starting point for the study of human-computer interaction for translation. It could be the dawn of a new era, with cognition-strength interfaces that work with brainwaves.

Most interface research, however inventive or effective, looks at a fairly basic set of data. The usability of touch screens, for example, was vastly enhanced by new intuitive software that taps into natural human gestures to create powerful new access methods and even new applications.

But for all their success, these types of devices represent a fairly coarse approach to human-computer interaction. Now one European project has taken the state of the art to a whole new level. The EYE-to-IT project combines (EEG), eye-tracking and keystroke logging to study how real human individuals use computers to engage with real-world problems at the level of cognitive, visual and motor function.

It looks like an eclectic mix because the EYE-to-IT project sought to unite three almost contradictory aims: the basic scientific research of translation as cognition, the study of eye-tracking in the context of a human-computer interaction (HCI) - very fashionable in modern business - and the use of EEG to study during translation.

Real-world benefits

By combining these three research elements into one strand, EYE-to-IT sought to develop a useful research tool that could also bring real-world benefits to HCI while developing even more sophisticated analysis on how our brains, eyes and hands work with a computer as we complete a complex task.

It was a highly ambitious, highly interdisciplinary project, as befits an effort with such diverse aims, and runs from classical philology through HCI, neurology and neuro-linguistics, electro-physiology and translation studies, among just six of the partners who were led by the New Bulgarian University as the scientific coordinator of the consortium.

With a relatively modest €2m research budget, €1.9m coming from the EU, the project achieved some compelling results, ranging from improvements in eye-tracking to new tools for translators, and they even tackled some fundamental research problems that have prevented this type of joined-up thinking in the past.

Seeming contradictions

“Some of the aims of the project seemed contradictory,” project coordinator Maxim Stamenov explains. “For example, eye-tracking obviously requires eye movement, while electroencephalography requires that the eyes stay very still. Moving eyes cause a lot of ‘noise’ in the electrical signals coming from the brain.”

EYE-to-IT developed a partial solution. On the face of it, presenting a text word by word would take out the eye saccades, but then eye-tracking is pointless. Instead, EYE-to-IT timed EEG recording in such a way as to provide windows for registering brain activity. These recording moments depend on the timing of the onsets and offsets of eye movements during natural reading.

The upshot is that the results achieved by the partners are among the first that report experiments with this combination of technologies. “We believe we are among the world’s first on this count when it comes to reading,” says Stamenov.

Tracking EEG, eye movements and keystrokes also presented an enormous challenge. “It generates a huge amount of data, but how do you visualise this data and find meaningful information?” asks Stamenov. The EU-funded project solved this problem by developing a visualisation tool, called KiEV, which offers easy access to all this data.

Not a small Babylon

KiEV, developed at the University of Tampere Human-Computer Interaction Unit, presents the three data sets across a timeline, in graphs, so researchers can quickly hone in on interesting trends or artefacts, or search for specific tendencies.

The project coordinator cites the author and philosopher Umberto Eco who is reported to have said, “Translation is the language of Europe”.

“The European Union, with its 23 official languages, is not a small Babylon,” stresses Stamenov. But thanks to the work of EYE-to-IT, there are now opportunities to develop new tools for translators. For example, eye-tracking can identify the word currently in translation, and can offer a pop-up menu prompting possible matches in the target language.

“Cognates and false or non-cognates are another possible application, where words seem the same between the two languages, but actually mean different things. For example, there are 3000 false cognates between English and Bulgarian!”

In all, the EYE-to-IT project was a very unusual and successful project, tackling fundamental science and technology issues to develop new tools for both the lab and the real world, using a broad range of disciplines to develop a new approach to old problems.

Unanticipated tools

Most practically, in both the long and short term, it has laid the groundwork for new, unanticipated tools to develop new modes of human-computer interaction, at a level of interactivity that goes way beyond user-testing for touch screens or motion sensors.

EYE-to-IT drills down to the basic functioning of the brain during extremely sophisticated tasks and offers the opportunity to use, in innovative ways, methods and technologies normally reserved for the research institute.

“Our work could certainly be applied to almost any task requiring cognition and computers,” Stamenov declares. “We chose translation because it tackles a real problem, it is probably the most sophisticated linguistic cognitive function and thus a good test case, it is a vital area for Europe and it reflects the interests of the partners. But our work could also be applied to other areas.”

And, in the long term, the work paves the way for developing far more sophisticated computer interfaces that adapt to the unique brain patterns of individual users by learning what makes their brain tick.

Commercial prospects

For now, one EYE-to-IT partner, Tobii, will seek to adapt their commercial eye-tracking technology to lab work by dramatically increasing the tracking resolution. Results from the project will also be incorporated into the Translog keystroke logging tool, developed at the Copenhagen Business School, and the methods and approach of the project will have a significant impact on cognitive research.

Stamenov notes that there are immediate applications in basic research and HCI. There is also scope to adapt the eye-tracking to the treatment and study of dyslexia, for example.

At the first Future and Emerging Technologies (FET) conference, in Prague during April this year, the project was approached by a large number of businesspeople who could see potential applications for the technology. One delegate from a leading mobile phone maker asked EYE-to-IT about plans to use eye-tracking for smart phones, showing that the concept is generating considerable interest.

Currently the project is finalising its data and final reports, but Stamenov says the partners will look at the possibilities of a follow-on project later in the year.

The EYE-to-IT project received funding from the FET-Open strand of the EU’s Sixth Framework Programme for research.

Provided by ICT Results

Citation: Translate this: 'cognition-strength interfaces' (2009, July 6) retrieved 29 March 2024 from https://phys.org/news/2009-07-cognition-strength-interfaces.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

An affordable future for eye tracking in sight

0 shares

Feedback to editors