SEOUL, Dec. 27 (Yonhap) -- "Congratulations on your graduation. I'm proud of you that you're doing well with work. I wish you will overcome hardships and succeed. Don't push yourself too hard.
My little child, I'm sorry I can't give you enough love. Take care of yourself, eat well and be safe."
This could be a note from any parent with endless worries for their children. But every word of this particular memo counts as precious.
These are the first words that a dad of three children has written since he was diagnosed with Amyotrophic lateral sclerosis (ALS) or more known as Lou Gehrig's Disease. It is a progressive neurodegenerative illness in which patients lose control of mobility. In the later stages, the only ability left is the use of their eyes.
The sample of DIY eye-tracking tool made by engineers at Samsung Electronics (Courtesy of the eyeCan project)
The eyeCan project team tests the DIY device on a patient with Lou Gehrig's Disease. (Courtesy of the eyeCan project)
The man who asked for anonymity was one of the first few patients who were testing the eye-tracking tool named eyeCan (http://eyecanproject.wordpress.com/english/), made by five engineers from Samsung Electronics as an independent project of their own.
Wearing what looks like makeshift eyeglasses with a small web camera duct taped to them, he uses his one eye to function as a keyboard mouse. As he blinks his eye as if he's winking, the camera with infrared LEDs recognizes the movement of his pupil, then it clicks on a virtual keyboard on a PC screen.
Although vision tracking technology is nothing new for people with neuromuscular syndrome such as Lou Gehrig's Disease, cerebral palsy or a spinal cord injury, what makes the eyeCan different from others is that it's a do-it-yourself (DIY) kit with open source code, which costs no more than US$100.
The kit includes a pair of glasses, micro web camera, infrared LEDs and copper wires along with free software to be installed in a PC or laptop. In that sense, it is a life-changing tool for patients who cannot afford such assistive equipments which cost easily over $10,000.
The original idea stemmed from Mick Ebeling, a founder of Not Impossible Foundation(http://notimpossiblefoundation.org ), an international creative think tank for social cause. Ebeling first came up with the DIY concept to help his friend, Tony "Tempt" Quan, an L.A.-based graffiti artist, to continue his work after his diagnosis with the ALS in 2003.
Last April, Quan wrote the following memo using the assistive eyewear, as Ebeling showed it during his presentation at TED Conferences.
That was the first time I've drawn
anything in 7 years. I feel like I had
been held underwater, and someone finally
reached down, and pulled my head up
so I could take a breath.
Inspired by this TED Talks, the team of Samsung engineers tweaked the code made by the Ebeling's foundation and came up with the similar device eyeCan, which works more like a keyboard mouse with which the user can click, double click, drag and drop.
The engineering part at the lab was easy and exciting, said Yu Kyung-hwa, the Samsung team's manager, but applying it to the patients at the bedside was the tough part, because they had to customize it for each one of them.
For instance, the distance from the user to the screen monitor or the angle of the patient's gaze all differed, and because many of them were hooked to several medical equipments, it was harder to make the adjustments.
Unexpected interference appeared, too. In one case, a strong sunlight in the room got in the way of the lighting of the camera, or an outdated PC created latency for it to work smoothly.
For more than seven months, the team visited about 10 patients and it took several trips of trials and errors for each patient. Even when they failed, the families showed them a sense of optimism and gratitude.
"The first thing (the patients) communicate to the family and to us was love and thank you, and I realized technology can be something warm in a way that it connects people," Yu said.
Most recently, a remote eye-tracking web camera connected to a TV was introduced by the Daejeon-based Electronics and Telecommunications Research Institute (ETRI). Without wearing any eyeglasses or other gadgets, people can change TV channels, play games or web search by only using their eyes.
Vision-tracking technology has already been introduced at the international IT shows such as IFA, but it was made for PCs or only worked for a short distance. The ETRI's camera, which looks like a set-top box with infrared LEDs on each side, traces the pupil movement up to 3 meters (a little over 9 feet).
Some of the global leaders in the field include Sweden's Tobii, EyeTech of the U.S. and Germany's SensoMotoric Instruments (SMI).
A virtual Korean keyboard to be used with the eye-tracking assistive device was introduced by the Electronics and Telecommunications Research Institute (ETRI) (Courtesy of ETRI)
The ETRI's technology uses multi-camera lenses to improve the precision in tracking the coordinated point of the viewer's gaze, even when in motion. It also relies on gaze duration rather than on blinking, which tends to have a higher degree of error, according to Cha Ji-hun, a head of the convergence media team at the ETRI.
Although it is not ready for commercial rollout, this advancement in technology can help improve DIY assistive devices like eyeCan. Since this July, the government and non-profit organizations took over the eyeCan project, and about 70 patients are currently using the eyewear after months of training. Some of the successful users can do a simple web surfing, playing games like Angry Birds and reading e-books, said Nam Se-hyun, a manager at the Korea Disabled People's Development Institute.
There are roughly about 30,000 people suffering from various kinds of neuromuscular syndromes in Korea, Nam said. Also, the eyeCan's brainchild project designDIVE has gathered about 100 people of creative minds to fix the bug, to improve the design, to distribute and educate better as well as to fund the DIY assistive technology.