Skip to main content

  

 

SPOTLIGHT

IMMEDIATE RELEASE
February 14, 2014
CONTACT: K.E. Schwab
724.738.2199
karl.schwab@sru.edu

SRU researchers 'thinking to make it so'

SLIPPERY ROCK, Pa. - "Thinking will make it so" is the philosophy behind Slippery Rock University Computer Science Professor Sam Thangiah and his two student researchers as they work to use human brainwaves to control a computer robot.

The three researchers are using a $5,000 grant from the University's joint Faculty/Student Research Grants Program to support "A Brain-Computer Interface to Control a Remote Mobile Robot" research project that is already seeing positive results.

Thangiah, who joined the SRU faculty in 1991, said for this project "I am mostly following my students' lead. They were interested in the project and I tied it to the return of U.S. veterans with loss of limbs. They were looking for a solution to make things easier, and this project is right in line. They are frequently asking such questions as 'Can we add this?' to the robot's repertoire of doable tasks."

Stephen Bierly from Apollo, and Jordan Schiller, a post-baccalaureate computer science, international student from Canada, are SRU computer science majors working on the project.

Thangiah explains the project saying, "We are developing ways for a human to think of a function the robot can intelligently accomplish; have a sensor pick up the brainwaves, and 'tell' the robot to then complete the task."

At this point in their work, the trio is first defining what kinds of tasks the robot can actually do, and setting the internal commands or instructions for accomplishing the task.

The SRU computer science department has purchased a brain-computer interface headset for use in undergraduate research and for use in advance computer classes. The headset consists of 14 electrodes that can detect conscious thoughts, such as 'move left,' 'move right'; emotions, such as 'excitement' or 'boredom;" and facial expressions."

"The research project is working to develop software that will interface the brain-computer headset to the Hercules Mobile Robot in order to move the Hercules robot with a user's neural signal," Thangiah said. "The user will be able to direct the movements of the mobile robot with his/her conscious thoughts. The potential applications of this research, using neural signals to control objects remotely, vary from helping paralytic people control objects in their environment to using the thoughts of artists to control a device capable of printing or composing music."

"We can have the robot scan a room and locate pre-determined objects and points of reference such as where a chair is located, a bookshelf, a table and individual objects like a ball or a box, a specific toy, book or that kind of thing. The operator could use a computer screen to 'think' of pointing to an object and set the robot in motion to complete a pre-set series of motions," Thangiah said.

As an example he said the user could think of moving the cursor to point to a ball and then "bring to me." The robot receiving the signal, and already having scanned the room for objects, would locate the ball, use its preset instructions to move to the ball, pick it up and return it to the user.

He said the steps required for the robot to move to the ball, pick it up and return it to the user are based on intelligent routing algorithms and have been implemented for the robot. The user would have to know what the robot was actually capable of doing in terms of tasks.

"For example, it might not be able to retrieve the ball from the top shelf of the bookcase, but it could locate the ball on the floor or on a table, navigate around furniture or other obstacles in the room to the ball's location, pick it up, then navigate back to the user and properly hand or place the ball. The robot would be able to determine on its own, based on intelligent algorithms in how to deal with problems, like not being able to pick up the ball if it moved slightly on the first try," he said.

"I became interested in EEG [electroencephalogram] machines when I first saw someone trying to control a character in a video game with them. It was then that I realized the massive potential that these devices could have. By sheer coincidence I was sitting in the AI lab with my fellow research partner Stephen Bierly and he bought up the topic of reading brainwaves and controlling things with your mind, and I told him that was something I was also interested in and I showed him the EEG machine I had recently discovered on Kickstarter, called "the Emotiv Insight," Schiller said.

"This led to more talking and excitement about the possibilities that such a device would offer. After some brainstorming, we realized we were sitting in the same room as the high robot, Hercules, a device that will accept commands and perform complicated actions. Why couldn't we use that? We contacted Dr. Sam [Thangiah] about the idea of moving the robot using an EEG machine, and soon after we had the funding, the EEG machine, full access to robot, and some working code that began to perform the task of moving the robot forward," he said.

"My role in the project is to build the signal analyzer. I will write and maintain the program that will take the raw data from the EEG and translate and recognize it reliably then hand off the command to the robot controller that Stephen is in charge of," he said

"This means ideally, my signal analyzer will take the electric signals from the EEG and be able to tell if the user is telling the robot to perform 'action X' whenever they want that command to occur, and do so without accidentally performing 'action Y' or doing nothing at all because the user is slightly distracted," he said.

"I feel this project is a real capstone on my education at SRU. It is something that thoroughly interests me and has huge potential in future careers that I will be pursuing. It will also be the foundation for a great education program opportunity for other Slippery Rock University students.

While the project is clearly hard work, Schiller said, "This project is tremendously fun. Seeing the robot move when you think 'move' is an amazing experience. Seeing other people's eyes light up with possibility when they see it move is extremely rewarding. I am very excited for the future of this project and this field in general."

Fellow researcher Bierly said, "I've always had an interest in technology and new applications of that technology. Therefore, when I was given the opportunity to be a part of the research project I jumped at the chance because it is closer to the newer applications of technology than I had ever been."

"I am one of the programmers. We work together to plan and implement the interface between the EEG headset's signal data and the robot. Essentially, we are responsible for making the software that translates the EEG information into commands that are then sent to the robot," he said.

A transfer student, Bierly said, "Overall, the project has definitely helped me expand my view and knowledge as a programmer, and it has caused me to be a more creative programmer. More than likely it has helped me with other computer science courses as well because it gives me regular practice at thinking up solutions, planning things out and programming."

"In a field where people are always vying for an edge, working on this research project will likely give me that something extra that is interesting on my resume," he said. "It is both fun and frustrating at times, and I wouldn't have it any other way. I enjoy puzzles and solving things and the project has presented me with just that, an interesting problem that forces me to think of possible solutions and be more creative. It often provides me with a sense of accomplishment when we break through a particular problem or get something related to the project to work when it was eluding us previously."

Thangiah said a major advantage of the project is that it allows undergraduates to be exposed to thought-controlled robots and its potential expansion of the research into other thought-controlled objects in the environment. Software developed from the project will be integrated into advanced computer classes to allow SRU students to experience and explore new areas of research in computer science and artificial intelligence.

The researchers also hope to publish and present papers on their work.

"We hope the major outcomes of the research are that we can train a Brain-Computer Interface device to send commands that can be understood by the Hercules robot; develop software for transmitting neural commands from the BCI device to the Hercules robot; mentor and involve undergraduate students to scientific research methods; and enhance the scholarly activity of the faculty and students in designing, implementing and testing algorithms for analyzing and transmitting neural signals from a BCI device to a mobile robot," Thangiah said.

Slippery Rock University is Pennsylvania's premier public residential university. Slippery Rock University provides students with a comprehensive learning experience that intentionally combines academic instruction with enhanced educational and learning opportunities that make a positive difference in their lives.