Présentation SDKs

This article is a quick overview of all three (one per headset) SDKs. Each of them is accompanied with a illustration created in a test project.

Test projects are written in C# and available for download here (password protected because of Emotiv's licensed libraries):
[[http://projets-labinfo.he-arc.ch/attachments/download/689/HeadsetsTestProjects.zip]]
To get the password please contact professor David Grunenwald

1. Neurosky development tools summary

There are 4 levels of interfaces:

  1. the ThinkGear Connector (TGC) (Windows and Mac OS X executables)
  2. the ThinkGear Communications Driver (TGCD) (Windows, Windows Mobile, Mac OS X, and J2ME (Symbian) libraries)
  3. the ThinkGear Stream Parser (source code for any C platform)
  4. the MindSet Communications Protocol (specs for any platform with Bluetooth serial I/O)

The highest level interfaces supply executables and binary libraries for some of the most common platforms, like Windows and Mac OS X, while the lower level interfaces provide source code and low level communication stream specs that allow MindSet development on virtually any platform that can receive a Bluetooth serial data stream.

1.1 Thinkgear

ThinkGear is the technology inside every NeuroSky product (including the MindSet), or partner product, that enables a device to interface with the wearer's brainwaves. It includes the sensor that touches the forehead, the contact and reference points located on the ear pad and the onboard chip that processes all of the data. Both the raw brainwaves and the eSense Meters (Attention and Meditation) are calculated on the ThinkGear chip.

1.2 Thinkgear Connector – TGC

The ThinkGear Connector (TGC) is an executable that provides a daemon-like service that manage communications with ThinkGear devices, such as the MindSet, that are connected to the computer. The TGC runs continuously in the background, and keeps an open socket on the local user's computer, allowing applications to connect to it and receive information from the connected ThinkGear devices this means that any application in any language that can open and read from sockets (such as Flash's ActionScript3, and other scripting languages) can connect to and receive data from MindSet headsets.

1.3 Thinkgear Communications Driver – TGCD

The ThinkGear Communications Driver (TGCD) is a device driver with a simple API that allows communication between an Application on a computer (or mobile device) and a ThinkGear chip/- module/headset. It is available as a .dll (for x86 or ARMV4I platforms), as a .bundle (for Mac OS X platforms), or as a .java library (for J2ME/Symbian platforms).
These provide wrappers (class files) which use the “thinkgear.dll” library. Languages available : C/C++, C# and Java.

1.4 ThinkGear Stream Parser

For all other platforms not covered by the TGC nor TGCD, it is up to the application to open the serial I/O communication channel (COM port or direct serial UART). Please refer to the platform's documentation for "serial I/O" or "UART" on how to open and read from such a channel, as serial I/O APIs tend to be platform-specific. Once the communication channel is open and bytes can be read, we provide a Packet parsing library for parsing and decoding the incoming data bytes. The library is in the form of ANSI C source code.

2. Neurosky development tools test

The test is performed on Windows Seven 32bit version. The test will first attempt a simple connection to the headset and try retrieving the attention data using the TGCD. The language used is C#. Following the basic indications given in the document, it is rapidly possible to retrieve all principal data values.

Here are print screens of a demo application done in C# .net. It displays apha, beta, gamma and theta waves but also the two eSence measurements Attention and Meditation.

Notice in this first illustration how the two eSense measurements show a very nice graph while the other brainwaves seem very low. This seems to confirm what the HUG people told us. The alpha and beta waves are hard to measure on the forehead. Notice the “Poor signal” graph which shows very little poor signal periods and notice how this influences the other graphs. It flattens the eSence graphs while agitating the brainwaves graph.

This second illustration shows a direct impact a poor signal. It makes the eSence signals nil and Alpha, beta, gamma and theta go crazy.

Finally this last illustration is a situation which appears very often with the Neurosky device. The signal is poor, you don’t know why. The eSense don’t look very convincing nor do the others. Hard to tell which of these illustrations is the most truthful?

Questions have been submitted to Neurosky but haven't been answered to this day.

It takes a few retrievals to get a value for Attention and Meditation, probably the time it takes to calculate these values in the onboard chip.

2.1 Getters

Raw data, alpha 1 and 2, beta 1 and 2, gamma 1 and 2, theta, eSence Attention, eSence Meditation, battery power level, poor signal.

  • public const int DATA_BATTERY = 0;
  • public const int DATA_POOR_SIGNAL = 1;
  • public const int DATA_ATTENTION = 2;
  • public const int DATA_MEDITATION = 3;
  • public const int DATA_RAW = 4;
  • public const int DATA_DELTA = 5;
  • public const int DATA_THETA = 6;
  • public const int DATA_ALPHA1 = 7;
  • public const int DATA_ALPHA2 = 8;
  • public const int DATA_BETA1 = 9;
  • public const int DATA_BETA2 = 10;
  • public const int DATA_GAMMA1 = 11;
  • public const int DATA_GAMMA2 = 12;

3. Emotiv Development tools summary

Our research edition comes with an API which enables access to all signals presented in the Control Panel. Expressive, Emotional and Cognitive data can be retrieved with great ease.

NB: The emotiv control panel which is delivered with the SDK now works with Windows seven. If it doesn't be sure to check for available updates.

The Emotiv Website has been updated in this beginning of year and several new downloads an information are available. They have published a new sdk for raw data acquisitioning; this sdk is called “Research Edition” and could be purchased for 250 USD. For the moment we don’t have a particular interest in using the raw data, but this might become useful soon.

Amongst the downloads are several items which will be testing in this development study.

NeuroVault:

“With NeuroVault, you will be able to record and playback your Emotiv Headset data along with fully synchronized audio and video”
The idea of fixing a small webcam to the headset in order to film what the wearer is actually doing, will enable us to test if this could identify
reasons for data changes in the events which are filmed.

Neurokey:

“With NeuroKey and the Emotiv Headset, you will be able to compose email, or use other applications (with the official release), without a keyboard.”
This application has not worked yet. It seems impossible to connect the headset to it automatically as stipulated in the documentation.
Further tests should be done.

Cortex Arcade:

Is a series of games which can use facial expressions or cognitive signals to play. Tetris and the classic PONG game are the most complex play,
the Star Wars ship game didn’t work. A good way to practice abilities in a different context then the control panel.

EPOCDemo :

This demo is the one shown in several YouTube videos. The player is in a virtual world in which he can move.
The player learns to use rapidly to clutch and use the lift/pull cognitive joysticks.
The player follows a spirit which guides him threw very basic and quick exercises.

Mind Photo Viewer :

This software enables the user to navigate threw pictures and rotate them using the trained joysticks from the control panel.

4. Emotiv Development tools test

Here is a demo developed in C#:

In this illustration, affective values are being retrieved from the headset. For an unknown reason the “Meditation” and “Frustration” wouldn’t give any results during the test, it maybe is a problem of initialisation.
Regardless of that everything else works fine.
This demonstration application connects directly to the Emotiv Control panel.
Regarding the meditation and frustration scores, helpdesk at Emotiv say “ The most likely cause is a noisy signal, which will force these outputs to 0. The noise detections are different and more stringent for frustration and meditation versus the other states.”

4.1 Getters

An API is delivered with the software. It is mainly and engine which connects to the EPOC device.
You can either create a new instance of the engine or lock on the control panel, like the other demo software. Using this API you can basically access the same functions as the ones present in the control panel. The functions include Expression suite, Emotiv suite and cognitive suite. It also includes a multiple user support and learning (calibrating) options…

You can retrieve Affective suite values, Expression suite values and Cognitive values. All can be calibrated in a user profile.

Expression suite
  • EXP_NEUTRAL = 0x0001,
  • EXP_BLINK = 0x0002,
  • EXP_WINK_LEFT = 0x0004,
  • EXP_WINK_RIGHT = 0x0008,
  • EXP_HORIEYE = 0x0010,
  • EXP_EYEBROW = 0x0020,
  • EXP_FURROW = 0x0040,
  • EXP_SMILE = 0x0080,
  • EXP_CLENCH = 0x0100,
  • EXP_LAUGH = 0x0200,
  • EXP_SMIRK_LEFT = 0x0400,
  • EXP_SMIRK_RIGHT = 0x0800
Affective suite
  • AFF_EXCITEMENT = 0x0001,
  • AFF_MEDITATION = 0x0002,
  • AFF_FRUSTRATION = 0x0004,
  • AFF_ENGAGEMENT_BOREDOM = 0x0008
Cognitiv suite
  • COG_NEUTRAL = 0x0001,
  • COG_PUSH = 0x0002,
  • COG_PULL = 0x0004,
  • COG_LIFT = 0x0008,
  • COG_DROP = 0x0010,
  • COG_LEFT = 0x0020,
  • COG_RIGHT = 0x0040,
  • COG_ROTATE_LEFT = 0x0080,
  • COG_ROTATE_RIGHT = 0x0100,
  • COG_ROTATE_CLOCKWISE = 0x0200,
  • COG_ROTATE_COUNTER_CLOCKWISE = 0x0400,
  • COG_ROTATE_FORWARDS = 0x0800,
  • COG_ROTATE_REVERSE = 0x1000,
  • COG_DISAPPEAR = 0x2000

5. Emotiv Research Edition & Neurovault

Emotiv has just launched the new version of Neurovault which appears to be more stable than the last. The Neurovault application is basically a logging program.

It can record video, sound and all EPOC headset values.

It saves the data in several files. The video is obviously in the avi file, the eeg file seems to be binary (so not immediately readable) and the neuro file is probably the one which links all files together. Last but not least, the csv file has all EPOC values recorded into it.

It looks like this: Notice that values of each electrode is accessible!

The research edition of the sdk is an update of the basic development sdk. It includes access to raw data of each electrode. It comes with a test bench application which displays the possibilities offered by the sdk. Here are a few print screens which offer a global view of these possibilities.




6. NIA (Brainfingers) Development tools summary

The brainfingers software must be running for the SDK to work (via shared memory). The software itself needs to be activated using a generated key which is calculated directly on a physical machine. This tight relationship between hardware and key has caused many problems until now. The key being valid for only one computer, we decided to install the software in a virtual machine. It seems this setup is causing unpredictable hardware changes and therefore key changes. For the moment, Andrew Junker has given us a 365 days extension code.

Once the software is launched, you need to activate the shared memory.

To do this, go to the “Access” tab, create an empty profile (or a specific one if needed). Go the pre-launch window and mouse click on the bottom left side of the window. This will make a checkbox appear called “Shared Memory”. Check the box and restart the software.

If you do not activate the shared memory, the example which is delivered with Brainfingers will generate the following error.

7. NIA (Brainfingers) Development tools test

7.1 Getters

Getters are listed in the sdk example which comes with the brainfingers software.
Direct data input
  • CLB_INPUT = 0;
  • CLB_EOG = 1;
  • CLB_EEG = 2;
  • CLB_EMG = 3;
  • CLB_GLANCE_MAG_CM = 4;
  • CLB_GLANCE_DIR_CM = 5;
  • CLB_ALPHA1_CM = 6;
  • CLB_ALPHA2_CM = 7;
  • CLB_ALPHA3_CM = 8;
  • CLB_BETA1_CM = 9;
  • CLB_BETA2_CM = 10;
  • CLB_BETA3_CM = 11;
  • CLB_MUSCLE_CM = 12;
Joystick data input
  • CLB_GLANCE_MAG_JS = 13;
  • CLB_GLANCE_DIR_JS = 14;
  • CLB_ALPHA1_JS = 15;
  • CLB_ALPHA2_JS = 16;
  • CLB_ALPHA3_JS = 17;
  • CLB_BETA1_JS = 18;
  • CLB_BETA2_JS = 19;
  • CLB_BETA3_JS = 20;
  • CLB_MUSCLE_JS = 21;
  • CLB_GLANCE_MAG_SW = 22;
  • CLB_ALPHA1_SW = 23;
  • CLB_ALPHA2_SW = 24;
  • CLB_ALPHA3_SW = 25;
  • CLB_BETA1_SW = 26;
  • CLB_BETA2_SW = 27;
  • CLB_BETA3_SW = 28;
  • CLB_MUSCLE_SW = 29;

02_Neurosky.JPG (94.7 kB) Paul Maire, 07.02.2011 13:13

03_Brainfingers.JPG (30.2 kB) Paul Maire, 07.02.2011 13:13

03_Emotiv_Neurovault.JPG (68.5 kB) Paul Maire, 07.02.2011 13:13

03_Neurosky.JPG (92.3 kB) Paul Maire, 07.02.2011 13:13

01_Emotiv.JPG (54.7 kB) Paul Maire, 07.02.2011 13:13

01_Neurosky.JPG (70.2 kB) Paul Maire, 07.02.2011 13:13

02_Brainfingers.JPG (86 kB) Paul Maire, 07.02.2011 13:13

04_Brainfingers.JPG (48.5 kB) Paul Maire, 07.02.2011 13:13

02_Emotiv_Neurovault.JPG (67.4 kB) Paul Maire, 07.02.2011 13:13

08_Emotiv.JPG (60.4 kB) Paul Maire, 07.02.2011 13:14

09_Emotiv.JPG (72.4 kB) Paul Maire, 07.02.2011 13:14

04_Emotiv_Neurovault.JPG (23.6 kB) Paul Maire, 07.02.2011 13:14

04_Neurosky.JPG (87.8 kB) Paul Maire, 07.02.2011 13:14

05_Emotiv_Neurovault.JPG (166.3 kB) Paul Maire, 07.02.2011 13:14

06_Emotiv.JPG (88.5 kB) Paul Maire, 07.02.2011 13:14

07_Emotiv.JPG (76.1 kB) Paul Maire, 07.02.2011 13:14

01_Brainfingers.JPG (27.5 kB) Paul Maire, 07.02.2011 13:34

02_neurosky 03_brainfingers 03_emotiv_neurovault 03_neurosky 01_emotiv 01_neurosky 02_brainfingers 04_brainfingers 02_emotiv_neurovault 08_emotiv 09_emotiv 04_emotiv_neurovault 04_neurosky 05_emotiv_neurovault 06_emotiv 07_emotiv 01_brainfingers