Data from Kinect

Mar 4, 2011 at 5:13 PM
Edited Mar 4, 2011 at 5:28 PM

Good morning,

I’d like to ask you a question:

I’m not interested in watching video (depth image, skeleton….), but I would like to write into a file all the data stream that Kinect supply to OpenNI.

Any advices?

Coordinator
Mar 4, 2011 at 6:28 PM

Hi,

 

you can write an output plugin for monoMIG that records receinved input to a file.

Then you could also write an input plugin for monoMIG that playblack a recorded session from that file.

That is already in my to-do list for monoMIG but I can't say when I'll able to work on that.

If you are interested in doing that using monoMIG I'll be glad to give you all informations you need for writing such plugins.

Writing plugin for monoMIG is a very easy task.

 

 

Regards,

Gene.

Mar 5, 2011 at 12:49 PM

Thank you very much,

On Monday I'll try to write my first plugin for monoMIG,,

I think I'll need your help.

Have a nice weekend,

David.

Mar 7, 2011 at 12:06 PM

Good morning,

I have problems with monoMIG (NetMono/WinFormsTest). When I run it, occurs this error:

Mig Service Error!= (retrying in 5 seconds),

Any advices?

 

Coordinator
Mar 7, 2011 at 12:11 PM
candidodavide wrote:

Good morning,

I have problems with monoMIG (NetMono/WinFormsTest). When I run it, occurs this error:

Mig Service Error!= (retrying in 5 seconds),

Any advices?

 

You have to run MIG.exe service first. It is located under the folder mono-mig\MultiInputGateway\MIG\bin\Release

Are you using Windows or Linux? Future releases will have installer.

 

G.

Mar 7, 2011 at 12:31 PM

I am using Windows 7 32 bit,

I’m doing research on gesture recognizing.

At first I’d like to observe the differences that Kinect is capable to catch and provide to the PC when my hand does different gestures.

monoMIG could be helpful, even if I don’t connect the Wii remote to my PC?

(actually I have got a Wii remote, but I would like to use just Kinect)

Thanks again for  your answers.

Coordinator
Mar 7, 2011 at 12:42 PM
candidodavide wrote:

I am using Windows 7 32 bit,

I’m doing research on gesture recognizing.

At first I’d like to observe the differences that Kinect is capable to catch and provide to the PC when my hand does different gestures.

monoMIG could be helpful, even if I don’t connect the Wii remote to my PC?

(actually I have got a Wii remote, but I would like to use just Kinect)

Thanks again for  your answers.

Yes, Wii Remote is not required. You can even remove it from configuration file (config.xml).

In order to use Kinect you have to install OpenNI + Nite.

Follow instruction as described here:

http://www.codeproject.com/Articles/148251/How-to-Successfully-Install-Kinect-on-Windows-Open.aspx?msg=3777884

 

 

G.

Mar 8, 2011 at 10:55 AM
Edited Mar 8, 2011 at 10:56 AM

Dai vostri nomi, mi sembra che siate italiani; quindi se non vi dispiace scrivo in Italiano.

Anch'io sono interessato a capire quali dati vengono estratti da kinect, ed in particolar modo, vorrei capire come viene "individuata" la mano.

In altre parole, come fà kinect a dedurre la posizione della mano?Utilizzando OpenNI è possibile ricavare informazioni(velocità,accelerazione, profondità..ecc..) di ciascun nodo dello skeleton?

Inoltre, utilizzando le librerie OpenNI vorrei salvare tali dati.

Grazie in anticipo per un' eventuale risposta.

Mar 8, 2011 at 1:39 PM

Ciao AndrewAdmin, mi sto facendo le tue stesse domande. Appena trovo una risposta, prometto di postare tutto qui.

Per quanto riguarda la lingua, per me va bene scrivere in itailano. Cmq in attesa che Generoso dia "l'autorizzazione" a parlare in italiano, posto la mia domanda in lingua inglese:

 

Dear Generoso,

As you suggested, I'd like to write an output plugin for monoMIG that records received input to a file,

can you give me any advices about how getting start?

Thank you.

Coordinator
Mar 8, 2011 at 2:57 PM
AndrewAdmin wrote:

Dai vostri nomi, mi sembra che siate italiani; quindi se non vi dispiace scrivo in Italiano.

Anch'io sono interessato a capire quali dati vengono estratti da kinect, ed in particolar modo, vorrei capire come viene "individuata" la mano.

In altre parole, come fà kinect a dedurre la posizione della mano?Utilizzando OpenNI è possibile ricavare informazioni(velocità,accelerazione, profondità..ecc..) di ciascun nodo dello skeleton?

Inoltre, utilizzando le librerie OpenNI vorrei salvare tali dati.

Grazie in anticipo per un' eventuale risposta.

Hand is recognized by the Middeware software Nite http://www.primesense.com/?p=515

monoMIG uses Nite middleware to achieve hands recognition and user skelton tracking.

For a lower level programming of Kinect Sensor I suggest both of you to have a look at PrimeSense OpenNI first. Pheraps it would fit your needs better than monoMIG.

 

Regards,

G.

Coordinator
Mar 8, 2011 at 3:33 PM
candidodavide wrote:

Ciao AndrewAdmin, mi sto facendo le tue stesse domande. Appena trovo una risposta, prometto di postare tutto qui.

Per quanto riguarda la lingua, per me va bene scrivere in itailano. Cmq in attesa che Generoso dia "l'autorizzazione" a parlare in italiano, posto la mia domanda in lingua inglese:

 

Dear Generoso,

As you suggested, I'd like to write an output plugin for monoMIG that records received input to a file,

can you give me any advices about how getting start?

Thank you.

I'll reply you later on tonight with some code snippet and explaination.

=)

G.

Mar 8, 2011 at 4:05 PM

Wow, now I understand why your nick is "Generoso",

you're very very kind.

:-)

Coordinator
Mar 8, 2011 at 10:13 PM
Edited Mar 8, 2011 at 10:14 PM

Ciao Davide,

have you succeed in running the WinFormTest application or MIRIAWeb example? =)

I'll give you a few informations about how work monoMIG Kinect input plugin.

Before start tracking hands or users skeleton, Kinect input plugin wait for "Wave Gesture" (to start hand tracking) and "Calibration Pose" (to start user skeleton tracking).

Once the startup gesture is detected the input plugin starts sending hand/skeleton data (3d position of hand or skeleton joints) to output plugins.

For istance, the "Silverlight" output plugin receives kinect data and sends it to clients through a TCP connection using a very minimal and simple protocol.

In order to build an output plugin follow these steps:

- Add a new class library project to the OutputPlugins folder of the monoMIG solution
- Give it a proper name eg: InputRecorder and define the namespace to be MIG.Plugins.Output.InputRecorder
- Add a class named Plugin.cs to InputRecorder project folder (delete MyClass1.cs)
- Plugin.cs will implement the IOutputPlugin interface in order to be properly loaded and controlled by monoMIG

The resulting class should be something like this:

 

using System;
using System.Xml;

using MIG;
using MIG.Plugins.Input.NiteKinect;

namespace MIG.Plugins.Output.InputRecorder
{
    public class Plugin : IOutputPlugin
    {
        private string _identifier = "migrecorder";
        private const string _pluginname = "MIG Input Record and Playback";
        private MIGHost _host = null;

        public Plugin ()
        {
        }

        public string Identifier
        {
            get { return _identifier; }
        }

        public string Name
        {
            get { return _pluginname; }
        }
       
        public void SetHost(MIGHost mighost)
        {
            _host = mighost;
        }
       
        public bool SetConfig(XmlElement xmlconfig)
        {
            // TODO: If any configuration parameter is planned for this plugin add code here
            return true;
        }

        public bool CanHandleInputType(Type plugintype)
        {
            bool canreceive = false;
            if (plugintype == typeof(MIG.Plugins.Input.NiteKinect.Plugin))
            {
                canreceive = true;
            }
            return canreceive;
        }
       
        public void InputMessageHandle(IPluginData msgdata)
        {
            if (msgdata.GetType() == typeof( NiteKinectUserEventData ) || msgdata.GetType() == typeof( NiteKinectHandGestureEventData ))
            {
                // TODO: Insert input recording logic here 
            }
        }
    }
}


- Build the project and copy the newly generated InputRecorder.dll to the OutputPlugins folder of monoMIG applcation
- Add output plugin config section to monoMIG config.xml in order to make monoMIG load the new plugin
      <OutputPlugins>
    ....
    <InputRecorder></InputRecorder>
    ...
      </OutputPlugins>
- Run monoMIG and see if the new plugin is correctly loaded

This will let you receive input data through NiteKinectUserEventData and NiteKinectHandGestureEventData that you can handle in the InputMessageHandle function that is automatically called by monoMIG application to route data from input plugin to output plugins.
Have a look to the MultiInputGateway/OutputPlugins/Silverlight/Silverlight/Handlers/NiteKinect.cs class to better understand how thise input event data is used.
Once you have got your recorded data input file, you can play it back using an input plugin that simply reads data from file and fires the proper DataReceived event as it happens for the real Kinect Input Plugin.
See the file MultiInputGateway/InputPlugins/NiteKinect/Plugin.cs to better understand how events are fired by input plugin.
I hope this can help you somehow.

G.