Created
August 8, 2017 14:55
-
-
Save rjw57/7cf212b5060eabf1e5298cc611d37b08 to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
{ | |
"cells": [ | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"# Driving the Kinect2 from Python" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"In this notebook we will demonstrate how to use the ``freenect2`` module with Python." | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 1, | |
"metadata": { | |
"collapsed": true | |
}, | |
"outputs": [], | |
"source": [ | |
"# Use IPython's built in %pylab command to import matplotlib as \"plt\" and numpy as \"np\"\n", | |
"%matplotlib inline\n", | |
"import matplotlib.pyplot as plt\n", | |
"import numpy as np" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"The ``freenect2`` module is imported in the usual way. Here we're using ``import ... as`` to give the module a shorter name." | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 2, | |
"metadata": { | |
"collapsed": true | |
}, | |
"outputs": [], | |
"source": [ | |
"import freenect2 as fn2" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"Each Kinect which is connected to the computer appears as a freenect \"device\" represented by a ``Device`` object. If you pass no arguments to the constructor the first connected device is used which is what you usually want." | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 3, | |
"metadata": { | |
"collapsed": true | |
}, | |
"outputs": [], | |
"source": [ | |
"device = fn2.Device()" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"Devices need to be started before frames are captured. This can be done explicitly via the ``.start()`` and ``.stop()`` members but a more Pythonic way is to use the ``with`` statement. This automatically takes care of starting and stopping the device even when an exception is thrown.\n", | |
"\n", | |
"Once a device is running, it may be iterated over to return ``FrameType``, ``Frame`` pairs for each captured frame." | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 4, | |
"metadata": {}, | |
"outputs": [ | |
{ | |
"name": "stdout", | |
"output_type": "stream", | |
"text": [ | |
"FrameType.Color: Frame(width=1920, height=1080, sequence=996, timestamp=306765, format=FrameFormat.BGRX)\n", | |
"FrameType.Ir: Frame(width=512, height=424, sequence=974, timestamp=310852, format=FrameFormat.Float)\n", | |
"FrameType.Depth: Frame(width=512, height=424, sequence=974, timestamp=310852, format=FrameFormat.Float)\n", | |
"FrameType.Ir: Frame(width=512, height=424, sequence=983, timestamp=313252, format=FrameFormat.Float)\n", | |
"FrameType.Depth: Frame(width=512, height=424, sequence=983, timestamp=313252, format=FrameFormat.Float)\n", | |
"FrameType.Ir: Frame(width=512, height=424, sequence=988, timestamp=316452, format=FrameFormat.Float)\n", | |
"FrameType.Depth: Frame(width=512, height=424, sequence=988, timestamp=316452, format=FrameFormat.Float)\n", | |
"FrameType.Color: Frame(width=1920, height=1080, sequence=1021, timestamp=319565, format=FrameFormat.BGRX)\n", | |
"FrameType.Color: Frame(width=1920, height=1080, sequence=1022, timestamp=319831, format=FrameFormat.BGRX)\n", | |
"FrameType.Color: Frame(width=1920, height=1080, sequence=1023, timestamp=320098, format=FrameFormat.BGRX)\n", | |
"FrameType.Color: Frame(width=1920, height=1080, sequence=1024, timestamp=320365, format=FrameFormat.BGRX)\n" | |
] | |
} | |
], | |
"source": [ | |
"# Start the device and print out information on the first 10 frames captured.\n", | |
"with device.running():\n", | |
" for idx, (frame_type, frame) in enumerate(device):\n", | |
" print('{}: {}'.format(frame_type, frame))\n", | |
" \n", | |
" # Stop iterating if we've captured enough frames\n", | |
" if idx >= 10:\n", | |
" break" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"The possible ``FrameType`` values are: ``Color``, ``Ir`` and ``Depth``. Let's write a little function which iterates over a device and returns a color, depth and IR frame tuple once one of each have been received." | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 5, | |
"metadata": { | |
"collapsed": true | |
}, | |
"outputs": [], | |
"source": [ | |
"def get_next_frames(device):\n", | |
" # These are the types of the frames we want to capture and the order\n", | |
" # they should be returned.\n", | |
" required_types = [fn2.FrameType.Color, fn2.FrameType.Depth, fn2.FrameType.Ir]\n", | |
" \n", | |
" # Store incoming frame in this dictionary keyed by type.\n", | |
" frames_by_type = {}\n", | |
" \n", | |
" for frame_type, frame in device:\n", | |
" # Record frame\n", | |
" frames_by_type[frame_type] = frame\n", | |
" \n", | |
" # Try to return a frame for each type. If we get a KeyError, we need to keep capturing\n", | |
" try:\n", | |
" return [frames_by_type[t] for t in required_types]\n", | |
" except KeyError:\n", | |
" pass # This is OK, capture the next frame" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"Now, with the device running, get the next RGB, IR and depth frames." | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 6, | |
"metadata": {}, | |
"outputs": [], | |
"source": [ | |
"with device.running():\n", | |
" color, depth, ir = get_next_frames(device)" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"metadata": {}, | |
"source": [ | |
"The ``Frame`` class allows direct conversion to ``PIL`` images. Notice that the images returned by Kinect are mirrored left to right." | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": 7, | |
"metadata": {}, | |
"outputs": [ | |
{ | |
"data": { |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment