Intel® RealSense™ Developer Documentation

These Examples demonstrate how to use the python wrapper of the SDK.

List of Examples

Sample code source code is available on GitHub
For full Python library documentation please refer to module-pyrealsense2

Link to GitHub


Streaming Depth

This example demonstrates how to start streaming depth frames from the camera and display the image in the console as an ASCII art.



Rendering depth and color with OpenCV and Numpy

This example demonstrates how to render depth and color images using the help of OpenCV and Numpy



Align & Background Removal

Demonstrate a way of performing background removal by aligning depth images to color images and performing simple calculation to strip the background.



Advanced Mode

Example of the advanced mode interface for controlling different options of the D400 ??? cameras



Read Bag File

Example on how to read bag file and use colorizer to show recorded depth stream in jet colormap.



Box measurement and multi-cameras Calibration

Simple demonstration for calculating the length, width and height of an object using multiple cameras.



Pose (basic)

Demonstrates how to retrieve pose data from a T265 camera



Coordinate system

This example shows how to change coordinate systems of a T265 pose



Sparse Stereo Depth (FishEye Passive)

This example shows how to use T265 intrinsics and extrinsics in OpenCV to asynchronously compute depth maps from T265 fisheye images on the host.



Stream over Ethernet

This example shows how to stream depth data from RealSense depth cameras over ethernet.
It includes Ethernet client and server using python's Asyncore.



PointCloud with OpenCV

This sample is mostly for demonstration and educational purposes.
It really doesn't offer the quality or performance that can be
achieved with hardware acceleration.



PointCloud with PyGlet

OpenGL Pointcloud viewer with


Interactive Examples

  1. Distance to Object - This notebook offers a quick hands-on introduction to Intel RealSense Depth-Sensing technology. Please refer to Distance to Object for further information. Click to experience Binder
  2. Depth Filters - This notebook is intended to showcase effect of post processing filters. Please refer to Depth Filters for further information. Click to experience Binder

Installation Guidelines

Please refer to installation guideline at Python Installation

Building from Source

Please refer to the instructions at Building from Source

Box Measurement and Multi-camera Calibration

This sample demonstrates the ability to use the SDK for aligning multiple devices to a unified co-ordinate system in world to solve a simple task such as dimension calculation of a box.


This code requires Python 3.6 to work and does not work with Python 2.7.


Place the 2 cameras in a similar way as the picture

Place the calibration chessboard object into the field of view of all the Intel RealSense cameras. See example for a target below.

Update the chessboard parameters in the script according to the target size, if using a different target than the one used for this demo here.

6 x 9 chessboard

6 x 9 chessboard

chessboard_width = 6 # squares
chessboard_height = 9 	# squares
square_size = 0.0253 # meters

Start the program. Allow calibration to occur and place the desired object ON the calibration object when the program asks for it. Make sure that the object to be measured is not bigger than the calibration object in length and width.

Example for a measured box

Example for a measured box

The length, width and height of the bounding box of the object is then displayed in millimeters.
Note: To keep the demo simpler, the clipping of the usable point cloud is done based on the assumption that the object is placed ON the calibration object and the length and width is less than that of the calibration object.

Once the calibration is done and the target object's dimensions are calculated, the application will open as many windows as the number of devices connected each displaying a color image along with an overlay of the calculated bounding box. In the following example we've used two Intel® RealSense™ Depth Cameras D435 pointing at a common object placed on a 6 x 9 chessboard.

Example for output

Example for output

Stream over Ethernet - Python Example

Ethernet client and server for RealSense using python's Asyncore.


Installation and Setup of Server:
These steps assume a fresh install of Ubuntu 18.04 on an UpBoard but has also been tested on an Intel NUC.

sudo apt-get update; sudo apt-get upgrade; 

sudo apt-get install python

sudo apt-get install python-pip  

sudo apt-get install git

Clone the repo then run:

sudo python

This will first install the pip dependencies, followed by the creation of cronjobs in the /etc/crontab file that maintains an instance of the Server running whenever the device is powered.


Mulicast broadcast is used to establish connections to servers that are present on the network.
Once a server receives a request for connection from a client, Asyncore is used to establish a TCP connection for each server.
Frames are collected from the camera using librealsense pipeline. It is then resized and send in smaller chucks as to conform with TCP.

UpBoard PoE

Below shows use of a PoE switch and PoE breakout devices(avalible from online retailers) powering each dedicated UpBoard:
This configuration should allow for a number of RealSense cameras to be connected over distances greater then 30m

The 5 RealSense cameras are connected to each UpBoard using the provided USB3 cables.

The 5 RealSense cameras are connected to each UpBoard using the provided USB3 cables.

Client Window

Below shows the result of having connected to five cameras over the local network:

The window titles indicate the port which the frames are being received over.

The window titles indicate the port which the frames are being received over.

Error Logging

Errors are piped to a log file stored in /tmp/error.log as part of the command that is setup in /etc/crontab


Power Considerations

The UpBoards require a 5v 4Amp power supply. When using PoE breakout adaptors I have found some stability issues, for example the device kernel can crash when the HDMI port is connected. As such I recommend running the UpBoard as a headless server when using PoE.

Network bandwidth

It is currently very easy to saturate the bandwidth of the Ethernet connection I have tested 5 servers connected to the same client without issue beyond limited framerate:

cfg.enable_stream(, 640, 480, rs.format.z16, 30)

self.decimate_filter.set_option(rs.option.filter_magnitude, 2)

There are a number of strategies that can be used to increase this bandwidth but are left to the user for brevity and the specific tradeoff for your application, these include:

Transmitting frames using UDP and allowing for frame drop, this requires implementation of packet ordering.

Reducing the depth channel to 8bit.

Reducing the resolution further.

The addition of compression, either frame wise or better still temporal.

Local recording of the depth data into a buffer, with asynchronous frame transfer.

TroubleShooting Tips

I first of all suggest installing and configuring openssh-server on each of the UpBoards allowing remote connection from the client machine.

Check that the UpBoards are avalible on the local network using "nmap -sP 192.168.2.*"

Check that the server is running on the UpBoard using "ps -eaf | grep "python"

Finally check the log file at /tmp/error.log

There might still be some conditions where the Server is running but not in a state to transmit, help in narrowing these cases would be much appreciated.

Updated 11 days ago


Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.