Simple webcam with websockets and Processing

Just Goofin

I heard someone was streaming Kinect data and it made me wonder if you could just stream image data over websockets if you encoded every frame to Base64. This is useful because you can use Base64 strings to define images in HTML:

var img;
ws.onmessage = function (e) {
	$("#image").attr('src',  'data:image/jpg;base64,';

It also seems pretty straight forward to do this in reverse from an iPhone/iPad native app. Same process, for each frame crunch the image into a Base64 string and send it. This is what I’ll be trying next.

You can also write this data to a canvas element and use then convert it to any form you want as well. Kind of a neat trick. Here is the Processing sketch, you can download the package with all the required libraries below. Dependencies issues, who needs ‘em? …I do :( :

The only complicated part is the encoding of the image:

Capture the image from the camera Capture object
create a BufferedImage object
load the pixels from the capture object to the BufferedImage
encode the RGB pixels as jpg image and write it to an output byte stream
convert that image to a byte array and then to a Base64 string
then broadcast it over the wire via Websockets

void captureEvent(Capture myCapture) {;

	BufferedImage buffimg = new BufferedImage( width, height, BufferedImage.TYPE_INT_RGB);
	buffimg.setRGB( 0, 0, width, height, myCapture.pixels, 0, width );

	ByteArrayOutputStream baos = new ByteArrayOutputStream();
	try {
    	ImageIO.write( buffimg, "jpg", baos );
  	} catch( IOException ioe ) {

	String b64image = Base64.encode( baos.toByteArray() );
	socket.broadcast( b64image );

And here is the markup:

You’ll wanna change the ip address for the websocket server to whatever is your computer: System Preferences > Network on OS X for you IP. I am you can get around this, but for my example I created a folder called “html” and placed it inside the Processing Sketch Folder: ~/Documents/Processing/SimpleSocketServer/html/index.html

This might be required.

It worked well at 640×480 @ 30fps in Chrome and 320×240 @ 30fps on the iPad and iPhone. Here is a video of me demoing it in fashion.

Also, I think the WebsocketsP5 library is broken for draft 10, crucial for Chrome, of the websockets handshake so here is a complete package of the Processing webserver/websocket libraries:>. If you want to, you can download the WebsocketP5 library and then drop in a new version of the webbit server.

About this entry