Processing For Mac



It has since grown on Classic Mac OS to be a very powerful image processing platform. Additionally, OMA is used in many labs as the front end interface for controlling I/O devices such as cameras, Digital I/O cards and stepper motors. Download the latest version of Processing for Mac - Programming language for electronic arts. Read 3 user reviews of Processing on MacUpdate. The 3.5.4 version of Processing for Mac is provided as a free download on our software library. The size of the latest setup package available for download is 124 MB. The actual developer of this free software for Mac is Processing.

The Microsoft Kinect sensor is a peripheral device (designed for XBox and windows PCs) that functions much like a webcam. However, in addition to providing an RGB image, it also provides a depth map. Meaning for every pixel seen by the sensor, the Kinect measures distance from the sensor. This makes a variety of computer vision problems like background removal, blob detection, and more easy and fun!

The Kinect sensor itself only measures color and depth. However, once that information is on your computer, lots more can be done like “skeleton” tracking (i.e. detecting a model of a person and tracking his/her movements). To do skeleton tracking you’ll need to use Thomas Lengling’s windows-only Kinect v2 processing libray. However, if you’re on a Mac and all you want is raw data from the Kinect, you are in luck! This library uses libfreenect and libfreenect2 open source drivers to access that data for Mac OS X (windows support coming soon).

What hardware do I need?

First you need a “stand-alone” kinect. You do not need to buy an Xbox.

  • Standalone Kinect Sensor v1. I believe this one comes with the power supply so you do not need a separate adapter listed next. However, if you have a kinect v1 that came with an XBox, it will not include the Kinect Sensor Power Supply.
  • Standalone Kinect Sensor v2. You also probably need the Kinect Adapter for Windows. Don’t be thrown off, although it says windows, this will allow you to connect it to your mac via USB. Finally, you’ll also want to make sure your computer supports USB 3. Most modern machines do, but if you are unsure you can find out more here for Mac OS X.

Some additional notes about different models:

  • Kinect 1414: This is the original kinect and works with the library documented on this page in the Processing 3.0 beta series.
  • Kinect 1473: This looks identical to the 1414, but is an updated model. It should work with this library, but I don’t have one to test. Please let me know if it does or does not!
  • Kinect for Windows version 1: ???? Help? Does this one work?
  • Kinect for Windows version 2: This is the brand spanking new kinect with all the features found in the XBox One Kinect. Also works with this library!

SimpleOpenNI

You could also consider using the SimpleOpenNI library and read Greg Borenstein’s Making Things See book. OpenNI has features (skeleton tracking, gesture recognition, etc.) that are not available in this library. Unfortunately, OpenNI was recently purchased by Apple and, while I thought it was shut, down there appear to be some efforts to revive it!. It’s unclear what the future will be of OpenNI and SimpleOpenNI.

I’m ready to get started right now

The easiest way to install the library is with the Processing Contributions ManagerSketch → Import Libraries → Add library and search for “Kinect”. A button will appear labeled “install”.If you want to install it manually download the most recent release and extract it in the libraries folder. Restart Processing, open up one of the examples in the examples folder and you are good to go!

What is Processing?

Processing is an open source programming language and environment for people who want to create images, animations, and interactions. Initially developed to serve as a software sketchbook and to teach fundamentals of computer programming within a visual context, Processing also has evolved into a tool for generating finished professional work. Today, there are tens of thousands of students, artists, designers, researchers, and hobbyists who use Processing for learning, prototyping, and production.

What if I don’t want to use Processing?

If you are comfortable with C++ I suggest you consider using openFrameworks or Cinder with the Kinect. These environments have some additional features and you also may get a C++ speed advantage when processing the depth data, etc.:

  • More resources from: The OpenKinect Project

What code do I write?

First thing is to include the proper import statements at the top of your code:

As well as a reference to a Kinect object, i.e.

Then in setup() you can initialize that kinect object:

If you are using a Kinect v2, use a Kinect2 object instead.

Once you’ve done this you can begin to access data from the kinect sensor. Currently, the library makes data available to you in five ways:

  • PImage (RGB) from the kinect video camera.
  • PImage (grayscale) from the kinect IR camera.
  • PImage (grayscale) with each pixel’s brightness mapped to depth (brighter = closer).
  • PImage (RGB) with each pixel’s hue mapped to depth.
  • int[] array with raw depth data (11 bit numbers between 0 and 2048).

Let’s look at these one at a time. If you want to use the Kinect just like a regular old webcam, you can access the video image as a PImage!

You can simply ask for this image in draw(), however, if you can also use videoEvent() to know when a new image is available.

If you want the IR image:

With kinect v1 cannot get both the video image and the IR image. They are both passed back via getVideoImage() so whichever one was most recently enabled is the one you will get. However, with the Kinect v2, they are both available as separate methods:

Now, if you want the depth image, you can request the grayscale image:

As well as the raw depth data:

For the kinect v1, the raw depth values range between 0 and 2048, for the kinect v2 the range is between 0 and 4500.

For the color depth image, use kinect.enableColorDepth(true);. And just like with the video image, there’s a depth event you can access if necessary.

Unfortunately, b/c the RGB camera and the IR camera are not physically located in the same spot, there is a stereo vision problem. Pixel XY in one image is not the same XY in an image from a camera an inch to the right. The Kinect v2 offers what’s called a “registered” image which aligns all the depth values with the RGB camera ones. This can be accessed as follows:

Finally, for kinect v1 (but not v2), you can also adjust the camera angle with the setTilt() method.

So, there you have it, here are all the useful functions you might need to use the Processing kinect library:

  • initDevice() — start everything (video, depth, IR)
  • activateDevice(int) - activate a specific device when multiple devices are connect
  • initVideo() — start video only
  • enableIR(boolean) — turn on or off the IR camera image (v1 only)
  • initDepth() — start depth only
  • enableColorDepth(boolean) — turn on or off the depth values as color image
  • enableMirror(boolean) — mirror the image and depth data (v1 only)
  • PImage getVideoImage() — grab the RGB (or IR for v1) video image
  • PImage getIrImage() — grab the IR image (v2 only)
  • PImage getDepthImage() — grab the depth map image
  • PImage getRegisteredImage() — grab the registered depth image (v2 only)
  • int[] getRawDepth() — grab the raw depth data
  • float getTilt() — get the current sensor angle (between 0 and 30 degrees) (v1 only)
  • setTilt(float) — adjust the sensor angle (between 0 and 30 degrees) (v1 only)

For everything else, you can also take a look at the javadoc reference.

Examples

There are four basic examples for both v1 and v2.

Display RGB, IR, and Depth Images

Code for v1:RGBDepthTest

Code for v2:RGBDepthTest2

This example uses all of the above listed functions to display the data from the kinect sensor.

Multiple devices

Both v1 and v2 has multiple kinect support.

Code for v1:MultiKinect

Code for v2:MultiKinect2

Point Cloud

Code for v1: PointCloud

Code for v2: PointCloud

Here, we’re doing something a bit fancier. Number one, we’re using the 3D capabilities of Processing to draw points in space. You’ll want to familiarize yourself with translate(), rotate(), pushMatrix(), popMatrix(). This tutorial is also a good place to start. In addition, the example uses a PVector to describe a point in 3D space. More here: PVector tutorial.

The real work of this example, however, doesn’t come from me at all. The raw depth values from the kinect are not directly proportional to physical depth. Rather, they scale with the inverse of the depth according to this formula:

Processing

Rather than do this calculation all the time, we can precompute all of these values in a lookup table since there are only 2048 depth values.

Thanks to Matthew Fisher for the above formula. (Note: for the results to be more accurate, you would need to calibrate your specific kinect device, but the formula is close enough for me so I’m sticking with it for now. More about calibration in a moment.)

Finally, we can draw some points based on the depth values in meters:

Average Point Tracking

The real magic of the kinect lies in its computer vision capabilities. With depth information, you can do all sorts of fun things like say: “the background is anything beyond 5 feet. Ignore it!” Without depth, background removal involves all sorts of painstaking pixel comparisons. As a quick demonstration of this idea, here is a very basic example that compute the average xy location of any pixels in front of a given depth threshold.

Source for v1: AveragePointTracking

Source for v2: AveragePointTracking2

In this example, I declare two variables to add up all the appropriate x’s and y’s and one variable to keep track of how many there are.

Then, whenever we find a given point that complies with our threshold, I add the x and y to the sum:

When we’re done, we calculate the average and draw a point!

What’s missing?

  • Everything is being tracked via github issues.

FAQ

  1. What are there shadows in the depth image (v1)? Kinect Shadow diagram
  2. What is the range of depth that the kinect can see? (v1) ~0.7–6 meters or 2.3–20 feet. Note you will get black pixels (or raw depth value of 2048) at both elements that are too far away and too close.

Buying a new Mac is exciting, but it can also be very confusing. For most of the Macs it sells, Apple offers a number of different configuration options, including different RAM capacities, storage sizes and disk types, and processors. For many people, it’s the last of those that’s most important. It’s arguable that processors, or CPUs, are now so powerful that they can do everything most of us need to do at a speed that is more than good enough and so choice of CPU isn’t important. But that’s not quite true, as we’ll explain below.

What is a processor?

Put simply, a processor is the ‘brain’ in your Mac. Until relatively recently the CPU was responsible only for taking input, executing instructions and passing on the results. Now, CPUs incorporate short-term memory of their own and, sometimes, graphics processors, or GPUs. In fact, when it comes to choosing a processor for your Mac, deciding whether to opt for one that has an on-chip graphics processor is one of the key decisions you’ll have to make. Macs that have processors with on-board GPUs tend to be less expensive than those that have separate graphics processors, but also are less capable when it comes to things like rendering 3D graphics and 4K video. To further complicate matters, new macOS features like Metal make excellent use of the hardware in GPUs, meaning the choice of graphics processor is almost as important as the choice of CPU.

What processor does my Mac have inside?

Since 2006 all Macs have used Intel processors — unlike iPhones and iPads which have Apple processors. Apple labels the Intel CPUs it uses in the Mac as Core i5 and Core i7 and differentiates them by speed in GHz. The other difference is the number of cores in the CPU and the number of CPUs in the Mac. So, roughly speaking a quad-core processor should be able to process instructions at twice the rate of a dual-core CPU. That’s not the case in the real world, as executing instructions relies on more than just the speed at which the CPU’s ‘brain’ can perform calculations (for example, it’s dependent on how quickly those instructions can be passed to and from the CPU). However, in an application optimized for multiple cores, you should notice a significant difference between CPUs with different numbers of cores.

Intel gives each generation of its processor a code name. Recent Intel CPUs have had names like Sandy Bridge, Haswell, and Skylake. Apple doesn’t use those names, or even talk publicly about which processor is in each Mac, but it’s known that the current crop of iMacs and MacBook Pros have Kaby Lake processors, which were the most recent available at the time they were released in 2017. So, if you bought it in the last year or so, your MacBook Pro processor is Kaby Lake. The slimline MacBook processor is known as Core M, designed specifically for low power mobile use. The MacBook Air and Mac mini have Haswell processors, as they were released in 2013. The Mac Pro uses a completely different family of Intel processors, designed for high-end workstations and known as Xeon. The Mac Pro, last updated in 2013, uses the Romley variant of Xeon.

What are the important features of a processor?

We’ve already talked about processors that have on-board graphics, such as Intel Iris and Iris Pro. These offer benefits such as taking up less space than discrete CPU and GPU chips and Mac’s that use them tend to be less expensive than those with separate CPU and GPU. However, they also tend to be less powerful.

The other key feature of a processor is the balance between speed and power consumption. CPUs that run faster use more energy and so generate more heat. This doesn’t just mean that fans have to run more often, it also uses more power — and if the Mac is a laptop, runs the batter down more quickly. Indeed, CPUs are often ‘throttled’ so that they don’t run at their theoretical maximum, in order to preserve battery life and reduce heat generation.

What about Turbo Boost?

Turbo Boost is a technology introduced by Intel and is designed to allow processors to run at speeds faster than those quoted on your Mac’s label in certain circumstances. Remember we said that CPUs are often throttled to prevent overheating? Turbo Boost monitors the power consumption and heat of the CPU and removes that throttle when it’s safe to do so. So, for example, a quad-core 2.8GHz Mac Pro could run as fast as 3.8GHz in the right circumstances.

Which processor should I choose?

It’s likely that if you’re buying a new Mac, you’re choice of processor will be made from the Core M, Core i5, and Core i7. Not all Mac models offer a choice of all three. The Core M, for example, is specifically designed to minimise power consumption in mobile devices and is used only in the MacBook. If you’re buying a MacBook Pro or iMac, you’ll have the choice of Core i5 or Core i7. Likewise, if you buy a MacBook Air or Mac mini, although those machines use older versions of Intel processors. And, as we said earlier, if you buy a Mac Pro, you’ll be able to choose from Xeon workstation processors with multiple cores.

There are two decisions you’re likely to have to make: i5 or i7 and dual-core or quad-core. Generally speaking, in terms of speed, a dual-core i5 is the slowest and a quad-core i7 the fastest. That, however, is not the whole story. In order to get the most from multiple cores, you’ll need to be performing tasks that really benefit from the ability to execute more instructions simultaneously. So, tasks like 3D rendering, video editing and working with large images in Photoshop will all improve noticeably with a quad-core vs a dual-core processor.

Core i7 processors have two main benefits over Core i5: larger cache and hyper-threading. The presence of a larger cache means the CPU can store more data locally and so spend less time transferring them back and forth to RAM. Hyper-threading allows the CPU to simulate additional cores. So, a quad-core i7 with hyperthreading behaves like and eight-core CPU.

The benefits of larger cache and hyperthreading are seen in scientific applications, where large calculations are performed and their results stored, as well as the 3D animation and 4K video editing.

Integrated vs discrete graphics

As we discussed earlier, some Intel processors have GPUs onboard. In some Mac ranges, such as the MacBook Pro and iMac, you’ll have the choice of a model with integrated graphics or one with a separate, or discrete, GPU. If you’re going to use your Mac primarily for playing power-hungry games, manipulating large images, or editing video or animation, you should choose a Mac with a separate GPU. On the other hand, if you’re mostly going to use it for writing, email, social media and editing your own photos, a CPU with integrated graphics, like Iris, is fine.

Can I upgrade the processor in my Mac?

That’s a flat no, sadly. There have in the past been Macs that had processors that could be upgraded, but now they’re soldered firmly in place and so your Apple CPU can’t be removed. That makes the choice you make when you buy your Mac even more important. The good news is that for most users every processor that ships with a currently available Mac, even those that haven’t been updated in several years, is absolutely fine and will run as fast as you need it to.

If I can’t upgrade the processor, how else can I speed up my Mac?

Two of the most effective ways to make your Mac go faster are to install more RAM and swap your hard drive for an SSD. Those both cost quite a bit of money, however. A much less expensive and much easier way is to get rid of the ‘junk’ files that can clog up your Mac. These are installed by applications, or by the system, or downloaded to your machine by websites. Deleting them one by one is a long and difficult process, but CleanMyMac X makes it very easy. CleanMyMac identifies files on your Mac that either serve no purpose, that you’re unlikely to need, or that are large and haven’t been opened for a while.

Signal Processing For Machine Learning

You can scan your Mac with one click and CleanMyMac will report back to you with the files it thinks you can delete and how much space it will save you — it can be tens of gigabytes. You can then review them and choose which to get rid of or press Delete and get rid of them all. CleanMyMac also makes it easy to uninstall apps you no longer use and removes all their associated files. You can download it free here.

Processing Machine For Juice

Choosing a processor for your Mac can seem confusing and difficult but it’s not really. Once you’ve chosen the Mac you want, there are likely only to be a few options. And with the help of our guide, you should now know which one is right for you.