Detecting the Toy Car with OpenCV

OpenCV is a wonderful library with a lot of options on image processing. The simplest way to detect an object is by color. Color range can be in RGB/HSV/YUV. Algorithms for filtering work well with HSV.


  1. Find lower and upper ranges of HSV values
  2. Filter using color and find mask-color
  3. Optional: Filter using backgroundSubtractorMog2-> mask object movement AND both the masks
  4. Use erode, dilate and threshold to remove any noise
  5. Find contours. Assume that whatever contour we have are part of moving car, combine all points
  6. Find Centre of mass using moments
 Here is the sample code for finding the HSV Ranges from an image

void readPixels(final Mat imageHSV) {
        for (int i = 0; i < imageHSV.rows(); i++)
            for (int j = 0; j < imageHSV.cols(); j++) {
                double pixel[] = imageHSV.get(i, j);

                for (int k = 0; k < pixel.length; k++)
                    System.out.print(pixel[k] + "\t");
                System.out.println();
            }
    }

I have used GIMP to extract the car
This is converted to HSV and output from readPixels were copied to OpenOffice Calc (excel) to get the value ranges

final Scalar carColorThresholdHSVLow = new Scalar(90, 20, 180);
final Scalar carColorThresholdHSVHigh = new Scalar(102, 115, 255);


The output from Detection.

Source can be found here.

Back to the toy car: My experiments with Raspberry PI

What can I do with Raspberry PI?

It is a B+ model. To use it as a fan-less credit-card computer and do browsing? Nah.. it sucks. Too slow to respond. To connect to TV and make it "smart"?
Oh wow... Well, within a week of getting one; I have configured it to run Raspbmc (now KODI). It happily stated serving as my media manager.

I thought about the experiment I did for the OpenHouse event. Though it was done in two days (and nights), demonstrating a simple concept of tracking, it became one of the best post in terms of web traffic. So, I thought; why not try it again. So, I went and bought Remote Control Car (8$ around) from a local shop. Not to fry my RPI GPIO, I found a motor driver from FabToLab. Last weekend, I could finish the first set of code to control the GPIO and in the first run itself, worked as expected.

Dissection

  1. Opened up the remote (instead of car in the previous experiment)
  2. Removed the 4 Push Buttons underneath the joystick and soldered 4 Pins which were connected to RPI through the motor driver. Thanks to Jeshwanth for doing it for me.
  3. Used rpi 3.3V GPIO to power up the Remote. 

Coding

  1. Python GPIO Library to test the set-up. Left, Right and Accelerate were working
  2. Got back to my comfort zone language, Java with Pi4J
  3. Wrote the first set of code with "event loop". The events are the buttons and the joystick action is the duration of each event.

Code design: Thanks to Uncle Bob's talks

  1. Single Responsibility Principle and Separation of concerns
  2. Factory Pattern+ Singleton
  3. Program to Interface, prefer containment to derivation
  4. Use main program to "assemble" the pieces

High Level Design

  1. Interface: Drive
    •  drive (Activate forward/left/right)
    • release (turn off the pins which were kept on)
  2. DriveEvent (Drive,duration)
  3. Interface: AbstractDriveEventLoop
    • addEvent(DriveEvent)
    • fireEvent 

Implementation

  1. DriveLeft, DriveRight, DriveStraight, NoDrive,StopCar implementing Drive
  2. DriveFactory : Which generates the drives
  3. DriveEventLoop using  BlockingQueue.
  4. ReadInstructions which can take a BufferedReader and fill the event loop

Pictures and Result


 Interested to peek in to the code?

 Next steps

  1. Add a close loop control system with OpenCV
  2.  Add unit testing code (yeah yeah!!!, I know, I am yet to reach the TDD level)