Gesture-controlled stepper motor-based positioning function for camera systems with facial recognition

04/02/2020 Know-How

In the city of Hasselt in Belgium there is a parking garage named “Q-Park”. At the gate, the vehicle’s license plate is read by an imaging camera and recorded on the magnetic strip of the ticket that the driver draws. The data is stored on a server, possibly in the cloud. The vehicle is parked so that the occupants can pursue private or business interests. Payment can be made at a machine in the parking garage or, if the occupants are staying overnight, in the hotel. They then get into the vehicle, drive to the exit and the gate opens automatically once another image camera has read the vehicle’s license plate and compared it against the data stored in the database. A very convenient concept that eliminates the need to wind down the driver’s window or to stand at payment machines to pay parking charges (if staying overnight), as there is a partnership with the parking garage operator and the ticket can be paid at the hotels when checking out.

I Foreword

Put simply, the system consists of two fixed cameras, software algorithms for vehicle license plate detection, a database (in the cloud) and a control system for the exit gate.

This inter-system concept is consistent with Rutronik's strategy of tackling the challenges of system solutions involving many manufacturers and partners and developing a modified version as a pre-study to provide our customers with a proof of concept. The modified version here is one that uses facial recognition instead of license plate recognition. The system consists of a camera with a software algorithm for facial recognition and a motor to turn the camera based on gesture controls.

Rutronik leverages its major strength here - the use of synergies. This means that specialists from the various departments, "among them Power, Microcontroller, Analog & Sensor, Wireless, Embedded, Mechanical, and Passive", were called upon to define the components and assignment of tasks. This saves time and allows an overarching inter-departmental concept to be presented to the public, in keeping with Mr. Rudel's slogan: "all from a single source".

II Functional Description for the Demonstrator and Applications

The task: a maximum of three participants standing next to one another; use a camera to capture one image of each person, and store the image under an identifier in the form of a number or name. The camera head is aligned using a stepper motor to allow the participants to be photographed. The stepper motor is controlled using a high-voltage controller. The motor controller receives instructions via a gesture sensor. A second run is then performed where the position of the participants is switched to verify whether the participants can be recognized using OKAO Vision technology algorithm provided by the image sensor manufacturer.

Our example application here is granting access to employees/staff to company premises/buildings or internal zones within the company. The company takes a picture of each of its employees and stores it in its database/on its cloud server with the appropriate permissions based on the position within the company. Once a live camera image is compared against the database, employees are granted access to the company car park, building and internal zones within the building. Persons who are not employees of the company are denied access and must register through reception or an intercom system. Visitors may be granted corresponding rights so that they can move around within certain zones. Visitors that regularly visit the company may also be stored in the database and avoid the need to register at reception if they are approved beforehand by their contact for their scheduled visit. For employees, the system can also be expanded to encompass the clock-in/clock-out system and eliminates the need for old-fashioned systems such as badges, cards or IDs. It can be used not only by businesses but also by hospitals and other public institutions.

Considering another approach, the camera can also be replaced with fin blades that can be opened or closed by means of gesture control or other sensor technology, for example to control a car's wing mirrors or to control the air intake of a vehicle. There are many other potential applications in which the basic principle of the demonstrator can be used.

III Construction of the Demonstrator and List of the Electronic Components Used

The demonstrator consists of a high-voltage controller of the HVC 4223F family from TDK, a stepper motor (14HS17-0504S), a proximity and ambient light sensor (VCNL4035X01) from Vishay, a STM32F4 board and a camera module (image sensor) from Omron (HVC-P2).

The system concept is kept very simple by having it constructed almost entirely from evaluation kits in order to ensure its reproducibility. The Small Demo Board SDB-I from TDK-Micronas includes the HVC 4223F motor controller, while the gesture controller uses a VCNL4035X01-GES-SB board, a Nucleo board from STM with the STM32F4 microcontroller, and the B5T-007001-020 kit, comprising a camera head and mainboard.

IV Gesture sensor from Vishay, VCNL4035X01

The potential applications mentioned at the start are based on a motor drive for the alignment of the camera or fin blade, with control performed in our case via a gesture sensor. The optical sensor from Vishay used here (VCNL4035X01) is a compact (4.0 x 2.36 x 0.75mm), multifunctional sensor.

A proximity and ambient light sensor, multiplexer, two 16-bit ADCs, an I2C interface, a programmable interrupt for the proximity and ambient light sensor, a "power on" and a "shut down" function are integrated. The sensor does not include an integrated IR emitter, but does feature a driver that allows for the connection of three external IR emitters. Vishay's gesture sensor board is shipped with demo software featuring a detection algorithm and display of measurement data. The software can be used to modify certain parameters.

V Considerations Regarding the Gesture Detection Algorithm

As previously mentioned, the VCNL4035X01-GES-SB sensor board from Vishay is used, featured three VSMY2940GX01 IR emitters arranged in a triangle and the VCNL4035X01 sensor on the board, as shown in Figure 3. A red LED shows whether gesture detection is available, if the parameter Upper/Lower has triggered the lower/upper switching limits.

To detect which IR emitter the reflected light is coming from, the three emitters are controlled via a switch. This means that they are triggered in sequence and the corresponding reflected signals are measured at the proximity sensor.

A right-to-left gesture causes an event (object) to be detected if the detected signal from IR emitter ps2 falls within the defined switching limits (parameters Upper/Lower). The detected value is stored in a variable, IR emitter ps2 is deactivated and IR emitter ps1 is pulsed, the detected value is stored in another variable, IR emitter ps1 is deactivated and the process is repeated.

A custom-defined algorithm is used to interpret whether a gesture is from right to left or vice versa. The algorithm used by us is called "insertion sort". With a right-to-left motion, a peak value is first expected in the measured values from IR emitter ps2, then in the values from IR emitter ps1. When observing how the values from both emitters develop over time, IR emitter ps2 exhibits a continuously increasing measured value up to the maximum, and then transitions to a continuously decline. The value from IR emitter ps1 is initially constant, but transitions to a continuous increase. Once it has reached its peak value, the value transitions to a continuous decline. The algorithm compares the measured values to determine if the measured signals from an IR emitter are continuously increasing (Xps2n > Xps2n-1); if they are, this value is compared against the value from the other IR emitter by way of subtraction. If Xps2n < Xps2n-1, the previous value (Xps2n-1) is compared against the result from the other IR emitter. If subsequent measures from IR emitter ps2 confirm that the value Xps2n-1 is greater, the maximum measurement has been found. If the subtraction of the values of the two IR emitters always shows IR emitter ps2 to be the minuend and IR emitter ps1 to be the subtrahend, the direction of the gesture is determined by the negative/positive nature of the difference; >0 is a right-to-left motion and vice versa. Variables are used with timestamps. A time window of 300ms is defined for gesture control with a sample rate of 10ms and a lower threshold value of 50mA for an event to have occurred. This means that an event is only deemed to have occurred from a certain height between the object and sensor.

The gesture sensor does not directly provide information about the detected gesture, simply the raw data. The raw data is interpreted in an external microcontroller using an algorithm. The analysis is performed by a Nucleo board (STM32F401) to control the motor controller.

VI Motor Driver from TDK Micronas HVC 4223F

The HVC 4223F motor controller from TDK, marketed under the Micronas brand, does not possess a hardware I2C interface, but the performance capacity of the integrated ARM Cortex M3 (T) allows for software emulation of the signals from the Nucleo board on one of the 11 available LGPIO pins. In our construction the first two free pins LGPIO3 and LGPIO4 are used.

The HVC 4223F is distinguished by its optimum integration with components in a compact QFN40 housing (6.0 x 6.0 mm). The target applications are smart actuation solutions, both with brushed DC motors and with brushless DC and stepper motors. With integrated half-bridges; voltage supply for direction connection to battery voltage, LIN interface and 32kB flash storage enable integration while largely eliminating the need for external modules. To control a bipolar stepper motor in our example application, four of the six available n/n-channel half-bridge FETs were used.

The HVC 4223F offers hardware-support for controlling a bipolar stepper motor by means of current or voltage control. Where current control is used, the measured phase current is compared against a pre-defined threshold value from the programmable 8bit DAC. When this value is exceeded, a comparator in the enhanced EPWM module automatically disables the corresponding MOUTx output until it falls below the defined normal operating value again.
Overvoltage, overcurrent and overtemperature monitors are integrated into the IC for diagnostic functions. The chip can be connected directly to the 12-18V car battery and also has a LIN interface (LIN 2.2 Transceiver) for direct communication with the outside. The 32bit ARM Cortex M3 processor and 32kbit flash memory provide enough computing power to support even complex algorithms for controlling devices such as permanent magnet synchronous motors. TDK-Micronas recommends the Keil MDK-ARM v5.14 software environment, which requires no license for memory up to 32kbit.

Libraries and source code in the form of application notes are provided by TDK-Micronas as examples for controlling motors, the LIN interface or other general applications. For cases where eventually production software is to be used in the course of development, TDK-Micronas has had an A-Spice-compliant firmware developed by a system vendor that also meets functional safety requirements.

Example: In "config.h" the motor control behavior can be adjusted.

#define CMD_POSITION_STEP (300) // defines the step size in tenths of a degree; 300 = 30°

In summary, the HVC 4223F is a versatile motor controller for controlling a variety of small, electric, smart actuator motors. The ARM Cortex M3 processor offers enough performance to implement even complex algorithms for controlling motors. Complete integration of all required components for controlling BLDC or stepper motors reduces development time and thus also development costs and significantly cuts the adaptation workload for other applications following an initial familiarization process. TDK-Micronas supports users with libraries and source code and offers contact with a system vendor that developed production software/firmware for the HVC 4223F module.

In our demo application the motor controller is connected to a bipolar stepper motor (14HS17-0504S) that performs the rotary motions of the camera.

VII Stepper Motor Specifications (See table 2)

VIII Image Sensor from Omron, B5T-007001-020

The facial recognition image sensor from Omron, B5T-007001-020, is connected to the stepper motor via a shaft to enable it to be turned 30° to the right/left. The B5T HVC-P2 sensor module kit used comprises a camera head (1600x1200 pixels) and a mainboard that are connected together by means of a ribbon cable.

Omron's OKAO Vision technology is used to provide ten detection functions to choose from. The available functions are: facial recognition, human body detection, gender estimation, age estimation, eye tracking and blinking detection, hand detection. The expression estimation function recognizes five facial expressions (neutral, happy, surprised, angry, sad). Three image output formats are available: no image output, 160x120 pixels and 320x240 pixels. The kit is available with two different camera heads for wide-angle capture and long-distance capture.

IX Remarks Regarding the Position Feedback from the Motor Driver

A remark regarding the demonstrator: no position feedback is used, so the microcontroller cannot know in which position the motor is upon power-on. The motor or camera is manually aligned upon each power-on. This behavior has been implemented for the motor controller. Upon power-on, it is assumed that the motor (or camera) are in their center position. From this center position (0°), the camera can be moved to the left (-30°), and from this position it can only be moved to the right to the center position (0°). From this position, the camera can be moved to the right (+30°), and from this position it can be moved back to the center position (0°).

The following options with position feedback are available by having the motor perform a calibration sweep after power-on using mechanical limit points. Using the software's blockage detection this position can be detected as a limit position (+X). A mechanical limit position should therefore be established for each rotational direction. Additional expense for mechanical design and minor software adjustment. Sensor method: The motor performs a calibration sweep upon power-on and a sensor detects the limit position. When using a Hall effect sensor, two sensors should be used to create a limit position for both rotational directions. Additional expenditure for mechanical design, attachment of a magnet and sensors, plus wiring and software adjustment.

Find components at <link www.rutronik24.com _blank external-link-new-window "open internal link">www.rutronik24.com</link>.

Subscribe to our <link www.rutronik.com/newsletter _blank external-link-new-window "open internal link">newsletter</link> and stay updated.