STM32F4 Quadcopter flight controller

Keywords: C, FreeRTOS, STM32F4, flight controller, quadcopter, sensor fushion, PID

As an on and off project I’ve been working on a custom flight controller firmware. It’s based on the STM32F4 ARM Cortex-M4 using FreeRTOS, 9-DOF IMU for stability around the pitch, roll and yaw axis aswell as three different barometers and a downwards-pointing laser for reliable altitude hold functionality. It can be controlled either from an RC receiver/transmitter system, or from a custom Android joystick app via an ESP8266 over WiFi. A very simple PCB was designed in KiCad.

I used STM32CubeMX to set up the peripherals and HAL libraries, and IAR Embedded Workbench for development and debugging of the code. In the below clip the basic functionality is demonstrated, and a new video of the full project is to be expected shortly!

The app was designed using Blynk and the circuit board was ordered from OSHPark.

app_pcb

Used in the project is:

  • STM32F4 ARM Cortex-M4 running at 168 MHz
  • EM7180 sensor fushion module with a 9-DOF MPU-9050 (gyrometer, accelerometer, magnetometer) and a BMP280 (barometer)
  • BMP280 barometer
  • BMP180 barometer
  • VL53L0X laser for ground distance estimation
  • drone_flowchart

    The EM7180 outputs raw sensor data aswell as quaternions at 200 Hz. A task converts the quaternions into euler angles to feed a PID loop roll, pitch and yaw angles at 200 Hz. Another task calculats the altitude (cm) using a lowpass filtered average from the three barometers aswell as data from the accelerometer and laser. Depending on the altitude, either laser- (low altitude), or barometer readings (high altitude) is being fused together with the accelerometer data using a complementary filter. The PID loop corresponding to altitude is running at 50 Hz due to the slower barometer and laser update rates. The PID-loop is given the filtered estimate of the current location from the sensors, the velocity, and the desired current location read from the RC receiver or from the Android app. It outputs a float number between -1.0f and 1.0f which will contribute to the final PWM signal consisted of:

  • hover – a hard coded value between 0.0f and 1.0f which corresponds to when the drone has enough thrust to hover.
  • alt – a PID controlled value between -1.0f and 1.0f for altitude hold functionality.
  • pitch – a PID controlled value between -1.0f and 1.0f.
  • roll – a PID controlled value between -1.0f and 1.0f.
  • yaw – a PID controlled value between -1.0f and 1.0f.
  • // Calculate PID values
    float pitch     = PID_Calc(&pitchPid);
    float roll      = PID_Calc(&rollPid);
    float yaw       = PID_Calc(&yawPid);
    float alt       = PID_Calc(&altitudePid);
     
    // Calculate the motor values
    // 1  front  4
    // left  right
    // 2  back   3
    esc1 = hover + alt + roll + pitch + yaw;
    esc2 = hover + alt + roll - pitch - yaw;
    esc3 = hover + alt - roll - pitch + yaw;
    esc4 = hover + alt - roll + pitch - yaw;
     
    // Feed motors only if armed
    if(arm){
        ESC_SetSpeed(TIM_CHANNEL_1, esc1);
        ESC_SetSpeed(TIM_CHANNEL_2, esc2);
        ESC_SetSpeed(TIM_CHANNEL_3, esc3);
        ESC_SetSpeed(TIM_CHANNEL_4, esc4);
    }

    The sum of the PID outputs together with the hard coded hover variable will add up to a float between -1.0f to 1.0f which is converted into a PWM signal of 1-2 ms pulses.

    The full firmware can be found on GitHub

    Angle estimation using Kalman filter

    Keywords: Sensor fushion, angle estimation, gyrometer, accelerometer, kalman filter

    In order to estimate angle using gyrometer- and accelerometer data I decided to implement a Kalman filter. I won’t dig much into the theory as plenty other sources on the interwebs are more qualified for that.

    Shown is the completed result of the Kalman filter in action during a 20 second period, where it can be seen that the resulting curve (blue) follows the gyro (green) short term but is being corrected by the accelerometer (red) long term.

    kalman

    I decided to implement the system matrices as a struct so it can be easily modified for other sensors and purposes, though only in two dimensions meaning only two different sensor inputs can be used at the time.

    typedef struct{
        // Inputs
        float u;        // Control input (for angle estimation use gyro data)
        float z[2];     // Measurement input (for angle estimation use acc data)
        float dt;       // Timestamp
        // Output matrix
        float x[2];     // Output
        // Initialize variance for each sensor
        float Q[2];     // Process (control) variance (for u)
        float R[2];     // Measurement variance (for z)
        // Initialize models
        float H[2][2];  // Observation model
        float F[2][2];  // State transition model
        float B[2];     // Control input model
        float P[2][2];  // Covariance matrix
        // Don't initialize
        float y[2];
        float S[2][2];
        float K[2][2];
    } FilterKalman_t;

    In order to initialize it, the state transition model F as well as the control model B has to be found. They can be found by looking at the units of our inputs and desired outputs, where our desired output vector is angle and gyro bias:

        \[ x_{k} = \begin{bmatrix} \theta \\ \dot{\theta}_{b} \end{bmatrix}_{k} \]

    The first element in the input vector z is acceleration which directly can be translated into an angle, hence is already the correct unit. Our control input u is the gyrometer (angular velocity) which can be translated into an angle by calculating the temporal integral. The full expression of our first output parameter would be:

    angle = initial angle + (anglular velocity * dt)

    You might recognize the formula from high school science class.

    Since an angle can be directly extracted from acceleration, and our angular velocity (gyro) is drifting, we change the expression to:

    angle = acceleration + ((gyro - gyro_bias) * dt)

    The state of the system

        \[ x_{k} = \boldsymbol{F}x_{k-1} + \boldsymbol{B}u_{k} \]

    hence gives us the expression:

        \[ \begin{bmatrix} \theta \\ \dot{\theta}_{b} \end{bmatrix}_{k} = \begin{bmatrix} 1 & -dt \\ 0 & 1 \end{bmatrix} \begin{bmatrix} \theta \\ \dot{\theta} \end{bmatrix}_{k-1} + \begin{bmatrix} dt \\ 0 \end{bmatrix} u_{k} \]

    The full initialization will be:

    FilterKalman_t angleFilterKalman = {
        .F[0][0] = 1,
        .F[0][1] = -dt,
        .F[1][0] = 0,
        .F[1][1] = 1,
        .B[0] = dt,
        .B[1] = 0,
        .Q[0] = 0.001,  // Variance for gyro
        .Q[1] = 0.003,  // Variance for gyro bias
        .R[0] = 0.03,   // Variance for accelerometer
        .R[1] = 0.03,   // Variance for accelerometer
        .H[0][0] = 1,
        .H[0][1] = 0,
        .H[1][0] = 0,
        .H[1][1] = 0,
     
        .P[0][0] = 0,
        .P[0][1] = 0,
        .P[1][0] = 0,
        .P[1][1] = 0
    };

    The filter function itself is implemented together with comments for it to be quite straight-forward.

    void Filter_Kalman(FilterKalman_t *k)
    {
        // x = F*x + B*u
        float x_temp[2];
        x_temp[0] = k->F[0][0] * k->x[0] + k->F[0][1] * k->x[1] + k->B[0]*k->u; // Position
        x_temp[1] = k->F[1][0] * k->x[0] + k->F[1][1] * k->x[1] + k->B[1]*k->u; // Velocity
        k->x[0] = x_temp[0];
        k->x[1] = x_temp[1];
        //rate = u - x[1];
     
        // P = F * P * Ft + Q
        float P_temp[2][2];
        P_temp[0][0] = k->F[0][0] * (k->F[0][0]*k->P[0][0]+k->F[0][1]*k->P[1][0]) + k->F[0][1] * (k->F[0][0]*k->P[0][1]+k->F[0][1]*k->P[1][1]) + k->Q[0]*k->dt;
        P_temp[0][1] = k->F[1][0] * (k->F[0][0]*k->P[0][0]+k->F[0][1]*k->P[1][0]) + k->F[1][1] * (k->F[0][0]*k->P[0][1]+k->F[0][1]*k->P[1][1]);
        P_temp[1][0] = k->F[0][0] * (k->F[1][0]*k->P[0][0]+k->F[1][1]*k->P[1][0]) + k->F[0][1] * (k->F[1][0]*k->P[0][1]+k->F[1][1]*k->P[1][1]);
        P_temp[1][1] = k->F[1][0] * (k->F[1][0]*k->P[0][0]+k->F[1][1]*k->P[1][0]) + k->F[1][1] * (k->F[1][0]*k->P[0][1]+k->F[1][1]*k->P[1][1]) + k->Q[1]*k->dt;
     
        k->P[0][0] = P_temp[0][0];
        k->P[0][1] = P_temp[0][1];
        k->P[1][0] = P_temp[1][0];
        k->P[1][1] = P_temp[1][1];
     
        // S = H*P * Ht + R
        k->S[0][0] = k->H[0][0] * k->H[0][0] * k->P[0][0] + k->R[0];
        k->S[0][1] = k->H[0][1] * k->H[1][0] * k->P[0][1];
        k->S[1][0] = k->H[0][1] * k->H[1][0] * k->P[1][0];
        k->S[1][1] = k->H[1][1] * k->H[1][1] * k->P[1][1] + k->R[1];
     
        // K = P * Ht * S^-1
        float S_inv = k->S[0][0]*k->S[1][1] - k->S[0][1]*k->S[1][0];
     
        k->K[0][0] = ((k->H[0][0]*k->P[0][0] + k->H[0][1]*k->P[0][1])*k->S[1][1] - (k->H[1][0]*k->P[0][0] + k->H[1][1]*k->P[0][1])*k->S[1][0]) / S_inv;
        k->K[0][1] = ((k->H[1][0]*k->P[0][0] + k->H[1][1]*k->P[0][1])*k->S[0][0] - (k->H[0][0]*k->P[0][0] + k->H[0][1]*k->P[0][1])*k->S[0][1]) / S_inv;
        k->K[1][0] = ((k->H[0][0]*k->P[1][0] + k->H[0][1]*k->P[1][1])*k->S[1][1] - (k->H[1][0]*k->P[1][0] + k->H[1][1]*k->P[1][1])*k->S[1][0]) / S_inv;
        k->K[1][1] = ((k->H[1][0]*k->P[1][0] + k->H[1][1]*k->P[1][1])*k->S[0][0] - (k->H[0][0]*k->P[1][0] + k->H[0][1]*k->P[1][1])*k->S[0][1]) / S_inv;
     
        // y = z - (H*x)
        k->y[0] = k->z[0] - (k->H[0][0]*k->x[0] + k->H[0][1]*k->x[1]);
        k->y[1] = k->z[1] - (k->H[1][0]*k->x[0] + k->H[1][1]*k->x[1]);
     
        // x = x + (K*y)
        k->x[0] = k->x[0] + k->K[0][0]*k->y[0] + k->K[0][1]*k->y[1];
        k->x[1] = k->x[1] + k->K[1][0]*k->y[0] + k->K[1][1]*k->y[1];
     
        // P = (I - K*H)*P
        P_temp[0][0] = k->P[0][0]*(1 - k->H[0][0]*k->K[0][0] - k->H[1][0]*k->K[0][1]);
        P_temp[0][1] = k->P[0][1]*(0 - k->H[0][1]*k->K[0][0] - k->H[1][1]*k->K[0][1]);
        P_temp[1][0] = k->P[1][0]*(0 - k->H[0][0]*k->K[1][0] - k->H[1][0]*k->K[1][1]);
        P_temp[1][1] = k->P[1][1]*(1 - k->H[0][1]*k->K[1][0] - k->H[1][1]*k->K[1][1]);
     
        k->P[0][0] = P_temp[0][0];
        k->P[0][1] = P_temp[0][1];
        k->P[1][0] = P_temp[1][0];
        k->P[1][1] = P_temp[1][1];
    }

    The full source code can be found here, but has probably been improved since this post was written.

    Altitude estimation using accelerometer & barometer

    Keywords: Sensor fushion, altitude estimation, barometer, accelerometer, complementary filter

    As part of my ongoing quadcopter flight controller project I had to find a realiable way to estimate altitude in order to get an “Altitude-Hold” functionality. I solved this by using data from the barometer and accelerometer by fusing it together using a complmentary filter. The result is the picture below, followed by a brief explanation of my approach. As can be seen, the filtered result depend short time on the drifting accelerometer, while it long term is dependent on the correcting, but noisy baromter.

    filter1png

    The data provided by the barometer differs about ± 50cm, which is not good enough for altitude estimation of a quadcopter. In order to fix this, the z-axis from the accelerometer can be used. Firstly, the gravity has to be substracted so the acceleration is zero at rest. To do this, we must calculate the orientation of the gravitational vector using the pitch and roll angles. This will allow the quadcopter to read zero g’s on the z-axis independent on orientation, as long as it’s at rest.

    Plotting the corrected data it can be seen that both sensors are rather noisy, which is why I applied a simple low-pass filter to both signals.

    typedef struct{
      float input;
      float output_last;
      float output;
      float coeff;  // lower value, lower damping
      float dt;       // delta time
    } lowpass_t;
     
    void low_pass(lowpass_t *temp)
    {
        const float damping = temp->dt/(temp->coeff + temp->dt);
        temp->output = temp->output_last + damping*(temp->input - temp->output_last);
        temp->output_last = temp->output;
    }

    filter2

    The unit of measurement from the barometer is already position. But the unit of the accelerometer is, shocking, acceleration. To be able to fuse these two signals together their units must match. While the temporal integral of acceleration is velocity, the second integral is position, which is what we’re interested in. Unfortunately, all accelerometers has an internal offset error, which will cause the velocity calculation error to rise linearly with time. A position calculation will have error rising quadratically with time.

    The complementary filter will be useful here. It is used to fuse the velocity calculated from the integral of acceleration together with the velocity calculated from the derivative of the altitude estimation from the barometer.

    typedef struct{
      float vel;
      float acc;
      float alt;
      float dt;
    } altitude_t;
     
    altitude_t alt_baro;
     
    // Derive altitude and get velocity in cm/s
    static float last_alt;
    alt_baro.vel = (alt_baro.alt - last_alt) / alt_baro.dt;
    last_alt = alt_baro.alt;
    altitude_t alt_tot;
     
    // Velocity = initial velocity + (acceleration * time) 
    float vel_temp = alt_tot.vel + alt_tot.acc * alt_tot.dt;
    // Calculate velocity using complimentary filter
    static float vel_coeff = 0.999f;
    alt_tot.vel = vel_coeff * vel_temp + (1 - vel_coeff) * alt_baro.vel;

    When the velocity is corrected, finally the same procedure can be used for altitude estimation:

    // Position = initial position + (initial velocity * time) + (1/2 * acceleration * (time^2))
    float alt_temp = alt_tot.alt + (vel_temp* alt_tot.dt) + (0.5f * alt_tot.acc * alt_tot.dt * alt_tot.dt);
    // Calculate altitude with complimentary filter
    static float alt_coeff = 0.998f;
    alt_tot.alt = alt_coeff * alt_temp + (1 - alt_coeff) * alt_baro.alt;

    The result (first picture) is a fused signal almost as smooth as the altitude estimation from the accelerometer, but with the long term correction from the barometer. This can surely be tuned and improved, and I might update the post when progress has been made.

    The full source code can be found here, but has probably been improved since this post was written.

    Speech recognition on Raspberry Pi

    Keywords: C, Node.js, FreeRTOS, ESP8266, Raspberry Pi, Speech Recognition, WiFi, MQTT

    I wanted to play around with speech recognition on Raspberry Pi and decided to use the voice API released by IBM, Watson TJBot. There are several guides out there to get you started.

    I already had my custom esp8266 RGBW lights (post futher down) controlled by a Raspberry Pi up and running at home. So instead from sending the commands over MQTT from the app, I wanted to send the same MQTT values using voice commands.

    The already existing speech-to-text script used by TJBot takes a voice stream as input and gives a text string as output. So the only thing I had to do was to modify the script to look for my trigger words, for example lamp1, lamp2, on, off, blue, green, red and so on, and send MQTT commands to the lights on these triggers.

    The source code for the speech recognition can be found on GitHub here. The source code for the esp8266 firmware is the same as in older post.

    Demo on YouTube:

    CC2650 BLE controlled longboard

    Keywords: C, Bluetooth, BLE, CC2650

    Once every year at Texas Instruments we have this thing called “Innovation Day” where we during 24 hours play around with basically whatever we want that is TI related. This was my first time, and I decided to replace the bluetooth controller on my longboard to TI products.

    I used 2x CC2650 Launchpads which is one of the BLE products I’m usually working with. One acts as a central device (longboard side) and one as a peripheral device (joystick side). The peripheral reads an analog value from a joystick (model PSP from ebay) using the Sensor Controller* in the CC2650. The joystick x-orientation is sent as a 16-bit integer over BLE to the central which outputs a corresponding PWM-signal. The PWM signal is connected to an ESC which outputs a three-phase current to the electrical motor.

    The longboard can go forward and break.

    Indoor test on YouTube:

    Outdoor test on YouTube:

    Since this was work-ish related i didn’t put the code on GitHub, but the BLE stack can be found here and additional examples here.

    *The Sensor Controller is as an ultra low-power 16-bit RISC processor that can access the same peripherals as the main processor. This is very useful when the Cortex-M3 is overkill and not power friendly.

    ESP8266 WiFi lights

    Keywords: C, Javascript, Node.js, FreeRTOS, ESP8266, Raspberry Pi, WiFi, MQTT

    During the late autumn of 2016 I finally got the opportunity to fiddle around with the ESP8266. There are several ways to write code for the chip, but I decided to go for the open-rtos-sdk since it got the perks of being open source and based on FreeRTOS which I’m already familiar with. I bought some cheap RGB light bulbs based on the ESP8266 chip online, together with an USB to TTL cable to be able to flash my custom firmware to it.

    lyt8266

    The project, which I decided to call Alfred Home Assistant, consists several light bulbs with ESP8266 acting as MQTT clients, and a Raspberry Pi Zero acting as a MQTT broker (using Mosquitto). The RPi also runs a server to host the web application that is displayed on the smartphone. The ESP8266 outputs four PWM signals, one 8-bit signal for each RGB color and one for white. The app displays three different bars:
    1. The color spectra in range 0-255 for all the three colors.
    2. Saturation, which is the intensity from 0-100 of the color. 0 gives only white, 50 gives 100% white and 100% color, and 100 gives just 100% RGB color but no white.
    3. Brightness 0-100.

    alfred

    The source code for the ESP8266 can be found here. Some parts are not beautiful, especially the fact that the ESP8266 seem to have only two timers (really?) caused problems in order to have four individual PWM signals.

    The code for the web app can be found here. It was made in Javascript using the Onsen framework for AngularJS. It can be run on any device in basically any browser and is screen responsive. However, I’m a shitty web developer so most of it is hardcoded and there’s plenty of room for improvement.

    Demo on YouTube:

    PID Controlled Temperature

    Keywords: C, PIC

    During my exchange to UPV (Polytechnic University of Valencia) I decided to learn how to implement a PID controller in C. I did this by making a simple temperature controller project using a PIC18f4520, simulated in Proteus ISIS 7 Professional.

    The project shows two identical temperature controllers, both connected to a seven segment display to show the current temperature. The simulation starts at 20°C, with a desired temperature at 50°C. The temperature controllers consists of two heaters which can either be ON or OFF. The left one is turned ON when the temperature is < 50°C and turned OFF when the temperature is > 50°C. The right one is controlled by a PID with a PWM signal as output, running on 2 Hz.

    The video will show that the left one reaches and exceeds the desired temperature, while the right, PID controlled heater will reach 50°C and stay put.

    The source code can be found on GitHub here.

    Demo on YouTube:

    ZigBee Mesh Home Automation

    Keywords: C, ZigBee, MSP430, ATmega, Raspberry Pi

    I started this project summer 2014 just for fun, and used pretty much what I had at home. The only thing i bought specifically for this was the XBee Series 2, ZigBee modules. The modules are configured as ZigBee Mesh, and can therefore forward messages between them if the transmitting node is out of range of the receiving node.

    The “brain” of the system is a Raspberry Pi. Its purpose is to run a server for GUI, storing addresses and variables in a database and most importantly serve as a bridge between the WiFi and the ZigBee network. The Pi is connected directly to the router with an ethernet cable, and to an ATmega162 microcontroller via UART on the GPIO pins. The coordinator ZigBee module is then connected to the ATmega.

    The end point nodes are connected to whatever microprocessors I had at home. To control the lights I broke apart some old 433 MHz radio receiver outlets and stuffed in an MSP430g2553 and the ZigBee module, along with some necessary electronics. The cabinet door is controlled by an ATmega48 connected to a servo along with the ZigBee module.

    The source code can be found on GitHub here.

    beforeaftercabinet
    Demo on YouTube:

    Daft Punk helmet

    Keywords: C/C++, Arduino, Bluetooth, ATmega,

    I built the helmet to wear on a costume party contest on Halloween 2013, and is because of this aesthetically damaged in the videos below. However, I won first prize and got myself a nice diploma, so it was totally worth it.

    It was shortly after the party noticed on a major Swedish TV channel; TV4 (can be found here) in an interview with my good friend Sofie, about encouraging girls to apply for technical education. It can nowadays be found in a showcase at Linköping University.

    The electronics consists of 256 LED soldered together to a 32×8 matrix, four 8-bit shift registers, an HC-05 Bluetooth module and an ATmega328 microcontroller. It was originally programmed with Arduino and later re-written in C.

    Demo on YouTube: