Inertial Measurement Unit
I am using the Sparkfun 9DoF Block for Intel Edison which has an accelerometer, gyroscope and magnetometer. And while the accelerometer points down if it does not move it might also sometimes point in any other direction. The gyroscope on the other hand only knows how fast it rotates into which direction and I am ignoring the magnetometer for now. Also vibrations can cause a lot of noise in all those readings. There are a couple of very clever and complicated ways to combine this data into a good orientation, but I couldn't get them to work for me since I didn't fully understand them...
Fortunately somewhere someone wrote about just combining the rotation information from the accelerometer with the current orientation based on the sum of the previous orientation and the gyroscope data. This is easy enough and turns out to work very well.
The accelerometer points in the direction a force is applied to it, which will be downwards due to gravity if it isn't moved otherwise. There is no yaw information in the accelerometer data, but pitch and roll can be calculated like this based on normalized accelerometer values:
float pitch = atan2(-accelerometer.x, sqrt(accelerometer.y*accelerometer.y + accelerometer.z*accelerometer.z))*180.0/M_PI;
float roll = atan2(accelerometer.y,( accelerometer.z>0.0f?1.0f:-1.0f)*sqrt(accelerometer.z*accelerometer.z + 0.001f*accelerometer.x*accelerometer.x))*180.0/M_PI;
To get rid of some noise I am also doing some simplistic lowpass filtering after this by lerping the old and new values.
The gyroscope data is added to the previous orientation (ignoring yaw for now) and the resulting new orientation is then lerped with orientation from the accelerometer, currently weighting the gyro based data very strongly.
orientation = Quaternion::WithLerpSpherical(_accelerometer, _gyroscope, 0.99f);
The result is quite stable and accurate.
I thought I could get away with just the proportinal part but it turned out not to work very well, so I implemented the whole thing which is easy enough to do. Wanting to be clever I am not controlling the angle but the distance between each motors actual position and it's target position. I have a feeling that this is not going to work out very well once I include yaw into it, but for now it does the job. What turned out to be tricky is finding good values for the P, I and D part. I started out with the quadcopter on the floor of my room and had to reprint lots of parts because of it doing anything but hovering.
Then I built this thing where I attached two arms to a string and only tested with one axis. My current values still aren't great, but somewhat working.
It somehow stays not level at some point, but checking the IMU data it knows about it and is controlling the motors correctly. I think for some reason one of the motors is just spinning slower than the other for the same value sent to the speed controller. I am currently investigating this...
As always the code can be found on github:
And as you can see there is also the software to control it with. It uses enet for the networking and implements controls using a PS4 controller connected with a cable to the PC running the program. It is all still very much WIP, but might be good enough for a first flight soon :).