I started playing the VR game Echo Arena competitively and explain what it's all about.
I started implementing multiplayer and wrote about how I did that so far using ENet and Protocol Buffers.
I implemented a 3d audio library based on Steam Audio and libsoundio and this post is about how I did it as well as some other things I would have liked to know before I started, because the Steam Audio documentation is somewhat lacking.
I want to make VR games in my spare time in a more serious way. This blog post is supposed to be one in a serious of hopefully many about that. This one is especially about my general plan and some more specific technical issues I had to overcome with bullet physics.
Short blog post about switching from pants to Anchor and Sushi, the Game Boy game we wrote in assembly during the Global Game Jam 2017.
If you start writing a 3D graphics engine, the basics usually consist of loading mesh, texture and shader data and getting it to the GPU in a way that enables the GPU to run the shader with the mesh and texture data as input. To load the texture data in a plattform independent way, easy formats to get started with are TGA, BMP and PNG (with a lot of help from libpng...). Much better for a real game are usually compressed formats such as S3TC that can be decompressed by the GPU while rendering at no performance cost. But at least I have a lot of PNG files laying around I want to test with and I don't really feel like converting everything until I have a real asset pipeline going.
I participated in the InnoGames game jam 2016 at the Gamescom and wrote a little bit about my experience.
I am showing the minimalistic way I currently implemented for my quadcopters IMU to merge the sensors relative accelerometer and gyroscope data into an absolute orientation. Also a tiny little bit about PID...