Does anyone know of a technical breakdown for the development of this game? Apparently, the 3D assets are real paper objects that have gone through a photogrammetry pipeline, but I've found no information about how they did it! A few years ago, I remember Meshroom [0] being the OSS reference for this, but I'm not sure what you'd use nowadays for this. I'm also curious about the Godot stuff, since I would imagine Blender Studio would've used Armory 3D [1] a few years ago to do this, after BGE went defunct. (Happy they went with Godot though, much more promising).
They've published a series of articles on this project:
>So this is how a lot of our assets ended up being primarily designed in real life as painted paper-cutout models. The models are then photographed, taken apart and scanned. Some paint samples are also created specifically to become tileable.
The paper objects were unfolded, scanned/photographed, imported into Blender, and mapped onto 3D Models of the objects; no photogrammetry was used afaik.
[0]: https://alicevision.org/#meshroom [1]: https://armory3d.org/engine/