I'm developing a query engine for spatial data in Python and needed a way to efficiently serialize 2D/3D data into 1D while maintaining spatial locality. I decided to implement Hilbert space-filling curves as they offer better locality preservation compared to Z-order curves. However, the algorithms behind Hilbert mappings can be complex and slower to compute. Therefore, I created HilbertSFC, a pure Python encoder/decoder using Numba. This project achieves remarkable speeds—around 1.8 nanoseconds per point for 2D encoding/decoding, and it can handle throughput of up to 500 million points per second when optimized for multi-threading. Compared to existing Python packages, it's 3-4 times faster and even outperforms some Rust implementations. I'm targeting Python developers who need a fast, production-ready package for spatial indexing and processing. You can find more details in my benchmarks comparing various implementations. I'm really curious about how these optimizations work and if anyone else has insights into this type of performance enhancement!
3 Answers
Looks interesting, but I can’t help but wonder if it's really as good as you say. Would you be open to integrating HilbertSFC with a popular open-source library to prove its capabilities? That might help potential users trust it more.
I have to say, I've seen quite a few libraries being touted as 'production-ready' lately, and they often fall short. What makes you feel confident that HilbertSFC is ready for prime time? Any specific testing or use cases?
I totally get that skepticism! I've rigorously tested it with large datasets, and it performs consistently well without glitches. It’s currently being used in a private project for point cloud data, which I plan to make public soon.
Awesome work on HilbertSFC! The performance improvements you outlined are really impressive. I’m especially interested in how you managed to optimize the architectural aspects like the fixed-structure finite state machine and the LUT indexing. Did you encounter any significant challenges during development?
Thanks! Yes, optimizing for those features was quite challenging, especially when balancing between speed and complexity. The architectural tweaks required a lot of testing to see how they performed in real-world scenarios.

Integration is a great idea! Do you have any specific libraries in mind? I'll definitely consider it, as demonstrating compatibility with established projects would enhance credibility.