Updated v2 API with faster VPN and ASN lookups now live!

At the end of April we shared with you some performance numbers for the new update to our v2 API which enhances VPN and ASN lookup speed. Today we're pleased to announce that the update is now live on our v2 endpoint.

This update has been a large undertaking as we not only focused on speed but also improving accuracy. For over a year we have been painstakingly adding VPN providers to our dataset but frankly there are thousands upon thousands of datacenters all over the world that can at a moments notice offer service to any of the thousands of VPN providers operating globally.

So we set upon a new strategy. Firstly the way we were blocking VPN's previously (blocking ASN's that served specific datacenters) was a good strategy but it had some flaws like we couldn't make exceptions for companies that use these same ASN blocks for residential or business internet access. It also meant we often gave out the incorrect provider name for a VPN service when we blocked their ASN range.

With our new VPN code launched today both of those issues have been solved. We can now block ASN's while making exceptions for specific IP Ranges or providers and we always give you the most accurate provider name for a specific IP even if they share an ASN range with another company.

Another change we've made is we're now using a new Machine Learning system for VPN detection. This is a real-time inference engine which will make determinations for all queries that have the &vpn=1 flag. This new engine has already broadened our VPN detection rate by 8% in testing when combined with our previous VPN detection methods.

The last thing we wanted to discuss is our Real-Time Inference Engine for proxy detection. With this update to v2 where we've introduced the new VPN Inference Engine we have made quite a performance breakthrough. By using enhanced math functions in the processors of our nodes combined with pre-computing computationally heavy instructions and storing their results we have been able to greatly reduce inference time from an average of 250ms to just 1.5ms. This is why we have not added a disable feature for the Inference Engine when performing VPN checks, it's simply so fast there was no need.

And that brings me to the benchmarks. In our testing with VPN, ASN and Inference checks enabled, supplying the API with 1,000 IP's in a single query it would previously take up the entire 90 second query window and only check 300 of the 1,000 IP's.

With the new code we're able to supply 10,000 IP Addresses with the same flags enabled and receive results for all 10,000 addresses within 10.5 seconds. This is a vast improvement which means you no longer need to forgo VPN, ASN or Inference Checks to get the fastest results possible. For single queries checking a single address we're seeing a consistent query time of under 6ms (after network overhead).

If you're not already using our v2 API we highly recommend the upgrade, not only is the detection for VPN's more accurate but the speed enhancements are unreal. We have ported some of this functionality back to v1 just to maintain compatibility but we cannot guarantee it will be as fast. As always all of these new features are available immediately to all customers whether you're on a paid or free plan.

Thanks for reading, we hope everyone has a great weekend.


Back