New Status Code System

Today we've introduced a new status code and message system to the v2 API. This was prompted by a user request and we felt it was a very useful feature for the API to have, standardising our errors and warnings will make the API easier to code against.

We have updated our API Documentation with the new information and below is a screenshot of that new section.

Image description

We hope you all like the change we were careful not to break compatibility with any v2 supporting clients while implementing the new status system.

Thanks!


WordPress plugin update!

Today Ricksterm an independent developer who made the WordPress Proxy & VPN Blocker plugin has released a major update to his plugin which adds the ability to perform checks across your entire WordPress website. Previously it supported checking and blocking only on signup, login and comment posting pages but now you can choose to enable it site-wide!

Image description

This has been a much requested feature by users and we thank him for his continued support of the plugin with frequent substantial updates. Alongside this new feature it has also gained an improved stats view with support for showing countries and a new slider which lets you adjust the detection sensitivity.

Image description


If the service is free then you are the product

The title of this post is a common phrase you will read online when viewing forum posts within privacy minded communities. And in general it's true and has been true for as long as products have been offered for "free" to consumers.

Since we started we've had customers enquire about what we're doing with the data they send us. Specifically when they send us a customers IP Address do we correlate that with their web property and then sell that information.

For example if you operated a store that sold Guitars and you use our service for your registration or checkout system are we recording the IP Addresses you send us for proxy checking and then handing that data off to a marketing company so they can run targeted ads to your visitor for Guitar related products.

With the recent Cambridge Analytica disclosures involving Facebook we've been asked this question much more frequently than before and we thought it would be a good idea to write a blog post about our stance on this.

So the question is, do we sell your information? and the answer is no, we do not sell your information. Infact we do not make available any of the data our customers entrust with us. The only third parties we ever allow to handle your data in any way are Stripe which is our card payment processor and mailgun and both of these companies only receive the bare minimum of your personal information to perform the duties we've entrusted with them.

For Stripe that means your bank card information to perform transactions and for mailgun that means your email address. Beyond that they don't receive anything else and neither does anyone else. We simply do not make available customer information in any form even as aggregate data to any third party, period.

Now of course the question is if our free customers aren't our product how are we staying profitable? Well our business model is built around converting free customers to paid customers. We give unregistered users 100 queries per day and we give registered users 1,000 queries per day. Both for free.

Then as those customers needs grow, meaning they're regularly making over 1,000 queries per day we attempt to convert them into paying customers. We do this in a few ways, firstly the stats on the dashboard help users to determine their own query volume needs and secondly when you go over your query allotment for five days in a row we send you a single email to let you know.

Essentially a single $29.99 subscription which is our most popular paid plan right now can subsidise the usage of several hundred free users. That's part of what enables us to offer a very competitive free plan with feature parity to our paid plans.

The other part is that we designed proxycheck.io from day one to scale across multiple servers. Not just the API but every facet of our service like our website which includes the customer dashboard and web interface. As the queries hitting our API have grown we've been able to efficiently meet that increasing demand with very little waste due to the cluster.

With some of our competitors infrastructure we've seen them place free customers on one server and paid customers on another server. We've also seen competitors setup single dedicated servers just for single paid customers. While that sounds very premium on the surface, the reality is that's increasing the chance of failure and it's very inefficient business wise as you will have under-utilised resources which you have to pay for regardless of that servers actual usage, it also makes their premium plans exceedingly and in some cases outrageously pricey. Essentially you pay more, but you get less.

Our custom cluster architecture has allowed us to maximise our resource use so that all of our customers benefit equally from the increased performance and redundancy that adding more servers to the cluster brings while keeping our costs low as we don't have to keep paying for under-utilised servers. All of that means we can offer our generous free plans while respecting all of our customers privacy.

When companies sell their customers data while also having paid plans we call that double dipping. Frankly we think the privacy situation globally right now is in a very poor state and we don't want to be a part of the problem. We have welcomed the GDPR (General Data Protection Regulation) because for too long internet companies have been operating like it's the wild west when it comes to user privacy and user data ownership rights.

We hope this blog post has informed you on our stance, we have no plans to make available customer data to third parties, frankly we don't want to know who your website visitors are or what your website does. All we're interested in is making the best Proxy and VPN detection API at the lowest possible cost and we can certainly do that without invading anyones privacy.

Thanks for reading and have a great week!


v2 API adoption rates

On January 1st 2018 we introduced our v2 API which was the first new version of our API since we started in April 2016. Sure we'd re-coded v1 quite significantly several times since we launched but the v2 rewrite changed everything. It really was a from the ground up re-implementation of our API with almost no shared code between v1 and v2 due to the main feature we wanted to implement which was the ability to check multiple IP's with a single query, a feature we call multi-checking.

I'm pleased to say that since the launch we have seen quite a high degree of registered users utilising the new API. In-fact 46.37% of all registered users are now making use of the v2 API and as of right now 79.54% of all queries made today by registered users were to the v2 API endpoint. Our largest customers are the fastest movers in this regard and have jumped onto the v2 endpoint very quickly.

For a new product that's only just over three months old that's incredible adoption and it means we're getting our messages out to our customers and they are trusting us to make the right decisions with a product they rely on every day.

When it comes to unregistered users the adoption rate is quite a bit lower, only 11.56% of unregistered users that performed any query to the API today did so to our v2 endpoint. We expect this because many of these users are using 3rd-party software solutions that are still using the older v1 API and these users have not configured the software and so have not got an API key yet.

We're still two years away from us ending support for v1 so these numbers are incredibly encouraging to us. The third-party software that has implemented our v1 API will likely get updated or replaced before then but we're leaning towards creating a simple redirect endpoint at v1 which will forward queries to the v2 endpoint and translate the queries back into a v1 format. This won't take much effort from us and guarantee no one gets left behind, we know sometimes people implement an API and then forget about it and we don't want to leave anyone unprotected.

So that's the update we wanted to share, the high adoption rate among our registered customers has been a great win for us and with the new v2 specific features we launched in March it has only accelerated the adoption rate. We don't poll users for their satisfaction of our service but many like to write in and tell us anyway and I'm pleased to say the satisfaction rate is incredibly high, we've been able to provide a stable service whilst responding to new feature requests from our customers.

We love to hear from all of you, if you have a new feature idea, found a bug or just want to tell us what you think of the service please get in touch and let us know!


Introducing two-factor authentication

Since we added the dashboard we've had various requests from users to add two-factor authentication which is where you use an application local to you (such as on your computer or phone) to generate a one-time password which can be used in conjunction with your normal password to authenticate your logins.

The added security is obvious, if an attacker compromises both your API key and password they would still need to gain physical access to the device you run your authenticator on. Most people choose to use a mobile phone based authenticator for this reason.

So today we've enabled two-factor authentication for all accounts whether you're on our free or paid tiers you can benefit from the extra security two-factor provides. Below is a screenshot showing the new user interface within your dashboard for the feature.

Image description

We hope you like the apperance of the feature, we wanted to make sure it looks clean while being easy to understand and use. As the image informs you we're not limiting the feature to a specific authenticator. You're free to use any TOTP compatible authenticator which includes Googles, Authy, 1Password and more.

Thanks for reading and we hope everyone is having a great week!


New dashboard stats

Over the past couple of days we've been updating the stats tab on the Dashboard. Yesterday we added by customer request a country display to the recent detections section as shown below. We also added support for country information to the JSON output feature.

Image description

Today we've added a very neat looking interactive geographic heat map which shows the countries your positive detections are originating from. We're using maxmind's geoip data for this feature. This means we download their IP to Country database and do the IP Address lookups on our own server.

This is different to where we obtain our country/asn information for our main API and we are using maxmind here for speed so the graph can load very quickly. Below is an image of the map you'll find within your dashboard as of this post.

Image description

We hope you enjoy the new additions to the dashboard, these changes are the result of direct customer feedback.


Customer dashboard and network improvements

Recently we've made some important changes we wanted to tell you about.

Network Improvement

The first is improving our network performance through better peering. As some of you may be aware if you've looked up our DNS records we use Cloudflare as our internet edge network partner.

This means before your requests reach our servers they first go through the Cloudflare infrastructure which is made up of 150 or so servers around the globe. Their goal is to have a server in every major city by the end of this year which would cover 95% of human civilisation.

As part of the Cloudflare service they have a paid sub-service called Argo which essentially funnels traffic between Cloudflare servers within a virtual private network over the lowest latency and highest bandwidth internet links available to Cloudflare.

When companies (like us) have Argo enabled it means your data first connects to a Cloudflare server closest to you, hopefully within the same city as you. And from there it only travels along Cloudflare servers around the world until it gets to one of our server nodes.

The end result is significantly lower latency for some connections. Argo is smart enough to determine if a connection will be faster through a native connection (You -> Cloudflare -> Plain Internet -> Us) or through Cloudflares Argo network (You -> Cloudflare -> Cloudflare -> Us).

So what are the benefits like? Well as you may be aware all of our three nodes are currently in Europe. We're planning to add one in North America and Asia but until we do so we're leaning on Argo to get faster peering for our customers that live outside of Europe. With Argo enabled we've been able to reduce our average Time To First Byte (TTFB) from 542ms to 382ms (29.52% improvement) which is a considerable difference when you consider we're handling millions of queries per day.

We love Cloudflare they're a great company that make a business like ours possible, we highly recommend them.

Customer Dashboard

The second change we wanted to discuss is something customers have been requesting for a while, the ability to update their card information without cancelling their current subscription. We're sorry it has taken us this long to implement this important feature but today we have made it available to all customers that hold a currently active subscription. Below are two screenshots showing what subscribed users see.

Image description

When clicking that update card details button you'll receive a simple model for you to enter your updated card details as shown below.

Image description

You may also notice that we altered the cancellation text to be a bit clearer. Now it explains when you request a manual change to your subscription tier we can prorate that change. So if you downgrade mid-way through your subscription we can refund you the difference and if you upgrade to a higher tier you'll receive a discount that takes into account how much you've already paid.

Thanks for reading!


Dashboard notices about v1 API

In our last blog post we mentioned that we would soon start showing a message in your dashboard if you're still making queries on our v1 API. You may have already seen the message we're displaying but here is a screenshot of it.

As you can see in the message we have decided to turn off the v1 API on March 1st 2020. However if we still find a lot of traffic using the API at that time we will write some kind of basic translator which will forward queries to the v2 API and translate the responses back into the v1 format. If however very few queries are still being made towards the v1 endpoint we will remove it entirely on that date.

We think two years should give you all ample opportunity to upgrade, I'm sure many of you will have reason to visit the dashboard before the cut off point.

As we get closer to March 1st 2020 we will make the notice more visible, placing it nearer the top of the dashboard and changing the colours so it's even more visible and of course we will eventually send out emails to people using the v1 API when we're only 3-6 months away from the cut off date.

We hope that none of you will feel inconvenienced by the change, we want to make available the best API we possibly can and so setting a firm date allows us to evolve the v2 API without the need to keep making compatibility patches for the v1 API. Essentially giving us more time to work on forward thinking features.

Thanks for reading and have a great weekend.


Updated API's have launched!

Earlier today we launched the latest versions of our v2 and v1 API endpoints with new performance improvements and in the case of the v2 API new features too. We've updated both our Web Interface page and our API Documentation page to take full advantage of the new features. Below we've included some of the development process we went through.

In testing the average lookup time of a single IP with the inference engine turned off and VPN checks disabled has been reduced from 42ms to under 1ms. This is a huge decrease in database transaction latency which is due to our new on-server custom in-memory database program we wrote last month.

We did initially have some teething issues with the new database software in February which is why we took a very cautious approach to the rollout by spending many weeks gathering data and making changes to make sure the software was stable and consistent in its performance.

When dealing with sub-millisecond data retrieval from other programs you start to see variance in performance caused by the operating system kernel and we saw these performance fluctuations mostly on our weaker node ATLAS when our custom process and our web-server interacted.

It took some time to find out the cause (kernel pausing new application connections due to high processor load) and to create a solution that would scale from our 32 threaded monster node (PROMETHEUS) to our weakest four threaded node (ATLAS). We were able to solve the problem two weeks ago but we allowed the new API version to run under our development cluster for a few more weeks so we could gather more data and be sure the variance we found was solved by our changes which it happily was.

So that brings us to today, both the v2 and v1 API endpoints are utilising the new process. Next on our agenda is to improve VPN checks which are quickly becoming a large quantity of the lookups we perform due to the popularity of the VPN tag in queries. We've already added sophisticated multi-level caching to ASN lookups which has lead to a dramatic improvement in lookup speed there but there is more we can do for those kinds of lookups aswell.

We hope everyone enjoys the new features, as we said at the top of this post the API Documentation page has been updated with all the new flags, these are only included on the v2 API and we will soon be showing a message in your dashboard if your most recent query was to our v1 API. We're still going to be offering the v1 API for many years and updating it to maintain functionality but all new features (such as new flags) will only be included on the v2 API moving forward.

Thanks for reading and we hope everyone has a great weekend!


v2 API update progress

Image description

It has been a week since we last updated you on the progress of our new API update which contains new features and increased speed. So far the testing is going very well and we're currently filling the API's new database with data ready for live queries from our customers.

We expect to deploy the new API some time next week and we will be back-porting parts of the code to the v1 API endpoint, specifically the way it performs checks on IP Addresses and how it accesses our database of address data.

The new server process for handling database lookups has been working very well in testing, the performance has been extremely consistent with constant low data access times even under simulated loads from millions of incoming queries per second.

So that's where things are right now, we're on track for a release soon, the last thing I wanted to mention is we've had a few customers email us about the new API asking about how it will affect their use of the API so I just want to explain that for everyone.

This new API update does not change the response format you receive from our API in any way that should break your software. We're not changing the names of anything or how we present the information. We're simply making the results faster and adding more information to the results if you supply the flags to view such extended information. So once it's live you won't need to do anything except enjoy the increased performance.

Thanks for reading we hope everyone has a great week.


Back