A quick note on Meltdown and Spectre

Recently we've had customers enquire about our operational security and how we safeguard customer data. Sharing with you our GDPR compliance earlier this month answered some of the questions about how we store and process data while respecting data privacy laws.

But another question raised has been if and how we are affected by the Meltdown and Spectre vulnerabilities. As a website that accepts sensitive customer information including login credentials, payment information and ip address data there is potential for user information to be read from our various servers operating memory if we used shared computing resources that haven't been patched against Meltdown and Spectre.

Whilst the computer processors we use are affected by these bugs (as are all modern processors to some degree) our infrastructure as a whole is not affected as we do not use shared computing resources and have never done so for our cluster architecture. We have only ever used shared resources for honeypots which have never and will never contain customer information or any other potentially sensitive information.

As part of our GDPR compliance we are bound to continuously evaluate potential threats to our infrastructure. So as soon as the Meltdown and Spectre news broke we read all of the information available starting with the linux kernel patches and comments from AMD right through to the disclosures from the researchers who discovered the processor flaws, Google's announcement and Intel's press release.

We came to the conclusion almost immediately that our infrastructure was not in danger due to our use of bare metal servers that we either purchased, built and deployed ourselves or rent from major data centers. Due to the way the attacks work you would need to be running malicious code from an attacker on your server or be sharing a physical server with an adversary.

So to sum up, we were aware of Meltdown and Spectre on the very first day the news broke and we did a risk assessment immediately at that time and determined we were not affected due to decisions made during our infrastructure building.

Thanks for reading and have a great weekend.


General Data Protection Regulation Compliance

On the 25th of May 2018 a new piece of positive regulation is coming into force within the European Union called the General Data Protection Regulation (GDPR) to help protect EU citizens sensitive and non-sensitive personal information online.

Under this regulation proxycheck.io will be classed as a data-processor and our customers which send data to us to be processed will be classed as data-controllers. As we have customers ourselves we will also be classed as a data-controller regarding the data we store about our own customers.

Because there are now specifically defined ways in how we can receive, handle, process and pass-on the data given to us by our customers we felt it prudent to create a document explaining how we are GDPR compliant. We do believe that we meet all the requirements set-forth by the GDPR and have been meeting them for over a year.

We have included links to our GDPR compliance document on every page of our website at the bottom and also next to our registration button on both our homepage and pricing page so that everyone can see the document clearly.

I'd like to iterate that we're proud to be compliant, we think the GDPR is a great piece of regulation that we hope will strengthen consumer privacy and stop what we see as rampant over collection and misuse of personal data.

You can view our GDPR compliance document here.

Thank you.


Saying goodbye to Skype

When we began offering customer support 16 months ago we started with Skype, iMessage and Email. Later on we added live chat support and since then we have seen an explosion in live web chats. In-fact most of our real-time interactions with customers have been done through our live chat support on our website.

Skype was once a very well utilised platform but the numbers are telling us it's not worth our time to support it for our business and we should instead focus on our live chat support, email and iMessage. All of which can be easily accessed by our customers with no annoyances.

In-fact you receive the same level of service through all three support channels meaning we're able to help you with payments, account related issues and service questions however you want to contact us which is a marked difference to many other sites where the live chat support is manned by sales call staff that can't assist with account related matters.

With how few customers took advantage of our Skype support we know this won't affect many of you. We're continually evaluating the best way to offer support to our customers and we're confident the options we currently provide will satisfy everyone.

Thanks for reading and have a great weekend.


ASN flag issue on our HELIOS node has been resolved

Earlier today we became aware of an issue with ASN data lookups on our HELIOS node affecting both v1 and v2 API endpoints. We have since corrected the problem and ASN data from HELIOS is now working correctly.

Neither our ATLAS or PROMETHEUS nodes were affected. Thank you.


v2 API Improvements & Web Interface Refresh

Since we debuted the v2 API on January 1st we've been collecting a lot of data about how our customers are using the new API. One of the obvious use cases has been multi-checking. We've seen a huge volume of customers making use of our multi-checking functionality especially our largest customers that have lots of prior data they wish to check or re-check.

To help facilitate this use case we have increased the volume of IP Addresses that can be checked in a single query from 100 to 1,000. We've also added in a new timing feature so if your query reaches 90 seconds it will stop and display to you the IP Addresses it has and has not processed up to that point, this is to ensure the query doesn't time out.

So if you provision your software for the time exhausted response Not processed. Reached 90 second processing time limit. you can re-send in a new query only the addresses that didn't get checked in your prior query. Depending on the flags you're using it is possible to query near 1,000 addresses within the 90 second query window so we know this will be a useful addition to the API.

In addition to these changes we've also remade the results interface on our Web Interface page (which also now goes to 1,000 checks per query or 90 seconds query time whichever comes first). We've made these changes so that it's easier to view large data sets, the previous UI was great up-to a certain number but once you're dealing with hundreds of addresses it could become cumbersome to navigate. Below is an example of the new interface with collapsable results.

Under each section if you're receiving 90 results or less it will automatically be expanded but if there are more than 90 results it will be collapsed enabling you to get an overview of all the kinds of results you've received before you choose to drill down into a specific result type.

We hope you enjoy these changes they are all live as of this blog post.


Introducing our new v2 API

Over the past couple months we've been working hard on a new version of our Proxy Checking API. And today we're proud to launch it officially and welcome you to try it out.

You may remember last month we shared with you a new API endpoint called v1b and at that time we weren't quite sure if we should launch the new API under the /v1/ endpoint because it created a lot of code bloat to make the new API compatible with our old result format whilst still being able to support the new features we've included.

New features such as:

  • Proper formatted URL strings so you can supply [ipaddress]?key= instead of [ipaddress]&key=.
  • The ability to send your IP Address to be checked using the POST method instead of just GET.
  • The ability to check up-to 100 IP Addresses in a single query using GET or POST methods.
  • The ability to disable real-time inference checking with a new inf flag.

v2 has all of these and to support the multi-checking feature we've had to alter our result format so that the proxy declarations are nested under the IP Addresses. We did (with v1b) create a compatibility layer so that if you were checking a single IP Address it presented the old format. But this created code bloat and frankly we didn't feel it was a good trade off.

So instead we're going to maintain our /v1/ endpoint for a long while (probably until 2020-2022 depending on usage). And in its place we will be presenting /v2/ as our main API endpoint from now on.

If you're worried about how this change will affect your code, don't be. The API is still very easy to query and in-fact in our own PHP Function (available on GitHub) we were able to upgrade with just two changes, the URL we were querying and the JSON conditional statement.

Essentially this: if ( $Decoded_JSON->proxy == "yes" && $Decoded_JSON->ip == $Visitor_IP ) {

Became this: if ( $Decoded_JSON->$Visitor_IP->proxy == "yes" ) {

It's just that simple. Now again you do not need to rush around and change your API over to v2, we will be supporting the v1 endpoint for many years yet. But if you want to get on the latest and greatest you're more than welcome to do so.

You'll find that we've updated our API Documentation page to include the new /v2/ endpoint and we've also spruced up the pages appearance. We hope you enjoy the new look.

We've got a lot of things coming in 2018 and this is the foundation on which we'll be building them. Thank you everyone for reading and we wish you a Happy New Year!


WordPress Plugin Now Available!

In our yearly retrospective we teased that a WordPress plugin was being made that would be available in early January. Well it was approved much faster than anticipated by the WordPress repository maintainers and so the plugin is available here on the WordPress site right now.

With the release of this new plugin we've also added a dedicated plugins page to the website where we'll maintain a listing of all the plugins that support the proxycheck.io API. As before with our Code Examples page if you've made a plugin let us know and we'll feature it just like we've done with this WordPress plugin made by Ricksterm.

We hope you all enjoy the new plugin and have a lovely new year.


A Giant Year

This year has been giant for us at proxycheck, we added lots of new features and dramatically overhauled our service, website and API. Below we'd like to share with you all the changes that happened this year.

Outreach

  • We started this company blog that you're currently reading
  • We started a GitHub account featuring PHP client code
  • We started a Twitter account and have been tweeting new features
  • We created a new contact us page with live chat support

API

  • We greatly decreased query latency through code refactoring
  • We added query tagging support
  • We built an Inference Engine to discover new proxies and to curate our existing data
  • We created 20 honeypots positioned around the world to capture malicious activity to further feed our Inference Engine
  • We vastly improved our VPN detection
  • We added support for VPN detection in IPv6 address ranges
  • We vastly improved our ASN flag support which now also supports IPv6 alongside IPv4

Website

  • Website gained a new look with drop-shadows, subtle animations and vibrant colours
  • A top navigation bar was added and tab bars were placed on some pages
  • We added a Pricing page and overhauled the Web Interface page
  • We completely remade the Service Status page
  • We made the website more mobile friendly with media queries
  • We remade the API Documentation page which now features dashboard API examples under a new tab
  • We significantly improved our Code Examples page with examples added for python, node.js, C# and Ruby
  • We added Stats, Whitelist and Blacklist features to the Customer Dashboard

Payments

  • We switched from one time yearly payments to monthly and yearly subscriptions (which you can cancel at any time)
  • We expanded our plan sizes into lower and higher priced plans (plans from just $1.99 a month all the way to $99 a month).

Infrastructure

  • We added a third server node to the cluster called ATLAS
  • We altered our international routing to enable lower latency access to our server nodes worldwide
  • We significantly upgraded our PROMETHEUS node going from 6 to 16 CPU cores

Email

  • We greatly improved the appearance of our emails and now bundle our CSS in the email itself for reliability
  • We standardised all our emails look and feel by creating a standard callable email function used by all our code
  • We send more emails to you for things like email/password changes and query overages, payment failures etc

With all these changes the service has become really fleshed out but we're not done. We still have features planned for next year including our new API that allows upto 100 IP Addresses to be checked in a single query.

We're also working on a batch processing and webhook system which we think will be very beneficial to some of our largest customers, this should be available some time early next year after the multi-check API is brought online.

The final thing we wanted to discuss is our free tier and how we've been distributing larger free plans to communities that need them most. Protecting websites, forums, game servers, payment gateways and more is something we're very proud to do and our service was actually started due to our founder needing this very service to protect their own online properties which included chat rooms, forums, websites and game servers.

That's why we've from the very start offered a generous 1,000 free queries to anyone that signs up, we have no intention to restrict that offering or to disable our premium features. We're aware that there are many competitors in this space that differentiate between free and paid customers by restricting features such as limiting Whitelist/Blacklists, not offering statistics or easy to access online support. We're different and proudly so.

We've also gone out of our way to give larger free plans ranging from 10,000 to 80,000 daily queries to many people who run free online games, chat communities, forums and support groups. We're proud to support forums that help teenagers and young adults with thoughts of self harm and also open source developers who release software we all benefit from.

This year has been really great for us. We've seen huge volumes of new customers and also many developers across the web integrating our API into their products and services, in-fact we have a WordPress plugin being developed on our behalf which should be available in January.

We'd like to thank everyone who has taken a chance on our service and we're really looking forward to bringing you new features and improvements next year and with that last sign off we also want to wish everyone a Merry Christmas and a Happy New Year.


New Dashboard Graph

The stats tab in the dashboard is the most heavily trafficked part of the site for registered users and that's because it's where you gather insight about the positive detections being made by our API on your properties. It's also where you can monitor your query allowance to make sure you're not going over your plans daily allotment of queries.

Today we're improving the stats tab with a new graph so that you can quickly see what your months been like without needing to page through each days queries. We're still giving you those full granular bar charts and the JSON API is still there for you to export and graph your queries however you wish but we've built in a really nice graph as shown below.

Our new graph is fully interactive so you can just hover your mouse over the data to see detailed number breakdowns and you can also toggle parts of the graphed data by clicking on the keys along the top. This is especially useful if only a small amount of your queries are positive detections which is common.

We know that you'll find the new graph useful as we're often asked by current customers what sized plan they should purchase based on their current usage. The new graph will make those decisions easier, especially as it shows your highest query days as large peaks and those are the days you want to plan ahead for.

This is likely the last major update we'll be releasing before the new year as we're winding down things for the Christmas break and new year celebrations. Thanks for reading and we hope everyone has a wonderful holiday and a happy new year.


Multi-check API update

Since we launched our multi-check API yesterday we've been hard at work improving performance and squashing bugs. Today we'd like to share with you some progress.

Firstly there were some bugs with the IPv6 VPN detection with regards to Google address spaces. This has been corrected in both the /v1b/ endpoint and in the back-ported code which is running on our main /v1/ endpoint.

The second IPv6 bug we had was with VPN detection. If you had not set the VPN flag to on but checked an IPv6 address it would be checked against our VPN data and a positive VPN result presented. This has also been corrected today in our /v1b/ and /v1/ endpoints.

The third bug we dealt with today was with dashboard statistics. Under certain circumstances you may have had a discrepancy in your total API queries reported at the top of your dashboard compared with the graphed breakdown of your query statistics. This was caused by underreporting on some negative detection scenarios. This bug only affected our /v1b/ endpoint.

Apart from fixing these bugs we've also corrected some very specific edge-case bugs. For example when performing a single IP check and the address entered into whitelisted IP ranges you may have received a response without the IP Address being repeated back to you in the JSON response. This behaviour has been corrected.

New functionality wise we've improved how we handle invalid IP Addresses. So previously you would simply receive a vague message indicating that one or more addresses were invalid but it didn't list the actual addresses you supplied. That's not an issue when you're checking a single address but with multi-checking you need to know which addresses in the data you've sent to be checked were invalid. To that end we now display that information back.

We're also planning to improve the 100 check per query limiter code. At present it simply stops processing your addresses once it reaches 100 and does not output to you a list of unchecked addresses. We'll be changing this behaviour soon to indicate which addresses were unprocessed due to hitting the limit. We'll also be changing statistic behaviour to account for this, at present if you send us 500 IP's you'll have 500 queries registered to your API Key even though we only processed the first 100.

So that's a quick update of where we're at. The new API is coming along steadily, we've already improved the performance since yesterday and we're squashing all the bugs we find as quickly as possible. We're on track for an early January 2018 rollout to our main API endpoint address.

Thanks for reading and have a great day.


Back