Code refactoring, bug fixes and performance improvements

Over the past week we've been very busy around proxycheck.io not designing new user features but improving the existing code base in several ways.

  1. Refactoring the code so it's more compact, easier to read and edit in the future
  2. Fixing some edge-case bugs that have been discovered
  3. Improving performance of the API to lower latency

We're not completely done with these efforts but we have gotten pretty far. Almost all of the code that runs proxycheck.io has been improved in some way including the API itself, the dashboard, the web interface and more.

Already the API is able to answer queries incredibly quickly even as we're seeing several million queries a day. But during our testing we were able to shave 2 entire seconds off a 500 query lookup through our web interface with our new performance optimised code which when extrapolated over the millions of daily queries we handle results in huge time savings. (If you're curious it was a reduction from 22ms per query to 18ms per query with Proxy and VPN checks enabled.)

To assist us in tracking down potential code optimisations we have built a new tool called ocebot (a play on the words ocelot and bot) which makes automated queries to our API all day every day and supplies a special flag to the API when it does, this triggers the recording of the query at a deep architectural level on the node that handled ocebot's query.

The data about that query is then saved and over time statistics which we can analyse are formed so we can see at a glance what functions in our code are the slowest, what kinds of anomalies are slowing queries down some of the time but not all of the time and also to see what software optimisations we should be looking into for our overall architecture meaning the operating system, web server and databases.

Due to the way that ocebot works only the queries ocebot makes itself will be recorded so that there is no performance impact on the queries made by our customers. But it will be making queries that are similar to the ones made by our customers. Some of its queries will even serve malformed data so that we can see the performance impact of bad queries.

We hope this blog post was interesting. I'm sure we'll have some data to share on the exploits of ocebot in the future.


3rd Party Software and the proxycheck.io API

Since we started the API there have been a few enquiries about our policy on third parties producing software that integrates our API. And specifically do we allow you to then sell that software.

So we thought it would be a great idea to explain our policy on this and our general thinking. Firstly yes we completely allow it, you are free to make any software you want that integrates our API and we are more than happy for you to sell your software with our API integrated into it. You do not need to contact us first and ask permission. Simply make whatever it is you want to make.

Of course we would like you to provide some way for your users to input an API key from proxycheck.io into your software. That way they can manage their proxy/vpn checking from the dashboard on our website.

Our reasoning for allowing others to produce third party software is that it's just good business sense. We cannot possibly author all the software ourselves that would benefit from a good proxy detection API. So encouraging other developers to build our API into their software is in our best interest.

But it's also a good deal for 3rd party developers because you don't need to worry about running a complicated always available API. You can simply build and charge for the client software you make. And we're not doing any split revenue structure so you do not owe us a penny for anything you make which uses our API.

And in-fact to assist you in broadening your softwares audience we are featuring 3rd party software on our example page. So if you've made something that uses our API and you give your users the ability to input a proxycheck.io API key then we are more than happy to feature it on our example page, simply email us with a link to your code example, application, plugin, function or SDK - If it uses our API we'll feature it.

To that end we today added three new 3rd party applications to the examples page that utilise our API. We hope to add many more and perhaps even your application will be featured soon. Thanks for reading!


New proxycheck.io blog!

Over the past year we've been working diligently on the proxycheck.io service adding new features and improving the responsiveness and reliability of the API. We've updated the appearance of the website several times and fleshed out the dashboard.

We feel now is the perfect time to launch our blog so we can better reach out to you and provide greater insight into our new features and future plans. We have a great roadmap ahead, the service is performing well and we're fully committed to proxycheck.io and to you.

Firstly, we want to allay some fears that the service is going to turn paid-only or that we're going to segment features between free and paid tiers. We don't believe that a paid-only model for this kind of service is viable. Instead, we believe that the best way to go is a free model with extra paid options for people that need more queries for commercial usage.

Feature segmentation is also not something we feel necessary at this time. Every user, be them on our free or paid tiers, should have access to TLS/SSL encrypted queries, powerful statistics, query tagging and white/black lists. These features are fundamental to the usage of the API; to only allow paying customers to utilise them would make the service inherently less effective.

Similarly, our free and paying users both enjoy full access to our entire cluster, as we do not differentiate between them. Some competitors' services restrict access to multi-server technology, which means free users are treated as second class, Furthermore, should there be any kind of server issue, they may lose access to the API temporarily, whereas our customers wouldn't.

The final thing I wanted to discuss was our unique technology. Everything we use has been built from the ground up by us. That includes the API technology, the website, our dashboard features, and even the cluster architecture that our servers utilise.

We were looking around for cluster technology we could make use of. However, we found all of them a bit too complicated with a lot of code debt, and many of them are also operating system specific.

With our custom architecture we maintain mirrored databases and files through a constant in process syncing system. If any node in our cluster goes offline, it is instantly kicked from service. It only rejoins once it has re-synced and been tested for coherency by the rest of our cluster members.

At the moment we expose two nodes within our cluster for the API to use but we have a third redundant node in the cluster which is not exposed. As the service grows and we receive more paying customers, we will be expanding the number of nodes in the cluster which will enhance proxycheck.io's availability and performance.

I hope this first blog post was enlightening and interesting. If you would like to discuss anything please feel free to message us on Skype, iMessage or Email via [email protected].

Thanks!


Back