dark proxyscrape logo

How I used ProxyScrape proxies to win ipinfo.io merchandise

Jan-05-20245 mins read
Hey, I’m Benji, a ProxyScrape user, and I love messing around with systems and finding out how programs work in the background. I also do some server administration and occasional coding (https://benji.link). ProxyScrape asked me to write a little about what I used their proxies for, so here we go:
For Halloween 2023, ipinfo.io held the “Halloween Hunt” contest, where users were supposed to use the IPinfo app to submit IPs and GPS data. I’m guessing they wanted to improve their IP data, while we got the chance to win some merch.

Each day for 2 weeks they picked a user from the submissions to win that day’s merch. That way it was not a complete numbers game and had some luck involved.

I started off my journey, the way it was probably intended, just with some automation. I did this by making a slow auto-clicker to turn on/off Airplane mode on my phone to force a new mobile IP from my ISP. This worked great, but was very slow and only gave me around 2-3 IPs per minute.

I got around 400-500 IPs in total with my phone with that method, before I started thinking of faster ways.

Once I got home that day, I started to reverse engineer the app to see if there was any way I could easily get around the system and use proxies without having to use a phone for it.

After inspecting the network traffic from the app, I noticed that all that was being sent was a single request to the json endpoint (“https://ipinfo.io/json”), with your device ID.


Screenshot from HTTP toolkit monitoring traffic going through the IPinfo app. (Device ID highlighted)

I first tried just copying one of those requests and using proxies with it, but the requests were not being accepted. Through some trial and error, I noticed that it somehow had to do with the extra information being sent in the request.

The device ID and the endpoint URL were the only things important in that request.

After removing all the other details so that the request only had the device_id and nothing else, it started working.

url = "https://ipinfo.io/json?token=app_test"


headers = {
  'Host': 'ipinfo.io',
  'User-Agent': 'IPinfo/Android-Lib/3.0.6/IPinfo',
  'x-conn-details': 'device_id=d813353d28df2ad3'
}

The device ID you could just copy from any phone that you installed the app from, and I could have probably made something to make it generate the ID for me without the app, but that wasn’t worth the hassle.

Now the only thing left to do was to use proxies, which was the easy part.

I decided to use ProxyScrape residential proxies because I had some free data from a promotion, and they let me get thousands of unique IPs.

I started off with just a very simple script that sent 100 requests like this:


This worked well and increased the speed from around 2-3 IPs per minute to around 30 IPs per minute. (With each request taking between 0.5-2 seconds.)

To increase the speed further, since this was still not fast enough for me, I wanted to implement some simple threading to send the requests concurrently.

import requests
import concurrent.futures
import time
import random


# open the proxies.txt file and read the proxies
proxies = open("proxies.txt", "r").read().split("\n")


url = "https://ipinfo.io/json?token=app_test"


device_ids = {
  "3d8e0d7245a92152",
  "a9c7b2b233dd06b8",
  "661035895999a7fe",
  "d813353d28df2ad3",
  "982078c380f4fe38"
}


success_count = 0


def send_request(i):
  global success_count
  try:
    # pick a random number between 1 and 1000
    rand = random.randint(1, 10000)
    proxy = {"https": proxies[rand]}
    device_id = random.choice(list(device_ids))


    payload = {}
    headers = {
        'Host': 'ipinfo.io',
        'User-Agent': 'IPinfo/Android-Lib/3.0.6/IPinfo',
        'x-conn-details': 'device_id={}'.format(device_id)
    }


    response = requests.request("GET", url, headers=headers, data=payload, proxies=proxy, timeout=20)
    print("Request #{}: \n{} \nTime taken: {}\n".format(i, response.text, response.elapsed.total_seconds()))
    success_count += 1
  except Exception as e:
    print("Request #{}: Error - {}".format(i, str(e)))


with concurrent.futures.ThreadPoolExecutor() as executor:
  futures = []
  for i in range(500):
    time.sleep(0.02)  # wait for 100ms before starting each thread
    futures.append(executor.submit(send_request, i))


  try:
    for future in concurrent.futures.as_completed(futures):
      future.result()
  except KeyboardInterrupt:
    print("Program interrupted by user.")
    for future in futures:
      future.cancel()
  except Exception as e:
    print("An error occurred:", str(e))


print("Success count:", success_count)

I added a few different device IDs just in case, and I got a list of 10 000 1-minute rotating proxies from ProxyScrapewhich were pasted into proxies.txt. I also made sure to add a short sleep time between starting each thread so that it doesn’t all happen at the exact same time. (which seemed to cause issues.)

Now all I had to do was change the number in the “range(500)” to get thousands of IPs counted.

All these changes let me get a few hundred IPs per minute. I then continued to send a couple thousand IPs every day for the 2 week hunt.

According to the event organizers, I managed to get 149k unique IPs, although I suspect that I sent a lot more. That put me at 6th place worldwide and left me with some sweet merch.
You can see the results of the contest here: https://community.ipinfo.io/t/the-great-ip-hunt-is-over/3906

I got myself a sticker pack, an “I am a Huntathon Winner T-Shirt”, a Notecard with a map of the Internet, and some IPinfo socks.

These all arrived around 3 weeks later:

As a disclaimer, the IPinfo team was expecting the app to get reverse engineered and were actually quite happy to learn how people found creative solutions around the blocks.
In the end, it was a fun time meeting new people, learning some things about proxies and Android reverse engineering, and of course, getting some free merch.

Benji