Read URLs

GET /urls - get a batch of 100 most recent URLs.

Example response: { "error": { "code": 0, "message": "" }, "urls": [ { "hash": "h", "long_url": "http://google.com", "short_url": "m.tiny.cc/h", "total_clicks": 0, "unique_clicks": 0, "last_visit": null, "email_stats": false, "protected": false, "ping": false, "archived": false, "note": "", "expiration_date": null, "max_clicks": 0, "tags":[], "links":{ "qr_small": ..., "qr_big": ... } }, { "hash": "g", "long_url": "http://yahoo.com", "short_url": "m.tiny.cc/g", "total_clicks": 0, "unique_clicks": 0, "last_visit": null, "email_stats": false, "protected": false, "ping": false, "archived": false, "note": "", "expiration_date": null, "max_clicks": 0, "tags":[], "links":{ "qr_small": ..., "qr_big": ... } }, . . . ], "page": { "results_count": 13, "total_count": 13, "offset": 0 } }

HTTP status code - 200 OK


It is also possible to paginate URLs like this
GET /urls?offset=10&limit=20

Other supported query string parameters:

Example:
GET /urls?offset=5&limit=1&search=john&order_by=clicks


Read single URL:
GET /urls/[hash]


To read a group of URLs you need to pass a group of hashes in query parameter "hashes", separated with comma.
For example, lets pass a group of three hashes (a,b,c):
GET /urls?hashes=a%2Cb%2Cc

Each individual URL entry has "links" section listing locations of related resources.