APIs.io is an experimental search engine that uses the APIs.json discovery format. The project is maintained by 3scale and also supported by APIEvangelist and APItools.com. It is the successor to a project launched earlier this year.
A bit about the project:
- The APIs.json format is a draft format for API metadata. It is described as a “machine readable JSON file that lives in the root of your domain, and describes your APIs and its supporting API program.”
- The search engine crawls known APIS.json files, top domains on Alexa and follows include links. The APIs it discovered are available via web search and it’s search API.
- The search engine provides a validator for APIs.json and a generator as well. The code is available on GitHub.
- APIs.io can pull in all the elements such as attributes, events and exceptions. This means a developer can deeplink to everything, said 3Scale CEO Steve Wilmott on the Hacker News thread about the new search engine.
One of the goals is to encourage others to make their own search engines so the metadata can be shared more broadly and readable by machines and humans.
Until APIs.io we only had directories like ProgrammableWeb which has done a great job – but it’s hard to keep up with the growth in APIs that way. Until now there hasn’t been an “meta-data” about APIs out on the web – so we’re hoping apis.json and apis.io help make that happen.
Google also doesn’t help finding some APIs easily since it just treats it as content or they aren’t linked, Wilmott said on Hacker News.
In the long run hopefully a lot of the fields in the apis.json files this pulls on will be machine readable (e.g. meta-data for things like major T&C provisions), pointers to libraries etc. so you could filter on that.
APIS.io was originally launched by Chris Matthieu, the co-founder of Octoblu, which makes APIs, platforms and devices the capability to talk to each other. The service is still live with the source code available on GitHub.
Mashape, which helps customers distribute, monetize, manage and consume cloud APIs, also has a thorough directory called PublicAPIs. Each API that turns up in a search result includes recommendations for other relevant APIs.
The PublicAPIs service is clean and looks very useful. But using metadata has certain advantages. Namely, the machine to machine communication and the resulting benefits that offers such as auto update of the information about the API and the ability to crawl the data. Directories require updating.
There are some mega trends underway. Containers are impacting the IaaS model that tends to lock down customer data. APIs help ameliorate this problem. But who wants to go to the trouble of pulling all your data and metadata out of a proprietary database?
Containers allow for a new kind of back up. A way to move the code so it can be used across different services and on-premise environments. Code can be updated. Microservices offer ways to plug and play services. APIs can be used in a container context, too. API Evangelist Kin Lane wrote a post earlier this year that tells how AutoDevBot uses a suite of containers to monitor API endpoints.
In the years ahead, APIs should work with machines that run containers to adapt to data loads, changing environments and different kinds of use cases. If the machines can understand the metadata then a discovery engine like APIs.io becomes quite valuable and necessary, especially as millions of things connect and the need for automation becomes critical for the programming of our world.
Feature image via Flickr Creative Commons.
3Scale is a sponsor of The New Stack.