If you are a data lover like me, this post should give you a good push into data & analytics platforms on the Internet Computer. I am going to share some interesting channels, sites, and persons where you can get some data insights. Also, I am going to say a little about how to pull public data from the canisters on your own.
Talking about data on the IC, there are many specific areas that you can be interested in – governance data, network data, NFTs data... I'm going to split them into groups, so you can easily jump to the point of your interest.
Governance and neurons
There is no better place to read data about governance and neurons than the Internet computer dashboard site. You can see details of each proposal, the number of votes, how many ICP is staked, how many are dissolving or not dissolving, etc.
I visit this dashboard every week to check how staking is trending. If the number of neurons and staked ICP grows, it is a good sign for the whole project.
The very known analytics legend [Kyle Langham] has an excellent knowledge of staking. (https://twitter.com/kylelangham). In his substack newsletter, he dives into topics like how much ICP is getting staked, voting analytics, ICP supply versus demand predictions and so on. He always shares data and charts that his conclusions are based on. I can confidently suggest that you subscribe to his newsletter not to miss anything.
If you are interested in how the network evolves, I can recommend the Internet Computer dashboard from the previous chapter. Plus, one other site shows data about the network size- the ICP Explorer.
ICP explorer is a simpler version of the IC dashboard that shows more information; in particular, you can also see upcoming nodes and boundary nodes. Above that, there is a chart for the ICP burn rate, which is one of the most important metrics of all. ICP explorer, on the other hand, has this incredible visualization of the whole network (picture above) and how the nodes are communicating with each other.
While in the previous chapter, I had mostly just one or two tips, in the NFT sphere there is a lot more to offer. NFTs are commercially very popular; no doubt there are also more people and solutions digging into data. Also, it is quite easy to get this data from canisters, which I will show you at the end of this text.
Let's see what we've got talking about NFTs analytics:
1. NFT Geek
NFT Geek is a site where you can check real time data about NFT minting, listings and transactions. I specifically love the real-time feed, where you can select your favorite collections, types of events that you want to monitor and data. I've got this open almost every day while I work to be the first one to know about new opportunities.
RaisinRank is a nicely designed web app also focused on NFT data. It is useful to see how many unique holders are in the specific collection or to check how many transactions happened in the specific time frame.
3. My NFTs reporting dashboard
Even though I totally love the previous two. I missed some more charts and the ability to check data as I wanted. So decided to create my own report Using Data Studio is more flexible for me to visualize data the way I want.
This report pulls data automatically every hour directly from the canisters. It is not as real-time as NFT Geek or RaisinRank but provides some long term trend views.
Dgastonia is a great explorer for NFTs collections on the IC. They provide nice filtering so you can see how many pieces of a specific trait are listed in the collection. You can see some quality indexes and charts there too.
5. Avocado research & IC stats
Avocado research and IC stats Twitter profiles are a good place to start if you don't want to bother working with data independently. They regularly share some nice infographics that are very easy and understandable.
If you don't really need to see real-time data or do your own research, this is a good way to get some high level informative insights.
Canister and web analytics
The authors of NFT Geek have more tools in their pockets. If you are familiar with Google Analytics, User Geek is a Web3.0 alternative that gives you insights about your web app while not spying on your visitors or monetizing their data.
Canister Geek, on the other hand, gives you a dashboard about the usage of your canister – how many cycles it consumes plus memory usage.
I had a one-hour video call with Dmitry from User/Canister/NFT Geek and these guys are incredible. It's a six-person strong team of IT adventurers building their own analytics engine. From what I understand, they keep data on AWS but are looking for ways to 100 % integrate them on the IC. I wish them luck.
How to get data from the canister
In the end, I would like to give you a small tutorial if you are wondering about how to get data from the canister. The easiest way to pull data out of the IC is using the IC-PY library. If you have at least a basic knowledge of python, you can do it!
I have also shared a minified version of the python code that I am using to pull data for my NFTs reporting dashboard. You can find it here.
Most of the apps on the IC have a backend canister and you need the ID of this canister. For example, for NFTs listed on entrepot, you can get it by clicking on this link, or you can take it from the NFTs URL and then search it on ic.rocks.
ic.rocks is a site where you can search for canisters and see their public interface. For the BTC Flower NFT canister, you can see that there is a public function called Transactions. You can call this function in your python code with the ic-py library or click on 'query' in the UI to get the data without any coding.
Voila! You've generated your first dataset on the Internet Computer! That is all for today's article. Follow me on Twitter and I can promise you I will be sharing more about data integration on the IC in the future.
Connect With Us:
- Disclaimer: The views and opinions expressed on this website are solely those of the original author and other contributors. These views and opinions do not necessarily represent those of the Dfinity Community staff and/or any/all contributors to this site.