There are 3 data sources:
- On-chain data (DeFi, NFTs, GameFi and etc.)
- Off-chain data (i.e some token prices come from the Coingecko API)
- Data uploaded by community
- No code experience, users can easily analyse the data on the chain without writing SQL
- Cross-chain analysis supported
- Combination of on-chain and off-chain data analysis supported
- Ability to upload own data
- Semantic data, and users can quickly understand the complex data on the chain.
You can open the dashboard and see the Time period and Table Description for each data table. You can click on the hyperlink for each data table to see the details.
Getting new contract addresses from external sources was our early practice, and we have now iterated to an automated solution that identifies contract addresses from our own foundation tables (transactions, token_transfers and etc.). When incremental node data is found to contain contract addresses that are not included in the database yet, the mechanism will automatically collect them.
An NFT marketplace aggregator consolidates the listing inventory of multiple different marketplaces, allowing users to gain full transparency of the market and also buy and sell NFTs in bulk without having to interact with each separate marketplace. We regard NFT aggregators as a marketplace from the event parsing process.
Aggregators transactions will not only have the marketplace name, but the aggregator name will be included too. In special cases( such as more than 2 aggregators are involved) we will just display the name of the first aggregator trigger the trade.
There are several strategies that we use to validate data:
No unusual data is present. Different traditional statistical methods are used for verification. No gaps in time between data are present.
Logical (internal) validation
The data makes sense in terms of business metrics. For instance, most lending protocols usually require overcollateralization. Therefore the net deposit amount should be higher than net borrowing amount.
Finally we compare the calculated metrics like TVL and MC with other data platforms
Updated about 1 year ago