Since Curate was first conceptualized, demand for ethereum block space has skyrocketed. This put a high price floor on the usecases that Curate can be with. Furthermore, we learned that there are is a lot more demand for user facing data than for contract to contract queries.
Light Curate is a new version of the contract that significantly decreases costs of deployment and operation of Curate lists by leveraging new technologies and changing the strategy for data storage.
1- Light Curate does not use contract storage to store item data. Instead, we only store the item's IPFS multihash in the contract. This means other contracts can't query the TCR with field values, but storage costs are roughly O(1) vs Classic Curate's O(n). 2- The Graph ipfs api means we can also store the item fields in the subgraph. This comes with several benefits, among them: - No need to use @kleros/gtcr-encoder to encode and decode items. Just query the subgraph and you have the item. - Faster, scalable search: Classic Curate needs to sync with the client by downloading every single item and decoding it. With subgraphs we do not need to do this and can query the fields directly. - New need for complex solidity code searching fields on-chain. 3- EIP-1167: Light Curate uses the minimal proxy for new deployments. This means the cost of deploying a new TCR dropped from roughly 7 million gas to 700k. Ten times cheaper!
Development
This section will be devided into 3 sections:
1- Fetching Parameters: Your UI needs to display some important information to the users such as, what is the bounty for successfuly challenges and how long do items stay in the challenge period. 1- Item Submission: Here you will learn how to build a button to submit an item to the UI. 2- Fetching Items: How to view items and item details. 3- Item Interaction: This includes challenging, submitting evidence and crowdfunding appeals.
Fetching Parameters
We use a view contract to fetch all the relevant information at once. Deployments:
Note: If you are using react, you can take the hook we built here or use it as an example.
Item Submission.
With light Curate, item submission consists of first uploading the item to IPFS and then submitting a transaction with the required deposit.
Since we use @graphprotocol/graph-ts we must submit items to its ipfs endpoint until they allow custom endpoints. In addition, we also upload to kleros ipfs node.
In addition to Kleros' and The Graph's, we strongly advise pin the data to ipfs nodes you control as well. Update the provided below for this.
constpinFiles=async ( data:FormData, pinToGraph:boolean): Promise< [Array<string>, Array<{ filebaseCid: string; graphCid: string }>]>=> {constcids=newArray<string>();// keep track in case some cids are inconsistentconstinconsistentCids=newArray<{ filebaseCid:string; graphCid:string; }>();for (const [_,dataElement] ofObject.entries(data)) {if (dataElement.isFile) {const { filename,mimeType,content } = dataElement;constpath=`${filename}`;constcid=awaitfilebase.storeDirectory([newFile([content], path, { type: mimeType }), ]);if (pinToGraph) {constgraphResult=awaitpublishToGraph(filename, content);if (!areCidsConsistent(cid, graphResult)) {console.warn("Inconsistent cids from Filebase and Graph Node :", { filebaseCid: cid, graphCid: graphResult[1].hash, });inconsistentCids.push({ filebaseCid: cid, graphCid: graphResult[1].hash, }); } }cids.push(`/ipfs/${cid}/${path}`); } }return [cids, inconsistentCids];};/** * Send file to IPFS network via The Graph hosted IPFS node * @param data - The raw data from the file to upload. * @returns ipfs response. Should include the hash and path of the stored item. */exportconstpublishToGraph=async (fileName, data) => {consturl=`${process.env.GRAPH_IPFS_ENDPOINT}/api/v0/add?wrap-with-directory=true`;constpayload=newFormData();payload.append("file",newBlob([data]), fileName);constresponse=awaitfetch(url, { method:"POST", body: payload, });if (!response.ok) {thrownewError(`HTTP error! status: ${response.status}, Failed to pin to graph` ); }constresult=parseNewlineSeparatedJSON(awaitresponse.text());returnresult.map(({ Name, Hash }) => ({ hash: Hash, path:`/${Name}`, }));};/** * @description parses json from stringified json's seperated by new line */constparseNewlineSeparatedJSON= (text) => {constlines=text.trim().split("\n");returnlines.map((line) =>JSON.parse(line));};exportconstareCidsConsistent= (filebaseCid, graphResult) => {constgraphCid= graphResult[1].hash;return graphCid === filebaseCid;};
The JSON file for the object is composed of the its metadata and fields.
Metadata (columns): An array describing each of the items columns (what's its type, name, description, etc.)
Values (values): An object mapping the column name to the value.
The metadata is available inside the meta evidence file, which is returned by the useTCRView hook. The Values are input by the user.
Example of columns used by the TCR at
[ {"label":"Logo","description":"The token's logo.","type":"image","isIdentifier":false }, {"label":"Name","description":"The token name.","type":"text","isIdentifier":true }, {"label":"Ticker","description":"The token ticker.","type":"text","isIdentifier":true }, {"label":"Address","description":"The token address.","type":"address","isIdentifier":true }, {"label":"Chain ID","description":"The ID of the chain the token contract was deployed","type":"number" }, {"label":"Decimals","description":"The number of decimal places.","type":"number" }]
And an example of values. Note that it is required for the keys to match the column names in the columns object.
We break down this section into two as list views and details view have different requirements.
Fetching items is best done via the subgraph we provide. If you deployed a list using the factory, it already has a subgraph deployed and available here.
List
Whenever we want to fetch items, or a specific item, we must pass the TCR address to the subgraph.