Skip to content

Support smart-contract-based mapping of MinerID to IndexProviderPeerID #250

@bajtos

Description

@bajtos

The current Spark implementation assumes the on-chain PeerID found in Filecoin.StateMinerInfo response is the same as the IPNI index provider PeerID, see https://github.com/CheckerNetwork/FIPs/blob/frc-retrieval-checking-requirements/FRCs/frc-retrieval-checking-requirements.md#link-on-chain-minerid-and-ipni-provider-identity

Curio uses a different PeerID for on-chain state and index provider advertisements. To support Spark, Curio is maintaining the MinerID->IndexProviderPeerID mapping in a smart contract. (There is a single smart contract shared by all miners using this new mechanism.)

We need to improve Spark to support both flavours of MinerID->IndexProviderPeerID lookups.

Resources:

Places where to implement this new mechanism:

Related:

Open questions:

  • How do we want to interact with the smart contract - are we going to use Ethers.js as we do elsewhere or implement something custom?
  • If Ethers.js: how to obtain ABI.json file to initialise Ethers.js smart contract client.
  • Are we okay to share the implementation using the current copy-n-paste approach or do we need to figure out how to share miner-peer-id lookup code first?
  • How should we check the two flavours - one after other (which one to check first?) or in parallel?
  • How can we make the RPC API call querying the smart contract state reasonably cheap in terms of Glif Compute Units spent? (Can we use the same mechanism as we use for Filecoin.StateMinerInfo queries?)
  • Should we verify the signature of the mapping entry? I think we should verify, otherwise a malicious SP can delegate all Spark retrieval checks to a different SP by posting a mapping from their MinerID to a PeerID of somebody else. OTOH, this is not critical and can be moved to a follow-up task.

Tasks:

  • Write a design proposal answering the questions above.
  • Convert the proposal into a list of implementation (sub)tasks.
  • Add the tasks to this list and implement them :)
  • Set Abort Signal timeout to 60_000

Blocks CheckerNetwork/spark-deal-observer#117

Metadata

Metadata

Labels

Projects

Status

✅ done

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions