Here are the steps:
- Prerequisites
- Initialize services
- Create a new node.js project
- Install dependencies
- Import dependencies and add variables and constants
- Initialize accounts and deploy contracts
- Publish a dataset and an algorithm
- Resolve published datasets and algorithms
- Send datatokens to consumer
- Consumer fetches compute environment
- Consumer starts a free compute job using a free C2D environment
- Check compute status and get download compute results url
- Consumer starts a paid compute job
- Check paid compute job status and get download compute results URL
Let's go through each step.
Before we start it is important that you have all of the necessary prerequisites installed on your computer.
- A Unix based operating system (Linux or Mac). If you are a Windows user you can try to run linux inside a virtual machine but this is outside of the scope of this article.
- Git. Instructions for installing Git can be found here: https://git-scm.com/book/en/v2/Getting-Started-Installing-Git
- Node.js can be downloaded from here: https://nodejs.org/en/download/
- Docker can be installed from here: https://docs.docker.com/get-docker/. Please note that Docker must run as a non-root user, you can set this up by following these instructions: https://docs.docker.com/engine/install/linux-postinstall/
Ocean.js uses off-chain services for metadata (Aquarius) and consuming datasets (Provider).
We start by initializing the services. To do this, we clone the Barge repository and run it. This will run the current default versions of Aquarius, Provider, and Ganache with our contracts deployed to it.
git clone https://github.com/oceanprotocol/barge.git
cd barge/
./start_ocean.shStart by creating a new Node.js project. Open a new terminal and enter the following commands:
mkdir compute-quickstart
cd compute-quickstart
npm init
# Answer the questions in the command line prompt
touch compute.ts
# On linux press CTRL + D to saveNext, we need to setup our TypeScript compiler options. Create a new file called tsconfig.json in the root of the compute-quickstart directory.
touch tsconfig.json
# Copy the following json content into the file, On linux press CTRL + D to save{
"compilerOptions": {
"lib": ["es6", "es7"],
"module": "CommonJS",
"target": "ES5",
"esModuleInterop": true,
"allowSyntheticDefaultImports": true,
"outDir": "./dist/",
"declaration": true,
"declarationDir": "./dist/"
},
"include": [
"compute.ts"
],
"exclude": [ "node_modules", "dist" ]
}Now you can compile your TypeScript project. If you have TypeScript installed use the following command:
tscIf you don't have TypeScript installed you can install it using the command below and then compile using the above command:
npm install -g typescriptOr if you don't want to install TypeScript use the following command to compile your file:
npx tsc compute.tsTo run your script as we go along, compile the script then you can use the following command:
node dist/compute.jsInstall dependencies running the following command in your terminal:
npm install @oceanprotocol/lib crypto-js ethersNow open the compute.ts file in your text editor.
Start by importing all of the necessary dependencies
import fs from 'fs'
import { homedir } from 'os'
import { ethers, getAddress, JsonRpcProvider, parseEther, Signer, toBeHex } from 'ethers'
import {
ProviderInstance,
Aquarius,
NftFactory,
Datatoken,
Nft,
ZERO_ADDRESS,
transfer,
sleep,
approveWei,
ProviderComputeInitialize,
ConsumeMarketFee,
ComputeAlgorithm,
ComputeAsset,
Config,
StorageObject,
NftCreateData,
DatatokenCreateParams,
sendTx,
configHelperNetworks,
ConfigHelper,
getEventFromTx,
amountToUnits,
isDefined,
ComputeResourceRequest,
unitsToAmount,
AssetFiles
} from '../../src/index.js'
import crypto from 'crypto-js'
import { DDO } from '@oceanprotocol/ddo-js'
import { EscrowContract } from '../../src/contracts/Escrow.js'
import BigNumber from 'bignumber.js'
const { SHA256 } = cryptoWe will need two files to publish, one as dataset and one as algorithm, so here we define the files that we intend to publish.
const DATASET_ASSET_URL: StorageObject = {
type: 'url',
url: 'https://raw.githubusercontent.com/oceanprotocol/testdatasets/main/shs_dataset_test.txt',
method: 'GET'
}
const ALGORITHM_ASSET_URL: StorageObject = {
type: 'url',
url: 'https://raw.githubusercontent.com/oceanprotocol/testdatasets/main/shs_dataset_test.txt',
method: 'GET'
}Next, we define the metadata for the dataset and algorithm that will describe our data assets. This is what we call the DDOs
const DATASET_DDO: DDO = {
'@context': ['https://w3id.org/did/v1'],
id: 'did:op:efba17455c127a885ec7830d687a8f6e64f5ba559f8506f8723c1f10f05c049c',
version: '4.1.0',
chainId: 8996,
nftAddress: '0x0',
metadata: {
created: '2021-12-20T14:35:20Z',
updated: '2021-12-20T14:35:20Z',
type: 'dataset',
name: 'dataset-name',
description: 'Ocean protocol test dataset description',
author: 'oceanprotocol-team',
license: 'https://market.oceanprotocol.com/terms',
additionalInformation: {
termsAndConditions: true
}
},
services: [
{
id: '1155995dda741e93afe4b1c6ced2d01734a6ec69865cc0997daf1f4db7259a36',
type: 'compute',
files: '',
datatokenAddress: '0xa15024b732A8f2146423D14209eFd074e61964F3',
serviceEndpoint: 'http://127.0.0.1:8001',
timeout: 300,
compute: {
publisherTrustedAlgorithmPublishers: ['*'] as any,
publisherTrustedAlgorithms: [
{
did: '*',
filesChecksum: '*',
containerSectionChecksum: '*'
}
] as any,
allowRawAlgorithm: false,
allowNetworkAccess: true
}
}
]
}
const ALGORITHM_DDO: DDO = {
'@context': ['https://w3id.org/did/v1'],
id: 'did:op:efba17455c127a885ec7830d687a8f6e64f5ba559f8506f8723c1f10f05c049c',
version: '4.1.0',
chainId: 8996,
nftAddress: '0x0',
metadata: {
created: '2021-12-20T14:35:20Z',
updated: '2021-12-20T14:35:20Z',
type: 'algorithm',
name: 'algorithm-name',
description: 'Ocean protocol test algorithm description',
author: 'oceanprotocol-team',
license: 'https://market.oceanprotocol.com/terms',
additionalInformation: {
termsAndConditions: true
},
algorithm: {
language: 'Node.js',
version: '1.0.0',
container: {
entrypoint: 'node $ALGO',
image: 'ubuntu',
tag: 'latest',
checksum:
'sha256:2d7ecc9c5e08953d586a6e50c29b91479a48f69ac1ba1f9dc0420d18a728dfc5'
}
}
},
services: [
{
id: 'db164c1b981e4d2974e90e61bda121512e6909c1035c908d68933ae4cfaba6b0',
type: 'access',
files: '',
datatokenAddress: '0xa15024b732A8f2146423D14209eFd074e61964F3',
serviceEndpoint: 'http://127.0.0.1:8001',
timeout: 300
}
]
}Now we define the variables which we will need later
let config: Config
let aquariusInstance: Aquarius
let datatoken: Datatoken
let providerUrl: string
let publisherAccount: Signer
let consumerAccount: Signer
let addresses
let computeEnvs
let datasetId: string
let algorithmId: string
let resolvedDatasetDdo: DDO
let resolvedAlgorithmDdo: DDO
let computeJobId: string
let agreementId: string
let computeRoutePath: string
let hasFreeComputeSupport: booleanNow we define the helper methods which we will use later to publish the dataset and algorithm, and also order them
Add a createAssetHelper()function.
async function createAssetHelper(
name: string,
symbol: string,
owner: Signer,
assetFiles: StorageObject[],
ddo: DDO,
providerUrl: string
) {
const { chainId } = await owner.provider.getNetwork()
const nft = new Nft(owner, Number(chainId))
const nftFactory = new NftFactory(addresses.ERC721Factory, owner, Number(chainId))
ddo.chainId = Number(chainId)
const nftParamsAsset: NftCreateData = {
name,
symbol,
templateIndex: 1,
tokenURI: 'aaa',
transferable: true,
owner: await owner.getAddress()
}
const datatokenParams: DatatokenCreateParams = {
templateIndex: 1,
cap: '100000',
feeAmount: '0',
paymentCollector: ZERO_ADDRESS,
feeToken: ZERO_ADDRESS,
minter: await owner.getAddress(),
mpFeeAddress: ZERO_ADDRESS
}
const bundleNFT = await nftFactory.createNftWithDatatoken(
nftParamsAsset,
datatokenParams
)
const trxReceipt = await bundleNFT.wait()
// events have been emitted
const nftCreatedEvent = getEventFromTx(trxReceipt, 'NFTCreated')
const tokenCreatedEvent = getEventFromTx(trxReceipt, 'TokenCreated')
const nftAddress = nftCreatedEvent.args.newTokenAddress
const datatokenAddressAsset = tokenCreatedEvent.args.newTokenAddress
// create the files encrypted string
const assetUrl: AssetFiles = {
nftAddress,
datatokenAddress: datatokenAddressAsset,
files: assetFiles
}
ddo.services[0].files = await ProviderInstance.encrypt(
assetUrl,
Number(chainId),
providerUrl,
owner
)
ddo.services[0].datatokenAddress = datatokenAddressAsset
ddo.services[0].serviceEndpoint = providerUrl
ddo.nftAddress = nftAddress
ddo.id = 'did:op:' + SHA256(ethers.getAddress(nftAddress) + chainId.toString(10))
const encryptedResponse = await ProviderInstance.encrypt(
ddo,
Number(chainId),
providerUrl,
owner
)
const validateResult = await aquariusInstance.validate(ddo, owner, providerUrl)
await nft.setMetadata(
nftAddress,
await owner.getAddress(),
0,
providerUrl,
'',
toBeHex(2),
encryptedResponse,
validateResult.hash
)
return ddo.id
}Add a handleOrder()function.
async function handleOrder(
order: ProviderComputeInitialize,
datatokenAddress: string,
payerAccount: Signer,
consumerAccount: string,
serviceIndex: number,
consumeMarkerFee?: ConsumeMarketFee
) {
/* We do have 3 possible situations:
- have validOrder and no providerFees -> then order is valid, providerFees are valid, just use it in startCompute
- have validOrder and providerFees -> then order is valid but providerFees are not valid, we need to call reuseOrder and pay only providerFees
- no validOrder -> we need to call startOrder, to pay 1 DT & providerFees
*/
if (order.providerFee && order.providerFee.providerFeeAmount) {
await approveWei(
payerAccount,
config,
await payerAccount.getAddress(),
order.providerFee.providerFeeToken,
datatokenAddress,
order.providerFee.providerFeeAmount
)
}
if (order.validOrder) {
if (!order.providerFee) return order.validOrder
const tx = await datatoken.reuseOrder(
datatokenAddress,
order.validOrder,
order.providerFee
)
const reusedTx = await tx.wait()
const orderReusedTx = getEventFromTx(reusedTx, 'OrderReused')
return orderReusedTx.transactionHash
}
const tx = await datatoken.startOrder(
datatokenAddress,
consumerAccount,
serviceIndex,
order.providerFee,
consumeMarkerFee
)
const orderTx = await tx.wait()
const orderStartedTx = getEventFromTx(orderTx, 'OrderStarted')
return orderStartedTx.transactionHash
}At the end of your compute.ts file define async function run(){ }. We will use this function to add and test the following chunks of code.
We need to load the configuration. Add the following code into your run(){ } function
const provider = new JsonRpcProvider(
process.env.NODE_URI || configHelperNetworks[1].nodeUri
)
publisherAccount = (await provider.getSigner(0)) as Signer
consumerAccount = (await provider.getSigner(1)) as Signer
const config = new ConfigHelper().getConfig(
parseInt(String((await publisherAccount.provider.getNetwork()).chainId))
)
if (process.env.NODE_URL) {
config.oceanNodeUri = process.env.NODE_URL
}
aquariusInstance = new Aquarius(config?.oceanNodeUri)
providerUrl = config?.oceanNodeUri
addresses = JSON.parse(
// eslint-disable-next-line security/detect-non-literal-fs-filename
fs.readFileSync(
process.env.ADDRESS_FILE ||
`${homedir}/.ocean/ocean-contracts/artifacts/address.json`,
'utf8'
)
).developmentAs we go along it's a good idea to console log the values so that you check they are right. At the end of your run(){ ... } function add the following logs:
console.log(`Indexer URL: ${config.oceanNodeUri}`)
console.log(`Provider URL: ${providerUrl}`)
console.log(`Deployed contracts address: ${addresses}`)
console.log(`Publisher account address: ${publisherAccount}`)
console.log(`Consumer account address: ${consumerAccount}`)
Now at the end of your compute.ts file call you run() function. Next, let's compile the file with the tsc command in the console and run node dist/compute.js.
If everything is working you should see the logs in the console and no errors.
We will use all of the following code snippets in the same way. Add the code snippet and the logs to the end of your run(){ ... } function as well as the logs.
Then compile your file with the tsc command and run it with node dist/compute.js
You can skip this step if you are running your script against a remote network, you need to mint oceans to mentioned accounts only if you are using barge to test your script
const minAbi = [
{
constant: false,
inputs: [
{ name: 'to', type: 'address' },
{ name: 'value', type: 'uint256' }
],
name: 'mint',
outputs: [{ name: '', type: 'bool' }],
payable: false,
stateMutability: 'nonpayable',
type: 'function'
}
]
const tokenContract = new ethers.Contract(addresses.Ocean, minAbi, publisherAccount)
const estGasPublisher = await tokenContract.mint.estimateGas(
await publisherAccount.getAddress(),
amountToUnits(null, null, '1000', 18)
)
await sendTx(
estGasPublisher,
publisherAccount,
1,
tokenContract.mint,
await publisherAccount.getAddress(),
amountToUnits(null, null, '1000', 18)
) transfer(
publisherAccount,
config,
addresses.Ocean,
await consumerAccount.getAddress(),
'100'
)
datasetId = await createAssetHelper(
'D1Min',
'D1M',
publisherAccount,
[DATASET_ASSET_URL],
DATASET_DDO,
providerUrl
)Now, let's check that we successfully published a dataset (create NFT + Datatoken)
console.log(`dataset id: ${datasetId}`) algorithmId = await createAssetHelper(
'D1Min',
'D1M',
publisherAccount,
[ALGORITHM_ASSET_URL],
ALGORITHM_DDO,
providerUrl
)Now, let's check that we successfully published a algorithm (create NFT + Datatoken)
console.log(`algorithm id: ${algorithmId}`) resolvedDatasetDdo = await aquariusInstance.waitForIndexer(datasetId)
resolvedAlgorithmDdo = await aquariusInstance.waitForIndexer(algorithmId) const { chainId } = await publisherAccount.provider.getNetwork()
const datatoken = new Datatoken(publisherAccount, Number(chainId))
await datatoken.mint(
resolvedDatasetDdo.services[0].datatokenAddress,
await publisherAccount.getAddress(),
'10',
await consumerAccount.getAddress()
)
await datatoken.mint(
resolvedAlgorithmDdo.services[0].datatokenAddress,
await publisherAccount.getAddress(),
'10',
await consumerAccount.getAddress()
) computeEnvs = await ProviderInstance.getComputeEnvironments(providerUrl)let's check the free compute environment
const computeEnv = computeEnvs.find((ce) => isDefined(ce.free))
console.log('Free compute environment = ', computeEnv)Let's have 5 minute of compute access
const mytime = new Date()
const computeMinutes = 5
mytime.setMinutes(mytime.getMinutes() + computeMinutes)Let's prepare the dataset and algorithm assets to be used in the compute job
const assets: ComputeAsset[] = [
{
documentId: resolvedDatasetDdo.id,
serviceId: resolvedDatasetDdo.services[0].id
}
]
const algo: ComputeAlgorithm = {
documentId: resolvedAlgorithmDdo.id,
serviceId: resolvedAlgorithmDdo.services[0].id,
meta: resolvedAlgorithmDdo.metadata.algorithm
}Let's start the free compute job
const computeJobs = await ProviderInstance.freeComputeStart(
providerUrl,
consumerAccount,
computeEnv.id,
assets,
algo
)Let's save the compute job it, we re going to use later
computeJobId = computeJobs[0].jobId
// eslint-disable-next-line prefer-destructuring
agreementId = computeJobs[0].agreementIdYou can also add various delays so you see the various states of the compute job
const jobStatus = await ProviderInstance.computeStatus(
providerUrl,
await consumerAccount.getAddress(),
computeJobId,
agreementId
)Now, let's see the current status of the previously started computer job
console.log('Current status of the compute job: ', jobStatus) await sleep(10000)
const downloadURL = await ProviderInstance.getComputeResultUrl(
providerUrl,
consumerAccount,
computeJobId,
0
)Let's check the compute results url for the specified index
console.log(`Compute results URL: ${downloadURL}`)let's select compute environment which have free and paid resources
const computeEnv = computeEnvs[0]
console.log('Compute environment = ', computeEnv)Let's have 5 minute of compute access
const mytime = new Date()
const computeMinutes = 5
mytime.setMinutes(mytime.getMinutes() + computeMinutes)
const computeValidUntil = Math.floor(mytime.getTime() / 1000)Let's prepare the dataset and algorithm assets to be used in the compute job
const resources: ComputeResourceRequest[] = [
{
id: 'cpu',
amount: 2
},
{
id: 'ram',
amount: 2
},
{
id: 'disk',
amount: 0
}
]
const assets: ComputeAsset[] = [
{
documentId: resolvedDatasetDdo.id,
serviceId: resolvedDatasetDdo.services[0].id
}
]
const dtAddressArray = [resolvedDatasetDdo.services[0].datatokenAddress]
const algo: ComputeAlgorithm = {
documentId: resolvedAlgorithmDdo.id,
serviceId: resolvedAlgorithmDdo.services[0].id,
meta: resolvedAlgorithmDdo.metadata.algorithm
}Triggering initialize compute to see payment options
const providerInitializeComputeResults = await ProviderInstance.initializeCompute(
assets,
algo,
computeEnv.id,
paymentToken,
computeValidUntil,
providerUrl,
consumerAccount,
resources,
Number(chainId)
)
console.log(
'providerInitializeComputeResults = ',
JSON.stringify(providerInitializeComputeResults)
)Let's check funds for escrow payment
const escrow = new EscrowContract(
getAddress(providerInitializeComputeResults.payment.escrowAddress),
consumerAccount
)
const paymentTokenPublisher = new Datatoken(publisherAccount)
const balancePublisherPaymentToken = await paymentTokenPublisher.balance(
paymentToken,
await publisherAccount.getAddress()
)
assert(
new BigNumber(parseEther(balancePublisherPaymentToken)).isGreaterThan(0),
'Balance should be higher than 0'
)
const tx = await publisherAccount.sendTransaction({
to: computeEnv.consumerAddress,
value: parseEther('1.5')
})
await tx.wait()
await paymentTokenPublisher.transfer(
paymentToken,
getAddress(computeEnv.consumerAddress),
(Number(balancePublisherPaymentToken) / 2).toString()
)
const amountToDeposit = (
providerInitializeComputeResults.payment.amount * 2
).toString()
await escrow.verifyFundsForEscrowPayment(
paymentToken,
computeEnv.consumerAddress,
await unitsToAmount(consumerAccount, paymentToken, amountToDeposit),
providerInitializeComputeResults.payment.amount.toString(),
providerInitializeComputeResults.payment.minLockSeconds.toString(),
'10'
)Let's order assets
algo.transferTxId = await handleOrder(
providerInitializeComputeResults.algorithm,
resolvedAlgorithmDdo.services[0].datatokenAddress,
consumerAccount,
computeEnv.consumerAddress,
0
)
for (let i = 0; i < providerInitializeComputeResults.datasets.length; i++) {
assets[i].transferTxId = await handleOrder(
providerInitializeComputeResults.datasets[i],
dtAddressArray[i],
consumerAccount,
computeEnv.consumerAddress,
0
)
}Let's start compute job
const computeJobs = await ProviderInstance.computeStart(
providerUrl,
consumerAccount,
computeEnv.id,
assets,
algo,
computeValidUntil,
paymentToken,
resources,
Number(chainId)
)Let's save the compute job it, we re going to use later
computeJobId = computeJobs[0].jobIdYou can also add various delays so you see the various states of the compute job
const jobStatus = await ProviderInstance.computeStatus(
providerUrl,
await consumerAccount.getAddress(),
computeJobId
)Now, let's see the current status of the previously started computer job
console.log('Current status of the compute job: ', jobStatus) await sleep(10000)
const downloadURL = await ProviderInstance.getComputeResultUrl(
providerUrl,
consumerAccount,
computeJobId,
0
)Let's check the compute results url for the specified index
console.log(`Compute results URL: ${downloadURL}`)Please note that ComputeExamples.md is an autogenerated file, you should not edit it directly.
Updates should be done in test/integration/ComputeExamples.test.ts and all markdown should have three forward slashes before it
e.g. /// # H1 Title