I use Pinata for hosting a number of websites on IPFS. However, due to their double-charging of similar pins, I need to clean up old copies of my sites to avoid high costs. I am just sharing a couple of simple scripts that work for me and could easily be adapted to other simple cleanups.
Setup
You need an API key to modify your account. Set up a shell variable with your key:
You may want to do something to keep this out of your shell history. For example, I have configured my shell so that I can prefix commands with a space to prevent them from being written to disk.
pinata_auth='-HAuthorization: Bearer YOUR_PINATA_JWT_HERE'
Scripts
Delete by Regex
First preview the paths to delete:
curl $pinata_auth \
'https://api.pinata.cloud/data/pinList?status=pinned&pageLimit=1000' \
| jq -r '
.rows[]
| select(.metadata.name | match("^_dnslink\\."))
'
Then actually perform the deletion.
curl $pinata_auth \
'https://api.pinata.cloud/data/pinList?status=pinned&pageLimit=1000' \
| jq -r '
.rows[]
| select(.metadata.name | match("^_dnslink\\."))
| "https://api.pinata.cloud/pinning/unpin/\(.ipfs_pin_hash)"
' \
| xargs -n1 -- curl -iXDELETE --retry 2 $pinata_auth
Delete old Versions
I upload new versions of sites with the same name. In this case I want to delete all but the last couple of versions of each site.
curl $pinata_auth \
'https://api.pinata.cloud/data/pinList?status=pinned&pageLimit=1000' \
| jq -r '
.rows
| group_by(.metadata.name)
| .[]
| sort_by(.date_pinned)
| .[0:-3]
| "https://api.pinata.cloud/pinning/unpin/\(.[].ipfs_pin_hash)"
' \
| xargs -n1 -- curl -iXDELETE --retry 2 $pinata_auth
This one is a bit complicated. Here is a quick overview how it works.
- Group by name.
- Separate each group into its own item in the stream.
- Sort by the date pinned to Pinata.
- Take all but the last 3. If there is less than three none will be selected.
- Generate the unpin URLs.
Other Notes
- Pinata only supports deleting a single pin at a time and has a rate limit. The
--retry
flag to curl will handle the rate limiting for us but deleting a lot of pins can still take a long time. - I make no attempt to handle paging. For the examples provided you can keep re-running the whole pipeline as long as the number of paths that you want to keep is less than 1000. However, if you have more than 1000 live paths you will need to handle paging.