5 Min. Read

From Git to Skynet: How to Host Large Files on Sia

by Derek Anderson

Insight

banner image for the post From Git to Skynet: How to Host Large Files on Sia

At Akash Network, we are always looking for amazing ways to use our network adjacent to technologies that align with our mission and vision of a true and complete decentralized web. 

One service that pairs well with our network is Skynet. Built on the Sia block technology stack, Skynet enables fluid and easy file serving through portals which allow users to easily upload and share media without needing to interact with the blockchain. 

The real beauty of Sia is the heavy investment in a JavaScript SDK. 

I thought about what kinds of use cases in which we could use this SDK at Akash Network. One of the primary uses floating around today is Git Large File Support or Git LFS. Natively, you can find this service on GitHub and you can find integrations using AWS S3 buckets but I couldn’t find one with Skynet integration.

The Importance of Using Git

Using Git to store code and large files has a lot of advantages. In scientific communities, storing references to large datasets and offloading those data sets can be both cost effective and a necessity. Creatives might want source control for binaries but want better tracking and checkout like subversion.

For Akash, we’re continuing our focus on reducing deployment barriers and increasing developer efficiency while maintaining our value for open-source software. We want to use this integration to continue to move deployment tooling in-line and beyond our Web 2.0 counterparts.

Currently, there are a few LFS server implementations. The version 1 specification of Git LFS has been around for almost five years. While you still need to download and enable the LFS extension, the base Git client now ships with support during some of its main actions such as automatically pulling in binaries from the LFS server.

Configuring Git LFS

Configuring Git LFS is pretty straightforward. Download and install the latest version from Git Large File Storage. Once you completed the install step use the command below:

git lfs install

Tracking png Files

Next, let’s add tracking for the binary files that you want to send to Skynet. Luckily, there is an easy command to include files in tracking:

git lfs track "*.png"

The previous command will start tracking any ‘png’ files in the project, these will be swapped out for text versions that contain tracking hashes.

Now, anytime we commit and push the binary versions of the files, they will be uploaded to the LFS endpoint on our Git server. To accomplish zero config, the LFS client assumes your LFS server is attached to your Git repository host, which isn’t the case in our guide today.

Redirecting the LFS Client

To redirect the LFS client, we simply need to add a dot file to our project. This file is appropriately named .lfsconfig.

[lfs]

url = "http://localhost:3000/akashnetwork/testFile"

We’ll point our Git client to our local LFS server that’s going to send our files to Skynet. 

Starting your LFS Server

Our next step is to start that LFS server. Akash has done the leg work of creating an out of the box server built upon the ‘Fastify’ server by Nearform. Now, the extensible server is ready to upload to Skynet. 

To use this server, you’ll need Node 16 and a local version of MongoDB installed to track uploads for the LFS server. First, head over to MongoDB to install a local version of the database server - there is no configuration needed here.

To start the server you can simply use npx to automatically download and execute the server for use.

npx @akashnetwork/lfs

You should see output on the terminal indicating the server is up and running and ready to accept inbound connections.

You’re Almost There! 

Now you’re ready to add and push tracked files to the sky.

git add file.png

git commit -m "Add cool image"

git push origin main

While working locally, Git will keep your binary files in the working directory. When pushed, resource ids are sent to the remote repository and upon cloning the binaries, will be ushered back into the Git repository.

You can see a demo of the working implementation on our YouTube channel here, or try it for yourself! We’re excited to keep developing tools to rapidly accelerate decentralized use and adoption.

Share this Blog

Discover what's happening on Akash

banner image for the post Achieving Decentralized Physical State Consensus With Witness Chain on Akash

By Zach Horn

Case Studies

Achieving Decentralized Physical State Consensus With Witness Chain on Akash

Witness Chain is transforming blockchains by bridging the cyber-physical divide by introducing a verifiable observation layer that captures and authenticates real-world attributes.

5 Min. Read

banner image for the post Prime Intellect Integrates Permissionless Akash GPUs

By Zach Horn

Updates

Prime Intellect Integrates Permissionless Akash GPUs

Today, high-performance GPUs from the Akash Supercloud, including NVIDIA H100, A100, and more, have been added to the Prime Intellect platform.

5 Min. Read

banner image for the post Decentralized AI Model Training on Akash With FLock.io

By Zach Horn

Insights

Decentralized AI Model Training on Akash With FLock.io

This case study outlines the integration between Akash and FLock.io that enables users to easily train AI models on decentralized compute.

5 Min. Read

Experience the Supercloud.