Systems Programming with Nim

black sand

Nim is a statically typed compiled systems programming language. I’ve seen a few posts about Nim popping up on HackerNews recently and was curious enough to give it a go.

My go-to tool / application to implement when trying out a new language these days is a simple CLI tool.

Recently I had a go at trying Rust – creating fstojson-rust. Sticking to systems programming languages, Nim came up next.

Here are the things I initially liked about Nim:

  • Statically typed, compiled language.
  • The Nim compiler and generated executables work on all the major operating systems. i.e. Linux, macOS, Windows, BSD.
  • Lots of documentation and tutorials.
  • An online Playground.
Some Nim code in my IDE
Getting started with Nim

Getting Started With Nim

Installation is really simple regardless of your OS. On macOS I chose to run the choosenim installation script. (Remember, always check and inspect scripts that you cURL / run directly from online sources):

curl https://nim-lang.org/choosenim/init.sh -sSf | sh

You can also download pre-built binaries for your OS at the main Nim Download page, or use a myriad of other methods to get it running such as Homebrew, Docker images, etc…

Nimble is a package manager that is bundled with Nim (since version 0.15.0). It is very similar in what it does as what npm does for node for example.

If you wanted to upgrade nimble itself, you can simply do:

nimble install nimble

Just like an npm update!

To create a new project, you can use the init command. An interactive CLI wizard guides you through creating a Library, Binary, or Hybrid project type.

nimble init [pkgname]
nimble init guide for new projects

A Nim project can use a .nimble file, which is equivalent to package.json in the Node.js world. Nimble is used to work with the project’s .nimble file.

To check the validity of your project and dependencies, you can run the check command:

nimble check

Then to install a new package you use the install command with nimble (the default is to install latest, but you can also install a specific version of a package, e.g. one from a git repository):

$ nimble install package@#head

Getting Acquainted with Nim

I started off with a really simple program that simply reads from stdin and then echos back a message.

echo "What's your name? "
var name: string = readLine(stdin)
echo "Hi, ", name, "!"

Building and running locally is simple. You can run nimble build or nimble install. The build command will create a debug version of your program which includes stack traces, while install will create a release version.

You can also pass Nim flags to these commands, though for anything permanent you can add a configuration file and specify them there.

You use nimble run to build and run your program in one go.

There are many more command options available for nimble. Just run the command without anything else to access the list.

fstojson-nim

Next, I jumped into writing my go-to application for traversing the file system and collecting directory and files in a designated path to output to a JSON hierarchy.

You can take a look at my code in GitHub.

My overall experience was that I found Nim to be a lot more foregiving in terms of safety checks and recursive functions than what Rust was.

I was able to use a single function (in Nim these are called procedures and are defined with the proc keyword), which could recursively call itself for traversing a nested directory hierarchy.

I created a simple object (PathObjectNode) which would store the information about each file or directory, with a children property that is used to hold a list (seq) of more PathObjectNode objects if they object is a directory (files of course do not have children).

A single hierarchy is created at the start of traversal and all directory or file object nodes are added to this.

var hierarchy = newSeq[PathObjectNode]()

At the end of traversal I simply echo out the JSON representation of the root hierarchy node (along with all children). Optionally, this can be prettified JSON.

Speaking of options, I used a handy package called docopt to provide the CLI interface. It was a case of simply adding this to my project’s .nimble file dependencies.

The interface can then be specified by providing a docopt string. For example:

let doc = """
traversefs

Usage:
  traversefs [-p | --pretty] [-r | --recurse] PATH

Arguments:
  PATH          The path to begin traversing

Options:
  -h --help     Show this screen.
  --version     Show version.
  -p --pretty   Pretty print JSON
  -r --recurse  Recusively traverse paths.
"""

With that done, and a simple nimble build later, I could run the binary directly to use it.

Closing Thoughts

Nim feels like an easily accessible systems programming language to me. Although I haven’t really used it enough to have an informed opinion, my thoughts are that it strikes a nice balance between usablity, ease of entry, performance, and type safety.

I’ll definitely consider exploring it further for other projects when I get the chance.

Beginning Rust: Writing a Small CLI Tool

rust tool

My first go at writing an application in Rust has been slightly frustrating. Coming in from using mostly dynamic languages every day I quickly found myself butting heads with Rust’s borrow checker. However, I’ve found that this is a fair price to pay for a statically typed language with a focus on memory safety and performance.

While these Rust features do increase the barrier of entry for newcomers such as myself, they also help to keep your code in check and are certainly major contributing factors to the language’s success.

Another interesting point is that Rust doesn’t have a GC (garbage collector). As soon as something in your code is not required anymore (a function call returns) the memory associated with that scope is cleaned up. Rust inserts Drop::drop calls at compile time to do this. I imagine this is similar in concept to the way that IL or code weaving is done in the .NET world. This fact means that Rust doesn’t suffer from performance hits that languages with a GC tend to sometimes encounter. Discord wrote an interesting article on how they improved performance by switching from Go to Rust that touches on this particular point.

Goals

To take a look at the Rust language and ecosystem at a really high-level, I decided to write a simple tool. My goals were to:

  • Write a CLI tool, small in scope. The tool will traverse a target directory in the file system recursively and print the structure to stdout as JSON.
  • Get a feeling for the language’s syntax.
  • See how package management and dependencies work.
  • Look at what the options are for cross-compiling to other platforms.

The tool – fstojson

Here is the small tool I wrote to achieve the above list of goals: fstojson-rust.

I’ve compiled my first app on macOS, Linux and Windows all from the same source, with no issues whatsoever.

Rust Packages

On my first look at Rust, packages were simple to understand and use. Rust uses “crates” and they work very similarly to JavaScript packages.

To add a crate to your project you simply add the dependency to your Cargo.toml file (akin to a package.json file in Node.js).

For example:

[dependencies]
serde_json = "1.0.68"

Once crates are installed with the cargo command, you’ll even get a lock file (Cargo.lock), just like with npm or yarn in a Node.js project.

Rust cross compiling

The first time you install Rust with rustup, the standard library for your current platform is installed. If you want to corss compile to other platforms you need to add those target platforms seperately.

Use the rustup target add command to add other platform targets. Use rustup target list to show all possible targets.

To cross-compile you’ll often also need to install a linker. For example if you were trying to compile for x86_64-unknown-linux-gnu on Windows you would need the cc linker.

Thoughts and impressions

To get a really simple “hello-world” application up and running in Rust was trivial. The cargo command makes things really easy for you to scaffold out a project.

However, I honestly struggled with anything more complex for a couple of hours after that. Mostly fighting the “borrow checker”. This is my fault because I didn’t really spend much time getting acquainted with the language initially via the documentation. I dove right in with trying to write a small app.

The last time I wrote something in a System programming language was at least 7 or 8 years ago – I wrote a tool in C++ to quiesce the file system in preparation for snapshots to be taken. Aside from that, the last time I really had to concern myself with memory management was with Objective-C (iOS), before ARC was introduced (See my first serious attempt at creating an iOS game, Cosmosis).

In my opinion, some of Rust’s great benefits also mean it has a high barrier of entry. It has a really strong emphasis on memory safety. I came at my first application trying to do all the things I can easily do in Typescript / Javascript or C#.

I very quickly realised how different things are in the Rust world, and how this opinionated approach helps to keep your code bug-free and your apps safe on memory.

Closing thoughts

After years of dynamic language use, my first introduction to Rust has been a little bit shaky. It’s a high barrier of entry, but with that said, I did find it satisfying that if there were no compiler warnings my code was pretty much guaranteed to run without issue.

The Rust ecosystem is active and thriving from what I can tell. You can use crates.io to search online for packages. You can use rustup to install toolchains and targets.

There are tons of stackoverflow questions and answers and the documentation page for Rust is full of good information.

Going forward I’ll try to dig into the Rust language a bit more. I’m on a little bit of a journey to try different programming languages (I’ve had a fair bit of experience in C# and Typescript / JavaScript, so I’m branching out from those now).

I discovered this post recently – A half-hour to learn Rust. In hindsight it would have been great to have found that before diving in.

Update: thanks to noah04 on GitHub for their improvements PR on applying some Rust idioms.

Quick and Easy ffmpeg Cheat Sheet

mixing board

This post contains my ffmpeg cheat sheet.

ffmpeg is a very useful utility to have installed for common audio and video conversion and modification tasks. I find myself using it frequently for both personal and work use.

Got an audio or video file to compress or convert? Or maybe you’re looking to split off a section of a video file. Did you do a screen recording of a meeting session and find you need to share it around, but it’s way too large in filesize?

ffmpeg is the clean and easy command line approach. I’ve put together a ffmpeg cheat sheet below that has some quick and easy usages documented.

Skip ahead to the ffmpeg Cheat Sheet section below if you already have it installed and ready to go.

Installing ffmpeg (macOS / Windows)

On macOS, you’re best off using homebrew. If you don’t have homebrew already, go get that done first. Then:

macOS

brew install ffmpeg

Windows

You can grab a release build in 7-zip archive format from here (recent as of the publish date of this post). Be sure to check the main ffmpeg downloads page for newer builds if you’re reading this page way in the future.

If you don’t have 7-zip, download and install that first to decompress the downloaded ffmpeg release archive.

Now you’ll ideally want to update your user PATH to include the path that you’ve extracted 7-zip to. Make sure you’ve moved it to a convenient location. For example on my Windows machine, I keep ffmpeg in D:\Tools\ffmpeg.

Open PowerShell (Windows key + R), type powershell, hit enter.

To ensure that the path persists whenever you run powershell in the future, in powershell, run:

notepad $profile

This will load a start-up profile for powershell in notepad. If it doesn’t exist yet, it’ll prompt you to create a new file. Choose yes.

In your profile notepad window, enter the following, replacing D:\Tools\ffmpeg with the path you extracted ffmpeg to on your own machine.

[Environment]::SetEnvironmentVariable("PATH", "$Env:PATH;D:\tools\ffmpeg")

Close Notepad and save the changes. Close Powershell, then launch it again. This time if you type ffmpeg in the powershell window you’ll run it, no matter which directory you’re in.

ffmpeg Cheat Sheet

This is a list of my 10 most useful ffmpeg commands for conversion and modification of video and audio files.

Task DescriptionCommand
Convert a video from MP4 to AVI formatffmpeg -i inputfile.mp4 outputfile.avi
Trim a video file (without re-encoding)ffmpeg -ss START_SECONDS -i input.mp4 -t DURATION_SECONDS -c copy output.mp4
Convert WAV audio file to compressed MP3 formatffmpeg -i input.wave -acodec libmp3lame output.mp3
Mux video from input1.mp4 with audio from input2.mp4 ffmpeg -i input1.mp4 -i input2.mp4 -c copy -map 0:0 -map 1:1 -shortest output.mp4
Resize or scale video from current size to 1280×720.ffmpeg -i input.mp4 -s 1280x720 -c:a copy output.mp4
Extract audio to MP3 from a video fileffmpeg -i inputvideo.mp4 -vn -acodec libmp3lame outputaudio.mp3
Add watermark or logo to the top-left of a video. (Change overlay parameter for different positions).ffmpeg -i inputvideo.mp4 -i logo.png -filter_complex "overlay=5:5" -codec:a copy outputvideo.mp4
Covert video to GIFffmpeg -i inputvideo.mp4 output.gif
Change the frame rate of a videoffmpeg -i inputvideo.mp4 -filter:v 'fps=fps=15' -codec:a copy outputvideo.mp4

Bonus batch scripts for video file re-sizing below. For example, iterate over every .MP4 file in a directory and resize to 1080p, or every file prefixed with DJI_ and resize to 1440p:

for f in *.MP4 ; do ffmpeg -i "$f" -s 1920x1080 -c:a copy "resized-1080p-$f" ; done
for f in DJI_* ; do ffmpeg -i "$f" -s 2560x1440 -c:a copy "resized-1440p-$f" ; done

ffmpeg cheat sheet example - logo overlay.
Yes, that is a banana logo overlayed onto this video using ffmpeg.

This list doesn’t even scratch the surface of the capabilities of ffmpeg. If you dig deeper you’ll find commands and parameters for just every audio and video modification process you need.

Remember, while it’s often easy to find a free conversion tool online, there’ll always be a catch or risk of using these. Whether it’s being subjected to unecessary advertising, catching potential malware, or being tracked with 3rd party cookies, you always take risks using free online tools. I guarantee almost every single free conversion website is using ffmpeg on their backend.

So do yourself a favour and practice and using CLI tools to do things yourself.