🌓

Mutability Isn’t Variability

How programmers confuse ideas about things changing over time.
Estimated reading time: 7 mins
Bruce Hill October 24, 2024

In programming, there are a lot of benefits to using immutable values, but the benefits are unrelated to whether variables can be reassigned. Immutable values are a type of data that are guaranteed to never change (as opposed to mutable data, which can change). Using immutable values helps you:

In essence, immutability is a guarantee that can make it easier to build complex programs with fewer bugs.

There is a separate, but related concept that I will refer to as “variability”. Variability is something that is not a property of data, but is a property of the symbols we use as a placeholder to represent data. In a programming language, a symbol might be either a constant, like PI, which will hold the same value for the entire lifetime of a program, or a variable like x, which can represent various different values over the course of a program’s lifetime. In functional programming languages, “local constants” are called a “let bindings”, but this term has become hopelessly confusing, because Javascript and Rust both use the keyword let to declare symbols that can take on new values over time. Some languages also have something which I will call a “local constant”, which is a symbol that represents the same value for the duration of the block where the symbol was defined (its lexical scope).

Continue reading…

Packrat Parsing from Scratch

How to implement a packrat parser from scratch, one easy piece at a time.
Estimated reading time: 12 mins
Bruce Hill June 5, 2021

Packrat parsing is a relatively new approach to parsing code introduced by Bryan Ford in his 2002 Master’s thesis. Packrat parsers are capable of very efficiently parsing a class of grammars called Parsing Expression Grammars (PEGs). I covered PEGs in more depth in my previous post, PEGs and the Structure of Languages, so if you’re new to the topic, I recommend you begin there. The packrat parser is based on earlier work on recursive descent parsers, dating back to the 70s, but focused specifically on Parsing Expression Grammars. In this post, I’ll attempt to demystify packrat parsers by walking through a complete packrat parser implementation that fits in a few dozen lines of Javascript code.

Continue reading…

PEGs and the Structure of Languages

How Parsing Expression Grammars help us (and computers) understand the structure of programming languages.
Estimated reading time: 7 mins
Bruce Hill May 29, 2021

When I first began working on my programming language Nomsu, one of the first challenges I faced was how to get the computer to see code written in my language as more than just a long sequence of letters. Code has structure, and finding that structure is called parsing. After spending a long time digging through different approaches to the problem, one lesser-known (though increasingly popular) approach called Parsing Expression Grammars became my favorite, and allowed me to make my own programming language. This post will explain what Parsing Expression Grammars (PEGs) are, and a bit of background on the problem they solve. There are some very good PEG libraries available (I first found out about PEGs through the Lua library LPEG), and in a follow-up post, I’ll even explain how to implement a PEG parser from scratch. Hopefully, after reading this post, you’ll have enough understanding to try defining your own language (programming or otherwise) with a Parsing Expression Grammar.

Continue reading…

Load-Bearing C Projects

A guide to building and maintaining nontrivial C projects of high quality.
Estimated reading time: 18 mins
Bruce Hill May 23, 2021

C can be a very fun language to program in, if you can set up your project well. A lot of what I know about managing C projects was learned the hard way by experimentation and studying other people’s code. This post is meant to be the guide I never had to organizing and polishing nontrivial C projects, as well as setting up Makefiles for building them and manpages for documentation.

Continue reading…

6 Useful Snippets

A handful of useful programming concepts that can each be implemented in under 20 lines of code.
Estimated reading time: 12 mins
Bruce Hill April 3, 2019

I’d like to present a handful of programming concepts that I find incredibly useful, all of which can be implemented in under 20 lines of code. These techniques can be used in a wide range of contexts, but here, I’ll show how they can all build off of each other to produce a full physics engine in just 75 lines of code.

Continue reading…

Hill Noise

A new approach to random noise generation with novel mathematical properties.
Estimated reading time: 13 mins
Bruce Hill June 1, 2017

Noise functions are a way to produce continuous pseudorandom values across one or more dimensions. The simplest example is a one-dimensional noise function that makes the y-value wobble up and down smoothly, but unpredictably, as you move along the x-axis. It’s also possible to produce 2-dimensional noise, where the amplitude of the noise depends on the x,y coordinates you’re sampling.

Continue reading…

Meandering Triangles

An in-depth explanation of the Meandering Triangles algorithm for drawing contour lines.
Estimated reading time: 5 mins
Bruce Hill February 14, 2017

Contour lines, also known as isolines, are lines drawn along along paths of constant elevation. They are often used when drawing maps to indicate the slopes of hills and mountains.

A contour map

But they can also be used to visualize functions that map (x,y) values to z values, or to find the boundaries of implicit surfaces. One of the common algorithms for finding contour lines is Marching Squares. The algorithm assumes that you have a set of elevation data points spaced out at regular intervals on a grid, and generates contour lines for each square of the grid. A simpler algorithm exists, called Meandering Triangles, and it divides the data points into triangles, rather than squares, to reduce the number of cases the algorithm has to handle. Here, I’ll give a full explanation of how the Meandering Triangles algorithm works, with some examples to show it in action.

Continue reading…

A Faster Weighted Random Choice

Performing fast random selection with non-uniform weights is trickier than you might imagine.
Estimated reading time: 9 mins
Bruce Hill February 2, 2017

Consider this problem: You have a list of items and a list of their weights. You want to randomly choose an item, with probability proportional to the item’s weight. For these examples, the algorithms will take a list of weights and return the index of the randomly selected item.

Continue reading…