<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
  <channel>
    <title>Things from the room in the back</title>
    <description>blog soup</description>
    <link>https://soup.agnescameron.info/</link>
    
      
        <item>
          <title>knitout-js cheatsheet</title>
          <description>&lt;p&gt;For the &lt;a href=&quot;https://cci.arts.ac.uk/~material/&quot;&gt;material programming project&lt;/a&gt; workshops, we have made an &lt;a href=&quot;https://agnescameron.github.io/knitout-live-visualizer/&quot;&gt;adapted version&lt;/a&gt; of CMU’s original Knitout visualiser, that allows you to write javascript code and export directly to files that run on the Kniterate machine. The visualiser uses a Javascript library called knitout, which allows you to write code that can be translated to run on a knitting machine.
&lt;!-- This page starts with a cheatsheet (just below), and further down is a walkthrough of loading the interface and writing a file. Below that is a set of examples. --&gt;&lt;/p&gt;

&lt;h2 id=&quot;knitout-js-commands&quot;&gt;knitout-js commands&lt;/h2&gt;

&lt;p&gt;This table is adapted from the longer guide in the &lt;a href=&quot;https://github.com/textiles-lab/knitout-frontend-js&quot;&gt;knitout-frontend-js&lt;/a&gt; repository, and contains just the most important operations. The links go to the info pages on Gabrielle Ohlson’s &lt;a href=&quot;https://knit.work&quot;&gt;knit.work&lt;/a&gt; website, which also has some really helpful animations.&lt;/p&gt;

&lt;table&gt;
  &lt;thead&gt;
    &lt;tr&gt;
      &lt;th&gt;command&lt;/th&gt;
      &lt;th&gt;arguments&lt;/th&gt;
      &lt;th&gt;example&lt;/th&gt;
      &lt;th&gt;description&lt;/th&gt;
    &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
    &lt;tr&gt;
      &lt;td&gt;&lt;a href=&quot;https://knit.work/knit/&quot;&gt;knit&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;direction&lt;/code&gt;, &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;bed+needle&lt;/code&gt;, &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;carrier&lt;/code&gt;&lt;/td&gt;
      &lt;td&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;knit(&quot;+&quot;,&quot;f10&quot;,&quot;3&quot;)&lt;/code&gt;&lt;/td&gt;
      &lt;td&gt;Knit a stitch, on &lt;strong&gt;bed&lt;/strong&gt; at &lt;strong&gt;needle&lt;/strong&gt;, in &lt;strong&gt;direction&lt;/strong&gt;, using &lt;strong&gt;carrier&lt;/strong&gt;&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;&lt;a href=&quot;https://knit.work/tuck/&quot;&gt;tuck&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;direction&lt;/code&gt;, &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;bed+needle&lt;/code&gt;, &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;carrier&lt;/code&gt;&lt;/td&gt;
      &lt;td&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;tuck(&quot;+&quot;,&quot;f10&quot;,&quot;3&quot;)&lt;/code&gt;&lt;/td&gt;
      &lt;td&gt;Tuck a stitch, on &lt;strong&gt;bed&lt;/strong&gt; at &lt;strong&gt;needle&lt;/strong&gt;, in &lt;strong&gt;direction&lt;/strong&gt;, using &lt;strong&gt;carrier&lt;/strong&gt;&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;&lt;a href=&quot;https://knit.work/transfer/&quot;&gt;xfer&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;from bed+needle&lt;/code&gt;, &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;to bed+needle&lt;/code&gt;&lt;/td&gt;
      &lt;td&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;xfer(&quot;f10&quot;,&quot;b10&quot;)&lt;/code&gt;&lt;/td&gt;
      &lt;td&gt;Transfer loops from &lt;strong&gt;from bed&lt;/strong&gt; at &lt;strong&gt;needle&lt;/strong&gt; to  &lt;strong&gt;to bed&lt;/strong&gt; at &lt;strong&gt;needle&lt;/strong&gt;&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;&lt;a href=&quot;https://knit.work/rack/&quot;&gt;rack&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;rack value&lt;/code&gt;(Number)&lt;/td&gt;
      &lt;td&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;rack(1)&lt;/code&gt;&lt;/td&gt;
      &lt;td&gt;Translate the back bed relative to the front bed by &lt;strong&gt;rack value&lt;/strong&gt; needle units. The default racking is zero – Kniterate machines also support racking by 0.5 (needed to knit with both beds)&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;&lt;a href=&quot;https://knit.work/drop/&quot;&gt;drop&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;bed+needle&lt;/code&gt;&lt;/td&gt;
      &lt;td&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;drop(&quot;f10&quot;)&lt;/code&gt;&lt;/td&gt;
      &lt;td&gt;Drop loops from &lt;strong&gt;bed+needle&lt;/strong&gt; (if you can’t be bothered to cast off)&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;in&lt;/td&gt;
      &lt;td&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;carrier&lt;/code&gt;&lt;/td&gt;
      &lt;td&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;in(&quot;5&quot;)&lt;/code&gt;&lt;/td&gt;
      &lt;td&gt;Bring in yarn carrier &lt;strong&gt;carrier&lt;/strong&gt;&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;out&lt;/td&gt;
      &lt;td&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;carrier&lt;/code&gt;&lt;/td&gt;
      &lt;td&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;out(&quot;6&quot;)&lt;/code&gt;&lt;/td&gt;
      &lt;td&gt;Take out yarn carrier &lt;strong&gt;carrier&lt;/strong&gt; (not strictly necessary on kniterate and can cause issues – only do this right at the end)&lt;/td&gt;
    &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;

&lt;p&gt;Here’s how the beds, needle numbers and carriage directions correspond to the knitting machine. Note that needles with the same ‘number’ will be opposite one another, and the positive and negative directions are the &lt;em&gt;same&lt;/em&gt; for both beds.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/knitout-cheatsheet/kniterate-diagram.jpg&quot; alt=&quot;main&quot; /&gt;
&lt;/figure&gt;

&lt;h3 id=&quot;kniterate-specific-rules&quot;&gt;kniterate-specific rules&lt;/h3&gt;

&lt;ul&gt;
  &lt;li&gt;carriers on the Kniterate are numbered “1” to “6” – you can’t have other numbers&lt;/li&gt;
  &lt;li&gt;you should rack by either a whole number, or by 0.5&lt;/li&gt;
  &lt;li&gt;don’t put more than two tuck stitches on top of each other (machine will get stressed out – this is a general machine thing)&lt;/li&gt;
  &lt;li&gt;don’t start at needle zero! try and center your design on the bed try 50 to start with&lt;/li&gt;
&lt;/ul&gt;

&lt;h3 id=&quot;filetype-guide&quot;&gt;filetype guide&lt;/h3&gt;

&lt;p&gt;Using the visualiser allows you to &lt;em&gt;write&lt;/em&gt; Javascript, which is transformed into KCode that runs on the knitting machine. It does this by first translating the Javascript into another language called Knitout, in which each line is a single instruction to the machine. Javascript is easier to read than Knitout, which is easier again to read than KCode.&lt;/p&gt;

&lt;table&gt;
  &lt;thead&gt;
    &lt;tr&gt;
      &lt;th&gt;file type&lt;/th&gt;
      &lt;th&gt;ending&lt;/th&gt;
      &lt;th&gt;usage&lt;/th&gt;
    &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
    &lt;tr&gt;
      &lt;td&gt;&lt;a href=&quot;/img/kniterate-code/waste-test.kc&quot;&gt;kcode&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;.kc&lt;/td&gt;
      &lt;td&gt;the file that the kniterate machine runs. Needs to be called ‘command.kc’ in the&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;javascript&lt;/td&gt;
      &lt;td&gt;.js&lt;/td&gt;
      &lt;td&gt;code that’s written or loaded into the editor that creates knitting files, based on the &lt;em&gt;knitout&lt;/em&gt; library.&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;&lt;a href=&quot;/img/kniterate-code/waste-test.k&quot;&gt;knitout&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;.k&lt;/td&gt;
      &lt;td&gt;this is the file that translates between the javascript code and the kcode that runs on the kniterate&lt;/td&gt;
    &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;

&lt;!-- ## using the interface

The knitout visualiser has 2 sides -- one to write code, the other to see the results. It will load with an example -- you can either edit this, delete everything and start from scratch, or load a file into the interface.


### how to think about knitout js
you must specify every stitch. this is cool and also annoying.

### knitting a row


### transferring between beds


### waste section


### bindoff


## common issues



## examples

These examples are adapted for the Kniterate from files in CMU&apos;s []()

### 1x1 rib

Transfers

### checkerboard


### challenges


## more information --&gt;
</description>
          <pubDate>2026-04-09T00:00:00-04:00</pubDate>
          <link>https://soup.agnescameron.info//2026/04/09/cheatsheet.html</link>
          <guid isPermaLink="true">https://soup.agnescameron.info//2026/04/09/cheatsheet.html</guid>
        </item>
      
    
      
        <item>
          <title>kniterate notes 4</title>
          <description>&lt;p class=&quot;topnote&quot;&gt;This is the fourth in a series of blog posts about the &lt;a href=&quot;https://cci.arts.ac.uk/~material/&quot;&gt;Material Programming Project&lt;/a&gt;. We are developing malleable knitting software for the &lt;a href=&quot;https://www.kniterate.com/&quot;&gt;Kniterate&lt;/a&gt;, a semi-industrial knitting machine. The first post, on the Knitout project, is available &lt;a href=&quot;https://soup.agnescameron.info/2025/09/20/kniterate.html&quot;&gt;here&lt;/a&gt;, a longer post about the Kniterate machine is &lt;a href=&quot;https://soup.agnescameron.info/2026/03/07/kniterate-notes.html&quot;&gt;here&lt;/a&gt;, and a guide to the different file formats is &lt;a href=&quot;https://soup.agnescameron.info/2026/03/25/kniterate-waste-section.html&quot;&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;This week I learned 2-bed knitting on the domestic machine, and made some progress on the Knitout-&amp;gt; Kniterate code visualiser. We also managed to get a bunch of tests to run on the Kniterate, most of which worked fairly well. We also visited an exhibition of some work by visiting students from NAFA (Nanyang Academy of Fine Art Singapore), which featured a really great and inspiring bit of Kniterate work.&lt;/p&gt;

&lt;h3 id=&quot;making-friends-with-the-ribber&quot;&gt;making friends with the ribber&lt;/h3&gt;

&lt;p&gt;This past weekend, Rosie and I went to Knitworks to do a &lt;a href=&quot;https://knitworkslondon.com/products/advanced-ribbing-attachment&quot;&gt;workshop&lt;/a&gt; on the Brother ribber. Neither of us had worked with one before, and it was really useful for getting a material understanding of what’s happening on a 2-bed machine.&lt;/p&gt;

&lt;figure&gt;
    &lt;img src=&quot;/img/kniterate-3/5-by-5-rib.jpg&quot; alt=&quot;main&quot; /&gt;
    &lt;span class=&quot;mainnote&quot;&gt;a plated 5x5 rib sample made on the Knitworks ribber. The ribbing pattern is made by transferring stitches between the front and back beds&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/kniterate-3/ribber-studio.jpg&quot; /&gt;
Setting the ribber up in my studio
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;We practiced a cast-on, different ribs, and experimented with plating and racking on the front bed (which might be a nice thing to re-create on the Kniterate). After the session, I went to my studio to set up the ribber that had come with my Brother machine, but that I’d not used before! It was pretty straightforward to set up: the hardest part was figuring out how to attach the &lt;a href=&quot;https://andeeknits.co.uk/media/catalog/product/cache/7c24d03cf47dc7112c16899914de80b2/b/r/brother_rise_and_fall_bracket_left.jpg&quot;&gt;brackets&lt;/a&gt; to the ribber as they’d been taken off. These are sprung, to allow the bed to latch up and down.&lt;/p&gt;

&lt;h2 id=&quot;kniterate-testing&quot;&gt;kniterate testing&lt;/h2&gt;

&lt;h3 id=&quot;testing-the-waste-section&quot;&gt;testing the waste section&lt;/h3&gt;

&lt;p&gt;The first thing we did at Chelsea was to test the waste section sample we’d generated &lt;a href=&quot;http://localhost:4000/2026/03/25/kniterate-waste-section.html&quot;&gt;last week&lt;/a&gt;. All I needed to do was add a set of rows of front bed knitting. I decided to do this by using the waste generation file as intended – I made a rectangle on the front bed, and then appended the waste section using a script.&lt;/p&gt;

&lt;p&gt;This turned out to be easier said than done: the lines I’d added to the cast-on section messed up the carrier directions because I was bringing them in twice: once in the tucked section and once again in the kniterate-style introduction that I’d added. This meant that the carriers ended up on the ‘wrong’ side of the bed for the original code. In the end I ended up adding a few extra rows to hack this together so we had something to test with, but I’ll need to sort this out properly later on.&lt;/p&gt;

&lt;figure style=&quot;max-width: 500px;&quot;&gt;
    &lt;img src=&quot;/img/kniterate-3/comparison-caston.jpg&quot; alt=&quot;main&quot; /&gt;
    &lt;span class=&quot;mainnote&quot;&gt;comparing between a waste section from the kniterate editor (left) and the knitout code (right). Note the tucked draw thread and cast-on yarn at the bottom of the knitout-based sample&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;The tests went well: B was quite pleased with the double introduction as it means the yarns are brought in early on, getting around an issue with yarn height on their kniterate, and saves her from manually bringing the carriers in. It was really cool seeing our code turn from code into actual, knitted material – and in seeing the correspondences between the knitout and kniterate-editor based cast-on.&lt;/p&gt;

&lt;p&gt;We encountered another issue, which is that bringing carriers ‘out’ from the right hand side means that they trail over the whole knit to move to the Home position. For now we just move the ‘out’ statement to the end (as they’re not as important in kniterate anyway&lt;label for=&quot;carrier-out&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt;&lt;input id=&quot;carrier-out&quot; class=&quot;margin-toggle&quot; /&gt;),&lt;span class=&quot;sidenote&quot;&gt;B said that bringing carriers ‘in’ and ‘out’ was more of a Shima thing – when they’re not being knit they just sit there, they don’t do that much.&lt;/span&gt; but the other way around this would be to always ensure the drawthread ends up on the RHS (thinking about it, this is non-ideal as it fixes the cast-on direction of the input file… so we won’t do that).&lt;/p&gt;

&lt;p&gt;Once we’d got this working, we made a couple of test samples – one with a 1x1 rib, one with a fisherman’s rib (same as the waste section). Both of these knitted well, though they could have been a little tighter.&lt;/p&gt;

&lt;figure&gt;
    &lt;div class=&quot;subfig&quot;&gt;
        &lt;img src=&quot;/img/kniterate-3/cast-on.jpg&quot; /&gt;
    &lt;/div&gt;
    &lt;div class=&quot;subfig&quot;&gt;
        &lt;img src=&quot;/img/kniterate-3/fishermans-rib.jpg&quot; /&gt;
    &lt;/div&gt;
	&lt;span class=&quot;mainnote&quot;&gt;sample of 1x1 rib (right) and fisherman&apos;s rib (left)&lt;/span&gt;
&lt;/figure&gt;

&lt;h3 id=&quot;casting-off&quot;&gt;casting off&lt;/h3&gt;

&lt;p&gt;The next thing to test was casting off (after B spent about 20 minutes manually casting off the first sample). Initially, I used the &lt;a href=&quot;github.com/textiles-lab/knitout-examples/blob/master/rectangle-bindoff.js&quot;&gt;rectangle-bindoff&lt;/a&gt; code from  (in the US, ‘casting off’ = ‘binding off’) from the knitout examples repo, adapting it to the more recent version of the knitout-frontend javascript code as follows:&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/kniterate-3/knitout-bindoff-1.png&quot; /&gt;
what the first bindoff section looks like in the knitout visualiser. Note the ‘held’ stitches every other stitch on the back bed!
&lt;/span&gt;&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;k.xfer(&quot;f&quot; + n, &quot;b&quot; + n);
k.rack(1.0);
k.xfer(&quot;b&quot; + n, &quot;f&quot; + (n+1));
k.rack(0.25);
if ((n-min) % 2 === 1) {
	k.tuck(&quot;+&quot;, &quot;b&quot; + n, Carrier); // every other stitch held by back bed
}
k.knit(&quot;+&quot;, &quot;f&quot; + (n+1), Carrier);
if (n+2 &amp;lt;= max) {
	k.miss(&quot;+&quot;, &quot;f&quot; + (n+2), Carrier);
}
k.rack(0.0);
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;Translating to words, the order of operations here is:&lt;/p&gt;
&lt;ul&gt;
  &lt;li&gt;transfer a stitch from the front bed to the back bed (the previous row will all have been knitted on the front bed, leaving the back bed free)&lt;/li&gt;
  &lt;li&gt;rack 1 needle position over, bringing the stitch that was transferred to face the next needle along (which will still have a stitch)&lt;/li&gt;
  &lt;li&gt;transfer that first stitch from the back bed to the front bed (on top of an existing stitch)&lt;/li&gt;
  &lt;li&gt;every odd-numbered stitch, perform a &lt;a href=&quot;https://knit.work/tuck/&quot;&gt;tuck stitch&lt;/a&gt; on the back bed (this will take a loop of yarn from the carrier and hold it in place on the back bed). This is effectively holding the carrier yarn in place.&lt;/li&gt;
  &lt;li&gt;knit the stitch that’s held on the front bed&lt;/li&gt;
  &lt;li&gt;for all except the last 3 stitches, drop the knitted stitch off the front bed&lt;/li&gt;
  &lt;li&gt;rack back to 0, and continue&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This does something a bit different to the typical knitout bindoff, which doesn’t do this step of holding every other stitch on the back bed. B remarked that this method was ‘very shima-y’ (probably because it’s taken from the regular knitout examples), and the end result was somewhat distorted.&lt;/p&gt;

&lt;figure&gt;
    &lt;img src=&quot;/img/kniterate-3/original-bindoff.jpg&quot; alt=&quot;main&quot; /&gt;
    &lt;span class=&quot;mainnote&quot;&gt;the bindoff test sample knit using the original code -- note how the shima-style holds leave some loose stitches on the edge&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;The next step was to update the code to match kniterate editor, removing the every-other-needle transfers to the back bed. We ran out of time to test this, but the hope is this should work similarly to the version in the kniterate editor:&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;k.xfer(&quot;f&quot; + n, &quot;b&quot; + n);  // front to rear transfer
k.rack(1.0); // rack 1
k.xfer(&quot;b&quot; + n, &quot;f&quot; + (n+1));
k.rack(0.25);
k.knit(&quot;+&quot;, &quot;f&quot; + (n+1), Carrier);
k.rack(0.0);
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;This does a simpler operation than the first one described above, missing the tuck steps, but also leaving the ‘miss’ steps till the very end, holding the knit stitches in place on the needles (instead of holding them in place using the tucked stitches on the back bed).&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;transfer a stitch from the front bed to the back bed&lt;/li&gt;
  &lt;li&gt;rack 1 needle position over&lt;/li&gt;
  &lt;li&gt;transfer that first stitch from the back bed to the front bed&lt;/li&gt;
  &lt;li&gt;knit the stitch that’s held on the front bed&lt;/li&gt;
  &lt;li&gt;rack back to 0, and continue&lt;/li&gt;
&lt;/ul&gt;

&lt;figure style=&quot;max-width: 550px;&quot;&gt;
    &lt;div class=&quot;subfig&quot;&gt;
        &lt;img src=&quot;/img/kniterate-3/kniterate-bindoff.png&quot; /&gt;
    &lt;/div&gt;
    &lt;div class=&quot;subfig&quot;&gt;
        &lt;img src=&quot;/img/kniterate-3/knitout-bindoff-2.png&quot; /&gt;
    &lt;/div&gt;&lt;br /&gt;
    &lt;span class=&quot;mainnote&quot;&gt;The kniterate editor bindoff, compared to the equivalent in knitout. Note that compared to the image above, no tucked stitches are held at the back.&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;In validating this approach, I also looked to see if there were any kniterate-specific knitout examples. I wasn’t able to find a JS-based bindoff in Gabrielle’s code, but there is a Python equivalent in her &lt;a href=&quot;https://github.com/gabrielle-ohlson/knitout-kniterate-3D/blob/c65ca94077a890da58b95498d2742eedf12e381a/knitout_kniterate_3D/knit3D.py#L2116&quot;&gt;knit3D&lt;/a&gt; repository, where ‘x’ represents the number of the needle:&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;k.xfer(f&apos;b{x}&apos;, f&apos;f{x}&apos;)
k.rack(1)
k.xfer(f&apos;f{x}&apos;, f&apos;b{x-1}&apos;)
k.rack(0)
[...]
k.knit(&apos;-&apos;, f&apos;b{x-1}&apos;, c)
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;
&lt;p&gt;The … part is this, which handles roller positions. It’s unclear whether this is necessary for a straight bindoff – potentially this is necessary as when dealing with 3D shapes, it might need to be extra careful with the positioning of the sample on the bed.&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;if x != xferNeedle+count-1:
  if not asDecMethod and (xferNeedle+count) - x == 30: 
    k.rollerAdvance(0)
  elif x &amp;lt; xferNeedle+count-4 
  and (asDecMethod or (xferNeedle+count) - x &amp;lt; 30): 
    k.addRollerAdvance(-50)
    k.drop(f&apos;b{x+1}&apos;)
if not asDecMethod and (xferNeedle+count) - x &amp;gt;= 30: 
  k.addRollerAdvance(50)
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;h3 id=&quot;kniterate-issues&quot;&gt;kniterate issues&lt;/h3&gt;

&lt;p&gt;We encountered a recurrent issue where the Kniterate moves the yarn carrier before the carriage starts to move, resulting on stitches being dropped on the right hand side of the bed. B mentioned that this was also an issue that also occurred in kniterate-editor generated files, but was seeming to occur more often in ours.&lt;/p&gt;

&lt;p&gt;She said that this only ever happened with carriers on the right-hand-side of the code, and also happened most frequently just after the main yarn was cast on (e.g. after the row of both beds knitting at the start of a sample).&lt;/p&gt;

&lt;p&gt;I started to look into this by making close comparisons of the .kcode files generated by the kniterate editor and &lt;a href=&quot;https://github.com/textiles-lab/knitout-backend-kniterate/blob/master/knitout-to-kcode.js&quot;&gt;knitout-to-kcode&lt;/a&gt; conversion script respectively. This yielded &lt;em&gt;some&lt;/em&gt; differences, but will need a lot more time with the converter (and ideally to see how the Kniterate editor does it) to get to the bottom of if, how and why there’s a bug there.&lt;/p&gt;

&lt;p&gt;In trying to understand this, I also made an annotated guide to a .kcode ‘row’:&lt;/p&gt;

&lt;figure&gt;
    &lt;img src=&quot;/img/kniterate-3/kcode-closeup-2.png&quot; alt=&quot;main&quot; /&gt;
    &lt;span class=&quot;mainnote&quot;&gt;the &apos;row&apos; consists of information about the front bed (carrier position and stitches, then stitch tensions), followd by the back bed, followed by a general metadata section (stitch type, carrier, speed etc)&lt;/span&gt;
&lt;/figure&gt;

&lt;h2 id=&quot;knitout---kniterate-editor&quot;&gt;knitout -&amp;gt; kniterate editor&lt;/h2&gt;

&lt;p&gt;To make the JS knitout library usable to our students, I’m updating the visualiser to include a bunch more kniterate-specific functions, including kcode export and automatic waste section generation.&lt;/p&gt;

&lt;h3 id=&quot;updating-the-interface&quot;&gt;updating the interface&lt;/h3&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/kniterate-3/new-visualiser.png&quot; /&gt;
the adapted visualiser, featuring &lt;i&gt;functional&lt;/i&gt; KCode export button and &lt;i&gt;buggy&lt;/i&gt; waste yarn addition checkbox
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;The first thing to do was add kcode export button to the visualiser. This was pretty straightforward, I just used Gabrielle’s code, which had already been integrated into a web interface, just a different one. This sped things up a lot once we had knitout code that looked good.&lt;/p&gt;

&lt;h3 id=&quot;auto-exporting-the-waste-section&quot;&gt;auto exporting the waste section&lt;/h3&gt;

&lt;p&gt;This requires a more in-depth refactor, partly also to overcome some of the issues we ran into with carrier positioning during the initial waste section tests. My plan is to write a state object (similar to the one used in the knitout-to-kcode file) that tracks the carrier positions and decides the direction of each row based on this.&lt;/p&gt;

&lt;p&gt;It’s possible that it might be nice to have this as a callable function where the min and max width, height etc are rendered as arguments (rather than just automatically adding as present), but not sure how that would integrate into the main code parsing.&lt;/p&gt;

&lt;h3 id=&quot;ideas-for-helper-functions&quot;&gt;ideas for helper functions&lt;/h3&gt;

&lt;p&gt;Kniterate’s “do my transfers” function is really good and helpful. The way it works is that an area of knit where transfers from the front to back &lt;em&gt;should&lt;/em&gt; happen, and then apply it as a layer, when they get scheduled correctly.&lt;/p&gt;

&lt;p&gt;I wonder if there’s a nice higher level js function that can handle simple cases – e.g. manage transfers between 2 rows that looks at what needs to move from the back bed to the front bed. Some of this seems to happen already: adjacent transfers seem to get split over 2 rows without me needing to do anything. I suppose this gets in to needing a greater understanding of how knitout is generated from js (&lt;a href=&quot;https://github.com/textiles-lab/knitout-frontend-js/blob/master/knitout.js&quot;&gt;this file&lt;/a&gt;) to understand whether I’d be breaking some kind of parsing law.&lt;/p&gt;

&lt;h2 id=&quot;nafa-kniterate-sample&quot;&gt;NAFA Kniterate Sample&lt;/h2&gt;

&lt;p&gt;In the Triangle Space at Chelsea, there was an exhibition of a bunch of work made by some exchange students from the Nanyang Academy of Fine Art Singapore with different Chelsea programs, including one group with BA Textile design.&lt;/p&gt;

&lt;p&gt;There was a big variety of work, including some really cool samples inspired by images of the sky from the student Chua Yi Jie. The sample was a Double Bed Jacquard, where one yarn was spandex, causing a really interesting bunching form on the right side of the fabric.&lt;/p&gt;

&lt;figure&gt;
    &lt;img src=&quot;/img/kniterate-3/chua-yi-jie-jolene.png&quot; alt=&quot;main&quot; /&gt;
    &lt;span class=&quot;mainnote&quot;&gt;Chua Yi Jie&apos;s sample, with the elastic yarn in light blue&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;This made me think about using elastic to do shaping in knits more generally, potentially producing some really interesting geometries. It might be a nice way to explore 3D knits around the limitations of the kniterate, though I’d imagine it might also make things a lot less predictable.&lt;/p&gt;

&lt;h3 id=&quot;next-steps&quot;&gt;next steps&lt;/h3&gt;

&lt;ul&gt;
  &lt;li&gt;I’d really like to learn more about jacquard, as it’s a technique I’ve never used on either the kniterate or on a ribber.&lt;/li&gt;
  &lt;li&gt;Understanding the kcode file format in order to get to the bottom of the carriage errors and understand where they’re being introduced (and whether that happens in hardware or software).&lt;/li&gt;
&lt;/ul&gt;
</description>
          <pubDate>2026-04-01T00:00:00-04:00</pubDate>
          <link>https://soup.agnescameron.info//2026/04/01/transfers.html</link>
          <guid isPermaLink="true">https://soup.agnescameron.info//2026/04/01/transfers.html</guid>
        </item>
      
    
      
        <item>
          <title>kniterate notes 3</title>
          <description>&lt;p class=&quot;topnote&quot;&gt;This is the third in a series of blog posts about the &lt;a href=&quot;https://cci.arts.ac.uk/~material/&quot;&gt;Material Programming Project&lt;/a&gt;. We are developing malleable knitting software for the &lt;a href=&quot;https://www.kniterate.com/&quot;&gt;Kniterate&lt;/a&gt;, a semi-industrial knitting machine. The first post, on the Knitout project, is available &lt;a href=&quot;https://soup.agnescameron.info/2025/09/20/kniterate.html&quot;&gt;here&lt;/a&gt;, and a longer post about the Kniterate machine is &lt;a href=&quot;https://soup.agnescameron.info/2026/03/07/kniterate-notes.html&quot;&gt;here&lt;/a&gt;. The &lt;a href=&quot;https://soup.agnescameron.info/2026/04/01/transfers.html&quot;&gt;next post&lt;/a&gt; includes more detail about Kcode.&lt;/p&gt;

&lt;p&gt;B, Rosie and I met this week to work on knitout generation for the Kniterate, and plan the first programming-based session for the &lt;a href=&quot;https://cci.arts.ac.uk/~material/&quot;&gt;material programming&lt;/a&gt; workshop series. We started by trying to articulate exactly what it is we’re trying to achieve (both with the workshops, and with the project more generally) in concrete, technical terms:&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
    &lt;img src=&quot;/img/kniterate-code/mods-milling.png&quot; /&gt;
    A 2.5D milling workflow for the Roland MDX20 in &lt;a href=&quot;https://modsproject.org/&quot;&gt;mods&lt;/a&gt;. The same workflow could easily be adapted for another mill, a laser cutter, vinyl cutter or embroidery machine.
&lt;/span&gt;&lt;/p&gt;
&lt;ul&gt;
  &lt;li&gt;open knitting file formats are in a ~larval stage and aren’t widely used outside of academia&lt;/li&gt;
  &lt;li&gt;in general, interesting (open-source, malleable) CAD software benefits hugely from the benefit and adoption of machine-agnostic and interchangeable file formats&lt;/li&gt;
  &lt;li&gt;the adoption of new kinds of file format requires usable and accessible tooling: it’s not enough for them to just exist&lt;/li&gt;
  &lt;li&gt;if we can make something our students can use, chances are other people will be able to use it too&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A big part of this work is to get the existing tools into a form our students can use, while also properly articulating the backend structure both for ourselves and for anyone else who wants to tinker with things under the hood. Examples of places this has been done well in the wider world include the &lt;a href=&quot;https://modsproject.org/&quot;&gt;mods project&lt;/a&gt;, a modular tool for rapid prototyping used for the control and automation of a wide variety of CAD machines, open-source 3D printing software, and open-source embroidery projects like &lt;a href=&quot;https://github.com/CreativeInquiry/PEmbroider&quot;&gt;PEMbroider&lt;/a&gt; and &lt;a href=&quot;https://github.com/nkymut/p5.embroider&quot;&gt;p5.embroider&lt;/a&gt;.&lt;/p&gt;

&lt;h2 id=&quot;open-knitting-file-formats&quot;&gt;open knitting file formats&lt;/h2&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
    &lt;img src=&quot;/img/kniterate-code/knitout-kc-comparison.png&quot; /&gt;
    The same file in knitout (left) and kcode (right) format
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;One thing that’s become apparent with this project is how quickly the different knit file formats can get confusing, especially with the naming similarity between Knitout and Kniterate. To add to the confusion, most of these file formats can be converted in one direction, but not another: for example, it’s possible to convert knitout to kcode, but not currently possible to do it the other way around.&lt;/p&gt;

&lt;p&gt;Here’s a summary of some key file formats, plus links to sample files. The main ones we’re concerned with in this project are .kc and .k files, but the other two are also worth thinking about.&lt;/p&gt;

&lt;table&gt;
  &lt;thead&gt;
    &lt;tr&gt;
      &lt;th&gt;file type&lt;/th&gt;
      &lt;th&gt;ending&lt;/th&gt;
      &lt;th&gt;usage&lt;/th&gt;
    &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
    &lt;tr&gt;
      &lt;td&gt;&lt;a href=&quot;/img/kniterate-code/waste-test.kc&quot;&gt;kcode&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;.kc&lt;/td&gt;
      &lt;td&gt;Plaintext file that runs on the Kniterate. Made in the Kniterate editor or by converting a Knitout file using the &lt;a href=&quot;&quot;&gt;knitout-backend-kniterate&lt;/a&gt; tools.&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;&lt;a href=&quot;/img/kniterate-code/waste-test.k&quot;&gt;knitout&lt;/a&gt;&lt;/td&gt;
      &lt;td&gt;.k&lt;/td&gt;
      &lt;td&gt;Open interchange format developed by CMU textiles lab. Considerably more human-readable than .kc files.&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;dak&lt;/td&gt;
      &lt;td&gt;.txt&lt;/td&gt;
      &lt;td&gt;Designaknit export format, has an interesting pictoral representation. Plaintext. Screenshot examples &lt;a href=&quot;https://support.kniterate.com/hc/en-us/articles/11696134190365-CREATE-AND-EXPORT-A-FILE-FROM-DESIGNAKNIT9-TO-KNITERATE&quot;&gt;here&lt;/a&gt;&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;DAT&lt;/td&gt;
      &lt;td&gt;.dat&lt;/td&gt;
      &lt;td&gt;Shima Seiki export format. To my knowledge, this is a binary file format. CMU have a closed repo which generates these files from Knitout&lt;/td&gt;
    &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
    &lt;img src=&quot;/img/kniterate-code/txt-file-pattern.png&quot; /&gt;
    the ‘load txt file pattern’ layer interface in the Kniterate editor loading a compatible file
&lt;/span&gt;&lt;/p&gt;

&lt;h3 id=&quot;dak-export-files&quot;&gt;dak export files&lt;/h3&gt;

&lt;p&gt;We’d reached out to Gerard from Kniterate a few weeks ago to ask about importing Kcode and/or knitout files into the Kniterate editor directly. To our knowledge, this isn’t currently possible, but it turns out that the editor &lt;em&gt;does&lt;/em&gt; have a plaintext import function, but for .txt files generated from Designaknit 9 (DAK9), accessed through the layers editor. This is done (in a similar way to other Kniterate Editor &lt;a href=&quot;https://soup.agnescameron.info/2026/03/07/kniterate-notes.html&quot;&gt;operations&lt;/a&gt;) by adding a layer called ‘Load txt file pattern’.&lt;/p&gt;

&lt;p&gt;These .txt files have &lt;em&gt;yet another&lt;/em&gt; really specific file format: in this instance, an &lt;a href=&quot;https://support.kniterate.com/hc/en-us/articles/11696134190365-CREATE-AND-EXPORT-A-FILE-FROM-DESIGNAKNIT9-TO-KNITERATE&quot;&gt;almost pictoral format&lt;/a&gt; that includes the shaped file made up of individual stitches, once containing carrier information, and a second time containing stitch types, plus a set of metadata at the top. As far as I can make out, this &lt;em&gt;doesn’t&lt;/em&gt; include information about racking, transfers etc: presumably you add that in once you’re in the Kniterate editor. So, while it would probably be possible to write a knitout-to-dak-txt conversion script, in the end we’d lose a lot of information that knitout already does a good job of encoding.&lt;/p&gt;

&lt;figure style=&quot;width:400px&quot;&gt;
    &lt;img src=&quot;/img/kniterate-code/dak-pattern.png&quot; alt=&quot;main&quot; /&gt;
    &lt;span class=&quot;mainnote&quot;&gt;an excerpt of the DAK txt export pattern, showing header metadata and then shaped carrier information&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;Although it’s not essential for running .kc files on the machine, longer-term it would be ideal to be able to import generated .kc files into the Kniterate editor. This is partly because it feels like it &lt;em&gt;should&lt;/em&gt; be possible and would open up a lot of cool plugin options, but also more concretely to be able to perform checks on the files before running them on the machine to avoid breaking stuff.&lt;/p&gt;

&lt;h3 id=&quot;extras&quot;&gt;extras&lt;/h3&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
    &lt;img src=&quot;/img/kniterate-code/shima-knitout.png&quot; alt=&quot;main&quot; /&gt;
    an example knit file rendered in the knitout visualiser (left) and Shima Seiki visualiser (right)
&lt;/span&gt;&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;As part of attempting to find some example Shima files online, I came across the &lt;a href=&quot;https://knitscript.miraheze.org/wiki/Main_Page&quot;&gt;knitscript wiki&lt;/a&gt;, which seems a bit incomplete, but was attached to a really interesting &lt;a href=&quot;https://dl.acm.org/doi/pdf/10.1145/3586183.3606789&quot;&gt;paper&lt;/a&gt; on scripting languages for knit. A scripting language can work with live machine state – very cool future direction!&lt;/li&gt;
  &lt;li&gt;There’s also the DAK Stitch Pattern (.stp) file format, which programmer Tom Price managed to &lt;a href=&quot;github.com/t0mpr1c3/DAKexport&quot;&gt;reverse engineer&lt;/a&gt; a few years back.&lt;/li&gt;
  &lt;li&gt;While digging around for Shima files, I re-found the &lt;a href=&quot;https://github.com/MediaInteractionLab/knittingutils&quot;&gt;knittingutils&lt;/a&gt; repo, which seems to have been doing a similar thing to us (helper functions and machine-specific translations) for the Shima a few years back. While there aren’t any sample Shima files, they did have this great side-by-side comparison of the Knitout visualiser with the Shima software&lt;/li&gt;
&lt;/ul&gt;

&lt;h3 id=&quot;file-format-questions&quot;&gt;file format questions&lt;/h3&gt;

&lt;ul&gt;
  &lt;li&gt;Are there any pieces of knit software which generate or convert &lt;em&gt;to&lt;/em&gt; knitout currently? That would be a cool feature, even for something like DAK.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2 id=&quot;tool-development&quot;&gt;tool development&lt;/h2&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
    &lt;img src=&quot;/img/kniterate-code/knitout-visualiser.png&quot; /&gt;
    JS code generating knitout visualised in the Knitout Visualiser
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;We are working on a fork of the &lt;a href=&quot;https://textiles-lab.github.io/knitout-live-visualizer/&quot;&gt;Knitout visualiser&lt;/a&gt; that allows students export .kc files that will run on the Kniterate directly from the browser. As many will be totally new to Javascript, having something browser-based and not reliant on running command-line scripts to generate Kniterate-ready files should save us a lot of confusion. Our first task is to replicate the cast-on process from the Kniterate editor using the knitout-frontend tools, so the waste section is reliably knit by our machine.&lt;/p&gt;

&lt;p&gt;Ideally, we’ll end up with something akin to the &lt;a href=&quot;https://github.com/MediaInteractionLab/knittingutils/tree/master&quot;&gt;knitting utils&lt;/a&gt; repository, which seems to do a similar thing for the Shima Seiki machines, adding a layer of helper functions on top of the knitout frontend code.&lt;/p&gt;

&lt;h3 id=&quot;waste-section-generation&quot;&gt;waste section generation&lt;/h3&gt;

&lt;p&gt;To get started, I made a &lt;a href=&quot;https://github.com/agnescameron/knitout-backend-kniterate/blob/master/extras/waste-section-kniterate.js&quot;&gt;fork&lt;/a&gt; of Gabrielle Ohlson’s &lt;a href=&quot;https://github.com/textiles-lab/knitout-backend-kniterate&quot;&gt;knitout-backend-kniterate&lt;/a&gt; repo, creating a new script based on her waste section generating script. The original &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;waste-section.js&lt;/code&gt; produces something &lt;em&gt;fairly&lt;/em&gt; similar to the Kniterate editor, but in &lt;a href=&quot;https://soup.agnescameron.info/2025/09/20/kniterate.html&quot;&gt;previous testing sessions&lt;/a&gt; we had run into some issues bringing the carriers in, and wanted to test out a more exact recreation of the Kniterate waste section.&lt;/p&gt;

&lt;p&gt;The first part of this was also understanding the waste section itself. To explain a bit more about &lt;em&gt;why&lt;/em&gt; waste sections are structured the way they are, B shared with us a set of &lt;a href=&quot;/img/kniterate-code/dubied-draw-thread.pdf&quot;&gt;instructions&lt;/a&gt; for casting on the &lt;a href=&quot;https://knitworkslondon.com/blogs/blog/an-introduction-to-dubied-industrial-knitting-machines&quot;&gt;dubied&lt;/a&gt; machines, which are manually-operated industrial machines and the precursor to modern industrial machines like the Stoll and Kniterate. It was interesting to read these, both to think about the idea of non-electrified industrial machinery (still used today!) and also to get a sense of the added complexity required to produce replicable results on large, fine-gauge industrial machines.&lt;/p&gt;

&lt;figure&gt;
    &lt;img src=&quot;/img/kniterate-code/dubied-instructions.png&quot; alt=&quot;main&quot; /&gt;
    &lt;span class=&quot;mainnote&quot;&gt;cast-on instructions for the Dubied machine&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;Increasingly, I think about the ‘industrial-ness’ of these machines as compensating for the added difficulty of managing yarn tension, dropped stitches etc when you can’t use your hands. Hearing B talk about the difference even between a Kniterate, where you can go in and move the needles and carriers around, and a Shima Seiki, which is essentially just a closed cabinet that spits out knitted fabric, also illiustrates this.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
    &lt;img src=&quot;/img/kniterate-code/waste-yarn.png&quot; /&gt;
    The waste section of a sample made in the kniterate editor, showing the looser, thicker section of alternating front and back bed stitches of waste yarn (orange) at the bottom, followed by rows of cast-on yarn (plated green/black) and draw thread (pink). Note that this sample failed, and a second waste section is initialised after the red-brown yarn toward the top of the image.
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;While I’ve not used a Dubied, even just imagining the extra weight of the carriage and the fine-gauge needles feels like the intention of the machine is to rely much less on manual sensation: in order to do this, the fabric must be made predictable in other ways. The waste section is one part of this process – setting the sample up to be as predictable as possible, bringing all the carriers in one-at-a-time to minimise the likelihood that something goes wrong.&lt;/p&gt;

&lt;h3 id=&quot;carriers&quot;&gt;carriers&lt;/h3&gt;

&lt;p&gt;Translating between the tools forced us to really engage on every line with &lt;em&gt;exactly&lt;/em&gt; what was happening in the Knitout code, and how that corresponded to what was going on in both the Kniterate editor files, and then on the machine. One thing we struggled with initially was getting the carriage direction correct to bring in each yarn – you need to start the row from the side where the relevant carrier has last ended up. When this is wrong, you can see the path the yarn takes from one side of the material to the other as a float in the interface.&lt;/p&gt;

&lt;figure&gt;
    &lt;div class=&quot;subfig&quot;&gt;
        &lt;img src=&quot;/img/kniterate-code/carriers-1.png&quot; /&gt;
        &lt;span class=&quot;mainnote&quot;&gt;floats visible in the red draw-thread yarn as it&apos;s brought in&lt;/span&gt;
    &lt;/div&gt;
    &lt;div class=&quot;subfig&quot;&gt;
        &lt;img src=&quot;/img/kniterate-code/carriers-2.png&quot; /&gt;
        &lt;span class=&quot;mainnote&quot;&gt;changing the carriage direction for those rows brings in the yarn correctly&lt;/span&gt;
    &lt;/div&gt;
&lt;/figure&gt;

&lt;p&gt;One thing that was consistently challenging was trying to work out if what we were seeing rendered in the Kniterate editor was the &lt;em&gt;same&lt;/em&gt; as what we were seeing in the knitout visualiser. A particular point of confusion was the first part of the waste yarn, where stitches on the same row alternate between front and back beds. In the Knitout visualiser, these appear (visually) to being passed between same-numbered needles on each bed (and also the stitches appear to be performed by each needle, rather than every other needle). Only after checking the knitout code directly does it actually seem to be equivalent.&lt;/p&gt;

&lt;p&gt;I later learned that this stitch type is called an &lt;a href=&quot;https://knit.work/interlock/&quot;&gt;interlock&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
     &lt;video width=&quot;240&quot; controls=&quot;&quot;&gt;
      &lt;source src=&quot;/img/kniterate-code/carriers.mp4&quot; type=&quot;video/mp4&quot; /&gt;
    &lt;/video&gt;&lt;br /&gt;
    in this video, you can see the carriage moving towards a carrier threaded with plated yarn, and then bring it across the bed, knitting a row. It’s also possible to see the stitches passed between the front and the back beds.
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;The code excerpt below initialises this alternating-bed section. On even numbered rows, the carriage travels in the positive (left-to-right) direction. On the front bed, even-numbered needles are knit, and on the back bed, odd numbered needles are knit. The needle assignment and direction is then flipped for the odd-numbered rows, resulting in the same texture as seen in the Kniterate editor.&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;// waste section
wasteSection.push(`;waste yarn section`);
for (let p = 0; p &amp;lt; wastePasses; ++p) {

  // even numbered rows in +ve direction
  if (p % 2 === 0) {
    for (let n = wasteMin; n &amp;lt;= wasteMax; ++n) {
      if (n % 2 === 0) {
          wasteSection.push(`knit + f${n} ${wasteCarrier}`);
        }

        else {
          wasteSection.push(`knit + b${n} ${wasteCarrier}`);
        }
      }
  } 

  // odd numbered rows in -ve direction
  else {
    for (let n = wasteMax; n &amp;gt;= wasteMin; --n) {
      if (n % 2 === 0) {
          wasteSection.push(`knit - b${n} ${wasteCarrier}`);
       } 

       else {
          wasteSection.push(`knit - f${n} ${wasteCarrier}`);
       }
      }
  }
}
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;figure&gt;
    &lt;img src=&quot;/img/kniterate-code/correspondence.png&quot; alt=&quot;main&quot; /&gt;
    &lt;span class=&quot;mainnote&quot;&gt;The end result file, plus details of where we think it corresponds to the visualisation in the knitout editor. Without being able to go from .kc -&amp;gt; .k or from .kc -&amp;gt; Kniterate Editor, we won&apos;t know how correct this is until we try it on the machine!&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;The remaining point of confusion is the very top of the waste section, where a row is knitted with every stitch on both the front and back beds in the cast-on yarn, forming the first row of the eventual knitted piece. In B’s explanation, this ‘both-beds’ row is important as it gives a nice, sturdy edge once you remove the draw thread: she showed us an example of this on one of her samples.&lt;/p&gt;

&lt;p&gt;In the Knitout visualiser, this looks similar to a cast-off, with each successive stitch rendered on a new row. We ran out of time to test this on the machine, but will be testing it again next week – at which point we’ll hopefully resolve whether this top part is actually doing what we want it to be doing!&lt;/p&gt;

</description>
          <pubDate>2026-03-25T00:00:00-04:00</pubDate>
          <link>https://soup.agnescameron.info//2026/03/25/kniterate-waste-section.html</link>
          <guid isPermaLink="true">https://soup.agnescameron.info//2026/03/25/kniterate-waste-section.html</guid>
        </item>
      
    
      
        <item>
          <title>kniterate notes 2</title>
          <description>&lt;p class=&quot;topnote&quot;&gt;This is the second in a series of blog posts about the &lt;a href=&quot;https://cci.arts.ac.uk/~material/&quot;&gt;Material Programming Project&lt;/a&gt;. We are developing malleable knitting software for the &lt;a href=&quot;https://www.kniterate.com/&quot;&gt;Kniterate&lt;/a&gt;, a semi-industrial knitting machine. This post is about the machine. The first post, on the Knitout project, is available &lt;a href=&quot;https://soup.agnescameron.info/2025/09/20/kniterate.html&quot;&gt;here&lt;/a&gt;. The next post, which covers the Knitout and Kcode file formats in more detail, is available &lt;a href=&quot;https://soup.agnescameron.info/2026/03/25/kniterate-waste-section.html&quot;&gt;here.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Today we had the first of the &lt;a href=&quot;https://cci.arts.ac.uk/~material/&quot;&gt;material programming project&lt;/a&gt; student workshops. We got some UAL Teaching and Learning funding to run a series of workshops on the Kniterate, with the eventual aim of getting students to experiment with the knit programming tools we’re developing. For now, the focus is just getting everyone trained on the machine, which was also a really useful refresher for me.&lt;/p&gt;

&lt;figure&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/kniterate/threading-1.jpeg&quot; /&gt;
	&lt;/div&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/kniterate/threading-3.jpeg&quot; /&gt;
	&lt;/div&gt;&lt;br /&gt;
	&lt;span class=&quot;mainnote&quot;&gt;B demonstrating how to thread the kniterate&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/kniterate/stitch-types.png&quot; /&gt;
	knitout stitch types
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;B led the workshop, first giving an overview of the &lt;a href=&quot;https://editor.kniterate.design/&quot;&gt;kniterate editor&lt;/a&gt; (free to use!), and then using the machine to knit out one of the files. It’s interesting, now having worked with the CMU knit tools, comparing the construction of the kniterate files both to their interface, and the eventual knitted design.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/kniterate/cast-on-template.png&quot; /&gt;
	the 100-stitch template cast on
&lt;/span&gt;&lt;/p&gt;

&lt;h2 id=&quot;setting-up-the-file&quot;&gt;setting up the file&lt;/h2&gt;

&lt;p&gt;The first thing we did was to set up a 100-stitch cast-on file to use as a template. All the different operations are managed in ‘layers’. While the name invokes something like Photoshop layers, these reminded me more of the process tracker bar in Fusion 360: though it seems that, unlike in Fusion, if you go back and change something earlier in the design the changes don’t cascade. (this makes me wonder what a parametric design tool for knit would look like).&lt;/p&gt;

&lt;figure width=&quot;400&quot;&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/kniterate/layers.png&quot; /&gt;
		&lt;span class=&quot;mainnote&quot;&gt;the layers in the template file&lt;/span&gt;
	&lt;/div&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/kniterate/fusion-timeline.png&quot; /&gt;
		&lt;span class=&quot;mainnote&quot;&gt;the fusion 360 timeline tool&lt;/span&gt;
	&lt;/div&gt;
&lt;/figure&gt;

&lt;p&gt;It was also interesting to compare the kniterate interface both to the knitout visualiser, and to the eventual results of the knitting. The cast on section, for example (in the kniterate software) has a complex sequence of bringing in the different feeders, which means the yarns end up in the correct position.&lt;/p&gt;

&lt;p&gt;In the photos below you can see how the two rows of yarn 1, the drawthread, (orange in the kniterate interface and bright pink in the sample), are integrated into the design. Yarn 6, the waste yarn (green in the interface and orange in the sample), alternates between the back and front beds initially, before a row of the main yarn (yarn 4, double stranded blue/yellow in the sample) is brought in.&lt;/p&gt;

&lt;figure&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/kniterate/cast-on-file-2.png&quot; /&gt;
	&lt;/div&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/kniterate/cast-on-sample.jpeg&quot; /&gt;
	&lt;/div&gt;&lt;br /&gt;
	&lt;span class=&quot;mainnote&quot;&gt;the cast-on section in the kniterate file (left) vs the actual cast-on (right). The machine freaked out after the first couple of rows of the main yarn, so the orange waste yarn starts again after a couple of rows.&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;The other thing that this made me realise was that an obvious first step for improving the behaviour of knitout on the kniterate would be to simply attempt to duplicate the exact cast-on pattern used by the kniterate software itself.&lt;/p&gt;

&lt;p&gt;At present, the waste section appended by knitout-backend-kniterate has a number of similar aspects – the row where the rear stitches are dropped, followed by the drawthread row (the last row of purple waste yarn, then red drawthread) seems to be the same as in the kniterate file. Similarly (but harder to see), the section of waste yarn where the front and back-bed stitches are alternated is followed by a few rows of them being knitted separately (you can see this in the knitout visualiser when the threads cross back and forth) is the same in both files. However, the knitout equivalent is missing the rows where the main yarn and drawthread are being brought into work, which might be partly why the machine struggles to knit these files.&lt;/p&gt;

&lt;figure&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/kniterate/knitout-waste.png&quot; /&gt;
	&lt;/div&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/kniterate/cast-on-file-2.png&quot; /&gt;
	&lt;/div&gt;&lt;br /&gt;
	&lt;span class=&quot;mainnote&quot;&gt;comparing to the equivalent section in the knitout visualiser&lt;/span&gt;
&lt;/figure&gt;

&lt;h2 id=&quot;rib-and-knit-structure&quot;&gt;rib and knit structure&lt;/h2&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/kniterate/helen-sharp-plating.jpg&quot; /&gt;
	&lt;span&gt;a plated cable sample by &lt;a href=&quot;http://silverneedlesmachineknittingclub.com/demonstrators-and-vendors-2018/&quot;&gt;Helen Sharp&lt;/a&gt;, where the alternating colours are caught by front (pale) and rear (dark) beds.&lt;/span&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/kniterate/plating.jpg&quot; /&gt;
	&lt;span&gt;the configuration of yarn in the kniterate feeder required for plating&lt;/span&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Once we’d made the template file, we used the rest of the workshop to explore knit structures using ribs. The Kniterate machine supports &lt;a href=&quot;https://www.ojolly.net/knitting/2021/kniterate-plating-and-sweater&quot;&gt;plating&lt;/a&gt;, a knit technique where two yarns are placed into the same feeder, layered so that one yarn will tend to be in front when knit by the front bed, and at the back when knit by the rear bed.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/kniterate/plating-samples.jpeg&quot; /&gt;
	some of B&apos;s samples that use plating. Note that sometimes the front/back bed catching isn&apos;t perfect -- you can see the rear yarn coming through at the edges of the wavy orange sample, creating a slight marl. In the top sample, the brighter green yarn also has higher elasticity, creating a difference in texture.
&lt;/figure&gt;

&lt;p&gt;Plating can be used to create colour variation without using fairisle or jacquard, and instead by alternating stitches between front and back beds. One of the students was even wearing a plated jumper, with the shape of a horse outlined as a flat area surrounded by alternating ribs. It was cool to learn how a fabric I’d seen around was constructed!&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/kniterate/final-layers.png&quot; /&gt;
	&lt;span&gt;layers in the rib file (I learned later that each set of transfers didn’t need to be done separately…)&lt;/span&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;The other nice thing about plating is that it doesn’t require anything complicated in the pattern – the colour variation will be provided by the existing structure, so the only thing we needed to do was learn how to make a rib.&lt;/p&gt;

&lt;p&gt;To start editing the file, we added a ‘free edit’ layer – these allow you to change the type of stitches being used. I used the paint tools to add sections of rib, alternating columns (wales) of front and back bed knitting. In order for this to be safely knit by the machine, transfers also need to be added. For this, a layer called ‘Front &amp;lt;&amp;gt; Rear Transfers’ is used, which will automatically plan transfers in a selection.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/kniterate/transfers-2.png&quot; /&gt;
	a section of the rib sample in the kniterate software, showing transfers between beds at the start and end of the rib section. You can see transfers from the (dark) back bed to the (light) front bed indicated by a down-pointing arrow in the rows before the yarn 3 section, and the up arrows indicating where stitches are transferred to the back bed for the next rows of rib. Notice that where there are multiple adjacent transfers this takes place over 2 rows.
&lt;/figure&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/kniterate/bind-off.png&quot; /&gt;
	&lt;span&gt;the bindoff section&lt;/span&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;You can see in this sample that adjacent transfers take place over multiple rows – this is to decrease the likelihood of the thread breaking. For complex transfers, this can extend to several rows.&lt;/p&gt;

&lt;p&gt;The last part of making a pattern is adding a bind off, and going through the checks process. The checks will highlight things like, for example, the bindoff starting on the wrong side (it will throw this error by saying the ‘float’ is too long – as the carriage has to jump from one side to another). The bindoff itself looks like a huge triangle, but it’s similar to the transfers in that it’s just moving and binding off one stitch per row – the end result is just a neatly knit straight line.&lt;/p&gt;

&lt;h2 id=&quot;knitting-out&quot;&gt;knitting out&lt;/h2&gt;

&lt;p&gt;The file we ended up knitting was one made by Rosie, who is the other e-textiles technician at CCI. She’d gone with a pattern that spelled her name, using a similar technique of the horse jumper of a flat area surrounded by ribs. After a couple of false starts (with feeders ending up where they shouldn’t be), the machine knit it out!&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/kniterate/rosie-sample.jpeg&quot; /&gt;
	Rosie&apos;s plating rib sample
&lt;/figure&gt;

&lt;h2 id=&quot;final-thoughts&quot;&gt;final thoughts&lt;/h2&gt;

&lt;p&gt;In some ways, I’m struck by how much &lt;em&gt;less&lt;/em&gt; informative the knitout stitch visualiser is than the kniterate one. It feels like the difference between a tinkercad drawing and a circuit diagram – with the former being more useful for people with no electronics experience in representing how a circuit will &lt;em&gt;look&lt;/em&gt;, but being ultimately very limited in terms of participation in the wider symbolic language of the field. Even spending a morning with the kniterate visualiser, the symbolic language is so much clearer.&lt;/p&gt;

&lt;p&gt;Perhaps this makes sense – the knitout visualiser was designed by non-knitters / the text parallel (which is super useful) provides a specific understanding of the pattern. But I do think making a visualisation interface readable by knitters seems pretty important, and is maybe another step that’s needed in developing the tool.&lt;/p&gt;

&lt;figure&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/kniterate/latching-tinkercad.png&quot; /&gt;
	&lt;/div&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/kniterate/latching-schematic.png&quot; /&gt;
	&lt;/div&gt;&lt;br /&gt;
	&lt;span class=&quot;mainnote&quot;&gt;A tinkercad pictoral representation of a circuit (left) vs a circuit diagram, or schematic representation (right). Although the picture can be easier to make sense of for a beginner, it&apos;s much harder to clearly read and analyse the circuit, making it much less usable for someone who needs to understand the circuit (and in a sense, much less &lt;i&gt;malleable&lt;/i&gt;)&lt;/span&gt;
&lt;/figure&gt;

</description>
          <pubDate>2026-03-07T00:00:00-05:00</pubDate>
          <link>https://soup.agnescameron.info//2026/03/07/kniterate-notes.html</link>
          <guid isPermaLink="true">https://soup.agnescameron.info//2026/03/07/kniterate-notes.html</guid>
        </item>
      
    
      
    
      
        <item>
          <title>kniterate notes 1</title>
          <description>&lt;p class=&quot;topnote&quot;&gt;This is the first in a series of blog posts about the &lt;a href=&quot;https://cci.arts.ac.uk/~material/&quot;&gt;Material Programming Project&lt;/a&gt;, a collaboration between researchers at Chelsea College of Arts and the Creative Computing Institute. We are developing malleable knitting software for the &lt;a href=&quot;https://www.kniterate.com/&quot;&gt;Kniterate&lt;/a&gt;, a semi-industrial knitting machine. The next post, which talks about the Kniterate machine in more detail, is available &lt;a href=&quot;https://soup.agnescameron.info/2026/03/07/kniterate-notes.html&quot;&gt;here.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I’ve been working on a &lt;a href=&quot;https://cci.arts.ac.uk/~material/&quot;&gt;research project&lt;/a&gt; with &lt;a href=&quot;https://www.instagram.com/b.clax/?hl=en&quot;&gt;B Claxton&lt;/a&gt; and &lt;a href=&quot;https://researchers.arts.ac.uk/1615-claire-anderson&quot;&gt;Claire Anderson&lt;/a&gt; from the Smart Textiles Lab at Chelsea College of Arts. At the moment we’re running a reading group and building an &lt;a href=&quot;https://docs.google.com/spreadsheets/d/1Mk6qIkn9i-3fB2CGQtpkl0AzOCBJlxThXYyPawyuev8/edit?gid=0#gid=0&quot;&gt;index of open source knit tools&lt;/a&gt;, which we’re attempting to use with the Chelsea &lt;a href=&quot;https://www.kniterate.com/&quot;&gt;Kniterate&lt;/a&gt;, a semi-industrial knitting machine. This series of posts describes our work implementing and using the &lt;a href=&quot;https://textiles-lab.github.io/knitout/knitout.html&quot;&gt;Knitout&lt;/a&gt; file format as the basis for programming tools.&lt;/p&gt;

&lt;h2 id=&quot;what-is-knitout&quot;&gt;what is knitout?&lt;/h2&gt;

&lt;p&gt;&lt;a href=&quot;https://textiles-lab.github.io/knitout/knitout.html&quot;&gt;Knitout&lt;/a&gt; is an open interchange industrial knit file format designed by the &lt;a href=&quot;https://textiles-lab.github.io/&quot;&gt;Textiles Lab&lt;/a&gt; at Carnegie Mellon University. This group’s research is extremely interesting, and they’re one of the major sources of new pieces of knit software/research.&lt;/p&gt;

&lt;p&gt;The appeal of Knitout is that it can theoretically describe the operations of any of the major knit machines (e.g. Shima Seiki, Stoll, Kniterate). This means that, instead of being reliant on proprietary software, it might be possible to write the same code to be agnostic to different machine backends, opening up the potential for &lt;a href=&quot;https://www.inkandswitch.com/essay/malleable-software/&quot;&gt;malleable knitting software&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;This post documents a number of early experiments with the CMU repository to see how easily we could get the knitout files working on our Kniterate machine.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/knitout/final-waste.png&quot; alt=&quot;main&quot; /&gt;
	&lt;span class=&quot;mainnote&quot;&gt;an example knit file rendered in the knitout visualiser&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/knitout/vector-tiles.png&quot; /&gt;
	&lt;!-- &lt;img src=&quot;/img/knitout/face-notes.png&quot;/&gt; --&gt;
	big fan of the knitout visualiser stitch svg reference files
&lt;/span&gt;&lt;/p&gt;

&lt;h2 id=&quot;where-is-knitout&quot;&gt;where is knitout?&lt;/h2&gt;

&lt;p&gt;Knitout and related tools appear to still be very much under active research/development, and documentation and tools are distributed across a few different repositories, which made piecing them together initially a bit confusing. We found &lt;a href=&quot;https://knit.work/&quot;&gt;this website&lt;/a&gt; (though also a work in process) helpful for getting a sense of how everything fits together. For the workflow to produce these designs, we ended up using a combination of the following tools:&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;&lt;strong&gt;&lt;a href=&quot;https://github.com/textiles-lab/knitout-examples/tree/master&quot;&gt;knitout examples&lt;/a&gt;&lt;/strong&gt;, a repository of example knitout files, and JS used to generate them – some of these seem to be a little out of date&lt;/li&gt;
  &lt;li&gt;&lt;strong&gt;knitout/js &lt;a href=&quot;https://github.com/textiles-lab/knitout-live-visualizer&quot;&gt;visualisation tools&lt;/a&gt;&lt;/strong&gt;, which provide a great sanity check for .k files, and also have a live-coding feature for writing js code that generates knit&lt;/li&gt;
  &lt;li&gt;the &lt;strong&gt;&lt;a href=&quot;https://github.com/textiles-lab/knitout-backend-kniterate&quot;&gt;backend kniterate conversion tools&lt;/a&gt;&lt;/strong&gt;, to convert generic knitout files to make things run on the kniterate&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To make things simpler and get more verbose error messages, we downloaded these repositories locally and ran tools from the command line, rather than using the online conversion and visualiser tools.&lt;/p&gt;

&lt;p&gt;We &lt;em&gt;didn’t&lt;/em&gt; yet properly test out the npm package (&lt;a href=&quot;https://github.com/textiles-lab/knitout-frontend-js&quot;&gt;knitout-frontend-js&lt;/a&gt;), or the sister &lt;a href=&quot;https://github.com/textiles-lab/knitout-frontend-py&quot;&gt;python package&lt;/a&gt; as the example js file we used was acting more like a printer. It’s possible that some of the kniterate conversion tools will be better suited to files generated in this way – we’ll try this in attempt 2.&lt;/p&gt;

&lt;p&gt;We got the sense that Knitout has been developed &lt;em&gt;primarily&lt;/em&gt; with the Shima in mind, and there are a bunch of settings (eg carrier numbers, the inhook and outhook commands) that don’t work on the Kniterate. To resolve this, CMU have published a script that takes in a knitout file and adapts it for the kniterate specifically, which we are also using (more below).&lt;/p&gt;

&lt;h2 id=&quot;attempt-1--stripes&quot;&gt;attempt 1 – stripes&lt;/h2&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	 &lt;video width=&quot;240&quot; controls=&quot;&quot;&gt;
	  &lt;source src=&quot;/img/knitout/kniterate-slomo.mp4&quot; type=&quot;video/mp4&quot; /&gt;
	&lt;/video&gt;&lt;br /&gt;
	slomo video I took of the machine running when we were trying to debug the waste section
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;We used the example &lt;a href=&quot;https://github.com/textiles-lab/knitout-examples/blob/master/sheet-stripes.js&quot;&gt;‘sheet-stripes.js’&lt;/a&gt; from the &lt;a href=&quot;https://github.com/textiles-lab/knitout-examples/tree/master&quot;&gt;knitout examples repo&lt;/a&gt; to create the initial file. This file seems to come from an era predating the npm package, as it’s just printing out knitout line by line to a file. More &lt;a href=&quot;https://knit.work/garter-stitch/&quot;&gt;recent examples&lt;/a&gt; instead use API calls to construct the knitout lines. Similarly, I think the API now has a way of writing files – we used a pipe to directly create the .k file. After some failed attempts with lots of waste yarn bunching, we changed it to be 40 stitches wide rather than 20 (we also changed the tension when we did this though, so it could have been either).&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;$ node stripes.js &amp;gt; stripes-example.k
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;In order to adapt this for the machine, we then use some of the backend tools. First, we use the knitout-alter-kniterate file, from the knitout-kniterate backend &lt;a href=&quot;https://github.com/textiles-lab/knitout-backend-kniterate/tree/master/extras&quot;&gt;extras&lt;/a&gt; folder, to take the original knitout file and adapt it for the kniterate. (input: &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;stripes-example.k&lt;/code&gt;, output &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;stripes-example-kniterate.k&lt;/code&gt;)&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;$ node knitout-alter-kniterate.js
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;We used the following settings for the machine:&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;Roller advance: 450
Roller advance for transfers: 0
Main stitch number: 6
Stitch number for transfers: 5
Main speed number: 300
Speed number for transfers: 100
Would you like to change any carriers: n
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;
&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/knitout/final-waste-2.png&quot; /&gt;the added waste section, in the knitout visualiser
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;We then used the waste-section.js file (in the same folder) to prepend a waste section to the file. Sometimes depending on the settings used (this happened when we used the ‘0’ setting on the cast on style; was fine otherwise), the file that this creates can skip out one of the knitout commands needed to bring in a carrier; we resolved this by manually editing the file to add the line &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;in 3&lt;/code&gt; to bring the third carrier in. (input: &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;stripes-example-kniterate.k&lt;/code&gt;, output &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;stripes-example-kniterate-waste.k&lt;/code&gt;).&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;$ node waste-section.js
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;
&lt;p&gt;It took us a few attempts to get the tension right – here are the settings used when we got it to work.&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;Roller advance: 100
Stitch number: 5 --&amp;gt; this turned out to maybe be quite important
Speed number: 150
Carrier to use for waste yarn: 6 --&amp;gt; this is our waste yarn
Carrier to use for draw thread: 1 --&amp;gt; draw thread yarn
Cast on style: Open tube
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;We then used the knitout-frontend visualiser to check that the file looks okay. I ran the visualiser locally, but to be honest it would probably be exactly the same using the online version.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/knitout/knitout.png&quot; /&gt;
	&lt;img src=&quot;/img/knitout/kcode.png&quot; /&gt;
	knitout (top) and equivalent kcode (bottom) for waste yarn section
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;The final step is to use the the knitout-to-kcode file to transform the final file (knitout -&amp;gt; kniterate -&amp;gt; waste yarn) into a .kc file, which makes it ready to run on the kniterate machine. This complained a bit but was otherwise fine – though it did throw an error on the times where we were missing the &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;in 3&lt;/code&gt; command to bring in the third carrier.&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;node knitout-to-kcode.js extras/stripes-example-kniterate-waste.k stripes.kc
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;I hadn’t looked at a kcode file before, they’re a super interesting format.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/knitout/stripes-sample-1.jpg&quot; alt=&quot;main&quot; /&gt;
	&lt;span class=&quot;mainnote&quot;&gt;a few sample attempts, chained together -- the orange is waste yarn, and the green is the start of the stripes section. by later versions the waste section had improved a lot, but the stripe change was still causing issues&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;Our first attempts were not wildly successful, but we did end up learning a lot. Because the kniterate machine can already be a bit sensitive, it’s unclear often what’s issues with the file vs the machine being quite particular/temperamental about the way it wants to do things. After a lot of wrangling the waste yarn, in the end our main issues came with bringing the second stripe in – but this could have had more with the way the kniterate was moving the plates with the carriage than anything else.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/knitout/waste-end-question.png&quot; alt=&quot;stitch difference between waste and main yarn&quot; /&gt;the main section is one stitch wider?
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;What was very helpful was going through the entire process – including at different points manually debugging the knitout files – and seeing the links between the code, the visualiser, and what was happening on the machine.&lt;/p&gt;

&lt;h3 id=&quot;questionsthoughts-from-round-1&quot;&gt;questions/thoughts from round 1&lt;/h3&gt;

&lt;ul&gt;
  &lt;li&gt;how do you get the machine to bind off? Knitout has a command called ‘outhook’ – but this doesn’t work on the kniterate. Is there an equivalent?&lt;/li&gt;
  &lt;li&gt;why is the waste yarn addition 1 stitch less wide than the eventual pattern?&lt;/li&gt;
  &lt;li&gt;is there a major difference between what’s possible in the Python and JS packages? or the intention to be?&lt;/li&gt;
  &lt;li&gt;at present, because the code is spread across so many repositories but – in the case especially of the backend – seems to need to be quite interoperable, it feels a bit like version changes between different repos might introduce a lot of issues&lt;/li&gt;
&lt;/ul&gt;

&lt;h2 id=&quot;attempt-2--testing-different-tools&quot;&gt;attempt 2 – testing different tools&lt;/h2&gt;

&lt;p&gt;Today, I wanted to experiment with more recently developed JS-based tools that automate slightly more of the tasks. Repositories used in this round:&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;&lt;strong&gt;&lt;a href=&quot;https://github.com/textiles-lab/knitout-examples/tree/master&quot;&gt;knitout examples&lt;/a&gt;&lt;/strong&gt; (like before – testing out more js files)&lt;/li&gt;
  &lt;li&gt;Gabrielle Ohlson’s &lt;strong&gt;&lt;a href=&quot;https://github.com/gabrielle-ohlson/knitout-image-processing&quot;&gt;knitout image processing&lt;/a&gt;&lt;/strong&gt; repository, which automatically generates waste section and bind-off, plus has some automated shaping tools&lt;/li&gt;
&lt;/ul&gt;

&lt;h3 id=&quot;1--fairisle&quot;&gt;1 – fairisle&lt;/h3&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/knitout/fairisle-image-script.jpg&quot; /&gt;
	generated with the fairisle image script
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;The first pattern I tried to use was the fairisle generating file &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;fairisle-image.js&lt;/code&gt; in the examples repository. Somewhat misleadingly, this file won’t work in the frontend visualsiser interface, as it makes calls to the local filesystem so has to be run using node.&lt;/p&gt;

&lt;p&gt;When I do use it in the correct way:&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;$ node fairisle-image.js fairisle-test.png &amp;gt; fairisle-test.k
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;
&lt;p&gt;It works really well. Another small note – transparent background pngs get read as black not white, which initially caused some issues, but when I made the background white it was fine.&lt;/p&gt;

&lt;p&gt;I didn’t try adding the waste / bindoff to this yet but thought it came out okay.&lt;/p&gt;

&lt;h3 id=&quot;2--knitout-image-processing&quot;&gt;2 – knitout image processing&lt;/h3&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/knitout/colourwork-test.png&quot; /&gt;
	results from image processing, with waste and bindoff
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;This seems a &lt;a href=&quot;https://gabrielle-ohlson.github.io/knitout-image-processing/#prompts&quot;&gt;really promising&lt;/a&gt; interface to do a bunch of different operations that previously were scattered across a lot of the different CMU repositories.&lt;/p&gt;

&lt;p&gt;Using the same image, I tried using the &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;knitify&lt;/code&gt; file. The command line interface took me a minute to get used to – it ‘autopresses’ enter once you type a y/n for the yes/no questions, which I kept getting tripped up on. Knitify is launched using the following:&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;npm run knitify
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;You then proceed to specify the file. This initially seemed more kniterate friendly, though the code for kniterate had a fatal bug encountered when changing the waste settings, caused by a typo (was also easily solved), which does indicate that this has probably been tested a bit more on the Shima.&lt;/p&gt;

&lt;p&gt;Once the bug was fixed, I kept getting a more persistent bug about the carriers:&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;Assertion failed: no carrier found for leftover needle: 100 (@ row 1)
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;It still allowed me to write out the file, but this is something I want to try fixing. There is also a more general carrier issue, in that it doesn’t let you specify which carrier you want to use for what during the setup. At least for us, on the Kniterate, carriers 1 and 6 are reserved for the draw thread and waste yarn specifically, but in this instance carriers 1 and 2 are automatically used for the pattern.&lt;/p&gt;

&lt;p&gt;&amp;lt; tests of this coming soon &amp;gt;&lt;/p&gt;

&lt;h3 id=&quot;3--autoknit&quot;&gt;3 – autoknit&lt;/h3&gt;

&lt;p&gt;For the third set of tests, we wanted to experiment with the &lt;a href=&quot;github.com/textiles-lab/autoknit&quot;&gt;autoknit&lt;/a&gt; repository, which can be used to turn 3D meshes into corresponding knitting patterns. This is probably the arena where it feels most difficult to adapt to working with the kniterate, though also poses a lot of really interesting questions. Without thorougly testing these it feels hard to draw conclusions but also I wanted to document where we’d got to.&lt;/p&gt;

&lt;h3 id=&quot;process&quot;&gt;process&lt;/h3&gt;

&lt;p&gt;We follow the process outlined in the &lt;a href=&quot;https://github.com/textiles-lab/autoknit&quot;&gt;autoknit&lt;/a&gt; repository – reproduced here in a condensed form (with example .obj file ‘simple cone’). Somewhat messily, I worked with everything inside the &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;dist&lt;/code&gt; folder – I’d probably like to move it into its own folder in the future.&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;1- create the constraints:
$ ./interface obj:simple-cone.obj constraints:simple-cone.cons

2- use the constraints to make the mesh
$ ./interface obj:simple-cone.obj load-constraints:simple-cone.cons obj-scale:20.0 stitch-width:3.66 stitch-height:1.73 save-traced:simple-cone.st

3- scheduling
$ ./schedule st:simple-cone.st js:simple-cone.js

* after this step! modify the JS file for kniterate as below*

4- make knitout
NODE_PATH=.. node simple-cone.js out:simple-cone.k
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/knitout/cone-placement.png&quot; /&gt;
	placing constraints on the cone
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;The placing of the constraints was the most tricky to get used to as the instructions were a little confusing.&lt;/p&gt;

&lt;ol&gt;
  &lt;li&gt;place a constraint by pressing ‘c’&lt;/li&gt;
  &lt;li&gt;press ‘c’ again and move the mouse, to drag a line out and place the &lt;em&gt;next&lt;/em&gt; constraint&lt;/li&gt;
  &lt;li&gt;click to finish the line, then repeat steps 2 and 3&lt;/li&gt;
  &lt;li&gt;once the lines are added, order them by pressing the plus and minus keys. not sure of best ordering strategy yet (need to test this)&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/knitout/cone-peeling.png&quot; /&gt;
	peeling the cone
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;When we talked about these first files, B thought that even more than previously, these files seemed very Shima-y. A bit later, this was confirmed when I came across the following paragraph in Gabrielle Ohlson’s &lt;a href=&quot;https://github.com/textiles-lab/knitout-backend-kniterate&quot;&gt;knitout-backend-kniterate&lt;/a&gt; repository:&lt;/p&gt;

&lt;blockquote&gt;
  &lt;p&gt;&lt;em&gt;You may have come across autoknit, an exciting project by the Textiles Lab that converts 3D meshes to knitout. As of now, autoknit doesn’t play too nicely with the kniterate, since the kniterate is lacking some features that make 3D-knitting a bit difficult (e.g. high-level take-down mechanisms [sinkers], consistently reliable transfer-mechanisms [sliders], etc.). With the hope of some day figuring it out, autoknit-kniterate.js was created so that autoknit can at least produce files that will safely run on the kniterate&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;We followed the steps she suggests:&lt;/p&gt;
&lt;ul&gt;
  &lt;li&gt;move &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;autoknit-kniterate.js&lt;/code&gt; into the node_modules folder within autoknit&lt;/li&gt;
  &lt;li&gt;after producing the js file (step 3), open that js file in a text-editor and change this line of code (line #1): &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;const autoknit = require(&apos;autoknit-yarns&apos;);&lt;/code&gt; to this: &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;const autoknit = require(&apos;autoknit-kniterate&apos;);&lt;/code&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Here’s the difference between the two files, when viewed in the visualiser:&lt;/p&gt;

&lt;figure width=&quot;400&quot;&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/knitout/cone-shima.png&quot; alt=&quot;april&quot; /&gt;
		&lt;span class=&quot;mainnote&quot;&gt;cone not adapted for kniterate&lt;/span&gt;
	&lt;/div&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/knitout/cone-kniterate.png&quot; alt=&quot;august&quot; /&gt;
		&lt;span class=&quot;mainnote&quot;&gt;cone adapted for kniterate&lt;/span&gt;
	&lt;/div&gt;
&lt;/figure&gt;

</description>
          <pubDate>2025-09-20T00:00:00-04:00</pubDate>
          <link>https://soup.agnescameron.info//2025/09/20/kniterate.html</link>
          <guid isPermaLink="true">https://soup.agnescameron.info//2025/09/20/kniterate.html</guid>
        </item>
      
    
      
        <item>
          <title>food maps</title>
          <description>&lt;p&gt;This is a write-up of a small tool that scrapes a graph of related dishes, according to the see also section of Wikipedia. I made it a couple of years ago for a project with the Knowledge Futures Group, along with &lt;a href=&quot;https://favourkelvin17.medium.com/&quot;&gt;Favour Kelvin&lt;/a&gt;, who was doing an internship with us at the time. I’ve revisited it a couple of times since, including re-doing the seeding stage and improving the filtering. The code for this project is &lt;a href=&quot;https://github.com/agnescameron/related-dishes&quot;&gt;here&lt;/a&gt;, which also contains a &lt;a href=&quot;https://github.com/agnescameron/related-dishes/blob/master/export.json&quot;&gt;JSON dump&lt;/a&gt; of the graph that you can import into a &lt;a href=&quot;https://neo4j.com/&quot;&gt;neo4j&lt;/a&gt; instance at home.&lt;/p&gt;

&lt;figure class=&quot;fullwidth&quot;&gt;
	&lt;img src=&quot;/img/dishes/graph-section-dishes.png&quot; alt=&quot;main&quot; /&gt;
	&lt;span class=&quot;mainnote&quot;&gt;a small section of linked dishes in the graph, including clusters of stuffed pastries and noodle soups, fading into sandwiches and stews&lt;/span&gt;
&lt;/figure&gt;

&lt;h2 id=&quot;dishes-vs-recipes&quot;&gt;dishes vs recipes&lt;/h2&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/dishes/flavour-map.jpg&quot; alt=&quot;vegetation health&quot; /&gt;a spatial map of flavours (source unknown)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;This project came out of a piece of consulting work around the design of ‘expert coding languages’ for cooking – the idea that, instead of being discrete entities, recipes were more like paths to particular points in a larger latent space&lt;label for=&quot;dishrecipe&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt;&lt;span class=&quot;sidenote&quot; id=&quot;dishrecipe&quot;&gt;a term from machine learning, ‘latent space’ could be paraphrased as a high dimensional ‘space of possibility’, within which certain points are known, but assumed to be contained within a much larger continuous space. For example – what transformations separate a chewy cookie from a crumbly one? and what’s in the middle?&lt;/span&gt;. By changing aspects of the recipe – cooking time, amount of butter, temperature – you could end up in a different point in the space, and by the same token you could work backward. During the research stages of the project I got very interested in different ontologies for food (I’ve written about this collection &lt;a href=&quot;https://www.are.na/editorial/on-food-ontologies&quot;&gt;here&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;Part of the reason I like these is there’s something very deeply subjective about them – which gets interesting as soon as you start to try and do things with computers. In the end, the project itself became already-too-vast even within the smaller use-case, though I think the idea is still pretty interesting.&lt;/p&gt;

&lt;p&gt;As well as collecting ontologies (and getting obsessed with &lt;a href=&quot;/2022/08/05/soft-bread.html&quot;&gt;industrial food texture modification manuals&lt;/a&gt;), I started to sketch out my own maps. I found this idea of a continuous latent space of food (just change the dials and you change the recipe into something inbetween) both deeply exciting, and also lacking.&lt;label for=&quot;metafont&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt;&lt;span class=&quot;sidenote&quot; id=&quot;metafont&quot;&gt;In the years since this project, I came across a really beautiful exploration of this tension (within the context of fonts!) in Douglas Hofstader’s fantastic essay &lt;a href=&quot;https://library.agnescameron.info/artificial%20intelligence/Metafont,%20Metamathematics,%20and%20Metaphysics,%20Douglas%20Hofstader%20(1982).pdf&quot;&gt;Metafont, Metamathematics and Metaphysics&lt;/a&gt;&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;I thought dishes did a good job of articulating this issue: discrete &lt;em&gt;things&lt;/em&gt; that couldn’t be easily faded between in a continuous manner. After all, what does it mean to mark a point halfway between a blood sausage and a kimchi stew? Below is an early sketch of a schema trying to map out these different culinary relationships:&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/dishes/food-space.png&quot; alt=&quot;main&quot; /&gt;
&lt;/figure&gt;

&lt;p&gt;It felt important to have ‘dish’ be distinct from ‘recipe’. I think of a recipe as an instance of an instruction-set, that captures &lt;em&gt;how&lt;/em&gt; to make a dish (named or unnamed)&lt;label for=&quot;dishrecipe&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt;&lt;span class=&quot;sidenote&quot; id=&quot;dishrecipe&quot;&gt;I feel like there’s also a continuum of how ‘dish-like’ a recipe is – though even recipes which don’t start out pointing to dishes can do, like drunk sandwiches that acquire a name after being perfected&lt;/span&gt;, while a dish is a concept, free to associate with other entites. A graph of dishes deals with overlapping but distinct: what other dishes, cuisines, holidays, meals, events, people, places or other practices are associated with this dish? What dishes exist in the world? How is a dish described?&lt;/p&gt;

&lt;p&gt;The answer to the last question is surprisingly sticky if you don’t count recipes as a description of a dish, and it made me think about the way we articulate dishes in terms of their relationships to other foods (which I think is more broadly about being able to express something most succinctly in terms of a shared experience). It’s certainly not a complete description and it’s also highly subjective – but moreso than most I think descriptions of dishes are necessarily subjective, unlike an instance of a recipe, which can also be precise.&lt;/p&gt;

&lt;p&gt;Wikipedia’s ‘see also’ section felt like a good place to start with this as it’s so ambiguous, and there are no hard rules – things can be related as they have similar ingredients, a similar form factor and/or textural qualities, or are made using a similar process – it seems like editors mostly go on vibe.&lt;/p&gt;

&lt;h2 id=&quot;the-scraper&quot;&gt;the scraper&lt;/h2&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/dishes/crawler_screenshot.png&quot; alt=&quot;vegetation health&quot; /&gt;the crawler (affectionately known as ‘the worm’) crawling through recipes
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;The scraper starts on a page for a particular dish, and extracts links from the ‘See Also’ section, which contains links to other pages the editor thinks are related. These pages might be other dishes (we want these!) but could also be random things. For example, here’s the See Also section for &lt;a href=&quot;https://en.wikipedia.org/wiki/Borscht&quot;&gt;Borscht&lt;/a&gt;:&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/dishes/borscht-see-also.png&quot; alt=&quot;main&quot; /&gt;
&lt;/figure&gt;

&lt;p&gt;We want our crawler to visit &lt;a href=&quot;https://en.wikipedia.org/wiki/Shchi&quot;&gt;shchi&lt;/a&gt; and &lt;a href=&quot;https://en.wikipedia.org/wiki/Cabbage_soup&quot;&gt;cabbage soup&lt;/a&gt;, as these are both dishes, but we want it to skip the other links&lt;label for=&quot;grandsoup&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt; &lt;input id=&quot;grandsoup&quot; class=&quot;margin-toggle&quot; /&gt;&lt;span class=&quot;sidenote&quot;&gt;though I’m really taken with the strangely unpatriotic &lt;a href=&quot;https://en.wikipedia.org/wiki/Three_grand_soups&quot;&gt;three grand soups&lt;/a&gt;&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;To decide which pages we are interested in knowing more about, the crawler goes through each link in the ‘See Also’ section of the page fetches all the categories that the page belongs to. If none of the category titles contain the any of the words ‘dishes’, ‘bread’, ‘dessert’, ‘pudding’, ‘pastries’ (pluralised because of the way categories are worded&lt;label for=&quot;swedish&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt; &lt;input id=&quot;swedish&quot; class=&quot;margin-toggle&quot; /&gt;&lt;span class=&quot;sidenote&quot;&gt;Initially I actually used the singular ‘dish’ but then had the issue of collecting multiple pages with categories containing the word ‘Swedish’, which also has ‘dish’ as a substring, and had to kill the crawler as it attempted to index every member of Sweden’s parliament… (second footnote – I wrote the original tool before learning regular expressions)&lt;/span&gt;) then the page is skipped as it’s probably &lt;em&gt;not&lt;/em&gt; primarily a page about a food. If the page does seem to be a dish, then the scraper creates a relationship between it and the page it was linked from, and adds it to a list of pages to crawl next. For example, here are the categories for shchi:&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/dishes/schchi-categories.png&quot; alt=&quot;main&quot; /&gt;
&lt;/figure&gt;

&lt;p&gt;When a page is crawled, or if it’s found to contain no relevant categories, it’s added to an array of pages to skip, as it’s already been scraped. Eventually, the crawler runs out of paths its already been down, and grinds to a halt (current endpoint, itself very funny, is &lt;a href=&quot;https://en.wikipedia.org/wiki/Jubilee_chicken&quot;&gt;Jubilee chicken&lt;/a&gt;)&lt;/p&gt;

&lt;p&gt;The crawler also scrapes the categories associated with the page it’s crawling, and adds them to the graph too. This is nice later on, as you get to see which categories have the most overlap in terms of dishes.&lt;/p&gt;

&lt;h3 id=&quot;seeding&quot;&gt;seeding&lt;/h3&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/dishes/national_dishes.png&quot; alt=&quot;vegetation health&quot; /&gt;the wikipedia list of national dishes
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;In order to crawl a good section of the graph, we had to come up with a set of ‘seed dishes’ that would spread the crawler initially over a range of different cuisines and types of food, rather than just trying to go from a single start point. To get a good range of cuisines, we started with the &lt;a href=&quot;https://en.wikipedia.org/wiki/Category:National_dishes&quot;&gt;Wikipedia Category of National Dishes&lt;/a&gt;, plus a set of random links from &lt;a href=&quot;https://en.wikipedia.org/wiki/List_of_desserts&quot;&gt;List of Desserts&lt;/a&gt; to balance a bias in national dishes toward savory foods. (I might have over-compensated for this, however, as the final graph ended up quite dessert-y).&lt;/p&gt;

&lt;h3 id=&quot;filtering-unhelpful-categories&quot;&gt;filtering unhelpful categories&lt;/h3&gt;

&lt;p&gt;I wanted to include a broader range of categories in the graph than was used to filter pages – e.g. a page probably doesn’t contain a dish if it doesn’t belong to any categories containing the word ‘dish’, but there might be categories relevant to its culinary qualities (like ‘pickle’, ‘pastry’, ‘cookie’) that still contain useful information for clustering. As such, the filtering involved iteratively picking out terms common to ‘wikipedia metadata’ and other less useful categories.&lt;/p&gt;

&lt;h3 id=&quot;limitations&quot;&gt;limitations&lt;/h3&gt;

&lt;p&gt;There are a number of obvious limitations to this – namely, this is only working on english-language Wikipedia, which probably has regional biases as to what dishes (and links &lt;em&gt;between&lt;/em&gt; dishes) are included. There’s also the major issue that links in the main text don’t get included in See Also – potentially missing loads of important links. It’s also not certain whether this method gets all the pages marked ‘dish’ – I haven’t come up with a good metric to determine what proportion of ‘dish’ articles are successfully indexed by the crawler, because there’s no singular category ‘dish’ on Wikipedia. I think the way to do this would be something like:&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;filter all categories by the category criteria used&lt;/li&gt;
  &lt;li&gt;for each of these categories, get the list of &lt;a href=&quot;https://pypi.org/project/Wikipedia-API/&quot;&gt;categoryMembers&lt;/a&gt;&lt;/li&gt;
  &lt;li&gt;remove duplicates and count, potentially use this to seed the next round&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I’ve also been fairly un-systematic with the script – earlier versions allowed pages that belonged to categories involving ‘bread’, which I took out as there were a lot of bready pages that weren’t really dishes, but am now considering putting back in. (cake is also another possibility)&lt;/p&gt;

&lt;h2 id=&quot;exploring-the-data&quot;&gt;exploring the data&lt;/h2&gt;

&lt;p&gt;The Python script dumps all the relationships between pages and categories into a big Neo4j graph database instance, using the &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;py-neo4j&lt;/code&gt; package. Dishes that are linked to one another are defined by an &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;isRelatedTo&lt;/code&gt; link – and links from dishes to categories by &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;hasCategory&lt;/code&gt;. The only node types are &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;Dish&lt;/code&gt; and &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;Category&lt;/code&gt;. Here’s a small subsection of the network, showing dishes and categories linked to pho and adjacent soups:&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/dishes/pho.png&quot; alt=&quot;main&quot; /&gt;
&lt;/figure&gt;

&lt;p&gt;I found Neo4j’s desktop tool initially quite clunky, but after a while began to enjoy playing around with the data. Here’s a query in their Cypher query language that maps out a path (travelling only by related dishes, not categories) from Hummus to Turducken.&lt;label for=&quot;whystar&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt;&lt;span class=&quot;sidenote&quot; id=&quot;whystar&quot;&gt;the lack of arrows around the -[:isRelatedTo*]- part of the query is to allow the arrows to be taken in either direction: if you insist on them being unidirectional, the graph gets much harder to traverse, as not all pages that link a dish in the ‘see also’ section are linked back by it… in a way this is also a fun separate analysis to run&lt;/span&gt;&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;match p=shortestPath( (k:Dish {name: &quot;Pizza&quot;})-[:isRelatedTo*]-(a:Dish {name: &quot;Banana cake&quot;}) ) return p
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;And here is the resultant graph. Note the ‘offal corridor’&lt;label for=&quot;offal&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt; of associations leading us through staple dishes into stuffed monstrosoties:&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;sidenote&quot; id=&quot;offal&quot;&gt;
	&lt;img src=&quot;/img/dishes/stomach.png&quot; alt=&quot;vegetation health&quot; /&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/dishes/hummus-turducken.png&quot; alt=&quot;main&quot; /&gt;
&lt;/figure&gt;

&lt;p&gt;Here’s another, from Banana Cake to Lavash:&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/dishes/banana-lavash.png&quot; alt=&quot;main&quot; /&gt;
&lt;/figure&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/dishes/halo-halo.png&quot; alt=&quot;vegetation health&quot; /&gt;a tightly related cluster around halo-halo
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;One thing I like about this is the train of associations: they flip between flavour, technique, form, cuisine and culture, forming loose chains of association. You start to see the same ‘nodal’ pages a lot that bridge between different form factors of food – &lt;a href=&quot;https://en.wikipedia.org/wiki/Khachapuri&quot;&gt;Khachapuri&lt;/a&gt;, for example, provides an important bridge between stuffed dishes and flatbreads. Another common node that acts as a cultural bridge is fusion cuisine, for example the Hawaiian noodle dish &lt;a href=&quot;https://en.wikipedia.org/wiki/Saimin&quot;&gt;Saimin&lt;/a&gt; crops up a lot in chains between European and East Asian cuisines.&lt;/p&gt;

&lt;p&gt;You also develop a sense for parts of Wikipedia where the dishes are highly clustered – like this tight-knit group around &lt;a href=&quot;https://en.wikipedia.org/wiki/Halo-halo&quot;&gt;Halo-Halo&lt;/a&gt;, or within Indonesian cuisine, itself maybe an articulation of the great degree of mixing and cross-influence within e.g. east asian desserts.&lt;/p&gt;

&lt;p&gt;Because the categories are also scraped, you can do nice things like this:&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;match (n:Dish)-[r:hasCategory]-&amp;gt;(c where lower(c.name) contains &quot;pastries&quot;)
return n
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;This gets all the dishes that have a category name containing the string ‘pastries’, and threads them together. It’s so big! I thought it was nice to put the whole thing laid out (there’s more writing after).&lt;/p&gt;

&lt;figure class=&quot;fullwidth&quot;&gt;
	&lt;img src=&quot;/img/dishes/pastry-graph.png&quot; alt=&quot;main&quot; /&gt;
&lt;/figure&gt;

&lt;!-- &lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/dishes/french.png&quot; alt=&quot;vegetation health&quot;/&gt;obsessed with the french separatist pastry corner
&lt;/span&gt;
 --&gt;
&lt;p&gt;I really like seeing which things get connected (and by what!) and which are left unconnected, or small islands at the edge of the graph (like the French separatist patisserie corner! macaron -&amp;gt; petits fours).&lt;/p&gt;

&lt;h2 id=&quot;getting-from-a-to-b&quot;&gt;getting from A to B&lt;/h2&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/dishes/gunkel_foodstuffs.jpg&quot; alt=&quot;vegetation health&quot; /&gt;a small version of Patrick Gunkel’s &lt;i&gt;An Idea Tree&lt;/i&gt; (&lt;a href=&quot;https://d2w9rnfcy7mm78.cloudfront.net/11943271/original_6623d9e3186d998f66f3566d274f83b7.jpg?1620932343?bc=0&quot;&gt;full size here&lt;/a&gt;)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;A big influence on this project is the work of &lt;a href=&quot;http://ideonomy.mit.edu/&quot;&gt;Patrick Gunkel&lt;/a&gt;, founder and core proponent of the field of &lt;em&gt;ideonomy&lt;/em&gt;– the science of ideas. Many years ago my friend SJ introduced me to Gunkel’s work, via the masterpiece &lt;em&gt;‘An Idea Tree’&lt;/em&gt; (small version right), which takes the initial idea of allanto (sausage-like) foods, and follows multiple ‘hints’ and ‘interpretations’ to theorise, for example, the appearence of mochi ice cream some decades before its introduction into Gunkel’s native Texas.&lt;/p&gt;

&lt;p&gt;This tree of sausage-inspired innovation follows a surprisingly similar set of jumps to the map of dishes, albeit with the latter feeling less self-consciously ‘innovative’. The forking, folding path – the ‘staple’ hint, the ‘deep frying’ hint – unfolds layers of different ideas and techniques that have built up between cuisines over time.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/dishes/beauties-gunkel.jpg&quot; alt=&quot;main&quot; /&gt;
&lt;/figure&gt;

&lt;p&gt;I’m a big admirer of Gunkel’s broader work for his interest in applying extraordinarily thorough, scientific and detailed methods to further the understanding of fleeting, subjective and slippery ideas. His articulations don’t pin them down, so much as flesh them out – expanding their descriptive possibilities by searching for many definitions within them. It’s a form of inquiry that feels genuinely &lt;em&gt;inquisitive&lt;/em&gt;, and is also extremely funny – a couple of my other favourites are his map of mutual analogousness of &lt;a href=&quot;https://ideonomy.mit.edu/scanned-charts/pic044.html&quot;&gt;‘Examples and Sources of Beauty’&lt;/a&gt;, and another tree &lt;a href=&quot;https://d2w9rnfcy7mm78.cloudfront.net/33837629/original_ed5eb0ea5fb755ee2426593e579d45a8.jpg?1737415310?bc=0&quot;&gt;‘Illusions re: A Stone’&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Without wanting to overstate here, I think this is quite a good example of what science should be &lt;em&gt;for&lt;/em&gt; – the use of the constraint of objectivity as a flexible tool to expand the kind of questions that we’re able to ask about human endeavours, to expand our imaginitive landscape. Gunkel’s &lt;a href=&quot;https://ideonomy.mit.edu/intro.html&quot;&gt;method&lt;/a&gt; is extremely fastidious and thorough&lt;label for=&quot;gunkeling&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt;&lt;span class=&quot;sidenote&quot; id=&quot;gunkeling&quot;&gt;‘gunkeling’ is a classic example of something that looks really easy until you try it&lt;/span&gt;, and takes abstract things very seriously, delighting in the non-obvious links that can be made between different ideas.&lt;/p&gt;

&lt;!-- ## trying some more complex queries

### addendum: graphcommons image graveyard

When I first sat down to write this tool up (in 2022!) I used a site called [graphcommons](https://graphcommons.com/) to host it online. While it still exists, all my old graphs were wiped and there&apos;s now a limit of 500 nodes in the free plan (way too few even just to host the dishes, let alone the categories!). So -- no graphcommons, but I&apos;ve added the screenshots I got at the time for posterity. I remember it being really user-friendly and it had some great clustering tools! (iirc it was also a big pain to export the Neo4j data in the right format...)

I wish I&apos;d taken better notes of the things I&apos;d found! I remember it being really fun to try out different clustering algorithms and see how things got grouped together by category. Here&apos;s a screenshot of the &apos;flatbread&apos; cluster, bordering the &apos;stuffed cluster&apos; (hello, khachapuri!).
 --&gt;
</description>
          <pubDate>2025-01-20T00:00:00-05:00</pubDate>
          <link>https://soup.agnescameron.info//2025/01/20/wiki-scraping.html</link>
          <guid isPermaLink="true">https://soup.agnescameron.info//2025/01/20/wiki-scraping.html</guid>
        </item>
      
    
      
        <item>
          <title>tracking mining with multispectral satellite imagery</title>
          <description>&lt;p&gt;&lt;em&gt;This is a writeup of research into using satellite imagery to track mining operations, funded as part of Bellingcat’s &lt;a href=&quot;https://www.bellingcat.com/become-a-2024-bellingcat-technical-writing-fellow/&quot;&gt;2024 Technical Writing Fellowship&lt;/a&gt;, and is a more in-depth version of a &lt;a href=&quot;https://www.bellingcat.com/resources/2025/01/10/satellite-imagery-bands-guide/&quot;&gt;short guide&lt;/a&gt; published on their online platform. The accompanying Multispectral Satellite Imagery Explorer tool is available &lt;a href=&quot;https://bellingcat-ee.projects.earthengine.app/view/multispectral-satellite-imagery-explorer&quot;&gt;here&lt;/a&gt;. This guide is intended as an overview of my research, and a starting point for these techniques.&lt;/em&gt;&lt;/p&gt;

&lt;figure class=&quot;fullwidth&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/tayan-full-2.png&quot; alt=&quot;main&quot; /&gt;
	&lt;span class=&quot;mainnote&quot;&gt;a false colour satellite image of bauxite mines (yellow, with white centres) and &lt;a href=&quot;https://developers.google.com/earth-engine/datasets/catalog/BIOPAMA_GlobalOilPalm_v1&quot;&gt;palm oil plantations&lt;/a&gt; (green-brown grids) near the Kapuas river, West Kalimantan (Google Earth Engine / Landsat 8)&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;Satellite images we encounter on platforms like Google Earth are typically rendered in ‘true colour’, which emulates how a scene might look when viewed with the naked eye. This rendering mode is natural, and is useful for producing images where the contents are easily recognisable.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/maps.png&quot; alt=&quot;google maps screenshot&quot; /&gt;true-colour satellite image of the Yorkshire Dales, taken from Google Maps
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;However, satellite imaging sensors also typically capture infrared light, which is beyond what the human eye can normally perceive. This is known as ‘multispectral imaging’. This outside-of-visible-range information is particularly useful in understanding the environmental context of a scene, giving information about the geology, vegetation, the presence and quality of water, air quality, crop varieties and even building materials and techniques.&lt;/p&gt;

&lt;p&gt;This is a guide to using multispectral satellite imaging to investigate mining operations, using contrasts between different parts of the light spectrum to highlight differences in mineral and chemical composition. While mining is chosen as the example, this kind of analysis can be used more broadly to understand geopolitical events through the lens of accelerating ecological change. For an example of how multispectral information can be used in this way, this &lt;a href=&quot;https://www.bellingcat.com/resources/case-studies/2021/08/02/is-climate-change-heating-up-central-asias-border-disputes-clues-from-satellite-imagery/&quot;&gt;2021 investigation&lt;/a&gt; into the role of water stress into conflict on the Kyrgyzstan/Tajikistan border uses thermal, vegetation and moisture indices to contextualise the analysis of a nominally unrelated dispute about security cameras.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/veg-health.png&quot; alt=&quot;vegetation health&quot; /&gt;vegetation health during the Summer 2023 drought in the UK, taken from the FAO’s &lt;a href=&quot;https://www.fao.org/giews/earthobservation/country/index.jsp?lang=en&amp;amp;code=GBR&quot;&gt;Earth Observation tool&lt;/a&gt;. The metrics used to create these images are calculated by combining visible and infrared light collected by the AVHRR sensor on board the Metop satellite.
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;To accompany this guide, I have developed &lt;a href=&quot;https://ee-agnesfcameron.projects.earthengine.app/view/band-ratio-explorer&quot;&gt;a tool&lt;/a&gt; that gives a ‘tour’ of these different multispectral imaging techniques, linked to existing investigations into the mining industry. Both the tool, and the images used in this guide use imagery from the &lt;a href=&quot;https://developers.google.com/earth-engine/datasets/catalog/LANDSAT_LC08_C02_T1_L2&quot;&gt;Landsat 8 Satellite imagery dataset&lt;/a&gt;, which contains data from 2013 to the present day. (there’s a couple of images from the recently-retired &lt;a href=&quot;https://developers.google.com/earth-engine/datasets/catalog/LANDSAT_LE07_C02_T1_L2&quot;&gt;Landsat 7&lt;/a&gt; satellite to show pre-2013 analysis, these will be indicated). This guide is intended as an overview of the use of multispectral imagery in OSINT, and not a detailed guide to writing code. For this, Ollie Ballinger’s &lt;a href=&quot;https://bellingcat.github.io/RS4OSINT/&quot;&gt;Remote Sensing for OSINT&lt;/a&gt; guide is a fantastic resource, and covers many of the techniques used here, and is a good intro to writing a similar tool.&lt;/p&gt;

&lt;p&gt;A note on language: the terms ‘satellite imaging’ and ‘remote sensing’ are often used interchangeably, the latter describing the broader collection of data from a distance, whether through satellites, aircraft or drones, while the former describes images taken using satellites specifically. For clarity, I will use the term ‘satellite imaging’ throughout this guide.&lt;/p&gt;

&lt;h2 id=&quot;the-electromagnetic-spectrum&quot;&gt;the electromagnetic spectrum&lt;/h2&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/bellingcat/spectrum.png&quot; alt=&quot;main&quot; /&gt;
&lt;/figure&gt;

&lt;p&gt;Before understanding how a satellite image is constructed, it’s worth looking at how we perceive light. All of the light that we see is part of a much larger range of radiation known as the ‘electromagnetic spectrum’. Electromagnetic radiation travels in waves, and includes forms of radiation like X-rays and infrared radiation, that humans cannot see but can be detected through other means. The electromagnetic spectrum is ordered by the ‘wavelength’ of the radiation, a property that changes the way that light interacts with different materials. ‘Visible light’ – e.g. light that can be seen with the human eye – ranges between the infrared light (which has a longer wavelength) and ultraviolet light (which has a shorter wavelength) on the electromagnetic spectrum.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/plants-green.png&quot; alt=&quot;vegetation health&quot; /&gt;
	the chlorophyll in healthy plant leaves absorbs blue and red light – when this breaks down in the Autumn, leaves stop absorbing red light, causing green and red to be reflected and changing the leaves to orange. (Wikipedia)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;All substances either reflect or absorb different wavelengths of electromagnetic radiation in some combination, and by examining how radiation interacts with a particular material, it’s possible to make some inferences about what that material is. Satellite imaging datasets will often describe what they measure as ‘surface reflectance’. This literally means, measuring the different wavelengths of light that are reflected, rather than absorbed, by whatever is on that part of the Earth’s surface.&lt;/p&gt;

&lt;p&gt;Consider plants – when we see an area of deep green on a ‘true colour’ satellite image, we typically associate that with vegetation. This green colour is due to the presence of chlorophyll in plant leaves, which reflects green light waves.&lt;/p&gt;

&lt;p&gt;Interestingly, there is a second kind of light that healthy plants reflect that isn’t visible to the human eye. This other wavelength falls in the shortwave infrared range, where the light waves are too long for the human eye to see, but can be picked up by some animals, and, crucially, by satellite imaging sensors. This extra information &lt;a href=&quot;https://www.nature.com/articles/s43017-022-00298-5&quot;&gt;becomes important&lt;/a&gt; for differentiating between healthy and unhealthy vegetation, and making inferences about the type of crops in a region. Notably, it’s also invaluable for differentiating between astroturf and real grass…&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/st-ann-timelapse.gif&quot; /&gt;
	a true-colour timelapse of spreading Bauxite mines in St Ann Parish, Jamaica (Google Earth Engine/Landsat 7)
&lt;/span&gt;&lt;/p&gt;

&lt;h2 id=&quot;ways-of-seeing&quot;&gt;ways of seeing&lt;/h2&gt;

&lt;p&gt;There’s a nice line from the information theorist Gregory Bateson, which describes information as ‘the difference which makes a difference’ – in other words, much of our ability to sense and act in the world rests on noticing discontinuities – in time, form, colour, and through other sensory means. Often, visual forms of investigative work involve highlighting a difference so as to make it more visible – picture a ‘before and after’ image, or a clip that shows the shrinking of a lake or the expansion of a construction site – to draw inferences about higher-order events.&lt;/p&gt;

&lt;p&gt;One key form of difference available to human perception is the ratio of the red (R), green (G) and blue (B) light in an image. Most peoples’ eyes distinguish between these colours very effectively&lt;label for=&quot;colourblindness&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt;&lt;span class=&quot;sidenote&quot; id=&quot;colourblindness&quot;&gt;many people suffer from colourblindness, which can make the differentiation between different parts of the light spectrum more difficult&lt;/span&gt;, and give us a form of vision that is distinct from other animals – a dog, for example, has only two different kinds of photoreceptors in its eyes, resulting in a form of vision that consists of yellows, blues and greys.&lt;/p&gt;

&lt;h3 id=&quot;note-satellite-imaging-dataset-structure&quot;&gt;note: satellite imaging dataset structure&lt;/h3&gt;

&lt;p&gt;Just as a digital camera image captures red, green, and blue data in different layers, a satellite image contains layers called ‘bands’, each with information from a specific segment of the electromagnetic spectrum. Different satellite datasets will vary in the number, width (e.g. what range of wavelengths a band contains) and distribution of bands, with commonly-used datasets such as Landsat, Sentinel and MODIS having around 8-9.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/oli-bands.png&quot; alt=&quot;vegetation health&quot; /&gt;
	&lt;img src=&quot;/img/bellingcat/tirs-bands.png&quot; alt=&quot;vegetation health&quot; /&gt;
	table of bands captured by Landsat 8, by the OLI and TIRS sensors respectively (Wikipedia)
&lt;/span&gt;&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/bellingcat/bands.jpg&quot; alt=&quot;main&quot; /&gt;
	&lt;span class=&quot;mainnote&quot;&gt;the spectral placement of different Landsat bands, mapped against the atmospheric absorption spectrum (image credit: &lt;a href=&quot;https://commons.wikimedia.org/wiki/File:The_spectral_band_placement_for_each_sensor_of_Landsat.jpg&quot;&gt;Wikimedia&lt;/a&gt;)&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;For each image of the earth’s surface captured, each ‘band’ in the dataset will consist of a black-and-white image representing the intensity of surface reflectance in that band. The brighter the pixels, the more light is being reflected in that part of the spectrum. Combined together, these bands can be used to reconstruct colour images.&lt;/p&gt;

&lt;p&gt;In different satellite image datasets, the numbering of the bands will correspond to different parts of the spectrum. In this guide we will use Landsat 8 band numbering, however&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/oli.jpg&quot; alt=&quot;vegetation health&quot; /&gt;
	diagram of the OLI sensor on board Landsat 8 (Wikipedia)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;One important thing to note is that &lt;em&gt;sensors&lt;/em&gt; (the things that actually capture the data) normally have different names to the satellites themselves, and one satellite may have many sensors on board. Thus, you might often see Landsat 7’s dataset referred to as ‘ETM+’ (Enhanced Thermal Mapping), or Landsat 8 (which contains two different sensors) as OLI (Optical Light Instrument) and TIRS (Thermal Infrared Sensor).&lt;/p&gt;

&lt;h2 id=&quot;seeing-more-with-band-comparisons&quot;&gt;seeing more with band comparisons&lt;/h2&gt;

&lt;p&gt;Multispectral satellite imaging techniques use differences between light reflected in different bands to reveal information about a scene. Consider the following example, a satellite image of lakes around the American town of Sandersville, Georgia, surrounded by a series of white patches.&lt;/p&gt;

&lt;p&gt;This first image is a typical ‘true colour’ satellite image, like you’d see on Google Earth. This image has been composed by taking the R, G and B bands separately, and displaying the R band with the red pixels of your computer screen, the G band with the green pixels, and the B band with the blue pixels.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/bellingcat/sandersville-rgb.png&quot; alt=&quot;main&quot; /&gt; (Google Earth Engine/Landsat 8)
&lt;/figure&gt;

&lt;p&gt;To get a different perspective on the scene, we can use a &lt;a href=&quot;https://gisgeography.com/landsat-8-bands-combinations/&quot;&gt;‘band combination’&lt;/a&gt; – a false colour image that makes use of out-of-visible range information to highlight different features of the scene.&lt;/p&gt;

&lt;p&gt;Shown below is the classic ‘vegetation’ band combination, which uses the Near Infrared (NIR) 5 band in the ‘R’ slot, with ‘R’ information in the G slot and G in the B slot (Landsat 8 bands 5, 4 and 3 respectively)&lt;label for=&quot;colourblindness&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt;&lt;span class=&quot;sidenote&quot; id=&quot;colourblindness&quot;&gt;why this order? Within Earth Observation (EO) there are conventions as to what bands go into which ‘slot’ of the image, normally in descending order –  though you would still be able to see differences by swapping them around.&lt;/span&gt;. In areas with large amounts of vegetation, this will produce a very bright red image. The darker red indicates areas where the vegetation is healthy – lighter red indicates sparser vegetation.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/bellingcat/sandersville-nir.png&quot; alt=&quot;main&quot; /&gt;
&lt;/figure&gt;
&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/nile-valley.png&quot; alt=&quot;vegetation health&quot; /&gt;
	for comparison to the NIR image of Sandersville, shown here is the same band combination (5, 4, 3) applied to the Nile Valley in Egypt (Google Earth Engine/Landsat 8)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Note that the same chalky-white patches have appeared in all of the images so far, meaning that whatever substance is there is reflecting all incident radiation of the wavelengths we’ve been looking at, including this Near Infrared band.&lt;/p&gt;

&lt;p&gt;However, in this next image, we can see that some of the patches are no longer white. This image was generated using the 7, 6, 4 band comparison, also sometimes called the Shortwave Infrared comparison, which is also used to monitor soil health. Now, around half of the white areas from the previous image are now cyan in colour, and there’s a much greater degree of contrast and variation in those areas.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/bellingcat/sandersville-764-aligned.png&quot; alt=&quot;main&quot; /&gt; (Google Earth Engine/Landsat 8)
&lt;/figure&gt;

&lt;!-- &lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/fall-line.png&quot;/&gt;&lt;br&gt;
	map of the &apos;fall line&apos; of clay deposits in Georgia (image: Georgia Mining Association)
&lt;/span&gt; --&gt;

&lt;p&gt;What we are looking at are lakes with high deposits of &lt;a href=&quot;https://en.wikipedia.org/wiki/Kaolinite&quot;&gt;kaolinite&lt;/a&gt; – a clay mineral – in Sandersville, Georgia. Kaolin clay is mined in large open-cast mines for use in the paper, ceramics and coatings industry. It’s part of a geological feature that passes diagonally through Georgia, known as the ‘fall line’, that separates two tectonic plates. Hundreds of tons of kaolin clay are extracted each year from this area of Georgia, from large open pit mines.&lt;/p&gt;

&lt;!-- &lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/kaolinite.jpg&quot; alt=&quot;vegetation health&quot;/&gt;
	a sample of kaolinite clay from Twiggs County, Georgia (&lt;a href=&quot;https://en.wikipedia.org/wiki/Kaolinite#/media/File:Kaolinite_from_Twiggs_County_in_Georgia_in_USA.jpg&quot;&gt;Wikipedia&lt;/a&gt;)
&lt;/span&gt;
 --&gt;
&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/sky-spectrum.png&quot; /&gt;
	graph showing the spectral signature of blue sky, showing high reflectance in the blue part of the light spectrum (Wikipedia)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;In order to match up what we’re seeing in the satellite image to the material we are interested in, we want to look at what’s called the ‘spectral signature’ of kaolinite. A spectral signature is a graph that shows, for a particular material, what proportion of different wavelengths of light you would expect to be reflected. For example, the ‘spectral signature’ of ‘blue sky’ reflects a lot of light around the ‘blue light’ wavelength. Light that isn’t blue gets absorbed, meaning that when you are looking at the sky, it’s blue light that’s reaching your eyes – resulting in a blue colour.&lt;/p&gt;

&lt;p&gt;Below is the reflectance spectrum of kaolinite, marked in red, with the relevant Landsat bands marked in navy blue overlaid. This image was made using the &lt;a href=&quot;https://landsat.usgs.gov/spectral-characteristics-viewer&quot;&gt;USGS Spectral Characteristics Viewer&lt;/a&gt;, which allows you to map the reflectance spectrum of a mineral of interest against the bands of different satellite sensors, including the Landsat and Sentinel satellites.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/bellingcat/kaolinite-reflectance.png&quot; alt=&quot;main&quot; /&gt;
&lt;/figure&gt;

&lt;p&gt;The reason that kaolinite looks white in the first two images is that bands 2, 3, 4, 5 and 6 all have a high reflectance – so when those bands are slotted into the R,G,B channels of an image, they are all uniformly bright. However, we can see that the reflectance dips around 1.4μm, and that it drops off sharply after a wavelength of 2μm. Thus – much less light is reflected in band 7 (SWIR 2), meaning that in the 7, 6, 4 image, the G and B channels are bright but R is dark, hence cyan.&lt;/p&gt;

&lt;h2 id=&quot;band-ratios&quot;&gt;band ratios&lt;/h2&gt;

&lt;p&gt;As you might have noticed, the band comparisons are a bit of a blunt tool. We have to make images with several different bands before we notice the difference between them. This brings us to the next technique: band ratios. Band ratios are a type of index, a way of combining different bands together to highlight things like vegetation, moisture and different rock types. Typically, the word ‘ratio’ refers to a measure determined by dividing one satellite imaging band by another, in order to highlight materials whose reflectance is most different in those bands.&lt;/p&gt;

&lt;p&gt;As shown in the previous example, kaolin clay reflects the most amount of light in the NIR band (5) and absorbs most light in the SWIR-2 band (7). We can look for the contrast between those bands using the 5/7 band ratio. By applying this band ratio, anything reflecting a lot of light in band 5 but absorbing a lot of light in band 7 will appear very bright, with materials with other absorption spectra appearing dark. This band ratio highlights a number of bright white spots in the images.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/bellingcat/sandersville-kaolinite.png&quot; alt=&quot;main&quot; /&gt; (Google Earth Engine/Landsat 8)
&lt;/figure&gt;

&lt;p&gt;It would be tempting to conclude that this is where the kaolin mines are, but checking the location of active mining pits against the location of the white spots, it seems that, while they contain some bright patches, the mining areas are not the brightest parts of the image.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/bellingcat/sandersville-mines.png&quot; alt=&quot;main&quot; /&gt; (Google Earth Engine/Landsat 8)
&lt;/figure&gt;

&lt;p&gt;If we zoom into one of these mining areas – marked as Southeastern Performance Minerals LLC on Google Maps – we can see in detail what is being highlighted. Note that the two Google Earth Engine images appear pixellated at this scale, but the Google Maps image on the right is much higher resolution. That’s because Google Maps uses a bunch of different data, included private data, composited together to make its maps.&lt;/p&gt;

&lt;figure&gt;
	&lt;div class=&quot;subfigthird&quot;&gt;
		&lt;img src=&quot;/img/bellingcat/deepstep-764.png&quot; alt=&quot;april&quot; /&gt;
		&lt;span class=&quot;mainnote&quot;&gt;in the 7,6,4 image&lt;/span&gt;
	&lt;/div&gt;
	&lt;div class=&quot;subfigthird&quot;&gt;
		&lt;img src=&quot;/img/bellingcat/deepstep-57.png&quot; alt=&quot;august&quot; /&gt;
		&lt;span class=&quot;mainnote&quot;&gt;in the 7/5 band ratio&lt;/span&gt;
	&lt;/div&gt;
	&lt;div class=&quot;subfigthird&quot;&gt;
		&lt;img src=&quot;/img/bellingcat/deepstep-circled-lakes.png&quot; alt=&quot;august&quot; /&gt;
		&lt;span class=&quot;mainnote&quot;&gt;with salient features circled&lt;/span&gt;
	&lt;/div&gt;
&lt;/figure&gt;

&lt;p&gt;By zooming in further to Google maps, the features most highlighted appear to be either lakes or lake beds. These may have a high kaolin content (and also may well be tailings lakes from mines!) but they aren’t exactly what we were looking for – so what happened?&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/deepstep-lake.png&quot; /&gt;
	two of the areas most brightly highlighted by the 5/7 band ratio (Google Earth Engine/Landsat 8)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;If we look at the spectral signature for &lt;em&gt;turbid&lt;/em&gt; water with clay minerals with the USGS tool (blue line), we can see that even though these areas have much lower reflectance overall, because they reflects almost no light in the SWIR-2 band, they will have a much more extreme 5/7 ratio than that of kaolinite. We can also see this in the 764 image – they appear a much deeper shade of cyan, rather than bright! They just &lt;em&gt;really&lt;/em&gt; don’t contain a red component.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/bellingcat/kaolin-turbid.png&quot; alt=&quot;main&quot; /&gt;
	(Google Earth)
&lt;/figure&gt;

&lt;p&gt;So - how do we see kaolin? When looking up the correct band ratio for kaolin (and clay minerals generally), most sources suggest &lt;a href=&quot;https://pro.arcgis.com/en/pro-app/latest/arcpy/spatial-analyst/clayminerals.htm&quot;&gt;6/7&lt;/a&gt; rather than 5/7. If we look at the spectral signatures above, we can see that this will also resolve our issue with turbid water, as the signature is low for both bands 6 and 7 in this instance. However, imaging the area using this ratio initially seems quite disappointing.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/wrens-mine.png&quot; /&gt;
	kaolin highlights with the 6/7 ratio show up much more on the kaolin mine in Wrens, Georgia, which is a couple of mines Northwest of the Sandersville site (Google Earth Engine/Landsat 8)
&lt;/span&gt;&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/bellingcat/kaolin-new-67.png&quot; alt=&quot;main&quot; /&gt;
	applying the 6/7 band ratio to the Sandersville kaolin mines and lakes (Google Earth Engine/Landsat 8)
&lt;/figure&gt;

&lt;p&gt;However, what we notice is that areas that were uniformly white in all the initial images are now much more highly contrasting – which, if we are interested in mineral differences &lt;em&gt;within&lt;/em&gt; the mining areas (rather that purely detecting them in the first place), can give us lots of information. To further highlight these differences, we can use this ratio as part of a false-colour image with other band ratios in which kaolin has a high contrast, derived from the spectral signature above.&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;(5/7) → Red&lt;/li&gt;
  &lt;li&gt;Clay minerals ratio (6/7) → Green&lt;/li&gt;
  &lt;li&gt;(6/1) → Blue&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/hymap-st-austell.png&quot; /&gt;
	a diagram from the St Austell paper, showing the end result of a detailed analysis of different rock types and spectra
&lt;/span&gt;&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/bellingcat/clay-ratios.png&quot; alt=&quot;main&quot; /&gt;
	(Google Earth Engine/Landsat 8)
&lt;/figure&gt;

&lt;p&gt;Although the areas potentially containing kaolin are not necessarily the brightest (as other minerals might have higher contrasts for one or other of these ratios) we can see that they contain considerably more variation, allowing us to see much more detailed differences within potential mining and tailings areas.&lt;/p&gt;

&lt;p&gt;In practice, this method is still very rough. Within geological surveying, most of the pattern-spotting will use considerably more sophisticated techniques to highlight differences in reflectance – a common one is &lt;a href=&quot;https://en.wikipedia.org/wiki/Principal_component_analysis&quot;&gt;Principal Component Analysis&lt;/a&gt;, a mathematical approach to capture the largest variation in a given set of data, and will also use a thorough analysis of the geology of the region. &lt;a href=&quot;https://attachments.are.na/33500524/ecf330c61b860bec4dda8e985489a407.pdf?1736326607&quot;&gt;This paper&lt;/a&gt;, which uses the &lt;a href=&quot;https://airbornescience.nasa.gov/instrument/HyMap&quot;&gt;HyMap&lt;/a&gt; hyperspectral&lt;label for=&quot;hyperspectral&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt; &lt;span class=&quot;sidenote&quot; id=&quot;hyperspectral&quot;&gt;hyperspectral imaging is distinct from multispectral imaging in that it uses hundreds, rather than tens of bands, giving a much higher resolution. The HyMap sensor is flown on a light aircraft above the site of interest, though there are other hyperspectral sensors (like NASA’s Hyperion sensor on board the &lt;a href=&quot;https://en.wikipedia.org/wiki/Earth_Observing-1&quot;&gt;EO-1 satellite&lt;/a&gt;)&lt;/span&gt; imaging sensor to image a series of kaolinite pits in St Austell, Cornwall, gives a sense of how these techniques are used in practice.&lt;/p&gt;

&lt;h3 id=&quot;mining-tailings-in-picher-oklahoma&quot;&gt;mining tailings in Picher, Oklahoma&lt;/h3&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/picher-rgb.png&quot; alt=&quot;google maps screenshot&quot; /&gt;true-colour satellite image of chat piles in Picher, Oklahoma (Google Earth Engine/Landsat 8)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/picher-clays.png&quot; alt=&quot;google maps screenshot&quot; /&gt;the same area with the kaolinite band ratio comparison applied (meh) (Google Earth Engine/Landsat 8)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Compare the kaolin lakes to piles of ‘chat’ – the by-products of lead and zinc mining – in Picher, Oklahoma. Picher lies in the centre of the Tri-state Mining District, a part of the federally-designated Tar Creek Superfund site near the Oklahoma/Missouri/Kansas border. Picher was subject to deregulated lead and zinc mining for more than 100 years, with huge piles of toxic chat – typically limestone, dolomite and silica – left on the town’s surface. The impact on the local area was devastating, with over 30% of the town’s children in a 1994 study suffering from lead poisoning. Despite being declared to have superfund status 30 years ago, efforts to undo the damage from mining are slow, with &lt;a href=&quot;https://cumulis.epa.gov/supercpad/SiteProfiles/index.cfm?fuseaction=second.cleanup&amp;amp;id=0601269&quot;&gt;no date projected&lt;/a&gt; for when the site might be reusable.&lt;/p&gt;

&lt;p&gt;When we look at the RGB image of Picher, these piles of chat look very similar in colour to the kaolinite pits in our first image, and also quite uniform. Using the kaolinite ratio, the piles are distinct from their surroundings, but still appear quite uniform and undifferentiated.&lt;/p&gt;

&lt;p&gt;Unlike Sandersville, the rock here does not contain large amounts of clay minerals, but instead of a mixture of limestone and dolomite, within which lead and zinc ores are present. If we want to look for these rocks specifically, we can use a different set of ratios.&lt;/p&gt;

&lt;p&gt;Geologists Sabreen Gad and Timothy Kusky developed sets of band ratios to differentiate between various rock types. In this example, we use the band ratio combination (S2/S1, S1/NIR, R/B) (e.g. 7/6, 6/5, 4/2), from &lt;a href=&quot;https://www.sciencedirect.com/science/article/abs/pii/S1464343X09001071&quot;&gt;this paper&lt;/a&gt; to highlight differences in the image. We can now see lots of detail in areas thst previously appeared uniform, particularly around the edges of the chat piles.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/bellingcat/gad-kusky-picher.png&quot; alt=&quot;google maps screenshot&quot; /&gt;Picher, Oklahoma, with Gad and Kusky&apos;s band ratio applied (Google Earth Engine/Landsat 8)
&lt;/figure&gt;

&lt;h2 id=&quot;using-band-ratios-to-track-bauxite-mining&quot;&gt;using band ratios to track bauxite mining&lt;/h2&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/bauxite-rock.JPG&quot; alt=&quot;google maps screenshot&quot; /&gt;a piece of red bauxite rock (Wikipedia)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Bauxite is a sedimentary rock with a high concentration of aluminium minerals, and is used as the basis for over 99% of aluminium production worldwide. Surging demand for aluminium in recent years has driven massive expansions of bauxite mining operations, at the expense of the areas from which it is extracted, and to the profit of a small number of giant mining conglomerates, the most prominent including Alcan, Alcoa and Chinalco. Increasingly, Bauxite mining operations are extending into indigenous land, and are associated with deforestation, changes to hydrology and displacement of communities.&lt;/p&gt;

&lt;p&gt;Bauxite is extracted using large open-pit mines, and dust from the industry often spreads to cover surrounding areas, and is widely considered to be an ecological disaster. Among other factors, a key side product of bauxite extraction is ‘red mud’, a highly alkaline and polluting slurry that contains large amounts of iron oxide, as well as aluminium oxide components, and is usually kept in large, toxic ‘tailings lakes’ close to processing facilities.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/red-mud-germany.jpg&quot; alt=&quot;google maps screenshot&quot; /&gt;red mud in a tailings lake in Germany (Wikipedia)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;In addition to the aluminium minerals gibbsite and boehmite, bauxite also contains high quantities of ferric iron oxide minerals goethite and hematite, which give it its characteristic reddish colour.&lt;/p&gt;

&lt;p&gt;To find bauxite mines, it’s typical not to look directly for the spectral signature of alumina, which is harder to see using the bands we have available. Instead, we can use band ratios for other components of bauxite – in this instance, ferric iron oxides and Kaolinite. A three-part band comparison lets us identify the presence of these different minerals.&lt;/p&gt;

&lt;p&gt;Instead of just looking for one of these at a time, we can combine them – as in the false-colour band comparisons used above. In this instance, I’ve used two ratios that highlight ferric iron oxides, to differentiate the areas with similar reflectance characteristics that are highlighted by one or the other.&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;Ferric iron oxide band ratio 1 (4/2) → Red&lt;/li&gt;
  &lt;li&gt;Ferric iron oxide band ratio 2 (4/3) → Green&lt;/li&gt;
  &lt;li&gt;Kaolinite band ratio (6/7) → Blue (note: this ratio is also listed for &lt;em&gt;laterite&lt;/em&gt;, another component of bauxite)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These were inspired by the approach taken in &lt;a href=&quot;https://www.mdpi.com/2673-4605/5/1/91&quot;&gt;this paper&lt;/a&gt;, which uses Sentinel rather than Landsat imagery, but also uses ferric and kaolinitic minerals to help spot bauxites.&lt;/p&gt;

&lt;h3 id=&quot;bauxite-mining-in-jamaica&quot;&gt;bauxite mining in Jamaica&lt;/h3&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/jamaica-mine-stann.jpg&quot; alt=&quot;google maps screenshot&quot; /&gt;a bauxite mine in Jamaica’s St Ann Parish (&lt;a href=&quot;https://jamentrust.org/&quot;&gt;JAMEN trust&lt;/a&gt;)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;To test the efficacy of these ratio combinations, we can look first at an area with a well-documented bauxite mining industry. Bauxite has been mined in Jamaica for over 80 years, causing widespread ecological and social damage, and &lt;a href=&quot;https://jamentrust.org/download/jet-red-dirt-book&quot;&gt;most recently threatening&lt;/a&gt; the ecologically sensitive Cockpit County region, which had previously been protected.&lt;/p&gt;

&lt;p&gt;When we compare an RGB image of Jamaica to one made using these three-band combinations, we can see areas of bauxite mining highlighted very clearly in bright yellow, even when really zoomed-out. This band combination can be really useful for very quickly locating both mining areas, and even just exposed areas of soil in locations with a high iron-ore concentration.&lt;/p&gt;

&lt;figure class=&quot;fullwidth&quot;&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/bellingcat/jamaica-rgb.png&quot; alt=&quot;april&quot; /&gt;
		&lt;span class=&quot;mainnote&quot;&gt;true colour satellite image of central Jamaica (Google Earth Engine/Landsat 8)&lt;/span&gt;
	&lt;/div&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/bellingcat/jamaica-bauxite.png&quot; alt=&quot;august&quot; /&gt;
		&lt;span class=&quot;mainnote&quot;&gt;false colour image highlighting bauxite mines (Google Earth Engine/Landsat 8)&lt;/span&gt;
	&lt;/div&gt;
&lt;/figure&gt;

&lt;p&gt;By comparison to the RGB image, we’re still able to see some very obvious features, but much of the landscape is far more subtle. If we zoom in to an area in Manchester parish, we can also see that this enhancing effect gives us a very clear outline of the areas exposed by forest mining operations.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/manchester-parish.png&quot; alt=&quot;main&quot; /&gt;
	forest mines in Manchester Parish, highlighted using the bauxite band comparison (Google Earth Engine/Landsat 8)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;To check that the mining sites we have found are what they seem to be, it’s possible to cross-reference government-issued mining licenses worldwide using the USGS Mineral Resources map. This is most accurate within the United States, but contains information about mines worldwide. Wikimapia also contains mines on an infrastructure layer, though they typically record fewer sites.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/bellingcat/jamaica-usgs.png&quot; alt=&quot;main&quot; /&gt;
	&lt;span class=&quot;mainnote&quot;&gt;A screenshot of Jamaica rendered on the &lt;a href=&quot;https://mrdata.usgs.gov/general/map-global.html&quot;&gt;USGS Mineral Resources Online Spatial Data&lt;/a&gt; map tool, showing mines (past and present) in red, prospects in green and processing plants in blue. It’s possible to see right away the bright yellow areas in the south centre of the satellite match up with the large number of mines south of Mandeville on the USGS map.&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;For a much fuller report on bauxite mining’s past, future and causes in the region, Jamaica Environmental Trust produced a research report entitled &lt;a href=&quot;https://jamentrust.org/download/jet-red-dirt-book&quot;&gt;Red Dirt&lt;/a&gt;. In particular, Chapter 5 – Degradation of Ecological
Heritage by ecologist Susan Koenig – provides a detailed overview and analysis of the close relationship between geology and ecology on the region, and the long term impact of surface mining on soil and air quality and the karst water cycle.&lt;/p&gt;

&lt;h3 id=&quot;kuantan-mining-disaster&quot;&gt;Kuantan mining disaster&lt;/h3&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/kuantan.gif&quot; /&gt;&lt;br /&gt;
	timelapse of spreading and disappearing bauxite mines in Kuantan, Malaysia. Note – the green artefacts in the image are due to cloud cover; the striping is due to the edges of Landsat tiles. (Google Earth Engine/Landsat 7)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;With this model, it’s possible to see changes in other mining areas. Consider the following timelapse image, showing changes to the Bukit Goh bauxite mine in the Malaysian district of Kuantan between 2012-2023. Bukit Goh lies at the epicentre of the &lt;a href=&quot;https://www.bbc.co.uk/news/world-asia-35340528&quot;&gt;2015-16 Kuantan Bauxite mining disaster&lt;/a&gt;, which saw deregulated mining operations tear through farmland, and spread polluting bauxite dust over roads, severely polluting waterways. Changes in legislation to neighbouring Indonesia’s bauxite exports in 2014 caused demand in the region to skyrocket, with Malaysian bauxite production increasing 100-fold from 200,000 to 20 million tonnes between 2013-15. Again, the areas mined or affected by dust pollution are highlighted in a bright yellow, with healthy vegetation appearing as a darker blue.&lt;/p&gt;

&lt;figure&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/bellingcat/kuantan-2015.png&quot; alt=&quot;april&quot; /&gt;
		&lt;span class=&quot;mainnote&quot;&gt;image of a bauxite mine near Kuantan 2015 (Google Earth Engine/Landsat 8)&lt;/span&gt;
	&lt;/div&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/bellingcat/kuantan-2023.png&quot; alt=&quot;august&quot; /&gt;
		&lt;span class=&quot;mainnote&quot;&gt;the same mine covered with topsoil in 2023 (Google Earth Engine/Landsat 8)&lt;/span&gt;
	&lt;/div&gt;
&lt;/figure&gt;

&lt;p&gt;In the immediate aftermath of the 2015-16 disaster, the Malaysian government prevented the issuance of new mining licences and ordered the remediation of damaged land. We can see that between 2017-19 some of the mined land has been covered with topsoil. However, more recently – possibly following the &lt;a href=&quot;https://www.reuters.com/article/malaysia-bauxite/malaysia-to-issue-bauxite-mining-licences-by-january-after-ban-lifted-idUSL3N27K0QP/&quot;&gt;2019 reissuing of mining licences&lt;/a&gt; to bauxite companies by the Malaysian government – we can potentially see new areas once more expanding toward the bottom left of the timelapse.&lt;/p&gt;

&lt;h3 id=&quot;bauxite-mining-in-west-kalimantan&quot;&gt;bauxite mining in West Kalimantan&lt;/h3&gt;

&lt;p&gt;We can use the same technique to look at Bauxite mines in Indonesia, another country that has seen massive mining-based land grabs over the past 2 decades. One particularly affected area surrounds the Kapuas river in West Kalimantan.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/tayan-small.gif&quot; alt=&quot;main&quot; /&gt;
	timelapse of pausing then spreading bauxite mine expansion around the Kapuas river, near the village of Tayan, Indonesia (Google Earth Engine/Landsat 8)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;A &lt;a href=&quot;https://www.iss.nl/sites/corporate/files/CMCP_81_Pye_et_al.pdf&quot;&gt;2015 study&lt;/a&gt; of land grabs in the region directly linked the drying of a lake to bauxite mining expansion in 2013-14, and named bauxite extraction as part of a linked set of extractive industries encroaching on indigenous peoples’ livelihoods. While bauxite extraction slowed after Indonesia’s 2014 policy to end exports of the mineral, local people have been offered little other than financial compensation (‘dust money’) for the damage done, with mines still expanding in some areas, and little to no remediation with topsoil.&lt;/p&gt;

&lt;p&gt;If we take a timelapse of an area around the Kapuas river (the area in the above image with a concentration of yellow-white areas) between 2013 and 2024, we can see that there was barely any expansion of the yellow-white spots indicating possible bauxite mines between 2014 and 2018. This coincides with a 2014 ban by the Indonesian government on certain raw materials including bauxite. However, the spots continue to expand again after 2018.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/bauxite-indonesia.png&quot; /&gt;&lt;br /&gt;
	chart of changing Indonesian bauxite production, 2014-2023 (data from &lt;a href=&quot;https://www.usgs.gov/centers/national-minerals-information-center/bauxite-and-alumina-statistics-and-information&quot;&gt;USGS mineral commodity summaries&lt;/a&gt;)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;This timelapse mirrors the &lt;a href=&quot;https://www.usgs.gov/centers/national-minerals-information-center/bauxite-and-alumina-statistics-and-information&quot;&gt;USGS mineral commodity summaries&lt;/a&gt; figures for Indonesia’s bauxite production during that timeframe: a significant drop following the export ban in 2014, followed by an increase in subsequent years.&lt;/p&gt;

&lt;p&gt;In addition to the USGS map referenced before, we can use a number of different sources to cross-check these images. First of all, the &lt;a href=&quot;https://nusantara-atlas.org/&quot;&gt;Nusantara Atlas&lt;/a&gt; project works to record major drivers of deforestation in the region, producing high-quality satellite maps of land use, cross-referenced with government issued licenses.&lt;/p&gt;

&lt;p&gt;The Indonesian government also maintains a &lt;a href=&quot;https://geoportal.esdm.go.id/minerba/&quot;&gt;detailed interactive map&lt;/a&gt; of mining concessions in the region, allowing us to check not only the minerals being extracted, but also to get a fine-grained picture of which companies (often local subsidiaries of major multinational mining companies) have mining licenses to different areas. This is partly possible with the USGS tool, but the government tool gives a far more detailed picture, showing concession areas and different forms of license. The following screenshot shows the Tayan Hilir region of the second gif, with each blue diamond representing a bauxite (‘bauksit’ in Indonesian) deposit.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/bellingcat/tayan-minerals.png&quot; alt=&quot;main&quot; /&gt;
	(Google Earth Engine/Landsat 8)
&lt;/figure&gt;

&lt;p&gt;What band ratios give us in each of these cases is a tool to highlight change far beyond what could be seen using visible-range images. As a use-case, mining gives us a particular set of cross-referencing tools, including both geophysical and economic tools more commonly used for prospecting.&lt;/p&gt;

&lt;h3 id=&quot;making-use-of-other-indices&quot;&gt;making use of other indices&lt;/h3&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/landtrendr.png&quot; alt=&quot;main&quot; /&gt;
	diagram from Han et. al’s paper outlining the application of the Landtrendr algorithm and NDVI
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;While geological band ratios can be very useful by themselves for clearly delineating different mineral areas, they may also be used in conjunction with other satellite imaging indices to make causal analyses about the effects of mining on the landscape. Two popular indices are the NDVI and NDWI – the &lt;a href=&quot;https://en.wikipedia.org/wiki/Normalized_difference_vegetation_index&quot;&gt;Normalised Difference Vegetation Index&lt;/a&gt;, and &lt;a href=&quot;https://en.wikipedia.org/wiki/Normalized_difference_water_index&quot;&gt;Normalised Difference Water Index&lt;/a&gt;, and give information about vegetation health and moisture levels respectively.&lt;/p&gt;

&lt;p&gt;These indices can link mining to the broader political ecology of the region, understanding disparate impacts on hydrology, soil health, and agricultural viability. It’s worth noting that these should be used thoughtfully – for example, both these indices change with the seasons, and the NDWI is also strongly affected by cloud cover (this makes analyses of areas affected by Bauxite mining particularly challenging, as most fall within the warm, wet and cloudy tropics).&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://www.tandfonline.com/doi/epdf/10.1080/15481603.2021.1996319?needAccess=true&quot;&gt;This 2021 paper&lt;/a&gt; by Han et. al, gives a well-explained example of using the NDVI within a robust time-series system to track vegetation degredation and recovery in reclaimed mining areas (their focus is on areas around Beijing, but the technique applies more generally).&lt;/p&gt;

&lt;h3 id=&quot;limitations&quot;&gt;limitations&lt;/h3&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/bauxite-mineralogy.png&quot; alt=&quot;google maps screenshot&quot; /&gt;bauxite mineralogy diagram, showing balance between ferrous and clay minerals
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;All of these examples use very simplified models of the geology, which can introduce important variation in remote sensing work. For example, within ‘bauxite’ as a class of mineral, there is a huge amount of variation, including between carbonate (including Jamaican) vs lateritic (including Malaysian and Indonesian) bauxites (which delineate from which rock body the bauxites are formed), the proportion and distribution of kaolins vs laterites, and the profile of deposits in the region. This detail is beyond the scope of this article, however, &lt;a href=&quot;https://attachments.are.na/33479319/b1d6d19cc28eed7354685f7a1ec341df.pdf?1736271075&quot;&gt;this paper&lt;/a&gt; on the mineral classification of bauxite deposits in the Tayan region gives a good overview of the different kinds of analysis that are applied.&lt;/p&gt;

&lt;p&gt;Likewise, areas with similar concentrations of minerals will be highlighted by similar ratios –  because of the reliance on iron oxides to highlight soil potentially containing bauxite, iron mines are also highlighted by this band comparison. If we wanted to differentiate them, we would want to look for minerals contained in one instance, and not the other.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/tayan-profile.png&quot; alt=&quot;google maps screenshot&quot; /&gt;diagram of bauxite profiles in the Tayan region, from the Tayan mineral classification paper
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;A second limitation on geological indices is vegetation – in areas where rock is not exposed, often it’s easier to make inferences about rock types using vegetation variation as a clue, rather than using the reflectance spectra of the rocks themselves, as if they are covered then this will not be visible to satellites.&lt;/p&gt;

&lt;p&gt;Another key limitation to satellite imaging work is the presence of clouds, and these can be particularly misleading when using false-colour images, as they’re far less easy to recognise. A common ‘sanity check’ is to switch back to the true colour image, in order to check that what you’re seeing is something that’s actually on the ground.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/clouds-aligned.png&quot; alt=&quot;main&quot; /&gt;
	artefacts in a river in the Amazonas basin turn out just to be clouds when examined in RGB… (Google Earth Engine/Landsat 8)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Lastly, when using satellite imagery it can be tempting to draw premature conclusions about the meaning of different zones and artefacts. For example, one might note that an area appears to have very high concentrations of Ferrous Sulphate, as the wavebands associated with FeSO3 are bright in images tuned to detect those wavelengths. However, there are a range of other minerals that might also follow a similar reflectance pattern. One way around this is to match up with what is on the ground, or use it as an indicator in conjunction with other evidence. Rarely, if ever, is satellite imaging alone enough to confirm or deny the presence of a substance, but it can be a very useful tool to provide clues, and to demonstrate changes over time.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/idb-bands-2.png&quot; alt=&quot;main&quot; /&gt;
	indexDB’s list of sensors by bands, showing where different satellite imaging sensors collect light
&lt;/span&gt;&lt;/p&gt;

&lt;h2 id=&quot;finding-and-using-indices&quot;&gt;finding and using indices&lt;/h2&gt;

&lt;p&gt;There are a number of tools available that can help determine useful band combinations and ratios, and indices for a given application.&lt;/p&gt;

&lt;p&gt;The website IndexDB is a fantastic resource for finding possible indices and band ratios. IndexDB maps between satellite imaging datasets (referred to there as ‘sensors’, denoting the instrument used to capture the data), indices (including band ratios), and ‘applications’, which refer to families of indices that may be used in a particular analysis.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/idb-metals.png&quot; alt=&quot;main&quot; /&gt;
	search results for the application ‘Heavy Metals Contamination’, showing possible indices and sensors with relevant bands (IndexDB)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;For example – if I am interested in looking at heavy metals pollution in topsoil, I can select this application, and see a list of relevant indices. Choosing one of these (e.g. FeO3) will bring up a list of sensors with bands suitable for calculating the relevant ratios. Selecting one of the listed sensors will then bring up the calculation with the relevant bands for that dataset, along with references to literature.&lt;/p&gt;

&lt;p&gt;One thing you might notice about IndexDB is the high number of sensors that are listed. The list is quite exhaustive, and doesn’t always give a clear idea of other factors relevant to use in open-source investigations, such as timeframe, spatial and temporal resolution, cost, availability and ease of use. One way to cross-reference this information is by using Google’s &lt;a href=&quot;https://developers.google.com/earth-engine/datasets/&quot;&gt;Earth Engine Data Catalog&lt;/a&gt;, which lists datasets available freely on the platform, with accompanying metadata that can serve as indicators for the usefulness of a dataset to a given application. Sometimes, however, this metadata can be a little misleading. NASA’s &lt;a href=&quot;https://developers.google.com/earth-engine/datasets/catalog/ASTER_AST_L1T_003&quot;&gt;ASTER&lt;/a&gt; dataset, for example, will often appear to be a great option, because of the large number of bands it contains, with a good spatial resolution and a timeframe. However, most of the most useful bands for geological work (namely short wave infrared) were only active from 2000-2008, meaning they cannot be used in contemporary analyses. If in doubt, &lt;a href=&quot;https://developers.google.com/earth-engine/datasets/catalog/landsat&quot;&gt;Landsats 7, 8 and 9&lt;/a&gt;, &lt;a href=&quot;https://developers.google.com/earth-engine/datasets/catalog/sentinel&quot;&gt;Sentinel 1 and 2&lt;/a&gt;, and &lt;a href=&quot;https://developers.google.com/earth-engine/datasets/catalog/modis&quot;&gt;MODIS&lt;/a&gt; are all good and freely-available starting datasets.&lt;/p&gt;

&lt;p&gt;Outside of the indices listed in IndexDB, information about if and how a substance can be differentiated via band ratios can be discovered by searching Google Scholar for academic papers. It’s also possible to approximate your own band ratios, using tools such as the USGS spectral characteristics viewer, as mentioned earlier in the article.&lt;/p&gt;

&lt;p&gt;Often the choice of precise bands can be a little more subtle than simply examining a single substance, with some bands being chosen over others to avoid confusion with similar reflectance patterns. As such, it’s wise to complement experimentation with tools like the spectral characteristics viewer with academic references.&lt;/p&gt;

&lt;h2 id=&quot;extending-mineral-investigations&quot;&gt;extending mineral investigations&lt;/h2&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/aster-anti-atlas.jpg&quot; alt=&quot;main&quot; /&gt;
	a detailed false-colour image of rock types in Morocco’s Anti-Atlas mountains, created using infrared light collected by the ASTER satellite (NASA)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;For more complex lithological and geological information when contemporary data is not as necessary, NASA’s ASTER satellite collected detailed data in the short wave infrared bands from 2000-2009 (after which the sensors unfortunately malfunctioned). Many of the examples in the open-source &lt;a href=&quot;https://github.com/rodreras/awesome-mining-band-ratio&quot;&gt;Awesome mining band ratio&lt;/a&gt; list relate to ASTER-based analysis.&lt;/p&gt;

&lt;p&gt;Much of the most precise contemporary research and investigative work makes use of band ratios and indices within the wider toolset of machine learning, and other mathematical techniques such as &lt;a href=&quot;https://en.wikipedia.org/wiki/Principal_component_analysis&quot;&gt;principal component analysis&lt;/a&gt;. Platforms such as Sentinel’s &lt;a href=&quot;https://eo-learn.readthedocs.io/en/latest/index.html&quot;&gt;EO-Learn&lt;/a&gt; contain a number of open-source machine learning models and examples, and Google Earth Engine also provides a rich toolset for training machine learning algorithms.&lt;/p&gt;

&lt;p&gt;A great starting tutorial to take these techniques further is Ollie Ballinger’s &lt;a href=&quot;https://bellingcat.github.io/RS4OSINT/&quot;&gt;Remote Sensing for OSINT&lt;/a&gt;, which includes a detailed case study on using machine learning techniques to identify oil refineries in Northwestern Syria. For an example of the same linear regression and machine learning techniques applied specifically to the context of mining pollution, &lt;a href=&quot;https://www.nature.com/articles/s41598-021-91103-8#Tab2&quot;&gt;this 2021 article&lt;/a&gt; uses eight spectral indices in total to infer the presence of heavy metals in topsoils in the Daxigo mining area in the Shaanxi province of China.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/daxigou-heavy-metals.jpg&quot; alt=&quot;main&quot; /&gt;
	detailed spatial distributions of heavy metals pollution in Daxigou mining area, calculated from Landsat 8 imagery using machine learning techniques &lt;a href=&quot;https://www.nature.com/articles/s41598-021-91103-8&quot;&gt;source&lt;/a&gt;.
&lt;/span&gt;&lt;/p&gt;

&lt;h2 id=&quot;conclusion&quot;&gt;conclusion&lt;/h2&gt;

&lt;p&gt;Multispectral satellite imaging provides a broad and varied range of information about the Earth’s surface, including much that is invisible to the naked eye. By amplifying subtle differences in how materials reflect different kinds of light, we can make quite sophisticated inferences about a landscape’s ecological and geological properties. Like all forms of remote sensing, satellite imaging is one of a suite of tools that allows us to see alongside what might be happening on the ground, rather than a replacement for it. Academic fields such as ecology and geology have long-standing techniques for using this information in a replicable and verifiable manner, and increasingly, use tools such as machine learning to enhance these indices’ effectiveness.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bellingcat/vegetation-new-orleans.png&quot; alt=&quot;vegetation health&quot; /&gt;
	a Near Infrared band comparison image of the New Orleans bayou (Google Earth Engine/Landsat 8)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Ultimately, what these images provide us with is a set of lenses that may be applied to see a landscape differently. This extended form of vision is useful both in the direct identification of surface changes, such as the expansion of mines, the spread of pollutants, and changes to vegetation, but can also play a subtler role in articulating the mutual influence between political and ecological struggles.&lt;/p&gt;

&lt;p&gt;All of this code was written using the &lt;a href=&quot;https://earthengine.google.com/&quot;&gt;Google Earth Engine&lt;/a&gt; platform, and is available on Github here. GEE requires you to write code, but is a good option in terms of being free, well-documented and cloud-based (e.g. doesn’t require you to download large datasets). An open-source desktop application is &lt;a href=&quot;https://www.qgis.org/&quot;&gt;qGIS&lt;/a&gt;, which does not require code (though does mean you need to download the data locally). No-code platforms such as &lt;a href=&quot;https://browser.dataspace.copernicus.eu/&quot;&gt;Copernicus Browser&lt;/a&gt; also have presets for common indices like the NDVI and NDWI, and allow you to swap out different band combinations.&lt;/p&gt;

&lt;hr /&gt;

&lt;p&gt;&lt;em&gt;The work that went into this guide was supported by open-source investigative bureau &lt;a href=&quot;https://www.bellingcat.com/&quot;&gt;Bellingcat&lt;/a&gt;. &lt;a href=&quot;https://x.com/melzxy&quot;&gt;Melissa Zhu&lt;/a&gt; and &lt;a href=&quot;https://galen.reich.me.uk/&quot;&gt;Galen Reich&lt;/a&gt; of Bellingcat both contributed extensive help, advice and editorial support to the writing this guide. Thanks also to &lt;a href=&quot;https://otherkat.com/&quot;&gt;Kat MacDonald&lt;/a&gt;, &lt;a href=&quot;https://www.muradkhan.co.uk/&quot;&gt;Murad Khan&lt;/a&gt;, Sergio Calderón Harker and Didem Incegoz, who gave feedback on drafts, to ecologist &lt;a href=&quot;http://www.austinwadesmith.com/&quot;&gt;Austin Wade Smith&lt;/a&gt; who discussed this project with me very early on, and geologists &lt;a href=&quot;http://www.geospectra.net/&quot;&gt;James Aber&lt;/a&gt; (incredible website) and &lt;a href=&quot;https://www.unbc.ca/roger-wheate&quot;&gt;Roger Wheate&lt;/a&gt;, who were both kind enough to talk to me about their work, and whose open-source resources are also invaluable.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;If this guide is of any use to you, or you have further questions free to &lt;a href=&quot;mailto:agnesfcameron@protonmail.com&quot;&gt;email me&lt;/a&gt;. I also compiled the references used in my research &lt;a href=&quot;https://www.are.na/agnes-cameron/proj-satellite-imaging-indices&quot;&gt;here&lt;/a&gt; (for general satellite imaging techniques), and &lt;a href=&quot;https://www.are.na/agnes-cameron/proj-satellite-mining&quot;&gt;here&lt;/a&gt; for mining-specific resources.&lt;/em&gt;&lt;/p&gt;

</description>
          <pubDate>2025-01-10T00:00:00-05:00</pubDate>
          <link>https://soup.agnescameron.info//2025/01/10/satellite.html</link>
          <guid isPermaLink="true">https://soup.agnescameron.info//2025/01/10/satellite.html</guid>
        </item>
      
    
      
    
      
    
      
        <item>
          <title>on peaceful and noisy machines</title>
          <description>&lt;p&gt;This year, I’ve been involved in making two different synthesisers. The first of these has been a collaboration with the electronic musician John Richards, aka the &lt;a href=&quot;https://www.dirtyelectronics.org/&quot;&gt;Dirty Electronics Ensemble&lt;/a&gt; with artwork by Colin Therlemont (&lt;a href=&quot;https://bandcamp.com/tellamont&quot;&gt;Tellamont&lt;/a&gt;). The project is called &lt;em&gt;More Roar&lt;/em&gt;, and the synth is a digital frequency modulation synthesiser built using a STM32 microcontroller.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/synths/hedgehog-full.jpeg&quot; alt=&quot;vegetation health&quot; /&gt;the More Roar synth in its final form (sans chip, which is broken off as a tooth)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;The second was made by me for a new performance by the &lt;em&gt;Commission for New and Old Art&lt;/em&gt; (heareafter: the &lt;em&gt;Commission&lt;/em&gt;). The whole performance was called &lt;em&gt;Valves&lt;/em&gt;; the synth was for a new piece called &lt;em&gt;On Pendle Hill&lt;/em&gt;, composed by Oliver Vibrans. The synth itself never really got a name, though collectively we started to refer to it as &lt;em&gt;the Machine&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;In both cases, I’ve been thinking a lot about the political nature of different technologies, technology’s militariness and the resistance to military co-optation of work made using technical things. These concerns feel particularly urgent in the context of the genocide in Gaza and the occupation of the West Bank. Military technologies are not just the drones (&lt;a href=&quot;https://caat.org.uk/data/countries/israel/israels-arms-industry-its-links-with-the-uk/&quot;&gt;manufactured in Britain&lt;/a&gt;), the guns and bombs used to kill Palestinians, or even the Israeli surveillance infrastructure built by &lt;a href=&quot;https://www.aljazeera.com/news/2024/4/23/what-is-project-nimbus-and-why-are-google-workers-protesting-israel-deal&quot;&gt;Google and Amazon&lt;/a&gt;, but the systems of capital and convenience that sustain this status quo.&lt;/p&gt;

&lt;p&gt;My original training is in electronic engineering, and I still subscribe to the IET’s&lt;label for=&quot;iet&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt;&lt;span class=&quot;sidenote&quot; id=&quot;iet&quot;&gt;the Institution of Engineering and Technology, one of Britain’s biggest professional accreditation bodies for engineers&lt;/span&gt; jobs mailing list – these days, it’s almost entirely dominated by weapons companies. I always remember my friend Angela describing the presence of &lt;a href=&quot;https://caat.org.uk/data/companies/elbit-systems/&quot;&gt;Elbit Systems&lt;/a&gt; in Britain as ‘creepy’, this sense of a close proximity to violence feeling deeply uncomfortable and unsettling. Moving back toward electronics (after years of mostly making work with the web) I find myself wanting to understand and use technologies where the overlap (in expertise, documentation, components) feels much closer to this militarised domain of engineering. Being closer to the source means being confronted with the decisions that a lot of technologies keep hidden, and getting, on some level to decide them for yourself.&lt;/p&gt;

&lt;h2 id=&quot;valves&quot;&gt;Valves&lt;/h2&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/synths/12AX7_tube.jpg&quot; alt=&quot;vegetation health&quot; /&gt;the 12AX7 vacuum tube
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Some of the initial ideas for the &lt;em&gt;Valves&lt;/em&gt; synthesiser came from a conversation with Sam Fairbrother of the &lt;em&gt;Commission&lt;/em&gt; a couple of years ago, where, during a conversation about artificial intelligence he asked me &lt;em&gt;“Can you make a peaceful machine?”&lt;/em&gt; – eg, is it possible to make a pacifist technology. I found the question very difficult to answer, and still do.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Valves&lt;/em&gt; as a performance circulated around the valve as a unifying theme through post industrial areas of Northern England. Most of the performance centred around brass instruments, with the brass band in particular as a symbol of Northern industrial heritage. In keeping with the theme, the specific requirement for this synthesiser was that it be made using a vacuum tube – a &lt;em&gt;thermionic valve&lt;/em&gt;. STAT magazine wrote about the whole performance &lt;a href=&quot;https://statmagazine.org/staging-england/&quot;&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/synths/valves-performance.jpeg&quot; alt=&quot;vegetation health&quot; /&gt;Lydia and I performing in &lt;i&gt;On Pendle Hill&lt;/i&gt; (photo: Brad Morgan)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Vacuum tubes are really interesting components to work with. As the ‘valve’ moniker suggests, they work by controlling the flow of a medium: similar to transistors, they use a small electrical current to switch a much larger one, effectively working as an amplifier. Vacuum tubes have been almost entirely superceded by transistors, but not because they didn’t work – they were ultimately abandoned due to size, cost and practicality. They aren’t as ‘clean’ as amplifying components as transistors are – they add to the sound, but in a way that can be very appealing for audio signals, and is considered quite ‘rich’ – which is why you still find them in guitar amplifiers.&lt;/p&gt;

&lt;p&gt;The eventual design was based on a design by &lt;a href=&quot;https://www.lookmumnocomputer.com/&quot;&gt;Look Mum No Computer&lt;/a&gt; called the &lt;a href=&quot;https://www.lookmumnocomputer.com/the-safety-valve&quot;&gt;&lt;em&gt;Safety Valve&lt;/em&gt;&lt;/a&gt;, that uses a vacuum tube called the 12AX7, which can be driven at 12V, making it much safer to power. It consisted of two valve stages – a distortion module, and a &lt;a href=&quot;https://www.soundonsound.com/techniques/introduction-vcas&quot;&gt;Voltage Controlled Amplifier&lt;/a&gt; (VCA), which were fed by two tone generators (hacked together from the &lt;a href=&quot;#more-roar&quot;&gt;More Roar synth&lt;/a&gt;), modulating the two signals together. The resultant sound is really interesting – in both cases (the VCA is just a kind of modded variant of the distortion module) the valve acts as a kind of imperfect amplifier, replicating the signal but adding multiple layers of harmonics. It works best with very simple sounds, adding a richness and a crunchiness to even very basic tones.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/synths/valve-test.jpg&quot; alt=&quot;vegetation health&quot; /&gt;testing the first distortion module of the valve synth
&lt;/span&gt;&lt;/p&gt;

&lt;h3 id=&quot;on-pendle-hill&quot;&gt;On Pendle Hill&lt;/h3&gt;

&lt;p&gt;&lt;em&gt;On Pendle Hill&lt;/em&gt; is a piece composed for baritone horn, brass chorus, valve synthesiser and clogs. The horn and brass chorus were performed by Troy Kelly and the Hebden Bridge Brass Band, with Lydia Phillips and I playing the synth and Sam Fairbrother clogging. The piece also included an audio recording of Ellie Kinney, a peace activist, from the top of Pendle Hill, narrating her view of a landscape that contains multiple weapons factories, and her view of the Northwestern England as a specifically militarised industrial zone.&lt;/p&gt;

&lt;p&gt;A lot of the ideas about the synthesiser (and the sound texture we made with it!), developed in conversation with Lydia Philips and &lt;a href=&quot;https://otherkat.com/&quot;&gt;Kat Macdonald&lt;/a&gt;. The first part of the synth I made was the distortion pedal, and in initial rehearsals we prototyped the VCA using some Max software that Kat made. We had an initial idea to modulate the signal using footsteps, which sounded great in the digital software, but the valve VCA brought in so many extra harmonics that it proved too noisy in practice.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/synths/kat-software.JPG&quot; alt=&quot;vegetation health&quot; /&gt;the Max-MSP based testing software that Kat made to simulate the VCA
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Here are some test samples we made with the final synth setup, using two sine tones as input to the VCA.&lt;/p&gt;
&lt;audio controls=&quot;&quot;&gt;
  &lt;source src=&quot;/img/synths/pure-texture.mp3&quot; type=&quot;audio/mpeg&quot; /&gt;
	Your browser does not support the audio element.
&lt;/audio&gt;

&lt;audio controls=&quot;&quot;&gt;
  &lt;source src=&quot;/img/synths/sample1.mp3&quot; type=&quot;audio/mpeg&quot; /&gt;
Your browser does not support the audio element.
&lt;/audio&gt;

&lt;audio controls=&quot;&quot;&gt;
  &lt;source src=&quot;/img/synths/sample2.mp3&quot; type=&quot;audio/mpeg&quot; /&gt;
	Your browser does not support the audio element.
&lt;/audio&gt;

&lt;p&gt;Much of our discussion circulated around the synth and the broader performance’s military-industry-hardware-sound relationship. There’s a lot there – Raytheon, the world’s largest defence contractor, was initially a vacuum tube manufacturing company. The 12AX7, the vacuum tube is a close relative of the 12AU7 – a miniaturised equivalent of which (6111) was &lt;a href=&quot;http://www.diyaudioblog.com/2017/11/matsumin-valvecaster-guitar-effects.html&quot;&gt;apparently&lt;/a&gt; developed for their Sidewinder missile in the mid-1950s. Meanwhile, the ‘great Northern industry’ that the brass band has historically symbolised increasingly consists of &lt;a href=&quot;https://caat.org.uk/data/companies/&quot;&gt;weapons manufacturers&lt;/a&gt;, including Elbit and Raytheon. Throughout the piece and the performance more generally there’s a deep ambivalence that never really resolves.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/synths/brass-band.jpeg&quot; alt=&quot;vegetation health&quot; /&gt;Hebden Bridge Brass Band marching into The White Hotel (photo: Brad Morgan)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;At the end of the program notes for &lt;em&gt;Valves&lt;/em&gt;, we included a line “we’d like to think there’s another way”. In technical teaching, I think there’s something important in showing that there are many ways to do things – it helps to understand technical things not as monolithic tools, but rather (to paraphrase a &lt;a href=&quot;https://pages.sandpoints.org/sandpoints/ubu5050ubus-8daa49c3/reflection/autonomous-archive/&quot;&gt;nice essay&lt;/a&gt; by Cristóbal Sciutto), a medium that one can have a degree of agency over.&lt;/p&gt;

&lt;p&gt;This sense of agency is, I think, a necessary but not sufficient condition for the revolutionary use of technology. Using a piece of technology without having a sense of how it works means that many of the decisions that went into it, and the way it acts in the world are opaque, making it harder to act with intention.&lt;/p&gt;

&lt;h2 id=&quot;more-roar&quot;&gt;More Roar&lt;/h2&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/synths/hedgehog-bl.png&quot; alt=&quot;vegetation health&quot; /&gt;hedgehog plus Kat in the wild (at an &lt;a href=&quot;https://lclo.otherkat.com/&quot;&gt;LCLO&lt;/a&gt; rehearsal)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;More Roar&lt;/em&gt; was developed specifically in an artistic/educational context. In March this year, I invited John to come to the &lt;a href=&quot;https://www.arts.ac.uk/creative-computing-institute&quot;&gt;CCI&lt;/a&gt; to run a synth-building workshop for our students. I’d remembered going to one of John’s workshops as an electronic engineering undergraduate, and experiencing electronics as this totally different, vibrant thing. I loved (and still do love) the maths and theory of electronics, but there was a liveness to the synth that we made that felt very different, and quite transformative.&lt;/p&gt;

&lt;p&gt;John decided to use this as an opportunity to develop code for a chip he’d not worked with before, and after some discussion he settled on the &lt;a href=&quot;https://en.wikipedia.org/wiki/STM32&quot;&gt;STM32 microcontroller&lt;/a&gt;, which had recently been made available as a tiny 8-pin package. After the workshop, he asked if I’d be interested in developing it further into a project together. Much of my contribution to the project has been to write the code for the &lt;a href=&quot;https://docs.google.com/document/u/0/d/1dLrLAFyj1qOBhgkgtjH0oDM67nwY9DkMXg1a5tIjvx4/mobilebasic?pli=1&quot;&gt;FM synthesis&lt;/a&gt; algorithm, and later the decay function that periodically adds noise to the signal. This was a very enjoyable process of writing some ultimately quite simple code. It’s humbling to come back to microcontrollers after about a decade away and realise just how theoretical and ungrounded your engineering education was.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/synths/nails-synth.jpeg&quot; alt=&quot;vegetation health&quot; /&gt;the DIY nails version of the synth from John’s initial workshop at the CCI
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;The synth has three control inputs – two dials, which control the frequency of the signals modulated by the synth, and a button that switches the mode. The sound is generated using the Direct Digital Synthesis algorithm, and benefits from the speed of the chip used, one of the key features that made it appealing. It’s become affectionately referred to as the ‘hedgehog’ after Colin’s design, which has given the sounds a really nice character. The hedgehog follows months of breadboard prototypes, plus the ‘nails’ type synth of John’s design that we made with the students (that suspends the chip between 8 nails, and uses capacitance instead of potentiometers for the analog inputs). John wrote a document for the November iteration that you can find &lt;a href=&quot;https://www.dirtyelectronics.org/docs/ROAR_doc.pdf&quot;&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The synth makes a range of sounds – it’s kind of amazing how much you can get out of even a simple algorithm – with an emphasis on sounds that are quite digital (scrapy/beepy), and quite monumental (windy/wet/weathery). The more you change the mode, the more the signals used decay – eventually they rot down to nothing – with the decay rate and mode changing every time the synth is turned on. I think one thing that appeals about this quality is to make an instrument with a sensibility that you have to attune to and recognise – to see when it’s rotting faster or slower, to understand how the modes work.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/synths/paradise-palms.jpg&quot; alt=&quot;vegetation health&quot; /&gt;John and I performing at Paradise Palms
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;This sense of skill in practice is also something that makes me want to make more electronic objects with my students. Currently in my department, the electronics education most of our students recieve involves learning Arduino, perhaps experimenting with other boards (like the ESP32) or the Raspberry Pi. There’s something very exciting about handing a student a bare chip, show them how to program it using just a text editor. There’s something very liberating about that process.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/synths/palms-breadboards.png&quot; alt=&quot;vegetation health&quot; /&gt;gig setup with four breadboards
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;John decided to use this as an opportunity to develop code for a chip he’d not worked with before, and after some discussion he settled on the &lt;a href=&quot;https://en.wikipedia.org/wiki/STM32&quot;&gt;STM32 microcontroller&lt;/a&gt;, which had recently been made available as a tiny 8-pin package. After the workshop, he asked if I’d be interested in developing it further into a project together. Much of my contribution to the project has been to write the code for the &lt;a href=&quot;https://docs.google.com/document/u/0/d/1dLrLAFyj1qOBhgkgtjH0oDM67nwY9DkMXg1a5tIjvx4/mobilebasic?pli=1&quot;&gt;FM synthesis&lt;/a&gt; algorithm, and later the decay function that periodically adds noise to the signal. This was a very enjoyable process of writing some ultimately quite simple code. It’s humbling to come back to microcontrollers after about a decade away and realise just how theoretical and ungrounded your engineering education was.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/synths/nails-synth.jpeg&quot; alt=&quot;vegetation health&quot; /&gt;the DIY nails version of the synth from John’s initial workshop at the CCI
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;The synth has three control inputs – two dials, which control the frequency of the signals modulated by the synth, and a button that switches the mode. The sound is generated using the Direct Digital Synthesis algorithm, and benefits from the speed of the chip used, one of the key features that made it appealing. It’s become affectionately referred to as the ‘hedgehog’ after Colin’s design, which has given the sounds a really nice character. The hedgehog follows months of breadboard prototypes, plus the ‘nails’ type synth of John’s design that we made with the students (that suspends the chip between 8 nails, and uses capacitance instead of potentiometers for the analog inputs). John wrote a document for the November iteration that you can find &lt;a href=&quot;https://www.dirtyelectronics.org/docs/ROAR_doc.pdf&quot;&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The synth makes a range of sounds – it’s kind of amazing how much you can get out of even a simple algorithm – with an emphasis on sounds that are quite digital (scrapy/beepy), and quite monumental (windy/wet/weathery). The more you change the mode, the more the signals used decay – eventually they rot down to nothing – with the decay rate and mode changing every time the synth is turned on. I think one thing that appeals about this quality is to make an instrument with a sensibility that you have to attune to and recognise – to see when it’s rotting faster or slower, to understand how the modes work.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/synths/paradise-palms.jpg&quot; alt=&quot;vegetation health&quot; /&gt;John and I performing at Paradise Palms
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;This sense of skill in practice is also something that makes me want to make more electronic objects with my students. Currently in my department, the electronics education most of our students recieve involves learning Arduino, perhaps experimenting with other boards (like the ESP32) or the Raspberry Pi. There’s something very exciting about handing a student a bare chip, show them how to program it using just a text editor. There’s something very liberating about that process.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/synths/palms-breadboards.png&quot; alt=&quot;vegetation health&quot; /&gt;gig setup with four breadboards
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Microcontrollers are funny things, as they bring you much more directly into contact with control over a medium (movement of electrons around a piece of silicon) and in the process highlight just how far removed from the medium you yourself are. A lot of John’s work that I really appreciate is taking these chips that cost about £2 and maximising the kind of sounds they can make. One thing lots of people say to me is, how does something that small make such a big sound?&lt;/p&gt;

&lt;p&gt;Last November we performed a small gig in Hackney as part of the &lt;a href=&quot;https://www.nonclassical.co.uk/&quot;&gt;NonClassical&lt;/a&gt; birthday party, with 4 of the synths feeding into a mixer, and a couple of homemade sequencers controlling the mode switching. It was deeply fun as an experience, and worked surprisingly well despite very little practice time. It felt in a way like much of the performance was already &lt;em&gt;in the instrument&lt;/em&gt;, and we were just bringing it out. Perhaps this is contradictory to the idea of it being an instrument that requires skill to play, though perhaps also it’s a creature that we are both already very familiar with. We are still tweaking the code, but plan to release the synth early next year.&lt;/p&gt;

&lt;p&gt;Back in November I went up to Leicester to visit John and record some samples. What you’re hearing in these is 2 or 3 of the synths playing together, plus some input from a sequencer to drive them in time, but no other sounds or inputs.&lt;/p&gt;

&lt;audio controls=&quot;&quot;&gt;
  &lt;source src=&quot;/img/synths/roar-bells.mp3&quot; type=&quot;audio/mpeg&quot; /&gt;
	Your browser does not support the audio element.
&lt;/audio&gt;

&lt;audio controls=&quot;&quot;&gt;
  &lt;source src=&quot;/img/synths/roar-mod.mp3&quot; type=&quot;audio/mpeg&quot; /&gt;
Your browser does not support the audio element.
&lt;/audio&gt;

&lt;audio controls=&quot;&quot;&gt;
  &lt;source src=&quot;/img/synths/roar-flock.mp3&quot; type=&quot;audio/mpeg&quot; /&gt;
	Your browser does not support the audio element.
&lt;/audio&gt;

&lt;p&gt;I think there’s something between both these projects – about making full use of something, that’s either small, or defunct, and trying to find the limits of its possibility – that feels like a way to answer the question about peaceful machines. In the manifesto on the &lt;em&gt;Commission’s&lt;/em&gt; &lt;a href=&quot;https://the-commission.vercel.app/about&quot;&gt;website&lt;/a&gt;, in arguing for the restaging of old works, they claim that that “developments of the past 150 years have been made too expensive, rarified, or fossilised”. So much of the basic violence of technology is in its extractiveness and newness, the constant refreshing and waste, of expertise that degrades and disappears – and a willingness to disregard human life (human-ness!) in the pursuit of this newness.&lt;/p&gt;

&lt;p&gt;I recently really enjoyed Tetsuo Kogawa’s essay on DIY micro-radio &lt;a href=&quot;https://anarchy.translocal.jp/non-japanese/radiorethink.html&quot;&gt;&lt;em&gt;Toward Polymorphous Radio&lt;/em&gt;&lt;/a&gt;, in which he paraphrases Heidegger to ask &lt;em&gt;‘What is radio’s “most extreme possibility?”‘&lt;/em&gt;, describing decades of work hosting microbroadcast radio stations on homemade transmitters. What I like about it so much is that the relationship to newness is totally there, but transformed. Micro radio is not new technology, but an old technology used to its fullest extent, over a long period of time, with results that are very surprising.&lt;/p&gt;

&lt;hr /&gt;

&lt;p&gt;&lt;em&gt;Research materials for Valves are &lt;a href=&quot;https://www.are.na/agnes-cameron/proj-valves&quot;&gt;here&lt;/a&gt;, and for More Roar &lt;a href=&quot;https://www.are.na/agnes-cameron/proj-more-roar&quot;&gt;here&lt;/a&gt;. I was particularly delighted for both projects to have come across the &lt;a href=&quot;https://www.modwiggler.com/forum/&quot;&gt;Modwiggler&lt;/a&gt; forum, has a lot of really helpful DIY synth resources&lt;/em&gt;&lt;/p&gt;
</description>
          <pubDate>2025-01-05T00:00:00-05:00</pubDate>
          <link>https://soup.agnescameron.info//2025/01/05/synths.html</link>
          <guid isPermaLink="true">https://soup.agnescameron.info//2025/01/05/synths.html</guid>
        </item>
      
    
      
    
      
    
      
    
      
        <item>
          <title>agrarianism and revolution reading group</title>
          <description>&lt;p&gt;&lt;a href=&quot;https://docs.google.com/document/d/1kZ0qLvmJgfvJ6jN8SSwNWMlcbrEOX7vNFZBx3yVDuwY&quot;&gt;The Agrarianism and Revolution Reading Group&lt;/a&gt; met weekly for the duration of the Delfina Foundation’s 2022 &lt;a href=&quot;&amp;quot;https://www.delfinafoundation.com/programmes/the-politics-of-food/season-5/&amp;quot;&quot;&gt;Politics of Food Programme&lt;/a&gt;. Organised by myself and fellow resident &lt;a href=&quot;https://asasonjasdotter.info&quot;&gt;Åsa Sonjasdotter&lt;/a&gt;, the impetus for the group came from a shared interest in discussing radical movements, ideas and materials concerning land and food. ARRG (as it is affectionately known) met 10 times over the course of the season, and will continue online on a biweekly basis, probably from February onward.&lt;/p&gt;

&lt;p&gt;Each session combined two readings, along with artefacts and materials brought by participants (highlight: Åsa brought flints from Suffolk). We would take turns to read aloud extracts from the text, finding links and parallels between the readings. At the end of each session, we would discuss the direction we next wanted to take, and myself and Åsa would select the next weeks’ extracts. In this way we meandered through a series of texts linked by intention and interest, covering a wide range of ideas and ideologies. At the turning point between seasons, I wanted to write an overview of the ideas we’d talked about, summarising some of our notes from the sessions and looking at the links between our paths of thought. This is not intended to be a thorough overview of all the materials, more of a meandering tour.&lt;/p&gt;

&lt;p&gt;I’ve tried to summarise the path we took through the readings in the diagram below.&lt;/p&gt;

&lt;figure class=&quot;fullwidth&quot;&gt;
	&lt;img src=&quot;/img/arrg/arrg-diagram.png&quot; /&gt;
&lt;/figure&gt;

&lt;h3 id=&quot;start-cabral-and-meiksins-wood&quot;&gt;start: cabral and meiksins-wood&lt;/h3&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/arrg/cabral.jpg&quot; alt=&quot;vegetation health&quot; /&gt;Amílcar Cabral during the Guinean revolution
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;We began the season with a pair of readings chosen by myself and Åsa – Amílcar Cabral’s speeches, in &lt;a href=&quot;https://www.are.na/block/16480920&quot;&gt;&lt;em&gt;Our People are Our Mountains&lt;/em&gt;&lt;/a&gt;, and Ellen Meiksins Wood’s &lt;a href=&quot;https://www.are.na/block/18246371&quot;&gt;&lt;em&gt;The Origin of Capitalism, a Longer View&lt;/em&gt;&lt;/a&gt;. These paired very well, both making interesting arguments about the feedback between social organisation, land use, and domination (colonial or internal). We specifically read the question-and-answer session with Cabral, where he discusses the successes, aims and projects during the height of the Guinéan revolution.&lt;/p&gt;

&lt;p&gt;In &lt;em&gt;The Origins of Capitalism&lt;/em&gt;, we focussed on two chapters discussing Meiksins-Wood’s theory of capitalism’s origins in English land management in the 15th century. She starts the first of these with a definition of appropriation and economic coercion, as it relates to the land:&lt;/p&gt;

&lt;blockquote&gt;
  &lt;p&gt;&lt;em&gt;For millennia, human beings have provided for their material needs by working the land. And probably for nearly as long as they have engaged in agriculture they have been divided into classes, between those who worked the land and those who appropriated the labour of others. That division between appropriators and producers has taken many forms, but one common characteristic is that the direct producers have typically been peasants. These peasant producers have generally had direct access to the means of their own reproduction and to the land itself. This has meant that when their surplus labour has been appropriated by exploiters, it has been done by what Marx called ‘extra-economic’ means - that is, by means of direct coercion, exercised by landlords or states employing their superior force, their privileged access to military, judicial, and political power.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The argument she goes on to make is quite detailed, but begins with the assertion that capitalist relations in England did not begin with the land enclosures but rather enclosures were the endpoint of a much longer process of dispossession of the peasantry. Increases in agricultural productivity are linked to the transformation of land relations from the feudal tithe system to a system of competitive rents, where farmers leased land from the lord and thus needed to maximise profits. Only in capitalism, the dominant mode of appropriation is based not on direct coercion, but on the complete dispossession of direct producers, who need to sell their labour on the capitalist ‘market’.&lt;/p&gt;

&lt;p&gt;This unique set of social arrangements, she argues, could happen in England because of the comparatively coherent transport + trade networks throughout England, themselves a result of colonial domination by the French, and are what allowed for later developments, such as the Industrial Revolution.&lt;/p&gt;

&lt;p&gt;Cabral’s discussion of land relations in Guiné gives us some interesting parallels:&lt;/p&gt;

&lt;blockquote&gt;
  &lt;p&gt;&lt;em&gt;Naturally, people in Europe expect ‘agrarian reform’ in my country. But in Guiné (Cape Verde is a different matter) the problem of agrarian reform is not the same as it is in Europe. This is because the land is not privately-owned in Guiné. The Portuguese did not occupy our land as settlers, as, for example, they did in Angola. The Africans kept the land and the Portuguese appropriated the results of his labour. As a result, most of the land has remained the property of the villages. Of course, in tribes like the Fula or Mandjak, which have a pyrami- dal social structure, the chiefs have the best land. But they have it only in terms of getting the best possible production from it; they do not own it, for it cannot be sold or otherwise disposed of.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;blockquote&gt;
  &lt;p&gt;&lt;em&gt;We do not therefore have the problem of agrarian reform in relation to land ownership that other countries are familiar with. What we need is an agrarian revolution to improve the yield of the soil through technology, and we believe that the best structure for this change will be a co-operative system. …We believe that we must develop the co-operative as the fundamental economic structure in our way of life, not only internally as the basis of our whole economy but also in terms of our country’s international economic relations.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3 id=&quot;link-social-organisation-around-land&quot;&gt;link: social organisation around land&lt;/h3&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/arrg/three-field.png&quot; alt=&quot;vegetation health&quot; /&gt;&lt;a href=&quot;https://en.wikipedia.org/wiki/Three-field_system&quot;&gt;three-field&lt;/a&gt; crop rotation system, introduced in Europe between 9th-11th century
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Inspired in particular by Meiksins-Wood’s analysis, and also by Cabral’s discussion of economics in &lt;a href=&quot;https://www.are.na/block/16480919&quot;&gt;&lt;em&gt;Resistance and Decolonisation&lt;/em&gt;&lt;/a&gt;, we started to read extracts from David Graeber and David Wengrow’s book &lt;a href=&quot;http://libgen.is/book/index.php?md5=CE3461B68D7A62C1F406ADC411891957&quot;&gt;&lt;em&gt;The Dawn of Everything&lt;/em&gt;&lt;/a&gt;, which traces back agricultural histories to the Neolithic era. In particular, the discussion stemmed from ideas about how social relations shape and are shaped by the relationship to the land, and agricultural relations in particular.&lt;/p&gt;

&lt;p&gt;One of the most compelling and powerful arguments made in the extract we read was the consideration of the first ‘agricultural revolution’ (the 3000-odd year period where crops moved in and out of domestication) as at its heart an informational and scientific revolution, as technologies of textiles and image-making were developed alongside the development of agricultural practice.&lt;/p&gt;

&lt;blockquote&gt;
  &lt;p&gt;&lt;em&gt;Seen this way, the ‘origins of farming’ start to look less like an economic transition and more like a media revolution, which was also a social revolution, encompassing everything from horticulture to architecture, mathematics to thermodynamics, and from religion to the remodelling of gender roles. And while we can’t know exactly who was doing what in this brave new world, it’s abundantly clear that women’s work and knowledge were central to its creation; that the whole process was a fairly leisurely, even playful one, not forced by any environmental catastrophe or demographic tipping point and unmarked by major violent conflict.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3 id=&quot;link-the-science-of-women&quot;&gt;link: the science of women&lt;/h3&gt;

&lt;p&gt;Graeber and Wengrow’s argument about the agricultural revolution also dubs it a &lt;em&gt;science of the concrete&lt;/em&gt;, a term coined by Lévi-Strauss which refers to a science rooted in material practice. They argue that women were at the centre of this kind of innovation, but are rarely credited as such.&lt;/p&gt;

&lt;p&gt;These assertions share a lot of ground (with good reason – Graeber was heavily influenced by the Kurdish movement and supported them throughout his career) with the notion of &lt;em&gt;jineolojî&lt;/em&gt;, or womens’ science, an ecofeminist ideology rooted in the Kurdish independence movement.&lt;/p&gt;

&lt;p&gt;Along this line, we chose readings from the artist and writer Marwa Arsanios and member of the Kurdish women’s movement and scholar Dilar Dirik, both of whom write on feminist revolutionary land struggles and agriculture in the Middle East, with Dirik writing specifically from the Kurdish perspective. In her piece &lt;a href=&quot;https://www.e-flux.com/journal/93/215118/who-s-afraid-of-ideology-ecofeminist-practices-between-internationalism-and-globalism/&quot;&gt;&lt;em&gt;Who’s Afraid of Ideology?&lt;/em&gt;&lt;/a&gt;, Arsanios actually interviews Dirik, and the links between the two are very strong.&lt;/p&gt;

&lt;p&gt;Arsanios’ piece in particular is forceful and broad-ranging, and produced a really intense, interesting discussion. In it, she moves between land struggles in Lebanon (her home), Syria, and Iraqi Kurdistan, with an analysis through the lens of self-defense. She critiques the positioning of NGOs in the region as ‘feminist’ projects, arguing that by focussing on “empowerment” and the need to “save” women, they do not work to support womens’ self-determination. In particular, Dirik’s interview within her piece had a section on the right to self-defense that we all found very compelling:&lt;/p&gt;

&lt;blockquote&gt;
  &lt;p&gt;&lt;em&gt;In liberalism, in liberal thought and philosophy in general, the expectation is that people should surrender the means of protection to the state. The state should have a monopoly on the use of force. The assumption is that you as an individual member of society should not have the agency to act because the state should decide on your behalf what is dangerous to your existence.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Our discussion that week circled around the replacement of ‘emancipation’ with ‘empowerment’, and how this also appears in the appropriation of ancestral knowledge in the project of nation-building.&lt;/p&gt;

&lt;p&gt;The following week, we also read a short piece by PKK founder Abdullah Öcalan, where he coins the term &lt;em&gt;jineolojî&lt;/em&gt;. He takes a very firm position on gender, namely that women must form separate political organisations from men if they are to take true political power and resist domination, a practice that has been carried out within the Kurdish struggle. I liked the line &lt;em&gt;“woman’s revolution is a revolution within a revolution”&lt;/em&gt;&lt;/p&gt;

&lt;h3 id=&quot;link-social-ecology&quot;&gt;link: social ecology&lt;/h3&gt;

&lt;p&gt;Dirik (and Öcalan, and Graeber and Wengrow for that matter) is heavily influenced by the work of American social theorist Murray Bookchin, who coined the term ‘social ecology’, which posits that ecological problems are the end result of structures of social domination, and that one cannot achieve social harmony without ecological, and vice versa. Interestingly, much of the introduction Bookchin gives to the concept circled around conversations we’d already been having, about the dangers of a philosophy of ‘wholeness’ straying into fascistic territory, and the neutralisation of once-radical words.&lt;/p&gt;

&lt;h3 id=&quot;link-monoculture&quot;&gt;link: monoculture&lt;/h3&gt;

&lt;p&gt;Bookchin discusses monoculture in depth as a form of ecological violence that inherits from an ideology of man’s scientific triumph over nature. Similarly to Graeber and Wengrow, Bookchin asks if there is “a scientific discipline that allows for the indiscipline of fancy, imagination, and artfulness? Can it encompass problems created by the social and environmental crises of our time? Can it integrate critique with reconstruction, theory with practice, vision with technique?”.&lt;/p&gt;

&lt;h3 id=&quot;link-states-and-nationhood&quot;&gt;link: states and nationhood&lt;/h3&gt;

&lt;p&gt;Both Dirik and Arsanios prompted lengthy discussions about the role of the state in agricultural systems, and the use of ‘nation myths’ in constructing a state. Related to earlier discussions of technology, and the interesting statistic that something like 90% of China’s food is produced on small-scale farms, we decided to read &lt;a href=&quot;https://www.are.na/block/18977759&quot;&gt;&lt;em&gt;Blockchain Chicken Farm&lt;/em&gt;&lt;/a&gt;, which discusses the use of technology in rural China. Wang’s writing is rich and evocative, taking a journalistic approach to describing the landscape of Chinese agriculture and technology.&lt;/p&gt;

&lt;p&gt;Joseph made the point that so much of China’s historic agricultural policy (early-mid 20th cent), at least that described at the start of the book, is in some way reactive to more global agroecological violence – and it is the state that adapts and in some ways passes that violence down through centralised planning policy.&lt;/p&gt;

&lt;h3 id=&quot;link-food-policy&quot;&gt;link: food policy&lt;/h3&gt;

&lt;p&gt;In our discussion of states and technology, we became interested in reading more about food systems research and the lines of power in global food systems. We discussed Friedman and McMichael’s piece &lt;a href=&quot;https://www.are.na/block/19079688&quot;&gt;&lt;em&gt;Agriculture and the State System&lt;/em&gt;&lt;/a&gt; in parallel with &lt;em&gt;Blockchain Chicken Farm&lt;/em&gt; – the argument they make is quite detailed, discussing the interplay between agriculture and state systems in the context of transnational capital. They trace the links between colonial relations, newly-independent settler states and the development of the nation-state system along with changes to trade and agricultural specialisation.&lt;/p&gt;

&lt;p&gt;The following week, we read an extract from agronomist Marci Baranski’s book &lt;a href=&quot;https://www.are.na/block/19374967&quot;&gt;&lt;em&gt;The Globalisation of Wheat&lt;/em&gt;&lt;/a&gt;, on the development of ‘global’ wheat varieties by the American agricultural scientist Norman Bourlag. In the piece, which introduces the project to develop a ‘global’ wheat variety, we wondered what was happening in each of the countries referred to at the time the American wheat research Centres were being set up. It had a bit of an air of ‘oh yeah he just happened to be in Chile in the late 60’s, how interesting’ about it. I think it also charts an interesting transformation to late-20th century colonialism, particularly the part where she observes that the new wheat varieties meant that the agricultural research centres in these countries were effectively de-skilled, as they weren’t developing local varieties anymore but just testing the ‘global’ one.&lt;/p&gt;

&lt;h3 id=&quot;link-indigenous-technoscience&quot;&gt;link: indigenous technoscience&lt;/h3&gt;

&lt;p&gt;In one session, we had a really interesting discussion on ideas of indigineity, prompted by Åsa’s question of what makes someone &lt;em&gt;not&lt;/em&gt; indigenous outside of a settler context, and what ‘ancestral knowledge’ means in the context of a colonial heart like the UK. This also related to the discussion around Meiksins-Wood, about broken links to the land and to one another.&lt;/p&gt;

&lt;p&gt;Emilio suggested a piece by indigenous Mexican activist &lt;a href=&quot;https://restofworld.org/2020/saving-the-world-through-tequiology/&quot;&gt;Yasnaya Aguilar Gil&lt;/a&gt;, a different approach to technology based on tequio, an ideology of technology based on sustainable and collective action. In particular, she cites the repurposing and adaptation of technologies within Abya Yala (the indigenous name for Central and South America), the development of open-source tools and the resistance to cell phone giants as examples of the use of technology as a form of mutual support. Reflecting on this text, we also talked not just about digital technology but forms of knowledge transfer in general, thinking about alternatives to written text as ways to contain and mediate knowledge.&lt;/p&gt;

&lt;p&gt;This also related to the Graeber and Wengrow text – so many nomadic societies don’t make it into the archaeological record because they didn’t build permeanent structures – meaning that their histories aren’t legible to, and thus not included in, the archeological narrative.&lt;/p&gt;

&lt;p&gt;Gil also addresses the false environmentalism of extractive ‘green technologies’, e.g. mining for rare earth minerals on indigenous lands.&lt;/p&gt;

&lt;h3 id=&quot;link-neocolonialism&quot;&gt;link: neocolonialism&lt;/h3&gt;

&lt;p&gt;There’s a remarkable part of the Cabral question-and-answer session, where he discusses why Portugal has not been able to relinquish Guinéa and Cape Verde as colonies:&lt;/p&gt;

&lt;blockquote&gt;
  &lt;p&gt;&lt;em&gt;…it is precisely because Portugal is underdeveloped, that she is unable to find a solution for her colonies, because she cannot hope for a neocolonialist one. In analysing the problems of African independence we can say that independence was given to colonised countries by the colonial powers as a means of securing the indirect domination of colonised peoples. But Portugal does notpossess the necessary economic infrastructure that will allow her to try decolonisation in this manner. She cannot decolonise because she cannot neocolonise.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;This notion of neocolonial power relations came up in many places, but in particular in a text introduced by Joseph and Stéphane, about the deployment of national parks in Africa as part of a neocolonialist land grab, under the guise of ‘environmentalism’.&lt;/p&gt;

&lt;h2 id=&quot;where-next&quot;&gt;Where next?&lt;/h2&gt;

&lt;p&gt;We finished up the season on Baranski and Silvia Federici, but we didn’t have a proper discussion of the latter so we’ll start next season with that too. I’d like to read some Vandana Shiva and potentially some more economic/political analysis too – Jeffrey M. Paige’s book &lt;em&gt;Agrarian Revolutions&lt;/em&gt; looks interesting in this regard. I’d also like to read more on logistics and supply chains, following on from the food systems readings toward the end of the season, and Cabral’s economic analysis. Moten and Harney’s &lt;em&gt;All Incomplete&lt;/em&gt; might be a place to start, and Denise Ferreria de Silva’s work.&lt;/p&gt;

&lt;p&gt;With love and thanks to Åsa, Maya, David, Cherry, Emilio, Stéphane, Joseph, Derek, Moza and Andrew for participation in and contributions to ARRG A/W 2022.&lt;/p&gt;

</description>
          <pubDate>2023-01-08T00:00:00-05:00</pubDate>
          <link>https://soup.agnescameron.info//2023/01/08/arrg.html</link>
          <guid isPermaLink="true">https://soup.agnescameron.info//2023/01/08/arrg.html</guid>
        </item>
      
    
      
    
      
        <item>
          <title>soft bread, hard times</title>
          <description>&lt;p&gt;Last April, &lt;a href=&quot;https://zhexi.info/&quot;&gt;Gary&lt;/a&gt; and I gave a talk titled &lt;em&gt;“Soft Bread, Hard Times”&lt;/em&gt; as part of Akademie Schloss Solitude’s festival &lt;a href=&quot;https://www.akademie-solitude.de/en/event/fragile-solidarity-fragile-connections/&quot;&gt;Fragile Solidarity, Fragile Connections&lt;/a&gt;. The talk (+ accompanying workshop) explored the history and theory of modifying food texture, processes that exist simultaneously on the highly industrial and the intimiate/bodily level. The original slides from the talk are &lt;a href=&quot;https://docs.google.com/presentation/d/1ESwKgDNP2qtaXQaVH95-9diEUmbTMQcTrcZ9SfoF-Tc/edit#slide=id.g128f945baba_0_210&quot;&gt;here&lt;/a&gt;; this post will likely stray a bit from that order.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/soft_bread/workshop.png&quot; alt=&quot;agnes and gary looking at tubs of ingredients&quot; /&gt;
	staring at starches
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;I first got interested in texture after reading volumes &lt;a href=&quot;https://www.are.na/block/11143252&quot;&gt;1&lt;/a&gt; and &lt;a href=&quot;https://www.are.na/block/11143253&quot;&gt;2&lt;/a&gt; of Woodhead Publishing’s &lt;em&gt;Modifying Food Texture&lt;/em&gt;, a manual on the science of industrial food processing. I’d ended up there for a project looking at food ontologies, which I plan to write more about soon. On a purely aesthetic level, there’s something deeply compelling about the manuals – a strange tenderness to discussions of how to manipulate the molecular structure of various foods such that they’re experienced as creamier, softer, crunchier or stickier by the person eating them. I’m also interested in the role that the food industry plays in the creation of ‘cheap foods’ – there’s a great short essay by &lt;a href=&quot;https://vittles.substack.com/p/how-to-destroy-the-imperial-food&quot;&gt;Max Walker&lt;/a&gt; on the intertwined history of cheap food, the British Empire, and contemporary free trade – and the use of these techniques into rendering certain crops (like corn, and wheat before it) from ‘foods’ to something more resembling chemical feedstocks.&lt;/p&gt;

&lt;p&gt;I maintain a collection of links &lt;a href=&quot;https://www.are.na/agnes-cameron/src-industrial-food-processing&quot;&gt;here&lt;/a&gt; relating to industrial food processing, including the manuals I reference here (libgen turns out to be fantastic for leaked industry textbooks, who knew).&lt;/p&gt;

&lt;h2 id=&quot;what-is-food-texture&quot;&gt;What is food texture?&lt;/h2&gt;

&lt;p&gt;Part of the reason I find ‘texture’ appealing as an area of food science is that it’s a concept that can only exist in relation to human subjective tastes. Specifically, the study of texture sits somewhere at the confluence of sensory perception, the physiology of the mouth, and food physics, most of which falls within the science of &lt;a href=&quot;https://en.wikipedia.org/wiki/Rheology&quot;&gt;rheology&lt;/a&gt; – the physics of the deformation and flow of solid and liquid materials.&lt;label for=&quot;rheology&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt; &lt;input id=&quot;rheology&quot; class=&quot;margin-toggle&quot; /&gt;&lt;span class=&quot;sidenote&quot;&gt;While not all foods ‘flow’ when on the plate, it’s important to remember that the study of food texture is almost entirely concerned with what happens to food &lt;em&gt;in the mouth&lt;/em&gt; – and in order for something to be swallowed, even if it didn’t flow initially, it definitely will by the time it travels down your oesophagus.&lt;/span&gt;&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/soft_bread/texture_disciplines.png&quot; alt=&quot;main&quot; /&gt;
&lt;/figure&gt;

&lt;p&gt;Eugene C. Bingham – the physicist that coined the term &lt;em&gt;rheology&lt;/em&gt; and who is widely regarded as the father of the field – declared in 1930 that:&lt;/p&gt;

&lt;blockquote&gt;
  &lt;p&gt;&lt;em&gt;“The flow of matter is still not understood and since it is not mysterious like electricity, it does not attract the attention of the curious. The properties are ill defined and they are imperfectly measured if at all, and they are in no way organized into a systematic body of knowledge which can be called a science.”&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;While the study of rheology has come on a great deal since then, there’s still a strongly un-glamorous slant that comes through in a lot of food science writing.&lt;/p&gt;

&lt;h3 id=&quot;chemistry-vs-reality&quot;&gt;Chemistry vs reality&lt;/h3&gt;

&lt;p&gt;One of the things that I found quite surprising when I started reading food processing manuals was how humbled all of the authors were by the whiles of human perception. In every book I read, as far as food product development goes, everything came second to how the foods were actually percieved by human tasting panels.&lt;/p&gt;

&lt;p&gt;In Malcom Bourne’s &lt;a href=&quot;https://www.are.na/block/17512660&quot;&gt;&lt;em&gt;Food Texture and Viscosity: Concept and Measurement&lt;/em&gt;&lt;/a&gt; (a great and comprehensive intro to the field), he bemoans a hubristic fellow chemist running his mouth about how easy it should be for a polymer chemist to synthesise fake meats:&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/soft_bread/mayo_prediction.png&quot; alt=&quot;sakiya aerial view&quot; /&gt;
	actually percieved saltiness, vs perception of saltiness predicted by the model (as a function of acid/sugar/ salt/oil level plus viscosity, critical strain and amylase thinning), from the Unilever mayonnaise study
&lt;/span&gt;&lt;/p&gt;

&lt;blockquote&gt;
  &lt;p&gt;&lt;em&gt;This scientist should be sentenced to spend 10 years hard labor in the product development laboratory for making such a misleading statement! Acceptable texture has been a limiting factor in the development of many fabricated foods.”&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/soft_bread/mayo_model.png&quot; alt=&quot;sakiya aerial view&quot; /&gt;
	system of equations for the Unilever mayonnaise model
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Attempts to close this ‘reality gap’ for various foods are also somewhat astounding in their scale – until you realise that, actually, mayonnaise really is a multibillion-dollar industry, and a ‘Standard Model’ for its perception is probably quite valuable. Hence a really fascinating chapter in the book &lt;a href=&quot;https://www.are.na/block/15366513&quot;&gt;&lt;em&gt;“Designing Functional Foods”&lt;/em&gt;&lt;/a&gt;, entitled &lt;a href=&quot;https://www.are.na/block/17512647&quot;&gt;&lt;em&gt;“Design of foods for the optimal delivery of basic tastes”&lt;/em&gt;&lt;/a&gt;, which discusses a techique used by a team of scientiasts at Unilever called “Integrated Sensory Response Modelling”, which used a large comparative tasting study of 76 (!) mayonnaises and 192 (!!!) pourable dressings that sought to model the effects of tastants, microstructures, oral processing, and interactions between textures and tastes (caused by things like thickeners) on the taste perception of these products.&lt;/p&gt;

&lt;h3 id=&quot;describing-food-texture&quot;&gt;Describing Food Texture&lt;/h3&gt;

&lt;p&gt;Discussion of food texture also comes with its own sets of terminology, for which there are a range of possible ontologies, all of which are geared toward an articulation which can be best married to the creation of a physical model, again desparately trying to make up the space between physics and sensory description.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/soft_bread/light_texture.png&quot; alt=&quot;main&quot; /&gt;
	&lt;span class=&quot;mainnote&quot;&gt;A chart from Bourne, comparing the physics and perception of light to that of texture&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/soft_bread/texture_words.png&quot; alt=&quot;table of food description terms&quot; /&gt;
	top 10 texture words used by American, Japanese, and Austrian tasting panels, in descending order.
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Descriptions of food texture also vary wildly between different languages and cuisines. In a comparison of 3 papers on food texture vocabulary used by tasting panels, Bourne notes that under similar conditions, tasting panels in the US, Japan and Austria came up with 78, 406, and 105 words respectively to describe the texture of foods. While there was a huge disparity in amount, however, six of the top 10 words are common to all 3 lists.&lt;/p&gt;

&lt;h2 id=&quot;soft-bread&quot;&gt;Soft Bread&lt;/h2&gt;

&lt;p&gt;Bourne makes the argument that a great deal of the history of food processing has involved the manupulation of texture:&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/soft_bread/dry_wets.png&quot; alt=&quot;meme showing pasta making in terms of drying and wetting processes&quot; /&gt;
	an accurate meme
&lt;/span&gt;&lt;/p&gt;

&lt;blockquote&gt;
  &lt;p&gt;&lt;em&gt;“From the nutritional standpoint wheat could be eaten as whole grains but most people find them too hard to be appealing. Instead, the structure of the wheat kernel is destroyed by grinding it into flour, which is then baked into bread with a completely different texture and structure than the grain of wheat. The texture of leavened bread is much softer and less dense than that of grains of wheat and is a more highly acceptable product, judging by the quantity of bread that is consumed”&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Texture is ultimately an interface to our digestive processes, to the point where many industrial transformations of food mimic digestive processes in our bodies – in particular the use of the enzymes amylase, protease and lipase to break down food into chemical components – in turn making products that are easier to eat and digest. In this sense, texture modification sits somewhere in a chain of processes that end in digestion, but the point where that chain enters the body might differ between processes.&lt;/p&gt;

&lt;figure class=&quot;fullwidth&quot;&gt;
	&lt;img src=&quot;/img/soft_bread/comminution.png&quot; alt=&quot;main&quot; /&gt;
	&lt;span class=&quot;mainnote&quot;&gt;a table from Malcom Bourne&apos;s &lt;i&gt;Food Texture and Viscosity&lt;/i&gt;, demonstrating the transformation from macro-food to chemical absorption&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;This modification is not just for the purposes of pleasure – the texture (and thus structure) of foods can determine to a great extent the bioavailability of different nutrients in the food – how much of a nutrient can be absorbed, and how. Just as the simple starches in processed white bread are much more rapidly broken down by the digestive system, so the bioavailability of the food has been directly affected by adjustments in the manufacturing process.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/soft_bread/bioavailability.png&quot; alt=&quot;main&quot; /&gt;
&lt;/figure&gt;

&lt;p&gt;Much of the second volume of &lt;em&gt;Modifying Food Texture&lt;/em&gt; is concerned with this kind of manipulation, specifically making foods softer such that they might be more easily be consumed by people suffering from dysphagia or difficulty swallowing, a condition associated with a number of different conditions and often experienced by people as they age.&lt;/p&gt;

&lt;h3 id=&quot;hard-times&quot;&gt;Hard Times&lt;/h3&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/soft_bread/soft_bread.png&quot; alt=&quot;sakiya aerial view&quot; /&gt;
	soft bread &amp;lt;-&amp;gt; hard times; an approximate system diagram
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;The industrial production of bread has several origin points, though a key turning point in development of contemporary industrial bread manufacture was the repeal of the &lt;a href=&quot;https://en.wikipedia.org/wiki/Corn_Laws&quot;&gt;Corn Laws&lt;/a&gt; in 1846 (followed by abolition of cereal duties in 1869). The Corn Laws were a system of tariffs implemented by the British Government introduced in 1815 that blocked the import of cheap grain (wheat, oats and barley) into the UK. Their repeal saw an influx of cheap grain from North America, and a squeeze on farmers in the UK (who could no longer sell grain), pushing people into a rapidly-growing urban working class, which in turn was used to drive the industrial revolution.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/soft_bread/wheat_prices.png&quot; alt=&quot;sakiya aerial view&quot; /&gt;
	the price/ton of wheat in England, 1264-1996, via Wikipedia
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;As Jason Moore and Raj Patel argue in &lt;em&gt;A History of the World in Seven Cheap Things&lt;/em&gt;: “cheap food enables cheap work to yield riches” – the cheap grain (and cheap bread that was produced from it) allowed for the proletarianisation of an urban workforce, whose labour fuelled the industrial revolution and the expansion of empire.&lt;/p&gt;

&lt;p&gt;In 1862, 16 years after the Corn Laws were repealed, Dr. John Daugleish founded the &lt;a href=&quot;https://en.wikipedia.org/wiki/Aerated_Bread_Company&quot;&gt;Aerated Bread Company&lt;/a&gt;, pioneering a no-knead chemical leavening process that allowed for a high degree of automation. The process reduced both the time and labour costs associated with breadmaking, thus markedly reducing the price of bread. &lt;label for=&quot;aerated&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt; &lt;input id=&quot;aerated&quot; class=&quot;margin-toggle&quot; /&gt;&lt;span class=&quot;sidenote&quot;&gt;When it was introduced to the Australian market, ABC is thought to have pushed down the price of bread locally between 8-17%, according to an &lt;a href=&quot;https://trove.nla.gov.au/newspaper/article/1261956&quot;&gt;article&lt;/a&gt; published in the Adelaide Observer.&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Daugleish’s process was refined in 1961 into the &lt;a href=&quot;https://en.wikipedia.org/wiki/Chorleywood_bread_process&quot;&gt;Chorleywood Bread Process&lt;/a&gt;, a high-volume and high-speed industrial bread process which is used to make around 80% of bread in the UK, Australia, New Zealand and India today.&lt;/p&gt;

&lt;p&gt;British food policy is still based overwhelmingly on extractive import agreements and export monocultures that hark back to colonial trade policies. In 1863, agricultural chemist &lt;a href=&quot;https://en.wikipedia.org/wiki/Justus_von_Liebig&quot;&gt;Justus von Liebig&lt;/a&gt;, talking about the colonial extraction of fertiliser declared&lt;label for=&quot;aerated&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt; &lt;input id=&quot;aerated&quot; class=&quot;margin-toggle&quot; /&gt;&lt;span class=&quot;sidenote&quot;&gt;In – I am not kidding – &lt;a href=&quot;https://books.google.co.uk/books?id=Rjg7AQAAMAAJ&amp;amp;dq=Great%20Britain%20deprives%20all%20countries%20of%20the%20conditions%20of%20their%20fertility.%20It%20has%20raked%20up%20the%20battle-fields%20of%20Leipsic&amp;amp;pg=PA227#v=onepage&amp;amp;q=Great%20Britain%20deprives%20all%20countries%20of%20the%20conditions%20of%20their%20fertility.%20It%20has%20raked%20up%20the%20battle-fields%20of%20Leipsic&amp;amp;f=false&quot;&gt;Farmer and Gardener Magazine&lt;/a&gt;, 1863 edition&lt;/span&gt; that:&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/soft_bread/wheat_demand.png&quot; alt=&quot;chart of demand vs production of wheat&quot; /&gt;
	english wheat demand vs production in millions of bushels, 1640-1880
&lt;/span&gt;&lt;/p&gt;

&lt;blockquote&gt;
  &lt;p&gt;&lt;em&gt;“Great Britain deprives all countries of the conditions of their fertility. It has raked up the battle-fields of Leipsic (sic), Waterloo and the Crimea; it has consumed the bones of many generations accumulated in the catacombs of Sicily; and now annually destroys the food for a future generation of three million and a half people. Like a vampire it hangs on the breast of Europe, and even the world, sucking its lifeblood without any real necessity or permanent gain for itself.”&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2 id=&quot;corn-is-a-platform&quot;&gt;Corn is a Platform&lt;/h2&gt;

&lt;p&gt;If the defining crop of the British Empire was wheat, the equivalent to the American empire would be corn (maize). Cultivated for centuries by indigenous peoples across North, Central and South America, having likely been domesticated in the Mexican highlands and spreading to eastern North America by 900 A.D. The Aztec and Maya used a process called nixtamalisation to process corn by cooking the kernels in an alkaline solution, considerably improving its nutritional value and changing its texture.&lt;label for=&quot;nixtamal&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt; &lt;input id=&quot;nixtamal&quot; class=&quot;margin-toggle&quot; /&gt;&lt;span class=&quot;sidenote&quot;&gt;European settlers did not take up this technique, leading to endemic &lt;a href=&quot;https://en.wikipedia.org/wiki/Pellagra&quot;&gt;pellagra&lt;/a&gt; in settler populations in the US in the early 20th century&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/soft_bread/corn_belt.png&quot; alt=&quot;sakiya aerial view&quot; /&gt;
	the corn belt
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Corn was cultivated by European settlers, who used the growing techniques of indigenous groups to expand corn agriculture across the Midwest (creating the &lt;a href=&quot;https://en.wikipedia.org/wiki/Corn_Belt&quot;&gt;Corn Belt&lt;/a&gt;), spurred on at the start of the 20th century by massive corn subsidies. The US is now far and away the world’s largest maize producer, at 360 million tonnes (in 2020) a full 100 million tonnes ahead of the next-most prolific producer, China.&lt;/p&gt;

&lt;h3 id=&quot;corn-wet-milling&quot;&gt;Corn Wet-Milling&lt;/h3&gt;

&lt;p&gt;Part of corn’s appeal comes from a process known as wet milling, which processes corn kernels into separated corn oil, starch (which can then be processed into sugar, and then ethanol), gluten meal and fibre, and which has an almost 100% yield (e.g., every part of the process turns into a usable product).&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/soft_bread/wet_milling.png&quot; alt=&quot;main&quot; /&gt;
	&lt;span class=&quot;mainnote&quot;&gt;a chart of the products from corn wet-milling&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;Through this process, corn is not processed into ‘food’, more a list of chemical feedstocks as suited to transform into foods as they are into fuels, plastics or animal feeds. This is a process that suddenly allows you to jump between points in ‘texture-space’, rather than dealing with discrete ingredients.&lt;/p&gt;

&lt;p&gt;The process for making packing peanuts, for example, strongly resembles the process for making puffed corn snacks like Wotsits, with the added step of sugar removal to prevent the material from being appealing to rats and mice. These pellets are extremely cheap to produce, and through the manipulation of solid foam texture have excellent force-absorbing properties.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/soft_bread/packing_peanut.png&quot; alt=&quot;main&quot; /&gt;
	&lt;span class=&quot;mainnote&quot;&gt;a diagram showing packing peanut production&lt;/span&gt;
&lt;/figure&gt;

&lt;h3 id=&quot;grain-research&quot;&gt;Grain Research&lt;/h3&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/soft_bread/pseudocereals.png&quot; /&gt;
	chart showing the change in viscosity over time (when subjected to a temperature curve) of different pseudocereal flours
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Campden BRI, renamed from the Chorleywood research center are still conducting food science research, working as a research consultancy for industrial food procucers. In a recent research paper, &lt;a href=&quot;https://www.are.na/block/15921484&quot;&gt;&lt;em&gt;Legumes – an alternative to cereals?&lt;/em&gt;&lt;/a&gt;, McGurk, Sahi and Heuer investigate the use of legumes and the pseudocereals amaranth, quinoa and buckwheat in breadmaking, using the analysis of starch gelatinisation temperatures to propose baking methods that reproduce the texture of wheat-based bread using a disparate range of inputs. In some senses this feels almost like a full circle, the heart of ‘big bread’ finding alternatives to wheat that create the same product… and maybe in other senses it feels like Shell’s renewable energy research division.&lt;/p&gt;

&lt;h2 id=&quot;future-textures&quot;&gt;Future Textures&lt;/h2&gt;

&lt;p&gt;Where does all this take us? In the next decades, it feels clear that (simplistically) our relationship to food policy (at least in the UK) needs to transform from a system of colonial extraction based on export monocultures to one focussed on what can be grown locally, without taxing the soil to the point where nothing can be grown at all. In part, this requires us to transform the inputs to the food system we currently inhabit to focus on a more sustainable set of crops. Likewise, there is a need to reduce the amount of meat we consume across the board, and also reduce the amount of land given over to the production of animal feeds and biofuels.&lt;/p&gt;

&lt;p&gt;Texture modification gives us a set of industrial tools that allow us to imagine making similar foods with, for example, a much greater diversity of grain varieties while still retaining the ability to make staples such as bread.&lt;/p&gt;

</description>
          <pubDate>2022-08-05T00:00:00-04:00</pubDate>
          <link>https://soup.agnescameron.info//2022/08/05/soft-bread.html</link>
          <guid isPermaLink="true">https://soup.agnescameron.info//2022/08/05/soft-bread.html</guid>
        </item>
      
    
      
        <item>
          <title>knitting experiments</title>
          <description>&lt;!-- &lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/knitting/first_go.JPG&quot; alt=&quot;main&quot;/&gt;
	first time using the machine to produce a swatch
&lt;/span&gt;
 --&gt;
&lt;p&gt;Since I started working at the &lt;a href=&quot;https://www.arts.ac.uk/creative-computing-institute&quot;&gt;Creative Computing Institute&lt;/a&gt; in January of this year, I’ve been on-and-off teaching myself how to use the Silver Reed SK840 digital knitting machine that we have there.It can be used either as a purely manual machine, or partially computer-controlled, using a box called a ‘Silverlink’ that mimics the behaviour of a punchcard, and allows for complex patterning.&lt;/p&gt;

&lt;p&gt;The thing that really appeals to me about knitting is the idea of modifying the bulk properties of a material, using tangible, meso-scale processes. It’s an idea I’ve thought a lot about, especially since seeing my friend &lt;a href=&quot;http://oujifei.com/&quot;&gt;Jifei&lt;/a&gt; talk about this during his PhD defence, this idea of visible changes that can modify the behaviour of materials around us. I also like the idea of processes that combine human/manual manipulation with machinery that automates some but not all of a complex process.&lt;/p&gt;

&lt;p&gt;It’s also a &lt;em&gt;sweet&lt;/em&gt; machine.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/knitting/silver_reed.png&quot; alt=&quot;main&quot; /&gt;
	&lt;span class=&quot;mainnote&quot;&gt;200 needles, baby&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;As I learn more about machine knitting, I’ve been collecting links in an &lt;a href=&quot;https://www.are.na/agnes-cameron/mech-knitting-machine&quot;&gt;are.na channel&lt;/a&gt;, which runs alongside a channel I’ve had for years called &lt;a href=&quot;https://www.are.na/agnes-cameron/src-textile-computer&quot;&gt;‘Textile Computer’&lt;/a&gt;, which also contains some relevant links, though focussed on textiles and computation more broadly. I’m also in the process of writing a series of &lt;a href=&quot;https://wiki.cci.arts.ac.uk/books/facilities/chapter/digital-knitting-machine&quot;&gt;tutorials&lt;/a&gt; on this machine for the CCI wiki, which should be a bit more formal once they’re done.&lt;/p&gt;

&lt;h2 id=&quot;sensors-and-e-textiles&quot;&gt;sensors and e-textiles&lt;/h2&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/knitting/etextile_closeup.png&quot; style=&quot;width: 100%;&quot; alt=&quot;close up image of knitted textile&quot; /&gt;closeup of conductive yarn / lambswool sample
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;One of the first things I did with the machine (after learning how to create a basic swatch, cast on, etc) was to test out knitting with conductive yarns. There’s a couple of nice guides on how to do this; my favourite is written by &lt;a href=&quot;https://www.kobakant.at/DIY/?p=1762&quot;&gt;KOBAKANT&lt;/a&gt; on their fantastic &lt;a href=&quot;https://www.kobakant.at/DIY/&quot;&gt;How to Get What You Want&lt;/a&gt; site.&lt;/p&gt;

&lt;p&gt;Typically, conductive yarn is used differently to conductive thread: it has much higher resistance, which means that it’s not used to directly connect components unless they are very close together. However, because the conductive properties of fabrics knitted with conductive yarns change according to the state of the material, they make great sensors.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/knitting/yarn_lineup.png&quot; style=&quot;width: 100%;&quot; alt=&quot;close up image of knitted textile&quot; /&gt;CCI conductive yarns
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;When a knitted sample is relaxed, it has a high resistance as the fibres in the yarn do not make contact with one another, meaning that current can pass only along a few strands. When the sample is stretched, however, the fibres are pulled into closer contact, reducing the resistance and allowing the flow of an electric current. This means that the inclusion of a conductive yarn allows you to sense movement in the material, creating a stretch sensor.&lt;/p&gt;

&lt;p&gt;As most of the conductive yarns we had were too thin to knit by themselves without tangling, I wrapped them together with a lambswool carrier yarn, which had the other nice effect of giving the fabric a good weight. With one exception (a very stiff, dry, metal fibre), they were all OK to knit with, and produced really interesting results.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/knitting/etextile_samples.png&quot; alt=&quot;main&quot; /&gt;
	&lt;span class=&quot;mainnote&quot;&gt;6 swatches made with different conductive yarns&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;I was really surprised with the range and consistency of the results; using different yarns I was able to make reliable variable resistors in the Ω, kΩ and MΩ range.&lt;/p&gt;

&lt;p&gt;Below is a gif of &lt;a href=&quot;https://www.evasajovic.co.uk/&quot;&gt;Eva&lt;/a&gt; testing out the variable resistance, attached to the analog input to an Arduino and a simple potential divider. As she stretches the fabric, the resistance decreases; the LED brightness shows the change in sensed voltage.&lt;/p&gt;

&lt;figure&gt;
	&lt;img src=&quot;/img/knitting/sensor.gif&quot; alt=&quot;main&quot; /&gt;&lt;br /&gt;
	&lt;span class=&quot;mainnote&quot;&gt;testing the stretch sensor&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;As well as stretch sensors, they make great integrated capacitive touch sensors when included in fabric. I worked with some students to knit some touch-sensitive pockets, and I reckon with the advent of the intarsia carriage (see below) it’ll get much easier to integrate discrete sensor patches into the textile.&lt;/p&gt;

&lt;h2 id=&quot;using-designaknit&quot;&gt;Using Designaknit&lt;/h2&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/knitting/gary_scarf.jpeg&quot; style=&quot;width: 100%;&quot; alt=&quot;close up image of knitted textile&quot; /&gt;fairisle scarf I knitted for gary’s birthday (the fringe is done by hand)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;After I’d played around for a bit with the machine and become comfortable with basic manipulation, I wanted to experiment with computer control. Initially I found it quite overwhelming – it’s very versatile software, that comes from a world very unfamiliar to me. I used this &lt;a href=&quot;https://www.youtube.com/watch?v=3NhIzbSfPTY&quot;&gt;series of videos&lt;/a&gt; to learn the basics, and have since also been using a copy of part of the &lt;a href=&quot;https://www.are.na/block/16526550&quot;&gt;manual&lt;/a&gt; I found in PDF form.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;fairisle&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Fairisle was the first digital technique I learned, and it’s pretty low-effort to produce some really beautiful, impressive things that it would be very hard to knit by hand. It works by switching between knitting 2 yarns, leaving one to trail along the back of the fabric while the other is stitched. It has the limitation that you don’t want these ‘runners’ to get too long (or they will get caught on things), so the best things involve a lot of noise/repetition to avoid large areas of a single colour.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/knitting/orbs.jpeg&quot; style=&quot;width: 100%;&quot; alt=&quot;close up image of knitted textile&quot; /&gt;test swatch for fairisle orbs (I’m not so keen on the regularity of the ‘birds-eye’ – dithered – pattern)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;The first big project I did on the machine was to knit a scarf for gary’s birthday. I used a Processing script to generate a pixellated torus shape, and then used Designaknit’s import photo tool to turn it into stitches (the pixellation turned out to be unnecessary as the software actually does it for you, sometimes with annoying results).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;tuck mosaic&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/knitting/tuck_mosaic.jpeg&quot; style=&quot;width: 100%;&quot; alt=&quot;close up image of knitted textile&quot; /&gt;tuck mosaic swatch
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;This is a pretty cool technique where you use the structure of the stitches to manipulate patterns. The 2 colours here are knitted in horizontal stripes, but with a regular pattern of stitches ‘tucked’ into the stitch above, manipulating the structure to resemble vertical lines. The result looks very beautiful, but it’s time-consuming to change the colour every 2 rows.&lt;/p&gt;

&lt;p&gt;What’s interesting about tuck patterns is that they look nothing like the patterns that produce them. There’s a great and incredibly detailed &lt;a href=&quot;https://alessandrina.com/2019/06/29/mosaics-and-mazes-charting-meet-numbers-and-gimp/&quot;&gt;blog post&lt;/a&gt; on &lt;a href=&quot;https://alessandrina.com&quot;&gt;Alessandrina&lt;/a&gt; (very helpful knitting blog) which explains in great detail.&lt;/p&gt;

&lt;h2 id=&quot;manual-shaping&quot;&gt;Manual Shaping&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;mock rib and sleeves&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Although Designaknit is a big help when it comes to drafting patterns, you still need to increase and decrease stitches manually. For my first attempt at this, I tried out making a sleeve for a child’s jumper. At the same time, I had a go at adding a ribbed edge. It also makes placing patterns on shaped pieces very straightforward, which is lovely.&lt;/p&gt;

&lt;p&gt;Typically if you’re doing this properly, you use what’s called a ‘ribber’ – 2 beds of needles placed facing one another so you can transfer stitches between them. My impression is that this opens up a lot of shaping possibilities, but it also seems like you can get away with faking stuff on a single bed machine (which seems less complex).&lt;/p&gt;

&lt;p&gt;I watched &lt;a href=&quot;https://www.youtube.com/watch?v=GOq0rZ-JcMo&quot;&gt;this video&lt;/a&gt; for guidance, which was pretty good (it’s in ?Norwegian? but there are subtitles), and produced a nice ribbed edge that didn’t feel too loose. One thing I learned from the collar video is that you can force a fold by knitting 2 rows on higher tension halfway through, which I’ve tried since and seems to work well.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;making a collar&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/knitting/neckline_1.jpeg&quot; style=&quot;width: 100%;&quot; alt=&quot;a knitted neck section&quot; /&gt;collar for a tiny tiny jumper
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;A key feature of machine knitting turns out to be good old-fashioned manual dexterity, something I’ve been slowly acclimatising to, though it’s been quite a steep learning curve. One very quiet afternoon over easter, I decided to learn how to make a fully-fashioned neckline, on a baby-sized sweater. It was pretty challenging, but also a very satisfying process.&lt;/p&gt;

&lt;p&gt;First you knit the neck shape, then knit a couple of rows of waste yarn on top. To do either side of the neck you need to do one at a time, with the other needles in a hold position. Once that’s done, you hook all the ‘neck’ needles back on, and then start to knit the collar. I learned how to do this from &lt;a href=&quot;https://www.youtube.com/watch?v=c2w9zbm4kIY&quot;&gt;this video&lt;/a&gt;, which shows you how to add a ribbed neckline using a single bed machine.&lt;/p&gt;

&lt;h2 id=&quot;the-intarsia-carriage&quot;&gt;the intarsia carriage&lt;/h2&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/knitting/intarsia_1.jpeg&quot; style=&quot;width: 100%;&quot; alt=&quot;a knitted neck section&quot; /&gt;first attempt with the intarsia carriage
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/knitting/intarsia_2.jpeg&quot; style=&quot;width: 100%;&quot; alt=&quot;a knitted neck section&quot; /&gt;second intarsia swatch – still a few errors, but improving
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;A new addition to the machine is a small carriage which allows patches of colour to be knit together using a technique called intarsia, with no runners at the back. I personally really like the aesthetic of this kind of knitting, though it’s a much more manual process than the other colour-changing techniques and as such takes a long time.&lt;/p&gt;

&lt;p&gt;My first experiments with it definitely felt like a step back in terms of dexterity, but I really like the effect it produced, especially in my second swatch.&lt;/p&gt;

&lt;h2 id=&quot;automatic-colour-changer&quot;&gt;automatic colour changer&lt;/h2&gt;

&lt;p&gt;In a somewhat profligate move, we also got hold of an automatic colour-changer for the machine, which can be used to change the colour of the main yarn rapidly once every 2 rows or less. The real benefit of this is in doing patterns like a tuck mosaic, which otherwise require a huge amount of effort to keep changing the 2 yarns constantly, but also makes knitting anything involving stripes a real breeze.&lt;/p&gt;

&lt;p&gt;It involves adding a beautifully-made set of mechanised hooks to one end of the needle bed, that can be pressed in and out to ready a yarn to be used. When ready to change the yarn, the carriage clicks into the hooks, with one hook grabbing the old yarn from the feeder, and another hook pushing the new one into place. It’s a really remarkable bit of machine design when it works (though it also takes a bit of getting used to).&lt;/p&gt;

&lt;p&gt;I had a go at using this to change one of the colours while knitting fairisle – I found that it was more liable to tangle than either straight knitting with a colour changer, or fairisle without one, but once that was under control the effect was pretty attractive, even from just messing around. Excited to do more with it.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/knitting/auto_changer.jpeg&quot; style=&quot;width: 100%;&quot; alt=&quot;a knitted neck section&quot; /&gt;fairisle swatch knitting using the automatic colour changer to change one of the yarns
&lt;/span&gt;&lt;/p&gt;

&lt;h2 id=&quot;next-project&quot;&gt;Next project&lt;/h2&gt;

&lt;p&gt;I’ve become very interested in manipulating the file format used by Designaknit to more readily knit generative designs without first having to go through the image -&amp;gt; stitch translation process offered by the software.&lt;/p&gt;

&lt;p&gt;CMU textiles lab have an open-source &lt;em&gt;industrial&lt;/em&gt; machine file format called knitout, which seems really cool but currently can’t be translated into the proprietary .dak file format, which is what gets sent via USB.&lt;/p&gt;

&lt;p&gt;There &lt;em&gt;is&lt;/em&gt; a project &lt;a href=&quot;https://github.com/gbl/D7CReader&quot;&gt;here&lt;/a&gt; to reverse-engineer DAK files (which can currently read their contents but not write to them), as well as a &lt;a href=&quot;https://nadiacw.github.io/softwear/2020/06/02/file-formats.html&quot;&gt;very thorough overview&lt;/a&gt; of attempts and challenges involved (as well as a bit of &lt;a href=&quot;https://nadiacw.github.io/softwear/knitting/2020/08/07/usb-traces.html&quot;&gt;packet sniffing&lt;/a&gt;) on Nadia Campo Woytuk’s &lt;a href=&quot;https://nadiacw.github.io/softwear/&quot;&gt;Softwear&lt;/a&gt; blog. During August I’m planning to look deeper into some of these!&lt;/p&gt;

</description>
          <pubDate>2022-07-19T00:00:00-04:00</pubDate>
          <link>https://soup.agnescameron.info//2022/07/19/knitting.html</link>
          <guid isPermaLink="true">https://soup.agnescameron.info//2022/07/19/knitting.html</guid>
        </item>
      
    
      
    
      
        <item>
          <title>giving a talk in the terminal</title>
          <description>&lt;p&gt;A few weeks ago I got invited to give a talk as part of a panel on “The Interfaces of AI Art Practices”, one I was initially somewhat surprised to be asked to give, identifying neither as an artist, nor someone that works primarily with “AI” in the sense that that term is often used, nor even someone who really thinks about interfaces very much&lt;label for=&quot;panellists&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt;&lt;input id=&quot;panellists&quot; class=&quot;margin-toggle&quot; /&gt;&lt;span class=&quot;sidenote&quot;&gt;in direct contrast to the other two panellists, &lt;a href=&quot;https://mitpress.mit.edu/contributors/christian-ulrik-andersen&quot;&gt;Christian Ulrik Andersen&lt;/a&gt; and &lt;a href=&quot;https://www.arts.ac.uk/creative-computing-institute/people/rebecca-fiebrink&quot;&gt;Rebecca Fiebrink&lt;/a&gt; who made &lt;a href=&quot;http://www.wekinator.org/&quot;&gt;Wekinator&lt;/a&gt;&lt;/span&gt;. Nonetheless, it was a nice opportunity to think a bit more about my relationship to computers and work in general, maybe moreso than if I’d have felt like an obvious fit to the subject matter.&lt;/p&gt;

&lt;p&gt;I won’t give a full recount of what I said here – there’s a &lt;a href=&quot;https://www.serpentinegalleries.org/art-and-ideas/the-interfaces-of-ai-art-practices/&quot;&gt;recording&lt;/a&gt; of the full panel online, along with a summary writeup, but I had a few thoughts that didn’t make it in, and some dumb notes on using terminal as a presentation software.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;fullwidth&quot;&gt;
	&lt;img src=&quot;/img/terminals/presentation.png&quot; alt=&quot;agnes (top right hand corner) on zoom talking about a set of terminal interfaces on a shared screen&quot; /&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;I decided to reflect on both my enthusiasm for terminal interfaces, and the idea of AI in the expanded sense – specifically, what it means to have machines that act intelligently in the world, and some of my thoughts about what it means to consider computers as ‘embodied’ agents with environmental awareness. It’s an alignment that’s a bit tenuous – terminals are no more a ‘real’ representation than any other interface to computation processes, just a different standpoint on what a computer is doing. Perhaps it’s more accurate to say that the terminal gives you a way of engaging with the affordances of a computer much more directly, while GUIs can hamper that engagement with uneccessary pauses or simplifications. &lt;label for=&quot;panellists&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt;&lt;input id=&quot;panellists&quot; class=&quot;margin-toggle&quot; /&gt;&lt;span class=&quot;sidenote&quot;&gt;there’s a &lt;a href=&quot;https://www.brandur.org/interfaces&quot;&gt;nice article&lt;/a&gt; on this by Stripe engineer Brandur&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;As part of the &lt;a href=&quot;https://soup.agnescameron.info/2021/03/21/bell-system.html&quot;&gt;Bell Labs project&lt;/a&gt; I’d also been reading Rob Pike’s piece &lt;a href=&quot;http://doc.cat-v.org/plan_9/1st_edition/help/&quot;&gt;Help: A Minimalist Global User Interface&lt;/a&gt;, along with some other writing on interfaces to come out of the plan9 crew. What struck me most about it was a real clarity that an interface is just a window on a process, and that processes aren’t all bound up in separate windows. When I read this paper, I had this really vivid vision of windows being like windows in a house, giving you a view onto a river: a flowing part of a much larger process taking place, with each view connected by this larger context but giving you a different perspective. Which is like, funny if you read the paper because of how basic the illiustrations are, but I guess that’s what good writing is for.&lt;/p&gt;

&lt;h2 id=&quot;giving-a-talk-in-terminal&quot;&gt;giving a talk in terminal&lt;/h2&gt;

&lt;p&gt;Of course, this was mostly a performative choice, and I’m not sure how much I’d recommend the terminal as presentation software.&lt;label for=&quot;panellists&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt;&lt;input id=&quot;panellists&quot; class=&quot;margin-toggle&quot; /&gt;&lt;span class=&quot;sidenote&quot;&gt;I’m reminded of a study that I can’t seem to find, where they took a bunch of academics who liked and didn’t like LaTeX, and got both groups to typeset papers in Word and in LaTeX. In both cases, the Word paper was typeset considerably faster (though I don’t recall if anyone asked who had more fun…)&lt;/span&gt; That said, I wanted to run a number of different pieces of code as part of the talk, and it was nice to be able to run them directly in the window.&lt;/p&gt;

&lt;p&gt;The talk was written as a python script: it might have been a little more effective at certain things in bash, but I had been using python system calls very recently and figured it would be quicker and easier to debug. Most ‘slides’ (text in a terminal, along with some accompanying imagery or a script) were handled by this helper function:&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;def standard_slide(text, cmd):
	os.system(cmd)
	os.system(&apos;clear&apos;)
	input(text)
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;This first runs a command (e.g. running a script, opening an image), then clears the aftermath of that process and the last slide, then prints the slide text to the terminal, such that with another keypress the next slide gets triggered.&lt;/p&gt;

&lt;p&gt;e.g.&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;standard_slide(&quot;3. permaculture network \n\n \\
with gary zhexi zhang \n \\
for schloss solitude web residency&quot;, \\
&quot;open -a Firefox http://root.schloss-post.com&quot;)
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;The most challenging part was figuring out how to run very different processes as part of the same script, and have them return properly without breaking. In the end, I did this for everything except the Node app that I wrote as a testing interface for the Mozilla bots project, which had to be run in a separate window launched using Applescript. This was actually my first time writing any applescript: it felt a little silly because of the natural-language like interface, but also did a lot that would have been hard to achieve with pure system calls.&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;&quot;&quot;&quot;osascript -e &apos;tell application &quot;Terminal&quot; \\
to do script with command \\
&quot;node /Users/agnes/etc/bot/index.js&quot;&apos;&quot;&quot;&quot;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;Mostly, this exercise has given me some impetus to rewrite the helper command I keep in my bash profile to do some more elaborate tasks (currently just limited to navigating to directories and opening new windows with a specific profile).&lt;/p&gt;
</description>
          <pubDate>2021-04-10T00:00:00-04:00</pubDate>
          <link>https://soup.agnescameron.info//2021/04/10/terminal-interfaces.html</link>
          <guid isPermaLink="true">https://soup.agnescameron.info//2021/04/10/terminal-interfaces.html</guid>
        </item>
      
    
      
        <item>
          <title>an introduction to the bell system</title>
          <description>&lt;p&gt;Last week we (&lt;a href=&quot;https://www.foreignobjects.net/&quot;&gt;Foreign Objects&lt;/a&gt;) presented the project &lt;a href=&quot;https://1127.foreignobjects.net/&quot;&gt;An Introduction To The Bell System&lt;/a&gt;&lt;label for=&quot;bell-system&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt;&lt;input id=&quot;bell-system&quot; class=&quot;margin-toggle&quot; /&gt;&lt;span class=&quot;sidenote&quot;&gt;taken from the title of an internal employee handbook that we came across in the Bell Labs archives&lt;/span&gt; as part of a talk hosted by Rhizome and the New Museum. The piece consists of a series of 5 embroidered jumpsuits, each themed around a particular strand of Bell Labs’ Computing Sciences Research Center work/folk history. As Gary and I are currently in the UK, and Kalli and Sam in Canada, with the work itself in New York, it was something of a weirdly disembodied experience (though perhaps a snapshot of life as a successful international artist on the cheap, ha ha).&lt;/p&gt;

&lt;p&gt;I’m hopeful that this isn’t a hard end to the project – more of an inflection point that allows us to explore other areas – though I’m also proud of the work that we did, along with collaborators &lt;a href=&quot;http://elizacollin.com/&quot;&gt;Eliza Collin&lt;/a&gt; (who handled the non-embroidery parts of garment production and fabrication, including natural-dyeing all of the fabric), and &lt;a href=&quot;https://www.instagram.com/genie.kausto/&quot;&gt;Genie Kausto&lt;/a&gt; (who did all of the styling, photography and filming onsite at Bell).&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;fullwidth&quot;&gt;
	&lt;img src=&quot;/img/bell/gremlin_desk.jpg&quot; alt=&quot;two views of the app&quot; /&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Because of COVID, what could have been very collaborative parts of the project became more of a textile-relay-race, which was a shame as, especially with the work between ourselves and Eliza, it felt like there was a lot we could have all learned from one another about the processes involved. Even without that aspect though, I learned a lot working on this project, and I wanted to write about 2 things: firstly, the material process of making the suits, and secondly the research process at bell. I haven’t decided yet whether to make the second part a separate post or add it to the end here: check back in a couple of weeks, I guess.&lt;/p&gt;

&lt;h1 id=&quot;part-1-making-the-suits&quot;&gt;part 1: making the suits&lt;/h1&gt;

&lt;h2 id=&quot;pre-embroidery&quot;&gt;pre embroidery&lt;/h2&gt;

&lt;p&gt;The original plan for this project was to use recycled fabrics (e.g. other clothing repurposed into a patchwork fabric), a technique Eliza uses a lot in her work. This aim was cut short by the November lockdown as we had no access to charity shops, so we instead got hold of some unbleached organic cotton in half panama, which Eliza then hand-dyed.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bell/dye_tests.jpeg&quot; alt=&quot;multiple different coloured tests layered over each other&quot; /&gt;test swatches made by eliza (the darker ones had been treated with iron more times than the lighter ones)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;dyeing&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Eliza did all of the dyeing of the fabric, but it was a nice opportunity to learn about natural dyeing techniques and results. We decided from the start on quite a dark fabric. She made a bunch of tests, using either gall nut extract, myrobalan extract, or tannic acid, along with iron sulphate to produce dark colours.&lt;/p&gt;

&lt;figure&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/bell/dye1.jpeg&quot; alt=&quot;fabric in a light brown bath of water&quot; /&gt;
		&lt;span class=&quot;mainnote&quot;&gt;mordanting with the tannin&lt;/span&gt;
	&lt;/div&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/bell/dye2.jpeg&quot; alt=&quot;fabric in a dark blue-black bath of water&quot; /&gt;
		&lt;span class=&quot;mainnote&quot;&gt;dyeing with the iron sulphate&lt;/span&gt;
	&lt;/div&gt;
&lt;/figure&gt;

&lt;p&gt;After a lot of tests, the formula we settled on was gall nut extract + iron sulphate, with 3 baths in the iron sulphate to produce an intense dark colour.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;csm dye workshop&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Eliza’s studio was in the old CSM building, which is imminently being demolished and was an amazing space to walk around in, albeit pretty trashed. We went to help rinse out and dry all the fabric, which was extremely cold and wet, and pretty full on. We managed to get the sinks in the old dyeing workshop to work, though the hot water did come out a pretty unattractive bright orange.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bell/csm_sinks_ii.JPG&quot; alt=&quot;a picture of some large sinks&quot; /&gt; the csm dye room
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bell/csm_water.JPG&quot; alt=&quot;yellow water coming out of a tap&quot; /&gt; the hot tap
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;The whole building was amazing actually – we got onto some other floors and had a good wander around, even finding a tiny lemon tree on the roof. It was also just nice to be in a big institutional school building again, though weirdly quiet and empty.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;fullwidth&quot;&gt;
	&lt;img src=&quot;/img/bell/csm_roof.jpeg&quot; alt=&quot;agnes standing on a roof overlooking the city of london&quot; /&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;toiling&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Eliza made a test version of the suit from calico, editing the pattern for a slightly higher and more tailored waist, and got rid of belt hoops and added cuff, waist and ankle tabs to allow it to be worn more fitted. it would’ve been nice to do this part together as it’s a really interesting process – pinning the suit, then adding in edits to the pattern on greaseproof paper. The end result was a really lovely fit on both myself and gary.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;cutting and planning&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Due to time lost from not being able to be in the same room, Gary and I cut the dyed fabric under helpful instruction from Eliza. This took a very long time, as we hand-tacked each piece to prevent warping, but it was actually a nice way to get a really good sense of the construction and probably contributed to us not fucking anything up later on.&lt;/p&gt;

&lt;p&gt;While trying to waste as little fabric as possible, we also gave longer, thinner and irregularly shaped pieces enough of a border so they could be comfortably clamped by the embroidery hoop.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;fullwidth&quot;&gt;
	&lt;img src=&quot;/img/bell/layout.JPG&quot; alt=&quot;a piece of fabric with the pattern laid out on it&quot; /&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;h2 id=&quot;embroidery&quot;&gt;embroidery&lt;/h2&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bell/samples.JPG&quot; alt=&quot;5 ziploc bags filled with embroidered fabric patches&quot; /&gt;
	sample patches sorted by suit
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;It became clear from fairly early in the design process that the embroidery was going to take a long time, and require a fair bit of testing. I ended up making at least one sample patch for each design that’s on the suits, including a bunch that didn’t make the cut. To see how they’d look, we pinned them to the toiled suit and took pictures in them. I was not always in a great mood during this process:&lt;/p&gt;

&lt;figure&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/bell/faces_patchtest.JPG&quot; alt=&quot;testing patches pinned to suit&quot; /&gt;
	&lt;/div&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/bell/sad_patchtest.JPG&quot; alt=&quot;testing patches pinned to suit (but I look sad)&quot; /&gt;
	&lt;/div&gt;
&lt;/figure&gt;

&lt;p&gt;&lt;strong&gt;an aside: buying second hand embroidery machines&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bell/nv800e.jpeg&quot; alt=&quot;brother innovis embroidery machine&quot; /&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;In order to do this project (and out of a general desire to do more of this kind of work), I invested in a second-hand brother innovis nv-800e, a large-format&lt;label for=&quot;nv800e&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt;&lt;input id=&quot;nv800e&quot; class=&quot;margin-toggle&quot; /&gt;&lt;span class=&quot;sidenote&quot;&gt;I think this model is about as large-hoop as you can get before you get into the light-industrial range&lt;/span&gt; embroidery-only model of the popular Brother domestic machines. I’ve actually only used Brother machines, though I did look into the Huqsvarnas when I was trying to find a one.&lt;/p&gt;

&lt;p&gt;I bought the machine from a seller on eBay, which in the UK is probably the best place to look. (I get the impression that there are actually quite a few more circulating in cities like NYC on craigslist, and for cheaper, too). After a couple of disappointments, where a machine that looked great turned out to have a fatal flaw that the sellers weren’t aware of (including one visit to luton to see a very sad machine), and a lot of descriptive back-and-forth with some sellers who were selling on machines they hadn’t used, I would always:&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;get whoever’s selling it to send you a video of them turning it on and going through some of the menus (if they’ve not used the machine it might take a bit of back and forth, but otherwise you have no way of knowing if there’s even a cable / it works)&lt;/li&gt;
  &lt;li&gt;if you’re going to pick up, take spare needles, thread, fabric, USB stick, wound bobbin in order to test when you’re there (and check with them if they have hoops/cables etc)&lt;/li&gt;
  &lt;li&gt;make sure you get a picture of the ports, and the part number so that you can be sure you’ll be able to use a format that’s not a weird proprietary floppy disk (CC most older machines)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2 id=&quot;software&quot;&gt;software&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;PE design 6&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Brother’s PE Design 6 software is a bit like an old friend that you talk to and realise why you no longer hang out, and then talk to some more and realise that there is actually a reason you like each other. I taught a &lt;a href=&quot;https://docs.google.com/presentation/d/1U5vzK0wHKXBZkMXNf1vDIRxCSbC2AxnJvcb6qFlc1mo/edit#slide=id.p1&quot;&gt;couple of workshops&lt;/a&gt; with this software a while back, which resulted in this meme:&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;fullwidth&quot;&gt;
	&lt;img src=&quot;/img/bell/pe_design.png&quot; alt=&quot;a meme describing 6 different negative qualities of brother&apos;s PE design 6 software: bad typefaces, mystery blobs, too many dialog boxes, bad vectorisation, too many separate programs, bad error messages&quot; /&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;PE Design 6 is one of those pieces of software that you encounter and you realise it was not made to be used the way you would like to use it&lt;label for=&quot;pe-sidenote&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt;&lt;input id=&quot;pe-sidenote&quot; class=&quot;margin-toggle&quot; /&gt;&lt;span class=&quot;sidenote&quot;&gt;Using dialogflow was a similar experience, though considerably more sinister: PE Design 6 is just very heavily geared towards 1990’s housewives that Brother seem to think would not appreciate the option for a vector file format, or any nice typefaces&lt;/span&gt;. It won’t accept any CAD file formats, instead requiring a (fairly low-res) scalar image that you then vectorise over a series of steps denoted by a parrot graphic.&lt;/p&gt;

&lt;p&gt;However, there are some things it does really well, and these are worth noting. Despite not handling vectors, it does a great job of getting object outlines provided they’re clear, and it was what I used as a default while making simple line drawings and fills. It does a decent job of optimising toolpaths as well – though that can sometimes result in some very weird looking decisions when it comes to stitching grids. It is not the proprietary embroidery software that we want; it might be the proprietary embroidery software that we deserve.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;PEMbroider&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bell/pjw_embroidery.gif&quot; alt=&quot;a gif of a machine embroidering&quot; /&gt;
	embroidering a design made by PEMbroider
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;At the opposite end of the spectrum is &lt;a href=&quot;https://github.com/CreativeInquiry/PEmbroider&quot;&gt;PEMbroider&lt;/a&gt;, an open-source embroidery library written for Processing by a group from the CMU Studio for Creative Inquiry. They also cite some software called &lt;a href=&quot;https://github.com/EmbroidePy/EmbroidePy&quot;&gt;EmbroidePy&lt;/a&gt;, which I’ve yet to test out.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bell/pembroider_shapes.jpeg&quot; alt=&quot;different kinds of overlaid shapes intersecting&quot; /&gt;
	the pembroider interface, showing different shape culling options
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;As making the toolpaths is basically just Java, with a bunch of helper functions, you’re afforded a really great degree of flexibility and control, though this can be a bit of a blessing and a curse, as it also takes a lot of tweaking to get a high-quality finish, especially over a tight timeframe.&lt;/p&gt;

&lt;p&gt;The features are much more geared toward a CAD flavour of experimentation, meaning easy transformation between vector files, and the ability to do things like merge and overlay shapes very easily. They also have really great quality intiutive type design, way nicer and more thoughtful than Brother’s offerings, allowing much smaller readable text that didn’t have the grottiness of small text produced with PE Design.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bell/plan9_status.png&quot; alt=&quot;an embroidery pattern showing 5 fills over 5 grids&quot; /&gt;
	the stitchBuddy interface (free version is read only)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Overall I feel like with a bit more tweaking and practice we could’ve definitely got more out of PEMbroider, and I’m looking forward to experimenting with it more, but the inconsistency of the finish and somewhat dodgy optimisation (these were a &lt;em&gt;lot&lt;/em&gt; harder to trim) meant that, despite having a desire to make an all-pembroider work, I defaulted to PE design. That said, I think these things could be resolved with practice, and for anything programmatic or pixellated it was far and away the best option: there were many elements we could not have done without PEMbroider, listed below.&lt;/p&gt;

&lt;p&gt;As PE Design is only for windows, and PEMbroider doesn’t have an interface to open files, I also downloaded the free version of StitchBuddy for mac to view and check files, which proved extremely helpful (before this i would go windows -&amp;gt; dropbox -&amp;gt; mac -&amp;gt; USB -&amp;gt; machine -&amp;gt; oh no is that the right file -&amp;gt; back to windows… now i only go back to mac).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;faces&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The first opportunity to really use PEMbroider came from making the pixellated faces from the vismon system, illustrated by Luca Cardelli. PE design doesn’t deal well with grids, understands shapes in terms of outlines+fills, and has a tendency to round off corners. Because you have a lot of control over how files are read in, I scaled each file up to 500px, while keeping the original pixel texture instead of resampling.&lt;/p&gt;

&lt;figure&gt;
	&lt;div class=&quot;subfigthird&quot;&gt;
		&lt;img src=&quot;/img/bell/early_facestest_inverse.JPG&quot; alt=&quot;an embroidered face&quot; /&gt;
	&lt;/div&gt;
	&lt;div class=&quot;subfigthird&quot;&gt;
		&lt;img src=&quot;/img/bell/early_facetest.JPG&quot; alt=&quot;an embroidered face&quot; /&gt;
	&lt;/div&gt;
	&lt;div class=&quot;subfigthird&quot;&gt;
		&lt;img src=&quot;/img/bell/ken.JPG&quot; alt=&quot;an embroidered face&quot; /&gt;
	&lt;/div&gt;
&lt;/figure&gt;

&lt;p&gt;The person to the left and centre’s image was labelled &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;aap&lt;/code&gt;, who I think is &lt;a href=&quot;http://squoze.net/&quot;&gt;Angelo Papenhoff&lt;/a&gt;, who now runs a site dedicated to maintaining PDP computers, though I’m not 100%. On the left is the first test; to the centre is the same image, but inverted, which looks much better as the colour cues are the right way round. To the right is Ken Thompson, both inverted, and with the hatching direction set to 0 degrees, which creates the effect of a computer scanning.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bell/plan9_status_grid_1.gif&quot; alt=&quot;a gif of a machine embroidering a grid with a strange, snail-like toolpath&quot; /&gt;
	round 1
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;the plan9 status tracker&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This was probably the trickiest of the designs, as it required both a regular grid, and a high-quality even and glossy fill which I’d not been able to produce reliably using PEMbroider.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bell/plan9_status_grid_2.gif&quot; alt=&quot;a gif of a machine embroidering a grid evenly&quot; /&gt;
	round 2
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;first attempt:&lt;/strong&gt; At first, I tried doing this straight in PE Design 6: producing a fairly good-quality grid after a huge amount of dragging different points around in the software. The result embroidered OK, but the toolpath was pretty strange (see right). The way the toolpath was laid out in the software also meant it was impossible to shift the status graph around without distorting the whole grid. After 10 minutes fruitlessly dragging points around I realised there must be a better way.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;second attempt:&lt;/strong&gt; I made the grids in PEMbroider, then dropped them into the .PES file editor of PE design 6, which I hadn’t realised until recently allows you to layer multiple designs on top of each other. it doesn’t have the nice ‘merging’ features of PEMbroider, and you can’t edit the patterns directly without editing the originals, but nonetheless, it allowed a hybrid grid/fill system to be made. I drew out the fills, then processed them with PE Design 6, making a remarkably complicated meta-file that only &lt;em&gt;just&lt;/em&gt; fit the largest hoop. But~ it came out well in the end.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;fullwidth&quot;&gt;
	&lt;img src=&quot;/img/bell/plan9_status.JPG&quot; alt=&quot;status graphs on grids embroidered on a jumpsuit&quot; /&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;h2 id=&quot;actually-embroidering-the-suits&quot;&gt;actually embroidering the suits&lt;/h2&gt;

&lt;!-- &lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bell/crabs_patchtest.jpg&quot; alt=&quot;a suit with patches pinned to it, with marks on it showing measurements&quot; /&gt;
	some nice gary markup sketches
&lt;/span&gt; --&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bell/embroidery_tracker.png&quot; alt=&quot;a screenshot of a spreadsheet with a list of designs being ticked off&quot; /&gt;
	progress tracker for suit 4
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;It took a really long time to work up the courage to go from the sample patches, to sewing directly onto the suits, as we didn’t have a lot of room for error. In hindsight, having a couple of extra metres of dyed fabric would have saved a lot of stress, though in the end there were no pieces that needed to be re-cut. We divided the work according to the system that gary would mark up and organse the placements of the patches, and that I would embroider them, kind of assuming that the embroidery would take about 2x as long as the other part. In the end it was pretty much 50:50, with marking, chalking and measuring turning out to be extremely laborious.&lt;/p&gt;

&lt;p&gt;In order to keep track of the status of each separate piece, we made a spreadsheet to track each design from finalising the design, to separating the tacked fabric, to marking up and measuring, and then finally the embroidery. Our main concerns were either mixing up different pieces from different suits, or embroidering the wrong side of something/upside-down etc, so we decided to finish each suit fully before moving onto the next. This also had the advantage of having to confront the hardest designs of each suit separately, where they could easily have been pushed along.&lt;/p&gt;

&lt;figure&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/bell/suit4_pieces_layout.JPG&quot; alt=&quot;a number of embroidered pieces laid out on a bed&quot; /&gt;
		suit 4 layout
	&lt;/div&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/bell/suit5_pieces_layout.JPG&quot; alt=&quot;a number of embroidered pieces laid out on a bed&quot; /&gt;
		suit 5 layout
	&lt;/div&gt;
&lt;/figure&gt;

&lt;p&gt;When we thought we’d finished each suit, gary would lay the entire thing out piece by piece and photograph it to check that we hadn’t lost or missed anything.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bell/trimming.JPG&quot; alt=&quot;trimming embroidery&quot; /&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;designing a FO label&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This part was the only part of the project that felt like genuinely free experimentation, in part because we actually only did these once we’d posted all the pieces to cornwall to fabricate the suits, as we’d totally forgotten about the potential for labels.&lt;/p&gt;

&lt;p&gt;I made a very easy outline of the FO logo in PE Design, using the satin stitch to fill the shape. Initially I only embroidered the plain white fill, but then decided to start experimenting with different lines, offsets and colours. In order not to constantly be going back and forth between my computer and the machine, I used the same file each time but switched out the thread and changed the position of the machine each time.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bell/fo_logo.JPG&quot; alt=&quot;a number of different embroidered logos of the initials F.O.&quot; /&gt;
	foreign objects test logos in the hoop
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;trimming&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;We did this part after getting the complete suits back from Eliza. Trimming can really be the scourge of embroidery, but in this case, with a couple of exceptions, the process was only about a day’s worth of shared work, and light enough to chat etc&lt;/p&gt;

&lt;p&gt;I got a painfully expensive pair of Fiskar’s embroidery scissors to do the snipping and, I am glad to say, they are very good at snipping and probably worth it. Because of the price we only had one pair, though.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;buttonholing&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bell/diba.png&quot; alt=&quot;a picture of diba dry cleaners&quot; /&gt;nice buttonholes
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;One thing I didn’t know is that unless you’ve got a proper buttonholing machine at home, a lot of people take their garments to get buttonholed elsewhere. there’s a &lt;a href=&quot;https://dmbuttons.co.uk&quot;&gt;guy in soho&lt;/a&gt; who only does this, but was also only accepting orders by post, which seemed a bit of a waste of time given how close we were. We got ours done by Diba Dry Cleaners and tailors on Abbeville road, who were super quick and really great.&lt;/p&gt;

&lt;p&gt;The very last thing we did (after taking a bunch of documentation pictures) was to post the suits to Genie Kausto in New York, where they took them to Bell Labs and took a bunch of pictures on site, including a film. We had a good time making them a &lt;a href=&quot;&apos;/img/bell/dossier.pdf&apos;&quot;&gt;reference dossier&lt;/a&gt; of images and poses to strike. It felt like a really great outcome for the project, though also sad not to have been there in person.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;fullwidth&quot;&gt;
	&lt;video width=&quot;640&quot; height=&quot;360&quot; controls=&quot;controls&quot; src=&quot;https://1127.foreignobjects.net/murrayhill_spring_1987.mp4&quot; alt=&quot;&quot;&gt;&lt;/video&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;!-- ## the bell archives

This was a slightly odd project for us, as right from the start we knew two things: we wanted to use the freedom and support from Rhizome+Bell to try making garments, something we&apos;d been wanting to do for a while, and that we wanted to spend some time with the archives. Right back in October 2019, we got invited to the UNIX 50th anniversary celebration, which I thought would 

project focussed around work
came right at the end as a way of naming uite a nebulous and rambling research process... what are we thinking about? why is this interesting?

## early visits

began in nov 19, unix conference
funny being read as an &apos;artist&apos;, people definitely find it easier to talk to you... perhaps I am more legible as an artist in that form, I look like people&apos;s image of an artist, not an engineer

getting excited about people getting excited about something

they all work at google now

self-consciousness about over-simplifying technical aspects, or not representing them on the terms that they represent themselves... fan art/tribute: not meant to be an analog to the real thing, but a way of looking at it with enthusiasm

playfulness i like and identify with existing in the stomach of this large company, not performative in the way that the table-tennis 

cambridge computer lab security group kind of formative

what does it mean to be so nostalgic for this point in tech which maybe wouldn&apos;t even have included you? do I really want that?

## online archives: 9front and cat-v

alt operating system crowd... 



&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bell/&quot; alt=&quot;&quot; /&gt;
&lt;/span&gt;

the structure of the site and who maintains which part of it obscure to me, 

&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bell/&quot; alt=&quot;&quot; /&gt;
&lt;/span&gt;


feel like pushing a corporate point about &apos;interdisciplinarity&apos; which honestly was disappointingly reinforcing of strict disciplines --&gt;

</description>
          <pubDate>2021-03-21T00:00:00-04:00</pubDate>
          <link>https://soup.agnescameron.info//2021/03/21/bell-system.html</link>
          <guid isPermaLink="true">https://soup.agnescameron.info//2021/03/21/bell-system.html</guid>
        </item>
      
    
      
        <item>
          <title>the first 10,000 years</title>
          <description>&lt;p&gt;&lt;a href=&quot;https://fud.global/&quot;&gt;&lt;em&gt;The First 10,000 Years&lt;/em&gt;&lt;/a&gt; is a collaboration with &lt;a href=&quot;http://zhexi.info/&quot;&gt;Gary&lt;/a&gt;, part of his larger body of work on the catastrophe insurance industry. The whole project is called &lt;a href=&quot;https://www.artscatalyst.org/fud&quot;&gt;&lt;em&gt;FUD&lt;/em&gt;&lt;/a&gt;, named for both the bitcoin-speculator acronym ‘Fear, Uncertainty and Doubt’, and for its sonic qualities. &lt;em&gt;The First 10,000 years&lt;/em&gt; simulates a market-simulation – a 10,000 year model – that serves as the backbone of a speculative trading platform and forum.&lt;/p&gt;

&lt;p&gt;&lt;span&gt;
	&lt;img src=&quot;/img/fud/main.png&quot; alt=&quot;testing out the chat&quot; /&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;It’s still a work in progress, and I’m currently circling back to a second version and wanted to write down my thoughts. We ended up making the first iteration over the course of about a week, and it’s more of a sketch of what’s possible than a full simulation in it’s own right. As such, this post is about these small, basic component models that with a bit of love will flourish into real, functional objects.&lt;/p&gt;

&lt;h2 id=&quot;the-first-10000-years&quot;&gt;the first 10,000 years&lt;/h2&gt;

&lt;p&gt;The catastrophe insurance industry sets its prices every year by running so-called 10,000 year models – using past data, along with statistical perturbation, to run the same year 10,000 times over, and calculate the average losses they can expect from various weather events. Of late, this form of simualtion is being forced to radically shift: climate change means that ‘all bets are off’ when it comes to the use of past data – but the industry has also been slow to change, in part due to the sheer complexity of the undertaking.&lt;/p&gt;

&lt;h3 id=&quot;risk-model&quot;&gt;risk model&lt;/h3&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/fud/traces.png&quot; alt=&quot;traces in a model&quot; /&gt; putting the HURDAT data into Mapbox to map historical hurricane traces
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;In order to create our own 10,000 year model, we used the &lt;a href=&quot;https://en.wikipedia.org/wiki/HURDAT&quot;&gt;HURDAT&lt;/a&gt; dataset – a record of all the hurricanes to hit the Atlantic and East Pacific oceans, since 1851 and 1949 respectively. We truncated the dataset to use only hurricanes where there was a record of the intensity, and location at each stage – about half-a-century’s worth in total – and used this as the basis for our model.&lt;/p&gt;

&lt;p&gt;Early on, we realised that we’d either need a much more expensive Digital Ocean server, or find another way to calculate the statistical perturbations on each hurricane, the likely damage to adjacent urban centers, and the changes in risk associated with each step, at the same time as handling the other aspects of the simulation. In the end, we decided to precompute the outputs of the model for the duration of the simulation, though there is probably a more graceful way to do this computation that wouldn’t have been quite so intensive. This also allowed us to normalise the outputs to help calibrate the market.&lt;/p&gt;

&lt;p&gt;This pre-computation for each hurricane consists of an array of GeoJSON objects for each step, showing the risk, and the proximity to urban centers.&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;          &quot;properties&quot;: {
            &quot;class&quot;: &quot;TS&quot;,
            &quot;date&quot;: &quot;11-07-1981&quot;,
            &quot;highest_risk&quot;: &quot;Nassau&quot;,
            &quot;landfall&quot;: false,
            &quot;proximity&quot;: [
              {
                &quot;country&quot;: &quot;bs&quot;,
                &quot;distance&quot;: 427.51290613937533,
                &quot;lat&quot;: 25.0833333,
                &quot;lon&quot;: -77.35,
                &quot;name&quot;: &quot;Nassau&quot;,
                &quot;pop&quot;: 227936.0,
                &quot;region&quot;: &quot;23&quot;,
                &quot;risk_factor&quot;: 0.007291133730685746
              },
              {
                &quot;country&quot;: &quot;cu&quot;,
                &quot;distance&quot;: 460.22906271772723,
                &quot;lat&quot;: 20.887222199999997,
                &quot;lon&quot;: -76.2630556,
                &quot;name&quot;: &quot;Holgu\u00edn&quot;,
                &quot;pop&quot;: 319114.0,
                &quot;region&quot;: &quot;12&quot;,
                &quot;risk_factor&quot;: 0.005600581696239157
              }
            ],
            &quot;report&quot;: &quot;Tropical cyclone of tropical storm intensity&quot;,
            &quot;risk&quot;: 0.007291133730685746,
            &quot;speed&quot;: &quot;40&quot;,
            &quot;time&quot;: &quot;00:00&quot;
          },
          &quot;type&quot;: &quot;Feature&quot;
        }
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;As it stands, the model first takes each historical hurricane in turn, and modulates the density and path of the hurricane along a normal distribution, appending these to a list of possible hurricanes. Using these ‘possible hurricanes’, the model then calculates the proximity to and the intensity of the hurricane at each step, using a database of urban centers and a simple half-circle radius distance to estimate the risk to each city in the path. The financial risk is then modelled based on these factors, plus the population of each city: a crude metric, and one that we could certainly improve upon. These get appended to a database, along with the names and locations of the nearby towns, allowing them to be marked on the map and discussed in the chat.&lt;/p&gt;

&lt;p&gt;&lt;span&gt;
	&lt;img src=&quot;/img/fud/traces-large.png&quot; alt=&quot;testing out the hurricane traces on MapBox&quot; /&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;If the hurricane broaches a distance of 2km, it is assumed to have ‘hit’ – and whatever value was tied up in various bonds is then lost, changing the state of the market.&lt;/p&gt;

&lt;p&gt;The main things to change about this part is making the risk model a whole lot more sophisticated: instead of just using the distance, using proper Polygons with something like &lt;a href=&quot;https://pypi.org/project/pyturf/&quot;&gt;turf&lt;/a&gt; or &lt;a href=&quot;https://geopandas.org/&quot;&gt;geopandas&lt;/a&gt; would make things a lot more interesting (also maybe allowing them to be layered on the projections, dashboard-style).&lt;/p&gt;

&lt;p&gt;Another strand is to actually employ some ML on actual losses associated with these hurricanes, which is much closer to how the hedge funds are doing it.&lt;/p&gt;

&lt;h3 id=&quot;dialog-models&quot;&gt;dialog models&lt;/h3&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/fud/log-output.png&quot; alt=&quot;traces in a model&quot; /&gt; monitoring the chat log and simulation state on the server (we ended up using the PM2 process manager)
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;These are dumb right now but they could be much smarter. Currently, the dialog operates in 3 kind of ‘wheels’, or degrees of immediacy and involvement.&lt;/p&gt;

&lt;p&gt;One suggestion Dan had was to have the chat itself be the trading interface, with agents making trades – a bit like the image of the Starship Bistromath in the &lt;em&gt;Hitchhiker’s Guide to the Galaxy&lt;/em&gt;, where the ship navigates on the basis of complex interactions between diners and waiters in a bistro. It’s an idea I’d really like to return to, but the time constraints of the project meant that, at least for the moment, we were constrained to&lt;/p&gt;

&lt;p&gt;Setting up a tiny IRC was immediately really exciting. At the very start, it was just a http endpoint, and my friend Dan and I spent half an hour just chatting using cURL. Even now, when I see a stranger in the chat screen it’s strangely thrilling. Perhaps it’s just nostalgia but there’s a kind of danger to it… you could say anything or be anyone… I guess this is why people like 4chan.&lt;/p&gt;

&lt;p&gt;&lt;span&gt;
	&lt;img src=&quot;/img/fud/patoto.png&quot; alt=&quot;testing out the chat&quot; /&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;h2 id=&quot;market&quot;&gt;market&lt;/h2&gt;

&lt;p&gt;The market is in some ways the core ‘point’ of the simulation, populated by the agents shitposting on the forum trading bonds on hurricanes.&lt;/p&gt;

&lt;p&gt;We started off modelling the market in a Google sheet, working out how we thought each agent should interact with the system. We wanted to give each one a basic personality and risk tolerance, which would, using a normal distrubution, define certain thresholds of risk which would modulate the price they were prepared to pay for a bond.&lt;/p&gt;

&lt;p&gt;The sheet was really useful for getting a handle on that idea of what’s ‘interesting’ in a simulation – e.g. how to set the prices so there’s constant dynamism, rather than immediately tending towards one stable state or another.&lt;/p&gt;

&lt;p&gt;&lt;span&gt;
	&lt;img src=&quot;/img/fud/spreadsheet.png&quot; alt=&quot;a spreadsheet showing market model&quot; /&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Getting to the edge of our knowledge of Google Sheets formulas, we started to prototype in Python directly – starting with a simple buy/sell threshold, and moving to a more complex price-setting plus statistical distribution.&lt;/p&gt;

&lt;p&gt;Each new hurricane, a tranche of insurance bonds are sold by a reinsurance fund, which in turn are bought up by different agents in the market. These in turn are cycled between agents in a series of bids as the risk changes, and each provides a return per unit time.&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;class Bond:
	def __init__(self, initial_price, bond_yield, period, company):
		self.initial_price = initial_price
		self.price = initial_price
		self.bond_yield = bond_yield
		self.bond_period = period
		self.company = company

	def yield_per_unit_time(self):
		return (self.bond_yield*self.initial_price)/self.bond_period

	def est_return(self, time_remaining):
		est = ((self.initial_price - self.price) + (time_remaining*self.bond_yield*self.initial_price)/self.bond_period)/self.price + 1
		return est

	def update_price(self, p):
		self.price = round(p, 2)
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;It’s financially prudent for agents to buy low and sell high, of course, but also to hold onto each bond as long as possible for the estimated return.&lt;/p&gt;

&lt;p&gt;Bidding takes place each cycle, where agents make first bids dependent on whether they want to buy, and asks depending on whether they want to sell.&lt;/p&gt;

&lt;p&gt;Agents are constructed internally with variable amounts of eagerness and desperation, which are recalculated each cyle, and which depend both on the normally-distrubted risk tolerance posessed by each agent, and on the profit or loss they’ve made over time.&lt;/p&gt;

&lt;p&gt;In this sense, the market is not &lt;em&gt;really&lt;/em&gt; a market – agents make bids based on their own internal state and then the market evens it out in terms of buys and sells. This acts as a proxy: in reality the agents look at the other bids that are being made and &lt;em&gt;then&lt;/em&gt; make their trades based on that.&lt;/p&gt;

&lt;p&gt;This is probably the part of the simulation I’m most interested in developing further, and using this more as a core mechanic to drive the other elements.&lt;/p&gt;
</description>
          <pubDate>2021-02-21T00:00:00-05:00</pubDate>
          <link>https://soup.agnescameron.info//2021/02/21/10000-year-models.html</link>
          <guid isPermaLink="true">https://soup.agnescameron.info//2021/02/21/10000-year-models.html</guid>
        </item>
      
    
      
    
      
    
      
    
      
    
      
        <item>
          <title>chatbots and cheap AI</title>
          <description>&lt;p&gt;A few weeks ago, &lt;a href=&quot;https://www.foreignobjects.net/&quot;&gt;Foreign Objects&lt;/a&gt; released &lt;a href=&quot;https://botor.no/&quot;&gt;Bot or Not&lt;/a&gt;, a project that we’d worked on for the 2019-20 &lt;a href=&quot;https://blog.mozilla.org/blog/2019/09/17/examining-ais-effect-on-media-and-truth/&quot;&gt;Mozilla Creative Media Awards&lt;/a&gt;. You play a game of ‘truth’ against a random opponent: at the end, you have to guess whether or not they’re human, and they make the same guess about you.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;fullwidth&quot;&gt;
	&lt;img src=&quot;/img/bots/screens.png&quot; alt=&quot;two views of the app&quot; /&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;The idea started as something of a joke: the theme of this years’ awards was ‘examining AI’s effect on media and truth’, and the idea of a ‘truth or dare turing test’ was something of a pun on that. We’re all interested in different ideas of non-human subjecthood and agency, and right now we’re at something of a turning point when it comes to interfacing with non-humans. The proliferation of Amazon Alexa and Google Home bots, the automation of call centers and other service workplaces, the recent public release of Google Duplex: even without the pandemic these things were happening, but right now they’ve been wildly accelerated.&lt;/p&gt;

&lt;p&gt;Inspiration for the project comes from a few places, but one idea we kept coming back to was Judith Donath’s essay &lt;a href=&quot;https://medium.com/berkman-klein-center/the-robot-dog-fetches-for-whom-a9c1dd0a458a&quot;&gt;&lt;em&gt;The Robot Dog Fetches for Whom?&lt;/em&gt;&lt;/a&gt;, which talks about the disconnect between the projected intent of robots that pretend to be human (or canine), and their actual purpose. Bots like Alexa, which assume the affect of a helpful personal assistant, do so because they want to establish a trusting relationship with their user. Unfortunately, Alexa’s actual purpose doesn’t quite match up: she’s cheap and widely available because she’s streaming your personal data back to a centralised server, which in turn ‘she’ uses to advertise new products to you.&lt;/p&gt;

&lt;p&gt;We’re also very interested in the idea of ‘bot speech’ (perhaps a topic for another post) – the speech rights afforded for bots, and what they mean for humans &lt;label for=&quot;bot-speech&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt;&lt;input id=&quot;bot-speech&quot; class=&quot;margin-toggle&quot; /&gt;&lt;span class=&quot;sidenote&quot;&gt;legal scholars Madeline Lamo and Ryan Calo (&lt;a href=&quot;https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3214572&quot;&gt;&lt;em&gt;Regulating Bot Speech&lt;/em&gt;&lt;/a&gt;), and Tim Wu (&lt;a href=&quot;https://scholarship.law.upenn.edu/cgi/viewcontent.cgi?article=1021&amp;amp;context=penn_law_review&quot;&gt;&lt;em&gt;Machine Speech&lt;/em&gt;&lt;/a&gt;) have written really interestingly on this&lt;/span&gt;. At least in a U.S. context, bots have some freedom of speech protections under law. Though this sounds somewhat stupid and legalistic, there are genuine issues with requiring bots to disclose that they’re bots. Namely: if you try and enforce a disclosure law, you immediately run into privacy issues like ‘how does someone prove they’re human?’&lt;/p&gt;

&lt;p&gt;The main idea of the game is to trouble both your idea of what you might be talking to online, and also to think about what it means to perform your own human-ness. We’re rapidly approaching an internet where bots are so cheap, effective and believable that the texture of online discourse is set to change significantly.&lt;/p&gt;

&lt;h3 id=&quot;chatbots&quot;&gt;chatbots&lt;/h3&gt;

&lt;p&gt;It’s probably important to distinguish between the kind of bot that we made, which has a lot in common with customer service bots and spam bots, from bots that use generative machine learning as their primary tactic of engagement. These latter bots can exist outside of the constraints of a particular context, and instead respond to patterns of speech, with no preprogrammed responses. It’s the latter kind of bot that are actually subject to Turing tests because, importantly, the conversation needs to take place without an assumed shared context.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bots/faith.jpg&quot; alt=&quot;a passive-agressive conversation with a chatbot by ryan kuo&quot; /&gt;faith holds a conversation
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Artist Ryan Kuo’s piece Faith is a great example of the first type being subverted. Working with technologists Angeline Meizler and Tommy Martinez, he used IBM’s Watson Assistant to make a defensive and resistant chatbot, inspired by conversations he’d had and seen online. One thing he talks about in the work is the ‘misuse’ of this software that’s designed for businesses to automate customer service, and all the assumptions that come baked into these platforms. This has definitely been something we’ve been thinking about here too.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bots/eliza.jpg&quot; alt=&quot;a conversation with a bot pretending to be a psychotherapist&quot; /&gt;a session with ELIZA
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;ELIZA, Joseph Weizenbaum’s rogerian psychotherapist bot, is an enduring and fantastic example of how simple a bot can be provided that the shared context is very constrained. In the paper &lt;a href=&quot;http://www.universelle-automation.de/1966_Boston.pdf&quot;&gt;ELIZA–A Computer Program For the Study of Natural Language Communication Between Man and Machine&lt;/a&gt;, Weizenbaum talks about how crucial the assumed context is for the interaction to work:&lt;/p&gt;

&lt;blockquote&gt;
  &lt;p&gt;&lt;em&gt;“the psychiatric interview is one of the few examples of categorized dyadic natural language communication in which one of the participating pair is free to assume the pose of knowing almost nothing of the real world. If, for example, one were to tell a psychiatrist “I went for a long boat ride” and he responded “Tell me about boats”, one would not assume that he knew nothing about boats, but that he had some purpose in so directing the subsequent conversation.”&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3 id=&quot;creating-a-cheap-ai&quot;&gt;creating a ‘cheap AI’&lt;/h3&gt;

&lt;p&gt;One thing we knew from the outset was that context would be really important in making this bot as human as possible. After all, it was pretty obvious that we weren’t about to pass the turing test in a few months, but there were some things we could do to create a compelling experience that felt like being online.&lt;/p&gt;

&lt;p&gt;The bot was built from a combination of some client-side NLP, and google’s Dialogflow service, which is traditionally used to make customer service bots. One of the reasons that we chose this method is that you get very fine-grained control over what it can and can’t say. We wanted to be able to steer the conversation, and also we wanted to be really sure that the bot couldn’t accidentally say something hurtful or offensive, as this project is at least partly geared toward an educational context.&lt;/p&gt;

&lt;p&gt;Of course, the downside of this is that all the responses had to be written by us. In some senses this was good, as we got to write a lot of jokes, but it definitely meant that we ran into diminishing returns pretty fast every time we wanted to expand the range of things the bot could say.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;dialogflow&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Dialogflow is a Google platform that allows people to build chatbots. It exposes an API, where you send in text, and you get out a response from the bot. In the dialogflow ‘world’ there are 2 main ideas:&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
&lt;img src=&quot;/img/bots/cli.gif&quot; alt=&quot;an interaction with the bot&quot; /&gt;&lt;br /&gt;
&lt;span class=&quot;mainnote&quot;&gt;an early test of the bot, showing the triggered intent along with the response each time. The ‘truth’ challenge is generated client-side without ever getting sent to Dialogflow, so no intent is printed in that instance&lt;/span&gt;&lt;br /&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;Intents – Intents are composed of ‘training phrases’, which define possible things that a user can say to the bot, and responses to a particular detected ‘intent’. For example, the intent &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;Location - General&lt;/code&gt; gets triggered if you ask the bot: ‘where are you?’. If what the user says doesn’t match any intents, then something called the ‘Default Fallback Intent’ gets triggered (or, if in a specific context, the Contextual Fallback Intent). When this happens, the bot will respond with something neutral, like ‘haha’ (or, in our case, something else will happen… see below)&lt;/li&gt;
  &lt;li&gt;Contexts – This is the ‘context’ in which the conversation is currently operating, that helps the bot to better respond to what you say to it. You can set up intents so that they can only be triggered while in a certain context, and/or so that they set a context on the output. For example, if you were talking to the bot in the context &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;crushes&lt;/code&gt;, and asked ‘what about you?’, the bot would tell you of its soft spot for Andrew Yang. If you were instead in the context of talking about &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;cooking&lt;/code&gt;, the bot would respond to say it had eggs on toast for breakfast, and does indeed like to cook. It’s also possible to set contexts through the API, without triggering a new intent, which is what happens every time the bot makes a truth challenge.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For a fuller explanation of how all this works, &lt;a href=&quot;https://cloud.google.com/dialogflow/docs&quot;&gt;Google’s documentation&lt;/a&gt; is actually pretty thorough. If you fancy &lt;em&gt;really&lt;/em&gt; getting into this, the mildly sinisterly-named but extremely thorough &lt;a href=&quot;https://miningbusinessdata.com/build-better-dialogflow-bots/#Surplus_Intents&quot;&gt;miningbusinessdata.com&lt;/a&gt; has some great guides.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;the NLP layer(s)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;It became clear pretty early on that the one thing dialogflow was not going to be able to handle was parsing structured queries that involved anything more complex than its ‘&lt;a href=&quot;https://cloud.google.com/dialogflow/docs/entities-overview&quot;&gt;entity detection&lt;/a&gt;’ framework. A classic example of this is the ‘would you rather’ question, where you have to incorporate/paraphrase the initial question in the response.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bots/would-you-rather.png&quot; alt=&quot;an interaction with the bot&quot; /&gt;the bot handles a would you rather challenge
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;To handle cases like this, we have an array of objects, each of which contains a family of approximate things the user could say, along with a set of possible responses, and contexts they can set. In order to detect better, the user’s query gets changed to lower-case before matching, and instead of using a direct match, the Levenshtein Distance is used.&lt;/p&gt;

&lt;p&gt;In this case, the &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;$&lt;/code&gt; symbol is switched out for a random half of the word (split by the ‘or’ in the middle), then replaced by one of the options the user has posed (changing second to first person, and vice versa). In this case, the set context is the same each time, as we never know what we’re saying in response.&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;{
	&quot;name&quot;: &quot;wouldYouRather&quot;,
	&quot;usertext&quot;: [&quot;Would you rather&quot;, &quot;wd u rather&quot;],
	&quot;responses&quot;: [
		{
			&quot;response&quot;: &quot;omggggggg... I would $&quot;,
			&quot;context&quot;: &quot;wouldYouRather&quot;
		},
		{
			&quot;response&quot;: &quot;$ for sure&quot;,
			&quot;context&quot;: &quot;wouldYouRather&quot;
		},
		...
	]
}
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;This idea was then developed for use in detecting other sentence structures and issues. For example, if the player seemed upset, or if they said that the conversation was slow or awkward, the bot would change the topic of conversation by asking a question. Similarly, if the player uses the bot’s name, it looks for a bunch of different sentence structures (including ‘my dad is called X’) to respond.&lt;/p&gt;

&lt;h3 id=&quot;conversation-design&quot;&gt;conversation design&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;nesting contexts&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bots/diagram-simple.png&quot; alt=&quot;an interaction with the bot&quot; /&gt;a diagram of the ‘habit’ conversation and context flow
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;One of the most interesting parts of this project was trying to create a realistic conversation flow without needing to define ‘follow-up’ responses to each new context in infinite depth. It became clear early on that the ‘brute-force’ approach (just predict everything everyone could say) would never work even with a clear context, and instead the conversation maps needed to look less like infinitely branching trees, and more like a lanscape of hills that the conversation could climb and disembark gracefully.&lt;/p&gt;

&lt;p&gt;On the right is a simple ‘context transfer’ diagram for the ‘habit’ prompt. When challenging you, the bot will ask ‘What is your worst habit’. Depending on the detected intent of your response, the bot will set a context that your habit was a ‘bad habit’ (if it matched a list of gross or annoying habits), a ‘not so bad habit’ (if it was cute or harmless), or will fall back on the default response for that context, where it’ll start talking about how much time it spends online&lt;/p&gt;

&lt;p&gt;Below is a much more complex diagram, showing two related topics people were likely to bring up a lot: COVID-19, and the location that the bot was in. As these were closely related, there’s a lot of ways to jump from one hill to the other using the crossover context, &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;&apos;location-covid&apos;&lt;/code&gt;.&lt;/p&gt;

&lt;figure class=&quot;fullwidth&quot;&gt;
	&lt;img src=&quot;/img/bots/complex-long.png&quot; alt=&quot;an interaction with the bot&quot; /&gt;
&lt;/figure&gt;

&lt;p&gt;A lot of the later stages of the conversation design were concerned with stitching these crossovers into the conversation so that the flow could move more seamlessly from one topic to another, without the bot getting bogged down in too detailed a conversation.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;what is truth?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;One of the major challenges of the project was how to constrain the context. Ideally, the player should &lt;em&gt;want&lt;/em&gt; to play a truth game: because if they don’t, the bot doesn’t have a lot of recourse to handle generic conversation.&lt;/p&gt;

&lt;p&gt;In the end, we ended up with something in between the two, which in hindsight would have had to have been the case anyway, as a good proportion of people online, speaking to a stranger that could be a bot, will be much more curious about that than the game itself.&lt;/p&gt;

&lt;p&gt;However, to make the game believable and compelling, something we did early on was map out the ideas of what a ‘truth challenge’ actually is. We were also particularly concerned with not being too ‘lame’, a tricky thing to achieve online (though one 16-year-old tester told us that we had successfully created a ‘fuckboy’, and was half expecting it to ask for nudes).&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/img/bots/truth-space.png&quot; alt=&quot;an interaction with the bot&quot; /&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;handling errors&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/bots/jesus-trim.png&quot; alt=&quot;an interaction with the bot&quot; /&gt;in this exchange, the bot doesn’t understand what’s being said, so changes the topic of conversation entirely
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;The error handling we used for the bot was heavily inspired by ELIZA: as soon as we don’t know what’s going on, we need to take charge of the conversation.&lt;/p&gt;

&lt;p&gt;In addition to the parsing that happens &lt;em&gt;before&lt;/em&gt; a sentence is sent to dialogflow, there’s also a layer that happens afterward. If no intent was detected (e.g. there was a Default Fallback Intent), then we hit this layer.&lt;/p&gt;

&lt;p&gt;This layer tries to do a much more generic analysis of what’s happening using a Parts of Speech tagger to identify the &lt;em&gt;kind&lt;/em&gt; of sentence that was said, and either deflect from it or change the conversation.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/img/bots/error-handling.png&quot; alt=&quot;handline error diagram&quot; /&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;timing&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Another thing we did to try and up the realism of the bot was to play with the timing of its responses, making it wait a random amount of time to ‘start typing’ then type proportionately (with some added noise) to the length of the response&lt;/p&gt;

&lt;p&gt;If you don’t reply for a while, and there’s already a context, the bot will send a query to dialogflow with the usertext ‘noreply’. This will trigger a contextual intent (or, if there is none with that exact intent)&lt;/p&gt;

&lt;p&gt;If there’s no existing context, then the bot will try to change the topic of conversation.&lt;/p&gt;

&lt;h3 id=&quot;putting-it-out-in-the-world&quot;&gt;putting it out in the world&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;testing&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;What was really hard about testing the bot was that, despite best efforts, it was really hard not to bake your own assumptions about how people text into the interface. In our case, with mostly myself and Gary writing and testing the bot’s responses, it did really well when we were talking to it (it does, indeed text like us), quite well with Kalli, Sam and other close friends, and sometimes sucked when someone we had nothing in common with tried it. Testing and reading people’s logs was really invaluable in shaping the platform a bit more, though the robust error-handling also did a lot.&lt;/p&gt;

&lt;p&gt;In the future, (see below) I think that the best way to make the bot more robust would be to really up the NLP layer, to better deal with a range of different contexts and people. I think also giving the bot some kind of underlying ‘state’ that it could use to modulate it’s responses&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;on chat logs&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;One of the major conundrums on release was whether or not we should record the chat logs through Dialogflow. In the end we decided it was too antithetical to the aims of the project to record anything people were saying, though it means that we’re really curious as to what people are saying.&lt;/p&gt;

&lt;p&gt;If you do end up playing with something you think is a bot… send screenshots :3&lt;/p&gt;

&lt;h3 id=&quot;future-bots&quot;&gt;future bots&lt;/h3&gt;

&lt;p&gt;The galling thing about any of these projects is that, as soon as you finish, you’re immediately filled with ideas of how much better you could have done it. In this case, I think that the main issue was really that trying to deal with all of the things people could say started to feel like whack-a-mole: that, and there was often a degree of repetition dealing with contextual replies, e.g. many different contexts had separate ‘what about you’ intents to detect a common follow-up from a user.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;underlying behaviours&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;One thing that we started doing in a very simple way, but we could really have run with a lot more was to add in an ‘underlying’ behavioural state for the bot&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;fullwidth&quot;&gt;
	&lt;img src=&quot;/img/bots/personality-map.png&quot; alt=&quot;a diagram showing the relationship between different personality types&quot; /&gt;
	&lt;span class=&quot;mainnote&quot;&gt;transformation diagram for personalities&lt;/span&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;This was part of the plan initially: an early diagram shows a ‘transformation’ between internal states that the bot would undergo if the user pushed it in a certain direction. In the end, the ways this is triggered is if the player says something either demonstrably cutesy (‘owo what’s this’), or deliberately aggro (‘fuck you you piece of shit’), the bot will adopt a long-lived background context, the fallback of which is either to spout kaomojis, or to shitpost.&lt;/p&gt;

&lt;p&gt;Even this simple bit was surprisingly effective, and I think a second pass would have made more of these longer-running contexts to actually modulate the character of the bot depending on what the person they were speaking to was saying.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;the simplest case&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Without changing much of the underlying architecture, the simplest way to deal with this would be to have a set of stock ‘contextual responses’ that send a standard, one-word response to dialogflow. E.g, for ‘what about you’, the parsing layer at the front would contain all the possible versions of that, then send a single ‘whataboutyou’ string to the backend, making it much quicker to write the training phrases&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;an ‘intent API’&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Developing on this, one could restructure the front-end parsing layer to do much of the work of the intent detection, essentially using Dialogflow just to handle and organise contexts and responses, while steering the conversation much more directly. While it might reduce the specificity of the things you could reply to, it would greatly increase the space of possible responses.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;of course, at this point…&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;…maybe dialogflow isn’t even the best tool for the job. After using it for a while you get a fairly clear feel of how the platform works, and, while it’s a useful tool in some ways for corralling all of these different sets of intents and responses, it’s also:&lt;/p&gt;

&lt;ol&gt;
  &lt;li&gt;a google product (where a lot of this project is about the problems with surveillent agents)&lt;/li&gt;
  &lt;li&gt;a pretty janky interface (another change would be to make a custom CLI early on)&lt;/li&gt;
  &lt;li&gt;there’s only so much you can hack a customer service bot into doing what you want, before you’re basically writing it yourself anyway&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;I would say that it works great as a prototyping tool: in the early stages of the project it was great how quickly you could get it to do things. But after that point it’s probably worth exporting the JSON, and making a more suitable backend architecture.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;bot proliferation&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;At the moment I’m thinking about writing a ‘customer service’ bot for the Foreign Objects website (after the &lt;a href=&quot;http://localhost:4000/_pages/slavoj.html&quot;&gt;attempts to voice clone Slavoj Žižek reading our studio’s ‘statement of purpose’&lt;/a&gt; went so badly), and maybe making a much more open-ended generative bot for a simulation project.&lt;/p&gt;

</description>
          <pubDate>2020-05-29T00:00:00-04:00</pubDate>
          <link>https://soup.agnescameron.info//2020/05/29/cheap-ai.html</link>
          <guid isPermaLink="true">https://soup.agnescameron.info//2020/05/29/cheap-ai.html</guid>
        </item>
      
    
      
        <item>
          <title>permaculture network</title>
          <description>&lt;p&gt;This is a long-overdue writeup of a project that &lt;a href=&quot;http://zhexi.info/&quot;&gt;Gary&lt;/a&gt; and I worked on last summer, as part of a Schloss Solitude Web Residency &lt;a href=&quot;https://schloss-post.com/category/web-residents/rigged-systems/&quot;&gt;&lt;em&gt;Rigged Systems&lt;/em&gt;&lt;/a&gt;, curated by Jonas Lund. Parts of this are adapted from answers to questions that different people have asked about the project&lt;label for=&quot;different-people&quot; class=&quot;margin-toggle sidenote-number&quot;&gt;&lt;/label&gt;&lt;input id=&quot;different-people&quot; class=&quot;margin-toggle&quot; /&gt;&lt;span class=&quot;sidenote&quot;&gt;thanks, &lt;a href=&quot;https://callil.com/&quot;&gt;Callil&lt;/a&gt;, and to Denise Sumi for &lt;a href=&quot;https://schloss-post.com/flora-fauna-and-folk-tales/&quot;&gt;interviewing us&lt;/a&gt;. If you’d like to read some other writing about this piece, it was recently covered by Daphne Dragona for Transmediale’s 2020 print publication, which you can download as a pdf &lt;a href=&quot;https://networkcultures.org/blog/publication/the-eternal-network/&quot;&gt;here&lt;/a&gt;&lt;/span&gt;, and parts are new. An are.na channel for this project exists &lt;a href=&quot;https://www.are.na/agnes-cameron/permaculture-network&quot;&gt;here&lt;/a&gt;, and the code is open-sourced &lt;a href=&quot;https://github.com/rokeby/permaculture&quot;&gt;here&lt;/a&gt;. The simulation itself is online &lt;a href=&quot;http://root.schloss-post.com/&quot;&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;figure class=&quot;fullwidth&quot;&gt;
	&lt;img src=&quot;/img/permaculture/full-sim.png&quot; alt=&quot;main&quot; /&gt;
	&lt;span class=&quot;mainnote&quot;&gt;A full view of the simulation&lt;/span&gt;
&lt;/figure&gt;

&lt;p&gt;&lt;em&gt;Permaculture Network&lt;/em&gt; is an agent-based simulation, a ‘zero-player’ game that was made while we were resident at &lt;a href=&quot;https://sakiya.org&quot;&gt;Sakiya&lt;/a&gt;, an art, science and agriculture institution based in the village of Ein Qinniya, Palestine. The project came about in part because we were thinking a lot about alternative representations of land, particularly from the perspective of data-gathering. Sakiya exists on Area C land in the west bank, which Palestinians aren’t allowed to build on (but which is frequently seized by Israeli settlers). One of the main routes by which land in the area is colonised in this way is through data collection: from the British Mandate to the current occupation, there’s a direct correlation between measurement of the land and its qualities and its subsequent requisitioning from Palestinian hands.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/permaculture/aerial.png&quot; alt=&quot;sakiya aerial view&quot; /&gt;a topographical view of Sakiya’s site, which is mimicked by the view of the simulation
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Initially, this project was to be the front-end for a set of networked soil sensors on the site, as setting up local soil quality monitoring has been on Sakiya’s roadmap for a little while. Our intention was to find a way to give an feeling of the landscape changing over time and seasons for people external to the site, while the back-end – only accessible to people managing the site – would allow for direct data-gathering, and generate reports about how different parts of the permaculture farm were faring. In the end, we didn’t have the equipment to set these sensors up permeanently, though we’re hoping to do so in the future. More about how they’ll be integrated lower down.&lt;/p&gt;

&lt;h3 id=&quot;technical-details&quot;&gt;technical details&lt;/h3&gt;

&lt;p&gt;The simulation explores the ecology of Sakiya through imagined conversations between plants, animals, soil, water, weather, and other human and non-human agents. The nature of these conversations is based loosely on the idea of permaculture ‘guilds’: plants that, when grown together, provide mutual benefit to one another. In as much as was possible, we tried very hard to re-create the actual ecology found on the site. Almost all of the agro-ecological information about the site: the soils, geology, and plant life, comes from a survey of Sakiya by agroecologist Omar Tesdell, and his team at &lt;a href=&quot;https://makaneyyat.org/en/&quot;&gt;Makaneyyat&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;simulation layers&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The simulation is built around many ‘layers’ that can co-exist in one of the cells in the simulation grid. The background colours you can see denote the different land types present on the site (you can see this at the top when you click on a square: ‘a rocky outcrop’ ‘a wild meadow’ etc). These types compose the ‘bottom layer’ of the simulation, and are in the same place each time. They are generated from a JSON array which determines the ‘substrate type’ type for each co-ordinate, itself generated from a Python script that takes in a &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;.bmp&lt;/code&gt; sketch of how the landscape should look, then outputs a co-ordinate map.&lt;/p&gt;

&lt;p&gt;In hindsight, it would have been easy enough to automate this bitmap-generating process by re-writing the python script in Node, allowing the topography of the site to be manipulated much more readily, and to make it easier to adapt to different landscapes.&lt;/p&gt;

&lt;p&gt;On top of these substrate types, soils and rocks, then plants, and finally animals, are spawned with a probability that depends on the properties of each land type, and the layers that already exist underneath. These form the ‘layers’ of the simulation. After the initial generation step, nothing moves apart from the animals, though in the future it would be interesting to see plants grow, die and take over one anothers’ cells over a period of time.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/permaculture/5-zone.png&quot; style=&quot;width: 100%;&quot; alt=&quot;bitmap of substrates&quot; /&gt;the bitmap from which the ‘substrate map’ is created
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/permaculture/substrates.png&quot; alt=&quot;generated substrates&quot; /&gt;the substrates generated from the co-ordinate map by the simulation
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Soil and rocks are distributed according to some probability of walking onto that area and finding that soil. So — in the ‘rocky outcrop’ you mostly get limestone and dolomite, but the ‘wild meadow’ gets terra rossa, and a bit of clay. Plants are then spawned on different soil substrates with a probability according to the kind of soil they like to be on.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/permaculture/goat-tracking.png&quot; alt=&quot;generated substrates&quot; /&gt;tracking where the goats have been during the debugging phase
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Animals are restricted to the kinds of places you tend to find them: the humans mostly hang out on the path or the terraces, while the goats get everywhere. Herding the goats proved somewhat challenging (as in real life): eventually we settled on having them move slowly downhill and out of shot. Perhaps corralling them around the spring and then dispersing them again would have been more realistic.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;cells&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The aesthetic of the simulation borrows heavily from &lt;a href=&quot;http://www.bay12games.com/dwarves/&quot;&gt;dwarf fortress&lt;/a&gt;, using ascii characters to represent different entities in the game. Whatever you see is the entity on the top layer of that cell at any given time, though if you click it brings up everything that’s there.&lt;/p&gt;

&lt;p&gt;For the animals and humans, we used the chinese characters, because of their pictograhpic appearence, largely due to ‘羊’ (goats).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;entities&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Entities in the game (plants, animals, soils), are all defined in a set of large JSON arrays, which store their properties: one of plants, one of animals + humans, one of soil/rock types. One of the restrictions we had was that Schloss-Post only gives its residents static hosting, though in hindsight we should probably just have paid for our own servers (or auto-generated these locally), and it’s something we’d definitely do for a second iteration.&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;&quot;goat&quot;:{
	&quot;name&quot;: &quot;damascus goat&quot;,
	&quot;zones&quot;: [1, 2, 3, 4, 5, 6],
	&quot;number&quot;: 30,
	&quot;arabic&quot;: &quot;ماعز دمشقي&quot;,
	&quot;latin&quot;: &apos;capra aegagrus hircus&apos;,
	&quot;symbol&quot;: &apos;羊&apos;,
	&quot;shades&quot;: [&apos;#ffffcc&apos;, &apos;#ffcc66&apos;, &apos;#cc9900&apos;],
	&quot;personality&quot;: &quot;friendly&quot;,
	&quot;speech&quot;: &apos;hello hello&apos;,
	&quot;type&quot;: &quot;grazer&quot;
},
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;The plants have some probabilty of appearing on each substrate type, and also on each rock, based on the kind of environments they’re found in, and the soil they like to grow on. These then get fed into a constructor, which makes a new entity for each square, along with space for thoughts and companions to be created as the simulation progresses.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;the simulation loop&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/permaculture/goats.jpg&quot; alt=&quot;generated substrates&quot; /&gt;the goats (and goatherders) under the ancient oak tree
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/permaculture/antar.jpg&quot; alt=&quot;generated substrates&quot; /&gt;one of the site’s cats, &lt;i&gt;Antar&lt;/i&gt; (&lt;span lang=&quot;ar&quot; dir=&quot;rtl&quot;&gt;عنترة&lt;/span&gt;), named for the legendary knight &lt;a href=&quot;https://en.wikipedia.org/wiki/Antarah_ibn_Shaddad&quot;&gt;Antar Ibn-Shabbad&lt;/a&gt;
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;The simulation clocked on a single cycle that updates every second. Every time it loops a few things happen.&lt;/p&gt;

&lt;ol&gt;
  &lt;li&gt;
    &lt;p&gt;every creature in the simulation (bees, goats, boars, people etc) move, if they want to. Most things move randomly over the kind of terrain they’re likely to be found in, apart from the goats which flock from one side to another (the site is blessed with a flock of 200 goats that come down the hill a few times a week).&lt;/p&gt;
  &lt;/li&gt;
  &lt;li&gt;
    &lt;p&gt;after everyone has moved, the ‘narrative’ aspect of the simulation updates, and conversations are generated first between new neighbours, and 100 agents randomly chosen from the grid&lt;/p&gt;
  &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;narrative generation&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The narrative can be generated in two ways: the first and most direct is when an event occurs — at the moment, that’s just when an animal moves from square to square. When an animal enters a square, this initiates a call-and-response occurs between the animal, and the agent that’s on the top of the square (e.g. if there’s a plant then a plant, otherwise a rock or soil).&lt;/p&gt;

&lt;p&gt;The second form of narrative generation happens at random every tick: 100 or so agents are selected, and another agent chosen at random from the 9x9 square surrounding them. If the agent is a plant, and a ‘companion plant’ (in a permaculture guilds sense) is located in this square, then that is chosen with priority. At that point, just as before, an exchange occurs between the randomly selected plant and its neighbour, depending again on personality and relationship. To see the companion plants, you can click on a plant (easiest to do this in the brown ‘terrace’ area as things there are planted in guilds anyway) and the information box should show you the list of nearby companions.&lt;/p&gt;

&lt;p&gt;There’s a matrix that maps the kind of interactions different agent types will have with one another: for example, when goats speak to crops or legumes (or really any plant apart from the shrubs, which are spiny and annoying), they will express love and appreciation. This is not reciprocated by the plant (which doesn’t want to be eaten), which will respond with either fear or annoyance, depending on the likelihood of being eaten vs just trodden on.&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;{
	&quot;senderType&quot;: &quot;tree&quot;,
	&quot;receiverType&quot;: &quot;amphibian&quot;,
	&quot;messageType&quot;: &quot;curiosity&quot;,
},
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/permaculture/conversation.png&quot; alt=&quot;showing a whole conversation&quot; /&gt;showing a whole conversation
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;This matrix, which was just a big JSON object, took a long time to write: with hindsight, it would have been a lot easier to use a database on the backend as a CMS, which I think we’d set up to make this project more modular.&lt;/p&gt;

&lt;p&gt;As well as this matrix, each agent has a kind of personality (where we could, these are based on islamic folklore about particular plants): so a friendly agent will express warmth or annoyance differently to how a wise agent would.&lt;/p&gt;

&lt;p&gt;Each narrative is expressed in the form of an inner monologue: there are &lt;em&gt;thoughts&lt;/em&gt;, and then there is &lt;em&gt;speech&lt;/em&gt;. This also takes a lot of inspiration from dwarf fortress: the append-only approach to character development. Right now these conversations don’t really go anywhere or change anything: but it would be really cool to have them change some thing in the interaction, to influence the ecological systems they’re seeking to model.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/permaculture/narrative.png&quot; alt=&quot;showing a whole conversation&quot; /&gt;the narrative of black mustard
&lt;/span&gt;&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;class Speech {
  constructor(sender, receiver, message, timestamp) {
    this.sender = sender;
    this.receiver = receiver;
    this.message = message;
    this.timestamp = timestamp;
  }  
}


class Thought {
  constructor(thinker, thought, timestamp) {
    this.thinker = thinker;
    this.thought = thought;
    this.timestamp = timestamp;
  }  
}
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;In each case, each entity remembers all the converasations it’s had, and also all the thoughts, which get strung together in a narrative.&lt;/p&gt;

&lt;p&gt;Additionally, every 10 seconds, the last ‘thought’ or ‘speech’ to be appended to a random agent is printed to the screen. In an earlier prototype, it would also print all the surrounding conversations, but the display was a bit hit-and-miss: sometimes it worked really well, sometimes all the blocks overlapped, and just looked quite confusing.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;seasons&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;One thing we did was to include the flowering seasons of each plant, so that they would change colour over different months. It was a feature we’d basically forgotten about, then we looked back in April and the whole site had transformed. It was a really wonderful feeling: a slow change you’ve completely forgotten about.&lt;/p&gt;

&lt;figure class=&quot;fullwidth&quot;&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/permaculture/april.png&quot; alt=&quot;april&quot; /&gt;
		&lt;span class=&quot;mainnote&quot;&gt;everything flowering in April&lt;/span&gt;
	&lt;/div&gt;
	&lt;div class=&quot;subfig&quot;&gt;
		&lt;img src=&quot;/img/permaculture/full-sim.png&quot; alt=&quot;august&quot; /&gt;
		&lt;span class=&quot;mainnote&quot;&gt;by August, everything&apos;s back to plain old green&lt;/span&gt;
	&lt;/div&gt;
&lt;/figure&gt;

&lt;h3 id=&quot;future-work&quot;&gt;future work&lt;/h3&gt;

&lt;p&gt;There’s a lot of bits that could be done to this, and it feels like the most essential ideas are probably completing work on the site (though that feels pretty far off right now), and adding in a back end that could handle all of the information we’re currently storing in files.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;integrating other data&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/permaculture/lovelyweather.png&quot; alt=&quot;generated substrates&quot; /&gt; lovely weather we’re having
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;One thing we’d been thinking about was to use local weather data to change the state of the simulation: thinking about a game I love by Julian Glander called like &lt;a href=&quot;https://glander.itch.io/lovely-weather-were-having&quot;&gt;Lovely Weather We’re Having&lt;/a&gt;, which uses the weather forecast in your location to modulate your experience of the game.&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;function checkPlantComfort(cell, temperature) {
  var tempLevel = getTempLevel(temperature)

  switch (cell.plant.preferredTemp) {

  case(&quot;hot&quot;):
    if(tempLevel === &quot;warm&quot;) 
    	cell.plant.temperament += 0.2

	else if(tempLevel === &quot;med&quot;) 
		cell.plant.temperament -= 0.1

	else cell.plant.temperament -= 0.3
	...
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;One way in which this could work would be to use a ‘temperament’ variable to skew the sentiments od responses and conversations. A plant that was normally effusive might become less chatty if it wasn’t currently experiencing it’s preferred temperature… meanwhile a cool, breezy day could really encourage the goats.&lt;/p&gt;

&lt;p&gt;&lt;span class=&quot;marginnote&quot;&gt;
	&lt;img src=&quot;/img/permaculture/weather.png&quot; alt=&quot;generated substrates&quot; /&gt; the two nearest weather stations to Ein Qiniyya
&lt;/span&gt;&lt;/p&gt;

&lt;p&gt;Initially, we figured that local weather stations could be a good way to use existing data to model this part of the simulation. However, the location of these weather stations makes the use of this data fraught: in the case of Ein Qiniyya, by looking at station data from &lt;a href=&quot;https://www.wunderground.com/wundermap?lat=31.928&amp;amp;lon=35.152&quot;&gt;weather underground&lt;/a&gt;, the nearest stations are in &lt;a href=&quot;https://en.wikipedia.org/wiki/Giv%27at_Ze%27ev&quot;&gt;Givat Ze’ev&lt;/a&gt; and &lt;a href=&quot;https://en.wikipedia.org/wiki/Psagot&quot;&gt;Psagot&lt;/a&gt;, both illegal settlements.&lt;/p&gt;

&lt;p&gt;It’s both inevitable and frustrating that this is the case: after all, it’s extremely difficult for Palestinians to get access to this kind of equipment, let alone get it on the map and expect to keep it, and it cements the lack of agency they’re able to have over their environment. The work &lt;a href=&quot;https://forensic-architecture.org/investigation/destruction-and-return-in-al-araqib&quot;&gt;Forensic Architecture did with Public Lab&lt;/a&gt;, and the villagers of Al-Araquaib – a guerilla satellite mapping project involving homemade kites to document the destruction of Bedouin villages, and establish a historical continuity for the Bedouin in that area – is a great example of trying get necessary information under the radar. So: no weather for now, perhaps next year!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;in other ecosystems&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;One thing I thought would be really interesting would be to make a simulation based on a different ecosystem and location – or even allow people to build their own!&lt;/p&gt;

&lt;p&gt;Everything was made pretty quickly so there’s a couple of bits that would need to be teased out (e.g. the goats have their own function), but for the most part it’s quite modular: all of the animals and plants are defined in separate JSON objects, and the background gets made independently too. What I can imagine being nice is some kind of CMS that would allow you to upload information about your local landscape and populate the sim over time.&lt;/p&gt;

&lt;p&gt;The other thing we could do with a backend is to have everyone watch the same simulation every day, as opposed to seeing a new one every time you load the page. This would be particularly nice if we could tie things like the light levels to Sakiya’s time zone.&lt;/p&gt;

</description>
          <pubDate>2020-05-27T00:00:00-04:00</pubDate>
          <link>https://soup.agnescameron.info//2020/05/27/permaculture-writeup.html</link>
          <guid isPermaLink="true">https://soup.agnescameron.info//2020/05/27/permaculture-writeup.html</guid>
        </item>
      
    
      
    
  </channel>
</rss>